CN116091651A - Processing method and device for nonstandard sparse color filter array image - Google Patents

Processing method and device for nonstandard sparse color filter array image Download PDF

Info

Publication number
CN116091651A
CN116091651A CN202111309465.1A CN202111309465A CN116091651A CN 116091651 A CN116091651 A CN 116091651A CN 202111309465 A CN202111309465 A CN 202111309465A CN 116091651 A CN116091651 A CN 116091651A
Authority
CN
China
Prior art keywords
image
channel
pixels
interpolation
aps
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111309465.1A
Other languages
Chinese (zh)
Inventor
韩江涛
杨力林
蒋坤君
李程辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sunny Optical Zhejiang Research Institute Co Ltd
Original Assignee
Sunny Optical Zhejiang Research Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sunny Optical Zhejiang Research Institute Co Ltd filed Critical Sunny Optical Zhejiang Research Institute Co Ltd
Priority to CN202111309465.1A priority Critical patent/CN116091651A/en
Publication of CN116091651A publication Critical patent/CN116091651A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/40Filling a planar surface by adding surface attributes, e.g. colour or texture

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Color Television Image Signal Generators (AREA)

Abstract

The application relates to a processing method and a device for a nonstandard sparse color filter array image, wherein the method comprises the following steps: acquiring a nonstandard sparse color filter array image; reconstructing three channel color information of the APS pixel position based on the single channel color information of the APS pixel position in the color filter array image by utilizing an interpolation algorithm, and filling the three channel color information to the corresponding position of the APS pixel in the color filter array image to obtain at least three half-color array images; determining an interpolation direction diagram of interpolation APS pixels at the DVS pixels based on three-channel color information of the APS pixel positions of the half-color array image so as to reconstruct three-channel color information of the DVS pixel positions; and filling the color filter array image to the corresponding position of the DVS pixel in the color filter array image to obtain a full-color image. The method and the device solve the problem that the non-standard sparse color filter array image reconstruction cannot be realized in the related technology, realize the non-standard sparse color filter array image reconstruction, and can be suitable for a conventional ISP algorithm.

Description

Processing method and device for nonstandard sparse color filter array image
Technical neighborhood
The present application relates to the field of image processing technologies, and in particular, to a method and apparatus for processing a nonstandard sparse color filter array image.
Background
The dynamic vision sensor (Dynamic Vision Sensor, DVS) is a novel vision sensor, has the advantages of high-speed response (up to tens of thousands of hertz), large dynamic range imaging and the like, realizes low-data-volume and low-power-consumption data output, and solves the problems of algorithm swelling and the like caused by insufficient frame rate, too small dynamic range, frame loss and too large data volume of the traditional camera CIS (Camera Image Sensor). Because DVS can not capture the fine texture image of natural scene, therefore, image chip vision sensor is proposed, this image chip vision sensor will be traditional APS (Active Pixel Vision Sensor) pixel and novel DVS pixel cross arrangement, simultaneously output color CFA image and DVS data, realize outputting high-resolution RGB image and event data, guarantee high signal to noise ratio data. The image chip vision sensor is covered with a layer of color filter array arranged in a Bayer mode, the array enables one pixel point to sample only one of three primary colors, namely red R, green G and blue B, and the other two color values need to be interpolated according to neighborhood information, so that a required full-color image is obtained, and the processing is demosaicing.
The existing color filter array image demosaicing method generally adopts a directional linear minimum mean square error. According to the method, firstly, the color difference value in the horizontal direction and the vertical direction is estimated through the minimum mean square error LMMSE, then, a final color difference value signal is obtained through mixed direction weighting, and finally, the missing pixel value of each channel is estimated. However, since DVS and APS are distributed at intervals, the binary data output by the DVS pixels cannot provide intensity information for the APS pixels, so that the APS pixel arrangement is discontinuous, and the three channel colors of the DVS pixels cannot be estimated by the above method, and therefore, reconstruction of a non-standard sparse color filter array image cannot be achieved.
Aiming at the problem that non-standard sparse color filter array image reconstruction cannot be realized in the related art, no effective solution is proposed at present.
Disclosure of Invention
In this embodiment, a method and an apparatus for processing a non-standard sparse color filter array image are provided to solve the problem that in the related art, the non-standard sparse color filter array image reconstruction cannot be achieved.
In a first aspect, in this embodiment, there is provided a method for processing a nonstandard sparse color filter array image, including:
Acquiring a nonstandard sparse color filter array image, wherein the color filter array image comprises APS pixels and DVS pixels which are arranged in a crossing manner with the APS pixels;
reconstructing three channel color information of the APS pixel position based on the single channel color information of the APS pixel position in the color filter array image by using an interpolation algorithm;
filling three-channel color information of the reconstructed APS pixel position into the corresponding position of the APS pixel in the color filter array image to obtain at least three half-color array images;
determining an interpolation pattern of interpolation APS pixels at the DVS pixels based on three-way color information for the APS pixel locations of the half-color array image to reconstruct three-way color information for the DVS pixel locations;
and filling the three-channel color information of the reconstructed DVS pixel position to the corresponding position of the DVS pixel in the color filter array image to obtain a full-color image.
In some of these embodiments, after obtaining the full-color image, further comprising:
downsampling the full-color image to obtain a standard Bayer array image; the standard bayer array image is suitable for ISP algorithms.
In some of these embodiments, the reconstructing three channel color information for the APS pixel locations in the color filter array image based on the single channel color information for the APS pixel locations using an interpolation algorithm comprises:
downsampling the color filter array image to obtain a first bayer array subgraph of APS pixels;
evaluating pixels missing in a green channel in the first Bayer array subgraph based on the single-channel color information of the APS pixel position by using an interpolation algorithm to obtain a first green channel image;
performing bilinear interpolation on pixels with missing red channels in a first Bayer array subgraph based on the first green channel image by using an interpolation algorithm to obtain a first red channel image;
performing bilinear interpolation on pixels with missing blue channels in a first Bayer array subgraph based on the first green channel image by using an interpolation algorithm to obtain a first blue channel image;
and reconstructing three-channel color information of the APS pixel position according to the first green channel image, the first red channel image and the first blue channel image.
In some embodiments, the estimating, by using an interpolation algorithm, pixels missing in a green channel in the first bayer array pattern based on the single-channel color information of the APS pixel position, to obtain a first green channel image includes:
Determining a weight pattern of a green channel missing pixel in the first bayer array subgraph based on the single-channel color information of the APS pixel position;
and interpolating the pixels with the missing green channels in the first Bayer array subgraph based on the weight direction diagram of the pixels with the missing green channels in the first Bayer array subgraph by using an interpolation formula to obtain a first green channel image.
In some embodiments, the interpolation formula is:
g=ω V g V +(1-ω V )g H
wherein g represents interpolation at R/B; g V Interpolation representing the vertical direction; g H Interpolation in the horizontal direction; omega V Weight in the vertical direction.
In some of these embodiments, the determining a weight pattern of green channel missing pixels in the first bayer array pattern based on the single-channel color information for the APS pixel locations includes:
determining the gradient of a green channel missing pixel in a first Bayer array subgraph, and determining initial weights of APS pixels in the upper, lower, left and right directions by using the gradient;
determining first weights in the horizontal direction and the vertical direction by utilizing the gradient variance of the interpolation map;
determining second weights in the horizontal direction and the vertical direction by using the interpolation map chromatic aberration variance;
And carrying out noise judgment on the direction of the initial weight according to the first weight and the second weight so as to determine a weight direction diagram of the missing pixels of the green channel in the first Bayer array subgraph.
In some of these embodiments, the determining an interpolation pattern of interpolation APS pixels at the DVS pixels based on three-way color information for the APS pixel locations of the half-color array image to reconstruct three-way color information for the DVS pixel locations comprises:
downsampling the half-color array image to obtain a second Bayer array subgraph of DVS pixels;
and determining a weight direction diagram of each channel missing pixel in the second Bayer array subgraph of the DVS pixel based on the three channel color information of the APS pixel position so as to reconstruct the three channel color information of the DVS pixel position.
In some of these embodiments, the reconstructing three-channel color information for the DVS pixel location includes:
interpolating pixels with a missing green channel in a second bayer array subgraph of the DVS pixels based on a weight pattern of the missing green channel pixels in the second bayer array subgraph by using a first weighted interpolation formula to obtain a second green channel image;
Interpolation is carried out on the pixels with the missing red channels in the second Bayer array subgraph based on the weight direction diagram of the pixels with the missing red channels in the second Bayer array subgraph of the DVS pixels by using a second weighted interpolation formula, so that a second red channel image is obtained;
interpolating pixels with missing blue channels in a second Bayer array subgraph of the DVS pixels based on a weight pattern of the pixels with missing blue channels in the second Bayer array subgraph by using a third weighted interpolation formula to obtain a second blue channel image;
and reconstructing three-channel color information of the DVS pixel position according to the second green channel image, the second red channel image and the second blue channel image.
In a second aspect, in this embodiment, there is provided a processing apparatus for a nonstandard sparse color filter array image, including: the device comprises an acquisition module, a first reconstruction module, a first filling module, a second reconstruction module and a second filling module;
the acquisition module is used for acquiring a nonstandard sparse color filter array image, wherein the color filter array image comprises APS pixels and DVS pixels which are arranged in a crossing manner with the APS pixels;
The first reconstruction module is configured to reconstruct three channel color information of the APS pixel position based on the single channel color information of the APS pixel position in the color filter array image using an interpolation algorithm;
the first filling module is used for filling three-channel color information of the reconstructed APS pixel position to the corresponding position of the APS pixel in the color filter array image to obtain at least three half-color array images;
the second reconstruction module is configured to determine an interpolation direction diagram of an interpolation APS pixel at the DVS pixel based on three channel color information of the APS pixel position of the half color array image, so as to reconstruct three channel color information of the DVS pixel position;
and the second filling module is used for filling the three-channel color information of the reconstructed DVS pixel position to the corresponding position of the DVS pixel in the color filter array image to obtain a full-color image.
In a third aspect, in this embodiment, there is provided a sensor, including a memory, a processor, and a computer program stored on the memory and executable on the processor, where the processor implements the method for processing a nonstandard sparse color filter array image according to the first aspect.
In a fourth aspect, in this embodiment, there is provided a storage medium having stored thereon a computer program which, when executed by a processor, implements the processing method of the nonstandard sparse color filter array image described in the first aspect above.
Compared with the related art, the processing method and the processing device for the nonstandard sparse color filter array image provided in the embodiment are characterized in that the nonstandard sparse color filter array image is obtained, and the color filter array image comprises APS pixels and DVS pixels which are arranged in a crossing manner with the APS pixels; reconstructing three channel color information of the APS pixel position based on the single channel color information of the APS pixel position in the color filter array image by using an interpolation algorithm; filling three-channel color information of the reconstructed APS pixel position into the corresponding position of the APS pixel in the color filter array image to obtain at least three half-color array images; determining an interpolation pattern of interpolation APS pixels at the DVS pixels based on three-way color information for the APS pixel locations of the half-color array image to reconstruct three-way color information for the DVS pixel locations; and filling the three-channel color information of the reconstructed DVS pixel position to the corresponding position of the DVS pixel in the color filter array image to obtain a full-color image. The method solves the problem that the non-standard sparse color filter array image reconstruction cannot be realized in the related technology, realizes the non-standard sparse color filter array image reconstruction, and can be suitable for the conventional ISP algorithm.
The details of one or more embodiments of the application are set forth in the accompanying drawings and the description below to provide a more thorough understanding of the other features, objects, and advantages of the application.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute an undue limitation to the application. In the drawings:
fig. 1 is a hardware block diagram of a terminal device of a processing method of a nonstandard sparse color filter array image according to an embodiment of the present application;
FIG. 2 is a flow chart of a method for processing a non-standard sparse color filter array image according to one embodiment of the present application;
fig. 3 is a schematic diagram illustrating the process of step S220 in fig. 2;
fig. 4 is a flowchart of step S220 in fig. 2;
FIG. 5 is a flow chart of a method of processing a non-standard sparse color filter array image provided by another embodiment of the present application;
FIG. 6 is a schematic diagram of a one-dimensional horizontal bayer array pattern according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a one-dimensional vertical bayer array pattern according to an embodiment of the present application;
FIG. 8 is a schematic diagram of a process for determining a first weight using interpolation map gradient variance according to an embodiment of the present application;
FIG. 9 is a schematic diagram of a process for determining a second weight using interpolation map color difference variance according to an embodiment of the present application;
FIG. 10 is a schematic illustration of Demosiac provided in an embodiment of the present application;
FIG. 11 is a schematic diagram of a process for determining an interpolation pattern of interpolation APS pixels at DVS pixels, according to one embodiment of the present application;
fig. 12 is a block diagram of a digital image color mapping apparatus according to an embodiment of the present application.
In the figure: 210. an acquisition module; 220. a first reconstruction module; 230. a first filling module; 240. a second reconstruction module; 250. and a second filling module.
Detailed Description
For a clearer understanding of the objects, technical solutions and advantages of the present application, the present application is described and illustrated below with reference to the accompanying drawings and examples.
Unless otherwise defined, technical or scientific terms used herein shall have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terms "a," "an," "the," "these," and the like in this application are not intended to be limiting in number, but rather are singular or plural. The terms "comprising," "including," "having," and any variations thereof, as used in the present application, are intended to cover a non-exclusive inclusion; for example, a process, method, and system, article, or apparatus that comprises a list of steps or modules (units) is not limited to the list of steps or modules (units), but may include other steps or modules (units) not listed or inherent to such process, method, article, or apparatus. The terms "connected," "coupled," and the like in this application are not limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. Reference to "a plurality" in this application means two or more. "and/or" describes an association relationship of an association object, meaning that there may be three relationships, e.g., "a and/or B" may mean: a exists alone, A and B exist together, and B exists alone. Typically, the character "/" indicates that the associated object is an "or" relationship. The terms "first," "second," "third," and the like, as referred to in this application, merely distinguish similar objects and do not represent a particular ordering of objects.
The method embodiments provided in the present embodiment may be executed in a terminal, a computer, or similar computing device. For example, the method runs on a terminal, and fig. 1 is a block diagram of a hardware structure of the terminal of the processing method of a nonstandard sparse color filter array image in this embodiment. As shown in fig. 1, the terminal may include one or more (only one is shown in fig. 1) processors 102 and a memory 104 for storing data, wherein the processors 102 may include, but are not limited to, a microprocessor MCU, a programmable logic device FPGA, or the like. The terminal may also include a transmission device 106 for communication functions and an input-output device 108. It will be appreciated by those of ordinary skill in the art that the configuration shown in fig. 1 is merely illustrative and is not intended to limit the configuration of the terminal described above. For example, the terminal may also include more or fewer components than shown in fig. 1, or have a different configuration than shown in fig. 1.
The memory 104 may be used to store a computer program, for example, a software program of application software and a module, such as a computer program corresponding to a processing method of a non-standard sparse color filter array image in the present embodiment, and the processor 102 executes the computer program stored in the memory 104 to perform various functional applications and data processing, that is, to implement the above-described method. Memory 104 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 104 may further include memory remotely located relative to the processor 102, which may be connected to the terminal via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission device 106 is used to receive or transmit data via a network. The network includes a wireless network provided by a communication provider of the terminal. In one example, the transmission device 106 includes a network adapter (Network Interface Controller, simply referred to as NIC) that can connect to other network devices through a base station to communicate with the internet. In one example, the transmission device 106 may be a Radio Frequency (RF) module, which is configured to communicate with the internet wirelessly.
In this embodiment, a processing method of a nonstandard sparse color filter array image is provided, and fig. 2 is a flowchart of the processing method of the nonstandard sparse color filter array image in this embodiment, as shown in fig. 2, where the flowchart includes the following steps:
step S210, obtaining a nonstandard sparse color filter array image, wherein the color filter array image comprises APS pixels and DVS pixels which are arranged in a crossing manner with the APS pixels;
step S220, rebuilding three channel color information of the APS pixel position based on the single channel color information of the APS pixel position in the color filter array image by utilizing an interpolation algorithm;
step S230, filling three channel color information of the reconstructed APS pixel position into the corresponding position of the APS pixel in the color filter array image to obtain at least three half-color array images;
Step S240, determining an interpolation pattern of interpolation APS pixels at the DVS pixels based on three-channel color information of the APS pixel positions of the half-color array image so as to reconstruct three-channel color information of the DVS pixel positions;
and step S250, filling three channel color information of the reconstructed DVS pixel position into the corresponding position of the DVS pixel in the color filter array image to obtain a full-color image.
Note that the nonstandard sparse color filter array (Color Filter Array, CFA) image includes APS pixels and DVS pixels; APS pixels are arranged to intersect DVS pixels. Wherein, the blank is DVS pixel; RGB is APS pixels; the three channel color information pic1 of the APS pixel position is reconstructed based on the single channel color information of the APS pixel position by using an interpolation algorithm, so that the corresponding position of the APS pixel in the color filter array image is filled, as shown in fig. 3. And guiding the reconstruction of three channel color information of the DVS pixel position based on the three channel color information of the reconstructed APS pixel position, and finally filling the three channel color information of the reconstructed DVS pixel position into the corresponding position of the DVS pixel in the color filter array image to obtain a full-color image.
The interpolation algorithm is not limited, for example: the interpolation algorithm may be a nearest neighbor interpolation algorithm, a bilinear interpolation algorithm, a bicubic interpolation algorithm, etc. In other embodiments, the method may also be implemented by using an algorithm based on local embedding (Neighbor Embedding), a deep learning algorithm, and the like, which is not limited thereto. The half-color array image is an image in which APS pixels in the color filter array image are filled completely, and the reconstructed APS pixel positions have complete three-channel color information, namely a green channel image, a red channel image and a blue channel image; each filled will have a half-color array image and therefore at least half-color array images. In other embodiments, the three half-color array images may be combined into one.
Through the steps, three channel color information of the APS pixel position is rebuilt firstly, three channel color information of the DVS pixel position is rebuilt based on the three channel color information of the APS pixel position, and then the rebuilding of the nonstandard sparse color filter array image is realized, so that the method is applicable to a conventional ISP algorithm, and the problem that the rebuilding of the nonstandard sparse color filter array image cannot be realized in the related technology is solved.
The following describes the above steps in detail:
in some of these embodiments, as shown in fig. 4, step S220 includes the following steps;
step S221, downsampling the color filter array image to obtain a first Bayer array subgraph of APS pixels;
step S222, evaluating pixels missing in a green channel in a first Bayer array subgraph based on single-channel color information of APS pixel positions by using an interpolation algorithm to obtain a first green channel image;
step S223, bilinear interpolation is carried out on pixels missing in a red channel in a first Bayer array subgraph based on a first green channel image by utilizing an interpolation algorithm, so as to obtain a first red channel image;
step S224, performing bilinear interpolation on pixels missing in a blue channel in the first Bayer array subgraph based on the first green channel image by using an interpolation algorithm to obtain a first blue channel image;
Step S225, reconstructing three channel color information of APS pixel positions according to the first green channel image, the first red channel image, and the first blue channel image.
Specifically, according to the specific structure of the cross arrangement of the APS pixels and the DVS pixels in the color filter array image, a first bayer array sub-image composed of the APS pixels is obtained through downsampling. The downsampling may be performed according to a window of a preset specification, for example: the window may be 2x2;3x3;4x4;5x5, etc. The downsampled first bayer array pattern may be a first bayer array pattern Raw1 of RGGB. In other embodiments, two or more first bayer array patterns may be obtained simultaneously by downsampling, and the two or more first bayer array patterns may be processed in parallel to improve processing efficiency.
In this embodiment, the interpolation algorithm is a Demosaic algorithm, and the interpolation of g is completed by calculating the weight of the interpolation g at the R/B position in the first bayer array subgraph and using weights in different directions such as horizontal and vertical, so as to reconstruct high-quality pixels with missing green channels. And by utilizing the characteristic that the reconstructed green channel missing pixels are clearer and more complete, interpolation algorithm is utilized to interpolate the red channel and the blue channel missing pixels in the first Bayer array subgraph in a color difference plane, further a first red channel image and a first blue channel image are restored, and finally corresponding three channel color information in the reconstructed three channel image is filled in the corresponding positions of APS pixels in the color filter array image, wherein the flow is shown in the figure 4, so that the color pixels missing from the APS pixels in the color filter array image are restored. In other embodiments, the preferred interpolation algorithm of step S223 and step S224 is a bilinear interpolation algorithm. Of course, bicubic interpolation, nearest neighbor interpolation, etc. algorithms may also be employed.
In some of these embodiments, as shown in fig. 5, after obtaining a full-color image, the embodiment of fig. 2 further includes the following steps:
step S260, downsampling the full-color image to obtain a standard Bayer array image; standard bayer array images are suitable for ISP algorithms.
The standard bayer array image obtained by downsampling may include four kinds of BGGR bayer array images; GBRG bayer array image; GRBG Bayer array images; RGGB Bayer array image. The ISP (Image Signal Processor) algorithm performs post-processing on the obtained standard bayer array image, and mainly has the functions of linear correction, noise removal, dead pixel removal, interpolation, white balance, automatic exposure control and the like. The method and the device can realize non-standard sparse color filter array image reconstruction, so that the method and the device can be compatible with a conventional ISP algorithm, so that a sensor (sensor) outputs RGB images and event data simultaneously, has the advantages of low delay, high time resolution and high dynamic range of an event camera, can also output high-resolution RGB images, records texture images of static scenes, and is compatible with a conventional processing algorithm. The sensor can be applied to low-power detection, eyeball tracking, deblurring, high-speed photography and large-dynamic-range photography in the field of notebook computers, privacy shielding, quick response and large-dynamic-range video in the fields of security and automatic driving, positioning, mapping, man-machine interaction and the like in the field of robots, and has a wide application prospect.
In some of these embodiments, step S222 includes the steps of:
determining a weight pattern of a green channel missing pixel in the first bayer array subgraph based on the single-channel color information of the APS pixel position;
and interpolating the pixels with the missing green channels in the first Bayer array subgraph based on the weight direction diagram of the pixels with the missing green channels in the first Bayer array subgraph by using an interpolation formula to obtain a first green channel image.
The interpolation formula is as follows:
g=ω V g V +(1-ω V )g H
wherein g represents interpolation at R/B; g V Interpolation representing the vertical direction; g H Interpolation in the horizontal direction; omega V Weight in the vertical direction.
The method for determining the weight direction diagram of the missing pixels of the green channel in the first Bayer array subgraph based on the single-channel color information of the APS pixel positions comprises the following steps:
determining the gradient of a green channel missing pixel in the first Bayer array subgraph, and determining initial weights of the APS pixels in the upper, lower, left and right directions by utilizing the gradient;
determining first weights in the horizontal direction and the vertical direction by utilizing the gradient variance of the interpolation map;
determining second weights in the horizontal direction and the vertical direction by using the interpolation map chromatic aberration variance;
And carrying out noise judgment on the direction of the initial weight according to the first weight and the second weight so as to determine a weight pattern of the missing pixels of the green channel in the first Bayer array subgraph.
Specifically, the gradient of the same color reflects the correlation of the center pixel with pixels in different directions; the determination of the gradient is as follows: as shown in FIG. 6, R is a schematic diagram of a Bayer array subgraph taking a horizontal direction and one-dimensional coordinates as an example 0 The horizontal gradient of (2) may be: delta 0 =|G -1 -G 1 I (I); the left direction gradient sum is: delta L =δ -2-10 The method comprises the steps of carrying out a first treatment on the surface of the The right direction gradient sum is: delta R =δ 012 The method comprises the steps of carrying out a first treatment on the surface of the Then by R 0 Can determine R 0 Left and right initial weights; taking the left direction as an example, R 0 The initial weights in the left direction are:
Figure BDA0003341394510000101
R 0 the initial weights in the right direction are: />
Figure BDA0003341394510000102
For the same reason, as shown in FIG. 7, a schematic diagram of a Bayer array subgraph, taking a vertical direction and one-dimensional coordinates as an example, can be represented by R 0 The vertical gradient of (2) may be: delta 0 =|G -1 -G 1 I (I); wherein G is -a Is R 0 Pixels at the lower side; g a Is R 0 Pixels at the upper side; can determine R 0 Initial weight ω of upward direction U The method comprises the steps of carrying out a first treatment on the surface of the And R is 0 Initial weight ω of upward direction U . And the smaller the gradient the more similar the pixel is, the greater the likelihood of being used for interpolation.
Specifically, the interpolation gradient is defined as the difference of interpolation in different directions, R 0 The interpolation gradient in the horizontal direction can be the subtraction of the left interpolation and the right interpolation, then the interpolation map gradient variance in 4 directions is calculated, finally the interpolation map gradient variance in the horizontal direction and the vertical direction is obtained by combining the direction weights on the CFA, and finally the first weights in the horizontal direction and the vertical direction are obtained by normalizing the interpolation map gradient variance. Such as: as shown in fig. 8, the method is firstly based on different directionsInterpolation is carried out on the neighborhood of the sensor, and interpolation in different directions is then carried out to obtain interpolation gradients. Wherein, by
Figure BDA0003341394510000103
Figure BDA0003341394510000104
Figure BDA0003341394510000105
Determining id 0 =gL 0 -gR 0 =G -1 -G 1 +0.5R 2 -0.5R -2 The method comprises the steps of carrying out a first treatment on the surface of the Definition R 0 The interpolation map gradient variance in the left direction may be: />
Figure BDA0003341394510000106
For the same reason, R can be determined 0 Interpolation map gradient variance in right direction +.>
Figure BDA0003341394510000107
R 0 Interpolation map gradient variance of upper direction +.>
Figure BDA0003341394510000108
R 0 Interpolation map gradient variance in right direction +.>
Figure BDA0003341394510000109
For the first weight, R 0 The interpolation map gradient variance in the horizontal direction may be:
Figure BDA00033413945100001010
Figure BDA00033413945100001011
R 0 the interpolation map gradient variance in the vertical direction may be:
Figure BDA00033413945100001012
Figure BDA00033413945100001013
normalization processing is carried out, and the first weight in the vertical direction is as follows:
Figure BDA00033413945100001014
the first weight in the horizontal direction is: />
Figure BDA00033413945100001015
Specifically, as shown in fig. 9, horizontal interpolation is first performed; and then carrying out interpolation chromatic aberration. Definition R 0 The color difference variance of the interpolation diagram in the left direction is as follows:
Figure BDA0003341394510000111
wherein, the liquid crystal display device comprises a liquid crystal display device,
Figure BDA0003341394510000112
Figure BDA0003341394510000113
for the same reason, R can be determined 0 Interpolation map color difference variance in right direction +.>
Figure BDA0003341394510000114
R 0 Interpolation map color difference variance of upper direction +.>
Figure BDA0003341394510000115
R 0 Interpolation map color difference variance +.>
Figure BDA0003341394510000116
For a second weight; r is R 0 The interpolation map color difference variance in the horizontal direction may be: />
Figure BDA0003341394510000117
R 0 The interpolation map gradient variance in the vertical direction may be: />
Figure BDA0003341394510000118
Normalization processing is carried out, and a first weight in the vertical directionThe method comprises the following steps: />
Figure BDA0003341394510000119
The first weight in the horizontal direction is: />
Figure BDA00033413945100001110
Specifically, in order to determine whether noise exists, noise determination is performed on the direction of the initial weight according to the first weight and the second weight, if the direction of statistics of the first weight and the second weight is different from the direction of statistics of the initial weight, the noise pixel and the direction are determined to be wrong, and the noise pixel and the direction are corrected by using a statistical formula. Weight ω in vertical direction after correction V The statistical formula of (2) is:
Figure BDA00033413945100001111
/>
wherein omega is 0,0 Is a center weight; omega -1,-1 Is the upper left Fang Quan weight of the center weight; omega -1,1 Is the upper right weight of the center weight; omega 1,-1 Is the lower left Fang Quan weight of the center weight; omega 1,1 Is the lower right Fang Quanchong of the center weight.
Based on this, a weight pattern of green channel missing pixels in the first bayer array pattern is determined. Then under the judgment of the accurate weight pattern, the complete green component is obtained for the missing pixel of the green channel (interpolation g for R/B) in the first Bayer array subgraph, and the first green channel image is obtained.
Finally, bilinear interpolation is respectively carried out on the pixels with missing red channels and the pixels with missing blue channels in the first Bayer array subgraph based on the first green channel image by utilizing a bilinear interpolation algorithm, so as to respectively obtain a first red channel image and a first blue channel image; three channels of color information for APS pixel locations are reconstructed so far. Interpolation B/R for R/B; such as: for R 0,0 Is the center pixel R; b (B) -1,-1 An upper left pixel B which is the center pixel R; b (B) -1,1 An upper right pixel B which is the center pixel R; b (B) 1,-1 A lower left pixel B which is the center pixel R; b (B) 1,1 Is the lower right pixel B of the center pixel R. R is R 0,0 B is obtained in a chromatic aberration plane by utilizing bilinear interpolation algorithm 0,0
Figure BDA00033413945100001112
Interpolation b/r for G; such as: for G 0,0 Is the center pixel G; b (B) -1,0 Is the pixel B immediately above the center pixel G; r is R 0,1 A right pixel R which is a center pixel G; b (B) 1,0 A pixel B immediately below the center pixel G; r is R 0,-1 Is the left pixel R of the center pixel G. G 0,0 B is obtained by bilinear interpolation algorithm 0,0
Figure BDA0003341394510000121
G 0,0 The r is obtained by bilinear interpolation algorithm 0,0
Figure BDA0003341394510000122
As previously described, the process of the present application is to estimate the color missing from the center pixel by using the neighborhood color filter array through the spatial correlation and spectral correlation of the image. Spatial correlation, i.e. intra-channel correlation, is interpolated using information on the original color filter array, whereas spectral correlation can be understood as inter-channel correlation, i.e. color correlation. The color correlation is largely divided into two categories: color difference diagrams (R-G, B-G) and color ratio diagrams (R/G, B/G) are adopted in the embodiment. In other embodiments, color ratio maps (R/G, B/G) may also be used.
Because the number of green components in the first Bayer array subgraph is half, the green components are interpolated preferentially, and after a complete first green channel image is obtained, bilinear interpolation is carried out on the red and blue components by utilizing a color difference plane, so that the method is accurate, simple and easy to realize. And the r and the b are interpolated on the color difference plane, which is equivalent to the texture direction of the green component (g), and the accurate r and the b are interpolated by using the accurate g, so that the direction judgment is not required to be increased, and the algorithm complexity is effectively reduced. The accuracy of the green component reconstruction thus plays a decisive role for the subsequent effects.
The key to interpolation of the green component (g) is the accuracy of the interpolation direction, pre-interpolation is performed according to assumptions, and then each direction is given a different weight according to some criterion. The interpolation is carried out in a weight combination mode, so that the interpolation trace can be effectively reduced while the texture is accurately reconstructed, and the quality of an interpolation image is further improved. Compared with the existing interpolation method with more one selected in the direction, the method can obtain smoother interpolation results, particularly in a flat area, avoids the occurrence of horizontal and vertical pseudo stripes, and is beneficial to the processing of subsequent modules in an image system. As shown in fig. 10, interpolation g is performed in the horizontal direction and interpolation g is performed in the vertical direction at each R/B v The interpolation mode may be an ACPI method, then weights in two directions are obtained through gradients, and finally candidate interpolation in horizontal and vertical directions is weighted to obtain interpolation results g=ω V *g vH * g. As shown in fig. 6, taking the current center pixel as R as an example, R 0 The horizontal interpolation g is defined as: g H =-0.25R -2 +0.5G -1 +0.5R 0 +0.5G 1 -0.25R 2 The method comprises the steps of carrying out a first treatment on the surface of the Wherein: r and G are pixel values at corresponding positions in the nonstandard sparse color filter array image; g represents the pixel value calculated by interpolation through the formula; subscript numbers represent the current center position, and positive and negative represent the left and right positions; such as: g -1 The pixel value G for the left 1 position in the non-standard sparse color filter array image. As shown in FIG. 7, R 0 Vertical interpolation g v The definition is as follows: g v =-0.25R -2 +0.5G -1 +0.5R 0 +0.5G 1 -0.23R 2 The method comprises the steps of carrying out a first treatment on the surface of the Wherein: r and G are pixel values at corresponding positions in the nonstandard sparse color filter array image; g represents the pixel value calculated by interpolation through the formula; subscript number indicates the current center position, positive and negativeRepresenting the up and down position; such as: g -1 The pixel value G at the upper 1 position in the non-standard sparse color filter array image is represented.
Since DVS pixels do not provide any color information, it is necessary to use three channels of color information of APS pixels to guide color interpolation of DVS pixels, in one embodiment, step S240, includes the steps of:
Step S241, downsampling the half color array image to obtain a second Bayer array subgraph of DVS pixels;
step S242, determining a weight pattern of each channel missing pixel in the second bayer array subgraph of the DVS pixel based on the three channel color information of the APS pixel position, so as to reconstruct the three channel color information of the DVS pixel position.
Specifically, as shown in fig. 11, downsampling of the half color array image may employ a Binning operation. Such as: in a window of 2x2, carrying out Binning operation on three half-color array images PIC3 to obtain two types of second Bayer array subgraphs; one is of the RGGB type, and the other is of the GRBG type, and can be respectively marked as a second Bayer array sub-graph raw3 and a second Bayer array sub-graph raw4. The direction diagram of interpolation g of R, B channel positions in the raw3 and raw4 diagrams is calculated and is marked as DMap1 and DMap2. The step of calculating the weight pattern is the same as that of calculating the insertion g at the R/B position by using the Demosaic algorithm, and the specific process is not repeated. In other embodiments, the size of the usage window may be adjusted and is not described herein.
In one embodiment, reconstructing three-channel color information for DVS pixel locations includes:
Interpolation is carried out on the pixels with the missing green channels in the second Bayer array subgraph based on the weight directional diagram of the pixels with the missing green channels in the second Bayer array subgraph of the DVS pixels by using a first weighted interpolation formula, so that a second green channel image is obtained;
interpolation is carried out on the pixels with the missing red channels in the second Bayer array subgraph based on the weight direction diagram of the pixels with the missing red channels in the second Bayer array subgraph of the DVS pixels by using a second weighted interpolation formula, so that a second red channel image is obtained;
interpolation is carried out on the pixels with the missing blue channels in the second Bayer array subgraph based on the weight direction diagram of the pixels with the missing blue channels in the second Bayer array subgraph of the DVS pixels by using a third weighted interpolation formula, so that a second blue channel image is obtained;
and reconstructing three-channel color information of the DVS pixel position according to the second green channel image, the second red channel image and the second blue channel image.
And carrying out weighted interpolation on 4 APS pixels in the horizontal and vertical directions of the neighborhood by the DVS pixels through a weight direction diagram to obtain three reconstruction channels of the DVS. Specifically, find the corresponding second bayer array sub-graph raw3 after the DVS pixel goes through the Binning operation, in the figure, the interpolation direction weight at the R/B position. As shown in fig. 11, in the first window of 2x2, there is only interpolation direction weight at the R/B position in the corresponding second bayer array sub-graph raw3, so interpolation of DVS pixels a and B positions in the first window is guided by using the corresponding pattern of the second bayer array sub-graph raw 3. And similarly, in the second window, guiding interpolation of the positions of DVS pixels c and d in the window by using a direction diagram corresponding to the second Bayer array sub-graph raw 4.
The r/g/b reconstruction value at the DVS pixel is obtained using a weighted interpolation formula, such as: for g 0,0 Is the center pixel g to be reconstructed; g -1,0 A left pixel G which is a center pixel G; g 0,1 A pixel G immediately below the center pixel G; g 1,0 A pixel G to the right of the center pixel G; g 0,-1 Is the pixel G immediately above the center pixel G.
The first weighted interpolation formula is:
Figure BDA0003341394510000141
wherein omega h And omega v Is the horizontal weight and the vertical weight obtained in the second bayer array pattern.
For r 0,0 Is the center pixel r to be reconstructed; r is R -1,0 Is the left image of the center pixel rA element R; r is R 0,1 A pixel R immediately below the center pixel R; r is R 1,0 Is the pixel R to the right of the center pixel R; r is R 0,-1 Is the pixel R immediately above the center pixel R. The second weighted interpolation formula is:
Figure BDA0003341394510000142
for b 0,0 Is the center pixel b to be reconstructed; b (B) -1,0 A left pixel B which is a center pixel B; b0,1 is the pixel B immediately below the center pixel B; b (B) 1,0 A pixel B to the right of the center pixel B; b (B) 0,-1 Is the pixel B immediately above the center pixel B.
The third weighted interpolation formula is:
Figure BDA0003341394510000143
wherein R/G/B represents the pixel values of the three channels.
And finally, filling three-channel color information of the reconstructed DVS pixel position into the corresponding position of the DVS pixel in the color filter array image to obtain a full-color image.
It should be noted that the steps illustrated in the above-described flow or flow diagrams of the figures may be performed in a computer system, such as a set of computer-executable instructions, and that, although a logical order is illustrated in the flow diagrams, in some cases, the steps illustrated or described may be performed in an order other than that illustrated herein.
In this embodiment, a device for demosaicing a non-standard sparse color filter array image is further provided, and the device is used to implement the foregoing embodiments and preferred embodiments, and is not described in detail. The terms "module," "unit," "sub-unit," and the like as used below may refer to a combination of software and/or hardware that performs a predetermined function. While the means described in the following embodiments are preferably implemented in software, implementations in hardware, or a combination of software and hardware, are also possible and contemplated.
Fig. 12 is a block diagram of the processing apparatus of the nonstandard sparse color filter array image of this embodiment, as shown in fig. 12, including: an acquisition module 210, a first reconstruction module 220, a first population module 230, a second reconstruction module 240, and a second population module 250;
An acquisition module 210, configured to acquire a nonstandard sparse color filter array image, where the color filter array image includes APS pixels and DVS pixels arranged to intersect the APS pixels;
a first reconstruction module 220, configured to reconstruct three channel color information of the APS pixel positions based on the single channel color information of the APS pixel positions in the color filter array image using an interpolation algorithm;
a first filling module 230, configured to fill three channel color information of the reconstructed APS pixel position into a corresponding position of the APS pixel in the color filter array image, to obtain at least three half-color array images;
a second modeling block 240 for determining an interpolation pattern of interpolation APS pixels at DVS pixels based on three-way color information of APS pixel positions of the half-color array image to reconstruct three-way color information of the DVS pixel positions;
and a second filling module 250, configured to fill the three channel color information of the reconstructed DVS pixel location into the corresponding location of the DVS pixel in the color filter array image, so as to obtain a full-color image.
By the device, the problem that non-standard sparse color filter array image reconstruction cannot be realized in the related technology is solved, the non-standard sparse color filter array image reconstruction is realized, and the device can be suitable for a conventional ISP algorithm.
In one embodiment, on the basis of fig. 12, a downsampling module is further included;
the downsampling module is used for downsampling the full-color image to obtain a standard Bayer array image; the standard bayer array image is suitable for ISP algorithms.
In one embodiment, the first reconstruction module 220 includes a first downsampling unit, an evaluation unit, a first bilinear interpolation unit, a second bilinear interpolation unit, and a first reconstruction unit;
the first downsampling unit is used for downsampling the color filter array image to obtain a first Bayer array subgraph of APS pixels;
the evaluation unit is used for evaluating pixels with missing green channels in the first Bayer array subgraph based on the single-channel color information of the APS pixel position by utilizing an interpolation algorithm to obtain a first green channel image;
the first bilinear interpolation unit is used for carrying out bilinear interpolation on pixels with missing red channels in the first Bayer array subgraph based on the first green channel image by utilizing an interpolation algorithm to obtain a first red channel image;
the second bilinear interpolation unit is used for carrying out bilinear interpolation on pixels with missing blue channels in the first Bayer array subgraph based on the first green channel image by utilizing an interpolation algorithm to obtain a first blue channel image;
And the first reconstruction unit is used for reconstructing three-channel color information of the APS pixel position according to the first green channel image, the first red channel image and the first blue channel image.
In one embodiment, the evaluation unit is further configured to determine a weight pattern of the missing green channel pixels in the first bayer array pattern based on the single-channel color information of the APS pixel positions;
and interpolating the pixels with the missing green channels in the first Bayer array subgraph based on the weight direction diagram of the pixels with the missing green channels in the first Bayer array subgraph by using an interpolation formula to obtain a first green channel image.
In one embodiment, the interpolation formula is:
g=ω V g V +(1-ω V )g H
wherein g represents interpolation at R/B; g V Interpolation representing the vertical direction; g H Interpolation in the horizontal direction; omega V Weight in the vertical direction.
In one embodiment, the evaluation unit is further configured to determine a gradient of a green channel missing pixel in the first bayer array subgraph, and determine initial weights of the APS pixels in four directions, i.e., up, down, left, and right, using the gradient;
determining first weights in the horizontal direction and the vertical direction by utilizing the gradient variance of the interpolation map;
Determining second weights in the horizontal direction and the vertical direction by using the interpolation map chromatic aberration variance;
and carrying out noise judgment on the direction of the initial weight according to the first weight and the second weight so as to determine a weight direction diagram of the missing pixels of the green channel in the first Bayer array subgraph.
In one embodiment, the second reconstruction block 240 includes a second downsampling unit and a second reconstruction unit;
the second downsampling unit is used for downsampling the half-color array image to obtain a second Bayer array subgraph of DVS pixels;
and the second reconstruction unit is used for determining the weight direction diagram of each channel missing pixel in the second Bayer array subgraph of the DVS pixel based on the three channel color information of the APS pixel position so as to reconstruct the three channel color information of the DVS pixel position.
In one embodiment, the second reconstructing unit is further configured to interpolate the pixels with the missing green channel in the second bayer array sub-graph based on the weighted pattern of the pixels with the missing green channel in the second bayer array sub-graph of the DVS pixels by using the first weighted interpolation formula, to obtain a second green channel image;
interpolation is carried out on the pixels with the missing red channels in the second Bayer array subgraph based on the weight direction diagram of the pixels with the missing red channels in the second Bayer array subgraph of the DVS pixels by using a second weighted interpolation formula, so that a second red channel image is obtained;
Interpolating pixels with missing blue channels in a second Bayer array subgraph of the DVS pixels based on a weight pattern of the pixels with missing blue channels in the second Bayer array subgraph by using a third weighted interpolation formula to obtain a second blue channel image;
and reconstructing three-channel color information of the DVS pixel position according to the second green channel image, the second red channel image and the second blue channel image.
The above-described respective modules may be functional modules or program modules, and may be implemented by software or hardware. For modules implemented in hardware, the various modules described above may be located in the same processor; or the above modules may be located in different processors in any combination.
There is also provided in this embodiment a sensor comprising a memory having stored therein a computer program and a processor arranged to run the computer program to perform the steps of any of the method embodiments described above.
Optionally, the sensor may further include a transmission device and an input/output device, where the transmission device is connected to the processor, and the input/output device is connected to the processor.
Alternatively, in the present embodiment, the above-described processor may be configured to execute the following steps by a computer program:
s1, acquiring a nonstandard sparse color filter array image, wherein the color filter array image comprises APS pixels and DVS pixels which are arranged in a crossing manner with the APS pixels;
s2, reconstructing three-channel color information of the APS pixel position based on the single-channel color information of the APS pixel position in the color filter array image by utilizing an interpolation algorithm;
s3, filling three-channel color information of the reconstructed APS pixel position into the corresponding position of the APS pixel in the color filter array image to obtain at least three half-color array images;
s4, determining an interpolation direction diagram of interpolation APS pixels at the DVS pixels based on three-channel color information of the APS pixel positions of the half-color array image so as to reconstruct three-channel color information of the DVS pixel positions;
and S5, filling three-channel color information of the reconstructed DVS pixel position into the corresponding position of the DVS pixel in the color filter array image to obtain a full-color image.
It should be noted that, specific examples in this embodiment may refer to examples described in the foregoing embodiments and alternative implementations, and are not described in detail in this embodiment.
In addition, in combination with the processing method of the nonstandard sparse color filter array image provided in the above embodiment, a storage medium may also be provided in this embodiment to implement. The storage medium has a computer program stored thereon; the computer program when executed by a processor implements the method of processing a non-standard sparse color filter array image of any of the above embodiments.
It should be understood that the specific embodiments described herein are merely illustrative of this application and are not intended to be limiting. All other embodiments that can be made by one of ordinary skill in the art without undue burden from the embodiments provided herein are within the scope of the present application.
It is to be understood that the drawings are merely examples or embodiments of the present application and that it is within the purview of one of ordinary skill in the art to adapt the present application to other similar situations without the exercise of inventive faculty. In addition, it should be appreciated that while the development effort might be complex and lengthy, it would nevertheless be a routine undertaking of design, fabrication, or manufacture for those of ordinary skill having the benefit of this disclosure, having the benefit of this disclosure.
The term "embodiment" in this application means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive. It will be clear or implicitly understood by those of ordinary skill in the art that the embodiments described in this application can be combined with other embodiments without conflict.
The above examples only represent a few embodiments of the present application, which are described in more detail and are not to be construed as limiting the scope of the patent. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which fall within the protection scope of the present application. Accordingly, the scope of protection of the present application shall be subject to the appended claims.

Claims (11)

1. A method for processing a nonstandard sparse color filter array image, comprising:
acquiring a nonstandard sparse color filter array image, wherein the color filter array image comprises APS pixels and DVS pixels which are arranged in a crossing manner with the APS pixels;
Reconstructing three channel color information of the APS pixel position based on the single channel color information of the APS pixel position in the color filter array image by using an interpolation algorithm;
filling three-channel color information of the reconstructed APS pixel position into the corresponding position of the APS pixel in the color filter array image to obtain at least three half-color array images;
determining an interpolation pattern of interpolation APS pixels at the DVS pixels based on three-way color information for the APS pixel locations of the half-color array image to reconstruct three-way color information for the DVS pixel locations;
and filling the three-channel color information of the reconstructed DVS pixel position to the corresponding position of the DVS pixel in the color filter array image to obtain a full-color image.
2. The method of processing a non-standard sparse color filter array image of claim 1, further comprising, after obtaining a full color image:
downsampling the full-color image to obtain a standard Bayer array image; the standard bayer array image is suitable for ISP algorithms.
3. The method of processing a nonstandard sparse color filter array image of claim 1, wherein reconstructing three channel color information for the APS pixel locations in the color filter array image based on the single channel color information for the APS pixel locations using an interpolation algorithm comprises:
Downsampling the color filter array image to obtain a first bayer array subgraph of APS pixels;
evaluating pixels missing in a green channel in the first Bayer array subgraph based on the single-channel color information of the APS pixel position by using an interpolation algorithm to obtain a first green channel image;
performing bilinear interpolation on pixels with missing red channels in a first Bayer array subgraph based on the first green channel image by using an interpolation algorithm to obtain a first red channel image;
performing bilinear interpolation on pixels with missing blue channels in a first Bayer array subgraph based on the first green channel image by using an interpolation algorithm to obtain a first blue channel image;
and reconstructing three-channel color information of the APS pixel position according to the first green channel image, the first red channel image and the first blue channel image.
4. The method for processing a nonstandard sparse color filter array image of claim 3, wherein the evaluating pixels missing in the green channel in the first bayer array subgraph based on the single-channel color information of the APS pixel locations using an interpolation algorithm to obtain a first green channel image comprises:
Determining a weight pattern of a green channel missing pixel in the first bayer array subgraph based on the single-channel color information of the APS pixel position;
and interpolating the pixels with the missing green channels in the first Bayer array subgraph based on the weight direction diagram of the pixels with the missing green channels in the first Bayer array subgraph by using an interpolation formula to obtain a first green channel image.
5. The method of processing a non-standard sparse color filter array image of claim 4, wherein the interpolation formula is:
g=ω V g V +(1-ω V )g H
wherein g represents interpolation at R/B; g V Interpolation representing the vertical direction; g H Interpolation in the horizontal direction; omega V Weight in the vertical direction.
6. The method of processing a non-standard sparse color filter array image of claim 4, wherein said determining a weight pattern of green channel missing pixels in the first bayer array pattern based on single channel color information for the APS pixel locations comprises:
determining the gradient of a green channel missing pixel in a first Bayer array subgraph, and determining initial weights of APS pixels in the upper, lower, left and right directions by using the gradient;
Determining first weights in the horizontal direction and the vertical direction by utilizing the gradient variance of the interpolation map;
determining second weights in the horizontal direction and the vertical direction by using the interpolation map chromatic aberration variance;
and carrying out noise judgment on the direction of the initial weight according to the first weight and the second weight so as to determine a weight direction diagram of the missing pixels of the green channel in the first Bayer array subgraph.
7. The method of processing a non-standard sparse color filter array image of claims 1-6, wherein determining an interpolation pattern of interpolation APS pixels at the DVS pixels based on three-way color information for the APS pixel locations of the half-color array image to reconstruct three-way color information for the DVS pixel locations comprises:
downsampling the half-color array image to obtain a second Bayer array subgraph of DVS pixels;
and determining a weight direction diagram of each channel missing pixel in the second Bayer array subgraph of the DVS pixel based on the three channel color information of the APS pixel position so as to reconstruct the three channel color information of the DVS pixel position.
8. The method of processing a non-standard sparse color filter array image of claim 7, wherein said reconstructing three-way color information for said DVS pixel locations comprises:
Interpolating pixels with a missing green channel in a second bayer array subgraph of the DVS pixels based on a weight pattern of the missing green channel pixels in the second bayer array subgraph by using a first weighted interpolation formula to obtain a second green channel image;
interpolation is carried out on the pixels with the missing red channels in the second Bayer array subgraph based on the weight direction diagram of the pixels with the missing red channels in the second Bayer array subgraph of the DVS pixels by using a second weighted interpolation formula, so that a second red channel image is obtained;
interpolating pixels with missing blue channels in a second Bayer array subgraph of the DVS pixels based on a weight pattern of the pixels with missing blue channels in the second Bayer array subgraph by using a third weighted interpolation formula to obtain a second blue channel image;
and reconstructing three-channel color information of the DVS pixel position according to the second green channel image, the second red channel image and the second blue channel image.
9. A processing apparatus for a nonstandard sparse color filter array image, comprising: the device comprises an acquisition module, a first reconstruction module, a first filling module, a second reconstruction module and a second filling module;
The acquisition module is used for acquiring a nonstandard sparse color filter array image, wherein the color filter array image comprises APS pixels and DVS pixels which are arranged in a crossing manner with the APS pixels;
the first reconstruction module is configured to reconstruct three channel color information of the APS pixel position based on the single channel color information of the APS pixel position in the color filter array image using an interpolation algorithm;
the first filling module is used for filling three-channel color information of the reconstructed APS pixel position to the corresponding position of the APS pixel in the color filter array image to obtain at least three half-color array images;
the second reconstruction module is configured to determine an interpolation direction diagram of an interpolation APS pixel at the DVS pixel based on three channel color information of the APS pixel position of the half color array image, so as to reconstruct three channel color information of the DVS pixel position;
and the second filling module is used for filling the three-channel color information of the reconstructed DVS pixel position to the corresponding position of the DVS pixel in the color filter array image to obtain a full-color image.
10. A sensor comprising a memory and a processor, wherein the memory has stored therein a computer program, the processor being arranged to run the computer program to perform the method of processing a non-standard sparse color filter array image of any one of claims 1 to 8.
11. A computer readable storage medium having stored thereon a computer program, characterized in that the computer program when executed by a processor implements the steps of the method of processing a non-standard sparse color filter array image according to any one of claims 1-8.
CN202111309465.1A 2021-11-06 2021-11-06 Processing method and device for nonstandard sparse color filter array image Pending CN116091651A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111309465.1A CN116091651A (en) 2021-11-06 2021-11-06 Processing method and device for nonstandard sparse color filter array image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111309465.1A CN116091651A (en) 2021-11-06 2021-11-06 Processing method and device for nonstandard sparse color filter array image

Publications (1)

Publication Number Publication Date
CN116091651A true CN116091651A (en) 2023-05-09

Family

ID=86201142

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111309465.1A Pending CN116091651A (en) 2021-11-06 2021-11-06 Processing method and device for nonstandard sparse color filter array image

Country Status (1)

Country Link
CN (1) CN116091651A (en)

Similar Documents

Publication Publication Date Title
Wronski et al. Handheld multi-frame super-resolution
US10565681B2 (en) System and method for image demosaicing
CN111194458A (en) Image signal processor for processing image
CN111353948B (en) Image noise reduction method, device and equipment
US9582863B2 (en) Image processing apparatus, image processing method, and program
CN101675454B (en) Adopt the edge mapping of panchromatic pixels
US20160350900A1 (en) Convolutional Color Correction
US8040558B2 (en) Apparatus and method for shift invariant differential (SID) image data interpolation in fully populated shift invariant matrix
TW202042550A (en) Systems and methods for converting non-bayer pattern color filter array image data
US20100182466A1 (en) Image interpolation method and apparatus using pattern characteristics of color filter array
CN113850367B (en) Network model training method, image processing method and related equipment thereof
CN104052979B (en) For device and the technology of image processing
JP5096645B1 (en) Image generating apparatus, image generating system, method, and program
US20110032396A1 (en) Edge-adaptive interpolation and noise filtering method, computer-readable recording medium, and portable terminal
CN102509294B (en) Single-image-based global depth estimation method
KR20180122548A (en) Method and apparaturs for processing image
US20190370933A1 (en) Image Processing Method and Apparatus
CN110430403B (en) Image processing method and device
CN104717474A (en) Image processing method and module
CN108734668A (en) Image color restoration methods, device, computer readable storage medium and terminal
Simpkins et al. An introduction to super-resolution imaging
US20110032269A1 (en) Automatically Resizing Demosaicked Full-Color Images Using Edge-Orientation Maps Formed In The Demosaicking Process
WO2022061879A1 (en) Image processing method, apparatus and system, and computer-readable storage medium
JP2002305751A (en) Reconstruction of color filter array images
US8213710B2 (en) Apparatus and method for shift invariant differential (SID) image data interpolation in non-fully populated shift invariant matrix

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination