CN111741277B - Image processing method and image processing device - Google Patents

Image processing method and image processing device Download PDF

Info

Publication number
CN111741277B
CN111741277B CN202010670512.4A CN202010670512A CN111741277B CN 111741277 B CN111741277 B CN 111741277B CN 202010670512 A CN202010670512 A CN 202010670512A CN 111741277 B CN111741277 B CN 111741277B
Authority
CN
China
Prior art keywords
image
full
pixel
rgb
sampling
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010670512.4A
Other languages
Chinese (zh)
Other versions
CN111741277A (en
Inventor
李伟冲
张玮
程祥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Goodix Technology Co Ltd
Original Assignee
Shenzhen Goodix Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Goodix Technology Co Ltd filed Critical Shenzhen Goodix Technology Co Ltd
Priority to CN202010670512.4A priority Critical patent/CN111741277B/en
Publication of CN111741277A publication Critical patent/CN111741277A/en
Application granted granted Critical
Publication of CN111741277B publication Critical patent/CN111741277B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/843Demosaicing, e.g. interpolating colour pixel values
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/88Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Color Image Communication Systems (AREA)
  • Color Television Image Signal Generators (AREA)
  • Image Processing (AREA)

Abstract

The embodiment of the application provides an image processing method and an image processing device, wherein the method comprises the following steps: sub-sampling the WRGB image to obtain a first sampling image comprising R pixels, G pixels and B pixels and a second sampling image comprising W pixels; converting the first sample map into a bayer array image and interpolating the second sample map into a full W pixel image; demosaicing the Bayer array image to generate a first RGB image; and carrying out panchromatic sharpening processing on the first RGB image according to the full W pixel image so as to generate a second RGB image. The method and the device can convert the WRGB image into the RGB image, and perform fusion on the RGB image, so that the generated RGB image can better keep the brightness information of W while injecting space detail information, and the image quality under dark light can be effectively improved.

Description

Image processing method and image processing device
Technical Field
The present embodiments relate to the field of images, and more particularly, to a method of image processing and an image processing apparatus.
Background
White Red Green Blue (WRGB) is a four-color pixel design by adding a white (W) pixel on the basis of original Red Green Blue (RGB). WRGB absorbs more light than RGB, and thus has higher sensitivity, and better imaging performance especially in low light conditions.
Although WRGB is improved in sensitivity, it still suffers from a decrease in spatial resolution, resulting in poor image quality.
Disclosure of Invention
The embodiment of the application provides an image processing method and an image processing device, which can convert a WRGB image into an RGB image, and perform fusion on the RGB image, so that the brightness information of W can be better maintained while spatial detail information is injected into a generated RGB image, and the image quality under dark light can be effectively improved.
In a first aspect, a method of image processing is provided, the method comprising: sub-sampling the WRGB image to obtain a first sampling image comprising R pixels, G pixels and B pixels and a second sampling image comprising W pixels; converting the first sampling diagram into a Bayer array image and interpolating the second sampling diagram into a full W pixel image; demosaicing the Bayer array image to generate a first RGB image; and carrying out panchromatic sharpening processing on the first RGB image according to the full W pixel image to generate a second RGB image.
With reference to the first aspect, in an implementation manner of the first aspect, the performing a panchromatic sharpening process on the first RGB image according to the full-W pixel image to generate a second RGB image includes: and according to the full W pixel image, carrying out image restoration processing on the first RGB image subjected to panchromatic sharpening processing to generate a second RGB image.
With reference to the first aspect and the foregoing implementation manner of the first aspect, in another implementation manner of the first aspect, the performing panchromatic sharpening on the first RGB image according to the full-W pixel image to generate a second RGB image includes: and according to the full W pixel image, carrying out panchromatic sharpening processing on the first RGB image subjected to image restoration processing to generate the second RGB image.
With reference to the first aspect and the foregoing implementation manner, in another implementation manner of the first aspect, the image restoration process includes automatic white balance AWB, a color correction matrix CCM, gamma, or TONE curve TONE cutter.
With reference to the first aspect and the foregoing implementation manner of the first aspect, in another implementation manner of the first aspect, the converting the first sampling diagram into a bayer array image includes: and carrying out average value operation on the first sampling image to obtain the Bayer array image.
With reference to the first aspect and the foregoing implementation manner of the first aspect, in another implementation manner of the first aspect, the demosaicing the bayer array image to generate a first RGB image includes: sub-sampling positions of R pixels, B pixels and G pixels in the Bayer array image respectively to obtain a third sampling image, a fourth sampling image and a fifth sampling image; interpolating the fifth sample map into a full G pixel image; acquiring a full R pixel image according to the full G pixel image and the third sampling image, and acquiring a full B pixel image according to the full G pixel image and the fourth sampling image; and generating the first RGB image according to the full G pixel image, the full R pixel image and the full B pixel image.
With reference to the first aspect and the foregoing implementation manner of the first aspect, in another implementation manner of the first aspect, the acquiring a full R pixel image according to the full G pixel image and the third sampling map, and acquiring a full B pixel image according to the full G pixel image and the fourth sampling map includes: performing guided filtering processing on the full G pixel image and the third sampling image to obtain an initial estimation image of the full R pixel image, and performing guided filtering processing on the full G pixel image and the fourth sampling image to obtain an initial estimation image of the full B pixel image; acquiring a first residual image between the third sampling image and the initial estimation image of the full R pixel image, and acquiring a second residual image between the fourth sampling image and the initial estimation image of the full B pixel image; performing linear interpolation on the first residual error map to obtain a first full-resolution residual error map, and performing linear interpolation on the second residual error map to obtain a second full-resolution residual error map; and performing a dot-adding operation on the initial estimation image of the full-R pixel image and the first full-resolution residual image to obtain the full-R pixel image, and performing a dot-adding operation on the initial estimation image of the full-B pixel image and the second full-resolution residual image to obtain the full-B pixel image.
With reference to the first aspect and the foregoing implementation manner of the first aspect, in another implementation manner of the first aspect, the performing panchromatic sharpening on the first RGB image according to the full-W pixel image to generate a second RGB image includes: the first RGB image is subjected to up-sampling to obtain a third RGB image; and generating the second RGB image according to the third RGB image and the full W pixel image.
With reference to the first aspect and the foregoing implementation manner of the first aspect, in another implementation manner of the first aspect, the generating the second RGB image according to the third RGB image and the full W pixel image includes: and conducting guiding filtering processing on the third RGB image and the full W pixel image to generate the second RGB image.
In a second aspect, there is provided an image processing apparatus comprising: the sampling unit is used for sub-sampling the WRGB image to acquire a first sampling image comprising R pixels, G pixels and B pixels and a second sampling image comprising W pixels; a bayer array generating unit configured to convert the first sampling pattern into a bayer array image; an interpolation unit for interpolating the second sample map into a full W pixel image; a first RGB image generation unit configured to perform demosaicing processing on the bayer array image to generate a first RGB image; and the second RGB image generation unit is used for carrying out panchromatic sharpening processing on the first RGB image according to the full W pixel image so as to generate a second RGB image.
With reference to the second aspect, in an implementation manner of the second aspect, the second RGB image generating unit is specifically configured to: and according to the full W pixel image, carrying out image restoration processing on the first RGB image subjected to panchromatic sharpening processing to generate a second RGB image.
With reference to the second aspect and the foregoing implementation manner of the second aspect, in another implementation manner of the second aspect, the second RGB image generating unit is specifically configured to: and according to the full W pixel image, carrying out panchromatic sharpening processing on the first RGB image subjected to image restoration processing to generate the second RGB image.
With reference to the second aspect and the foregoing implementation manner, in another implementation manner of the second aspect, the image restoration processing includes Automatic White Balance (AWB), Color Correction Matrix (CCM), GAMMA, or TONE curve TONE cutter.
With reference to the second aspect and the foregoing implementation manner of the second aspect, in another implementation manner of the second aspect, the bayer array generation unit is specifically configured to: and carrying out average value operation on the first sampling image to obtain the Bayer array image.
With reference to the second aspect and the foregoing implementation manner of the second aspect, in another implementation manner of the second aspect, the first RGB image generating unit is specifically configured to: sub-sampling positions of R pixels, B pixels and G pixels in the Bayer array image respectively to obtain a third sampling image, a fourth sampling image and a fifth sampling image; interpolating the fifth sample map into a full G pixel image; acquiring a full R pixel image according to the full G pixel image and the third sampling image, and acquiring a full B pixel image according to the full G pixel image and the fourth sampling image; and generating the first RGB image according to the full G pixel image, the full R pixel image and the full B pixel image.
With reference to the second aspect and the foregoing implementation manner of the second aspect, in another implementation manner of the second aspect, the first RGB image generating unit is specifically configured to: performing guided filtering processing on the full G pixel image and the third sampling image to obtain an initial estimation image of the full R pixel image, and performing guided filtering processing on the full G pixel image and the fourth sampling image to obtain an initial estimation image of the full B pixel image; acquiring a first residual image between the third sampling image and the initial estimation image of the full R pixel image, and acquiring a second residual image between the fourth sampling image and the initial estimation image of the full B pixel image; performing linear interpolation on the first residual error map to obtain a first full-resolution residual error map, and performing linear interpolation on the second residual error map to obtain a second full-resolution residual error map; and performing a dot-adding operation on the initial estimation image of the full-R pixel image and the first full-resolution residual image to obtain the full-R pixel image, and performing a dot-adding operation on the initial estimation image of the full-B pixel image and the second full-resolution residual image to obtain the full-B pixel image.
With reference to the second aspect and the foregoing implementation manner of the second aspect, in another implementation manner of the second aspect, the sampling unit is further configured to: the first RGB image is subjected to up-sampling to obtain a third RGB image; the second RGB image generating unit is specifically configured to: and generating the second RGB image according to the third RGB image and the full W pixel image.
With reference to the second aspect and the foregoing implementation manner of the second aspect, in another implementation manner of the second aspect, the second RGB image generating unit is specifically configured to: and conducting guiding filtering processing on the third RGB image and the full W pixel image to generate the second RGB image.
With reference to the second aspect and the foregoing implementation manner of the second aspect, in another implementation manner of the second aspect, the image processing apparatus further includes: and the photoelectric conversion unit is used for acquiring the WRGB image.
With reference to the second aspect and the foregoing implementation manner of the second aspect, in another implementation manner of the second aspect, the photoelectric conversion unit is suitable for a Charge-coupled Device (CCD) structure or a Complementary Metal Oxide Semiconductor (CMOS) structure.
In a third aspect, a chip is provided, which includes: and the processor is used for calling and running the computer program from the memory so that the device provided with the chip executes the method in the first aspect or each implementation mode thereof.
In a fourth aspect, there is provided an image processing apparatus comprising: a processor and a memory, wherein the memory is used for storing a computer program, and the processor is used for calling and running the computer program stored in the memory, and executing the method in the first aspect or each implementation manner thereof.
In a fifth aspect, a computer-readable storage medium is provided for storing a computer program, the computer program causing a computer to execute the method of the first aspect or its implementation modes.
Drawings
Fig. 1 is a system architecture diagram of an image processing apparatus according to an embodiment of the present application.
Fig. 2 is a schematic diagram of a bayer array image.
Fig. 3 is a schematic diagram of a WRGB image.
Fig. 4 is a schematic block diagram of a method of image processing according to an embodiment of the present application.
Fig. 5 is a process diagram of a method of image processing according to an embodiment of the present application.
Fig. 6 is a schematic diagram of each pixel value of the subsampling map in the embodiment of the present application.
Fig. 7 is a process diagram of the demosaicing process according to the embodiment of the present application.
Fig. 8 is a process diagram of a panchromatic sharpening process according to an embodiment of the present application.
Fig. 9(a), 9(b), and 9(c) show WRGB images of the same picture and effect diagrams obtained using the method of image processing according to the embodiment of the present application, respectively.
Fig. 10 shows a schematic block diagram of an image processing apparatus of an embodiment of the present application.
Fig. 11 shows a schematic block diagram of a chip of an embodiment of the application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings.
The image processing device converts the optical image of the imaging object into an electric signal in a corresponding proportional relation with the optical image by utilizing the photoelectric conversion function of the pixel array, and then obtains the image of the imaging object. Fig. 1 shows a schematic block diagram of an image processing apparatus 100, the image processing apparatus 100 may refer to any electronic device, for example, the image processing apparatus 100 may be a mobile phone; alternatively, the image processing apparatus 100 may be a part of an electronic device, for example, an image pickup module in the electronic device, and the embodiment of the present application is not limited thereto.
As shown in fig. 1, the image processing apparatus 100 generally includes a Pixel Array (Pixel Array)101 (or may also be referred to as a photoelectric conversion unit 101 or an image sensor 101), a signal reading circuit 102, a signal processor 103, a controller 104, an interface circuit 105, and a power supply 106. The electrical signal output end of the pixel array 101 is connected to the input end of the signal reading circuit 102, the control end of the pixel array 101 is connected to the output end of the controller 104, the output end of the signal reading circuit 102 is connected to the input end of the signal processor 103, and the power supply 106 is used for supplying power to the signal reading circuit 102, the signal processor 103, the controller 104, and the interface circuit 105.
The pixel array 101 is configured to collect an optical signal returned through an imaging object, convert the optical signal into an electrical signal, and reflect an optical image of the imaging object by the intensity of the electrical signal. The signal reading circuit 102 is for reading an electric signal output by each pixel. The signal processor 103 is configured to perform analog-to-digital conversion on the electric signals output by the pixel array, and output image data of an imaging object. The interface circuit 105 is used to transfer image data to the outside. The controller 104 is configured to output a control signal for controlling each pixel in the pixel array to work in cooperation.
The core component of the image processing apparatus 100 is the pixel array 101. Each pixel structure in the pixel array 101 is similar, and generally each pixel structure may include a lens (or a microlens), a Color Filter (Color Filter), and a photosensitive element. The lens is positioned above the optical filter, and the optical filter is positioned above the photosensitive element. Light returning after passing through an imaging object is focused by a lens, then emitted from a lens emitting area, filtered by an optical filter, and then emitted to a photosensitive element such as a Photodiode (PD), and an optical signal is converted into an electrical signal by the photosensitive element. The pixels may include red pixels (hereinafter, referred to as R pixels), green pixels (hereinafter, referred to as G pixels), and blue pixels (hereinafter, referred to as B pixels) according to the types of light transmitted through the different filters. The R pixel is a photosensitive element to which only red light is emitted after being filtered by the filter, and the principle of the G pixel and the B pixel is the same as that of the R pixel, and the description thereof is omitted.
The principle of generating color image data by the image sensor is as follows: each pixel in the pixel array can convert only one type of optical signal into an electrical signal, and then perform interpolation operation by combining optical signals acquired by other types of pixels around, so as to restore the image color of the area acquired by the current pixel, which is also called Demosaicing (Demosaicing), and is usually completed in a processor. For example: the current pixel is an R pixel, and the R pixel can only convert a red light signal into an electrical signal, so that the intensity of blue light and green light of the current pixel can be restored by combining the electrical signals collected by surrounding B pixels or G pixels, and the image color of the current pixel is determined.
Therefore, in order to acquire a Color image, it is necessary to dispose a Color Filter in a Color-specific arrangement above a photosensor Array included in a pixel Array, or may also be referred to as a Color Filter Array (CFA). Currently, for most pixel arrays, such as CCD and CMOS image sensors, the CFA included therein employs a Bayer format based on three primary colors of RGB as shown in fig. 2. The Bayer pattern is characterized in that its basic unit is a 2 × 2 four-pixel array including 1 red pixel R, one blue pixel B, and 2 green pixels G, where two green pixels G are adjacently disposed at a common vertex. Since any pixel can only obtain a signal of one color of RGB, the restoration of complete color information must be realized by a specific image processing algorithm.
This pure RGB bayer layout allows only light of a specific color to pass through, i.e., cuts off most of the photons, and thus the image may not be accurately restored in a low-light environment. Therefore, a CFA using a WRGB four-color type pixel design as shown in fig. 3 has come to work, which mainly adds a white pixel W that transmits all the wavelength light to an RGB array. The WRGB CFA absorbs more light than the standard Bayer CFA, and can increase the amount of charge accumulation per pixel, thereby having higher sensitivity and better imaging performance, especially in low light conditions.
But WRGB CFA also has problems: color resolution decreases, color noise increases, and image quality degrades. Therefore, the embodiment of the present application provides an image processing method, so as to solve the problem.
It should be noted that WRGB refers to a format including W pixels, R pixels, G pixels, and B pixels, which is independent of the arrangement of the respective pixels, and may also be referred to as RGBW.
Fig. 4 shows a schematic block diagram of a method 200 of image processing of an embodiment of the present application. The method 200 may be performed by the image processing apparatus 100 shown in fig. 1, and in particular, may be performed by the processor 103 in the image processing apparatus 100. As shown in fig. 4, the method 200 may include some or all of the following:
s210, sub-sampling is carried out on the WRGB image, and a first sampling image comprising R pixels, G pixels and B pixels and a second sampling image comprising W pixels are obtained.
And S220, converting the first sampling image into a Bayer array image.
And S230, interpolating the second sampling image into a full W pixel image.
And S240, performing demosaicing processing on the Bayer array image to generate a first RGB image.
And S250, carrying out panchromatic sharpening processing on the first RGB image according to the full W pixel image to generate a second RGB image.
In the embodiments of the present application, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not limit the implementation process of the embodiments of the present application.
The image sensor (i.e., the pixel array or the photoelectric conversion unit) in the embodiment of the present application may include a photosensitive cell array and a filter cell array as shown in fig. 3, where the filter cells in the filter cell array may correspond to the photosensitive cells in the photosensitive cell array one to one. The green light filtering unit in the light filtering unit array is used for outputting green light, the red light filtering unit is used for outputting red light, the blue light filtering unit is used for outputting blue light, the white light filtering unit is used for outputting white light, the photosensitive unit corresponding to the green light filtering unit in the photosensitive unit array outputs G pixels, the photosensitive unit corresponding to the blue light filtering unit outputs B pixels, the photosensitive unit corresponding to the red light filtering unit outputs R pixels, and the photosensitive unit corresponding to the white light filtering unit outputs W pixels, namely the image format collected by the image sensor is WRGB.
The image sensor may transmit the acquired WRGB image to the processor, and the processor performs the above-described process. Fig. 5 shows an architecture of an image processing flow of an embodiment of the present application executed by a processor. Specifically, with reference to fig. 4 and 5, in S210, the processor performs sub-sampling (subsampling) on the obtained WRGB image, and obtains a first sampling map including only B pixels, R pixels, and G pixels and a second sampling map including only W pixels, respectively, where positions of the B pixels, the R pixels, and the G pixels in the first sampling map are positions of corresponding pixels in the WRGB image, and a position of the W pixel in the second sampling map is also a position of the W pixel in the WRGB image.
Alternatively, in S220, the processor may convert the first sample map into a bayer array image, for example, the processor may perform an averaging operation on the first sample map to obtain the bayer array image. Assuming that the pixel values in the first sampling diagram are as shown in fig. 6, the pixel values in the bayer array image after conversion may be respectively
Figure GDA0003448840820000081
Optionally, the processor may also perform other operations on the first sampling map, such as a maximum operation, to obtain a bayer array image. That is, the pixel values in the converted bayer array image may be max { B } respectively1,2,B2,1},max{G1,4,G2,3},max{G3,2,G4,1},max{R3,4,R4,3}。
Alternatively, in S230, the processor may interpolate the second sample map to a full W pixel image. That is, by setting the W pixels at the positions of the pixels other than the W pixels in the WRGB image, a full-resolution luminance image can be obtained.
As shown in fig. 5, since the converted bayer array image is an image of low resolution, the converted bayer array image may be further processed. Optionally, in S240, the processor may perform Demosaicing (Demosaicing) on the converted bayer array image, resulting in a first RGB image with low resolution.
Optionally, in this embodiment of the present application, sub-sampling positions of R pixels, B pixels, and G pixels in the bayer array image, respectively, to obtain a third sampling diagram, a fourth sampling diagram, and a fifth sampling diagram; interpolating the fifth sample map into a full G pixel image; and acquiring a full R pixel image according to the full G pixel image and the third sampling image, and acquiring a full B pixel image according to the full G pixel image and the fourth sampling image.
The specific flow of the demosaicing process will be described in detail below with reference to fig. 7. The input of the demosaicing processing is a Bayer array image, and the output is a first RGB image.
Step 1: the bayer array image may be sub-sampled at R pixel, B pixel, and G pixel positions, respectively, to obtain corresponding sampling maps, that is, a third sampling map, a fourth sampling map, and a fifth sampling map, which are denoted as R pixels, respectivelys,Bs,Gs
Step 2: for GsAnd performing linear interpolation to obtain an interpolated full G pixel image which is marked as G.
And step 3: according to G and RsAcquiring a full R pixel image, and marking as R; according to G and BsAnd acquiring a full B pixel image, and marking as B. In particular, the amount of the solvent to be used,
step 3.1: using G as guide map, for G and RsObtaining an initial estimation image of the R channel full resolution ratio, namely an initial estimation image of the full R pixel image by using a guiding filtering algorithm, and recording the initial estimation image as
Figure GDA0003448840820000091
Using G as guide map, for G and BsUsing a Guided Filter (Guided Filter) algorithm to obtain an initial estimation image of the B channel full resolution, i.e. an initial estimation image of a full B pixel image, which is recorded as
Figure GDA0003448840820000092
Alternatively, the guided filtering algorithm in step 3.1 may be replaced by other algorithms, such as joint bilateral filtering, Hybrid Color Mapping (HCM).
Step 3.2: the residual error deltaR is calculated at the R channel position and is expressed mathematically as
Figure GDA0003448840820000093
MRRepresenting a binary diagram, wherein the value of the R channel position is 1, and the values of other positions are 0; the residual error Delta B is calculated at the position of the B channel and is expressed mathematically as
Figure GDA0003448840820000094
MBThe binary map is shown, and the value is 1 at the B channel position and 0 at the other positions.
Step 3.3, carrying out linear interpolation on the residual error delta R obtained in the step 3.2 to obtain a full-resolution residual error map after interpolation
Figure GDA0003448840820000095
Carrying out linear interpolation on the residual error delta B obtained in the step 3.2 to obtain a full-resolution residual error map after interpolation
Figure GDA0003448840820000096
Step 3.4, the full resolution residual error map obtained in the step 3.3
Figure GDA0003448840820000097
And the initial estimation map of the R channel full R pixel image obtained in the step 3.1
Figure GDA0003448840820000098
Performing point addition to obtain an R channel image after interpolation, namely a full R pixel image; the full resolution residual error map obtained in the step 3.3
Figure GDA0003448840820000099
And the initial estimation map of the full B pixel image obtained in the step 3.1
Figure GDA00034488408200000910
And performing point addition to obtain a B channel image after interpolation, namely a full B pixel image.
And 4, combining the full G pixel image, the full R pixel image and the full B pixel image obtained in the steps 2 and 3 to obtain a first RGB image.
The first RGB image obtained by the mosaic processing is a low-resolution image, and further, the next processing is required for the first RGB image in order to obtain the second RGB image with the full resolution.
Alternatively, in S250, the processor may perform panchromatic sharpening (panscharging) on the mosaic-processed first RGB image according to the full-W pixel image, so as to obtain a full-resolution second RGB image. Among them, so-called panchromatic sharpening is also called image fusion.
Optionally, in this embodiment of the present application, the panchromatic sharpening process may include: the first RGB image is subjected to up-sampling to obtain a third RGB image; and generating the second RGB image according to the third RGB image and the full W pixel image.
The specific flow of the panchromatic sharpening will be described in detail below with reference to fig. 8. Wherein, the input is a first RGB image with low resolution, which is recorded as RGB1rAnd a full W pixel image, denoted as W, output as a full resolution second RGB image, denoted as RGB.
Step 1: can be used for RGB1rCalculating bilinear interpolation to obtain initial estimation graph of up-sampled RGB, and recording as RGBhrThe initial estimate of RGB is a full resolution image.
Step 2: can use W as guide diagram for RGBhrThe R, G and B channels of the RGB video signal are respectively subjected to guide filtering operation to obtain RGB. Specifically, taking the R channel as an example, mathematically represented as:
Ri,j=ap,q·Wi,j+bp,q
Figure GDA0003448840820000101
Figure GDA0003448840820000102
similarly, taking the B channel as an example, mathematically represented as:
Bi,j=ap,q·Wi,j+bp,q
Figure GDA0003448840820000103
Figure GDA0003448840820000104
similarly, taking the G channel as an example, mathematically represented as:
Gi,j=ap,q·Wi,j+bp,q
Figure GDA0003448840820000111
Figure GDA0003448840820000112
wherein R isi,jIs the pixel value of i, j position in R pixel image in RGB, Bi,jIs the pixel value, G, of the i, j position in the B pixel image in RGBi,jIs the pixel value, R, of the i, j position in the G pixel image in RGBhri,jIs RGBhrPixel value of i, j position in R pixel image, Bhri,jIs RGBhrPixel value of i, j position in B pixel image, Ghri,jIs the pixel value, W, of the i, j position in the G pixel image in RGBi,jIs the pixel value, ω, of the i, j position in Wp,qRepresenting a window of size p, q.
Therefore, the image processing method of the embodiment of the application can convert the WRGB image into the RGB image, and the W image is fused on the RGB image, so that the luminance information of the W can be better maintained while injecting the spatial detail information of the processed RGB image, and the image quality under the low-light condition can be effectively improved.
Optionally, in this embodiment of the application, an image reduction (Color Reproduction) process may be further performed on the processed RGB image. For example, the first RGB image may be subjected to image restoration processing, and the first RGB image subjected to the image restoration processing may be subjected to full-color sharpening processing based on the full-W pixel image, thereby obtaining the second RGB image. The scheme is convenient, flexible and simple, and can be adapted to the existing Image Signal Processor (ISP) and Image processing chip.
For another example, the first RGB image may be subjected to full-color sharpening based on the full-W pixel image, and then subjected to image restoration processing, thereby obtaining the second RGB image. The scheme is more beneficial to color restoration of the RGB image.
Fig. 9(a), 9(b) and 9(c) show WRGB images of the same figure and effect diagrams obtained using the technical solution of the embodiment of the present application, respectively. Fig. 9(a) is a WRGB image, fig. 9(b) is an RGB image obtained by performing image restoration processing first and then performing panchromatic sharpening processing, and fig. 9(c) is an RGB image obtained by performing panchromatic sharpening processing first and then performing image restoration processing.
Having described the method of image processing according to the embodiment of the present application in detail above, an image processing apparatus according to an embodiment of the present application will be described below with reference to fig. 10, and the technical features described in the method embodiment are applicable to the following apparatus embodiments.
Fig. 10 shows a schematic block diagram of an image processing apparatus 300 according to an embodiment of the present application, and as shown in fig. 10, the image processing apparatus 300 includes:
the sampling unit 310 is configured to sub-sample the WRGB image, and acquire a first sampling diagram including R pixels, G pixels, and B pixels, and a second sampling diagram including W pixels.
A bayer array generating unit 320 for converting the first sampling pattern into a bayer array image.
An interpolation unit 330, configured to interpolate the second sample map into a full W pixel image.
The first RGB image generation unit 340 is configured to perform demosaicing on the bayer array image to generate a first RGB image.
A second RGB image generating unit 350, configured to perform a full-color sharpening process on the first RGB image according to the full-W pixel image to generate a second RGB image.
Therefore, the image processing device of the embodiment of the application can convert the WRGB image into the RGB image, and perform image fusion on the RGB image, so that the generated RGB image can better maintain the luminance information of W while injecting spatial detail information, and thus the image quality under dark light can be effectively improved.
Optionally, in this embodiment of the application, the second RGB image generating unit is specifically configured to: and performing image restoration processing on the first RGB image subjected to panchromatic sharpening processing according to the full W pixel image to generate the second RGB image.
Optionally, in this embodiment of the application, the second RGB image generating unit is specifically configured to: and carrying out panchromatic sharpening processing on the first RGB image subjected to image restoration processing according to the full W pixel image so as to generate the second RGB image.
Alternatively, in the embodiment of the present application, the image restoration process includes automatic white balance AWB, color correction matrix CCM, gamma, or TONE curve TONE cutter.
Optionally, in this embodiment of the present application, the bayer array generating unit is specifically configured to: and carrying out average value operation on the first sampling image to obtain the Bayer array image.
Optionally, in this embodiment of the application, the first RGB image generating unit is specifically configured to: respectively sub-sampling the positions of R pixels, B pixels and G pixels in the Bayer array image to obtain a third sampling image, a fourth sampling image and a fifth sampling image; interpolating the fifth sample map into a full G pixel image; and acquiring a full R pixel image according to the full G pixel image and the third sampling image, and acquiring a full B pixel image according to the full G pixel image and the fourth sampling image.
Optionally, in this embodiment of the present application, the first RGB image unit is specifically configured to: performing guided filtering processing on the full G pixel image and the third sampling image to obtain an initial estimation image of the full R pixel image, and performing guided filtering processing on the full G pixel image and the fourth sampling image to obtain an initial estimation image of the full B pixel image; acquiring a first residual image between the third sampling image and the initial estimation image of the full R pixel image, and acquiring a second residual image between the fourth sampling image and the initial estimation image of the full B pixel image; performing linear interpolation on the first residual error map to obtain a first full-resolution residual error map, and performing linear interpolation on the second residual error map to obtain a second full-resolution residual error map; and performing a dot-adding operation on the initial estimation image of the full-R pixel image and the first full-resolution residual image to obtain the full-R pixel image, and performing a dot-adding operation on the initial estimation image of the full-B pixel image and the second full-resolution residual image to obtain the full-B pixel image.
Optionally, in this embodiment of the present application, the sampling unit is further configured to: up-sampling the first RGB image to obtain a third RGB image; the second RGB image generation unit is specifically configured to: and generating the second RGB image according to the third RGB image and the full W pixel image.
Optionally, in this embodiment of the application, the processing second RGB image generating unit is specifically configured to: and performing guided filtering processing on the third RGB image and the full W pixel image to generate the second RGB image.
Optionally, in an embodiment of the present application, the image processing apparatus further includes: and the photoelectric conversion unit is used for acquiring the WRGB image.
Alternatively, in the embodiment of the present application, the photoelectric conversion unit is applied to a charge coupled device CCD structure or a complementary metal oxide semiconductor CMOS structure.
Fig. 11 is a schematic structural diagram of an image processing apparatus 400 according to an embodiment of the present application. The image processing apparatus 400 shown in fig. 11 includes a processor 410, and the processor 410 can call and run a computer program from a memory to implement the method in the embodiment of the present application.
Optionally, as shown in fig. 11, the image processing apparatus 400 may further include a memory 420. From the memory 420, the processor 410 can call and run a computer program to implement the method in the embodiment of the present application.
The memory 420 may be a separate device from the processor 410, or may be integrated into the processor 410.
Optionally, the image processing apparatus 400 may specifically be an image processing apparatus according to this embodiment, and the image processing apparatus 400 may implement a corresponding process implemented by the image processing apparatus in each method according to this embodiment, which is not described herein again for brevity.
The embodiment of the present application further provides a chip, where the chip includes a processor, and the processor can call and run a computer program from a memory to implement the method in the embodiment of the present application.
Optionally, the chip may be applied to the image processing apparatus in the embodiment of the present application, and the chip may implement a corresponding process implemented by the image processing apparatus in each method in the embodiment of the present application, and for brevity, no further description is given here.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as a system-on-chip, a system-on-chip or a system-on-chip, etc.
Optionally, the present application further provides a computer-readable medium for storing a computer program to implement the method in the present application.
It should be understood that the processor of the embodiments of the present application may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method embodiments may be performed by integrated logic circuits of hardware in a processor or instructions in the form of software. The Processor may be a general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, or discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in a memory, and a processor reads information in the memory and completes the steps of the method in combination with hardware of the processor.
It will be appreciated that the memory in the embodiments of the subject application can be either volatile memory or nonvolatile memory, or can include both volatile and nonvolatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. Volatile Memory can be Random Access Memory (RAM), which acts as external cache Memory. By way of example, but not limitation, many forms of RAM are available, such as Static random access memory (Static RAM, SRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic random access memory (Synchronous DRAM, SDRAM), Double Data Rate Synchronous Dynamic random access memory (DDR SDRAM), Enhanced Synchronous SDRAM (ESDRAM), Synchronous link SDRAM (SLDRAM), and Direct Rambus RAM (DR RAM). It should be noted that the memory of the systems and methods described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
It should be understood that the above memories are exemplary but not limiting illustrations, for example, the memories in the embodiments of the present application may also be Static Random Access Memory (SRAM), dynamic random access memory (dynamic RAM, DRAM), Synchronous Dynamic Random Access Memory (SDRAM), double data rate SDRAM (DDR SDRAM), enhanced SDRAM (enhanced SDRAM, ESDRAM), Synchronous Link DRAM (SLDRAM), Direct Rambus RAM (DR RAM), and the like. That is, the memory in the embodiments of the present application is intended to comprise, without being limited to, these and any other suitable types of memory.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (23)

1. A method of image processing, comprising:
sub-sampling the WRGB image to obtain a first sampling image comprising R pixels, G pixels and B pixels and a second sampling image comprising W pixels;
converting the first sample map into a bayer array image and interpolating the second sample map into a full W pixel image;
demosaicing the Bayer array image to generate a first RGB image, wherein the first RGB image comprises a smaller number of pixels than the WRGB image;
and carrying out panchromatic sharpening processing on the first RGB image according to the full W pixel image so as to generate a second RGB image.
2. The method of claim 1, wherein the panchromatic sharpening of the first RGB image from the full W pixel image to generate a second RGB image comprises:
and according to the full W pixel image, carrying out image restoration processing on the first RGB image subjected to panchromatic sharpening processing to generate a second RGB image.
3. The method of claim 1, wherein the panchromatic sharpening of the first RGB image from the full W pixel image to generate a second RGB image comprises:
and according to the full W pixel image, carrying out panchromatic sharpening processing on the first RGB image subjected to image restoration processing to generate the second RGB image.
4. The method according to claim 2 or 3, wherein the image reduction process comprises automatic white balance AWB, color correction matrix CCM, gamma GAMA, or TONE curve TONE CURVER.
5. The method of any of claims 1 to 3, wherein said converting the first sample pattern into a Bayer array image comprises:
and carrying out average value operation on the first sampling image to obtain the Bayer array image.
6. The method of any one of claims 1 to 3, wherein said demosaicing said Bayer array image to generate a first RGB image comprises:
sub-sampling positions of R pixels, B pixels and G pixels in the Bayer array image respectively to obtain a third sampling image, a fourth sampling image and a fifth sampling image;
interpolating the fifth sample map into a full G pixel image;
acquiring a full R pixel image according to the full G pixel image and the third sampling image, and acquiring a full B pixel image according to the full G pixel image and the fourth sampling image;
and generating the first RGB image according to the full G pixel image, the full R pixel image and the full B pixel image.
7. The method of claim 6, wherein the obtaining a full R pixel image from the full G pixel image and the third sample map and obtaining a full B pixel image from the full G pixel image and the fourth sample map comprises:
performing guided filtering processing on the full G pixel image and the third sampling image to obtain an initial estimation image of the full R pixel image, and performing guided filtering processing on the full G pixel image and the fourth sampling image to obtain an initial estimation image of the full B pixel image;
acquiring a first residual image between the third sampling image and the initial estimation image of the full R pixel image, and acquiring a second residual image between the fourth sampling image and the initial estimation image of the full B pixel image;
performing linear interpolation on the first residual error map to obtain a first full-resolution residual error map, and performing linear interpolation on the second residual error map to obtain a second full-resolution residual error map;
and performing a dot-adding operation on the initial estimation image of the full-R pixel image and the first full-resolution residual image to obtain the full-R pixel image, and performing a dot-adding operation on the initial estimation image of the full-B pixel image and the second full-resolution residual image to obtain the full-B pixel image.
8. The method of any of claims 1-3, wherein said panchromatic sharpening of the first RGB image from the full W pixel image to generate a second RGB image comprises:
the first RGB image is subjected to up-sampling to obtain a third RGB image;
and generating the second RGB image according to the third RGB image and the full W pixel image.
9. The method of claim 8, wherein generating the second RGB image from the third RGB image and the full W pixel image comprises:
and conducting guiding filtering processing on the third RGB image and the full W pixel image to generate the second RGB image.
10. An image processing apparatus characterized by comprising:
the sampling unit is used for sub-sampling the WRGB image to acquire a first sampling image comprising R pixels, G pixels and B pixels and a second sampling image comprising W pixels;
a bayer array generating unit configured to convert the first sampling pattern into a bayer array image;
an interpolation unit for interpolating the second sample map into a full W pixel image;
a first RGB image generation unit configured to perform demosaicing processing on the bayer array image to generate a first RGB image, where the first RGB image includes a smaller number of pixels than the WRGB image;
and the second RGB image generation unit is used for carrying out panchromatic sharpening processing on the first RGB image according to the full W pixel image so as to generate a second RGB image.
11. The image processing apparatus according to claim 10, wherein the second RGB image generating unit is specifically configured to:
and according to the full W pixel image, performing image restoration processing on the first RGB image subjected to panchromatic sharpening processing to generate a second RGB image.
12. The image processing apparatus according to claim 10, wherein the second RGB image generating unit is specifically configured to:
and according to the full W pixel image, carrying out panchromatic sharpening processing on the first RGB image subjected to image restoration processing to generate the second RGB image.
13. The image processing apparatus according to claim 11 or 12, wherein the image restoration processing includes automatic white balance AWB, color correction matrix CCM, gamma, or TONE curve TONE cutter.
14. The image processing apparatus according to any one of claims 10 to 12, wherein the bayer array generation unit is specifically configured to:
and carrying out average value operation on the first sampling image to obtain the Bayer array image.
15. The image processing apparatus according to any one of claims 10 to 12, wherein the first RGB image generation unit is specifically configured to:
sub-sampling positions of R pixels, B pixels and G pixels in the Bayer array image respectively to obtain a third sampling image, a fourth sampling image and a fifth sampling image;
interpolating the fifth sample map into a full G pixel image;
acquiring a full R pixel image according to the full G pixel image and the third sampling image, and acquiring a full B pixel image according to the full G pixel image and the fourth sampling image;
and generating the first RGB image according to the full G pixel image, the full R pixel image and the full B pixel image.
16. The image processing apparatus according to claim 15, wherein the first RGB image generating unit is specifically configured to:
performing guided filtering processing on the full G pixel image and the third sampling image to obtain an initial estimation image of the full R pixel image, and performing guided filtering processing on the full G pixel image and the fourth sampling image to obtain an initial estimation image of the full B pixel image;
acquiring a first residual image between the third sampling image and the initial estimation image of the full R pixel image, and acquiring a second residual image between the fourth sampling image and the initial estimation image of the full B pixel image;
performing linear interpolation on the first residual error map to obtain a first full-resolution residual error map, and performing linear interpolation on the second residual error map to obtain a second full-resolution residual error map;
and performing a dot-adding operation on the initial estimation image of the full-R pixel image and the first full-resolution residual image to obtain the full-R pixel image, and performing a dot-adding operation on the initial estimation image of the full-B pixel image and the second full-resolution residual image to obtain the full-B pixel image.
17. The image processing apparatus according to any one of claims 10 to 12, wherein the sampling unit is further configured to:
the first RGB image is subjected to up-sampling to obtain a third RGB image;
the second RGB image generating unit is specifically configured to:
and generating the second RGB image according to the third RGB image and the full W pixel image.
18. The image processing apparatus according to claim 17, wherein the second RGB image generating unit is specifically configured to:
and conducting guiding filtering processing on the third RGB image and the full W pixel image to generate the second RGB image.
19. The image processing apparatus according to any one of claims 10 to 12, characterized by further comprising:
and the photoelectric conversion unit is used for acquiring the WRGB image.
20. The image processing device according to claim 19, wherein the photoelectric conversion unit is adapted to a Charge Coupled Device (CCD) structure or a Complementary Metal Oxide Semiconductor (CMOS) structure.
21. A chip, comprising: a processor for calling and running a computer program from a memory so that a device on which the chip is installed performs the method of any one of claims 1 to 9.
22. An image processing apparatus characterized by comprising: a processor and a memory for storing a computer program, the processor for invoking and executing the computer program stored in the memory, performing the method of any one of claims 1 to 9.
23. A computer-readable storage medium for storing a computer program which causes a computer to perform the method of any one of claims 1 to 9.
CN202010670512.4A 2020-07-13 2020-07-13 Image processing method and image processing device Active CN111741277B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010670512.4A CN111741277B (en) 2020-07-13 2020-07-13 Image processing method and image processing device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010670512.4A CN111741277B (en) 2020-07-13 2020-07-13 Image processing method and image processing device

Publications (2)

Publication Number Publication Date
CN111741277A CN111741277A (en) 2020-10-02
CN111741277B true CN111741277B (en) 2022-04-29

Family

ID=72654463

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010670512.4A Active CN111741277B (en) 2020-07-13 2020-07-13 Image processing method and image processing device

Country Status (1)

Country Link
CN (1) CN111741277B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112351172B (en) * 2020-10-26 2021-09-17 Oppo广东移动通信有限公司 Image processing method, camera assembly and mobile terminal
CN112261391B (en) * 2020-10-26 2022-01-04 Oppo广东移动通信有限公司 Image processing method, camera assembly and mobile terminal
CN112738494B (en) * 2020-12-28 2023-03-14 Oppo广东移动通信有限公司 Image processing method, image processing system, terminal device, and readable storage medium
CN112738493B (en) * 2020-12-28 2023-03-14 Oppo广东移动通信有限公司 Image processing method, image processing apparatus, electronic device, and readable storage medium
CN114697585B (en) * 2020-12-31 2023-12-29 杭州海康威视数字技术股份有限公司 Image sensor, image processing system and image processing method
CN113660415A (en) * 2021-08-09 2021-11-16 Oppo广东移动通信有限公司 Focus control method, device, imaging apparatus, electronic apparatus, and computer-readable storage medium
CN113781303B (en) * 2021-09-01 2024-05-10 瑞芯微电子股份有限公司 Image processing method, medium, processor and electronic device
CN115767287B (en) * 2021-09-03 2023-10-27 荣耀终端有限公司 Image processing method and electronic equipment
CN114500850B (en) * 2022-02-22 2024-01-19 锐芯微电子股份有限公司 Image processing method, device, system and readable storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
BR112014023256A8 (en) * 2012-03-27 2017-07-25 Sony Corp IMAGE PROCESSING APPARATUS AND METHOD, AND IMAGE FORMING DEVICE, AND PROGRAM
JP2014126903A (en) * 2012-12-25 2014-07-07 Toshiba Corp Image processing apparatus, image processing method, and program
CN104754321B (en) * 2013-12-25 2017-04-12 奇景光电股份有限公司 Camera array system
CN108419062B (en) * 2017-02-10 2020-10-02 杭州海康威视数字技术股份有限公司 Image fusion apparatus and image fusion method
CN110507283A (en) * 2019-09-17 2019-11-29 福州鑫图光电有限公司 Retina camera and its implementation
CN110579279B (en) * 2019-09-19 2021-08-06 西安理工大学 Design method of nine-spectral-band multispectral imaging system of single sensor

Also Published As

Publication number Publication date
CN111741277A (en) 2020-10-02

Similar Documents

Publication Publication Date Title
CN111741277B (en) Image processing method and image processing device
CN112261391B (en) Image processing method, camera assembly and mobile terminal
US10200639B2 (en) Image pickup device and method enabling control of spectral sensitivity and exposure time
US8355074B2 (en) Exposing pixel groups in producing digital images
US10136107B2 (en) Imaging systems with visible light sensitive pixels and infrared light sensitive pixels
CN111614886B (en) Image sensor and electronic device
WO2021208593A1 (en) High dynamic range image processing system and method, electronic device, and storage medium
WO2021179806A1 (en) Image acquisition method, imaging apparatus, electronic device, and readable storage medium
WO2021196553A1 (en) High-dynamic-range image processing system and method, electronic device and readable storage medium
WO2021212763A1 (en) High-dynamic-range image processing system and method, electronic device and readable storage medium
CN113676675B (en) Image generation method, device, electronic equipment and computer readable storage medium
US20060050956A1 (en) Signal processing apparatus, signal processing method, and signal processing program
US8111298B2 (en) Imaging circuit and image pickup device
JP2019208201A (en) Image processing method and filter array
CN113840067B (en) Image sensor, image generation method and device and electronic equipment
US9143747B2 (en) Color imaging element and imaging device
US8582006B2 (en) Pixel arrangement for extended dynamic range imaging
WO2023124607A1 (en) Image generation method and apparatus, electronic device, and computer-readable storage medium
US9179110B2 (en) Imaging systems with modified clear image pixels
JP4531007B2 (en) Image processing system
WO2022011506A1 (en) Image processing method and image processing apparatus
JP5753395B2 (en) Imaging device
WO2022088310A1 (en) Image processing method, camera assembly, and mobile terminal
JP2004088260A (en) Digital camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant