CN116055896A - Image generation method and device and electronic device - Google Patents

Image generation method and device and electronic device Download PDF

Info

Publication number
CN116055896A
CN116055896A CN202211728356.8A CN202211728356A CN116055896A CN 116055896 A CN116055896 A CN 116055896A CN 202211728356 A CN202211728356 A CN 202211728356A CN 116055896 A CN116055896 A CN 116055896A
Authority
CN
China
Prior art keywords
image
pixel
images
pixels
color
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211728356.8A
Other languages
Chinese (zh)
Inventor
王土生
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202211728356.8A priority Critical patent/CN116055896A/en
Publication of CN116055896A publication Critical patent/CN116055896A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Color Television Image Signal Generators (AREA)

Abstract

The application discloses an image generation method, an image generation device and an electronic device, and belongs to the technical field of communication. Wherein the camera module in the electronic device includes: n monochromatic cameras which are arranged in parallel, wherein N is a positive integer greater than 2; the monochromatic camera comprises a pixel array, wherein the pixel array comprises a plurality of minimum repeating units, the minimum repeating units comprise first pixels and second pixels, the first pixels are first visible light pixels or invisible light pixels, and the second pixels are second visible light pixels; wherein a wavelength band of light transmitted through the light transmitting portion of the first visible light pixel includes a wavelength band ranging from blue to red, and a wavelength band of light transmitted through the light transmitting portion of the second visible light pixel is narrower than a wavelength band of light transmitted through the light transmitting portion of the first visible light pixel.

Description

Image generation method and device and electronic device
Technical Field
The application belongs to the technical field of communication, and particularly relates to an image generation method, an image generation device and an electronic device.
Background
At present, front and back camera modules are configured on consumer electronic products such as mobile phones, the number of the camera modules is changed from single camera to multiple cameras at present, and the camera modes also work together from single camera shooting to binocular or even more camera modules to realize shooting or photographing. The camera module is functionally divided into a main camera, a long focus, a wide angle, a super wide angle, black and white, a macro, a long focus macro, a Time-Of-Flight (TOF) distance measurement and the like. In the case of visible light modules, the visible light modules can be classified into color and black-and-white. The common color acquisition mode of the color camera module adopts a beam splitting prism and a dichroic filter to split incident light into three beams of red, green and blue light, and images on three image sensor arrays respectively. Due to the coaxial light path, three images of Red, green and Blue (RGB) can be aligned by internal reference calibration, and then an image with complete color information at each pixel point is fused.
However, since the beam-splitting prism is used, the sensor area is limited and the image quality is low because the beam-splitting prism is compressed in size.
Disclosure of Invention
An embodiment of the application aims to provide an image generation method, an image generation device and an electronic device, which can solve the problem of low image quality in the prior art.
In a first aspect, an embodiment of the present application provides a camera module, including:
n monochromatic cameras which are arranged in parallel, wherein N is a positive integer greater than 2;
the monochromatic camera comprises a pixel array, wherein the pixel array comprises a plurality of minimum repeating units, the minimum repeating units comprise first pixels and second pixels, the first pixels are first visible light pixels or invisible light pixels, and the second pixels are second visible light pixels;
wherein a wavelength band of light transmitted through the light transmitting portion of the first visible light pixel includes a wavelength band ranging from blue to red, and a wavelength band of light transmitted through the light transmitting portion of the second visible light pixel is narrower than a wavelength band of light transmitted through the light transmitting portion of the first visible light pixel.
In a second aspect, an embodiment of the present application provides an electronic device, including the camera module as described above.
In a third aspect, an embodiment of the present application provides an image generating method, which is applied to the electronic device, and the method includes:
acquiring N first images through N monochromatic cameras;
taking one of the N first images as a reference, and respectively calculating other first images to obtain N-1 offset pixel matrixes;
and according to the N-1 offset pixel matrixes, performing pixel superposition on the pixels of the N first images to obtain a target image.
In a fourth aspect, an embodiment of the present application provides an image generating apparatus, which is applied to the above electronic apparatus, including:
the first acquisition module is used for acquiring N first images through N monochromatic cameras;
the first processing module is used for taking one of the N first images as a reference, and respectively calculating other first images through a three-dimensional registration algorithm to obtain N-1 offset pixel matrixes;
and the second processing module is used for carrying out pixel superposition on the pixels of the N first images according to the N-1 offset pixel matrixes to obtain a target image.
In a fifth aspect, an embodiment of the present application provides an electronic device, including a processor, a memory, and an image capturing module according to the first aspect, where the memory stores a program or instructions executable on the processor, and where the program or instructions implement the steps of the method according to the third aspect when executed by the processor.
In a sixth aspect, embodiments of the present application provide a readable storage medium having stored thereon a program or instructions which when executed by a processor implement the steps of the method according to the third aspect.
In a seventh aspect, embodiments of the present application provide a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and where the processor is configured to execute a program or instructions to implement a method according to the third aspect.
In an eighth aspect, embodiments of the present application provide a computer program product stored in a storage medium, the program product being executable by at least one processor to implement the method according to the third aspect.
In the embodiment of the application, as the camera module consists of N monochromatic cameras which are arranged in parallel, the thickness of the camera module can be reduced, and the Z-direction height can be reduced; each single-color camera comprises a pixel array, the pixel array comprises a plurality of minimum repeating units, the minimum repeating units comprise a first pixel and a second pixel, the first pixel is a first visible light pixel or a non-visible light pixel, and the second pixel is a second visible light pixel; and moreover, a first pixel is added in the pixel array of each monochromatic camera, and pixel registration among the monochromatic cameras is realized through the assistance of the first pixel, so that finer color images can be obtained.
Drawings
Fig. 1 is a schematic structural diagram of a camera module provided in an embodiment of the present application;
FIG. 2 is a schematic diagram of a pixel arrangement of a color filter array according to an embodiment of the present disclosure;
FIG. 3 is a second schematic diagram of a pixel arrangement of a color filter array according to an embodiment of the present disclosure;
FIG. 4 is a third schematic diagram of a pixel arrangement of a color filter array according to an embodiment of the present disclosure;
FIG. 5 is a fourth schematic diagram of a pixel arrangement of a color filter array according to an embodiment of the present disclosure;
FIG. 6 is a flowchart of an image generation method provided by an embodiment of the present application;
fig. 7 is one of pixel arrangement diagrams of a gray scale image according to an embodiment of the present application;
FIG. 8 is a second schematic diagram of a pixel arrangement of a gray scale image according to an embodiment of the present disclosure;
FIG. 9 is a third schematic diagram of pixel arrangement of a gray scale image according to an embodiment of the present application
Fig. 10 is a schematic structural diagram of an image generating apparatus according to an embodiment of the present application;
fig. 11 is a block diagram of an electronic device according to an embodiment of the present application;
fig. 12 is a block diagram of another electronic device according to an embodiment of the present application.
Detailed Description
Technical solutions in the embodiments of the present application will be clearly described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application are within the scope of the protection of the present application.
The terms first, second and the like in the description and in the claims, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged, as appropriate, such that embodiments of the present application may be implemented in sequences other than those illustrated or described herein, and that the objects identified by "first," "second," etc. are generally of a type and not limited to the number of objects, e.g., the first object may be one or more. Furthermore, in the description and claims, "and/or" means at least one of the connected objects, and the character "/", generally means that the associated object is an "or" relationship.
At present, the color acquisition mode of the common color camera module is usually bayer pattern and its variants repeated on a single imaging array, such as: red-Green-Blue (RGGB), including Red-Yellow-Blue (RYYB) of other color combinations, red-Yellow-White (RGBW) of which the pixel of perceived visible light is increased, and the like. Because these images need to be processed by demosaicing demosiac and the like to be restored into an image with complete color information at each pixel point, the demosaic can cause color confusion in high-frequency color details under the scenes of multiple textures, high-frequency foreground, different color backgrounds and the like.
There are two main reasons:
first: since the color is acquired by using the bayer pattern on the imaging pixel array, only one color can be acquired on each pixel, and the colors of the pixels are restored by using the colors of the surrounding pixels through a demosaic method, which is an interpolation process, and color confusion is difficult to avoid under multi-texture, high-frequency foreground and different-color background scenes.
Second,: crosstalk exists between adjacent pixels of an imaging pixel array, including two types of crosstalk: optically, wide angle incident light is cross-linked to adjacent pixels; electrically, cross-talk of charge collection wells between pixels.
Therefore, in order to solve the above-mentioned problems, an embodiment of the present application provides an image capturing module. The image capturing module provided in the embodiment of the application is described in detail below by means of specific embodiments and application scenarios thereof with reference to the accompanying drawings.
The embodiment of the application provides a module of making a video recording, this module of making a video recording includes:
n monochromatic cameras which are arranged in parallel, wherein N is a positive integer greater than 2;
the monochromatic camera comprises a pixel array, wherein the pixel array comprises a plurality of minimum repeating units, the minimum repeating units comprise first pixels and second pixels, the first pixels are first visible light pixels or invisible light pixels, and the second pixels are second visible light pixels;
Wherein a wavelength band of light transmitted through the light transmitting portion of the first visible light pixel includes a wavelength band ranging from blue to red, and a wavelength band of light transmitted through the light transmitting portion of the second visible light pixel is narrower than a wavelength band of light transmitted through the light transmitting portion of the first visible light pixel.
Specifically, in order to avoid the lack of common information between the monochromatic cameras for registration under the monochromatic light source or the monochromatic area scene, a first pixel is added in each monochromatic camera for transmitting visible light or invisible light, and the first pixel can be used as a three-dimensional matching reference between single monochromatic cameras to assist in realizing pixel registration between images of the monochromatic cameras. If the first pixel is a first visible light pixel, under the dark light condition, the first visible light pixel has higher light transmittance, so that more texture details of the color image can be acquired. If the first pixel is an invisible light pixel, the monochromatic camera also needs to comprise a light supplementing device, and under the condition of dark light, the light supplementing device can be used for supplementing light, so that a higher signal-to-noise ratio support can be provided for stereo matching, and more details are provided.
The scheme of the application is described below by taking the value of N as 3 as an example, wherein the value of N is a positive integer greater than 2:
If the value of N is 3, as shown in fig. 1, the structure of the image capturing module 11 is schematically shown, where the image capturing module 11 includes 3 monochromatic cameras 111 arranged in parallel, and the arrangement manner can reduce the thickness of the image capturing module 11, thereby reducing the Z-direction height.
It should be noted that, the number of monochrome cameras in fig. 1 is merely an example, and the value of N may be other values. The arrangement of the 3 single-color cameras in fig. 1 is also merely an example, and is not limited to a lateral arrangement.
The pixel array is periodically and repeatedly arranged in a single-color pixel pattern (i.e. a minimum repeating unit), and the single-color pixel pattern comprises 2 x 2 four pixel units and consists of first pixels and second pixels, wherein the first pixels are at least 1, and the second pixels are at least 2. The first pixel is a first visible light pixel or an invisible light pixel, the spectral wavelength accepted by the first visible light pixel comprises a wavelength band ranging from blue light to red light, and the wavelength band accepted by the invisible light pixel is near infrared. The second pixel receives a narrower spectral wavelength than the first visible light pixel, and may preferably be one of red, green, and blue wavelength bands.
If the three monochromatic cameras are mainly transmitted through red light R, green light G and blue light B respectively, the monochromatic camera mainly transmitted through the red light R is used as a red light camera, the monochromatic camera mainly transmitted through the green light G is used as a green light camera, and the monochromatic camera mainly transmitted through the blue light B is used as a blue light camera. The three monochromatic cameras transmitting visible light of different colors are different in that: the spectral wavelength band accepted by the second pixel in each monochrome camera is different.
Each monochrome camera further includes: motors, motor drives, infrared (Infrared Radiation, IR) filters, photosensitive pixel arrays, and the like.
For example: as shown in fig. 2, if each minimum repeating unit is within 2×2 pixel arrangement, at least 2 second pixels and 1 first pixel are included. If the pixel arrangement of the minimum repeating unit in the red camera includes 3 second pixels and 1 first pixel, the pixel arrangement format of the minimum repeating unit of the red camera is as shown in fig. 2, R represents the second pixel of red, and W represents the first pixel. If the pixel arrangement of the minimum repeating unit in the red camera includes 2 second pixels and 2 first pixels, the pixel arrangement format of the minimum repeating unit of the red camera is as shown in fig. 3.
Similarly, if the pixel arrangement of the minimum repeating unit in the green camera includes 3 second pixels and 1 first pixel, the pixel arrangement format of the minimum repeating unit of the green camera is as shown in fig. 4, G represents the second pixel of green, and W represents the first pixel.
Similarly, if the pixel arrangement of the minimum repeating unit in the blue camera includes 3 second pixels and 1 first pixel, the pixel arrangement format of the minimum repeating unit of the blue camera is as shown in fig. 5, B represents the second pixel of blue, and W represents the first pixel.
In the embodiment of the application, as the camera module consists of N monochromatic cameras which are arranged in parallel, the thickness of the camera module can be reduced, and the Z-direction height can be reduced; each single-color camera comprises a pixel array, the pixel array comprises a plurality of minimum repeating units, the minimum repeating units comprise a first pixel and a second pixel, the first pixel is a first visible light pixel or a non-visible light pixel, and the second pixel is a second visible light pixel; and moreover, a first pixel is added in the pixel array of each monochromatic camera, and pixel registration among the monochromatic cameras is realized through the assistance of the first pixel, so that finer color images can be obtained.
As an alternative embodiment, the parallel arrangement includes: are arranged in parallel at equal intervals in the horizontal direction or are arranged in parallel at equal intervals in the vertical direction.
Specifically, the N single-color cameras may be arranged in a horizontal direction, and may be further set so that the distance interval between every two adjacent single-color cameras is equal, so that the thickness of the camera module may be reduced, and the Z-direction height may be further reduced. For example: as shown in fig. 1, when N is 3, 3 monochrome cameras are arranged in parallel at equal intervals in the horizontal direction.
The embodiment of the invention also provides an electronic device, which comprises the camera module set in any embodiment.
The electronic device may also include a central processor, a display screen, memory, an imaging device, etc.
The electronic device can be a mobile phone, a tablet personal computer, an intelligent television and other equipment.
The central processing unit is connected with the imaging device, and is used for setting the imaging device and starting an acquisition instruction; the CPU controls the image output by the imaging device to be stored in the memory. The central processing unit controls the display screen for displaying the image output by the imaging device.
As shown in fig. 6, an embodiment of the present application provides an image generating method, which may be applied to the above-described electronic device. The image generation method specifically comprises the following steps:
in step 601, N first images are acquired by N monochrome cameras.
Specifically, the N monochromatic cameras are used for respectively acquiring images to obtain N acquired images, and in the image acquisition process, the exposure control parameter and the focal length parameter of one monochromatic camera can be used as references, and the other monochromatic cameras adopt the same exposure control parameter and focal length parameter.
Preprocessing N acquired images through at least N paths of image preprocessing channels to obtain N first images, wherein the preprocessing comprises the following steps: at least one of a pixel removal process, a noise reduction process, a lens correction process, and the like.
Step 602, taking one of the N first images as a reference, and respectively calculating the other first images to obtain N-1 offset pixel matrixes.
Specifically, one of the N first images is used as a reference standard, and an offset pixel matrix of each of the other first images relative to the first image used as the reference standard is calculated through an algorithm such as stereo registration.
For example: and N is 3, three first images are C1, C2 and C3 respectively, the C2 is used as a reference standard, a three-dimensional registration algorithm is adopted to calculate an offset pixel matrix of the C1 relative to the C2, and a three-dimensional registration algorithm is adopted to calculate an offset pixel matrix of the C3 relative to the C2.
And 603, according to the N-1 offset pixel matrixes, performing pixel superposition on the pixels of the N first images to obtain a target image.
Specifically, in the other first images, according to the offset pixel matrix of each first image relative to the reference standard, the pixels of the first image are superimposed on the pixels corresponding to the reference standard, so as to obtain the pixels with superimposed multicolor pixels, thereby obtaining the target image, that is, the target image has N pieces of color information at one pixel position.
According to the embodiment, N first images are acquired through N monochromatic cameras, one of the N first images is used as a reference, the other first images are respectively calculated through a three-dimensional registration algorithm to obtain N-1 offset pixel matrixes, and pixels of the N first images are subjected to pixel superposition according to the N-1 offset pixel matrixes to obtain a target image. The pixel stereo registration algorithm is utilized to carry out stereo registration between the monochromatic cameras to obtain an offset pixel matrix, and the first images of the N monochromatic cameras are combined according to the offset pixel matrix, so that finer color images can be obtained; and the light quantity is improved through N monochromatic cameras, so that the signal to noise ratio is better.
As an optional embodiment, before the step 603 performs pixel stacking on the pixels of the N first images according to the N-1 offset pixel matrices to obtain the target image, the method may further include:
performing interpolation processing on first pixels in the first images aiming at each first image in the N first images to obtain a first interpolation result;
and replacing the first pixel with the first interpolation result to obtain a monochromatic image.
Further, the step 603 performs pixel stacking on the pixels of the N first images according to the N-1 offset pixel matrixes to obtain a target image, and specifically includes:
and according to the N-1 offset pixel matrixes, carrying out pixel superposition on the N monochromatic images to obtain a target image.
Specifically, for each first image in the N first images, performing interpolation processing on a first pixel in the first image, that is, calculating an average value of second pixels around the first pixel to obtain a first interpolation result, filling the position of the first pixel as the first interpolation result, and after filling all the positions of the first pixels, obtaining a monochrome image. And according to the offset pixel matrix of the monochromatic image corresponding to each other first image, the pixels of the monochromatic image are overlapped to the positions of the corresponding pixels of the monochromatic image of the reference image, and the multi-color target image can be obtained after the overlapping is completed.
As an optional embodiment, the step of performing pixel stacking on the N single-color images according to the N-1 offset pixel matrices to obtain a target image specifically includes:
according to the N-1 offset pixel matrixes, one of the N single-color images is used as a reference image, and pixels in other single-color images are overlapped on corresponding pixels of the reference image, so that a multi-color image is obtained;
And carrying out fusion processing on the multicolor image and the reference image to obtain a target image.
Specifically, if the first pixel is a first visible light pixel, for each first image, the first visible light pixel of the first image is extracted to form a gray-scale image, thereby obtaining N gray-scale images. And taking a gray image corresponding to one of the N single-color images as a reference image, and calculating offset pixel matrixes of each gray image relative to the reference image in other gray images through a three-dimensional registration algorithm, so as to obtain N-1 offset pixel matrixes. Preferably, a first image acquired by a single-color camera in the middle of the N single-color cameras is selected as a reference standard.
For example: the number of the first images is 3, namely a first image C1 shown in fig. 2, a first image C2 shown in fig. 4 and a first image C3 shown in fig. 5. As shown in fig. 7, the gray-scale image W1 obtained after the first visible light pixel is extracted in fig. 2. As shown in fig. 8, the gray-scale image W2 obtained after the first visible light pixel is extracted in fig. 4. As shown in fig. 9, the gray-scale image W3 obtained after the first visible light pixel is extracted in fig. 5. Assuming that three monochrome cameras are aligned in the row direction, if W2 is a reference image, it is necessary to calculate an offset pixel matrix d1 of W1 with respect to W2 and an offset pixel matrix d3 of W3 with respect to W2.
And according to the offset pixel matrix of the monochromatic image corresponding to each other gray level image, overlapping the pixels of the monochromatic image on the corresponding pixel positions of the monochromatic image of the reference image, and after the overlapping is finished, each pixel position is provided with a plurality of color information, so as to obtain the multicolor image. And (3) carrying out fusion processing on the multicolor image and the reference image, and obtaining the target image with more details and higher signal-to-noise ratio under the condition of dark light.
As an optional embodiment, the step of fusing the multi-color image with the reference image to obtain the target image specifically includes:
processing the multi-color image and the reference image into images of the same resolution;
performing format conversion on the multicolor image with the same resolution as the reference image to obtain a second image in YUV format;
and carrying out fusion processing on the reference image with the same resolution as the multicolor image and the Y component in the second image to obtain a target image.
Specifically, if the multi-color image is a full resolution image and the reference image is an image with 1/2 resolution or 1/4 resolution, the reference image can be amplified from the original 1/2 resolution or 1/4 resolution to the full resolution by a super resolution processing mode, so as to obtain the reference image with the same resolution as the multi-color image. Alternatively, the colored image may be reduced in resolution to yield a multi-colored image of the same resolution as the reference image.
If the reference image is amplified into a full-resolution image, format conversion is carried out on the full-resolution multicolor image to obtain a YUV format second image, and fusion processing is carried out on a Y component in the YUV format second image and the full-resolution reference image, so that a target image with lower brightness noise is obtained.
Further, the above-mentioned fusion processing of the reference image with the same resolution as the multicolor image and the Y component in the second image includes, but is not limited to, the following ways:
replacing the Y component with a reference image of the same resolution as the multi-color image;
or decomposing a base layer and a detail layer of the Y component by adopting a Gaussian filter, and decomposing the base layer and the detail layer of a reference image with the same resolution as the multicolor image;
the detail layer of the Y component is replaced with the detail layer of a reference image of the same resolution as the multicolor image.
Specifically, one way may be: replacing the Y component with a reference image; another way may be: the detail layer of the Y component is replaced with the detail layer of the reference picture.
The pixel stacking process is described in detail below by way of a specific embodiment:
for example: as shown in fig. 7 to 9, if W2 is a reference image, the offset pixel matrix of W1 with respect to W2 is d1, and the offset pixel matrix of W3 with respect to W2 is d3. Acquiring C2 (i, j) pixels by d1 corresponds to C1 (i, j 1) on C1, i.e., C1 (i, j 1) is C1 (i, j+d1 ij ) The method comprises the steps of carrying out a first treatment on the surface of the Acquiring C2 (i, j) pixel corresponding to C3 (i, j 3) on C3 by d3, i.e. C3 (i, j 3) is C3 (i, j+d3 ij ). C1 (i, j+d1) ij ) C3 (i, j+d3) ij ) When the color information is superimposed on the C2 (i, j) pixel, three pieces of color information are provided at the C2 (i, j) position, and the pixel information included at the superimposed C2 (i, j) position is C1 (i, j+d1ij), C2 (i, j), and C3 (i, j+d3ij). Where i represents the pixel abscissa and j represents the pixel ordinate.
As shown in fig. 2, 4 to 9, X1, X2, X3, X4 are X-direction (column-direction) coordinate indexes of an image, that is, X1 st column, X2 nd column, X3 rd column, X4 th column, and the like. Y1, Y2, Y3, Y4 are coordinate indexes in the Y direction (row direction) of the image, i.e., Y1 row, Y2 row, Y3 row, Y4 row. The position of the filled W is a matching point pair found by stereo registration, that is, a pixel with coordinates (y 1, x 2) in W2, coordinates (y 1, x 1) in the corresponding W1 map, and offset pixels with coordinates (y 1, x 3) in the corresponding W3 map corresponding to disparity maps d1, d3 in (y 1, x 2) are:
d1(y1,x2)=x1-x2
d3(y1,x2)=x3-x2
the monochrome images corresponding to W1, W2, and W3 are A1, A2, and A3, respectively, and since the resolutions of W1, W2, and W3 are half of A1, A2, and A3, it is necessary to multiply the pixel coordinates and offset pixels by 2 to obtain offset pixels of A1 and A3 with respect to A2.
Also taking the position of W2 at (y 1, x 2) as an example, when the image size corresponding to A2 is to be obtained, the corresponding pixel coordinate becomes (2×y1,2×x2) and the corresponding offset pixel is also enlarged by 2 times as much as the original one:
d1 (2 x1, 2 x 2) is 2 x (x 1-x 2)
D3 (2 x y1,2 x 2) is 2 x (x 3-x 2)
Pixels located at odd coordinate positions, such as (2×y1,2×2+1), (2×y1+1,2×x2), (2×y1+1,2×x2+1), can be interpolated from the offset pixels at even coordinate positions in the above D1 and D3.
Then, for the pixel coordinate (i, j) in A2, according to the offset pixels D1 and D3, two other color pixel values are taken from the images A1 and A3 respectively, and the three color pixel values under the coordinate position are obtained.
It should be noted that if there is a row-direction misalignment of three monochrome sensors, then it is also necessary to characterize the offset pixels in the y-direction.
In summary, according to the above embodiments of the present application, by the parallel and side-by-side structure of the N monochrome cameras and adding the first pixel in each camera, pixel registration between the monochrome camera images can be assisted, so as to obtain finer color images; and the single-color images of the N single-color cameras are combined by utilizing pixel registration, so that a finer multi-color image is obtained; and under the condition of dark light, fusing the multicolor image and the gray level image to obtain the target image with more details and higher signal-to-noise ratio under the condition of dark light. According to the scheme, the color resolution is improved, more accurate detailed image information than monocular image information can be obtained, the N monochromatic cameras are utilized to enable the colors to be separated to obtain finer color reduction, the N monochromatic cameras are utilized to achieve the purpose of binocular ranging, and color details of foreground images or exhibits can be more prominently presented for live broadcasting modes of images or cultural relics and other exhibits on line.
According to the image generation method provided by the embodiment of the application, the execution subject can be an image generation device. In the embodiment of the present application, an image generating apparatus provided in the embodiment of the present application will be described by taking an example in which the image generating apparatus executes an image generating method.
As shown in fig. 10, an embodiment of the present application further provides an image generating apparatus, which is applied to the above electronic apparatus, and includes:
a first obtaining module 801, configured to obtain N first images through N monochrome cameras;
a first processing module 802, configured to calculate, using one of the N first images as a reference, the other first images respectively, so as to obtain N-1 offset pixel matrices;
and the second processing module 803 is configured to perform pixel stacking on the pixels of the N first images according to the N-1 offset pixel matrices, so as to obtain a target image.
Optionally, the apparatus further includes:
the third processing module is used for carrying out interpolation processing on first pixels in the first images aiming at each first image in the N first images to obtain a first interpolation result;
the fourth processing module is used for replacing the first pixel with the first interpolation result to obtain a monochromatic image;
Wherein, the second processing module 803 is specifically configured to:
and according to the N-1 offset pixel matrixes, carrying out pixel superposition on the N monochromatic images to obtain a target image.
Optionally, the second processing module 803 is specifically configured to, when performing pixel stacking on N single-color images according to the N-1 offset pixel matrices to obtain a target image:
according to the N-1 offset pixel matrixes, one of the N single-color images is used as a reference image, and pixels in other single-color images are overlapped on corresponding pixels of the reference image, so that a multi-color image is obtained;
and carrying out fusion processing on the multicolor image and the reference image to obtain a target image.
Optionally, the second processing module 803 is specifically configured to, when performing fusion processing on the multi-color image and the reference image to obtain a target image:
processing the multi-color image and the reference image into images of the same resolution;
performing format conversion on the multicolor image with the same resolution as the reference image to obtain a second image in YUV format;
and carrying out fusion processing on the reference image with the same resolution as the multicolor image and the Y component in the second image to obtain a target image.
Optionally, the fusing the reference image with the same resolution as the multicolor image with the Y component in the second image includes:
replacing the Y component with a reference image of the same resolution as the multi-color image;
or decomposing a base layer and a detail layer of the Y component by adopting a Gaussian filter, and decomposing the base layer and the detail layer of a reference image with the same resolution as the multicolor image;
the detail layer of the Y component is replaced with the detail layer of a reference image of the same resolution as the multicolor image.
In summary, according to the above embodiments of the present application, by the parallel and side-by-side structure of the N monochrome cameras and adding the first pixel in each camera, pixel registration between the monochrome camera images can be assisted, so as to obtain finer color images; and the single-color images of the N single-color cameras are combined by utilizing pixel registration, so that a finer multi-color image is obtained; and under the condition of dark light, fusing the multicolor image and the gray level image to obtain the target image with more details and higher signal-to-noise ratio under the condition of dark light. According to the scheme, the color resolution is improved, more accurate detailed image information than monocular image information can be obtained, the N monochromatic cameras are utilized to enable the colors to be separated to obtain finer color reduction, the N monochromatic cameras are utilized to achieve the purpose of binocular ranging, and color details of foreground images or exhibits can be more prominently presented for live broadcasting modes of images or cultural relics and other exhibits on line.
The image generating device in the embodiment of the application may be an electronic device, or may be a component in an electronic device, for example, an integrated circuit or a chip. The electronic device may be a terminal, or may be other devices than a terminal. By way of example, the electronic device may be a cell phone, tablet computer, notebook computer, palm computer, vehicle-mounted electronic device, mobile internet appliance (Mobile Internet Device, MID), augmented reality (augmented reality, AR)/Virtual Reality (VR) device, robot, wearable device, ultra-mobile personal computer, UMPC, netbook or personal digital assistant (personal digital assistant, PDA), etc., and embodiments of the present application are not particularly limited.
The image generating apparatus in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android operating system, an ios operating system, or other possible operating systems, which are not specifically limited in the embodiments of the present application.
The image generating device provided in the embodiment of the present application can implement each process implemented by the method embodiments of fig. 2 to 9, and in order to avoid repetition, a description is omitted here.
Optionally, as shown in fig. 11, the embodiment of the present application further provides an electronic device 900, including a processor 901 and a memory 902, where a program or an instruction capable of being executed on the processor 901 is stored in the memory 902, and the program or the instruction when executed by the processor 901 implements each step of the embodiment of the image generating method, and the steps can achieve the same technical effect, so that repetition is avoided, and no further description is given here.
The electronic device in the embodiment of the application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 12 is a schematic hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 1000 includes, but is not limited to: radio frequency unit 1001, network module 1002, audio output unit 1003, input unit 1004, sensor 1005, display unit 1006, user input unit 1007, interface unit 1008, memory 1009, and processor 1010.
Those skilled in the art will appreciate that the electronic device 1000 may also include a power source (e.g., a battery) for powering the various components, which may be logically connected to the processor 1010 by a power management system to perform functions such as managing charge, discharge, and power consumption by the power management system. The electronic device structure shown in fig. 12 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than illustrated, or may combine certain components, or may be arranged in different components, which are not described in detail herein.
The input unit 1004 acquires N first images through N monochromatic cameras;
a processor 1010, configured to calculate, using one of the N first images as a reference, the other first images respectively, to obtain N-1 offset pixel matrices;
and according to the N-1 offset pixel matrixes, performing pixel superposition on the pixels of the N first images to obtain a target image.
According to the embodiment, N first images are acquired through N monochromatic cameras, one of the N first images is used as a reference, the other first images are respectively calculated through a three-dimensional registration algorithm to obtain N-1 offset pixel matrixes, and pixels of the N first images are subjected to pixel superposition according to the N-1 offset pixel matrixes to obtain a target image. The pixel stereo registration algorithm is utilized to carry out stereo registration between the monochromatic cameras to obtain an offset pixel matrix, and the first images of the N monochromatic cameras are combined according to the offset pixel matrix, so that finer color images can be obtained; and the light quantity is improved through N monochromatic cameras, so that the signal to noise ratio is better.
Optionally, before performing pixel stacking on the pixels of the N first images according to the N-1 offset pixel matrices to obtain the target image, the processor 1010 is further configured to:
Performing interpolation processing on first pixels in the first images aiming at each first image in the N first images to obtain a first interpolation result;
replacing the first pixel with the first interpolation result to obtain a monochromatic image;
the processor 1010 is specifically configured to, when performing pixel stacking on the pixels of the N first images according to the N-1 offset pixel matrices to obtain a target image:
and according to the N-1 offset pixel matrixes, carrying out pixel superposition on the N monochromatic images to obtain a target image.
Optionally, when the processor 1010 performs pixel stacking on N single-color images according to the N-1 offset pixel matrices to obtain a target image, the processor is specifically configured to:
according to the N-1 offset pixel matrixes, one of the N single-color images is used as a reference image, and pixels in other single-color images are overlapped on corresponding pixels of the reference image, so that a multi-color image is obtained;
and carrying out fusion processing on the multicolor image and the reference image to obtain a target image.
Optionally, when the processor 1010 performs fusion processing on the multi-color image and the reference image to obtain a target image, the method is specifically used for:
Processing the multi-color image and the reference image into images of the same resolution;
performing format conversion on the multicolor image with the same resolution as the reference image to obtain a second image in YUV format;
and carrying out fusion processing on the reference image with the same resolution as the multicolor image and the Y component in the second image to obtain a target image.
Optionally, the fusing the reference image with the same resolution as the multicolor image with the Y component in the second image includes:
replacing the Y component with a reference image of the same resolution as the multi-color image;
or decomposing a base layer and a detail layer of the Y component by adopting a Gaussian filter, and decomposing the base layer and the detail layer of a reference image with the same resolution as the multicolor image;
the detail layer of the Y component is replaced with the detail layer of a reference image of the same resolution as the multicolor image.
In summary, according to the above embodiments of the present application, by the parallel and side-by-side structure of the N monochrome cameras and adding the first pixel in each camera, pixel registration between the monochrome camera images can be assisted, so as to obtain finer color images; and the single-color images of the N single-color cameras are combined by utilizing pixel registration, so that a finer multi-color image is obtained; and under the condition of dark light, fusing the multicolor image and the gray level image to obtain the target image with more details and higher signal-to-noise ratio under the condition of dark light. According to the scheme, the color resolution is improved, more accurate detailed image information than monocular image information can be obtained, the N monochromatic cameras are utilized to enable the colors to be separated to obtain finer color reduction, the N monochromatic cameras are utilized to achieve the purpose of binocular ranging, and color details of foreground images or exhibits can be more prominently presented for live broadcasting modes of images or cultural relics and other exhibits on line.
It should be understood that in the embodiment of the present application, the input unit 1004 may include a graphics processor (Graphics Processing Unit, GPU) 10041 and a microphone 10042, and the graphics processor 10041 processes image data of still pictures or videos obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 1006 may include a display panel 10061, and the display panel 10061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 1007 includes at least one of a touch panel 10071 and other input devices 10072. The touch panel 10071 is also referred to as a touch screen. The touch panel 10071 can include two portions, a touch detection device and a touch controller. Other input devices 10072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and so forth, which are not described in detail herein.
The memory 1009 may be used to store software programs as well as various data. The memory 1009 may mainly include a first memory area storing programs or instructions and a second memory area storing data, wherein the first memory area may store an operating system, application programs or instructions (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like. Further, the memory 1009 may include volatile memory or nonvolatile memory, or the memory 1009 may include both volatile and nonvolatile memory. The nonvolatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable EPROM (EEPROM), or a flash Memory. The volatile memory may be random access memory (Random Access Memory, RAM), static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (ddr SDRAM), enhanced SDRAM (Enhanced SDRAM), synchronous DRAM (SLDRAM), and Direct RAM (DRRAM). Memory 1009 in embodiments of the present application includes, but is not limited to, these and any other suitable types of memory.
The processor 1010 may include one or more processing units; optionally, the processor 1010 integrates an application processor that primarily processes operations involving an operating system, user interface, application programs, and the like, and a modem processor that primarily processes wireless communication signals, such as a baseband processor. It will be appreciated that the modem processor described above may not be integrated into the processor 1010.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored, and when the program or the instruction is executed by a processor, the processes of the embodiment of the image generating method are implemented, and the same technical effects can be achieved, so that repetition is avoided, and no further description is given here.
Wherein the processor is a processor in the electronic device described in the above embodiment. The readable storage medium includes computer readable storage medium such as computer readable memory ROM, random access memory RAM, magnetic or optical disk, etc.
The embodiment of the application further provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled with the processor, and the processor is used for running a program or an instruction, so as to implement each process of the embodiment of the image generation method, and achieve the same technical effect, so that repetition is avoided, and no redundant description is provided here.
It should be understood that the chips referred to in the embodiments of the present application may also be referred to as system-on-chip chips, chip systems, or system-on-chip chips, etc.
The embodiments of the present application provide a computer program product stored in a storage medium, where the program product is executed by at least one processor to implement the respective processes of the embodiments of the image generating method described above, and achieve the same technical effects, and are not repeated herein.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Furthermore, it should be noted that the scope of the methods and apparatus in the embodiments of the present application is not limited to performing the functions in the order shown or discussed, but may also include performing the functions in a substantially simultaneous manner or in an opposite order depending on the functions involved, e.g., the described methods may be performed in an order different from that described, and various steps may also be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solutions of the present application may be embodied essentially or in a part contributing to the prior art in the form of a computer software product stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk), comprising several instructions for causing a terminal (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the methods described in the embodiments of the present application.
The embodiments of the present application have been described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those of ordinary skill in the art without departing from the spirit of the present application and the scope of the claims, which are also within the protection of the present application.

Claims (11)

1. A camera module, comprising:
n monochromatic cameras which are arranged in parallel, wherein N is a positive integer greater than 2;
the monochromatic camera comprises a pixel array, wherein the pixel array comprises a plurality of minimum repeating units, the minimum repeating units comprise first pixels and second pixels, the first pixels are first visible light pixels or invisible light pixels, and the second pixels are second visible light pixels;
wherein a wavelength band of light transmitted through the light transmitting portion of the first visible light pixel includes a wavelength band ranging from blue to red, and a wavelength band of light transmitted through the light transmitting portion of the second visible light pixel is narrower than a wavelength band of light transmitted through the light transmitting portion of the first visible light pixel.
2. The camera module of claim 1, wherein the parallel arrangement comprises: are arranged in parallel at equal intervals in the horizontal direction or are arranged in parallel at equal intervals in the vertical direction.
3. An electronic device comprising the camera module of any one of claims 1 to 2.
4. An image generation method, applied to the electronic device of claim 3, comprising:
Acquiring N first images through N monochromatic cameras;
taking one of the N first images as a reference, and respectively calculating other first images to obtain N-1 offset pixel matrixes;
and according to the N-1 offset pixel matrixes, performing pixel superposition on the pixels of the N first images to obtain a target image.
5. The method of claim 4, wherein the pixel stacking of the N first images according to the N-1 offset pixel matrices, before obtaining the target image, further comprises:
performing interpolation processing on first pixels in the first images aiming at each first image in the N first images to obtain a first interpolation result;
replacing the first pixel with the first interpolation result to obtain a monochromatic image;
the pixel stacking of the pixels of the N first images according to the N-1 offset pixel matrixes to obtain a target image includes:
and according to the N-1 offset pixel matrixes, carrying out pixel superposition on the N monochromatic images to obtain a target image.
6. The method according to claim 5, wherein the pixel stacking N of the monochrome images according to the N-1 offset pixel matrices to obtain a target image includes:
According to the N-1 offset pixel matrixes, one of the N single-color images is used as a reference image, and pixels in other single-color images are overlapped on corresponding pixels of the reference image, so that a multi-color image is obtained;
and carrying out fusion processing on the multicolor image and the reference image to obtain a target image.
7. The method of claim 6, wherein the fusing the multi-color image with the reference image to obtain the target image comprises:
processing the multi-color image and the reference image into images of the same resolution;
performing format conversion on the multicolor image with the same resolution as the reference image to obtain a second image in YUV format;
and carrying out fusion processing on the reference image with the same resolution as the multicolor image and the Y component in the second image to obtain a target image.
8. The method of claim 7, wherein the fusing the reference image of the same resolution as the multi-color image with the Y component in the second image comprises:
replacing the Y component with a reference image of the same resolution as the multi-color image;
Or decomposing a base layer and a detail layer of the Y component by adopting a Gaussian filter, and decomposing the base layer and the detail layer of a reference image with the same resolution as the multicolor image;
the detail layer of the Y component is replaced with the detail layer of a reference image of the same resolution as the multicolor image.
9. An image generating apparatus comprising the camera module according to any one of claims 1 to 2, the image generating apparatus further comprising:
the first acquisition module is used for acquiring N first images through N monochromatic cameras;
the first processing module is used for taking one of the N first images as a reference, and respectively calculating other first images through a three-dimensional registration algorithm to obtain N-1 offset pixel matrixes;
and the second processing module is used for carrying out pixel superposition on the pixels of the N first images according to the N-1 offset pixel matrixes to obtain a target image.
10. An electronic device comprising a processor, a memory and a camera module according to any one of claims 1-2, the memory storing a program or instructions executable on the processor, which when executed by the processor, implement the steps of the image generation method according to any one of claims 4-8.
11. A chip comprising a processor and a communication interface, the communication interface and the processor being coupled, the processor being configured to execute programs or instructions to implement the image generation method of any of claims 4-8.
CN202211728356.8A 2022-12-29 2022-12-29 Image generation method and device and electronic device Pending CN116055896A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211728356.8A CN116055896A (en) 2022-12-29 2022-12-29 Image generation method and device and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211728356.8A CN116055896A (en) 2022-12-29 2022-12-29 Image generation method and device and electronic device

Publications (1)

Publication Number Publication Date
CN116055896A true CN116055896A (en) 2023-05-02

Family

ID=86132434

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211728356.8A Pending CN116055896A (en) 2022-12-29 2022-12-29 Image generation method and device and electronic device

Country Status (1)

Country Link
CN (1) CN116055896A (en)

Similar Documents

Publication Publication Date Title
EP3416369B1 (en) Image processing method and apparatus for terminal, and terminal
US8379976B2 (en) Image processing apparatus and method and a computer-readable recording medium on which an image processing program is stored
CN112529775A (en) Image processing method and device
US20120056988A1 (en) 3-d camera
US11656722B1 (en) Method and apparatus for creating an adaptive bayer pattern
CN111225135B (en) Image sensor, imaging device, electronic apparatus, image processing system, and signal processing method
JP7387434B2 (en) Image generation method and image generation device
WO2012102941A1 (en) Camera with multiple color sensors
US20230007191A1 (en) Image sensor, imaging apparatus, electronic device, image processing system, and signal processing method
CN105791793A (en) Image processing method and electronic device
KR20150063010A (en) Method and device for estimating disparity associated with views of a scene acquired with a plenoptic camera
CN114331916B (en) Image processing method and electronic device
CN113674685B (en) Pixel array control method and device, electronic equipment and readable storage medium
CN114125319A (en) Image sensor, camera module, image processing method and device and electronic equipment
CN116055896A (en) Image generation method and device and electronic device
CN113676674B (en) Image processing method, device, electronic equipment and readable storage medium
CN112738399A (en) Image processing method and device and electronic equipment
US12032792B1 (en) Method and apparatus for creating an adaptive Bayer pattern
CN109688333B (en) Color image acquisition method, device, equipment and storage medium
CN118283417A (en) Image processing method and device
US20230394787A1 (en) Imaging apparatus
CN115439300A (en) Panoramic image processing method, device and storage medium
CN116156334A (en) Shooting method, shooting device, electronic equipment and readable storage medium
JP2001292452A (en) Image pickup device and processing method for color image pickup signal
EP2879372A1 (en) Method and device for processing a raw image acquired with a plenoptic camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination