WO2017152402A1 - 终端的图像处理方法、装置和终端 - Google Patents

终端的图像处理方法、装置和终端 Download PDF

Info

Publication number
WO2017152402A1
WO2017152402A1 PCT/CN2016/076017 CN2016076017W WO2017152402A1 WO 2017152402 A1 WO2017152402 A1 WO 2017152402A1 CN 2016076017 W CN2016076017 W CN 2016076017W WO 2017152402 A1 WO2017152402 A1 WO 2017152402A1
Authority
WO
WIPO (PCT)
Prior art keywords
component
black
image
luminance
fusion
Prior art date
Application number
PCT/CN2016/076017
Other languages
English (en)
French (fr)
Inventor
朱聪超
罗巍
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to CN201680025593.4A priority Critical patent/CN107534735B/zh
Priority to PCT/CN2016/076017 priority patent/WO2017152402A1/zh
Priority to EP16893053.5A priority patent/EP3416369B1/en
Priority to US16/083,428 priority patent/US10645268B2/en
Publication of WO2017152402A1 publication Critical patent/WO2017152402A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/673Focus control based on electronic image sensor signals based on contrast or high frequency components of image signals, e.g. hill climbing method
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise

Definitions

  • the present invention relates to image processing technologies, and in particular, to an image processing method, apparatus, and terminal for a terminal.
  • a Charge-coupled Device In order to enable the camera to capture a color image, a Charge-coupled Device (CCD) is usually installed in the camera of the camera.
  • the CCD is a special semiconductor device for outputting the brightness of light of different colors to an image processor in an electrical signal, and finally generating a color image by the image processor.
  • FIG. 1 is a schematic diagram of the working principle of a CCD in the prior art.
  • the CCD is provided with a Color Filter Array (CFA) and an array of photosensitive elements disposed under the color filter array.
  • the color filter array includes filters of three colors (red, green, and blue) arranged in a predetermined order.
  • the natural light of white includes red, orange, yellow, green, blue, enamel, and purple.
  • the color filter array is used to pass only red, green, and blue light in natural light through the filter to the array of photosensitive elements.
  • the photosensitive element array records the intensity of the received light. As shown in FIG.
  • each color filter corresponds to one photosensitive element, wherein each photosensitive element corresponds to one pixel in the image, and the position of each pixel in the image and the corresponding photosensitive element in the photosensitive element array The location is the same.
  • the green filter in the color filter array since the green filter can only pass green light, only the green light of the input natural light signal (only red light, green light, and blue light is shown in FIG. 1) It can reach the photosensitive element, and finally the true color of the pixel can not be obtained.
  • the image processor usually uses a demosaic algorithm to synthesize the true color of the pixel, specifically, the average value of the digital signal of the adjacent pixel having the primary color for the primary color missing from any pixel. As the digital signal of the primary color of the pixel, the true color of the pixel is finally obtained.
  • each color filter in the color filter array can only pass light of one color, and cannot pass light of other colors, so that only part of the light is input to the photosensitive element, resulting in a large noise of the image;
  • the sampling rate causes the resolution of the image to decrease, that is, the light information contained in all the pixels of the image is reduced.
  • only half of the photosensitive elements in the photosensitive element array can Received green light that ultimately affects image quality.
  • the embodiment of the invention provides an image processing method, device and terminal for a terminal, so as to solve the technical problem that the color image obtained by the conventional single CCD camera has low resolution and large noise, resulting in poor image quality.
  • an embodiment of the present invention provides an image processing method for a terminal.
  • the terminal includes a black and white camera and a color camera.
  • the black and white camera and the color camera are arranged side by side.
  • the method includes:
  • Receiving a shooting instruction controlling the black and white camera and the color camera to simultaneously capture the current scene according to the shooting instruction, obtaining a black and white image and a color image of the current scene; acquiring a luminance component and a chrominance component of the color image; and combining the luminance component and the black and white image to obtain brightness
  • the fusion component; the target color image is obtained according to the luminance fusion component and the chrominance component.
  • the above method utilizes the advantages of high resolution and low noise of the black and white image, and the black and white image is merged into the color image, so that the target color image finally obtained after the fusion has the advantages of high resolution and low noise, compared with the color camera.
  • the color image improves the image quality.
  • the merging process specifically includes:
  • the registration result includes a black and white image block fused with each of the luminance blocks; and the luminance component and the registration result according to the registration result
  • the black and white images are fused to obtain a luminance blending component.
  • the registration process specifically includes:
  • the search area for the registration process is determined for the brightness component in the black and white image; the brightness component and the black and white image are registered in the search area. , get the registration result.
  • the above method improves the registration speed and accuracy by reducing the search domain in the registration process, thereby improving the quality of subsequent image fusion.
  • the luminance component and the black and white image are fused to obtain a luminance fusion component, including:
  • the edge and detail information of the image are mainly recorded.
  • the high-frequency information component of the luminance component and the high-frequency information component of the black-and-white image By combining the high-frequency information component of the luminance component and the high-frequency information component of the black-and-white image, the fusion effect of the black-and-white image and the luminance component is improved.
  • performing pixel sum processing on the first low frequency information image and the high frequency fused image to obtain a luminance fused image ,Also includes:
  • the high-frequency fusion information is enhanced, and the high-frequency fusion information after the enhancement processing is obtained.
  • the information contained in the high frequency fusion information is made clearer in the luminance fusion component.
  • the color image is decomposed into a luminance component and a chrominance component. Previously, it also included:
  • the color image is subjected to brightness correction processing based on the black and white image to obtain a brightness corrected color image.
  • the chrominance component is subjected to noise reduction processing to obtain a chrominance component after noise reduction processing.
  • the brightness of the target color image can be further improved by the brightness consistency correction and the noise reduction processing.
  • the following is a description of an image processing device for a terminal according to an embodiment of the present invention.
  • the device has a one-to-one correspondence with the method, and the image processing method in the foregoing embodiment has the same technical features and technical effects. Let me repeat.
  • an embodiment of the present invention provides an image processing apparatus for a terminal, where the terminal includes a black and white camera and a color camera, and the black and white camera and the color camera are arranged side by side.
  • the device includes:
  • the input image acquisition module is configured to receive a shooting instruction, and control the black and white camera and the color camera to simultaneously capture the current scene according to the shooting instruction, to obtain a black and white image and a color image of the current scene;
  • a component acquisition module configured to acquire a luminance component and a chrominance component of the color image
  • a fusion module configured to fuse the luminance component and the black and white image to obtain a luminance fusion component
  • a target color image acquisition module is configured to obtain a target color image according to the luminance fusion component and the chrominance component.
  • the fusion module is specifically configured to divide the luminance component into at least two luminance blocks, and register the luminance component and the black and white image to obtain a registration result.
  • the registration result includes a black and white image block fused with each of the luminance blocks; the luminance component and the black and white image are fused according to the registration result to obtain a luminance fusion component.
  • the fusion module is specifically configured to: according to a distance between a black and white camera and a color camera, a focal length and a color of the black and white camera
  • the focal length of the camera is a search area in which the luminance component is determined to be registered in the black and white image; the luminance component and the black and white image are registered in the search area to obtain a registration result.
  • the convergence module includes:
  • a high frequency information acquiring unit configured to acquire first high frequency information and first low frequency information of the luminance component, and acquire second high frequency information of the black and white image
  • a high-frequency fusion information acquiring unit configured to fuse the first high-frequency information and the second high-frequency information according to the registration result, to obtain high-frequency fusion information
  • the brightness fusion component acquisition unit is configured to perform pixel summation processing on the first low frequency information and the high frequency fusion information to obtain a luminance fusion component.
  • the merging module further includes:
  • the enhancement unit is configured to perform enhancement processing on the high frequency fusion information to obtain enhanced high frequency fusion information.
  • the image processing apparatus of the terminal further includes:
  • the brightness correction module is configured to perform brightness correction processing on the color image according to the black and white image to obtain a brightness corrected color image.
  • the image processing apparatus of the terminal further includes:
  • the noise reduction module is configured to perform noise reduction processing on the chrominance component to obtain a chrominance component after the noise reduction process.
  • an embodiment of the present invention provides a terminal, where the terminal includes a black and white camera, a color camera, and an image processor, and the black and white camera and the color camera are arranged side by side;
  • the image processor is configured to receive a shooting instruction, control the black and white camera and the color camera to simultaneously capture the current scene according to the shooting instruction, obtain a black and white image and a color image of the current scene; acquire a luminance component and a chrominance component of the color image; The image is fused to obtain a luminance fusion component; the target color image is obtained according to the luminance fusion component and the chrominance component.
  • the image processor is specifically configured to divide the luminance component into at least two luminance blocks, and register the luminance component and the black and white image to obtain registration.
  • the registration result includes a black and white image block fused with each of the luminance blocks; the luminance component and the black and white image are fused according to the registration result to obtain a luminance fusion component.
  • the image processor is specifically configured to: according to a distance between the black and white camera and the color camera, a focal length of the black and white camera, and The focal length of the color camera is a search area in which the luminance component is determined to be registered in the black and white image; the luminance component and the black and white image are registered in the search area to obtain a registration result.
  • the image processor is specifically configured to acquire the first high frequency information and the first low frequency information of the luminance component, and obtain the second high frequency information of the black and white image. According to the registration result, the first high frequency information and the second high frequency information are fused to obtain high frequency fusion information; the first low frequency information and the high frequency fusion information are subjected to pixel point addition processing to obtain a luminance fusion component.
  • the image processor is further configured to perform enhancement processing on the high frequency fusion information to obtain the enhanced high frequency Fusion information; performing pixel summation processing on the first low frequency information and the enhanced high frequency fusion information to obtain a luminance fusion component.
  • the image processor is further configured to: according to the black and white image pair The color image is subjected to brightness correction processing to obtain a luminance-corrected color image; and a luminance component and a chrominance component of the luminance-corrected color image are obtained.
  • the image processor is further configured to use the chrominance component
  • the noise reduction process is performed to obtain the chrominance component after the noise reduction process; and the target color image is obtained according to the luminance fusion component and the chrominance component after the noise reduction process.
  • an embodiment of the present invention provides a storage medium, where the storage medium is a computer readable storage medium, and one or more programs are stored, and one or more programs include instructions, when the instruction is included in the camera and multiple applications.
  • the portable electronic device is executed, the portable electronic device is configured to perform the image processing method in the implementation of any of the first to sixth aspects of the first aspect, the first aspect, wherein the camera comprises a black and white camera and a color camera.
  • FIG. 1 is a schematic structural view of an embodiment of a color filter array in the prior art
  • FIG. 2 is a schematic structural diagram of a terminal according to the present invention.
  • Embodiment 3 is a schematic flowchart of Embodiment 1 of an image processing method of a terminal according to the present invention
  • Embodiment 4 is a schematic flowchart of Embodiment 2 of an image processing method of a terminal according to the present invention.
  • FIG. 5 is a schematic flowchart of Embodiment 3 of an image processing method of a terminal according to the present invention.
  • FIG. 6 is a schematic diagram of a principle of a pyramid-based block matching algorithm according to the present invention.
  • FIG. 7 is a schematic flowchart diagram of Embodiment 4 of an image processing method of a terminal according to the present invention.
  • FIG. 8 is a schematic structural diagram of Embodiment 1 of an image processing apparatus of a terminal according to the present invention.
  • FIG. 9 is a schematic structural diagram of Embodiment 2 of an image processing apparatus of a terminal according to the present invention.
  • the embodiment of the present invention provides a method for processing an image of a terminal, including a black and white camera and a color, in view of the technical problem that the color image obtained by the conventional single CCD camera has low resolution and large noise, resulting in poor image quality.
  • the camera, black and white camera and color camera are set side by side while shooting the current scene.
  • the terminal may be a camera, or may be a mobile phone, a tablet, or the like.
  • the camera is taken as an example below, and is not specifically limited.
  • FIG. 2 is a schematic structural diagram of a terminal embodiment of the present invention. Specifically, as shown in FIG.
  • the two cameras are independent of each other, the optical axes are parallel, and images are acquired synchronously, so that the camera can capture black and white images and color images of the same scene.
  • the image processing method provided by the embodiment reduces the noise of the color image by combining the black and white image of the same scene with the high sampling rate of light and the low noise, and the color image with low resolution and large noise. And improve the resolution of color images.
  • the two cameras of the terminal can be set to work at the same time, and then the image processing method provided by the invention is applied to obtain the fused target color image; or a single camera can be set to work alone, and the color of the current scene can be directly captured by a single camera. Images or black and white images are stored or output.
  • the image processing method of the present invention will be described in detail below using specific embodiments.
  • FIG. 3 is a schematic flowchart diagram of Embodiment 1 of an image processing method of a terminal according to the present invention.
  • the execution body of the method is an image processing device, and the device may be implemented by any software and/or hardware.
  • An exemplary image processor may be the image processor in FIG. 2, and the image processor is disposed in the image capturing device. As shown in FIG. 3, the method includes:
  • Step 301 Receive a shooting instruction, and control the black and white camera and the color camera to simultaneously capture the current scene according to the shooting instruction, to obtain a black and white image and a color image of the current scene;
  • Step 302 Obtain a luminance component and a chrominance component of the color image.
  • Step 303 merging the luminance component and the black and white image to obtain a luminance fusion component.
  • Step 304 Obtain a target color image according to the luminance blend component and the chroma component.
  • images and components in the image processing process are in a matrix format and are stored in a matrix format.
  • the images are displayed according to different image format standards.
  • the image processing apparatus controls the black and white camera and the color camera to simultaneously capture the current scene according to the shooting instruction, thereby obtaining a black and white image and a color image of the current scene; wherein the shooting instruction is available to the user. Press the camera button on the camera to trigger. Since the black and white camera only includes the photosensitive element, does not include any color filter or color filter array, it can receive all the natural light signals, so the photosensitive element can simultaneously record the intensity of all the light, and the light transmittance is high, so that the same shooting The black-and-white image in the environment has the advantages of high sampling rate of light and low noise with respect to the color image.
  • the black-and-white image itself is a luminance image.
  • the black and white image M contains 720 ⁇ 480 pixels
  • the black and white image can be represented by a two-dimensional matrix M of size 720 ⁇ 480. Where 720 is the total number of columns of the matrix M, and 480 is the total number of rows of the matrix M.
  • the color image output by the color camera is in a red, green and blue color space, and is represented by three colors of red, green and blue.
  • a color image includes 720 ⁇ 480 pixels
  • the color image is represented by a three-dimensional matrix C of size 720 ⁇ 480 ⁇ 3
  • 720 is the total number of columns of the matrix C
  • 480 is the total row of the matrix C.
  • the number, 3 is the total number of pages of the matrix C.
  • Each page represents a two-dimensional matrix of 720 ⁇ 480 color, which is denoted as R, G, and B respectively.
  • the two-dimensional matrix R represents the intensity value of the red light of the color image
  • the two-dimensional matrix G represents The intensity value of the green light of the color image
  • the two-dimensional matrix B represents the intensity value of the blue light of the color image.
  • the final color of each pixel of the color image is determined by the intensity value of the red light, the intensity of the green light, and the intensity of the blue light.
  • the black and white image needs to be fused with the luminance component of the color image, so the luminance component of the color image needs to be acquired first.
  • the color image in the red, green, and blue color space can be converted into a color image represented by the luminance component and the chrominance component in the luminance chrominance color space according to the color space conversion formula of the color image.
  • the color space conversion formula is:
  • V(i,j) 0.5 ⁇ R(i,j) ⁇ 0.4187 ⁇ G(i,j) ⁇ 0.0813 ⁇ B(i,j)+128.
  • i is the number of columns of pixels in the matrix
  • j is the number of rows of pixels in the matrix
  • i is in the range [0,720]
  • j is in the range [0,360].
  • the converted color image is also a three-dimensional matrix of size 720 ⁇ 360 ⁇ 3, including three Two-dimensional matrices Y, U, and V of size 720 ⁇ 360, wherein the two-dimensional matrix Y stores the luminance values of the pixels of the color image, and the two-dimensional matrix Y is the luminance component of the color image, and the other two The dimensional matrices U and V store the chromaticity values of the respective pixels of the color image, collectively as the chrominance components of the color image. At this time, the final color of each pixel of the color image is determined by the brightness value and the chromaticity value of the pixel.
  • step 303 in a specific fusion process of the luminance component and the black-and-white image, the luminance value of the pixel in the two-dimensional matrix Y representing the luminance component and the pixel having the same position in the two-dimensional matrix M representing the black-and-white image are The luminance values are averaged, and the obtained average value is used as a fusion value. According to each fusion value, a luminance fusion matrix L1 having a size of 720 ⁇ 360, that is, a luminance fusion component is obtained.
  • the target color image is obtained according to the luminance fusion component and the chrominance component, and specifically may be a two-dimensional matrix U and a two-dimensional matrix V representing the chrominance component, and a luminance two-dimensional matrix of the luminance fusion matrix L1, respectively
  • the obtained three-dimensional matrix C2 is the target color image.
  • the camera eventually stores or outputs the target color image.
  • the image processing method provided by the embodiment of the invention acquires the luminance component and the chrominance component of the color image, and then fuses the black and white image with the luminance component to obtain a luminance fusion component, and finally obtains the target color image according to the chrominance component and the luminance fusion component.
  • the black-and-white image has the advantages of high resolution and low noise, and the black-and-white image is merged into the color image, so that the target color image finally obtained after the fusion has the advantages of high resolution and low noise, compared with that obtained by the color camera. Color images improve image quality.
  • FIG. 4 is a schematic flowchart of Embodiment 2 of an image processing method of a terminal according to the present invention. As shown in FIG. 4, the method includes:
  • Step 401 Dividing a luminance component into at least two luminance blocks, and registering the luminance component and the black and white image to obtain a registration result, where the registration result includes a black and white image block that is fused with each of the luminance blocks;
  • Step 402 Fusing the luminance component and the black and white image according to the registration result to obtain a luminance fusion component.
  • the luminance component is divided into at least two luminance blocks, and then the luminance component and the black and white image are registered to obtain a registration result.
  • the registration processing of the luminance component and the black-and-white image may specifically be block matching processing, and for each luminance block of the luminance component, a black and white image block is determined to be fused with the black and white image.
  • the second pixel in the black and white image fused with each of the first pixel points of the luminance component may also be determined according to the registration result.
  • the block matching process is performed to obtain a block in which the luminance block E in the luminance component and the black and white image block F in the black and white image are the smallest difference between the pixels, and the first pixel and the second pixel in the mergeable first pixel
  • the position of the dot in the luminance block E is the same as the position of the second pixel in the black-and-white image block F.
  • the position of the pixel refers to the row and column of the pixel in the array of pixel points of the image to which it belongs.
  • the registration result further includes a first fusion weight of each of the first pixels in the luminance component and a second fusion weight of each of the second pixels in the black and white image.
  • the specific fusion process may preferably be, for any one of the luminance components, the first pixel point X, the first fusion weight a, and the luminance according to the luminance value X 1 of the first pixel X.
  • the first pixel point X of the component performs the luminance value Y 1 and the second fusion weight b of the second pixel point Y in the fused black and white image, and fuses the first pixel point X and the second pixel point Y to obtain brightness. Fusion component.
  • FIG. 5 is a schematic flowchart of Embodiment 3 of the image processing method of the terminal of the present invention, including:
  • Step 501 Dividing a luminance component into at least two luminance blocks, and performing a search process on the black and white image for each luminance block to obtain a black and white image block having the smallest difference from each luminance block pixel;
  • Step 502 Obtain a fusion weight of each luma block and a fusion weight of the black and white image block according to a pixel difference between each luma block and the black and white image block, wherein the pixel difference is inversely proportional to the fusion weight of the black and white image block;
  • Step 503 Obtain a first fusion weight of each first pixel in the luminance component and a fusion with each first pixel in the luminance component according to the fusion weight of each luma block and the fusion weight of the black and white image block.
  • the position in the luma block is the same as the position of the second pixel in the black and white image block.
  • the luminance component is divided into at least two luminance blocks, which may be evenly divided or unevenly divided.
  • the luminance block E includes m*n pixel points, and m and n are positive integers.
  • the search box is moved to search, the size of the search box is m*n pixels, after searching, Obtaining a plurality of black and white image blocks of size m*n, and then comparing the brightness block E with all black and white image blocks of size m*n to obtain a pixel difference, which may specifically be a brightness block E and a black and white image.
  • the luminance values of the pixels at the corresponding positions of the block are made worse, and m*n differences are obtained, and then m*n differences are added to obtain pixel differences.
  • a black-and-white image block F having the smallest difference from the luminance block E pixel is determined. It will be understood by those skilled in the art that the smaller the pixel difference, the more similar the luminance block E is to the black and white image block F, and when the pixel difference is 0, it indicates that the luminance block E is identical to the black and white image block F.
  • FIG. 6 is a schematic diagram of the principle of a pyramid-based block matching algorithm according to the present invention. As shown in FIG.
  • the pyramid includes Layer 3, Layer 1, and Layer 2 as an example, wherein the luminance component including P*Q pixels is divided into four luminance blocks in Layer 2, and Block 1 and Block 2 are , block 3 and block 4; four luminance blocks (block 1, block 2, block 3, and block 4) in layer 2 are subdivided in layer 1, exemplarily, block 1, block 2, block 3 and Block 4 is divided into 4 luma blocks, block 1 includes block 11, block 12, block 13 and block 14; in layer 0, multiple luma blocks in layer 1 are subdivided, for example, block 11 is divided into blocks. 111, block 112, block 113, and block 114.
  • the black and white image block is a block. 01; Then, match any of the luminance blocks belonging to the block 1 in the layer 1, such as the block 11, at this time, only need to search in the block 01 in the black and white image to determine the black and white image block with the smallest pixel difference. , block 011; finally, matching any of the luma blocks belonging to the block 11 in the layer 0, such as the block 111, at this time, only searching in the block 011 in the black and white image to determine the black and white with the smallest pixel difference Image block, block 0111.
  • the method of establishing the above pyramid is merely an exemplary division method, and is not intended to limit the invention.
  • the pyramid-based block-to-fine block matching algorithm significantly reduces the search area of the search box, speeds up the matching, and improves the matching accuracy.
  • the pixel difference is inversely proportional to the fusion weight of the black and white image block F.
  • the fusion weight of the black and white image block F + the fusion weight of the luminance block E is 1.
  • the first fusion weight of any one of the first pixels in the luminance block E is the fusion weight of the luminance block E
  • the second fusion weight of any second pixel in the black and white image block F is the black and white.
  • the fusion weight of the image block F may be obtained by averaging the fusion weights of the luminance blocks to which all the adjacent pixel points of the edge pixel belong.
  • the method for processing the fusion weights of the edge pixels in the black-and-white image block F is similar, and will not be described again.
  • a search area for performing registration processing on the black and white image may also be determined first.
  • it can be implemented by the following feasible implementations:
  • the search area for determining the registration process in the black and white image for the brightness component; in the search area Registration of the luminance component and the black and white image results in a registration result.
  • the position of the same content in the black and white image and the color image in the two images can be determined, and then The offset determines the search area of the block matching search, thereby reducing the search domain in the block matching process and improving the block matching speed and accuracy.
  • the offset may be determined by querying an offset table in which the distance between the two cameras and the corresponding relationship between the respective focal lengths and the offsets are recorded.
  • Another feasible implementation method is to perform feature point matching on the black and white image and the color image, and obtain the position correspondence relationship between the feature points of the two black and white images and the color image, and determine the black and white image and the color image according to the position correspondence relationship of the feature points.
  • the offset of the position of the same content in the two images, and then according to the offset determines the search area of the block matching search, thereby reducing the search domain in the block matching process, and improving the block matching speed and accuracy.
  • FIG. 7 is a fourth embodiment of an image processing method for a terminal of the present invention. Schematic diagram of the process, including:
  • Step 701 Acquire first high frequency information of the luminance component and first low frequency information, and acquire second high frequency information of the black and white image.
  • Step 702 merging the first high frequency information and the second high frequency information according to the registration result to obtain high frequency fusion information
  • Step 703 Perform pixel sum processing on the first low frequency information and the high frequency fusion information to obtain a luminance fusion component.
  • low-pass filtering is performed on the luminance component, so that only low-frequency information in the luminance component can pass through the low-pass filter, and low-frequency information in the luminance component, that is, the first low-frequency information is obtained; and then low-pass filtering is adopted.
  • the low-frequency value of the pixel point of the first low-frequency information obtained by subtracting the low-pass filtering from the luminance value of the pixel portion of the front luminance component, thereby obtaining the high-frequency information of the luminance component, that is, the first high-frequency information; the second high frequency
  • the information is obtained by the same processing method and will not be described again.
  • the first high frequency information and the second high frequency information are used for fusion to obtain high frequency fusion information
  • the specific fusion method may be the same as the fusion method in the embodiment shown in FIG. Narration.
  • the high-frequency fusion information may be subjected to high-frequency enhancement processing.
  • the high-frequency value of each pixel in the high-frequency fusion information is multiplied by the enhancement coefficient to amplify the high-frequency value of each pixel.
  • the degree of enhancement can be set according to the sensitivity of the currently photographed image of the image pickup apparatus. For example, when the sensitivity is low, the noise is small, and a higher degree of enhancement can be performed for improving the sharpness; when the sensitivity is high, a lower degree of enhancement can be performed to prevent the occurrence of significant noise.
  • the first low frequency information of the luminance component and the high frequency fusion information are subjected to pixel point addition processing to obtain a luminance fusion component.
  • the luminance fusion component is obtained by performing low-pass filtering processing on the luminance component and the black-and-white image.
  • high-frequency information and low-frequency information are obtained, and high-frequency information of the black-and-white image and the luminance component are combined to obtain high-frequency fusion information, so that the luminance fusion component is
  • the high frequency component information combines the high frequency information of the black and white image. Due to the difference in light transmittance between the black and white camera and the color camera, the brightness of the two images is inconsistent, that is, the brightness of one image is brighter than the other, or one image is brighter than the other image. Some of the content is darker.
  • the high-frequency information mainly records the edge and detail information of the image, does not include the above-mentioned brightness information, and the human eye is more sensitive to the edges and details in the image. Therefore, the high-frequency information and the black-and-white image of the luminance component are selected in this embodiment.
  • the high-frequency information is fused, which preserves the advantages of low noise and high resolution of the black-and-white image, avoids the above-mentioned brightness change problem, and improves the fusion effect of the black-and-white image and the luminance component.
  • the luminance component and the chrominance component of the color image before acquiring the luminance component and the chrominance component of the color image, performing brightness correction processing on the color image according to the black and white image, and then acquiring the brightness corrected color image.
  • the luminance component and the chrominance component are further subjected to a fusion method as described in any of the above embodiments to obtain a target color image.
  • the specific process of the brightness correction may be: comparing the average brightness of the black and white image and the color image to obtain a brightness correction coefficient of the color image, multiplying the brightness component of the color image by the correction coefficient to improve the brightness, or may be a color image.
  • the color components are respectively multiplied by the preset correction coefficients corresponding to the respective color components, so that the color image and the black and white image have brightness consistency, thereby improving the accuracy of the subsequent registration process.
  • the chrominance component may be subjected to noise reduction processing to obtain the chrominance component after the noise reduction processing. Then, the target color image is obtained based on the luminance blending component and the chrominance component after the noise reduction processing.
  • the chrominance component may be combined with a bilateral filtering algorithm or a guided filtering algorithm for noise reduction processing. Since the edge information in the chrominance component is less, the edge information in the luminance fused component is combined to be a joint bilateral filtering.
  • the algorithm or the guidance filtering algorithm sets the filtering factor so that after filtering, the edge information in the chrominance component can be preserved, and the noise of the chrominance component can be removed to improve the image quality.
  • the image captured by the camera is greatly affected by the objective environment and the components of the camera, and the image may contain a large amount of noise, and the illumination is weaker.
  • the noise in the image is larger.
  • multiple pairs of black and white images and color images may be collected first.
  • the method performs image processing to obtain a higher quality color image.
  • FIG. 8 is a schematic structural diagram of Embodiment 1 of an image processing apparatus of a terminal according to the present invention. As shown in Figure 8, the device includes:
  • the input image obtaining module 801 is configured to receive a shooting instruction, and control the black and white camera and the color camera to simultaneously capture the current scene according to the shooting instruction, to obtain a black and white image and a color image of the current scene;
  • a component acquisition module 802 configured to acquire a luminance component and a chrominance component of the color image
  • a fusion module 803 configured to fuse the luminance component and the black and white image to obtain a luminance fusion component
  • the target color image acquisition module 804 is configured to obtain a target color image according to the luminance fusion component and the chrominance component.
  • the component acquisition module splits the color image into a luminance component and a chrominance component, and then the fusion module fuses the black and white image with the luminance component to obtain a luminance fusion component, and finally obtains the target color image.
  • the module combines the chrominance component and the brightness to obtain the target color image.
  • the black-and-white image has the advantages of high resolution and low noise, and the black-and-white image is fused into the color image, so that the target color image finally obtained after fusion has high resolution.
  • the advantage of low noise is that the image quality is improved compared to the color image captured by the color camera.
  • the fusion module is specifically configured to divide the luminance component into at least two luminance blocks, and register the luminance component and the black and white image to obtain a registration result, where the registration result includes a black and white image that is fused with each of the luminance blocks. Block; merging the luminance component and the black and white image according to the registration result to obtain a luminance fusion component.
  • the fusion module is specifically configured to determine, according to a distance between the black and white camera and the color camera, a focal length of the black and white camera, and a focal length of the color camera, a search area for performing a registration process in the black and white image for the brightness component; Registration of the luminance component and the black and white image results in a registration result.
  • FIG. 9 is a schematic structural diagram of Embodiment 2 of an image processing apparatus of a terminal according to the present invention.
  • the fusion module 802 includes:
  • the high frequency information acquiring unit 901 is configured to acquire first high frequency information and first low frequency information of the brightness component, and acquire second high frequency information of the black and white image;
  • the high frequency fusion information acquiring unit 902 is configured to fuse the first high frequency information and the second high frequency information according to the registration result to obtain high frequency fusion information;
  • the luminance fusion component acquisition unit 903 is configured to perform pixel summation processing on the first low frequency information and the high frequency fusion information to obtain a luminance fusion component.
  • the fusion module 802 further includes:
  • the enhancement unit is configured to perform enhancement processing on the high frequency fusion information to obtain enhanced high frequency fusion information
  • the luminance fusion component acquisition unit 903 is specifically configured to perform pixel summation processing on the first low frequency information and the enhanced high frequency fusion information to obtain a luminance fusion component.
  • the image processing apparatus of the terminal further includes:
  • a brightness correction module configured to perform brightness correction processing on the color image according to the black and white image to obtain a brightness corrected color image
  • the component acquisition module 802 is specifically configured to acquire a luminance component and a chrominance component of the luminance corrected color image.
  • the image processing apparatus of the terminal further includes:
  • the noise reduction module is configured to perform noise reduction processing on the chrominance component to obtain a chrominance component after the noise reduction process;
  • the target color image acquisition module 804 is specifically configured to obtain a target color image according to the luminance fusion component and the chrominance component after the noise reduction process.
  • Another aspect of the present invention provides a terminal, including a black and white camera, a color camera, and an image processor.
  • the black and white camera and the color camera are arranged side by side.
  • the terminal is used to execute the image processing method in any of the above embodiments, and has the same technical features and technical effects.
  • the image processor of the terminal is configured to receive a shooting instruction, control the black and white camera and the color camera to simultaneously capture the current scene according to the shooting instruction, obtain a black and white image and a color image of the current scene; obtain a luminance component and a chrominance component of the color image; The black and white image is fused to obtain a luminance fusion component; and the target color image is obtained according to the luminance fusion component and the chrominance component.
  • the image processor is specifically configured to divide the luminance component into at least two luminance blocks, and register the luminance component and the black and white image to obtain a registration result, where the registration result includes black and white fused with each of the luminance blocks.
  • the image processor is specifically configured to determine, according to a distance between the black and white camera and the color camera, a focal length of the black and white camera, and a focal length of the color camera, a search area for performing a registration process in the black and white image for the brightness component; The brightness component and the black and white image are registered inside to obtain the registration result.
  • the image processor is specifically configured to: acquire first high frequency information and first low frequency information of the luminance component, and acquire second high frequency information of the black and white image; and according to the registration result, the first high frequency information and the second The high-frequency information is fused to obtain high-frequency fusion information; the first low-frequency information and the high-frequency fusion information are subjected to pixel addition processing to obtain a luminance fusion component.
  • the image processor is further configured to: perform enhancement processing on the high frequency fusion information to obtain enhanced high frequency fusion information; and perform pixel point addition processing on the first low frequency information and the enhanced processed high frequency fusion information. , to obtain a luminance fusion component.
  • the image processor is further configured to perform brightness correction processing on the color image according to the black and white image to obtain a brightness corrected color image; and obtain a luminance component and a chrominance component of the brightness corrected color image.
  • the image processor is further configured to perform noise reduction processing on the chrominance component to obtain a chrominance component after the noise reduction processing; and obtain a target color image according to the luminance fusion component and the chrominance component after the noise reduction processing.
  • a storage medium which is a computer readable storage medium, stores one or more programs, and one or more programs include instructions, when included in a camera and a plurality of applications.
  • the portable electronic device is executed, the portable electronic device is caused to perform the image processing method in any of the above method embodiments; wherein the camera comprises a black and white camera and a color camera.
  • the disclosed systems, devices, and methods may be implemented in other manners.
  • the device embodiments described above are merely illustrative.
  • the division of the unit is only a logical function division.
  • there may be another division manner for example, multiple units or components may be combined or Can be integrated into another system, or some features can be ignored or not executed.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, device or unit, and may be in an electrical, mechanical or other form.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each functional unit in each embodiment of the present invention may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the functions may be stored in a computer readable storage medium if implemented in the form of a software functional unit and sold or used as a standalone product.
  • the technical solution of the present invention which is essential or contributes to the prior art, or a part of the technical solution, may be embodied in the form of a software product, which is stored in a storage medium, including
  • the instructions are used to cause a computer device (which may be a personal computer, server, or network device, etc.) to perform all or part of the steps of the methods described in various embodiments of the present invention.
  • the foregoing storage medium includes: a U disk, a mobile hard disk, a read-only memory (ROM), A variety of media that can store program code, such as random access memory (RAM), disk, or optical disk.

Abstract

本发明实施例提供一种终端的图像处理方法、装置和终端。该终端包括黑白摄像头和彩色摄像头,黑白摄像头和彩色摄像头并排设置,该方法包括:接收拍摄指令,根据拍摄指令控制黑白摄像头和彩色摄像头同时拍摄当前场景,得到当前场景的黑白图像和彩色图像;获取彩色图像的亮度分量和色度分量;对亮度分量和黑白图像进行融合,得到亮度融合分量;根据亮度融合分量和色度分量,得到目标彩色图像。本发明实施例提供的终端的图像处理方法、装置和终端,将黑白图像融合进彩色图像中,使得融合后得到的目标彩色图像具有高解析度、噪声低的优点,相比彩色摄像头拍摄得到的彩色图像,提高了图像质量。

Description

终端的图像处理方法、装置和终端 技术领域
本发明涉及图像处理技术,尤其涉及一种终端的图像处理方法、装置和终端。
背景技术
为了使摄像机能够拍摄得到彩色图像,通常在摄像机的摄像头中安装电荷耦合器件(Charge-coupled Device,简称CCD)。该CCD是一种特殊半导体器件,用于以电信号的方式将不同颜色的光的亮度输出至图像处理器,最终由图像处理器生成彩色图像。
图1为现有技术中CCD的工作原理示意图。该CCD上设置有色彩滤镜阵列(Color Filter Array,CFA)和设置在色彩滤镜阵列下方的感光元件阵列。色彩滤镜阵列中包括按照预设顺序排列的三种颜色(红、绿、蓝)的滤镜。白色的自然光线包括红、橙、黄、绿、蓝、靛、紫七色光,色彩滤镜阵列用于使自然光线中仅红色光、绿色光和蓝色光穿过滤镜,到达感光元件阵列,感光元件阵列记录接收到的光的强度。如图1所示,每个色彩滤镜对应一个感光元件,其中,每个感光元件对应图像中的一个像素点,每个像素点在图像中的位置与对应的感光元件在感光元件阵列中的位置相同。以色彩滤镜阵列中的绿色滤镜为例,由于绿色滤镜仅能使绿色光通过,因此输入的自然光信号(图1中仅示出红色光、绿色光和蓝色光)中仅有绿色光可到达感光元件上,最终无法得到该像素点的真实颜色。此时,图像处理器通常采用去马赛克(demosaic)算法合成该像素点的真实颜色,具体地,针对任一像素点所缺少的基色,将具有该基色的相邻像素点的数字信号的平均值,作为该像素点的该基色的数字信号,最终得到该像素点的真实颜色。
然而,色彩滤镜阵列中的每一个色彩滤镜只能通过一种颜色的光,无法通过其它颜色的光,使得只有部分光输入至感光元件,导致图像的噪声较大;同时还降低了光的采样率,导致图像的解析度降低,即图像的所有像素点中所包含的光线信息变少,如图1所示,感光元件阵列中仅一半的感光元件能 接收到绿色光线,最终影响图像质量。
发明内容
本发明实施例提供一种终端的图像处理方法、装置和终端,以解决现有的单CCD摄像机拍摄得到的彩色图像解析度较低、噪声较大,导致图像质量较差的技术问题。
第一方面,本发明实施例提供一种终端的图像处理方法,终端包括黑白摄像头和彩色摄像头,黑白摄像头和彩色摄像头并排设置,该方法包括:
接收拍摄指令,根据拍摄指令控制黑白摄像头和彩色摄像头同时拍摄当前场景,得到当前场景的黑白图像和彩色图像;获取彩色图像的亮度分量和色度分量;对亮度分量和黑白图像进行融合,得到亮度融合分量;根据亮度融合分量和色度分量,得到目标彩色图像。
上述方法利用了黑白图像具有高解析度、噪声低的优点,将黑白图像融合进彩色图像中,使得融合后最终得到的目标彩色图像具有高解析度、噪声低的优点,相比彩色摄像头拍摄得到的彩色图像,提高了图像质量。
结合第一方面,在第一方面的第一种可能的实现方式中,融合过程具体包括:
将亮度分量划分为至少两个亮度块,对亮度分量和黑白图像进行配准,得到配准结果,配准结果包括与每一个亮度块进行融合的黑白图像块;根据配准结果对亮度分量和黑白图像进行融合,得到亮度融合分量。
结合第一方面的第一种可能的实现方式,在第一方面的第二种可能的实现方式中,配准过程具体包括:
根据黑白摄像头和彩色摄像头之间的距离、黑白摄像头的焦距和彩色摄像头的焦距,为亮度分量在黑白图像内确定进行配准处理的搜索区域;在搜索区域内对亮度分量和黑白图像进行配准,得到配准结果。
上述方法通过减少配准过程中的搜索域,提高了配准的速度和准确性,进而提高了后续图像融合的质量。
结合第一方面,在第一方面的第三种可能的实现方式中,对亮度分量和黑白图像进行融合,得到亮度融合分量,包括:
获取亮度分量的第一高频信息和第一低频信息,获取黑白图像的第二高 频信息;对第一高频信息和第二高频信息进行融合,得到高频融合信息;对第一低频信息和高频融合信息进行像素点加和处理,得到亮度融合分量。
高频信息中主要记录了图像的边缘和细节信息,通过选用亮度分量的高频信息分量和黑白图像的高频信息分量进行融合,提高了黑白图像与亮度分量的融合效果。
结合第一方面的第三种可能的实现方式,在第一方面的第四种可能的实现方式中,对第一低频信息图像和高频融合图像进行像素点加和处理,得到亮度融合图像之前,还包括:
对高频融合信息进行增强处理,得到增强处理后的高频融合信息。
通过上述增强处理,使得高频融合信息中所包含的信息在亮度融合分量中更清晰。
结合第一方面、第一方面的第一种至第四种中任一种可行的实现方式,在第一方面的第五种可行的实现方式中,将彩色图像分解为亮度分量和色度分量之前,还包括:
根据黑白图像对彩色图像进行亮度校正处理,得到亮度校正后的彩色图像。
结合第一方面、第一方面的第一种至第五种中任一种可行的实现方式,在第一方面的第六种可行的实现方式中,根据亮度融合分量和色度分量图像,得到目标彩色图像之前,还包括:
对色度分量进行降噪处理,得到降噪处理后的色度分量。
通过亮度一致性校正和降噪处理,可进一步提高目标彩色图像的质量。
下面介绍本发明实施例提供的一种终端的图像处理装置,该装置与方法一一对应,用以实现上述实施例中的图像处理方法,具有相同的技术特征和技术效果,本发明对此不再赘述。
第二方面,本发明实施例提供一种终端的图像处理装置,终端包括黑白摄像头和彩色摄像头,黑白摄像头和彩色摄像头并排设置,该装置包括:
输入图像获取模块,用于接收拍摄指令,根据拍摄指令控制黑白摄像头和彩色摄像头同时拍摄当前场景,得到当前场景的黑白图像和彩色图像;
分量获取模块,用于获取彩色图像的亮度分量和色度分量;
融合模块,用于对亮度分量和黑白图像进行融合,得到亮度融合分量;
目标彩色图像获取模块,用于根据亮度融合分量和色度分量,得到目标彩色图像。
结合第二方面,在第二方面的第一种可能的实现方式中,融合模块具体用于,将亮度分量划分为至少两个亮度块,对亮度分量和黑白图像进行配准,得到配准结果,配准结果包括与每一个亮度块进行融合的黑白图像块;根据配准结果对亮度分量和黑白图像进行融合,得到亮度融合分量。
结合第二方面的第一种可能的实现方式,在第二方面的第二种可能的实现方式中,融合模块具体用于,根据黑白摄像头和彩色摄像头之间的距离、黑白摄像头的焦距和彩色摄像头的焦距,为亮度分量在黑白图像内确定进行配准处理的搜索区域;在搜索区域内对亮度分量和黑白图像进行配准,得到配准结果。
结合第二方面,在第二方面的第三种可能的实现方式中,融合模块包括:
高频信息获取单元,用于获取亮度分量的第一高频信息和第一低频信息,获取黑白图像的第二高频信息;
高频融合信息获取单元,用于根据配准结果,对第一高频信息和第二高频信息进行融合,得到高频融合信息;
亮度融合分量获取单元,用于对第一低频信息和高频融合信息进行像素点加和处理,得到亮度融合分量。
结合第二方面的第三种可能的实现方式,在第二方面的第四种可能的实现方式中,融合模块,还包括:
增强单元,用于对高频融合信息进行增强处理,得到增强处理后的高频融合信息。
结合第二方面、第二方面的第一种至第四种中任一种可行的实现方式,在第二方面的第五种可行的实现方式中,终端的图像处理装置还包括:
亮度校正模块,用于根据黑白图像对彩色图像进行亮度校正处理,得到亮度校正后的彩色图像。
结合第二方面、第二方面的第一种至第五种中任一种可行的实现方式,在第二方面的第六种可行的实现方式中,终端的图像处理装置还包括:
降噪模块,用于对色度分量进行降噪处理,得到降噪处理后的色度分量。
下面介绍本发明实施例提供的一种终端,该终端与方法一一对应,用以 实现上述实施例中的图像处理方法,具有相同的技术特征和技术效果,本发明对此不再赘述。
第三方面,本发明实施例提供一种终端,该终端包括黑白摄像头、彩色摄像头和图像处理器,黑白摄像头和彩色摄像头并排设置;
图像处理器,用于接收拍摄指令,根据拍摄指令控制黑白摄像头和彩色摄像头同时拍摄当前场景,得到当前场景的黑白图像和彩色图像;获取彩色图像的亮度分量和色度分量;对亮度分量和黑白图像进行融合,得到亮度融合分量;根据亮度融合分量和色度分量,得到目标彩色图像。
结合第三方面,在第三方面的第一种可能的实现方式中,图像处理器具体用于,将亮度分量划分为至少两个亮度块,对亮度分量和黑白图像进行配准,得到配准结果,配准结果包括与每一个亮度块进行融合的黑白图像块;根据配准结果对亮度分量和黑白图像进行融合,得到亮度融合分量。
结合第三方面的第一种可能的实现方式,在第三方面的第二种可能的实现方式中,图像处理器具体用于,根据黑白摄像头和彩色摄像头之间的距离、黑白摄像头的焦距和彩色摄像头的焦距,为亮度分量在黑白图像内确定进行配准处理的搜索区域;在搜索区域内对亮度分量和黑白图像进行配准,得到配准结果。
结合第三方面,在第三方面的第三种可能的实现方式中,图像处理器具体用于,获取亮度分量的第一高频信息和第一低频信息,获取黑白图像的第二高频信息;根据配准结果,对第一高频信息和第二高频信息进行融合,得到高频融合信息;对第一低频信息和高频融合信息进行像素点加和处理,得到亮度融合分量。
结合第三方面的第三种可能的实现方式,在第三方面的第四种可能的实现方式中,图像处理器还用于,对高频融合信息进行增强处理,得到增强处理后的高频融合信息;对第一低频信息和增强处理后的高频融合信息进行像素点加和处理,得到亮度融合分量。
结合第三方面、第三方面的第一种至第四种中任一种可行的实现方式,在第三方面的第五种可行的实现方式中,图像处理器还用于,根据黑白图像对彩色图像进行亮度校正处理,得到亮度校正后的彩色图像;获取亮度校正后的彩色图像的亮度分量和色度分量。
结合第三方面、第三方面的第一种至第五种中任一种可行的实现方式,在第三方面的第六种可行的实现方式中,图像处理器还用于,对色度分量进行降噪处理,得到降噪处理后的色度分量;根据亮度融合分量和降噪处理后的色度分量,得到目标彩色图像。
下面介绍本发明实施例提供的一种存储介质,该存储介质用以实现上述第一方面、第一方面的第一种至第六种中任一种可行的实现方式中的图像处理方法,具有相同的技术特征和技术效果,本发明对此不再赘述。
第四方面,本发明实施例提供一种存储介质,存储介质为计算机可读存储介质,存储有一个或多个程序,一个或多个程序包括指令,指令当被包括摄像头和多个应用程序的便携式电子设备执行时,使便携式电子设备执行如上述第一方面、第一方面的第一种至第六种中任一种可行的实现方式中的图像处理方法;其中,摄像头包括黑白摄像头和彩色摄像头。
附图说明
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作一简单地介绍,显而易见地,下面描述中的附图是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1为现有技术中色彩滤镜阵列实施例的结构示意图;
图2为本发明终端的结构示意图;
图3为本发明终端的图像处理方法实施例一的流程示意图;
图4为本发明终端的图像处理方法实施例二的流程示意图;
图5为本发明终端的图像处理方法实施例三的流程示意图;
图6为本发明基于金字塔的块匹配算法的原理示意图;
图7为本发明终端的图像处理方法实施例四的流程示意图;
图8为本发明终端的图像处理装置实施例一的结构示意图;
图9为本发明终端的图像处理装置实施例二的结构示意图。
具体实施方式
为使本发明实施例的目的、技术方案和优点更加清楚,下面将结合本发 明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
本发明实施例针对现有的单CCD摄像机拍摄得到的彩色图像解析度较低、噪声较大,导致图像质量较差的技术问题,提供一种终端的图像处理方法,该终端包括黑白摄像头和彩色摄像头,黑白摄像头和彩色摄像头并排设置,同时拍摄当前场景。该终端可以为摄像机,也可以为手机、平板等,对于该终端的具体实现形式,本实施例下文均以摄像机为例,而并非对其的具体限制。图2为本发明终端实施例的结构示意图。具体地,如图2所示,两个摄像头相互独立,光轴平行,同步采集图像,使得该摄像机可以拍摄得到同一场景的黑白图像和彩色图像。本实施例提供的图像处理方法,通过将同一场景的具有对光的高采样率以及噪声低的优点的黑白图像,与解析度较低、噪声较大彩色图像进行融合,以降低彩色图像的噪声并提高彩色图像的解析度。在实际使用时,可设置终端的两个摄像头同时工作,然后应用本发明提供的图像处理方法得到融合后的目标彩色图像;也可设置单个摄像头单独工作,直接将单个摄像头拍摄当前场景得到的彩色图像或黑白图像存储或输出。下面采用具体的实施例,对本发明的图像处理方法进行详细说明。
图3为本发明终端的图像处理方法实施例一的流程示意图。该方法的执行主体为图像处理装置,该装置可以由任意的软件和/或硬件实现,示例性的可以为图2中的图像处理器,该图像处理器设置在摄像设备中。如图3所示,该方法包括:
步骤301、接收拍摄指令,根据拍摄指令控制黑白摄像头和彩色摄像头同时拍摄当前场景,得到当前场景的黑白图像和彩色图像;
步骤302、获取彩色图像的亮度分量和色度分量;
步骤303、对亮度分量和黑白图像进行融合,得到亮度融合分量;
步骤304、根据亮度融合分量和色度分量,得到目标彩色图像。
本发明各实施例中图像处理过程中的图像和分量均为矩阵格式,并以矩阵格式存储,当图像处理过程结束时,再依据不同的图像格式标准,将图像显示出来。
具体的,在步骤301中,图像处理装置接收到拍摄指令后,根据该拍摄指令控制黑白摄像头和彩色摄像头同时拍摄当前场景,即可得到当前场景的黑白图像和彩色图像;其中,拍摄指令可由用户按压摄像机上的拍照按键触发。由于黑白摄像头仅包括感光元件,不包括任一颜色滤镜或色彩滤镜阵列,可以接收所有的自然光信号,故感光元件可同时记录所有光的强度,且透光率较高,从而使得相同拍摄环境下的黑白图像相对于彩色图像,具有对光的高采样率以及噪声低的优点;同时由于各感光元件仅记录了光的强度信息,没有颜色信息,因此黑白图像其本身即亮度图像。示例性的,当黑白图像M中包含720×480个像素点时,黑白图像可由一个大小为720×480的二维矩阵M表示。其中,720为矩阵M的总列数,480为矩阵M的总行数。
在本实施例中,彩色摄像头输出的彩色图像处于红绿蓝色彩空间内,由红绿蓝三个颜色表示。示例性的,当彩色图像中包含720×480个像素点时,该彩色图像由一个大小为720×480×3的三维矩阵C表示,720为矩阵C的总列数,480为矩阵C的总行数,3为矩阵C的总页数。其中,每一页代表一个颜色的大小为720×480的二维矩阵,分别记为R、G、B,二维矩阵R代表了该彩色图像的红色光的强度值、二维矩阵G代表了该彩色图像的绿色光的强度值,二维矩阵B代表了该彩色图像的蓝色光的强度值。彩色图像的每个像素点最终呈现出来的颜色,由该像素点的红色光的强度值、绿色光的强度值和蓝色光的强度值共同决定。
为得到目标彩色图像,需将黑白图像与彩色图像的亮度分量进行融合,故需要先获取彩色图像的亮度分量。具体地,可根据彩色图像的色彩空间转化公式,将处于红绿蓝色彩空间内的彩色图像转化为处于亮度色度彩色空间内的由亮度分量和色度分量表示的彩色图像。
其中,色彩空间转化公式为:
Y(i,j)=0.299×R(i,j)+0.587×G(i,j)+0.114×B(i,j);
U(i,j)=-0.1687×R(i,j)-0.3313×G(i,j)+0.5×B(i,j)+128;
V(i,j)=0.5×R(i,j)-0.4187×G(i,j)-0.0813×B(i,j)+128。
其中,i表示像素点在矩阵中的列数,j表示像素点在矩阵中的行数,i的取值范围为[0,720],j的取值范围为[0,360]。
转化后的彩色图像同样是一个大小为720×360×3的三维矩阵,包含了三 个大小为720×360的二维矩阵Y、U、V,其中二维矩阵Y中存储有彩色图像的各像素点的亮度值,二维矩阵Y即为彩色图像的亮度分量,另两个二维矩阵U和V存储有彩色图像的各像素点的色度值,共同作为彩色图像的色度分量。此时,彩色图像的每个像素点的最终呈现出来的颜色,由该像素点的亮度值和色度值共同决定。
本领域技术人员可以理解,上述色彩空间转化公式仅示意性的示出了一种获取亮度分量的方法,对于其它可行的方式,本实施例此处不做限制。
在步骤303中,在亮度分量和黑白图像的具体融合过程中,将表示亮度分量的二维矩阵Y中的像素点的亮度值与表示黑白图像的二维矩阵M中的具有相同位置的像素点的亮度值进行平均,得到的平均值作为融合值,根据各融合值,得到大小为720×360的亮度融合矩阵L1,即亮度融合分量。
在步骤304中、根据亮度融合分量和色度分量得到目标彩色图像,具体可以为将表示色度分量的二维矩阵U和二维矩阵V,以及亮度融合矩阵L1共三个二维矩阵,分别作为一个新的大小为720×480×3的三维矩阵C2的三个页,得到的三维矩阵C2即为目标彩色图像。摄像机最终将目标彩色图像存储或输出。
本发明实施例提供的图像处理方法,通过获取彩色图像的亮度分量和色度分量,然后将黑白图像与亮度分量进行融合,得到亮度融合分量,最后根据色度分量和亮度融合分量得到目标彩色图像,利用了黑白图像具有高解析度、噪声低的优点,将黑白图像融合进彩色图像中,使得融合后最终得到的目标彩色图像具有高解析度、噪声低的优点,相比彩色摄像头拍摄得到的彩色图像,提高了图像质量。
下面结合图4,对亮度分量和黑白图像进行融合,得到亮度融合分量的过程进行详细说明。图4为本发明终端的图像处理方法实施例二的流程示意图,如图4所示,包括:
步骤401、将亮度分量划分为至少两个亮度块,对亮度分量和黑白图像进行配准,得到配准结果,配准结果包括与每一个亮度块进行融合的黑白图像块;
步骤402、根据配准结果对亮度分量和黑白图像进行融合,得到亮度融合分量。
具体的,在步骤401中,将亮度分量划分为至少两个亮度块,然后对亮度分量和黑白图像进行配准,得到配准结果。对亮度分量和黑白图像的配准处理具体可以为块匹配处理,为亮度分量的每一个亮度块,在黑白图像中确定一个黑白图像块与之进行融合。可选的,根据配准结果还可确定与亮度分量中的每个第一像素点进行融合的黑白图像中的第二像素点。具体地,通过块匹配处理,得到亮度分量中的亮度块E与黑白图像中的黑白图像块F为像素差异最小的块,则对于可融合的第一像素点和第二像素点,第一像素点在亮度块E中的位置与第二像素点在黑白图像块F中的位置相同。其中,像素点的位置指该像素点在其所属的图像的像素点阵列中所处的行与列。可选的,配准结果还包括亮度分量中的每个第一像素点的第一融合权重,以及黑白图像中的每个第二像素点的第二融合权重。
在步骤402中,具体的融合过程,优选的可以为,针对亮度分量中的任一个第一像素点X,根据该第一像素点X的亮度值X1,第一融合权重a,以及与亮度分量中的第一像素点X进行融合的黑白图像中的第二像素点Y的亮度值Y1和第二融合权重b,对第一像素点X以及第二像素点Y进行融合,最终得到亮度融合分量。
示例性的,在进行融合时,第一融合权重a与第二融合权重b的取值范围通常为[0,1],且a+b=1。亮度融合分量中的任一像素点Z的亮度值,即融合值,由与该像素点Z具有相同位置的第一像素点X以及第二像素点Y相融合后获得,示例性的融合公式为:融合值=a*X1+b*Y1
下面通过图5,对图4所示实施例中的块匹配处理过程进行详细说明,图5为本发明终端的图像处理方法实施例三的流程示意图,包括:
步骤501、将亮度分量划分为至少两个亮度块,针对每一亮度块,对黑白图像进行搜索处理,得到与每一亮度块像素差异最小的黑白图像块;
步骤502、根据每一亮度块与黑白图像块的像素差异,得到每一亮度块的融合权重与黑白图像块的融合权重,其中,像素差异与黑白图像块的融合权重成反比例关系;
步骤503、根据每一亮度块的融合权重与黑白图像块的融合权重,得到亮度分量中的每个第一像素点的第一融合权重以及与亮度分量中的每个第一像素点进行融合的黑白图像中的第二像素点的融合权重,其中,第一像素点 在亮度块中的位置与第二像素点在黑白图像块中的位置相同。
在步骤501中,将亮度分量划分为至少两个亮度块,在划分时,可均匀划分,也可不均匀划分。对于划分得到的任一亮度块E,示例性的,亮度块E包括m*n个像素点,m和n为正整数。对于该亮度块E,在黑白图像中按照预设顺序,例如从左到右,从上到下的顺序,移动搜索框来搜索,该搜索框的大小为m*n个像素点,搜索后可获得多个大小为m*n的黑白图像块,然后对该亮度块E与所有大小为m*n的黑白图像块进行比较,得到像素差异,该像素差异具体可以为将亮度块E与黑白图像块的对应位置的像素点的亮度值做差,得到m*n个差值,再将m*n个差值进行加和得到像素差异。在得到的多个像素差异中,确定与亮度块E像素差异最小的黑白图像块F。本领域技术人员可以理解,像素差异越小,则表明亮度块E与黑白图像块F越相似,当像素差异为0时,表明亮度块E与黑白图像块F完全相同。
本领域技术人员可以理解,为了提高图像块匹配过程的速度,本实施例的块匹配算法具体可以采用基于金字塔的块匹配算法。图6为本发明基于金字塔的块匹配算法的原理示意图。如图6所示,以金字塔包括层0、层1、层2三层为例,其中,层2中将包括P*Q个像素点的亮度分量划分为四个亮度块,块1、块2、块3和块4;层1中将层2中的四个亮度块(块1、块2、块3和块4)进行细分,示例性的,将块1、块2、块3和块4中均划分为4个亮度块,块1中包括块11、块12、块13和块14;层0中将层1中的多个亮度块再细分,例如将块11划分为块111、块112、块113和块114。在进行块匹配时,首先对层2中的任一亮度块,如块1,进行块匹配,在黑白图像中确定与该块1像素差异最小的黑白图像块,例如,该黑白图像块为块01;然后,对层1中的属于块1中的任一亮度块,如块11,进行匹配,此时,只需在黑白图像中的块01中进行搜索,确定像素差异最小的黑白图像块,块011;最后,对层0中的属于块11中的任一亮度块,如块111,进行匹配,此时,只需在黑白图像中的块011中进行搜索,确定像素差异最小的黑白图像块,块0111。上述金字塔的建立方法仅为示意性的划分方法,而并非对本发明的限定。显而易见的,基于金字塔的由粗到精的块匹配算法,显著的减少了搜索框的搜索区域,加快了匹配速度,且提高了匹配准确度。
在步骤502中,像素差异与黑白图像块F的融合权重成反比例关系,当 像素差异值越小,黑白图像块F的融合权重越大,亮度块E的融合权重则越小。因此,根据任一亮度块与黑白图像块的像素差异,可以得到任一亮度块的融合权重与黑白图像块的融合权重。可选的,黑白图像块F的融合权重+亮度块E的融合权重=1。
在步骤503中,亮度块E中的任一第一像素点的第一融合权重为该亮度块E的融合权重,黑白图像块F中的任一第二像素点的第二融合权重为该黑白图像块F的融合权重。可选地,对于亮度块E中的处于边缘位置的边缘像素点,其第一融合权重可由该边缘像素点的所有相邻像素点所属的亮度块的融合权重求均值获得。对于黑白图像块F中的边缘像素点的融合权重处理方法类似,不赘述。
可选地,在图4或图5所示实施例中,为了提高配准速度,还可以先确定对黑白图像进行配准处理的搜索区域。具体地,可通过如下可行的实现方式实现:
一种可行的实现方式,根据黑白摄像头和彩色摄像头之间的距离、黑白摄像头的焦距和彩色摄像头的焦距,为所述亮度分量在黑白图像内确定进行配准处理的搜索区域;在搜索区域内对亮度分量和黑白图像进行配准,得到配准结果。示例性的,根据黑白摄像头和彩色摄像头之间的距离、黑白摄像头的焦距和彩色摄像头的焦距,可确定黑白图像和彩色图像中的相同的内容在两幅图像中的位置的偏移,进而根据该偏移,确定块匹配搜索的搜索区域,从而减少块匹配过程中的搜索域,提高块匹配速度和准确性。可选地,偏移可通过查询偏移表来确定,偏移表中记录有两摄像头之间的距离、各自焦距与偏移的对应关系。
另一种可行的实现方式,对黑白图像和彩色图像进行特征点匹配,得到两黑白图像和彩色图像的特征点的位置对应关系,根据特征点的位置对应关系,确定黑白图像和彩色图像中的相同的内容在两幅图像中的位置的偏移,进而根据该偏移,确定块匹配搜索的搜索区域,从而减少块匹配过程中的搜索域,提高块匹配速度和准确性。
进一步地,在上述图4或图5实施例的基础上,为了提高融合效果,采用亮度分量的高频信息与黑白图像的高频信息进行融合。下面通过图7,对具体的实现方式进行详细说明。图7为本发明终端的图像处理方法实施例四 的流程示意图,包括:
步骤701、获取亮度分量的第一高频信息和第一低频信息,获取黑白图像的第二高频信息;
步骤702、根据配准结果,对第一高频信息和第二高频信息进行融合,得到高频融合信息;
步骤703、对第一低频信息和高频融合信息进行像素点加和处理,得到亮度融合分量。
在本实施例中,通过对亮度分量进行低通滤波,使得仅亮度分量中的低频信息能够通过低通滤波器,得到了亮度分量中的低频信息,即第一低频信息;然后采用低通滤波前的亮度分量的像素点的亮度值减去低通滤波后得到的第一低频信息的像素点的低频值,即可获得亮度分量的高频信息,即第一高频信息;第二高频信息采用相同的处理方法获得,不再赘述。
在本实施例的融合过程时,采用第一高频信息和第二高频信息进行融合,得到高频融合信息,具体的融合方法可以与图4所示实施例中的融合方法相同,不再赘述。
可选的,为使高频融合信息所包含的信息在亮度融合分量中更清晰,可对高频融合信息进行高频增强处理。具体在进行高频增强处理过程中,可对高频融合信息中的各像素点的高频值乘以增强系数,放大各像素点的高频值。在对高频融合信息进行增强时,可以根据摄像设备当前拍照图像的感光度设置增强程度。如,当感光度较低时,噪声较小,为提高清晰度,可以进行较强程度的增强;当感光度较高时,为防止出现明显的噪声,可以进行较低程度的增强。
最后,将亮度分量的第一低频信息和高频融合信息进行像素点加和处理,得到亮度融合分量。
本实施例中通过将亮度分量和黑白图像进行低通滤波处理,得到高频信息和低频信息,将黑白图像和亮度分量的高频信息相融合,得到高频融合信息,使得亮度融合分量中的高频分量信息融合了黑白图像的高频信息。由于黑白摄像头与彩色摄像头的透光性不同,会导致两幅图像的亮度不一致,即一副图像的亮度相对另一幅整体更亮,或一副图像相对于另一幅图像,一部分内容更亮,一部分内容更暗。此时,若直接将黑白图像融合进亮度分量中, 可能导致获取到的亮度融合分量产生不一致的亮度变化。而高频信息中主要记录了图像的边缘和细节信息,不包含上述亮度高低信息,且人眼对图像中的边缘和细节更敏感,故本实施例中选用亮度分量的高频信息和黑白图像的高频信息进行融合,保留了黑白图像噪声低、分辨率高的优点,避免了上述亮度变化问题,提高了黑白图像与亮度分量的融合效果。
可选地,在上述任一实施例的基础上,在获取彩色图像的亮度分量和色度分量之前,还可以对根据黑白图像对彩色图像进行亮度校正处理,然后获取亮度校正后的彩色图像的亮度分量和色度分量,再采用如上述任一实施例所述的融合方法,得到目标彩色图像。
上述亮度校正的具体过程具体可以为:比较黑白图像和彩色图像的平均亮度,得到彩色图像的亮度修正系数,为彩色图像的亮度分量乘以修正系数以提高亮度,或者,可为彩色图像的各颜色分量分别乘以各颜色分量对应的预设校正系数,从而使得彩色图像与黑白图像具有亮度一致性,从而提高后续配准处理过程的准确性。
可选地,在上述任一实施例的基础上,在根据亮度融合分量和色度分量得到目标彩色图像之前,还可以先对色度分量进行降噪处理,得到降噪处理后的色度分量,然后根据亮度融合分量和降噪处理后的色度分量,得到目标彩色图像。
在具体实现过程中,可对色度分量采用联合双边滤波算法或指导滤波算法进行降噪处理,由于色度分量中的边缘信息较少,因此结合亮度融合分量中的边缘信息,为联合双边滤波算法或指导滤波算法设置滤波因子,从而使得在滤波后,可在保留色度分量中的边缘信息的同时,去除色度分量的噪音,提高图像质量。
可选的,在上述任一实施例的基础上,在低照度情况下,摄像机拍摄得到的图像受客观环境和摄像机各元器件影响较大,图像中可能包含有大量噪声,且光照越弱,图像中的噪声越大。为提高图像的清晰度,降低图像噪声,在通过双摄像头获取当前场景的黑白图像和彩色图像时,可先采集多对黑白图像和彩色图像。对这些多帧黑白图像进行时域降噪处理,获取一帧噪声较低的黑白图像;对这些多帧彩色图像进行时域降噪处理,获取一帧噪声较低的彩色图像,最后对该噪声较低的黑白图像和彩色图像,根据本发明提供的 方法进行图像处理,获得较高质量的彩色图像。
本发明实施例另一方面提供一种终端的图像处理装置,终端包括黑白摄像头和彩色摄像头,黑白摄像头和彩色摄像头并排设置,该装置用于执行上述任一实施例中的图像处理方法,具有相同的技术特征和技术效果。图8为本发明终端的图像处理装置实施例一的结构示意图。如图8所示,该装置包括:
输入图像获取模块801,用于接收拍摄指令,根据拍摄指令控制黑白摄像头和彩色摄像头同时拍摄当前场景,得到当前场景的黑白图像和彩色图像;
分量获取模块802,用于获取彩色图像的亮度分量和色度分量;
融合模块803,用于对亮度分量和黑白图像进行融合,得到亮度融合分量;
目标彩色图像获取模块804,用于根据亮度融合分量和色度分量,得到目标彩色图像。
本发明实施例提供的终端的图像处理装置,分量获取模块将彩色图像拆分为亮度分量和色度分量,然后融合模块将黑白图像与亮度分量进行融合,得到亮度融合分量,最后目标彩色图像获取模块根据色度分量和亮度融合得到目标彩色图像,利用了黑白图像具有高解析度、噪声低的优点,将黑白图像融合进彩色图像中,使得融合后最终得到的目标彩色图像具有高解析度、噪声低的优点,相比彩色摄像头拍摄得到的彩色图像,提高了图像质量。
下面采用具体实施例对本发明终端的图像处理装置进行详细说明。
可选的,融合模块具体用于,将亮度分量划分为至少两个亮度块,对亮度分量和黑白图像进行配准,得到配准结果,配准结果包括与每一个亮度块进行融合的黑白图像块;根据配准结果对亮度分量和黑白图像进行融合,得到亮度融合分量。
进一步地,融合模块具体用于,根据黑白摄像头和彩色摄像头之间的距离、黑白摄像头的焦距和彩色摄像头的焦距,为亮度分量在黑白图像内确定进行配准处理的搜索区域;在搜索区域内对亮度分量和黑白图像进行配准,得到配准结果。
图9为本发明终端的图像处理装置实施例二的结构示意图。如图9所示,融合模块802包括:
高频信息获取单元901,用于获取亮度分量的第一高频信息和第一低频信息,获取黑白图像的第二高频信息;
高频融合信息获取单元902,用于根据配准结果,对第一高频信息和第二高频信息进行融合,得到高频融合信息;
亮度融合分量获取单元903,用于对第一低频信息和高频融合信息进行像素点加和处理,得到亮度融合分量。
进一步地,在图9所示实施例的基础上,融合模块802,还包括:
增强单元,用于对高频融合信息进行增强处理,得到增强处理后的高频融合信息;
亮度融合分量获取单元903,具体用于对第一低频信息和增强处理后的高频融合信息进行像素点加和处理,得到亮度融合分量。
可选的,在上述任一实施例的基础上,终端的图像处理装置还包括:
亮度校正模块,用于根据黑白图像对彩色图像进行亮度校正处理,得到亮度校正后的彩色图像;
分量获取模块802具体用于,获取亮度校正后的彩色图像的亮度分量和色度分量。
可选的,在上述任一实施例的基础上,终端的图像处理装置还包括:
降噪模块,用于对色度分量进行降噪处理,得到降噪处理后的色度分量;
目标彩色图像获取模块804具体用于,根据亮度融合分量和降噪处理后的色度分量,得到目标彩色图像。
本发明实施例另一方面提供一种终端,包括黑白摄像头、彩色摄像头和图像处理器,黑白摄像头和彩色摄像头并排设置。该终端用于执行上述任一实施例中的图像处理方法,具有相同的技术特征和技术效果。
终端的图像处理器,用于接收拍摄指令,根据拍摄指令控制黑白摄像头和彩色摄像头同时拍摄当前场景,得到当前场景的黑白图像和彩色图像;获取彩色图像的亮度分量和色度分量;对亮度分量和黑白图像进行融合,得到亮度融合分量;根据亮度融合分量和色度分量,得到目标彩色图像。
可选的,图像处理器具体用于,将亮度分量划分为至少两个亮度块,对亮度分量和黑白图像进行配准,得到配准结果,配准结果包括与每一个亮度块进行融合的黑白图像块;根据配准结果对亮度分量和黑白图像进行融合, 得到亮度融合分量。
进一步地,图像处理器具体用于,根据黑白摄像头和彩色摄像头之间的距离、黑白摄像头的焦距和彩色摄像头的焦距,为亮度分量在黑白图像内确定进行配准处理的搜索区域;在搜索区域内对亮度分量和黑白图像进行配准,得到配准结果。
可选的,图像处理器具体用于,获取亮度分量的第一高频信息和第一低频信息,获取黑白图像的第二高频信息;根据配准结果,对第一高频信息和第二高频信息进行融合,得到高频融合信息;对第一低频信息和高频融合信息进行像素点加和处理,得到亮度融合分量。
可选的,图像处理器还用于,对高频融合信息进行增强处理,得到增强处理后的高频融合信息;对第一低频信息和增强处理后的高频融合信息进行像素点加和处理,得到亮度融合分量。
可选的,图像处理器还用于,根据黑白图像对彩色图像进行亮度校正处理,得到亮度校正后的彩色图像;获取亮度校正后的彩色图像的亮度分量和色度分量。
可选的,图像处理器还用于,对述色度分量进行降噪处理,得到降噪处理后的色度分量;根据亮度融合分量和降噪处理后的色度分量,得到目标彩色图像。
本发明实施例另一方面提供一种存储介质,该存储介质为计算机可读存储介质,存储有一个或多个程序,一个或多个程序包括指令,指令当被包括摄像头和多个应用程序的便携式电子设备执行时,使便携式电子设备执行上述任一方法实施例中的图像处理方法;其中,摄像头包括黑白摄像头和彩色摄像头。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本发明的范围。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程,可以参考前述方法实施例中的对应 过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的系统、装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
本发明的说明书和权利要求书及上述附图中的术语“第一”、“第二”、“第三”“第四”等(如果存在)是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便这里描述的本发明的实施例例如能够以除了在这里图示或描述的那些以外的顺序实施。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或单元的过程、方法、系统、产品或设备不必限于清楚地列出的那些步骤或单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或单元。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本发明各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。
所述功能如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本发明的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本发明各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(Read-Only Memory,简称ROM)、 随机存取存储器(Random Access Memory,简称RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
最后应说明的是:以上各实施例仅用以说明本发明的技术方案,而非对其限制;尽管参照前述各实施例对本发明进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分或者全部技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本发明各实施例技术方案的范围。

Claims (22)

  1. 一种终端的图像处理方法,其特征在于,所述终端包括黑白摄像头和彩色摄像头,所述黑白摄像头和所述彩色摄像头并排设置,所述方法包括:
    接收拍摄指令,根据所述拍摄指令控制所述黑白摄像头和所述彩色摄像头同时拍摄当前场景,得到所述当前场景的黑白图像和彩色图像;
    获取所述彩色图像的亮度分量和色度分量;
    对所述亮度分量和所述黑白图像进行融合,得到亮度融合分量;
    根据所述亮度融合分量和所述色度分量,得到目标彩色图像。
  2. 根据权利要求1所述的方法,其特征在于,所述对所述亮度分量和所述黑白图像进行融合,得到亮度融合分量,包括:
    将所述亮度分量划分为至少两个亮度块,对所述亮度分量和所述黑白图像进行配准,得到配准结果,所述配准结果包括与每一个亮度块进行融合的黑白图像块;
    根据所述配准结果对所述亮度分量和所述黑白图像进行融合,得到亮度融合分量。
  3. 根据权利要求2所述的方法,其特征在于,所述对所述亮度分量和所述黑白图像进行配准,得到配准结果,包括:
    根据所述黑白摄像头和所述彩色摄像头之间的距离、所述黑白摄像头的焦距和所述彩色摄像头的焦距,为所述亮度分量在所述黑白图像内确定进行配准处理的搜索区域;
    在所述搜索区域内对所述亮度分量和所述黑白图像进行配准,得到配准结果。
  4. 根据权利要求1所述的方法,其特征在于,所述对所述亮度分量和所述黑白图像进行融合,得到亮度融合分量,包括:
    获取所述亮度分量的第一高频信息和第一低频信息,获取所述黑白图像的第二高频信息;
    对所述第一高频信息和所述第二高频信息进行融合,得到高频融合信息;
    对所述第一低频信息和所述高频融合信息进行像素点加和处理,得到所述亮度融合分量。
  5. 根据权利要求4所述的方法,其特征在于,所述对所述第一低频信息 和所述高频融合信息进行像素点加和处理,得到所述亮度融合分量之前,还包括:
    对所述高频融合信息进行增强处理,得到增强处理后的高频融合信息;
    所述对所述第一低频信息和所述高频融合信息进行像素点加和处理,得到所述亮度融合分量,包括:
    对所述第一低频信息和所述增强处理后的高频融合信息进行像素点加和处理,得到所述亮度融合分量。
  6. 根据权利要求1至5任一项所述的方法,其特征在于,所述获取所述彩色图像的亮度分量和色度分量之前,还包括:
    根据所述黑白图像对所述彩色图像进行亮度校正处理,得到亮度校正后的彩色图像;
    所述获取所述彩色图像的亮度分量和色度分量,包括:
    获取所述亮度校正后的彩色图像的亮度分量和色度分量。
  7. 根据权利要求1至6任一项所述的方法,其特征在于,所述根据所述亮度融合分量和所述色度分量,得到目标彩色图像之前,还包括:
    对所述色度分量进行降噪处理,得到降噪处理后的色度分量;
    所述根据所述亮度融合分量和所述色度分量,得到目标彩色图像,包括:
    根据所述亮度融合分量和所述降噪处理后的色度分量,得到目标彩色图像。
  8. 一种终端的图像处理装置,其特征在于,所述终端包括黑白摄像头和彩色摄像头,所述黑白摄像头和所述彩色摄像头并排设置,所述装置包括:
    输入图像获取模块,用于接收拍摄指令,根据所述拍摄指令控制所述黑白摄像头和所述彩色摄像头同时拍摄当前场景,得到所述当前场景的黑白图像和彩色图像;
    分量获取模块,用于获取所述彩色图像的亮度分量和色度分量;
    融合模块,用于对所述亮度分量和所述黑白图像进行融合,得到亮度融合分量;
    目标彩色图像获取模块,用于根据所述亮度融合分量和所述色度分量,得到目标彩色图像。
  9. 根据权利要求8所述的装置,其特征在于,所述融合模块具体用于,
    将所述亮度分量划分为至少两个亮度块,对所述亮度分量和所述黑白图像进行配准,得到配准结果,所述配准结果包括与每一个亮度块进行融合的黑白图像块;
    根据所述配准结果对所述亮度分量和所述黑白图像进行融合,得到亮度融合分量。
  10. 根据权利要求9所述的装置,其特征在于,所述融合模块具体用于,
    根据所述黑白摄像头和所述彩色摄像头之间的距离、所述黑白摄像头的焦距和所述彩色摄像头的焦距,为所述亮度分量在所述黑白图像内确定进行配准处理的搜索区域;在所述搜索区域内对所述亮度分量和所述黑白图像进行配准,得到配准结果。
  11. 根据权利要求8所述的装置,其特征在于,所述融合模块包括:
    高频信息获取单元,用于获取所述亮度分量的第一高频信息和第一低频信息,获取所述黑白图像的第二高频信息;
    高频融合信息获取单元,用于根据所述配准结果,对所述第一高频信息和所述第二高频信息进行融合,得到高频融合信息;
    亮度融合分量获取单元,用于对所述第一低频信息和所述高频融合信息进行像素点加和处理,得到所述亮度融合分量。
  12. 根据权利要求11所述的装置,其特征在于,所述融合模块,还包括:
    增强单元,用于对所述高频融合信息进行增强处理,得到增强处理后的高频融合信息;
    所述亮度融合分量获取单元,具体用于对所述第一低频信息和所述增强处理后的高频融合信息进行像素点加和处理,得到所述亮度融合分量。
  13. 根据权利要求8至12任一项所述的装置,其特征在于,还包括:
    亮度校正模块,用于根据所述黑白图像对所述彩色图像进行亮度校正处理,得到亮度校正后的彩色图像;
    所述分量获取模块具体用于,获取所述亮度校正后的彩色图像的亮度分量和色度分量。
  14. 根据权利要求8至13任一项所述的装置,其特征在于,还包括:
    降噪模块,用于对所述色度分量进行降噪处理,得到降噪处理后的色度分量;
    所述目标彩色图像获取模块具体用于,根据所述亮度融合分量和所述降噪处理后的色度分量,得到目标彩色图像。
  15. 一种终端,其特征在于,所述终端包括黑白摄像头、彩色摄像头和图像处理器,所述黑白摄像头和所述彩色摄像头并排设置;
    所述图像处理器,用于接收拍摄指令,根据所述拍摄指令控制所述黑白摄像头和所述彩色摄像头同时拍摄当前场景,得到所述当前场景的黑白图像和彩色图像;
    所述图像处理器,还用于获取所述彩色图像的亮度分量和色度分量;对所述亮度分量和所述黑白图像进行融合,得到亮度融合分量;根据所述亮度融合分量和所述色度分量,得到目标彩色图像。
  16. 根据权利要求15所述的终端,其特征在于,所述图像处理器具体用于,
    将所述亮度分量划分为至少两个亮度块,对所述亮度分量和所述黑白图像进行配准,得到配准结果,所述配准结果包括与每一个亮度块进行融合的黑白图像块;
    根据所述配准结果对所述亮度分量和所述黑白图像进行融合,得到亮度融合分量。
  17. 根据权利要求16所述的终端,其特征在于,所述所述图像处理器具体用于,
    根据所述黑白摄像头和所述彩色摄像头之间的距离、所述黑白摄像头的焦距和所述彩色摄像头的焦距,为所述亮度分量在所述黑白图像内确定进行配准处理的搜索区域;
    在所述搜索区域内对所述亮度分量和所述黑白图像进行配准,得到配准结果。
  18. 根据权利要求15所述的终端,其特征在于,所述图像处理器具体用于,
    获取所述亮度分量的第一高频信息和第一低频信息,获取所述黑白图像的第二高频信息;
    根据所述配准结果,对所述第一高频信息和所述第二高频信息进行融合,得到高频融合信息;
    对所述第一低频信息和所述高频融合信息进行像素点加和处理,得到所述亮度融合分量。
  19. 根据权利要求18所述的终端,其特征在于,所述图像处理器还用于,
    对所述高频融合信息进行增强处理,得到增强处理后的高频融合信息;
    对所述第一低频信息和所述增强处理后的高频融合信息进行像素点加和处理,得到所述亮度融合分量。
  20. 根据权利要求15至19任一项所述的终端,其特征在于,所述图像处理器还用于,
    根据所述黑白图像对所述彩色图像进行亮度校正处理,得到亮度校正后的彩色图像;获取所述亮度校正后的彩色图像的亮度分量和色度分量。
  21. 根据权利要求15至20任一项所述的终端,其特征在于,所述图像处理器还用于,
    对所述色度分量进行降噪处理,得到降噪处理后的色度分量;
    根据所述亮度融合分量和所述降噪处理后的色度分量,得到目标彩色图像。
  22. 一种存储介质,其特征在于,所述存储介质为计算机可读存储介质,存储有一个或多个程序,所述一个或多个程序包括指令,所述指令当被包括摄像头和多个应用程序的便携式电子设备执行时,使所述便携式电子设备执行权利要求1至7中任一项所述方法;其中,所述摄像头包括黑白摄像头和彩色摄像头。
PCT/CN2016/076017 2016-03-09 2016-03-09 终端的图像处理方法、装置和终端 WO2017152402A1 (zh)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN201680025593.4A CN107534735B (zh) 2016-03-09 2016-03-09 终端的图像处理方法、装置和终端
PCT/CN2016/076017 WO2017152402A1 (zh) 2016-03-09 2016-03-09 终端的图像处理方法、装置和终端
EP16893053.5A EP3416369B1 (en) 2016-03-09 2016-03-09 Image processing method and apparatus for terminal, and terminal
US16/083,428 US10645268B2 (en) 2016-03-09 2016-03-09 Image processing method and apparatus of terminal, and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/076017 WO2017152402A1 (zh) 2016-03-09 2016-03-09 终端的图像处理方法、装置和终端

Publications (1)

Publication Number Publication Date
WO2017152402A1 true WO2017152402A1 (zh) 2017-09-14

Family

ID=59789996

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/076017 WO2017152402A1 (zh) 2016-03-09 2016-03-09 终端的图像处理方法、装置和终端

Country Status (4)

Country Link
US (1) US10645268B2 (zh)
EP (1) EP3416369B1 (zh)
CN (1) CN107534735B (zh)
WO (1) WO2017152402A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111866476A (zh) * 2020-08-31 2020-10-30 维沃移动通信有限公司 图像拍摄方法、装置及电子设备
CN113271393A (zh) * 2021-07-19 2021-08-17 浙江华睿科技股份有限公司 一种多波段的平场校正方法、装置及计算机可读介质
WO2023011197A1 (zh) * 2021-08-05 2023-02-09 中兴通讯股份有限公司 图像处理方法、电子设备、以及计算机可读存储介质

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106993136B (zh) * 2017-04-12 2021-06-15 深圳市知赢科技有限公司 移动终端及其基于多摄像头的图像降噪方法和装置
CN108389165B (zh) * 2018-02-02 2022-06-21 成都西纬科技有限公司 一种图像去噪方法、装置、终端系统和存储器
KR102571243B1 (ko) * 2018-02-20 2023-08-25 삼성디스플레이 주식회사 색채측정 장치 및 방법
CN108717691B (zh) * 2018-06-06 2022-04-15 成都西纬科技有限公司 一种图像融合方法、装置、电子设备及介质
CN110876016B (zh) * 2018-08-31 2021-03-16 珠海格力电器股份有限公司 图像处理方法、装置和存储介质
JP7336773B2 (ja) * 2018-10-29 2023-09-01 パナソニックIpマネジメント株式会社 情報提示方法、情報提示装置、及び、情報提示システム
JP7374600B2 (ja) * 2019-03-22 2023-11-07 ソニー・オリンパスメディカルソリューションズ株式会社 医療用画像処理装置及び医療用観察システム
CN112822465B (zh) * 2019-11-15 2023-06-16 北京小米移动软件有限公司 一种终端设备及图像处理方法
CN112907704B (zh) * 2021-02-04 2024-04-12 浙江大华技术股份有限公司 一种图像融合方法、计算机设备以及装置
CN112801908B (zh) * 2021-02-05 2022-04-22 深圳技术大学 图像去噪方法、装置、计算机设备和存储介质
CN113129400B (zh) * 2021-03-17 2023-02-24 维沃移动通信有限公司 图像处理方法、装置、电子设备和可读存储介质
CN113240609A (zh) * 2021-05-26 2021-08-10 Oppo广东移动通信有限公司 图像去噪方法、装置及存储介质
CN113409205B (zh) * 2021-06-10 2023-11-24 Oppo广东移动通信有限公司 图像处理方法、图像处理装置、存储介质与电子设备
CN116156334A (zh) * 2023-02-28 2023-05-23 维沃移动通信有限公司 拍摄方法、装置、电子设备和可读存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101493893A (zh) * 2008-12-11 2009-07-29 中山大学 一种图像数据融合方法
CN103595982A (zh) * 2013-11-07 2014-02-19 天津大学 基于灰度和彩色两颗传感器的彩色图像采集装置
CN104363375A (zh) * 2014-11-28 2015-02-18 广东欧珀移动通信有限公司 一种照片降噪的方法、装置及终端
CN105049718A (zh) * 2015-07-06 2015-11-11 深圳市金立通信设备有限公司 一种图像处理方法及终端

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6788338B1 (en) 2000-11-20 2004-09-07 Petko Dimitrov Dinev High resolution video camera apparatus having two image sensors and signal processing
CN101527033A (zh) 2008-03-04 2009-09-09 河海大学 超分辨率重建和自动配准的工业ccd彩色成像系统
JP5816015B2 (ja) * 2011-07-15 2015-11-17 株式会社東芝 固体撮像装置及びカメラモジュール
US20140320602A1 (en) * 2011-12-02 2014-10-30 Nokia Corporation Method, Apparatus and Computer Program Product for Capturing Images
JP2013183353A (ja) * 2012-03-02 2013-09-12 Toshiba Corp 画像処理装置
JP2015197745A (ja) * 2014-03-31 2015-11-09 キヤノン株式会社 画像処理装置、撮像装置、画像処理方法及びプログラム
CN103986875A (zh) 2014-05-29 2014-08-13 宇龙计算机通信科技(深圳)有限公司 一种图像获取装置、方法、终端及视频获取方法
US9344639B2 (en) * 2014-08-12 2016-05-17 Google Technology Holdings LLC High dynamic range array camera
US10558867B2 (en) * 2015-09-28 2020-02-11 Kyocera Corporation Image processing apparatus, stereo camera apparatus, vehicle, and image processing method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101493893A (zh) * 2008-12-11 2009-07-29 中山大学 一种图像数据融合方法
CN103595982A (zh) * 2013-11-07 2014-02-19 天津大学 基于灰度和彩色两颗传感器的彩色图像采集装置
CN104363375A (zh) * 2014-11-28 2015-02-18 广东欧珀移动通信有限公司 一种照片降噪的方法、装置及终端
CN105049718A (zh) * 2015-07-06 2015-11-11 深圳市金立通信设备有限公司 一种图像处理方法及终端

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3416369A4 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111866476A (zh) * 2020-08-31 2020-10-30 维沃移动通信有限公司 图像拍摄方法、装置及电子设备
CN113271393A (zh) * 2021-07-19 2021-08-17 浙江华睿科技股份有限公司 一种多波段的平场校正方法、装置及计算机可读介质
CN113271393B (zh) * 2021-07-19 2021-10-15 浙江华睿科技股份有限公司 一种多波段的平场校正方法、装置及计算机可读介质
WO2023011197A1 (zh) * 2021-08-05 2023-02-09 中兴通讯股份有限公司 图像处理方法、电子设备、以及计算机可读存储介质

Also Published As

Publication number Publication date
CN107534735A (zh) 2018-01-02
US10645268B2 (en) 2020-05-05
US20190098188A1 (en) 2019-03-28
EP3416369A1 (en) 2018-12-19
EP3416369A4 (en) 2019-02-27
CN107534735B (zh) 2019-05-03
EP3416369B1 (en) 2020-10-28

Similar Documents

Publication Publication Date Title
WO2017152402A1 (zh) 终端的图像处理方法、装置和终端
US10547772B2 (en) Systems and methods for reducing motion blur in images or video in ultra low light with array cameras
EP2642757B1 (en) Imaging systems with clear filter pixels
TWI407800B (zh) 馬賽克影像的改良處理
EP1594321A2 (en) Extended dynamic range in color imagers
JP6059359B2 (ja) 画像処理方法および装置ならびにイメージングデバイス
JP2009153013A (ja) 撮像装置、色ノイズ低減方法および色ノイズ低減プログラム
JP2003284084A (ja) 画像処理装置および方法、並びに画像処理装置の製造方法
JPH0823543A (ja) 撮像装置
WO2023016146A1 (zh) 图像传感器、图像采集装置、图像处理方法及图像处理器
WO2020119505A1 (zh) 一种图像处理方法和系统
JP2003108999A (ja) 電子カラー画像の色を補正する画像処理方法
CN112991245A (zh) 双摄虚化处理方法、装置、电子设备和可读存储介质
US9654756B1 (en) Method and apparatus for interpolating pixel colors from color and panchromatic channels to color channels
TW200822764A (en) Imaging device and signal processing method
EP2717155A1 (en) Color distortion correction method and device for imaging systems and image output systems
CN107786857B (zh) 一种图像还原方法及装置
Lukac Single-sensor digital color imaging fundamentals
JP2020120204A (ja) 画像処理装置、画像処理方法およびプログラム
US8068146B2 (en) Opponent color detail enhancement for saturated colors
KR20080039522A (ko) 화이트 밸런스 통계를 위한 개선된 크로미넌스 필터
CN105323568A (zh) 一种数码相机中色彩滤镜阵列的彩色重建方法
JP7203441B2 (ja) 画像生成装置及び撮像装置
CN115719311A (zh) 处理图像的方法和相关设备
TWI617198B (zh) 具有透明濾波器像素之成像系統

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2016893053

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2016893053

Country of ref document: EP

Effective date: 20180911

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16893053

Country of ref document: EP

Kind code of ref document: A1