WO2023151210A1 - 图像处理方法、电子设备及计算机可读存储介质 - Google Patents

图像处理方法、电子设备及计算机可读存储介质 Download PDF

Info

Publication number
WO2023151210A1
WO2023151210A1 PCT/CN2022/099249 CN2022099249W WO2023151210A1 WO 2023151210 A1 WO2023151210 A1 WO 2023151210A1 CN 2022099249 W CN2022099249 W CN 2022099249W WO 2023151210 A1 WO2023151210 A1 WO 2023151210A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
pixel
processed
value
information
Prior art date
Application number
PCT/CN2022/099249
Other languages
English (en)
French (fr)
Inventor
刘晴晴
Original Assignee
上海闻泰信息技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海闻泰信息技术有限公司 filed Critical 上海闻泰信息技术有限公司
Publication of WO2023151210A1 publication Critical patent/WO2023151210A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/77Retouching; Inpainting; Scratch removal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Definitions

  • the present disclosure relates to an image processing method, an electronic device, and a computer-readable storage medium.
  • an image processing method an electronic device, and a computer-readable storage medium are provided.
  • An image processing method comprising:
  • the purple fringe area in the image to be processed is processed to obtain a target image in which the purple fringe area is eliminated.
  • the acquiring the image to be processed, and the dark frame image and the grayscale image corresponding to the image to be processed include: controlling the camera device according to the first exposure value and The image to be processed and the dark frame image corresponding to the image to be processed are obtained by shooting at the second exposure value; grayscale processing is performed on the image to be processed to obtain a grayscale image.
  • performing grayscale processing on the image to be processed to obtain a grayscale image includes: performing binarization processing on the image to be processed to obtain a binarized image; the extracting the second image information of the purple-fringed area in the grayscale image includes: extracting the second image information of the purple-fringed area in the binarized image.
  • the three color channel values of each pixel in the image to be proxied are respectively processed to obtain the gray value corresponding to each pixel
  • the method includes: processing the three color channel values of the first pixel to obtain three brightness values corresponding to the three color channel values; wherein, the first pixel is any one of the images to be processed pixel; determine any one of the three brightness values as the first gray value corresponding to the first pixel; or, compare the three brightness values, and determine the maximum brightness value Determined as the second gray value corresponding to the first pixel point; or, calculate the three brightness values by means of an average value method to obtain a brightness average value, and determine the brightness average value as the value corresponding to the first pixel point or, according to the three kinds of weights respectively corresponding to the three kinds of brightness values, the target brightness value is obtained by calculating the weighted average method, and the target brightness value is determined as the corresponding value of the first pixel
  • the fourth grayscale value is obtained by calculating the weighted average method, and the target brightness value is determined as the
  • the three channel values of each pixel in the image to be proxied are respectively processed to obtain the gray value corresponding to each pixel, including : Process the three kinds of channel values of the first pixel to obtain three kinds of brightness values corresponding to the first pixel; wherein, the first pixel is any pixel in the image to be processed; Any one of the three brightness values or the maximum brightness value or the average brightness value or the weighted average brightness value is used as the gray value corresponding to the first pixel.
  • the determining the purple fringe area in the image to be processed includes: extracting the first chromaticity information corresponding to each pixel in the image to be processed; using The first chromaticity information corresponding to each pixel determines a purple fringe area in the image to be processed.
  • the first chromaticity information includes first target chromaticity information and second target chromaticity information
  • the extraction of each pixel in the image to be processed corresponds to
  • the first chromaticity information includes: obtaining the first red intensity value, the first green intensity value and the first blue intensity value of the first pixel in the image to be processed, wherein the first pixel is the Any pixel point in the image to be processed; according to the first red intensity value and the first green intensity value of the first pixel point, determine the first pixel point of the first pixel point in the image to be processed
  • Target chromaticity information determining second target chromaticity information of the first pixel in the image to be processed according to the first blue pixel value and the first green intensity value of the first pixel.
  • the first target chromaticity information in the image includes: determining a first difference or a first ratio between the first red intensity value of the first pixel and the first green intensity value as the first pixel
  • the first target chromaticity information in the image to be processed includes: determining a second difference or a second ratio between the first blue intensity value of the first pixel point and the first green intensity value as the The second target chromaticity information of the first pixel in the image to be processed.
  • the first chromaticity information includes first target chromaticity information and second target chromaticity information
  • the extraction of each pixel in the image to be processed corresponds to
  • the first chromaticity information includes: obtaining the first red intensity value, the first blue intensity value and the first brightness value of the first pixel in the image to be processed, wherein the first pixel is the Any pixel in the image to be processed; according to the first red intensity value and the first brightness value of the first pixel, determine the first target color of the first pixel in the image to be processed chromaticity information; determining second target chromaticity information of the first pixel in the image to be processed according to the first blue pixel value and the first luminance value of the first pixel.
  • the first target chromaticity information in includes: determining the third difference or third ratio between the first red intensity value of the first pixel and the first brightness value as the first pixel at The first target chromaticity information in the image to be processed; according to the first blue pixel value and the first luminance value of the first pixel point, it is determined that the first pixel point is in the image to be processed
  • the second target chromaticity information in the image includes: determining a fourth difference or a fourth ratio between the first blue intensity value of the first pixel and the first brightness value as the first pixel Click on the second target chromaticity information in the image to be processed.
  • the determining the purple fringe area in the image to be processed by using the first chromaticity information includes: according to each pixel in the image to be processed Corresponding to the first target chromaticity information and the second target chromaticity information, determine the red channel value and blue channel value corresponding to each pixel point; place the red channel value in the image to be processed in the first preset range Pixels within the value of the blue channel within the second preset range are determined to belong to the purple fringing area.
  • the determination of each pixel point is based on the first target chromaticity information and the second target chromaticity information corresponding to each pixel point in the image to be processed
  • the corresponding red channel value includes: calculating a first product value of the second target chromaticity information and a first preset threshold; if the first product value is smaller than the first target chromaticity information, the The first product value is determined as the red channel value corresponding to each pixel point; if the first product value is greater than the first target chromaticity information, the first target chromaticity value is determined as the each The red channel value corresponding to pixels.
  • the determination of each pixel point is based on the first target chromaticity information and the second target chromaticity information corresponding to each pixel point in the image to be processed
  • the corresponding blue channel value includes: calculating a second product value of the first target chromaticity information and a second preset threshold; if the second product value is smaller than the second target chromaticity information, the The second product value is determined as the blue channel value corresponding to each pixel; if the second product value is greater than the second target chromaticity information, the second target chromaticity value is determined as the set Describe the blue channel value corresponding to each pixel.
  • the extracting the first image information of the purple-fringed area in the dark frame image includes: performing a process between the image to be processed and the dark frame image Aligning and registering, and determining a first image area corresponding to the purple-fringed area in the dark frame image; extracting first image information corresponding to the first image area.
  • the extracting the second image information of the purple-fringed area in the grayscale image includes: according to the information of the purple-fringed area in the area to be processed The coordinate position of each pixel point, determine the target pixel point corresponding to the coordinate position of each pixel point in the grayscale image; determine the second image area according to the target pixel point; extract the second image area Corresponding second image information.
  • the extracting the first image information of the purple fringing area in the dark frame image, and extracting the information of the purple fringing area in the grayscale image second image information, and fusing the first image information with the second image information to obtain the fused image information including: extracting the second chromaticity of the purple fringe area in the dark frame image information, and extract brightness information of the purple-fringed area in the grayscale image; fuse the second chroma information with the brightness information to obtain fused pixel information corresponding to the purple-fringed area.
  • the purple fringing area in the image to be processed is processed to obtain a target image in which the purple fringing area is eliminated, including: determining the initial pixel information corresponding to the purple fringe area in the image to be processed; replacing the fused pixel information with the initial pixel information corresponding to the purple fringe area in the image to be processed to obtain the elimination The target image for the purple fringed area.
  • An electronic device that may include:
  • An acquisition module configured to acquire an image to be processed, and a dark frame image and a grayscale image corresponding to the image to be processed, wherein the image to be processed is an image obtained based on a first exposure value,
  • the dark frame image is an image captured based on a second exposure value, and the first exposure value is greater than the module of the second exposure value;
  • a purple fringing determination module configured to determine a purple fringing area module in the image to be processed
  • An information extraction module configured to extract the first image information of the purple-fringed area in the dark frame image, and extract the second image information of the purple-fringed area in the grayscale image module;
  • a processing module configured to fuse the first image information and the second image information to obtain fused image information; and process the image to be processed according to the fused image information
  • the purple fringe area in is processed to obtain a module for eliminating the target image of the purple fringe area.
  • the acquisition module includes: a photographing unit configured to photograph the image to be processed according to the first exposure value and the second exposure value respectively, and the A unit of a dark frame image corresponding to the image to be processed; a grayscale processing unit configured to perform grayscale processing on the image to be processed to obtain a grayscale image.
  • the grayscale processing unit is further configured to perform binarization processing on the image to be processed to obtain a binarized image;
  • the information extraction module is configured A module for extracting the second image information of the purple fringe area in the binarized image.
  • the grayscale processing unit is further configured to separately process the three color channel values of each pixel in the image to be proxied to obtain the The grayscale value corresponding to the point; according to the grayscale value corresponding to each pixel point, the unit of the grayscale image corresponding to the image to be processed is obtained.
  • the gray scale processing unit is further configured to process the three color channel values of the first pixel to obtain three color channel values corresponding to the three color channel values.
  • brightness value wherein, the first pixel point is any pixel point in the image to be processed; any brightness value among the three brightness values is determined as the first gray value corresponding to the first pixel point or, compare the three brightness values, and determine the maximum brightness value as the second gray value corresponding to the first pixel; or, perform an average value method on the three brightness values Calculate to obtain the average brightness value, and determine the average brightness value as the third gray value corresponding to the first pixel point; or, according to the three weights respectively corresponding to the three brightness values, calculate and obtain by weighted average method a target brightness value, and determine the target brightness value as a unit of the fourth gray value corresponding to the first pixel.
  • the purple fringe determining module includes: a chroma extraction unit configured to extract the first pixel corresponding to each pixel in the image to be processed A chromaticity information unit; a purple fringing determination unit configured to use the first chromaticity information corresponding to each pixel to determine a purple fringing area in the image to be processed.
  • the first chromaticity information includes first target chromaticity information and second target chromaticity information; and the information extraction module is further configured to acquire the image to be processed The first red intensity value, the first green intensity value and the first blue intensity value of the first pixel in the first pixel, wherein the first pixel is any pixel in the image to be processed; according to the first The first red intensity value of a pixel and the first green intensity value determine the first target chromaticity information of the first pixel in the image to be processed; according to the first pixel of the first The blue pixel value and the first green intensity value are a module for determining the second target chromaticity information of the first pixel in the image to be processed.
  • the information extraction module includes: a chromaticity determination unit configured to combine the first red intensity value of the first pixel with the chromaticity determination unit.
  • the first difference or the first ratio of the first green intensity value is determined as the first target chromaticity information of the first pixel in the image to be processed; the first blue of the first pixel
  • a second difference or a second ratio between the color intensity value and the first green intensity value is determined as a unit of second target chromaticity information of the first pixel in the image to be processed.
  • the first chromaticity information includes first target chromaticity information and second target chromaticity information; and the information extraction module is further configured to acquire the image to be processed The first red intensity value, the first blue intensity value and the first brightness value of the first pixel in the first pixel, wherein the first pixel is any pixel in the image to be processed; according to the first The first red intensity value of the pixel and the first brightness value determine the first target chromaticity information of the first pixel in the image to be processed; according to the first blue color of the first pixel The pixel value and the first luminance value are used to determine the second target chromaticity information of the first pixel in the image to be processed.
  • the chromaticity determination unit is further configured to calculate the third difference or the third difference between the first red intensity value of the first pixel point and the first brightness value Three ratios, determined as the first target chromaticity information of the first pixel in the image to be processed; combining the first blue intensity value of the first pixel with the fourth of the first luminance value The difference or the fourth ratio is determined as a unit of the second target chromaticity information of the first pixel in the image to be processed.
  • the information extraction module further includes: a purple-fringe pixel determination unit, configured to configure the purple-fringe pixel determination unit to correspond to each pixel in the image to be processed The first target chromaticity information and the second target chromaticity information, determine the red channel value and blue channel value corresponding to each pixel; set the red channel value in the image to be processed within the first preset range And the pixels whose blue channel value is within the second preset range are determined as the pixel units belonging to the purple fringing area.
  • a purple-fringe pixel determination unit configured to configure the purple-fringe pixel determination unit to correspond to each pixel in the image to be processed The first target chromaticity information and the second target chromaticity information, determine the red channel value and blue channel value corresponding to each pixel; set the red channel value in the image to be processed within the first preset range And the pixels whose blue channel value is within the second preset range are determined as the pixel units belonging to the purple fringing area.
  • the purple-fringe pixel determination unit is further configured to calculate a first product value of the second target chromaticity information and a first preset threshold; if the first If the product value is less than the first target chromaticity information, then determine the first product value as the red channel value corresponding to each pixel; if the first product value is greater than the first target chromaticity information , the first target chromaticity value is determined as the unit of the red channel value corresponding to each pixel.
  • the purple-fringe pixel determination unit is further configured to calculate a second product value of the first target chromaticity information and a second preset threshold; if the second If the product value is less than the second target chromaticity information, then determine the second product value as the blue channel value corresponding to each pixel; if the second product value is greater than the second target chromaticity information, then determine the second target chromaticity value as the blue channel value corresponding to each pixel.
  • the information extraction module is further configured to align and register the image to be processed with the dark frame image, and determine the A first image area corresponding to the purple fringe area; a module for extracting first image information corresponding to the first image area.
  • the information extraction module is further configured to determine the pixel in the grayscale image according to the coordinate position of each pixel in the purple fringe area in the area to be processed.
  • a target pixel point corresponding to the coordinate position of each pixel point; determining a second image area according to the target pixel point; and a module for extracting second image information corresponding to the second image area.
  • the information extraction module is further configured to extract the second chromaticity information of the purple-fringed area in the dark frame image, and extract the second chromaticity information of the purple-fringed area in the A module of brightness information in the grayscale image; the processing module is also configured to fuse the second chroma information with the brightness information to obtain the fused pixel information corresponding to the purple fringe area module.
  • the processing module includes: a pixel determination unit configured to determine initial pixel information corresponding to the purple fringe area in the image to be processed A unit; a pixel processing unit, the pixel processing unit is configured to replace the fused pixel information with the initial pixel information corresponding to the purple fringe area in the image to be processed, to obtain the elimination of the purple fringe area The cells of the target image.
  • An electronic device comprising a memory and one or more processors, the memory is configured as a module storing computer-readable instructions; when the computer-readable instructions are executed by the processor, the one or more The processor executes the steps of any one of the image processing methods described above.
  • One or more non-volatile storage media storing computer-readable instructions, when the computer-readable instructions are executed by one or more processors, the one or more processors execute the image processing method described in any one of the above A step of.
  • Fig. 1a is a flowchart of steps of an image processing method provided by one or more embodiments of the present disclosure
  • Fig. 1b is a schematic diagram of an embodiment of determining a target image provided by one or more embodiments of the present disclosure
  • FIG. 2 is a flowchart of steps of an image processing method provided by one or more embodiments of the present disclosure
  • FIG. 3 is a structural block diagram of an electronic device in one or more embodiments of the present disclosure.
  • FIG. 4 is an internal structural diagram of an electronic device in one or more embodiments of the present disclosure.
  • first and second and the like in the specification and claims of the present disclosure are used to distinguish different objects, rather than to describe a specific order of objects.
  • first image information and the second image information are for distinguishing different image information, not for describing a specific sequence of image information.
  • words such as “exemplary” or “for example” are used as examples, illustrations or illustrations. Any embodiment or design described as “exemplary” or “for example” in the embodiments of the present disclosure shall not be construed as being preferred or advantageous over other embodiments or designs. To be precise, the use of words such as “exemplary” or “for example” is intended to present related concepts in a specific manner. In addition, in the description of the embodiments of the present disclosure, unless otherwise specified, the meaning of "plurality” refers to two one or more.
  • the image processing method provided in the present disclosure can be applied to an image processing system.
  • the image processing system includes electronics.
  • the electronic device obtains the image to be processed through the camera device, and obtains a dark frame image and a grayscale image corresponding to the image to be processed, wherein the image to be processed is an image obtained based on the first exposure value, and the dark frame image is obtained based on the second exposure value value, the first exposure value is greater than the second exposure value, then determine the purple fringing area in the image to be processed, and extract the first image information of the purple fringing area in the dark frame image, and extract the purple fringing area
  • the electronic device processes the purple fringe area in the image to be processed according to the fused image information , so as to obtain the target image that eliminates the purple fringe area.
  • the electronic devices may be, but not limited to, various personal computers, notebook computers, smart phones, tablet computers and portable wearable devices.
  • FIG. 1a is a flowchart of steps of an image processing method provided by one or more embodiments of the present disclosure, which may include:
  • the image to be processed is an image obtained by the electronic device based on a first exposure value (Exposure Value, EV), and the dark frame image is an image obtained by the electronic device based on a second exposure value, and the first exposure value is greater than the first EV.
  • Exposure Value Exposure Value
  • Two exposure values Two exposure values.
  • the image to be processed is a color (Red Green Blue, RGB) image.
  • the electronic device can obtain the image to be processed in real time, and immediately eliminate the purple fringe area in the image to be processed; it can also save the obtained image to be processed in the memory first, and then when the user uses the image to be processed, the The purple fringe area in the image to be processed is eliminated.
  • the electronic device can obtain the image to be processed based on the first exposure value captured by the camera device. Since the brightness contrast of the scene around the object to be photographed is relatively high, and the first exposure value is relatively high, it will cause the high light part of the object to be photographed Color spots appear at the junction with low-light parts, so that there may be relatively obvious purple fringes in the image to be processed acquired by the electronic device. The purple fringing will affect the image quality of the image to be processed, resulting in poor imaging effect of the image to be processed.
  • the above-mentioned first exposure value is any exposure value in the range of normal exposure values, ie EV0.
  • the normal exposure value range is an interval formed by a first preset exposure threshold and a second preset exposure threshold, and the first preset exposure threshold is smaller than the second preset exposure threshold.
  • the electronic equipment can acquire the dark frame image captured by the camera device.
  • the electronic device may first adjust the exposure value of the imaging device so that the adjusted second exposure value is smaller than the first exposure value, and then take a picture corresponding to the image to be processed by the imaging device based on the second exposure value.
  • Dark frame image Even if the brightness contrast of the scene around the subject is relatively large, because the adjusted second exposure value is low, there will be no color spots at the junction between the high-light part and the low-light part of the subject. That is, there will be no purple fringing in the dark frame image acquired by the electronic device.
  • the above-mentioned second exposure value EV0' is smaller than EV0, and at the same time, the second exposure value is smaller than the minimum exposure threshold in the normal exposure value range, that is, the second exposure value is smaller than the first preset exposure threshold.
  • the grayscale image refers to an image obtained after the electronic device performs grayscale processing on the image to be processed. Since the electronic device has adjusted the chromaticity information of the image to be processed, the chromaticity information in the grayscale image obtained by the electronic device is different from the chromaticity information in the image to be processed, but the grayscale The degree of deviation between the brightness information in the image and the brightness information in the image to be processed is relatively small.
  • the electronic device may use the first chromaticity information corresponding to each pixel to determine the purple fringing area in the image to be processed.
  • the purple fringe area refers to a color speckle area that appears at a junction between a high-light part and a low-light part of an object to be photographed in the image to be processed.
  • the first chromaticity information may at least include first target chromaticity information and second target chromaticity information.
  • the first target chromaticity information may be red chromaticity information
  • the second target chromaticity information may be blue chromaticity information.
  • the electronic device can use the two target chromaticity information to determine the red channel value and blue channel value corresponding to each pixel, and then judge the color of each pixel according to the red channel value and blue channel value corresponding to each pixel. Whether the point belongs to the purple fringe area, so as to effectively determine the purple fringe area in the image to be processed.
  • the electronic device may first align and register the image to be processed with the dark frame image, and then determine the first pixel corresponding to the purple fringed area in the dark frame image according to the purple fringed area in the image to be processed. An image area, and then extract the first image information corresponding to the first image area.
  • the electronic device can determine the target pixel corresponding to the coordinate position of each pixel in the grayscale image according to the coordinate position of each pixel in the purple-fringed area in the image to be processed; The pixels determine the second image area; then, extract the second image information corresponding to the second image area.
  • the first image information is different from the second image information.
  • the first image information may be second chromaticity information
  • the electronic device may extract the second chromaticity information corresponding to each pixel in the first image area.
  • the first target pixel is any pixel among the pixels in the first image area.
  • the second chromaticity information of the first target pixel may at least include third target chromaticity information of the first target pixel and fourth target chromaticity information of the first target pixel.
  • the third target chromaticity information may be red chromaticity information
  • the fourth target chromaticity information may be blue chromaticity information.
  • the second image information may be brightness information
  • the electronic device may extract brightness information corresponding to each pixel in the second image area.
  • the brightness information may include a brightness value.
  • the pixel information of the image includes chrominance information and luminance information.
  • the electronic device can fuse the second chromaticity information extracted from the dark frame image according to the purple fringed area with the brightness information extracted from the grayscale image according to the purple fringed area to obtain new pixel information corresponding to the purple fringed area, That is, the fused image information is obtained.
  • the new pixel information corresponding to the purple-fringed area is not affected by the highlight and contrast, the new pixel information corresponding to the purple-fringed area facilitates subsequent processing of the purple-fringed area in the image to be processed by the electronic device.
  • the electronic device can replace the fused image information with the image information corresponding to the purple fringe area in the image to be processed, so as to eliminate the purple fringe area. That is to say, in the process of using the fused image information to eliminate the purple-fringed area in the image to be processed, the electronic device replaces the pixel information of the purple-fringed area, that is, retains the new pixel information corresponding to the purple-fringed area. Pixel information, so that the image content corresponding to the purple-fringed area in the image to be processed will not be lost, thereby ensuring the integrity of the image to be processed.
  • FIG. 1 b it is a schematic diagram of an embodiment of determining a target image in an embodiment of the present disclosure.
  • the electronic device extracts the first image information corresponding to the purple-fringed area from the dark frame image according to the purple-fringed area in the image to be processed, and extracts the first image information corresponding to the purple-fringed area from the grayscale image.
  • Two image information the electronic device fuses the first image information with the second image information to obtain fused image information; then, the electronic device replaces the fused image information with the purple fringe in the image to be processed area, and obtain the target image with the purple fringing area eliminated.
  • the electronic device acquires the image to be processed based on the first exposure value. Since the first exposure value is relatively large, the brightness contrast of the scene around the object to be photographed will be relatively large, resulting in the image to be processed. There is purple fringing in the processed image, and the purple fringing will affect the image quality of the image to be processed, resulting in poor imaging effect of the image to be processed;
  • the frame image corresponds to the image to be processed, and since the second exposure value is relatively small, the scene brightness contrast around the object to be photographed is relatively small, so that there is no purple fringe in the dark frame image; Process the corresponding grayscale image of the image. Then, the electronic device determines the purple fringe area in the image to be processed.
  • the electronic device can extract the first image information of the purple-fringed area in the dark frame image, and extract the second image information of the purple-fringed area in the grayscale image, and combine the first image information with the first
  • the two image information are fused, so that the obtained fused image information is not affected by the highlight and contrast;
  • the electronic device processes the purple fringe area in the image to be processed according to the fused image information, which can effectively The purple fringe area in the image to be processed is eliminated to obtain a more accurate and complete target image.
  • FIG. 2 is a flow chart of the steps of the image processing method provided by one or more embodiments of the present disclosure, which may include:
  • Control the camera device to obtain an image to be processed and a dark frame image corresponding to the image to be processed by shooting according to the first exposure value and the second exposure value respectively.
  • the electronic device may control the camera device to obtain an image to be processed by taking pictures according to the first exposure value.
  • the electronic device then adjusts the first exposure value to obtain a second exposure value, and controls the camera to capture a dark frame image corresponding to the image to be processed according to the second exposure value.
  • the first exposure value is greater than the second exposure value. There may be purple fringing in the image to be processed, but there is no purple fringing in the dark frame image.
  • the electronic device processes the three color channel values of each pixel in the image to be processed to obtain the gray value corresponding to each pixel, and then according to the gray value corresponding to each pixel , to obtain the grayscale image corresponding to the image to be processed.
  • the three color channel values respectively include: a red (Red, R) channel value, a green (Green, G) channel value and a blue (Blue, B) channel value.
  • the first pixel is any pixel in each pixel.
  • the electronic device processes the three color channel values of each pixel in the image to be processed to obtain the gray value corresponding to each pixel, which may include but not limited to at least one of the following implementations:
  • Implementation Mode 1 The electronic device uses a component method to process the three color channel values of the first pixel in the image to be processed to obtain the first grayscale value corresponding to the first pixel.
  • the three color channel values of the first pixel correspond to three brightness values respectively.
  • the luminance value corresponding to the R channel value is L1
  • the luminance value corresponding to the G channel value is L2
  • the luminance value corresponding to the B channel value is L3.
  • the electronic device may use L1 as the first grayscale value corresponding to the first pixel; or, the electronic device may use L2 as the first grayscale value corresponding to the first pixel; or, the electronic device may use L3 as the first grayscale value corresponding to the first pixel.
  • the first gray value corresponding to the first pixel is not specifically limited here. That is to say, the electronic device may determine any brightness value in L1, L2 and L3 as the first grayscale value corresponding to the first pixel point.
  • Implementation mode 2 The electronic device uses the maximum value method to process the three color channel values of the first pixel in the image to be processed to obtain the second gray value corresponding to the first pixel.
  • the electronic device can compare L1, L2 and L3, and if L1>L2 and L1>L3, the electronic device will use the gray value corresponding to the R channel value as the first gray value corresponding to the first pixel point.
  • Gray value if L2>L1 and L2>L3, then the electronic device uses the gray value corresponding to the G channel value as the first gray value corresponding to the first pixel; if L3>L1 and L3>L2, then The electronic device uses the grayscale value corresponding to the B channel value as the first grayscale value corresponding to the first pixel. That is to say, after comparing L1, L2 and L3, the electronic device determines the maximum brightness value as the first grayscale value corresponding to the first pixel.
  • Implementation mode 3 The electronic device processes the three color channel values of the first pixel in the image to be processed by using an average value method to obtain a third gray value corresponding to the first pixel.
  • the electronic device determines the average brightness value L as the first grayscale value corresponding to the first pixel point.
  • Implementation Mode 4 The electronic device uses a weighted average method to process the three color channel values of the first pixel in the image to be processed to obtain the fourth grayscale value corresponding to the first pixel.
  • the three color channel values of the first pixel correspond to three weights respectively, and these three weights may be the same or different, and these three weights are respectively a, b and c.
  • the electronic device obtains a target brightness value by using the first formula, and then determines the target brightness value as the first grayscale value corresponding to the first pixel point.
  • performing grayscale processing on the image to be processed by the electronic device to obtain the grayscale image may include: performing binarization processing on the image to be processed by the electronic device to obtain a binarized image.
  • the binarized image is an image composed of a pixel value of 0 and a pixel value of 1, wherein a pixel value of 0 represents black, and a pixel value of 1 represents white.
  • the electronic device can acquire the grayscale value corresponding to each pixel in the image to be processed, and compare each grayscale value with a preset grayscale threshold. Then, the pixel points whose grayscale value is greater than or equal to the preset grayscale threshold value are recorded as 1, and the pixel points whose grayscale value is smaller than the preset grayscale threshold value are recorded as 0, thereby obtaining a binarized image.
  • the electronic device performs grayscale processing on the image to be processed or performs binarization processing on the image to be processed, it is all for the convenience of obtaining image information in subsequent images.
  • the first pixel is any pixel in the image to be processed
  • the first chromaticity information includes first target chromaticity information and second target chromaticity information
  • the electronic device extracts each
  • the first chromaticity information corresponding to the pixel may include but not limited to one of the following implementations:
  • Implementation 1 The electronic device acquires the first red intensity value, the first green intensity value, and the first blue intensity value of the first pixel in the image to be processed; the electronic device obtains the first red intensity value of the first pixel according to and the first green intensity value to determine the first target chromaticity information of the first pixel in the image to be processed; value to determine the second target chromaticity information of the first pixel in the image to be processed.
  • the RGB space of the image to be processed includes three kinds of components, which are R component, G component and B component respectively.
  • the red intensity value refers to the intensity value of the R component in the range of (0, 255);
  • the green intensity value refers to the intensity value of the G component in the range of (0, 255);
  • the blue intensity value refers to the B component in the Intensity values in the range (0, 255).
  • the electronic device determines the first target chromaticity information of the first pixel in the image to be processed according to the first red intensity value and the first green intensity value of the first pixel, which may include: The device determines the first difference or the first ratio between the first red intensity value of the first pixel and the first green intensity value as the first target chromaticity information of the first pixel in the image to be processed .
  • the electronic device after the electronic device acquires the first red intensity value and the first green intensity value of the first pixel, the electronic device makes a difference between the first red intensity value and the first green intensity value to obtain the first difference, or, compare the first red intensity value with the first green intensity value to obtain a first ratio; the electronic device then determines the first difference or the first ratio as the first pixel at The first target chromaticity information in the image to be processed.
  • the electronic device needs to perform grayscale processing on the first pixel corresponding to the first target chromaticity information , or directly record the first target chromaticity information as 0.
  • the electronic device determines the second target chromaticity information of the first pixel in the image to be processed according to the first blue pixel value and the first green intensity value of the first pixel, including: A second difference or a second ratio between the first blue intensity value of the first pixel and the first green intensity value is determined as the second target chromaticity information of the first pixel in the image to be processed.
  • the electronic device after the electronic device obtains the first blue intensity value and the first green intensity value of the first pixel, the electronic device makes a difference between the first red intensity value and the first green intensity value to obtain the first Two difference values, or, compare the first blue intensity value with the first green intensity value to obtain a second ratio; the electronic device then determines the second difference or the second ratio as the first pixel Click on the second target chromaticity information in the image to be processed.
  • the electronic device needs to perform grayscale processing on the first pixel corresponding to the second target chromaticity information , or directly record the second target chromaticity information as 0.
  • the electronic device converts the RGB space of the image to be processed into the YCbCr space with brightness and chromaticity separation, and obtains the first red intensity value, the first blue intensity value and the first brightness value of the first pixel in the image to be processed ;
  • the electronic device determines the first target chromaticity information of the first pixel in the image to be processed according to the first red intensity value and the first brightness value of the first pixel;
  • the first intensity value and the first brightness value of the pixel determine the second target chromaticity information of the first pixel in the image to be processed.
  • Y represents the luminance value
  • Cb refers to the blue chrominance component
  • Cr refers to the red chrominance component
  • the electronic device determines the first target chromaticity information of the first pixel in the image to be processed according to the first red intensity value and the first brightness value of the first pixel, which may include: electronic device A third difference between the first red intensity value of the first pixel and the first brightness value is determined as the first target chromaticity information of the first pixel in the image to be processed.
  • the electronic device after the electronic device obtains the first red intensity value and the first brightness value of the first pixel, the electronic device makes a difference between the first red intensity value and the first brightness value to obtain a third difference value The electronic device then determines the third difference as the first target chromaticity information of the first pixel in the image to be processed.
  • the electronic device needs to perform grayscale processing on the first pixel corresponding to the first target chromaticity information, or directly The first target chromaticity information is recorded as 0.
  • the electronic device determines the second target chromaticity information of the first pixel in the image to be processed according to the first blue intensity value and the first brightness value of the first pixel, including: an electronic device A fourth difference between the first blue intensity value of the first pixel and the first brightness is determined as the second target chromaticity information of the first pixel in the image to be processed.
  • the electronic device after the electronic device acquires the first blue intensity value and the first brightness value of the first pixel, the electronic device makes a difference between the first blue intensity value and the first brightness value to obtain the fourth difference; the electronic device then determines the fourth difference as the second target chromaticity information of the first pixel in the image to be processed.
  • the electronic device needs to perform grayscale processing on the first pixel corresponding to the second target chromaticity information, or directly The second target chromaticity information is recorded as 0.
  • the timing of determining the first target chromaticity information and determining the second target chromaticity information by the electronic device is not limited.
  • the electronic device can use the first chromaticity information corresponding to each pixel to determine the purple fringing in the image to be processed. Then, according to the pixels belonging to the purple-fringed area, the purple-fringed area in the image to be processed is more accurately determined.
  • the electronic device uses the first chromaticity information corresponding to each pixel to determine the purple fringe area in the image to be processed, which may include: the electronic device determines the purple fringe area in the image to be processed according to the first target color The chromaticity information and the second target chromaticity information determine the red channel value and the blue channel value corresponding to each pixel; the electronic device places the red channel value in the image to be processed within the first preset range and the blue channel value Pixels whose values are within the second preset range are determined to belong to the purple-fringed area.
  • the electronic device determines the red channel value corresponding to each pixel according to the first target chromaticity information and the second target chromaticity information corresponding to each pixel in the image to be processed, which may include: calculating by the electronic device The first product value of the second target chromaticity information and the first preset threshold value; the electronic device compares the first product value with the first target chromaticity information, and if the first product value is smaller than the first target chromaticity information, then determine the first product value as the red channel value corresponding to each pixel; if the first product value is greater than the first target chromaticity information, then determine the first target chromaticity information as the The red channel value corresponding to pixels.
  • the first preset threshold may be set before the electronic device leaves the factory, or may be customized by the user according to requirements, which is not specifically limited here.
  • the electronic device determines the blue channel value corresponding to each pixel according to the first target chromaticity information and the second target chromaticity information corresponding to each pixel in the image to be processed, which may include: electronic device calculating a second product value of the first target chromaticity information and a second preset threshold; the electronic device compares the second product value with the second target chromaticity information, and if the second product value is smaller than the second target color chromaticity information, then determine the second product value as the blue channel value corresponding to each pixel; if the second product value is greater than the second target chromaticity information, then determine the second target chromaticity information as The blue channel value corresponding to each pixel.
  • the second preset threshold can be set before the electronic device leaves the factory, or can be customized by the user according to the needs.
  • the second preset threshold can be the same as the first preset threshold, or it can be different, which will not be done here. Specific limits.
  • the first preset range and the second preset range are set by the electronic device before leaving the factory, and the first preset range and the second preset range may be the same or different. No specific limitation is made here.
  • the electronic device acquires the red channel value and the blue channel value corresponding to each pixel in the image to be processed, if the electronic device determines the purple fringe area in the image to be processed, then it is necessary to The red channel value and the blue channel value corresponding to the point are judged to determine the pixel points belonging to the purple fringing area, so that the purple fringing area of the image to be processed can be accurately determined.
  • the method may further include: the electronic device sets the green channel value corresponding to the pixel belonging to the purple-fringed area to 0.
  • the electronic device can extract the red chromaticity information and blue chromaticity information corresponding to each pixel in the dark frame image in the purple fringed area, and can also extract the brightness corresponding to each pixel in the dark frame image in the purple fringed area value.
  • the electronic device can correspondingly combine the second chromaticity information of each pixel point extracted in the dark frame image in the purple fringe area with the brightness information of each pixel point extracted in the grayscale image in the purple fringe area , to get new pixel information, that is, to get the fused image information.
  • the electronic device after the electronic device determines the purple-fringed area in the image to be processed, it can obtain the first chromaticity information and initial brightness information corresponding to each pixel in the purple-fringed area in the image to be processed, that is, The initial pixel information corresponding to each pixel point is acquired in the image to be processed.
  • the electronic device can replace the original pixel information corresponding to the purple-fringed area with the new pixel information, so as to eliminate the pixel information in the image to be processed. purple fringe area. That is to say, in the process of using the new pixel information to eliminate the purple fringing area in the image to be processed, the electronic device replaces the processed pixel information of the purple fringing area, that is, retains the new pixel information corresponding to the purple fringing area. In this way, the image content corresponding to the purple-fringed area in the image to be processed will not be lost, thereby ensuring the integrity of the image to be processed.
  • the electronic device acquires the image to be processed based on the first exposure value. Since the first exposure value is relatively large, the brightness contrast of the scene around the object to be photographed will be relatively large, resulting in the image to be processed. There is purple fringing in the processed image, and the purple fringing will affect the image quality of the image to be processed, resulting in poor imaging effect of the image to be processed;
  • the frame image corresponds to the image to be processed. Since the second exposure value is relatively small, the brightness contrast of the scene around the object to be photographed is relatively small, so that there is no purple fringe in the dark frame image; The grayscale image corresponding to the image to be processed. Then, the electronic device determines the purple fringe area in the image to be processed.
  • the electronic device can extract the second chromaticity information of the purple-fringed area in the dark frame image, and extract the luminance information of the purple-fringed area in the gray-scale image, and combine the second chromaticity information with the luminance
  • the information is fused so that the obtained fused pixel information is not affected by the highlight and contrast; finally, the electronic device replaces the fused pixel information with the purple fringe area in the image to be processed, which can effectively eliminate the image to be processed In order to get a more accurate and complete target image.
  • FIG. 1 a , FIG. 1 b and FIG. 2 are shown sequentially as indicated by the arrows, these steps are not necessarily executed sequentially in the order indicated by the arrows. Unless otherwise specified herein, there is no strict order restriction on the execution of these steps, and these steps can be executed in other orders. Moreover, at least some of the steps in FIG. 1a, FIG. 1b and FIG. 2 may include multiple sub-steps or multiple stages, and these sub-steps or stages are not necessarily performed at the same time, but may be performed at different times. The execution order of the sub-steps or stages is not necessarily performed sequentially, but may be executed alternately or alternately with at least a part of other steps or sub-steps or stages of other steps.
  • the embodiment of the present disclosure also provides an electronic device, the embodiment of the electronic device corresponds to the foregoing method embodiment, for the convenience of reading, this embodiment of the electronic device does not refer to the foregoing method Details in the embodiments will be described one by one, but it should be clear that the electronic device in this embodiment can correspondingly implement all the content in the foregoing method embodiments.
  • FIG. 3 is a structural block diagram of an electronic device provided in an embodiment of the present disclosure. As shown in FIG. 3 , the electronic device 300 provided in this embodiment includes:
  • the acquisition module 301 is configured to acquire an image to be processed, and a dark frame image and a grayscale image corresponding to the image to be processed, wherein the image to be processed is an image obtained based on a first exposure value, and the dark frame image is Based on the image taken by the second exposure value, the first exposure value is greater than the module of the second exposure value;
  • a purple fringe determination module 302 configured to configure the purple fringe determination module 302 as a module for determining the purple fringe area in the image to be processed
  • An information extraction module 303 configuring the information extraction module 303 to extract the first image information of the purple-fringed area in the dark frame image, and a module for extracting the second image information of the purple-fringed area in the grayscale image;
  • the processing module 304 configures the processing module 304 to fuse the first image information and the second image information to obtain fused image information; and process the purple fringe area in the image to be processed according to the fused image information to obtain A module to remove purple-fringed regions of target images.
  • the obtaining module 301 includes:
  • the photographing unit 3011 is configured to photograph the image to be processed and the dark frame image corresponding to the image to be processed according to the first exposure value and the second exposure value respectively;
  • the grayscale processing unit 3012 is to convert the grayscale processing unit 3012 is configured as a unit for performing grayscale processing on the image to be processed to obtain a grayscale image.
  • the grayscale processing unit 3012 is also configured as a unit for binarizing the image to be processed to obtain a binarized image; the information extraction module 303 is also configured to extract purple fringing Module of the second image information of the region in the binarized image.
  • the grayscale processing unit 3012 is further configured to process the three color channel values of each pixel in the proxy image to obtain the grayscale value corresponding to each pixel ;According to the gray value corresponding to each pixel, get the unit of the gray image corresponding to the image to be processed.
  • the grayscale processing unit 3012 is further configured to process the three color channel values of the first pixel to obtain three brightness values corresponding to the three color channel values;
  • the first pixel is any pixel in the image to be processed; any one of the three brightness values is determined as the first gray value corresponding to the first pixel; or, the three brightness values are Compare and determine the maximum brightness value as the second gray value corresponding to the first pixel point; or, calculate the three brightness values by the average value method to obtain the brightness average value, and determine the brightness average value as the first pixel point
  • the corresponding third grayscale value or, according to the three kinds of weights respectively corresponding to the three kinds of brightness values, the target brightness value is calculated by the weighted average method, and the target brightness value is determined as the fourth grayscale value corresponding to the first pixel point unit.
  • the purple fringing determination module 302 includes:
  • the chromaticity extraction unit 3021 is configured to extract the first chromaticity information corresponding to each pixel in the image to be processed; the purple fringing determination unit 3022 is configured to use each The first chromaticity information corresponding to the pixel points determines the unit of the purple fringe area in the image to be processed.
  • the first chromaticity information includes first target chromaticity information and second target chromaticity information; the information extraction module 303 is also configured to obtain the first pixel in the image to be processed The first red intensity value, the first green intensity value and the first blue intensity value, wherein, the first pixel point is any pixel point in the image to be processed; according to the first red intensity value and the first pixel point A green intensity value, determine the first target chromaticity information of the first pixel in the image to be processed; determine the first pixel in the image to be processed according to the first blue pixel value and the first green intensity value of the first pixel A module of the second target chroma information in the image.
  • the information extraction module 303 includes: a chromaticity determination unit 3031, and the chromaticity determination unit 3031 is configured to combine the first red intensity value and the first green intensity value of the first pixel
  • the first difference or the first ratio of the values is determined as the first target chromaticity information of the first pixel in the image to be processed; the first blue intensity value of the first pixel and the first green intensity value of the first pixel
  • the second difference or the second ratio is determined as a module of the second target chromaticity information of the first pixel in the image to be processed;
  • the first chromaticity information includes first target chromaticity information and second target chromaticity information; the information extraction module 303 is also configured to obtain the first pixel in the image to be processed The first red intensity value, the first blue intensity value and the first brightness value, wherein, the first pixel point is any pixel point in the image to be processed; according to the first red intensity value of the first pixel point and the first The luminance value determines the first target chromaticity information of the first pixel in the image to be processed; according to the first blue pixel value and the first luminance value of the first pixel, determines the chromaticity information of the first pixel in the image to be processed A module for the second target chroma information.
  • the chromaticity determination unit 3031 is further configured to determine a third difference or a third ratio between the first red intensity value and the first brightness value of the first pixel point as The first target chromaticity information of the first pixel in the image to be processed; the fourth difference or the fourth ratio between the first blue intensity value and the first brightness value of the first pixel is determined as the first pixel The unit of the second target chroma information in the image to be processed.
  • the information extraction module further includes: a purple-fringe pixel determination unit 3032 configured to configure the purple-fringe pixel determination unit to The chromaticity information and the second target chromaticity information determine the red channel value and the blue channel value corresponding to each pixel point; the red channel value in the image to be processed is within the first preset range and the blue channel value is within the second preset range Set the pixels within the range to be determined as the units of the pixels belonging to the purple fringe area.
  • a purple-fringe pixel determination unit 3032 configured to configure the purple-fringe pixel determination unit to The chromaticity information and the second target chromaticity information determine the red channel value and the blue channel value corresponding to each pixel point; the red channel value in the image to be processed is within the first preset range and the blue channel value is within the second preset range Set the pixels within the range to be determined as the units of the pixels belonging to the purple fringe area.
  • the purple fringe pixel determination unit 3032 is further configured to calculate a first product value of the second target chromaticity information and the first preset threshold; if the first product value is smaller than the first target chromaticity information, the first product value is determined as the red channel value corresponding to each pixel; if the first product value is greater than the first target chromaticity information, the first target chromaticity value is determined as each pixel The unit corresponding to the red channel value.
  • the purple fringe pixel determination unit 3032 is further configured to calculate a second product value of the first target chromaticity information and a second preset threshold; if the second product value is smaller than the second target chromaticity information, the second product value is determined as the blue channel value corresponding to each pixel; if the second product value is greater than the second target chromaticity information, the second target chromaticity value is determined as each pixel The point corresponds to the blue channel value.
  • the information extraction module 303 is further configured to align and register the image to be processed with the dark frame image, and determine the first image area corresponding to the purple fringe area in the dark frame image ; A module for extracting the first image information corresponding to the first image area.
  • the information extraction module 303 is also configured to determine the coordinates of each pixel in the grayscale image according to the coordinate position of each pixel in the purple-fringed area in the area to be processed A target pixel point corresponding to the position; determining a second image area according to the target pixel point; and a module for extracting second image information corresponding to the second image area.
  • the information extraction module 303 is further configured to extract the second chromaticity information of the purple fringe area in the dark frame image, and extract the brightness information of the purple fringe area in the grayscale image module; and the processing module 304 is also configured as a module for merging the second chroma information and luminance information to obtain fused pixel information corresponding to the purple fringe area.
  • the processing module 304 includes: a pixel determination unit 3041 configured to determine the initial pixel information corresponding to the purple fringe area in the image to be processed; pixel processing The unit 3042 is configured to replace the fused pixel information with the original pixel information corresponding to the purple fringing area in the image to be processed, so as to obtain a target image for eliminating the purple fringing area.
  • the electronic device provided in this embodiment can execute the image processing method provided in the foregoing method embodiment, and its implementation principle and technical effect are similar, and will not be repeated here.
  • Each module in the above-mentioned electronic equipment can be fully or partially realized by software, hardware and a combination thereof.
  • the above-mentioned modules can be embedded in or independent of the processor in the electronic device in the form of hardware, and can also be stored in the memory of the electronic device in the form of software, so that the processor can invoke and execute the corresponding operations of the above-mentioned modules.
  • an electronic device is provided.
  • the electronic device may be a terminal device, and its internal structure may be as shown in FIG. 4 .
  • the electronic device includes a processor, a memory, a communication interface, a database, a display screen and an input device connected through a system bus.
  • the processor of the electronic device is configured as a module providing calculation and control capabilities.
  • the memory of the electronic device includes a non-volatile storage medium and an internal memory.
  • the non-volatile storage medium stores an operating system and computer readable instructions.
  • the internal memory provides an environment for the execution of the operating system and computer readable instructions in the non-volatile storage medium.
  • the communication interface of the electronic device is configured as a wired or wireless communication module with an external terminal, and the wireless mode can be realized through WIFI, operator network, near field communication (NFC) or other technologies.
  • the computer-readable instructions are executed by the processor, the image processing method provided by the above-mentioned embodiments can be realized.
  • the display screen of the electronic device may be a liquid crystal display screen or an electronic ink display screen, and the input device of the electronic device may be a touch layer covered on the display screen, or a button, a trackball or a touch pad provided on the housing of the electronic device , and can also be an external keyboard, touchpad, or mouse.
  • FIG. 4 is only a block diagram of a partial structure related to the disclosed solution, and does not constitute a limitation on the electronic device to which the disclosed solution is applied.
  • the specific electronic device can be More or fewer components than shown in the figures may be included, or some components may be combined, or have a different arrangement of components.
  • the electronic device provided by the present disclosure may be implemented in the form of a computer-readable instruction, and the computer-readable instruction may run on the electronic device as shown in FIG. 4 .
  • Various program modules constituting the electronic device can be stored in the memory of the electronic device, for example, the acquisition module 301 and the purple fringe determination module 302 shown in FIG. 3 .
  • the computer-readable instructions constituted by the various program modules cause the processor to execute the steps in the image processing methods of the various embodiments of the present disclosure described in this specification.
  • an electronic device including a memory and one or more processors, the memory is configured as a module storing computer-readable instructions; when the computer-readable instructions are executed by the processors , so that the one or more processors execute the steps of the image processing method described in the above method embodiments.
  • the electronic device provided in this embodiment can implement the image processing method provided in the above method embodiment, and its implementation principle and technical effect are similar, and will not be repeated here.
  • One or more non-volatile storage media storing computer-readable instructions, when the computer-readable instructions are executed by one or more processors, the one or more processors execute the image processing method described in any one of the above A step of.
  • the computer-readable instructions stored on the computer-readable storage medium provided by this embodiment can implement the image processing method provided by the above-mentioned method embodiment, and its implementation principle is similar to the technical effect, and will not be repeated here.
  • Non-volatile memory may include read-only memory (Read-Only Memory, ROM), magnetic tape, floppy disk, flash memory or optical memory, etc.
  • Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory.
  • RAM Random Access Memory
  • SRAM Static Random Access Memory
  • DRAM Dynamic Random Access Memory
  • the image processing method provided by the present disclosure can effectively improve the shooting effect of the electronic device when the object is photographed by the camera device, effectively eliminate the purple fringe area in the image to be processed, so as to obtain a more accurate and complete target image, and has the advantages of Strong industrial applicability.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

本公开实施例公开了一种图像处理方法、电子设备以及计算机可读存储介质,可有效消除待处理图像中的紫边区域,提高图像质量。本公开实施例方法包括:获取待处理图像,以及与待处理图像对应的暗帧图像和灰度图像,其中,待处理图像为基于第一曝光值拍摄得到的图像,暗帧图像为基于第二曝光值拍摄得到的图像,第一曝光值大于第二曝光值;确定待处理图像中的紫边区域;提取紫边区域在暗帧图像中的第一图像信息,及提取紫边区域在灰度图像中的第二图像信息,并将第一图像信息与第二图像信息进行融合,得到融合后的图像信息;根据融合后的图像信息,对待处理图像中的紫边区域进行处理,得到消除紫边区域的目标图像。

Description

图像处理方法、电子设备及计算机可读存储介质
相关交叉引用
本公开要求于2022年2月10日提交中国专利局、申请号为2022101243236、发明名称为“图像处理方法、电子设备及计算机可读存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本公开中。
技术领域
本公开涉及图像处理方法、电子设备以及计算机可读存储介质。
背景技术
目前,电子设备在通过摄像装置对被拍摄物体进行拍摄时,由于该被拍摄物体周围场景的亮度反差较大,所以,易造成高光部分与低光部分之间的交界处会出现较为明显的紫边。此外,紫边的形成过程还与摄像装置镜头参数及电子设备中的内部差值算法有关。这就导致该电子设备拍摄得到的图像中无法完全避免该紫边的形成,从而会对该图像的图像质量和视觉效果造成不同程度的影响。
然而,在相关的图像处理方式中,无论是对摄像装置镜头参数进行调整,还是对电子设备中的内部差值算法进行优化,都无法有效消除图像中的紫边。
发明内容
(一)要解决的技术问题
在现有技术中,电子设备通过摄像装置对被拍摄物体进行拍摄时,高光部分和低光部分之间的交界处会出现较为明显的紫边,影响图像的图像质量和视觉效果。
(二)技术方案
根据本公开公开的各种实施例,提供一种图像处理方法、电子设备以及计算机可读存储介质。
一种图像处理方法,所述方法包括:
获取待处理图像,以及与所述待处理图像对应的暗帧图像和灰度图像,其中,所述待处理图像为基于第一曝光值拍摄得到的图像,所述暗帧图像为基于第二曝光值拍摄得到的图像,所述第一曝光值大于所述第二曝光值;
确定所述待处理图像中的紫边区域;
提取所述紫边区域在所述暗帧图像中的第一图像信息,及提取所述紫边区域在所述灰度图像中的第二图像信息,并将所述第一图像信息与所述第二图像信息进行融合,得到融合后的图像信息;
根据所述融合后的图像信息,对所述待处理图像中的紫边区域进行处理,得到消除所述紫边区域的目标图像。
作为本公开实施例的一种可选的实施方式,所述获取待处理图像,以及与所述待处理图像对应的暗帧图像和灰度图像,包括:控制摄像装置分别根据第一曝光值及第二曝光值拍摄得到待处理图像及与所述待处理图像对应的暗帧图像;对所述待处理图像进行灰度处理,得到灰度图像。
作为本公开实施例的一种可选的实施方式,所述对所述待处理图像进行灰度处理,得到灰度图像,包括:对所述待处理图像进行二值化处理,得到二值化图像;所述提取所述紫边区域在所述灰度图像中的第二图像信息,包括:提取所述紫边区域在所述二值化图像中的第二图像信息。
作为本公开实施例的一种可选的实施方式,所述分别对所述待代理图像中每个像素点的三种颜色通道值进行处理,得到所述每个像素点对应的灰度值,包括:对第一像素点的三种颜色通道值进行处理,得到与所述三种颜色通道值对应的三种亮度值;其中,所述第一像素点为所述待处理图像中的任一像素点;将所述三种亮度值中的任一亮度值确定为所述第一像素点对应的第一灰度值;或,将所述三种亮度值进行比较,并将最大的亮度值确定为该第一像素点对应的第二灰度值;或,通过平均值法对所述三种亮度值进行计算,得到亮度均值,并将所述亮度均值确定为所述第一像素点对应的第三灰度值;或,根据所述三种亮度值分别对应的三种权重,通过加权平均法计算得到目标亮度值,并将所述目标亮度值确定为所述第一像素点对应的第四灰度值。
作为本公开实施例的一种可选的实施方式,所述分别对所述待代理图像中每个像素点的三种通道值进行处理,得到所述每个像素点对应的灰度值,包括:对第一像素点的三种通道值进行处理,得到所述第一像素点对应的三种亮度值;其中,所述第一像素点为所述待处理图像中的任一像素点;将所述三种亮度值的任一亮度值或最大的亮度值或平均亮度值或加权平均亮度值作为第一像素点对应的灰度值。
作为本公开实施例的一种可选的实施方式,所述确定所述待处理图像中的紫边区域,包括:提取所述待处理图像中每个像素点对应的第一色度信息;利用所述每个像素点对应的第一色度信息,确定所述待处理图像中的紫边区域。
作为本公开实施例的一种可选的实施方式,所述第一色度信息包括第一目标色度信息和第二目标色度信息,所述提取所述待处理图像中每个像素点对应的第一色度信息,包括:获取所述待处理图像中第一像素点的第一红色强度值、第一绿色强度值和第一蓝色强度值,其中,所述第一像素点为所述待处理图像中的任一像素点;根据所述第一像素点的第一红色强度值与所述第一绿色强度值,确定所述第一像素点在所述待处理图像中的第一目标色度信息;根据所述第一像素点的第一蓝色像素值与所述第一绿色强度值,确定所述第一像素点在所述待处理图像中的第二目标色度信息。
作为本公开实施例的一种可选的实施方式,所述根据所述第一像素点的第一红色强度值与所述第一绿色强度值,确定所述第一像素点在所述待处理图像中的第一目标色度信息,包括:将所述第一像素点的第一红色强度值与所述第一绿色强度值的第一差值或第一比值,确定为所述第一像素点在所述待处理图像中的第一目标色度信息;所述根据所述第一像素点的第一蓝色像素值与所述第一绿色强度值,确定所述第一像素点在所述待处理图像中的第二目标色度信息,包括:将所述第一像素点的第一蓝色强度值与所述第一绿色强度值的第二差值或第二比值,确定为所述第一像素点在所述待处理图像中的第二目标色度信息。
作为本公开实施例的一种可选的实施方式,所述第一色度信息包括第一目标 色度信息和第二目标色度信息,所述提取所述待处理图像中每个像素点对应的第一色度信息,包括:获取所述待处理图像中第一像素点的第一红色强度值、第一蓝色强度值和第一亮度值,其中,所述第一像素点为所述待处理图像中的任一像素点;根据所述第一像素点的第一红色强度值与所述第一亮度值,确定所述第一像素点在所述待处理图像中的第一目标色度信息;根据所述第一像素点的第一蓝色像素值与所述第一亮度值,确定所述第一像素点在所述待处理图像中的第二目标色度信息。
作为本公开实施例的一种可选的实施方式,所述根据所述第一像素点的第一红色强度值与所述第一亮度值,确定所述第一像素点在所述待处理图像中的第一目标色度信息,包括:将所述第一像素点的第一红色强度值与所述第一亮度值的第三差值或第三比值,确定为所述第一像素点在所述待处理图像中的第一目标色度信息;所述根据所述第一像素点的第一蓝色像素值与所述第一亮度值,确定所述第一像素点在所述待处理图像中的第二目标色度信息,包括:将所述第一像素点的第一蓝色强度值与所述第一亮度值的第四差值或第四比值,确定为所述第一像素点在所述待处理图像中的第二目标色度信息。
作为本公开实施例的一种可选的实施方式,所述利用所述第一色度信息,确定所述待处理图像中的紫边区域,包括:根据所述待处理图像中每个像素点对应的第一目标色度信息和第二目标色度信息,确定所述每个像素点对应的红色通道值及蓝色通道值;将所述待处理图像中红色通道值位于第一预设范围内且蓝色通道值位于第二预设范围内的像素点,确定为属于紫边区域的像素点。
作为本公开实施例的一种可选的实施方式,所述根据所述待处理图像中每个像素点对应的第一目标色度信息和第二目标色度信息,确定所述每个像素点对应的红色通道值,包括:计算所述第二目标色度信息与第一预设阈值的第一乘积值;若所述第一乘积值小于所述第一目标色度信息,则将所述第一乘积值确定为所述每个像素点对应的红色通道值;若所述第一乘积值大于所述第一目标色度信息,则将所述第一目标色度值确定为所述每个像素点对应的红色通道值。
作为本公开实施例的一种可选的实施方式,所述根据所述待处理图像中每个像素点对应的第一目标色度信息和第二目标色度信息,确定所述每个像素点对应的蓝色通道值,包括:计算所述第一目标色度信息与第二预设阈值的第二乘积值;若所述第二乘积值小于所述第二目标色度信息,则将所述第二乘积值确定为所述每个像素点对应的蓝色通道值;若所述第二乘积值大于所述第二目标色度信息,则将所述第二目标色度值确定为所述每个像素点对应的蓝色通道值。
作为本公开实施例的一种可选的实施方式,所述提取所述紫边区域在所述暗帧图像中的第一图像信息,包括:将所述待处理图像与所述暗帧图像进行对齐配准,并确定所述暗帧图像中与所述紫边区域对应的第一图像区域;提取所述第一图像区域对应的第一图像信息。
作为本公开实施例的一种可选的实施方式,所述提取所述紫边区域在所述灰度图像中的第二图像信息,包括:根据所述待处理区域中所述紫边区域的每个像素点的坐标位置,确定所述灰度图像中与所述每个像素点的坐标位置对应的目标像素点;根据所述目标像素点确定第二图像区域;提取所述第二图像区域对应的第二图像信息。
作为本公开实施例的一种可选的实施方式,所述提取所述紫边区域在所述暗帧图像中的第一图像信息,及提取所述紫边区域在所述灰度图像中的第二图像信息,并将所述第一图像信息与所述第二图像信息进行融合,得到融合后的图像信 息,包括:提取所述紫边区域在所述暗帧图像中的第二色度信息,及提取所述紫边区域在所述灰度图像中的亮度信息;将所述第二色度信息与所述亮度信息进行融合,得到所述紫边区域对应的融合后的像素信息。
作为本公开实施例的一种可选的实施方式,所述根据所述融合后的图像信息,对所述待处理图像中的紫边区域进行处理,得到消除所述紫边区域的目标图像,包括:确定所述紫边区域在所述待处理图像中对应的初始像素信息;将所述融合后的像素信息替换所述紫边区域在所述待处理图像中对应的初始像素信息,得到消除所述紫边区域的目标图像。
一种电子设备,可以包括:
获取模块,将所述获取模块配置成获取待处理图像,以及与所述待处理图像对应的暗帧图像和灰度图像,其中,所述待处理图像为基于第一曝光值拍摄得到的图像,所述暗帧图像为基于第二曝光值拍摄得到的图像,所述第一曝光值大于所述第二曝光值的模块;
紫边确定模块,将所述紫边确定模块配置成确定所述待处理图像中的紫边区域模块;
信息提取模块,将所述信息提取模块配置成提取所述紫边区域在所述暗帧图像中的第一图像信息,及提取所述紫边区域在所述灰度图像中的第二图像信息的模块;
处理模块,将所述处理模块配置成将所述第一图像信息与所述第二图像信息进行融合,得到融合后的图像信息;以及根据所述融合后的图像信息,对所述待处理图像中的紫边区域进行处理,得到消除所述紫边区域的目标图像的模块。
作为本公开实施例一种可选的实施方式,所述获取模块,包括:拍摄单元,将所述拍摄单元配置成分别根据第一曝光值及第二曝光值拍摄得到待处理图像及与所述待处理图像对应的暗帧图像的单元;灰度处理单元,将所述灰度处理单元配置成对所述待处理图像进行灰度处理,得到灰度图像的单元。
作为本公开实施例一种可选的实施方式,还将所述灰度处理单元配置成对所述待处理图像进行二值化处理,得到二值化图像的单元;将所述信息提取模块配置成提取所述紫边区域在所述二值化图像中的第二图像信息的模块。
作为本公开实施例一种可选的实施方式,还将所述灰度处理单元配置成分别对所述待代理图像中每个像素点的三种颜色通道值进行处理,得到所述每个像素点对应的灰度值;根据所述每个像素点对应的灰度值,得到与所述待处理图像对应的灰度图像的单元。
作为本公开实施例一种可选的实施方式,还将所述灰度处理单元配置成对第一像素点的三种颜色通道值进行处理,得到与所述三种颜色通道值对应的三种亮度值;其中,所述第一像素点为所述待处理图像中的任一像素点;将所述三种亮度值中的任一亮度值确定为所述第一像素点对应的第一灰度值;或,将所述三种亮度值进行比较,并将最大的亮度值确定为该第一像素点对应的第二灰度值;或,通过平均值法对所述三种亮度值进行计算,得到亮度均值,并将所述亮度均值确定为所述第一像素点对应的第三灰度值;或,根据所述三种亮度值分别对应的三种权重,通过加权平均法计算得到目标亮度值,并将所述目标亮度值确定为所述第一像素点对应的第四灰度值的单元。
作为本公开实施例一种可选的实施方式,所述紫边确定模块,包括:色度提取单元,将所述色度提取单元配置成提取所述待处理图像中每个像素点对应的第一色度信息的单元;紫边确定单元,将所述紫边确定单元配置成利用所述每个像 素点对应的第一色度信息,确定所述待处理图像中的紫边区域的单元。
作为本公开实施例一种可选的实施方式,所述第一色度信息包括第一目标色度信息和第二目标色度信息;还将所述信息提取模块配置成获取所述待处理图像中第一像素点的第一红色强度值、第一绿色强度值和第一蓝色强度值,其中,所述第一像素点为所述待处理图像中的任一像素点;根据所述第一像素点的第一红色强度值与所述第一绿色强度值,确定所述第一像素点在所述待处理图像中的第一目标色度信息;根据所述第一像素点的第一蓝色像素值与所述第一绿色强度值,确定所述第一像素点在所述待处理图像中的第二目标色度信息的模块。
作为本公开实施例一种可选的实施方式,所述信息提取模块,包括:色度确定单元,将所述色度确定单元配置成将所述第一像素点的第一红色强度值与所述第一绿色强度值的第一差值或第一比值,确定为所述第一像素点在所述待处理图像中的第一目标色度信息;将所述第一像素点的第一蓝色强度值与所述第一绿色强度值的第二差值或第二比值,确定为所述第一像素点在所述待处理图像中的第二目标色度信息的单元。
作为本公开实施例一种可选的实施方式,所述第一色度信息包括第一目标色度信息和第二目标色度信息;还将所述信息提取模块配置成获取所述待处理图像中第一像素点的第一红色强度值、第一蓝色强度值和第一亮度值,其中,所述第一像素点为所述待处理图像中的任一像素点;根据所述第一像素点的第一红色强度值与所述第一亮度值,确定所述第一像素点在所述待处理图像中的第一目标色度信息;根据所述第一像素点的第一蓝色像素值与所述第一亮度值,确定所述第一像素点在所述待处理图像中的第二目标色度信息的模块。
作为本公开实施例一种可选的实施方式,还将所述色度确定单元配置成将所述第一像素点的第一红色强度值与所述第一亮度值的第三差值或第三比值,确定为所述第一像素点在所述待处理图像中的第一目标色度信息;将所述第一像素点的第一蓝色强度值与所述第一亮度值的第四差值或第四比值,确定为所述第一像素点在所述待处理图像中的第二目标色度信息的单元。
作为本公开实施例一种可选的实施方式,所述信息提取模块,还包括:紫边像素确定单元,将所述紫边像素确定单元配置成根据所述待处理图像中每个像素点对应的第一目标色度信息和第二目标色度信息,确定所述每个像素点对应的红色通道值及蓝色通道值;将所述待处理图像中红色通道值位于第一预设范围内且蓝色通道值位于第二预设范围内的像素点,确定为属于紫边区域的像素点的单元。
作为本公开实施例一种可选的实施方式,还将所述紫边像素确定单元配置成计算所述第二目标色度信息与第一预设阈值的第一乘积值;若所述第一乘积值小于所述第一目标色度信息,则将所述第一乘积值确定为所述每个像素点对应的红色通道值;若所述第一乘积值大于所述第一目标色度信息,则将所述第一目标色度值确定为所述每个像素点对应的红色通道值的单元。
作为本公开实施例一种可选的实施方式,还将所述紫边像素确定单元配置成计算所述第一目标色度信息与第二预设阈值的第二乘积值;若所述第二乘积值小于所述第二目标色度信息,则将所述第二乘积值确定为所述每个像素点对应的蓝色通道值;若所述第二乘积值大于所述第二目标色度信息,则将所述第二目标色度值确定为所述每个像素点对应的蓝色通道值。
作为本公开实施例一种可选的实施方式,还将所述信息提取模块配置成将所述待处理图像与所述暗帧图像进行对齐配准,并确定所述暗帧图像中与所述紫边区域对应的第一图像区域;提取所述第一图像区域对应的第一图像信息的模块。
作为本公开实施例一种可选的实施方式,还将所述信息提取模块配置成根据所述待处理区域中所述紫边区域的每个像素点的坐标位置,确定所述灰度图像中与所述每个像素点的坐标位置对应的目标像素点;根据所述目标像素点确定第二图像区域;提取所述第二图像区域对应的第二图像信息的模块。
作为本公开实施例一种可选的实施方式,还将所述信息提取模块配置成提取所述紫边区域在所述暗帧图像中的第二色度信息,及提取所述紫边区域在所述灰度图像中的亮度信息的模块;还将所述处理模块配置成将所述第二色度信息与所述亮度信息进行融合,得到所述紫边区域对应的融合后的像素信息的模块。
作为本公开实施例一种可选的实施方式,所述处理模块,包括:像素确定单元,将所述像素确定单元配置成确定所述紫边区域在所述待处理图像中对应的初始像素信息的单元;像素处理单元,将所述像素处理单元配置成将所述融合后的像素信息替换所述紫边区域在所述待处理图像中对应的初始像素信息,得到消除所述紫边区域的目标图像的单元。
一种电子设备,包括存储器和一个或多个处理器,将所述存储器配置成存储计算机可读指令的模块;所述计算机可读指令被所述处理器执行时,使得所述一个或多个处理器执行上述任一项所述的图像处理方法的步骤。
一个或多个存储有计算机可读指令的非易失性存储介质,计算机可读指令被一个或多个处理器执行时,使得一个或多个处理器执行上述任一项所述的图像处理方法的步骤。
本公开的其他特征和优点将在随后的说明书中阐述,并且,部分地从说明书中变得显而易见,或者通过实施本公开而了解。本公开的目的和其他优点在说明书、权利要求书以及附图中所特别指出的结构来实现和获得,本公开的一个或多个实施例的细节在下面的附图和描述中提出。
为使本公开的上述目的、特征和优点能更明显易懂,下文特举可选实施例,并配合所附附图,作详细说明如下。
附图说明
此处的附图被并入说明书中并构成本说明书的一部分,示出了符合本公开的实施例,并与说明书一起用来解释本公开的原理。
为了更清楚地说明本公开实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,对于本领域普通技术人员而言,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1a为本公开一个或多个实施例提供的图像处理方法的步骤流程图;
图1b为本公开一个或多个实施例提供的确定目标图像的一个实施例示意图;
图2为本公开一个或多个实施例提供的图像处理方法的步骤流程图;
图3为本公开一个或多个实施例中电子设备的结构框图;
图4为本公开一个或多个实施例中电子设备的内部结构图。
具体实施方式
为了能够更清楚地理解本公开的上述目的、特征和优点,下面将对本公开的方案进行进一步描述。需要说明的是,在不冲突的情况下,本公开的实施例及实施例中的特征可以相互组合。
在下面的描述中阐述了很多具体细节以便于充分理解本公开,但本公开还可以采用其他不同于在此描述的方式来实施;显然,说明书中的实施例只是本公开的一部分实施例,而不是全部的实施例。
本公开的说明书和权利要求书中的术语“第一”和“第二”等是用来区别不同的对象,而不是用来描述对象的特定顺序。例如,第一图像信息和第二图像信息是为了区别不同的图像信息,而不是为了描述图像信息的特定顺序。
在本公开实施例中,“示例性的”或者“例如”等词来表示作例子、例证或说明。本公开实施例中被描述为“示例性的”或者“例如”的任何实施例或设计方案不应被解释为比其它实施例或设计方案更优选或更具优势。确切而言,使用“示例性的”或者“例如”等词旨在以具体方式呈现相关概念,此外,在本公开实施例的描述中,除非另有说明,“多个”的含义是指两个或两个以上。
本公开提供的图像处理方法,可以应用于图像处理系统中。该图像处理系统包括电子设备。电子设备通过摄像装置获取待处理图像,以及获取与待处理图像对应的暗帧图像和灰度图像,其中,待处理图像为基于第一曝光值拍摄得到的图像,暗帧图像为基于第二曝光值拍摄得到的图像,第一曝光值大于所述第二曝光值,再确定待处理图像中的紫边区域,并提取紫边区域在暗帧图像中的第一图像信息,以及提取紫边区域在灰度图像中的第二图像信息,将第一图像信息与第二图像信息进行融合,得到融合后的图像信息,电子设备根据融合后的图像信息,对待处理图像中的紫边区域进行处理,从而得到消除紫边区域的目标图像。其中,电子设备可以但不限于是各种个人计算机、笔记本电脑、智能手机、平板电脑和便携式可穿戴设备。
参照图1a所示,图1a为本公开一个或多个实施例提供的图像处理方法的步骤流程图,可以包括:
101、获取待处理图像,以及与待处理图像对应的暗帧图像和灰度图像。
其中,该待处理图像为电子设备基于第一曝光值(Exposure Value,EV)拍摄得到的图像,该暗帧图像为电子设备基于第二曝光值拍摄得到的图像,该第一曝光值大于该第二曝光值。
在一些实施例中,该待处理图像是一张彩色(Red Green Blue,RGB)图像。电子设备可以实时获取待处理图像,并立即对该待处理图像中的紫边区域进行消除;也可以将获取的待处理图像先保存在存储器中,当用户使用该待处理图像时,再对该待处理图像中的紫边区域进行消除。
电子设备可获取摄像装置基于第一曝光值拍摄得到待处理图像,由于被拍摄物体周围的场景亮度反差较大,且该第一曝光值较高,所以,会导致在该被拍摄物体的高光部位与低光部位之间交界处出现色斑现象,从而使得该电子设备获取的待处理图像中可能存在较为明显的紫边。该紫边会影响到该待处理图像的图像质量,从而导致该待处理图像的成像效果不佳。
其中,上述第一曝光值为正常曝光值范围中的任一曝光值,即EV0。该正常曝光值范围为第一预设曝光阈值与第二预设曝光阈值构成的区间,该第一预设曝光阈值小于该第二预设曝光阈值。
电子设备可获取该摄像装置拍摄得到暗帧图像。该电子设备可以先调整该摄像装置的曝光值,使得调整后得到的第二曝光值小于该第一曝光值,再通过该摄像装置基于该第二曝光值拍摄一张与该待处理图像对应的暗帧图像。即使被拍摄物体周围的场景亮度反差较大,但由于调整后得到的第二曝光值较低,所以,在该被拍摄物体的高光部位与低光部位之间交界处不会出现色斑现象,即该电子设 备获取的暗帧图像中不会存在紫边。其中,该暗帧图像中的亮度信息与该待处理图像中的亮度信息存在偏差,该暗帧图像中的色度信息与该待处理图像中的色度信息之间的偏差程度较小。
其中,上述第二曝光值EV0’小于EV0,同时,该第二曝光值小于该正常曝光值范围中的最小曝光阈值,即该第二曝光值小于该第一预设曝光阈值。
该灰度图像指的是电子设备对该待处理图像进行灰度处理后得到的图像。由于电子设备对该待处理图像的色度信息进行了调整,所以,导致该电子设备得到的灰度图中的色度信息与该待处理图像中的色度信息是不同的,但该灰度图像中的亮度信息与该待处理图像中的亮度信息之间的偏差程度较小。
需要说明的是,电子设备获取暗帧图像与获取灰度图像的时序不作限定。
102、确定待处理图像中的紫边区域。
在一些实施例中,电子设备可以利用每个像素点对应的第一色度信息来确定待处理图像中的紫边区域。
该紫边区域指的是待处理图像中,被拍摄物体的高光部位与低光部位之间交界处出现的色斑区域。
其中,该第一色度信息可以至少包括第一目标色度信息和第二目标色度信息。该第一目标色度信息可以为红色色度信息,该第二目标色度信息可以为蓝色色度信息。该电子设备可以利用这两个目标色度信息来确定每个像素点对应的红色通道值及蓝色通道值,再根据该每个像素点对应的红色通道值及蓝色通道值,判断各个像素点是否属于紫边区域,从而有效地确定该待处理图像中的紫边区域。
103、提取紫边区域在暗帧图像中的第一图像信息,及提取紫边区域在灰度图像中的第二图像信息。
在一些实施例中,电子设备可以先将待处理图像与暗帧图像进行对齐配准,再根据该待处理图像中的紫边区域,在该暗帧图像中确定与该紫边区域对应的第一图像区域,然后,提取该第一图像区域对应的第一图像信息。
在一些实施例中,电子设备可以根据待处理图像中紫边区域每个像素点的坐标位置,在灰度图像中确定与该每个像素点的坐标位置对应的目标像素点;再根据这些目标像素点确定第二图像区域;然后,提取该第二图像区域对应的第二图像信息。
其中,该第一图像信息与该第二图像信息是不同的。
可选的,该第一图像信息可以为第二色度信息,电子设备可以提取该第一图像区域中各个像素点对应的第二色度信息。
其中,第一目标像素点为该第一图像区域中各个像素点中的任一像素点。该第一目标像素点的第二色度信息可以至少包括该第一目标像素点的第三目标色度信息和该第一目标像素点的第四目标色度信息。该第三目标色度信息可以为红色色度信息,该第四目标色度信息可以为蓝色色度信息。
可选的,该第二图像信息可以为亮度信息,电子设备可以提取该第二图像区域中各个像素点对应的亮度信息。该亮度信息可以包括亮度值。
需要说明的是,电子设备提取第一图像信息与提取第二图像信息的时序不作限定。
104、将第一图像信息与第二图像信息进行融合,得到融合后的图像信息。
在一些实施例中,图像的像素信息包括色度信息和亮度信息。电子设备可以将根据紫边区域在暗帧图像中提取的第二色度信息,和根据该紫边区域在灰度图像中提取的亮度信息融合,得到该紫边区域对应的新的像素信息,即得到融合后 的图像信息。
由于该紫边区域对应的新的像素信息不受高亮及对比度的影响,所以,该紫边区域对应的新的像素信息利于电子设备对该待处理图像中紫边区域的后续处理。
105、根据融合后的图像信息,对待处理图像中的紫边区域进行处理,得到消除紫边区域的目标图像。
在一些实施例中,由于融合后的图像信息不受高亮及对比度的影响,所以,电子设备可以将该融合后的图像信息替换该待处理图像中紫边区域对应的图像信息,以消除该紫边区域。也就是说,该电子设备在利用该融合后的图像信息来消除待处理图像中紫边区域的过程中,是替换了该紫边区域的像素信息,即保留了该紫边区域对应的新的像素信息,这样不会造成该待处理图像中该紫边区域对应的图像内容的丢失,从而保证该待处理图像的完整性。
示例性的,如图1b所示,为本公开实施例中确定目标图像的一个实施例示意图。在图1b中,电子设备根据待处理图像中的紫边区域,在暗帧图像中提取与该紫边区域对应的第一图像信息,并在灰度图像中提取与该紫边区域对应的第二图像信息;该电子设备将该第一图像信息与该第二图像信息进行融合,得到融合后的图像信息;然后,该电子设备将该融合后的图像信息替换该待处理图像中的紫边区域,得到消除了该紫边区域的目标图像。
在本公开实施例中,电子设备获取基于第一曝光值拍摄得到的待处理图像,由于该第一曝光值较大,所以,会导致被拍摄物体周围的场景亮度反差较大,从而导致该待处理图像中存在紫边,该紫边会影响到该待处理图像的图像质量,从而导致该待处理图像的成像效果不佳;电子设备获取基于第二曝光值拍摄得到的暗帧图像,该暗帧图像与该待处理图像对应,由于该第二曝光值较小,所以,使得该被拍摄物体周围的场景亮度反差较小,从而使得该暗帧图像中不存在紫边;再获取与该待处理图像对应的灰度图像。然后,电子设备确定该待处理图像中的紫边区域。其次,电子设备可以提取该紫边区域在该暗帧图像中的第一图像信息,及提取该紫边区域在该灰度图像中的第二图像信息,并将该第一图像信息与该第二图像信息进行融合,使得得到的融合后的图像信息不受高亮及对比度的影响;最后,电子设备根据该融合后的图像信息,对该待处理图像中的紫边区域进行处理,可有效消除该待处理图像中的紫边区域,以得到更为准确且完整的目标图像。
参照图2所示,图2为本公开一个或多个实施例提供的图像处理方法的步骤流程图,可以包括:
201、控制摄像装置分别根据第一曝光值及第二曝光值拍摄得到待处理图像及与待处理图像对应的暗帧图像。
在一些实施例中,电子设备可控制摄像装置根据第一曝光值拍摄得到待处理图像。该电子设备再将该第一曝光值进行调整后得到第二曝光值,并控制该摄像装置根据该第二曝光值拍摄得到与该待处理图像对应的暗帧图像。
其中,该第一曝光值大于该第二曝光值。该待处理图像中可能存在紫边,该暗帧图像中不存在紫边。
202、对待处理图像进行灰度处理,得到灰度图像。
在一些实施例中,电子设备对该待处理图像中每个像素点的三种颜色通道值进行处理,得到每个像素点对应的灰度值,再根据该每个像素点对应的灰度值,得到与该待处理图像对应的灰度图像。
其中,该三种颜色通道值分别包括:红色(Red,R)通道值、绿色(Green,G)通道值和蓝色(Blue,B)通道值。
可选的,第一像素点为该每个像素点中的任一像素点。电子设备对该待处理图像中每个像素点的三种颜色通道值进行处理,得到每个像素点对应的灰度值,可以包括但不限于以下至少一种实现方式:
实现方式1:电子设备利用分量法,对该待处理图像中第一像素点的三种颜色通道值进行处理,得到该第一像素点对应的第一灰度值。
在一个实施例中,第一像素点的三种颜色通道值分别对应三种亮度值。其中,R通道值对应的亮度值为L1,G通道值对应的亮度值为L2,B通道值对应的亮度值为L3。电子设备可以将L1作为该第一像素点对应的第一灰度值;或,该电子设备可以将L2作为该第一像素点对应的第一灰度值;或,该电子设备可以将L3作为该第一像素点对应的第一灰度值,此处不做具体限定。也就是说,电子设备可以在L1、L2和L3中将任一亮度值确定为该第一像素点对应的第一灰度值。
实现方式2:电子设备利用最大值法,对该待处理图像中第一像素点的三种颜色通道值进行处理,得到该第一像素点对应的第二灰度值。
在一个实施例中,电子设备可以将L1、L2和L3进行比较,若L1>L2且L1>L3,则该电子设备将R通道值对应的灰度值作为该第一像素点对应的第一灰度值;若L2>L1且L2>L3,则该电子设备将G通道值对应的灰度值作为该第一像素点对应的第一灰度值;若L3>L1且L3>L2,则该电子设备将B通道值对应的灰度值作为该第一像素点对应的第一灰度值。也就是说,电子设备将L1、L2和L3进行比较之后,将最大的亮度值确定为该第一像素点对应的第一灰度值。
实现方式3:电子设备利用平均值法,对该待处理图像中第一像素点的三种颜色通道值进行处理,得到该第一像素点对应的第三灰度值。
在一个实施例中,电子设备可以对L1、L2和L3进行计算,得到亮度均值L,即L=(L1+L2+L3)/3。该电子设备将亮度均值L确定为该第一像素点对应的第一灰度值。
实现方式4:电子设备利用加权平均法,对该待处理图像中第一像素点的三种颜色通道值进行处理,得到该第一像素点对应的第四灰度值。
在一个实施例中,第一像素点的三种颜色通道值分别对应三种权重,这三种权重可以是相同的,也可以是不同的,且这三种权重分别为a、b和c。电子设备利用第一公式,得到目标亮度值,再将该目标亮度值确定为该第一像素点对应的第一灰度值。其中,该第一公式为L’=aL1+bL2+cL3,L’表示该目标亮度值。
需要说明的是,电子设备无论是执行实现方式1-4中的哪一种实现方式,都可以获取到较为完整的灰度图。
可选的,电子设备对待处理图像进行灰度处理,得到灰度图像,可以包括:电子设备对待处理图像进行二值化处理,得到二值化图像。
在一个实施例中,二值化图像是由像素值为0及像素值为1组成的图像,其中,像素值为0表示黑色,像素值为1表示白色。
电子设备可以获取该待处理图像中每个像素点对应的灰度值,并将每个灰度值都与预设灰度阈值进行比较。然后,将灰度值大于等于该预设灰度阈值的像素点记为1,将灰度值小于该预设灰度阈值的像素点记为0,从而得到二值化图像。
需要说明的是,电子设备无论是对该待处理图像进行灰度处理,还是对该待处理图像进行二值化处理,都是为了便于后续图像中图像信息的获取。
203、提取待处理图像中每个像素点对应的第一色度信息。
可选的,第一像素点为该待处理图像中的任一像素点,该第一色度信息包括第一目标色度信息和第二目标色度信息,电子设备提取待处理图像中每个像素点 对应的第一色度信息,可以包括但不限于以下其中一种实现方式:
实现方式1:电子设备获取待处理图像中第一像素点的第一红色强度值、第一绿色强度值和第一蓝色强度值;该电子设备根据该第一像素点的第一红色强度值与该第一绿色强度值,确定该第一像素点在该待处理图像中的第一目标色度信息;该电子设备根据该第一像素点的第一蓝色像素值与该第一绿色强度值,确定该第一像素点在该待处理图像中的第二目标色度信息。
其中,在待处理图像的RGB空间中包括三种分量,分别为R分量、G分量和B分量。红色强度值指的是R分量在(0,255)范围内的强度值;绿色强度值指的是G分量在(0,255)范围内的强度值;蓝色强度值指的是B分量在(0,255)范围内的强度值。
可选的,电子设备根据该第一像素点的第一红色强度值与该第一绿色强度值,确定该第一像素点在该待处理图像中的第一目标色度信息,可以包括:电子设备将该第一像素点的第一红色强度值与该第一绿色强度值的第一差值或第一比值,确定为该第一像素点在该待处理图像中的第一目标色度信息。
在一些实施例中,电子设备在获取第一像素点的第一红色强度值与第一绿色强度值之后,电子设备将该第一红色强度值与该第一绿色强度值作差,得到第一差值,或者,将该第一红色强度值与该第一绿色强度值作比,得到第一比值;该电子设备再将该第一差值或该第一比值确定为该第一像素点在该待处理图像中的第一目标色度信息。
其中,若该第一差值或该第一比值为负数,即该第一目标色度信息为负数,则该电子设备需要将该第一目标色度信息对应的第一像素点进行灰度处理,或者,直接将该第一目标色度信息记为0。
可选的,电子设备根据该第一像素点的第一蓝色像素值与该第一绿色强度值,确定该第一像素点在该待处理图像中的第二目标色度信息,包括:将该第一像素点的第一蓝色强度值与该第一绿色强度值的第二差值或第二比值,确定为该第一像素点在该待处理图像中的第二目标色度信息。
在一些实施例中,电子设备在获取第一像素点的第一蓝色强度值与第一绿色强度值之后,电子设备将该第一红色强度值与该第一绿色强度值作差,得到第二差值,或者,将该第一蓝色强度值与该第一绿色强度值作比,得到第二比值;该电子设备再将该第二差值或该第二比值确定为该第一像素点在该待处理图像中的第二目标色度信息。
其中,若该第二差值或该第二比值为负数,即该第二目标色度信息为负数,则该电子设备需要将该第二目标色度信息对应的第一像素点进行灰度处理,或者,直接将该第二目标色度信息记为0。
实现方式2:电子设备将待处理图像的RGB空间转换至亮色度分离YCbCr空间,并获取该待处理图像中第一像素点的第一红色强度值、第一蓝色强度值和第一亮度值;该电子设备根据该第一像素点的第一红色强度值与该第一亮度值,确定该第一像素点在该待处理图像中的第一目标色度信息;该电子设备根据该第一像素点的第一强度值与该第一亮度值,确定该第一像素点在该待处理图像中的第二目标色度信息。
其中,在YCbCr空间中,Y表示亮度值,Cb指蓝色色度分量,而Cr指红色色度分量。
可选的,电子设备根据该第一像素点的第一红色强度值与该第一亮度值,确定该第一像素点在该待处理图像中的第一目标色度信息,可以包括:电子设备将 该第一像素点的第一红色强度值与该第一亮度值的第三差值,确定为该第一像素点在该待处理图像中的第一目标色度信息。
在一些实施例中,电子设备在获取第一像素点的第一红色强度值与第一亮度值之后,电子设备将该第一红色强度值与该第一亮度值作差,得到第三差值;该电子设备再将该第三差值确定为该第一像素点在该待处理图像中的第一目标色度信息。
其中,若该第三差值为负数,即该第一目标色度信息为负数,则该电子设备需要将该第一目标色度信息对应的第一像素点进行灰度处理,或者,直接将该第一目标色度信息记为0。
可选的,电子设备根据该第一像素点的第一蓝色强度值与该第一亮度值,确定该第一像素点在该待处理图像中的第二目标色度信息,包括:电子设备将该第一像素点的第一蓝色强度值与该第一亮度的第四差值,确定为该第一像素点在该待处理图像中的第二目标色度信息。
在一些实施例中,电子设备在获取第一像素点的第一蓝色强度值与第一亮度值之后,电子设备将该第一蓝色强度值与该第一亮度值作差,得到第四差值;该电子设备再将该第四差值确定为该第一像素点在该待处理图像中的第二目标色度信息。
其中,若该第四差值为负数,即该第二目标色度信息为负数,则该电子设备需要将该第二目标色度信息对应的第一像素点进行灰度处理,或者,直接将该第二目标色度信息记为0。
需要说明的是,无论是实现方式1,还是实现方式2,电子设备确定第一目标色度信息与确定第二目标色度信息的时序不作限定。
204、利用每个像素点对应的第一色度信息,确定待处理图像中的紫边区域。
在一些实施例中,电子设备提待处理图像中每个像素点对应的第一色度信息之后,可利用该每个像素点对应的第一色度信息,确定属于该待处理图像中紫边区域的像素点,然后,根据属于该紫边区域的像素点,较为准确地确定该待处理图像中的紫边区域。
可选的,电子设备利用每个像素点对应的第一色度信息,确定待处理图像中的紫边区域,可以包括:电子设备根据该待处理图像中每个像素点对应的第一目标色度信息和第二目标色度信息,确定该每个像素点对应的红色通道值及蓝色通道值;该电子设备将该待处理图像中红色通道值位于第一预设范围内且蓝色通道值位于第二预设范围内的像素点,确定为属于紫边区域的像素点。
可选的,电子设备根据该待处理图像中每个像素点对应的第一目标色度信息和第二目标色度信息,确定该每个像素点对应的红色通道值,可以包括:电子设备计算第二目标色度信息与第一预设阈值的第一乘积值;该电子设备将该第一乘积值与第一目标色度信息进行比较,若该第一乘积值小于该第一目标色度信息,则将该第一乘积值确定为该每个像素点对应的红色通道值;若该第一乘积值大于该第一目标色度信息,则将该第一目标色度信息确定为该每个像素点对应的红色通道值。
其中,第一预设阈值可以是电子设备出厂前设置的,也可以是用户根据需求自定义的,此处不做具体限定。
可选的,电子设备根据该待处理图像中每个像素点对应的第一目标色度信息和第二目标色度信息,确定该每个像素点对应的蓝色通道值,可以包括:电子设备计算第一目标色度信息与第二预设阈值的第二乘积值;该电子设备将该第二乘 积值与第二目标色度信息进行比较,若该第二乘积值小于该第二目标色度信息,则将该第二乘积值确定为该每个像素点对应的蓝色通道值;若该第二乘积值大于该第二目标色度信息,则将该第二目标色度信息确定为该每个像素点对应的蓝色通道值。
其中,第二预设阈值可以是电子设备出厂前设置的,也可以是用户根据需求自定义的,该第二预设阈值可以与该第一预设阈值相同,也可以不同,此处不做具体限定。
在一些实施例中,该第一预设范围和该第二预设范围是该电子设备在出厂前设置好的,该第一预设范围和该第二预设范围可以相同,也可以不同。此处不做具体限定。
由于电子设备获取的是待处理图像中每个像素点对应的红色通道值及蓝色通道值,所以,如果该电子设备确定该待处理图像中的紫边区域,那么,就需要对每个像素点对应的红色通道值及蓝色通道值进行判断,以确定属于紫边区域的像素点,从而可以准确地确定该待处理图像的紫边区域。
在一些实施例中,在电子设备确定属于紫边区域的像素点之后,该方法还可以包括:电子设备将该属于紫边区域的像素点对应的绿色通道值置为0。
205、提取紫边区域在暗帧图像中的第二色度信息,及提取紫边区域在灰度图像中的亮度信息。
在一些实施例中,电子设备可以提取紫边区域在暗帧图像中各个像素点对应的红色色度信息和蓝色色度信息,还可以提取紫边区域在暗帧图像中各个像素点对应的亮度值。
206、将第二色度信息与亮度信息进行融合,得到紫边区域对应的融合后的像素信息。
在一些实施例中,电子设备可以将紫边区域在暗帧图像中提取的各个像素点的第二色度信息,和该紫边区域在灰度图像中提取的各个像素点的亮度信息对应结合,得到新的像素信息,即得到融合后的图像信息。
207、确定紫边区域在待处理图像中对应的初始像素信息。
在一些实施例中,电子设备在确定待处理图像中的紫边区域之后,可以在该待处理图像中获取该紫边区域中各个像素点对应的第一色度信息和初始亮度信息,即可以在该待处理图像中获取该各个像素点对应的初始像素信息。
208、将融合后的像素信息替换紫边区域在待处理图像中对应的初始像素信息,得到消除紫边区域的目标图像。
在一些实施例中,由于新的像素信息不受高亮及对比度的影响,所以,电子设备可以将该新的像素信息替换该紫边区域对应的初始像素信息,以消除该待处理图像中的紫边区域。也就是说,该电子设备在利用该新的像素信息来消除待处理图像中的紫边区域的过程中,是替换了该紫边区域的处理像素信息,即保留了该紫边区域对应的新的像素信息,这样不会造成该待处理图像中该紫边区域对应的图像内容的丢失,从而保证该待处理图像的完整性。
在本公开实施例中,电子设备获取基于第一曝光值拍摄得到的待处理图像,由于该第一曝光值较大,所以,会导致被拍摄物体周围的场景亮度反差较大,从而导致该待处理图像中存在紫边,该紫边会影响到该待处理图像的图像质量,从而导致该待处理图像的成像效果不佳;电子设备获取基于第二曝光值拍摄得到的暗帧图像,该暗帧图像与该待处理图像对应,由于该第二曝光值较小,所以,使得该被拍摄物体周围的场景亮度反差较小,从而使得该暗帧图像中不存在紫边; 电子设备再获取与该待处理图像对应的灰度图像。然后,电子设备确定该待处理图像中的紫边区域。其次,电子设备可以提取该紫边区域在该暗帧图像中的第二色度信息,及提取该紫边区域在该灰度图像中的亮度信息,并将该第二色度信息与该亮度信息进行融合,使得得到的融合后的像素信息不受高亮及对比度的影响;最后,电子设备将该融合后的像素信息替换该待处理图像中的紫边区域,可有效消除该待处理图像中的紫边区域,以得到更为准确且完整的目标图像。
应该理解的是,虽然图1a、图1b和图2的流程图中的各个步骤按照箭头的指示依次显示,但是这些步骤并不是必然按照箭头指示的顺序依次执行。除非本文中有明确的说明,这些步骤的执行并没有严格的顺序限制,这些步骤可以以其它的顺序执行。而且,图1a、图1b和图2中的至少一部分步骤可以包括多个子步骤或者多个阶段,这些子步骤或者阶段并不必然是在同一时刻执行完成,而是可以在不同的时刻执行,这些子步骤或者阶段的执行顺序也不必然是依次进行,而是可以与其它步骤或者其它步骤的子步骤或者阶段的至少一部分轮流或者交替地执行。
基于同一发明构思,作为对上述方法的实现,本公开实施例还提供了一种电子设备,该电子设备实施例与前述方法实施例对应,为便于阅读,本电子设备实施例不再对前述方法实施例中的细节内容进行逐一赘述,但应当明确,本实施例中的电子设备能够对应实现前述方法实施例中的全部内容。
图3为本公开实施例提供的电子设备的结构框图,如图3所示,本实施例提供的电子设备300包括:
获取模块301,将获取模块301配置成获取待处理图像,以及与待处理图像对应的暗帧图像和灰度图像,其中,待处理图像为基于第一曝光值拍摄得到的图像,暗帧图像为基于第二曝光值拍摄得到的图像,第一曝光值大于第二曝光值的模块;
紫边确定模块302,将紫边确定模块302配置成确定待处理图像中的紫边区域的模块;
信息提取模块303,将信息提取模块303配置成提取紫边区域在暗帧图像中的第一图像信息,及提取紫边区域在灰度图像中的第二图像信息的模块;
处理模块304,将处理模块304配置成将第一图像信息与第二图像信息进行融合,得到融合后的图像信息;以及根据融合后的图像信息,对待处理图像中的紫边区域进行处理,得到消除紫边区域的目标图像的模块。
作为本公开实施例一种可选的实施方式,获取模块301,包括:
拍摄单元3011,将拍摄单元3011配置成分别根据第一曝光值及第二曝光值拍摄得到待处理图像及与待处理图像对应的暗帧图像的单元;灰度处理单元3012,将灰度处理单元3012配置成对待处理图像进行灰度处理,得到灰度图像的单元。
作为本公开实施例一种可选的实施方式,还将灰度处理单元3012配置成对待处理图像进行二值化处理,得到二值化图像的单元;还将信息提取模块303配置成提取紫边区域在二值化图像中的第二图像信息的模块。
作为本公开实施例一种可选的实施方式,还将灰度处理单元3012配置成分别对待代理图像中每个像素点的三种颜色通道值进行处理,得到每个像素点对应的灰度值;根据每个像素点对应的灰度值,得到与待处理图像对应的灰度图像的单元。
作为本公开实施例一种可选的实施方式,还将灰度处理单元3012配置成对第一像素点的三种颜色通道值进行处理,得到与三种颜色通道值对应的三种亮度值;其中,第一像素点为待处理图像中的任一像素点;将三种亮度值中的任一亮度值 确定为第一像素点对应的第一灰度值;或,将三种亮度值进行比较,并将最大的亮度值确定为第一像素点对应的第二灰度值;或,通过平均值法对三种亮度值进行计算,得到亮度均值,并将亮度均值确定为第一像素点对应的第三灰度值;或,根据三种亮度值分别对应的三种权重,通过加权平均法计算得到目标亮度值,并将目标亮度值确定为第一像素点对应的第四灰度值的单元。
作为本公开实施例一种可选的实施方式,紫边确定模块302,包括:
色度提取单元3021,将色度提取单元3021配置成提取待处理图像中每个像素点对应的第一色度信息的单元;紫边确定单元3022,将紫边确定单元3022配置成利用每个像素点对应的第一色度信息,确定待处理图像中的紫边区域的单元。
作为本公开实施例一种可选的实施方式,第一色度信息包括第一目标色度信息和第二目标色度信息;还将信息提取模块303配置成获取待处理图像中第一像素点的第一红色强度值、第一绿色强度值和第一蓝色强度值,其中,第一像素点为待处理图像中的任一像素点;根据第一像素点的第一红色强度值与第一绿色强度值,确定第一像素点在待处理图像中的第一目标色度信息;根据第一像素点的第一蓝色像素值与第一绿色强度值,确定第一像素点在待处理图像中的第二目标色度信息的模块。
作为本公开实施例一种可选的实施方式,信息提取模块303,包括:色度确定单元3031,将色度确定单元3031配置成将第一像素点的第一红色强度值与第一绿色强度值的第一差值或第一比值,确定为第一像素点在待处理图像中的第一目标色度信息;将第一像素点的第一蓝色强度值与第一绿色强度值的第二差值或第二比值,确定为第一像素点在待处理图像中的第二目标色度信息的模块;
作为本公开实施例一种可选的实施方式,第一色度信息包括第一目标色度信息和第二目标色度信息;还将信息提取模块303配置成获取待处理图像中第一像素点的第一红色强度值、第一蓝色强度值和第一亮度值,其中,第一像素点为待处理图像中的任一像素点;根据第一像素点的第一红色强度值与第一亮度值,确定第一像素点在待处理图像中的第一目标色度信息;根据第一像素点的第一蓝色像素值与第一亮度值,确定第一像素点在待处理图像中的第二目标色度信息的模块。
作为本公开实施例一种可选的实施方式,还将色度确定单元3031配置成将第一像素点的第一红色强度值与第一亮度值的第三差值或第三比值,确定为第一像素点在待处理图像中的第一目标色度信息;将第一像素点的第一蓝色强度值与第一亮度值的第四差值或第四比值,确定为第一像素点在待处理图像中的第二目标色度信息的单元。
作为本公开实施例一种可选的实施方式,信息提取模块,还包括:紫边像素确定单元3032,将紫边像素确定单元配置成根据待处理图像中每个像素点对应的第一目标色度信息和第二目标色度信息,确定每个像素点对应的红色通道值及蓝色通道值;将待处理图像中红色通道值位于第一预设范围内且蓝色通道值位于第二预设范围内的像素点,确定为属于紫边区域的像素点的单元。
作为本公开实施例一种可选的实施方式,还将紫边像素确定单元3032配置成计算第二目标色度信息与第一预设阈值的第一乘积值;若第一乘积值小于第一目标色度信息,则将第一乘积值确定为每个像素点对应的红色通道值;若第一乘积值大于第一目标色度信息,则将第一目标色度值确定为每个像素点对应的红色通道值的单元。
作为本公开实施例一种可选的实施方式,还将紫边像素确定单元3032配置成 计算第一目标色度信息与第二预设阈值的第二乘积值;若第二乘积值小于第二目标色度信息,则将第二乘积值确定为每个像素点对应的蓝色通道值;若第二乘积值大于第二目标色度信息,则将第二目标色度值确定为每个像素点对应的蓝色通道值。
作为本公开实施例一种可选的实施方式,还将信息提取模块303配置成将待处理图像与暗帧图像进行对齐配准,并确定暗帧图像中与紫边区域对应的第一图像区域;提取第一图像区域对应的第一图像信息的模块。
作为本公开实施例一种可选的实施方式,还将信息提取模块303配置成根据待处理区域中紫边区域的每个像素点的坐标位置,确定灰度图像中与每个像素点的坐标位置对应的目标像素点;根据目标像素点确定第二图像区域;提取第二图像区域对应的第二图像信息的模块。
作为本公开实施例一种可选的实施方式,还将信息提取模块303配置成提取紫边区域在暗帧图像中的第二色度信息,及提取紫边区域在灰度图像中的亮度信息的模块;还将处理模块304配置成将第二色度信息与亮度信息进行融合,得到紫边区域对应的融合后的像素信息的模块。
作为本公开实施例一种可选的实施方式,处理模块304,包括:像素确定单元3041,将像素确定单元3041配置成确定紫边区域在待处理图像中对应的初始像素信息的单元;像素处理单元3042,用于将融合后的像素信息替换紫边区域在待处理图像中对应的初始像素信息,得到消除紫边区域的目标图像的单元。
本实施例提供的电子设备可以执行上述方法实施例提供的图像处理方法,其实现原理与技术效果类似,此处不再赘述。上述电子设备中的各个模块可全部或部分通过软件、硬件及其组合来实现。上述各模块可以硬件形式内嵌于或独立于电子设备中的处理器中,也可以以软件形式存储于电子设备中的存储器中,以便于处理器调用执行以上各个模块对应的操作。
在一个实施例中,提供了一种电子设备,该电子设备可以是终端设备,其内部结构图可以如图4所示。该电子设备包括通过系统总线连接的处理器、存储器、通信接口、数据库、显示屏和输入装置。其中,该电子设备的处理器配置成提供计算和控制能力的模块。该电子设备的存储器包括非易失性存储介质、内存储器。该非易失性存储介质存储有操作系统和计算机可读指令。该内存储器为非易失性存储介质中的操作系统和计算机可读指令的运行提供环境。该电子设备的通信接口配置成与外部的终端进行有线或无线方式的通信模块,无线方式可通过WIFI、运营商网络、近场通信(NFC)或其他技术实现。该计算机可读指令被处理器执行时以实现上述实施例提供的图像处理方法。该电子设备的显示屏可以是液晶显示屏或者电子墨水显示屏,该电子设备的输入装置可以是显示屏上覆盖的触摸层,也可以是电子设备外壳上设置的按键、轨迹球或触控板,还可以是外接的键盘、触控板或鼠标等。
本领域技术人员可以理解,图4中示出的结构,仅仅是与本公开方案相关的部分结构的框图,并不构成对本公开方案所应用于其上的电子设备的限定,具体的电子设备可以包括比图中所示更多或更少的部件,或者组合某些部件,或者具有不同的部件布置。
在一个实施例中,本公开提供的电子设备可以实现为一种计算机可读指令的形式,计算机可读指令可在如图4所示的电子设备上运行。电子设备的存储器中可存储组成该电子设备的各个程序模块,比如,图3所示的获取模块301和紫边确定模块302。各个程序模块构成的计算机可读指令使得处理器执行本说明书中描 述的本公开各个实施例的图像处理方法中的步骤。
在一个实施例中,提供了一种电子设备,包括存储器和一个或多个处理器,将所述存储器配置成存储计算机可读指令的模块;所述计算机可读指令被所述处理器执行时,使得所述一个或多个处理器执行上述方法实施例所述的图像处理方法的步骤。
本实施例提供的电子设备,可以实现上述方法实施例提供的图像处理方法,其实现原理与技术效果类似,此处不再赘述。
一个或多个存储有计算机可读指令的非易失性存储介质,计算机可读指令被一个或多个处理器执行时,使得一个或多个处理器执行上述任一项所述的图像处理方法的步骤。
本实施例提供的计算机可读存储介质上存储的计算机可读指令,可以实现上述方法实施例提供的图像处理方法,其实现原理与技术效果类似,此处不再赘述。
本领域普通技术人员可以理解实现上述方法实施例中的全部或部分流程,是可以通过计算机可读指令来指令相关的硬件来完成的,计算机可读指令可存储于一非易失性计算机可读取存储介质中,该计算机可读指令在执行时,可包括如上述各方法的实施例的流程。其中,本公开所提供的各实施例中所使用的对存储器、数据库或其它介质的任何引用,均可包括非易失性和易失性存储器中的至少一种。非易失性存储器可包括只读存储器(Read-Only Memory,ROM)、磁带、软盘、闪存或光存储器等。易失性存储器可包括随机存取存储器(Random Access Memory,RAM)或者外部高速缓冲存储器。作为说明而非局限,RAM以多种形式可得,比如静态随机存取存储器(Static Random Access Memory,SRAM)和动态随机存取存储器(Dynamic Random Access Memory,DRAM)等。
以上实施例的各技术特征可以进行任意的组合,为使描述简洁,未对上述实施例中的各个技术特征所有可能的组合都进行描述,然而,只要这些技术特征的组合不存在矛盾,都应当认为是本说明书记载的范围。
以上实施例仅表达了本公开的几种实施方式,其描述较为具体和详细,但并不能因此而理解为对发明专利范围的限制。应当指出的是,对于本领域的普通技术人员来说,在不脱离本公开构思的前提下,还可以做出若干变形和改进,这些都属于本公开的保护范围。因此,本公开专利的保护范围应以所附权利要求为准。
工业实用性
本公开提供的图像处理方法,可有效改善电子设备通过摄像装置对被拍摄物体进行拍摄时的拍摄效果,有效消除待处理图像中的紫边区域,以得到更为准确且完整的目标图像,具有很强的工业实用性。

Claims (20)

  1. 一种图像处理方法,包括:
    获取待处理图像,以及与所述待处理图像对应的暗帧图像和灰度图像,其中,所述待处理图像为基于第一曝光值拍摄得到的图像,所述暗帧图像为基于第二曝光值拍摄得到的图像,所述第一曝光值大于所述第二曝光值;
    确定所述待处理图像中的紫边区域;
    提取所述紫边区域在所述暗帧图像中的第一图像信息,及提取所述紫边区域在所述灰度图像中的第二图像信息,并将所述第一图像信息与所述第二图像信息进行融合,得到融合后的图像信息;
    根据所述融合后的图像信息,对所述待处理图像中的紫边区域进行处理,得到消除所述紫边区域的目标图像。
  2. 根据权利要求1所述的方法,其中,所述获取待处理图像,以及与所述待处理图像对应的暗帧图像和灰度图像,包括:
    控制摄像装置分别根据第一曝光值及第二曝光值拍摄得到待处理图像及与所述待处理图像对应的暗帧图像;
    对所述待处理图像进行灰度处理,得到灰度图像。
  3. 根据权利要求2所述的方法,其中,所述对所述待处理图像进行灰度处理,得到灰度图像,包括:
    对所述待处理图像进行二值化处理,得到二值化图像;
    所述提取所述紫边区域在所述灰度图像中的第二图像信息,包括:
    提取所述紫边区域在所述二值化图像中的第二图像信息。
  4. 根据权利要求2所述的方法,其中,所述对所述待处理图像进行灰度处理,得到灰度图像,包括:
    分别对所述待代理图像中每个像素点的三种颜色通道值进行处理,得到所述每个像素点对应的灰度值;
    根据所述每个像素点对应的灰度值,得到与所述待处理图像对应的灰度图像。
  5. 根据权利要求4所述的方法,其中,所述分别对所述待代理图像中每个像素点的三种颜色通道值进行处理,得到所述每个像素点对应的灰度值,包括:
    对第一像素点的三种颜色通道值进行处理,得到与所述三种颜色通道值对应的三种亮度值;其中,所述第一像素点为所述待处理图像中的任一像素点;
    将所述三种亮度值中的任一亮度值确定为所述第一像素点对应的第一灰度值;或,
    将所述三种亮度值进行比较,并将最大的亮度值确定为所述第一像素点对应的第二灰度值;或,
    通过平均值法对所述三种亮度值进行计算,得到亮度均值,并将所述亮度均值确定为所述第一像素点对应的第三灰度值;或,
    根据所述三种亮度值分别对应的三种权重,通过加权平均法计算得到目标亮度值,并将所述目标亮度值确定为所述第一像素点对应的第四灰度值。
  6. 根据权利要求1所述的方法,其中,所述确定所述待处理图像中的紫边区域,包括:
    提取所述待处理图像中每个像素点对应的第一色度信息;
    利用所述每个像素点对应的第一色度信息,确定所述待处理图像中的紫边区 域。
  7. 根据权利要求6所述的方法,其中,所述第一色度信息包括第一目标色度信息和第二目标色度信息,所述提取所述待处理图像中每个像素点对应的第一色度信息,包括:
    获取所述待处理图像中第一像素点的第一红色强度值、第一绿色强度值和第一蓝色强度值,其中,所述第一像素点为所述待处理图像中的任一像素点;
    根据所述第一像素点的第一红色强度值与所述第一绿色强度值,确定所述第一像素点在所述待处理图像中的第一目标色度信息;
    根据所述第一像素点的第一蓝色像素值与所述第一绿色强度值,确定所述第一像素点在所述待处理图像中的第二目标色度信息。
  8. 根据权利要求7所述的方法,其中,所述根据所述第一像素点的第一红色强度值与所述第一绿色强度值,确定所述第一像素点在所述待处理图像中的第一目标色度信息,包括:
    将所述第一像素点的第一红色强度值与所述第一绿色强度值的第一差值或第一比值,确定为所述第一像素点在所述待处理图像中的第一目标色度信息;
    所述根据所述第一像素点的第一蓝色像素值与所述第一绿色强度值,确定所述第一像素点在所述待处理图像中的第二目标色度信息,包括:
    将所述第一像素点的第一蓝色强度值与所述第一绿色强度值的第二差值或第二比值,确定为所述第一像素点在所述待处理图像中的第二目标色度信息。
  9. 根据权利要求6所述的方法,其中,所述第一色度信息包括第一目标色度信息和第二目标色度信息,所述提取所述待处理图像中每个像素点对应的第一色度信息,包括:
    获取所述待处理图像中第一像素点的第一红色强度值、第一蓝色强度值和第一亮度值,其中,所述第一像素点为所述待处理图像中的任一像素点;
    根据所述第一像素点的第一红色强度值与所述第一亮度值,确定所述第一像素点在所述待处理图像中的第一目标色度信息;
    根据所述第一像素点的第一蓝色像素值与所述第一亮度值,确定所述第一像素点在所述待处理图像中的第二目标色度信息。
  10. 根据权利要求9所述的方法,其中,所述根据所述第一像素点的第一红色强度值与所述第一亮度值,确定所述第一像素点在所述待处理图像中的第一目标色度信息,包括:
    将所述第一像素点的第一红色强度值与所述第一亮度值的第三差值或第三比值,确定为所述第一像素点在所述待处理图像中的第一目标色度信息;
    所述根据所述第一像素点的第一蓝色像素值与所述第一亮度值,确定所述第一像素点在所述待处理图像中的第二目标色度信息,包括:
    将所述第一像素点的第一蓝色强度值与所述第一亮度值的第四差值或第四比值,确定为所述第一像素点在所述待处理图像中的第二目标色度信息。
  11. 根据权利要求7-10项中任一所述的方法,其中,所述利用所述第一色度信息,确定所述待处理图像中的紫边区域,包括:
    根据所述待处理图像中每个像素点对应的第一目标色度信息和第二目标色度信息,确定所述每个像素点对应的红色通道值及蓝色通道值;
    将所述待处理图像中红色通道值位于第一预设范围内且蓝色通道值位于第二预设范围内的像素点,确定为属于紫边区域的像素点。
  12. 根据权利要求11所述的方法,其中,所述根据所述待处理图像中每个像 素点对应的第一目标色度信息和第二目标色度信息,确定所述每个像素点对应的红色通道值,包括:
    计算所述第二目标色度信息与第一预设阈值的第一乘积值;
    若所述第一乘积值小于所述第一目标色度信息,则将所述第一乘积值确定为所述每个像素点对应的红色通道值;
    若所述第一乘积值大于所述第一目标色度信息,则将所述第一目标色度值确定为所述每个像素点对应的红色通道值。
  13. 根据权利要求11所述的方法,其中,所述根据所述待处理图像中每个像素点对应的第一目标色度信息和第二目标色度信息,确定所述每个像素点对应的蓝色通道值,包括:
    计算所述第一目标色度信息与第二预设阈值的第二乘积值;
    若所述第二乘积值小于所述第二目标色度信息,则将所述第二乘积值确定为所述每个像素点对应的蓝色通道值;
    若所述第二乘积值大于所述第二目标色度信息,则将所述第二目标色度值确定为所述每个像素点对应的蓝色通道值。
  14. 根据权利要求1-10项中任一所述的方法,其中,所述提取所述紫边区域在所述暗帧图像中的第一图像信息,包括:
    将所述待处理图像与所述暗帧图像进行对齐配准,并确定所述暗帧图像中与所述紫边区域对应的第一图像区域;
    提取所述第一图像区域对应的第一图像信息。
  15. 根据权利要求1-10项中任一所述的方法,其中,所述提取所述紫边区域在所述灰度图像中的第二图像信息,包括:
    根据所述待处理区域中所述紫边区域的每个像素点的坐标位置,确定所述灰度图像中与所述紫边区域的每个像素点的坐标位置对应的目标像素点;
    根据所述目标像素点确定第二图像区域;
    提取所述第二图像区域对应的第二图像信息。
  16. 根据权利要求1-10项中任一所述的方法,其中,所述提取所述紫边区域在所述暗帧图像中的第一图像信息,及提取所述紫边区域在所述灰度图像中的第二图像信息,并将所述第一图像信息与所述第二图像信息进行融合,得到融合后的图像信息,包括:
    提取所述紫边区域在所述暗帧图像中的第二色度信息,及提取所述紫边区域在所述灰度图像中的亮度信息;
    将所述第二色度信息与所述亮度信息进行融合,得到所述紫边区域对应的融合后的像素信息。
  17. 根据权利要求16所述的方法,其中,所述根据所述融合后的图像信息,对所述待处理图像中的紫边区域进行处理,得到消除所述紫边区域的目标图像,包括:
    确定所述紫边区域在所述待处理图像中对应的初始像素信息;
    将所述融合后的像素信息替换所述紫边区域在所述待处理图像中对应的初始像素信息,得到消除所述紫边区域的目标图像。
  18. 一种电子设备,包括:
    获取模块,将所述获取模块配置成获取待处理图像,以及与所述待处理图像对应的暗帧图像和灰度图像,其中,所述待处理图像为基于第一曝光值拍摄得到的图像,所述暗帧图像为基于第二曝光值拍摄得到的图像,所述第一曝光值大于 所述第二曝光值的模块;
    紫边确定模块,将所述紫边确定模块配置成确定所述待处理图像中的紫边区域的模块;
    信息提取模块,将所述信息提取模块配置成提取所述紫边区域在所述暗帧图像中的第一图像信息,及提取所述紫边区域在所述灰度图像中的第二图像信息的模块;
    处理模块,将所述处理模块配置成将所述第一图像信息与所述第二图像信息进行融合,得到融合后的图像信息;以及根据所述融合后的图像信息,对所述待处理图像中的紫边区域进行处理,得到消除所述紫边区域的目标图像的模块。
  19. 一种电子设备,包括:存储器和一个或多个处理器,所述存储器中存储有计算机可读指令;所述计算机可读指令被所述一个或多个处理器执行时,使得所述一个或多个处理器执行权利要求1-17任一项所述的图像处理方法的步骤。
  20. 一个或多个存储有计算机可读指令的非易失性计算机可读存储介质,所述计算机可读指令被一个或多个处理器执行时,使得所述一个或多个处理器执行权利要求1-17任一项所述的图像处理方法的步骤。
PCT/CN2022/099249 2022-02-10 2022-06-16 图像处理方法、电子设备及计算机可读存储介质 WO2023151210A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210124323.6A CN114511461A (zh) 2022-02-10 2022-02-10 图像处理方法、电子设备以及计算机可读存储介质
CN202210124323.6 2022-02-10

Publications (1)

Publication Number Publication Date
WO2023151210A1 true WO2023151210A1 (zh) 2023-08-17

Family

ID=81552296

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/099249 WO2023151210A1 (zh) 2022-02-10 2022-06-16 图像处理方法、电子设备及计算机可读存储介质

Country Status (2)

Country Link
CN (1) CN114511461A (zh)
WO (1) WO2023151210A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114511461A (zh) * 2022-02-10 2022-05-17 上海闻泰信息技术有限公司 图像处理方法、电子设备以及计算机可读存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100231603A1 (en) * 2009-03-13 2010-09-16 Dolby Laboratories Licensing Corporation Artifact mitigation method and apparatus for images generated using three dimensional color synthesis
CN111199524A (zh) * 2019-12-26 2020-05-26 浙江大学 一种针对可调光圈光学系统的图像紫边校正方法
CN112367465A (zh) * 2020-10-30 2021-02-12 维沃移动通信有限公司 图像输出方法、装置及电子设备
CN112887693A (zh) * 2021-01-12 2021-06-01 浙江大华技术股份有限公司 图像紫边消除方法、设备及存储介质
CN113905183A (zh) * 2021-08-25 2022-01-07 珠海全志科技股份有限公司 宽动态范围图像的色差校正方法及装置
CN114511461A (zh) * 2022-02-10 2022-05-17 上海闻泰信息技术有限公司 图像处理方法、电子设备以及计算机可读存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100231603A1 (en) * 2009-03-13 2010-09-16 Dolby Laboratories Licensing Corporation Artifact mitigation method and apparatus for images generated using three dimensional color synthesis
CN111199524A (zh) * 2019-12-26 2020-05-26 浙江大学 一种针对可调光圈光学系统的图像紫边校正方法
CN112367465A (zh) * 2020-10-30 2021-02-12 维沃移动通信有限公司 图像输出方法、装置及电子设备
CN112887693A (zh) * 2021-01-12 2021-06-01 浙江大华技术股份有限公司 图像紫边消除方法、设备及存储介质
CN113905183A (zh) * 2021-08-25 2022-01-07 珠海全志科技股份有限公司 宽动态范围图像的色差校正方法及装置
CN114511461A (zh) * 2022-02-10 2022-05-17 上海闻泰信息技术有限公司 图像处理方法、电子设备以及计算机可读存储介质

Also Published As

Publication number Publication date
CN114511461A (zh) 2022-05-17

Similar Documents

Publication Publication Date Title
US10742889B2 (en) Image photographing method, image photographing apparatus, and terminal
US9451173B2 (en) Electronic device and control method of the same
JP6469678B2 (ja) 画像アーティファクトを補正するシステム及び方法
US9916518B2 (en) Image processing apparatus, image processing method, program and imaging apparatus
WO2018176925A1 (zh) Hdr图像的生成方法及装置
US20170318233A1 (en) Noise Models for Image Processing
WO2017008377A1 (zh) 一种图像处理方法及终端
US10204432B2 (en) Methods and systems for color processing of digital images
US9247152B2 (en) Determining image alignment failure
WO2017114399A1 (zh) 逆光拍照方法和装置
US20150063694A1 (en) Techniques for combining images with varying brightness degrees
US20100014754A1 (en) Digital image editing system and method
WO2019029573A1 (zh) 图像虚化方法、计算机可读存储介质和计算机设备
CN116416122B (zh) 图像处理方法及其相关设备
WO2023137956A1 (zh) 图像处理方法、装置、电子设备及存储介质
WO2023151210A1 (zh) 图像处理方法、电子设备及计算机可读存储介质
CN111932521A (zh) 图像质量测试方法和装置、服务器、计算机可读存储介质
CN113177886B (zh) 图像处理方法、装置、计算机设备及可读存储介质
CN111669572A (zh) 摄像头模组的检测方法、装置、介质及电子设备
US20230033956A1 (en) Estimating depth based on iris size
CN111602390A (zh) 终端白平衡处理方法、终端及计算机可读存储介质
CN114862734A (zh) 图像处理方法、装置、电子设备和计算机可读存储介质
US20170372495A1 (en) Methods and systems for color processing of digital images
CN111354058A (zh) 一种图像着色方法、装置、图像采集设备及可读存储介质
CN116668838B (zh) 图像处理方法与电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22925569

Country of ref document: EP

Kind code of ref document: A1