WO2019148912A1 - Procédé de traitement d'image, appareil, dispositif électronique et support d'informations - Google Patents

Procédé de traitement d'image, appareil, dispositif électronique et support d'informations Download PDF

Info

Publication number
WO2019148912A1
WO2019148912A1 PCT/CN2018/112877 CN2018112877W WO2019148912A1 WO 2019148912 A1 WO2019148912 A1 WO 2019148912A1 CN 2018112877 W CN2018112877 W CN 2018112877W WO 2019148912 A1 WO2019148912 A1 WO 2019148912A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
map
yuv
determining
rgb
Prior art date
Application number
PCT/CN2018/112877
Other languages
English (en)
Chinese (zh)
Inventor
高浩然
Original Assignee
杭州海康威视数字技术股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 杭州海康威视数字技术股份有限公司 filed Critical 杭州海康威视数字技术股份有限公司
Publication of WO2019148912A1 publication Critical patent/WO2019148912A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration using histogram techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/92Dynamic range modification of images or parts thereof based on global image properties
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/56Processing of colour picture signals
    • H04N1/60Colour correction or control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Definitions

  • the present application relates to the field of image processing technologies, and in particular, to an image processing method, apparatus, electronic device, and storage medium.
  • the problem that the license plate is too bright or too dark, the background color of the license plate is not real enough, or the license plate contrast is insufficient due to the backward light is often caused, thereby causing the license plate recognition error or the recognition failure.
  • An object of the embodiments of the present application is to provide an image processing method, apparatus, electronic device, and storage medium to improve monitoring image quality and improve the accuracy of license plate recognition.
  • the specific technical solutions are as follows:
  • an embodiment of the present application provides an image processing method, where the method includes:
  • the first YUV map and the second YUV map are fused to obtain a fused image.
  • the determining, according to the second RGB image, each matrix that performs enhancement processing on the second RGB image; and performing enhancement processing on the second RGB image by using each matrix to obtain a second YUV The steps of the diagram include:
  • the step of determining a corresponding gamma curve according to the average brightness of the Y component in the first YUV map comprises:
  • the step of converting the bayer map into a first RGB map comprises:
  • the RGB domain is interpolated to obtain a first RGB map.
  • the step of fusing the first YUV image and the second YUV image to obtain a fused image includes:
  • the fusion weights of the pixels in the sub-regions are used as the fusion weights of the second YUV map, and the first YUV map and the second YUV map are merged to obtain a fused image.
  • the step of determining a fusion weight of each pixel in each sub-area includes:
  • the method further includes:
  • the method before the converting the bayer graph into the first RGB graph, the method further includes:
  • the black level value is removed for each channel data of the bayer map.
  • an embodiment of the present application provides an image processing apparatus, where the apparatus includes:
  • An extraction module configured to extract a bayer map including the target object in the image to be processed, and convert the image to be processed into a first YUV map;
  • a statistic module configured to count an average brightness of the Y component in the first YUV image; when the average brightness is outside a preset range, determine a corresponding gamma according to an average brightness of the Y component in the first YUV image curve;
  • a conversion module configured to convert the bayer map into a first RGB image; and map the first RGB image according to the gamma curve to obtain a second RGB image;
  • a processing module configured to determine, according to the second RGB image, each matrix that performs enhancement processing on the second RGB image; and perform enhancement processing on the second RGB image by using each matrix to obtain a second YUV image ;
  • a first fusion module configured to fuse the first YUV map and the second YUV map to obtain a fused image.
  • the processing module includes:
  • a first determining submodule configured to perform histogram statistics on the second RGB image, and determine variance, saturation information, and color information corresponding to the second RGB image;
  • a second determining submodule configured to determine, according to the variance, the saturation information, and the color information corresponding to the second RGB image, a target color enhancement matrix that performs color enhancement on the second RGB image, where the second RGB is a target contrast enhancement matrix for contrast enhancement, and a target saturation enhancement matrix for saturation enhancement of the second RGB image;
  • An enhancement processing submodule configured to perform color enhancement processing on the second RGB image according to the target color enhancement matrix
  • a adjusting submodule configured to perform contrast and saturation adjustment on the second RGB image after the color enhancement process according to the target contrast enhancement matrix and the target saturation enhancement matrix, respectively, to obtain a second YUV image.
  • the extraction module is specifically configured to:
  • the RGB domain is interpolated to obtain a first RGB map.
  • the first fusion module includes:
  • a sub-module configured to determine a to-be-fused region of the second YUV map, and divide the to-be-fused region into a plurality of sub-regions;
  • a third determining submodule configured to determine a fusion weight of each pixel in each subregion
  • a fusion sub-module configured to use a fusion weight of each pixel in each sub-region as a fusion weight of the second YUV map, and fuse the first YUV map and the second YUV image to obtain a fused image .
  • the third determining submodule is specifically configured to:
  • the device further includes:
  • a determining module configured to determine a fusion weight of the first YUV map according to a difference between the fused image and YUV data corresponding to each pixel of the first YUV map;
  • a second fusion module configured to fuse the fused image with the first YUV image according to a fusion weight of the first YUV map to obtain a target image.
  • the device further includes:
  • a determining module configured to determine whether a black level of the bayer graph is greater than a preset threshold
  • a removing module configured to remove a black level value for each channel data of the bayer map when the determining result of the determining module is YES.
  • an electronic device including:
  • processor a memory, a communication interface, and a bus
  • the processor, the memory, and the communication interface are connected by the bus and complete communication with each other;
  • the memory stores executable program code
  • the processor runs a program corresponding to the executable program code by reading executable program code stored in the memory for performing an image processing method as described in the first aspect above.
  • an embodiment of the present application provides a computer readable storage medium, where the computer readable storage medium stores a computer program, and when the computer program is executed by a processor, implementing one according to the first aspect described above.
  • An image processing method An image processing method.
  • the embodiment of the present application further provides a computer program product comprising instructions, when executed on a computer, causing the computer to perform an image processing method according to the above first aspect.
  • An embodiment of the present application provides an image processing method, apparatus, electronic device, and storage medium, the method comprising: acquiring an image to be processed; extracting a bayer map including a target object in the image to be processed, and Processing the image into a first YUV image; counting the average brightness of the Y component in the first YUV image; and when the average brightness is outside the preset range, according to the average brightness of the Y component in the first YUV image, Determining a corresponding gamma curve; converting the bayer map into a first RGB map; and mapping the first RGB map according to the gamma curve to obtain a second RGB map; determining according to the second RGB map And performing enhancement processing on the second RGB image; and performing enhancement processing on the second RGB image by using the respective matrices to obtain a second YUV map; and the first YUV map and the second YUV The map is fused to obtain a fused image.
  • the target object area may be enhanced according to the bayer image and the YUV map. Since the bayer map is an image that has only been subjected to exposure processing on the brightness, the image does not have overexposure due to some over-brightening processing. The problem can solve the problem of abnormal license plate recognition caused by the environment being too bright or too dark, and improve the accuracy of license plate recognition.
  • FIG. 1 is a flowchart of an image processing method according to an embodiment of the present application
  • FIG. 3 is another flowchart of an image processing method according to an embodiment of the present application.
  • FIG. 5 is another flowchart of an image processing method according to an embodiment of the present application.
  • FIG. 6 is another flowchart of an image processing method according to an embodiment of the present application.
  • FIG. 7a is a schematic diagram of an image to be processed in the embodiment of the present application.
  • FIG. 7b is a schematic diagram of a fusion region in an embodiment of the present application.
  • FIG. 7c is a schematic diagram of an image fusion sub-area divided in the embodiment of the present application.
  • FIG. 8a is a schematic diagram of an effect after image fusion using the current technology
  • FIG. 8b is a schematic diagram of an effect after image fusion using the method provided by the embodiment of the present application.
  • FIG. 9 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application.
  • FIG. 10 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
  • FIG. 1 illustrates a flow of an image processing method according to an embodiment of the present application.
  • the method may include the following steps:
  • the image capturing device is a device for collecting a monitoring image, such as a ball machine or a gun machine;
  • the electronic device may be any device having a data processing function, such as a desktop computer, a portable computer, a smart mobile terminal, or the like.
  • the image processing method provided by the embodiment of the present application is described by taking the method provided by the embodiment of the present application as another example of the electronic device other than the image capturing device.
  • a communication connection may be established between the electronic device and the image capturing device, so that after the image capturing device collects the monitoring image, the monitoring may be performed.
  • the image is sent to an electronic device for processing.
  • the electronic device can establish a wired or wireless connection with the image capture device.
  • a wired connection may be established between the electronic device and the image capture device; or a wireless connection may be established between the electronic device and the image capture device by any remote wireless connection; or, NFC (Near Field Communication)
  • NFC Near Field Communication
  • the short-range wireless connection method such as the short-range wireless communication technology and the Bluetooth establishes a wireless connection between the electronic device and the image capturing device, which is not limited in this embodiment of the present application.
  • the image capturing device may collect the monitoring image. For example, it may periodically collect monitoring images according to preset time intervals, such as 1 millisecond, 5 milliseconds, 10 milliseconds, etc.; or, trigger conditions may be preset, for example, when a target object enters a setting area, etc., when detecting The image acquisition device can acquire the monitoring image when the trigger condition is met.
  • preset time intervals such as 1 millisecond, 5 milliseconds, 10 milliseconds, etc.
  • trigger conditions may be preset, for example, when a target object enters a setting area, etc., when detecting
  • the image acquisition device can acquire the monitoring image when the trigger condition is met.
  • the target object may be a person, a vehicle, or the like.
  • the image processing method of the embodiment of the present application is described by taking the above-mentioned target object as a vehicle as an example.
  • the monitoring image can be sent to the electronic device as the image to be processed, so as to enhance the image of the monitoring image, improve the image quality, and improve the accuracy of the license plate recognition.
  • the electronic device may extract a bayer map containing the target object in the image to be processed.
  • the electronic device may extract a bayer map containing only the license plate area in the image to be processed.
  • the bayer map is a raw uninterpolated image that is processed only by exposure without any other ISP (Image Signal Processing). Simple exposure generally does not make the image brighter, so even if there is a high-light source in the scene on the license plate, the license plate will not be too exposed. Therefore, the image does not exist due to some over-brightening processing. Overexposure problem.
  • the electronic device may further convert the image to be processed into a first YUV map, that is, convert the image to be processed into a YUV domain.
  • YUV is a color coding method adopted by European television systems. It is a color space used by analog color television systems in PAL (Phase Alteration Line) and SECAM (Sequentiel Couleur AMemoire).
  • a three-tube color camera or a color CCD (Charge Coupled Device) camera is usually used for image acquisition, and then the obtained color image signals are subjected to color separation and separately amplified to obtain RGB (red red). , green green, blue blue), and then through the matrix conversion circuit to obtain the luminance signal Y and two color difference signals BY (ie U), RY (ie V), and finally the transmitter transmits the three signals of luminance and color difference respectively, using the same The channel is sent out.
  • RGB red red
  • BY luminance signal
  • RY RY
  • the transmitter transmits the three signals of luminance and color difference respectively, using the same The channel is sent out.
  • This representation of color is the so-called YUV color space representation.
  • the importance of using the YUV color space is that its luminance signal Y and chrominance signals U, V are separated.
  • the first YUV image is mainly based on the full image effect.
  • the image to be processed is any frame in the video, the corresponding ISP processing from bayer to RGB to YUV is the best video stream effect.
  • the coordinates of the ROI (region of interest) will be given in the YUV.
  • the best video may be lacking in the ROI of the license plate, which may affect the subjective or objective recognition effect.
  • the electronic device can count the average brightness of the Y component in the first YUV map. For example, the average value of the luminance signals of each pixel of the license plate area in the first YUV map can be counted. Specifically, the number of pixels of the license plate area in the first YUV map and the Y component value of each pixel can be counted. Further, the sum of the Y component values of each pixel is obtained, and the sum of the Y component values of each pixel is divided by the number of pixels to obtain the average luminance of the Y component in the first YUV map.
  • the average luminance of the Y component may identify the luminance information of the first YUV map, and the greater the average luminance, the higher the luminance of the first YUV map.
  • the brightness of the image to be processed may be adjusted based on the average brightness of the Y component in the first YUV image.
  • the electronic device can determine whether the average brightness of the Y component is within a preset range, and if yes, indicating that the brightness information of the image to be processed is better, and does not need to be adjusted; when the average brightness is outside the preset range , indicating that the brightness information of the image to be processed is poor, and it needs to be adjusted.
  • the electronic device may determine a corresponding gamma curve according to the average brightness of the Y component in the first YUV map to perform brightness adjustment on the image to be processed.
  • the gamma curve can be used to represent the correspondence between the brightness of the input image and the output image, and the gamma curve can be used to map the brightness.
  • the abscissa of the gamma curve is the brightness value of the input image
  • the ordinate is the brightness value of the output image.
  • the brightness value of each pixel of the input image is mapped by the gamma curve to obtain the brightness value of the output image, and the brightness adjustment of the input image is completed.
  • two reference gamma curves may be preset.
  • the ordinate of any point in the first gamma curve is greater than the abscissa
  • the ordinate of any point in the second gamma curve is smaller than the abscissa to increase the brightness of the image to be processed by the first gamma curve, through the second gamma curve Reduce the brightness of the image to be processed.
  • the gamma curve can be determined as the second gamma curve, reducing the brightness of the image to be processed; when the average brightness of the Y component in the first YUV image is small, such as less than the preset brightness threshold, indicating The brightness of the image to be processed is small.
  • the gamma curve can be determined as the first gamma curve to increase the brightness of the image to be processed.
  • three reference gamma curves may be built in advance: a linear gamma curve, a gamma curve and a brightness gamma curve, and a plurality of brightness thresholds, and then according to the average brightness. And the size relationship of the plurality of brightness thresholds, the corresponding gamma curve is obtained by interpolating a plurality of reference gamma curves.
  • the abscissa of any gamma curve is the input brightness, and the ordinate is the output brightness.
  • the output brightness of the linear gamma curve is equal to the input brightness; the output brightness of the bright gamma curve is greater than the input brightness; the output brightness of the bright gamma curve is less than the input brightness.
  • the corresponding gamma curve can be determined according to the following methods:
  • the average brightness is greater than the preset first brightness threshold and less than the preset second brightness threshold, determining that the corresponding gamma curve is a preset brightening gamma curve and a preset linear gamma curve interpolation result;
  • the corresponding gamma curve is a preset brightness gamma curve.
  • the first brightness threshold is smaller than the second brightness threshold
  • the second brightness threshold is smaller than the third brightness threshold
  • the third brightness threshold is smaller than the fourth brightness threshold
  • the corresponding gamma may be determined according to the following formula:
  • the gamma curve is determined as a brightening gamma curve to be processed by the gamma curve.
  • the gamma curve is determined to be a bright gamma curve and a linear gamma curve interpolation.
  • the brightness of the image to be processed is slightly increased.
  • the gamma curve is determined as a linear gamma curve to be treated by the gamma curve.
  • the gamma curve is determined as the brightness gamma curve and the linear gamma curve interpolation.
  • the brightness of the image to be processed is slightly reduced.
  • the gamma curve is determined as a brightness gamma curve to be processed by the gamma curve. , greatly reducing the brightness of the image to be processed.
  • the gamma curve for processing the image to be processed can be determined by more reference gamma curves.
  • the specific process is similar to the above process of determining the gamma curve of the image to be processed by two or three reference gamma curves, which is not described in this embodiment.
  • the electronic device may convert the bayer map into a first RGB map. That is, you can convert the bayer map to the RGB field.
  • any image conversion method can be used to directly convert the bayer map to the RGB domain to obtain the first RGB map.
  • the Bayer image may be subjected to detouring and AWB (Automatic White Balance) processing, and then interpolated to the RGB domain to obtain a first RGB image.
  • AWB Automatic White Balance
  • the values of the R, G, and B components are obtained for each pixel of the Bayer graph according to each pixel value in the range of 3*3.
  • the first RGB image may be mapped according to the gamma curve obtained in step S103 to obtain a second RGB image.
  • the luminance value of each pixel of the first RGB image may be used as an input value, and the output luminance corresponding to each pixel is determined according to the correspondence between the input luminance and the output luminance in the gamma curve, thereby obtaining a second RGB image.
  • S105 Determine, according to the second RGB image, each matrix that performs enhancement processing on the second RGB image; and perform enhancement processing on the second RGB image by using each matrix to obtain a second YUV image.
  • the electronic device may further enhance the second RGB image to improve the image quality of the second RGB image, thereby improving the accuracy of the license plate recognition.
  • the electronic device may determine, according to image features of the second RGB image, such as brightness, pixel value, and the like, each matrix that performs enhancement processing on the second RGB image, such as an RGB2 RGB matrix, an RGB 2 YUV matrix, etc., and use each matrix pair to the second RGB.
  • the figure performs enhancement processing, that is, matrix multiplication of each pixel of the second RGB image to obtain a second YUV map.
  • the first YUV map and the second YUV map are merged to obtain a fused image.
  • the first YUV map and the second YUV map may be fused to obtain a fused image. That is to say, the image of the processed license plate area is merged with the original entire image to obtain a fused image.
  • the first YUV image and the second YUV image may be directly merged; when the second YUV image is the entire image of the image to be processed, the second image may be according to the second
  • the coordinates of the license plate area in the YUV map extract the YUV map containing only the license plate area, and fuse the extracted YUV map with the first YUV map.
  • the electronic device may replace the second YUV map with the license plate area in the first YUV map to obtain a fused image; or, in order to improve the fused image quality, the electronic device may advance
  • the fusion weights of the first YUV map and the second YUV map are set for the license plate area.
  • the fusion weights of the first YUV map and the second YUV map may be set to 0.5, and then according to the first YUV map and the second YUV map.
  • the fusion weights combine the first YUV map and the second YUV map to obtain a fused image.
  • the target object area may be enhanced according to the bayer image and the YUV map. Since the bayer map is an image that has only been subjected to exposure processing on the brightness, the image does not have overexposure due to some over-brightening processing. The problem can solve the problem of abnormal license plate recognition caused by the environment being too bright or too dark, and improve the accuracy of license plate recognition.
  • the electronic device determines each matrix for performing enhancement processing on the second RGB image, and performs enhancement processing on the second RGB image by using each matrix to obtain a second YUV image.
  • the steps can include:
  • S201 Perform histogram statistics on the second RGB image, and determine variance, saturation information, and color information corresponding to the second RGB image.
  • the electronic device may perform histogram statistics on the second RGB image, and determine variance, saturation information, and color information corresponding to the second RGB map.
  • the electronic device may perform statistics on the brightness of each pixel of the second RGB image, and then obtain a histogram in which the abscissa is luminance and the ordinate is the number of pixels. Then, the variance, saturation information, and color information are determined based on the obtained histogram.
  • the second RGB image may be subjected to color enhancement, saturation enhancement, and contrast enhancement.
  • the electronic device may determine, according to the variance, the saturation information, and the color information corresponding to the second RGB image, an RGB2 RGB matrix, that is, a target color enhancement matrix that performs color enhancement on the second RGB image, and an RGB2YUV matrix, that is, The second RGB image performs a contrast-enhanced target contrast enhancement matrix, and a target saturation enhancement matrix that saturates the second RGB image.
  • the electronic device may determine the target color enhancement matrix according to the determined color information and saturation information, and the correspondence between different saturation levels and color enhancement matrices in each of the pre-saved colors.
  • the electronic device may pre-store three basic matrices for each color license plate, respectively, an identity matrix, a small intensity enhanced background color matrix, and a large intensity enhanced background color matrix; the corresponding levels are 0, 50, 100, respectively.
  • a built-in base level of 50 a new intensity level can be mapped based on the determined license plate color and saturation information, and a corresponding matrix can be obtained for color enhancement processing to enhance the license plate color without affecting the overall color.
  • each basic matrix is a matrix of 3*3.
  • the process of determining the target color enhancement matrix may include:
  • the candidate color enhancement matrix of different saturation levels corresponding to the color information of the second RGB image may be first searched in the color enhancement matrix of different saturation levels corresponding to each color saved in advance.
  • the color enhancement matrix of different saturation levels corresponding to each color saved in advance may be as shown in the following table:
  • a color enhancement matrix a of 0 level corresponding to the color "red”, a color enhancement matrix b of 50 levels, a color enhancement matrix c of 100 levels, and a color enhancement matrix of 0 level corresponding to the color "yellow” are pre-set.
  • 50-level color enhancement matrix e 100-level color enhancement matrix f
  • color "blue” corresponding to 0-level color enhancement matrix g
  • 50-level color enhancement matrix h 100-level color enhancement matrix j.
  • the color enhancement matrix of the different saturation levels corresponding to the color information may be determined as: a color enhancement matrix a of 0 level, a color enhancement matrix of 50 levels, and a level of 100.
  • Color enhancement matrix c When the color information of the second RGB image is “red”, the color enhancement matrix of the different saturation levels corresponding to the color information may be determined as: a color enhancement matrix a of 0 level, a color enhancement matrix of 50 levels, and a level of 100.
  • the electronic device may calculate the target color level according to the saturation information of the second RGB image, the preset reference saturation, and the predetermined color conversion level.
  • the new target color level lvl_new can be calculated according to the following formula:
  • the electronic device may determine, in the candidate color enhancement matrix, two levels of candidate color enhancement matrices adjacent to the target color level.
  • the calculated target color level is 20
  • two candidate color enhancement matrices adjacent to the target color level may be determined.
  • the candidate color enhancement matrix a corresponding to level 0 and the candidate color enhancement matrix b corresponding to level 50 are respectively.
  • the electronic device may determine the weights of the candidate color enhancement matrix of the two levels according to the distance from the target color level to the two levels, and according to the two The weights of the candidate color enhancement matrices of the ranks are interpolated with the candidate color enhancement matrices of the two ranks to obtain a target color enhancement matrix. Wherein, the greater the distance from the target color level to any level, the smaller the weight of the candidate color enhancement matrix of the level.
  • level 0 when it is determined that the candidate color enhancement matrices of the two levels adjacent to the target color level 20 are: candidate color enhancement matrix a corresponding to level 0, and candidate color enhancement matrix b corresponding to level 50, level 0 may be determined.
  • the distance to the target color level 20 is 20, the distance from the level 50 to the target color level 20 is 30, and the weight of the candidate color enhancement matrix a corresponding to the level 0 is determined to be A, and the weight of the candidate color enhancement matrix b corresponding to the level 50 is For the B; further, using the weight A of the candidate color enhancement matrix a and the weight B of the candidate color enhancement matrix b, the candidate color enhancement matrices a and b are interpolated to obtain a target color enhancement matrix.
  • the candidate color enhancement matrix corresponding to level 0 is matrix a
  • the weight is A
  • the candidate color enhancement matrix corresponding to level 50 is matrix b
  • the target color enhancement matrix X is:
  • the electronic device may determine the target contrast enhancement matrix according to the determined variance and the correspondence between the different contrast levels previously saved and the contrast enhancement matrix.
  • the electronic device may pre-store a corresponding contrast enhancement matrix for different contrast levels.
  • the contrast level may be, for example, 0, 20, 50, 100, and the built-in level is 50.
  • Each of the contrast enhancement matrices is a 3*3 matrix.
  • the process of determining the target contrast enhancement matrix may include:
  • the electronic device may calculate the target contrast level according to the variance of the second RGB image, the preset reference variance, and the predetermined contrast level.
  • the new target contrast level con_lvl_new can be calculated according to the following formula:
  • S402. Determine, in a contrast enhancement matrix corresponding to different contrast levels stored in advance, two levels of candidate contrast enhancement matrices adjacent to the target contrast level.
  • the electronic device may determine two levels of candidate contrast enhancement matrices adjacent to the target contrast level in the contrast enhancement matrix corresponding to the different contrast levels stored in advance.
  • the candidate contrast enhancement matrices of the two levels adjacent to the target contrast level are respectively: candidate contrast enhancement matrix corresponding to level 20 And the candidate contrast enhancement matrix d corresponding to level 50.
  • the electronic device may determine the weights of the two levels of candidate contrast enhancement matrices according to the target contrast level to the distance of the two levels, and according to the two The weights of the candidate contrast enhancement matrices of the ranks are interpolated for the two levels of candidate contrast enhancement matrices to obtain a target contrast enhancement matrix. Wherein, the greater the distance from the target contrast level to any level, the smaller the weight of the candidate contrast enhancement matrix of the level.
  • the level 20 can be determined.
  • the distance to the target contrast level 25 is 5, the distance from the level 50 to the target contrast level 25 is 25, and the weight of the candidate contrast enhancement matrix c corresponding to the level 20 is determined to be C, and the weight of the candidate contrast enhancement matrix d corresponding to the level 50 is D; further, using the weight C of the candidate contrast enhancement matrix c and the weight D of the candidate contrast enhancement matrix d, the candidate contrast enhancement matrices c and d are interpolated to obtain a target contrast enhancement matrix.
  • the candidate contrast enhancement matrix corresponding to level 20 is matrix c and the weight is C
  • the candidate contrast enhancement matrix corresponding to level 50 is matrix d
  • the target contrast enhancement matrix Y is:
  • the electronic device may determine the target saturation enhancement matrix according to the determined saturation information and the correspondence between the different saturation levels and the saturation enhancement matrix stored in advance.
  • the electronic device may pre-store a corresponding saturation enhancement matrix for different saturation levels.
  • the saturation level can be, for example, [0-100], the built-in level is 10, and the saturation level of the saved saturation enhancement matrix can be: 10, 20, 30, 40, 50, 60, 70, 80. , 90, 100.
  • Each of the saturation enhancement matrices is a matrix of 3*3.
  • the process of determining a target saturation enhancement matrix may include:
  • S501 Calculate a target saturation level according to the saturation information, a preset reference saturation, and a predetermined saturation conversion level.
  • the electronic device may calculate the target saturation level according to the saturation information of the second RGB map, the preset reference saturation, and the predetermined saturation conversion level.
  • the new target saturation level lvl_new_yuv can be calculated according to the following formula:
  • the electronic device may determine two levels of candidate saturation enhancement matrices adjacent to the target saturation level in the saturation enhancement matrix corresponding to the different saturation levels stored in advance.
  • the candidate saturation enhancement matrices of the two levels adjacent to the target saturation level are: candidate saturation corresponding to level 10 The degree enhancement matrix C, and the candidate saturation enhancement matrix D corresponding to level 20.
  • the electronic device may determine the weights of the candidate saturation enhancement matrices of the two levels according to the distance from the target saturation level to the two levels. And according to the weights of the two levels of candidate saturation enhancement matrices, the two levels of candidate saturation enhancement matrices are interpolated to obtain a target saturation enhancement matrix. Wherein, the greater the distance from the target saturation level to any level, the smaller the weight of the candidate saturation enhancement matrix of the level.
  • the candidate saturation enhancement matrices of the two levels adjacent to the target saturation level 15 are: the candidate saturation enhancement matrix e corresponding to level 10 and the candidate saturation enhancement matrix f corresponding to level 20, then It may be determined that the distance from the level 10 to the target saturation level 15 is 5, the distance from the level 20 to the target saturation level 15 is 5, and the weight of the candidate saturation enhancement matrix e corresponding to the level 10 is determined to be E, and the level 20 corresponds to The weight of the candidate saturation enhancement matrix f is F; and further, the weights E of the candidate saturation enhancement matrix e and the weight F of the candidate saturation enhancement matrix f are used, and the candidate saturation enhancement matrices e and f are interpolated to obtain target saturation. Degree enhancement matrix.
  • the candidate saturation enhancement matrix corresponding to level 10 is matrix e
  • the weight is E
  • the candidate saturation enhancement matrix corresponding to level 20 is matrix f
  • the target saturation enhancement matrix Z is:
  • the electronic device may perform color enhancement processing on the second RGB image according to the target color enhancement matrix.
  • the electronic device can perform matrix multiplication on each pixel of the second RGB map using the target color enhancement matrix.
  • S204 Perform contrast and saturation adjustment on the second RGB image after the color enhancement process according to the target contrast enhancement matrix and the target saturation enhancement matrix, respectively, to obtain a second YUV image.
  • the electronic device may further perform contrast and saturation adjustment on the second RGB image after the color enhancement process according to the target contrast enhancement matrix and the target saturation enhancement matrix to obtain a second YUV map.
  • the electronic device may perform matrix multiplication on each pixel of the second RGB image after the color enhancement process using the target contrast enhancement matrix and the target saturation enhancement matrix to obtain a second YUV map.
  • the second RGB image may be subjected to color enhancement processing, saturation enhancement processing, and contrast enhancement processing according to the information of the second RGB image to ensure the image quality of the obtained second YUV image, thereby improving the accurate license plate recognition. Sex.
  • the electronic device combines the first YUV image and the second YUV image, and the step of obtaining the fused image may include:
  • the area to be fused of the second YUV map may be first determined, that is, the specific fusion range is determined.
  • the area to be fused is the rectangular area 701 in FIG. 7b.
  • the area to be fused may be divided into multiple sub-areas to respectively converge the plurality of sub-areas.
  • the region to be fused in FIG. 7b it can be divided into 9 sub-regions as shown in FIG. 7c, which are sub-regions 1, 2, 3, 4, 5, 6, 7, 8, and 9, respectively.
  • the electronic device may further determine the fusion weight of each pixel in each sub-region.
  • the electronic device may determine the fusion weight of each pixel in each sub-area as follows:
  • the central sub-area is the sub-area 5
  • the central sub-area 5 is set to the highest weight, as can be set to 1, that is, when performing image fusion, all use enhancements.
  • the upper and lower sub-regions of the same ordinate with the central sub-region 5, that is, the sub-regions 2, 8, the column weights of the two sub-regions are the same as the sub-region 5, and are the highest weights, and only the row weights need to be calculated to determine the fusion weights. .
  • the four corner sub-regions that is, the sub-regions 1, 3, 7, and 9, are not in the range of the sub-region 5, so the row and column weights need to be separately calculated, and the smaller of the two is used to determine the fusion weight. .
  • the longitudinal weight of the pixel when calculating the longitudinal weight of any pixel in any sub-area, the longitudinal weight of the pixel may be calculated according to the distance of the pixel from the horizontal boundary of the sub-area; and calculating any pixel in any sub-area When the horizontal weight is used, the horizontal weight of the pixel may be calculated according to the distance of the pixel from the longitudinal boundary of the sub-area.
  • the horizontal weight (or the vertical weight) may be squared, and the squared value is used as the fusion weight of the pixel.
  • the greater the weight of the second YUV image after the enhancement is used. This makes it possible to use the enhanced effect with a large distance from the boundary, and the original image with the most use in the edge area ensures that the final large image is natural and does not show any traces of the texture.
  • the fusion weight of each pixel in each sub-region is used as a fusion weight of the second YUV map, and the first YUV map and the second YUV map are merged to obtain a fused image.
  • the fusion weights of the pixels in each sub-region are used as the fusion weights of the second YUV map, and the first YUV image and the second YUV image are merged to obtain a fusion image.
  • the fusion weight of the area to be fused can be calculated, and the fused image is generated, thereby ensuring that the fused image looks natural, without texture splicing marks, and improving the subjective visual experience.
  • the fused image may be secondarily fused, that is, the fused image and the original image are merged again.
  • the fusion weight of the first YUV image may be determined according to the difference between the fused image and the YUV data of each pixel corresponding to the first YUV map; and the fused image and the first YUV are further determined according to the fusion weight of the first YUV map. The figure is fused to obtain the target image.
  • the absolute difference Dif corresponding to each pixel point can be calculated as:
  • Difs correspond to different weights, and the smaller the Dif, the larger the weight.
  • Dif is greater than the threshold, the weight tends to take a fixed value of the merged image.
  • the fused image and the original image may be secondarily fused to ensure that the fused boundary is more natural and smooth.
  • FIG. 8a which is an image in which the original image shown in FIG. 7a is directly raised by luma
  • the image after lifting luma is higher than the original image shown in FIG. 7a
  • FIG. 8b is a method using the embodiment.
  • the merged image It can be seen from FIG. 8a and FIG. 8b that the method of the embodiment of the present application can effectively improve the image quality and ensure that the image fusion boundary is smooth and natural.
  • the electronic device before the electronic device converts the bayer graph into the first RGB graph, it may also determine whether the black level of the bayer graph is greater than a preset threshold; if yes, remove the data of each channel of the bayer graph. Black level values, which in turn improve image quality.
  • Black level refers to the level of a video signal that does not have a bright output on a calibrated display device. Define the corresponding signal level when the image data is 0. Adjusting the black level does not affect the amplification factor of the signal, but only shifts the signal up and down. If you adjust the black level up, the image will be dimmed, and if you adjust the black level down, the image will light up. When the black level is 0, the level corresponding to 0V or less is converted to image data 0, and the level above 0V is converted according to the magnification defined by the gain, and the maximum value is 255.
  • the embodiment of the present application further provides an image processing apparatus.
  • the apparatus includes:
  • the obtaining module 910 is configured to obtain an image to be processed.
  • An extraction module 920 configured to extract a bayer map including the target object in the image to be processed, and convert the image to be processed into a first YUV map;
  • a statistic module 930 configured to count an average brightness of the Y component in the first YUV image; and when the average brightness is outside a preset range, determine a corresponding color according to an average brightness of the Y component in the first YUV image Gamma curve
  • a conversion module 940 configured to convert the bayer map into a first RGB image; and map the first RGB image according to the gamma curve to obtain a second RGB image;
  • a processing module 950 configured to determine, according to the second RGB image, each matrix that performs enhancement processing on the second RGB image; and perform enhancement processing on the second RGB image by using each matrix to obtain a second YUV Figure
  • the first fusion module 960 is configured to fuse the first YUV map and the second YUV map to obtain a fused image.
  • the target object area may be enhanced according to the bayer image and the YUV map. Since the bayer map is an image that has only been subjected to exposure processing on the brightness, the image does not have overexposure due to some over-brightening processing. The problem can solve the problem of abnormal license plate recognition caused by the environment being too bright or too dark, and improve the accuracy of license plate recognition.
  • the processing module 950 includes:
  • a first determining submodule configured to perform histogram statistics on the second RGB image, and determine variance, saturation information, and color information corresponding to the second RGB image;
  • a second determining submodule configured to determine, according to the variance, the saturation information, and the color information corresponding to the second RGB image, a target color enhancement matrix that performs color enhancement on the second RGB image, where the second RGB is a target contrast enhancement matrix for contrast enhancement, and a target saturation enhancement matrix for saturation enhancement of the second RGB image;
  • An enhancement processing submodule configured to perform color enhancement processing on the second RGB image according to the target color enhancement matrix
  • a adjusting submodule configured to perform contrast and saturation adjustment on the second RGB image after the color enhancement process according to the target contrast enhancement matrix and the target saturation enhancement matrix, respectively, to obtain a second YUV image.
  • the second determining submodule includes:
  • a first determining subunit configured to determine a target color enhancement matrix according to the color information and the saturation information, and the correspondence between the different saturation levels and the color enhancement matrix in each of the pre-saved colors;
  • a second determining subunit configured to determine a target contrast enhancement matrix according to the variance, and a correspondence between different contrast levels saved in advance and a contrast enhancement matrix
  • a third determining subunit configured to determine a target saturation enhancement matrix according to the saturation information and a correspondence between different saturation levels and a saturation enhancement matrix stored in advance.
  • the first determining subunit is specifically configured to:
  • the candidate color enhancement matrix performs an interpolation operation to obtain a target color enhancement matrix.
  • the second determining subunit is specifically configured to:
  • the candidate contrast enhancement matrix is interpolated to obtain a target contrast enhancement matrix.
  • the third determining subunit is specifically configured to:
  • the two levels of candidate saturation enhancement matrices are interpolated to obtain a target saturation enhancement matrix.
  • the statistic module is specifically configured to:
  • the extraction module 920 is specifically configured to:
  • the RGB domain is interpolated to obtain a first RGB map.
  • the first fusion module 960 includes:
  • a sub-module configured to determine a to-be-fused region of the second YUV map, and divide the to-be-fused region into a plurality of sub-regions;
  • a third determining submodule configured to determine a fusion weight of each pixel in each subregion
  • a fusion sub-module configured to use a fusion weight of each pixel in each sub-region as a fusion weight of the second YUV map, and fuse the first YUV map and the second YUV image to obtain a fused image .
  • the third determining submodule is specifically configured to:
  • the device further includes:
  • a determining module configured to determine a fusion weight of the first YUV map according to a difference between the fused image and YUV data corresponding to each pixel of the first YUV map;
  • a second fusion module configured to fuse the fused image with the first YUV image according to a fusion weight of the first YUV map to obtain a target image.
  • the device further includes:
  • a determining module configured to determine whether a black level of the bayer graph is greater than a preset threshold
  • a removing module configured to remove a black level value for each channel data of the bayer map when the determining result of the determining module is YES.
  • the embodiment of the present application further provides an electronic device, as shown in FIG. 10, including:
  • processor 1010 a processor 1010, a memory 1020, a communication interface 1030, and a bus 1040;
  • the processor 1010, the memory 1020, and the communication interface 1030 are connected by the bus 1040 and complete communication with each other;
  • the memory 1020 stores executable program code
  • the processor 1010 by reading the executable program code stored in the memory 1020, to run a program corresponding to the executable program code, for performing an image processing method according to an embodiment of the present application, wherein
  • the image processing method includes:
  • the first YUV map and the second YUV map are fused to obtain a fused image.
  • the target object area may be enhanced according to the bayer image and the YUV map. Since the bayer map is an image that has only been subjected to exposure processing on the brightness, the image does not have overexposure due to some over-brightening processing. The problem can solve the problem of abnormal license plate recognition caused by the environment being too bright or too dark, and improve the accuracy of license plate recognition.
  • the communication bus mentioned in the above computer equipment may be a Peripheral Component Interconnect (PCI) bus or an Extended Industry Standard Architecture (EISA) bus.
  • PCI Peripheral Component Interconnect
  • EISA Extended Industry Standard Architecture
  • the communication bus can be divided into an address bus, a data bus, a control bus, and the like. For ease of representation, only one line is shown in the figure, but it does not mean that there is only one bus or one type of bus.
  • the communication interface is used for communication between the above computer device and other devices.
  • the memory may include a random access memory (RAM), and may also include a non-volatile memory, such as at least one disk storage.
  • the memory may also be at least one storage device located away from the aforementioned processor.
  • the processor may be a general-purpose processor, including a central processing unit (CPU), a network processor (Ne twork processor, NP for short), or a digital signal processor (DSP). ), Application Specific Integrated Circuit (ASIC), Field-Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware component.
  • CPU central processing unit
  • Ne twork processor Network processor
  • DSP digital signal processor
  • ASIC Application Specific Integrated Circuit
  • FPGA Field-Programmable Gate Array
  • the embodiment of the present application further provides a computer readable storage medium, where the computer readable storage medium stores a computer program, and when the computer program is executed by the processor, the implementation of any of the above FIG. 1 to FIG. 8 is implemented.
  • the target object area may be enhanced according to the bayer image and the YUV map. Since the bayer map is an image that has only been subjected to exposure processing on the brightness, the image does not have overexposure due to some over-brightening processing. The problem can solve the problem of abnormal license plate recognition caused by the environment being too bright or too dark, and improve the accuracy of license plate recognition.
  • the embodiment of the present application further provides a computer program product comprising instructions, when executed on a computer, causing the computer to execute the image processing method as described in any of the above-mentioned Figures 1-8.
  • the target object area may be enhanced according to the bayer image and the YUV map. Since the bayer map is an image that has only been subjected to exposure processing on the brightness, the image does not have overexposure due to some over-brightening processing. The problem can solve the problem of abnormal license plate recognition caused by the environment being too bright or too dark, and improve the accuracy of license plate recognition.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)

Abstract

La présente invention concerne un procédé de traitement d'image, un appareil, un dispositif électronique et un support d'informations. Ledit procédé consiste : à acquérir une image à traiter (S101) ; à extraire, de l'image à traiter, une image Bayer contenant un objet cible, et à convertir l'image à traiter en une première image YUV (S102) ; à collecter des statistiques sur la luminosité moyenne d'une composante Y dans la première image YUV ; lorsque la luminosité moyenne dépasse une plage prédéfinie, à déterminer une courbe gamma correspondante en fonction de la luminosité moyenne de la composante Y dans la première image YUV (S103) ; à convertir l'image Bayer en une première image RVB ; et à effectuer un mappage sur la première image RVB en fonction de la courbe gamma, de façon à obtenir une seconde image RVB (S104) ; à déterminer, en fonction de la seconde image RVB, diverses matrices effectuant un traitement d'amélioration sur la seconde image RVB ; et à effectuer un traitement d'amélioration sur la seconde image RVB au moyen des matrices, de façon à obtenir une seconde image YUV (S105) ; et à fusionner la première image YUV et la seconde image YUV pour obtenir une image fusionnée (S106). Ledit procédé peut améliorer la qualité d'une image surveillée, et améliorer en outre la précision de la reconnaissance de plaque d'immatriculation de véhicule.
PCT/CN2018/112877 2018-02-02 2018-10-31 Procédé de traitement d'image, appareil, dispositif électronique et support d'informations WO2019148912A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810104828.XA CN110136071B (zh) 2018-02-02 2018-02-02 一种图像处理方法、装置、电子设备及存储介质
CN201810104828.X 2018-02-02

Publications (1)

Publication Number Publication Date
WO2019148912A1 true WO2019148912A1 (fr) 2019-08-08

Family

ID=67477869

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/112877 WO2019148912A1 (fr) 2018-02-02 2018-10-31 Procédé de traitement d'image, appareil, dispositif électronique et support d'informations

Country Status (2)

Country Link
CN (1) CN110136071B (fr)
WO (1) WO2019148912A1 (fr)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110910830A (zh) * 2019-11-29 2020-03-24 京东方科技集团股份有限公司 显示亮度调节方法、显示系统、计算机设备及介质
CN111246190A (zh) * 2020-01-21 2020-06-05 北京中科核安科技有限公司 基于核辐射定位仪的图像处理方法、装置及电子设备
CN111724456A (zh) * 2020-06-18 2020-09-29 Oppo广东移动通信有限公司 一种图像显示方法、设备及计算机可读存储介质
CN112017174A (zh) * 2020-09-03 2020-12-01 湖南省华芯医疗器械有限公司 图像处理方法、装置、电子设备及存储介质
WO2022105019A1 (fr) * 2020-11-20 2022-05-27 罗普特科技集团股份有限公司 Procédé et appareil d'évaluation de qualité d'instantané pour dispositif à baïonnette de véhicule, et support lisible
CN117710365A (zh) * 2024-02-02 2024-03-15 中国电建集团华东勘测设计研究院有限公司 缺陷管道图像的处理方法、装置及电子设备
CN112017174B (zh) * 2020-09-03 2024-05-31 湖南省华芯医疗器械有限公司 图像处理方法、装置、电子设备及存储介质

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111028190A (zh) * 2019-12-09 2020-04-17 Oppo广东移动通信有限公司 图像处理方法、装置、存储介质及电子设备
CN111126493B (zh) * 2019-12-25 2023-08-01 东软睿驰汽车技术(沈阳)有限公司 深度学习模型的训练方法、装置、电子设备及存储介质
CN111260600B (zh) * 2020-01-21 2023-08-22 维沃移动通信有限公司 一种图像处理方法、电子设备及介质
CN111428732B (zh) * 2020-03-03 2023-10-17 平安科技(深圳)有限公司 Yuv图像识别方法、系统和计算机设备
CN111860380A (zh) * 2020-07-27 2020-10-30 平安科技(深圳)有限公司 人脸图像生成方法、装置、服务器及存储介质
CN111898532A (zh) * 2020-07-30 2020-11-06 杭州海康威视数字技术股份有限公司 一种图像处理方法、装置、电子设备及监控系统
CN112381836B (zh) * 2020-11-12 2023-03-31 贝壳技术有限公司 图像处理方法和装置、计算机可读存储介质、电子设备
CN113313661A (zh) * 2021-05-26 2021-08-27 Oppo广东移动通信有限公司 图像融合方法、装置、电子设备及计算机可读存储介质
CN113793470A (zh) * 2021-08-09 2021-12-14 上海腾盛智能安全科技股份有限公司 一种基于动态图像探测分析的探测装置
CN113781370A (zh) * 2021-08-19 2021-12-10 北京旷视科技有限公司 图像的增强方法、装置和电子设备
CN113920042B (zh) * 2021-09-24 2023-04-18 深圳市资福医疗技术有限公司 图像处理系统及胶囊内窥镜
CN113792708B (zh) * 2021-11-10 2022-03-18 湖南高至科技有限公司 基于arm的远距离目标清晰成像系统及方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040263651A1 (en) * 2003-06-30 2004-12-30 Shin Kazunobu Method and system for printing images captured by a mobile camera telephone
CN102202163A (zh) * 2011-05-13 2011-09-28 成都西图科技有限公司 一种监控视频的自适应增强方法及其装置
CN104899845A (zh) * 2015-05-10 2015-09-09 北京工业大学 一种基于lαβ空间场景迁移的多曝光图像融合方法
CN105847703A (zh) * 2016-03-28 2016-08-10 联想(北京)有限公司 一种图像处理方法和电子设备
CN106454014A (zh) * 2016-11-04 2017-02-22 安徽超远信息技术有限公司 一种提高逆光场景车辆抓拍图像质量的方法及装置
CN107438178A (zh) * 2016-05-25 2017-12-05 掌赢信息科技(上海)有限公司 一种图像色彩矫正方法及电子设备

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100425080C (zh) * 2005-05-25 2008-10-08 凌阳科技股份有限公司 贝尔影像的边缘强化方法与装置暨彩色影像撷取系统
KR101366596B1 (ko) * 2007-08-03 2014-03-14 삼성전자주식회사 이차원 정지 화상에 대해 몰입감을 생성하는 방법 및시스템 그리고 상기 몰입감 생성을 위한 팩터 조절 방법,이미지 콘텐트 분석 방법 및 스케일링 파라미터 예측 방법
CN101115211A (zh) * 2007-08-30 2008-01-30 四川长虹电器股份有限公司 色彩独立增强处理方法
CN101990081B (zh) * 2010-11-11 2012-02-22 宁波大学 一种虚拟视点图像的版权保护方法
KR101217476B1 (ko) * 2011-03-07 2013-01-02 주식회사 코아로직 이미지 신호 처리를 위한 장치 및 방법
CN102231206B (zh) * 2011-07-14 2012-11-28 浙江理工大学 适用于汽车辅助驾驶系统的彩色夜视图像亮度增强方法
CN104156921B (zh) * 2014-08-08 2017-02-22 大连理工大学 一种低照度或亮度不均图像的自适应图像增强方法
CN105184757B (zh) * 2015-06-11 2018-03-06 西安电子科技大学 基于颜色空间特点的食物图像色彩增强方法
CN105160355B (zh) * 2015-08-28 2018-05-15 北京理工大学 一种基于区域相关和视觉单词的遥感图像变化检测方法
CN106023129A (zh) * 2016-05-26 2016-10-12 西安工业大学 红外与可见光图像融合的汽车抗晕光视频图像处理方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040263651A1 (en) * 2003-06-30 2004-12-30 Shin Kazunobu Method and system for printing images captured by a mobile camera telephone
CN102202163A (zh) * 2011-05-13 2011-09-28 成都西图科技有限公司 一种监控视频的自适应增强方法及其装置
CN104899845A (zh) * 2015-05-10 2015-09-09 北京工业大学 一种基于lαβ空间场景迁移的多曝光图像融合方法
CN105847703A (zh) * 2016-03-28 2016-08-10 联想(北京)有限公司 一种图像处理方法和电子设备
CN107438178A (zh) * 2016-05-25 2017-12-05 掌赢信息科技(上海)有限公司 一种图像色彩矫正方法及电子设备
CN106454014A (zh) * 2016-11-04 2017-02-22 安徽超远信息技术有限公司 一种提高逆光场景车辆抓拍图像质量的方法及装置

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110910830A (zh) * 2019-11-29 2020-03-24 京东方科技集团股份有限公司 显示亮度调节方法、显示系统、计算机设备及介质
CN110910830B (zh) * 2019-11-29 2021-02-12 京东方科技集团股份有限公司 显示亮度调节方法、显示系统、计算机设备及介质
CN111246190A (zh) * 2020-01-21 2020-06-05 北京中科核安科技有限公司 基于核辐射定位仪的图像处理方法、装置及电子设备
CN111724456A (zh) * 2020-06-18 2020-09-29 Oppo广东移动通信有限公司 一种图像显示方法、设备及计算机可读存储介质
CN111724456B (zh) * 2020-06-18 2024-05-24 Oppo广东移动通信有限公司 一种图像显示方法、设备及计算机可读存储介质
CN112017174A (zh) * 2020-09-03 2020-12-01 湖南省华芯医疗器械有限公司 图像处理方法、装置、电子设备及存储介质
CN112017174B (zh) * 2020-09-03 2024-05-31 湖南省华芯医疗器械有限公司 图像处理方法、装置、电子设备及存储介质
WO2022105019A1 (fr) * 2020-11-20 2022-05-27 罗普特科技集团股份有限公司 Procédé et appareil d'évaluation de qualité d'instantané pour dispositif à baïonnette de véhicule, et support lisible
CN117710365A (zh) * 2024-02-02 2024-03-15 中国电建集团华东勘测设计研究院有限公司 缺陷管道图像的处理方法、装置及电子设备
CN117710365B (zh) * 2024-02-02 2024-05-03 中国电建集团华东勘测设计研究院有限公司 缺陷管道图像的处理方法、装置及电子设备
CN117974414B (zh) * 2024-03-28 2024-06-07 中国人民解放军国防科技大学 基于融合新闻素材的数字水印签名校验方法、装置和设备

Also Published As

Publication number Publication date
CN110136071A (zh) 2019-08-16
CN110136071B (zh) 2021-06-25

Similar Documents

Publication Publication Date Title
WO2019148912A1 (fr) Procédé de traitement d'image, appareil, dispositif électronique et support d'informations
CN111028189B (zh) 图像处理方法、装置、存储介质及电子设备
EP3308537B1 (fr) Étalonnage d'éléments de capteur d'image défectueux
US7764319B2 (en) Image processing apparatus, image-taking system, image processing method and image processing program
EP3554070A1 (fr) Procédé de capture de photographie, appareil, terminal et support de stockage
EP2854389A1 (fr) Procédé et appareil de reconnaissance de scène
CN107633252B (zh) 肤色检测方法、装置及存储介质
TWI532385B (zh) 白平衡處理方法及其處理裝置
CN108932696B (zh) 信号灯的光晕抑制方法及装置
WO2018068300A1 (fr) Procédé et dispositif de traitement d'image
WO2021088639A1 (fr) Procédé et appareil de traitement de luminosité d'images et procédé et appareil de traitement d'images
US20210067695A1 (en) Image processing apparatus, output information control method, and program
CN109922325B (zh) 图像处理设备及其控制方法和计算机可读存储介质
CN107644437B (zh) 基于分块的偏色检测系统及其方法
CN108615030B (zh) 一种标题一致性检测方法、装置及电子设备
CN107682611B (zh) 对焦的方法、装置、计算机可读存储介质和电子设备
CN106773453B (zh) 一种照相机曝光的方法、装置及移动终端
JP5395650B2 (ja) 被写体領域抽出装置およびその制御方法、被写体追跡装置、並びにプログラム
CN111970501A (zh) 纯色场景ae色彩处理方法、装置、电子设备及存储介质
JP2007166028A (ja) 露出量制御装置および露出量制御方法
CN116309224A (zh) 图像融合方法、装置、终端及计算机可读存储介质
US20110149069A1 (en) Image processing apparatus and control method thereof
CN109510936B (zh) 连续自动对焦方法及系统
CN109327655B (zh) 连续自动对焦方法及系统
US20130163862A1 (en) Image processing method and device for redeye correction

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18904146

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18904146

Country of ref document: EP

Kind code of ref document: A1