WO2022227394A1 - Image processing method and apparatus, and device, storage medium and program - Google Patents

Image processing method and apparatus, and device, storage medium and program Download PDF

Info

Publication number
WO2022227394A1
WO2022227394A1 PCT/CN2021/120905 CN2021120905W WO2022227394A1 WO 2022227394 A1 WO2022227394 A1 WO 2022227394A1 CN 2021120905 W CN2021120905 W CN 2021120905W WO 2022227394 A1 WO2022227394 A1 WO 2022227394A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
area
processed
guide
regional
Prior art date
Application number
PCT/CN2021/120905
Other languages
French (fr)
Chinese (zh)
Inventor
吴佳飞
李亘杰
张广程
Original Assignee
上海商汤智能科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海商汤智能科技有限公司 filed Critical 上海商汤智能科技有限公司
Priority to KR1020227027309A priority Critical patent/KR20220149514A/en
Publication of WO2022227394A1 publication Critical patent/WO2022227394A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/97Determining parameters from multiple pictures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Definitions

  • the present disclosure relates to the field of computer vision technology, and in particular, to an image processing method, apparatus, device, storage medium, and program.
  • edge protection filtering is one of the most commonly used image filtering techniques.
  • edge protection filtering there are two methods of edge protection filtering: local filtering and global optimization.
  • the global optimization has better processing effect, but the calculation complexity is too high and the efficiency is low. Difference.
  • the embodiment of the present disclosure proposes an image processing technical solution.
  • An embodiment of the present disclosure provides an image processing method, the method is executed by an electronic device, and the method includes:
  • the linear estimation parameter of any image position in the image to be processed is associated with the regional variance value and the regional entropy value of the first image area in the guide image, and the first image area includes the guide image.
  • the image In the image, the image area with the image position as the center and the preset size.
  • the determining linear estimation parameters of multiple image positions of the to-be-processed image according to the to-be-processed image and the guide image of the to-be-processed image includes: for the to-be-processed image For any image position, determine a first image area and a second image area corresponding to the image position, and the second image area includes an image of a preset size centered on the image position in the to-be-processed image area; respectively determine the area parameters of the first image area and the second image area, the area parameters include the area variance value, the area entropy value and the first area mean value of the pixels in the first image area , the second area mean value of the pixels in the second image area, and the third area mean value of the fusion area of the first image area and the second image area; Linear estimation parameters for each image location. In this way, the linear estimation parameter of the image position can be determined based on the area parameter of the guide image area corresponding to the image position.
  • the determining the area parameters of the first image area and the second image area respectively includes: fusing the first image area and the second image area to obtain a fusion area; determine the third area mean of the pixels in the fusion area. In this way, the regional mean values of the regions can be fused to realize the association between the guide image and the image to be processed.
  • the determining, respectively, the area parameters of the first image area and the second image area includes: determining, according to a luminance histogram of the guide image, a The occurrence probability of each pixel point in the guide image; the region entropy value is determined according to the occurrence probability of each pixel point in the first image region. In this way, the regional entropy value of the image region can be obtained for subsequent calculation of linear estimation parameters.
  • the linear estimation parameters include a first parameter and a second parameter
  • the determining the linear estimation parameters for the plurality of image positions according to the region parameters includes: according to the region parameters difference value, the regional entropy value, the first regional average value, the second regional average value, and the third regional average value to determine the first parameter of the image position; according to the first regional average value, the The second area mean and the first parameter determine the second parameter of the image position.
  • the method further includes: performing linear filtering on the to-be-processed image to obtain a guide image of the to-be-processed image.
  • information such as noise in the image to be processed is filtered out in the guide image, so that the information in the image is smoother, so as to guide the subsequent processing of the image to be processed.
  • the method further includes: decomposing the image to be processed to obtain a texture map of the image to be processed.
  • an edge protection filtering algorithm can be used to decompose the to-be-processed image into a base layer and a detail layer, and the detail layer can be used as the texture map of the to-be-processed image, so that the image processing result can be obtained by using the texture map subsequently.
  • Embodiments of the present disclosure provide an image processing apparatus, including:
  • a parameter determination module configured to determine linear estimation parameters of multiple image positions of the to-be-processed image according to the to-be-processed image and a guide image of the to-be-processed image;
  • a structure diagram acquisition module configured to obtain the structure diagram of the to-be-processed image according to the guide image and the linear estimation parameters of the positions of the multiple images;
  • a result determination module configured to fuse the structure map with the texture map of the to-be-processed image to obtain an image processing result of the to-be-processed image
  • the linear estimation parameter of any image position in the image to be processed is associated with the regional variance value and the regional entropy value of the first image area in the guide image, and the first image area includes the guide image.
  • the image In the image, the image area with the image position as the center and the preset size.
  • the parameter determination module includes:
  • An area determination sub-module configured to determine, for any image position of the image to be processed, a first image area and a second image area corresponding to the image position, the second image area including the image to be processed , the image area with the image position as the center and the preset size;
  • a region parameter determination sub-module configured to respectively determine the region parameters of the first image region and the second image region, the region parameters including the region variance value and the region entropy of the pixels in the first image region value and the first area mean value, the second area mean value of the pixels in the second image area, and the third area mean value of the fusion area of the first image area and the second image area;
  • the linear parameter determination submodule is configured to determine linear estimation parameters of the plurality of image positions according to the region parameters.
  • the region parameter determination submodule is configured to fuse the first image region and the second image region to obtain a fusion region; The average of the three regions.
  • the region parameter determination submodule is configured to determine, according to a brightness histogram of the guide image, the occurrence probability of each pixel in the first image region in the guide image;
  • the region entropy value is determined according to the occurrence probability of each pixel in the first image region.
  • the linear estimation parameter includes a first parameter and a second parameter
  • the linear parameter determination sub-module is configured to An area mean value, the second area mean value and the third area mean value determine the first parameter of the image position; according to the first area mean value, the second area mean value and the first parameter, determine the second parameter of the image location.
  • the apparatus further includes: a linear filtering module configured to perform linear filtering on the to-be-processed image to obtain a guide image of the to-be-processed image.
  • the apparatus further includes: an image decomposition module configured to decompose the to-be-processed image to obtain a texture map of the to-be-processed image.
  • An embodiment of the present disclosure provides an electronic device, including: a processor; a memory for storing instructions executable by the processor; wherein the processor is configured to invoke the instructions stored in the memory to execute the above method.
  • Embodiments of the present disclosure provide a computer-readable storage medium, on which computer program instructions are stored, and when the computer program instructions are executed by a processor, the foregoing method is implemented.
  • An embodiment of the present disclosure also provides a computer program, the computer program includes computer-readable codes, and when the computer-readable codes are executed in an electronic device, a processor of the electronic device executes the above method.
  • the linear estimation parameters can be determined according to the image to be processed and the guide image; the structure diagram of the image to be processed can be obtained according to the guide image and the linear estimation parameters; the image processing result can be obtained by fusing the structure diagram and the texture image.
  • the estimated parameters are related to the regional variance value and the regional entropy value, which can better maintain the sharpness of the image edge and improve the image processing effect.
  • FIG. 1 shows a flowchart of an image processing method according to an embodiment of the present disclosure
  • FIG. 2 shows a schematic diagram of a system architecture to which an image processing method according to an embodiment of the present disclosure can be applied;
  • FIG. 3 shows a block diagram of an image processing apparatus according to an embodiment of the present disclosure
  • FIG. 4 shows a block diagram of an electronic device according to an embodiment of the present disclosure
  • FIG. 5 shows a block diagram of an electronic device according to an embodiment of the present disclosure.
  • Multiple or multiple in the embodiments of the present disclosure may refer to at least two or at least two, respectively.
  • Guided filtering is a local filtering algorithm of edge protection filtering technology.
  • a linear regression model are fitted, and regularization constraints are added to the linear regression model to reduce the overfitting problem.
  • the weight of the regularization constraint is fixed, which is prone to artifacts in edge regions.
  • the linear estimation parameter of the image position can be determined based on the regional variance value and the regional entropy value of the guide image region corresponding to the image position, and the automatic adjustment can be performed by the entropy-variance joint adaptive processing method. Regularize the weight of the constraint item, thereby reducing the possible artifacts in the edge area of the image, thereby improving the processing effect of the image.
  • FIG. 1 shows a flowchart of an image processing method according to an embodiment of the present disclosure. As shown in FIG. 1 , the image processing method includes:
  • step S11 according to the to-be-processed image and the guide image of the to-be-processed image, linear estimation parameters of multiple image positions of the to-be-processed image are determined;
  • step S12 a structure diagram of the to-be-processed image is obtained according to the guide image and the linear estimation parameters of the multiple image positions;
  • step S13 the structure map and the texture map of the to-be-processed image are fused to obtain an image processing result of the to-be-processed image
  • the linear estimation parameter of any image position in the image to be processed is associated with the regional variance value and the regional entropy value of the first image area in the guide image, and the first image area includes the guide image.
  • the image In the image, the image area with the image position as the center and the preset size.
  • the image processing method may be executed by an electronic device such as a terminal device or a server
  • the terminal device may be a user equipment (User Equipment, UE), a mobile device, a user terminal, a terminal, a cellular phone, a cordless Phones, personal digital assistants (Personal Digital Assistants, PDAs), handheld devices, computing devices, vehicle-mounted devices, wearable devices, etc.
  • the method can be implemented by the processor calling the computer-readable instructions stored in the memory.
  • the method may be performed by a server.
  • the image to be processed may be any image, such as a scene image collected by an image acquisition device, an image downloaded from a network, and the like.
  • the image processing tasks for the images to be processed can be of any category, such as image dehazing, dark light enhancement, contrast enhancement, tone mapping, high dynamic range HDR imaging, etc. Specific categories are not limited.
  • an edge protection filtering algorithm can be used to decompose the image to be processed into two layers, called a base layer and a detail layer, where the base layer includes the overall structural information of the image (such as the position and layout of objects in the image, etc. ), the detail layer includes the texture detail information of the image (for example, the texture, direction, etc. of the object in the image).
  • the image to be processed X(p) can be expressed as:
  • Z(p) can represent the base layer; D(p) can represent the detail layer; p can represent any image position in the image.
  • the base layer can be estimated by linear transformation of the guided image:
  • formula (2) may represent the linear transformation estimate of the base layer
  • G(p) may represent the guide image
  • a p' and b p' may represent the linear estimated parameters of the linear regression model at the image position p.
  • the guide image is used to filter out target components in the image to be processed (eg, to filter out noise in the image to be processed).
  • linear filtering such as block filtering, mean filtering, or Gaussian filtering, etc.
  • the image to be processed itself can also be used as a guide image.
  • the present disclosure does not limit the source and processing method of the guide image.
  • the linear estimation parameters at each image position of the image to be processed may be determined respectively according to the image to be processed and the guide image.
  • the linearly estimated parameters a p' and b p' for the image position p can be calculated by the following linear regression model:
  • ⁇ 1 (p') can represent a rectangular window (for example, 3, 5, 7, 11, 17, etc.) with the image position p as the center and the side length ⁇ 1.
  • ⁇ G (p') can be represented as:
  • ⁇ G (p') is The normalized processing result.
  • the cross-entropy-variance factor can be expressed as:
  • It can represent the regional variance value (or called local variance value) of the pixel point within the rectangular window ⁇ 1 (p') in the guide image; It can represent the regional entropy value (or local entropy value) of the pixel points within the rectangular window ⁇ 1 (p′) in the guide image.
  • the linear estimated parameters a p' and b p' of the image position p can be obtained, and the obtained linear estimated parameters a p' and b p' are the same as the guide
  • the regional variance value and the regional entropy value of the rectangular window ⁇ 1 (p′) in the image which may be referred to as the first image region, are associated.
  • the first image area includes an image area of a preset size (eg, 3 ⁇ 3, 5 ⁇ 5, etc.) centered on the image position p in the guide image.
  • the area variance value is usually much larger than the area entropy value; for smooth areas inside the image (such as texture detail areas or background areas with noise), the area entropy value is usually larger than the area variance value value. Therefore, the factor ⁇ G (p') for edge regions will be larger than the factor ⁇ G (p') for smooth regions. In this case, halo artifacts near edges and over-smoothing on details will be reduced, resulting in better edge sharpness and improved image processing.
  • a structure diagram of the to-be-processed image may be obtained according to the guide image and linear estimation parameters of multiple image positions. That is, after obtaining the linear estimation parameters a p' and b p' of the image position p, the linear estimation result of the image position p can be calculated according to formula (2).
  • a linear estimation result of the entire image to be processed can be obtained, that is, the base layer of the image to be processed.
  • the base layer includes the overall structure information of the image, which can be called the structure diagram of the image to be processed.
  • an edge protection filtering algorithm can be used to decompose the image to be processed into a base layer and a detail layer.
  • the detail layer includes texture detail information of the image, which may be called a texture map of the image to be processed.
  • step S13 the structure map of the image to be processed and the texture map may be fused to obtain an image processing result of the image to be processed.
  • the structure map and the texture map may be directly summed to obtain a processed image as an image processing result.
  • different weights may be set for the structure map and the texture map according to the category of the image processing task (for example, in the image enhancement task, the weights of the structure map and the texture map are set to 1 and 1 respectively). 2, in order to achieve the enhancement of image texture details). Furthermore, the weighted summation is performed on the structure map and the texture map to obtain a processed image, which is used as the image processing result.
  • the present disclosure does not limit the fusion method of the structure map and the texture map.
  • the entropy-variance joint adaptive factor introduced in the embodiment of the present disclosure can better distinguish the edge region of the image from the smooth region inside the image, and also has better anti-noise capability. Therefore, the image processing result obtained in step S13 can not only retain the detail information of the original image, but also reduce the halo artifacts near the edge of the image, and improve the sharpness of the edge of the image.
  • the image processing result obtained in step S13 has better visual quality and quantitative performance.
  • the linear estimation parameters can be determined according to the image to be processed and the guide image; the structure diagram of the image to be processed can be obtained according to the guide image and the linear estimation parameters; the image processing result can be obtained by fusing the structure diagram and the texture image.
  • the estimated parameters are associated with the regional variance value and the regional entropy value, which can better maintain the sharpness of the image edge, thereby improving the image processing effect.
  • FIG. 2 shows a schematic diagram of a system architecture to which an image processing method according to an embodiment of the present disclosure can be applied; as shown in FIG. 2 , the system architecture includes an image acquisition terminal 201 , a network 202 and a processing terminal 203 .
  • the image acquisition terminal 201 and the processing terminal 203 establish a communication connection through the network 202, the image acquisition terminal 201 reports the image to be processed to the processing terminal 203 through the network 202, and the processing terminal 203 responds to the image to be processed.
  • the linear estimation parameters of the multiple image positions of the to-be-processed image are determined;
  • the structure diagram of the to-be-processed image is obtained according to the guide image and the linear estimation parameters of the multiple image positions;
  • the structure map and the texture map of the image to be processed are fused to obtain an image processing result of the image to be processed.
  • the processing terminal 203 uploads the image processing result to the network 202 and sends it to the image acquisition terminal 201 through the network 202 .
  • the image acquisition terminal 201 may include an image acquisition device, and the processing terminal 203 may include a visual processing device or a remote server with visual information processing capability.
  • Network 202 may employ wired or wireless connections.
  • the image acquisition terminal 201 can communicate with the visual processing device through a wired connection, such as data communication through a bus; when the processing terminal 203 is a remote server, the image acquisition terminal 201 can Data exchange with remote server through wireless network.
  • the image acquisition terminal 201 may be a vision processing device with a video acquisition module, or a host with a camera.
  • the image processing method of the embodiment of the present disclosure may be executed by the image acquisition terminal 201 , and the above-mentioned system architecture may not include the network 202 and the control terminal 203 .
  • the image processing method according to the embodiment of the present disclosure may further include: performing linear filtering on the to-be-processed image to obtain a guide image of the to-be-processed image.
  • the guide image of the image to be processed can be obtained by means of linear filtering, and the information such as noise in the image to be processed is filtered from the guide image, so that the information in the image is smoother, so as to guide the subsequent processing of the image to be processed.
  • linear filtering may be, for example, block filtering, mean filtering, Gaussian filtering, etc., which is not limited in the present disclosure.
  • step S11 linear estimation parameters of multiple image positions of the to-be-processed image may be determined according to the to-be-processed image and the guide image.
  • step S11 may include:
  • the second image area includes the image to be processed, and the image position is center, image area of preset size
  • the area parameters include the area variance value, the area entropy value and the first area mean value of the pixels in the first image area, so the second area mean of the pixels in the second image area, and the third area mean of the fusion area of the first image area and the second image area;
  • linear estimation parameters for the image position are determined.
  • a window area corresponding to the image position may be determined first, including a first image area and a second image area.
  • the first image area includes an image area of a preset size centered on the image position in the guide image;
  • the second image area includes an image area of a preset size centered on the image position in the image to be processed.
  • the preset size may be set to, for example, 3 ⁇ 3, 5 ⁇ 3, 7 ⁇ 7, 11 ⁇ 11, 17 ⁇ 17, etc. The present disclosure does not limit the specific value of the preset size.
  • the part beyond the image range may be Zero padding.
  • the area parameters of the first image area and the second image area may be determined separately in order to calculate the linear estimation parameters a p' and b p' of the image position p.
  • the area parameters include the area variance value, the area entropy value and the first area mean of the pixels in the first image area, the second area mean of the pixels in the second image area, and the first image area and the second area mean The third regional mean of the fused regions of the image regions.
  • the area variance value of the pixel points in the first image area that is, in formula (5) can be expressed as:
  • formula (6) Can represent the regional variance of pixels in the rectangular window ⁇ 1(p ' ) corresponding to the image position p in the guide image; N can represent the number of pixels in the guide image; j can represent any image position in the guide image; It can represent the regional variance of the pixels in the rectangular window corresponding to the image position j in the guide image. According to formula (6), it can be known that, that is The normalized processing result.
  • the step of respectively determining the area parameters of the first image area and the second image area may include:
  • the brightness histogram of the guide image determine the occurrence probability of each pixel in the first image area in the guide image
  • the region entropy value is determined according to the occurrence probability of each pixel in the first image region.
  • the regional entropy value of the pixels in the first image region that is, in formula (5) can be expressed as:
  • K can represent the number of pixels in the first image area; It can represent the occurrence probability of the kth pixel in the first image area.
  • the guide image may be processed to obtain a brightness histogram of the guide image, and the present disclosure does not limit the specific processing manner.
  • the brightness value of each pixel in the first image area can be determined, and the probability of occurrence in the guide image; and then the brightness value of the first image area can be calculated by formula (7).
  • Area entropy value the brightness value of each pixel in the first image area.
  • the regional entropy value of the image region can be obtained for subsequent calculation of linear estimation parameters.
  • the pixel values in the first image area and the pixel values in the second image area may be averaged respectively to obtain the first area mean value and the second area mean value in the area parameters.
  • the step of respectively determining the area parameters of the first image area and the second image area may include:
  • a third area mean of the pixels in the fusion area is determined.
  • the pixel values at the corresponding positions of the first image area and the second image area can be dot-multiplied to obtain a fusion area; the pixel values in the fusion area are averaged to obtain the third area mean value in the area parameters.
  • the regional mean values of the regions can be fused to realize the association between the guiding image and the image to be processed.
  • the linear estimation parameters include a first parameter a p' and a second parameter b p' .
  • the step of determining the linear estimation parameter of the image position according to the area parameter may include:
  • a second parameter of the image position is determined based on the first regional mean, the second regional mean, and the first parameter.
  • formulas (8) and (9) can represent the mean value of the first region, It can represent the mean value of the second area; It can represent the third area mean; the operator ° can represent the dot product.
  • the factor ⁇ G (p') can be obtained based on formula (4); according to the factor ⁇ G (p'), the first regional mean second area mean and the third regional mean That is, the first parameter a p' of the image position can be determined based on the formula (8); then, according to the first area mean second area mean and the first parameter a p' , the second parameter b p' of the image position can be determined based on the formula (9).
  • linear estimation parameters of any image position can be determined respectively, and the linear estimation parameters are associated with the regional variance value and the regional entropy value, which can improve the image processing effect.
  • the above processing methods can be used to process each image position in the image to obtain linear estimation parameters of the entire image to be processed; in step S12, according to the guiding image and the linear estimation parameters, the Based on formula (2), the structure diagram of the image to be processed is obtained by calculation.
  • the image processing method according to the embodiments of the present disclosure may further include:
  • the to-be-processed image is decomposed to obtain a texture map of the to-be-processed image.
  • an edge protection filtering algorithm can be used to decompose the image to be processed into a base layer and a detail layer, and the detail layer can be used as a texture map of the image to be processed.
  • the image to be processed may also be directly filtered to remove information such as noise in the image to be processed, and to retain detailed texture information in the image to be processed, to obtain a texture map of the image to be processed.
  • the filtering method may be linear reducible filtering, and the present disclosure does not limit the specific filtering method.
  • step S13 after the texture map of the image to be processed is obtained, in step S13, the structure map of the image to be processed and the texture map may be fused to obtain an image processing result of the image to be processed.
  • the structure map and the texture map may be directly summed to obtain a processed image as an image processing result.
  • Different weights can also be set for the structure map and texture map according to the category of the image processing task (for example, in the image enhancement task, the weights of the structure map and texture map are set to 1 and 2, respectively).
  • the weighted summation of the structure map and the texture map is performed to obtain the processed image as the image processing result.
  • the present disclosure does not limit the fusion method of the structure map and the texture map.
  • an extended guided filtering method which can determine the linear estimation parameter of the image position based on the regional variance value and the regional entropy value of the guided image region corresponding to the image position; -
  • the variance joint adaptive processing method is used to automatically adjust the weight of the regularization constraint item of the linear estimation parameter, and the regularization weight can be adjusted to different degrees according to different image structures, thereby overcoming the fixed regularization weight in guided filtering. artifact problem.
  • the image processing method according to the embodiment of the present disclosure can be applied to the fields of artificial intelligence, image processing, machine vision, etc., to realize image processing such as image dehazing, dark light enhancement, contrast enhancement, tone mapping, high dynamic range HDR imaging, image stitching, etc. It can reduce the halo artifacts near the edge of the image and the over-smoothing problem on the details, and can better maintain the edge sharpness and improve the image processing effect.
  • the present disclosure also provides image processing apparatuses, electronic devices, computer-readable storage media, and programs, all of which can be used to implement any image processing method provided by the present disclosure.
  • image processing apparatuses electronic devices, computer-readable storage media, and programs, all of which can be used to implement any image processing method provided by the present disclosure.
  • FIG. 3 shows a block diagram of an image processing apparatus according to an embodiment of the present disclosure. As shown in FIG. 3 , the apparatus includes:
  • the parameter determination module 31 is configured to determine linear estimation parameters of multiple image positions of the to-be-processed image according to the to-be-processed image and the guide image of the to-be-processed image;
  • the structure diagram obtaining module 32 is configured to obtain the structure diagram of the to-be-processed image according to the guide image and the linear estimation parameters of the positions of the multiple images;
  • the result determination module 33 is configured to fuse the structure map with the texture map of the to-be-processed image to obtain an image processing result of the to-be-processed image,
  • the linear estimation parameter of any image position in the image to be processed is associated with the regional variance value and the regional entropy value of the first image area in the guide image, and the first image area includes the guide image.
  • the image In the image, the image area with the image position as the center and the preset size.
  • the parameter determination module 31 includes:
  • An area determination sub-module configured to determine, for any image position of the image to be processed, a first image area and a second image area corresponding to the image position, the second image area including the image to be processed , the image area with the image position as the center and the preset size;
  • a region parameter determination sub-module configured to respectively determine the region parameters of the first image region and the second image region, the region parameters including the region variance value and the region entropy of the pixels in the first image region value and the first area mean value, the second area mean value of the pixels in the second image area, and the third area mean value of the fusion area of the first image area and the second image area;
  • the linear parameter determination submodule is configured to determine linear estimation parameters of the plurality of image positions according to the region parameters.
  • the region parameter determination submodule is configured to fuse the first image region and the second image region to obtain a fusion region; The average of the three regions.
  • the region parameter determination submodule is configured to determine, according to a brightness histogram of the guide image, the occurrence probability of each pixel in the first image region in the guide image;
  • the region entropy value is determined according to the occurrence probability of each pixel in the first image region.
  • the linear estimation parameter includes a first parameter and a second parameter
  • the linear parameter determination sub-module is configured to An area mean value, the second area mean value and the third area mean value determine the first parameter of the image position; according to the first area mean value, the second area mean value and the first parameter, determine the second parameter of the image location.
  • the apparatus further includes: a linear filtering module configured to perform linear filtering on the to-be-processed image to obtain a guide image of the to-be-processed image.
  • the apparatus further includes: an image decomposition module configured to decompose the to-be-processed image to obtain a texture map of the to-be-processed image.
  • the functions or modules included in the apparatuses provided in the embodiments of the present disclosure may be used to execute the methods described in the above method embodiments.
  • Embodiments of the present disclosure further provide a computer-readable storage medium, on which computer program instructions are stored, and when the computer program instructions are executed by a processor, the foregoing method is implemented.
  • Computer-readable storage media can be volatile or non-volatile computer-readable storage media.
  • An embodiment of the present disclosure further provides an electronic device, including: a processor; a memory for storing instructions executable by the processor; wherein the processor is configured to invoke the instructions stored in the memory to execute the above method.
  • An embodiment of the present disclosure also provides a computer program, the computer program includes computer-readable codes, and when the computer-readable codes are executed in an electronic device, the processor of the electronic device executes the above method.
  • Embodiments of the present disclosure also provide another computer program product, including computer-readable codes, or a non-volatile computer-readable storage medium carrying computer-readable codes, when the computer-readable codes are processed in an electronic device When running in the device, the processor in the electronic device executes the above method.
  • the electronic device may be provided as a terminal, server or other form of device.
  • FIG. 4 shows a block diagram of an electronic device 800 according to an embodiment of the present disclosure.
  • electronic device 800 may be a mobile phone, computer, digital broadcast terminal, messaging device, game console, tablet device, medical device, fitness device, personal digital assistant, etc. terminal.
  • an electronic device 800 may include one or more of the following components: a processing component 802, a memory 804, a power supply component 806, a multimedia component 808, an audio component 810, an input/output (I/O) interface 812 , sensor component 814 , and communication component 816 .
  • the processing component 802 generally controls the overall operation of the electronic device 800, such as operations associated with display, phone calls, data communications, camera operations, and recording operations.
  • the processing component 802 can include one or more processors 820 to execute instructions to perform all or some of the steps of the methods described above.
  • processing component 802 may include one or more modules that facilitate interaction between processing component 802 and other components.
  • processing component 802 may include a multimedia module to facilitate interaction between multimedia component 808 and processing component 802.
  • Memory 804 is configured to store various types of data to support operation at electronic device 800 . Examples of such data include instructions for any application or method operating on electronic device 800, contact data, phonebook data, messages, pictures, videos, and the like.
  • the memory 804 may be implemented by any type of volatile or non-volatile storage device or a combination thereof, such as Static Random-Access Memory (SRAM), Electrically Erasable Programmable Read-Only Memory (Electrically Erasable) Erasable Programmable Read Only Memory, EEPROM), Erasable Programmable Read Only Memory (EPROM), Programmable Read Only Memory (PROM), Read Only Memory (Read Only Memory, ROM) ), magnetic memory, flash memory, magnetic disk or optical disk.
  • SRAM Static Random-Access Memory
  • EEPROM Electrically Erasable Programmable Read-Only Memory
  • EPROM Erasable Programmable Read Only Memory
  • PROM Programmable Read Only Memory
  • Read Only Memory Read Only Memory
  • Power supply assembly 806 provides power to various components of electronic device 800 .
  • Power supply components 806 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power to electronic device 800 .
  • Multimedia component 808 includes a screen that provides an output interface between the electronic device 800 and the user.
  • the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user.
  • the touch panel includes one or more touch sensors to sense touch, swipe, and gestures on the touch panel. The touch sensor may not only sense the boundaries of a touch or swipe action, but also detect the duration and pressure associated with the touch or swipe action.
  • the multimedia component 808 includes a front-facing camera and/or a rear-facing camera. When the electronic device 800 is in an operation mode, such as a shooting mode or a video mode, the front camera and/or the rear camera may receive external multimedia data. Each of the front and rear cameras can be a fixed optical lens system or have focal length and optical zoom capability.
  • Audio component 810 is configured to output and/or input audio signals.
  • the audio component 810 includes a microphone (Microphone, MIC) configured to receive external audio signals when the electronic device 800 is in an operating mode, such as a calling mode, a recording mode, and a voice recognition mode.
  • the received audio signal may be further stored in memory 804 or transmitted via communication component 816 .
  • the audio component 810 further includes a speaker for outputting audio signals.
  • the I/O interface 812 provides an interface between the processing component 802 and a peripheral interface module, which may be a keyboard, a click wheel, a button, or the like. These buttons may include, but are not limited to: home button, volume buttons, start button, and lock button.
  • Sensor assembly 814 includes one or more sensors for providing status assessment of various aspects of electronic device 800 .
  • the sensor assembly 814 can detect the on/off state of the electronic device 800, the relative positioning of the components, such as the display and the keypad of the electronic device 800, the sensor assembly 814 can also detect the electronic device 800 or one of the electronic device 800 Changes in the position of components, presence or absence of user contact with the electronic device 800 , orientation or acceleration/deceleration of the electronic device 800 and changes in the temperature of the electronic device 800 .
  • Sensor assembly 814 may include a proximity sensor configured to detect the presence of nearby objects in the absence of any physical contact.
  • Sensor assembly 814 may also include a light sensor, such as a Complementary Metal Oxide Semiconductor (CMOS) or Charge-Coupled Device (CCD) image sensor, for use in imaging applications.
  • CMOS Complementary Metal Oxide Semiconductor
  • CCD Charge-Coupled Device
  • the sensor assembly 814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
  • Communication component 816 is configured to facilitate wired or wireless communication between electronic device 800 and other devices.
  • the electronic device 800 can access a wireless network based on a communication standard, such as a wireless network (Wi-Fi), a second-generation mobile communication technology (2-Generation, 2G) or a third-generation mobile communication technology (3rd-Generation, 3G), or their combination.
  • the communication component 816 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel.
  • the communication component 816 also includes a Near Field Communication (NFC) module to facilitate short-range communication.
  • the NFC module may be based on Radio Frequency Identification (RFID) technology, Infrared Data Association (IrDA) technology, Ultra Wide Band (UWB) technology, Bluetooth (BitTorrent, BT) technology and other technology to achieve.
  • RFID Radio Frequency Identification
  • IrDA Infrared Data Association
  • UWB Ultra Wide Band
  • Bluetooth BitTorrent
  • the electronic device 800 may be implemented by one or more Application Specific Integrated Circuit (ASIC), Digital Signal Processor (DSP), Digital Signal Processor Device (Digital Signal Processor Device) , DSPD), Programmable Logic Device (PLD), Field Programmable Gate Array (FPGA), controller, microcontroller, microprocessor, or other electronic component implementation, used to perform the above method.
  • ASIC Application Specific Integrated Circuit
  • DSP Digital Signal Processor
  • DSPD Digital Signal Processor Device
  • PLD Programmable Logic Device
  • FPGA Field Programmable Gate Array
  • controller microcontroller, microprocessor, or other electronic component implementation, used to perform the above method.
  • a non-volatile computer-readable storage medium such as a memory 804 comprising computer program instructions executable by the processor 820 of the electronic device 800 to perform the above method is also provided.
  • FIG. 5 shows a block diagram of an electronic device 1900 according to an embodiment of the present disclosure.
  • the electronic device 1900 may be provided as a server. 5
  • the electronic device 1900 includes a processing component 1922, which may include one or more processors, and a memory resource, represented by memory 1932, for storing instructions, such as applications, executable by the processing component 1922.
  • An application program stored in memory 1932 may include one or more modules, each corresponding to a set of instructions.
  • the processing component 1922 is configured to execute instructions to perform the above-described methods.
  • the electronic device 1900 may also include a power supply assembly 1926 configured to perform power management of the electronic device 1900, a wired or wireless network interface 1950 configured to connect the electronic device 1900 to a network, and an I/O interface 1958.
  • the electronic device 1900 can operate based on an operating system stored in the memory 1932, such as a Microsoft server operating system (Windows Server TM ), a graphical user interface based operating system (Mac OS X TM ) introduced by Apple, a multi-user multi-process computer operating system (Unix TM ), Free and Open Source Unix-like Operating System (Linux TM ), Open Source Unix-like Operating System (FreeBSD TM ) or the like.
  • Microsoft server operating system Windows Server TM
  • Mac OS X TM graphical user interface based operating system
  • Uniix TM multi-user multi-process computer operating system
  • Free and Open Source Unix-like Operating System Linux TM
  • FreeBSD TM Open Source Unix-like Operating System
  • a non-volatile computer-readable storage medium such as memory 1932 comprising computer program instructions executable by processing component 1922 of electronic device 1900 to perform the above-described method.
  • the present disclosure may be a system, method and/or computer program product.
  • the computer program product may include a computer-readable storage medium having computer-readable program instructions loaded thereon for causing a processor to implement various aspects of the present disclosure.
  • a computer-readable storage medium may be a tangible device that can hold and store instructions for use by the instruction execution device.
  • the computer-readable storage medium may be, for example, but not limited to, an electrical storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing.
  • Computer readable storage media include: portable computer disks, hard disks, random access memory (RAM), ROM, EPROM or flash memory, SRAM, portable compact disk read only Memory (Compact Disc-Read Only Memory, CD-ROM), Digital Video Disc (DVD), memory sticks, floppy disks, mechanical coding devices, such as punched cards or recessed protrusions on which instructions are stored structure, and any suitable combination of the above.
  • Computer-readable storage media, as used herein, are not to be construed as transient signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through waveguides or other transmission media (eg, light pulses through fiber optic cables), or through electrical wires transmitted electrical signals.
  • the computer readable program instructions described herein may be downloaded to various computing/processing devices from a computer readable storage medium, or to an external computer or external storage device over a network such as the Internet, a local area network, a wide area network, and/or a wireless network.
  • the network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers, and/or edge servers.
  • a network adapter card or network interface in each computing/processing device receives computer-readable program instructions from a network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in each computing/processing device .
  • Computer program instructions for carrying out the operations of the present disclosure may be assembly instructions, Instruction Set Architecture (ISA) instructions, machine instructions, machine-dependent instructions, microcode, firmware instructions, state setting data, or in one or Source or object code written in any combination of programming languages, including object-oriented programming languages, such as Smalltalk, C++, etc., and conventional procedural programming languages, such as the "C" language or similar programming languages
  • the computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server
  • the remote computer can be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or Wide Area Network (WAN), or it can be connected to an external A computer (eg, using an Internet service provider to connect through the Internet).
  • LAN Local Area Network
  • WAN Wide Area Network
  • electronic circuits such as programmable logic circuits, FPGAs, or programmable logic arrays, are personalized by utilizing state information of computer-readable program instructions ( Programmable Logic Arrays, PLA), electronic circuits that can execute computer-readable program instructions to implement various aspects of the present disclosure.
  • PLA Programmable Logic Arrays
  • These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer or other programmable data processing apparatus to produce a machine that causes the instructions when executed by the processor of the computer or other programmable data processing apparatus , resulting in means for implementing the functions/acts specified in one or more blocks of the flowchart and/or block diagrams.
  • These computer readable program instructions can also be stored in a computer readable storage medium, these instructions cause a computer, programmable data processing apparatus and/or other equipment to operate in a specific manner, so that the computer readable medium storing the instructions includes An article of manufacture comprising instructions for implementing various aspects of the functions/acts specified in one or more blocks of the flowchart and/or block diagrams.
  • Computer readable program instructions can also be loaded onto a computer, other programmable data processing apparatus, or other equipment to cause a series of operational steps to be performed on the computer, other programmable data processing apparatus, or other equipment to produce a computer-implemented process , thereby causing instructions executing on a computer, other programmable data processing apparatus, or other device to implement the functions/acts specified in one or more blocks of the flowcharts and/or block diagrams.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more functions for implementing the specified logical function(s) executable instructions.
  • the functions noted in the blocks may occur out of the order noted in the figures. For example, two blocks in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
  • each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations can be implemented in dedicated hardware-based systems that perform the specified functions or actions , or can be implemented in a combination of dedicated hardware and computer instructions.
  • the computer program product can be specifically implemented by hardware, software or a combination thereof.
  • the computer program product is embodied as a computer storage medium, and in another optional embodiment, the computer program product is embodied as a software product, such as a software development kit (Software Development Kit, SDK), etc. Wait.
  • a software development kit Software Development Kit, SDK
  • Embodiments of the present disclosure relate to an image processing method, apparatus, device, storage medium, and program.
  • the method includes: determining multiple image positions of the to-be-processed image according to the to-be-processed image and a guide image of the to-be-processed image
  • the linear estimation parameters of the image to be processed are obtained; the structure diagram of the image to be processed is obtained according to the linear estimation parameters of the guide image and the positions of the multiple images; the structure diagram and the texture map of the image to be processed are fused to obtain the
  • the image processing result of the image to be processed wherein the linear estimation parameter of any image position in the image to be processed is associated with the regional variance value and the regional entropy value of the first image region in the guide image, so
  • the first image area includes an image area with a preset size centered on the image position in the guide image.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

An image processing method and apparatus, and a device, a storage medium and a program. The method comprises: according to an image to be processed and a guide image of the image to be processed, determining linear estimation parameters of a plurality of image positions of the image to be processed (S11); according to the guide image and the linear estimation parameters of the plurality of image positions, obtaining a structure map of the image to be processed (S12); and fusing the structure map with a texture map of the image to be processed, so as to obtain an image processing result of the image to be processed (S13), wherein a linear estimation parameter of any image position in the image to be processed is associated with a regional variance value and a regional entropy value of a first image region in the guide image, and the first image region comprises an image region in the guide image, which region takes the image position as the center and has a preset size. By means of the method, an image processing effect can be improved.

Description

图像处理方法、装置、设备、存储介质及程序Image processing method, apparatus, equipment, storage medium and program
相关申请的交叉引用CROSS-REFERENCE TO RELATED APPLICATIONS
本专利申请要求2021年04月28日提交的中国专利申请号为202110469110.2、申请人为上海商汤智能科技有限公司,申请名称为“图像处理方法及装置、电子设备和存储介质”的优先权,该申请的全文以引用的方式并入本申请中。This patent application claims the priority of the Chinese patent application number 202110469110.2 filed on April 28, 2021, the applicant is Shanghai SenseTime Intelligent Technology Co., Ltd. The entirety of the application is incorporated by reference into this application.
技术领域technical field
本公开涉及计算机视觉技术领域,尤其涉及一种图像处理方法、装置、设备、存储介质及程序。The present disclosure relates to the field of computer vision technology, and in particular, to an image processing method, apparatus, device, storage medium, and program.
背景技术Background technique
在各种图像处理任务和计算机视觉任务(例如图像去雾、暗光增强、图像拼接等任务)中,经常需要对图像进行滤波,以实现相应的处理。其中,边缘保护滤波是目前最常用的图像滤波技术之一。In various image processing tasks and computer vision tasks (such as image dehazing, dark light enhancement, image stitching, etc.), it is often necessary to filter images to achieve corresponding processing. Among them, edge protection filtering is one of the most commonly used image filtering techniques.
目前边缘保护滤波有局部滤波和全局优化两种方式,全局优化的处理效果较好,但计算复杂度过高,效率较低;局部滤波计算简便,但容易产生伪影等问题,导致处理效果变差。At present, there are two methods of edge protection filtering: local filtering and global optimization. The global optimization has better processing effect, but the calculation complexity is too high and the efficiency is low. Difference.
发明内容SUMMARY OF THE INVENTION
本公开实施例提出了一种图像处理技术方案。The embodiment of the present disclosure proposes an image processing technical solution.
本公开实施例提供了一种图像处理方法,所述方法由电子设备执行,所述方法包括:An embodiment of the present disclosure provides an image processing method, the method is executed by an electronic device, and the method includes:
根据待处理图像以及所述待处理图像的引导图像,确定所述待处理图像的多个图像位置的线性估计参数;determining linear estimation parameters of multiple image positions of the to-be-processed image according to the to-be-processed image and the guide image of the to-be-processed image;
根据所述引导图像及所述多个图像位置的线性估计参数,得到所述待处理图像的结构图;obtaining a structure diagram of the to-be-processed image according to the guide image and the linear estimation parameters of the multiple image positions;
将所述结构图与所述待处理图像的纹理图融合,得到所述待处理图像的图像处理结果,fusing the structure map with the texture map of the to-be-processed image to obtain an image processing result of the to-be-processed image,
其中,所述待处理图像中任一图像位置的线性估计参数,与所述引导图像中的第一图像区域的区域方差值和区域熵值相关联,所述第一图像区域包括所述引导图像中,以所述图像位置为中心、预设尺寸的图像区域。Wherein, the linear estimation parameter of any image position in the image to be processed is associated with the regional variance value and the regional entropy value of the first image area in the guide image, and the first image area includes the guide image. In the image, the image area with the image position as the center and the preset size.
在本公开的一些实施例中,所述根据待处理图像及所述待处理图像的引导图像,确定所述待处理图像的多个图像位置的线性估计参数,包括:针对所述待处理图像的任一图像位置,确定与所述图像位置对应的第一图像区域和第二图像区域,所述第二图像区域包括所述待处理图像中,以所述图像位置为中心、预设尺寸的图像区域;分别确定所述第一图像区域及所述第二图像区域的区域参数,所述区域参数包括所述第一图像区域中的像素点的区域方差值、区域熵值及第一区域均值,所述第二图像区域中的像素点的第二区域均值,以及所述第一图像区域与所述第二图像区域的融合区域的第三区域均值;根据所述区域参数,确定所述多个图像位置的线性估计参数。如此,能够基于图像位置对应的引导图像区域的区域参数,来确定图像位置的线性估计参数。In some embodiments of the present disclosure, the determining linear estimation parameters of multiple image positions of the to-be-processed image according to the to-be-processed image and the guide image of the to-be-processed image includes: for the to-be-processed image For any image position, determine a first image area and a second image area corresponding to the image position, and the second image area includes an image of a preset size centered on the image position in the to-be-processed image area; respectively determine the area parameters of the first image area and the second image area, the area parameters include the area variance value, the area entropy value and the first area mean value of the pixels in the first image area , the second area mean value of the pixels in the second image area, and the third area mean value of the fusion area of the first image area and the second image area; Linear estimation parameters for each image location. In this way, the linear estimation parameter of the image position can be determined based on the area parameter of the guide image area corresponding to the image position.
在本公开的一些实施例中,所述分别确定所述第一图像区域及所述第二图像区域的 区域参数,包括:将所述第一图像区域与所述第二图像区域融合,得到融合区域;确定所述融合区域中的像素点的第三区域均值。如此,能够融合区域的区域均值,实现引导图像与待处理图像之间的关联。In some embodiments of the present disclosure, the determining the area parameters of the first image area and the second image area respectively includes: fusing the first image area and the second image area to obtain a fusion area; determine the third area mean of the pixels in the fusion area. In this way, the regional mean values of the regions can be fused to realize the association between the guide image and the image to be processed.
在本公开的一些实施例中,所述分别确定所述第一图像区域及所述第二图像区域的区域参数,包括:根据所述引导图像的亮度直方图,确定所述第一图像区域中各个像素点在所述引导图像中的出现概率;根据所述第一图像区域中各个像素点的出现概率,确定所述区域熵值。如此,能够得到图像区域的区域熵值,以便后续计算线性估计参数。In some embodiments of the present disclosure, the determining, respectively, the area parameters of the first image area and the second image area includes: determining, according to a luminance histogram of the guide image, a The occurrence probability of each pixel point in the guide image; the region entropy value is determined according to the occurrence probability of each pixel point in the first image region. In this way, the regional entropy value of the image region can be obtained for subsequent calculation of linear estimation parameters.
在本公开的一些实施例中,所述线性估计参数包括第一参数和第二参数,所述根据所述区域参数,确定所述多个图像位置的线性估计参数,包括:根据所述区域方差值、所述区域熵值、所述第一区域均值、所述第二区域均值及所述第三区域均值,确定所述图像位置的第一参数;根据所述第一区域均值、所述第二区域均值及所述第一参数,确定所述图像位置的第二参数。如此,可分别确定任一图像位置的两个线性估计参数,该线性估计参数与区域方差值和区域熵值相关联,能够提高图像处理效果。In some embodiments of the present disclosure, the linear estimation parameters include a first parameter and a second parameter, and the determining the linear estimation parameters for the plurality of image positions according to the region parameters includes: according to the region parameters difference value, the regional entropy value, the first regional average value, the second regional average value, and the third regional average value to determine the first parameter of the image position; according to the first regional average value, the The second area mean and the first parameter determine the second parameter of the image position. In this way, two linear estimation parameters of any image position can be determined respectively, and the linear estimation parameters are associated with the regional variance value and the regional entropy value, which can improve the image processing effect.
在本公开的一些实施例中,所述方法还包括:对所述待处理图像进行线性滤波,得到所述待处理图像的引导图像。如此,引导图像中滤除了待处理图像中的噪声等信息,使得图像中的信息更为平滑,以便引导待处理图像的后续处理。In some embodiments of the present disclosure, the method further includes: performing linear filtering on the to-be-processed image to obtain a guide image of the to-be-processed image. In this way, information such as noise in the image to be processed is filtered out in the guide image, so that the information in the image is smoother, so as to guide the subsequent processing of the image to be processed.
在本公开的一些实施例中,所述方法还包括:对所述待处理图像进行分解,得到所述待处理图像的纹理图。如此,可采用边缘保护滤波算法将待处理图像分解成基础层和细节层,将细节层作为待处理图像的纹理图,以便后续采用该纹理图得到图像处理结果。In some embodiments of the present disclosure, the method further includes: decomposing the image to be processed to obtain a texture map of the image to be processed. In this way, an edge protection filtering algorithm can be used to decompose the to-be-processed image into a base layer and a detail layer, and the detail layer can be used as the texture map of the to-be-processed image, so that the image processing result can be obtained by using the texture map subsequently.
以下装置、电子设备等的效果描述参见上述图像处理方法的说明。The following descriptions of the effects of the apparatuses, electronic devices, etc. refer to the descriptions of the above-mentioned image processing methods.
本公开实施例提供了一种图像处理装置,包括:Embodiments of the present disclosure provide an image processing apparatus, including:
参数确定模块,配置为根据待处理图像以及所述待处理图像的引导图像,确定所述待处理图像的多个图像位置的线性估计参数;a parameter determination module, configured to determine linear estimation parameters of multiple image positions of the to-be-processed image according to the to-be-processed image and a guide image of the to-be-processed image;
结构图获取模块,配置为根据所述引导图像及所述多个图像位置的线性估计参数,得到所述待处理图像的结构图;a structure diagram acquisition module, configured to obtain the structure diagram of the to-be-processed image according to the guide image and the linear estimation parameters of the positions of the multiple images;
结果确定模块,配置为将所述结构图与所述待处理图像的纹理图融合,得到所述待处理图像的图像处理结果,A result determination module, configured to fuse the structure map with the texture map of the to-be-processed image to obtain an image processing result of the to-be-processed image,
其中,所述待处理图像中任一图像位置的线性估计参数,与所述引导图像中的第一图像区域的区域方差值和区域熵值相关联,所述第一图像区域包括所述引导图像中,以所述图像位置为中心、预设尺寸的图像区域。Wherein, the linear estimation parameter of any image position in the image to be processed is associated with the regional variance value and the regional entropy value of the first image area in the guide image, and the first image area includes the guide image. In the image, the image area with the image position as the center and the preset size.
在本公开的一些实施例中,所述参数确定模块,包括:In some embodiments of the present disclosure, the parameter determination module includes:
区域确定子模块,配置为针对所述待处理图像的任一图像位置,确定与所述图像位置对应的第一图像区域和第二图像区域,所述第二图像区域包括所述待处理图像中,以所述图像位置为中心、预设尺寸的图像区域;An area determination sub-module configured to determine, for any image position of the image to be processed, a first image area and a second image area corresponding to the image position, the second image area including the image to be processed , the image area with the image position as the center and the preset size;
区域参数确定子模块,配置为分别确定所述第一图像区域及所述第二图像区域的区域参数,所述区域参数包括所述第一图像区域中的像素点的区域方差值、区域熵值及第 一区域均值,所述第二图像区域中的像素点的第二区域均值,以及所述第一图像区域与所述第二图像区域的融合区域的第三区域均值;a region parameter determination sub-module, configured to respectively determine the region parameters of the first image region and the second image region, the region parameters including the region variance value and the region entropy of the pixels in the first image region value and the first area mean value, the second area mean value of the pixels in the second image area, and the third area mean value of the fusion area of the first image area and the second image area;
线性参数确定子模块,配置为根据所述区域参数,确定所述多个图像位置的线性估计参数。The linear parameter determination submodule is configured to determine linear estimation parameters of the plurality of image positions according to the region parameters.
在本公开的一些实施例中,所述区域参数确定子模块,配置为将所述第一图像区域与所述第二图像区域融合,得到融合区域;确定所述融合区域中的像素点的第三区域均值。In some embodiments of the present disclosure, the region parameter determination submodule is configured to fuse the first image region and the second image region to obtain a fusion region; The average of the three regions.
在本公开的一些实施例中,所述区域参数确定子模块,配置为根据所述引导图像的亮度直方图,确定所述第一图像区域中各个像素点在所述引导图像中的出现概率;根据所述第一图像区域中各个像素点的出现概率,确定所述区域熵值。In some embodiments of the present disclosure, the region parameter determination submodule is configured to determine, according to a brightness histogram of the guide image, the occurrence probability of each pixel in the first image region in the guide image; The region entropy value is determined according to the occurrence probability of each pixel in the first image region.
在本公开的一些实施例中,所述线性估计参数包括第一参数和第二参数,所述线性参数确定子模块,配置为根据所述区域方差值、所述区域熵值、所述第一区域均值、所述第二区域均值及所述第三区域均值,确定所述图像位置的第一参数;根据所述第一区域均值、所述第二区域均值及所述第一参数,确定所述图像位置的第二参数。In some embodiments of the present disclosure, the linear estimation parameter includes a first parameter and a second parameter, and the linear parameter determination sub-module is configured to An area mean value, the second area mean value and the third area mean value determine the first parameter of the image position; according to the first area mean value, the second area mean value and the first parameter, determine the second parameter of the image location.
在本公开的一些实施例中,所述装置还包括:线性滤波模块,配置为对所述待处理图像进行线性滤波,得到所述待处理图像的引导图像。In some embodiments of the present disclosure, the apparatus further includes: a linear filtering module configured to perform linear filtering on the to-be-processed image to obtain a guide image of the to-be-processed image.
在本公开的一些实施例中,所述装置还包括:图像分解模块,配置为对所述待处理图像进行分解,得到所述待处理图像的纹理图。In some embodiments of the present disclosure, the apparatus further includes: an image decomposition module configured to decompose the to-be-processed image to obtain a texture map of the to-be-processed image.
本公开实施例提供了一种电子设备,包括:处理器;用于存储处理器可执行指令的存储器;其中,所述处理器被配置为调用所述存储器存储的指令,以执行上述方法。An embodiment of the present disclosure provides an electronic device, including: a processor; a memory for storing instructions executable by the processor; wherein the processor is configured to invoke the instructions stored in the memory to execute the above method.
本公开实施例提供了一种计算机可读存储介质,其上存储有计算机程序指令,所述计算机程序指令被处理器执行时实现上述方法。Embodiments of the present disclosure provide a computer-readable storage medium, on which computer program instructions are stored, and when the computer program instructions are executed by a processor, the foregoing method is implemented.
本公开实施例还提供一种计算机程序,所述计算机程序包括计算机可读代码,在所述计算机可读代码在电子设备中运行的情况下,所述电子设备的处理器执行上述方法。An embodiment of the present disclosure also provides a computer program, the computer program includes computer-readable codes, and when the computer-readable codes are executed in an electronic device, a processor of the electronic device executes the above method.
在本公开实施例中,能够根据待处理图像及引导图像确定线性估计参数;根据引导图像及线性估计参数得到待处理图像的结构图;将结构图与纹理图融合得到图像处理结果,通过将线性估计参数与区域方差值和区域熵值相关联,能够更好地保持图像边缘的清晰度,提高图像处理效果。In the embodiment of the present disclosure, the linear estimation parameters can be determined according to the image to be processed and the guide image; the structure diagram of the image to be processed can be obtained according to the guide image and the linear estimation parameters; the image processing result can be obtained by fusing the structure diagram and the texture image. The estimated parameters are related to the regional variance value and the regional entropy value, which can better maintain the sharpness of the image edge and improve the image processing effect.
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,而非限制本公开。根据下面参考附图对示例性实施例的详细说明,本公开的其它特征及方面将变得清楚。It is to be understood that the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the present disclosure. Other features and aspects of the present disclosure will become apparent from the following detailed description of exemplary embodiments with reference to the accompanying drawings.
附图说明Description of drawings
此处的附图被并入说明书中并构成本说明书的一部分,这些附图示出了符合本公开的实施例,并与说明书一起用于说明本公开的技术方案。The accompanying drawings, which are incorporated into and constitute a part of this specification, illustrate embodiments consistent with the present disclosure, and together with the description, serve to explain the technical solutions of the present disclosure.
图1示出根据本公开实施例的图像处理方法的流程图;FIG. 1 shows a flowchart of an image processing method according to an embodiment of the present disclosure;
图2示出可以应用本公开实施例的图像处理方法的一种系统架构示意图;FIG. 2 shows a schematic diagram of a system architecture to which an image processing method according to an embodiment of the present disclosure can be applied;
图3示出根据本公开实施例的图像处理装置的框图;FIG. 3 shows a block diagram of an image processing apparatus according to an embodiment of the present disclosure;
图4示出根据本公开实施例的一种电子设备的框图;FIG. 4 shows a block diagram of an electronic device according to an embodiment of the present disclosure;
图5示出根据本公开实施例的一种电子设备的框图。FIG. 5 shows a block diagram of an electronic device according to an embodiment of the present disclosure.
具体实施方式Detailed ways
以下将参考附图详细说明本公开的各种示例性实施例、特征和方面。附图中相同的附图标记表示功能相同或相似的元件。尽管在附图中示出了实施例的各种方面,但是除非特别指出,不必按比例绘制附图。Various exemplary embodiments, features and aspects of the present disclosure will be described in detail below with reference to the accompanying drawings. The same reference numbers in the figures denote elements that have the same or similar functions. While various aspects of the embodiments are shown in the drawings, the drawings are not necessarily drawn to scale unless otherwise indicated.
在这里专用的词“示例性”意为“用作例子、实施例或说明性”。这里作为“示例性”所说明的任何实施例不必解释为优于或好于其它实施例。The word "exemplary" is used exclusively herein to mean "serving as an example, embodiment, or illustration." Any embodiment described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
本文中术语“和/或”,仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。另外,本文中术语“至少一种”表示多种中的任意一种或多种中的至少两种的任意组合,例如,包括A、B、C中的至少一种,可以表示包括从A、B和C构成的集合中选择的任意一个或多个元素。The term "and/or" in this article is only an association relationship to describe the associated objects, indicating that there can be three kinds of relationships, for example, A and/or B, it can mean that A exists alone, A and B exist at the same time, and A and B exist independently B these three cases. In addition, the term "at least one" herein refers to any combination of any one of the plurality or at least two of the plurality, for example, including at least one of A, B, and C, and may mean including from A, B, and C. Any one or more elements selected from the set of B and C.
本公开实施例中的多个或者多种可以分别指的是至少两个或者至少两种。Multiple or multiple in the embodiments of the present disclosure may refer to at least two or at least two, respectively.
另外,为了更好地说明本公开,在下文的具体实施方式中给出了众多的具体细节。本领域技术人员应当理解,没有某些具体细节,本公开同样可以实施。在一些实施例中,对于本领域技术人员熟知的方法、手段、元件和电路未作详细描述,以便于凸显本公开的主旨。In addition, in order to better illustrate the present disclosure, numerous specific details are set forth in the following detailed description. It will be understood by those skilled in the art that the present disclosure may be practiced without certain specific details. In some embodiments, methods, means, components and circuits well known to those skilled in the art have not been described in detail so as not to obscure the subject matter of the present disclosure.
引导滤波是边缘保护滤波技术的一种局部滤波算法,其基本思路是图像中的每个像素点都可以用一个M×M(M为大于1的整数,例如取值为3、5、7等)的局部窗口和一个线性回归模型拟合得到,并且在线性回归模型的基础上添加正则化约束项来减少过拟合问题。然而,该正则化约束项的权重是固定的,容易在边缘区域产生伪影问题。Guided filtering is a local filtering algorithm of edge protection filtering technology. ) and a linear regression model are fitted, and regularization constraints are added to the linear regression model to reduce the overfitting problem. However, the weight of the regularization constraint is fixed, which is prone to artifacts in edge regions.
根据本公开实施例的图像处理方法,能够基于图像位置对应的引导图像区域的区域方差值和区域熵值来确定图像位置的线性估计参数,通过熵-方差联合自适应的处理方式来自动调节正则化约束项的权重,从而减少图像的边缘区域可能出现的伪影,进而提高图像的处理效果。According to the image processing method of the embodiment of the present disclosure, the linear estimation parameter of the image position can be determined based on the regional variance value and the regional entropy value of the guide image region corresponding to the image position, and the automatic adjustment can be performed by the entropy-variance joint adaptive processing method. Regularize the weight of the constraint item, thereby reducing the possible artifacts in the edge area of the image, thereby improving the processing effect of the image.
图1示出根据本公开实施例的图像处理方法的流程图,如图1所示,所述图像处理方法包括:FIG. 1 shows a flowchart of an image processing method according to an embodiment of the present disclosure. As shown in FIG. 1 , the image processing method includes:
在步骤S11中,根据待处理图像以及所述待处理图像的引导图像,确定所述待处理图像的多个图像位置的线性估计参数;In step S11, according to the to-be-processed image and the guide image of the to-be-processed image, linear estimation parameters of multiple image positions of the to-be-processed image are determined;
在步骤S12中,根据所述引导图像及所述多个图像位置的线性估计参数,得到所述待处理图像的结构图;In step S12, a structure diagram of the to-be-processed image is obtained according to the guide image and the linear estimation parameters of the multiple image positions;
在步骤S13中,将所述结构图与所述待处理图像的纹理图融合,得到所述待处理图像的图像处理结果,In step S13, the structure map and the texture map of the to-be-processed image are fused to obtain an image processing result of the to-be-processed image,
其中,所述待处理图像中任一图像位置的线性估计参数,与所述引导图像中的第一 图像区域的区域方差值和区域熵值相关联,所述第一图像区域包括所述引导图像中,以所述图像位置为中心、预设尺寸的图像区域。Wherein, the linear estimation parameter of any image position in the image to be processed is associated with the regional variance value and the regional entropy value of the first image area in the guide image, and the first image area includes the guide image. In the image, the image area with the image position as the center and the preset size.
在本公开的一些实施例中,所述图像处理方法可以由终端设备或服务器等电子设备执行,终端设备可以为用户设备(User Equipment,UE)、移动设备、用户终端、终端、蜂窝电话、无绳电话、个人数字助理(Personal Digital Assistant,PDA)、手持设备、计算设备、车载设备、可穿戴设备等,所述方法可以通过处理器调用存储器中存储的计算机可读指令的方式来实现。或者,可通过服务器执行所述方法。In some embodiments of the present disclosure, the image processing method may be executed by an electronic device such as a terminal device or a server, and the terminal device may be a user equipment (User Equipment, UE), a mobile device, a user terminal, a terminal, a cellular phone, a cordless Phones, personal digital assistants (Personal Digital Assistants, PDAs), handheld devices, computing devices, vehicle-mounted devices, wearable devices, etc., the method can be implemented by the processor calling the computer-readable instructions stored in the memory. Alternatively, the method may be performed by a server.
在本公开的一些实施例中,待处理图像可以为任意图像,例如图像采集设备采集的场景图像、从网络下载的图像等。针对待处理图像的图像处理任务可以为任意类别,例如图像去雾、暗光增强、对比度增强、色调映射、高动态范围HDR成像等,本公开对待处理图像的类型及图像来源、图像处理任务的具体类别均不作限制。In some embodiments of the present disclosure, the image to be processed may be any image, such as a scene image collected by an image acquisition device, an image downloaded from a network, and the like. The image processing tasks for the images to be processed can be of any category, such as image dehazing, dark light enhancement, contrast enhancement, tone mapping, high dynamic range HDR imaging, etc. Specific categories are not limited.
在本公开的一些实施例中,可采用边缘保护滤波算法将待处理图像分解成两层,称为基础层和细节层,基础层包括图像的整体结构信息(例如图像中物体的位置、布局等),细节层包括图像的纹理细节信息(例如图像中物体的纹路、方向等)。In some embodiments of the present disclosure, an edge protection filtering algorithm can be used to decompose the image to be processed into two layers, called a base layer and a detail layer, where the base layer includes the overall structural information of the image (such as the position and layout of objects in the image, etc. ), the detail layer includes the texture detail information of the image (for example, the texture, direction, etc. of the object in the image).
在该情况下,待处理图像X(p)可表示为:In this case, the image to be processed X(p) can be expressed as:
X(p)=Z(p)+D(p)         公式(1);X(p)=Z(p)+D(p) Formula (1);
在公式(1)中,Z(p)可表示基础层;D(p)可表示细节层;p可表示图像中的任一图像位置。在采用引导滤波进行图像处理的情况下,基础层可通过引导图像的线性变换估计得到:In formula (1), Z(p) can represent the base layer; D(p) can represent the detail layer; p can represent any image position in the image. In the case of image processing using guided filtering, the base layer can be estimated by linear transformation of the guided image:
Figure PCTCN2021120905-appb-000001
Figure PCTCN2021120905-appb-000001
在公式(2)中,
Figure PCTCN2021120905-appb-000002
可表示基础层的线性变换估计,G(p)可表示引导图像;a p'和b p'可表示线性回归模型在图像位置p处的线性估计参数。
In formula (2),
Figure PCTCN2021120905-appb-000002
may represent the linear transformation estimate of the base layer, G(p) may represent the guide image; a p' and b p' may represent the linear estimated parameters of the linear regression model at the image position p.
在本公开的一些实施例中,引导图像用于滤除待处理图像中的目标分量(例如滤除待处理图像中的噪声)。其中,可以对待处理图像进行线性滤波(例如方框滤波、均值滤波、或高斯滤波等),得到引导图像;也可基于图像处理任务的类别,选择与图像处理任务对应的图像(例如已进行暗光增强的图像),作为引导图像;还可将待处理图像本身,作为引导图像。本公开对引导图像的来源及处理方式不作限制。In some embodiments of the present disclosure, the guide image is used to filter out target components in the image to be processed (eg, to filter out noise in the image to be processed). Among them, linear filtering (such as block filtering, mean filtering, or Gaussian filtering, etc.) can be performed on the image to be processed to obtain a guide image; it is also possible to select an image corresponding to the image processing task based on the category of the image processing task (for example, the image that has been darkened light-enhanced image) as a guide image; the image to be processed itself can also be used as a guide image. The present disclosure does not limit the source and processing method of the guide image.
在本公开的一些实施例中,在步骤S11中,可根据待处理图像及引导图像,分别确定待处理图像的各个图像位置处的线性估计参数。其中,可通过下面的线性回归模型来计算图像位置p的线性估计参数a p'和b p'In some embodiments of the present disclosure, in step S11, the linear estimation parameters at each image position of the image to be processed may be determined respectively according to the image to be processed and the guide image. where the linearly estimated parameters a p' and b p' for the image position p can be calculated by the following linear regression model:
Figure PCTCN2021120905-appb-000003
Figure PCTCN2021120905-appb-000003
在公式(3)中,Ωζ 1(p')可表示以图像位置p为中心,边长为ζ 1的矩形窗(例如取值为3、5、7、11、17等),该矩形窗中包括K=ζ 1×ζ 1个图像位置,p′表示该矩形窗中的任一像素点;
Figure PCTCN2021120905-appb-000004
可表示用来防止过拟合问题的正则化约束项;
Figure PCTCN2021120905-appb-000005
可表示正则化约束项的权重,λ表示正则化系数(例如取值为0.1、0.01等);Γ G(p')可称为“熵-方差联合自适应因子”,用于自动调整正则化系数的大小。
In formula (3), Ωζ 1 (p') can represent a rectangular window (for example, 3, 5, 7, 11, 17, etc.) with the image position p as the center and the side length ζ 1. The rectangular window Including K=ζ 1 ×ζ 1 image positions, p' represents any pixel in the rectangular window;
Figure PCTCN2021120905-appb-000004
can represent regularization constraints used to prevent overfitting problems;
Figure PCTCN2021120905-appb-000005
It can represent the weight of the regularization constraint, λ represents the regularization coefficient (for example, the value is 0.1, 0.01, etc.); Γ G (p') can be called "entropy-variance joint adaptive factor", which is used to automatically adjust the regularization the size of the coefficient.
在本公开的一些实施例中,Γ G(p')可表示为: In some embodiments of the present disclosure, Γ G (p') can be represented as:
Figure PCTCN2021120905-appb-000006
Figure PCTCN2021120905-appb-000006
在公式(4)中,
Figure PCTCN2021120905-appb-000007
可表示引导图像中的矩形窗Ωζ 1(p')的交叉熵-方差因子(cross entropy-variance factor);N可表示引导图像中的像素点数量;i可表示引导图像中的任一位置;
Figure PCTCN2021120905-appb-000008
可表示引导图像中、与位置i对应的矩形窗的交叉熵-方差因子。根据公式(4)可知,Γ G(p')即为
Figure PCTCN2021120905-appb-000009
的归一化处理结果。
In formula (4),
Figure PCTCN2021120905-appb-000007
can represent the cross entropy-variance factor of the rectangular window Ωζ 1 (p') in the guide image; N can represent the number of pixels in the guide image; i can represent any position in the guide image;
Figure PCTCN2021120905-appb-000008
may represent the cross-entropy-variance factor of the rectangular window corresponding to position i in the guide image. According to formula (4), Γ G (p') is
Figure PCTCN2021120905-appb-000009
The normalized processing result.
在本公开的一些实施例中,交叉熵-方差因子
Figure PCTCN2021120905-appb-000010
可表示为:
In some embodiments of the present disclosure, the cross-entropy-variance factor
Figure PCTCN2021120905-appb-000010
can be expressed as:
Figure PCTCN2021120905-appb-000011
Figure PCTCN2021120905-appb-000011
在公式(5)中,
Figure PCTCN2021120905-appb-000012
可表示引导图像中的矩形窗Ωζ 1(p')内的像素点的区域方差值(或称为局部方差值);
Figure PCTCN2021120905-appb-000013
可表示引导图像中的矩形窗Ωζ 1(p')内的像素点的区域熵值(或称为局部熵值)。
In formula (5),
Figure PCTCN2021120905-appb-000012
It can represent the regional variance value (or called local variance value) of the pixel point within the rectangular window Ωζ 1 (p') in the guide image;
Figure PCTCN2021120905-appb-000013
It can represent the regional entropy value (or local entropy value) of the pixel points within the rectangular window Ωζ 1 (p′) in the guide image.
将公式(4)和(5)代入到公式(3)中,即可求取出图像位置p的线性估计参数a p'和b p',得到的线性估计参数a p'和b p'与引导图像中的矩形窗Ωζ 1(p')(可称为第一图像区域)的区域方差值和区域熵值相关联。第一图像区域包括引导图像中,以图像位置p为中心、预设尺寸(例如3×3、5×5等)的图像区域。 Substituting formulas (4) and (5) into formula (3), the linear estimated parameters a p' and b p' of the image position p can be obtained, and the obtained linear estimated parameters a p' and b p' are the same as the guide The regional variance value and the regional entropy value of the rectangular window Ωζ 1 (p′) in the image, which may be referred to as the first image region, are associated. The first image area includes an image area of a preset size (eg, 3×3, 5×5, etc.) centered on the image position p in the guide image.
对于图像边缘附近的区域,其区域方差值通常会远大于区域熵值;而对于图像内部 的平滑区域(例如纹理细节区域或具有噪声的背景区域),其区域熵值通常会大于区域方差值。因此,边缘区域的因子Γ G(p')会大于平滑区域的因子Γ G(p')。在该情况下,边缘附近的光晕伪影和细节上的过度平滑问题将会减少,从而能够更好地保持边缘清晰度,进而提高图像处理效果。 For areas near the edge of the image, the area variance value is usually much larger than the area entropy value; for smooth areas inside the image (such as texture detail areas or background areas with noise), the area entropy value is usually larger than the area variance value value. Therefore, the factor Γ G (p') for edge regions will be larger than the factor Γ G (p') for smooth regions. In this case, halo artifacts near edges and over-smoothing on details will be reduced, resulting in better edge sharpness and improved image processing.
在本公开的一些实施例中,在步骤S12中,根据引导图像及多个图像位置的线性估计参数,可得到所述待处理图像的结构图。也即,在得到图像位置p的线性估计参数a p'和b p'后,可根据公式(2),计算图像位置p的线性估计结果。对各个图像位置分别进行处理,可得到整个待处理图像的线性估计结果,也即待处理图像的基础层。该基础层包括图像的整体结构信息,可称为待处理图像的结构图。 In some embodiments of the present disclosure, in step S12, a structure diagram of the to-be-processed image may be obtained according to the guide image and linear estimation parameters of multiple image positions. That is, after obtaining the linear estimation parameters a p' and b p' of the image position p, the linear estimation result of the image position p can be calculated according to formula (2). By separately processing each image position, a linear estimation result of the entire image to be processed can be obtained, that is, the base layer of the image to be processed. The base layer includes the overall structure information of the image, which can be called the structure diagram of the image to be processed.
在前面的描述中,可采用边缘保护滤波算法将待处理图像分解成基础层和细节层。该细节层包括图像的纹理细节信息,可称为待处理图像的纹理图。In the foregoing description, an edge protection filtering algorithm can be used to decompose the image to be processed into a base layer and a detail layer. The detail layer includes texture detail information of the image, which may be called a texture map of the image to be processed.
在本公开的一些实施例中,在步骤S13中,可将待处理图像的结构图与纹理图融合,得到待处理图像的图像处理结果。In some embodiments of the present disclosure, in step S13, the structure map of the image to be processed and the texture map may be fused to obtain an image processing result of the image to be processed.
在本公开的一些实施例中,可对结构图与纹理图直接求和,得到处理后的图像,作为图像处理结果。In some embodiments of the present disclosure, the structure map and the texture map may be directly summed to obtain a processed image as an image processing result.
在本公开的一些实施例中,也可根据图像处理任务的类别,为结构图与纹理图设置不同的权重(例如,在图像增强任务中,将结构图与纹理图的权重分别设置为1和2,以实现图像纹理细节的增强)。进而,对结构图与纹理图进行加权求和,得到处理后的图像,作为图像处理结果。本公开对结构图与纹理图的融合方式不作限制。In some embodiments of the present disclosure, different weights may be set for the structure map and the texture map according to the category of the image processing task (for example, in the image enhancement task, the weights of the structure map and the texture map are set to 1 and 1 respectively). 2, in order to achieve the enhancement of image texture details). Furthermore, the weighted summation is performed on the structure map and the texture map to obtain a processed image, which is used as the image processing result. The present disclosure does not limit the fusion method of the structure map and the texture map.
本公开实施例引入的熵-方差联合自适应因子,能够更好地区分图像的边缘区域与图像内部的平滑区域,还具有更好的抗噪声能力。因此,步骤S13得到的图像处理结果,能够既保留原始图像的细节信息,又减少图像边缘附近的光晕伪影,提高图像边缘的清晰度。The entropy-variance joint adaptive factor introduced in the embodiment of the present disclosure can better distinguish the edge region of the image from the smooth region inside the image, and also has better anti-noise capability. Therefore, the image processing result obtained in step S13 can not only retain the detail information of the original image, but also reduce the halo artifacts near the edge of the image, and improve the sharpness of the edge of the image.
例如,在应用于对比度增强等图像增强任务时,步骤S13得到的图像处理结果具有更好的视觉质量和定量性能。For example, when applied to image enhancement tasks such as contrast enhancement, the image processing result obtained in step S13 has better visual quality and quantitative performance.
根据本公开的实施例,能够根据待处理图像及引导图像确定线性估计参数;根据引导图像及线性估计参数得到待处理图像的结构图;将结构图与纹理图融合得到图像处理结果,通过将线性估计参数与区域方差值和区域熵值相关联,能够更好地保持图像边缘的清晰度,进而提高图像处理效果。According to the embodiments of the present disclosure, the linear estimation parameters can be determined according to the image to be processed and the guide image; the structure diagram of the image to be processed can be obtained according to the guide image and the linear estimation parameters; the image processing result can be obtained by fusing the structure diagram and the texture image. The estimated parameters are associated with the regional variance value and the regional entropy value, which can better maintain the sharpness of the image edge, thereby improving the image processing effect.
图2示出可以应用本公开实施例的图像处理方法的一种系统架构示意图;如图2所示,该系统架构中包括:图像获取终端201、网络202和处理终端203。为实现支撑一个示例性应用,图像获取终端201和处理终端203通过网络202建立通信连接,图像获取终端201通过网络202向处理端203上报待处理图像,处理终端203响应于待处理图像,首先,根据待处理图像以及待处理图像的引导图像,确定待处理图像的多个图像位置的线性估计参数;其次,根据引导图像及多个图像位置的线性估计参数,得到待处理图像的结构图;然后将结构图与待处理图像的纹理图融合,得到待处理图像的图像处 理结果。最后,处理终端203将图像处理结果上传至网络202,并通过网络202发送给图像获取终端201。FIG. 2 shows a schematic diagram of a system architecture to which an image processing method according to an embodiment of the present disclosure can be applied; as shown in FIG. 2 , the system architecture includes an image acquisition terminal 201 , a network 202 and a processing terminal 203 . In order to support an exemplary application, the image acquisition terminal 201 and the processing terminal 203 establish a communication connection through the network 202, the image acquisition terminal 201 reports the image to be processed to the processing terminal 203 through the network 202, and the processing terminal 203 responds to the image to be processed. First, According to the to-be-processed image and the guide image of the to-be-processed image, the linear estimation parameters of the multiple image positions of the to-be-processed image are determined; secondly, the structure diagram of the to-be-processed image is obtained according to the guide image and the linear estimation parameters of the multiple image positions; then The structure map and the texture map of the image to be processed are fused to obtain an image processing result of the image to be processed. Finally, the processing terminal 203 uploads the image processing result to the network 202 and sends it to the image acquisition terminal 201 through the network 202 .
作为示例,图像获取终端201可以包括图像采集设备,处理终端203可以包括具有视觉信息处理能力的视觉处理设备或远程服务器。网络202可以采用有线或无线连接方式。其中,当处理终端203为视觉处理设备时,图像获取终端201可以通过有线连接的方式与视觉处理设备通信连接,例如通过总线进行数据通信;当处理终端203为远程服务器时,图像获取终端201可以通过无线网络与远程服务器进行数据交互。As an example, the image acquisition terminal 201 may include an image acquisition device, and the processing terminal 203 may include a visual processing device or a remote server with visual information processing capability. Network 202 may employ wired or wireless connections. Wherein, when the processing terminal 203 is a visual processing device, the image acquisition terminal 201 can communicate with the visual processing device through a wired connection, such as data communication through a bus; when the processing terminal 203 is a remote server, the image acquisition terminal 201 can Data exchange with remote server through wireless network.
或者,在一些场景中,图像获取终端201可以是带有视频采集模组的视觉处理设备,可以是带有摄像头的主机。这时,本公开实施例的图像处理方法可以由图像获取终端201执行,上述系统架构可以不包含网络202和控制终端203。Alternatively, in some scenarios, the image acquisition terminal 201 may be a vision processing device with a video acquisition module, or a host with a camera. At this time, the image processing method of the embodiment of the present disclosure may be executed by the image acquisition terminal 201 , and the above-mentioned system architecture may not include the network 202 and the control terminal 203 .
下面对根据本公开实施例的图像处理方法进行展开说明。The image processing method according to the embodiment of the present disclosure will be described below.
在本公开的一些实施例中,在步骤S11之前,根据本公开实施例的图像处理方法还可包括:对所述待处理图像进行线性滤波,得到所述待处理图像的引导图像。In some embodiments of the present disclosure, before step S11, the image processing method according to the embodiment of the present disclosure may further include: performing linear filtering on the to-be-processed image to obtain a guide image of the to-be-processed image.
也即,可通过线性滤波的方式获取待处理图像的引导图像,该引导图像中滤除了待处理图像中的噪声等信息,使得图像中的信息更为平滑,以便引导待处理图像的后续处理。That is, the guide image of the image to be processed can be obtained by means of linear filtering, and the information such as noise in the image to be processed is filtered from the guide image, so that the information in the image is smoother, so as to guide the subsequent processing of the image to be processed.
其中,线性滤波的方式可例如为方框滤波、均值滤波、高斯滤波等,本公开对此不作限制。The manner of linear filtering may be, for example, block filtering, mean filtering, Gaussian filtering, etc., which is not limited in the present disclosure.
在步骤S11中,可根据待处理图像及引导图像,确定待处理图像的多个图像位置的线性估计参数。在本公开的一些实施例中,步骤S11可包括:In step S11, linear estimation parameters of multiple image positions of the to-be-processed image may be determined according to the to-be-processed image and the guide image. In some embodiments of the present disclosure, step S11 may include:
针对所述待处理图像的任一图像位置,确定与所述图像位置对应的第一图像区域和第二图像区域,所述第二图像区域包括所述待处理图像中,以所述图像位置为中心、预设尺寸的图像区域;For any image position of the image to be processed, determine a first image area and a second image area corresponding to the image position, the second image area includes the image to be processed, and the image position is center, image area of preset size;
分别确定所述第一图像区域及所述第二图像区域的区域参数,所述区域参数包括所述第一图像区域中的像素点的区域方差值、区域熵值及第一区域均值,所述第二图像区域中的像素点的第二区域均值,以及所述第一图像区域与所述第二图像区域的融合区域的第三区域均值;Determine the area parameters of the first image area and the second image area respectively, the area parameters include the area variance value, the area entropy value and the first area mean value of the pixels in the first image area, so the second area mean of the pixels in the second image area, and the third area mean of the fusion area of the first image area and the second image area;
根据所述区域参数,确定所述图像位置的线性估计参数。Based on the region parameters, linear estimation parameters for the image position are determined.
在本公开的一些实施例中,针对待处理图像中的任一图像位置,可先确定与该图像位置对应的窗口区域,包括第一图像区域和第二图像区域。其中,第一图像区域包括引导图像中,以该图像位置为中心、预设尺寸的图像区域;第二图像区域包括待处理图像中,以该图像位置为中心、预设尺寸的图像区域。预设尺寸可例如设置为3×3、5×3、7×7、11×11、17×17等,本公开对预设尺寸的具体取值不作限制。In some embodiments of the present disclosure, for any image position in the image to be processed, a window area corresponding to the image position may be determined first, including a first image area and a second image area. The first image area includes an image area of a preset size centered on the image position in the guide image; the second image area includes an image area of a preset size centered on the image position in the image to be processed. The preset size may be set to, for example, 3×3, 5×3, 7×7, 11×11, 17×17, etc. The present disclosure does not limit the specific value of the preset size.
在本公开的一些实施例中,在该图像位置处于图像边缘附近的情况下,如果第一图像区域和第二图像区域超出引导图像和待处理图像的图像范围,则可对超出图像范围的部分补零。In some embodiments of the present disclosure, in the case where the image position is near the edge of the image, if the first image area and the second image area exceed the image range of the guide image and the image to be processed, the part beyond the image range may be Zero padding.
在本公开的一些实施例中,可分别确定第一图像区域及第二图像区域的区域参数, 以便计算该图像位置p的线性估计参数a p'和b p'。其中,区域参数包括第一图像区域中的像素点的区域方差值、区域熵值及第一区域均值,第二图像区域中的像素点的第二区域均值,以及第一图像区域与第二图像区域的融合区域的第三区域均值。 In some embodiments of the present disclosure, the area parameters of the first image area and the second image area may be determined separately in order to calculate the linear estimation parameters a p' and b p' of the image position p. Wherein, the area parameters include the area variance value, the area entropy value and the first area mean of the pixels in the first image area, the second area mean of the pixels in the second image area, and the first image area and the second area mean The third regional mean of the fused regions of the image regions.
在本公开的一些实施例中,第一图像区域中的像素点的区域方差值,即公式(5)中的
Figure PCTCN2021120905-appb-000014
可表示为:
In some embodiments of the present disclosure, the area variance value of the pixel points in the first image area, that is, in formula (5)
Figure PCTCN2021120905-appb-000014
can be expressed as:
Figure PCTCN2021120905-appb-000015
Figure PCTCN2021120905-appb-000015
在公式(6)中,
Figure PCTCN2021120905-appb-000016
可表示引导图像中、与图像位置p对应的矩形窗 Ωζ1(p' )内的像素点的区域方差;N可表示引导图像中的像素点数量;j可表示引导图像中的任一图像位置;
Figure PCTCN2021120905-appb-000017
可表示引导图像中、与图像位置j对应的矩形窗内的像素点的区域方差。根据公式(6)可知,
Figure PCTCN2021120905-appb-000018
即为
Figure PCTCN2021120905-appb-000019
的归一化处理结果。
In formula (6),
Figure PCTCN2021120905-appb-000016
Can represent the regional variance of pixels in the rectangular window Ωζ1(p ' ) corresponding to the image position p in the guide image; N can represent the number of pixels in the guide image; j can represent any image position in the guide image;
Figure PCTCN2021120905-appb-000017
It can represent the regional variance of the pixels in the rectangular window corresponding to the image position j in the guide image. According to formula (6), it can be known that,
Figure PCTCN2021120905-appb-000018
that is
Figure PCTCN2021120905-appb-000019
The normalized processing result.
在本公开的一些实施例中,分别确定所述第一图像区域及所述第二图像区域的区域参数的步骤,可包括:In some embodiments of the present disclosure, the step of respectively determining the area parameters of the first image area and the second image area may include:
根据所述引导图像的亮度直方图,确定所述第一图像区域中各个像素点在所述引导图像中的出现概率;According to the brightness histogram of the guide image, determine the occurrence probability of each pixel in the first image area in the guide image;
根据所述第一图像区域中各个像素点的出现概率,确定所述区域熵值。The region entropy value is determined according to the occurrence probability of each pixel in the first image region.
其中,第一图像区域中的像素点的区域熵值,即公式(5)中的
Figure PCTCN2021120905-appb-000020
可表示为:
Among them, the regional entropy value of the pixels in the first image region, that is, in formula (5)
Figure PCTCN2021120905-appb-000020
can be expressed as:
Figure PCTCN2021120905-appb-000021
Figure PCTCN2021120905-appb-000021
在公式(7)中,K可表示第一图像区域中的像素点数量;
Figure PCTCN2021120905-appb-000022
可表示第一图像区域中第k个像素点的出现概率。
In formula (7), K can represent the number of pixels in the first image area;
Figure PCTCN2021120905-appb-000022
It can represent the occurrence probability of the kth pixel in the first image area.
在本公开的一些实施例中,可对引导图像进行处理,得到引导图像的亮度直方图,本公开对具体的处理方式不作限制。In some embodiments of the present disclosure, the guide image may be processed to obtain a brightness histogram of the guide image, and the present disclosure does not limit the specific processing manner.
在本公开的一些实施例中,根据亮度直方图,可确定第一图像区域中各个像素点的亮度值,在引导图像中的出现概率;进而可通过公式(7)计算出第一图像区域的区域熵值。In some embodiments of the present disclosure, according to the brightness histogram, the brightness value of each pixel in the first image area can be determined, and the probability of occurrence in the guide image; and then the brightness value of the first image area can be calculated by formula (7). Area entropy value.
通过这种方式,能够得到图像区域的区域熵值,以便后续计算线性估计参数。In this way, the regional entropy value of the image region can be obtained for subsequent calculation of linear estimation parameters.
在本公开的一些实施例中,可分别对第一图像区域中的像素值和第二图像区域中的像素值求平均,得到区域参数中的第一区域均值和第二区域均值。In some embodiments of the present disclosure, the pixel values in the first image area and the pixel values in the second image area may be averaged respectively to obtain the first area mean value and the second area mean value in the area parameters.
在本公开的一些实施例中,分别确定所述第一图像区域及所述第二图像区域的区域参数的步骤,可包括:In some embodiments of the present disclosure, the step of respectively determining the area parameters of the first image area and the second image area may include:
将所述第一图像区域与所述第二图像区域融合,得到融合区域;Fusing the first image area and the second image area to obtain a fusion area;
确定所述融合区域中的像素点的第三区域均值。A third area mean of the pixels in the fusion area is determined.
也即,可将第一图像区域与第二图像区域对应位置的像素值进行点乘,得到融合区域;对融合区域中的像素值求平均,得到区域参数中的第三区域均值。That is, the pixel values at the corresponding positions of the first image area and the second image area can be dot-multiplied to obtain a fusion area; the pixel values in the fusion area are averaged to obtain the third area mean value in the area parameters.
通过这种方式,能够融合区域的区域均值,实现引导图像与待处理图像之间的关联。In this way, the regional mean values of the regions can be fused to realize the association between the guiding image and the image to be processed.
在本公开的一些实施例中,线性估计参数包括第一参数a p'和第二参数b p'。其中,根据所述区域参数,确定所述图像位置的线性估计参数的步骤,可包括: In some embodiments of the present disclosure, the linear estimation parameters include a first parameter a p' and a second parameter b p' . Wherein, the step of determining the linear estimation parameter of the image position according to the area parameter may include:
根据所述区域方差值、所述区域熵值、所述第一区域均值、所述第二区域均值及所述第三区域均值,确定所述图像位置的第一参数;determining the first parameter of the image position according to the regional variance value, the regional entropy value, the first regional mean, the second regional mean and the third regional mean;
根据所述第一区域均值、所述第二区域均值及所述第一参数,确定所述图像位置的第二参数。A second parameter of the image position is determined based on the first regional mean, the second regional mean, and the first parameter.
如前所示,将公式(4)至公式(6)代入到公式(3)中,即可求取出该图像位置的线性估计参数a p'和b p',表示为: As shown above, by substituting formulas (4) to (6) into formula (3), the linear estimation parameters a p' and b p' of the image position can be obtained, which are expressed as:
Figure PCTCN2021120905-appb-000023
Figure PCTCN2021120905-appb-000023
Figure PCTCN2021120905-appb-000024
Figure PCTCN2021120905-appb-000024
在公式(8)和(9)中,
Figure PCTCN2021120905-appb-000025
可表示第一区域均值,
Figure PCTCN2021120905-appb-000026
可表示第二区域均值;
Figure PCTCN2021120905-appb-000027
可表示第三区域均值;运算符°可表示点乘。
In formulas (8) and (9),
Figure PCTCN2021120905-appb-000025
can represent the mean value of the first region,
Figure PCTCN2021120905-appb-000026
It can represent the mean value of the second area;
Figure PCTCN2021120905-appb-000027
It can represent the third area mean; the operator ° can represent the dot product.
在本公开的一些实施例中,根据区域方差值和区域熵值,可基于公式(4)得到因子Γ G(p');根据因子Γ G(p')、第一区域均值
Figure PCTCN2021120905-appb-000028
第二区域均值
Figure PCTCN2021120905-appb-000029
及第三区域均值
Figure PCTCN2021120905-appb-000030
即可基于公式(8)确定图像位置的第一参数a p';进而,根据第一区域均值
Figure PCTCN2021120905-appb-000031
第二区域均值
Figure PCTCN2021120905-appb-000032
及第一参数a p',即可基于公式(9)确定所述图像位置的第二参数b p'
In some embodiments of the present disclosure, according to the regional variance value and the regional entropy value, the factor Γ G (p') can be obtained based on formula (4); according to the factor Γ G (p'), the first regional mean
Figure PCTCN2021120905-appb-000028
second area mean
Figure PCTCN2021120905-appb-000029
and the third regional mean
Figure PCTCN2021120905-appb-000030
That is, the first parameter a p' of the image position can be determined based on the formula (8); then, according to the first area mean
Figure PCTCN2021120905-appb-000031
second area mean
Figure PCTCN2021120905-appb-000032
and the first parameter a p' , the second parameter b p' of the image position can be determined based on the formula (9).
通过这种方式,可分别确定任一图像位置的两个线性估计参数,该线性估计参数与区域方差值和区域熵值相关联,能够提高图像处理效果。In this way, two linear estimation parameters of any image position can be determined respectively, and the linear estimation parameters are associated with the regional variance value and the regional entropy value, which can improve the image processing effect.
在本公开的一些实施例中,可采用上述处理方式,对图像中的每个图像位置进行处理,得到整个待处理图像的线性估计参数;在步骤S12中,根据引导图像及线性估计参数,可基于公式(2),计算得到待处理图像的结构图。In some embodiments of the present disclosure, the above processing methods can be used to process each image position in the image to obtain linear estimation parameters of the entire image to be processed; in step S12, according to the guiding image and the linear estimation parameters, the Based on formula (2), the structure diagram of the image to be processed is obtained by calculation.
在本公开的一些实施例中,根据本公开实施例的图像处理方法还可包括:In some embodiments of the present disclosure, the image processing method according to the embodiments of the present disclosure may further include:
对所述待处理图像进行分解,得到待处理图像的纹理图。The to-be-processed image is decomposed to obtain a texture map of the to-be-processed image.
如前所述,可采用边缘保护滤波算法将待处理图像分解成基础层和细节层,将细节层作为待处理图像的纹理图。As mentioned above, an edge protection filtering algorithm can be used to decompose the image to be processed into a base layer and a detail layer, and the detail layer can be used as a texture map of the image to be processed.
在本公开的一些实施例中,也可以对待处理图像直接进行滤波,去除待处理图像中的噪声等信息,保留待处理图像中的纹理细节信息,得到待处理图像的纹理图。其中,滤波的方式可以为线性可还原的滤波,本公开对具体的滤波方式不作限制。In some embodiments of the present disclosure, the image to be processed may also be directly filtered to remove information such as noise in the image to be processed, and to retain detailed texture information in the image to be processed, to obtain a texture map of the image to be processed. The filtering method may be linear reducible filtering, and the present disclosure does not limit the specific filtering method.
在本公开的一些实施例中,在得到待处理图像的纹理图后,可在步骤S13中,将待处理图像的结构图与纹理图融合,得到待处理图像的图像处理结果。In some embodiments of the present disclosure, after the texture map of the image to be processed is obtained, in step S13, the structure map of the image to be processed and the texture map may be fused to obtain an image processing result of the image to be processed.
在本公开的一些实施例中,可对结构图与纹理图直接求和,得到处理后的图像,作为图像处理结果。也可根据图像处理任务的类别,为结构图与纹理图设置不同的权重(例如,在图像增强任务中,将结构图与纹理图的权重分别设置为1和2)。对结构图与纹理图进行加权求和,得到处理后的图像,作为图像处理结果。本公开对结构图与纹理图的融合方式不作限制。In some embodiments of the present disclosure, the structure map and the texture map may be directly summed to obtain a processed image as an image processing result. Different weights can also be set for the structure map and texture map according to the category of the image processing task (for example, in the image enhancement task, the weights of the structure map and texture map are set to 1 and 2, respectively). The weighted summation of the structure map and the texture map is performed to obtain the processed image as the image processing result. The present disclosure does not limit the fusion method of the structure map and the texture map.
根据本公开实施例的图像处理方法,提出了一种扩展型引导滤波方式,能够基于图像位置对应的引导图像区域的区域方差值和区域熵值,来确定图像位置的线性估计参数;通过熵-方差联合自适应的处理方式,来自动调节线性估计参数的正则化约束项的权重,能够根据不同的图像结构对正则化权重进行不同程度的调节,从而克服引导滤波中由于正则化权重固定造成的伪影问题。According to the image processing method of the embodiment of the present disclosure, an extended guided filtering method is proposed, which can determine the linear estimation parameter of the image position based on the regional variance value and the regional entropy value of the guided image region corresponding to the image position; - The variance joint adaptive processing method is used to automatically adjust the weight of the regularization constraint item of the linear estimation parameter, and the regularization weight can be adjusted to different degrees according to different image structures, thereby overcoming the fixed regularization weight in guided filtering. artifact problem.
根据本公开实施例的图像处理方法,能够应用于人工智能、图像处理、机器视觉等领域,实现图像去雾、暗光增强、对比度增强、色调映射、高动态范围HDR成像、图像拼接等图像处理任务,能够减少图像边缘附近的光晕伪影和细节上的过度平滑问题,能够更好地保持边缘清晰度,提高图像处理效果。The image processing method according to the embodiment of the present disclosure can be applied to the fields of artificial intelligence, image processing, machine vision, etc., to realize image processing such as image dehazing, dark light enhancement, contrast enhancement, tone mapping, high dynamic range HDR imaging, image stitching, etc. It can reduce the halo artifacts near the edge of the image and the over-smoothing problem on the details, and can better maintain the edge sharpness and improve the image processing effect.
可以理解,本公开提及的上述各个方法实施例,在不违背原理逻辑的情况下,均可以彼此相互结合形成结合后的实施例,限于篇幅,本公开不再赘述。本领域技术人员可以理解,在具体实施方式的上述方法中,各步骤的具体执行顺序应当以其功能和可能的内在逻辑确定。It can be understood that the above-mentioned method embodiments mentioned in the present disclosure can be combined with each other to form a combined embodiment without violating the principle and logic. Those skilled in the art can understand that, in the above method of the specific embodiment, the specific execution order of each step should be determined by its function and possible internal logic.
此外,本公开还提供了图像处理装置、电子设备、计算机可读存储介质、程序,上述均可用来实现本公开提供的任一种图像处理方法,相应技术方案和描述和参见方法部分的相应记载,不再赘述。In addition, the present disclosure also provides image processing apparatuses, electronic devices, computer-readable storage media, and programs, all of which can be used to implement any image processing method provided by the present disclosure. For the corresponding technical solutions and descriptions, refer to the corresponding records in the Methods section. ,No longer.
图3示出根据本公开实施例的图像处理装置的框图,如图3所示,所述装置包括:FIG. 3 shows a block diagram of an image processing apparatus according to an embodiment of the present disclosure. As shown in FIG. 3 , the apparatus includes:
参数确定模块31,配置为根据待处理图像以及所述待处理图像的引导图像,确定所述待处理图像的多个图像位置的线性估计参数;The parameter determination module 31 is configured to determine linear estimation parameters of multiple image positions of the to-be-processed image according to the to-be-processed image and the guide image of the to-be-processed image;
结构图获取模块32,配置为根据所述引导图像及所述多个图像位置的线性估计参数,得到所述待处理图像的结构图;The structure diagram obtaining module 32 is configured to obtain the structure diagram of the to-be-processed image according to the guide image and the linear estimation parameters of the positions of the multiple images;
结果确定模块33,配置为将所述结构图与所述待处理图像的纹理图融合,得到所述待处理图像的图像处理结果,The result determination module 33 is configured to fuse the structure map with the texture map of the to-be-processed image to obtain an image processing result of the to-be-processed image,
其中,所述待处理图像中任一图像位置的线性估计参数,与所述引导图像中的第一图像区域的区域方差值和区域熵值相关联,所述第一图像区域包括所述引导图像中,以所述图像位置为中心、预设尺寸的图像区域。Wherein, the linear estimation parameter of any image position in the image to be processed is associated with the regional variance value and the regional entropy value of the first image area in the guide image, and the first image area includes the guide image. In the image, the image area with the image position as the center and the preset size.
在本公开的一些实施例中,所述参数确定模块31,包括:In some embodiments of the present disclosure, the parameter determination module 31 includes:
区域确定子模块,配置为针对所述待处理图像的任一图像位置,确定与所述图像位置对应的第一图像区域和第二图像区域,所述第二图像区域包括所述待处理图像中,以所述图像位置为中心、预设尺寸的图像区域;An area determination sub-module configured to determine, for any image position of the image to be processed, a first image area and a second image area corresponding to the image position, the second image area including the image to be processed , the image area with the image position as the center and the preset size;
区域参数确定子模块,配置为分别确定所述第一图像区域及所述第二图像区域的区域参数,所述区域参数包括所述第一图像区域中的像素点的区域方差值、区域熵值及第一区域均值,所述第二图像区域中的像素点的第二区域均值,以及所述第一图像区域与所述第二图像区域的融合区域的第三区域均值;a region parameter determination sub-module, configured to respectively determine the region parameters of the first image region and the second image region, the region parameters including the region variance value and the region entropy of the pixels in the first image region value and the first area mean value, the second area mean value of the pixels in the second image area, and the third area mean value of the fusion area of the first image area and the second image area;
线性参数确定子模块,配置为根据所述区域参数,确定所述多个图像位置的线性估计参数。The linear parameter determination submodule is configured to determine linear estimation parameters of the plurality of image positions according to the region parameters.
在本公开的一些实施例中,所述区域参数确定子模块,配置为将所述第一图像区域与所述第二图像区域融合,得到融合区域;确定所述融合区域中的像素点的第三区域均值。In some embodiments of the present disclosure, the region parameter determination submodule is configured to fuse the first image region and the second image region to obtain a fusion region; The average of the three regions.
在本公开的一些实施例中,所述区域参数确定子模块,配置为根据所述引导图像的亮度直方图,确定所述第一图像区域中各个像素点在所述引导图像中的出现概率;根据所述第一图像区域中各个像素点的出现概率,确定所述区域熵值。In some embodiments of the present disclosure, the region parameter determination submodule is configured to determine, according to a brightness histogram of the guide image, the occurrence probability of each pixel in the first image region in the guide image; The region entropy value is determined according to the occurrence probability of each pixel in the first image region.
在本公开的一些实施例中,所述线性估计参数包括第一参数和第二参数,所述线性参数确定子模块,配置为根据所述区域方差值、所述区域熵值、所述第一区域均值、所述第二区域均值及所述第三区域均值,确定所述图像位置的第一参数;根据所述第一区域均值、所述第二区域均值及所述第一参数,确定所述图像位置的第二参数。In some embodiments of the present disclosure, the linear estimation parameter includes a first parameter and a second parameter, and the linear parameter determination sub-module is configured to An area mean value, the second area mean value and the third area mean value determine the first parameter of the image position; according to the first area mean value, the second area mean value and the first parameter, determine the second parameter of the image location.
在本公开的一些实施例中,所述装置还包括:线性滤波模块,配置为对所述待处理图像进行线性滤波,得到所述待处理图像的引导图像。In some embodiments of the present disclosure, the apparatus further includes: a linear filtering module configured to perform linear filtering on the to-be-processed image to obtain a guide image of the to-be-processed image.
在本公开的一些实施例中,所述装置还包括:图像分解模块,配置为对所述待处理图像进行分解,得到所述待处理图像的纹理图。In some embodiments of the present disclosure, the apparatus further includes: an image decomposition module configured to decompose the to-be-processed image to obtain a texture map of the to-be-processed image.
在一些实施例中,本公开实施例提供的装置具有的功能或包含的模块可以用于执行上文方法实施例描述的方法,其具体实现可以参照上文方法实施例的描述,为了简洁,这里不再赘述。In some embodiments, the functions or modules included in the apparatuses provided in the embodiments of the present disclosure may be used to execute the methods described in the above method embodiments. For specific implementation, reference may be made to the descriptions of the above method embodiments. For brevity, here No longer.
本公开实施例还提出一种计算机可读存储介质,其上存储有计算机程序指令,所述计算机程序指令被处理器执行时实现上述方法。计算机可读存储介质可以是易失性或非易失性计算机可读存储介质。Embodiments of the present disclosure further provide a computer-readable storage medium, on which computer program instructions are stored, and when the computer program instructions are executed by a processor, the foregoing method is implemented. Computer-readable storage media can be volatile or non-volatile computer-readable storage media.
本公开实施例还提出一种电子设备,包括:处理器;用于存储处理器可执行指令的存储器;其中,所述处理器被配置为调用所述存储器存储的指令,以执行上述方法。An embodiment of the present disclosure further provides an electronic device, including: a processor; a memory for storing instructions executable by the processor; wherein the processor is configured to invoke the instructions stored in the memory to execute the above method.
本公开实施例还提供一种计算机程序,所述计算机程序包括计算机可读代码,在所 述计算机可读代码在电子设备中运行的情况下,所述电子设备的处理器执行如上述方法。An embodiment of the present disclosure also provides a computer program, the computer program includes computer-readable codes, and when the computer-readable codes are executed in an electronic device, the processor of the electronic device executes the above method.
本公开实施例还提供了另一种计算机程序产品,包括计算机可读代码,或者承载有计算机可读代码的非易失性计算机可读存储介质,当所述计算机可读代码在电子设备的处理器中运行时,所述电子设备中的处理器执行上述方法。Embodiments of the present disclosure also provide another computer program product, including computer-readable codes, or a non-volatile computer-readable storage medium carrying computer-readable codes, when the computer-readable codes are processed in an electronic device When running in the device, the processor in the electronic device executes the above method.
电子设备可以被提供为终端、服务器或其它形态的设备。The electronic device may be provided as a terminal, server or other form of device.
图4示出根据本公开实施例的一种电子设备800的框图。例如,电子设备800可以是移动电话,计算机,数字广播终端,消息收发设备,游戏控制台,平板设备,医疗设备,健身设备,个人数字助理等终端。FIG. 4 shows a block diagram of an electronic device 800 according to an embodiment of the present disclosure. For example, electronic device 800 may be a mobile phone, computer, digital broadcast terminal, messaging device, game console, tablet device, medical device, fitness device, personal digital assistant, etc. terminal.
参照图4,电子设备800可以包括以下一个或多个组件:处理组件802,存储器804,电源组件806,多媒体组件808,音频组件810,输入/输出(Input/Output,I/O)的接口812,传感器组件814,以及通信组件816。4, an electronic device 800 may include one or more of the following components: a processing component 802, a memory 804, a power supply component 806, a multimedia component 808, an audio component 810, an input/output (I/O) interface 812 , sensor component 814 , and communication component 816 .
处理组件802通常控制电子设备800的整体操作,诸如与显示,电话呼叫,数据通信,相机操作和记录操作相关联的操作。处理组件802可以包括一个或多个处理器820来执行指令,以完成上述的方法的全部或部分步骤。此外,处理组件802可以包括一个或多个模块,便于处理组件802和其他组件之间的交互。例如,处理组件802可以包括多媒体模块,以方便多媒体组件808和处理组件802之间的交互。The processing component 802 generally controls the overall operation of the electronic device 800, such as operations associated with display, phone calls, data communications, camera operations, and recording operations. The processing component 802 can include one or more processors 820 to execute instructions to perform all or some of the steps of the methods described above. Additionally, processing component 802 may include one or more modules that facilitate interaction between processing component 802 and other components. For example, processing component 802 may include a multimedia module to facilitate interaction between multimedia component 808 and processing component 802.
存储器804被配置为存储各种类型的数据以支持在电子设备800的操作。这些数据的示例包括用于在电子设备800上操作的任何应用程序或方法的指令,联系人数据,电话簿数据,消息,图片,视频等。存储器804可以由任何类型的易失性或非易失性存储设备或者它们的组合实现,如静态随机存取存储器(Static Random-Access Memory,SRAM),电可擦除可编程只读存储器(Electrically Erasable Programmable Read Only Memory,EEPROM),可擦除可编程只读存储器(Electrical Programmable Read Only Memory,EPROM),可编程只读存储器(Programmable Read Only Memory,PROM),只读存储器(Read Only Memory,ROM),磁存储器,快闪存储器,磁盘或光盘。Memory 804 is configured to store various types of data to support operation at electronic device 800 . Examples of such data include instructions for any application or method operating on electronic device 800, contact data, phonebook data, messages, pictures, videos, and the like. The memory 804 may be implemented by any type of volatile or non-volatile storage device or a combination thereof, such as Static Random-Access Memory (SRAM), Electrically Erasable Programmable Read-Only Memory (Electrically Erasable) Erasable Programmable Read Only Memory, EEPROM), Erasable Programmable Read Only Memory (EPROM), Programmable Read Only Memory (PROM), Read Only Memory (Read Only Memory, ROM) ), magnetic memory, flash memory, magnetic disk or optical disk.
电源组件806为电子设备800的各种组件提供电力。电源组件806可以包括电源管理系统,一个或多个电源,及其他与为电子设备800生成、管理和分配电力相关联的组件。Power supply assembly 806 provides power to various components of electronic device 800 . Power supply components 806 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power to electronic device 800 .
多媒体组件808包括在所述电子设备800和用户之间的提供一个输出接口的屏幕。在一些实施例中,屏幕可以包括液晶显示器(Liquid Crystal Display,LCD)和触摸面板(Touch Panel,TP)。如果屏幕包括触摸面板,屏幕可以被实现为触摸屏,以接收来自用户的输入信号。触摸面板包括一个或多个触摸传感器以感测触摸、滑动和触摸面板上的手势。所述触摸传感器可以不仅感测触摸或滑动动作的边界,而且还检测与所述触摸或滑动操作相关的持续时间和压力。在一些实施例中,多媒体组件808包括一个前置摄像头和/或后置摄像头。当电子设备800处于操作模式,如拍摄模式或视频模式时,前置摄像头和/或后置摄像头可以接收外部的多媒体数据。每个前置摄像头和后置摄像头可以是一个固定的光学透镜系统或具有焦距和光学变焦能力。Multimedia component 808 includes a screen that provides an output interface between the electronic device 800 and the user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touch, swipe, and gestures on the touch panel. The touch sensor may not only sense the boundaries of a touch or swipe action, but also detect the duration and pressure associated with the touch or swipe action. In some embodiments, the multimedia component 808 includes a front-facing camera and/or a rear-facing camera. When the electronic device 800 is in an operation mode, such as a shooting mode or a video mode, the front camera and/or the rear camera may receive external multimedia data. Each of the front and rear cameras can be a fixed optical lens system or have focal length and optical zoom capability.
音频组件810被配置为输出和/或输入音频信号。例如,音频组件810包括一个麦克风 (Microphone,MIC),当电子设备800处于操作模式,如呼叫模式、记录模式和语音识别模式时,麦克风被配置为接收外部音频信号。所接收的音频信号可以被进一步存储在存储器804或经由通信组件816发送。在本公开的一些实施例中,音频组件810还包括一个扬声器,用于输出音频信号。Audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a microphone (Microphone, MIC) configured to receive external audio signals when the electronic device 800 is in an operating mode, such as a calling mode, a recording mode, and a voice recognition mode. The received audio signal may be further stored in memory 804 or transmitted via communication component 816 . In some embodiments of the present disclosure, the audio component 810 further includes a speaker for outputting audio signals.
I/O接口812为处理组件802和外围接口模块之间提供接口,上述外围接口模块可以是键盘,点击轮,按钮等。这些按钮可包括但不限于:主页按钮、音量按钮、启动按钮和锁定按钮。The I/O interface 812 provides an interface between the processing component 802 and a peripheral interface module, which may be a keyboard, a click wheel, a button, or the like. These buttons may include, but are not limited to: home button, volume buttons, start button, and lock button.
传感器组件814包括一个或多个传感器,用于为电子设备800提供各个方面的状态评估。例如,传感器组件814可以检测到电子设备800的打开/关闭状态,组件的相对定位,例如所述组件为电子设备800的显示器和小键盘,传感器组件814还可以检测电子设备800或电子设备800一个组件的位置改变,用户与电子设备800接触的存在或不存在,电子设备800方位或加速/减速和电子设备800的温度变化。传感器组件814可以包括接近传感器,被配置用来在没有任何的物理接触时检测附近物体的存在。传感器组件814还可以包括光传感器,如互补金属氧化物半导体(Complementary Metal Oxide Semiconductor,CMOS)或电荷耦合装置(Charge-Coupled Device,CCD)图像传感器,用于在成像应用中使用。在一些实施例中,该传感器组件814还可以包括加速度传感器,陀螺仪传感器,磁传感器,压力传感器或温度传感器。Sensor assembly 814 includes one or more sensors for providing status assessment of various aspects of electronic device 800 . For example, the sensor assembly 814 can detect the on/off state of the electronic device 800, the relative positioning of the components, such as the display and the keypad of the electronic device 800, the sensor assembly 814 can also detect the electronic device 800 or one of the electronic device 800 Changes in the position of components, presence or absence of user contact with the electronic device 800 , orientation or acceleration/deceleration of the electronic device 800 and changes in the temperature of the electronic device 800 . Sensor assembly 814 may include a proximity sensor configured to detect the presence of nearby objects in the absence of any physical contact. Sensor assembly 814 may also include a light sensor, such as a Complementary Metal Oxide Semiconductor (CMOS) or Charge-Coupled Device (CCD) image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
通信组件816被配置为便于电子设备800和其他设备之间有线或无线方式的通信。电子设备800可以接入基于通信标准的无线网络,如无线网络(Wi-Fi),第二代移动通信技术(2-Generation,2G)或第三代移动通信技术(3rd-Generation,3G),或它们的组合。在一个示例性实施例中,通信组件816经由广播信道接收来自外部广播管理系统的广播信号或广播相关信息。在一个示例性实施例中,所述通信组件816还包括近场通信(Near Field Communication,NFC)模块,以促进短程通信。例如,在NFC模块可基于射频识别(Radio Frequency Identification,RFID)技术,红外数据协会(Infrared Data Association,IrDA)技术,超宽带(Ultra Wide Band,UWB)技术,蓝牙(BitTorrent,BT)技术和其他技术来实现。Communication component 816 is configured to facilitate wired or wireless communication between electronic device 800 and other devices. The electronic device 800 can access a wireless network based on a communication standard, such as a wireless network (Wi-Fi), a second-generation mobile communication technology (2-Generation, 2G) or a third-generation mobile communication technology (3rd-Generation, 3G), or their combination. In one exemplary embodiment, the communication component 816 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 816 also includes a Near Field Communication (NFC) module to facilitate short-range communication. For example, the NFC module may be based on Radio Frequency Identification (RFID) technology, Infrared Data Association (IrDA) technology, Ultra Wide Band (UWB) technology, Bluetooth (BitTorrent, BT) technology and other technology to achieve.
在示例性实施例中,电子设备800可以被一个或多个应用专用集成电路(Application Specific Integrated Circuit,ASIC)、数字信号处理器(Digital Signal Processor,DSP)、数字信号处理设备(Digital Signal Processor Device,DSPD)、可编程逻辑器件(Programmable Logic Device,PLD)、现场可编程门阵列(Field Programmable Gate Array,FPGA)、控制器、微控制器、微处理器或其他电子元件实现,用于执行上述方法。In an exemplary embodiment, the electronic device 800 may be implemented by one or more Application Specific Integrated Circuit (ASIC), Digital Signal Processor (DSP), Digital Signal Processor Device (Digital Signal Processor Device) , DSPD), Programmable Logic Device (PLD), Field Programmable Gate Array (FPGA), controller, microcontroller, microprocessor, or other electronic component implementation, used to perform the above method.
在示例性实施例中,还提供了一种非易失性计算机可读存储介质,例如包括计算机程序指令的存储器804,上述计算机程序指令可由电子设备800的处理器820执行以完成上述方法。In an exemplary embodiment, a non-volatile computer-readable storage medium, such as a memory 804 comprising computer program instructions executable by the processor 820 of the electronic device 800 to perform the above method is also provided.
图5示出根据本公开实施例的一种电子设备1900的框图。例如,电子设备1900可以被提供为一服务器。参照图5,电子设备1900包括处理组件1922,其可以包括一个或多个处 理器,以及由存储器1932所代表的存储器资源,用于存储可由处理组件1922的执行的指令,例如应用程序。存储器1932中存储的应用程序可以包括一个或一个以上的每一个对应于一组指令的模块。此外,处理组件1922被配置为执行指令,以执行上述方法。FIG. 5 shows a block diagram of an electronic device 1900 according to an embodiment of the present disclosure. For example, the electronic device 1900 may be provided as a server. 5, the electronic device 1900 includes a processing component 1922, which may include one or more processors, and a memory resource, represented by memory 1932, for storing instructions, such as applications, executable by the processing component 1922. An application program stored in memory 1932 may include one or more modules, each corresponding to a set of instructions. Additionally, the processing component 1922 is configured to execute instructions to perform the above-described methods.
电子设备1900还可以包括一个电源组件1926被配置为执行电子设备1900的电源管理,一个有线或无线网络接口1950被配置为将电子设备1900连接到网络,和一个I/O接口1958。电子设备1900可以操作基于存储在存储器1932的操作系统,例如微软服务器操作系统(Windows Server TM),苹果公司推出的基于图形用户界面操作系统(Mac OS X TM),多用户多进程的计算机操作系统(Unix TM),自由和开放原代码的类Unix操作系统(Linux TM),开放原代码的类Unix操作系统(FreeBSD TM)或类似。 The electronic device 1900 may also include a power supply assembly 1926 configured to perform power management of the electronic device 1900, a wired or wireless network interface 1950 configured to connect the electronic device 1900 to a network, and an I/O interface 1958. The electronic device 1900 can operate based on an operating system stored in the memory 1932, such as a Microsoft server operating system (Windows Server ), a graphical user interface based operating system (Mac OS X ) introduced by Apple, a multi-user multi-process computer operating system (Unix ), Free and Open Source Unix-like Operating System (Linux ), Open Source Unix-like Operating System (FreeBSD ) or the like.
在示例性实施例中,还提供了一种非易失性计算机可读存储介质,例如包括计算机程序指令的存储器1932,上述计算机程序指令可由电子设备1900的处理组件1922执行以完成上述方法。In an exemplary embodiment, a non-volatile computer-readable storage medium is also provided, such as memory 1932 comprising computer program instructions executable by processing component 1922 of electronic device 1900 to perform the above-described method.
本公开可以是系统、方法和/或计算机程序产品。计算机程序产品可以包括计算机可读存储介质,其上载有用于使处理器实现本公开的各个方面的计算机可读程序指令。The present disclosure may be a system, method and/or computer program product. The computer program product may include a computer-readable storage medium having computer-readable program instructions loaded thereon for causing a processor to implement various aspects of the present disclosure.
计算机可读存储介质可以是可以保持和存储由指令执行设备使用的指令的有形设备。计算机可读存储介质例如可以是(但不限于)电存储设备、磁存储设备、光存储设备、电磁存储设备、半导体存储设备或者上述的任意合适的组合。计算机可读存储介质的更具体的例子(非穷举的列表)包括:便携式计算机盘、硬盘、随机存取存储器(Random Access Memory,RAM)、ROM、EPROM或闪存、SRAM、便携式压缩盘只读存储器(Compact Disc-Read Only Memory,CD-ROM)、数字多功能盘(Digital Video Disc,DVD)、记忆棒、软盘、机械编码设备、例如其上存储有指令的打孔卡或凹槽内凸起结构、以及上述的任意合适的组合。这里所使用的计算机可读存储介质不被解释为瞬时信号本身,诸如无线电波或者其他自由传播的电磁波、通过波导或其他传输媒介传播的电磁波(例如,通过光纤电缆的光脉冲)、或者通过电线传输的电信号。A computer-readable storage medium may be a tangible device that can hold and store instructions for use by the instruction execution device. The computer-readable storage medium may be, for example, but not limited to, an electrical storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. More specific examples (non-exhaustive list) of computer readable storage media include: portable computer disks, hard disks, random access memory (RAM), ROM, EPROM or flash memory, SRAM, portable compact disk read only Memory (Compact Disc-Read Only Memory, CD-ROM), Digital Video Disc (DVD), memory sticks, floppy disks, mechanical coding devices, such as punched cards or recessed protrusions on which instructions are stored structure, and any suitable combination of the above. Computer-readable storage media, as used herein, are not to be construed as transient signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through waveguides or other transmission media (eg, light pulses through fiber optic cables), or through electrical wires transmitted electrical signals.
这里所描述的计算机可读程序指令可以从计算机可读存储介质下载到各个计算/处理设备,或者通过网络、例如因特网、局域网、广域网和/或无线网下载到外部计算机或外部存储设备。网络可以包括铜传输电缆、光纤传输、无线传输、路由器、防火墙、交换机、网关计算机和/或边缘服务器。每个计算/处理设备中的网络适配卡或者网络接口从网络接收计算机可读程序指令,并转发该计算机可读程序指令,以供存储在各个计算/处理设备中的计算机可读存储介质中。The computer readable program instructions described herein may be downloaded to various computing/processing devices from a computer readable storage medium, or to an external computer or external storage device over a network such as the Internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers, and/or edge servers. A network adapter card or network interface in each computing/processing device receives computer-readable program instructions from a network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in each computing/processing device .
用于执行本公开操作的计算机程序指令可以是汇编指令、指令集架构((Industry Standard Architecture,ISA)指令、机器指令、机器相关指令、微代码、固件指令、状态设置数据、或者以一种或多种编程语言的任意组合编写的源代码或目标代码,所述编程语言包括面向对象的编程语言,诸如Smalltalk、C++等,以及常规的过程式编程语言,诸如“C”语言或类似的编程语言。计算机可读程序指令可以完全地在用户计算机上执行、部分地在用户计算机上执行、作为一个独立的软件包执行、部分在用户计算机上部分在 远程计算机上执行、或者完全在远程计算机或服务器上执行。在涉及远程计算机的情形中,远程计算机可以通过任意种类的网络,包括局域网(Local Area Network,LAN)或广域网(Wide Area Network,WAN),连接到用户计算机,或者,可以连接到外部计算机(例如利用因特网服务提供商来通过因特网连接)。在一些实施例中,通过利用计算机可读程序指令的状态信息来个性化定制电子电路,例如可编程逻辑电路、FPGA或可编程逻辑阵列(Programmable Logic Arrays,PLA),该电子电路可以执行计算机可读程序指令,从而实现本公开的各个方面。Computer program instructions for carrying out the operations of the present disclosure may be assembly instructions, Instruction Set Architecture (ISA) instructions, machine instructions, machine-dependent instructions, microcode, firmware instructions, state setting data, or in one or Source or object code written in any combination of programming languages, including object-oriented programming languages, such as Smalltalk, C++, etc., and conventional procedural programming languages, such as the "C" language or similar programming languages The computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server In the case of a remote computer, the remote computer can be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or Wide Area Network (WAN), or it can be connected to an external A computer (eg, using an Internet service provider to connect through the Internet). In some embodiments, electronic circuits, such as programmable logic circuits, FPGAs, or programmable logic arrays, are personalized by utilizing state information of computer-readable program instructions ( Programmable Logic Arrays, PLA), electronic circuits that can execute computer-readable program instructions to implement various aspects of the present disclosure.
这里参照根据本公开实施例的方法、装置(系统)和计算机程序产品的流程图和/或框图描述了本公开的各个方面。应当理解,流程图和/或框图的每个方框以及流程图和/或框图中各方框的组合,都可以由计算机可读程序指令实现。Aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer readable program instructions.
这些计算机可读程序指令可以提供给通用计算机、专用计算机或其它可编程数据处理装置的处理器,从而生产出一种机器,使得这些指令在通过计算机或其它可编程数据处理装置的处理器执行时,产生了实现流程图和/或框图中的一个或多个方框中规定的功能/动作的装置。也可以把这些计算机可读程序指令存储在计算机可读存储介质中,这些指令使得计算机、可编程数据处理装置和/或其他设备以特定方式工作,从而,存储有指令的计算机可读介质则包括一个制造品,其包括实现流程图和/或框图中的一个或多个方框中规定的功能/动作的各个方面的指令。These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer or other programmable data processing apparatus to produce a machine that causes the instructions when executed by the processor of the computer or other programmable data processing apparatus , resulting in means for implementing the functions/acts specified in one or more blocks of the flowchart and/or block diagrams. These computer readable program instructions can also be stored in a computer readable storage medium, these instructions cause a computer, programmable data processing apparatus and/or other equipment to operate in a specific manner, so that the computer readable medium storing the instructions includes An article of manufacture comprising instructions for implementing various aspects of the functions/acts specified in one or more blocks of the flowchart and/or block diagrams.
也可以把计算机可读程序指令加载到计算机、其它可编程数据处理装置、或其它设备上,使得在计算机、其它可编程数据处理装置或其它设备上执行一系列操作步骤,以产生计算机实现的过程,从而使得在计算机、其它可编程数据处理装置、或其它设备上执行的指令实现流程图和/或框图中的一个或多个方框中规定的功能/动作。Computer readable program instructions can also be loaded onto a computer, other programmable data processing apparatus, or other equipment to cause a series of operational steps to be performed on the computer, other programmable data processing apparatus, or other equipment to produce a computer-implemented process , thereby causing instructions executing on a computer, other programmable data processing apparatus, or other device to implement the functions/acts specified in one or more blocks of the flowcharts and/or block diagrams.
附图中的流程图和框图显示了根据本公开的多个实施例的系统、方法和计算机程序产品的可能实现的体系架构、功能和操作。在这点上,流程图或框图中的每个方框可以代表一个模块、程序段或指令的一部分,所述模块、程序段或指令的一部分包含一个或多个用于实现规定的逻辑功能的可执行指令。在有些作为替换的实现中,方框中所标注的功能也可以以不同于附图中所标注的顺序发生。例如,两个连续的方框实际上可以基本并行地执行,它们有时也可以按相反的顺序执行,这依所涉及的功能而定。也要注意的是,框图和/或流程图中的每个方框、以及框图和/或流程图中的方框的组合,可以用执行规定的功能或动作的专用的基于硬件的系统来实现,或者可以用专用硬件与计算机指令的组合来实现。The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more functions for implementing the specified logical function(s) executable instructions. In some alternative implementations, the functions noted in the blocks may occur out of the order noted in the figures. For example, two blocks in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It is also noted that each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented in dedicated hardware-based systems that perform the specified functions or actions , or can be implemented in a combination of dedicated hardware and computer instructions.
该计算机程序产品可以具体通过硬件、软件或其结合的方式实现。在一个可选实施例中,所述计算机程序产品具体体现为计算机存储介质,在另一个可选实施例中,计算机程序产品具体体现为软件产品,例如软件开发包(Software Development Kit,SDK)等等。The computer program product can be specifically implemented by hardware, software or a combination thereof. In an optional embodiment, the computer program product is embodied as a computer storage medium, and in another optional embodiment, the computer program product is embodied as a software product, such as a software development kit (Software Development Kit, SDK), etc. Wait.
以上已经描述了本公开的各实施例,上述说明是示例性的,并非穷尽性的,并且也不限于所披露的各实施例。在不偏离所说明的各实施例的范围和精神的情况下,对于本 技术领域的普通技术人员来说许多修改和变更都是显而易见的。本文中所用术语的选择,旨在最好地解释各实施例的原理、实际应用或对市场中的技术的改进,或者使本技术领域的其它普通技术人员能理解本文披露的各实施例。Various embodiments of the present disclosure have been described above, and the foregoing descriptions are exemplary, not exhaustive, and not limiting of the disclosed embodiments. Numerous modifications and variations will be apparent to those skilled in the art without departing from the scope and spirit of the described embodiments. The terminology used herein was chosen to best explain the principles of the various embodiments, the practical application or improvement over the technology in the marketplace, or to enable others of ordinary skill in the art to understand the various embodiments disclosed herein.
工业实用性Industrial Applicability
本公开实施例涉及一种图像处理方法、装置、设备、存储介质及程序,所述方法包括:根据待处理图像以及所述待处理图像的引导图像,确定所述待处理图像的多个图像位置的线性估计参数;根据所述引导图像及所述多个图像位置的线性估计参数,得到所述待处理图像的结构图;将所述结构图与所述待处理图像的纹理图融合,得到所述待处理图像的图像处理结果,其中,所述待处理图像中任一图像位置的线性估计参数,与所述引导图像中的第一图像区域的区域方差值和区域熵值相关联,所述第一图像区域包括所述引导图像中,以所述图像位置为中心、预设尺寸的图像区域。Embodiments of the present disclosure relate to an image processing method, apparatus, device, storage medium, and program. The method includes: determining multiple image positions of the to-be-processed image according to the to-be-processed image and a guide image of the to-be-processed image The linear estimation parameters of the image to be processed are obtained; the structure diagram of the image to be processed is obtained according to the linear estimation parameters of the guide image and the positions of the multiple images; the structure diagram and the texture map of the image to be processed are fused to obtain the The image processing result of the image to be processed, wherein the linear estimation parameter of any image position in the image to be processed is associated with the regional variance value and the regional entropy value of the first image region in the guide image, so The first image area includes an image area with a preset size centered on the image position in the guide image.

Claims (11)

  1. 一种图像处理方法,所述方法由电子设备执行,所述方法包括:An image processing method, the method being performed by an electronic device, the method comprising:
    根据待处理图像以及所述待处理图像的引导图像,确定所述待处理图像的多个图像位置的线性估计参数;determining linear estimation parameters of multiple image positions of the to-be-processed image according to the to-be-processed image and the guide image of the to-be-processed image;
    根据所述引导图像及所述多个图像位置的线性估计参数,得到所述待处理图像的结构图;obtaining a structure diagram of the to-be-processed image according to the guide image and the linear estimation parameters of the multiple image positions;
    将所述结构图与所述待处理图像的纹理图融合,得到所述待处理图像的图像处理结果,fusing the structure map with the texture map of the to-be-processed image to obtain an image processing result of the to-be-processed image,
    其中,所述待处理图像中任一图像位置的线性估计参数,与所述引导图像中的第一图像区域的区域方差值和区域熵值相关联,所述第一图像区域包括所述引导图像中,以所述图像位置为中心、预设尺寸的图像区域。Wherein, the linear estimation parameter of any image position in the image to be processed is associated with the regional variance value and the regional entropy value of the first image area in the guide image, and the first image area includes the guide image. In the image, the image area with the image position as the center and the preset size.
  2. 根据权利要求1所述的方法,其中,所述根据待处理图像及所述待处理图像的引导图像,确定所述待处理图像的多个图像位置的线性估计参数,包括:The method according to claim 1, wherein the determining linear estimation parameters of multiple image positions of the to-be-processed image according to the to-be-processed image and the guide image of the to-be-processed image comprises:
    针对所述待处理图像的任一图像位置,确定与所述图像位置对应的第一图像区域和第二图像区域,所述第二图像区域包括所述待处理图像中,以所述图像位置为中心、预设尺寸的图像区域;For any image position of the image to be processed, determine a first image area and a second image area corresponding to the image position, the second image area includes the image to be processed, and the image position is center, image area of preset size;
    分别确定所述第一图像区域及所述第二图像区域的区域参数,所述区域参数包括所述第一图像区域中的像素点的区域方差值、区域熵值及第一区域均值,所述第二图像区域中的像素点的第二区域均值,以及所述第一图像区域与所述第二图像区域的融合区域的第三区域均值;Determine the area parameters of the first image area and the second image area respectively, the area parameters include the area variance value, the area entropy value and the first area mean value of the pixels in the first image area, so the second area mean of the pixels in the second image area, and the third area mean of the fusion area of the first image area and the second image area;
    根据所述区域参数,确定所述多个图像位置的线性估计参数。Based on the region parameters, linear estimation parameters for the plurality of image locations are determined.
  3. 根据权利要求2所述的方法,其中,所述分别确定所述第一图像区域及所述第二图像区域的区域参数,包括:The method according to claim 2, wherein the determining the area parameters of the first image area and the second image area respectively comprises:
    将所述第一图像区域与所述第二图像区域融合,得到融合区域;Fusing the first image area and the second image area to obtain a fusion area;
    确定所述融合区域中的像素点的第三区域均值。A third area mean of the pixels in the fusion area is determined.
  4. 根据权利要求2所述的方法,其中,所述分别确定所述第一图像区域及所述第二图像区域的区域参数,包括:The method according to claim 2, wherein the determining the area parameters of the first image area and the second image area respectively comprises:
    根据所述引导图像的亮度直方图,确定所述第一图像区域中各个像素点在所述引导图像中的出现概率;According to the brightness histogram of the guide image, determine the occurrence probability of each pixel in the first image area in the guide image;
    根据所述第一图像区域中各个像素点的出现概率,确定所述区域熵值。The region entropy value is determined according to the occurrence probability of each pixel in the first image region.
  5. 根据权利要求2所述的方法,其中,所述线性估计参数包括第一参数和第二参数,所述根据所述区域参数,确定所述多个图像位置的线性估计参数,包括:The method according to claim 2, wherein the linear estimation parameter includes a first parameter and a second parameter, and the determining the linear estimation parameter of the plurality of image positions according to the region parameter comprises:
    根据所述区域方差值、所述区域熵值、所述第一区域均值、所述第二区域均值及所述第三区域均值,确定所述图像位置的第一参数;determining the first parameter of the image position according to the regional variance value, the regional entropy value, the first regional mean, the second regional mean and the third regional mean;
    根据所述第一区域均值、所述第二区域均值及所述第一参数,确定所述图像位置的第二参数。A second parameter of the image position is determined based on the first regional mean, the second regional mean, and the first parameter.
  6. 根据权利要求1至5任一所述的方法,其中,所述方法还包括:The method according to any one of claims 1 to 5, wherein the method further comprises:
    对所述待处理图像进行线性滤波,得到所述待处理图像的引导图像。Linear filtering is performed on the to-be-processed image to obtain a guide image of the to-be-processed image.
  7. 根据权利要求1至6任一所述的方法,其中,所述方法还包括:The method according to any one of claims 1 to 6, wherein the method further comprises:
    对所述待处理图像进行分解,得到所述待处理图像的纹理图。The to-be-processed image is decomposed to obtain a texture map of the to-be-processed image.
  8. 一种图像处理装置,包括:An image processing device, comprising:
    参数确定模块,配置为根据待处理图像以及所述待处理图像的引导图像,确定所述 待处理图像的多个图像位置的线性估计参数;A parameter determination module, configured to determine linear estimation parameters of multiple image positions of the to-be-processed image according to the to-be-processed image and the guide image of the to-be-processed image;
    结构图获取模块,配置为根据所述引导图像及所述多个图像位置的线性估计参数,得到所述待处理图像的结构图;a structure diagram acquisition module, configured to obtain the structure diagram of the to-be-processed image according to the guide image and the linear estimation parameters of the positions of the multiple images;
    结果确定模块,配置为将所述结构图与所述待处理图像的纹理图融合,得到所述待处理图像的图像处理结果,A result determination module, configured to fuse the structure map with the texture map of the to-be-processed image to obtain an image processing result of the to-be-processed image,
    其中,所述待处理图像中任一图像位置的线性估计参数,与所述引导图像中的第一图像区域的区域方差值和区域熵值相关联,所述第一图像区域包括所述引导图像中,以所述图像位置为中心、预设尺寸的图像区域。Wherein, the linear estimation parameter of any image position in the image to be processed is associated with the regional variance value and the regional entropy value of the first image area in the guide image, and the first image area includes the guide image. In the image, the image area with the image position as the center and the preset size.
  9. 一种电子设备,包括:An electronic device comprising:
    处理器;processor;
    用于存储处理器可执行指令的存储器;memory for storing processor-executable instructions;
    其中,所述处理器被配置为调用所述存储器存储的指令,以执行权利要求1至7中任意一项所述的方法。wherein the processor is configured to invoke the memory-stored instructions to perform the method of any one of claims 1-7.
  10. 一种计算机可读存储介质,其上存储有计算机程序指令,所述计算机程序指令被处理器执行时实现权利要求1至7任意一项所述的方法。A computer-readable storage medium having computer program instructions stored thereon, the computer program instructions implementing the method of any one of claims 1 to 7 when executed by a processor.
  11. 一种计算机程序,所述计算机程序包括计算机可读代码,在所述计算机可读代码在电子设备中运行的情况下,所述电子设备的处理器执行用于实现如权利要求1至7中任意一项所述的图像处理方法。A computer program comprising computer readable codes, when the computer readable codes are run in an electronic device, the processor of the electronic device executes the code for implementing any one of claims 1 to 7 The image processing method described in one item.
PCT/CN2021/120905 2021-04-28 2021-09-27 Image processing method and apparatus, and device, storage medium and program WO2022227394A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020227027309A KR20220149514A (en) 2021-04-28 2021-09-27 Image processing method, apparatus, apparatus, storage medium and program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110469110.2 2021-04-28
CN202110469110.2A CN113139947A (en) 2021-04-28 2021-04-28 Image processing method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
WO2022227394A1 true WO2022227394A1 (en) 2022-11-03

Family

ID=76816396

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/120905 WO2022227394A1 (en) 2021-04-28 2021-09-27 Image processing method and apparatus, and device, storage medium and program

Country Status (3)

Country Link
KR (1) KR20220149514A (en)
CN (1) CN113139947A (en)
WO (1) WO2022227394A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113139947A (en) * 2021-04-28 2021-07-20 上海商汤智能科技有限公司 Image processing method and device, electronic equipment and storage medium
CN113610817B (en) * 2021-08-11 2024-03-26 贵州中烟工业有限责任公司 Characteristic peak identification method, computing device and storage medium
CN113640445A (en) * 2021-08-11 2021-11-12 贵州中烟工业有限责任公司 Characteristic peak identification method based on image processing, computing equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160292824A1 (en) * 2013-04-12 2016-10-06 Agency For Science, Technology And Research Method and System for Processing an Input Image
CN109903272A (en) * 2019-01-30 2019-06-18 西安天伟电子系统工程有限公司 Object detection method, device, equipment, computer equipment and storage medium
CN110675404A (en) * 2019-09-03 2020-01-10 RealMe重庆移动通信有限公司 Image processing method, image processing apparatus, storage medium, and terminal device
CN113139947A (en) * 2021-04-28 2021-07-20 上海商汤智能科技有限公司 Image processing method and device, electronic equipment and storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6390173B2 (en) * 2014-06-02 2018-09-19 日本電気株式会社 Information processing apparatus, information processing system, image processing method, and program
CN106780417B (en) * 2016-11-22 2019-12-13 北京交通大学 method and system for enhancing uneven-illumination image
CN107680054B (en) * 2017-09-26 2021-05-18 长春理工大学 Multi-source image fusion method in haze environment
CN110189309B (en) * 2019-05-21 2021-06-15 上海商汤智能科技有限公司 Image processing method and device, electronic equipment and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160292824A1 (en) * 2013-04-12 2016-10-06 Agency For Science, Technology And Research Method and System for Processing an Input Image
CN109903272A (en) * 2019-01-30 2019-06-18 西安天伟电子系统工程有限公司 Object detection method, device, equipment, computer equipment and storage medium
CN110675404A (en) * 2019-09-03 2020-01-10 RealMe重庆移动通信有限公司 Image processing method, image processing apparatus, storage medium, and terminal device
CN113139947A (en) * 2021-04-28 2021-07-20 上海商汤智能科技有限公司 Image processing method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
KR20220149514A (en) 2022-11-08
CN113139947A (en) 2021-07-20

Similar Documents

Publication Publication Date Title
WO2022227394A1 (en) Image processing method and apparatus, and device, storage medium and program
WO2020155711A1 (en) Image generating method and apparatus, electronic device, and storage medium
JP6710255B2 (en) Electronic device and communication method thereof
TWI777112B (en) Method, apparatus and electronic device for image processing and storage medium
US20200380661A1 (en) Image processing method and apparatus, electronic device, and storage medium
US11410342B2 (en) Method for adding special effect to video, electronic device and storage medium
CN111340733B (en) Image processing method and device, electronic equipment and storage medium
CN111445414B (en) Image processing method and device, electronic equipment and storage medium
CN108154466B (en) Image processing method and device
US20220130023A1 (en) Video denoising method and apparatus, terminal, and storage medium
CN111640114B (en) Image processing method and device
CN111583142B (en) Image noise reduction method and device, electronic equipment and storage medium
CN111369482B (en) Image processing method and device, electronic equipment and storage medium
WO2023071167A1 (en) Image processing method and apparatus, and electronic device, storage medium and program product
WO2022021932A1 (en) De-noising method and apparatus, and electronic device, storage medium and computer program product
CN107730443B (en) Image processing method and device and user equipment
CN110675355B (en) Image reconstruction method and device, electronic equipment and storage medium
CN110728180A (en) Image processing method, device and storage medium
CN107451972B (en) Image enhancement method, device and computer readable storage medium
CN112651880B (en) Video data processing method and device, electronic equipment and storage medium
CN112200745A (en) Method and device for processing remote sensing image, electronic equipment and storage medium
CN110896492B (en) Image processing method, device and storage medium
CN111369456B (en) Image denoising method and device, electronic device and storage medium
CN111583144A (en) Image noise reduction method and device, electronic equipment and storage medium
CN111583145B (en) Image noise reduction method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2022538754

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21938858

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21938858

Country of ref document: EP

Kind code of ref document: A1