WO2022166907A1 - 图像处理方法、装置、设备及可读存储介质 - Google Patents
图像处理方法、装置、设备及可读存储介质 Download PDFInfo
- Publication number
- WO2022166907A1 WO2022166907A1 PCT/CN2022/075130 CN2022075130W WO2022166907A1 WO 2022166907 A1 WO2022166907 A1 WO 2022166907A1 CN 2022075130 W CN2022075130 W CN 2022075130W WO 2022166907 A1 WO2022166907 A1 WO 2022166907A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- pixel
- sketch
- map
- grayscale
- Prior art date
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 43
- 238000000034 method Methods 0.000 claims abstract description 57
- 238000012545 processing Methods 0.000 claims abstract description 35
- 230000000875 corresponding effect Effects 0.000 claims description 42
- 238000004590 computer program Methods 0.000 claims description 32
- 230000004927 fusion Effects 0.000 claims description 29
- 238000004040 coloring Methods 0.000 claims description 22
- 230000009467 reduction Effects 0.000 claims description 11
- 230000002596 correlated effect Effects 0.000 claims description 4
- 230000008569 process Effects 0.000 abstract description 24
- 238000013135 deep learning Methods 0.000 abstract description 6
- 230000000717 retained effect Effects 0.000 abstract 1
- 238000010586 diagram Methods 0.000 description 28
- 238000012937 correction Methods 0.000 description 21
- 230000000694 effects Effects 0.000 description 14
- 230000006870 function Effects 0.000 description 13
- 238000010421 pencil drawing Methods 0.000 description 9
- 238000004891 communication Methods 0.000 description 6
- 238000013136 deep learning model Methods 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 5
- 238000006243 chemical reaction Methods 0.000 description 3
- 239000000835 fiber Substances 0.000 description 3
- 238000001914 filtration Methods 0.000 description 3
- 238000010606 normalization Methods 0.000 description 3
- 238000000605 extraction Methods 0.000 description 2
- 230000001815 facial effect Effects 0.000 description 2
- 238000007499 fusion processing Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 239000000047 product Substances 0.000 description 2
- 230000000644 propagated effect Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000002087 whitening effect Effects 0.000 description 2
- 101100269850 Caenorhabditis elegans mask-1 gene Proteins 0.000 description 1
- 101150064138 MAP1 gene Proteins 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 210000005252 bulbus oculi Anatomy 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 239000011265 semifinished product Substances 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/04—Context-preserving transformations, e.g. by using an importance map
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/001—Texturing; Colouring; Generation of texture or colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/25—Determination of region of interest [ROI] or a volume of interest [VOI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/54—Extraction of image or video features relating to texture
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/56—Extraction of image or video features relating to colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20024—Filtering details
- G06T2207/20032—Median filtering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
- G06T2207/30201—Face
Definitions
- the embodiments of the present disclosure relate to the technical field of image processing, and in particular, to an image processing method, apparatus, device, and readable storage medium.
- images of other styles can be obtained by performing style conversion on the original image.
- a deep learning model is trained using artificially drawn cartoon samples and image samples such as portraits in advance, and then the deep learning model is used to convert the style of the original image to obtain a cartoon.
- Sketching is also an image style that is loved by users.
- the prior art does not support converting the source image into a sketch-style image.
- Embodiments of the present disclosure provide an image processing method, apparatus, device, and readable storage medium, which can obtain a sketch image by means of grayscale processing, Gaussian blur, etc., retain the details to a large extent, and make the sketch image realistic, And no deep learning process is required, and the efficiency is high.
- an embodiment of the present disclosure provides an image processing method, including:
- Gaussian blurring is performed on the grayscale image to obtain a Gaussian blurred image
- a first sketch image corresponding to the source image is generated.
- an image processing apparatus including:
- a grayscale module for generating a grayscale image based on the source image
- a processing module for performing Gaussian blur on the grayscale image to obtain a Gaussian blurred image
- a generating module configured to generate a first sketch image corresponding to the source image based on the Gaussian blur map and the grayscale map.
- embodiments of the present disclosure provide an electronic device, including: at least one processor and a memory;
- the memory stores computer-executable instructions
- the at least one processor executes the computer-executable instructions stored in the memory, causing the at least one processor to perform the image processing method described in the first aspect and various possible designs of the first aspect above.
- embodiments of the present disclosure provide a computer-readable storage medium, where computer-executable instructions are stored in the computer-readable storage medium, and when a processor executes the computer-executable instructions, the first aspect and the first
- the image processing method is described in terms of various possible designs.
- an embodiment of the present disclosure provides a computer program product, where the computer program product includes: a computer program, where the computer program is stored in a readable storage medium, and at least one processor of an electronic device stores data from the readable storage medium.
- the computer program is read by the medium, and the execution of the computer program by the at least one processor causes the electronic device to perform the method as described in the first aspect above.
- an embodiment of the present disclosure provides a computer program, where the computer program is stored in a readable storage medium, and at least one processor of an electronic device reads the computer program from the readable storage medium, the Execution of the computer program by at least one processor causes the electronic device to perform the method of the first aspect above.
- the electronic device after the electronic device obtains the source image, it generates a grayscale image based on the source image, and performs Gaussian blur on the grayscale image to obtain a Gaussian blurred image. Then, the electronic device generates a first sketch image using the Gaussian blur map and the grayscale map.
- the electronic device can obtain the first sketch image by means of grayscale processing, Gaussian blurring, etc., which retains the details to a large extent, making the first sketch image realistic, and does not require a deep learning process, with high efficiency .
- FIG. 1 is a schematic diagram of a network architecture of an image processing method provided by an embodiment of the present disclosure
- FIG. 2 is a flowchart of an image processing method provided by an embodiment of the present application.
- 3A is a schematic diagram of a process of generating a sketch image in an image processing method provided by an embodiment of the present disclosure
- 3B is a schematic diagram of a process of generating a sketch image in an image processing method provided by an embodiment of the present disclosure
- FIG. 4 is a schematic diagram of a process of sketching an image based on saturation in the image processing method provided by an embodiment of the present disclosure
- FIG. 5 is a schematic diagram of incorporating texture in an image processing method provided by an embodiment of the present disclosure.
- FIG. 6 is a schematic diagram of a texture map in an image processing method provided by an embodiment of the present disclosure.
- FIG. 7 is a schematic diagram of an effect curve of a fusion effect in an image processing method provided by an embodiment of the present disclosure
- FIG. 8A is a schematic diagram of a coloring process in an image processing method provided by an embodiment of the present disclosure.
- FIG. 8B is a schematic diagram of a coloring process in an image processing method provided by an embodiment of the present disclosure.
- FIG. 9 is a flowchart of an image processing method provided by an embodiment of the present application.
- FIG. 10 is a structural block diagram of an image processing apparatus according to an embodiment of the present disclosure.
- FIG. 11 is a structural block diagram of another image processing apparatus provided by an embodiment of the present disclosure.
- FIG. 12 is a schematic structural diagram of an electronic device for implementing an embodiment of the present disclosure.
- Two-dimensional comics are an art that focuses on freehand brushwork, usually based on deep learning, using artificially drawn comic samples and image samples such as portraits to train a deep learning model.
- the drawing style of the comic samples is the same.
- the deep learning model is placed on an electronic device such as a user's mobile terminal or a server. After the electronic device obtains the original image, the original image is input into the deep learning model, and the comic can be output. Many details, such as facial features, cannot be preserved during model training. However, comics do not have high requirements for "realism". Therefore, even if many details in the comics obtained by the deep learning model are ignored, the comics can achieve a better freehand effect and be accepted by users.
- pencil sketches are more realistic, requiring a realistic image of the content, so they pay more attention to details.
- the original image is an image of a person
- the pencil sketch obtained based on the original image is considered qualified if the face, facial features, etc. in the pencil sketch match the image of the person well.
- the above deep learning method is not suitable for the acquisition of pencil sketches.
- FIG. 1 is a schematic diagram of a network architecture of an image processing method provided by an embodiment of the present disclosure.
- the network architecture includes a terminal device 1 , a server 2 and a network 3 , and the terminal device 1 and the server 2 establish a network connection through the network 3 .
- Network 3 includes various types of network connections, such as wired, wireless communication links, or fiber optic cables.
- the user uses the terminal device 1 to interact with the server 2 through the network 3 to receive or send messages and the like.
- Various communication client applications are installed on the terminal device 1, such as video playback applications, shopping applications, search applications, instant communication tools, email clients, social platform software, and the like.
- the terminal device 1 may be hardware or software.
- the terminal device 1 is, for example, a mobile phone, a tablet computer, an e-book reader, a laptop computer, a desktop computer, or the like.
- the terminal device 1 is software, it can be installed in the hardware devices listed above. In this case, the terminal device 1 is, for example, multiple software modules or a single software module, which is not limited by the embodiments of the present disclosure.
- the server 2 is a server capable of providing various servers for receiving the source image sent by the terminal device, performing style conversion on the source image, and obtaining a pencil sketch, such as a black and white sketch image or a colored pencil sketch.
- the server 2 may be hardware or software.
- the server 2 is hardware, the server 2 is a single server or a distributed server cluster composed of multiple servers.
- the server 2 is software, it may be multiple software modules or a single software module, etc., which is not limited in the embodiment of the present disclosure.
- terminal devices 1 , servers 2 and networks 3 in FIG. 1 are only illustrative. In actual implementation, any number of electronic devices 1 , servers 2 and networks 3 are deployed according to actual requirements.
- the server 2 and the network 3 in the above-mentioned FIG. 1 may not exist.
- FIG. 2 is a flowchart of an image processing method provided by an embodiment of the present application.
- the execution subject of this embodiment is an electronic device, and the electronic device is, for example, the terminal device or the server in the above-mentioned FIG. 1 .
- This embodiment includes:
- the electronic device obtains the source image locally, or obtains the source image from the Internet.
- the source image is also called (image_src).
- the source image is a red green blue (RGB) image, a black-and-white photo, etc., which is not limited in the embodiment of the present disclosure.
- the electronic device performs grayscale processing on each pixel of the source image, thereby converting the source image into a grayscale image. For example, the electronic device averages the values of R, G, and B for each pixel, and uses the average value as a grayscale value. For another example, for each pixel in the source image, the electronic device determines the maximum value and the minimum value of R, G, and B of the pixel, and takes the average value of the maximum value and the minimum value as the gray value. For another example, the electronic device uses a binary image method, a weighted average method, etc., to generate a grayscale image for the source image.
- the grayscale image is also referred to as (image_gray).
- the electronic device determines a Gaussian kernel (kernel) corresponding to the Gaussian blur according to Gaussian convolution or the like, and uses the Gaussian kernel to perform Gaussian blurring on the grayscale image to obtain a Gaussian blurred image.
- the Gaussian kernel is, for example, 255 or the like.
- the electronic device uses different Gaussian kernels to perform Gaussian blurring on grayscale images, and the resulting Gaussian blurred images are different.
- the Gaussian blur image is also called (image_blur).
- the electronic device uses the Gaussian blur image and the grayscale image as materials to generate a first sketch image.
- the electronic device performs fusion processing on the pixels of the Gaussian blur image and the corresponding pixels in the grayscale image, thereby generating the first sketch image.
- the first sketch image is also called (image_target).
- the electronic device uses different Gaussian kernels to perform Gaussian blurring on the grayscale image, the obtained Gaussian blurred images are different. Therefore, the electronic device generates different sketch images based on the grayscale image and different Gaussian blurred images. For example, please refer to FIG. 3A and FIG. 3B .
- 3A is a schematic diagram of a process of generating a first sketch image in an image processing method provided by an embodiment of the present disclosure.
- the Gaussian kernel 1 when the Gaussian kernel 1 is relatively small, the first sketch image generated based on the Gaussian blur map 1 and the grayscale image is lighter.
- FIG. 3B is a schematic diagram of a process of generating a sketch image in an image processing method provided by an embodiment of the present disclosure.
- the Gaussian kernel 2 when the Gaussian kernel 2 is relatively small, the first sketch image generated based on the Gaussian blurred image 2 and the grayscale image is relatively heavy.
- the source image is, for example, a color image or a black-and-white image, which is not limited by the embodiment of the present disclosure.
- the first sketch images in FIGS. 3A and 3B are also referred to as first sketch images of different layers. That is to say, the first sketch image in FIG. 3A is a semi-finished product in the process of simulating the painter's drawing, and the first sketch image in FIG. 3B is equivalent to the finished product in the artist's drawing process.
- the electronic device after acquiring the source image, the electronic device generates a grayscale image based on the source image, and performs Gaussian blur on the grayscale image to obtain a Gaussian blurred image. Then, the electronic device generates a first sketch image using the Gaussian blur map and the grayscale map.
- the electronic device can obtain the first sketch image by means of grayscale processing, Gaussian blurring, etc., which retains the details to a large extent, making the first sketch image realistic, and does not require a deep learning process, with high efficiency .
- the electronic device after acquiring the source image, performs noise reduction processing on the source image. For example, the electronic device performs median filtering on the original image to remove possible noise in the original image. In addition, the electronic device can also perform noise reduction processing on the source image by means of mean filtering, adaptive Wiener filtering, and the like.
- the present disclosure does not limit the specific manner of noise reduction processing.
- the electronic device suppresses or eliminates noise in the source image by performing noise reduction processing on the source image, thereby achieving the purpose of improving the quality of the source image.
- the electronic device in the process of generating the first sketch image by the electronic device based on the Gaussian blur map and the grayscale map, according to the pixel values of the pixels in the grayscale map and the pixels of the pixels in the Gaussian blur map value, determine the pixel value of the pixel in the first sketch image, the grayscale map, the Gaussian blur map and the first sketch image have the same size, the grayscale map, the Gaussian blur map and all the The pixels in the first sketch image are in one-to-one correspondence, and the sketch image is generated according to the determined pixel values of the pixels in the first sketch image.
- the size of the source image, the grayscale image, the Gaussian blur image and the first sketch image are the same, and the pixels are in one-to-one correspondence. Therefore, the electronic device can determine the pixel value of the corresponding pixel in the sketch image according to the pixel value of the pixel in the grayscale image and the Gaussian blur image, and then obtain the first sketch image. For example, after the Gaussian blur map is generated, the electronic device uses the dodge mode to extract the effect, that is, the following formula (1) is used for extraction:
- image_target (image_gray/gray_blur1) ⁇ 255 Formula (1)
- the dodge based on different Gaussian kernels can obtain the effect of the first sketch image at different levels.
- the source image includes the hair region of the person, in the first sketch image obtained in step 104, the color of the hair region is relatively bright and white. In practice, however, the hair in the first sketch image is all black. Therefore, the first sketch image needs to be corrected to obtain the second sketch image.
- the electronics convert the source image into a Hue Saturation Value (HSV) map. After that, the electronic device corrects the first sketch image according to the saturation of each pixel in the HSV image.
- HSV Hue Saturation Value
- the electronic device converts the source image from the RGB mode to the HSV mode, thereby obtaining an HSV image, and according to the HSV image, the saturation of each pixel in the HSV image can be obtained, and the saturation is also called an S channel value.
- the electronic device determines the highlighted area, that is, the area to be corrected, from the first sketch image according to the saturation of each pixel in the HSV image. For example, the electronic device determines a saturation threshold, and if the saturation of a certain pixel in the HSV image exceeds the preset saturation, the pixel is regarded as a pixel in the highlighted area. After that, the highlighted areas in the first sketch image are corrected to obtain the second sketch image.
- the electronic device normalizes the saturation of each pixel in the HSV map to 0-1.
- the highlight area to be corrected is determined from the first sketch image according to the saturation of each pixel in the normalized HSV map. For example, if the source image is a 640 ⁇ 480 image, after converting it into an HSV image, it is found that the normalized saturation of 20,000 pixels in the HSV image exceeds the preset threshold, then according to the 20,000 pixels in the HSV image, from the first 20,000 pixels are determined in a sketch image, and the color of the 20,000 pixels in the first sketch image is darkened, so as to realize the correction of the hair area included in the first sketch image.
- the electronic device determines and corrects the pixels to be corrected from the first sketch image according to the saturation value of each pixel in the HSV map of the source image, thereby improving the quality of the first sketch image.
- the electronic device determines the highlighted area from the first sketch image according to the saturation of the pixels in the HSV image
- the electronic device determines from the HSV image that the normalized saturation exceeds A pixel with a preset threshold is obtained to obtain a first pixel set.
- the electronic device determines a second pixel set from the first sketch image according to the first pixel set, the first pixel set and the pixels in the second pixel set are in one-to-one correspondence, and the positions of the corresponding pixels are the same, and the second pixel set
- the highlighted area is formed.
- the electronic device performs weighted fusion on the pixels in the second pixel set and the corresponding pixels in the third pixel set to correct the first sketch image.
- the saturation of each pixel of the HSV image belongs to 0-1.
- the electronic device can determine the pixels whose normalized saturation is greater than or equal to 0.5 from the HSV map, and use these pixels as the first pixel set.
- the electronic device can determine the second pixel set from the first sketch image, and determine the third pixel set from the grayscale image, and the pixels in the three pixel sets are in one-to-one correspondence.
- the one-to-one correspondence refers to the position of pixel a in the HSV image and the pixel b in the first sketch image
- the position of pixel a in the HSV image is the same as the position of pixel c in the grayscale image.
- the electronic device After determining the pixels to be corrected, the electronic device performs weighted fusion between the pixels in the first sketch image and the corresponding pixels in the grayscale image, so as to achieve the purpose of correcting the first sketch image.
- the electronic device when the electronic device performs weighted fusion of the pixels in the second pixel set and the corresponding pixels in the third pixel set to correct the first sketch image, first, the electronic device performs weighted fusion according to the first pixel classification in the first pixel set.
- the normalized saturation is determined as a first weight, and according to the first weight, a second weight is determined, where the second weight is the difference between 1 and the first weight.
- the electronic device determines the fourth pixel after weighted fusion according to the pixels in the second pixel set, the pixels in the third pixel set, the first weight, and the second weight.
- the electronic device replaces the second pixel in the first sketch image with the fourth pixel to obtain a second sketch image.
- the pixels in the first pixel set, the second pixel set, and the third pixel set are in one-to-one correspondence, and the one-to-one corresponding pixels are the first pixel, the second pixel, and the third pixel, and the electronic device determines After the first pixel set is obtained, the normalized saturation of each first pixel in the first pixel set is used as the first weight of the second pixel in the second pixel set, and the difference between 1 and the first weight is used as the second weight, the second weight is the second weight of the third pixel in the third pixel set. After that, the electronic device performs weighted fusion on the second pixel and the third pixel according to the first weight and the second weight, so as to obtain a fourth pixel.
- the normalized saturation of the first pixel is 0.6, which is greater than the preset threshold of 0.5.
- the normalized saturation of each first pixel in the first pixel set is used as the first weight of the third pixel in the third pixel set, and 1 The difference from the first weight is used as the second weight, and the second weight is the weight of the second pixel in the second pixel set.
- the electronic device performs weighted fusion on the second pixel and the third pixel according to the first weight and the second weight, so as to obtain a fourth pixel.
- the normalized saturation of the first pixel is 0.6, which is greater than the preset threshold of 0.5.
- the electronic device After determining the corresponding fourth pixel for each pixel in the second pixel set, the electronic device replaces the second pixel in the first sketch image with the fourth pixel, thereby obtaining a second sketch image.
- FIG. 4 is a schematic diagram of a process of correcting a first sketch image based on saturation in an image processing method provided by an embodiment of the present disclosure.
- the source image from left to right are the source image, the first sketch image and the second sketch image.
- the whitening of the hair in the first sketch image is not a normal sketch effect. After correction, the problem of whitening hair highlights has been improved.
- the source image in FIG. 4 may also be a color image, which is not limited by the embodiment of the present disclosure.
- the electronic device determines the weights related to the weighted fusion according to the normalized saturation in the HSV image, so as to achieve the purpose of accurately correcting the first sketch image.
- the electronic device in the process of correcting the first sketch image by the electronic device according to the saturation of each pixel in the normalized HSV image, only the highlighted areas such as hair can be corrected, that is, the improvement of the local area is not possible. Make global corrections to the entire graph. Therefore, after completing the above correction, the electronic device also needs to add shader, also called adding texture, to the corrected black and white pencil drawing, so that the first sketch image is more like a hand-painted drawing with pencil drawing texture. In addition, sometimes the above-mentioned correction cannot completely solve the problem of local area highlighting. Therefore, after the above correction, the electronic device needs to further correct the second sketch image by adding a shader. In the process of adding the shader, the electronic device fuses the preset texture map and the corrected pencil sketch according to the pixel value of each pixel in the grayscale image.
- the electronic device determines a grayscale threshold, determines the pixels whose grayscale values exceed the grayscale threshold from the grayscale image, determines corresponding pixels from the second sketch image according to these pixels, and adds to the area formed by these pixels. Lighter texture. Adds a lighter texture to the area formed by the remaining pixels.
- the electronic device normalizes the pixel value of each pixel in the grayscale image to 0-1, sets a threshold, and divides the pixels in the grayscale image into multiple sets according to the threshold, and the pixels in different sets are listed in the first set.
- a corrected sketch image has corresponding pixels on it.
- the electronic device fuses the texture map and the pencil sketch after the first correction, so as to incorporate the pencil sketch into the pencil sketch after the first correction.
- the electronic device determines the weight according to the normalized pixel value in the grayscale image, and integrates the pencil drawing texture into the pencil drawing sketch after the first correction according to the weight.
- FIG. 5 is a schematic diagram of incorporating textures in an image processing method provided by an embodiment of the present disclosure.
- the hat and face are both white, and the hair is darker in comparison.
- the hair, hat, and face are all brightly colored in the first sketch image after style transfer.
- the electronic device performs correction based on the saturation of the HSV image, it is determined from the first sketch image that the pixels in the second pixel set are the pixels of the hair region. After the first correction, the hair area is noticeably darker.
- the electronic device fuses the texture map and the second sketch image, thereby incorporating the pencil-drawn texture into the second sketch image.
- the electronic device adds texture to the second sketch image by adding texture, so as to realize the purpose of incorporating the pencil drawing texture into the second sketch image and the purpose of further correcting the second sketch image.
- the preset texture map includes a first texture map and a second texture map, the texture of the first texture map is opposite to the texture of the second texture map, and the texture color of the first texture map is darker than The texture color of the second texture map.
- FIG. 6 is a schematic diagram of a texture map in an image processing method provided by an embodiment of the present disclosure. Please refer to FIG. 6 , the left side is the first texture map, the right side is the second texture map, the color of the first texture map is dark, the color of the second texture map is light, and the texture directions of the first texture map and the second texture map are as shown in the figure As shown by the arrows, it is obvious that the two directions are perpendicular (orthogonal) to each other.
- the electronic device performs texture fusion according to the following formula (2):
- Result represents the value obtained by dividing the pixel value of the pixel in the second sketch image by 255 after texture fusion
- Sketch represents the value obtained by dividing the pixel value of the pixel in the second sketch image by 255
- Mask1 Indicates the value of the pixel value of the pixel in the first texture map divided by 255
- Mask2 represents the value of the pixel value of the pixel in the second texture map divided by 255.
- the electronic device reasonably sets three thresholds, and fuses the second sketch image, the first texture map and the second texture map according to the normalized gray value of each pixel in the grayscale image to integrate into the pencil drawing. Texture to make the lines of the second sketch image smoother.
- a third sketch image that is, a color pencil sketch can be further obtained.
- the electronic device determines the weight of each pixel in the first sketch image according to the pixel value of each pixel in the first sketch image, and determines the weight of each pixel in the first sketch image according to each pixel value in the first sketch image.
- the pixel value of the pixel, the weight of each pixel in the first sketch image, and the pixel value of each pixel in the source image are used to colorize the first sketch image to obtain a third sketch image.
- the electronic device uses the first sketch image and the source image as materials, and obtains a mask by designing a piecewise function.
- the source image and the first sketch image are fused pixel by pixel based on the mask, so that the first sketch image is colored to obtain a third pencil sketch.
- the electronic device determines the piecewise function according to the following formula (3):
- P[i][j] represents the pixel value of a pixel in the black and white pencil drawing
- the value range is an integer from 0 to 255
- i represents the abscissa of the pixel in the black and white pencil drawing
- j represents the The vertical coordinate of the pixel in the black and white pencil drawing
- the parameter thresh is 128, for example.
- the above Mask value is the weight.
- the electronic device performs pixel-by-pixel fusion on the source image and the first sketch image according to formula (4):
- image_color image_target ⁇ (1-Mask value)+image_src ⁇ mask value Formula (4).
- FIG. 7 is a schematic diagram of an effect curve of a fusion effect in an image processing method provided by an embodiment of the present disclosure.
- the horizontal axis is the pixel value of the pixel in the first sketch image, and the value range is 0-255.
- the vertical axis is the value of the weight in the fusion process, and the value range is 0-1.
- the effect curve is a three-segment piecewise function, which is continuous as a whole. From the perspective of the fusion effect: when the pixel value of the pixel in the first sketch image is smaller than thresh, the weight of the pixel in the first sketch image is larger, close to 1, and the corresponding source Corresponding pixels in the image are weighted less and decay slowly.
- the weight of the pixel in the first sketch image remains unchanged at 0.9, as shown in the part of pixel values 128-180 in the figure. After that, when the pixel value of the pixel in the first sketch image is greater than 180, the weight of the pixel in the first sketch image decreases sharply and decays to 0 rapidly. In this way, the coloring effect of hair, eyeballs and other parts can be better maintained, and other white highlight parts are not affected by the source image.
- the coloring of the first sketch image is used as an example for explanation.
- the embodiment of the present disclosure is not limited to this. Colorize, or, colorize the textured second sketch image.
- FIG. 8A is a schematic diagram of a coloring process in an image processing method provided by an embodiment of the present disclosure. Referring to FIG. 8A , from left to right are a source image (color), a first sketch image and a third sketch image.
- FIG. 8B is a schematic diagram of a coloring process in an image processing method provided by an embodiment of the present disclosure. Referring to FIG. 8A , from left to right are a source image (color), a first sketch image, a second sketch image, and a third sketch image.
- the above-mentioned image processing method can also be modified into another method.
- please refer to FIG. 9 please refer to FIG. 9 .
- FIG. 9 is a flowchart of an image processing method provided by an embodiment of the present application.
- the execution subject of this embodiment is an electronic device, and the electronic device is, for example, the terminal device or the server in the above-mentioned FIG. 1 .
- This embodiment includes:
- the electronic device performs RGB channel separation on the source image, and splits the source image into three RGB channels, which are respectively represented as: R channel image (img_r), G channel image (img_g), and B channel image (img_b ).
- the electronic device determines the Gaussian kernel (kernel) corresponding to the respective Gaussian blurs of RGB according to Gaussian convolution, etc., and uses the Gaussian kernel to perform Gaussian blurring on the R channel image (img_r) to obtain the R channel Gaussian blur map (r_blur), for G
- the channel image is Gaussian blurred to obtain the G channel Gaussian blur map (g_blur)
- the B channel image is Gaussian blurred to obtain the B channel Gaussian blur map (b_blur).
- the R pencil sketch map can also be represented as r_target
- the G pencil sketch map can also be represented as g_target
- the B pencil sketch map can also be represented as b_target.
- the electronic device performs channel fusion on b_target, g_target, and r_target to obtain a final result.
- the electronic device is split through RGB channels, and the color source image can be directly converted into a color pencil sketch, with high flexibility.
- FIG. 10 is a structural block diagram of an image processing apparatus according to an embodiment of the present disclosure.
- the image processing apparatus 100 includes: an acquisition module 11 , a grayscale module 12 , a processing module 13 and a generation module 14 .
- an acquisition module 11 for acquiring a source image
- a grayscale module 12 configured to generate a grayscale image based on the source image
- a processing module 13 configured to perform Gaussian blur on the grayscale image to obtain a Gaussian blurred image
- the generating module 14 is configured to generate a first sketch image corresponding to the source image based on the Gaussian blur map and the grayscale map.
- FIG. 11 is a structural block diagram of another image processing apparatus provided by an embodiment of the present disclosure.
- the image processing apparatus 100 provided in this example further includes:
- the correction module 15 is configured to convert the source image into Hue Saturation Lightness HSV after the generation module 14 generates the first sketch image corresponding to the source image based on the Gaussian blur map and the grayscale image Figure, according to the saturation of the pixels in the HSV image, determine the highlighted area from the first sketch image, and correct the highlighted area to obtain the second sketch image.
- the correction module 15 is configured to determine the normalization in the HSV image when determining the highlighted area from the first sketch image according to the saturation of the pixels in the HSV image After converting pixels whose saturation exceeds a preset threshold, a first pixel set is obtained, and according to the first pixel set, a second pixel set is determined from the first sketch image, the first pixel set and the first pixel set are The pixels in the two pixel sets are in one-to-one correspondence, and the second pixel set forms the highlighted area; when the correction module 15 corrects the highlighted area to obtain a second sketch image, it is used to modify the highlighted area to obtain a second sketch image according to the first pixel set , determine the third pixel set from the grayscale image, the first pixel set and the pixels in the third pixel set are in one-to-one correspondence, and the pixels in the second pixel set and the third pixel set are in a one-to-one correspondence. Weighted fusion is performed on the corresponding pixels to correct the first sketch image
- the correction module 15 performs weighted fusion on the pixels in the second pixel set and the corresponding pixels in the third pixel set, so as to correct the first sketch image to obtain the second sketch image is used to determine a first weight according to the normalized saturation of the first pixel in the first pixel set, and the first weight is positively correlated with the normalized saturation of the first pixel, according to the The first weight determines the second weight, the second weight is the difference between the preset value and the first weight, according to the second pixel in the second pixel set and the third pixel in the third pixel set , the first weight, the second weight, determine the fourth pixel after weighted fusion, and replace the second pixel in the first sketch image with the fourth pixel to obtain the second sketch image.
- the above-mentioned image processing apparatus 100 further includes:
- the fusion module 16 is configured to fuse the preset texture map and the second sketch image according to the pixel values of the pixels in the grayscale image after the correction module 15 corrects the highlighted area to obtain the second sketch image .
- the preset texture map includes a first texture map and a second texture map, and the texture of the first texture map and the texture of the second texture map are inversely perpendicular to each other, so The texture color of the first texture map is darker than the texture color of the second texture map.
- the above-mentioned image processing apparatus 100 further includes: a coloring module 17 for generating, in the generating module 14, based on the Gaussian blur map and the grayscale map, After the first sketch image, the weight of the pixels in the first sketch image is determined according to the pixel value of the pixels in the first sketch image, and the first sketch image is determined according to the pixel value of each pixel in the first sketch image.
- the generating module 14 is configured to determine the pixel value of the pixel in the first sketch image according to the pixel value of the pixel in the grayscale image and the pixel value of the pixel in the Gaussian blur image pixel value, the grayscale map, the Gaussian blur map and the first sketch image have the same size, and the grayscale map, the Gaussian blur map and the pixels in the first sketch image are in one-to-one correspondence,
- the first sketch image is generated according to the determined pixel values of the pixels in the first sketch image.
- the above-mentioned grayscale module 12 is configured to perform noise reduction processing on the source image, and convert the noise reduction processed source image into the grayscale image.
- FIG. 12 is a schematic structural diagram of an electronic device for implementing an embodiment of the present disclosure.
- the electronic device 200 may be a terminal device or a server.
- the terminal equipment may include, but is not limited to, such as mobile phones, notebook computers, digital broadcast receivers, personal digital assistants (Personal Digital Assistant, referred to as PDA), tablet computers (Portable Android Device, referred to as PAD), portable multimedia players (Portable Media Player, PMP for short), mobile terminals such as in-vehicle terminals (such as in-vehicle navigation terminals), etc., and fixed terminals such as digital TVs, desktop computers, and the like.
- PDA Personal Digital Assistant
- PAD Portable Android Device
- PMP Portable Multimedia Player
- mobile terminals such as in-vehicle terminals (such as in-vehicle navigation terminals), etc.
- fixed terminals such as digital TVs, desktop computers, and the like.
- the electronic device shown in FIG. 12 is only an example, and should not impose any limitation on the
- the electronic device 200 may include a processing device (such as a central processing unit, a graphics processor, etc.) 201, which may be stored in a read-only memory (Read Only Memory, ROM for short) 202 according to a program or from a storage device 208 is a program loaded into a random access memory (Random Access Memory, RAM for short) 203 to execute various appropriate actions and processes.
- a processing device such as a central processing unit, a graphics processor, etc.
- ROM Read Only Memory
- RAM Random Access Memory
- various programs and data required for the operation of the electronic device 200 are also stored.
- the processing device 201, the ROM 202, and the RAM 203 are connected to each other through a bus 204.
- An input/output (I/O) interface 205 is also connected to the bus 204 .
- the following devices can be connected to the I/O interface 205: input devices 206 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; including, for example, a Liquid Crystal Display (LCD for short) ), speaker, vibrator, etc. output device 207; storage device 208 including, eg, magnetic tape, hard disk, etc.; and communication device 202.
- Communication means 202 may allow electronic device 200 to communicate wirelessly or by wire with other devices to exchange data. While FIG. 12 shows the electronic device 200 having various means, it should be understood that not all of the illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
- embodiments of the present disclosure include a computer program product comprising a computer program carried on a computer-readable medium, the computer program containing program code for performing the method illustrated in the flowchart.
- the computer program may be downloaded and installed from the network via the communication device 202, or from the storage device 208, or from the ROM 202.
- the processing apparatus 201 When the computer program is executed by the processing apparatus 201, the above-mentioned functions defined in the methods of the embodiments of the present disclosure are executed.
- Embodiments of the present disclosure include a computer program containing program code for performing the method shown in the flowchart.
- the computer-readable medium mentioned above in the present disclosure may be a computer-readable signal medium or a computer-readable storage medium, or any combination of the above two.
- a computer-readable storage medium can be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the above. More specific examples of computer readable storage media may include, but are not limited to, electrical connections with one or more wires, portable computer disks, hard disks, random access memory (RAM), read only memory (ROM), erasable Programmable read only memory (EPROM or flash memory), fiber optics, portable compact disk read only memory (CD-ROM), optical storage devices, magnetic storage devices, or any suitable combination of the foregoing.
- a computer-readable storage medium may be any tangible medium that contains or stores a program that can be used by or in conjunction with an instruction execution system, apparatus, or device.
- a computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave with computer-readable program code embodied thereon. Such propagated data signals may take a variety of forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the foregoing.
- a computer-readable signal medium can also be any computer-readable medium other than a computer-readable storage medium that can transmit, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device .
- Program code embodied on a computer readable medium may be transmitted using any suitable medium including, but not limited to, electrical wire, optical fiber cable, RF (radio frequency), etc., or any suitable combination of the foregoing.
- the above-mentioned computer-readable medium may be included in the above-mentioned electronic device; or may exist alone without being assembled into the electronic device.
- the aforementioned computer-readable medium carries one or more programs, and when the aforementioned one or more programs are executed by the electronic device, causes the electronic device to execute the methods shown in the foregoing embodiments.
- Computer program code for carrying out operations of the present disclosure may be written in one or more programming languages, including object-oriented programming languages—such as Java, Smalltalk, C++, but also conventional Procedural programming language - such as the "C" language or similar programming language.
- the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server.
- the remote computer can be connected to the user's computer through any kind of network—including a Local Area Network (LAN) or a Wide Area Network (WAN)—or, can be connected to an external A computer (eg using an Internet service provider to connect via the Internet).
- LAN Local Area Network
- WAN Wide Area Network
- each block in the flowchart or block diagrams may represent a module, segment, or portion of code that contains one or more logical functions for implementing the specified functions executable instructions.
- the functions noted in the blocks may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
- each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations can be implemented in dedicated hardware-based systems that perform the specified functions or operations , or can be implemented in a combination of dedicated hardware and computer instructions.
- the units involved in the embodiments of the present disclosure may be implemented in a software manner, and may also be implemented in a hardware manner.
- the name of the unit does not constitute a limitation on the unit itself under certain circumstances, for example, the first obtaining unit may also be described as "a unit for obtaining at least two Internet Protocol addresses".
- exemplary types of hardware logic components include: Field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), Systems on Chips (SOCs), Complex Programmable Logical Devices (CPLDs) and more.
- FPGAs Field Programmable Gate Arrays
- ASICs Application Specific Integrated Circuits
- ASSPs Application Specific Standard Products
- SOCs Systems on Chips
- CPLDs Complex Programmable Logical Devices
- a machine-readable medium may be a tangible medium that may contain or store a program for use by or in connection with the instruction execution system, apparatus or device.
- the machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium.
- Machine-readable media may include, but are not limited to, electronic, magnetic, optical, electromagnetic, infrared, or semiconductor systems, devices, or devices, or any suitable combination of the foregoing.
- machine-readable storage media would include one or more wire-based electrical connections, portable computer disks, hard disks, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM or flash memory), fiber optics, compact disk read only memory (CD-ROM), optical storage, magnetic storage, or any suitable combination of the foregoing.
- RAM random access memory
- ROM read only memory
- EPROM or flash memory erasable programmable read only memory
- CD-ROM compact disk read only memory
- magnetic storage or any suitable combination of the foregoing.
- an image processing method including: acquiring a source image; generating a grayscale image based on the source image; performing Gaussian blur on the grayscale image, A Gaussian blur map is obtained; based on the Gaussian blur map and the grayscale map, a first sketch image corresponding to the source image is generated.
- the method further includes: converting the source image into a hue-saturation-lightness HSV map; According to the saturation of the pixels in the HSV image, a highlighted area is determined from the first sketch image; the second sketch image is obtained by correcting the highlighted area.
- determining a highlighted area from the first sketch image according to the saturation of pixels in the HSV image includes: determining a normalized saturation in the HSV image Pixels exceeding the preset threshold are obtained as a first pixel set; according to the first pixel set, a second pixel set is determined in the first sketch image, and the first pixel set and the second pixel set in the second pixel set are determined.
- Pixels are in one-to-one correspondence, and the second set of pixels forms the highlighted area; the modifying the highlighted area to obtain a second sketch image includes: determining from the grayscale image according to the first set of pixels A third pixel set is obtained, and the first pixel set and the pixels in the third pixel set are in one-to-one correspondence; the pixels in the second pixel set and the corresponding pixels in the third pixel set are weighted and fused to correct the The first sketch image is obtained to obtain the second sketch image.
- the weighted fusion of the pixels in the second pixel set and the corresponding pixels in the third pixel set is performed to modify the first sketch image to obtain the second sketch image
- the method includes: determining a first weight according to the normalized saturation of the first pixel in the first pixel set, where the first weight is positively correlated with the normalized saturation of the first pixel; weight, determine a second weight, the second weight is the difference between the preset value and the first weight; according to the second pixel in the second pixel set, the third pixel in the third pixel set, the The first weight and the second weight are used to determine the fourth pixel after weighted fusion; the second pixel in the first sketch image is replaced with the fourth pixel to obtain the second sketch image.
- the method further includes: fusing a preset texture map and the Second sketch image.
- the preset texture map includes a first texture map and a second texture map, and the texture of the first texture map and the texture of the second texture map are inversely positive to each other At the same time, the texture color of the first texture map is darker than the texture color of the second texture map.
- the method further includes: determining, according to pixel values of pixels in the first sketch image, determining The weight of the pixels in the first sketch image; the coloring weight of the corresponding pixels in the first sketch image is determined according to the pixel value of each pixel in the first sketch image; according to the pixel value of each pixel in the first sketch image The pixel value, the coloring weight, and the pixel value of the corresponding pixel in the source image are used to colorize the first sketch image to obtain a third sketch image.
- Generating a first sketch image based on the Gaussian blur map and the grayscale map includes: determining the first sketch according to pixel values of pixels in the grayscale map and pixel values of pixels in the Gaussian blur map the pixel value of the pixel in the image, the grayscale map, the Gaussian blur map and the first sketch image have the same size, the grayscale map, the Gaussian blur map and the pixels in the first sketch image One-to-one correspondence; the first sketch image is generated according to the determined pixel values of the pixels in the first sketch image.
- converting the source image into a grayscale image includes: performing noise reduction processing on the source image; converting the noise reduction processed source image into the grayscale image picture.
- an image processing apparatus including:
- a grayscale module for generating a grayscale image based on the source image
- a processing module for performing Gaussian blur on the grayscale image to obtain a Gaussian blurred image
- a generating module configured to generate a first sketch image corresponding to the source image based on the Gaussian blur map and the grayscale map.
- the above-mentioned apparatus further includes: a correction module, configured to generate, in the generation module, a first sketch corresponding to the source image based on the Gaussian blur map and the grayscale map After image, the source image is converted into a hue saturation lightness HSV map, according to the saturation of the pixels in the HSV map, a highlighted area is determined from the first sketch image, and the highlighted area is corrected to obtain the first image.
- a correction module configured to generate, in the generation module, a first sketch corresponding to the source image based on the Gaussian blur map and the grayscale map
- the source image is converted into a hue saturation lightness HSV map, according to the saturation of the pixels in the HSV map, a highlighted area is determined from the first sketch image, and the highlighted area is corrected to obtain the first image.
- the correction module is configured to determine, in the HSV image, when the highlighted area is determined from the first sketch image according to the saturation of the pixels in the HSV image, the normalized saturation exceeds a preset threshold. pixel to obtain a first pixel set, and according to the first pixel set, determine a second pixel set from the first sketch image, and the first pixel set corresponds to the pixels in the second pixel set one-to-one , the second set of pixels forms the highlighted area;
- the correction module corrects the highlighted area to obtain the second sketch image
- it is configured to determine, according to the first pixel set, a third pixel set from the grayscale image, the first pixel set and the The pixels in the third pixel set are in one-to-one correspondence, and weighted fusion is performed on the pixels in the second pixel set and the corresponding pixels in the third pixel set to correct the first sketch image to obtain the second sketch image.
- the modification module performs weighted fusion of the pixels in the second pixel set and the corresponding pixels in the third pixel set, so as to modify the first sketch image to obtain the second sketch image, used to determine the first weight according to the normalized saturation of the first pixel in the first pixel set, the first weight is positively correlated with the normalized saturation of the first pixel, according to the The first weight is determined, and the second weight is determined.
- the second weight is the difference between the preset value and the first weight.
- the first weight, and the second weight determine the fourth pixel after weighted fusion, and replace the second pixel in the first sketch image with the fourth pixel to obtain the second sketch image.
- the above-mentioned apparatus further includes: a fusion module, configured to, after the correction module corrects the highlighted area to obtain the second sketch image, value to fuse the preset texture map with the second sketch image.
- the preset texture map includes a first texture map and a second texture map, and the texture of the first texture map and the texture of the second texture map are inversely perpendicular to each other , the texture color of the first texture map is darker than the texture color of the second texture map.
- the above-mentioned apparatus further includes: a coloring module, configured to, after the generating module generates a first sketch image based on the Gaussian blur map and the grayscale map, according to the the pixel value of the pixel in the first sketch image, determining the weight of the pixel in the first sketch image, and determining the coloring weight of the corresponding pixel in the first sketch image according to the pixel value of each pixel in the first sketch image, Coloring the first sketch image according to the pixel value and coloring weight of each pixel in the first sketch image and the pixel value of the corresponding pixel in the source image to obtain a third sketch image.
- a coloring module configured to, after the generating module generates a first sketch image based on the Gaussian blur map and the grayscale map, according to the the pixel value of the pixel in the first sketch image, determining the weight of the pixel in the first sketch image, and determining the coloring weight of the corresponding pixel in the first sketch image according to the pixel value of each pixel in the first sketch
- the generating module is configured to determine the pixels in the first sketch image according to the pixel values of the pixels in the grayscale image and the pixel values of the pixels in the Gaussian blur image , the grayscale map, the Gaussian blur map and the first sketch image have the same size, and the grayscale map, the Gaussian blur map and the pixels in the first sketch image correspond one-to-one , generating the first sketch image according to the determined pixel values of the pixels in the first sketch image.
- the grayscale module is configured to perform noise reduction processing on the source image, and convert the noise reduction processed source image into the grayscale image.
- an electronic device comprising: at least one processor and a memory;
- the memory stores computer-executable instructions
- the at least one processor executes computer-implemented instructions stored in the memory to cause the at least one processor to perform the image processing method as described above.
- a computer-readable storage medium where computer-executable instructions are stored in the computer-readable storage medium, and when a processor executes the computer-executable instructions, The image processing method as described above is implemented.
- a computer program product comprising: a computer program, the computer program being stored in a readable storage medium, at least one of an electronic device
- the processor reads the computer program from the readable storage medium, and the at least one processor executes the computer program to cause the electronic device to implement the image processing method as described above.
- a computer program is provided, the computer program is stored in a readable storage medium, and at least one processor of an electronic device can read from the readable storage medium Taking the computer program, the at least one processor executes the computer program so that the electronic device implements the image processing method as described above.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Image Processing (AREA)
Abstract
Description
Claims (14)
- 一种图像处理方法,其特征在于,包括:获取源图像;将所述源图像转换成灰度图;对所述灰度图进行高斯模糊,得到高斯模糊图;基于所述高斯模糊图和所述灰度图,生成所述源图像对应的第一素描图像。
- 根据权利要求1所述的方法,其特征在于,所述基于所述高斯模糊图和所述灰度图,生成第一素描图像之后,还包括:将所述源图像转换成色相饱和度明度HSV图;根据所述HSV图中像素的饱和度,从所述第一素描图像中确定出高亮区域;修正所述高亮区域得到第二素描图像。
- 根据权利要求2所述的方法,其特征在于,所述根据所述HSV图中像素的饱和度,从所述第一素描图像中确定出高亮区域,包括:确定所述HSV图像中归一化后饱和度超过预设阈值的像素,得到第一像素集合;根据所述第一像素集合,从所述第一素描图像中确定出第二像素集合,所述第一像素集合和所述第二像素集合中的像素一一对应,所述第二像素集合形成所述高亮区域;所述修正所述高亮区域得到第二素描图像,包括:根据所述第一像素集合,从所述灰度图中确定出第三像素集合,所述第一像素集合和所述第三像素集合中的像素一一对应;对第二像素集合中的像素和第三像素集合中的对应像素进行加权融合,以修正所述第一素描图像得到所述第二素描图像。
- 根据权利要求3所述的方法,其特征在于,所述对第二像素集合中的像素和第三像素集合中的对应像素进行加权融合,以修正所述第一素描图像得到所述第二素描图像,包括:根据第一像素集合中第一像素归一化后的饱和度,确定第一权重,所述第一权重与所述第一像素归一化后的饱和度正相关;根据所述第一权重,确定第二权重,所述第二权重是预设值与所述第一权重的差值;根据所述第二像素集合中第二像素、所述第三像素集合中第三像素、所述第一权重、所述第二权重,确定加权融合后第四像素;将所述第一素描图像中的第二像素替换为所述第四像素,得到所述第二素描图像。
- 根据权利要求2-4任一项所述的方法,其特征在于,所述修正所述高亮区域得到第二素描图像之后,还包括:根据所述灰度图中像素的像素值,融合预设的纹理图和所述第二素描图像。
- 根据权利要求5所述的方法,其特征在于,所述预设的纹理图包括第一纹理图和第二纹理图,所述第一纹理图的纹理和所述第二纹理图的纹理反向相互正交,所述第一纹理图的纹理颜色深于所述第二纹理图的纹理颜色。
- 根据权利要求1-4任一项所述的方法,其特征在于,所述基于所述高斯模糊图和所述灰度图,生成第一素描图像之后,还包括:根据所述第一素描图像中像素的像素值,确定所述第一素描图像中像素的权重;根据所述第一素描图像中各像素的像素值,确定所述第一素描图像中对应像素的着色权重;根据所述第一素描图像中各像素的像素值、着色权重以及所述源图像中对应像素的像素值,对所述第一素描图像着色,得到第三素描图像。
- 根据权利要求1-4任一项所述的方法,其特征在于,所述基于所述高斯模糊图和所述灰度图,生成第一素描图像,包括:根据所述灰度图中像素的像素值和所述高斯模糊图中像素的像素值,确定所述第一素描图像中像素的像素值,所述灰度图、所述高斯模糊图和所述第一素描图像的尺寸相同,所述灰度图、所述高斯模糊图和所述第一素描图像中的像素一一对应;根据确定的所述第一素描图像中像素的像素值,生成所述第一素描图像。
- 根据权利要求1-4任一项所述的方法,其特征在于,所述将所述源图像转换成灰度图,包括:对所述源图像进行降噪处理;将经过降噪处理的源图像转化为所述灰度图。
- 一种图像处理装置,其特征在于,包括:获取模块,用于获取源图像;灰度模块,用于基于所述源图像,生成灰度图;处理模块,用于对所述灰度图进行高斯模糊,得到高斯模糊图;生成模块,用于基于所述高斯模糊图和所述灰度图,生成所述源图像对应的第一素描图像。
- 一种电子设备,其特征在于,包括:至少一个处理器和存储器;所述存储器存储计算机执行指令;所述至少一个处理器执行所述存储器存储的计算机执行指令,使得所述至少一个处理器执行如权利要求1-9任一项所述的图像处理方法。
- 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质中存储有计算机执行指令,当处理器执行所述计算机执行指令时,实现如权利要求1-9任一项所述的图像处理方法。
- 一种计算机程序产品,其特征在于,包括计算机程序,所述计算机程序在被处理器执行时实现如权利要求1-9任一项所述的方法。
- 一种计算机程序,其特征在于,所述计算机程序在被处理器执行时实现行如权利要求1-9任一项所述的方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2023547662A JP2024505713A (ja) | 2021-02-05 | 2022-01-29 | 画像処理方法、装置、機器及び可読記憶媒体 |
EP22749186.7A EP4276733A4 (en) | 2021-02-05 | 2022-01-29 | IMAGE PROCESSING METHOD AND DEVICE AS WELL AS DEVICE AND READABLE STORAGE MEDIUM |
US18/264,402 US20240046537A1 (en) | 2021-02-05 | 2022-01-29 | Image processing method and apparatus, device and readable storage medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110164141.7 | 2021-02-05 | ||
CN202110164141.7A CN112819691B (zh) | 2021-02-05 | 2021-02-05 | 图像处理方法、装置、设备及可读存储介质 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022166907A1 true WO2022166907A1 (zh) | 2022-08-11 |
Family
ID=75861947
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2022/075130 WO2022166907A1 (zh) | 2021-02-05 | 2022-01-29 | 图像处理方法、装置、设备及可读存储介质 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20240046537A1 (zh) |
EP (1) | EP4276733A4 (zh) |
JP (1) | JP2024505713A (zh) |
CN (1) | CN112819691B (zh) |
WO (1) | WO2022166907A1 (zh) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112819691B (zh) * | 2021-02-05 | 2023-06-20 | 北京字跳网络技术有限公司 | 图像处理方法、装置、设备及可读存储介质 |
CN117474820B (zh) * | 2023-10-12 | 2024-06-18 | 书行科技(北京)有限公司 | 图像处理方法、装置、电子设备及存储介质 |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101587593A (zh) * | 2009-06-19 | 2009-11-25 | 西安交通大学 | 一种基于真实图像素描风格化的方法 |
CN103021002A (zh) * | 2011-09-27 | 2013-04-03 | 康佳集团股份有限公司 | 彩色素描图像生成方法 |
CN105528765A (zh) * | 2015-12-02 | 2016-04-27 | 小米科技有限责任公司 | 处理图像的方法及装置 |
CN107170032A (zh) * | 2017-03-23 | 2017-09-15 | 南京信息工程大学 | 基于线性素描图像的力渲染方法 |
CN107864337A (zh) * | 2017-11-30 | 2018-03-30 | 广东欧珀移动通信有限公司 | 素描图像处理方法、装置及设备 |
CN108682040A (zh) * | 2018-05-21 | 2018-10-19 | 努比亚技术有限公司 | 一种素描图像生成方法、终端及计算机可读存储介质 |
CN109300099A (zh) * | 2018-08-29 | 2019-02-01 | 努比亚技术有限公司 | 一种图像处理方法、移动终端及计算机可读存储介质 |
CN109741243A (zh) * | 2018-12-27 | 2019-05-10 | 深圳云天励飞技术有限公司 | 彩色素描图像生成方法及相关产品 |
US20200320756A1 (en) * | 2019-04-02 | 2020-10-08 | Adobe Inc. | Automatic illustrator guides |
CN112819691A (zh) * | 2021-02-05 | 2021-05-18 | 北京字跳网络技术有限公司 | 图像处理方法、装置、设备及可读存储介质 |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
RU2445677C1 (ru) * | 2010-09-13 | 2012-03-20 | Корпорация "САМСУНГ ЭЛЕКТРОНИКС Ко., Лтд." | Способ черновой печати посредством преобразования растровых изображений в эскизы (варианты) |
-
2021
- 2021-02-05 CN CN202110164141.7A patent/CN112819691B/zh active Active
-
2022
- 2022-01-29 JP JP2023547662A patent/JP2024505713A/ja active Pending
- 2022-01-29 US US18/264,402 patent/US20240046537A1/en active Pending
- 2022-01-29 EP EP22749186.7A patent/EP4276733A4/en active Pending
- 2022-01-29 WO PCT/CN2022/075130 patent/WO2022166907A1/zh active Application Filing
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101587593A (zh) * | 2009-06-19 | 2009-11-25 | 西安交通大学 | 一种基于真实图像素描风格化的方法 |
CN103021002A (zh) * | 2011-09-27 | 2013-04-03 | 康佳集团股份有限公司 | 彩色素描图像生成方法 |
CN105528765A (zh) * | 2015-12-02 | 2016-04-27 | 小米科技有限责任公司 | 处理图像的方法及装置 |
CN107170032A (zh) * | 2017-03-23 | 2017-09-15 | 南京信息工程大学 | 基于线性素描图像的力渲染方法 |
CN107864337A (zh) * | 2017-11-30 | 2018-03-30 | 广东欧珀移动通信有限公司 | 素描图像处理方法、装置及设备 |
CN108682040A (zh) * | 2018-05-21 | 2018-10-19 | 努比亚技术有限公司 | 一种素描图像生成方法、终端及计算机可读存储介质 |
CN109300099A (zh) * | 2018-08-29 | 2019-02-01 | 努比亚技术有限公司 | 一种图像处理方法、移动终端及计算机可读存储介质 |
CN109741243A (zh) * | 2018-12-27 | 2019-05-10 | 深圳云天励飞技术有限公司 | 彩色素描图像生成方法及相关产品 |
US20200320756A1 (en) * | 2019-04-02 | 2020-10-08 | Adobe Inc. | Automatic illustrator guides |
CN112819691A (zh) * | 2021-02-05 | 2021-05-18 | 北京字跳网络技术有限公司 | 图像处理方法、装置、设备及可读存储介质 |
Non-Patent Citations (1)
Title |
---|
See also references of EP4276733A4 |
Also Published As
Publication number | Publication date |
---|---|
CN112819691A (zh) | 2021-05-18 |
US20240046537A1 (en) | 2024-02-08 |
JP2024505713A (ja) | 2024-02-07 |
EP4276733A4 (en) | 2023-11-22 |
CN112819691B (zh) | 2023-06-20 |
EP4276733A1 (en) | 2023-11-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20230401682A1 (en) | Styled image generation method, model training method, apparatus, device, and medium | |
US20190130208A1 (en) | Local tone mapping to reduce bit depth of input images to high-level computer vision tasks | |
WO2022166907A1 (zh) | 图像处理方法、装置、设备及可读存储介质 | |
CN111368685A (zh) | 关键点的识别方法、装置、可读介质和电子设备 | |
US20240095981A1 (en) | Video generation method and apparatus, device and readable storage medium | |
CN111784568A (zh) | 人脸图像处理方法、装置、电子设备及计算机可读介质 | |
US20240104810A1 (en) | Method and apparatus for processing portrait image | |
CN110070495B (zh) | 图像的处理方法、装置和电子设备 | |
CN110070499A (zh) | 图像处理方法、装置和计算机可读存储介质 | |
CN111340921B (zh) | 染色方法、装置和计算机系统及介质 | |
CN111583103B (zh) | 人脸图像处理方法、装置、电子设备及计算机存储介质 | |
WO2023143229A1 (zh) | 图像处理方法、装置、设备及存储介质 | |
CN113989717A (zh) | 视频图像处理方法、装置、电子设备及存储介质 | |
CN113610720A (zh) | 视频去噪方法及装置、计算机可读介质和电子设备 | |
CN110335195A (zh) | 车身颜色调整方法、装置、电子设备及存储介质 | |
CN112581635A (zh) | 一种通用的快速换脸方法、装置、电子设备和存储介质 | |
CN113066020A (zh) | 图像处理方法及装置、计算机可读介质和电子设备 | |
CN112419179A (zh) | 修复图像的方法、装置、设备和计算机可读介质 | |
CN116320597A (zh) | 直播图像帧处理方法、装置、设备、可读存储介质及产品 | |
CN114331823A (zh) | 图像处理方法、装置、电子设备及存储介质 | |
CN115953597B (zh) | 图像处理方法、装置、设备及介质 | |
CN110555799A (zh) | 用于处理视频的方法和装置 | |
CN111833262A (zh) | 图像降噪方法、装置及电子设备 | |
CN111402159A (zh) | 图像处理方法、装置、电子设备及计算机可读介质 | |
CN116188314A (zh) | 一种图像处理方法、装置、电子设备及存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22749186 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18264402 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023547662 Country of ref document: JP |
|
ENP | Entry into the national phase |
Ref document number: 2022749186 Country of ref document: EP Effective date: 20230810 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |