CN117745605A - Image processing method, device, electronic equipment and storage medium - Google Patents

Image processing method, device, electronic equipment and storage medium Download PDF

Info

Publication number
CN117745605A
CN117745605A CN202310629034.6A CN202310629034A CN117745605A CN 117745605 A CN117745605 A CN 117745605A CN 202310629034 A CN202310629034 A CN 202310629034A CN 117745605 A CN117745605 A CN 117745605A
Authority
CN
China
Prior art keywords
image
color image
gray
texture
pixel point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310629034.6A
Other languages
Chinese (zh)
Inventor
汪洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiaohongshu Technology Co ltd
Original Assignee
Xiaohongshu Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiaohongshu Technology Co ltd filed Critical Xiaohongshu Technology Co ltd
Priority to CN202310629034.6A priority Critical patent/CN117745605A/en
Publication of CN117745605A publication Critical patent/CN117745605A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Processing (AREA)

Abstract

The embodiment of the application provides an image processing method, an image processing device, electronic equipment and a storage medium. The method comprises the following steps: acquiring a first texture image; according to the texture coordinates of each pixel point in the first texture image, performing texture sampling on the first texture image to obtain a first color image; acquiring a target color image corresponding to the first color image, wherein the target color image comprises a special effect area, and the change of the color value of a pixel point in the special effect area accords with the eclosion rule; and fusing the first color image and the target color image to obtain a target image corresponding to the first texture image. The embodiment of the application provides a new special effect and improves user experience.

Description

Image processing method, device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an image processing method, an image processing device, an electronic device, and a storage medium.
Background
With the development of technology, especially the progress of image processing technology, more and more application software, such as short video application programs, is going into the life of users, and gradually enriches the amateur life of users. The users can record life in a video, photo and other modes, upload the video and photo to the short video application program and share the video and photo with other users.
While there are many special effects of image algorithms and rendering techniques on these software applications, these special effects play attracts more and more users to use the software applications. However, the existing special effect playing method is single, for example, only provides some transition special effect playing methods for users, so that continuous expansion of the special effect playing methods and expansion of creative effect materials are required.
How to increase the diversity of special effect playing methods and provide diversified special effect playing methods for users is a technical problem to be solved urgently at present.
Disclosure of Invention
The embodiment of the application provides an image processing method, an image processing device, electronic equipment and a storage medium, wherein special effects are added to texture images, target images with special effects of changing eclosion rules are generated, special effect playing methods are expanded, and diversified special effect playing methods can be provided for users.
In a first aspect, embodiments of the present application provide an image processing method, including:
acquiring a first texture image;
according to the texture coordinates of each pixel point in the first texture image, performing texture sampling on the first texture image to obtain a first color image;
acquiring a target color image corresponding to the first color image, wherein the target color image comprises a special effect area, and the change of the color value of a pixel point in the special effect area accords with the eclosion rule;
and fusing the first color image and the target color image to obtain a target image corresponding to the first texture image.
In a second aspect, an embodiment of the present application provides an image processing apparatus, including an acquisition unit and a processing unit;
the acquisition unit is used for acquiring a first texture image;
the processing unit is used for performing texture sampling on the first texture image according to the texture coordinates of each pixel point in the first texture image to obtain a first color image;
acquiring a target color image corresponding to the first color image, wherein the target color image comprises a special effect area, and the change of the color value of a pixel point in the special effect area accords with the eclosion rule;
and fusing the first color image and the target color image to obtain a target image corresponding to the first texture image.
In a third aspect, an embodiment of the present application provides an electronic device, including: a processor and a memory, the processor being connected to the memory, the memory being for storing a computer program, the processor being for executing the computer program stored in the memory to cause the electronic device to perform the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium storing a computer program, the computer program causing a computer to perform the method according to the first aspect.
In a fifth aspect, embodiments of the present application provide a computer program product comprising a non-transitory computer readable storage medium storing a computer program, the computer being operable to cause a computer to perform the method according to the first aspect.
The implementation of the embodiment of the application has the following beneficial effects:
it can be seen that, in the embodiment of the present application, when a user needs to add a special effect to an image, texture sampling may be performed on the first texture image based on texture coordinates of each pixel point in the first texture image, so as to obtain a first color image; then, a target color image corresponding to the first texture image is generated, wherein the target color image comprises a special effect area, and the change rule of the color value of each pixel point in the special effect area accords with the eclosion rule, that is, the target color image comprises the special effect area with the changed eclosion rule. And finally, fusing the first color image with the target color image to generate a target image, namely, overlapping a special effect area with the feathering regular change in the target color image on the first color image to generate the target image, so that the special effect area with the feathering regular change is added for the first texture image, further, a special effect playing method for adding the feathering special effect is provided for a user, the special effect playing method is expanded, diversified special effect playing methods can be provided for the user, and the user experience is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic diagram of a special effect generation scene provided in an embodiment of the present application;
fig. 2 is a schematic flow chart of an image processing method according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a synthesized target color image according to an embodiment of the present disclosure;
fig. 4 is a schematic flow chart of synthesizing a target image according to an embodiment of the present application;
FIG. 5 is a schematic diagram of blurring a first texture image according to an embodiment of the present disclosure;
fig. 6 is a schematic diagram of different special effect areas of a multi-frame texture image according to an embodiment of the present application;
fig. 7 is a functional unit block diagram of an image processing apparatus according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
The terms "first," "second," "third," and "fourth" and the like in the description and in the claims of this application and in the drawings, are used for distinguishing between different objects and not for describing a particular sequential order. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those listed steps or elements but may include other steps or elements not listed or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, result, or characteristic described in connection with the embodiment may be included in at least one embodiment of the application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments.
In order to facilitate an understanding of the embodiments of the present application, the description and explanation of the related art terms referred to in this application will first be provided.
Eclosion of images: the feathering of the image is an image processing technology, so that the image is gradually blurred and faded from the center to the edge of the image, and the transition between the image and the background is more natural. The degree of feathering of the image depends on the selected range of feathering, i.e., the distance of feathering. In general, the greater the distance of feathering, the softer the edge. During the linear eclosion, the weight of a pixel is determined according to its distance from the edge of the image. The more distant pixels from the edge, the less weighted they are.
Referring to fig. 1, fig. 1 is a schematic diagram of a special effect generating scene provided in an embodiment of the present application.
As shown in fig. 1, an effect generation virtual function button, which may be named "eclipsed effect", that is, an effect of the present application, is displayed on the display interface of the image processing apparatus 10. Therefore, when the user needs to add the feathering effect to the texture image, the feathering effect can be added to the image to be processed by touching or clicking the virtual function.
As shown in fig. 1, the image processing apparatus 10 firstly performs texture sampling on a first texture image according to the texture coordinates of each pixel point in the first texture image to obtain a first color image; then, generating a target color image corresponding to the first texture image based on the image processing method, wherein the target color image comprises a special effect area, and the pixel values of the pixel points in the special effect area accord with the eclosion rule. Then, fusing the first color image with the target color image to obtain a target image corresponding to the first texture image, wherein the target image can understand that the first texture image is added with the texture image with the special effect area with the feathering change rule, so that the first texture image is converted into a frame of target image containing the special effect area, the special effect area changes in the feathering rule, and the feathering special effect is added to the first texture image.
Referring to fig. 2, fig. 2 is a flow chart of an image processing method according to an embodiment of the present application. The method is applied to the image processing device. The method includes, but is not limited to, the following steps:
201: a first texture image is acquired.
Alternatively, the first texture image may be a single frame texture image, or may be any frame texture image in the video. It should be appreciated that if the first texture image is a single frame texture image, then the user's need is to add a feathering effect to the first texture image; if the first texture image is any one frame of texture image in the video, wherein the video comprises a plurality of frames of texture images, the user needs to add the eclosion special effect for each frame of texture image in the video, so that the synthesized special effect video comprises the eclosion special effect. In this application, a specific video is mainly synthesized, and an example of generating a target image corresponding to a first texture image is described.
202: and performing texture sampling on the first texture image according to the texture coordinates of each pixel point in the first texture image to obtain a first color image.
Illustratively, according to the texture coordinates of each pixel point in the first texture image, performing texture sampling on the first texture image in a texture space to obtain RGB components of each pixel point; based on the RGB components of each pixel point, a first color image is obtained. Wherein each pixel point is each pixel point in the first texture image. It should be appreciated that the relative position and number of individual pixels will not change, regardless of the texture image, except that the pixel values of the individual pixels in the different types of images will differ.
203: and acquiring a target color image corresponding to the first color image, wherein the target color image comprises a special effect area, and the pixel value change of the pixel points in the special effect area accords with the eclosion rule.
Illustratively, the first color image is subjected to gray scale processing to obtain a first gray scale image. Alternatively, the gradation processing is performed on the first color image, which can be expressed by the formula (1):
GrayScale = dot (BlueColor (R, G, B), vec (a 1, a2, a 3)) equation (1);
the gray scale is a first gray image, the blue color (R, G, B) is a three-dimensional vector composed of RGB components of each pixel point in the first color image, a1, a2, a3 is a preset weight of RGB dimensions, vec (a 1, a2, a 3) is a three-dimensional vector composed of a1, a2, a3, and dot represents dot product operation between the vectors.
Then, acquiring a gray threshold value and a preset radius corresponding to the first texture image; and determining a special effect area corresponding to the first gray level image according to the gray level threshold value and the preset radius, namely determining the position of the special effect area in the first gray level image.
Specifically, a gray value interval is determined based on the gray threshold value and a preset radius, wherein the left end point of the gray value interval is the gray threshold value, and the length is the preset radius. And then, acquiring the gray value of each pixel point in the first gray image, and forming a special effect area in the first gray image by the pixel points with the gray values falling into the gray interval.
Further, according to the gray threshold and the preset radius, the first reference parameter is determined. Illustratively, the first reference parameter may be represented by equation (2):
halfpadius = 1/2 x radius formula (2);
wherein halfpadius is a first reference parameter, and Radius is a preset Radius.
And then, determining the eclosion intensity of each pixel point according to the gray value of each pixel point in the special effect area in the first gray image, the gray threshold value and the first reference parameter. For example, if the pixel is outside the special effect area, the eclosion intensity of the pixel is set to 0; if the pixel point is in the special effect area, acquiring a gray value of the pixel point in the first gray image, the gray threshold value and a difference value between the first reference parameters; and taking the absolute value of the ratio of the difference value to the first reference parameter as the eclosion intensity of the pixel point.
Therefore, the eclosion intensity of each pixel point can be expressed by the formula (3):
the linearPos is the eclosion intensity of any pixel point, grayScale is the gray value of the pixel point in the first gray image, adjThreshold is the gray threshold, halfRadius is the first reference parameter, and abs is the absolute value.
It can be seen that, based on the formula (3), for the pixel points in the special effect area, the closer to the pixel point in the middle of the gray value interval, the greater the eclosion intensity, and the closer to the pixel point at the end point of the gray value interval, the smaller the eclosion intensity, so that the variation of the eclosion intensity of each pixel point in the special effect area accords with the eclosion rule.
Finally, according to the eclosion intensity of each pixel point, fusing the preset color image (i.e. the base map, for example, a pure black image can be used as the base map) with the first color image to obtain the target color image. Specifically, the RGB components of each pixel point in the preset color image and the RGB components of each pixel point in the first color image are obtained, then the product of the RGB components of each pixel point in the preset color image and the eclosion intensity is taken as the RGB components of each pixel point in the target color image after the RGB components of each pixel point in the first color image, and further the target color image is obtained.
By way of example, the target color image may be represented by formula (4):
GlowColor = filecolor × linearpos+basecolor formula (4);
GlowColor is a target color image, fillColor is a preset color image, and BaseColor is a first color image.
It should be understood that since the eclosion intensity of each pixel point in the special effect area conforms to the eclosion rule, the change in color value (i.e., RGB component) of each pixel point in the special effect area in the target color image conforms to the eclosion rule.
For example, as shown in fig. 3, the upper body of the user is screened out as a special effect area by presetting a radius and a gray threshold, and the eclosion intensity of the pixels at the edge part in the special effect area is relatively high, that is, the brightness is relatively high, and the eclosion intensity of the flat part is relatively small, so that the variation of the eclosion intensity of each pixel of the target color image obtained after fusion accords with the eclosion rule.
204: and fusing the first color image and the target color image to obtain a target image corresponding to the first texture image.
Illustratively, according to the preset parameters, the first color image and the target color image are fused to obtain a target image corresponding to the first texture image. Wherein the target image can be represented by formula (5):
finalcolor=basecolor (1-alpha) +glowcolor formula (5);
wherein Finalcolor is a target image, glowColor is a target color image, baseColor is a first color image, and alpha is a preset parameter.
It can be understood that, since the target image is obtained by fusing the target color images, the target image also includes a special effect area, and the color value of each pixel in the special effect area changes according to the eclosion rule. For example, as shown in fig. 4, the target color image and the first color image are fused, so that the color value of the first color is retained in the target image, and the color value of each pixel point in the special effect area in the target image is changed to conform to the eclosion rule.
It can be seen that, in the embodiment of the present application, when a user needs to add a special effect to an image, texture sampling may be performed on the first texture image based on texture coordinates of each pixel point in the first texture image, so as to obtain a first color image; then, a target color image corresponding to the first texture image is generated, wherein the target color image comprises a special effect area, and the change rule of the color value of each pixel point in the special effect area accords with the eclosion rule, that is, the target color image comprises the special effect area with the changed eclosion rule. And finally, fusing the first color image with the target color image to generate a target image, namely, overlapping a special effect area with the feathering regular change in the target color image on the first color image to generate the target image, so that the special effect area with the feathering regular change is added for the first texture image, further, a special effect playing method for adding the feathering special effect is provided for a user, the special effect playing method is expanded, diversified special effect playing methods can be provided for the user, and the user experience is improved.
In one embodiment of the present application, before gray processing is performed on the first color image, blurring processing may be performed on the first texture image to obtain the second texture image. And then, according to the texture coordinates of each pixel point in the second texture image, performing texture sampling on the second texture image in a texture space to obtain RGB components of each pixel point. A second color image is obtained based on the RGB components of each pixel point. And finally, carrying out gray scale processing on the second color image to obtain a first gray scale image. That is, in the present application, the gray scale processing may be directly performed by using the first color image to obtain the first gray scale image, or the blurring processing may be performed on the first texture image first, and the first gray scale image may be obtained by using the second texture image. The manner in which the first grayscale image is acquired is not limited in this application.
It should be noted that the blurring process is performed on the first texture image because the first color image sampled by the first texture image is directly used for graying, and the eclosion intensity of each pixel point is determined by using the grayed first gray image to synthesize the target image, and since the blurring process is not performed on the edge of the first texture image, a very obvious noise point appears on the edge. As shown in the left graph of fig. 5, a significant noise, i.e., a burr phenomenon, occurs at the edge of the face, and the viewing experience of the user is poor. Therefore, the blurring process may be performed on the first texture image, that is, the edges of the first texture image may be smoothed to reduce the burr phenomenon of the edges. As shown in the middle diagram of fig. 5, the smoothing parameter k1 is used to smooth the first texture image, and it can be seen that after the smoothing parameter k1 is used to smooth the first texture image, the edge of the finally synthesized target image is relatively smooth, and the burr phenomenon is less. Further, as shown in the right graph of fig. 5, the smoothing parameter k2 (k 2 > k 1) is used to smooth the first texture image, and it can be seen that after the smoothing parameter k1 is used to smooth the first texture image, the edge of the finally synthesized target image is smoother, and the burr phenomenon is less. Therefore, it can be seen that the number of noise points in the target image can be reduced, namely the phenomenon of burrs of the target image is avoided, by performing the blurring processing on the first texture image and acquiring the eclosion intensity of each pixel point through the blurred texture image, namely the second texture image.
In one embodiment of the present application, the first texture image is any one frame of a plurality of frames of texture images, wherein the plurality of frames of texture images are a plurality of frames of texture images of an original video. And acquiring a gray threshold value and a preset radius corresponding to the first texture image according to the position of the first texture image in the multi-frame texture image, namely the position of the first texture image in the original video. Specifically, the mapping relation among the position of the texture image, the gray threshold value and the preset radius is preset; then, according to the position of the first texture image in the multi-frame texture image and the mapping relation, determining a gray threshold value and a preset radius corresponding to the first texture image. It can be understood that the gray scale interval and the preset radius corresponding to the first texture image correspond to the position of the first texture image in the original video, so that the determined special effect area also corresponds to the position of the first texture image in the original video. Therefore, the gray value threshold value and the preset radius corresponding to each frame of texture image can be obtained, the target image corresponding to each frame of texture image is determined, and the target image corresponding to each frame of texture image contains a special effect area. And finally, synthesizing a plurality of target images corresponding to the plurality of texture images to obtain the special effect video, namely converting the original video into the special effect video containing the eclosion special effect. It should be understood that the special effect area in each frame of texture image is different and corresponds to the position of each frame of texture image in the original video, so that after the multi-frame texture image is synthesized into the special effect video, the special effect area of the multi-frame texture image dynamically changes in a scanning mode. For example, the special effect area in the three frames of texture images shown in fig. 6 corresponds to the positions of the three frames of texture images in the original video, so that after the three frames of texture images are synthesized into the special effect video, the special effect area of the three frames of texture images dynamically changes in a scanning manner.
Referring to fig. 7, fig. 7 is a block diagram illustrating functional units of an image processing apparatus according to an embodiment of the present application. The image processing apparatus 700 includes: an acquisition unit 701 and a processing unit 702;
an acquisition unit 701 for acquiring a first texture image;
a processing unit 702, configured to perform texture sampling on the first texture image according to texture coordinates of each pixel point in the first texture image, so as to obtain a first color image;
acquiring a target color image corresponding to the first color image, wherein the target color image comprises a special effect area, and the change of the color value of a pixel point in the special effect area accords with the eclosion rule;
and fusing the first color image and the target color image to obtain a target image corresponding to the first texture image.
In one embodiment of the present application, the processing unit 702 is specifically configured to, when acquiring the target color image corresponding to the first color image:
gray processing is carried out on the first color image to obtain a first gray image;
acquiring a gray threshold value and a preset radius corresponding to the first texture image;
determining the special effect area in the first gray level image according to the gray level threshold value and the preset radius;
and fusing a preset color image and the first color image according to the gray threshold value, the preset radius and the gray value of each pixel point in the special effect area in the first gray image to obtain the target color image.
In one embodiment of the present application, in terms of fusing a preset color image and the first color image according to the gray threshold, the preset radius, and the gray value of each pixel point in the special effect area in the first gray image, the processing unit 702 is specifically configured to:
determining a first reference parameter according to the gray threshold and the preset radius;
determining the eclosion intensity of each pixel point according to the gray value of each pixel point in the special effect area in the first gray image, the gray threshold value and the first reference parameter;
and fusing the preset color image and the first color image according to the eclosion intensity of each pixel point to obtain the target color image.
In one embodiment of the present application, the processing unit 702 is specifically configured to determine, according to the gray value of each pixel point in the special effect area in the first gray image, the gray threshold value, and the first reference parameter, an eclosion intensity of each pixel point:
if the pixel point is positioned outside the special effect area, the eclosion intensity of the pixel point is set to be 0;
if the pixel point is positioned in the special effect area, acquiring a gray value of the pixel point in the first gray image, the gray threshold value and a difference value between the first reference parameters;
and taking the absolute value of the ratio of the difference value to the first reference parameter as the eclosion intensity of the pixel point.
In one embodiment of the present application, before performing gray-scale processing on the first color image to obtain a first gray-scale image, the processing unit 702 is further configured to:
blurring processing is carried out on textures of the first texture image, and a second texture image is obtained;
according to the texture coordinates of each pixel point in the second texture image, performing texture sampling on the second texture image to obtain a second color image;
in terms of performing gray-scale processing on the first color image to obtain a first gray-scale image, the processing unit 702 is specifically configured to: and carrying out gray scale processing on the second color image to obtain the first gray scale image.
In one embodiment of the present application, the first texture image is any one frame of multi-frame texture images; the processing unit 702 is further configured to:
acquiring a multi-frame target image corresponding to the multi-frame texture image;
and synthesizing the multi-frame target images to obtain the special effect video.
In one embodiment of the present application, the special effect area in the multi-frame target image changes according to a scanning manner.
Referring to fig. 8, fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present application. As shown in fig. 8, the electronic device 800 includes a transceiver 801, a processor 802, and a memory 803. Which are connected by a bus 804. The memory 803 is used to store computer programs and data, and the data stored in the memory 803 can be transferred to the processor 802. The electronic device 800 may be the image processing apparatus 700 described above.
The processor 802 is configured to read a computer program in the memory 803 to perform the following operations:
controlling the transceiver 801 to acquire a first texture image;
according to the texture coordinates of each pixel point in the first texture image, performing texture sampling on the first texture image to obtain a first color image;
acquiring a target color image corresponding to the first color image, wherein the target color image comprises a special effect area, and the change of the color value of a pixel point in the special effect area accords with the eclosion rule;
and fusing the first color image and the target color image to obtain a target image corresponding to the first texture image.
In one embodiment of the present application, the processor 802 is specifically configured to, when acquiring the target color image corresponding to the first color image, perform the following operations:
gray processing is carried out on the first color image to obtain a first gray image;
acquiring a gray threshold value and a preset radius corresponding to the first texture image;
determining the special effect area in the first gray level image according to the gray level threshold value and the preset radius;
and fusing a preset color image and the first color image according to the gray threshold value, the preset radius and the gray value of each pixel point in the special effect area in the first gray image to obtain the target color image.
In one embodiment of the present application, the processor 802 is specifically configured to, in terms of fusing a preset color image and the first color image to obtain the target color image according to the gray threshold, the preset radius, and the gray value of each pixel point in the special effect area in the first gray image:
determining a first reference parameter according to the gray threshold and the preset radius;
determining the eclosion intensity of each pixel point according to the gray value of each pixel point in the special effect area in the first gray image, the gray threshold value and the first reference parameter;
and fusing the preset color image and the first color image according to the eclosion intensity of each pixel point to obtain the target color image.
In one embodiment of the present application, the processor 802 is specifically configured to, in determining the eclipse intensity of each pixel point in the special effects area according to the gray value of each pixel point in the first gray image, the gray threshold value, and the first reference parameter, perform the following operations:
if the pixel point is positioned outside the special effect area, the eclosion intensity of the pixel point is set to be 0;
if the pixel point is positioned in the special effect area, acquiring a gray value of the pixel point in the first gray image, the gray threshold value and a difference value between the first reference parameters;
and taking the absolute value of the ratio of the difference value to the first reference parameter as the eclosion intensity of the pixel point.
In one embodiment of the present application, before performing gray-scale processing on the first color image to obtain a first gray-scale image, the processor 802 is further configured to perform the following operations:
blurring processing is carried out on textures of the first texture image, and a second texture image is obtained;
according to the texture coordinates of each pixel point in the second texture image, performing texture sampling on the second texture image to obtain a second color image;
in terms of performing gray-scale processing on the first color image to obtain a first gray-scale image, the processor 802 is specifically configured to perform the following operations: and carrying out gray scale processing on the second color image to obtain the first gray scale image.
In one embodiment of the present application, the first texture image is any one frame of multi-frame texture images; the processor 802 is also configured to perform the following operations:
acquiring a multi-frame target image corresponding to the multi-frame texture image;
and synthesizing the multi-frame target images to obtain the special effect video.
In one embodiment of the present application, the special effect area in the multi-frame target image changes according to a scanning manner.
Specifically, the transceiver 801 may be the acquisition unit 701 of the image processing apparatus 700 of the embodiment illustrated in fig. 7, and the processor 802 may be the processing unit 702 of the image processing apparatus 700 of the embodiment illustrated in fig. 7.
It should be understood that the electronic device in the present application may include a smart Phone (such as an Android mobile Phone, an iOS mobile Phone, a Windows Phone mobile Phone, etc.), a tablet computer, a palm computer, a notebook computer, a mobile internet device MID (Mobile Internet Devices, abbreviated as MID) or a wearable device, etc. The above-described electronic devices are merely examples and are not intended to be exhaustive and include, but are not limited to, the above-described electronic devices. In practical applications, the electronic device may further include: intelligent vehicle terminals, computer devices, etc.
The present application also provides a computer-readable storage medium storing a computer program that is executed by a processor to implement some or all of the steps of any one of the image processing methods as described in the above method embodiments.
The present application also provides a computer program product comprising a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform part or all of the steps of any one of the image processing methods as described in the method embodiments above.
It should be noted that, for simplicity of description, the foregoing method embodiments are all expressed as a series of action combinations, but it should be understood by those skilled in the art that the present application is not limited by the order of actions described, as some steps may be performed in other order or simultaneously in accordance with the present application. Further, those skilled in the art will also appreciate that the embodiments described in the specification are all alternative embodiments, and that the acts and modules referred to are not necessarily required in the present application.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, such as the division of the units, merely a logical function division, and there may be additional manners of dividing the actual implementation, such as multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, or may be in electrical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units described above may be implemented either in hardware or in software program modules.
The integrated units, if implemented in the form of software program modules, may be stored in a computer-readable memory for sale or use as a stand-alone product. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a memory, including several instructions for causing a computer device (which may be a personal computer, a server or a network device, etc.) to perform all or part of the steps of the method described in the embodiments of the present application. And the aforementioned memory includes: a U-disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Those of ordinary skill in the art will appreciate that all or a portion of the steps in the various methods of the above embodiments may be implemented by a program that instructs associated hardware, and the program may be stored in a computer readable memory, which may include: flash disk, read-Only Memory (ROM), random access Memory (Random Access Memory, RAM), magnetic disk or optical disk.
The foregoing has outlined rather broadly the more detailed description of embodiments of the present application, wherein specific examples are provided herein to illustrate the principles and embodiments of the present application, the above examples being provided solely to assist in the understanding of the methods of the present application and the core ideas thereof; meanwhile, as those skilled in the art will have modifications in the specific embodiments and application scope in accordance with the ideas of the present application, the present description should not be construed as limiting the present application in view of the above.

Claims (10)

1. An image processing method, the method comprising:
acquiring a first texture image;
according to the texture coordinates of each pixel point in the first texture image, performing texture sampling on the first texture image to obtain a first color image;
acquiring a target color image corresponding to the first color image, wherein the target color image comprises a special effect area, and the change of the color value of a pixel point in the special effect area accords with the eclosion rule;
and fusing the first color image and the target color image to obtain a target image corresponding to the first texture image.
2. The method of claim 1, wherein the acquiring a target color image corresponding to the first color image comprises:
gray processing is carried out on the first color image to obtain a first gray image;
acquiring a gray threshold value and a preset radius corresponding to the first texture image;
determining the special effect area in the first gray level image according to the gray level threshold value and the preset radius;
and fusing a preset color image and the first color image according to the gray threshold value, the preset radius and the gray value of each pixel point in the special effect area in the first gray image to obtain the target color image.
3. The method according to claim 2, wherein the fusing the preset color image and the first color image to obtain the target color image according to the gray threshold, the preset radius, and the gray value of each pixel point in the special effect area in the first gray image includes:
determining a first reference parameter according to the gray threshold and the preset radius;
determining the eclosion intensity of each pixel point according to the gray value of each pixel point in the special effect area in the first gray image, the gray threshold value and the first reference parameter;
and fusing the preset color image and the first color image according to the eclosion intensity of each pixel point to obtain the target color image.
4. A method according to claim 3, wherein said determining the feathering intensity of each pixel point in the special effects area according to the gray value of each pixel point in the first gray image, the gray threshold value and the first reference parameter comprises:
if the pixel point is positioned outside the special effect area, the eclosion intensity of the pixel point is set to be 0;
if the pixel point is positioned in the special effect area, acquiring a gray value of the pixel point in the first gray image, the gray threshold value and a difference value between the first reference parameters;
and taking the absolute value of the ratio of the difference value to the first reference parameter as the eclosion intensity of the pixel point.
5. The method of any one of claims 2-4, wherein prior to gray processing the first color image to obtain a first gray image, the method further comprises:
blurring processing is carried out on textures of the first texture image, and a second texture image is obtained;
according to the texture coordinates of each pixel point in the second texture image, performing texture sampling on the second texture image to obtain a second color image;
the gray processing is performed on the first color image to obtain a first gray image, which includes:
and carrying out gray scale processing on the second color image to obtain the first gray scale image.
6. The method according to any one of claims 3 to 5, wherein,
the first texture image is any frame in a multi-frame texture image; the method further comprises the steps of:
acquiring a multi-frame target image corresponding to the multi-frame texture image;
and synthesizing the multi-frame target images to obtain the special effect video.
7. The method of claim 6, wherein the step of providing the first layer comprises,
and the special effect area in the multi-frame target image is changed according to the scanning mode.
8. An image processing apparatus, characterized in that the apparatus comprises an acquisition unit and a processing unit;
the acquisition unit is used for acquiring a first texture image;
the processing unit is used for performing texture sampling on the first texture image according to the texture coordinates of each pixel point in the first texture image to obtain a first color image;
acquiring a target color image corresponding to the first color image, wherein the target color image comprises a special effect area, and the change of the color value of a pixel point in the special effect area accords with the eclosion rule;
and fusing the first color image and the target color image to obtain a target image corresponding to the first texture image.
9. An electronic device, comprising: a processor and a memory, the processor being connected to the memory, the memory being for storing a computer program, the processor being for executing the computer program stored in the memory to cause the electronic device to perform the method of any one of claims 1-7.
10. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program, which is executed by a processor to implement the method of any of claims 1-7.
CN202310629034.6A 2023-05-30 2023-05-30 Image processing method, device, electronic equipment and storage medium Pending CN117745605A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310629034.6A CN117745605A (en) 2023-05-30 2023-05-30 Image processing method, device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310629034.6A CN117745605A (en) 2023-05-30 2023-05-30 Image processing method, device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117745605A true CN117745605A (en) 2024-03-22

Family

ID=90256823

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310629034.6A Pending CN117745605A (en) 2023-05-30 2023-05-30 Image processing method, device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117745605A (en)

Similar Documents

Publication Publication Date Title
US11676342B2 (en) Providing 3D data for messages in a messaging system
US11776233B2 (en) Beautification techniques for 3D data in a messaging system
WO2020192568A1 (en) Facial image generation method and apparatus, device and storage medium
KR102624635B1 (en) 3D data generation in messaging systems
US11825065B2 (en) Effects for 3D data in a messaging system
CN108024071B (en) Video content generation method, video content generation device, and storage medium
US20210065464A1 (en) Beautification techniques for 3d data in a messaging system
CN112288665B (en) Image fusion method and device, storage medium and electronic equipment
CN114972632A (en) Image processing method and device based on nerve radiation field
KR20200107957A (en) Image processing method and device, electronic device and storage medium
KR101710966B1 (en) Image anti-aliasing method and apparatus
US9959672B2 (en) Color-based dynamic sub-division to generate 3D mesh
CN111612878A (en) Method and device for making static photo into three-dimensional effect video
CN114841853A (en) Image processing method, device, equipment and storage medium
CN114581979A (en) Image processing method and device
CN111652792B (en) Local processing method, live broadcasting method, device, equipment and storage medium for image
CN114862729A (en) Image processing method, image processing device, computer equipment and storage medium
CN116091292B (en) Data processing method and related device
CN117745605A (en) Image processing method, device, electronic equipment and storage medium
CN110941413B (en) Display screen generation method and related device
CN114998115A (en) Image beautification processing method and device and electronic equipment
CN115908596B (en) Image processing method and electronic equipment
CN116757970B (en) Training method of video reconstruction model, video reconstruction method, device and equipment
CN115714888B (en) Video generation method, device, equipment and computer readable storage medium
CN117830077A (en) Image processing method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination