CN117745546A - Video processing method and device and electronic equipment - Google Patents

Video processing method and device and electronic equipment Download PDF

Info

Publication number
CN117745546A
CN117745546A CN202211112270.2A CN202211112270A CN117745546A CN 117745546 A CN117745546 A CN 117745546A CN 202211112270 A CN202211112270 A CN 202211112270A CN 117745546 A CN117745546 A CN 117745546A
Authority
CN
China
Prior art keywords
image frame
image
pixel value
video
weighting parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211112270.2A
Other languages
Chinese (zh)
Inventor
李磊
詹亘
徐天春
赵世杰
李军林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Douyin Vision Co Ltd
Lemon Inc Cayman Island
Original Assignee
Douyin Vision Co Ltd
Lemon Inc Cayman Island
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Douyin Vision Co Ltd, Lemon Inc Cayman Island filed Critical Douyin Vision Co Ltd
Priority to CN202211112270.2A priority Critical patent/CN117745546A/en
Publication of CN117745546A publication Critical patent/CN117745546A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Processing (AREA)

Abstract

The disclosure provides a video processing method, a device and an electronic device, wherein the method comprises the following steps: acquiring image distribution information of a plurality of first image frames in a first video; for any one first image frame, determining a second image frame associated with the first image frame based on image distribution information of the first image frame and image distribution information of the first N image frames of the first image frame, wherein N is an integer greater than or equal to 1; generating a second video based on a second image frame associated with the plurality of first image frames, the second video being a video after the first video enhancement process. And the display effect of the enhanced video is improved.

Description

Video processing method and device and electronic equipment
Technical Field
The embodiment of the disclosure relates to the technical field of image processing, in particular to a video processing method, a video processing device and electronic equipment.
Background
The video enhancement processing technology can improve the display effect of the video. For example, when the color of a video shot by a user is low, the color, brightness, contrast and the like of the video can be improved through a video enhancement processing technology.
At present, video can be processed through a histogram algorithm, so that the display effect of the video is improved. For example, the pixels of each frame of image in the video are adjusted through a histogram algorithm, so that parameters such as color, brightness, contrast and the like of each frame of image are improved, and the display effect of the video is further improved. However, the image distribution between the video frames is discontinuous, so that flickering occurs between the frames after the video enhancement processing, and the display effect after the video enhancement processing is poor.
Disclosure of Invention
The disclosure provides a video processing method, a video processing device and electronic equipment, which are used for solving the technical problem in the prior art that the display effect after video enhancement processing is poor.
In a first aspect, the present disclosure provides a video processing method, the method comprising:
acquiring image distribution information of a plurality of first image frames in a first video;
for any one first image frame, determining a second image frame associated with the first image frame based on image distribution information of the first image frame and image distribution information of the first N image frames of the first image frame, wherein N is an integer greater than or equal to 1;
generating a second video based on a second image frame associated with the plurality of first image frames, the second video being a video after the first video enhancement process.
In a second aspect, the present disclosure provides a video processing apparatus, including an acquisition module, a determination module, and a generation module, wherein:
the acquisition module is used for acquiring image distribution information of a plurality of first image frames in the first video;
the determining module is used for determining, for any one first image frame, a second image frame associated with the first image frame based on image distribution information of the first image frame and image distribution information of the first N image frames of the first image frame, wherein N is an integer greater than or equal to 1;
The generating module is configured to generate a second video based on a second image frame associated with the plurality of first image frames, where the second video is a video after the first video enhancement processing.
In a third aspect, an embodiment of the present disclosure provides an electronic device, including: a processor and a memory;
the memory stores computer-executable instructions;
the processor executes computer-executable instructions stored by the memory to cause the at least one processor to perform the video processing method as described above in the first aspect and the various possible aspects of the first aspect.
In a fourth aspect, embodiments of the present disclosure provide a computer-readable storage medium having stored therein computer-executable instructions which, when executed by a processor, implement the video processing method as described in the first aspect and the various possible aspects of the first aspect above.
In a fifth aspect, embodiments of the present disclosure provide a computer program product comprising a computer program which, when executed by a processor, implements the video processing method as described above in the first aspect and the various possible aspects of the first aspect.
The disclosure provides a video processing method, a video processing device and electronic equipment, wherein image distribution information of a plurality of first image frames in a first video is acquired, for any one first image frame, a second image frame associated with the first image frame is determined based on the image distribution information of the first image frame and image distribution information of the first N image frames of the first image frame, wherein N is an integer greater than or equal to 1, a second video is generated based on the second image frames associated with the plurality of first image frames, and the second video is a video after enhancement processing of the first video. According to the method, the electronic equipment enhances the image of the frame through the image distribution information of the frame and the image distribution information of the previous frame, and combines the image distribution information among the video frames, so that the problem of flickering among the frames after the image enhancement processing can be avoided, and the display effect of the enhanced video is further improved.
Drawings
In order to more clearly illustrate the embodiments of the present disclosure or the solutions in the prior art, a brief description will be given below of the drawings that are needed in the embodiments or the description of the prior art, it being obvious that the drawings in the following description are some embodiments of the present disclosure, and that other drawings may be obtained from these drawings without inventive effort to a person of ordinary skill in the art.
Fig. 1 is a schematic view of an application scenario provided in an embodiment of the present disclosure;
fig. 2 is a schematic flow chart of an audio processing method according to an embodiment of the disclosure;
fig. 3 is a schematic diagram of image distribution information according to an embodiment of the disclosure;
FIG. 4 is a schematic diagram of a process for determining a pixel distribution interval according to an embodiment of the disclosure;
FIG. 5 is a schematic diagram of acquiring a first pixel value and a second pixel value according to an embodiment of the disclosure;
fig. 6 is a schematic diagram of a method for acquiring a second image frame according to an embodiment of the disclosure;
fig. 7 is a process schematic diagram of a video processing method according to an embodiment of the disclosure;
fig. 8 is a schematic structural diagram of a video processing apparatus according to an embodiment of the disclosure; the method comprises the steps of,
Fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the disclosure.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
In order to facilitate understanding, concepts related to the embodiments of the present disclosure are described below.
Electronic equipment: is a device with wireless receiving and transmitting function. The electronic device may be deployed on land, including indoors or outdoors, hand-held, wearable, or vehicle-mounted; can also be deployed on the water surface (such as a ship, etc.). The electronic device may be a mobile phone (mobile phone), a tablet (Pad), a computer with wireless transceiving function, a Virtual Reality (VR) electronic device, an augmented reality (augmented reality, AR) electronic device, a wireless terminal in industrial control (industrial control), a vehicle-mounted electronic device, a wireless terminal in unmanned driving (self driving), a wireless electronic device in remote medical (remote medical), a wireless electronic device in smart grid (smart grid), a wireless electronic device in transportation security (transportation safety), a wireless electronic device in smart city, a wireless electronic device in smart home (smart home), a wearable electronic device, etc. The electronic device according to the embodiments of the present disclosure may also be referred to as a terminal, a User Equipment (UE), an access electronic device, a vehicle-mounted terminal, an industrial control terminal, a UE unit, a UE station, a mobile station, a remote electronic device, a mobile device, a UE electronic device, a wireless communication device, a UE proxy, a UE apparatus, or the like. The electronic device may also be stationary or mobile.
Video enhancement: the video enhancement algorithm can improve the display effect of the video. For example, video enhancement processing may improve the brightness, contrast, and color of video, thereby improving the display of the video. Optionally, the video enhancement algorithm may process each frame of image in the video, thereby improving the display effect of the video. For example, the video enhancement algorithm may adjust pixels in each frame of image in the video, and improve brightness, contrast, and color of each frame of image, thereby improving the display effect of the video.
Histogram algorithm: the histogram algorithm may adjust the pixel distribution in the image. For example, the electronic device may obtain histogram information of R, G, B three channels in the image through a histogram algorithm, and adjust pixels in a dark region to a bright region, so as to improve brightness, contrast and color of the image.
In the related art, when the video display effect obtained by the electronic device is low (e.g., color, brightness is low, etc.), the electronic device may improve the video display effect by using a video enhancement processing technology. Currently, video can be processed by histogram algorithms. For example, the electronic device may adjust pixels of each frame of image in the video through a histogram algorithm, so that pixels in a dark region move to a bright region, thereby improving parameters such as color, brightness, contrast and the like of each frame of image, and improving a display effect of the video. However, when the pixel distribution difference of the adjacent video frames is large, flicker occurs between two frames of images with large pixel distribution difference after each frame of images is processed by a histogram algorithm, so that the display effect after video enhancement is poor.
In order to solve the technical problems in the related art, an embodiment of the present disclosure provides a video processing method, in which an electronic device obtains image distribution information of a plurality of first image frames in a first video, determines, for any one first image frame, a first pixel distribution interval associated with the first image frame based on the image distribution information of the first image frame, determines N second pixel distribution intervals of N image frames based on the image distribution information of the first N image frames of the first image frame, processes the first image frame based on the first pixel distribution interval and the N second pixel distribution intervals, obtains a second image frame, and generates a second video after a first video enhancement process based on the plurality of second image frames. In the method, the electronic device performs enhancement processing on the image of the frame through the pixel distribution interval of the frame and the pixel distribution interval of the previous frame, and combines the pixel distribution information among the video frames, so that the problem of flickering between adjacent video frames with large pixel distribution difference after the image enhancement processing can be avoided when the pixels are adjusted, and the display effect after video enhancement is further improved.
Next, an application scenario of the present disclosure will be described with reference to fig. 1.
Fig. 1 is a schematic view of an application scenario provided in an embodiment of the present disclosure. Please refer to fig. 1, which includes an electronic device, a first video. The first video comprises an image A and an image B, wherein the image A is a 1 st frame image in the first video, and the image B is a 2 nd frame image in the first video. Inputting the image A and the image B to the electronic equipment, wherein the electronic equipment can acquire pixel distribution information of the image A and pixel distribution information of the image B, and process the image B based on the pixel distribution information of the image A and the pixel distribution information of the image B to obtain a second video, wherein the second video comprises the image A and the image C, and the image C is an image after the image B is subjected to enhanced display processing. Therefore, when each frame of image in the first video is enhanced, the electronic equipment can combine pixel distribution information between adjacent video frames, so that the problem of flickering between adjacent video frames with large pixel distribution difference after the image enhancement processing can be avoided, and the display effect after the video enhancement is improved.
The following describes the technical solutions of the present disclosure and how the technical solutions of the present disclosure solve the above technical problems in detail with specific embodiments. The following embodiments may be combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments. Embodiments of the present disclosure will be described below with reference to the accompanying drawings.
Fig. 2 is a flow chart of an audio processing method according to an embodiment of the disclosure. Referring to fig. 2, the method may include:
s201, acquiring image distribution information of a plurality of first image frames in a first video.
The execution body of the embodiment of the disclosure may be an electronic device, or may be a video processing apparatus provided in the electronic device. The video processing device may be implemented by software, or may be implemented by a combination of software and hardware.
Alternatively, the first video may be a video acquired by the electronic device. For example, the electronic device may acquire the first video in real time through the image capturing device, the electronic device may acquire the stored first video in the database, and the electronic device may also receive the first video sent by other devices, which is not limited in the embodiments of the present disclosure.
Optionally, the first image frame is a video frame in the first video, and the image distribution information indicates pixel distribution information of the first image frame. For example, the image distribution information may be pixel distribution information of the image frames in the first video on three channels R, G, B. Optionally, the electronic device may acquire the image distribution information of the first image frame through a preset algorithm. For example, the electronic device may process the first image frame through a histogram algorithm, so as to obtain image distribution information of the first image frame.
Optionally, before the electronic device obtains the image distribution information of the first image frame through a preset algorithm, the electronic device may perform color enhancement processing on the first image frame through a color enhancement algorithm, and obtain the image distribution information of the first image frame after color enhancement, so that accuracy of obtaining the image distribution information may be improved.
Next, image distribution information of the first image frame will be described with reference to fig. 3.
Fig. 3 is a schematic diagram of image distribution information according to an embodiment of the disclosure. Referring to fig. 3, the method includes: and a pixel distribution curve corresponding to the first image frame. Wherein, the abscissa of the pixel distribution curve is the pixel value, and the ordinate of the pixel distribution curve is the pixel number. Through the pixel distribution curve corresponding to the first image frame, the number of darker pixels and the number of lighter pixels in the first image frame can be determined.
S202, for any one of the first image frames, determining a second image frame associated with the first image frame based on the image distribution information of the first image frame and the image distribution information of the first N image frames of the first image frame.
Optionally, N is an integer greater than or equal to 1. For example, the electronic device may acquire image distribution information of a first image frame, and image distribution information of a first 1 image frame of the first image frame; the electronic device may obtain the distribution information of the first image frame and the image distribution information of the first 2 image frames of the first image frame, which is not limited by the embodiments of the present disclosure.
Optionally, the second image frame is an image frame after the enhancement processing is performed on the first image frame. For example, the electronic device performs enhancement processing of color, contrast, and brightness on the first image frame through the image distribution information of the first image frame and the image distribution information of the image frame that is the last frame of the first image frame, to obtain the second image frame.
Alternatively, the electronic device may determine the second image frame associated with the first image frame by: the method comprises the steps of determining a first pixel distribution interval of a first image frame based on image distribution information of the first image frame, determining N second pixel distribution intervals of N image frames based on the image distribution information of the first N image frames, and processing the first image frame based on the first pixel distribution interval and the N second pixel distribution intervals to obtain a second image frame.
Optionally, the pixel distribution interval is used to indicate an interval of valid pixels of the image frame. For example, in the practical application process, after the electronic device processes the image frame through the histogram algorithm, pixel distribution information of the image frame is obtained, where pixels in the pixel distribution information are arranged according to tone values, and the electronic device may determine a pixel distribution interval based on the tone value of each pixel.
Next, a process of determining a pixel distribution section will be described with reference to fig. 4.
Fig. 4 is a schematic diagram of a process for determining a pixel distribution interval according to an embodiment of the disclosure. Please refer to fig. 4, which includes a pixel distribution curve. Wherein, the abscissa of the pixel distribution curve is the pixel value, and the ordinate of the pixel distribution curve is the pixel number. And determining a pixel value A corresponding to the pixel at the position of 5% of the total number of pixels in the pixel distribution curve, and determining a pixel value B corresponding to the pixel at the position of 95% of the total number of pixels in the pixel distribution curve, so as to obtain a pixel distribution interval. The left end point of the pixel distribution interval is a pixel value A, and the right end point of the pixel distribution interval is a pixel value B.
It should be noted that, 5% and 95% in the embodiment shown in fig. 4 may be any set values, which is not limited in this embodiment of the disclosure. With the embodiment shown in fig. 4, the electronic device may acquire a first pixel distribution section of a first image frame and N second pixel distribution sections of N image frames.
Optionally, the processing is performed on the first image frame based on the first pixel distribution interval and the N second pixel distribution intervals to obtain a second image frame, specifically: and acquiring a first pixel value of a left end point and a second pixel value of a right end point of the first pixel distribution interval. For example, if the first pixel distribution section includes 1% -99% of pixels in the histogram information associated with the first image frame, a pixel value corresponding to a 1% position is acquired in the histogram information and determined as a first pixel value, and a pixel value corresponding to a 99% position is acquired in the histogram information and determined as a second pixel value.
And acquiring a third pixel value of the left end point and a fourth pixel value of the right end point of the second pixel distribution interval. For example, if the second pixel distribution section includes 5% -95% of pixels in the histogram information associated with the image frame (for example, N is 1) of the previous image frame, the pixel value corresponding to the 5% position is obtained in the histogram information and is determined as the third pixel value, and the pixel value corresponding to the 95% position is obtained in the histogram information and is determined as the fourth pixel value.
Next, a process of acquiring the first pixel value and the second pixel value will be described with reference to fig. 5.
Fig. 5 is a schematic diagram of acquiring a first pixel value and a second pixel value according to an embodiment of the disclosure. Referring to fig. 5, a first pixel distribution section is included. And determining a pixel value corresponding to the left end point of the first pixel distribution interval as a first pixel value of the first image frame, and determining a pixel value corresponding to the right end point of the first pixel distribution interval as a second pixel value of the first image frame. Thus, the endpoint pixel value corresponding to each image frame may be determined by the image distribution information corresponding to each image frame.
And processing the first image frame based on the first pixel value, the second pixel value, the third pixel value and the fourth pixel value to obtain the second image frame. For example, the first image frame is subjected to adaptive curve stretching processing through the first pixel value, the second pixel value, the third pixel value and the fourth pixel value, and a second image frame is obtained.
S203, generating a second video based on the second image frames associated with the plurality of first image frames.
Optionally, the second video is a video after enhancement processing of the first video. For example, the electronic device performs brightness, color, and contrast enhancement processing on each first image frame in the first video, and then obtains the second video.
The embodiment of the disclosure provides a video processing method, an electronic device obtains image distribution information of a plurality of first image frames in a first video, for any one first image frame, the electronic device can determine a first pixel distribution interval associated with the first image frame based on the image distribution information of the first image frame, determine N second pixel distribution intervals of the N image frames based on the image distribution information of the first N image frames of the first image frame, and process the first image frame to obtain a second image frame based on the plurality of second image frames through pixel values associated with left end points and right end points of the first pixel distribution interval and pixel values associated with left end points and right end points of the second pixel distribution interval, and generate a second video after the first video enhancement processing. In the method, the electronic device performs enhancement processing on the image of the frame through the pixel distribution interval of the frame and the pixel distribution interval of the previous frame, and combines the pixel distribution information among the video frames, so that the problem of flickering between adjacent video frames with large pixel distribution difference after the image enhancement processing can be avoided when the pixels are adjusted, and the display effect after video enhancement is further improved.
In the following, taking N as 1 as an example, in connection with fig. 6, a method procedure of processing a first image frame to obtain a second image frame based on a first pixel value, a second pixel value, a third pixel value and a fourth pixel value in the above video processing method will be described based on the embodiment shown in fig. 2.
Fig. 6 is a schematic diagram of a method for acquiring a second image frame according to an embodiment of the disclosure. Referring to fig. 6, the method includes:
s601, determining a first weighting parameter and a second weighting parameter based on the first pixel value, the second pixel value, the third pixel value, and the fourth pixel value.
Optionally, the first weighting parameter and the second weighting parameter are used for performing image enhancement processing on the first image frame. For example, a stretching curve corresponding to the first image frame can be obtained through the first weighting parameter and the second weighting parameter, and then image enhancement processing is performed on the first image frame through the stretching curve.
Alternatively, the electronic device may determine the first weighting parameter and the second weighting parameter according to the following possible implementation manner: scene-cut information is determined based on the first pixel value and the third pixel value, or scene-cut information is determined based on the second pixel value and the fourth pixel value, and first weighting parameters and second weighting parameters are determined based on the scene-cut information, the first pixel value, the second pixel value, the third pixel value, and the fourth pixel value.
Optionally, the scene change information is used to indicate whether the first image frame and a previous image frame of the first image frame are scene-changed. For example, if the first image frame is the first frame image in the sky material video and the last frame image of the first image frame is the last frame image in the ocean material video, it is indicated that scene switching occurs between the first image frame and the last frame image frame.
Alternatively, there are two possible implementations of determining scene switching information between the first image frame and the last image frame:
one possible implementation:
scene cut information is determined based on the first pixel value and the third pixel value. For example, if the absolute value of the difference between the first pixel value and the third pixel value is greater than a first preset threshold, determining that scene switching occurs between the first image frame and the previous image frame, where the scene switching information indicates that scene switching occurs between the first image frame and the previous image frame of the first image frame; if the absolute value of the difference value between the first pixel value and the third pixel value is smaller than or equal to a first preset threshold value, determining that scene switching does not occur between the first image frame and the last image frame, wherein the scene switching information indicates that scene information does not occur between the first image frame and the last image frame of the first image frame.
Alternatively, the scene cut information may be determined according to the following possible formula:
|p_min n -p_min n-1 |>A
wherein p_min n Is the first pixel value; p_min n-1 A third pixel value; n is the serial number of the first image frame in the first video; a is a first preset threshold; if |p_min n -p_min n-1 If the I is larger than A, determining that scene switching occurs between the first image frame and the last image frame, and if the I p_min is larger than A n -p_min n-1 And if the I is less than or equal to A, determining that scene switching does not occur between the first image frame and the last image frame.
Another possible implementation is:
scene cut information is determined based on the second pixel value and the fourth pixel value. For example, if the absolute value of the difference between the second pixel value and the fourth pixel value is greater than the second preset threshold, determining that scene switching occurs between the first image frame and the previous image frame, where the scene switching information indicates that scene switching occurs between the first image frame and the previous image frame of the first image frame; if the absolute value of the difference value between the second pixel value and the fourth pixel value is smaller than or equal to a second preset threshold value, determining that scene switching does not occur between the first image frame and the last image frame, wherein the scene switching information indicates that scene information does not occur between the first image frame and the last image frame of the first image frame. It should be noted that the first preset threshold value and the second preset threshold value may be the same or different, which is not limited in the embodiment of the present disclosure. For example, the first preset threshold may be 10, and the second preset threshold may be 10; alternatively, the first preset threshold is 10 and the second preset threshold is 15.
Alternatively, the scene cut information may be determined according to the following possible formula:
|p_max n -p_max n_1 |>B
wherein p_max n Is the second pixel value; p_max n-1 A fourth pixel value; n is the serial number of the first image frame in the first video; b is a second preset threshold; if |p_max n -p_max n-1 If the I is larger than B, determining that scene switching occurs between the first image frame and the last image frame, and if the I p_max is larger than B n -p_max n-1 And if the I is less than or equal to B, determining that scene switching does not occur between the first image frame and the last image frame.
Optionally, the first weighting parameter and the second weighting parameter are determined based on the scene-cut information, the first pixel value, the second pixel value, the third pixel value, and the fourth pixel value, where there are two cases:
case 1: the scene cut information indicates that the first image frame and a previous image frame of the first image frame are scene cut.
If the scene switching information indicates that the first image frame and the last image frame of the first image frame are subjected to scene switching, the first pixel value is determined as a first weighting parameter, and the second pixel value is determined as a second weighting parameter. For example, if the scene change occurs between the first image frame and the previous image frame, the first pixel value of the first image frame is 10, and the second pixel value is 200, the first weighting parameter is 10, and the second weighting parameter is 200. For example, the first weighting parameter and the second weighting parameter may be determined by the following formula:
p_min_ExpW n =p_min n
p_max_ExpW n =p_max n
Wherein p_min_ExpW n Is a first weighting parameter; p_max_ExpW n Is a second weighting parameter; p_min n Is the first pixel value; p_max n Is the second pixel value; n is the sequence number of the first image frame in the first video.
Case 2: the scene cut information indicates that the first image frame and a previous image frame of the first image frame have not been scene cut.
If the scene switching information indicates that scene switching does not occur in the first image and the last image frame of the first image frame, acquiring a third weighting parameter and a fourth weighting parameter of the last image frame, and determining the first weighting parameter and the second weighting parameter based on the first pixel value, the second pixel value, the third weighting parameter and the fourth weighting parameter.
Optionally, the third weighting parameter and the fourth weighting parameter are used for performing image enhancement processing on the previous image frame. For example, a stretching curve corresponding to the previous image frame can be obtained through the third weighting parameter and the fourth weighting parameter, and then image enhancement processing is performed on the previous image frame through the stretching curve.
Alternatively, the third weighting parameter and the fourth weighting parameter may be obtained according to the following possible implementation manner: when scene switching occurs between the last image frame and the last two image frames, the third pixel value is determined as the third weighting parameter and the fourth pixel value is determined as the fourth weighting parameter. For example, when determining the second image frame corresponding to the 3 rd frame image in the video, if scene switching occurs between the 1 st frame image and the 2 nd frame image, determining the third weighting parameter as the third pixel value corresponding to the 2 nd frame image, and determining the fourth weighting parameter as the fourth pixel value corresponding to the 2 nd frame image.
And when scene switching does not occur between the previous image frame and the previous two image frames, determining a third weighting parameter and a fourth weighting parameter according to the weighting parameters of the previous two image frames and the third pixel value and the fourth pixel value corresponding to the previous image frame. Since the 1 st frame image of the video does not include the previous frame image, it is determined that the 1 st frame image of the video is subjected to scene switching, and the weighting parameter corresponding to the 1 st frame image includes the pixel value of the left end point of the pixel distribution section and the pixel value of the right end point of the pixel distribution section of the 1 st frame image.
Alternatively, the first weighting parameter and the second weighting parameter may be determined by the following formula, specifically:
p_min_ExpW n =α×p_min_ExpW n-1 +β×p_min n
p_max_ExpW n =α×p_max_ExpW n_1 +β×p_max n
wherein p_min_ExpW n Is a first weighting parameter; alpha and beta are weighting coefficients; p_min n Is the first pixel value; p_min_ExpW n-1 Is a third weighting parameter; n is the serial number of the first image frame in the first video; p_max_ExpW n Is a second weighting parameter; p_max n Is the second pixel value; p_max_ExpW n_1 Is the fourth weighting parameter.
S602, processing the first image frame based on the first weighting parameter and the second weighting parameter to obtain a second image frame.
Alternatively, the first image frame may be processed to obtain the second image frame according to the following possible implementation manner: a first difference of the first weighting parameter and the second weighting parameter is obtained. For example, if the first weighting parameter is 100 and the second weighting parameter is 40, the first difference is 60.
And obtaining a second difference value between the pixel value associated with each pixel and the first weighting parameter to obtain a plurality of second difference values. For example, for each pixel in the first image frame, a second difference between each pixel and the first weighting parameter is obtained. For example, if the pixel value of a pixel in the first image frame is 100, the first weighting parameter is 200, and the second difference is 100.
And obtaining a second image frame based on the first difference value, the preset value and the plurality of second difference values. For example, the preset value may be 255, and the second image frame is obtained by adjusting the pixel corresponding to each second difference value through the first difference value, the preset value and each second difference value.
Optionally, the electronic device may adjust each pixel value in the first image frame to obtain the second image frame by the following formula:
I out =(I_sat-P_min_Expw)×C/(p_max_Expw-p_min_Expw)
wherein, i_sat is a pixel value corresponding to a pixel in the first image frame; i out A pixel value after the pixel is adjusted; P_min_Expw is a first weighting parameter; p_max_expw is the second weighting parameter; c is a preset value.
It should be noted that, through the above formula, the pixel value of each pixel in the first image frame may be smoothly adjusted, and then the adaptive curve stretching is performed on the first image frame, so as to obtain the second image frame. For example, the first image frame includes a pixel a and a pixel B, and the pixel value of the pixel a and the pixel value of the pixel B are adjusted by the above method, so as to obtain a second image frame.
The embodiment of the disclosure provides a method for acquiring a second image frame, which determines scene switching information based on a first pixel value and a third pixel value, or determines scene switching information based on the second pixel value and a fourth pixel value, if the scene switching information indicates that scene switching occurs in the first image frame and a last image frame of the first image frame, the first image frame is processed based on the first pixel value and the second pixel value to obtain the second image frame, and if the scene switching information indicates that scene switching does not occur in the first image and a last image frame of the first image frame, a weighting parameter of the last image frame is acquired based on the third pixel value and the fourth pixel value, and the second image frame is determined based on the first pixel value, the second pixel value and the weighting parameter. In this way, when the scene switching information is different, the electronic device can adjust the pixel values in the first image frame in different modes, so that the flexibility of video enhancement processing is improved, and the electronic device can smoothly enhance the brightness, contrast and color of the image of the frame by combining the pixel distribution information among the video frames, so that the problem that flicker occurs between adjacent video frames with large pixel distribution difference after the image enhancement processing can be avoided when the pixels are adjusted, and the display effect after video enhancement is further improved.
On the basis of any one of the above embodiments, a procedure of the above video processing method will be described below with reference to fig. 7.
Fig. 7 is a schematic process diagram of a video processing method according to an embodiment of the disclosure. Referring to fig. 7, a first video is included. The first video comprises an image A and an image B, wherein the image A is a 1 st frame image in the first video, and the image B is a 2 nd frame image in the first video. And acquiring image distribution information of the image A, and obtaining a pixel distribution curve A through the image distribution information of the image A. And acquiring image distribution information of the image B, and obtaining a pixel distribution curve B through the image distribution information of the image B.
Referring to fig. 7, a third pixel value corresponding to a pixel at a position of 5% of the total number of pixels is determined in the pixel distribution curve a, and a fourth pixel value corresponding to a pixel at a position of 95% of the total number of pixels is determined in the pixel distribution curve a. A first pixel value corresponding to a pixel at a position of 5% of the total number of pixels is determined in the pixel distribution curve B, and a second pixel value corresponding to a pixel at a position of 95% of the total number of pixels is determined in the pixel distribution curve B.
Referring to fig. 7, it is determined that scene switching does not occur between the image a and the image B through the first pixel value and the third pixel value, and thus, the stretching curve of the image B is determined through the first pixel value, the second pixel value, the third pixel value, and the fourth pixel value. Wherein the abscissa of the stretch curve is the bit depth of the color. It should be noted that, the process of determining the stretch curve may refer to step S602, which is not described in detail in the embodiments of the present disclosure.
Referring to fig. 7, the second video is obtained by processing the image B through a stretching curve. The second video comprises an image A and an image C, wherein the image C is an image after the image B is subjected to enhanced display processing. Therefore, when each frame of image in the first video is enhanced, the electronic equipment can combine pixel distribution information between adjacent video frames, so that the problem of flickering between adjacent video frames with large pixel distribution difference after the image enhancement processing can be avoided, and the display effect after the video enhancement is improved.
Fig. 8 is a schematic structural diagram of a video processing apparatus according to an embodiment of the disclosure. Referring to fig. 8, the video processing apparatus 80 includes an acquisition module 81, a determination module 82, and a generation module 83, wherein:
the acquiring module 81 is configured to acquire image distribution information of a plurality of first image frames in a first video;
the determining module 82 is configured to determine, for any one first image frame, a second image frame associated with the first image frame based on image distribution information of the first image frame and image distribution information of first N image frames of the first image frame, where N is an integer greater than or equal to 1;
The generating module 83 is configured to generate a second video based on a second image frame associated with the plurality of first image frames, where the second video is a video after the first video enhancement process.
In one possible implementation, the determining module 82 is specifically configured to:
determining a first pixel distribution interval of the first image frame based on the image distribution information of the first image frame;
determining N second pixel distribution intervals of the N image frames based on the image distribution information of the first N image frames;
and processing the first image frame based on the first pixel distribution interval and the N second pixel distribution intervals to obtain the second image frame.
In one possible implementation, the determining module 82 is specifically configured to:
acquiring a first pixel value of a left end point and a second pixel value of a right end point of the first pixel distribution interval;
acquiring a third pixel value of a left end point and a fourth pixel value of a right end point of the second pixel distribution interval;
and processing the first image frame based on the first pixel value, the second pixel value, the third pixel value and the fourth pixel value to obtain the second image frame.
In one possible implementation, the determining module 82 is specifically configured to:
determining a first weighting parameter and a second weighting parameter based on the first pixel value, the second pixel value, the third pixel value and the fourth pixel value, wherein the first weighting parameter and the second weighting parameter are used for performing image enhancement processing on the first image frame;
and processing the first image frame based on the first weighting parameter and the second weighting parameter to obtain the second image frame.
In one possible implementation, the determining module 82 is specifically configured to:
determining scene switching information based on the first pixel value and the third pixel value, or determining scene switching information based on the second pixel value and the fourth pixel value, the scene switching information indicating whether or not a scene switching occurred for the first image frame and a previous image frame of the first image frame;
the first weighting parameter and the second weighting parameter are determined based on the scene cut information, the first pixel value, the second pixel value, the third pixel value, and the fourth pixel value.
In one possible implementation, the determining module 82 is specifically configured to:
If the scene switching information indicates that the first image frame and the last image frame of the first image frame are subjected to scene switching, determining the first pixel value as a first weighting parameter and determining the second pixel value as a second weighting parameter;
and if the scene switching information indicates that scene switching does not occur in the first image and the last image frame of the first image frame, acquiring a third weighting parameter and a fourth weighting parameter of the last image frame, and determining the first weighting parameter and the second weighting parameter based on the first pixel value, the second pixel value, the third weighting parameter and the fourth weighting parameter, wherein the third weighting parameter and the fourth weighting parameter are used for performing image enhancement processing on the last image frame.
In one possible implementation, the determining module 82 is specifically configured to:
acquiring a first difference value of the first weighting parameter and the second weighting parameter;
acquiring a second difference value between the pixel value associated with each pixel and the first weighting parameter to obtain a plurality of second difference values;
and obtaining the second image frame based on the first difference value, a preset value and the second difference values.
The video processing device provided in this embodiment may be used to execute the technical solution of the foregoing method embodiment, and its implementation principle and technical effects are similar, and this embodiment will not be described herein again.
Fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the disclosure. Referring to fig. 9, a schematic diagram of an electronic device 900 suitable for implementing embodiments of the present disclosure is shown, where the electronic device 900 may be a terminal device or a server. The terminal device may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a digital broadcast receiver, a personal digital assistant (Personal Digital Assistant, PDA for short), a tablet (Portable Android Device, PAD for short), a portable multimedia player (Portable Media Player, PMP for short), an in-vehicle terminal (e.g., an in-vehicle navigation terminal), and the like, and a fixed terminal such as a digital TV, a desktop computer, and the like. The electronic device shown in fig. 9 is merely an example, and should not impose any limitations on the functionality and scope of use of embodiments of the present disclosure.
As shown in fig. 9, the electronic apparatus 900 may include a processing device (e.g., a central processor, a graphics processor, or the like) 901, which may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 902 or a program loaded from a storage device 908 into a random access Memory (Random Access Memory, RAM) 903. In the RAM 903, various programs and data necessary for the operation of the electronic device 900 are also stored. The processing device 901, the ROM 902, and the RAM 903 are connected to each other through a bus 904. An input/output (I/O) interface 905 is also connected to the bus 904.
In general, the following devices may be connected to the I/O interface 905: input devices 906 including, for example, a touch screen, touchpad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, and the like; an output device 907 including, for example, a liquid crystal display (Liquid Crystal Display, LCD for short), a speaker, a vibrator, and the like; storage 908 including, for example, magnetic tape, hard disk, etc.; and a communication device 909. The communication means 909 may allow the electronic device 900 to communicate wirelessly or by wire with other devices to exchange data. While fig. 9 shows an electronic device 900 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flowcharts. In such an embodiment, the computer program may be downloaded and installed from a network via the communication device 909, or installed from the storage device 908, or installed from the ROM 902. When executed by the processing device 901, performs the above-described functions defined in the methods of the embodiments of the present disclosure.
It should be noted that the computer readable medium described in the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
The computer readable medium may be contained in the electronic device; or may exist alone without being incorporated into the electronic device.
The computer-readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to perform the methods shown in the above-described embodiments.
Computer program code for carrying out operations of the present disclosure may be written in one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a local area network (Local Area Network, LAN for short) or a wide area network (Wide Area Network, WAN for short), or it may be connected to an external computer (e.g., connected via the internet using an internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present disclosure may be implemented by means of software, or may be implemented by means of hardware. The name of the unit does not in any way constitute a limitation of the unit itself, for example the first acquisition unit may also be described as "unit acquiring at least two internet protocol addresses".
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a Complex Programmable Logic Device (CPLD), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
It should be noted that references to "one", "a plurality" and "a plurality" in this disclosure are intended to be illustrative rather than limiting, and those of ordinary skill in the art will appreciate that "one or more" is intended to be understood as "one or more" unless the context clearly indicates otherwise.
The names of messages or information interacted between the various devices in the embodiments of the present disclosure are for illustrative purposes only and are not intended to limit the scope of such messages or information.
It will be appreciated that prior to using the technical solutions disclosed in the embodiments of the present disclosure, the user should be informed and authorized of the type, usage range, usage scenario, etc. of the personal information related to the present disclosure in an appropriate manner according to the relevant legal regulations.
For example, in response to receiving an active request from a user, a prompt is sent to the user to explicitly prompt the user that the operation it is requesting to perform will require personal information to be obtained and used with the user. Thus, the user can autonomously select whether to provide personal information to software or hardware such as an electronic device, an application program, a server or a storage medium for executing the operation of the technical scheme of the present disclosure according to the prompt information.
As an alternative but non-limiting implementation, in response to receiving an active request from a user, the manner in which the prompt information is sent to the user may be, for example, a popup, in which the prompt information may be presented in a text manner. In addition, a selection control for the user to select to provide personal information to the electronic device in a 'consent' or 'disagreement' manner can be carried in the popup window.
It will be appreciated that the above-described notification and user authorization process is merely illustrative and not limiting of the implementations of the present disclosure, and that other ways of satisfying relevant legal regulations may be applied to the implementations of the present disclosure.
It will be appreciated that the data (including but not limited to the data itself, the acquisition or use of the data) involved in the present technical solution should comply with the corresponding legal regulations and the requirements of the relevant regulations. The data may include information, parameters, messages, etc., such as tangential flow indication information.
The foregoing description is only of the preferred embodiments of the present disclosure and description of the principles of the technology being employed. It will be appreciated by persons skilled in the art that the scope of the disclosure referred to in this disclosure is not limited to the specific combinations of features described above, but also covers other embodiments which may be formed by any combination of features described above or equivalents thereof without departing from the spirit of the disclosure. Such as those described above, are mutually substituted with the technical features having similar functions disclosed in the present disclosure (but not limited thereto).
Moreover, although operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. In certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limiting the scope of the present disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are example forms of implementing the claims.

Claims (11)

1. A video processing method, comprising:
acquiring image distribution information of a plurality of first image frames in a first video;
For any one first image frame, determining a second image frame associated with the first image frame based on image distribution information of the first image frame and image distribution information of the first N image frames of the first image frame, wherein N is an integer greater than or equal to 1;
generating a second video based on a second image frame associated with the plurality of first image frames, the second video being a video after the first video enhancement process.
2. The method of claim 1, wherein the determining the second image frame associated with the first image frame based on the image distribution information of the first image frame and the image distribution information of the first N image frames of the first image frame comprises:
determining a first pixel distribution interval of the first image frame based on the image distribution information of the first image frame;
determining N second pixel distribution intervals of the N image frames based on the image distribution information of the first N image frames;
and processing the first image frame based on the first pixel distribution interval and the N second pixel distribution intervals to obtain the second image frame.
3. The method of claim 2, wherein N is 1, wherein processing the first image frame based on the first pixel distribution interval and the N second pixel distribution intervals to obtain the second image frame comprises:
Acquiring a first pixel value of a left end point and a second pixel value of a right end point of the first pixel distribution interval;
acquiring a third pixel value of a left end point and a fourth pixel value of a right end point of the second pixel distribution interval;
and processing the first image frame based on the first pixel value, the second pixel value, the third pixel value and the fourth pixel value to obtain the second image frame.
4. A method according to claim 3, wherein said processing said first image frame based on said first pixel value, said second pixel value, said third pixel value and said fourth pixel value to obtain said second image frame comprises:
determining a first weighting parameter and a second weighting parameter based on the first pixel value, the second pixel value, the third pixel value and the fourth pixel value, wherein the first weighting parameter and the second weighting parameter are used for performing image enhancement processing on the first image frame;
and processing the first image frame based on the first weighting parameter and the second weighting parameter to obtain the second image frame.
5. The method of claim 4, wherein the determining a first weighting parameter and a second weighting parameter based on the first pixel value, the second pixel value, the third pixel value, and the fourth pixel value comprises:
Determining scene switching information based on the first pixel value and the third pixel value, or determining scene switching information based on the second pixel value and the fourth pixel value, the scene switching information indicating whether or not a scene switching occurred for the first image frame and a previous image frame of the first image frame;
the first weighting parameter and the second weighting parameter are determined based on the scene cut information, the first pixel value, the second pixel value, the third pixel value, and the fourth pixel value.
6. The method of claim 5, wherein the determining the first weighting parameter and the second weighting parameter based on the scene cut information, the first pixel value, the second pixel value, the third pixel value, and the fourth pixel value comprises:
if the scene switching information indicates that the first image frame and the last image frame of the first image frame are subjected to scene switching, determining the first pixel value as a first weighting parameter and determining the second pixel value as a second weighting parameter;
and if the scene switching information indicates that the scene switching does not occur in the first image and the last image frame of the first image frame, acquiring a third weighting parameter and a fourth weighting parameter of the last image frame, and determining the first weighting parameter and the second weighting parameter based on the first pixel value, the second pixel value, the third weighting parameter and the fourth weighting parameter, wherein the third weighting parameter and the fourth weighting parameter are used for performing image enhancement processing on the last image frame.
7. The method according to any one of claims 4-6, wherein processing the first image frame based on the first weighting parameter and the second weighting parameter to obtain the second image frame includes:
acquiring a first difference value of the first weighting parameter and the second weighting parameter;
acquiring a second difference value between the pixel value associated with each pixel and the first weighting parameter to obtain a plurality of second difference values;
and obtaining the second image frame based on the first difference value, a preset value and the second difference values.
8. The video processing device is characterized by comprising an acquisition module, a determination module and a generation module, wherein:
the acquisition module is used for acquiring image distribution information of a plurality of first image frames in the first video;
the determining module is used for determining, for any one first image frame, a second image frame associated with the first image frame based on image distribution information of the first image frame and image distribution information of the first N image frames of the first image frame, wherein N is an integer greater than or equal to 1;
the generating module is configured to generate a second video based on a second image frame associated with the plurality of first image frames, where the second video is a video after the first video enhancement processing.
9. An electronic device, comprising: a processor and a memory;
the memory stores computer-executable instructions;
the processor executing computer-executable instructions stored in the memory, causing the processor to perform the video processing method of any one of claims 1 to 7.
10. A computer readable storage medium having stored therein computer executable instructions which, when executed by a processor, implement the video processing method of any of claims 1 to 7.
11. A computer program product comprising a computer program which, when executed by a processor, implements the video processing method according to any one of claims 1 to 7.
CN202211112270.2A 2022-09-13 2022-09-13 Video processing method and device and electronic equipment Pending CN117745546A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211112270.2A CN117745546A (en) 2022-09-13 2022-09-13 Video processing method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211112270.2A CN117745546A (en) 2022-09-13 2022-09-13 Video processing method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN117745546A true CN117745546A (en) 2024-03-22

Family

ID=90257805

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211112270.2A Pending CN117745546A (en) 2022-09-13 2022-09-13 Video processing method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN117745546A (en)

Similar Documents

Publication Publication Date Title
CN110728622B (en) Fisheye image processing method, device, electronic equipment and computer readable medium
CN111260601B (en) Image fusion method and device, readable medium and electronic equipment
CN110211030B (en) Image generation method and device
CN110070495B (en) Image processing method and device and electronic equipment
CN115761090A (en) Special effect rendering method, device, equipment, computer readable storage medium and product
CN110719407A (en) Picture beautifying method, device, equipment and storage medium
CN113535105A (en) Media file processing method, device, equipment, readable storage medium and product
CN111583102B (en) Face image processing method and device, electronic equipment and computer storage medium
CN110097520B (en) Image processing method and device
CN111738950A (en) Image processing method and device
CN116596748A (en) Image stylization processing method, apparatus, device, storage medium, and program product
CN112465940B (en) Image rendering method and device, electronic equipment and storage medium
CN117745546A (en) Video processing method and device and electronic equipment
CN111369472B (en) Image defogging method and device, electronic equipment and medium
CN114866706A (en) Image processing method, image processing device, electronic equipment and storage medium
CN114723600A (en) Method, device, equipment, storage medium and program product for generating cosmetic special effect
CN111415393B (en) Method and device for adjusting display of multimedia blackboard, medium and electronic equipment
CN114422698A (en) Video generation method, device, equipment and storage medium
CN114693860A (en) Highlight rendering method, highlight rendering device, highlight rendering medium and electronic equipment
US20240095882A1 (en) Image processing method and apparatus, electronic device and medium
CN111738899B (en) Method, apparatus, device and computer readable medium for generating watermark
US11880919B2 (en) Sticker processing method and apparatus
CN112214187B (en) Water ripple image implementation method and device
CN116205806B (en) Image enhancement method and electronic equipment
CN111798399B (en) Image processing method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination