CN110786002B - Video processing method, device and computer readable storage medium - Google Patents

Video processing method, device and computer readable storage medium Download PDF

Info

Publication number
CN110786002B
CN110786002B CN201880039251.7A CN201880039251A CN110786002B CN 110786002 B CN110786002 B CN 110786002B CN 201880039251 A CN201880039251 A CN 201880039251A CN 110786002 B CN110786002 B CN 110786002B
Authority
CN
China
Prior art keywords
video frame
target video
target
processing
specified pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201880039251.7A
Other languages
Chinese (zh)
Other versions
CN110786002A (en
Inventor
李菊梅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dajiang Mutual Entertainment Technology Beijing Co ltd
SZ DJI Technology Co Ltd
Original Assignee
Dajiang Mutual Entertainment Technology Beijing Co ltd
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dajiang Mutual Entertainment Technology Beijing Co ltd, SZ DJI Technology Co Ltd filed Critical Dajiang Mutual Entertainment Technology Beijing Co ltd
Publication of CN110786002A publication Critical patent/CN110786002A/en
Application granted granted Critical
Publication of CN110786002B publication Critical patent/CN110786002B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2621Cameras specially adapted for the electronic generation of special effects during image pickup, e.g. digital cameras, camcorders, video cameras having integrated special effects capability
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/268Signal distribution or switching

Abstract

The embodiment of the invention provides a video processing method, video processing equipment and a computer readable storage medium, wherein the video processing method comprises the following steps: determining a gradual change factor corresponding to each target video frame in the first video and/or a gradual change factor corresponding to each target video frame in the second video in the process of transferring from the first video to the second video; and carrying out floodlight processing on the target video frame according to the specified pixel characteristics of the pixels of the target video frame and the gradual change factor corresponding to the target video frame.

Description

Video processing method, device and computer readable storage medium
Technical Field
The present invention relates to the field of video processing technologies, and in particular, to a video processing method, a video processing device, and a computer-readable storage medium.
Background
Video transitions are one type of video processing. A video file usually comprises a plurality of videos, which are usually shot-different or scene-different, and if the shot or scene is directly switched, the videos are obtrusive, so that transition needs to be performed between each other, that is, transition needs to be performed between two videos.
In the related video transition method, the video is usually modified by using uniform transition materials, such as blinds, fly-ins, checkerboards, shading, and the like. However, the transition of the video is completed by using the uniform transition material, the effect is the same for all the video transitions, personalized presentation cannot be performed according to different videos in the transition process, and the transition effect is poor.
Disclosure of Invention
The invention provides a video processing method, video processing equipment and a computer readable storage medium, which can realize personalized transition of videos and have better transition effect.
In a first aspect of the embodiments of the present invention, a video processing method is provided, including:
determining a gradual change factor corresponding to each target video frame in the first video and/or a gradual change factor corresponding to each target video frame in the second video in the process of transferring from the first video to the second video;
and carrying out floodlight processing on the target video frame according to the specified pixel characteristics of the pixels of the target video frame and the gradual change factor corresponding to the target video frame.
In a second aspect of the embodiments of the present invention, there is provided an electronic device, including: a memory and a processor;
the memory for storing program code;
the processor, configured to invoke the program code, and when executed, configured to:
determining a gradual change factor corresponding to each target video frame in the first video and/or a gradual change factor corresponding to each target video frame in the second video in the process of transferring from the first video to the second video;
and carrying out floodlight processing on the target video frame according to the specified pixel characteristics of the pixels of the target video frame and the gradual change factor corresponding to the target video frame.
In a third aspect of the embodiments of the present invention, a computer-readable storage medium is provided, where computer instructions are stored on the computer-readable storage medium, and when the computer instructions are executed, the video processing method according to the first aspect of the embodiments of the present invention is implemented.
Based on the technical scheme, in the embodiment of the invention, in the process of transferring from the first video to the second video, the gradual change factor corresponding to the target video frame is determined, and when the target video frame is processed, the processing is performed according to the specified pixel characteristics of the pixels of the target video frame and the corresponding gradual change factor, namely, the target video frame is processed according to the personalized information of each target video frame, so that the personalized transfer of the video can be realized, and the transfer effect is better; meanwhile, the target video frame is processed by floodlight processing, so that a bright area can generate a halo effect, and the video transition effect is better.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments of the present invention will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments described in the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings of the embodiments of the present invention.
FIG. 1 is a flow chart of a video processing method according to an embodiment of the invention;
FIG. 2 is a flow chart illustrating a video frame flood process according to an embodiment of the present invention;
FIGS. 3-6 are schematic diagrams of the floodlight effect of a video frame during a transition according to an embodiment of the present invention;
fig. 7 is a block diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention. In addition, the features in the embodiments and the examples described below may be combined with each other without conflict.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein and in the claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should be understood that the term "and/or" as used herein is meant to encompass any and all possible combinations of one or more of the associated listed items.
Although the terms first, second, third, etc. may be used herein to describe various information, these information should not be limited to these terms. These terms are used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present invention. Depending on the context, moreover, the word "if" is used may be interpreted as "at … …," or "when … …," or "in response to a determination.
The embodiment of the invention provides a video processing method which can be applied to electronic equipment. The electronic device may be a device having a video processing function, and is not limited to a specific example, such as a computer device or an embedded device, and the electronic device may be provided with a display device for displaying a video processing effect, or may not have a display device, and transmit the processed video to another device for display.
For example, the electronic device may be a cell phone. The function of video editing on the mobile phone is more and more required by users, and the video processing method of the embodiment of the invention can be applied to the mobile phone and can be widely used by the users. Because video Processing involves a large amount of data Processing, under the condition that the mobile phone has resources such as OpenGL (open graphics library) and GPU (graphics Processing Unit), a video editor of the mobile phone can use OpenGL and realize the video Processing method of the embodiment of the invention on the GPU, and the limited resources of the mobile phone can be utilized, so that real-time video preview can be realized, and meanwhile, a better Processing effect can be achieved. Of course, the above description is only a possible application platform and implementation manner of the video processing method according to the embodiment of the present invention, and is not limited thereto.
The video processing method of the embodiment of the invention mainly comprises the processing of video transition. In a related video transition processing mode, transition processing is performed on a video by using transition materials, for example, pixel characteristics of a specified area of a video frame in the transition process are modified into uniform colors, finally, a checkerboard-shaped image appears in the video frame, the checkerboard-shaped image disappears in a subsequent frame, and the transition process is completed. Since the designated area and color are the same for all videos, a personalized transition effect cannot be achieved for different videos that require a transition.
In the embodiment of the invention, in the process of transferring from the first video to the second video, the corresponding gradient factor of the target video frame is determined firstly, and when the target video frame is processed, the processing is carried out according to the specified pixel characteristics of the pixels of the target video frame and the corresponding gradient factor, namely, the target video frame is processed according to the personalized information of each target video frame, so that the personalized transfer of the video can be realized, and the transfer effect is better; meanwhile, the target video frame is processed by floodlight processing, so that a bright area can generate a halo effect, and the video transition effect is better.
The following describes the video processing method according to the embodiment of the present invention more specifically, but not by way of limitation.
Referring to fig. 1, in one embodiment, a video processing method may include the steps of:
s100: determining a gradual change factor corresponding to each target video frame in the first video and/or a gradual change factor corresponding to each target video frame in the second video in the process of transferring from the first video to the second video;
s200: and carrying out floodlight processing on the target video frame according to the specified pixel characteristics of the pixels of the target video frame and the gradual change factor corresponding to the target video frame.
Specifically, the main body of execution of the video processing method may be an electronic device, and further may be a processor of the electronic device, where the processor may be one or more, and the processor may be a general-purpose processor or a special-purpose processor.
In step S100, a process of transferring from the first video to the second video is a transition process of the first video to the second video, and in this process, the determined target video frame is a video frame in the transition process.
Each target video frame in the first video may be the last few video frames, for example, the last 15 frames, of the first video, and of course, when a transition is implemented using only the target video frame in the first video, there may be no target video frame in the second video. Accordingly, each target video frame in the second video may be the first few video frames, for example, the first 15 frames, of the second video, and of course, when only the target video frame in the second video is used to achieve the transition, there may be no target video frame in the first video.
The first video and the second video can be two sub-video segments needing transition in the same video; or two different videos may be processed and then spliced according to the video processing method of the embodiment of the present invention, or two videos may be spliced into one video and then processed according to the video processing method of the embodiment of the present invention.
Preferably, the first video and the second video both have a target video frame of transition, that is, during the transition, the last frames of the first video and the first frames of the second video are both processed correspondingly.
In the process of switching from the first video to the second video, the gradual change factor corresponding to each target video frame can be determined, that is, the gradual change factor corresponding to each target video frame in the first video and/or the gradual change factor corresponding to each target video frame in the second video are determined.
The corresponding gradient factors of different target video frames may be the same or different, as long as each target video frame has a corresponding gradient factor. Each gradual change factor can be prestored in a memory of the electronic device, and then the corresponding gradual change factor can be called by the processor in the process of switching from the first video to the second video; alternatively, each fade factor may be calculated and determined by the processor during the transition from the first video to the second video, without limitation.
In step S200, for the target video frame with the determined gradient factor, the processor performs a floodlighting process on the target video frame according to the designated pixel characteristics of the pixels of the target video frame and the gradient factor corresponding to the target video frame.
For each target video frame, the specified pixel characteristics of its pixels and its corresponding fade factor are the personalized information for that target video frame. Therefore, when the floodlight processing is performed on the target video frame, the floodlight processing is performed according to the designated pixel characteristics of the pixels of the target video frame and the gradient factors corresponding to the target video frame, and the floodlight effects obtained by different target video frames have personalized differences. Accordingly, the transition effect of different videos can be personalized and different from each other as a whole.
During the process of switching from the first video to the second video, the floodlight effect of each target video frame is different, so that the transition part of the final video can present a dynamically-changed floodlight effect compared with the floodlight effect of a single image.
The gradual change factor is used for determining the degree of the floodlight processing required in the target video frames, so that when the gradual change factor is gradually changed, the degree of the floodlight processing between the target video frames is also gradually changed, and a gradual transition effect can be presented.
In one embodiment, the fade factor for each target video frame in the first video is decremented; the fade factor for each target video frame in the second video is incremented. It will be appreciated that the gradient factor is a numerical value whose range depends on the range of values of the characteristic of the given pixel to which it is compared.
For example, the fade factor for each target video frame in the first video may decrease from 1 to 0, and the fade factor for each target video frame in the second video may increase from 0 to 1.
Specifically, for example, in the 2-second video transition process, the gradient factor of each target video frame in the first video is gradually decreased from 1, 1-1/fps and 1-2/fps to 0, the gradient factor of each target video frame in the second video is gradually increased from 0, 1/fps and 2/fps to 1, and fps is the frame rate.
Or, the gradual change factor of each target video frame in the first video is decreased to the set gradual change factor and then is kept unchanged; the gradual change factors of the first N target video frames in the second video maintain the set gradual change factors, and the gradual change factors of the rest target video frames are increased. The gradient factor may be set to 0, for example, without limitation. The value of N may be an integer not less than 0, and certainly less than the number of all target video frames of the second video.
For example, the fade factor of each target video frame in the first video may be decreased from 1 to 0 and then remain unchanged, the fade factors of the first N target video frames in the second video may maintain 0, and the fade factors of the remaining target video frames may be gradually increased to 1.
Preferably, N is equal to the number of target video frames in the first video for which the fade factor is the set fade factor, but not by way of limitation.
It is understood that the trend of the gradient factor of each target video frame may be symmetrical or asymmetrical. And the gradual change factor is used as a parameter for comparing with the specified pixel characteristics of the pixels in the corresponding target video frame, and the value can be determined according to the value range of the specified pixel characteristics and is not limited to be between 0 and 1.
Preferably, the fade factor of each target video frame in the first video decreases in a linearly decreasing manner, i.e. the decrease value is the same each time. The gradual change factor of each target video frame in the second video is increased in a linear increasing mode, namely, the increment value of each time is the same. Of course, the linear decrease and the linear increase are not limited, and may be a nonlinear decrease and a nonlinear increase.
For a target video frame with pixel values that are color values, specifying pixel characteristics may include at least one of: an R channel feature, a G channel feature, and a B channel feature. If the pixel value is other attribute values, the designated pixel characteristic may also be adjusted accordingly, for example, for a target video frame with a gray-scale pixel value, the designated pixel characteristic may be a gray-scale characteristic.
Thus, the specific pixel characteristics are not limited. In the following embodiments, the specific pixel features including the R channel feature, the G channel feature, and the B channel feature are exemplified and explained, but not limited thereto.
The R channel characteristic, the G channel characteristic and the B channel characteristic can be directly RGB values (R channel value, G channel value and B channel value) of the pixel, so that the value range of the corresponding gradient factor is between 0 and 255; or the R channel feature, the G channel feature and the B channel feature may be normalized values of RGB values, so that the value range of the corresponding gradient factor is 0 to 1. In the following embodiments, it is exemplified that the R channel feature, the G channel feature, and the B channel feature are normalized values of RGB values, and the value range of the gradient factor is 0 to 1.
In one embodiment, in step S200, the performing floodlight processing on the target video frame according to the specified pixel characteristics of the pixels of the target video frame and the gradient factor corresponding to the target video frame includes the following steps:
s201: for each target video frame, if the specified pixel characteristics of the pixels of the target video frame are higher than the corresponding gradient factor of the target video frame, performing stretching processing on the specified pixel characteristics of the pixels of the target video frame, as shown in fig. 2.
In step S201, for each target video frame, the processor performs stretching processing on the specified pixel characteristics of the pixels in the target video frame above the gradual change factor, so that the bright portion in the target video frame is brighter, and the floodlight of the video frame is realized.
Certainly, the processor may first determine whether the specified pixel feature of the pixel in the target video frame is higher than the corresponding gradient factor of the target video frame, and may perform stretching processing on the specified pixel feature each time the specified pixel feature of the pixel higher than the gradient factor is traversed, and after the stretching processing is completed, the processor continues to traverse the next pixel until the pixel in the target video frame is traversed completely.
Specifically, the R channel feature, the G channel feature, and the B channel feature of each pixel in the target video frame are compared with the gradient factor of the target video frame to determine whether stretching processing is required. Comparing the R channel characteristic of the pixel with the gradual change factor, and if the R channel characteristic of the pixel is greater than the gradual change factor, stretching the R channel characteristic of the pixel; comparing the G channel characteristic of the pixel with the gradient factor, and if the G channel characteristic of the pixel is greater than the gradient factor, stretching the G channel characteristic of the pixel; and comparing the B channel characteristic of the pixel with the gradual change factor, and if the B channel characteristic of the pixel is greater than the gradual change factor, stretching the B channel characteristic of the pixel.
In one embodiment, if any one of the R channel feature, the G channel feature, and the B channel feature of a pixel of the target video frame is greater than the fade factor, all the channel features of the pixel are subjected to stretching processing.
As previously mentioned, the fade factor for each target video frame may be: the gradual change factor of each target video frame in the first video is decreased; the fade factor for each target video frame in the second video is incremented. Of course, several frames of the fade factor in the middle can be maintained as the target video frame for which the fade factor is set.
In the transition process, for a target video frame with a gradually decreased gradient factor, the number of pixels with the specified pixel characteristics higher than the gradient factor is increased frame by frame, so that a floodlight area is increased frame by frame, in the target video frame with the gradually decreased gradient factor, a brighter object is floodlight firstly, and a darker object is floodlight secondly; for the target video frame with the gradually increased gradient factor, the pixels of the specified pixel characteristics higher than the gradient factor are reduced from frame to frame, so that the floodlight area is reduced from frame to frame.
Generally in a video, darker objects are more important objects, such as physical objects of people, trees, houses, etc., while lighter objects are generally less important, such as sky, sun, etc., so in a video of a transition, the more important objects in a first video disappear later and the more important objects in a second video appear first. For different videos, the more important objects are different in each video and even in each frame of the target video, but the more important objects can be kept for the longest time during the transition. And stretching treatment is carried out according to the appointed pixel characteristics and the gradual change factors of the pixels of the target video frame, gradual change floodlight in the video transition is realized, and more important objects are highlighted in the gradual change floodlight process.
Preferably, in step S201, the stretching the specified pixel feature of the pixel of the target video frame includes the following steps:
s2011: determining a target stretching value by using a set exponential function and the specified pixel characteristics of the pixels;
s2012: stretching the specified pixel characteristic of the pixel to the target stretch value.
In step S2011, the processor may substitute the specified pixel characteristic of the pixel into a set exponential function, and calculate a target stretch value of the specified pixel characteristic of the pixel.
In step S2012, the processor stretches the specified pixel characteristic of the pixel to the calculated target stretch value, i.e., modifies the specified pixel characteristic of the pixel to the target stretch value.
Since the obtained dependent variable is a value stretched for the independent variable, the base number of the set exponential function is greater than 0, and the specified pixel characteristic of the pixel can be substituted as the independent variable into the exponential function. Of course, since the R channel feature, the G channel feature, and the B channel feature are normalized values of RGB values, the maximum value of the target stretching value calculated by using the exponential function cannot exceed 1, and many such exponential functions are provided, and are not described herein again.
The specified pixel characteristics of the pixels are stretched by adopting an exponential function, so that the specified pixel characteristics of all the pixels higher than the gradient factor can be stretched, and a certain brightness difference is formed at the same time. The set exponential function is monotonically increasing over the range of values specifying the pixel characteristics, so that darker objects correspond to lower target stretch values and brighter objects correspond to higher target stretch values, and the difference in brightness between dark and bright objects is increased.
It is to be understood that the exponential function used is not particularly limited as long as the luminance stretching can be achieved and a certain luminance difference can be formed. Of course, the method of stretching the specified pixel feature of the pixel is not limited to the exponential function, and the specified pixel feature of the pixel higher than the gradation factor may be stretched.
In one embodiment, with reference to fig. 2, in step S200, the floodlighting the target video frame according to the specified pixel characteristics of the pixels of the target video frame and the gradient factor corresponding to the target video frame further includes:
s203: and carrying out fuzzy processing on the stretched target video frame.
This step S203 may be executed after step S201, and the processor performs a blurring process on the entire target video frame, and may blur the entire target video frame, so that the transition effect is better.
The specific fuzzy processing mode is not limited, for example, mean fuzzy, gaussian fuzzy, median fuzzy, binary fuzzy, etc., and the fuzzy processing operation is not limited to one fuzzy processing mode, and may be combined with several fuzzy processing modes.
Preferably, in step S203, the blurring processing on the target video frame after the stretching processing includes: and performing Gaussian blur processing on the target video frame after stretching processing in the horizontal and/or vertical direction.
Because the image is two-dimensional, separable Gaussian blur calculation is performed to reduce the calculation amount, and the Gaussian blur is performed on the target video frame in the horizontal direction and the vertical direction respectively. Firstly, performing Gaussian blur in the horizontal direction on a target video frame according to rows, and then performing Gaussian blur in the vertical direction on the target video frame after the horizontal direction blur according to columns; or the gaussian blurring processing can be performed on the target video frame in the vertical direction first, and then the gaussian blurring processing can be performed on the target video frame in the horizontal direction.
Of course, the gaussian blurring process may be performed only in the horizontal or vertical direction on the target video frame, and is not limited specifically.
The blurring process can make the designated pixel characteristics of the adjacent pixels in the target video frame closer, so that human eyes cannot distinguish the difference between the pixels, and the transition effect is more natural.
In one embodiment, with continued reference to fig. 2, before the blurring process is performed on the target video frame after the stretching process, the method further includes:
s202: down-sampling the stretched target video frame;
after the blurring process is performed on the target video frame which is subjected to the stretching process, the method further comprises the following steps:
s204: and upsampling the target video frame after the blurring processing so that the target video frame after the upsampling has the same size as the target video frame before the downsampling.
As in fig. 2, the order of execution by the processor may be in order from step S201 to step S204.
In step S201, after the processor stretches the specified pixel characteristics of the pixels higher than the gradual change factor of the target video frame, step S202 is executed, and the processor performs downsampling on the stretched target video frame, where the downsampling is to reduce the calculation amount through subsequent gaussian blur processing, and the size after downsampling is not limited. For example, the down-sampling is one out of every two pixels.
After the down-sampling is completed, the processor performs the blurring processing on the target video frame in step S203, which can refer to the foregoing contents related to the blurring processing and will not be described herein again.
Next, in step S204, the processor performs upsampling on the blurred target video frame so that the upsampled target video frame has the same size as the target video frame before the downsampling. For example, when the down-sampling is sampling one out of every two pixels, each pixel is interpolated to obtain two pixels.
The up-sampling and down-sampling using OpenGl is computationally inexpensive, but reduces the amount of blurring computation, and the up-sampling process can actually further blur the entire target video frame.
In one embodiment, before the upsampling the blurred target video frame, the method further includes the following steps:
s205: and aiming at the specified pixel feature of each pixel in the target video frame, determining a target stretching value of the specified pixel feature by using the specified pixel feature and a set exponential function, and stretching the specified pixel feature of the pixel to the target stretching value so as to stretch the target video frame as a whole.
This step S205 may perform further stretching on the basis of the stretching in step S201 between steps S203 and S204, except that in step S205, the specified pixel characteristics of all pixels in the target video frame are stretched, that is, the target video frame is stretched as a whole.
The set exponential function may be the same as that in step S201, or may be different. If the values are the same, after the stretching process is performed by using the exponential function, the maximum value of the specified pixel characteristics of all the pixels in the target video frame is a value close to 1 but not 1, and further stretching is required for the occurrence of the white light effect, for example, weighting may be performed to implement the further stretching, so that the specified pixel characteristics of at least some of the pixels in the target video frame are 1.
In other words, the exponential function in step S205 may be a function obtained by weighting the exponential function in step S201, and the function may be a monotonically increasing function, or may be a piecewise function, and is not limited specifically, as long as the overall stretching can be achieved, so that the specified pixel feature of at least some pixels in the target video frame is 1.
In one embodiment, in step S201, the stretching the specified pixel characteristics of the pixels of the target video frame includes:
s2013: and if the gradient factor of the target video frame is the set gradient factor, modifying the specified pixel characteristic of the pixel of the target video frame to be the set maximum value so that the target video frame presents a full white picture.
Setting the gradient factor to be 0, for example, the specified pixel characteristics of the pixels of the target video frame are all higher than the gradient factor of the target video frame, and the specified pixel characteristics of the pixels of the target video frame are set to be the maximum value, for example, the R channel characteristics, the G channel characteristics, and the B channel characteristics are all stretched to 1, so that the target video frame presents a full white picture.
In the process of transferring the target video frame with the gradient factor from the first video to the second video, the last frame or frames in the first video and/or the previous frame or frames in the second video present the target video frame as a full white picture, so that the second video reappears after the first video completely disappears, and the connection is more natural.
In one embodiment, the method further comprises the steps of:
s300: and for each target video frame, if the specified pixel characteristics of the pixels of the target video frame are lower than the gradual change factor corresponding to the target video frame, modifying the specified pixel characteristics of the pixels of the target video frame to be set to be the minimum value.
If the designated pixel characteristics of the pixels of the target video frame are lower than the gradient factors corresponding to the target video frame, which indicates that the pixels correspond to darker objects, and the floodlight processing is not required, the designated pixel characteristics of the pixels of the target video frame are modified to set minimum values, for example, the R channel characteristics, the G channel characteristics and the B channel characteristics are all pulled down to 0, so that the dimming can be further performed.
This step S300 may be executed before or after step S201, or may be executed simultaneously with step S201, and is not limited in particular.
Of course, it is also possible that the specified pixel characteristics of pixels below the fade factor need not be modified to set a minimum value, but instead remain.
In one embodiment, after performing the floodlight processing on the target video frame according to the specified pixel characteristics of the pixels of the target video frame and the corresponding gradient factor of the target video frame, the method further includes the following steps:
s400: and fusing the target video frame which is not subjected to floodlight processing with the video frame obtained after the target video frame is subjected to floodlight processing.
The target video frame that is not flood processed may be the original target video frame that has not undergone any processing.
The step S400 may be executed after the step S200 is executed, or after the steps S300 and S200 are executed, the processor fuses the target video frame that is not subjected to the floodlighting processing and the video frame obtained after the target video frame is subjected to the floodlighting processing, and the fused video frame is used as the video frame that is finally presented in the transition process, and can be displayed on the display device in real time after being fused.
Preferably, the fusing the target video frame which is not subjected to the floodlighting processing and the video frame obtained after the target video frame is subjected to the floodlighting processing includes:
and superposing the target video frame which is not subjected to floodlight processing and the pixels at the same position in the video frame obtained after the target video frame is subjected to floodlight processing, and acquiring the larger one of the specified pixel characteristics of each pixel during superposition.
Comparing the specified pixel characteristics of the pixels at corresponding positions in the target video frame which is not subjected to the floodlighting processing with the specified pixel characteristics of the pixels at corresponding positions in the target video frame which is subjected to the floodlighting processing, taking the example of performing superposition on the target video frame which is not subjected to the floodlighting processing, if the specified pixel characteristics of the pixels of the target video frame which is not subjected to the floodlighting processing are larger, the specified pixel characteristics of the pixels of the target video frame which is not subjected to the floodlighting processing are not modified, and if the specified pixel characteristics of the pixels of the target video frame which is subjected to the floodlighting processing are larger, the specified pixel characteristics of the pixels of the target video frame which is not subjected to the floodlighting processing are modified into the specified pixel characteristics of the pixels of the target video frame which is subjected to the floodlighting processing. Finally, these larger specified pixel features are fused onto one frame of the video frame.
Fig. 3 to 6 show only four video frames (in chronological order) processed by the video processing method of the embodiment of the present invention in the process of moving from a first video to a second video, where the floodlight areas of two video frames (fig. 3 and 4) in the first video are increased frame by frame, but important objects (persons, etc.) are disappeared last, the floodlight areas of two video frames (fig. 5 and 6) in the second video are decreased frame by frame, and important objects (persons, etc.) are appeared last, so that the time for retaining the objects of most interest to the user can be kept longest.
Based on the same concept as the method described above, referring to fig. 7, an embodiment of the present invention further provides an electronic device 100, including: memory 101 and processor 102 (e.g., one or more processors);
in one embodiment, the memory is to store program code; the processor, configured to invoke the program code, and when executed, configured to:
determining a gradual change factor corresponding to each target video frame in the first video and/or a gradual change factor corresponding to each target video frame in the second video in the process of transferring from the first video to the second video;
and carrying out floodlight processing on the target video frame according to the specified pixel characteristics of the pixels of the target video frame and the gradual change factor corresponding to the target video frame.
Preferably, the fade factor of each target video frame in the first video is decreased; the gradual change factor of each target video frame in the second video is increased; alternatively, the first and second electrodes may be,
the gradual change factor of each target video frame in the first video is decreased to the set gradual change factor and then is kept unchanged; and the gradual change factors of the first N target video frames in the second video maintain the set gradual change factors, and the gradual change factors of the rest target video frames are increased progressively.
Preferably, N is equal to the number of target video frames in the first video whose fade factors are the set fade factors.
Preferably, the gradual change factor of each target video frame in the first video decreases in a linear decreasing manner;
the gradual change factor of each target video frame in the second video is increased in a linear increasing mode when being increased.
Preferably, the processor, when performing the floodlight processing on the target video frame according to the specified pixel characteristics of the pixels of the target video frame and the gradient factor corresponding to the target video frame, is specifically configured to:
and for each target video frame, if the specified pixel characteristics of the pixels of the target video frame are higher than the gradient factors corresponding to the target video frame, stretching the specified pixel characteristics of the pixels of the target video frame.
Preferably, when the processor performs stretching processing on the specified pixel feature of the pixel of the target video frame, the processor is specifically configured to:
determining a target stretching value by using a set exponential function and the specified pixel characteristics of the pixels;
stretching the specified pixel characteristic of the pixel to the target stretch value.
Preferably, the processing is specifically configured to, when performing the floodlight processing on the target video frame according to the specified pixel characteristics of the pixels of the target video frame and the gradient factor corresponding to the target video frame:
and carrying out fuzzy processing on the stretched target video frame.
Preferably, when the processor performs the blurring processing on the target video frame after the stretching processing, the processor is specifically configured to:
and performing Gaussian blur processing on the target video frame after stretching processing in the horizontal and/or vertical direction.
Preferably, before the blurring process is performed on the target video frame after the stretching process, the processor is further configured to:
down-sampling the stretched target video frame;
the processor, after blurring the target video frame that has been stretched, is further configured to:
and upsampling the target video frame after the blurring processing so that the target video frame after the upsampling has the same size as the target video frame before the downsampling.
Preferably, before the upsampling the blurred target video frame, the processor is further configured to:
and aiming at the specified pixel feature of each pixel in the target video frame, determining a target stretching value of the specified pixel feature by using the specified pixel feature and a set exponential function, and stretching the specified pixel feature of the pixel to the target stretching value so as to stretch the target video frame as a whole.
Preferably, when the processor performs stretching processing on the specified pixel feature of the pixel of the target video frame, the processor is specifically configured to:
and if the gradient factor of the target video frame is the set gradient factor, modifying the specified pixel characteristic of the pixel of the target video frame to be the set maximum value so that the target video frame presents a full white picture.
Preferably, the processor is further configured to:
and for each target video frame, if the specified pixel characteristics of the pixels of the target video frame are lower than the gradual change factor corresponding to the target video frame, modifying the specified pixel characteristics of the pixels of the target video frame to be set to be the minimum value.
Preferably, the processor, after performing the floodlight processing on the target video frame according to the specified pixel characteristics of the pixels of the target video frame and the gradient factor corresponding to the target video frame, is further configured to:
and fusing the target video frame which is not subjected to floodlight processing with the video frame obtained after the target video frame is subjected to floodlight processing.
Preferably, when the processor fuses the target video frame which is not subjected to the floodlighting processing and the video frame obtained after the target video frame is subjected to the floodlighting processing, the processor is specifically configured to:
and superposing the target video frame which is not subjected to floodlight processing and the pixels at the same position in the video frame obtained after the target video frame is subjected to floodlight processing, and acquiring the larger one of the specified pixel characteristics of each pixel during superposition.
Preferably, the specified pixel characteristics include at least one of: an R channel feature, a G channel feature, and a B channel feature.
Based on the same inventive concept as the method, an embodiment of the present invention further provides a computer-readable storage medium, where computer instructions are stored, and when the computer instructions are executed, the video processing method in the foregoing embodiment is implemented.
The systems, devices, modules or units illustrated in the above embodiments may be implemented by a computer chip or an entity, or by an article of manufacture with certain functionality. A typical implementation device is a computer, which may take the form of a personal computer, laptop computer, cellular telephone, camera phone, smart phone, personal digital assistant, media player, navigation device, email messaging device, game console, tablet computer, wearable device, or a combination of any of these devices.
For convenience of description, the above devices are described as being divided into various units by function, and are described separately. Of course, the functions of the units may be implemented in the same software and/or hardware or in a plurality of software and/or hardware when implementing the invention.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
Furthermore, these computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The above description is only an example of the present invention, and is not intended to limit the present invention. Various modifications and alterations to this invention will become apparent to those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the scope of the claims of the present invention.

Claims (29)

1. A video processing method, comprising:
determining a gradual change factor corresponding to each target video frame in the first video and/or a gradual change factor corresponding to each target video frame in the second video in the process of transferring from the first video to the second video;
floodlight processing is carried out on a target video frame according to the designated pixel characteristics of the pixels of the target video frame and the gradient factor corresponding to the target video frame;
the floodlight processing is carried out on the target video frame according to the specified pixel characteristics of the pixels of the target video frame and the gradual change factor corresponding to the target video frame, and the floodlight processing comprises the following steps:
and for each target video frame, if the specified pixel characteristics of the pixels of the target video frame are higher than the gradient factors corresponding to the target video frame, stretching the specified pixel characteristics of the pixels of the target video frame.
2. The video processing method of claim 1, wherein a fade factor for each target video frame in the first video is decremented; the gradual change factor of each target video frame in the second video is increased; alternatively, the first and second electrodes may be,
the gradual change factor of each target video frame in the first video is decreased to the set gradual change factor and then is kept unchanged; and the gradual change factors of the first N target video frames in the second video maintain the set gradual change factors, and the gradual change factors of the rest target video frames are increased progressively.
3. The video processing method according to claim 2, wherein N is equal to the number of target video frames in the first video for which the fade factor is the set fade factor.
4. The video processing method according to claim 2, wherein the fade factor of each target video frame in the first video decreases in a linearly decreasing manner;
the gradual change factor of each target video frame in the second video is increased in a linear increasing mode when being increased.
5. The video processing method of claim 1, wherein said stretching specified pixel characteristics of said pixels of the target video frame comprises:
determining a target stretching value by using a set exponential function and the specified pixel characteristics of the pixels;
stretching the specified pixel characteristic of the pixel to the target stretch value.
6. The video processing method of claim 1, wherein the floodlighting process is performed on the target video frame according to the specified pixel characteristics of the pixels of the target video frame and the corresponding gradient factor of the target video frame, further comprising:
and carrying out fuzzy processing on the stretched target video frame.
7. The video processing method according to claim 6, wherein the blurring the target video frame after the stretching process comprises:
and performing Gaussian blur processing on the target video frame after stretching processing in the horizontal and/or vertical direction.
8. The video processing method according to claim 6 or 7, wherein before the blurring process is performed on the target video frame after the stretch process, further comprising:
down-sampling the stretched target video frame;
after the blurring process is performed on the target video frame which is subjected to the stretching process, the method further comprises the following steps:
and upsampling the target video frame after the blurring processing so that the target video frame after the upsampling has the same size as the target video frame before the downsampling.
9. The video processing method of claim 8, wherein prior to said upsampling the blurry processed target video frame, further comprising:
and aiming at the specified pixel feature of each pixel in the target video frame, determining a target stretching value of the specified pixel feature by using the specified pixel feature and a set exponential function, and stretching the specified pixel feature of the pixel to the target stretching value so as to stretch the target video frame as a whole.
10. The video processing method of claim 1, wherein said stretching specified pixel characteristics of said pixels of the target video frame comprises:
and if the gradient factor of the target video frame is the set gradient factor, modifying the specified pixel characteristic of the pixel of the target video frame to be the set maximum value so that the target video frame presents a full white picture.
11. The video processing method of claim 1, the method further comprising:
and for each target video frame, if the specified pixel characteristics of the pixels of the target video frame are lower than the gradual change factor corresponding to the target video frame, modifying the specified pixel characteristics of the pixels of the target video frame to be set to be the minimum value.
12. The video processing method of claim 1, wherein after performing the floodlight processing on the target video frame according to the specified pixel characteristics of the pixels of the target video frame and the corresponding gradient factor of the target video frame, further comprising:
and fusing the target video frame which is not subjected to floodlight processing with the video frame obtained after the target video frame is subjected to floodlight processing.
13. The video processing method of claim 12, wherein the fusing the target video frame that is not subjected to the floodlighting processing with the video frame obtained after the target video frame is subjected to the floodlighting processing comprises:
and superposing the target video frame which is not subjected to floodlight processing and the pixels at the same position in the video frame obtained after the target video frame is subjected to floodlight processing, and acquiring the larger one of the specified pixel characteristics of each pixel during superposition.
14. The video processing method of claim 1, wherein the specified pixel characteristics comprise at least one of: an R channel feature, a G channel feature, and a B channel feature.
15. An electronic device, comprising: a memory and a processor;
the memory for storing program code;
the processor, configured to invoke the program code, and when executed, configured to:
determining a gradual change factor corresponding to each target video frame in the first video and/or a gradual change factor corresponding to each target video frame in the second video in the process of transferring from the first video to the second video;
floodlight processing is carried out on a target video frame according to the designated pixel characteristics of the pixels of the target video frame and the gradient factor corresponding to the target video frame;
the processor is specifically configured to, when performing floodlight processing on a target video frame, according to specified pixel characteristics of pixels of the target video frame and a gradient factor corresponding to the target video frame:
and for each target video frame, if the specified pixel characteristics of the pixels of the target video frame are higher than the gradient factors corresponding to the target video frame, stretching the specified pixel characteristics of the pixels of the target video frame.
16. The electronic device of claim 15, wherein a fade factor for each target video frame in the first video is decremented; the gradual change factor of each target video frame in the second video is increased; alternatively, the first and second electrodes may be,
the gradual change factor of each target video frame in the first video is decreased to the set gradual change factor and then is kept unchanged; and the gradual change factors of the first N target video frames in the second video maintain the set gradual change factors, and the gradual change factors of the rest target video frames are increased progressively.
17. The electronic device of claim 16, wherein the N is equal to a number of target video frames in the first video for which a fade factor is the set fade factor.
18. The electronic device of claim 16, wherein the fade factor for each target video frame in the first video decreases in a linearly decreasing manner as it decreases;
the gradual change factor of each target video frame in the second video is increased in a linear increasing mode when being increased.
19. The electronic device of claim 15, wherein the processor, when performing stretching processing on the specified pixel characteristics of the pixels of the target video frame, is specifically configured to:
determining a target stretching value by using a set exponential function and the specified pixel characteristics of the pixels;
stretching the specified pixel characteristic of the pixel to the target stretch value.
20. The electronic device of claim 15, wherein the processing, when performing the floodlighting process on the target video frame according to the specified pixel characteristics of the pixels of the target video frame and the corresponding gradient factor of the target video frame, is specifically configured to:
and carrying out fuzzy processing on the stretched target video frame.
21. The electronic device of claim 20, wherein the processor, when performing the blurring processing on the target video frame after the stretching processing, is specifically configured to:
and performing Gaussian blur processing on the target video frame after stretching processing in the horizontal and/or vertical direction.
22. The electronic device of claim 20 or 21, wherein the processor, prior to blurring the target video frame after stretch processing, is further configured to:
down-sampling the stretched target video frame;
the processor, after blurring the target video frame that has been stretched, is further configured to:
and upsampling the target video frame after the blurring processing so that the target video frame after the upsampling has the same size as the target video frame before the downsampling.
23. The electronic device of claim 22, wherein the processor, prior to the upsampling the blurry processed target video frame, is further configured to:
and aiming at the specified pixel feature of each pixel in the target video frame, determining a target stretching value of the specified pixel feature by using the specified pixel feature and a set exponential function, and stretching the specified pixel feature of the pixel to the target stretching value so as to stretch the target video frame as a whole.
24. The electronic device of claim 15, wherein the processor, when performing stretching processing on the specified pixel characteristics of the pixels of the target video frame, is specifically configured to:
and if the gradient factor of the target video frame is the set gradient factor, modifying the specified pixel characteristic of the pixel of the target video frame to be the set maximum value so that the target video frame presents a full white picture.
25. The electronic device of claim 15, wherein the processor is further to:
and for each target video frame, if the specified pixel characteristics of the pixels of the target video frame are lower than the gradual change factor corresponding to the target video frame, modifying the specified pixel characteristics of the pixels of the target video frame to be set to be the minimum value.
26. The electronic device of claim 15, wherein the processor, after performing the floodlight processing on the target video frame according to the specified pixel characteristics of the pixels of the target video frame and the corresponding gradient factor of the target video frame, is further configured to:
and fusing the target video frame which is not subjected to floodlight processing with the video frame obtained after the target video frame is subjected to floodlight processing.
27. The electronic device of claim 26, wherein the processor, when fusing the video frame obtained by floodlighting the non-floodlighting target video frame with the video frame obtained by floodlighting the target video frame, is specifically configured to:
and superposing the target video frame which is not subjected to floodlight processing and the pixels at the same position in the video frame obtained after the target video frame is subjected to floodlight processing, and acquiring the larger one of the specified pixel characteristics of each pixel during superposition.
28. The electronic device of claim 15, wherein the specified pixel characteristics comprise at least one of: an R channel feature, a G channel feature, and a B channel feature.
29. A computer-readable storage medium, characterized in that,
the computer-readable storage medium having stored thereon computer instructions which, when executed, implement the video processing method of any of claims 1-14.
CN201880039251.7A 2018-07-24 2018-07-24 Video processing method, device and computer readable storage medium Active CN110786002B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/096847 WO2020019164A1 (en) 2018-07-24 2018-07-24 Video processing method and device, and computer-readable storage medium

Publications (2)

Publication Number Publication Date
CN110786002A CN110786002A (en) 2020-02-11
CN110786002B true CN110786002B (en) 2022-02-25

Family

ID=69181054

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880039251.7A Active CN110786002B (en) 2018-07-24 2018-07-24 Video processing method, device and computer readable storage medium

Country Status (2)

Country Link
CN (1) CN110786002B (en)
WO (1) WO2020019164A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112312201B (en) * 2020-04-09 2023-04-07 北京沃东天骏信息技术有限公司 Method, system, device and storage medium for video transition
CN113542847B (en) * 2020-04-21 2023-05-02 抖音视界有限公司 Image display method, device, equipment and storage medium
CN112749613B (en) * 2020-08-27 2024-03-26 腾讯科技(深圳)有限公司 Video data processing method, device, computer equipment and storage medium
EP4248396A4 (en) * 2021-05-31 2024-04-24 Huawei Tech Co Ltd Apparatus and method for generating a bloom effect
CN114998479B (en) * 2022-08-08 2022-10-18 江阴市晶磁电子有限公司 Gradual change image scene machine control processing system and method

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009027550A (en) * 2007-07-20 2009-02-05 Funai Electric Co Ltd Video signal processing apparatus

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5812216A (en) * 1994-04-19 1998-09-22 Avid Technology, Inc. Method and apparatus for creating motion picture transitions according to non-linear light response
US20020126224A1 (en) * 2000-12-28 2002-09-12 Rainer Lienhart System for detection of transition and special effects in video
CN1547382A (en) * 2003-12-02 2004-11-17 深圳创维-Rgb电子有限公司 A method of television set channel switching
CN103024559B (en) * 2012-11-09 2016-04-13 小米科技有限责任公司 A kind of method of Switch Video, device and equipment
CN103971713B (en) * 2014-05-07 2018-02-16 厦门美图之家科技有限公司 A kind of filter processing method of video file
CN103984553B (en) * 2014-05-26 2017-10-24 中科创达软件股份有限公司 A kind of 3D desktop display methods and system
CN104144301A (en) * 2014-07-30 2014-11-12 厦门美图之家科技有限公司 Method for transition special effects on basis of mixed modes
CN106210531B (en) * 2016-07-29 2019-05-03 Oppo广东移动通信有限公司 Video generation method, device and mobile terminal
CN107124624B (en) * 2017-04-21 2022-09-23 腾讯科技(深圳)有限公司 Method and device for generating video data
CN108024071B (en) * 2017-11-24 2022-03-08 腾讯数码(天津)有限公司 Video content generation method, video content generation device, and storage medium
CN108038896B (en) * 2017-12-06 2021-06-11 北京像素软件科技股份有限公司 Halo gradual change realization method and device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009027550A (en) * 2007-07-20 2009-02-05 Funai Electric Co Ltd Video signal processing apparatus

Also Published As

Publication number Publication date
CN110786002A (en) 2020-02-11
WO2020019164A1 (en) 2020-01-30

Similar Documents

Publication Publication Date Title
CN110786002B (en) Video processing method, device and computer readable storage medium
US10410327B2 (en) Shallow depth of field rendering
CN104469179A (en) Method for combining dynamic pictures into mobile phone video
JP2008501170A (en) Method and system for viewing and enhancing images
CN109413335B (en) Method and device for synthesizing HDR image by double exposure
CN108022223B (en) Tone mapping method based on logarithm mapping function blocking processing fusion
CN112184877B (en) Method and system for rendering optimization of glow effect
Yee et al. Segmentation and adaptive assimilation for detail-preserving display of high-dynamic range images
US20140078170A1 (en) Image processing apparatus and method, and program
CN114581355A (en) Method, terminal and electronic device for reconstructing HDR image
CN106603885B (en) Method of video image processing and device
JP4879363B1 (en) Image processing system
Kim et al. ITM-CNN: Learning the inverse tone mapping from low dynamic range video to high dynamic range displays using convolutional neural networks
KR20230074136A (en) Salience-based capture or image processing
Zhang et al. Multi-scale-based joint super-resolution and inverse tone-mapping with data synthesis for UHD HDR video
CN111161685B (en) Virtual reality display equipment and control method thereof
CN112991163B (en) Panoramic image acquisition method, device and equipment
CN109859303B (en) Image rendering method and device, terminal equipment and readable storage medium
KR20030066511A (en) Apparatus and method for real-time brightness control of moving images
CN115293994B (en) Image processing method, image processing device, computer equipment and storage medium
CN113538211A (en) Image quality enhancement device and related method
CN110706162A (en) Image processing method and device and computer storage medium
CN113256785B (en) Image processing method, apparatus, device and medium
CN111696034B (en) Image processing method and device and electronic equipment
US11328494B2 (en) Image processing apparatus, image processing method, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant