CN113259745B - Video playing page processing method and device, electronic equipment and storage medium - Google Patents

Video playing page processing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113259745B
CN113259745B CN202110524435.6A CN202110524435A CN113259745B CN 113259745 B CN113259745 B CN 113259745B CN 202110524435 A CN202110524435 A CN 202110524435A CN 113259745 B CN113259745 B CN 113259745B
Authority
CN
China
Prior art keywords
color
sequence
frame
frame image
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110524435.6A
Other languages
Chinese (zh)
Other versions
CN113259745A (en
Inventor
逄增耀
李琳科
杜英豪
王学兵
肖锋
胡滨
冯飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202110524435.6A priority Critical patent/CN113259745B/en
Publication of CN113259745A publication Critical patent/CN113259745A/en
Application granted granted Critical
Publication of CN113259745B publication Critical patent/CN113259745B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47205End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for manipulating displayed content, e.g. interacting with MPEG-4 objects, editing locally
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4782Web browsing, e.g. WebTV

Abstract

The disclosure provides a processing method and device of a video playing page, electronic equipment and a storage medium, and relates to the field of image processing. The specific implementation scheme is as follows: determining a plurality of first colors respectively corresponding to a plurality of frame images in a target video; determining a first color sequence and a frame number corresponding to each color in the first color sequence based on a plurality of first colors; according to the frame number corresponding to each color in the first color sequence, smoothing the first color sequence to obtain a second color sequence; and in the process of playing the target video in the playing page, rendering the target area in the playing page based on the second color sequence. According to the technology of the embodiment of the disclosure, the immersion feeling of the playing page of the target video can be improved.

Description

Video playing page processing method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of computer technology, and more particularly, to the field of image processing.
Background
With the popularization of mobile terminals and the increase of network speed, short videos become one of the main information carriers in daily life of people. Short videos are typically played in pages of an application. In a short video playback scenario, there may be areas of the page that are not covered by video, for example, there may be an options area above the page for displaying clickable button options. In general, the background of these regions not covered by the video is filled with black, white, video cover theme color, etc., or filled with the result of gaussian blurring processing of each frame image in the video.
Disclosure of Invention
The disclosure provides a processing method and device for a video playing page, electronic equipment and a storage medium.
According to an aspect of the present disclosure, a method for processing a video playing page is provided, including:
determining a plurality of first colors respectively corresponding to a plurality of frame images in a target video;
determining a first color sequence and a frame number corresponding to each color in the first color sequence based on a plurality of first colors;
according to the frame number corresponding to each color in the first color sequence, smoothing the first color sequence to obtain a second color sequence;
and in the process of playing the target video in the playing page, rendering the target area in the playing page based on the second color sequence.
According to another aspect of the present disclosure, there is provided a processing apparatus for a video playback page, including:
the color determining module is used for determining a plurality of first colors respectively corresponding to a plurality of frames of images in the target video;
the first sequence module is used for determining a first color sequence and the frame number corresponding to each color in the first color sequence based on a plurality of first colors;
the second sequence module is used for smoothing the first color sequence according to the frame number corresponding to each color in the first color sequence to obtain a second color sequence;
and the rendering module is used for rendering the target area in the playing page based on the second color sequence in the process of playing the target video in the playing page.
According to another aspect of the present disclosure, there is provided an electronic device including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein, the first and the second end of the pipe are connected with each other,
the memory stores instructions executable by the at least one processor to cause the at least one processor to perform the method of any of the embodiments of the present disclosure.
According to another aspect of the present disclosure, there is provided a non-transitory computer readable storage medium storing computer instructions for causing a computer to perform a method in any embodiment of the present disclosure.
According to another aspect of the present disclosure, a computer program product is provided, comprising a computer program which, when executed by a processor, implements the method in any of the embodiments of the present disclosure.
In the technical scheme of the disclosure, in the process of playing the target video, the second color sequence used for rendering the target area is obtained based on the plurality of first colors corresponding to the multi-frame images in the target video, so that the immersion and the integration degree of the playing page of the target video can be improved. And because the first color sequence is obtained based on the plurality of first colors, and then the second color sequence used for rendering the target area is obtained based on the frame number corresponding to each color, the display effect of the target area can be prevented from being too disordered, and the color flicker in the target area can be reduced.
It should be understood that the statements in this section are not intended to identify key or critical features of the embodiments of the present disclosure, nor are they intended to limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The drawings are included to provide a better understanding of the present solution and are not to be construed as limiting the present disclosure. Wherein:
FIG. 1 is a schematic diagram of a method of processing a video playback page according to one embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a method of processing a video playback page according to another embodiment of the present disclosure;
FIG. 3 is a diagram illustrating an example of determining a first color based on linear color difference according to the present disclosure;
FIG. 4 is a diagram illustrating a first color determination based on a color difference and a brightness difference in an exemplary application of the present disclosure;
FIG. 5 is a schematic diagram of a processing device for video playback pages according to one embodiment of the present disclosure;
FIG. 6 is a schematic diagram of a processing method of a video playback page according to one embodiment of the present disclosure;
fig. 7 is a block diagram of an electronic device for implementing a method for processing a video playback page according to an embodiment of the present disclosure.
Detailed Description
Exemplary embodiments of the present disclosure are described below with reference to the accompanying drawings, in which various details of the embodiments of the disclosure are included to assist understanding, and which are to be considered as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
Fig. 1 shows a schematic diagram of a processing method of a video playing page according to an embodiment of the present disclosure. As shown in fig. 1, the method includes:
s11, determining a plurality of first colors respectively corresponding to a plurality of frame images in a target video;
step S12, determining a first color sequence and the frame number corresponding to each color in the first color sequence based on a plurality of first colors;
step S13, smoothing the first color sequence according to the frame number corresponding to each color in the first color sequence to obtain a second color sequence;
and S14, in the process of playing the target video in the playing page, rendering the target area in the playing page based on the second color sequence.
In the embodiment of the present disclosure, the target video is a video to be played. By way of example and not limitation, the target video may include a video to be played in various apps, such as a short video App (Application), a community App, and an instant messaging App.
Illustratively, the multi-frame image in the target video may include each frame image in the target video, and may also include a partial image in the target video. For example, FFmpeg (Fast Forward Mpeg) may be used to frame the target video to obtain each frame of image in the target video, and each frame of image in the target video is used as the above-mentioned multi-frame image or is obtained based on odd frame or even frame of image in the target video.
For example, a theme color may be determined for each frame of the multiple frames of images, and the theme color may be used as the corresponding first color of the image. The theme color may be extracted from the image, or may be obtained by performing statistics based on the color of each pixel in the image.
In practical applications, RGB (Red-Green-Blue) color values can be used to represent various colors. Wherein the RGB color values may be represented on a hexadecimal number basis. Taking the example of using RGB color values to represent the first color, the N first colors corresponding to the N frames of images can be as shown in table 1:
image sequence number First color (RGB color value)
1 A6B1FF
2 A6B1FF
3 33D122
N E1A355
TABLE 1
For example, adjacent and same colors in the plurality of first colors may be combined to obtain the first color sequence and the number of frames corresponding to each color in the first color sequence. Here, the number of frames is the number of corresponding images.
For example, if the 4 first colors corresponding to the 4 frame images include red, light red, and black, the first color sequence is { red, light red, and black }, where the frame number corresponding to red is 1, the frame number corresponding to light red is 2, and the frame number corresponding to black is 1.
For another example, corresponding to table 1 above, the first color sequence may be as shown in table 2:
each color (RGB color value) in the first color sequence Number of frames
A6B1FF 2
33D122 4
E1A355 10
TABLE 2
For example, in step S13, the colors in the first color sequence whose number of frames is less than the threshold value and the colors before and after the first color sequence may be combined to smooth the first color sequence to reduce the flickering of the colors. The combination mode can be neutralization, or replace the color with more frames.
For example, if the first color sequence includes three consecutive colors of light red, deep red, and orange, the corresponding frame numbers are 10, 2, and 13, respectively. Because the frame number corresponding to the deep red is small, the deep red and the previous light red can be combined, and the red obtained by neutralizing the deep red and the light red is taken as the combined color, so as to obtain the second color sequence containing two continuous colors of red and orange, wherein, the frame numbers corresponding to the red and the orange are 12 and 13 respectively.
For example, when the target video is played in the playing page, the rendering duration of each color may be determined according to the frame number corresponding to each color, and then the target area in the playing page is rendered sequentially by using each color according to the rendering duration of each color. The target area may be an area not covered by the video in the playing page, for example, a button option area above the playing page or a sharing praise area below the playing page.
For example, assuming that the second color sequence is shown in table 2, rendering durations corresponding to the first two colors A6B1FF and 33D122 are the playing duration of 2 frames of images and the playing duration of 4 frames of images, respectively, the color A6B1FF is used to render the target region while the 1 st to 2 nd frames of images are played, and the color 33D122 is used to render the target region while the 3 rd to 6 th frames of images are played.
According to the method disclosed by the embodiment of the disclosure, in the process of playing the target video, the second color sequence used for rendering the target area is obtained based on the plurality of first colors corresponding to the multi-frame images in the target video, so that compared with the method of filling the target area with a pure color which is invariable all the time, such as black, white or a video cover theme color, the immersion feeling and the integration degree of the playing page of the target video can be improved. And because the first color sequence is obtained based on the plurality of first colors, and then the second color sequence used for rendering the target area is obtained based on the frame number corresponding to each color, the display effect of the target area can be prevented from being too disordered, and the color flicker in the target area is reduced. Compared with filling the target area based on the Gaussian blur processing result of each frame of image in the video, the consistency of the played page can be improved, and the font display effect in the target area is prevented from being influenced.
As an exemplary embodiment, referring to fig. 2, the step S11 of determining a plurality of first colors respectively corresponding to a plurality of frame images in the target video includes:
step S21, obtaining a second color of the ith frame image based on the colors of a plurality of target pixels in the ith frame image in the multi-frame image; wherein i is an integer greater than or equal to 2;
and step S22, determining the first color of the ith frame image according to the color difference information between the second color of the ith frame image and the first color of the (i-1) th frame image in the multi-frame image.
For example, the processing manner for the ith frame image may be applied to each frame image from the 2 nd frame image in the plurality of frame images. For the 1 st frame image in the multi-frame image, the second color of the 1 st frame image can be obtained based on the colors of the target pixels in the 1 st frame image, and the second color is the first color of the 1 st frame image.
For example, a first color of the 1 st frame image may be determined first, and a second color of each frame image from the 2 nd frame image may be determined. Then, each frame of image is traversed from the 2 nd frame of image, when the frame of image is traversed to the i-th frame of image, the second color of the i-th frame of image is compared with the first color of the i-1 th frame of image to obtain color difference information, the first color of the i-th frame of image is determined based on the color difference information, for example, the second color of the i-th frame of image is reserved as the first color of the i-th frame of image when the color difference information is greater than a threshold value, and the first color of the i-1 th frame of image is used as the first color of the i-th frame of image when the color difference information is less than or equal to the threshold value, so that the colors of the two frames of images are combined. And the analogy is repeated to obtain the first color of each frame of image.
The color of each frame of image is smoothed according to the color difference of each frame of image, so that the color flicker in the target area can be further reduced, the consistency effect of the played page is improved, and the watching experience of a user is optimized.
In some exemplary embodiments, the multi-frame image may be pre-processed, and then the smoothing process based on the color difference in the steps S21 and S22 is performed. Exemplarily, before determining a plurality of first colors respectively corresponding to a plurality of frame images in the target video in step S11, the method may further include:
determining a reference area adjacent to the target area in the ith frame of image based on a preset size;
a plurality of pixels in the reference region are determined as a plurality of target pixels in the ith frame image.
For example, the processing manner for the ith frame image may be applied to each frame image in a plurality of frame images. The preset size may be a preset pixel size or a size ratio in each frame image. For example, the preset size may be 50 pixels, or 1/6 of the height of the image per frame.
For example, if the target area is an option area above a video in the playing page, and the preset size is 1/6 of the height of each frame of image, the upper 1/6 of each frame of image may be used as a reference area, and each pixel in the reference area is a target pixel for determining the second color and the first color corresponding to the image.
By preprocessing the images, determining a reference region adjacent to the target region and determining the color corresponding to each image based on the target pixels in the reference region, the color in the target region can be more approximate to the adjacent region, and the immersion and integration effects of the page can be improved.
For example, on the basis of the foregoing embodiments, the step S21 of obtaining the second color of the ith frame image based on the colors of the multiple target pixels in the ith frame image in the multiple frame images may include:
determining the number of target pixels respectively corresponding to a plurality of preset colors based on the colors of a plurality of target pixels in the ith frame of image in the multi-frame of images;
determining scores of the plurality of preset colors based on the number of target pixels respectively corresponding to the plurality of preset colors;
and determining the second color of the ith frame image from a plurality of preset colors based on the scores.
In the above manner, the second color corresponding to the image is determined by quantizing the colors of a plurality of target pixels in the image. In practical applications, color quantization is to combine similar colors with less importance in the original image into a color by considering the perceptual inertia of human eyes to the color, so as to obtain the color component set in the image and the number proportion of each color in the image. For example, ten preset colors are set, and the color of each target pixel in the ith frame image is classified into one of the ten preset colors. Therefore, the target pixel corresponding to each preset color in the ten preset colors can be obtained, and the number of the target pixels can be counted. For another example, an octree algorithm is adopted, and the number of target pixels respectively corresponding to a plurality of preset colors is determined based on the colors of a plurality of target pixels in the ith frame image.
The method comprises the steps of determining the values of preset colors based on the number of target pixels corresponding to multiple preset colors in an ith image, determining the second color of the ith image from multiple colors based on the values, enabling the second color to be the main color in the multiple target pixels of the ith image, obtaining a first color sequence and a second color sequence based on the main color, rendering a target area based on the second color sequence, improving the color consistency of the target area and video display, and further improving the page immersion and integration effect.
In practical application, in addition to the number of target pixels corresponding to each preset color, the score of each preset color can be determined by combining other factors, so as to improve the accuracy of color extraction.
For example, a color quantization score of each preset color may be determined based on the number of target pixels corresponding to each preset color, a final score of each preset color may be obtained by combining the region saliency score and/or the hue saliency score of each preset color, and a color with the highest score may be determined as the second color of the ith frame image.
Illustratively, the obtaining manner of the region saliency score may include: determining a salient region in the ith frame of image by adopting an LC (Luminance Contrast) algorithm; determining a saliency normalization score for a plurality of target pixels in the ith frame image based on the saliency region in the ith frame image; and obtaining the region saliency score of each preset color based on the saliency normalization score of the target pixel corresponding to each preset color in the plurality of preset colors. The LC algorithm simulates the visual characteristics of a human through an image algorithm, and the salient region can be accurately determined.
Illustratively, the manner of obtaining the color saliency score may include: and dividing each preset color into five levels of white, light, color, dark and black according to the hue, saturation and brightness of each preset color, and obtaining the hue saliency score of each preset color based on the hue saliency score corresponding to each level.
For example, after obtaining the second color of each image, the second color of each image may be adjusted to meet the requirement of displaying the playing page based on reasonable selection of hue and accurate control of high purity. For example, at least one of the hue, brightness and saturation of the second color of each image is adjusted, and the first color of each image is obtained based on the second color of each image.
In the step S22, the first color of the i-th frame image is determined according to the color difference information between the second color of the i-th frame image and the first color of the i-1 th frame image, and the first color may be determined based on a plurality of color difference information. Illustratively, the color difference information includes at least one of a linear color difference, a color difference, and a brightness difference.
The linear color difference may include a color difference calculated after converting colors to be calculated (the second color of the ith frame image and the first color of the (i-1) th frame image) into linear color values by using a Delta-E2000 algorithm. Linear color differences may characterize the difference in the color as a whole. The hue difference may include a difference between Hues (Hues) calculated in an HSB (hue-Saturation-Brightness) color space. The luminance difference may include a difference between luminances (brightnesss) calculated in the HSB color space.
Color smoothing is carried out on the adjacent frame images by combining at least one of linear chromatic aberration, chromatic aberration and brightness difference, and the instability of inter-frame colors can be effectively overcome.
Illustratively, one way to combine the linear color difference, the color difference and the brightness difference is to take the second color of the ith frame image as the first color of the ith frame image when at least one of the linear color difference, the color difference and the brightness difference is greater than or equal to a preset threshold; and taking the first color of the i-1 frame image as the first color difference of the i frame image under the condition that the linear color difference, the color difference and the brightness difference are all smaller than the preset threshold value.
Another exemplary way to combine linear color difference, color difference and brightness difference is to perform step S22 iteratively, and during each iteration, determine a first color based on one or more color difference information, then determine the first color of each image as a new second color of each image, and perform the next iteration.
Specifically, the step S22 of determining the first color of the i-th frame image according to the color difference information between the second color of the i-th frame image and the first color of the i-1-th frame image may include:
traversing each frame image in the multi-frame images, and determining the first color of the ith frame image according to the linear color difference between the second color of the ith frame image and the first color of the (i-1) th frame image when the ith frame image is traversed; determining a first color of the ith frame image as a new second color of the ith frame image;
and traversing each frame of image in the multiple frame of images again, and determining the first color of the ith frame of image according to the color difference and the brightness difference between the second color of the ith frame of image and the first color of the (i-1) th frame of image when the ith frame of image is traversed.
In practical application, the first color of the i-1 th frame image may be used as a reference color for comparing differences, and color difference information between the first color and the reference color may be calculated for each frame image. Fig. 3 shows a schematic diagram of an application example of determining a first color based on linear color difference. In order to better embody the initialization process in the program, i.e. the processing of the image of frame 1, the processing of the image of frame m is shown in fig. 3, where m = i-1, m is numbered from 1. In practical applications, m may also be numbered from 0, i.e., the multi-frame image includes the 0 th frame image, the 1 st frame image, the 2 nd frame image, and so on.
As shown in fig. 3, at the time of initialization, m =1, it is determined whether or not there is a reference color, and if there is no reference color, the first color of the 1 st frame image (the same as the second color of the 1 st frame image) is selected as the reference color. Then calculating a linear color difference between the reference color and a second color of the 2 nd frame image; if the linear color difference is larger than the preset threshold value M, reserving the second color of the 2 nd frame image as the first color of the 2 nd frame image, and taking the first color of the 2 nd frame image as a new reference color. And (4) adding one to m, namely m = m +1, judging whether m is less than the total frame number, and if so, returning to the step of judging whether the reference color exists. At this time m =2, and there is a reference color, which is the first color of the 2 nd frame image. Then calculating a linear color difference between the reference color and a second color of the 3 rd frame image; if the linear color difference is less than or equal to a preset threshold value M, taking a reference color, namely a first color of the 2 nd frame image, as a first color of the 3 rd frame image; if the linear color difference is larger than the threshold value M, the second color of the 3 rd frame image is reserved as the first color of the 3 rd frame image, and the first color of the 3 rd frame image is used as a new reference color. And so on until m equals the total number of frames. The step of determining the first color based on the linear color difference is ended.
It can be seen that according to the above logic, each frame image first corresponds to a second color. And when the linear color difference between the second color of the current frame image and the reference color is less than or equal to M, replacing the second color of the current frame image with the determined first color of the previous frame image to be used as the first color of the current frame image. And the like, when the linear color difference between the second color of the M +1 th frame image and the reference color is larger than M, the second color of the M +1 th frame image is reserved as the first color, and the first color of the M +1 th frame image is used as the reference color. And continuously calculating the linear color difference between the second color of the m +2 frame image and the reference color, repeating the logic, and finally completing the color combination based on the linear color difference.
In the HSB color space, two adjacent colors have the same hue, but the difference in brightness is large, which also causes color flickering. For example, the pale green color has a hue H value of 189, a saturation S value of 26.6, and a lightness B value of 31; the hue H value of the dark green is 186, the saturation S value is 26.9, and the brightness B value is 40.8; the brightness of both flickers clearly. Therefore, after the first color is determined based on the linear color difference, the first color corresponding to each frame of image may be used as a new second color, and then the first color corresponding to each frame of image may be re-determined based on the color difference and the brightness difference.
Fig. 4 is a diagram showing an example of an application of determining the first color based on the color difference and the luminance difference. In order to better embody the initialization process in the program, i.e. the processing of the image of frame 1, the processing of the image of frame n is shown in fig. 4, where n = i-1, n is numbered starting from 1. In practical applications, n may also be numbered from 0, i.e. the multiple frame images include the 0 th frame image, the 1 st frame image, the 2 nd frame image, etc.
As shown in fig. 4, at the time of initialization, n =1, it is determined whether or not there is a reference color, and if there is no reference color, the first color of the 1 st frame image (the same as the second color of the 1 st frame image) is selected as the reference color. Then, a color difference (Diff _ H) and a luminance difference (Diff _ B) between the reference color and the second color of the 2 nd frame image are calculated; if the color difference (Diff _ H) is greater than the hue threshold N and the lightness difference (Diff _ B) is greater than the lightness threshold K, the second color of the 2 nd frame image is reserved as the first color of the 2 nd frame image, and the first color of the 2 nd frame image is used as a new reference color. And adding one to n, namely n = n +1, judging whether n is less than the total frame number, if so, returning to the step of judging whether the reference color exists. At this time n =2, and there is a reference color, which is the first color of the 2 nd frame image. Then, a color difference (Diff _ H) and a luminance difference (Diff _ B) between the reference color and the second color of the 3 rd frame image are calculated; setting the first color of the 2 nd frame image as the reference color as the first color of the 3 rd frame image when the hue difference (Diff _ H) is equal to or less than the hue threshold N and/or the brightness difference (Diff _ B) is equal to or less than the brightness threshold K; if the color difference (Diff _ H) is greater than the hue threshold N and the lightness difference (Diff _ B) is greater than the lightness threshold K, the second color of the 3 rd frame image is reserved as the first color of the 3 rd frame image, and the first color of the 3 rd frame image is used as a new reference color. And so on until n equals the total number of frames. The step of determining the first color based on the linear color difference is ended.
It can be seen that according to the above logic, each frame image first corresponds to a second color. And when the color difference or the brightness difference between the second color of the current frame image and the reference color is less than or equal to the corresponding threshold value, replacing the second color of the current frame image with the determined first color of the previous frame image to be used as the first color of the current frame image. And repeating the steps until the color difference and the brightness difference between the second color of the (n + 1) th frame image and the reference color are larger than the corresponding threshold values, reserving the second color of the (n + 1) th frame image as the first color, and using the first color of the (n + 1) th frame image as the reference color. And continuously calculating the linear color difference between the second color of the n +2 frame image and the reference color, repeating the logic, and finally completing the color combination based on the linear color difference.
Based on the processes shown in fig. 3 and fig. 4, the merging of the inter-frame similar colors can be basically completed, and finally the first color corresponding to each frame of image is obtained, which can be converted into the first color sequence. For example, when the first color sequence is recorded by using a list, a start frame corresponding to each color may be recorded as shown in the following table 3:
Figure BDA0003065283790000111
TABLE 3
The process of determining the first color of each frame image based on the multiple color difference information is described above by taking the example of sequentially executing the flows shown in fig. 3 and 4. It should be noted that, in practical applications, the implementation order of fig. 3 and fig. 4 is not limited, and the process of fig. 4 may be executed first, then the first color of each frame image is determined as the new second color, and then the process of fig. 3 is executed. The flows of fig. 3 and 4 may alternatively be performed, such as performing only the flow of fig. 3 or performing only the flow of fig. 4.
Exemplarily, after obtaining the first color sequence, the step S13 performs a smoothing process on the first color sequence according to the number of frames corresponding to each color in the first color sequence to obtain a second color sequence, including:
determining a color set to be combined in the first color sequence according to the frame number corresponding to each color in the first color sequence and a preset threshold;
determining the combined color corresponding to the color set to be combined according to the frame number corresponding to each color in the color set to be combined;
and combining the colors in the color set to be combined into the combined color to obtain a second color sequence.
For example, in a case that the number of frames corresponding to a color is less than a preset threshold, a color set to be merged may be determined based on the color and at least one preceding color or based on the color and at least one succeeding color. I.e. the color is combined with the preceding or succeeding color. And the color with the maximum number of frames in the color set is used as the combined color to obtain a second color sequence.
For example, if the first color sequence includes three consecutive colors of light red, dark red, and orange, the corresponding frame numbers are 10, 2, and 13, respectively. Because the frame number corresponding to the deep red is small, the deep red and the previous light red can be combined, and the deep red with more corresponding frame numbers is used as the combined color, so as to obtain the second color sequence containing two continuous colors of the deep red and the orange, wherein, the frame numbers corresponding to the deep red and the orange are 12 and 13 respectively.
According to the embodiment, the color flicker of the target area can be further reduced, and the visual effect is optimized.
In practical applications, the color sets to be merged may also be determined in a variety of different ways. In one example, determining a color set to be merged in the first color sequence according to the number of frames corresponding to each color in the first color sequence and a preset threshold includes:
under the condition that the frame number corresponding to the jth color in the first color sequence is smaller than a first frame number threshold, adding the jth color and the jth-1 color in the first color sequence into the same color set to be combined; wherein j is an integer of 2 or more.
That is, if the number of frames corresponding to the jth color is smaller than the first frame number threshold, for example, 5, the jth color and the jth-1 color are added to the same color set to be merged. Here, if the j-1 th color and the j-2 th color are in the same color set to be merged, the j-1 th color and the j-2 th color are in the same color set to be merged.
According to the embodiment, the color flicker of the target area can be further reduced, and the visual effect is optimized. And the color set to be combined is determined directly based on the frame number, so that the calculation can be simplified, and the calculation amount is reduced.
In another example, the determining, according to the number of frames corresponding to each color in the first color sequence and a preset threshold, a color set to be merged in the first color sequence includes:
under the condition that the frame number corresponding to the kth color in the first color sequence is smaller than a second frame number threshold, if the similarity between the initial frame image corresponding to the kth color and the previous frame image of the initial frame image is smaller than a similarity threshold, adding the kth color and the (k-1) th color in the first color sequence into the same color set to be combined; wherein k is an integer of 2 or more.
That is, if the number of frames corresponding to the kth color is smaller than the second frame number threshold, for example, 15, and the starting frame of the kth color is similar to the previous frame image, the kth color and the (k-1) th color are added to the same color set to be merged. Here, if the k-1 st color and the k-2 nd color are in the same color set to be merged, the k-1 st color and the k-2 nd color are in the same color set to be merged.
According to the embodiment, the color flicker of the target area can be further reduced, and the visual effect is optimized. And determining the color set to be merged based on the combination of the frame number and the similarity, thereby increasing the smoothing effect.
Illustratively, the similarity between the image of the starting frame and the image of the previous frame can be calculated based on the color quantization results of the two images. Specifically, it can be calculated according to the following formula:
Figure BDA0003065283790000131
wherein f is i+1 For the above start frame image, f i A picture of a frame preceding the start frame picture, similarity (f) i ,f i+1 ) Is the similarity between the image of the starting frame and the image of the previous frame. As already explained above, color quantization may combine similar colors of less importance in an image into one color. The color corresponding to each pixel can be determined through color quantization, and a color set is obtained. In the above formula, seti represents the color set of the previous frame image,
Figure BDA0003065283790000132
representing the color, set, of a certain pixel in the image of the previous frame j A set of colors representing the image of the start frame,
Figure BDA0003065283790000133
indicating the color of a certain pixel in the starting frame image.
Figure BDA0003065283790000134
To represent
Figure BDA0003065283790000137
And
Figure BDA0003065283790000135
linear color difference therebetween.
Figure BDA0003065283790000136
The number indicating that the linear color difference is less than the color difference threshold Y. Count pixel Representing the number of pixels in each frame of the image. That is, the above-described scheme is based on the percentage of the number of pixels having a similarity smaller than Y to the total number of pixels as the similarity between two images. When the similarity is smaller than the similarity threshold, for example, 80%, the corresponding color may be added to the same color set to be merged.
Illustratively, the above-described manner of using the first frame number threshold and the second frame number threshold may be implemented simultaneously. Wherein the second frame number threshold is greater than the first frame number threshold. For example, a color that satisfies at least one of the following two conditions may be added to the set of colors to be merged:
Figure BDA0003065283790000141
wherein, score represents the number of frames corresponding to the colors in the first color sequence. P is the first frame number threshold, Q is the second frame number threshold, similarity (f) i ,f i+1 ) And the similarity between the image of the initial frame corresponding to the color and the image of the previous frame. K is a similarity threshold.
For example, the operation of smoothing the color sequence in step S13 may be performed iteratively. For example, after the first color sequence is executed once, the obtained second color sequence is used as a new first color sequence, and the step S13 is executed again. Until there is no color with the frame number less than the preset threshold in the second color sequence. Finally, a smooth second color sequence is obtained. After the second color sequence is determined, the target area may be rendered according to the second color sequence during video playback.
As an implementation of the foregoing methods, an embodiment of the present disclosure further provides a processing apparatus for a video playback page, and referring to fig. 5, the apparatus includes:
a color determination module 510, configured to determine a plurality of first colors respectively corresponding to a plurality of frames of images in a target video;
a first sequence module 520, configured to determine, based on a plurality of first colors, a first color sequence and a number of frames corresponding to each color in the first color sequence;
a second sequence module 530, configured to perform smoothing on the first color sequence according to the frame number corresponding to each color in the first color sequence to obtain a second color sequence;
and a rendering module 540, configured to render the target area in the play page based on the second color sequence in the process of playing the target video in the play page.
Illustratively, as shown in fig. 6, the color determination module 510 includes:
a second color unit 511, configured to obtain a second color of an ith frame image based on colors of a plurality of target pixels in the ith frame image in the multi-frame image; wherein i is an integer greater than or equal to 2;
the first color unit 512 is configured to determine a first color of the i-th frame image according to color difference information between the second color of the i-th frame image and the first color of the i-1 th frame image in the multi-frame image.
Illustratively, as shown in fig. 6, the processing device for a video playing page further includes:
a region determining module 610, configured to determine, based on a preset size, a reference region adjacent to the target region in the ith frame of image;
a pixel determining module 620, configured to determine a plurality of pixels in the reference region as a plurality of target pixels in the ith frame image.
Illustratively, the second color unit 511 is for:
determining the number of target pixels respectively corresponding to a plurality of preset colors based on the colors of the target pixels in the ith frame of image in the multi-frame image;
determining scores of a plurality of preset colors based on the number of target pixels respectively corresponding to the plurality of preset colors;
and determining the second color of the ith frame image from a plurality of preset colors based on the scores.
Illustratively, the color difference information includes at least one of a linear color difference, a color difference, and a brightness difference.
Illustratively, the second sequence module 530 includes:
a set determining unit 531, configured to determine a color set to be combined in the first color sequence according to a frame number corresponding to each color in the first color sequence and a preset threshold;
a color selecting unit 532, configured to determine, according to a frame number corresponding to each color in the color set to be merged, a merged color corresponding to the color set to be merged;
the merging unit 533 is configured to merge the colors in the color set to be merged into a merged color, so as to obtain a second color sequence.
Exemplarily, the set determining unit 531 is configured to:
under the condition that the frame number corresponding to the jth color in the first color sequence is smaller than a first frame number threshold, adding the jth color and the jth-1 color in the first color sequence into the same color set to be combined; wherein j is an integer of 2 or more.
Exemplarily, the set determining unit 531 is configured to:
under the condition that the frame number corresponding to the kth color in the first color sequence is smaller than a second frame number threshold, if the similarity between the initial frame image corresponding to the kth color and the previous frame image of the initial frame image is smaller than a similarity threshold, adding the kth color and the (k-1) th color in the first color sequence into the same color set to be combined; wherein k is an integer of 2 or more.
The functions of each unit, module or sub-module in each device in the embodiments of the present disclosure may refer to the corresponding description in the above method embodiments, and are not described herein again.
The present disclosure also provides an electronic device, a readable storage medium, and a computer program product according to embodiments of the present disclosure.
FIG. 7 shows a schematic block diagram of an example electronic device 700 that may be used to implement embodiments of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 7, the electronic device 700 includes a computing unit 701, which may perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM) 702 or a computer program loaded from a storage unit 708 into a Random Access Memory (RAM) 703. In the RAM 703, various programs and data necessary for the operation of the electronic device 700 can be stored. The computing unit 701, the ROM 702, and the RAM 703 are connected to each other by a bus 704. An input/output (I/O) interface 705 is also connected to bus 704.
A plurality of components in the electronic device 700 are connected to the I/O interface 705, including: an input unit 706 such as a keyboard, a mouse, or the like; an output unit 707 such as various types of displays, speakers, and the like; a storage unit 708 such as a magnetic disk, optical disk, or the like; and a communication unit 709 such as a network card, a modem, a wireless communication transceiver, etc. The communication unit 709 allows the electronic device 700 to exchange information/data with other devices via a computer network such as the internet and/or various telecommunication networks.
Computing unit 701 may be a variety of general purpose and/or special purpose processing components with processing and computing capabilities. Some examples of the computing unit 701 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various dedicated Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, and so forth. The computing unit 701 executes the respective methods and processes described above, such as the processing method of the video playback page. For example, in some embodiments, the processing of video playback pages may be implemented as a computer software program tangibly embodied on a machine-readable medium, such as storage unit 708. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 700 via the ROM 702 and/or the communication unit 709. When the computer program is loaded into the RAM 703 and executed by the computing unit 701, one or more steps of the method of processing a video playback page described above may be performed. Alternatively, in other embodiments, the computing unit 701 may be configured by any other suitable means (e.g., by means of firmware) to perform the processing method of the video playback page.
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), system on a chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program code may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program code, when executed by the processor or controller, causes the functions/acts specified in the flowchart and/or block diagram to be performed. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
It should be understood that various forms of the flows shown above, reordering, adding or deleting steps, may be used. For example, the steps described in the present disclosure may be executed in parallel or sequentially or in different orders, and are not limited herein as long as the desired results of the technical solutions disclosed in the present disclosure can be achieved.
The above detailed description should not be construed as limiting the scope of the disclosure. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present disclosure should be included in the protection scope of the present disclosure.

Claims (16)

1. A processing method of a video playing page comprises the following steps:
determining a plurality of first colors respectively corresponding to a plurality of frame images in a target video;
determining a first color sequence and a frame number corresponding to each color in the first color sequence based on the plurality of first colors;
according to the frame number corresponding to each color in the first color sequence, smoothing the first color sequence to obtain a second color sequence;
in the process of playing the target video in a playing page, rendering a target area in the playing page based on the second color sequence;
the smoothing processing is performed on the first color sequence according to the frame number corresponding to each color in the first color sequence to obtain a second color sequence, and the smoothing processing includes:
determining a color set to be combined in the first color sequence according to the frame number corresponding to each color in the first color sequence and a preset threshold;
determining the combined color corresponding to the color set to be combined according to the frame number corresponding to each color in the color set to be combined;
and combining the colors in the color set to be combined into the combined color to obtain a second color sequence.
2. The method of claim 1, wherein the determining a plurality of first colors corresponding to respective ones of a plurality of frames of images in a target video comprises:
obtaining a second color of an ith frame image based on colors of a plurality of target pixels in the ith frame image in the multi-frame image; wherein i is an integer greater than or equal to 2;
and determining the first color of the ith frame image according to color difference information between the second color of the ith frame image and the first color of the (i-1) th frame image in the multi-frame image.
3. The method of claim 2, further comprising:
determining a reference area adjacent to the target area in the ith frame of image based on a preset size;
determining a plurality of pixels in the reference region as a plurality of target pixels in the ith frame image.
4. The method of claim 2, wherein said deriving a second color of an ith frame image of the multi-frame image based on colors of a plurality of target pixels in the ith frame image comprises:
determining the number of target pixels respectively corresponding to a plurality of preset colors based on the colors of the target pixels in the ith frame of image in the multi-frame of images;
determining scores of a plurality of preset colors based on the number of target pixels respectively corresponding to the preset colors;
and determining a second color of the ith frame image from the plurality of preset colors based on the scores.
5. The method of claim 2, wherein the color difference information comprises at least one of linear color difference, and brightness difference.
6. The method according to claim 1, wherein the determining the color set to be combined in the first color sequence according to the frame number corresponding to each color in the first color sequence and a preset threshold comprises:
under the condition that the frame number corresponding to the jth color in the first color sequence is smaller than a first frame number threshold, adding the jth color and the jth-1 color in the first color sequence into the same color set to be combined; wherein j is an integer of 2 or more.
7. The method according to claim 1, wherein the determining a color set to be combined in the first color sequence according to the number of frames corresponding to each color in the first color sequence and a preset threshold comprises:
under the condition that the frame number corresponding to the kth color in the first color sequence is smaller than a second frame number threshold, if the similarity between the initial frame image corresponding to the kth color and the previous frame image of the initial frame image is smaller than a similarity threshold, adding the kth color and the (k-1) th color in the first color sequence into the same color set to be combined; wherein k is an integer of 2 or more.
8. A processing device for video playing pages, comprising:
the color determining module is used for determining a plurality of first colors respectively corresponding to a plurality of frames of images in the target video;
the first sequence module is used for determining a first color sequence and the number of frames corresponding to each color in the first color sequence based on the plurality of first colors;
the second sequence module is used for carrying out smoothing processing on the first color sequence according to the frame number corresponding to each color in the first color sequence to obtain a second color sequence;
the rendering module is used for rendering a target area in a playing page based on the second color sequence in the process of playing the target video in the playing page;
wherein the second sequence module comprises:
a set determining unit, configured to determine a color set to be combined in the first color sequence according to a frame number corresponding to each color in the first color sequence and a preset threshold;
the color selection unit is used for determining the combined color corresponding to the color set to be combined according to the frame number corresponding to each color in the color set to be combined;
and the merging unit is used for merging the colors in the color set to be merged into the merged color to obtain a second color sequence.
9. The apparatus of claim 8, wherein the color determination module comprises:
the second color unit is used for obtaining a second color of the ith frame image based on the colors of a plurality of target pixels in the ith frame image in the multi-frame images; wherein i is an integer greater than or equal to 2;
and the first color unit is used for determining the first color of the ith frame image according to color difference information between the second color of the ith frame image and the first color of the (i-1) th frame image in the multi-frame image.
10. The apparatus of claim 9, further comprising:
the area determining module is used for determining a reference area adjacent to the target area in the ith frame of image based on a preset size;
a pixel determining module, configured to determine a plurality of pixels in the reference region as a plurality of target pixels in the i-th frame image.
11. The apparatus of claim 9, wherein the second color unit is to:
determining the number of target pixels respectively corresponding to a plurality of preset colors based on the colors of a plurality of target pixels in the ith frame of image in the multi-frame of images;
determining scores of a plurality of preset colors based on the number of target pixels respectively corresponding to the preset colors;
and determining a second color of the ith frame image from the plurality of preset colors based on the scores.
12. The apparatus of claim 9, wherein the color difference information comprises at least one of a linear color difference, a color difference, and a brightness difference.
13. The apparatus of claim 8, wherein the set determination unit is to:
under the condition that the frame number corresponding to the jth color in the first color sequence is smaller than a first frame number threshold, adding the jth color and the jth-1 color in the first color sequence into the same color set to be combined; wherein j is an integer of 2 or more.
14. The apparatus of claim 8, wherein the set determination unit is to:
under the condition that the frame number corresponding to the kth color in the first color sequence is smaller than a second frame number threshold, if the similarity between the initial frame image corresponding to the kth color and the previous frame image of the initial frame image is smaller than a similarity threshold, adding the kth color and the (k-1) th color in the first color sequence into the same color set to be combined; wherein k is an integer of 2 or more.
15. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein, the first and the second end of the pipe are connected with each other,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-7.
16. A non-transitory computer readable storage medium having stored thereon computer instructions for causing a computer to perform the method of any one of claims 1-7.
CN202110524435.6A 2021-05-13 2021-05-13 Video playing page processing method and device, electronic equipment and storage medium Active CN113259745B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110524435.6A CN113259745B (en) 2021-05-13 2021-05-13 Video playing page processing method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110524435.6A CN113259745B (en) 2021-05-13 2021-05-13 Video playing page processing method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113259745A CN113259745A (en) 2021-08-13
CN113259745B true CN113259745B (en) 2022-11-15

Family

ID=77181787

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110524435.6A Active CN113259745B (en) 2021-05-13 2021-05-13 Video playing page processing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113259745B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106101810A (en) * 2016-08-15 2016-11-09 青岛海信电器股份有限公司 Interface subject alternative approach, device and intelligent television for intelligent television
CN106406504A (en) * 2015-07-27 2017-02-15 常州市武进区半导体照明应用技术研究院 Atmosphere rendering system and method of man-machine interaction interface
CN109783178A (en) * 2019-01-24 2019-05-21 北京字节跳动网络技术有限公司 A kind of color adjustment method of interface assembly, device, equipment and medium
WO2020074303A1 (en) * 2018-10-09 2020-04-16 Signify Holding B.V. Determining dynamicity for light effects based on movement in video content
CN111679877A (en) * 2020-05-27 2020-09-18 浙江大华技术股份有限公司 Method and device for changing background of terminal equipment and electronic equipment
CN111897619A (en) * 2020-08-14 2020-11-06 百度时代网络技术(北京)有限公司 Browser page display method and device, electronic equipment and storage medium
CN112328345A (en) * 2020-11-02 2021-02-05 百度(中国)有限公司 Method and device for determining theme color, electronic equipment and readable storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106406504A (en) * 2015-07-27 2017-02-15 常州市武进区半导体照明应用技术研究院 Atmosphere rendering system and method of man-machine interaction interface
CN106101810A (en) * 2016-08-15 2016-11-09 青岛海信电器股份有限公司 Interface subject alternative approach, device and intelligent television for intelligent television
WO2020074303A1 (en) * 2018-10-09 2020-04-16 Signify Holding B.V. Determining dynamicity for light effects based on movement in video content
CN109783178A (en) * 2019-01-24 2019-05-21 北京字节跳动网络技术有限公司 A kind of color adjustment method of interface assembly, device, equipment and medium
CN111679877A (en) * 2020-05-27 2020-09-18 浙江大华技术股份有限公司 Method and device for changing background of terminal equipment and electronic equipment
CN111897619A (en) * 2020-08-14 2020-11-06 百度时代网络技术(北京)有限公司 Browser page display method and device, electronic equipment and storage medium
CN112328345A (en) * 2020-11-02 2021-02-05 百度(中国)有限公司 Method and device for determining theme color, electronic equipment and readable storage medium

Also Published As

Publication number Publication date
CN113259745A (en) 2021-08-13

Similar Documents

Publication Publication Date Title
Khan et al. A tone-mapping technique based on histogram using a sensitivity model of the human visual system
US11113795B2 (en) Image edge processing method, electronic device, and computer readable storage medium
CN110971929B (en) Cloud game video processing method, electronic equipment and storage medium
US10783837B2 (en) Driving method and driving device of display device, and related device
CN111654746B (en) Video frame insertion method and device, electronic equipment and storage medium
US11409794B2 (en) Image deformation control method and device and hardware device
US9704227B2 (en) Method and apparatus for image enhancement
CN110996174B (en) Video image quality enhancement method and related equipment thereof
US10810462B2 (en) Object detection with adaptive channel features
CN113518185A (en) Video conversion processing method and device, computer readable medium and electronic equipment
CN115022679B (en) Video processing method, device, electronic equipment and medium
CN115345968B (en) Virtual object driving method, deep learning network training method and device
CN112541868A (en) Image processing method, image processing device, computer equipment and storage medium
CN113989174B (en) Image fusion method and training method and device of image fusion model
CN113259745B (en) Video playing page processing method and device, electronic equipment and storage medium
CN111833262A (en) Image noise reduction method and device and electronic equipment
CN109308690B (en) Image brightness balancing method and terminal
CN113988294A (en) Method for training prediction network, image processing method and device
CN114782249A (en) Super-resolution reconstruction method, device and equipment for image and storage medium
CN114092359A (en) Screen-splash processing method and device and electronic equipment
CN113762016A (en) Key frame selection method and device
CN112513940A (en) Alpha value determination device, alpha value determination method, program, and data structure of image data
CN111915529A (en) Video dim light enhancement method and device, mobile terminal and storage medium
CN114219744B (en) Image generation method, device, equipment and storage medium
CN116957983A (en) Image enhancement method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant