CN112887694A - Video playing method, device and equipment and readable storage medium - Google Patents

Video playing method, device and equipment and readable storage medium Download PDF

Info

Publication number
CN112887694A
CN112887694A CN202110114533.2A CN202110114533A CN112887694A CN 112887694 A CN112887694 A CN 112887694A CN 202110114533 A CN202110114533 A CN 202110114533A CN 112887694 A CN112887694 A CN 112887694A
Authority
CN
China
Prior art keywords
color
video
theme
target
sampling
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110114533.2A
Other languages
Chinese (zh)
Other versions
CN112887694B (en
Inventor
赖师悦
董祁恩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Music Entertainment Technology Shenzhen Co Ltd
Original Assignee
Tencent Music Entertainment Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Music Entertainment Technology Shenzhen Co Ltd filed Critical Tencent Music Entertainment Technology Shenzhen Co Ltd
Priority to CN202110114533.2A priority Critical patent/CN112887694B/en
Publication of CN112887694A publication Critical patent/CN112887694A/en
Application granted granted Critical
Publication of CN112887694B publication Critical patent/CN112887694B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/643Hue control means, e.g. flesh tone control

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)

Abstract

The application discloses a video playing method, a device, equipment and a readable storage medium, wherein the method comprises the following steps: extracting theme colors corresponding to all video frames in a target video; carrying out normalization adjustment on the color components of the theme color to obtain a target theme color; when the target video is played, the peripheral area of the video playing area is rendered by using the target theme color corresponding to each video frame. In the method, when the target video is played, the target theme color corresponding to each video frame can be rendered in the peripheral area of the video playing area. Therefore, the color of the peripheral area of the video playing area can be changed along with the color change of the played video, the color change does not frequently shake, and the magic color stability of the video can be improved. Therefore, the method can avoid discomfort of human eyes caused by frequent jumping of the magic color, and can improve user experience.

Description

Video playing method, device and equipment and readable storage medium
Technical Field
The present application relates to the field of computer application technologies, and in particular, to a video playing method, apparatus, device, and readable storage medium.
Background
When the video is played, in order to guarantee the best watching effect and the human-computer interaction experience, the video playing area cannot cover the whole display interface. I.e. outside the video playback area, there is a peripheral area. Some functional areas, such as barrage, comments, appreciation, and the like, or video-related content display may be set in the peripheral area.
At present, in order to enhance special effects such as cool and soft degrees of videos, a magic color technology can be adopted to process a peripheral area, so that the color of the peripheral area can be changed along with the change of the color of the played video. However, in practical applications, magic colors (colors displayed in peripheral areas) may frequently jump, causing discomfort to human eyes and affecting user experience.
In summary, how to effectively solve the problems of frequent flashing of magic colors and the like is a technical problem which needs to be solved urgently by the technical personnel in the field at present.
Disclosure of Invention
The application aims to provide a video playing method, a video playing device, video playing equipment and a readable storage medium, which can avoid discomfort of human eyes caused by frequent jumping of magic colors and can improve user experience.
In order to solve the technical problem, the application provides the following technical scheme:
in a first aspect, the present application provides a video playing method, including:
extracting theme colors corresponding to all video frames in a target video;
carrying out normalization adjustment on the color components of the theme color to obtain a target theme color;
and when the target video is played, rendering the peripheral area of the video playing area by using the target theme color corresponding to each video frame.
In a possible implementation manner, performing normalization adjustment on the color components of the theme color to obtain a target theme color includes:
performing color space conversion on the theme color to obtain the color component; the color component comprises a hue component and a saturation component;
and carrying out normalization adjustment on the hue components, and/or carrying out normalization adjustment on the saturation components to obtain the target theme color.
In another possible embodiment, the normalization adjustment of the hue component includes:
determining a tone interval corresponding to the tone component;
and adjusting the tone component to a normalized tone value corresponding to the tone interval.
In another possible embodiment, the normalization adjustment of the saturation component includes:
under the condition that the saturation component is in a weakening interval, adjusting the saturation component to be a preset saturation;
in the case where the saturation component is not in the weakened interval, the saturation component is kept unchanged.
In another possible implementation manner, the extracting the theme color corresponding to each video frame in the target video includes:
sampling pixel points in the video frame to obtain a plurality of sampling pixel points;
calculating the average color of a plurality of sampling pixel points;
determining the mean color as the theme color.
In another possible implementation, sampling the pixel points in the video frame to obtain a plurality of sampled pixel points includes:
carrying out down-sampling scaling on the video frame to obtain a reduced image;
and uniformly sampling the reduced image to obtain a plurality of sampling pixel points.
In another possible implementation, the uniformly sampling the reduced image to obtain a plurality of sampling pixel points includes:
acquiring a plurality of uniform sampling points of the reduced graph in the horizontal and vertical directions;
and calculating the pixel mean value of the neighborhood pixel points of each uniform sampling point to obtain a plurality of sampling pixel points.
In a second aspect, the present application further provides a video playing apparatus, including:
the theme color extraction module is used for extracting the theme colors corresponding to the video frames in the target video;
the theme color adjusting module is used for carrying out normalization adjustment on the color components of the theme color to obtain a target theme color;
and the video playing module is used for rendering the peripheral area of the video playing area by using the target theme color corresponding to each video frame when the target video is played.
In a third aspect, the present application further provides an electronic device, including:
a memory for storing a computer program;
and the processor is used for realizing the steps of the video playing method when executing the computer program.
In a fourth aspect, the present application further provides a readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to implement the steps of the video playing method.
By applying the method provided by the embodiment of the application, the theme color corresponding to each video frame in the target video is extracted; carrying out normalization adjustment on the color components of the theme color to obtain a target theme color; when the target video is played, the peripheral area of the video playing area is rendered by using the target theme color corresponding to each video frame.
In the method, a theme color of a video frame in a target video is extracted first. Considering that the video itself has a certain continuity, the change of the theme color of the video frame is limited within a certain range. Based on this, in order to avoid the occurrence of color jitter, the color components of the theme colors can be normalized and adjusted, and then the target theme colors with small difference can be obtained. Therefore, when the target video is played, the target theme color corresponding to each video frame can be rendered in the peripheral area of the video playing area. Therefore, the color of the peripheral area of the video playing area can be changed along with the color change of the played video, the color change does not frequently shake, and the magic color stability of the video can be improved. Therefore, the method can avoid discomfort of human eyes caused by frequent jumping of the magic color, and can improve user experience.
Accordingly, embodiments of the present application further provide a video playing apparatus, an electronic device, and a readable storage medium corresponding to the video playing method, which have the above technical effects and are not described herein again.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments or related technologies of the present application, the drawings needed to be used in the description of the embodiments or related technologies are briefly introduced below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a flowchart illustrating an implementation of a video playing method in an embodiment of the present application;
FIG. 2 is a schematic diagram of theme color extraction according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of a video playback device according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of an electronic device in an embodiment of the present application;
fig. 5 is a schematic structural diagram of an electronic device in an embodiment of the present application.
Detailed Description
In order that those skilled in the art will better understand the disclosure, the following detailed description will be given with reference to the accompanying drawings. It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The video playing method provided by the application can be applied to equipment which needs to adopt a magic color technology to optimize video playing, such as a PC (personal computer) end and a mobile terminal. The magic color is the color similar to the current picture extracted by the algorithm and is called as the magic color of the current picture. Video magic color, which is the collection of magic colors for each frame of image in the video. In the related art, no processing strategy for stabilizing the magic color extraction exists, so that the stability of the magic color of the video is poor (namely, the magic color of adjacent sequence frames in the video generates frequent jumping causing discomfort of human eyes due to the change of pictures). The video playing method provided by the embodiment of the application can solve the problem of poor stability of the magic color, can prevent the magic color of adjacent sequence frames in the video from frequently jumping due to the change of pictures, and can achieve the effect that the whole color change is relatively stable and soft.
Specifically, please refer to fig. 1 for a specific implementation process of the video playing method, where fig. 1 is a flowchart of a video playing method in an embodiment of the present application, and the method includes the following steps:
s101, extracting the theme color corresponding to each video frame in the target video.
The target video can be any video which needs to be played, and a peripheral area exists outside the video playing area. For example, the target video may be embodied as a movie, a television show, a short video, a live video, and the like.
In the present embodiment, the theme color corresponds to the magic color in the magic color technique. The magic color is the color similar to the current picture extracted by the algorithm, and is called as the magic color of the current picture. That is, the theme color in the embodiment of the present application is a color similar to the video frame.
In this embodiment, algorithms based on color quantization (such as octree extraction, median segmentation), clustering-based algorithms (KMeans algorithms), or color modeling algorithm numerology, etc. may be used to extract theme colors from video frames of the target video.
Certainly, in practical application, the extraction algorithm can be improved by methods of sampling first, then processing, parallel computing and the like, so that the theme color of the video frame can be extracted efficiently in real time.
S102, carrying out normalization adjustment on the color components of the theme color to obtain the target theme color.
The video itself has a certain continuity, that is, scene contents are often similar for consecutive frames in the video, especially between adjacent video frames, so that the variation of the theme color of the extracted video frame is limited within a certain range. For example, in a video shot of green grass, the theme colors of the video frames are often green fields, and no incoherent colors such as red, black, and the like appear.
Based on this, in order to avoid the occurrence of color jitter, the subject color may be subjected to normalization adjustment. And typically characterizing color is expressed in terms of color components. Therefore, in this embodiment, the color components of the theme colors can be normalized and adjusted, and thus the target theme colors with small differences can be obtained.
The specific adjustment manner may specifically be to divide the color components according to a range or a threshold, and further normalize the color components of a certain area to a certain specific value. Thus, even if M theme colors respectively corresponding to M different video frames are obtained, after the normalization adjustment is performed, the theme colors are classified into N theme colors, and it is obvious that N is smaller than M. And the number of N is related to the normalized granularity, and the larger the interval set by the normalization adjustment is, the smaller the number of N is.
It should be noted that the number and types of color components may vary for different color spaces. Therefore, in practical applications, the subject color is subjected to normalization adjustment, that is, specifically, in a certain color space, normalization adjustment is performed on a part or all of the corresponding color components, so as to avoid a situation that the subject color frequently flickers, that is, to weaken the difference between different data frames. In this embodiment, it is not limited to what color space the normalization adjustment is performed on the theme color, and the type and number of the corresponding color components and the details of the normalization adjustment are not limited when the normalization adjustment is performed on the theme color.
After normalization adjustment, target theme colors with small difference can be obtained. And the target theme color still corresponds to the video frame corresponding to the theme color before normalization adjustment.
S103, when the target video is played, rendering the peripheral area of the video playing area by using the target theme color corresponding to each video frame.
The peripheral area may be specifically all or part of the area in the visual interface except the video playing area. The geometric shapes of the peripheral area and the video playing area and the geometric relationship between the peripheral area and the video playing area are not limited. For example, the video playing area may be in a regular shape, such as a common rectangle, or in an irregular shape, such as a heart shape; the peripheral area and the video playing area can be adjacent or separated.
When the target video is played, the peripheral area can be rendered by using the target theme color corresponding to each video frame. That is, the peripheral region is made to continuously display the target theme color corresponding to the currently played video frame. Specifically, for how to render the peripheral area by using the target theme color, reference may be made to rendering related technologies, which is not limited in the embodiment of the present application.
Of course, in practical applications, in order to increase a cool and dazzling effect, a special video effect may be added in a video playing area and a peripheral area.
When the video playing method provided by the application is applied to the mobile terminal, if the theme color extraction is performed on each frame of video frame, a large amount of computing resources may be occupied, and the playing experience may be affected. In order to occupy less computing resources as soon as possible and guarantee the magic color effect of the video. When the frame rate is large, the theme colors are extracted at intervals from the video frames. Of course, the currently extracted theme color can also be monitored, and when the theme color change fluctuation is determined to be smaller than the preset threshold value, the theme color is extracted from the video frames in the target video in a sampling mode. Therefore, the theme colors do not need to be extracted frame by frame (for example, the theme colors are extracted after a certain number of video frames are arranged), and the high-frequency rendering of the peripheral area based on the target theme colors is also not needed.
In practical application, the target theme colors corresponding to the plurality of video frames can be normalized and adjusted, so that one target theme color corresponding to the plurality of video frames is obtained. Therefore, when a plurality of video frames are played and displayed, the peripheral area can be rendered by only adopting one target theme color, the rendering times are saved, and the magic color jump frequency can be reduced.
By applying the video playing method provided by the embodiment of the application, the theme color corresponding to each video frame in the target video is extracted; carrying out normalization adjustment on the color components of the theme color to obtain a target theme color; when the target video is played, the peripheral area of the video playing area is rendered by using the target theme color corresponding to each video frame.
In the method, a theme color corresponding to each video frame in a target video is extracted. Considering that the video itself has a certain continuity, the change of the theme color of the video frame is limited within a certain range. Based on this, in order to avoid the occurrence of color jitter, the color components of the theme colors can be normalized and adjusted, and then the target theme colors with small difference can be obtained. Therefore, when the target video is played, the target theme color corresponding to each video frame can be rendered in the peripheral area of the video playing area. Therefore, the color of the peripheral area of the video playing area can be changed along with the color change of the played video, the color change does not frequently shake, and the magic color stability of the video can be improved. Therefore, the method can avoid discomfort of human eyes caused by frequent jumping of the magic color, and can improve user experience.
It should be noted that, based on the above embodiments, the embodiments of the present application also provide corresponding improvements. In the preferred/improved embodiment, the same steps as those in the above embodiment or corresponding steps may be referred to each other, and corresponding advantageous effects may also be referred to each other, which are not described in detail in the preferred/improved embodiment herein.
In a specific embodiment of the present application, the step S102 of performing normalization adjustment on the color component of the theme color to obtain the target theme color may specifically include:
step one, performing color space conversion on the theme color to obtain a color component.
The color component comprises a hue component and a saturation component.
Generally, the color space corresponding to video is the RGB color space. The RGB Color Space (RGB Color Space) is based on three basic colors of R (Red), G (Green) and B (Blue) and is superimposed to different degrees to generate rich and wide colors, so the RGB Color Space is commonly called a three-primary-Color mode. That is, for the RGB space, the color components may include R, G, B. However, when any one of the RGB color components is changed, the subject color is changed to another color with a large difference, and it is difficult to control and adjust the change to a similar color.
Thus, the subject color can be converted into other color spaces in which it is relatively easy to control the color change to an approximate color, such as HSV color space (Hue, Saturation). Wherein, the color components of the HSV color space are respectively: hue component (H), saturation component (S), brightness component (V).
In the following, the HSV color space is taken as an example, and how to obtain the color components is similarly described.
First, the extracted theme color is converted from the RGB color space to the HSV color space. The research shows that the luminance component of the luminance component, the hue component and the saturation component are relatively stable, and the hue component and the saturation component are the main factors causing the theme colors of different video frames to be different from each other, so that the theme colors are unstable. Based on this, when the theme color is normalized and adjusted, color space conversion can be specifically performed on the theme color, and the theme color is converted into an HSV color space capable of extracting hue components and saturation components. For how to convert the subject color of the RGB color space into the HSV color space, specific definitions and conversion rules of the two color spaces may be specifically referred to, and details are not repeated here. After the color space conversion is completed, the hue powder box saturation component of the theme color can be directly obtained from the HSV color space.
It should be noted that, since the hue component and the saturation component are the main reasons causing the difference between the theme colors of different video frames, after the color space conversion is performed, the hue component and the saturation component can be obtained only from the HSV color space.
And step two, carrying out normalization adjustment on the hue component, and/or carrying out normalization adjustment on the saturation component to obtain the target theme color.
After the color components are obtained, normalization adjustment can be performed on the hue components or on the saturation components, or the hue components and the saturation components can be simultaneously normalized and adjusted, and then the target theme color is obtained based on the color components after normalization adjustment. Specifically, the color tone component and the saturation component are normalized and adjusted, and the core idea is to normalize and classify scattered color components.
Specifically, the tone components are normalized and adjusted, and a binning concept (i.e., variable discretization) can be adopted for processing. The specific implementation process may include the following steps:
step 1, determining a tone interval corresponding to a tone component;
and 2, adjusting the tone component to a normalized tone value corresponding to the tone interval.
For convenience of description, the above 2 steps will be described in combination.
For hue components, in the HSV color space, the full range interval of hue components is [0, 360], that is, the specific value of the color component of the theme color, i.e., any value in [0, 360 ]. To avoid frequent flicker between color components, several standard data may be taken from [0, 360] to characterize the actual values. Specifically, 0, 360 may be divided into a plurality of hue intervals in advance, and a normalized hue value may be set for each hue interval. The normalized hue value may be any value in the corresponding hue interval. The numerical ranges of the respective hue ranges may be the same or different. In practical applications, in order to make the normalized hue values more normative and the differences between the respective normalized hue values more obvious, the normalized hue values may be selected in a uniform manner for each hue interval. Specifically, the initial value, the end value, the middle specific value or the average value in the hue interval is used as the normalized hue value.
That is, using the binning concept, the hue value [0, 360] can be adjusted]Dividing into M tone intervals, replacing the tone components falling into the same tone interval with the same normalized tone value, namely the normalized tone value: hnewFloor (H/M) × M, where floor refers to rounding down a number to the nearest integer or a multiple of the nearest specified base. Wherein, the larger the value of M, the finer the granularity of the normalized tone component.
For example: if the number of the hue intervals is 6, and the hue intervals include: [0, 60), [60, 120), [120, 180), [180, 240), [240, 300, [300, 360] the normalized hue values correspond in turn to 30, 80, 150, 210, 270, 330, if 10 subject colors are extracted and the hue components are 25, 12, 44, 72, 82, 96, 89, 267, 298, 152, respectively, then 25, 12, 44 are determined to belong to [0, 60), are all replaced by 30, 72, 82, 96, 89 are determined to belong to [60, 120), are all replaced by 80, 267, 298 are all belong to [240, 300), are all replaced by 270, 152 belongs to [120, 180), are replaced by 150, then the adjusted hue components are in turn: 30, 30, 30, 80, 80, 80, 270, 270, 150, it can be seen that 10 hue components for adjusting the former 10 theme colors are shifted 10 times, and after the adjustment, the 10 hue components are shifted by only 3.
For example: if the number of the hue intervals is 10, and the hue intervals include: [0, 36), [36, 72), [72, 108), [108, 144), [144, 180), [180, 216) [216, 252) [252, 288) [288, 232 ], [232, 360] the normalized hue values correspond in turn to 36, 72, 108, 144, 180, 216, 252, 288, 232, 360, if 10 theme colors are extracted and the hue components are still 25, 12, 44, 72, 82, 96, 89, 267, 298, 152, respectively, it is determined that 25, 12, both belong to [0, 36), all are replaced with 36; determination 44 belongs to [36, 72), is replaced by 72, determination 72, 82, 96, 89 all belong to [72, 108), all are replaced by 108, 267 belongs to [252, 288), all are replaced by 288, 298 belongs to [288, 232), replaced by 232, 152 belongs to [144, 180), and replaced by 180, the adjusted hue component is, in order: 36, 36, 72, 72, 72, 108, 288, 232, 180, it can be seen that 10 hue components for adjusting the former 10 theme colors are shifted 10 times, and after the adjustment, the 10 hue components are shifted only 5 times.
It can be seen that, the larger the number of the hue intervals is, the closer the hue values after normalization adjustment are, and different normalized hue values cause different difference values before and after the hue normalization adjustment. Therefore, in practical applications, the number of the hue intervals and the normalized hue value corresponding to the hue intervals can be set according to specific requirements. That is, the number of hue intervals and the normalized hue value corresponding to the hue interval are not limited in the present embodiment, but experiments prove that a better visual effect can be obtained when the hue interval is 10.
Specifically, the normalization adjustment of the saturation component may include:
step one, judging whether the saturation component is larger than a saturation maximum threshold value;
step two, if yes, determining that the saturation component is in the weakening interval, and adjusting the saturation component to be a preset saturation;
and step three, if not, determining that the saturation component is not in the weakening interval, and keeping the saturation component inconvenient.
For convenience of description, the above three steps will be described in combination.
For the saturation S component, a weakening operation may be performed to reduce the color frequency jitter caused by saturation. Specifically, a saturation maximum threshold (S) may be selectedT). That is, there are the following two distribution cases for the saturation component of the subject color:
case 1: the saturation component is in the weakening interval;
case 2: the saturation component is not in the weakened section.
For the specific case of the saturation component, different processing measures are used. Specifically, for the case 1, the saturation component is adjusted to a preset saturation; for case 2, the saturation component is kept constant. I.e. less than the saturation maximumThe saturation component of the value threshold is retained, and the saturation value above the threshold is weakened to the saturation maximum value threshold (i.e. the preset saturation is equal to the threshold), and the formula is:
Figure BDA0002917033380000101
that is, greater than STThe corresponding saturation interval is the weakening interval, and when the saturation component is in the weakening interval, the saturation component can be directly adjusted to a preset saturation, which can be specifically STMay be smaller than ST. When the saturation component is not in the weakened interval, the saturation component may be kept unchanged.
In the examples of the present application, for STThe specific value of (a) is not limited, nor is the specific value of the preset saturation. However, after experimental verification, when S isTAnd more specifically, 0.4, a better visual effect can be obtained.
With STIf the saturation component of the theme color is 0.5, the saturation component is adjusted to 0.4, that is, the saturation component of the corresponding target theme color is 0.4; if the saturation component of the theme color is 0.2, the saturation component of the theme color is kept to be 0.2, that is, the saturation component of the corresponding target theme color is 0.2.
It should be noted that, due to the selection of the reference setting and the hue component and saturation component adjustment setting, the value of the color component of the subject color and the target subject color after adjustment may be kept unchanged, or there may be a difference (e.g., the saturation component, at least one of the saturation components is changed).
After the color components of the theme color are normalized and adjusted, the adjusted target theme color may be converted into the RGB color space again, so that the peripheral area may be rendered based on the rendering technology corresponding to the RGB color space. Of course, the peripheral area may be rendered based on the target theme color by directly adopting the rendering technology of the corresponding color space after the conversion without performing the color space conversion again.
In this embodiment, the hue component and/or the saturation component are normalized and adjusted, so that the difference between the hue component and the saturation component of different theme colors can be correspondingly weakened, and a target theme color capable of avoiding frequent jumping can be obtained.
In a specific embodiment of the present application, when extracting the theme color, the method can also be implemented by sampling the video frame, so as to reduce the occupation of less computing resources, and facilitate the implementation of the video playing method provided by the present application on the mobile terminal.
For magic color extraction, an algorithm based on color quantization, an algorithm based on clustering and a color modeling algorithm digital method are provided, and the algorithms are combined with an acceleration strategy, such as parallel calculation or sampling and calculating, so that efficient real-time extraction can be realized at a PC (personal computer) end. However, real-time extraction cannot be performed on a mobile terminal, that is, a magic color video technology cannot be used for optimizing video playing on a certain terminal.
On the mobile terminal, a large amount of video playing requirements exist, such as movie and television APPs, short video APPs and live broadcast APPs. Because the computing resources of the mobile terminal are less than those of the PC terminal, the computing resources of the mobile terminal are difficult to support the relevant magic color extraction algorithm. Therefore, on the basis of any of the above embodiments, a manner of sampling a video frame to extract a theme color is proposed to obtain a color similar to the video frame, so that a video magic color technology is implemented on the mobile terminal to optimize video playing. Of course, the magic color system is applied to the PC end, and the effects of reducing occupied computing resources and realizing real-time magic color can also be achieved.
Specifically, the step S101 of extracting the theme color corresponding to each video frame in the target video includes:
step one, sampling pixel points in a video frame to obtain a plurality of sampling pixel points.
The video frame is a frame of image of the video, the video frame comprises a large number of pixel points, in order to quickly obtain the theme color, the pixel points in the video frame can be sampled, and then a plurality of sampling pixel points are obtained. Specifically, the sampling pixel points can be obtained by randomly sampling the pixel points in the video frame, and the sampling pixel points can also be uniformly adopted to obtain the sampling pixel points. Obviously, the more the number of the pixels is adopted, the closer the finally obtained theme color is to the video frame, but the more the computer resources are consumed. Therefore, in practical application, more random or more uniform sampling algorithms can be selected to sample the pixel points in the video frame, so as to obtain more representative sampling pixel points.
The number of sampling pixel points can be set according to the computing power and resources of the practical application terminal. For example, if the method needs to be applied to a PC terminal, more sampling pixel points can be set; if the method is applied to the mobile terminal, fewer sampling pixel points can be set. For example, when the sampling pixel points are applied to a mobile terminal, the number of the sampling pixel points is set to be 100, and when the sampling pixel points are applied to a PC terminal, the number of the sampling pixel points can be set to be 200, or even more.
Wherein, sampling the pixel in the video frame, obtain a plurality of sampling pixel, can specifically include:
step 1, carrying out downsampling and scaling on the video frame to obtain a reduced image.
The video frame is subjected to down-sampling and zooming, a reduced image can be obtained, the sampling space is reduced for further collecting pixel points, and the pixel points can be sampled more uniformly.
And 2, uniformly sampling the reduced image to obtain a plurality of sampling pixel points.
The reduced image is uniformly sampled, so that a plurality of sampling pixel points obtained by collection can uniformly cover the video frame, and the sampling pixel points are representative.
It should be noted that, in order to avoid that the sampling pixel is too isolated (for example, completely different from the surrounding pixels), which results in that the final theme color is not similar to the color of the video frame, when the step 2 is executed, the following steps may be specifically detailed as:
step 2.1, acquiring a plurality of uniform sampling points of the minification map in the horizontal and vertical directions;
and 2.2, calculating the pixel mean value of the neighborhood pixel points of each uniform sampling point to obtain a plurality of sampling pixel points.
That is to say, sampling pixel point is acquireed in the sampling, can not only follow and directly gather a plurality of pixel points in reducing the picture, still can be in reducing the picture, at level and the vertical mode acquisition a plurality of uniform sampling point, and a uniform sampling point corresponds a sampling pixel point. And then, taking the pixel mean value of the field pixel point of the uniform sampling point as the pixel value of the sampling pixel point corresponding to the uniform sampling point. Therefore, each sampling pixel point can represent the pixel condition of the pixel region formed by the neighborhood corresponding to the uniform sampling point.
When the neighborhood mean value is obtained for each uniform sampling point to determine the pixel of the sampling pixel point, for fast calculation, the average value in the horizontal direction and then the average value in the vertical direction can be decomposed, and the average value in the vertical direction and then the average value in the horizontal direction can be decomposed.
After a plurality of sampling pixel points are obtained, the theme color can be determined based on the plurality of sampling pixel points.
And step two, calculating the average color of the plurality of sampling pixel points.
The average color of the plurality of sampling pixel points can be directly calculated. When the average color is calculated, a sequential accumulation mode can be adopted, and a plurality of sampling pixel points can be accumulated in parallel according to the horizontal or vertical direction, so that the calculation speed of the average color is accelerated.
And step three, determining the average value color as the theme color.
After the mean color is obtained, it is obtained by sampling and mean calculation from the video frame. Thus, the mean color approximates the color of the video frame. The mean color can thus be directly determined as the theme color. The theme color is approximated as a magic color.
Referring to fig. 2, fig. 2 is a schematic diagram of extracting theme colors according to an embodiment of the present application, wherein video frames with a pixel size of 720 × 720, a reduced graph with a pixel size of 200 × 200, 10 × 10 sampling pixels, and 1 × 1 theme color are sequentially arranged from left to right. That is, the video frame may be down-sampled and scaled by using the sampling concept, and the aspect ratio is scaled to a specific number of pixels (200 pixels as shown in fig. 2, i.e., the size of the reduced graph is 200 × 200 pixels) on the short side. Then, the minification map is uniformly sampled in the horizontal and vertical directions (for example, 10 uniform sampling points can be sampled respectively, and a total of 100 uniform sampling points are obtained). And solving the pixel average value of a window with a neighborhood of a preset value (such as 3) for each uniform sampling point, and finally obtaining sampling pixel points, wherein the number of the sampling pixel points is consistent with that of the uniform sampling points (for example, if 100 uniform sampling points exist, 100 sampling pixel points can be obtained). And solving the average value of the colors of the sampling pixel points to finally obtain the theme color, namely the magic color.
By adopting the specific implementation mode to obtain the theme color, the calculation amount and the complexity can be greatly reduced, the theme color extraction speed can be accelerated, the implementation can be carried out on a mobile terminal with weak calculation resources, and the blank that the mobile terminal is difficult to implement the video magic color technology in real time can be filled.
Corresponding to the above method embodiments, the present application further provides a video playing apparatus, and the video playing apparatus described below and the video playing method described above may be referred to correspondingly.
Referring to fig. 3, the apparatus includes the following modules:
the theme color extraction module 101 is configured to extract a theme color corresponding to each video frame in the target video;
the theme color adjusting module 102 is configured to perform normalization adjustment on the color components of the theme color to obtain a target theme color;
the video playing module 103 is configured to render a peripheral area of the video playing area by using a target theme color corresponding to each video frame when the target video is played.
By applying the device provided by the embodiment of the application, the theme color corresponding to each video frame in the target video is extracted; carrying out normalization adjustment on the color components of the theme color to obtain a target theme color; when the target video is played, the peripheral area of the video playing area is rendered by using the target theme color corresponding to each video frame.
In the device, a theme color corresponding to each video frame in a target video is extracted. Considering that the video itself has a certain continuity, the change of the theme color of the video frame is limited within a certain range. Based on this, in order to avoid the occurrence of color jitter, the color components of the theme colors can be normalized and adjusted, and then the target theme colors with small difference can be obtained. Therefore, when the target video is played, the target theme color corresponding to each video frame can be rendered in the peripheral area of the video playing area. Therefore, the color of the peripheral area of the video playing area can be changed along with the color change of the played video, the color change does not frequently shake, and the magic color stability of the video can be improved. Therefore, the device can avoid discomfort of human eyes caused by frequent jumping of magic colors, and user experience can be improved.
In a specific embodiment of the present application, the theme color adjustment module 102 specifically includes:
the color component obtaining unit is used for carrying out color space conversion on the theme color to obtain a color component; the color component comprises a hue component and a saturation component;
and the color component adjusting unit is used for carrying out normalization adjustment on the hue component and/or carrying out normalization adjustment on the saturation component to obtain the target theme color.
In a specific embodiment of the present application, the color component adjusting unit is specifically configured to determine a hue interval corresponding to a hue component; and adjusting the tone component to a normalized tone value corresponding to the tone interval.
In a specific embodiment of the present application, the color component adjusting unit is specifically configured to adjust the saturation component to a preset saturation when the saturation component is in the weakened section; in the case where the saturation component is not in the weakened section, the saturation component is kept unchanged.
In a specific embodiment of the present application, the theme color extraction module 101 specifically includes:
the sampling unit is used for sampling pixel points in the video frame to obtain a plurality of sampling pixel points;
the mean color calculation unit is used for calculating the mean color of the sampling pixel points;
and the theme color determining unit is used for determining the average value color as the theme color.
In an embodiment of the present application, the sampling unit is specifically configured to perform downsampling and scaling on a video frame to obtain a reduced image; and uniformly sampling the reduced image to obtain a plurality of sampling pixel points.
In an embodiment of the present application, the sampling unit is specifically configured to obtain a plurality of uniform sampling points of the reduced image in horizontal and vertical directions; and calculating the pixel mean value of the neighborhood pixel points of each uniform sampling point to obtain a plurality of sampling pixel points.
Corresponding to the above method embodiment, an embodiment of the present application further provides an electronic device, and a piece of electronic device described below and a piece of video playing method described above may be referred to in correspondence.
Referring to fig. 4, the electronic device includes:
a memory 332 for storing a computer program;
the processor 322 is configured to implement the steps of the video playing method of the above-mentioned method embodiment when executing the computer program.
Specifically, referring to fig. 5, fig. 5 is a schematic structural diagram of an electronic device provided in this embodiment, which may generate relatively large differences due to different configurations or performances, and may include one or more processors (CPUs) 322 (e.g., one or more processors) and a memory 332, where the memory 332 stores one or more computer applications 342 or data 344. Memory 332 may be, among other things, transient or persistent storage. The program stored in memory 332 may include one or more modules (not shown), each of which may include a sequence of instructions operating on a data processing device. Still further, the central processor 322 may be configured to communicate with the memory 332 to execute a series of instruction operations in the memory 332 on the electronic device 301.
The electronic device 301 may also include one or more power sources 326, one or more wired or wireless network interfaces 350, one or more input-output interfaces 358, and/or one or more operating systems 341.
The steps in the video playing method described above may be implemented by the structure of the electronic device.
Corresponding to the above method embodiment, the present application further provides a readable storage medium, and a readable storage medium described below and a video playing method described above may be referred to in correspondence.
A readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the steps of the video playing method of the above-mentioned method embodiment.
The readable storage medium may be a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and various other readable storage media capable of storing program codes.
Those of skill would further appreciate that the various illustrative components and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative components and steps have been described above generally in terms of their functionality in order to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.

Claims (10)

1. A video playback method, comprising:
extracting theme colors corresponding to all video frames in a target video;
carrying out normalization adjustment on the color components of the theme color to obtain a target theme color;
and when the target video is played, rendering the peripheral area of the video playing area by using the target theme color corresponding to each video frame.
2. The video playing method according to claim 1, wherein the normalizing the color components of the theme color to obtain the target theme color comprises:
performing color space conversion on the theme color to obtain the color component; the color component comprises a hue component and a saturation component;
and carrying out normalization adjustment on the hue components, and/or carrying out normalization adjustment on the saturation components to obtain the target theme color.
3. The video playback method of claim 1, wherein the normalizing the hue component comprises:
determining a tone interval corresponding to the tone component;
and adjusting the tone component to a normalized tone value corresponding to the tone interval.
4. The video playing method according to claim 1, wherein the normalization adjustment of the saturation component comprises:
under the condition that the saturation component is in a weakening interval, adjusting the saturation component to be a preset saturation;
in the case where the saturation component is not in the weakened interval, the saturation component is kept unchanged.
5. The video playing method according to any one of claims 1 to 4, wherein the extracting the theme color corresponding to each video frame in the target video includes:
sampling pixel points in the video frame to obtain a plurality of sampling pixel points;
calculating the average color of a plurality of sampling pixel points;
determining the mean color as the theme color.
6. The video playing method according to claim 5, wherein sampling the pixels in the video frame to obtain a plurality of sampled pixels comprises:
carrying out down-sampling scaling on the video frame to obtain a reduced image;
and uniformly sampling the reduced image to obtain a plurality of sampling pixel points.
7. The video playing method according to claim 6, wherein uniformly sampling the reduced image to obtain a plurality of sampling pixels, comprises:
acquiring a plurality of uniform sampling points of the reduced graph in the horizontal and vertical directions;
and calculating the pixel mean value of the neighborhood pixel points of each uniform sampling point to obtain a plurality of sampling pixel points.
8. A video playback apparatus, comprising:
the theme color extraction module is used for extracting the theme colors corresponding to the video frames in the target video;
the theme color adjusting module is used for carrying out normalization adjustment on the color components of the theme color to obtain a target theme color;
and the video playing module is used for rendering the peripheral area of the video playing area by using the target theme color corresponding to each video frame when the target video is played.
9. An electronic device, comprising:
a memory for storing a computer program;
a processor for implementing the steps of the video playback method according to any one of claims 1 to 7 when executing the computer program.
10. A readable storage medium, having stored thereon a computer program which, when being executed by a processor, carries out the steps of the video playback method according to any one of claims 1 to 7.
CN202110114533.2A 2021-01-26 2021-01-26 Video playing method, device and equipment and readable storage medium Active CN112887694B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110114533.2A CN112887694B (en) 2021-01-26 2021-01-26 Video playing method, device and equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110114533.2A CN112887694B (en) 2021-01-26 2021-01-26 Video playing method, device and equipment and readable storage medium

Publications (2)

Publication Number Publication Date
CN112887694A true CN112887694A (en) 2021-06-01
CN112887694B CN112887694B (en) 2023-03-10

Family

ID=76053564

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110114533.2A Active CN112887694B (en) 2021-01-26 2021-01-26 Video playing method, device and equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN112887694B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114095751A (en) * 2021-09-30 2022-02-25 广州方硅信息技术有限公司 Background processing method of live related page, electronic device and storage medium
CN114547436A (en) * 2021-12-31 2022-05-27 北京达佳互联信息技术有限公司 Page display method and device, electronic equipment and storage medium
CN115145442A (en) * 2022-06-07 2022-10-04 杭州海康汽车软件有限公司 Environment image display method and device, vehicle-mounted terminal and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1577336A (en) * 2003-07-04 2005-02-09 三菱电机株式会社 Method and apparatus for representing a group of images
CN106406504A (en) * 2015-07-27 2017-02-15 常州市武进区半导体照明应用技术研究院 Atmosphere rendering system and method of man-machine interaction interface
CN107135420A (en) * 2017-04-28 2017-09-05 歌尔科技有限公司 Video broadcasting method and system based on virtual reality technology
CN109166159A (en) * 2018-10-12 2019-01-08 腾讯科技(深圳)有限公司 Obtain the method, apparatus and terminal of the dominant hue of image
CN110322520A (en) * 2019-07-04 2019-10-11 厦门美图之家科技有限公司 Image key color extraction method, apparatus, electronic equipment and storage medium
CN110597589A (en) * 2019-09-11 2019-12-20 北京达佳互联信息技术有限公司 Page coloring method and device, electronic equipment and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1577336A (en) * 2003-07-04 2005-02-09 三菱电机株式会社 Method and apparatus for representing a group of images
CN106406504A (en) * 2015-07-27 2017-02-15 常州市武进区半导体照明应用技术研究院 Atmosphere rendering system and method of man-machine interaction interface
CN107135420A (en) * 2017-04-28 2017-09-05 歌尔科技有限公司 Video broadcasting method and system based on virtual reality technology
CN109166159A (en) * 2018-10-12 2019-01-08 腾讯科技(深圳)有限公司 Obtain the method, apparatus and terminal of the dominant hue of image
CN110322520A (en) * 2019-07-04 2019-10-11 厦门美图之家科技有限公司 Image key color extraction method, apparatus, electronic equipment and storage medium
CN110597589A (en) * 2019-09-11 2019-12-20 北京达佳互联信息技术有限公司 Page coloring method and device, electronic equipment and storage medium

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114095751A (en) * 2021-09-30 2022-02-25 广州方硅信息技术有限公司 Background processing method of live related page, electronic device and storage medium
CN114547436A (en) * 2021-12-31 2022-05-27 北京达佳互联信息技术有限公司 Page display method and device, electronic equipment and storage medium
CN115145442A (en) * 2022-06-07 2022-10-04 杭州海康汽车软件有限公司 Environment image display method and device, vehicle-mounted terminal and storage medium
CN115145442B (en) * 2022-06-07 2024-06-11 杭州海康汽车软件有限公司 Method and device for displaying environment image, vehicle-mounted terminal and storage medium

Also Published As

Publication number Publication date
CN112887694B (en) 2023-03-10

Similar Documents

Publication Publication Date Title
CN112887694B (en) Video playing method, device and equipment and readable storage medium
US9691139B2 (en) Adaptive contrast in image processing and display
JP7508135B2 (en) IMAGE PROCESSING METHOD, IMAGE PROCESSING APPARATUS, ELECTRONIC DEVICE, AND COMPUTER PROGRAM
CN113518185B (en) Video conversion processing method and device, computer readable medium and electronic equipment
EP2525561A1 (en) Data-generating device, data-generating method, data-generating program, and recording medium
US9041773B2 (en) Conversion of 2-dimensional image data into 3-dimensional image data
WO2019018434A1 (en) Actor/person centric auto thumbnail
CN113709560B (en) Video editing method, device, equipment and storage medium
US11409794B2 (en) Image deformation control method and device and hardware device
KR101985880B1 (en) Display device and control method thereof
JP2024523865A (en) Screensaver interaction method, device, electronic device, and storage medium
WO2022156129A1 (en) Image processing method, image processing apparatus, and computer device
CN112565887A (en) Video processing method, device, terminal and storage medium
CN114449362B (en) Video cover selection method, device, equipment and storage medium
CN106603885B (en) Method of video image processing and device
CN115689882A (en) Image processing method and device and computer readable storage medium
CN110858389B (en) Method, device, terminal and transcoding equipment for enhancing video image quality
CN108769825B (en) Method and device for realizing live broadcast
CN112788234B (en) Image processing method and related device
CN115311321A (en) Background replacing method, device, electronic equipment and storage medium
CN115604410A (en) Video processing method, device, equipment and computer readable storage medium
CN115706860A (en) Image light supplementing method, device, equipment and storage medium
CN108876800B (en) Information processing method and equipment
EP2614489B1 (en) Method and system for obtaining a control information related to a digital image
WO2020206356A1 (en) High dynamic range video format detection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant