CN113301414B - Interface generation processing method and device, electronic equipment and computer storage medium - Google Patents

Interface generation processing method and device, electronic equipment and computer storage medium Download PDF

Info

Publication number
CN113301414B
CN113301414B CN202010648544.4A CN202010648544A CN113301414B CN 113301414 B CN113301414 B CN 113301414B CN 202010648544 A CN202010648544 A CN 202010648544A CN 113301414 B CN113301414 B CN 113301414B
Authority
CN
China
Prior art keywords
video
image
video frame
characteristic information
color value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010648544.4A
Other languages
Chinese (zh)
Other versions
CN113301414A (en
Inventor
李小康
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Group Holding Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN202010648544.4A priority Critical patent/CN113301414B/en
Publication of CN113301414A publication Critical patent/CN113301414A/en
Application granted granted Critical
Publication of CN113301414B publication Critical patent/CN113301414B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/30Creation or generation of source code
    • G06F8/38Creation or generation of source code for implementing user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • H04N21/4858End-user interface for client configuration for modifying screen layout parameters, e.g. fonts, size of the windows

Abstract

The embodiment of the invention provides a method and a device for generating and processing an interface, electronic equipment and a computer storage medium. The interface includes a video region and a background region, the method comprising: acquiring image characteristic information of a video frame image of a video area; and adjusting the display content of the background area according to the image characteristic information. The method can enable the display content of the background area to be changed along with the video frame image of the video area.

Description

Interface generation processing method and device, electronic equipment and computer storage medium
Technical Field
The embodiment of the invention relates to the technical field of computers, in particular to a method and a device for generating and processing an interface, electronic equipment and a computer storage medium.
Background
In the prior art, the video may have a different duty cycle in the player interface when the user is watching the video. For example, a video played in full screen mode may fill the entire player interface with a 100% ratio in the player interface, where the user does not see interface elements outside of the video frame images of the video.
In some cases, the video frame image of the video may not fill the entire player interface, so that the player interface may display the background area, and in the case that the user can see the background area, there may be a situation that the display content of the background area is not adapted to the video frame image, which affects the viewing experience of the user.
Disclosure of Invention
In view of the above, an embodiment of the present invention provides an interface generation processing scheme to at least partially solve the above-mentioned problems.
According to a first aspect of an embodiment of the present invention, there is provided a method for generating and processing an interface, where the interface includes a video area and a background area, the method including: acquiring image characteristic information of a video frame image of a video area; and adjusting the display content of the background area according to the image characteristic information.
According to a second aspect of an embodiment of the present invention, there is provided an interface generation processing apparatus, including: the acquisition module is used for acquiring image characteristic information of video frame images of the video area; and the adjusting module is used for adjusting the display content of the background area according to the image characteristic information.
According to a third aspect of an embodiment of the present invention, there is provided an electronic apparatus including: the display screen is used for displaying the video frame images and the display content of the background area; the processor is used for acquiring the image characteristic information of the video frame image of the video area; and adjusting the display content of the background area according to the image characteristic information.
According to a fourth aspect of embodiments of the present invention, there is provided a computer storage medium having stored thereon a computer program which, when executed by a processor, implements the method of generating an interface according to the first aspect.
According to the interface generation processing scheme provided by the embodiment of the invention, the display content of the background area outside the background area is adjusted according to the image characteristic information of the video frame image, so that the display content of the background area can change along with the video frame image displayed by the video area, the display content of the background area is richer, monotonous is avoided, the display content of the background area can change automatically, the automation of display content adjustment is improved, and the labor intensity of staff is reduced.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments described in the embodiments of the present invention, and other drawings may be obtained according to these drawings for a person having ordinary skill in the art.
FIG. 1a is a flowchart illustrating a method for generating and processing an interface according to a first embodiment of the present invention;
fig. 1b is an interface schematic diagram of a player according to a first embodiment of the present invention;
FIG. 1c is an interface diagram of another player according to the first embodiment of the present invention;
FIG. 1d is a schematic diagram of interface changes in a usage scenario according to a first embodiment of the present invention;
FIG. 2 is a flow chart showing steps of a method for generating and processing an interface according to a second embodiment of the present invention;
FIG. 3a is a flowchart illustrating a method for generating and processing an interface according to a third embodiment of the present invention;
FIG. 3b is a flowchart of a usage scenario according to a third embodiment of the present invention;
fig. 4 is a block diagram of an interface generation processing device according to a fourth embodiment of the present invention;
fig. 5 is a schematic structural diagram of an electronic device according to a fifth embodiment of the present invention.
Detailed Description
In order to better understand the technical solutions in the embodiments of the present invention, the following description will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which are derived by a person skilled in the art based on the embodiments of the present invention, shall fall within the scope of protection of the embodiments of the present invention.
The implementation of the embodiments of the present invention will be further described below with reference to the accompanying drawings.
Example 1
Referring to fig. 1a, a flowchart of steps of a method for generating and processing an interface according to a first embodiment of the present invention is shown.
In this embodiment, the method for generating and processing the interface is performed by the terminal device, and in other embodiments, the method may also be performed by the server.
The interface generation processing method of the embodiment comprises the following steps:
step S102: and acquiring image characteristic information of a video frame image of the video area.
In this embodiment, the video is displayed through an interface, where the interface includes a video area and a background area, and the video area is used to display video frame images in the video, and may also display other contents as required. The background area is a portion of the interface outside the video area. If the video area is in a full-screen state, i.e. the video area completely covers the whole interface, the background area is invisible, otherwise, if the video area is not in a full-screen state, the background area is visible. The video may be a long video, a short video, a video of an electronic album, or the like, or may be plain text information or the like displayed in an image manner.
In this embodiment, a case where a user views a video through a web player or an application player on a terminal device will be described. As shown in fig. 1B, a display screen of the terminal device is shown at a position a, a video area occupied by a video frame image is shown at a position B, and an area other than the position B is a background area. In addition to the display content and the video frame image of the background area, other content, such as advertisement content, knowledge content related to video, recommended content related to video or character information or character works related to video, and the like, can be displayed in the interface according to requirements.
In the video playing process, the video frame images displayed in the video area can change, so that the display content (such as images or colors) of the background area and the contrast of the video frame images displayed in the video area are always smaller, and the display content of the background area needs to be automatically and dynamically adjusted according to the displayed video frame images.
In the present embodiment, the contrast may be determined according to the contrast of the display content of the background area and the contrast of the video frame image, or may be determined according to the contrast of the display content of the background area and the video frame image, the respective saturation of the display content of the background area and the video frame image, and the respective brightness, which is not limited in this embodiment.
In order to be able to automatically and dynamically adjust the presentation content of the background area, it is necessary to acquire image feature information of the presented video frame image, for example, at least one of the following: background partial images in the video frame images, foreground partial images of the video frame images, and color value information of the video frame images. Therefore, the display content of the background area can be adjusted based on the image characteristic information in the subsequent steps, so that the effect of reducing the contrast of light and shade is realized.
The video frame images may be video frame images shown in the video area at the current time, for example, video frame images shown at B in fig. 1B, or video frame images shown at B in fig. 1 c. The background portion image of the video frame image may be a portion other than the portrait located in the foreground, and accordingly, the foreground portion image may be a portrait portion of the foreground. The color value information of the video frame image may be color value information obtained by extracting color values of the whole video frame image by any suitable method in the prior art, or color value information obtained by extracting color values of a background part image in the video frame image.
In this embodiment, the color value information includes at least one of rgb color value information, rgba color value information, hsl color value information, hex color value information, and argb color value information.
For example, for a video frame image played in an HTML 5-based web page player, the color value information thereof may be at least one of rgb color value information, hsl color value information, and hex color value information recognizable by the HTML5 web page player.
For another example, for video frame images played in an application player, the color value information thereof may be argb color value information or the like recognizable by the application player.
Step S104: and adjusting the display content of the background area according to the image characteristic information.
For different image characteristic information, the display content of the background area can be configured to be corresponding content, so that the brightness contrast of the background area and the video area is smaller.
For example, if the image feature information includes a background portion image of the video frame image, the display content of the background area may be adjusted to the background portion image, or to an image obtained by performing soft light, color mixing, and other processing on the background portion image, so that the contrast between the display content of the background area of the player interface and the brightness of the video frame image is less than or equal to a first brightness threshold, so that the consistency between the background area and the video area is better, and visual fatigue caused by long-time viewing of a user can be avoided.
For another example, if the image feature information is color value information of the video frame image, the color value of the display content of the background area may be adjusted according to the color value information, so that the color value of the display content of the background area is the same as or similar to the color value of the background part image of the video frame image, so as to ensure that the brightness contrast of the background area and the video frame image is less than or equal to the first brightness threshold. The first shading threshold may be determined as needed, which is not limited in this embodiment.
Of course, in other embodiments, the content of the background area may be adjusted in other suitable manners, which is not limited in this embodiment.
The implementation process of the method of this embodiment is described in detail below in conjunction with a specific usage scenario:
in this usage scenario, the user views the video through a web player on the terminal device. Since the video frame image does not occupy the entire interface (which may be a full display interface), the background area of the player interface is visible to the user. Typically, this background area in the prior art is solid, such as white, black, etc., or the developer may preset some other color or image as a presentation of the background area. However, in this case, the content of the background area is monotonous, and if the contrast between the background area and the displayed video frame image is too large, the visual sense is bad, visual fatigue is easily generated for the user, and the vision is easily damaged.
To solve this problem, in the present usage scenario, for a currently displayed video frame image (for convenience of description of the recording image frame 1), an image is obtained by processing the displayed video frame image in real time, for example, to acquire color value information thereof. According to the color value information, the color value of the display content of the background area is adjusted, for example, the color value of the display content of the background area is set to be the color value of the image characteristic information, and the adjusted player interface is shown as an interface 1 in fig. 1 c.
After the image frame 1 is displayed, the currently displayed video frame image of the video area of the player interface becomes the image frame 2. Similarly, image characteristic information, such as color value information, of the image frame 2 is acquired. According to the color value information, the display content of the background area is adjusted to make the color value be the color value of the background part image of the image frame 2, so that the display content of the background area of the player interface can be adjusted according to the image characteristic information of the displayed video frame image, and the set player interface is shown as an interface 2 in fig. 1 c.
According to the embodiment, according to the image characteristic information of the video frame image, the display content of the background area outside the background area is adjusted, so that the display content of the background area can change along with the video frame image displayed by the video area, the display content of the background area is richer, monotonous is avoided, the display content of the background area can change automatically, the automation of the adjustment of the display content is improved, and the labor intensity of workers is reduced.
Example two
Referring to fig. 2, a step flowchart of a method for generating and processing an interface according to a second embodiment of the present invention is shown.
In this embodiment, the method is described as an example by the terminal device. The interface generation processing method includes steps S102 to S104.
In order to automatically adjust the display content of the background area and reduce the contrast between the display content of the background area and the video frame image of the video area during the video playing process, step S102 may be implemented as follows: and periodically acquiring image characteristic information of the video frame image of the video domain according to a preset acquisition period.
The acquisition period may be determined as needed, for example, the video is a series of video frame images with a time sequence relationship, the acquisition period may be one video frame image acquired per set number of video frame images (the set number may be determined as needed, such as 1, 3, etc.), or the acquisition period may be acquired per video frame image.
The duration of the two adjacent acquisition periods may be the same or different. For example, the duration of the acquisition period may be a fixed duration, as described above, with one acquisition per a set number of video frame images per interval. For another example, the duration of the acquisition period may also be a non-fixed duration, such as when the contrast between the displayed video frame image and the previous video frame image is greater than or equal to a second threshold (which may be determined as needed), the display content of the background area needs to be replaced, where the image feature information is acquired, and in this case, the duration of the acquisition period may be non-fixed.
For each acquisition period, the displayed video frame image can be processed in real time, and image characteristic information is obtained. Alternatively, the video frame image may be processed in advance, and the obtained image feature information may be stored, and when the image feature information of the video frame image needs to be collected, the corresponding image feature information may be directly read from the storage space, which is not limited in this embodiment.
Optionally, in order to meet the personalized requirements of different users, to adjust the display content of the background area according to the requirements of the users, the acquisition of the image feature information of the video frame image of the video domain may be implemented as follows: and according to the validated adjustment style options, acquiring image characteristic information corresponding to the validated adjustment style options from video frame images of the video area.
The webpage player or the application program player of the terminal equipment is provided with an adjustment switch option and an adjustment style option, and a user can control whether to adjust the display content or not by controlling the adjustment switch option, so that more choices can be provided for the user. For example, the user may operate the adjustment switch option when using the web player or the application player, such as adjusting the state of the adjustment switch option to an on state when adjusting the display content of the background area; or when the user does not need to adjust the display content of the background area, the state of the adjustment switch option is adjusted to be in a closed state.
The adjustment style options correspond to different processing operations on the display content, so that different display styles are formed, the user can select the adjustment style options according to own requirements, and the adjustment style options selected by the user are used as effective adjustment style options.
The adjustment style option may be configured as required, for example, may be a blurring style, a nostalgic style, a black-and-white style, and the like, which is not limited in this embodiment.
In this way, when the adjustment switch option is in an on state in the process of playing the video through the webpage player or the application program player, when the image feature information is acquired, the image feature information corresponding to the effective adjustment style option is acquired. For example, the effective adjustment style option is a blurring style, and then a background portion image thereof may be obtained from the video frame image, and so on.
Corresponding to step S102, step S104 may be implemented as: and according to the periodically acquired image characteristic information, periodically adjusting the display content of the background area.
After the image characteristic information of the displayed video frame image is acquired for each acquisition period, the display content of the background area is adjusted according to the image characteristic information, so that the display content of the background area can be automatically adjusted at intervals, the display content of the background area is richer, the contrast between the display content of the background area and the brightness of the displayed video frame image is smaller than or equal to a first brightness threshold value, visual fatigue cannot be brought to a user, and the health of the user is protected. Moreover, the mode does not need to manually preset the display content, so that the labor intensity is reduced.
According to the embodiment, the image characteristic information can be periodically acquired, so that the display content of the background area is adjusted according to the acquired image characteristic information, the display content of the background area automatically follows the change of the played video frame image, the problems of monotonous and tedious background area are solved, the display content of the background area is not required to be preset manually, and the labor intensity is reduced.
Example III
Referring to fig. 3a, a step flow diagram of a method for generating and processing an interface according to a third embodiment of the present invention is shown.
In this embodiment, the interface generating method includes the steps S102 to S104, which may be implemented in the first or second embodiment.
In order to improve adaptability, the adjustment of the display content of the background area can be well realized for different players, network time delays and the like, and the step S102 comprises the substeps S1021-S1022.
Substep S1021: and determining an acquisition strategy of the image characteristic information according to at least one of available hardware computing resources of playing equipment for playing the video, whether the video is live content or not and network time delay.
Taking the example of a user playing video through a terminal device, available hardware computing resources include available memory of the terminal device, available CPU, available GPU, and the like.
In this embodiment, different acquisition strategies may be determined according to different available hardware computing resources, different playing environments, network time delay and the like, so as to ensure that the determined acquisition strategy has better adaptability to the state of the current playing device, thereby ensuring the smoothness of playing the video, avoiding the playing device from being blocked or generating too much heat and the like.
In a specific implementation, the acquisition policy includes two sub-policies, a content policy and a mode policy.
The content policy is used to indicate content contained in the image feature information to be acquired, such as containing only a background partial image, or containing only color value information, or containing only a foreground partial image, or containing at least one of a background partial image, a foreground partial image, and containing color value information, etc.
The mode policy is used for indicating an acquisition mode of the image feature information, for example, the image feature information is acquired by carrying out real-time processing on the displayed video frame image locally, or the image feature information obtained by processing the video frame image in advance is acquired from the storage space, or the image feature information is acquired by carrying out real-time processing on the video frame image through the server side.
In a specific implementation, the sub-step S1021 may determine two sub-policies separately, where the sub-step S1021 includes:
Step I: and determining that the acquired image characteristic information indicated in the acquisition strategy is at least two of background part image, foreground part image and color value information, or is background part image, or is foreground part image or is color value information according to available hardware computing resources of playing equipment for playing the video, a preset second resource threshold and a preset third resource threshold.
The second and third resource thresholds may be determined as desired, e.g., the second resource threshold is greater than the third resource threshold.
Taking the second resource threshold value as 70% and the third resource threshold value as 50% as an example, if the currently available hardware computing resource is greater than or equal to the second resource threshold value, which indicates that the available hardware computing resource is more, determining that the content policy indication image information in the acquisition policy is at least two of the background partial image, the foreground partial image and the color value information.
Or if the current available hardware computing resource is smaller than the second resource threshold and larger than or equal to the third resource threshold, which indicates that the available hardware computing resource is generally sufficient, determining that the content policy indication image information in the acquisition policy is a background partial image or a foreground partial image.
Because the acquisition of the background part image or the foreground part image needs to occupy more hardware computing resources, when the hardware computing resources are sufficient, the image information contains the background part image or the foreground part image, so that the display content of the subsequent background area is richer than that of the single color change.
In addition, the background part image or the foreground part image can be subjected to blurring, sharpening, clipping and other treatments as required. If the display screen of the playing device is a display screen with the resolution greater than or equal to the first resolution, the acquisition of the background partial image or the foreground partial image in the png format may be indicated, so that more image details remain in the background partial image or the foreground partial image. Or if the display screen of the playing device is a display screen with a resolution smaller than the first resolution, the acquisition of the background partial image or the foreground partial image in jpg format may be indicated, so that a part of image details of the background partial image or the foreground partial image are discarded, thereby reducing occupation of hardware computing resources.
Or if the current available hardware computing resource is smaller than the third resource threshold value, which indicates that the available hardware computing resource is more intense, determining that the content policy indication image characteristic information in the acquisition policy is color value information.
Because the color value information occupies smaller hardware computing resources than the background part image or the foreground part image, the color value information can be only acquired when the hardware computing resources are tense.
Different formats of color value information may be employed for different types of players. For example, for a web player, the color value information may be rgb color value information, rgba color value information, hsl color value information, hex color value information, and the like. For an application player, the color value information may be argb color value information or the like.
Step II: and determining to locally process the displayed video frame image in real time to obtain the image characteristic information according to whether the video is live content, network time delay and a preset time delay threshold value, or obtaining the image characteristic information from a server.
If the video is live broadcast content, the displayed video frame image is acquired in real time, and the time delay required to be ensured is smaller, so that the displayed video frame image is required to be processed in real time. If the network time delay is greater than or equal to a preset time delay threshold, the network condition is poor, and therefore the video frame image needs to be processed locally, and the mode strategy in the acquisition strategy is determined based on the network time delay, so that the video frame image is processed locally in real time to acquire the image characteristic information.
Or if the video is live content and the network time delay is smaller than the time delay threshold, the determining mode strategy can indicate that the video frame image is processed in real time through the server, and the processed image characteristic information is obtained from the server.
Or if the video is recorded and broadcast content, the video may be processed in real time or preprocessed through a server or locally as required, which is not limited in this embodiment.
Sub-step S1022: and acquiring the image characteristic information of the video frame image of the video area according to the determined acquisition strategy.
In a specific implementation, the acquisition strategy indicates that the video frame image is processed locally in real time to obtain image characteristic information, and the image characteristic information comprises at least one of background part image, foreground part image and color value information.
When the background part image or the foreground part image is acquired, the foreground part image and the background part image can be separated independently by carrying out foreground and background segmentation on the displayed video frame image.
In which case the background partial image and/or the foreground partial image may be directly taken as part of the image characteristic information. In another case, the background portion image and/or the foreground portion image may be appropriately processed, such as blurring, filling in blank portions, and the like, and then the processed background portion image and/or foreground portion image may be used as a part of the image feature information.
When the color value information is acquired, the color value information can be extracted from the background part image or the foreground part image, or the color value information can be directly extracted from the complete video frame image. The extracted color value information is then used as a part of the image characteristic information.
Of course, in other embodiments, the image characteristic information may be obtained in other suitable manners as needed, which is not limited in this embodiment.
After the image characteristic information is obtained, the display content of the background area can be adjusted according to the state of the playing device and the content contained in the image characteristic information.
For example, step S104 includes substep S1041 or substep S1042.
Substep S1041: and if the available hardware computing resources of the playing equipment for playing the video are greater than or equal to a first resource threshold, adjusting the background area to display the background part image in the image characteristic information.
Because the rendering image consumes more hardware computing resources, when the available hardware computing resources are larger than or equal to a first resource threshold, the display content of the background area is adjusted to be a background part image or a foreground part image, so that the display effect is rich, and the situation that playing equipment is blocked due to the fact that too many hardware computing resources are occupied is avoided.
The first resource threshold may be determined as needed, which is not limited in this embodiment, and may be 20%, for example.
Sub-step S1042: and if the available hardware computing resource of the playing equipment for playing the video is smaller than the first resource threshold, adjusting the color value of the display content of the background area according to the color value information of the image characteristic information.
When the available hardware computing resources are smaller than the first resource threshold, the hardware computing resources are stressed, and in order to save resources, the color value of the display content of the background area is adjusted according to the color value information in the image characteristic information, so that the color of the display content of the background area is attached to the color of the background part image or the foreground part image of the video frame image of the video area, and the situation that visual fatigue is caused to a user due to overlarge contrast between brightness and darkness is avoided.
It should be noted that, except that the above-mentioned substep S1041 or substep S1042 may be adopted to set the display content of the background area, if the image feature information includes only the background part image or the foreground part image, the display content of the background area may be directly adjusted according to the background part image or the foreground part image; or if the image characteristic information only includes the color value information, the display content of the background area can be directly set according to the color value information.
The implementation process of the method is described below with reference to a specific usage scenario:
as shown in fig. 3b, the adjustment procedure of the presentation content of the primary background area of the generation processing method of the interface in the usage scenario is shown.
In this usage scenario, during the process of playing video by the user through the playing device (such as a mobile phone):
process a: it is detected whether the video area is in a full screen state.
It should be noted that, the full-screen state herein does not necessarily refer to the full screen of the web page player or the application program player, but refers to whether the displayed video frame image occupies the whole display screen. For example, if the web page player is already full-screen, but the video frame image shown in the web page player does not occupy the whole display screen, it is also determined that the web page player is not in a full-screen state.
If the state is in the full screen state, the processing is not performed; or if the display content is not in the full-screen state, the display content of the background area is required to be set, and the process B is executed.
Process B: if the video frame image is not full screen, entering a strategy center, and generating decisions according to available hardware computing resources of the playing equipment, whether the video is live broadcast content, network time delay and the like.
In the policy center, available hardware computing resources of the playback device, whether the video is live content, network latency, whether the player is a web page player or an application player, etc. are analyzed to determine an acquisition policy.
For example, if the available memory of the available hardware computing resource of the playing device is greater than or equal to the second resource threshold, the player is a web page player, and the resolution of the display screen is greater than the first resolution, in the determined acquisition policy, the content policy indicates that the image information to be acquired includes a background portion image and color value information, the format of the background portion image is png format, and the color value information may be at least one of rgb color value information, rgba color value information, hex color value information and hsl color value information.
Meanwhile, if the video is live content and the network time delay is larger than the time delay threshold, the mode strategy indication in the acquisition strategy is processed in real time on the displayed video frame image locally.
Of course, in other usage scenarios, the background portion image may be an image of other suitable formats, such as GIF.
Process C: and acquiring image information from the video frame image according to the acquisition strategy.
The background portion image and the color value information are acquired from the video frame image by any suitable means according to the acquisition strategy. Wherein the background partial image may also be referred to as a frame map.
Process D: and adjusting the display content of the background area according to the image characteristic information.
For a webpage player, a video area is usually a play control drawn on a webpage, the play control and the webpage do not belong to the same layer, and the play control can shield part of the webpage, so that the whole webpage can be set when the display content of a background area is set.
For an application player, the background area may be an area outside the video area, that is, an area not used for displaying video frame images, and the display content of the background area may be adjusted when set.
If the image feature information only includes the background partial image or the color value information, the background partial image included in the image feature information is set in the background area for displaying, or the color value of the display content of the background area is set as the color value corresponding to the color value information.
If the image characteristic information comprises the background part image and the color value information, according to whether available hardware computing resources of the playing device are larger than a first resource threshold, if so, adjusting the display content of the background area according to the background part image; otherwise, if the color value information is smaller than the first resource threshold value, the display content of the background area is adjusted according to the color value information.
By the adjustment, the effect of displaying is that the display content of the background area outside the video area changes along with the video frame image displayed by the video area. Because the atmosphere background formed by the display content of the background area and the video frame image displayed in the video area are synchronously displayed and smoothly transited, the contrast difference between the brightness of the display content of the background area and the contrast of the brightness of the video frame image displayed in the video area is reduced, the immersive viewing experience is achieved, and therefore visual fatigue of a user is not easy to generate. In addition, the manner of intelligently setting the display content of the background area based on the displayed video greatly reduces the workload of workers of video websites and video application programs, improves the operation and maintenance efficiency, solves the problem that the atmosphere background is irrelevant to the played video possibly caused by manual configuration and presetting of the theme, and enables the display content of the background area to sense the change of the displayed video frame image.
According to the embodiment, according to the image characteristic information of the video frame image, the display content of the background area outside the background area is adjusted, so that the display content of the background area can change along with the video frame image displayed by the video area, the display content of the background area is richer, monotonous is avoided, the display content of the background area can change automatically, the automation of the adjustment of the display content is improved, and the labor intensity of workers is reduced.
Example IV
Referring to fig. 4, a block diagram of an interface generation processing apparatus according to a fourth embodiment of the present invention is shown.
In this embodiment, the interface generation processing apparatus includes: an acquiring module 402, configured to acquire image feature information of a video frame image of a video area; and the adjusting module 404 is configured to adjust the display content of the background area according to the image feature information.
Optionally, the acquiring module 402 is configured to periodically acquire image feature information of a video frame image of the video area according to a preset acquisition period; the adjustment module 404 is configured to periodically adjust the display content of the background area according to the periodically acquired image feature information.
Optionally, the image characteristic information includes at least one of: background portion images in the video frame images, foreground portion images of the video frame images, and color value information of the video frame images.
Optionally, the adjusting module 404 includes: a background image adjustment module 4041, configured to adjust the background area to display a background part image in the image feature information if available hardware computing resources of a playback device that plays the video are greater than or equal to a first resource threshold; or a color value adjustment module 4042, configured to adjust, if the available hardware computing resource of the playback device that plays back the video is smaller than the first resource threshold, the color value of the display content in the background area according to the color value information of the image feature information.
Optionally, the color value information includes at least one of rgb color value information, rgba color value information, hsl color value information, hex color value information, and argb color value information.
Optionally, the acquiring module 402 includes: a policy determining module 4021, configured to determine an acquisition policy of the image feature information according to at least one of available hardware computing resources of a playback device that plays back the video, whether the video is live content, and a network time delay; the information obtaining module 4022 is configured to obtain the image feature information of the video frame image of the video area according to the determined obtaining policy.
Optionally, the policy determining module 4021 is configured to determine, according to available hardware computing resources of a playback device that plays the video, a preset second resource threshold value, and a third resource threshold value, that the acquired image feature information indicates that the acquired image feature information is at least two of a background part image, a foreground part image, and color value information, or is a background part image, or is a foreground part image, or is color value information in the acquisition policy.
Optionally, the policy determining module 4021 is configured to determine that the displayed video frame image is processed locally in real time to obtain the image feature information, or obtain the image feature information from a server according to whether the video is live content, network delay, and a preset delay threshold.
Optionally, the obtaining module 402 is configured to obtain, from the video frame image of the video area, image feature information corresponding to the validated adjustment style option according to the validated adjustment style option.
The interface generation processing device of the present embodiment is configured to implement the corresponding interface generation method in the foregoing multiple method embodiments, and has the beneficial effects of the corresponding method embodiments, which are not described herein. In addition, the functional implementation of each module in the interface generating device of the present embodiment may refer to the description of the corresponding portion in the foregoing method embodiment, which is not repeated herein.
Example five
Referring to fig. 5, a schematic structural diagram of an electronic device according to a fifth embodiment of the present invention is shown, and the specific embodiment of the present invention is not limited to the specific implementation of the electronic device.
As shown in fig. 5, the electronic device may include: a processor 502, a communication interface (Communications Interface) 504, a memory 506, and a communication bus 508 and a display screen.
Wherein:
processor 502, communication interface 504, and memory 506 communicate with each other via communication bus 508.
A communication interface 504 for communicating with other electronic devices or servers.
The processor 502 is configured to execute the program 510, and may specifically execute relevant steps in the embodiment of the interface generation processing method.
The display screen is used for displaying video images and contents displayed in the background area.
In particular, program 510 may include program code including computer-operating instructions.
The processor 502 may be a central processing unit CPU, or a specific integrated circuit ASIC (Application Specific Integrated Circuit), or one or more integrated circuits configured to implement embodiments of the present invention. The one or more processors comprised by the smart device may be the same type of processor, such as one or more CPUs; but may also be different types of processors such as one or more CPUs and one or more ASICs.
A memory 506 for storing a program 510. Memory 506 may comprise high-speed RAM memory or may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
The interface includes a video area and a background area, and the program 510 may be specifically configured to cause the processor 502 to: acquiring image characteristic information of a video frame image of a video area; and adjusting the display content of the background area according to the image characteristic information.
In an alternative embodiment, the program 510 is further configured to, when the processor 502 acquires the image feature information of the video frame image of the video area, periodically acquire the image feature information of the video frame image of the video area according to a preset acquisition period; the adjusting the display content of the background area according to the image characteristic information comprises the following steps: and according to the periodically acquired image characteristic information, periodically adjusting the display content of the background area.
In an alternative embodiment, the image characteristic information includes at least one of: background portion images in the video frame images, foreground portion images of the video frame images, and color value information of the video frame images.
In an optional embodiment, the program 510 is further configured to, when the processor 502 adjusts the display content of the background area according to the image feature information, adjust the background area to display a background part image in the image feature information if an available hardware computing resource of a playing device that plays the video is greater than or equal to a first resource threshold; or if the available hardware computing resource of the playing equipment for playing the video is smaller than the first resource threshold, adjusting the color value of the display content of the background area according to the color value information of the image characteristic information.
In an alternative embodiment, the color value information includes at least one of rgb color value information, rgba color value information, hsl color value information, hex color value information, and argb color value information.
In an alternative embodiment, the program 510 is further configured to, when obtaining the image feature information of the video frame image of the video area, cause the processor 502 to determine an obtaining policy of the image feature information according to at least one of available hardware computing resources of a playing device that plays the video, whether the video is live content, and a network delay; and acquiring the image characteristic information of the video frame image of the video area according to the determined acquisition strategy.
In an alternative embodiment, the program 510 is further configured to, when determining the acquisition policy of the image feature information according to at least one of available hardware computing resources of a playing device playing the video, whether the video is live content, and a network time delay, determine that the acquired image feature information is at least two of a background portion image, a foreground portion image, and a color value information, or is a background portion image, or is a foreground portion image, or is color value information in the acquisition policy according to available hardware computing resources of the playing device playing the video, a preset second resource threshold, and a third resource threshold.
In an optional implementation manner, the program 510 is further configured to, when determining the acquisition policy of the image feature information according to at least one of available hardware computing resources of a playing device that plays the video, whether the video is live content and network latency, determine to acquire the image feature information by locally processing the displayed image of the video frame in real time according to whether the video is live content, network latency and a preset latency threshold, or acquire the image feature information from a server.
In an alternative embodiment, the program 510 is further configured to, when acquiring the image feature information of the video frame image of the video area, cause the processor 502 to acquire, from the video frame image of the video area, the image feature information corresponding to the validated adjustment style option according to the validated adjustment style option.
The specific implementation of each step in the program 510 may refer to corresponding steps and corresponding descriptions in the units in the embodiment of the interface generation processing method, which are not described herein. It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the apparatus and modules described above may refer to corresponding procedure descriptions in the foregoing method embodiments, which are not repeated herein.
It should be noted that, according to implementation requirements, each component/step described in the embodiments of the present invention may be split into more components/steps, or two or more components/steps or part of operations of the components/steps may be combined into new components/steps, so as to achieve the objects of the embodiments of the present invention.
The above-described methods according to embodiments of the present invention may be implemented in hardware, firmware, or as software or computer code storable in a recording medium such as a CD ROM, RAM, floppy disk, hard disk, or magneto-optical disk, or as computer code originally stored in a remote recording medium or a non-transitory machine-readable medium and to be stored in a local recording medium downloaded through a network, so that the methods described herein may be stored on such software processes on a recording medium using a general purpose computer, special purpose processor, or programmable or special purpose hardware such as an ASIC or FPGA. It is understood that a computer, processor, microprocessor controller, or programmable hardware includes a memory component (e.g., RAM, ROM, flash memory, etc.) that can store or receive software or computer code that, when accessed and executed by the computer, processor, or hardware, implements the interface generation process methods described herein. Further, when the general-purpose computer accesses the code for implementing the generation processing method of the interface shown here, execution of the code converts the general-purpose computer into a special-purpose computer for executing the generation processing method of the interface shown here.
Those of ordinary skill in the art will appreciate that the elements and method steps of the examples described in connection with the embodiments disclosed herein can be implemented as electronic hardware, or as a combination of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the embodiments of the present invention.
The above embodiments are only for illustrating the embodiments of the present invention, but not for limiting the embodiments of the present invention, and various changes and modifications may be made by one skilled in the relevant art without departing from the spirit and scope of the embodiments of the present invention, so that all equivalent technical solutions also fall within the scope of the embodiments of the present invention, and the scope of the embodiments of the present invention should be defined by the claims.

Claims (11)

1. A generation processing method of an interface, executed by a terminal device, the interface including a video area and a background area, the method comprising:
acquiring image characteristic information of a video frame image of a video area, wherein the image characteristic information comprises at least one of the following: background partial images in the video frame images, foreground partial images of the video frame images, and color value information of the video frame images;
Adjusting the display content of the background area according to the image characteristic information, wherein the adjusting the display content of the background area according to the image characteristic information comprises the following steps: if the available hardware computing resources of the playing equipment for playing the video are greater than or equal to a first resource threshold, adjusting the background area to display a background part image in the image characteristic information; or if the available hardware computing resource of the playing device for playing the video is smaller than the first resource threshold, adjusting the color value of the display content of the background area according to the color value information of the image characteristic information, wherein the playing device is the terminal device.
2. The method of claim 1, wherein the acquiring image feature information of the video frame image of the video region comprises:
periodically acquiring image characteristic information of video frame images of a video area according to a preset acquisition period;
the adjusting the display content of the background area according to the image characteristic information comprises the following steps: and according to the periodically acquired image characteristic information, periodically adjusting the display content of the background area.
3. The method of claim 1, wherein the color value information comprises at least one of rgb color value information, rgba color value information, hsl color value information, hex color value information, and argb color value information.
4. The method according to claim 1 or 2, wherein the acquiring image feature information of the video frame image of the video area comprises:
determining an acquisition strategy of the image characteristic information according to at least one of available hardware computing resources of playing equipment for playing the video, whether the video is live broadcast content and network time delay;
and acquiring the image characteristic information of the video frame image of the video area according to the determined acquisition strategy.
5. The method of claim 4, wherein the determining the image characteristic information acquisition policy based on at least one of available hardware computing resources of a playback device that plays the video, whether the video is live content, and a network time delay comprises:
and determining that the acquired image characteristic information indicated in the acquisition strategy is at least two of background part image, foreground part image and color value information, or is background part image, or is foreground part image or is color value information according to available hardware computing resources of playing equipment for playing the video, a preset second resource threshold and a preset third resource threshold.
6. The method of claim 4, wherein the determining the image characteristic information acquisition policy based on at least one of available hardware computing resources of a playback device that plays the video, whether the video is live content, and a network time delay comprises:
And determining to locally process the displayed video frame image in real time to obtain the image characteristic information according to whether the video is live content, network time delay and a preset time delay threshold value, or obtaining the image characteristic information from a server.
7. The method of claim 1, wherein the acquiring image feature information of the video frame image of the video region comprises:
and according to the validated adjustment style options, acquiring image characteristic information corresponding to the validated adjustment style options from video frame images of the video area.
8. An interface generation processing device configured in a terminal device, where the interface includes a video area and a background area, the device includes:
an acquisition module, configured to acquire image feature information of a video frame image of a video area, where the image feature information includes at least one of: background partial images in the video frame images, foreground partial images of the video frame images, and color value information of the video frame images;
the adjustment module is configured to adjust the display content of the background area according to the image feature information, and includes: if the available hardware computing resources of the playing equipment for playing the video are greater than or equal to a first resource threshold, adjusting the background area to display a background part image in the image characteristic information; or if the available hardware computing resource of the playing device for playing the video is smaller than the first resource threshold, adjusting the color value of the display content of the background area according to the color value information of the image characteristic information, wherein the playing device is the terminal device.
9. An electronic device, implemented as a terminal device, comprising:
the display screen is used for displaying the display content of the video frame image and the background area, and the image characteristic information of the video frame image comprises at least one of the following: background partial images in the video frame images, foreground partial images of the video frame images, and color value information of the video frame images;
the processor is used for acquiring the image characteristic information of the video frame image of the video area; adjusting the display content of the background area according to the image characteristic information, wherein the adjusting the display content of the background area according to the image characteristic information comprises the following steps: if the available hardware computing resources of the playing equipment for playing the video are greater than or equal to a first resource threshold, adjusting the background area to display a background part image in the image characteristic information; or if the available hardware computing resource of the playing device for playing the video is smaller than the first resource threshold, adjusting the color value of the display content of the background area according to the color value information of the image characteristic information, wherein the playing device is the terminal device.
10. The electronic device of claim 9, further comprising a memory;
The memory is used for storing at least video and acquired image characteristic information.
11. A computer storage medium having stored thereon a computer program which, when executed by a processor, implements the method of generating an interface as claimed in any one of claims 1 to 7.
CN202010648544.4A 2020-07-07 2020-07-07 Interface generation processing method and device, electronic equipment and computer storage medium Active CN113301414B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010648544.4A CN113301414B (en) 2020-07-07 2020-07-07 Interface generation processing method and device, electronic equipment and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010648544.4A CN113301414B (en) 2020-07-07 2020-07-07 Interface generation processing method and device, electronic equipment and computer storage medium

Publications (2)

Publication Number Publication Date
CN113301414A CN113301414A (en) 2021-08-24
CN113301414B true CN113301414B (en) 2023-06-02

Family

ID=77318339

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010648544.4A Active CN113301414B (en) 2020-07-07 2020-07-07 Interface generation processing method and device, electronic equipment and computer storage medium

Country Status (1)

Country Link
CN (1) CN113301414B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115442657B (en) * 2021-10-15 2023-12-26 佛山欧神诺云商科技有限公司 Method, equipment, medium and product for dynamically adjusting resolution of image picture
CN115145442A (en) * 2022-06-07 2022-10-04 杭州海康汽车软件有限公司 Environment image display method and device, vehicle-mounted terminal and storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7742059B2 (en) * 2006-06-29 2010-06-22 Scientific-Atlanta, Llc Filling blank spaces of a display screen
US10257487B1 (en) * 2018-01-16 2019-04-09 Qualcomm Incorporated Power efficient video playback based on display hardware feedback
CN109413352B (en) * 2018-11-08 2020-06-23 北京微播视界科技有限公司 Video data processing method, device, equipment and storage medium
CN110852938B (en) * 2019-10-28 2024-03-19 腾讯科技(深圳)有限公司 Display picture generation method, device and storage medium

Also Published As

Publication number Publication date
CN113301414A (en) 2021-08-24

Similar Documents

Publication Publication Date Title
CN108600781B (en) Video cover generation method and server
US8421819B2 (en) Pillarboxing correction
CN109729405B (en) Video processing method and device, electronic equipment and storage medium
WO2017016171A1 (en) Window display processing method, apparatus, device and storage medium for terminal device
CN113301414B (en) Interface generation processing method and device, electronic equipment and computer storage medium
CN107179889B (en) Interface color adjusting method, webpage color adjusting method and webpage color adjusting device
CN107948733B (en) Video image processing method and device and electronic equipment
WO2019153723A1 (en) Video frame display method and device, television and storage medium
JP7295950B2 (en) Video enhancement control method, device, electronic device and storage medium
CN112102422B (en) Image processing method and device
WO2016160638A1 (en) User sliders for simplified adjustment of images
CN113034509A (en) Image processing method and device
CN110650352B (en) Video processing method of IPTV browser
CN108989872B (en) Android television background fast switching method, framework, server and storage medium
CN113709949A (en) Control method and device of lighting equipment, electronic equipment and storage medium
CN112929682B (en) Method, device and system for transparently processing image background and electronic equipment
CN113691737B (en) Video shooting method and device and storage medium
CN111414221B (en) Display method and device
CN110378973B (en) Image information processing method and device and electronic equipment
CN110941413B (en) Display screen generation method and related device
CN110225177B (en) Interface adjusting method, computer storage medium and terminal equipment
CN113507572A (en) Video picture display method, device, terminal and storage medium
CN108540824B (en) Video rendering method and device
CN113411553A (en) Image processing method, image processing device, electronic equipment and storage medium
CN108235144B (en) Playing content obtaining method and device and computing equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant