CN115314754A - Display control method and device of interactive control and electronic equipment - Google Patents

Display control method and device of interactive control and electronic equipment Download PDF

Info

Publication number
CN115314754A
CN115314754A CN202210692473.7A CN202210692473A CN115314754A CN 115314754 A CN115314754 A CN 115314754A CN 202210692473 A CN202210692473 A CN 202210692473A CN 115314754 A CN115314754 A CN 115314754A
Authority
CN
China
Prior art keywords
video
image
control
area
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210692473.7A
Other languages
Chinese (zh)
Inventor
杨泽伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202210692473.7A priority Critical patent/CN115314754A/en
Publication of CN115314754A publication Critical patent/CN115314754A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47217End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for controlling playback functions for recorded or on-demand content, e.g. using progress bars, mode or play-point indicators or bookmarks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a display control method and device of an interactive control and electronic equipment; wherein, the method comprises the following steps: in response to the display of the first video, determining a target sub-area containing the interaction control in a video display area of the first video; wherein the interaction control is to: implementing a specified function associated with the first video; the interactive control is displayed on the video image in the target subregion; determining control display parameters of the interactive control based on the video image in the target sub-region; the display parameters of the control and the display parameters of the video image in the target sub-area have a specified distance; and controlling the display of the interactive control based on the control display parameter. The method improves the visibility of the interactive control, and the user can quickly find the position of the interactive control, so that the user can execute the interactive operation related to the video at any time, and the video watching experience of the user is improved.

Description

Display control method and device of interactive control and electronic equipment
Technical Field
The invention relates to the technical field of interactive control, in particular to a display control method and device of an interactive control and electronic equipment.
Background
When a user watches a video through a terminal device, some interactive operations may be performed on the video, for example, operations such as returning, pausing, adjusting a playing progress, adjusting a playing sound, and the like. In many video playing scenes, an interaction control for interaction operation is displayed in a video playing area. Video content is varied, colors of video images are changed at any time, and meanwhile, interactive controls are generally displayed by using specific colors. In the playing process of the video, the interactive control is hidden and appearing, and when the color of the video image is similar to that of the interactive control, the visibility of the interactive control is poor, so that the user is difficult to find the position of the interactive control, the instant execution of the interactive operation related to the video by the user is influenced, and the video watching experience of the user is reduced.
Disclosure of Invention
In view of this, an object of the present invention is to provide a display control method and apparatus for an interactive control, and an electronic device, so as to improve visibility of the interactive control, and enable a user to quickly find a position of the interactive control, so that the user can perform video-related interactive operations at any time, thereby improving video viewing experience of the user.
In a first aspect, an embodiment of the present invention provides a display control method for an interactive control, where the method includes: responding to the display of the first video, and determining a target sub-area containing the interaction control in a video display area of the first video; wherein the interaction control is to: implementing a specified function associated with the first video; the interactive control is displayed on the video image in the target sub-area; determining control display parameters of the interactive control based on the video image in the target sub-area; the display parameters of the control and the display parameters of the video image in the target sub-area have a specified distance; and controlling the display of the interactive control based on the control display parameter.
The step of determining the target sub-area containing the interactive control in the video display area of the first video in response to the display of the first video includes: loading and displaying the first video in response to a load instruction for the first video; rendering the target sub-region in the video display region of the first video based on the predetermined position parameter of the target sub-region; the position parameters of the target sub-area are determined in advance based on the position of the interactive control in the video display area, and the target sub-area is displayed in a transparent mode.
The step of determining the control display parameters of the interactive control based on the video image in the target sub-region includes: acquiring a video image in a target sub-area; extracting a first tone image of the video image, and determining a second tone image of the interactive control based on the first tone image; wherein the contrast between the second tone image and the first tone image is greater than a preset contrast threshold; and determining control display parameters of the interaction control based on the second hue image.
The step of acquiring a video image in the target sub-region includes: creating a canvas object matched with the target sub-region; intercepting a region image corresponding to a target sub-region in a video display region of a first video through a canvas object; and setting the intercepted area image in the target sub-area to obtain a video image in the target sub-area.
The step of extracting the first tone image of the video image includes; performing fuzzy processing on a video image to obtain a first image; amplifying the first image through the pseudo elements of the target subarea to obtain a second image; and displaying a designated image area in the second image in the target sub-area, and determining the designated image area as a first tone image of the video image.
Before the step of performing the enlarging process on the first image through the dummy elements of the target sub-region, the method further includes: setting a pseudo element in the page element corresponding to the target sub-region; wherein the size of the dummy element matches the size of the target sub-region; the dummy element has the following functions: and carrying out scaling processing on the image in the target sub-area by taking the size of the dummy element as a reference.
After the step of displaying the designated image area in the second image in the target sub-area and determining the designated image area as the first tone image of the video image, the method further comprises: concealing an image area other than the designated image area in the second image.
The step of determining the second tone image of the interactive control based on the first tone image includes: and carrying out reverse phase processing on the first color tone image to obtain a second color tone image.
The step of determining the control display parameters of the interactive control based on the second hue image includes: and extracting color information of the specified pixel from the second tone image, and determining the color information as a control display parameter of the interactive control.
The step of controlling and displaying the interactive control based on the control display parameter includes: acquiring page elements corresponding to the interactive controls; setting the background color of the page elements based on the control display parameters, and displaying the interactive control based on the background color.
After the step of displaying the interactive control based on the background color, the method further includes: the layer level of the page structure is set to be higher than the layer level of the first video, and the layer level of the target sub-region is set to be lower than the layer level of the first video.
The method further comprises the following steps: in the playing process of the first video, executing the steps of determining a target sub-area containing the interactive control in a video display area of the first video at a specified time interval until the first video is played.
In a second aspect, an embodiment of the present invention provides a display control apparatus for an interactive control, where the apparatus includes: the area determination module is used for responding to the display of the first video and determining a target sub-area containing the interactive control in the video display area of the first video; wherein the interaction control is to: implementing a specified function associated with the first video; the interactive control is displayed on the video image in the target subregion; the parameter determining module is used for determining control display parameters of the interactive control based on the video image in the target sub-area; the display parameters of the control and the display parameters of the video image in the target sub-area have a specified distance; and the display control module is used for controlling the display of the interactive control based on the control display parameter.
In a third aspect, an embodiment of the present invention provides an electronic device, which includes a processor and a memory, where the memory stores machine executable instructions that can be executed by the processor, and the processor executes the machine executable instructions to implement the display control method for the interactive control described above.
In a fourth aspect, embodiments of the present invention provide a machine-readable storage medium storing machine-executable instructions, which when invoked and executed by a processor, cause the processor to implement the above-mentioned display control method for an interactive control.
The embodiment of the invention has the following beneficial effects:
the display control method and device of the interactive control and the electronic equipment respond to the display of the first video, and determine a target sub-area containing the interactive control in the video display area of the first video; wherein the interaction control is to: implementing a specified function associated with the first video; the interactive control is displayed on the video image in the target subregion; determining control display parameters of the interactive control based on the video image in the target sub-region; the display parameters of the control and the display parameters of the video image in the target sub-area have a specified distance; and controlling the interactive control to be displayed based on the control display parameter. In the mode, the control display parameters of the interactive controls are determined according to the video images, the specified distance is reserved between the control display parameters and the display parameters of the video images, the interactive controls cannot be clearly displayed when the interactive controls are close to the display parameters of the video images, accordingly, the visibility of the interactive controls is improved, a user can quickly find the positions of the interactive controls, the user can perform video-related interactive operation at any time, and the video watching experience of the user is improved.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
In order to make the aforementioned and other objects, features and advantages of the present invention comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a flowchart of a display control method for an interactive control according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a target sub-region according to an embodiment of the present invention;
fig. 3 is a schematic diagram of an interaction control according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating another interactive control provided by an embodiment of the present invention;
fig. 5 is a schematic structural diagram of a display control apparatus of an interactive control according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
To make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is apparent that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
First, key terms related to the present embodiment will be described.
1、DOM
DOM, the Document Object Model, refers to the standard programming interface that handles extensible markup language. It is an application program interface independent of platform and language, and can dynamically access program and script and update its content, structure and document style.
2、Canvas
A Canvas is a Canvas tool used to draw graphics on a web page.
When a video is played in a webpage or an application program, an interaction control related to the video is displayed in a video playing area, for example, a return control, and the return control is mostly displayed at the position of the upper left corner of the video playing area. Interactive controls are typically displayed in fixed colors, while video content is changing, possibly displaying various colors. If the color of the video content is different from that of the interactive control, the interactive control can be clearly displayed, but if the color of the video content is similar to that of the interactive control, the interactive control and the video content are integrated, an accurate click hot area cannot be provided for a user, the user cannot observe the specific position of the interactive control easily, and the user is not facilitated to operate the interactive control.
Based on the foregoing, embodiments of the present invention provide a display control method and apparatus for an interactive control, and an electronic device. The display control method for the interactive control provided by the embodiment can be applied to Web application, android application and iOS application.
In one embodiment of the present invention, the display control method of the interactive control may be executed in a local terminal device or a server, and the local terminal device may be a touch terminal device. When the display control method of the interactive control runs on the server, the method can be implemented and executed based on a cloud interactive system, wherein the cloud interactive system comprises the server and the client device.
In an optional embodiment, various cloud applications may be run under the cloud interaction system, for example: cloud video or cloud gaming. Taking a cloud game as an example, a cloud game refers to a game mode based on cloud computing. In the running mode of the cloud game, the running main body of the game program and the game picture presenting main body are separated, the storage and the running of the display control method of the information control are finished on the cloud game server, and the client equipment is used for receiving and sending data and presenting the game picture, for example, the client equipment can be display equipment with a data transmission function close to a user side, such as a mobile terminal, a television, a computer, a palm computer and the like; but the cloud game server which performs information processing is a cloud. When a game is played, a player operates the client device to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, data such as game pictures and the like are encoded and compressed, the data are returned to the client device through a network, and finally the data are decoded through the client device and the game pictures are output.
In an optional implementation manner, taking a game as an example, the local terminal device stores a game program and is used for presenting a game screen. The local terminal device is used for interacting with the player through a graphical user interface, namely, a game program is downloaded and installed and operated through the electronic device conventionally. The manner in which the local terminal device provides the graphical user interface to the player may include a variety of ways, for example, it may be rendered for display on a display screen of the terminal or provided to the player through holographic projection. For example, the local terminal device may include a display screen for presenting a graphical user interface including a game screen and a processor for running the game, generating the graphical user interface, and controlling display of the graphical user interface on the display screen.
In a possible implementation manner, an embodiment of the present invention provides a method for controlling display of an interaction control, where a graphical user interface is provided by a terminal device, where the terminal device may be the aforementioned local terminal device, and may also be the aforementioned client device in a cloud interaction system. A graphical user interface is provided through the terminal device, and interface contents such as game scene pictures, communication interaction windows and the like can be displayed on the graphical user interface according to the type of the started application program.
As shown in fig. 1, the method for controlling display of an interactive control can be implemented by a terminal device, which may be the local terminal device described above or the client device described above. The method comprises the following steps:
step S102, responding to the display of the first video, and determining a target sub-area containing the interactive control in the video display area of the first video; wherein the interaction control is to: implementing a specified function associated with the first video; the interactive control is displayed on the video image in the target subregion;
here, the first video may be a video displayed on a web page or a page of an application program, for example, a short video, a live video, or the like, or may be a scene screen video captured by a virtual camera in a game scene during a game, and in this case, the first video is displayed on a graphical user interface of the game. The present embodiment does not specifically limit the type, source, and the like of the first video.
In the video playing process, the video content is constantly changing, and thus the display parameters such as color and brightness of the video image are also constantly changing. The embodiment aims to adjust the display parameters of the interactive control according to the video image, so that the display parameters of the interactive control displayed in the video image are changed along with the change of the video image, and even if the video image is changed continuously, a user can see the position of the interactive control clearly in real time.
Based on the above, when the first video is displayed, the target sub-area containing the interaction control is determined in the video display area of the first video. The video display area of the first video is used for displaying the first video, and if the first video is located on the webpage, the video display area occupies a part of the display area of the webpage. In the video display area, a target sub-area containing the interactive control is determined, and the target sub-area is contained in the video display area, that is, the target sub-area is a partial area of the video display area, or the target sub-area is the video display area.
In practical implementation, the target sub-area may be directly divided from the video display area, or a region object may be generated in the video display area and superimposed on the video display area, and the region within the region object is the target sub-area. The target sub-region may also be understood as a click hotspot of the interaction control.
The interactive control is contained in the target sub-region, and the position of the target sub-region relative to the video display region may be determined based on the position of the interactive control in the video display region. It can be understood that the interaction control is displayed on the video image in the target sub-area, that is, the display level of the interaction control is higher than that of the video image, and the display definition of the interaction control is obviously affected by the video image in the target sub-area. The functionality of the interaction control is typically associated with the first video. For example, if the first video is a short video, the interactive control may be a return control, a pause control, a play progress adjusting control, a play sound adjusting control, or the like; if the first video is a game scene, the interactive control may be a game-related control, such as a skill control, a displacement control, and so forth.
Step S104, determining control display parameters of the interactive control based on the video image in the target sub-area; the control display parameters and the display parameters of the video image in the target sub-area have a specified distance;
the display parameters of the video image may include color parameters, brightness parameters, and the like. In practical implementation, the display parameters of the video image can be extracted, and then the control display parameters are generated. The aforementioned specified distance may be used to characterize the difference between the control display parameters and the display parameters of the video image. In order to make the control display parameter have a specified distance from the display parameter of the video image, a mapping relation or a mapping algorithm of the display parameter having the specified distance may be set in advance.
In a specific implementation manner, multiple sets of display parameters and control display parameters of the video images may be preset, and the display parameters and the control display parameters of the video images in each set have a specified distance. In another mode, a mapping algorithm may be set, the display parameters of the video image are obtained, the display parameters are input to the mapping algorithm, and the control display parameters are obtained through calculation. The specified distance may include a distance between display colors and may also include a distance between display luminances. It can be appreciated that the greater the specified distance, the greater the difference between the display parameters of the control and the display parameters of the video image, and the clearer the display of the interactive control.
And step S106, controlling and displaying the interactive control based on the control display parameter.
And controlling the display of the interactive control through the control display parameter. For example, if the control display parameters include a display color, the interactive control may be set to the display color. In addition, the control display parameters can also control the display of the positions of the background, the edge and the like of the interactive control. Because the display parameters of the control and the display parameters of the video image of the target sub-area have larger differences, the display effect of the interactive control and the display effect of the video image of the target sub-area also have larger differences.
It should be noted that, in the playing process of the first video, the foregoing steps S102 to S106 may be executed multiple times in consideration of the continuous change of the video image, so that the control display parameter changes instantly with the change of the video image, for example, a certain time interval may be set, and the foregoing steps S102 to S106 are executed once each time the time interval is reached.
The display control method of the interactive control responds to the display of the first video, and determines a target sub-area containing the interactive control in a video display area of the first video; wherein the interaction control is to: implementing a specified function associated with the first video; the interactive control is displayed on the video image in the target sub-area; determining control display parameters of the interactive control based on the video image in the target sub-region; the display parameters of the control and the display parameters of the video image in the target sub-area have a specified distance; and controlling the display of the interactive control based on the control display parameter. In the mode, the control display parameters of the interactive controls are determined according to the video images, the control display parameters have the appointed distance with the display parameters of the video images, and the interactive controls cannot be clearly displayed when the interactive controls are close to the display parameters of the video images, so that the visibility of the interactive controls is improved, a user can quickly find the positions of the interactive controls, the user can perform video-related interactive operation at any time, and the video watching experience of the user is improved.
The following embodiments continue to describe specific implementations of determining a target sub-region.
Loading and displaying the first video in response to a load instruction for the first video; rendering the target sub-region in the video display region of the first video based on the predetermined position parameter of the target sub-region; the position parameters of the target sub-area are determined in advance based on the position of the interactive control in the video display area, and the target sub-area is displayed in a transparent mode.
When the first video is played on the terminal device, the first video needs to be loaded in advance. Taking the video on the webpage as an example, when the user accesses the webpage containing the first video or clicks to play the first video, the terminal device loads the video data of the first video and displays the first video. In the page data of the webpage containing the first video, position parameters of the target sub-area, such as reference coordinates, size and other parameters, can be preset; when the target sub-region is a rectangle, the reference coordinate may be a vertex or a center point of the rectangle, and the size parameter may include a length and a width. After the first video is loaded and displayed, a video display area of the first video can be determined, and then the target sub-area is rendered in the video display area based on the position parameter of the target sub-area.
In a specific implementation manner, the position parameter of the target sub-region may be based on the video display region, for example, a reference coordinate in the position parameter, and a coordinate system may be established with a certain point in the video display region as an origin, and the reference coordinate is expressed by using a coordinate of the coordinate system. As mentioned above, the target sub-region includes the interaction control, and the position of the interaction control in the video display region may also be preset, for example, the interaction control is near the upper left corner of the video display region, and specifically, the position of the interaction control may be represented by coordinates in the foregoing coordinate system.
In the webpage, the target sub-region may specifically be a page element, and after the first video is loaded, the target sub-region is rendered on the video display region of the first video.
As an example, in fig. 2, the target sub-area is disposed at the upper left corner of the video display area, the upper left corner of the target sub-area is used as a reference coordinate, the abscissa is X, the ordinate is Y, and the width and the height are both W, and at this time, the coordinate area of the target sub-area can be represented by [ X, X + W ] and [ Y, Y + W ].
The target sub-area may be laid out and rendered using codes in HTML and CSS languages, and in order to prevent the parent element of the target sub-area from being hidden due to being not spread by the first video, in this embodiment, the video metadata of the first video is loaded first, and then the target sub-area is rendered.
The following embodiments continue to describe specific implementations for determining control display parameters for an interactive control.
Acquiring a video image in a target sub-area; extracting a first tone image of the video image, and determining a second tone image of the interactive control based on the first tone image; wherein the contrast between the second tone image and the first tone image is greater than a preset contrast threshold; and determining the second tone image as a control display parameter of the interaction control.
A plurality of colors may be included in the video image, and the first color tone image may be understood as an image containing the main colors of the video image. The second hue image may be understood as an image containing the dominant color of the interaction control. In practical implementation, a filter may be used to filter out detail information or color information with a small content in the video image, so as to obtain a first-tone image of the video image. And presetting contrast relations among the tone images, and obtaining a tone image with the contrast larger than the contrast threshold value with the first tone image, namely a second tone image based on the contrast relations after the first tone image is obtained. The second tone image is the control display parameter of the interactive control.
Considering that human eyes are sensitive to the change of the color tone, in the mode, the color tone is used as the display parameter, so that the contrast between the color tone of the interactive control and the color tone of the video image is larger, the higher difference between the display effect of the interactive control and the display of the video image is larger, and the clear display of the interactive control is realized.
In the webpage, as can be seen from the foregoing embodiment, after the target sub-region is rendered, the target sub-region is transparently displayed in an initial state. In order to obtain a video image in a target sub-region, firstly creating a canvas object matched with the target sub-region; intercepting a region image corresponding to a target sub-region in a video display region of a first video through the canvas object; and setting the intercepted area image in the target sub-area to obtain a video image in the target sub-area.
The Canvas object may specifically be a Canvas object, which may be implicitly created using the createlement () function of the DOM, and may be the same size as the target sub-region. The canvas object need not be inserted into the code of the internet webpage. Using the canvas object, the area image corresponding to the target sub-area in the video display area can be intercepted in real time and then converted into the picture format of Base 64. Specifically, the canvas content is converted into a 2D plane composition by using a getContext () function of a canvas object, then a drawImage () function is called to intercept a region image corresponding to the target sub-region, then a toDataURL () function is called to convert the region image into image data in a Base64 picture format, and the image data is declared as a background image of the target sub-region by using background-image, thereby obtaining a video image in the target sub-region.
When a first tone image of a video image is extracted, performing fuzzy processing on the video image to obtain a first image; amplifying the first image through the pseudo elements of the target subarea to obtain a second image; a designated image area in the second image is displayed in the target sub-area, and the designated image area is determined as a first tone image of the video image.
In one approach, the video image may be blurred using a filter function, which may be, for example, a filter blu () function, which is a native filter function provided by the browser. The filter function can process the whole color of the video image by using the Gaussian blur effect, the effect is to reduce image noise and reduce detail level, the convolution calculation is carried out on the video image and the circular frame blur, and a more accurate out-of-focus imaging effect is generated. In short, the inconspicuous details are weakened and the prominent details are strengthened. By the ingenious and unknown mode, the detail level of the dominant tone is enhanced, the detail levels of other tones are weakened, the effect of dynamic processing is achieved, and the first image of the video image is obtained.
The extraction of the first image requires the use of dummy elements. Setting a pseudo element in the page element corresponding to the target sub-region; wherein the size of the dummy element matches the size of the target sub-region; the dummy element has the following functions: and carrying out scaling processing on the image in the target sub-area by taking the size of the dummy element as a reference.
The dummy element may be understood as a sub-element of the target sub-region. The size of the dummy element can be the same as the size of the target sub-area, for example, the width and the height of the dummy element are declared to be consistent with the width and the height of the target sub-area, and this way, under the condition that the size of the target sub-area is fixed, the size scaling of the dummy element is controlled by taking the size of the target sub-area as a reference. In the initial state, the background of the dummy element is a background mixing attribute, the abbreviated sequence is an image path background-image, an image repetition mode background-repeat, an image position background-position image size mode and background-size, and the background mixing attribute can declare the background image of the dummy element as non-repeated and free-fit size.
After the dummy elements are set, the transformation center declares transform: scale (), and the first image is enlarged by a specified factor, for example, three times, to obtain a second image. It is understood that the larger the magnification, the higher the uniformity of the subsequently obtained first tone image, and the magnification also affects the shade of the first color.
The size of the second image is larger than the size of the target sub-area, so that the target sub-area can only display a partial area of the second image, and in actual implementation, which area in the second image is displayed can be set. For example, through a "center" instruction, the target sub-area may be controlled to display a central area of the second image, the central area being the aforementioned designated image area; similarly, the target sub-area can be controlled to display other area images of the second image through other instructions. The designated image area displayed in the target sub-area is the first tone image of the video image.
For the target sub-area, it may be stated in advance in the element corresponding to the target sub-area that the image area overflowing the target sub-area is hidden, and since the second image is an image enlarged by the dummy element, the size of the second image is larger than that of the target sub-area, and at this time, the image area in the second image except for the designated image area is hidden through the target sub-area.
In order to make the contrast between the second color tone image and the first color tone image greater than the contrast threshold, in this embodiment, when the second color tone image of the interactive control is determined, the first color tone image is subjected to an inversion process to obtain the second color tone image. In practical implementation, the inverse parameter can be preset, the value range of the inverse parameter is 0-100%, when the inverse parameter is zero, the tone is not changed, and when the inverse parameter is 100%, the tone is completely reversed. Specifically, the first tone image and 100% of the inverse parameters may be input by using a filter: inverse () function, and the inverse image coloring may be performed on the first tone image to obtain the second tone image. In one example, the first tone image is pink, and the second tone image is dark green after the inversion processing. The hues of the pink color and the blackish green color are very bright in contrast, and the interactive control of the blackish green color is displayed on the video image of the pink color, so that the interactive control can be clearly displayed.
For ease of understanding, fig. 3 and 4 are used as examples, in fig. 3, the video image is lighter in color, and in this case, the interactive control is darker in color; in fig. 4, the video image has a darker color tone, and fig. 4 shows a dark color tone with shading, and in this case, the interactive control has a lighter color tone.
The second tone image of the interactive control is obtained by carrying out the inverse processing on the first tone image of the video image, so that the first tone image and the second tone image have sharp contrast, the interactive control can be clearly displayed on the video image, and a player can find the interactive control and execute interactive operation conveniently.
Further, color information of the designated pixel is extracted from the second hue image, and the color information is determined as a control display parameter of the interactive control. The designated pixel may be an element at a specific position in the second-tone image, for example, a center element of the second-tone image. Specifically, a Canvas object, which is implicitly created using the createElement () function of the DOM, may be reused, and the width and height of the screen object may be set to be the same as the target sub-region, and the Canvas object does not need to be inserted into the HTML (hypertext Markup Language) code of the webpage. The getContext () function of the canvas object is called to convert the canvas content into a 2D plane composition, and then the drawImage () function is called to intercept the second tone image in the target sub-region and convert it into image data. Then, the pixel information of the intermediate coordinates of the image data is analyzed, and the value of R, G, B is obtained therefrom and converted into HEX, i.e., 16-system color, to obtain the aforementioned color information. For example, the calculated color information is #0e5042 in 16-ary notation.
When the interactive control is controlled to be displayed based on the control display parameter, acquiring a page element corresponding to the interactive control; and setting the background color of the page element based on the control display parameter, and displaying the interactive control based on the background color. Specifically, a page element for displaying the interactive control is arranged on a parent element of a video element of the first video, and background-color is used to declare the background color of the page element as the color information in the display parameter of the control.
Further, the layer level of the page structure is set to be higher than the layer level of the first video, and the layer level of the target sub-region is set to be lower than the layer level of the first video. In a web page, the layer hierarchy can be set using z-index. In the initial state, after the target sub-region is rendered, the target sub-region is suspended above the page element of the first video, for example, the layer level z-index of the target sub-region is set to 9. In the actual application process, the target sub-area does not need to appear visually, and after the interactive control is displayed through the control display parameter, the interactive control is displayed on the page element of the first video in a suspending manner, for example, the layer level z-index of the interactive control is set to 9, meanwhile, the layer level z-index of the target sub-area is set to-9, at this time, the target sub-area is set below the first video, and the target sub-area is hidden in a layer overlapping manner.
In addition, considering that the video image is continuously changed and the color tone of the video image is continuously changed in the playing process of the first video, based on this, in the playing process of the first video, the step of determining the target sub-area containing the interactive control in the video display area of the first video is executed at intervals of specified duration until the first video is completely played. The specified time period may be set in advance, for example, 500ms.
It should be noted that after the step of determining the target sub-region including the interactive control in the video display region of the first video is executed, the subsequent other steps are automatically executed until the updated control display parameter of the interactive control is obtained, and the interactive control is displayed based on the updated control display parameter of the interactive control.
Since the video content of the first video is dynamically changed over time, if the control display parameter obtained through the first frame cannot meet the subsequent change, a new control display parameter needs to be generated again when the video content is changed. Assuming that the total duration of the first video is T, the control display parameter may be updated every 500ms, the number of updates may be T/500 and rounded down, and the usage formula may be expressed as math. The display parameters of the control of the interaction control of the first video with the final total duration of T have Math.floor (T/500) possibilities, and the video content is dynamically identified in the way, and the color of the interaction control is controlled, so that the interaction control is always kept in a striking state which can be accurately found by a player.
In a specific manner, a timer for executing the above procedure every 500ms may be set, where the timer takes D as an initial number, and the timer is decremented by 1 for each execution, and the execution of the timer is stopped until D is 0. At this time, the possibility of the display parameters of the [1,D ] controls is generated, and the transition of the display parameters of each control is 500ms, so that the transition is very unnatural, and the display parameters can be changed into other display parameters at a moment, so that the transition is declared to be all 500ms, and when the background color of the interactive control changes, the smooth transition of 500ms is provided, thereby making up the dynamic change of the color difference, and making the transition of each display parameter more natural.
The display control method of the interactive control provided by the embodiment has the following advantages:
1. the color of the interactive control can be changed well according to the color of the video content, and the problem that the interactive control is hidden and appearing due to the video content is solved. And skillfully utilizing the layer level difference of the image layers to dynamically generate an inverse dominant hue of the click hot area, assigning the inverse dominant hue to the background color attribute of the interactive control, and continuously generating the inverse dominant hue by utilizing the dynamic characteristics of the video playing content to be supplied to the background color attribute of the interactive control all the time, thereby achieving the effect of dynamically controlling the background color of the interactive control.
2. The whole process only needs to be executed on a browser, and participation of a server is not needed. And based on the code function of the browser, generating control display parameters of the interactive control, and displaying the control interactive control without script operation or storing related data. If the interface developed by the server is used for dynamic processing, the delay of generating the dynamic effect also causes interface pressure due to the increase of video playing amount. Because no process of interacting with the server exists, the problem of processing the problems is completely solved without re-developing a server script, scenes using the database are eliminated, dependence on a storage object is eliminated, the effect of dynamically generating the main tone by the image is achieved, the pressure of the server is shared, and the browser is enabled to complete the jobs.
3. The browser is a GUI (Graphical User Interface) processing tool, a built-in fliter series filter function is highly customized for rendering, so that the method is more stable, more efficient and lower in cost compared with the method that some open-source or charged third-party image color quantization algorithms are used in a server, and a set of image main tone extraction processes are not required to be maintained at all, and a series of high-cost interfaces are not required to be developed independently for achieving the effects.
4. The overall access cost is small, using only three APIs (Application Programming interfaces) to Canvas objects and four rows of core CSS (Cascading Style Sheets) codes, including overflow: hidden, filter: blue (), filter: inverse () and transform: scale (). For the server to complete the above process only by hundreds of lines of core codes, the development cost is really ultra-low.
5. If the above-mentioned process is realized through the server, need to read every frame of the video, and then the position of every frame extracts the opposite phase dominant hue and obtains an interactive control of the current frame, and then merge the interactive control into the video, belong to a processing procedure with huge cost really. The technical scheme of the invention is oriented to the client, all the processes are dynamically completed in a mode of combining Canvas and CSS based on the rendering characteristics of the browser, and compared with a server, the method has better rendering advantages and execution cost advantages.
And improving the rendering performance of the image in the extraction process by using the characteristic of transform attribute matrix conversion. Specifically, the cache record is not changed in the execution process of transform, nodes of the current layer are traversed during layer composition, corresponding new coordinates are calculated by using a matrix formula, the map-based new coordinates can be regarded as transformation irrelevant to the content of the layer, and the bitmap information cache generated by elements in the layer for the first time can be repeatedly used, so that the rendering performance is improved.
Corresponding to the above method embodiment, referring to fig. 5, a schematic structural diagram of a display control apparatus of an interactive control is shown, where the apparatus includes:
a region determining module 50, configured to determine, in response to the display of the first video, a target sub-region containing the interactive control in the video display region of the first video; wherein the interaction control is to: implementing a specified function associated with the first video; the interactive control is displayed on the video image in the target sub-area;
the parameter determining module 52 is configured to determine a control display parameter of the interactive control based on the video image in the target sub-region; the display parameters of the control and the display parameters of the video image in the target sub-area have a specified distance;
and a display control module 54 for controlling the display of the interactive control based on the control display parameter.
The display control device of the interactive control responds to the display of the first video, and determines a target sub-area containing the interactive control in the video display area of the first video; wherein the interaction control is to: implementing a specified function associated with the first video; the interactive control is displayed on the video image in the target subregion; determining control display parameters of the interactive control based on the video image in the target sub-region; the display parameters of the control and the display parameters of the video image in the target sub-area have a specified distance; and controlling the display of the interactive control based on the control display parameter. In the mode, the control display parameters of the interactive controls are determined according to the video images, the control display parameters have the appointed distance with the display parameters of the video images, and the interactive controls cannot be clearly displayed when the interactive controls are close to the display parameters of the video images, so that the visibility of the interactive controls is improved, a user can quickly find the positions of the interactive controls, the user can perform video-related interactive operation at any time, and the video watching experience of the user is improved.
The area determining module is further configured to: responding to a loading instruction aiming at the first video, and loading and displaying the first video; rendering the target sub-region in the video display region of the first video based on the predetermined position parameter of the target sub-region; the position parameters of the target sub-area are determined in advance based on the position of the interactive control in the video display area, and the target sub-area is displayed in a transparent mode.
The parameter determining module is further configured to: acquiring a video image in a target sub-area; extracting a first tone image of the video image, and determining a second tone image of the interactive control based on the first tone image; wherein the contrast between the second tone image and the first tone image is greater than a preset contrast threshold; and determining control display parameters of the interaction control based on the second hue image.
The parameter determining module is further configured to: creating a canvas object matched with the target sub-region; intercepting a region image corresponding to a target sub-region in a video display region of a first video through a canvas object; and setting the intercepted area image in the target sub-area to obtain a video image in the target sub-area.
The parameter determining module is further configured to: carrying out fuzzy processing on the video image to obtain a first image; amplifying the first image through the pseudo elements of the target subarea to obtain a second image; and displaying a designated image area in the second image in the target sub-area, and determining the designated image area as a first tone image of the video image.
The apparatus further comprises an element setting module configured to: setting a pseudo element in the page element corresponding to the target sub-region; wherein the size of the dummy element matches the size of the target sub-region; the dummy element has the following functions: and carrying out scaling processing on the image in the target sub-area by taking the size of the dummy element as a reference.
The apparatus further comprises a concealment module configured to: concealing an image area other than the designated image area in the second image.
The parameter determining module is further configured to: and carrying out reverse phase processing on the first color tone image to obtain a second color tone image.
The parameter determining module is further configured to: and extracting color information of the specified pixel from the second tone image, and determining the color information as a control display parameter of the interactive control.
The display control module is further configured to: acquiring page elements corresponding to the interactive controls; and setting the background color of the page element based on the control display parameter, and displaying the interactive control based on the background color.
The apparatus further comprises a hierarchy setting module for: the layer level of the page structure is set to be higher than the layer level of the first video, and the layer level of the target sub-region is set to be lower than the layer level of the first video.
The apparatus further comprises an execution module configured to: in the playing process of the first video, executing the steps of determining a target sub-area containing the interactive control in a video display area of the first video at a specified time interval until the first video is played.
The embodiment also provides an electronic device, which comprises a processor and a memory, wherein the memory stores machine executable instructions capable of being executed by the processor, and the processor executes the machine executable instructions to realize the display control method of the interaction control. The electronic device may be a server or a terminal device.
Referring to fig. 6, the electronic device includes a processor 100 and a memory 101, where the memory 101 stores machine executable instructions capable of being executed by the processor 100, and the processor 100 executes the machine executable instructions to implement the display control method of the interactive control.
Further, the electronic device shown in fig. 6 further includes a bus 102 and a communication interface 103, and the processor 100, the communication interface 103, and the memory 101 are connected through the bus 102.
The Memory 101 may include a high-speed Random Access Memory (RAM) and may also include a non-volatile Memory (non-volatile Memory), such as at least one disk Memory. The communication connection between the network element of the system and at least one other network element is realized through at least one communication interface 103 (which may be wired or wireless), and the internet, a wide area network, a local network, a metropolitan area network, and the like can be used. The bus 102 may be an ISA bus, PCI bus, EISA bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one double-headed arrow is shown in FIG. 6, but this does not indicate only one bus or one type of bus.
Processor 100 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 100. The Processor 100 may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; the device can also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, or a discrete hardware component. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in the memory 101, and the processor 100 reads the information in the memory 101 and completes the steps of the method of the foregoing embodiment in combination with the hardware thereof.
The processor in the electronic device may implement the following operations in the display control method of the interactive control by executing the machine executable instruction:
in response to the display of the first video, determining a target sub-area containing the interaction control in a video display area of the first video; wherein the interaction control is to: implementing a specified function associated with the first video; the interactive control is displayed on the video image in the target subregion; determining control display parameters of the interactive control based on the video image in the target sub-region; the display parameters of the control and the display parameters of the video image in the target sub-area have a specified distance; and controlling the interactive control to be displayed based on the control display parameter.
In the above manner, the control display parameters of the interactive control are determined according to the video image, and the control display parameters have the specified distance from the display parameters of the video image, so that the interactive control cannot be clearly displayed when the interactive control is close to the display parameters of the video image, the visibility of the interactive control is improved, a user can quickly find the position of the interactive control, the user can perform video-related interactive operation at any time, and the video watching experience of the user is improved.
Loading and displaying the first video in response to a load instruction for the first video; rendering the target sub-region in the video display region of the first video based on the predetermined position parameter of the target sub-region; the position parameters of the target sub-area are determined in advance based on the position of the interactive control in the video display area, and the target sub-area is displayed in a transparent mode.
Acquiring a video image in a target sub-area; extracting a first tone image of the video image, and determining a second tone image of the interactive control based on the first tone image; wherein the contrast between the second tone image and the first tone image is greater than a preset contrast threshold; and determining control display parameters of the interaction control based on the second hue image.
In the mode, the control display parameters of the control are determined through the tone image, and the tone is used as the display parameters, so that the contrast between the tone of the interactive control and the tone of the video image is larger, the higher difference between the display effect of the interactive control and the display of the video image is larger, and the clear display of the interactive control is realized.
In the above manner, the hue is used as the display parameter, so that the contrast between the hue of the interactive control and the hue of the video image is relatively large, the display effect of the interactive control and the display of the video image have relatively large difference, and the interactive control is clearly displayed.
Creating a canvas object matched with the target sub-region; intercepting a region image corresponding to a target sub-region in a video display region of a first video through a canvas object; and setting the intercepted area image in the target sub-area to obtain a video image in the target sub-area.
Carrying out fuzzy processing on the video image to obtain a first image; amplifying the first image through the pseudo elements of the target subarea to obtain a second image; and displaying a designated image area in the second image in the target sub-area, and determining the designated image area as a first tone image of the video image.
Setting a pseudo element inside a page element corresponding to the target sub-area; wherein the size of the dummy element matches the size of the target sub-region; the dummy element has the following functions: and carrying out scaling processing on the image in the target sub-area by taking the size of the dummy element as a reference.
Concealing an image area other than the designated image area in the second image.
And carrying out reverse phase processing on the first color tone image to obtain a second color tone image.
In the above manner, the second tone image of the interactive control is obtained by performing the inverse processing on the first tone image of the video image, so that the first tone image and the second tone image can have sharp contrast, the interactive control can be clearly displayed on the video image, and a player can find the interactive control and perform interactive operation conveniently.
The second tone image of the interactive control is obtained by carrying out the inverse processing on the first tone image of the video image, so that the first tone image and the second tone image have sharp contrast, the interactive control can be clearly displayed on the video image, and a player can find the interactive control and execute interactive operation conveniently.
And extracting color information of the specified pixel from the second tone image, and determining the color information as a control display parameter of the interactive control.
Acquiring page elements corresponding to the interactive controls; setting the background color of the page elements based on the control display parameters, and displaying the interactive control based on the background color.
The layer level of the page structure is set to be higher than the layer level of the first video, and the layer level of the target sub-region is set to be lower than the layer level of the first video.
In the playing process of the first video, executing the step of determining a target sub-area containing the interactive control in a video display area of the first video at a specified time interval until the first video is played completely.
In the mode, the display interaction control is updated regularly in the video playing process, so that the interaction control is always kept in a striking state which can be accurately found by a player.
The present embodiments also provide a machine-readable storage medium storing machine-executable instructions that, when invoked and executed by a processor, cause the processor to implement the display control method of the interactive control described above.
The machine-executable instructions stored in the machine-readable storage medium can implement the following operations in the display control method of the interactive control by executing the machine-executable instructions:
in response to the display of the first video, determining a target sub-area containing the interaction control in a video display area of the first video; wherein the interaction control is to: implementing a specified function associated with the first video; the interactive control is displayed on the video image in the target subregion; determining control display parameters of the interactive control based on the video image in the target sub-region; the display parameters of the control and the display parameters of the video image in the target sub-area have a specified distance; and controlling the display of the interactive control based on the control display parameter.
In the above manner, the control display parameters of the interactive control are determined according to the video image, and the control display parameters have the specified distance from the display parameters of the video image, so that the interactive control cannot be clearly displayed when the interactive control is close to the display parameters of the video image, the visibility of the interactive control is improved, a user can quickly find the position of the interactive control, the user can perform video-related interactive operation at any time, and the video watching experience of the user is improved.
Loading and displaying the first video in response to a load instruction for the first video; rendering the target sub-region in the video display region of the first video based on the predetermined position parameter of the target sub-region; the position parameters of the target sub-area are determined in advance based on the position of the interactive control in the video display area, and the target sub-area is displayed in a transparent mode.
Acquiring a video image in a target sub-area; extracting a first tone image of the video image, and determining a second tone image of the interactive control based on the first tone image; wherein the contrast between the second tone image and the first tone image is greater than a preset contrast threshold; and determining control display parameters of the interaction control based on the second hue image.
In the mode, the control display parameters of the control are determined through the tone image, and the tone is used as the display parameters, so that the contrast between the tone of the interactive control and the tone of the video image is larger, the higher difference between the display effect of the interactive control and the display of the video image is larger, and the clear display of the interactive control is realized.
In the above manner, the hue is used as the display parameter, so that the contrast between the hue of the interactive control and the hue of the video image is relatively large, and thus, the display effect of the interactive control and the display of the video image have relatively large difference, and the interactive control is displayed clearly.
Creating a canvas object matched with the target sub-region; intercepting a region image corresponding to a target sub-region in a video display region of a first video through a canvas object; and setting the intercepted area image in the target sub-area to obtain a video image in the target sub-area.
Performing fuzzy processing on a video image to obtain a first image; amplifying the first image through the pseudo elements of the target subarea to obtain a second image; and displaying a designated image area in the second image in the target sub-area, and determining the designated image area as a first tone image of the video image.
Setting a pseudo element in the page element corresponding to the target sub-region; wherein the size of the dummy element matches the size of the target sub-region; the dummy element has the following functions: and carrying out scaling processing on the image in the target sub-area by taking the size of the dummy element as a reference.
Concealing an image area other than the designated image area in the second image.
And carrying out reverse phase processing on the first color tone image to obtain a second color tone image.
In the above manner, the second tone image of the interactive control is obtained by performing the inverse processing on the first tone image of the video image, so that the first tone image and the second tone image can have sharp contrast, the interactive control can be clearly displayed on the video image, and a player can find the interactive control and perform interactive operation conveniently.
The first tone image of the video image is processed in a reversed phase mode to obtain the second tone image of the interactive control, the first tone image and the second tone image can be clearly compared, the interactive control can be clearly displayed on the video image, and a player can find the interactive control conveniently and execute interactive operation.
And extracting color information of the specified pixel from the second tone image, and determining the color information as a control display parameter of the interactive control.
Acquiring page elements corresponding to the interactive controls; setting the background color of the page elements based on the control display parameters, and displaying the interactive control based on the background color.
The layer level of the page structure is set to be higher than the layer level of the first video, and the layer level of the target sub-region is set to be lower than the layer level of the first video.
In the playing process of the first video, executing the steps of determining a target sub-area containing the interactive control in a video display area of the first video at a specified time interval until the first video is played.
In the mode, the display interaction control is updated regularly in the video playing process, so that the interaction control is always kept in a striking state which can be accurately found by a player.
The display control method and apparatus for an interactive control, and the computer program product of an electronic device provided in the embodiments of the present invention include a computer-readable storage medium storing a program code, where instructions included in the program code may be used to execute the method described in the foregoing method embodiments, and specific implementations may refer to the method embodiments, and are not described herein again.
It can be clearly understood by those skilled in the art that, for convenience and simplicity of description, the specific working process of the system and the apparatus described above may refer to the corresponding process in the foregoing method embodiment, and details are not described herein again.
In addition, in the description of the embodiments of the present invention, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meaning of the above terms in the present invention can be understood in specific cases for those skilled in the art.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
In the description of the present invention, it should be noted that the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc., indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are only for convenience of description and simplicity of description, but do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, should not be construed as limiting the present invention. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
Finally, it should be noted that: although the present invention has been described in detail with reference to the foregoing embodiments, those skilled in the art will understand that the following embodiments are merely illustrative of the present invention, and not restrictive, and the scope of the present invention is not limited thereto: any person skilled in the art can modify or easily conceive the technical solutions described in the foregoing embodiments or equivalent substitutes for some technical features within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present invention, and they should be construed as being included therein. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (15)

1. A display control method of an interactive control, the method comprising:
in response to the display of a first video, determining a target sub-area containing an interaction control in a video display area of the first video; wherein the interaction control is to: implementing a specified function associated with the first video; the interactive control is displayed on the video image in the target sub-area;
determining control display parameters of the interactive control based on the video image in the target subregion; the control display parameters and the display parameters of the video image in the target sub-area have a specified distance;
and controlling the interactive control to be displayed based on the control display parameter.
2. The method of claim 1, wherein the step of determining a target sub-region containing an interactive control in a video display region of a first video in response to display of the first video comprises:
loading and displaying the first video in response to a load instruction for the first video;
rendering a target sub-region in a video display region of the first video based on a predetermined position parameter of the target sub-region; the position parameter of the target sub-area is determined in advance based on the position of the interaction control in the video display area, and the target sub-area is displayed in a transparent mode.
3. The method of claim 1, wherein the step of determining control display parameters of the interactive control based on the video image in the target sub-region comprises:
acquiring a video image in the target sub-area;
extracting a first tone image of the video image, and determining a second tone image of the interactive control based on the first tone image; wherein a contrast between the second-tone image and the first-tone image is greater than a preset contrast threshold;
and determining control display parameters of the interaction control based on the second tone image.
4. The method of claim 3, wherein the step of acquiring the video image in the target sub-region comprises:
creating a canvas object matching the target sub-region;
intercepting a region image corresponding to the target sub-region in a video display region of the first video through the canvas object; and setting the intercepted area image in the target sub-area to obtain a video image in the target sub-area.
5. The method of claim 3, wherein the step of extracting a first tone image of the video image comprises;
carrying out fuzzy processing on the video image to obtain a first image;
amplifying the first image through the pseudo elements of the target sub-region to obtain a second image; and displaying a designated image area in the second image in the target sub-area, and determining the designated image area as a first tone image of the video image.
6. The method of claim 5, wherein prior to the step of magnifying the first image by a dummy element of the target sub-region, the method further comprises:
setting a pseudo element in the page element corresponding to the target sub-region; wherein the size of the dummy element matches the size of the target sub-region; the dummy element has the following functions: and performing scaling processing on the image in the target sub-area by taking the size of the dummy element as a reference.
7. The method of claim 5, wherein after the step of displaying a designated image area in the second image in the target sub-area, the designated image area being determined to be a first-tone image of the video image, the method further comprises: concealing an image area other than the designated image area in the second image.
8. The method of claim 3, wherein determining the second tone image of the interactive control based on the first tone image comprises: and carrying out reverse phase processing on the first color tone image to obtain a second color tone image.
9. The method of claim 3, wherein determining control display parameters for the interactive control based on the second hue image comprises:
and extracting color information of a specified pixel from the second tone image, and determining the color information as a control display parameter of the interactive control.
10. The method of claim 1, wherein controlling the display of the interactive control based on the control display parameter comprises:
acquiring page elements corresponding to the interactive controls;
and setting the background color of the page element based on the control display parameter, and displaying the interactive control based on the background color.
11. The method of claim 10, wherein after the step of displaying the interaction control based on the background color, the method further comprises:
and setting the layer level of the page structure to be higher than the layer level of the first video, and setting the layer level of the target sub-region to be lower than the layer level of the first video.
12. The method of claim 1, further comprising:
in the playing process of the first video, executing the steps of determining a target sub-area containing an interactive control in a video display area of the first video at a specified time interval until the first video is played.
13. An apparatus for controlling display of an interactive control, the apparatus comprising:
the area determination module is used for responding to the display of a first video and determining a target sub-area containing an interactive control in a video display area of the first video; wherein the interaction control is to: implementing a specified function associated with the first video; the interactive control is displayed on the video image in the target sub-area;
the parameter determining module is used for determining control display parameters of the interactive control based on the video image in the target sub-area; the control display parameters and the display parameters of the video image in the target sub-area have a specified distance;
and the display control module is used for controlling and displaying the interactive control based on the control display parameter.
14. An electronic device comprising a processor and a memory, the memory storing machine executable instructions executable by the processor, the processor executing the machine executable instructions to implement the method of display control of an interactive control of any of claims 1-12.
15. A machine-readable storage medium having stored thereon machine-executable instructions which, when invoked and executed by a processor, cause the processor to implement the method of display control of an interaction control of any of claims 1-12.
CN202210692473.7A 2022-06-17 2022-06-17 Display control method and device of interactive control and electronic equipment Pending CN115314754A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210692473.7A CN115314754A (en) 2022-06-17 2022-06-17 Display control method and device of interactive control and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210692473.7A CN115314754A (en) 2022-06-17 2022-06-17 Display control method and device of interactive control and electronic equipment

Publications (1)

Publication Number Publication Date
CN115314754A true CN115314754A (en) 2022-11-08

Family

ID=83854518

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210692473.7A Pending CN115314754A (en) 2022-06-17 2022-06-17 Display control method and device of interactive control and electronic equipment

Country Status (1)

Country Link
CN (1) CN115314754A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170161862A1 (en) * 2015-12-04 2017-06-08 Le Holdings(Beijing)Co., Ltd. Method and electronic device for adding watermark to video
CN108737878A (en) * 2017-04-18 2018-11-02 谷歌有限责任公司 The method and system of user interface color is changed for being presented in conjunction with video
CN111050202A (en) * 2019-11-22 2020-04-21 北京达佳互联信息技术有限公司 Video processing method, video processing device, electronic equipment and medium
US20210168473A1 (en) * 2019-11-28 2021-06-03 Beijing Dajia Internet Information Technology Co., Ltd. Method and Apparatus for Synthesizing Video
US20210193184A1 (en) * 2018-12-07 2021-06-24 Tencent Technology (Shenzhen) Company Limited Image information processing method and apparatus, and storage medium
CN114327214A (en) * 2022-01-05 2022-04-12 北京有竹居网络技术有限公司 Interaction method, interaction device, electronic equipment, storage medium and computer program product

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170161862A1 (en) * 2015-12-04 2017-06-08 Le Holdings(Beijing)Co., Ltd. Method and electronic device for adding watermark to video
CN108737878A (en) * 2017-04-18 2018-11-02 谷歌有限责任公司 The method and system of user interface color is changed for being presented in conjunction with video
CN113660514A (en) * 2017-04-18 2021-11-16 谷歌有限责任公司 Method and system for modifying user interface color in conjunction with video presentation
US20210193184A1 (en) * 2018-12-07 2021-06-24 Tencent Technology (Shenzhen) Company Limited Image information processing method and apparatus, and storage medium
CN111050202A (en) * 2019-11-22 2020-04-21 北京达佳互联信息技术有限公司 Video processing method, video processing device, electronic equipment and medium
US20210168473A1 (en) * 2019-11-28 2021-06-03 Beijing Dajia Internet Information Technology Co., Ltd. Method and Apparatus for Synthesizing Video
CN114327214A (en) * 2022-01-05 2022-04-12 北京有竹居网络技术有限公司 Interaction method, interaction device, electronic equipment, storage medium and computer program product

Similar Documents

Publication Publication Date Title
Khan et al. A tone-mapping technique based on histogram using a sensitivity model of the human visual system
WO2021169307A1 (en) Makeup try-on processing method and apparatus for face image, computer device, and storage medium
JP3870109B2 (en) Image display apparatus, image display method, and image display program
CN107179889B (en) Interface color adjusting method, webpage color adjusting method and webpage color adjusting device
WO2017211250A1 (en) Image overlay display method and system
CN111145135B (en) Image descrambling processing method, device, equipment and storage medium
US20140111524A1 (en) Method and terminal for displaying an animation
US20240144976A1 (en) Video processing method, device, storage medium, and program product
CN110149550B (en) Image data processing method and device and storage medium
WO2015180448A1 (en) Method and device for switching playing mode of mobile terminal, storage medium and program
CN113891167A (en) Screen projection method and electronic equipment
CN108989872B (en) Android television background fast switching method, framework, server and storage medium
CN113301414B (en) Interface generation processing method and device, electronic equipment and computer storage medium
US20140146086A1 (en) Image output apparatus, image output method, and program
CN114913067A (en) Rendering method and device for dynamic resolution, electronic equipment and readable storage medium
CN112426711A (en) Bloom effect processing method, system, electronic device and storage medium
CN115314754A (en) Display control method and device of interactive control and electronic equipment
CN111190665A (en) Full-screen image display method, intelligent terminal and storage medium
CN113434243B (en) Weex page-based screenshot method, device, equipment and storage medium
JP2007121864A (en) Optimized video reproduction system, optimized video distribution system, video optimizing device, optimized video reproducing method, and video content data storage medium
CN115970275A (en) Projection processing method and device for virtual object, storage medium and electronic equipment
CN115049572A (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN114549732A (en) Model rendering method and device and electronic equipment
CN109729285B (en) Fuse grid special effect generation method and device, electronic equipment and storage medium
CN116450017B (en) Display method and device for display object, electronic equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination