CN115098012A - Display method, electronic device, and computer program product - Google Patents

Display method, electronic device, and computer program product Download PDF

Info

Publication number
CN115098012A
CN115098012A CN202210730428.6A CN202210730428A CN115098012A CN 115098012 A CN115098012 A CN 115098012A CN 202210730428 A CN202210730428 A CN 202210730428A CN 115098012 A CN115098012 A CN 115098012A
Authority
CN
China
Prior art keywords
window
content
user
trigger
response
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210730428.6A
Other languages
Chinese (zh)
Inventor
付聪
张树悦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zitiao Network Technology Co Ltd
Original Assignee
Beijing Zitiao Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zitiao Network Technology Co Ltd filed Critical Beijing Zitiao Network Technology Co Ltd
Priority to CN202210730428.6A priority Critical patent/CN115098012A/en
Publication of CN115098012A publication Critical patent/CN115098012A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Embodiments of the present disclosure provide video display methods, electronic devices, and computer program products. The method may include displaying first content within a first window. The method may also include, in response to receiving a first trigger by the user, displaying a second window in floating window form on the displayed first window, wherein the second window is for displaying second content different from the first content. Further, the method may include updating a presentation state of the second window in response to a trigger of a target event associated with the first content. According to the embodiment of the disclosure, the user can still know the key event of the first content in the first window when watching the corresponding application content of the second window, so that the user experience is improved.

Description

Display method, electronic device, and computer program product
Technical Field
Embodiments of the present disclosure relate to the field of computers, and more particularly, to a display method, an electronic device, and a computer program product.
Background
With the rise of 5G technology and the improvement of computing device performance, the user equipment can provide a real-time interactive system of multi-video stream display for users. For example, current user equipment typically supports a multi-window display mode. However, for an application scenario that needs to change the display content on the user equipment of each user by receiving a control instruction of one or more users in real time, how to optimize the display and operation interface of the user equipment is a problem to be solved urgently.
Disclosure of Invention
Embodiments of the present disclosure provide a plurality of display schemes.
In a first aspect of the present disclosure, a display method is provided. The method may include displaying first content within a first window. The method may also include, in response to receiving a first trigger of a user, displaying a second window in floating window form on the displayed first window, wherein the second window is for displaying second content different from the first content. Further, the method may include updating a presentation state of the second window in response to a trigger of a target event associated with the first content.
In a second aspect of the present disclosure, an electronic device is provided, comprising a processor; and a memory coupled with the processor, the memory having instructions stored therein that, when executed by the processor, cause the electronic device to perform acts comprising: displaying first content in a first window; in response to receiving a first trigger of a user, displaying a second window in a floating window form on the displayed first window, wherein the second window is used for displaying second content different from the first content; and updating the presentation state of the second window in response to a trigger of a target event associated with the first content.
In a third aspect of the disclosure, there is provided a computer program product tangibly stored on a computer-readable medium and comprising machine executable instructions that, when executed, cause a machine to perform any of the steps of the method according to the first aspect.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the disclosure, nor is it intended to be used to limit the scope of the disclosure.
Drawings
The foregoing and other objects, features and advantages of the disclosure will be apparent from the following more particular descriptions of exemplary embodiments of the disclosure as illustrated in the accompanying drawings wherein like reference numbers generally represent the same or similar parts throughout the exemplary embodiments of the disclosure. In the drawings:
FIG. 1 shows a schematic diagram of an example environment, in accordance with embodiments of the present disclosure;
FIG. 2 shows a flow diagram of a process for video display according to an embodiment of the present disclosure;
3A-3E illustrate schematic diagrams of a plurality of states of a window according to an embodiment of the present disclosure;
FIG. 4 shows a schematic diagram of a system aggregating a plurality of different players, in accordance with an embodiment of the present disclosure; and
FIG. 5 illustrates a block diagram of an example device that can be used to implement embodiments of the present disclosure.
Detailed Description
The principles of the present disclosure will be described below with reference to a number of example embodiments shown in the drawings.
The term "include" and variations thereof as used herein is meant to be inclusive in an open-ended manner, i.e., "including but not limited to". The term "or" means "and/or" unless specifically stated otherwise. The term "based on" means "based at least in part on". The terms "one example embodiment" and "one embodiment" mean "a set of example embodiments". The term "another embodiment" means "a set of additional embodiments". The terms "first," "second," and the like may refer to different or the same objects. Other explicit and implicit definitions are also possible below.
Further, a "window" as referred to herein is used to represent an area where corresponding application content is displayed on a display interface of a user device. The "content" referred to herein is a video stream generated in the user equipment based on display data of a corresponding application. It will be appreciated that the display of the video stream is typically real-time, or the display of the video stream is delayed only for a short time. Further, references herein to a user's "trigger" are intended to mean input data generated by a user by touching a display screen or operating an analog device used to simulate a field device, or the like.
As discussed above, for an application scenario that requires changing the display content on each user's respective user device by receiving control instructions of one or more users in real time, when a user is performing a corresponding operation in an application (e.g., a game application), if the user wishes to start a player (e.g., a live video player) embedded in the application and watch the video content played by the player, the user device is typically implemented by loading the corresponding player based on the user's selection operation. For example, the user device may pause the process of the application and start the process of the player. At this point, the user will not be able to obtain the video content of the application, and even the user device will not be able to obtain the real-time data associated with the application.
In a conventional application scenario such as remote control, a user typically performs a live operation by operating a first window for remote control. If the user needs to communicate with the user's needs or perform operation guidance by opening the second window for the teleconference, the user equipment usually suspends the process of the first window and opens the process of the second window. When the user needs to view the video content of the first window, the user device typically reloads the first window and presents the real-time video content to the user. However, the loading process is time consuming and the user may miss all live information for the time period during which the process of the first window was paused.
Further, in a scene such as a game application, a user typically plays a real-time game by operating a first window for displaying a game interface such as a battle interface. If at this point the user does not need to respond quickly and accurately to events occurring in the game and needs to view the live video by opening a second window for playing, for example, live video, the user device will typically pause the progress of the first window and open the progress of the second window. When the user needs to view the game content in the first window, the user device typically reloads the first window and presents the real-time video content to the user. Alternatively, the user does not pause the game play, but may miss the game event in the first window by viewing the content of the second window.
To at least partially solve the above problems, embodiments of the present disclosure provide a content display scheme. First, when a user performs an interactive operation through a first window, if the user instructs to open a second window in a floating window form, the user device may display the second window in the floating window form on top of the first window. In other words, the second window may cover at least a portion of the first window at a predetermined transparency (e.g., opaque or translucent). At this time, the user may view the application content displayed in the second window, and the user device is further configured to receive real-time data associated with the first content and change a presentation state of the second window based on the real-time data. It should be understood that the "first window" may be a window on the user device for outputting video and audio information of an application, and the "second window" may be a window of a player plug-in of the application, or may also be a window for outputting video and audio information of other applications. At this time, the user may perform an interactive operation on the second window, and the first window may not receive an interactive input from the user, but when an event associated with the first content is triggered, the user may know the event through a change in a display state of the second window.
Through the operation, the user can still know the key event of the first content in the first window when watching the corresponding application content of the second window, so that the user can perform purposeful switching between two or more windows, and the user experience is obviously improved.
FIG. 1 illustrates a schematic diagram of an example environment 100 in which devices and/or processes according to embodiments of the disclosure may be implemented, according to embodiments of the disclosure. As shown in FIG. 1, example environment 100 may include a first user 110, a second user 120, a first user device 111, a second user device 121, a first user interface 112, a second user interface 122, a network 130, and a server 140.
In fig. 1, the first user equipment 111 and the second user equipment 121 are respectively connected to the network 130 in communication, and the network 130 is connected to the server 140 in communication, so as to form a communication connection between the user equipment and the server. The first user 110 interacts with a first user device 111 via a first user interface 112. As an example, the first user interface 112 may be touched or clicked by the first user 110 to receive instructions from the first user 110, and the first user interface 112 may be used to display a corresponding video or a corresponding user-operated interface to the first user 110 to enable user interaction with the device. Similarly, the second user 120 interacts with the second user device 121 through the second user interface 122. As an example, the second user interface 122 may be touched or clicked by the second user 120 to receive an instruction from the second user 120, and the second user interface 122 may be used to display a corresponding video or a corresponding user operation interface to the second user 120.
In some embodiments, each of the first user device 111 and the second user device 121 has at least a first application and a player embedded in the first application, and each of the first user 110 and the second user 120 is a user of the first application.
In some embodiments, the first user device 111 and the second user device 121 may be any device having computing and communication capabilities. As non-limiting examples, the first user device 111 and the second user device 121 may be any type of fixed, mobile, or portable computing device, including but not limited to mobile phones, desktop computers, laptop computers.
In some embodiments, server 140 may be any device having computing and communication capabilities. All or some of the components of server 140 may be distributed in the cloud. The server 140 and its connected nodes may employ a cloud-edge architecture.
In certain embodiments, server 140 may include memory for at least storing the processed, structured data. These memories may be replaced by other various types of devices with storage capabilities, including, but not limited to, hard disks (HDDs), Solid State Disks (SSDs), removable disks, any other magnetic storage device, and any other optical storage device, or any combination thereof.
Fig. 1 is intended to illustrate only some concepts of the disclosure, and is not intended to limit the scope of the disclosure.
A process of data processing according to an embodiment of the present disclosure will be described in detail below with reference to fig. 2. For ease of understanding, specific data mentioned in the following description are exemplary and are not intended to limit the scope of the present disclosure. It is to be understood that the embodiments described below may also include additional acts not shown and/or may omit acts shown, as the scope of the disclosure is not limited in this respect.
Fig. 2 shows a flow chart of a process of video display according to an embodiment of the present disclosure. In some embodiments, process 200 may be implemented in first user device 111 or second user device 121 (hereinafter collectively referred to as "user devices") in fig. 1. A process 200 for video display according to an embodiment of the disclosure is now described with reference to fig. 2. For ease of understanding, the specific examples set forth in the following description are intended to be illustrative, and are not intended to limit the scope of the disclosure.
As shown in fig. 2, at 202, a user device may display first content within a first window. It should be appreciated that the first content displayed in the first window may change as a result of user interaction input. As an example, the first content may be a target game play for a certain game application, and the operation of displaying the first content within the first window may be displaying at least part of a game interface of the target game play within the first window. To describe this process more clearly, the display interface of the first window is described first with reference to fig. 3A. Fig. 3A illustrates a schematic diagram of a display state of a first window 310 according to an embodiment of the present disclosure. It should be understood that although an example of two windows is shown in fig. 3A or other related figures, the number of windows is not limited thereto and embodiments of the present disclosure may include more windows.
As shown in fig. 3A, the first window 310 may include at least control regions 320 and 321 and a trigger region 330 of the second window. Almost all of the area on the first window 310 may be used to display the first content. The control areas 320, 321 are used to receive user interactive input, and the user may implement interactive input such as touch, click, drag, etc. at a specific location in the control areas 320, 321. Alternatively or additionally, the user may also complete the interactive input through an external device connected to the user equipment.
In some embodiments, the first window corresponds to a remote operation application, and the user's interactive input is performed by the user operating an external device connected to the user equipment. In other embodiments, the first window corresponds to a game application, and the user may perform an interactive input by touching, clicking, or dragging a corresponding position in the control area 320, 321, so that a virtual object in the game application may be controlled to perform a corresponding operation. It should be appreciated that data associated with this operation may be used to update the first content.
Further, the trigger area 330 may be arranged as a reminder icon embedded on the first window 310, and the user may open an additional window for presenting the plug-in player or other application by touching or clicking on the trigger area 330.
Returning to fig. 2, at 204, the user device may determine in real-time whether a first trigger of the user is received, and if it is determined that the first trigger of the user is received, proceed to 206.
At 206, the user device may display a second window in floating window form on the displayed first window 310. In some embodiments, the second window is for displaying second content different from the first content, and the second window is in a floating window state. To more clearly describe this process, a display interface including a second window is described first with reference to FIG. 3B. Fig. 3B shows a schematic diagram of a floating window state 340 of a second window, in accordance with an embodiment of the present disclosure.
Fig. 3B includes a second window in addition to the first window 310, the control regions 320 and 321, and the trigger region 330 of the second window, similar to fig. 3A. It should be appreciated that the second window is arranged in a floating window form in any region above the first window 310, and as shown in fig. 3B, the floating window state 340 does not affect the user's view and operation of the first window 310. Further, the second window contains second content different from the first content. Almost all of the area on the floating-window state 340 may be used to display the second content. Alternatively or additionally, the user may perform a partial operation on the floating-window state 340. As an example, the user may adjust the size and position of the floating-window state 340 as desired. In addition, the user may perform operations such as sound adjustment, muting, window maximization, etc. on the floating-window state 340 by touching or clicking on the control region 341 within the floating-window state 340.
Returning to FIG. 2, at 208, the user device may determine in real-time whether a target event associated with the first content is triggered. If it is determined that the target event is triggered, 210 is entered.
At 210, the user device may update the presentation status of the second window. In some embodiments, when the target event message is detected in the first content, a presentation state of a preset region of the second window is updated, the presentation state including at least one of color, brightness, and flicker of the preset region. In some embodiments, the second window may be in a large screen state in the form of a floating window, i.e., the size of the second window is more than 70% of the size of the first window and does not completely cover the first window. To more clearly describe this process, a display interface including a second window is described first with reference to FIG. 3C. Fig. 3C shows a schematic diagram of a large screen state 350 of a second window, in accordance with an embodiment of the present disclosure. It should be appreciated that the large screen state 350 may be considered a deformation of the floating window state 340, which is itself still a floating window.
As shown in fig. 3C, in this scenario, the user is watching the live video content in the second window of the large screen state 350, although the user is in the game. When the user equipment detects the target event message in the first content updated in real time, the user equipment can remind the user in any mode, so that the user can directly return to the game application displayed by the first window by clicking the control of the second window or the game interface of the first window on the edge of the second window.
In some embodiments, the user device may determine a position of the target event in a corresponding map of the target game play (e.g., the approximate direction is located at the left side of the first window), determine a preset area for performing a reminding operation based on a position relationship between the position and the second window or a virtual object controlled by the user, and update a presentation state of the preset area of the second window in any manner described above. As an example, in the scenario shown in fig. 3C, the user is watching the live video content in the second window of the large screen state 350, although the user is in a game. When the user device detects the existence of a hostile virtual object in the game space in a certain direction relative to the target virtual object controlled by the user, the upper edge or other position of the second window (e.g., the window edge of the direction in which the hostile virtual object appears) is set to blink or a prompt is set at the upper edge or other position of the second window. Further, the user device may receive an operation of the user in real time, and if an operation input of the user is not received, the size of the second window may be directly reduced to the floating window state 340 or less so as to present the game interface in the first window.
In some embodiments, alternatively or additionally, if a targeted event message is detected in the first content, the user device may present an interactive interface on the second content and present third content in the interactive interface, the third content being associated with the targeted event. As an example, as shown in fig. 3D, when a friend application of another user occurs in the first content in the first window 310, the user equipment may detect an event message of the friend application from the first content. Thus, the user device may present the interactive interface 352 in the large screen state 350 of the second window, and present the prompt content of the other user application plus friends, and options such as consent, suspend, disagreement, etc. available for the user to select in the interactive interface 352. In other embodiments, the user device may also present the interactive interface described above in the floating window state 340 of the second window. In other embodiments, in addition to events that add friends, a team event of a game friend may be, among other things. In this way, the user can receive the prompt message from the first content while watching the second content in the second window, and perform response operation on the prompt message without pausing the watching of the second content, thereby improving the user experience.
In some embodiments, as shown in fig. 3D, after the third content is presented in the interactive interface 352, if a trigger such as an agreement, a suspension, a disagreement, etc., is received within the interactive interface by the user, the user device closes the interactive interface 352 and sends an operation corresponding to the trigger to the first content. As an example, after presenting the third content of the team message as a game buddy, if the user clicks on the confirm button in the team message, the interactive interface is closed and information associated with the "confirm" will be sent to the first content to add the user-controlled target virtual object to the buddy's queue. Alternatively or additionally, the user device closes the interactive interface 352 if a trigger within the interactive interface by the user is not received within a preset length of time. In addition, in the above scenario of the game application, when the game play of the first window is finished and the user is watching the live video of the second window, if the match invitation of the next game play is received on the first window, an interactive interface whether to receive the request may be displayed on the second window, and after the user receives the match invitation, the interactive interface is closed, and at this time, the user may continue to watch the live video of the second window until the match stage of the game play is finished. At this point, the interactive interface is opened in the second window to prompt the user to enter the game play, or the second window may be directly adjusted to a small window, such as the floating window state 340, to show the user the page in the first window that entered the game play. In the above processes, the user can basically keep continuously watching the second content of the second window, thereby improving the user experience.
In some embodiments, after updating the presentation state of the second window, if the user trigger is not received within a preset time period, the type of the target event is determined, and when the target event is of the first type (for example, an enemy virtual object in the game is close to the target virtual object controlled by the user), the second window is adjusted to a first preset size, for example, the second window is minimized.
In some embodiments, after updating the display state of the second window, a trigger of the user in the first window may be received, and the second window is adjusted to the first preset size. For example, the user is watching a video live of the second window, and if the user wishes to watch the game application of the first window, the user may touch a minimized case of the second window or any area outside the second window, thereby primarily displaying the first content of the first window.
In some embodiments, after displaying the second window in the floating window form on the displayed first window, the user equipment may receive a trigger of the user on the second window, and adjust the second window to a second preset size. As an example, the user may touch the control area 341 in the second window to perform a window maximization operation on the second window to make the second window reach a second preset size. It is understood that the second preset size may be set to more than 70% of the size of the first window, such as the large screen state 350 in fig. 3C. In this manner, the user may view the application content in the large screen state 350 of the second window while still receiving the display information of the first window 310, thereby improving the user experience.
In some embodiments, the user device may determine in real-time whether a user click operation is received on an area not covered by the large screen state 350. If it is determined that a user click operation on an area not covered by the large-screen state 350 is received, the second window is transitioned from the large-screen state 350 to the floating-window state 340. In this way, the user can effect the transition of the second window from the large screen state 350 to the floating window state 340 by a simple operation.
In some embodiments, the user device may also determine whether a particular event is triggered in the first window 310 when the second window is in the floating window state 340 or the large screen state 350, and if so, send a reminder to the user. As an example, the specific event may be a prompt from a live worker in a remote operation scenario, or a game value above or below a predetermined threshold in a game application scenario. In this way, the user can quickly return to view the first window and make a response action according to the prompt, and therefore the user experience is improved.
In some embodiments, the alert may include at least one of a flashing of a border of the first window 310, a flashing of a border of the large screen state 350 of the second window, a shaking of a frame of the first window 310, a shaking of a frame of the large screen state 350 of the second window, an operation to transition the second window from the large screen state 350 to the floating window state 340, and the like.
In some embodiments, the user device may also transition the second window to a play detail state in response to receiving a user trigger, the play detail state containing at least the second application content and one or more video setting options. To more clearly describe this process, a display interface including a second window is described first with reference to fig. 3E. Fig. 3E shows a schematic diagram of a play detail state 360 of the second window according to an embodiment of the present disclosure.
As shown in fig. 3E, the play detail status 360 of the second window completely covers the first window, and the play detail status 360 of the second window may include at least a video play area 361 and video activity areas 362, 363. The play details state 360 may be configured to display play details of the played application content, such as various play settings. In addition, the playing detail state 360 can also be used for interactive operations such as barrage and comments.
In some embodiments, the second window loads the second application content only once when transitioning between the floating-window state 340, the large-screen state 350, and the play details state 360. As an example, the same front page may be used for the second windows of the floating window state 340, the large screen state 350, and the play detail state 360, and the front page is switched according to the modality set by the user or the modality of the client. In this way, the second window does not need to be loaded when being switched among the states, waiting time is reduced, and user experience is improved.
In some embodiments, the generating operation of the first window and the generating operation of the second window are performed in different processes, respectively. It should be understood that, in order to avoid the problem of memory overflow of the first application process, the process in which the image generation operation in the second window is performed should be independent of the process in which the image generation operation in the first window is performed. To do this, the bearer container (such as webview) for the second window needs to run in a separate process.
For the play details state 360, normal processing can be done because it is a full screen and blocks the play interface of the first application receipt. For the floating window state 340 and the large screen state 350, when the main process runs the application, an independent secondary process can be run in the background to complete rendering operation, and the rendered picture is drawn into the display area of the main process. As an example, the initialization operation of the secondary process may be initiated by the primary process during an initialization phase. And when the user indicates to open the second window, starting the rendering operation of the secondary process through the main process, and returning the processed data to the main process to finish the drawing operation. In the process, the user device may periodically (e.g., every 1000/30ms) determine whether to stop drawing, and when the user instructs to close the second window, the sub-process may stop data processing and destroy the corresponding data, and finally remove the second window. In this way, the data processing process of the first window can be independent of the data processing process of the second window, so that the running of the application is not influenced by the opening of the second window, and the user experience is improved.
With the above embodiment, the user can receive the update prompt of the first content from the first window in the floating-window state of the second window, so that the user can compromise two or more windows. In addition, the user can respond to the specific event of the first window in the floating window state or the large screen state of the second window without pausing to watch the video content of the second window, so that the user experience is improved. In addition, a preset reminding condition is set through the first window, and a user can quickly return to view the first window and make a response action according to the prompt. And the processing process of the display data of the first window is independent of the processing process of the display data of the second window, so that the two applications are not interfered with each other and can be switched quickly.
Fig. 4 shows a schematic diagram of a system 400 of aggregating a plurality of different players in accordance with an embodiment of the present disclosure. It should be understood that an embedded player, such as the second window, may aggregate players of multiple different playback platforms. As shown in fig. 4, the system 400 may include a plurality of players, such as a first player 410, a second player 420, a third player 430, and so on. The players are from different video playing platforms respectively. Due to differences between platforms (such as API differences), it is necessary to integrate the first player 410, the second player 420, the third player 430, and the like with the in-line playing page 440, and encapsulate the in-line playing page 440 in the multi-platform aggregation front-end page 450. Thus, the second application window 460 may interact with the multi-platform syndication front end page 450 to perform control operations such as pause, resume, mute, etc. on video from the first player 410, the second player 420, the third player 430, etc.
Fig. 5 illustrates a block diagram of an example device 500 that may be used to implement embodiments of the present disclosure. For example, the electronic device 500 may be used to implement the first user device 111 or the second user device 121 in fig. 1. As shown, electronic device 500 includes a Central Processing Unit (CPU)501 that may perform various suitable actions and processes according to computer program instructions stored in a Read Only Memory (ROM)502 or computer program instructions loaded from a storage unit 508 into a Random Access Memory (RAM) 503. In the RAM 503, various programs and data required for the operation of the device 500 can also be stored. The CPU 501, ROM 502, and RAM 503 are connected to each other via a bus 604. An input/output (I/O) interface 505 is also connected to bus 504.
A number of components in the device 500 are connected to the I/O interface 505, including: an input unit 506 such as a keyboard, a mouse, or the like; an output unit 507 such as various types of displays, speakers, and the like; a storage unit 508, such as a magnetic disk, optical disk, or the like; and a communication unit 509 such as a network card, modem, wireless communication transceiver, etc. The communication unit 509 allows the device 500 to exchange information/data with other devices through a computer network such as the internet and/or various telecommunication networks.
The processing unit 501 performs the various methods and processes described above, such as the process 200. For example, in some embodiments, the various methods and processes described above may be implemented as a computer software program or computer program product tangibly embodied in a machine-readable medium, such as storage unit 508. In some embodiments, part or all of the computer program may be loaded and/or installed onto the device 500 via the ROM 502 and/or the communication unit 509. When the computer program is loaded into RAM 503 and executed by CPU 501, one or more steps of any of the processes described above may be performed. Alternatively, in other embodiments, CPU 501 may be configured to perform processes such as process 200 in any other suitable manner (e.g., by way of firmware).
The present disclosure may be methods, apparatus, systems, and/or computer program products. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied thereon for carrying out various aspects of the present disclosure.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, any non-transitory memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical encoding device, such as punch cards or in-groove raised structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations of the present disclosure may be assembler instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, the electronic circuitry that can execute the computer-readable program instructions implements aspects of the present disclosure by utilizing the state information of the computer-readable program instructions to personalize the electronic circuitry, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA).
Various aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processing unit of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Having described embodiments of the present disclosure, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the disclosed embodiments. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein was chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (19)

1. A display method, comprising:
displaying first content in a first window;
in response to receiving a first trigger of a user, displaying a second window in a floating window form on the displayed first window, wherein the second window is used for displaying second content different from the first content; and
updating a presentation state of the second window in response to a trigger of a target event associated with the first content.
2. The method of claim 1, wherein updating the presentation state of the second window in response to the triggering of the target event comprises:
in response to detecting the target event message in the first content, updating the display state of a preset area of the second window, wherein the display state comprises at least one of color, brightness and flicker of the preset area.
3. The method of claim 2, wherein the first content is a targeted game play, wherein displaying the first content within the first window comprises:
displaying at least a portion of a game interface of the target game play within the first window,
wherein updating the presentation state in response to detecting the target event message comprises:
determining a location of the target event in a corresponding map of the target game play;
determining the preset area based on a position relation between the position and the second window or the virtual object controlled by the user; and
updating the display state of the preset area of the second window.
4. The method of claim 1, wherein updating the presentation state of the second window in response to the triggering of the target event comprises:
in response to detecting the targeted event message in the first content, presenting an interactive interface on the second content; and
third content is presented in the interactive interface, and the third content is associated with the target event.
5. The method of claim 4, after presenting the third content in the interactive interface, the method further comprising:
in response to receiving a second trigger of the user in the interactive interface, closing the interactive interface, and sending an operation corresponding to the second trigger to first content; or
And closing the interactive interface in response to not receiving a second trigger of the user in the interactive interface within a preset time length.
6. The method of claim 1, after updating the presentation state of the second window, the method further comprising:
responding to the situation that a second trigger of the user is not received within a preset time length, and judging the type of the target event;
and when the target event is of a first type, adjusting the second window to a first preset size.
7. The method of claim 1, after updating the presentation state of the second window, the method further comprising:
and receiving a third trigger of the user in the first window, and adjusting the second window to be a first preset size.
8. The method of claim 1, after displaying a second window in floating window form on the displayed first window, the method further comprising:
and receiving a fourth trigger of the user in the second window, and adjusting the second window to a second preset size.
9. The method of claim 8, wherein the second predetermined size is greater than 70% of the size of the first window.
10. An electronic device, comprising:
a processor; and
a memory coupled with the processor, the memory having instructions stored therein that, when executed by the processor, cause the electronic device to perform acts comprising:
displaying first content in a first window;
in response to receiving a first trigger of a user, displaying a second window in a floating window form on the displayed first window, wherein the second window is used for displaying second content different from the first content; and
updating a presentation state of the second window in response to a trigger of a target event associated with the first content.
11. The apparatus of claim 10, wherein updating the presentation state of the second window in response to the triggering of the target event comprises:
in response to detecting the target event message in the first content, updating the display state of a preset area of the second window, wherein the display state comprises at least one of color, brightness and flicker of the preset area.
12. The apparatus of claim 11, wherein the first content is a targeted game play, wherein displaying the first content within the first window comprises:
displaying at least a portion of a game interface of the target game play within the first window,
wherein updating the presentation state in response to detecting the target event message comprises:
determining a location of the target event in a corresponding map of the target game play;
determining the preset area based on a position relation between the position and the second window or a virtual object controlled by the user; and
updating the display state of the preset area of the second window.
13. The apparatus of claim 10, wherein updating the presentation state of the second window in response to the triggering of the target event comprises:
in response to detecting the targeted event message in the first content, presenting an interactive interface on the second content; and
and displaying third content in the interactive interface, wherein the third content is associated with the target event.
14. The device of claim 13, after presenting the third content in the interactive interface, the method further comprising:
in response to receiving a second trigger of the user in the interactive interface, closing the interactive interface, and sending an operation corresponding to the second trigger to first content; or alternatively
And closing the interactive interface in response to not receiving a second trigger of the user in the interactive interface within a preset time length.
15. The apparatus of claim 10, after updating the presentation state of the second window, the method further comprising:
responding to the situation that a second trigger of the user is not received within a preset time length, and judging the type of the target event;
and when the target event is of a first type, adjusting the second window to a first preset size.
16. The apparatus of claim 10, after updating the presentation state of the second window, the method further comprising:
and receiving a third trigger of the user in the first window, and adjusting the second window to be a first preset size.
17. The device of claim 10, after displaying a second window in floating window form on the displayed first window, the method further comprising:
and receiving a fourth trigger of the user in the second window, and adjusting the second window to a second preset size.
18. The apparatus of claim 17, the second predetermined size being greater than 70% of the size of the first window.
19. A computer program product tangibly stored on a computer-readable medium and comprising machine executable instructions that, when executed, cause a machine to perform the method of any of claims 1 to 9.
CN202210730428.6A 2022-06-24 2022-06-24 Display method, electronic device, and computer program product Pending CN115098012A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210730428.6A CN115098012A (en) 2022-06-24 2022-06-24 Display method, electronic device, and computer program product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210730428.6A CN115098012A (en) 2022-06-24 2022-06-24 Display method, electronic device, and computer program product

Publications (1)

Publication Number Publication Date
CN115098012A true CN115098012A (en) 2022-09-23

Family

ID=83293431

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210730428.6A Pending CN115098012A (en) 2022-06-24 2022-06-24 Display method, electronic device, and computer program product

Country Status (1)

Country Link
CN (1) CN115098012A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115408093A (en) * 2022-10-31 2022-11-29 统信软件技术有限公司 Remote connection method, remote connection system, computing device, and storage medium

Citations (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102185856A (en) * 2011-03-22 2011-09-14 北京朗玛数联科技有限公司 Team organizing video method, device and system used in team organizing game
CN104808907A (en) * 2015-05-20 2015-07-29 腾讯科技(深圳)有限公司 Method and device for displaying content in same screen, and terminal equipment
CN105187939A (en) * 2015-09-21 2015-12-23 合一网络技术(北京)有限公司 Method and device of playing video in webpage game
CN105554424A (en) * 2015-12-24 2016-05-04 北京奇虎科技有限公司 Method and apparatus for video playing in application
CN106060574A (en) * 2016-06-21 2016-10-26 北京奇虎科技有限公司 Method and device for showing live video stream in game
CN106101855A (en) * 2016-06-29 2016-11-09 北京奇虎科技有限公司 A kind for the treatment of method and apparatus of games page
EP3091748A1 (en) * 2015-05-05 2016-11-09 Facebook, Inc. Methods and systems for viewing embedded videos
CN106406998A (en) * 2016-09-28 2017-02-15 北京奇虎科技有限公司 Method and device for processing user interface
CN107551555A (en) * 2017-08-24 2018-01-09 网易(杭州)网络有限公司 Game picture display methods and device, storage medium, terminal
CN107626105A (en) * 2017-08-24 2018-01-26 网易(杭州)网络有限公司 Game picture display methods and device, storage medium, electronic equipment
CN107656671A (en) * 2017-09-29 2018-02-02 珠海市魅族科技有限公司 Suspend small window control method and device, terminal installation and computer-readable recording medium
CN107899241A (en) * 2017-11-22 2018-04-13 网易(杭州)网络有限公司 Information processing method and device, storage medium, electronic equipment
US20190060745A1 (en) * 2017-08-22 2019-02-28 Netease (Hangzhou) Network Co.,Ltd. Information Processing Method and Apparatus, Storage Medium, and Electronic Device
CN109525851A (en) * 2018-11-12 2019-03-26 咪咕互动娱乐有限公司 Live broadcasting method, device and storage medium
US20190238908A1 (en) * 2016-12-28 2019-08-01 Tencent Technology (Shenzhen) Company Limited Information processing method, terminal, system, and computer storage medium
CN110278475A (en) * 2018-03-16 2019-09-24 优酷网络技术(北京)有限公司 The display methods and device of barrage information
CN111408145A (en) * 2020-02-28 2020-07-14 网易(杭州)网络有限公司 Method and device for playing live content during game, electronic equipment and storage medium
CN111443848A (en) * 2020-03-24 2020-07-24 腾讯科技(深圳)有限公司 Information display method and device, storage medium and electronic device
CN111459598A (en) * 2020-04-02 2020-07-28 上海极链网络科技有限公司 Information display method and device, electronic equipment and storage medium
CN111488107A (en) * 2020-03-26 2020-08-04 北京小米移动软件有限公司 Multitask interaction control method, multitask interaction control device and storage medium
US20200286449A1 (en) * 2017-09-07 2020-09-10 Huawei Technologies Co., Ltd. Interface Display Method and Apparatus
CN111672111A (en) * 2020-05-28 2020-09-18 腾讯科技(深圳)有限公司 Interface display method, device, equipment and storage medium
CN111760266A (en) * 2020-07-01 2020-10-13 网易(杭州)网络有限公司 Game live broadcast method and device and electronic equipment
CN112057848A (en) * 2020-09-10 2020-12-11 网易(杭州)网络有限公司 Information processing method, device, equipment and storage medium in game
CN112114722A (en) * 2020-09-16 2020-12-22 北京嘀嘀无限科技发展有限公司 Suspension window control method and system
CN112584224A (en) * 2020-12-08 2021-03-30 北京字节跳动网络技术有限公司 Information display and processing method, device, equipment and medium
CN112704883A (en) * 2020-12-30 2021-04-27 腾讯科技(深圳)有限公司 Method, device, terminal and storage medium for grouping virtual objects in virtual environment
CN112791388A (en) * 2021-01-22 2021-05-14 网易(杭州)网络有限公司 Information control method and device and electronic equipment
CN113426114A (en) * 2021-07-07 2021-09-24 网易(杭州)网络有限公司 Game information prompting method and device, readable storage medium and electronic equipment
CN113750522A (en) * 2021-09-07 2021-12-07 网易(杭州)网络有限公司 Game skill processing method and device and electronic equipment
CN113893540A (en) * 2021-09-30 2022-01-07 腾讯科技(深圳)有限公司 Information prompting method and device, storage medium and electronic equipment
CN113938748A (en) * 2021-10-15 2022-01-14 腾讯科技(成都)有限公司 Video playing method, device, terminal, storage medium and program product
CN113941149A (en) * 2021-09-26 2022-01-18 网易(杭州)网络有限公司 Game behavior data processing method, nonvolatile storage medium and electronic device
CN113975806A (en) * 2021-10-28 2022-01-28 北京完美赤金科技有限公司 In-game interface interaction method and device, storage medium and computer equipment
CN114053697A (en) * 2021-11-17 2022-02-18 北京字节跳动网络技术有限公司 Cloud game interaction method and device, readable medium and electronic equipment
CN114461106A (en) * 2021-07-02 2022-05-10 北京字跳网络技术有限公司 Display method and device and electronic equipment
CN114470771A (en) * 2022-01-14 2022-05-13 网易(杭州)网络有限公司 Information processing method and device
CN114500717A (en) * 2022-01-26 2022-05-13 维沃移动通信有限公司 Information display method and device

Patent Citations (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102185856A (en) * 2011-03-22 2011-09-14 北京朗玛数联科技有限公司 Team organizing video method, device and system used in team organizing game
EP3091748A1 (en) * 2015-05-05 2016-11-09 Facebook, Inc. Methods and systems for viewing embedded videos
CN104808907A (en) * 2015-05-20 2015-07-29 腾讯科技(深圳)有限公司 Method and device for displaying content in same screen, and terminal equipment
CN105187939A (en) * 2015-09-21 2015-12-23 合一网络技术(北京)有限公司 Method and device of playing video in webpage game
CN105554424A (en) * 2015-12-24 2016-05-04 北京奇虎科技有限公司 Method and apparatus for video playing in application
WO2017107962A1 (en) * 2015-12-24 2017-06-29 北京奇虎科技有限公司 Method of playing video in application and device
CN106060574A (en) * 2016-06-21 2016-10-26 北京奇虎科技有限公司 Method and device for showing live video stream in game
CN106101855A (en) * 2016-06-29 2016-11-09 北京奇虎科技有限公司 A kind for the treatment of method and apparatus of games page
CN106406998A (en) * 2016-09-28 2017-02-15 北京奇虎科技有限公司 Method and device for processing user interface
US20190238908A1 (en) * 2016-12-28 2019-08-01 Tencent Technology (Shenzhen) Company Limited Information processing method, terminal, system, and computer storage medium
US20190060745A1 (en) * 2017-08-22 2019-02-28 Netease (Hangzhou) Network Co.,Ltd. Information Processing Method and Apparatus, Storage Medium, and Electronic Device
CN107551555A (en) * 2017-08-24 2018-01-09 网易(杭州)网络有限公司 Game picture display methods and device, storage medium, terminal
CN107626105A (en) * 2017-08-24 2018-01-26 网易(杭州)网络有限公司 Game picture display methods and device, storage medium, electronic equipment
US20200286449A1 (en) * 2017-09-07 2020-09-10 Huawei Technologies Co., Ltd. Interface Display Method and Apparatus
CN107656671A (en) * 2017-09-29 2018-02-02 珠海市魅族科技有限公司 Suspend small window control method and device, terminal installation and computer-readable recording medium
CN107899241A (en) * 2017-11-22 2018-04-13 网易(杭州)网络有限公司 Information processing method and device, storage medium, electronic equipment
CN110278475A (en) * 2018-03-16 2019-09-24 优酷网络技术(北京)有限公司 The display methods and device of barrage information
CN109525851A (en) * 2018-11-12 2019-03-26 咪咕互动娱乐有限公司 Live broadcasting method, device and storage medium
CN111408145A (en) * 2020-02-28 2020-07-14 网易(杭州)网络有限公司 Method and device for playing live content during game, electronic equipment and storage medium
CN111443848A (en) * 2020-03-24 2020-07-24 腾讯科技(深圳)有限公司 Information display method and device, storage medium and electronic device
CN111488107A (en) * 2020-03-26 2020-08-04 北京小米移动软件有限公司 Multitask interaction control method, multitask interaction control device and storage medium
CN111459598A (en) * 2020-04-02 2020-07-28 上海极链网络科技有限公司 Information display method and device, electronic equipment and storage medium
CN111672111A (en) * 2020-05-28 2020-09-18 腾讯科技(深圳)有限公司 Interface display method, device, equipment and storage medium
CN111760266A (en) * 2020-07-01 2020-10-13 网易(杭州)网络有限公司 Game live broadcast method and device and electronic equipment
CN112057848A (en) * 2020-09-10 2020-12-11 网易(杭州)网络有限公司 Information processing method, device, equipment and storage medium in game
CN112114722A (en) * 2020-09-16 2020-12-22 北京嘀嘀无限科技发展有限公司 Suspension window control method and system
CN112584224A (en) * 2020-12-08 2021-03-30 北京字节跳动网络技术有限公司 Information display and processing method, device, equipment and medium
CN112704883A (en) * 2020-12-30 2021-04-27 腾讯科技(深圳)有限公司 Method, device, terminal and storage medium for grouping virtual objects in virtual environment
CN112791388A (en) * 2021-01-22 2021-05-14 网易(杭州)网络有限公司 Information control method and device and electronic equipment
CN114461106A (en) * 2021-07-02 2022-05-10 北京字跳网络技术有限公司 Display method and device and electronic equipment
CN113426114A (en) * 2021-07-07 2021-09-24 网易(杭州)网络有限公司 Game information prompting method and device, readable storage medium and electronic equipment
CN113750522A (en) * 2021-09-07 2021-12-07 网易(杭州)网络有限公司 Game skill processing method and device and electronic equipment
CN113941149A (en) * 2021-09-26 2022-01-18 网易(杭州)网络有限公司 Game behavior data processing method, nonvolatile storage medium and electronic device
CN113893540A (en) * 2021-09-30 2022-01-07 腾讯科技(深圳)有限公司 Information prompting method and device, storage medium and electronic equipment
CN113938748A (en) * 2021-10-15 2022-01-14 腾讯科技(成都)有限公司 Video playing method, device, terminal, storage medium and program product
CN113975806A (en) * 2021-10-28 2022-01-28 北京完美赤金科技有限公司 In-game interface interaction method and device, storage medium and computer equipment
CN114053697A (en) * 2021-11-17 2022-02-18 北京字节跳动网络技术有限公司 Cloud game interaction method and device, readable medium and electronic equipment
CN114470771A (en) * 2022-01-14 2022-05-13 网易(杭州)网络有限公司 Information processing method and device
CN114500717A (en) * 2022-01-26 2022-05-13 维沃移动通信有限公司 Information display method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
无名氏: "王者荣耀s25赛季更新辅助装改版 王者荣耀新赛季落子无悔更新内容视频播放器优化_游侠手游", pages 22, Retrieved from the Internet <URL:https://m.ali213.net/news/gl2109/695389_22.html> *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115408093A (en) * 2022-10-31 2022-11-29 统信软件技术有限公司 Remote connection method, remote connection system, computing device, and storage medium
CN115408093B (en) * 2022-10-31 2023-05-02 统信软件技术有限公司 Remote connection method, remote connection system, computing device, and storage medium

Similar Documents

Publication Publication Date Title
US20210064317A1 (en) Operational mode-based settings for presenting notifications on a user display
US11712624B2 (en) User immersion context-based notifications on a user display
US20200220920A1 (en) Platform-independent content generation for thin client applications
CN107113468B (en) Mobile computing equipment, implementation method and computer storage medium
WO2017113856A1 (en) Barrage display method and device
CN115509398A (en) Method for displaying emoticons using instant messaging service and user device thereof
CN103747362A (en) Method and device for cutting out video clip
JP7426496B2 (en) Video interaction methods, apparatus, electronic devices, storage media, computer program products and computer programs
US11070894B2 (en) Methods, systems, and media for presenting interactive elements within video content
CN112242947B (en) Information processing method, device, equipment and medium
CN111790148B (en) Information interaction method and device in game scene and computer readable medium
US11890549B2 (en) Summarizing notifications on a user display
CN109819268B (en) Live broadcast room play control method, device, medium and equipment in video live broadcast
US11870827B2 (en) Methods, systems, and media for navigating through a stream of content items
WO2024078486A1 (en) Content presentation method and apparatus, and device and storage medium
US11750879B2 (en) Video content display method, client, and storage medium
CN115098012A (en) Display method, electronic device, and computer program product
US20210326010A1 (en) Methods, systems, and media for navigating user interfaces
US9525905B2 (en) Mapping visual display screen to portable touch screen
US20230370686A1 (en) Information display method and apparatus, and device and medium
US10862946B1 (en) Media player supporting streaming protocol libraries for different media applications on a computer system
WO2023030292A1 (en) Multimedia file playback method and apparatus
WO2023104102A1 (en) Live broadcasting comment presentation method and apparatus, and device, program product and medium
US20220091717A1 (en) Methods, systems, and media for presenting offset content
CN113970966A (en) Interaction control method and device, electronic equipment and computer storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination