CN113542886A - Video playing method and device for playing video - Google Patents

Video playing method and device for playing video Download PDF

Info

Publication number
CN113542886A
CN113542886A CN202010292066.8A CN202010292066A CN113542886A CN 113542886 A CN113542886 A CN 113542886A CN 202010292066 A CN202010292066 A CN 202010292066A CN 113542886 A CN113542886 A CN 113542886A
Authority
CN
China
Prior art keywords
video
floating window
user
key
playing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010292066.8A
Other languages
Chinese (zh)
Other versions
CN113542886B (en
Inventor
姜盛禄
杨斌
刘贲
周冰洋
赵涵思
张华婷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Sogou Technology Development Co Ltd
Original Assignee
Beijing Sogou Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Sogou Technology Development Co Ltd filed Critical Beijing Sogou Technology Development Co Ltd
Priority to CN202010292066.8A priority Critical patent/CN113542886B/en
Publication of CN113542886A publication Critical patent/CN113542886A/en
Application granted granted Critical
Publication of CN113542886B publication Critical patent/CN113542886B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4316Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/812Monomedia components thereof involving advertisement data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/858Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot
    • H04N21/8586Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot by using a URL

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a video playing method and device and a device for playing video. An embodiment of the method comprises: when detecting that a user triggers a video playing function in an input method interface, presenting a floating window and loading video data; the video is played in the floating window based on the video data. The implementation mode improves the convenience of the user for operating the equipment and the continuity of video playing.

Description

Video playing method and device for playing video
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to a video playing method and device and a device for playing videos.
Background
With the development of internet technology, video Applications (APPs) are increasing, such as short video applications, live video applications, movie and television video applications, and the like. A user may view a video by installing such an application in a device.
In the prior art, a user needs to open an installed video application to watch a video. If other applications need to be used in the process of watching the video, for example, a message in an instant messaging application needs to be replied, or a document needs to be written in a document editing application, the video application needs to be switched to the other applications, and after the other applications are used, the video application needs to be switched again to continue playing the video. Therefore, the conventional mode needs a user to frequently switch the application operation, so that the operation convenience is low, and meanwhile, the conventional mode cannot support continuous video playing in the process of using other applications, so that the video playing consistency is poor.
Disclosure of Invention
The embodiment of the application provides a video playing method and device and a device for playing video, and aims to solve the technical problems that in the prior art, the operation convenience is low and the video playing consistency is poor.
In a first aspect, an embodiment of the present application provides a video playing method, where the method includes: when detecting that a user triggers a video playing function in an input method interface, presenting a floating window and loading video data; the video is played in the floating window based on the video data.
In a second aspect, an embodiment of the present application provides a video playing apparatus, where the apparatus includes: the display unit is matched to display the floating window and load video data when detecting that a user triggers a video playing function in the input method interface; a playback unit adapted to play back video in the floating window based on the video data.
In a third aspect, an embodiment of the present application provides an apparatus for playing video, comprising a memory, and one or more programs, wherein the one or more programs are stored in the memory, and the one or more programs are configured to be executed by the one or more processors and include instructions for: when detecting that a user triggers a video playing function in an input method interface, presenting a floating window and loading video data; the video is played in the floating window based on the video data.
In a fourth aspect, embodiments of the present application provide a computer-readable medium on which a computer program is stored, which when executed by a processor, implements the method as described in the first aspect above.
According to the video playing method and device and the video playing device, when the fact that the user triggers the video playing function in the input method interface is detected, the floating window is displayed, and the video data are loaded, so that the video is played in the floating window based on the video data. Therefore, the video can be continuously played in the process that the user uses different applications in the terminal equipment, frequent application switching is not needed, and convenience in operation and continuity of video playing are improved.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is a flow diagram of one embodiment of a video playback method according to the present application;
FIG. 2 is a schematic diagram of a floating window rendering process according to the present application;
FIG. 3 is a schematic view of a floating window stowing process according to the present application;
FIG. 4 is a schematic diagram of a video transmission process according to the present application;
FIG. 5 is a flow diagram of yet another embodiment of a video playback method according to the present application;
FIG. 6 is a schematic block diagram of an embodiment of a video playback device according to the present application;
FIG. 7 is a schematic diagram of an apparatus for playing video according to the present application;
FIG. 8 is a schematic diagram of a server in accordance with some embodiments of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
Referring to fig. 1, a flow 100 of one embodiment of a video playback method according to the present application is shown. The video playing method can be operated in various electronic devices, including but not limited to: a server, a smart phone, a tablet computer, an e-book reader, an MP3 (Moving Picture Experts Group Audio Layer III) player, an MP4 (Moving Picture Experts Group Audio Layer IV) player, a laptop, a car computer, a desktop computer, a set-top box, an intelligent tv, a wearable device, and so on.
The input method application mentioned in the embodiment of the application can support various input methods. The input method may be an encoding method used for inputting various symbols to electronic devices such as computers and mobile phones, and a user may conveniently input a desired character or character string to the electronic devices using the input method application. It should be noted that, in the embodiment of the present application, in addition to the common chinese input method (such as pinyin input method, wubi input method, zhuyin input method, phonetic input method, handwriting input method, etc.), the input method may also support other languages (such as english input method, japanese hiragana input method, korean input method, etc.), and the input method and the language category of the input method are not limited at all.
The video playing method in this embodiment may include the following steps:
step 101, when detecting that a user triggers a video playing function in an input method interface, presenting a floating window and loading video data.
In the present embodiment, various types of client applications, such as an input method application, an instant messaging application, a document editing application, an audio/video application, and the like, may be installed in an execution main body of the video playing method (such as the electronic device described above). Wherein, the input method application can be configured with a video playing function. The video playing function supports calling video service and playing video through input method application, so that a user can watch the video in the input process of various scenes. Video herein includes, but is not limited to, short video, movie, live, etc.
In this embodiment, when it is detected that the user triggers a video playing function in the input method interface, the execution main body may present a floating window in the current interface. Wherein, the input method interface is the interface of the input method application. The input method interface may include a keyboard area and various function keys, such as a voice input function key, an applet function key, a search function key, an expression input function key, a video play function key, and the like. When the user triggers (e.g., clicks) the video playing function key, the video playing function applied by the input method can be triggered.
It should be noted that the user may also trigger the video playing function in other manners. As an example, a user may trigger a video playback function by making a content input in an input method application. For example, when a user inputs target content, such as "video playing" or other content, through a coding input mode or a voice input mode, a video playing function may be triggered. The target content may be used to indicate video playback. In addition, the video playing function applied by the input method can be triggered by detecting the input intention of the user, and when the user is detected to have the intention of playing, sending or sharing the video, the video playing function is triggered. As an example, the current input scene is an instant messaging scene, and when the duration of waiting for the message returned by the opposite terminal after the user sends the instant messaging message is longer than the preset duration, the video playing function may be triggered. Before triggering the video play function, the user may be asked if a trigger is required. The triggering path of the video playing function may be preset according to the requirement, and is not limited herein.
As an example, fig. 2 shows a schematic diagram of a floating window rendering process. As shown in fig. 2, a user communicates with another user in an instant messaging application. And in the communication process, the content is input by using an input method application. A session interface with the user is shown as reference 201. When content is input, an input method interface 202 is presented in the session interface. The input method interface comprises a keyboard area and various functional keys, such as expression input functional keys, keyboard input functional keys, voice input functional keys, small program functional keys, video playing functional keys and the like. When a user triggers (e.g., clicks) a video playing function button (as shown by reference numeral 203) in the input method interface, a video playing function of the input method application can be triggered, so that a floating window (as shown by reference numeral 204) can be presented in the current interface.
In this embodiment, a floating window may be used to present the video. The floating window may be configured as a fixed position floating window or as a movable floating window. When configured as a fixed-position floating window, the floating window may be disposed at a position on the screen where the screen content is not easily occluded, such as the upper right corner of the screen. When configured as a movable floating window, the user can drag the floating window to any position in the screen as needed, thereby avoiding obstructing the content in the current input scene. In addition, when the electronic device is configured as a movable floating window, the electronic device can also automatically identify the position of the content (such as text information) in the screen, and automatically adjust the position of the floating window to avoid the shielding of the content in the screen as much as possible.
In addition, the floating window may take various forms. For example, the background of the floating window may be set to a semi-transparent color or a solid color, and the background color or pattern may be set individually by the user. Under the default condition, the background color can be set to be semitransparent black, so that the shielding of interface contents by the suspension window is avoided, and meanwhile, the identifiability of functional keys in the suspension window under different interfaces can be ensured. In addition, the background color of the floating window can be adaptively adjusted according to the color of the current interface, so that the function keys in the floating window can be identified when the application is switched, and the floating window does not cause content shielding.
The floating window may be fixed in size or variable in size. When the size of the floating window is a fixed size, in order to prevent the user from being blocked by the floating window in the input process (such as chatting with an opposite-end user in an instant messaging application), the width of the floating window can be limited to 50% of the screen width, so that the user can conveniently check the instant messaging message sent by the opposite-end user. Meanwhile, the height of the floating window can be set based on the position of the input method interface, so that the distance between the bottom of the floating window and the top of the input method interface is kept by at least one line of messages, and a user can browse the last line of messages conveniently in an instant messaging scene. When the size of the floating window is variable, the size of the floating window can be changed through stretching operation by a user, and the electronic equipment can automatically adjust the size of the floating window to avoid shielding the content in the screen as much as possible by detecting the position of the content in the current interface.
In this embodiment, the execution subject may be communicatively connected to the server through the input method application. The server here may be a server for providing video data, such as an input method server, a server of a third-party video platform, and the like. When detecting that the user triggers a video playing function in the input method interface, the execution main body may send a video data acquisition request to the server to acquire video data. After the video data are obtained, the video data can be loaded.
In practice, each time the execution main body sends a video data acquisition request, one or more (e.g., 10) video addresses may be acquired. The video address may include a URL (Uniform Resource Locator) of a cover page of the video and a URL of the video data. When the execution main body acquires the video address, the execution main body can access the video address and acquire the cover page and the video data. The captured cover and video data may then be stored in a cache. Finally, the cover page and video data of a piece of video in the cache can be loaded, so that the video is played. In the process, the cover of the next video can be read in advance at the same time, so that the cover of the next video can be presented in time when the user switches the videos, and the situation that the picture is blocked is avoided.
It should be noted that, the execution main body may send a video data acquisition request to the server to acquire new video data after a preset number of videos are played each time, or after it is detected that the number of videos that are not played in the cache is smaller than a preset value. The requested video data may be video data of a video source focused by the user, or may be a random video, which is not limited in this embodiment.
In some optional implementation manners of this embodiment, when the user triggers the video playing function, it may be first detected whether the user opens the floating window permission of the input method application. If the window is opened, the floating window can be directly presented. If not, prompt information can be presented to prompt the user to open the floating window permission. For example, the prompt may be presented in a pop-up window format, such as "the floating window permission needs to be opened for video services. And (4) checking (suspension window permission) or (allowing to appear in other applications) to open. In addition, because the user usually does not open the floating window permission when the user triggers the video playing function for the first time, the prompt information can be directly presented when the user triggers the video playing function for the first time.
In some optional implementation manners of this embodiment, when the user first triggers the video playing function, after the floating window is presented, the guidance page may be displayed in the floating window at first to inform the floating window and the use methods of the function keys in the floating window, such as a mute method, a video switching method, a full screen method, a floating window collapsing method, and the like. Therefore, the user can conveniently and accurately operate the video playing function for the first time.
And 102, playing the video in the floating window based on the video data.
In this embodiment, the execution main body may play a video frame in the video data in the floating window based on the loaded video data, thereby implementing video playing.
Here, under the condition that the user does not switch the video, if the playing of the current video in the floating window is completed, the video data of the next video can be automatically loaded, and the playing of the next video can be automatically performed. In addition, the video can also be played again until the video switching operation executed by the user is detected, the video data of the next video is loaded, and the next video is played.
In some optional implementation manners of this embodiment, in the video playing process, if it is detected that the user performs operations such as exiting the input method interface, performing an operation in the input method interface, exiting the current application, switching the current application to another application, performing an operation in the current application, and touching an entity return key or a virtual return key in the device, the floating window may be continuously presented, and the video is continuously played in the floating window. Therefore, when the floating window plays the video, the user can normally execute the input operation, can also operate in the current application, and can continuously play the video under the condition that the user exits the current input scene or switches the input scene, so that the continuity of video playing is improved. It should be noted that, when the user inputs voice, the user may be prompted to have a video to play. Meanwhile, the played video can be automatically muted or paused, and the video can also be normally played.
In some alternative implementations of the present embodiment, the floating window may be configured with a key frame. The key frame can be an internal key frame positioned in the suspension window, can also be an external key frame positioned outside the suspension window, and can also comprise the internal key frame and the external key frame at the same time. Function keys may be displayed in the key frame. The function keys may include, but are not limited to, at least one of: the device comprises a video switching key, a mute key, a zoom key, a full screen key, a suspended window closing key and a suspended window retracting key. The video switching key here may include a key for switching to the previous video and a key for switching to the next video.
In the video playing process, when the execution main body detects that the user touches the function key in the key frame, the function key touched by the user can be used as a target function key, and a function corresponding to the target function key is triggered. For example, when the user clicks the close button, the floating window may be closed. When the user clicks the full-screen button, the floating window can be switched to the full-screen mode, so that the played video can be displayed in the full screen mode.
In some optional implementation manners of this embodiment, when the floating window is in the full-screen mode, if it is detected that the user touches an entity return key (e.g., a home key) or a virtual return key in the device, the full-screen mode may be switched to the non-full-screen mode. It should be noted that, when the floating window is in the full-screen mode, the floating window may further include a zoom button. When the user is detected to touch the zoom key, the full-screen mode can be switched to the zoom mode.
In some optional implementation manners of this embodiment, when the floating window is in the non-full screen mode, if it is detected that the user touches an entity return key or a virtual return key in the device, the non-full screen mode may be maintained (i.e., the floating window is not closed). Meanwhile, in response to an operation of the user touching the physical return key or the virtual return key, an operation of exiting the current application interface may be performed. Therefore, the presentation of the floating window and the playing of the video in the floating window can be kept under the condition that the user exits from the current input scene, and the continuity of video playing is improved.
In some alternative implementations of the present embodiment, the floating window may have a stow function. When detecting that the user triggers the retraction function of the floating window, the execution main body can pause playing the video, retract the floating window and switch the floating window into an icon. The icon may be a fixed position icon or a movable icon. In addition, when it is detected that the user touches the icon, the icon may be switched to the floating window again, and the video may continue to be played. Therefore, the floating window can be temporarily folded when the floating window shields the content in the display interface, and the floating window is continuously presented to continuously play the video when the user browses the page content, so that frequent closing or moving of the floating window is avoided, and convenience in operation is improved.
As an example, fig. 3 shows a schematic diagram of the floating window stowing process. As shown in fig. 3, a user communicates with another user in an instant messaging application. And watching the video played in the floating window in the communication process. In the key frame of the floating window, a floating window retracting key shown by reference numeral 301 is included. After the user clicks the floating window retracting button, the floating window can be retracted and switched to a movable icon, as shown by reference numeral 302. Thus, the user can browse to the content in the area covered by the floating window. And when the user clicks the movable icon, the floating window can be recovered, and the video is continuously played.
In some optional implementation manners of this embodiment, when the current input scene is an instant messaging scene (e.g., a scene of chatting with an opposite-end user, group chatting, or the like), if it is detected that the user performs a preset operation on the video played in the floating window, the related information of the video is sent or shared as an instant messaging message. Optionally, the preset operation may include, but is not limited to, at least one of the following: the method comprises the steps of long-time pressing the video, dragging the video to an input frame in the input method interface, triggering a key used for sending or sharing the video, and inputting voice used for sending or sharing the video. Optionally, the related information may include, but is not limited to, at least one of the following: the video data, a page (e.g., H5 page) link of the video, and a cover page of the video.
As an example, fig. 4 shows a schematic diagram of a video transmission process. As shown in fig. 4, a user communicates with another user in an instant messaging application. And watching the video played in the floating window in the communication process. And after the user presses the video played in the floating window for a long time and drags the video to an input box in an input method interface, the video can be automatically sent as an instant messaging message. It should be noted that after the video is dragged to the input box, the related information of the video may be automatically sent, or the related information of the video may be sent after the user clicks the confirmation key or the send key, which is not limited in this embodiment.
In addition, after the opposite-end user receives the instant communication message, the video can be watched by clicking the video in the message. In practice, after clicking on the video, the opposite-end user can watch the video in the form of an H5 page; and a video playing function applied by the input method of the opposite-end user can be automatically triggered, and the video is played in the current interface of the opposite-end user in a floating window mode.
Therefore, the user can input and watch the video, and meanwhile, the watched video can be sent or shared to other users in the input process, so that the functions of the input method are enriched, and the convenience of sending or sharing the video is improved.
According to the method provided by the embodiment of the application, when the fact that the user triggers the video playing function in the input method interface is detected, the floating window is presented, and the video data are loaded, so that the video is played in the floating window based on the video data. Therefore, the video can be continuously played in the process that the user uses different applications in the terminal equipment, frequent application switching is not needed, and convenience in operation and continuity of video playing are improved.
With further reference to fig. 5, a flow 500 of yet another embodiment of a video playback method is shown. The process 500 of the video playing method includes the following steps:
step 501, when detecting that a user triggers a video playing function in an input method interface, presenting a floating window and loading video data.
Step 502, playing a video in the floating window based on the video data.
Steps 501 to 502 in this embodiment can refer to steps 101 to 102 in the corresponding embodiment of fig. 1, and are not described herein again.
Step 503, in the case that the video is a promotion video, if it is detected that the user clicks the video, presenting a promotion information area, and displaying promotion information in the promotion information area.
In this embodiment, the executing main body of the video playing method may detect whether the currently played video is a promotional video. Here, the video with the video content being the promotion content is the promotion video. In practice, video data typically contains a category identification to indicate the video category. The execution subject may detect whether the currently played video is a promotional video based on the category identifier.
In this embodiment, when the currently played video is the promotional video, if it is detected that the user clicks the video, the promotional information area may be presented, and the promotional information may be presented in the promotional information area. The promotional information area may be located in any area of the floating window, such as the bottom area.
The promotion information may include at least one of: deep-link, web-link, download-link. Wherein the deep link may be a Uniform Resource Identifier (URI) for linking to a target page in the target application. The web page link may be a link to a web page (e.g., the h5 page). The download link may be a link to a download address of an application.
It should be noted that, in the case that the played video is not the promotional video, if it is detected that the user clicks the video, the playing of the video may be suspended.
And step 504, jumping to a target page of the target application when the fact that the user clicks the deep link is detected.
In this embodiment, the deep link may enable content pages from any application to another application (e.g., one-click jumping, whereby a jump from an input method application to a target page of a target application may occur upon detection of a user clicking on the deep link.
For example, if a deep link indicates a link to a product page in an e-commerce application, the user can jump directly to the product page when clicking on the deep link. In the process, the user does not need to manually switch, start, search and other operations of the application. The operation convenience is ensured while richer information is provided for the user.
And 505, when it is detected that the user clicks the webpage link, presenting a webpage corresponding to the webpage link.
In this embodiment, when it is detected that the user clicks the web page link, the execution main body may present a web page corresponding to the web page link. The webpage corresponding to the webpage link is presented by clicking the webpage link, so that more information can be provided for the user, and the information richness is improved.
And step 506, downloading the program package of the target application when the user clicking the downloading link is detected.
In this embodiment, the execution subject may download the package of the target application when it is detected that the user clicks the download link. Therefore, more required applications can be provided for the user in the scene that the user watches the video while inputting, and the user can download and use the application conveniently.
As can be seen from fig. 5, compared with the embodiment corresponding to fig. 1, the flow 500 of the video playing method in this embodiment relates to a step of presenting the promotion information when the user clicks the video in the case that the played video is the promotion video. Therefore, more information can be provided for the user in the scene that the user watches the video while inputting, and rich information push is realized.
With further reference to fig. 6, as an implementation of the methods shown in the above-mentioned figures, the present application provides an embodiment of a video playing apparatus, which corresponds to the embodiment of the method shown in fig. 1, and which can be applied to various electronic devices.
As shown in fig. 6, the video playing apparatus 600 according to the present embodiment includes: the presentation unit 601 is configured to present the floating window and load video data when detecting that a user triggers a video playing function in the input method interface; a playing unit 602, adapted to play video in the floating window based on the video data.
In some optional implementations of this embodiment, the apparatus further includes: a first detection unit configured to: when detecting that the user executes the following operations, continuously presenting the floating window and continuously playing the video in the floating window: exiting the input method interface, performing an operation in the input method interface, exiting the current application, switching the current application to another application, performing an operation in the current application, touching a physical return key or a virtual return key in the device.
In some optional implementations of this embodiment, the apparatus further includes: a second detection unit configured to: when detecting that a user triggers the retraction function of the floating window, pausing playing the video, retracting the floating window and switching the floating window into an icon; and when detecting that the user touches the icon, switching the icon into the floating window, and continuously playing the video.
In some optional implementations of this embodiment, the apparatus further includes: a transmitting unit configured to: and when the current input scene is an instant messaging scene, if the fact that the user executes preset operation on the video played in the floating window is detected, sending or sharing the related information of the video as an instant messaging message.
In some optional implementations of this embodiment, the preset operation includes at least one of: the video is long-pressed and dragged to an input frame in the input method interface, a key used for sending or sharing the video is triggered, and voice used for sending or sharing the video is input; the related information includes at least one of: the video data, the page link of the video, and the cover page of the video.
In some optional implementations of this embodiment, the apparatus further includes: a third detection unit configured to: when the floating window is in a full-screen mode, if detecting that a user touches an entity return key or a virtual return key in equipment, switching the full-screen mode into a non-full-screen mode; and when the floating window is in the non-full screen mode, if detecting that a user touches an entity return key or a virtual return key in the equipment, keeping the non-full screen mode.
In some optional implementations of this embodiment, the apparatus further includes: a fourth detection unit configured to: after the video playing is finished, the video is played again; and when detecting that the user executes the video switching operation, loading the video data of the next video and playing the next video.
In some optional implementations of this embodiment, the video playing function is triggered by the following steps: when detecting that a user triggers a video playing function key in an input method interface, triggering a video playing function; or when detecting that a user inputs target content in a coding input mode or a voice input mode, triggering a video playing function, wherein the target content is used for indicating video playing; or when the user is detected to have the intention of playing, sending or sharing the video, the video playing function is triggered.
In some optional implementations of this embodiment, the apparatus further includes: an identification unit configured to: during the process of playing the video, identifying at least one of the following items in real time: the position of the content in the screen and the color of the current input interface; adjusting the position and the size of the suspension window based on the position of the content; and adjusting the background color of the floating window based on the color of the current input interface.
In some optional implementations of this embodiment, the floating window is configured with a key frame, the key frame includes an internal key frame located inside the floating window and/or an external key frame located outside the floating window, the key frame displays function keys, and the function keys include at least one of: the device comprises a video switching key, a mute key, a zoom key, a full screen key, a suspended window closing key and a suspended window retracting key; and, the above-mentioned apparatus also includes: a fifth detection unit configured to: and when detecting that the user touches the function key in the key frame, taking the function key touched by the user as a target function key, and triggering a function corresponding to the target function key.
In some optional implementations of this embodiment, the apparatus further includes: a sixth detection unit configured to: under the condition that the video is not the promotion video, if the fact that the user clicks the video is detected, pausing playing the video; under the condition that the video is the promotion video, if the fact that a user clicks the video is detected, a promotion information area is presented, and promotion information is displayed in the promotion information area, wherein the promotion information comprises at least one of the following items: the system comprises a deep link, a webpage link and a download link, wherein the deep link is a uniform resource identifier used for linking to the target page in the target application; when the fact that the user clicks the deep link is detected, jumping to a target page of a target application; when the condition that the user clicks the webpage link is detected, displaying a webpage corresponding to the webpage link; and when the user is detected to click the download link, downloading the program package of the target application.
According to the device provided by the embodiment of the application, when the situation that the user triggers the video playing function in the input method interface is detected, the floating window is presented, and the video data is loaded, so that the video is played in the floating window based on the video data. Therefore, the video can be continuously played in the process that the user uses different applications in the terminal equipment, frequent application switching is not needed, and convenience in operation and continuity of video playing are improved.
Fig. 7 is a block diagram illustrating an apparatus 700 for setting a toolbar according to an exemplary embodiment, where the apparatus 700 may be an intelligent terminal or a server. For example, the apparatus 700 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 7, apparatus 700 may include one or more of the following components: a processing component 702, a memory 704, a power component 706, a multimedia component 708, an audio component 710, an input/output (I/O) interface 712, a sensor component 714, and a communication component 716.
The processing component 702 generally controls overall operation of the device 700, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing element 702 may include one or more processors 720 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 702 may include one or more modules that facilitate interaction between the processing component 702 and other components. For example, the processing component 702 may include a multimedia module to facilitate interaction between the multimedia component 708 and the processing component 702.
The memory 704 is configured to store various types of data to support operations at the apparatus 700. Examples of such data include instructions for any application or method operating on device 700, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 704 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The power supply component 706 provides power to the various components of the device 700. The power components 706 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the apparatus 700.
The multimedia component 708 includes a screen that provides an output interface between the device 700 and a user as described above. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of the touch or slide action but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 708 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 700 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 710 is configured to output and/or input audio signals. For example, audio component 710 includes a Microphone (MIC) configured to receive external audio signals when apparatus 700 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signal may further be stored in the memory 704 or transmitted via the communication component 716. In some embodiments, audio component 710 also includes a speaker for outputting audio signals.
The I/O interface 712 provides an interface between the processing component 702 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 714 includes one or more sensors for providing status assessment of various aspects of the apparatus 700. For example, sensor assembly 714 may detect an open/closed state of device 700, the relative positioning of components, such as a display and keypad of apparatus 700, the change in position of apparatus 700 or a component of apparatus 700, the presence or absence of user contact with apparatus 700, the orientation or acceleration/deceleration of apparatus 700, and the change in temperature of apparatus 700. The sensor assembly 714 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 714 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 714 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 716 is configured to facilitate wired or wireless communication between the apparatus 700 and other devices. The apparatus 700 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 716 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 716 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 700 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer readable storage medium comprising instructions, such as the memory 704 comprising instructions, executable by the processor 720 of the device 700 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
Fig. 8 is a schematic diagram of a server in some embodiments of the present application. The server 800, which may vary significantly depending on configuration or performance, may include one or more Central Processing Units (CPUs) 822 (e.g., one or more processors) and memory 832, one or more storage media 830 (e.g., one or more mass storage devices) storing applications 842 or data 844. Memory 832 and storage medium 830 may be, among other things, transient or persistent storage. The program stored in the storage medium 830 may include one or more modules (not shown), each of which may include a series of instruction operations for the server. Still further, a central processor 822 may be provided in communication with the storage medium 830 for executing a series of instruction operations in the storage medium 830 on the server 800.
The server 800 may also include one or more power supplies 826, one or more wired or wireless network interfaces 850, one or more input-output interfaces 858, one or more keyboards 856, and/or one or more operating systems 841, such as Windows Server, Mac OS XTM, UnixTM, Linux, FreeBSDTM, etc.
A non-transitory computer readable storage medium having instructions therein which, when executed by a processor of a device (smart terminal or server), enable the device to perform a video playback method, the method comprising: when detecting that a user triggers a video playing function in an input method interface, presenting a floating window and loading video data; playing a video in the floating window based on the video data.
Optionally, the apparatus described above being configured to execute the one or more programs by the one or more processors includes instructions for: continuously presenting the floating window and continuing to play the video in the floating window upon detecting that a user performs the following: exiting the input method interface, performing an operation in the input method interface, exiting the current application, switching the current application to another application, performing an operation in the current application, touching a physical return key or a virtual return key in the device.
Optionally, the apparatus described above being configured to execute the one or more programs by the one or more processors includes instructions for: when detecting that a user triggers a retraction function of the floating window, pausing playing the video, retracting the floating window and switching the floating window into an icon; and when the icon is detected to be touched by the user, switching the icon into the floating window, and continuing to play the video.
Optionally, the apparatus described above being configured to execute the one or more programs by the one or more processors includes instructions for: when the floating window is in a full-screen mode, if an entity return key or a virtual return key in a user touch device is detected, switching the full-screen mode into a non-full-screen mode; and when the floating window is in the non-full screen mode, if detecting that a user touches an entity return key or a virtual return key in the equipment, keeping the non-full screen mode.
Optionally, the apparatus described above being configured to execute the one or more programs by the one or more processors includes instructions for: and when the current input scene is an instant messaging scene, if the fact that a user executes preset operation on the video played in the floating window is detected, sending or sharing the related information of the video as an instant messaging message.
Optionally, the preset operation includes at least one of: the video is pressed for a long time and dragged to an input frame in the input method interface, a key used for sending or sharing the video is triggered, and voice used for sending or sharing the video is input; the related information includes at least one of: the video data, page links of the video, and cover pages of the video.
Optionally, the apparatus described above being configured to execute the one or more programs by the one or more processors includes instructions for: after the video playing is finished, the video is played again; and when detecting that the user executes the video switching operation, loading the video data of the next video and playing the next video.
Optionally, the video playing function is triggered by the following steps: when detecting that a user triggers a video playing function key in an input method interface, triggering a video playing function; or when detecting that a user inputs target content in a coding input mode or a voice input mode, triggering a video playing function, wherein the target content is used for indicating video playing; or when the user is detected to have the intention of playing, sending or sharing the video, the video playing function is triggered.
Optionally, the apparatus described above being configured to execute the one or more programs by the one or more processors includes instructions for: during the process of playing the video, identifying at least one of the following items in real time: the position of the content in the screen and the color of the current input interface; adjusting the position and the size of the floating window based on the position of the content; and adjusting the background color of the floating window based on the color of the current input interface.
Optionally, the floating window is configured with a key frame, the key frame includes an internal key frame located inside the floating window and/or an external key frame located outside the floating window, a function key is displayed in the key frame, and the function key includes at least one of the following: the device comprises a video switching key, a mute key, a zoom key, a full screen key, a suspended window closing key and a suspended window retracting key; and, the above-described device being configured to execute, by one or more processors, the one or more programs including instructions for: and when detecting that the user touches the function key in the key frame, taking the function key touched by the user as a target function key, and triggering a function corresponding to the target function key.
Optionally, the apparatus described above being configured to execute the one or more programs by the one or more processors includes instructions for: under the condition that the video is not the promotion video, if the fact that the user clicks the video is detected, the video is paused to be played; under the condition that the video is the promotion video, if it is detected that a user clicks the video, presenting a promotion information area, and displaying promotion information in the promotion information area, wherein the promotion information comprises at least one of the following items: the system comprises a deep link, a webpage link and a download link, wherein the deep link is a uniform resource identifier used for linking to the target page in the target application; when the fact that the user clicks the deep link is detected, jumping to a target page of a target application; when the condition that the user clicks the webpage link is detected, presenting a webpage corresponding to the webpage link; and when the condition that the user clicks the download link is detected, downloading the program package of the target application.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the application disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice in the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It will be understood that the present application is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the application is limited only by the appended claims.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.
The foregoing detailed description is directed to a video playing method and apparatus and an apparatus for playing video provided by the present application, and specific examples are applied herein to illustrate the principles and embodiments of the present application, and the descriptions of the foregoing embodiments are only used to help understand the method and core ideas of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. A video playback method, the method comprising:
when detecting that a user triggers a video playing function in an input method interface, presenting a floating window and loading video data;
playing a video in the floating window based on the video data.
2. The method of claim 1, further comprising:
continuously presenting the floating window and continuing to play the video in the floating window upon detecting that a user performs the following: exiting the input method interface, performing an operation in the input method interface, exiting the current application, switching the current application to another application, performing an operation in the current application, touching a physical return key or a virtual return key in the device.
3. The method of claim 1, further comprising:
when detecting that a user triggers a retraction function of the floating window, pausing playing the video, retracting the floating window and switching the floating window into an icon;
and when the icon is detected to be touched by the user, switching the icon into the floating window, and continuing to play the video.
4. The method of claim 1, further comprising:
and when the current input scene is an instant messaging scene, if the fact that a user executes preset operation on the video played in the floating window is detected, sending or sharing the related information of the video as an instant messaging message.
5. The method of claim 1, further comprising:
when the floating window is in a full-screen mode, if an entity return key or a virtual return key in a user touch device is detected, switching the full-screen mode into a non-full-screen mode;
and when the floating window is in the non-full screen mode, if detecting that a user touches an entity return key or a virtual return key in the equipment, keeping the non-full screen mode.
6. The method of claim 1, wherein the floating window is configured with a key frame, the key frame comprises an internal key frame located inside the floating window and/or an external key frame located outside the floating window, the key frame has function keys displayed therein, and the function keys comprise at least one of: the device comprises a video switching key, a mute key, a zoom key, a full screen key, a suspended window closing key and a suspended window retracting key; and the number of the first and second groups,
the method further comprises the following steps:
and when detecting that the user touches the function key in the key frame, taking the function key touched by the user as a target function key, and triggering a function corresponding to the target function key.
7. The method of claim 1, wherein after the playing video in the floating window based on the video data, the method further comprises:
under the condition that the video is not the promotion video, if the fact that the user clicks the video is detected, the video is paused to be played;
under the condition that the video is the promotion video, if the fact that a user clicks the video is detected, a promotion information area is presented, promotion information is displayed in the promotion information area, and the promotion information comprises at least one of the following items: the system comprises a deep link, a webpage link and a download link, wherein the deep link is a uniform resource identifier used for linking to the target page in the target application;
when the fact that the user clicks the deep link is detected, jumping to a target page of a target application;
when the condition that the user clicks the webpage link is detected, presenting a webpage corresponding to the webpage link;
and when the condition that the user clicks the download link is detected, downloading the program package of the target application.
8. A video playback apparatus, comprising:
the display unit is matched to display the floating window and load video data when detecting that a user triggers a video playing function in the input method interface;
a playback unit adapted to play back video in the floating window based on the video data.
9. An apparatus for playing video, comprising a memory, and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by one or more processors, the one or more programs comprising instructions for:
when detecting that a user triggers a video playing function in an input method interface, presenting a floating window and loading video data;
playing a video in the floating window based on the video data.
10. A computer-readable medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-7.
CN202010292066.8A 2020-04-14 2020-04-14 Video playing method and device for playing video Active CN113542886B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010292066.8A CN113542886B (en) 2020-04-14 2020-04-14 Video playing method and device for playing video

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010292066.8A CN113542886B (en) 2020-04-14 2020-04-14 Video playing method and device for playing video

Publications (2)

Publication Number Publication Date
CN113542886A true CN113542886A (en) 2021-10-22
CN113542886B CN113542886B (en) 2023-09-22

Family

ID=78119998

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010292066.8A Active CN113542886B (en) 2020-04-14 2020-04-14 Video playing method and device for playing video

Country Status (1)

Country Link
CN (1) CN113542886B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113873316A (en) * 2021-11-04 2021-12-31 上海哔哩哔哩科技有限公司 Live broadcast room video playing method and device
CN114040243A (en) * 2021-11-04 2022-02-11 上海哔哩哔哩科技有限公司 Live broadcast room video playing method and device
CN115086739A (en) * 2022-06-07 2022-09-20 北京字跳网络技术有限公司 Video processing method, device, equipment and storage medium
CN115695889A (en) * 2022-09-30 2023-02-03 聚好看科技股份有限公司 Display device and floating window display method
CN117319755A (en) * 2023-11-28 2023-12-29 深圳大智软件技术有限公司 Short video communication information sharing method and system

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101593075A (en) * 2009-07-10 2009-12-02 张向阳 A kind of advertising display device that is integrated in the input method
CN102147665A (en) * 2010-02-05 2011-08-10 北京搜狗科技发展有限公司 Method and device for displaying information in input process and input method system
US20110214084A1 (en) * 2008-10-02 2011-09-01 Youn Soo Kim Letter input method
CN102314462A (en) * 2010-06-30 2012-01-11 北京搜狗科技发展有限公司 Method and system for obtaining navigation result on input method platform
CN102789366A (en) * 2012-07-27 2012-11-21 上海量明科技发展有限公司 Implementation method, client and system for rich media input method tool
CN105045466A (en) * 2015-07-17 2015-11-11 百度在线网络技术(北京)有限公司 Information provision method and apparatus
WO2016192068A1 (en) * 2015-06-04 2016-12-08 武克易 Smart television shopping method, apparatus, and device
CN108366298A (en) * 2018-03-28 2018-08-03 广东欧珀移动通信有限公司 Video broadcasting method, mobile terminal and computer readable storage medium
CN109819305A (en) * 2018-12-28 2019-05-28 深圳豪客互联网有限公司 Video playing control method and device in a kind of application program
CN110333814A (en) * 2019-05-31 2019-10-15 华为技术有限公司 A kind of method and electronic equipment of sharing contents

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110214084A1 (en) * 2008-10-02 2011-09-01 Youn Soo Kim Letter input method
CN101593075A (en) * 2009-07-10 2009-12-02 张向阳 A kind of advertising display device that is integrated in the input method
CN102147665A (en) * 2010-02-05 2011-08-10 北京搜狗科技发展有限公司 Method and device for displaying information in input process and input method system
CN102314462A (en) * 2010-06-30 2012-01-11 北京搜狗科技发展有限公司 Method and system for obtaining navigation result on input method platform
CN102789366A (en) * 2012-07-27 2012-11-21 上海量明科技发展有限公司 Implementation method, client and system for rich media input method tool
WO2016192068A1 (en) * 2015-06-04 2016-12-08 武克易 Smart television shopping method, apparatus, and device
CN105045466A (en) * 2015-07-17 2015-11-11 百度在线网络技术(北京)有限公司 Information provision method and apparatus
CN108366298A (en) * 2018-03-28 2018-08-03 广东欧珀移动通信有限公司 Video broadcasting method, mobile terminal and computer readable storage medium
CN109819305A (en) * 2018-12-28 2019-05-28 深圳豪客互联网有限公司 Video playing control method and device in a kind of application program
CN110333814A (en) * 2019-05-31 2019-10-15 华为技术有限公司 A kind of method and electronic equipment of sharing contents

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113873316A (en) * 2021-11-04 2021-12-31 上海哔哩哔哩科技有限公司 Live broadcast room video playing method and device
CN114040243A (en) * 2021-11-04 2022-02-11 上海哔哩哔哩科技有限公司 Live broadcast room video playing method and device
CN114040243B (en) * 2021-11-04 2023-08-04 上海哔哩哔哩科技有限公司 Live broadcasting room video playing method and device
CN113873316B (en) * 2021-11-04 2024-02-27 上海哔哩哔哩科技有限公司 Live broadcasting room video playing method and device
CN115086739A (en) * 2022-06-07 2022-09-20 北京字跳网络技术有限公司 Video processing method, device, equipment and storage medium
CN115086739B (en) * 2022-06-07 2024-01-30 北京字跳网络技术有限公司 Video processing method, device, equipment and storage medium
CN115695889A (en) * 2022-09-30 2023-02-03 聚好看科技股份有限公司 Display device and floating window display method
CN117319755A (en) * 2023-11-28 2023-12-29 深圳大智软件技术有限公司 Short video communication information sharing method and system
CN117319755B (en) * 2023-11-28 2024-05-17 深圳大智软件技术有限公司 Short video communication information sharing method and system

Also Published As

Publication number Publication date
CN113542886B (en) 2023-09-22

Similar Documents

Publication Publication Date Title
CN107153541B (en) Browsing interaction processing method and device
CN105955607B (en) Content sharing method and device
CN113542886B (en) Video playing method and device for playing video
CN106294609B (en) Page loading method and device
CN116055610B (en) Method for displaying graphical user interface and mobile terminal
US20210405952A1 (en) Screen projection method, screen projection device, and storage medium
CN111680521A (en) Translation processing method and device and translation processing device
US11704001B2 (en) Method and device for displaying web page content
CN111381739B (en) Application icon display method and device, electronic equipment and storage medium
EP3561691A1 (en) Method and apparatus for displaying webpage content
CN105786507B (en) Display interface switching method and device
CN106528735B (en) Method and device for controlling browser to play media resources
CN109358785B (en) Theme preview method and device
CN110968364B (en) Method and device for adding shortcut plugins and intelligent device
CN112584222A (en) Video processing method and device for video processing
CN107729098B (en) User interface display method and device
CN109521938B (en) Method and device for determining data evaluation information, electronic device and storage medium
CN110321042B (en) Interface information display method and device and electronic equipment
CN112068764B (en) Language switching method and device for language switching
CN111859209A (en) Content display method, device and storage medium
CN106325712B (en) Terminal display control method and device and terminal
CN108829473B (en) Event response method, device and storage medium
CN111092971A (en) Display method and device for displaying
CN107168631B (en) Application program closing method and device and terminal electronic equipment
CN112115947A (en) Text processing method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant