CN113542886B - Video playing method and device for playing video - Google Patents

Video playing method and device for playing video Download PDF

Info

Publication number
CN113542886B
CN113542886B CN202010292066.8A CN202010292066A CN113542886B CN 113542886 B CN113542886 B CN 113542886B CN 202010292066 A CN202010292066 A CN 202010292066A CN 113542886 B CN113542886 B CN 113542886B
Authority
CN
China
Prior art keywords
video
floating window
user
key
playing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010292066.8A
Other languages
Chinese (zh)
Other versions
CN113542886A (en
Inventor
姜盛禄
杨斌
刘贲
周冰洋
赵涵思
张华婷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Sogou Technology Development Co Ltd
Original Assignee
Beijing Sogou Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Sogou Technology Development Co Ltd filed Critical Beijing Sogou Technology Development Co Ltd
Priority to CN202010292066.8A priority Critical patent/CN113542886B/en
Publication of CN113542886A publication Critical patent/CN113542886A/en
Application granted granted Critical
Publication of CN113542886B publication Critical patent/CN113542886B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4316Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/812Monomedia components thereof involving advertisement data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/858Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot
    • H04N21/8586Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot by using a URL

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a video playing method and device and a device for playing video. An embodiment of the method comprises: when detecting that a user triggers a video playing function in an input method interface, presenting a floating window and loading video data; video is played in the floating window based on the video data. The embodiment improves the convenience of the user for operating the equipment and the consistency of video playing.

Description

Video playing method and device for playing video
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to a video playing method and device and a device for playing video.
Background
With the development of internet technology, video Applications (APP) are increasing, such as short video applications, live broadcast applications, video applications, and the like. A user may view video by installing such an application in a device.
In the prior art, a user needs to open an installed video class application to watch a video. If other applications need to be used in the process of watching the video, for example, a message in an instant messaging application needs to be replied, or a document needs to be written in a document editing application, the video application needs to be switched to the other applications, and after the other applications are used up, the video application is switched to the video application again to continue playing the video. Therefore, the conventional mode requires frequent application switching operation by a user, so that the operation convenience is low, and meanwhile, the conventional mode cannot support continuous video playing in the process of using other applications, so that the video playing continuity is poor.
Disclosure of Invention
The embodiment of the application provides a video playing method and device and a device for playing video, which are used for solving the technical problems of low operation convenience and poor video playing consistency in the prior art.
In a first aspect, an embodiment of the present application provides a video playing method, where the method includes: when detecting that a user triggers a video playing function in an input method interface, presenting a floating window and loading video data; video is played in the floating window based on the video data.
In a second aspect, an embodiment of the present application provides a video playing device, including: the presentation unit is matched with the suspension window and loads video data when detecting that a user triggers a video playing function in the input method interface; and a playing unit configured to play video in the floating window based on the video data.
In a third aspect, embodiments of the present application provide an apparatus for playing video, comprising a memory, and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by one or more processors, the one or more programs comprising instructions for: when detecting that a user triggers a video playing function in an input method interface, presenting a floating window and loading video data; video is played in the floating window based on the video data.
In a fourth aspect, embodiments of the present application provide a computer readable medium having stored thereon a computer program which, when executed by a processor, implements a method as described in the first aspect above.
According to the video playing method, the video playing device and the video playing device, when the video playing function in the user trigger input method interface is detected, the floating window is presented, and video data are loaded, so that video is played in the floating window based on the video data. Therefore, the video can be continuously played in the process that the user uses different applications in the terminal equipment, frequent application switching is not needed, and the convenience of operation and the continuity of video playing are improved.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the detailed description of non-limiting embodiments, made with reference to the accompanying drawings in which:
FIG. 1 is a flow chart of one embodiment of a video playback method in accordance with the present application;
FIG. 2 is a schematic diagram of a floating window presentation process according to the present application;
FIG. 3 is a schematic illustration of a suspended window stowing process according to the present application;
fig. 4 is a schematic diagram of a video transmission process according to the present application;
Fig. 5 is a flowchart of yet another embodiment of a video playing method according to the present application;
fig. 6 is a schematic structural view of an embodiment of a video playback device according to the present application;
fig. 7 is a schematic structural view of an apparatus for playing video according to the present application;
fig. 8 is a schematic diagram of a server in some embodiments according to the application.
Detailed Description
The application is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the application and are not limiting of the application. It should be noted that, for convenience of description, only the portions related to the present application are shown in the drawings.
It should be noted that, without conflict, the embodiments of the present application and features of the embodiments may be combined with each other. The application will be described in detail below with reference to the drawings in connection with embodiments.
Referring to fig. 1, a flow 100 of one embodiment of a video playing method according to the present application is shown. The video playing method can be operated in various electronic devices, including but not limited to: servers, smartphones, tablet computers, e-book readers, MP3 (dynamic video expert compression standard audio plane 3,Moving Picture Experts Group Audio Layer III) players, MP4 (dynamic video expert compression standard audio plane 4,Moving Picture Experts Group Audio Layer IV) players, laptop computers, car computers, desktop computers, set-top boxes, smart televisions, wearable devices, and the like.
The input method application mentioned in the embodiment of the application can support a plurality of input methods. The input method may be an encoding method employed for inputting various symbols into electronic devices such as computers, mobile phones, etc., and a user may conveniently input a desired character or character string into the electronic device using an input method application. It should be noted that, in the embodiment of the present application, the input method may support other language input methods (such as english input method, japanese hiragana input method, korean input method, etc.) in addition to the common chinese input method (such as pinyin input method, wubi input method, zhuyin input method, voice input method, handwriting input method, etc.), and the input method and the language type of the input method are not limited.
The video playing method in this embodiment may include the following steps:
step 101, when detecting that a user triggers a video playing function in an input method interface, presenting a floating window and loading video data.
In the present embodiment, various types of client applications, such as an input method application, an instant messaging application, a document editing application, an audio-video application, and the like, may be installed in the execution body of the video playback method (e.g., the above-described electronic device). The input method application can be configured with a video playing function. The video playing function supports calling video service through the input method application and playing video, so that a user can watch the video in the input process of various scenes. Videos here include, but are not limited to, short videos, movie dramas, live broadcasts, and the like.
In this embodiment, when it is detected that the user triggers the video playing function in the input method interface, the executing body may present a floating window in the current interface. The input method interface is an interface of input method application. The input method interface may include a keyboard region and various function keys, such as a voice input function key, a applet function key, a search function key, an expression input function key, a video play function key, and the like. When the user triggers (e.g. clicks) the video playing function key, the video playing function of the input method application can be triggered.
It should be noted that the user may trigger the video playing function in other manners. As an example, a user may trigger a video play function by making content input in an input method application. If the user inputs the target content, such as "video playing" or the like, through the coding input mode or the voice input mode, the video playing function can be triggered. The target content may be used to indicate video playback. In addition, the video playing function of the input method application can be triggered by detecting the input intention of the user, and the video playing function is triggered when the intention of playing, transmitting or sharing the video of the user is detected. As an example, the current input scenario is an instant messaging scenario, and when the duration of waiting for the opposite terminal to return a message after the user sends the instant messaging message is longer than the preset duration, the video playing function can be triggered. Before triggering the video play function, the user may be asked if triggering is needed. The triggering path of the video playing function can be preset according to the requirement, and is not limited herein.
As an example, fig. 2 shows a schematic diagram of a floating window presentation process. As shown in fig. 2, a user communicates with another user in an instant messaging application. And in the communication process, the input method is used for inputting the content. The session interface with the user is shown at 201. Upon content input, an input method interface 202 is presented in the session interface. The input method interface comprises a keyboard area and various function keys, such as an expression input function key, a keyboard input function key, a voice input function key, a small program function key, a video playing function key and the like. When the user triggers (e.g., clicks) a video play function button (as shown by reference numeral 203) in the input method interface, the video play function of the input method application may be triggered, so that a floating window (as shown by reference numeral 204) may be presented in the current interface.
In this embodiment, a floating window may be used to present the video. The floating window may be configured as either a fixed position floating window or a movable floating window. When configured as a fixed-position floating window, the floating window may be positioned at a location on the screen where it is not likely to obscure the screen content, such as the upper right corner of the screen. When configured as a movable hover window, a user can drag the hover window to any position in the screen as desired, thereby avoiding obscuring content in the current input scene. In addition, when the electronic device is configured as a movable floating window, the electronic device can automatically identify the position of the content (such as text information) in the screen and automatically adjust the position of the floating window to avoid shielding the content in the screen as much as possible.
In addition, the floating window may take various forms. For example, the background of the floating window may be set to a translucent color or a solid color, and the background color or pattern may also be set by the user in a personalized manner. Under the default condition, the background color can be set to be semitransparent black, so that the interface content is prevented from being shielded by the floating window, and meanwhile, the identifiability of the function keys in the floating window under different interfaces can be ensured. In addition, the background color of the floating window can be adaptively adjusted according to the color of the current interface, so that the function keys in the floating window can be identified when the application is switched, and the floating window does not cause content shielding.
The floating window may be set to a fixed size or may be set to a variable size. When the size of the floating window is a fixed size, in order to prevent the user from being blocked by the floating window in the input process (such as chatting with the opposite terminal user in instant communication application), the width of the floating window can be limited to 50% of the screen width, so that the user can conveniently check the instant communication message sent by the opposite terminal user. Meanwhile, the height of the floating window can be set based on the position of the input method interface, so that the distance between the bottom of the floating window and the top of the input method interface is kept at least one line of information, and a user can conveniently browse the last line of information in an instant messaging scene. When the size of the floating window is variable, the size of the floating window can be changed by a user through stretching operation, and the electronic equipment can automatically adjust the size of the floating window by detecting the position of the content in the current interface so as to avoid shielding the content in the screen as much as possible.
In this embodiment, the execution body may be communicatively connected to the server through an input method application. The server herein may be a server for providing video data, such as an input method server, a server of a third party video platform, etc. When detecting that the user triggers the video playing function in the input method interface, the executing body can send a video data obtaining request to the server to obtain video data. After the video data is acquired, the video data can be loaded.
In practice, each time the executing body sends a video data acquisition request, one or more (e.g., 10) video addresses may be acquired. The video address may include a URL (Uniform Resource Locator ) of the cover of the video and a URL of the video data. When the executing body acquires the video address, the executing body can access the video address and acquire the cover and the video data. The acquired cover and video data may then be stored in a buffer. Finally, the cover and video data of a video in the cache may be loaded, thereby playing the video. In the process, the cover of the next video can be read in advance at the same time, so that the user can timely display the cover of the next video when switching the video, and picture blocking is avoided.
It should be noted that, the executing body may send a video data acquisition request to the server after playing a preset number of videos or after detecting that the number of videos not played in the buffer is smaller than a preset value, so as to acquire new video data. The requested video data may be video data of a video source of interest to the user, or may be random video, which is not limited in this embodiment.
In some optional implementations of this embodiment, when the user triggers the video playing function, it may first be detected whether the user opens the floating window right of the input method application. If it is already open, the floating window can be presented directly. If not, a prompt message can be presented to prompt the user to open the floating window authority. For example, the hint information may be presented in a pop-up form, such as "a floating window right needs to be opened for video services. The hooks [ hover window authority ] or [ allow to appear on other applications ] can be opened. In addition, since the user usually does not open the floating window authority when the user first triggers the video playing function, the prompt information can be directly presented when the user first triggers the video playing function.
In some optional implementations of this embodiment, when the user first triggers the video playing function, after presenting the floating window, a guide page may also be first displayed in the floating window to inform the floating window and a usage method of each function key in the floating window, such as a mute method, a video switching method, a full screen method, a floating window stowing method, and so on. Thereby facilitating accurate operation when the user first uses the video play function.
Step 102, playing video in a floating window based on the video data.
In this embodiment, the executing body may play the video frames in the video data in the floating window based on the loaded video data, so as to realize video playing.
Here, if the current video playing in the floating window is completed under the condition that the user does not switch the video, the video data of the next video can be automatically loaded, and the playing of the next video can be automatically performed. In addition, the video may be replayed until it is detected that the user performs the video switching operation, video data of a next video is loaded, and the next video is played.
In some optional implementations of this embodiment, during the video playing process, if it is detected that the user performs operations of exiting the input method interface, performing operations in the input method interface, exiting the current application, switching the current application to another application, performing operations in the current application, touching an entity return key or a virtual return key in the device, the floating window may be continuously presented, and the video may be continuously played in the floating window. Therefore, when the floating window plays the video, the user can normally execute the input operation, can also operate in the current application, and can continuously play the video under the condition that the user exits the current input scene or switches the input scene, so that the continuity of video playing is improved. It should be noted that, when the user inputs voice, the user may be prompted to have video playing. Meanwhile, the played video can be automatically muted or paused, and the video can be normally played.
In some alternative implementations of the present embodiment, the floating window may be configured with a key frame. The key frame can be an internal key frame positioned in the floating window, an external key frame positioned outside the floating window, and the internal key frame and the external key frame can be contained at the same time. The key frame may have function keys displayed therein. The function keys may include, but are not limited to, at least one of: video switch button, mute button, zoom button, full screen button, floating window close button, floating window stow button. The video switching keys herein may include a key for switching to a previous video and a key for switching to a next video.
In the video playing process, when the executing main body detects that the user touches a function key in the key frame, the executing main body can take the function key touched by the user as a target function key and trigger a function corresponding to the target function key. For example, when the user clicks the close button, the floating window may be closed. When the user clicks the full screen key, the floating window can be switched to the full screen mode, so that the played video can be displayed in a full screen mode.
In some optional implementations of this embodiment, when the floating window is in full-screen mode, if it is detected that the user touches an entity return key (such as a home key) or a virtual return key in the device, the full-screen mode may be switched to non-full-screen mode. It should be noted that, when the floating window is in the full screen mode, the floating window may further include a zoom key. When the user is detected to touch the zoom key, the full-screen mode may also be switched to the zoom mode.
In some optional implementations of this embodiment, when the floating window is in the non-full screen mode, if an entity return key or a virtual return key in the user touch device is detected, the non-full screen mode may be maintained (i.e., the floating window is not closed). Meanwhile, in response to an operation of the user touching the entity return key or the virtual return key, an operation of exiting the current application interface may be performed. Therefore, under the condition that the user exits from the current input scene, the presentation of the floating window and the playing of the video in the floating window can be maintained, and the continuity of video playing is improved.
In some alternative implementations of the present embodiment, the floating window may have a stow function. When detecting that the user triggers the function of folding the floating window, the executing body can pause playing video, fold the floating window and switch the floating window into an icon. The icons may be fixed-position icons or movable icons. In addition, when the user touches the icon, the icon may be switched back to a floating window, and the video may be continuously played. Therefore, the floating window can be temporarily retracted when the floating window shields the content in the display interface, and the floating window is continuously presented to continuously play the video when the user browses the page content, so that frequent closing or moving of the floating window is avoided, and the convenience of operation is improved.
By way of example, fig. 3 shows a schematic diagram of a floating window stowing process. As shown in fig. 3, a user communicates with another user in an instant messaging application. And watch the video played in the floating window in the communication process. The floating window key frame includes a floating window retracting key as indicated by reference numeral 301. After the user clicks the floating window retract button, the floating window may be retracted and switched to a movable icon, as indicated by reference numeral 302. Thus, the user can browse the content in the area covered by the floating window. When the user clicks the movable icon, the floating window can be restored, and the video can be played continuously.
In some optional implementations of this embodiment, when the current input scenario is an instant communication scenario (such as a scenario of chatting with an opposite user, group chat, etc.), if it is detected that the user performs a preset operation on the video played in the floating window, relevant information of the video is sent or shared as an instant communication message. Optionally, the preset operations may include, but are not limited to, at least one of: long-pressing the video, dragging the video to an input box in the input method interface, triggering a key for sending or sharing the video, and inputting voice for sending or sharing the video. Optionally, the related information may include, but is not limited to, at least one of the following: the video data, a page (e.g., H5 page) link of the video, and a cover of the video.
As an example, fig. 4 shows a schematic diagram of a video transmission process. As shown in fig. 4, a user communicates with another user in an instant messaging application. And watch the video played in the floating window in the communication process. After a user presses the video played in the floating window for a long time and drags the video to an input box in the input method interface, the video can be automatically sent as an instant communication message. It should be noted that, after dragging the video to the input box, the related information of the video may be automatically sent, or the related information of the video may be sent after the user clicks the confirm button or sends the button, which is not limited in this embodiment.
In addition, after the opposite terminal user receives the instant communication message, the video can be watched by clicking the video in the message. In practice, after clicking the video, the opposite terminal user can watch the video in the form of an H5 page; the method can also automatically trigger the video playing function of the input method application of the opposite terminal user, and play the video in the current interface of the opposite terminal user in a floating window mode.
Therefore, the user can input and watch the video simultaneously, and simultaneously, the watched video can be sent or shared to other users in the input process, so that the functions of the input method are enriched, and the convenience of sending or sharing the video is improved.
According to the method provided by the embodiment of the application, when the video playing function in the input method interface is triggered by the user, the floating window is presented, and video data is loaded, so that video is played in the floating window based on the video data. Therefore, the video can be continuously played in the process that the user uses different applications in the terminal equipment, frequent application switching is not needed, and the convenience of operation and the continuity of video playing are improved.
With further reference to fig. 5, a flow 500 of yet another embodiment of a video playback method is shown. The video playing method 500 includes the following steps:
step 501, when detecting that a user triggers a video playing function in an input method interface, presenting a floating window and loading video data.
Step 502, playing video in a floating window based on video data.
Step 501 to step 502 in the present embodiment can refer to step 101 to step 102 in the corresponding embodiment of fig. 1, and are not described herein.
In step 503, if the video is a promotion video, if it is detected that the user clicks the video, a promotion information area is presented, and promotion information is presented in the promotion information area.
In this embodiment, the execution subject of the video playing method may detect whether the currently played video is a promotional video. The video content is the video of the promotion content, namely the promotion video. In practice, video data typically contains a category identification to indicate the video category. The executing body can detect whether the currently played video is a promotion video or not based on the category identification.
In this embodiment, if it is detected that the user clicks the video when the currently played video is a promotion video, a promotion information area may be presented, and promotion information may be presented in the promotion information area. The promotional information area may be located in any area of the floating window, such as the bottom area, etc.
The promotional information may include at least one of: deep-link), web page link, download link. Wherein the deep link may be a uniform resource identifier (Uniform Resource Identifier, URI) for linking to a target page in the target application. The web page links may be links to web pages (e.g., h5 pages). The download link may be a link to a download address of an application.
It should be noted that, in the case that the video played is not a promotional video, if it is detected that the user clicks the video, the playing of the video may be paused.
Step 504, when detecting that the user clicks on the deep link, jumping to a target page of the target application.
In this embodiment, the deep link may implement a content page (e.g., one-click jump) from any application to another application, whereby, upon detecting a user clicking on the deep link, a jump may be made from the input method application to a target page of the target application.
For example, if the deep link indicates a link to a product page in an e-commerce application, the user may jump directly to the product page when clicking on the deep link. In the process, the user does not need to manually switch, open, search and other operations of the application. The convenience of operation is ensured while richer information is provided for the user.
In step 505, when it is detected that the user clicks on the web page link, a web page corresponding to the web page link is presented.
In this embodiment, when the execution body detects that the user clicks on the web page link, the execution body may present the web page corresponding to the web page link. By clicking the webpage links to present the webpages corresponding to the webpage links, more information can be provided for the user, and the richness of the information is improved.
Step 506, when detecting that the user clicks the download link, downloading the package of the target application.
In this embodiment, the executing body may download the package of the target application when detecting that the user clicks the download link. Therefore, more applications required by the user can be provided for the user under the scene that the user inputs and views the video, and the user can download and use conveniently.
As can be seen from fig. 5, compared with the embodiment corresponding to fig. 1, the flow 500 of the video playing method in this embodiment relates to a step of presenting promotion information when the user clicks the video in the case that the video being played is a promotion video. Therefore, more information can be provided for the user under the scene that the user inputs and watches the video, and rich information pushing is realized.
With further reference to fig. 6, as an implementation of the method shown in the foregoing figures, the present application provides an embodiment of a video playing device, where the embodiment of the device corresponds to the embodiment of the method shown in fig. 1, and the device is specifically applicable to various electronic devices.
As shown in fig. 6, the video playing device 600 according to the present embodiment includes: the presenting unit 601 is configured to present a floating window and load video data when detecting that a user triggers a video playing function in an input method interface; a playing unit 602 configured to play video in the floating window based on the video data.
In some optional implementations of this embodiment, the apparatus further includes: a first detection unit configured to: continuously presenting the floating window and continuously playing the video in the floating window when the following operation is detected by a user: the method comprises the steps of exiting an input method interface, executing operation in the input method interface, exiting a current application, switching the current application to other applications, executing operation in the current application, and touching an entity return key or a virtual return key in the device.
In some optional implementations of this embodiment, the apparatus further includes: a second detection unit configured to: when detecting that the user triggers the function of folding the floating window, pausing playing the video, folding the floating window and switching the floating window into an icon; when the user touches the icon, the icon is switched to the floating window, and the video is continuously played.
In some optional implementations of this embodiment, the apparatus further includes: a transmission unit configured to: when the current input scene is an instant communication scene, if the user is detected to execute the preset operation on the video played in the floating window, the relevant information of the video is sent or shared as an instant communication message.
In some optional implementations of this embodiment, the preset operations include at least one of: long-pressing the video, dragging the video to an input box in the input method interface, triggering a key for sending or sharing the video, and inputting voice for sending or sharing the video; the above-mentioned related information includes at least one of the following: the video data, the page links of the video, and the cover of the video.
In some optional implementations of this embodiment, the apparatus further includes: a third detection unit configured to: when the floating window is in a full-screen mode, if an entity return key or a virtual return key in the touch equipment of the user is detected, switching the full-screen mode into a non-full-screen mode; and when the floating window is in the non-full-screen mode, if the physical return key or the virtual return key in the touch equipment of the user is detected, the non-full-screen mode is maintained.
In some optional implementations of this embodiment, the apparatus further includes: a fourth detection unit configured to: after the video is played, the video is played again; when the user is detected to execute the video switching operation, video data of the next video is loaded, and the next video is played.
In some optional implementations of this embodiment, the video playing function is triggered by the following steps: triggering a video playing function when detecting that a user triggers a video playing function key in an input method interface; or when detecting that the user inputs the target content in a coding input mode or a voice input mode, triggering a video playing function, wherein the target content is used for indicating video playing; or when the intention of the user to play, send or share the video is detected, triggering the video playing function.
In some optional implementations of this embodiment, the apparatus further includes: an identification unit configured to: during the process of playing the video, at least one of the following is identified in real time: the position of the content in the screen and the color of the current input interface; adjusting the position and the size of the floating window based on the position of the floating window; and adjusting the background color of the floating window based on the color of the current input interface.
In some optional implementations of this embodiment, the floating window configures a key frame, where the key frame includes an internal key frame located inside the floating window and/or an external key frame located outside the floating window, a function key is displayed in the key frame, and the function key includes at least one of: video switching key, mute key, zoom key, full screen key, floating window closing key, floating window stowing key; and, the above-mentioned device further includes: a fifth detection unit configured to: when the user touches the function keys in the key frame, the function keys touched by the user are used as target function keys, and the functions corresponding to the target function keys are triggered.
In some optional implementations of this embodiment, the apparatus further includes: a sixth detection unit configured to: if the video is detected to be clicked by the user under the condition that the video is not the popularization video, the playing of the video is paused; and under the condition that the video is a promotion video, if the video is clicked by a user, presenting a promotion information area and presenting promotion information in the promotion information area, wherein the promotion information comprises at least one of the following items: deep links, web links, download links, the deep links being uniform resource identifiers for linking to the target pages in the target applications; when the user clicking the deep link is detected, jumping to a target page of a target application; when the user clicking the webpage link is detected, presenting a webpage corresponding to the webpage link; and when detecting that the user clicks the download link, downloading the program package of the target application.
According to the device provided by the embodiment of the application, when the video playing function in the input method interface is triggered by the user, the floating window is presented, and video data is loaded, so that video is played in the floating window based on the video data. Therefore, the video can be continuously played in the process that the user uses different applications in the terminal equipment, frequent application switching is not needed, and the convenience of operation and the continuity of video playing are improved.
Fig. 7 is a block diagram illustrating an apparatus 700 for setting a toolbar according to an exemplary embodiment, the apparatus 700 may be an intelligent terminal or a server. For example, apparatus 700 may be a mobile phone, computer, digital broadcast terminal, messaging device, game console, tablet device, medical device, exercise device, personal digital assistant, or the like.
Referring to fig. 7, an apparatus 700 may include one or more of the following components: a processing component 702, a memory 704, a power component 706, a multimedia component 708, an audio component 710, an input/output (I/O) interface 712, a sensor component 714, and a communication component 716.
The processing component 702 generally controls overall operation of the apparatus 700, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing element 702 may include one or more processors 720 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 702 can include one or more modules that facilitate interaction between the processing component 702 and other components. For example, the processing component 702 may include a multimedia module to facilitate interaction between the multimedia component 708 and the processing component 702.
The memory 704 is configured to store various types of data to support operations at the apparatus 700. Examples of such data include instructions for any application or method operating on the apparatus 700, contact data, phonebook data, messages, pictures, videos, and the like. The memory 704 may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The power supply component 706 provides power to the various components of the device 700. The power components 706 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the device 700.
The multimedia component 708 includes a screen that provides an output interface between the device 700 and the user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only a boundary of a touch or a sliding action but also a duration and a pressure related to the touch or the sliding operation. In some embodiments, the multimedia component 708 includes a front-facing camera and/or a rear-facing camera. The front-facing camera and/or the rear-facing camera may receive external multimedia data when the device 700 is in an operational mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have focal length and optical zoom capabilities.
The audio component 710 is configured to output and/or input audio signals. For example, the audio component 710 includes a Microphone (MIC) configured to receive external audio signals when the device 700 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may be further stored in the memory 704 or transmitted via the communication component 716. In some embodiments, the audio component 710 further includes a speaker for outputting audio signals.
The I/O interface 712 provides an interface between the processing component 702 and peripheral interface modules, which may be a keyboard, click wheel, buttons, etc. These buttons may include, but are not limited to: homepage button, volume button, start button, and lock button.
The sensor assembly 714 includes one or more sensors for providing status assessment of various aspects of the apparatus 700. For example, the sensor assembly 714 may detect an on/off state of the device 700, a relative positioning of the assemblies, such as the display and keypad of the apparatus 700, a change in position of the apparatus 700 or one of the assemblies of the apparatus 700, the presence or absence of user contact with the apparatus 700, an orientation or acceleration/deceleration of the apparatus 700, and a change in temperature of the apparatus 700. The sensor assembly 714 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact. The sensor assembly 714 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 714 may also include an acceleration sensor, a gyroscopic sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 716 is configured to facilitate communication between the apparatus 700 and other devices in a wired or wireless manner. The apparatus 700 may access a wireless network based on a communication standard, such as WiFi,2G or 3G, or a combination thereof. In one exemplary embodiment, the communication component 716 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communication component 716 described above further includes a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 700 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic elements for executing the methods described above.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided, such as memory 704, including instructions executable by processor 720 of apparatus 700 to perform the above-described method. For example, the non-transitory computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
Fig. 8 is a schematic diagram of a server in some embodiments of the application. The server 800 may vary considerably in configuration or performance and may include one or more central processing units (central processing units, CPUs) 822 (e.g., one or more processors) and memory 832, one or more storage media 830 (e.g., one or more mass storage devices) storing applications 842 or data 844. Wherein the memory 832 and the storage medium 830 may be transitory or persistent. The program stored in the storage medium 830 may include one or more modules (not shown), each of which may include a series of instruction operations on a server. Still further, the central processor 822 may be configured to communicate with the storage medium 830 to execute a series of instruction operations in the storage medium 830 on the server 800.
The server 800 may also include one or more power supplies 826, one or more wired or wireless network interfaces 850, one or more input/output interfaces 858, one or more keyboards 856, and/or one or more operating systems 841, such as Windows ServerTM, mac OS XTM, unixTM, linuxTM, freeBSDTM, and the like.
A non-transitory computer readable storage medium, which when executed by a processor of an apparatus (smart terminal or server) causes the apparatus to perform a video playback method, the method comprising: when detecting that a user triggers a video playing function in an input method interface, presenting a floating window and loading video data; and playing video in the floating window based on the video data.
Optionally, the above-described device configured to be executed by the one or more processors, the one or more programs comprising instructions for: continuously presenting the floating window when the user is detected to execute the following operations, and continuously playing the video in the floating window: the method comprises the steps of exiting an input method interface, executing operation in the input method interface, exiting a current application, switching the current application to other applications, executing operation in the current application, and touching an entity return key or a virtual return key in the device.
Optionally, the above-described device configured to be executed by the one or more processors, the one or more programs comprising instructions for: when detecting that a user triggers the function of folding the floating window, pausing playing the video, folding the floating window and switching the floating window into an icon; and when the icon is detected to be touched by the user, switching the icon into the floating window, and continuing to play the video.
Optionally, the above-described device configured to be executed by the one or more processors, the one or more programs comprising instructions for: when the floating window is in a full-screen mode, if an entity return key or a virtual return key in the user touch equipment is detected, switching the full-screen mode into a non-full-screen mode; and when the floating window is in the non-full-screen mode, if the physical return key or the virtual return key in the user touch equipment is detected, the non-full-screen mode is maintained.
Optionally, the above-described device configured to be executed by the one or more processors, the one or more programs comprising instructions for: when the current input scene is an instant communication scene, if the user is detected to execute preset operation on the video played in the floating window, the relevant information of the video is sent or shared as an instant communication message.
Optionally, the preset operation includes at least one of: long-pressing the video, dragging the video to an input box in the input method interface, triggering a key for sending or sharing the video, and inputting voice for sending or sharing the video; the related information includes at least one of: the video data, the page links of the video, the cover of the video.
Optionally, the above-described device configured to be executed by the one or more processors, the one or more programs comprising instructions for: after the video is played, replaying the video; and when detecting that the user executes the video switching operation, loading video data of the next video and playing the next video.
Optionally, the video playing function is triggered by the following steps: triggering a video playing function when detecting that a user triggers a video playing function key in an input method interface; or when detecting that the user inputs the target content in a coding input mode or a voice input mode, triggering a video playing function, wherein the target content is used for indicating video playing; or when the intention of the user to play, send or share the video is detected, triggering the video playing function.
Optionally, the above-described device configured to be executed by the one or more processors, the one or more programs comprising instructions for: during the process of playing the video, at least one of the following is identified in real time: the position of the content in the screen and the color of the current input interface; adjusting the position and the size of the floating window based on the position of the content; and adjusting the background color of the floating window based on the color of the current input interface.
Optionally, the floating window is configured with a key frame, the key frame includes an internal key frame located inside the floating window and/or an external key frame located outside the floating window, a function key is displayed in the key frame, and the function key includes at least one of the following: video switching key, mute key, zoom key, full screen key, floating window closing key, floating window stowing key; and, the above-described device configured to be executed by the one or more processors, the one or more programs including instructions for: when the fact that the user touches the function keys in the key frame is detected, the function keys touched by the user are used as target function keys, and functions corresponding to the target function keys are triggered.
Optionally, the above-described device configured to be executed by the one or more processors, the one or more programs comprising instructions for: if the video is not the popularization video, if the video is detected to be clicked by the user, the playing of the video is paused; if the video is a promotion video, presenting a promotion information area and presenting promotion information in the promotion information area if the video is clicked by a user, wherein the promotion information comprises at least one of the following items: deep links, web page links, download links, the deep links being uniform resource identifiers for linking to the target pages in the target application; when the fact that the user clicks the deep link is detected, jumping to a target page of a target application; when the fact that the user clicks the webpage link is detected, presenting a webpage corresponding to the webpage link; and downloading the program package of the target application when detecting that the user clicks the download link.
Other embodiments of the application will be apparent to those skilled in the art from consideration of the specification and practice of the application disclosed herein. This application is intended to cover any variations, uses, or adaptations of the application following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the application pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It is to be understood that the application is not limited to the precise arrangements and instrumentalities shown in the drawings, which have been described above, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the application is limited only by the appended claims.
The foregoing description of the preferred embodiments of the application is not intended to limit the application to the precise form disclosed, and any such modifications, equivalents, and alternatives falling within the spirit and scope of the application are intended to be included within the scope of the application.
The above description of a video playing method, apparatus and apparatus for playing video provided by the present application applies specific examples to illustrate the principles and embodiments of the present application, and the above description of the examples is only used to help understand the method and core ideas of the present application; meanwhile, as those skilled in the art will have variations in the specific embodiments and application scope in accordance with the ideas of the present application, the present description should not be construed as limiting the present application in view of the above.

Claims (18)

1. A video playing method, the method comprising:
when detecting that a user triggers a video playing function in an input method interface, presenting a floating window and loading video data;
playing video in the floating window based on the video data;
the method further comprises the steps of:
continuously presenting the floating window when the user is detected to execute the following operations, and continuously playing the video in the floating window: exiting the input method interface, executing operation in the input method interface, exiting the current application, switching the current application to other applications, executing operation in the current application, touching an entity return key or a virtual return key in the device;
the video playing function in the input method interface is triggered by the following steps:
triggering a video playing function in an input method interface when detecting that a user triggers a video playing function key in the input method interface; or alternatively, the process may be performed,
when the target content is detected to be input by the user through the voice input mode, triggering a video playing function in the input method interface, wherein the target content is used for indicating video playing.
2. The method according to claim 1, wherein the method further comprises:
When detecting that a user triggers the function of folding the floating window, pausing playing the video, folding the floating window and switching the floating window into an icon;
and when the icon is detected to be touched by the user, switching the icon into the floating window, and continuing to play the video.
3. The method according to claim 1, wherein the method further comprises:
when the current input scene is an instant communication scene, if the user is detected to execute preset operation on the video played in the floating window, the relevant information of the video is sent or shared as an instant communication message.
4. A method according to claim 3, wherein the preset operation comprises at least one of: long-pressing the video, dragging the video to an input box in the input method interface, triggering a key for sending or sharing the video, and inputting voice for sending or sharing the video; the method comprises the steps of,
the related information includes at least one of: the video data, the page links of the video, the cover of the video.
5. The method according to claim 1, wherein the method further comprises:
When the floating window is in a full-screen mode, if an entity return key or a virtual return key in the user touch equipment is detected, switching the full-screen mode into a non-full-screen mode;
and when the floating window is in the non-full-screen mode, if the physical return key or the virtual return key in the user touch equipment is detected, the non-full-screen mode is maintained.
6. The method according to claim 1, wherein the method further comprises:
during the process of playing the video, at least one of the following is identified in real time: the position of the content in the screen and the color of the current input interface;
adjusting the position and the size of the floating window based on the position of the content;
and adjusting the background color of the floating window based on the color of the current input interface.
7. The method of claim 1, wherein the floating window configures a key frame comprising an internal key frame located inside the floating window and/or an external key frame located outside the floating window, wherein a function key is displayed in the key frame, the function key comprising at least one of: video switching key, mute key, zoom key, full screen key, floating window closing key, floating window stowing key; the method comprises the steps of,
The method further comprises the steps of:
when the fact that the user touches the function keys in the key frame is detected, the function keys touched by the user are used as target function keys, and functions corresponding to the target function keys are triggered.
8. The method of claim 1, wherein after the playing video in the floating window based on the video data, the method further comprises:
if the video is not the popularization video, if the video is detected to be clicked by the user, the playing of the video is paused;
if the video is a promotion video, presenting a promotion information area and presenting promotion information in the promotion information area if the video is clicked by a user, wherein the promotion information comprises at least one of the following items: deep links, web page links, download links, the deep links being uniform resource identifiers for linking to target pages in a target application;
when the fact that the user clicks the deep link is detected, jumping to a target page of a target application;
when the fact that the user clicks the webpage link is detected, presenting a webpage corresponding to the webpage link;
and downloading the program package of the target application when detecting that the user clicks the download link.
9. A video playback device, the device comprising:
the presentation unit is matched with the suspension window and loads video data when detecting that a user triggers a video playing function in the input method interface;
a playback unit adapted to play back video in the floating window based on the video data;
the apparatus further comprises:
a first detection unit configured to continuously present the floating window and continuously play the video in the floating window when detecting that a user performs the following operations: exiting the input method interface, executing operation in the input method interface, exiting the current application, switching the current application to other applications, executing operation in the current application, touching an entity return key or a virtual return key in the device;
the video playing function in the input method interface is triggered by the following steps:
triggering a video playing function in an input method interface when detecting that a user triggers a video playing function key in the input method interface; or alternatively, the process may be performed,
when the target content is detected to be input by the user through the voice input mode, triggering a video playing function in the input method interface, wherein the target content is used for indicating video playing.
10. The apparatus of claim 9, wherein the apparatus further comprises:
a second detection unit configured to:
when detecting that a user triggers the function of folding the floating window, pausing playing the video, folding the floating window and switching the floating window into an icon;
and when the icon is detected to be touched by the user, switching the icon into the floating window, and continuing to play the video.
11. The apparatus of claim 9, wherein the apparatus further comprises:
and the sending unit is configured to send or share the related information of the video as an instant communication message if detecting that the user executes the preset operation on the video played in the floating window when the current input scene is the instant communication scene.
12. The apparatus of claim 11, wherein the preset operation comprises at least one of: long-pressing the video, dragging the video to an input box in the input method interface, triggering a key for sending or sharing the video, and inputting voice for sending or sharing the video; the method comprises the steps of,
the related information includes at least one of: the video data, the page links of the video, the cover of the video.
13. The apparatus of claim 9, wherein the apparatus further comprises:
a third detection unit configured to:
when the floating window is in a full-screen mode, if an entity return key or a virtual return key in the user touch equipment is detected, switching the full-screen mode into a non-full-screen mode;
and when the floating window is in the non-full-screen mode, if the physical return key or the virtual return key in the user touch equipment is detected, the non-full-screen mode is maintained.
14. The apparatus of claim 9, wherein the apparatus further comprises:
an identification unit configured to:
during the process of playing the video, at least one of the following is identified in real time: the position of the content in the screen and the color of the current input interface;
adjusting the position and the size of the floating window based on the position of the content;
and adjusting the background color of the floating window based on the color of the current input interface.
15. The device of claim 9, wherein the floating window is provided with a key frame, the key frame comprises an internal key frame positioned inside the floating window and/or an external key frame positioned outside the floating window, a function key is displayed in the key frame, and the function key comprises at least one of the following: video switching key, mute key, zoom key, full screen key, floating window closing key, floating window stowing key; the method comprises the steps of,
The apparatus further comprises:
and the fifth detection unit is configured to trigger a function corresponding to the target function key by taking the function key touched by the user as the target function key when detecting that the user touches the function key in the key frame.
16. The apparatus of claim 9, wherein the apparatus further comprises:
a sixth detection unit configured to:
if the video is not the popularization video, if the video is detected to be clicked by the user, the playing of the video is paused;
if the video is a promotion video, presenting a promotion information area and presenting promotion information in the promotion information area if the video is clicked by a user, wherein the promotion information comprises at least one of the following items: deep links, web page links, download links, the deep links being uniform resource identifiers for linking to target pages in a target application;
when the fact that the user clicks the deep link is detected, jumping to a target page of a target application;
when the fact that the user clicks the webpage link is detected, presenting a webpage corresponding to the webpage link;
and downloading the program package of the target application when detecting that the user clicks the download link.
17. An apparatus for playing video, comprising a memory, and one or more programs, wherein the one or more programs are stored in the memory, which when executed by one or more processors, implement the steps of the method of any of claims 1-8.
18. A computer readable medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements the method according to any of claims 1-8.
CN202010292066.8A 2020-04-14 2020-04-14 Video playing method and device for playing video Active CN113542886B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010292066.8A CN113542886B (en) 2020-04-14 2020-04-14 Video playing method and device for playing video

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010292066.8A CN113542886B (en) 2020-04-14 2020-04-14 Video playing method and device for playing video

Publications (2)

Publication Number Publication Date
CN113542886A CN113542886A (en) 2021-10-22
CN113542886B true CN113542886B (en) 2023-09-22

Family

ID=78119998

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010292066.8A Active CN113542886B (en) 2020-04-14 2020-04-14 Video playing method and device for playing video

Country Status (1)

Country Link
CN (1) CN113542886B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114040243B (en) * 2021-11-04 2023-08-04 上海哔哩哔哩科技有限公司 Live broadcasting room video playing method and device
CN113873316B (en) * 2021-11-04 2024-02-27 上海哔哩哔哩科技有限公司 Live broadcasting room video playing method and device
CN115086739B (en) * 2022-06-07 2024-01-30 北京字跳网络技术有限公司 Video processing method, device, equipment and storage medium
CN115695889A (en) * 2022-09-30 2023-02-03 聚好看科技股份有限公司 Display device and floating window display method
CN117319755B (en) * 2023-11-28 2024-05-17 深圳大智软件技术有限公司 Short video communication information sharing method and system

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101593075A (en) * 2009-07-10 2009-12-02 张向阳 A kind of advertising display device that is integrated in the input method
CN102147665A (en) * 2010-02-05 2011-08-10 北京搜狗科技发展有限公司 Method and device for displaying information in input process and input method system
CN102314462A (en) * 2010-06-30 2012-01-11 北京搜狗科技发展有限公司 Method and system for obtaining navigation result on input method platform
CN102789366A (en) * 2012-07-27 2012-11-21 上海量明科技发展有限公司 Implementation method, client and system for rich media input method tool
CN105045466A (en) * 2015-07-17 2015-11-11 百度在线网络技术(北京)有限公司 Information provision method and apparatus
WO2016192068A1 (en) * 2015-06-04 2016-12-08 武克易 Smart television shopping method, apparatus, and device
CN108366298A (en) * 2018-03-28 2018-08-03 广东欧珀移动通信有限公司 Video broadcasting method, mobile terminal and computer readable storage medium
CN109819305A (en) * 2018-12-28 2019-05-28 深圳豪客互联网有限公司 Video playing control method and device in a kind of application program
CN110333814A (en) * 2019-05-31 2019-10-15 华为技术有限公司 A kind of method and electronic equipment of sharing contents

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101044679B1 (en) * 2008-10-02 2011-06-29 (주)아이티버스 Characters input method

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101593075A (en) * 2009-07-10 2009-12-02 张向阳 A kind of advertising display device that is integrated in the input method
CN102147665A (en) * 2010-02-05 2011-08-10 北京搜狗科技发展有限公司 Method and device for displaying information in input process and input method system
CN102314462A (en) * 2010-06-30 2012-01-11 北京搜狗科技发展有限公司 Method and system for obtaining navigation result on input method platform
CN102789366A (en) * 2012-07-27 2012-11-21 上海量明科技发展有限公司 Implementation method, client and system for rich media input method tool
WO2016192068A1 (en) * 2015-06-04 2016-12-08 武克易 Smart television shopping method, apparatus, and device
CN105045466A (en) * 2015-07-17 2015-11-11 百度在线网络技术(北京)有限公司 Information provision method and apparatus
CN108366298A (en) * 2018-03-28 2018-08-03 广东欧珀移动通信有限公司 Video broadcasting method, mobile terminal and computer readable storage medium
CN109819305A (en) * 2018-12-28 2019-05-28 深圳豪客互联网有限公司 Video playing control method and device in a kind of application program
CN110333814A (en) * 2019-05-31 2019-10-15 华为技术有限公司 A kind of method and electronic equipment of sharing contents

Also Published As

Publication number Publication date
CN113542886A (en) 2021-10-22

Similar Documents

Publication Publication Date Title
CN107153541B (en) Browsing interaction processing method and device
CN113542886B (en) Video playing method and device for playing video
CN105955607B (en) Content sharing method and device
CN109683761B (en) Content collection method, device and storage medium
US10949490B2 (en) Method and apparatus for displaying webpage content
US11704001B2 (en) Method and device for displaying web page content
CN116055610B (en) Method for displaying graphical user interface and mobile terminal
CN111680521A (en) Translation processing method and device and translation processing device
CN110968364B (en) Method and device for adding shortcut plugins and intelligent device
CN111381739A (en) Application icon display method and device, electronic equipment and storage medium
CN112584222A (en) Video processing method and device for video processing
CN107729098B (en) User interface display method and device
CN110413169B (en) Information display method, device and medium
CN109521938B (en) Method and device for determining data evaluation information, electronic device and storage medium
CN111859209A (en) Content display method, device and storage medium
CN110321042B (en) Interface information display method and device and electronic equipment
CN112068764B (en) Language switching method and device for language switching
CN108874758B (en) Note processing method and device, and device for note processing
CN108829473B (en) Event response method, device and storage medium
CN111092971A (en) Display method and device for displaying
CN109976549B (en) Data processing method, device and machine readable medium
CN112312220B (en) Webpage video playing method and device
CN111880696B (en) Encyclopedic-based data processing method and device
WO2023072251A1 (en) Interaction method, interaction apparatus, electronic device, and computer-readable storage medium
CN109388328B (en) Input method, device and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant