WO2019090653A1 - Procédé, appareil et système de diffusion vidéo en continu - Google Patents

Procédé, appareil et système de diffusion vidéo en continu Download PDF

Info

Publication number
WO2019090653A1
WO2019090653A1 PCT/CN2017/110341 CN2017110341W WO2019090653A1 WO 2019090653 A1 WO2019090653 A1 WO 2019090653A1 CN 2017110341 W CN2017110341 W CN 2017110341W WO 2019090653 A1 WO2019090653 A1 WO 2019090653A1
Authority
WO
WIPO (PCT)
Prior art keywords
event
video
live
time
server
Prior art date
Application number
PCT/CN2017/110341
Other languages
English (en)
Chinese (zh)
Inventor
肖融
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Priority to PCT/CN2017/110341 priority Critical patent/WO2019090653A1/fr
Priority to CN201780055514.9A priority patent/CN110024412B/zh
Publication of WO2019090653A1 publication Critical patent/WO2019090653A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/238Interfacing the downstream path of the transmission network, e.g. adapting the transmission rate of a video stream to network bandwidth; Processing of multiplex streams
    • H04N21/2387Stream processing in response to a playback request from an end-user, e.g. for trick-play

Definitions

  • the embodiments of the present invention relate to the field of Internet technologies, and in particular, to a method, an apparatus, and a system for video live broadcast.
  • Live broadcast refers to the way to simultaneously produce and publish information along with the occurrence and development of events, including live broadcasts, studio interview live broadcasts, live text pictures, live video and audio, or live broadcasts from sources provided by third parties. Wait. Live content can include sports events, evenings, conferences, celebrations, and more.
  • Live broadcast if the user needs to look back at the events that have just occurred, such as goals in a football match, penalty kicks, etc., dunks, blocks, etc. in a basketball game, you can only manually drag back to the viewing axis (also known as the playback progress). Strip, progress bar) to find, can not quickly locate the location of the event on the playback progress bar.
  • the embodiment of the present application provides a method, an apparatus, and a system for video live broadcast, which can provide annotation data of a live event, so that an event node that has occurred can be marked on the live broadcast back axis, and the present embodiment is presented in an intuitive form. Events in the live broadcast.
  • a method for video live broadcast may be applied to a video clip server, and the method includes:
  • the description information of an acquired event determines the playing time of the event, and the playing time of the event is provided to the video live server, and the playing time of the event is provided by the video live server to the live stream receiving the video program.
  • a client for marking the event in a play progress bar of the video program in the client is provided.
  • a method for video live broadcast can be applied to a video client, and the method includes:
  • a method for video live broadcast can be applied to a video live broadcast system, and the method includes:
  • the video clip server acquires description information of one or more events associated with the live video program, wherein the description information of an event includes an occurrence time of the event; and the description information of the acquired event is based on The time of occurrence of the event in the description information determines a play time of the event, and the play time of the event is provided to the video live broadcast server;
  • the live video server obtains the live media stream of the video program, and provides the video client to the video client to play; the play time of the event is provided to the video client;
  • the video client plays a live media stream of the video program, and marks the event in a play progress bar of the video program according to a play time of the event.
  • a video live broadcast system may include: a video clip server, a video live server, and a video client;
  • the live video server is configured to obtain a live media stream of the video program. And providing the video client to play; providing the playing time of the event to the video client; and
  • the video client is configured to play a live media stream of the video program, and mark the event in a play progress bar of the video program according to a play time of the event.
  • Embodiments of the present application also provide a computer readable storage medium storing computer readable instructions.
  • the instructions When configured to correspond to a device in the live video system of the present application, the instructions may be executed by the processor to perform a processing flow of the device in each embodiment.
  • the information of the event in the real scene corresponding to the current live content is obtained in real time, and is processed into the annotation data and provided to the client, which may be marked on the live broadcast back axis during the live broadcast process.
  • the event node that has occurred presenting the events in this live broadcast in an intuitive form.
  • FIG. 1b and 1c are flowcharts of a video live broadcast method of a video live broadcast system according to an embodiment of the present application
  • FIGS. 2a and 2b are flowcharts of a video live broadcast method of a video clip server according to an embodiment of the present application
  • 2c is a schematic structural diagram of a video clip server according to an embodiment of the present application.
  • the video client 13 refers to a device or a program for requesting a video live broadcast service to the video service system 11 and presenting the live video screen on the user interface.
  • video client 13 may be a web application running in a terminal device, such as a browser, and the like.
  • video client 13 may be specifically designed for use with video services
  • the system 11 communicates to obtain video data applications, such as dedicated video apps, and so on.
  • the video live broadcast platform 11 may include a video clip server 15 and a video live broadcast server 16.
  • video clip server 15, video live server 16 may each include one or more physical devices (eg, computing devices).
  • the live video server 16 can obtain the live media stream of the video program and provide it to the video client 13 for playing; and provide the playing time of the event to the video client.
  • the video client 13 can play the live media stream of the video program, and determine a location corresponding to the play time in the play progress bar of the video program according to the play time of the event, and display the location for indicating the location A prompt message to mark the event.
  • the live video program may refer to a live video program of some activity currently in progress.
  • the activities mentioned here can include sports events, parties, conferences, celebrations, etc.
  • the events associated with the video program may refer to goals, substitutions, fouls, start of a game, etc. in the sports event, the start of a certain program in the party, the issuance of an award, the speech of someone in the meeting, and so on.
  • the description information of an event may refer to information used to describe various characteristics of an event, such as information describing the time, location, related person, event content, and the like of the event.
  • the description information can be unstructured data, such as text, or structured data, such as data with multiple fields stored in a preset format, data structures, databases, files in a markup language (such as XML, HTML). ,Wait. Description of the event Obtained by a predetermined route.
  • the description information can be obtained from a preset network device (such as a database of a data service provider, a device of an event organizer, etc.).
  • the video live broadcast platform 11 may periodically or irregularly access the network device to read event description information.
  • the description information of the event may be pushed to the live broadcast platform 11 by the network device in real time after the event occurs. In other embodiments, the video live broadcast platform 11 can also obtain the description information manually input.
  • the time at which the event occurs may refer to the time at which the event occurred during the live event, that is, the time at which the live data of the event was collected.
  • the occurrence time may include time relative to a certain point in time, the Mth minute of the first half, etc., and/or absolute time, such as 15:30 on May 8, 2017, and the like.
  • the playing time of the event may refer to the time when the video content corresponding to the event is displayed to the user during the live broadcast.
  • the method of extracting the occurrence time of the event from the description information may be determined according to the form of the description information. For example, when the description information is text, the occurrence time of the event may be extracted by analyzing the text; when the description information is structured data in a preset format, the occurrence time of the event may be extracted from the preset field.
  • the configuration of the video live broadcast platform 11 shown in FIG. 1a is only used to explain the function of the video live broadcast platform 11, and is not intended to limit the video live broadcast platform 11 to the structural framework.
  • the video live broadcast platform 11 can also be implemented by other devices in different organizational manners.
  • FIG. 1b is a flowchart of a method for video live broadcast of a live video system according to an embodiment of the present application.
  • the execution order of each method step is not fixed. In different cases, for example, the frequency of occurrence of the event and the time when the client accesses the live broadcast may be different, and the execution order of each step is adjusted according to actual conditions.
  • the live video method may include the following steps.
  • Step S101 The video live broadcast server acquires and stores live media data of the video program from the data source. This step can be continuously performed during the live broadcast.
  • Step S102 The video client requests live broadcast data of the video program from the video live server.
  • Step S103 The video live server provides the live media stream of the video program to the video client.
  • video data is transmitted to the client as a stream of media data. As long as the client does not quit the live broadcast, it will continue to receive subsequent video data of the live broadcast.
  • step S103a the client plays the live media stream in the user interface.
  • Step S104a the video clip server acquires description information of at least one event associated with the live video program, and determines a play time of the event.
  • step S105a the video clip server sends the event information to the live video server.
  • the event information can include the play time of the event. In some embodiments, the event information may also include other information about the event, such as type, text description, time of occurrence, person involved, and the like.
  • step S106a the video live server stores the received event information locally.
  • step S108a the video live server pushes the received event information to the video client that receives the video program.
  • the above calibration can be performed by acquiring the deviation between the occurrence time of the scene and the playing time (that is, the deviation between the playing time and the collecting time of the content in the live broadcast, which is simply referred to as the live broadcast delay).
  • the live broadcast delay of the video program can be obtained, and the play time of the event is determined according to the live broadcast delay and the occurrence time of the event.
  • the live broadcast delay may be a deviation of the play time of the video program from the acquisition time.
  • 2b is a video in the embodiment of the present application
  • Step S31 Obtain description information of a preset event in the video program.
  • Step S32 Acquire an occurrence time of the preset event from the description information of the preset event.
  • Step S33 Acquire a play time of the preset event.
  • Step S34 The time difference between the occurrence time of the preset event and the playing time is used as a live broadcast delay.
  • Step S35 Determine, for the first event associated with the video program, a play time of the event according to the live broadcast delay and the occurrence time of the first event.
  • the event type that does not need to be stored may also be preset, and when the type of the event does not belong to one of the preset event types that are not required to be stored, the video clip server 15 may perform the storage of the annotation data of the event to The step in the data storage device, that is, when the type of the event is one of preset event types that do not need to be stored, the annotation data of the event is not stored.
  • other event screening criteria may also be employed, using event types for illustration only.
  • video clip server 15 may cull events that have been stored in data storage device 17 while storing new annotation data.
  • the annotation data of the event includes the type of the event
  • the stored annotation data may be eliminated by using the type of the event and the priority corresponding to each type of the preset.
  • the event is acquired according to a preset relationship between the event type and the priority (ie, a new waiting The priority corresponding to the type of the stored event) is taken as the priority of the event.
  • the annotation data of the event is stored into the data storage device, and The annotation data of the at least one event is deleted in the data storage device.
  • at least one event having a lower priority than the event to be stored may refer to all of the stored events, all events having a lower priority than the event, or one of all events lower than the priority of the event. Or multiple events.
  • priorities of the multiple events are the same, events with a shorter playing time (ie, newer) may be retained.
  • all events lower than the priority of the event may be deleted, or only the same number of events to be stored may be deleted.
  • the video clip server 15 may include a processor 21, a memory 23, and a network interface 27.
  • the memory 23 includes an operating system 24, a network communication module 25, and an annotation module 26.
  • the labeling module 26 can include: an obtaining module 261, a time extracting module 262, a time adjusting module 264, and an annotation providing module 265.
  • the obtaining module 261 can acquire description information of an event associated with the video program currently being broadcasted.
  • the time extraction module 262 can extract the time of occurrence of the event from the description information.
  • the time adjustment module 764 can determine the play time of the event according to the occurrence time of the event.
  • the annotation providing module 765 can provide the play time of the event to the live video server.
  • the annotation module 26 can also include a delay acquisition module 263.
  • the delay acquisition module 263 can acquire a live broadcast delay of the video program.
  • the data acquisition server 14 may determine at least one piece of description information of the event according to the original description data, the at least one piece of description information including a time of occurrence of the event, and the determined description information and the video program to which the event belongs The logo is provided to the video clip server 15.
  • data acquisition server 14 may extract the type of event and the time of occurrence of the event from the original description data.
  • the event type and the occurrence time of the event can be obtained by analyzing the text or by reading the corresponding field of the structured data.
  • the data acquisition server 14 may also obtain a preset icon material associated with the event type.
  • different event types may be pre-configured with corresponding icon material for annotating events of that type.
  • the icon material may be an icon representing a soccer ball; when the event type is a red card, the icon material may be a red rectangle representing a red card.
  • the data acquisition server 14 can provide the description information including the event type, the occurrence time of the event, and the icon material to the video clip server 15.
  • the video clip server 15 may determine whether to provide the play time and the icon material of the event to the client according to the event type, so that the client corresponds to the play progress bar.
  • the location of the play time displays the icon material to mark the event.
  • the at least one description information provided by the data acquisition server 14 may further include a description text of the event. The descriptive text can be extracted from the original description data.
  • FIG. 3 is a schematic structural diagram of a video client according to an embodiment of the present application.
  • the video client 13 can include a processor 31, a memory 33, a display device 38, and a network interface 37.
  • the memory 33 includes an operating system 34, a network communication module 35, and a video playback module 36.
  • the video playing module 36 may include: a live stream playing module 361 and an event labeling module 362.
  • FIG. 3b is a flowchart of a method for providing annotation data according to an embodiment of the present application. Such as shown in FIG. 3b, the method 40 can include the following steps.
  • the video client 13 can pull the annotation data from the live video server 16 at predetermined time intervals.
  • the video broadcast server 16 may be queried at a preset time interval for whether the video program has annotation data of an unacquired event; when it is determined that there is an annotation data of an unacquired event, the video live server is not pulled.
  • the annotation data of the acquired event the annotation data including the playing time of the event.
  • the video live server 16 may push the annotation data to the client at predetermined time intervals or when there is updated annotation data.
  • the video client 13 can acquire the annotation data using data communication based on the HTTP interface.
  • Step S43 for the playing time of the acquired event, the event labeling module 362 may determine a location corresponding to the playing time in the playing progress bar of the video program, and display prompt information for indicating the location. Label the event.
  • the video client 13 can display the prompt information of the event anywhere in the user interface of the live video.
  • the video client 13 may present the prompt information of the event in the form of a list of information in the sidebar.
  • the video client 13 may scroll the presentation information of each event in the form of subtitles.
  • the video client 13 can display the indicia directly at the location of the playback time of the corresponding event in the playback progress bar.
  • FIG. 2b is a schematic diagram of a user interface of a live video broadcast according to an embodiment of the present application. In the play progress bar 21, the video client 13 uses the flag 44 to indicate the current live broadcast progress, and uses the markers 42, 43 to indicate the location of the two events that have occurred in the live broadcast.
  • the annotation data may include information of the icon material, such as image data of the icon material, a unique identifier of the icon material, a URL of the icon material, and the like; the video client 13 may obtain the icon material by using the acquired information of the icon material. In other examples, the video client 13 may acquire the icon material corresponding to the event from the video live server.
  • FIG. 3 is a schematic diagram of a live broadcast play interface according to an embodiment of the present application. As shown in FIG. 3d, as the video program (e.g., a football game) progresses, in addition to the currently broadcast progress flag 44, the progress bar 41 displays some hint information (in this example, an icon). Different icons represent different event types, such as first half game start 411, red card 412, goal 413, first half game end 414, second half game start 415, substitution 416, and the like.
  • the video client 13 may present relevant information of the event corresponding to the prompt information in the user interface.
  • the information of the presented event may include, but is not limited to, the time of occurrence of the event, the person involved, the description text, the representative image, and the like.
  • the description text of the event can be obtained from the annotation data of the event.
  • the representative image of the event may be intercepted by the client from the live broadcast content of the video (for example, a frame image corresponding to the playback time of the event), and the image or the thumbnail of the image is directly used as a representative image of the event.
  • the video client 13 may display the play time 47, the screenshot 46, and the description text 48 of the event corresponding to the prompt information on the play interface (eg, "C Ronald penalty point” The ball breaks the door"), so that users can quickly understand the content of the event.
  • the prompt information e.g, mouse hovering, etc.
  • video playback module 36 may be implemented by computer readable instructions, that is, as computer readable instructions corresponding to modules 361 and 362 described above. These instructions may cause processor 31 to perform the operations of the video client of various embodiments of the present application.
  • the video live broadcast platform 11 may provide the video client with information of events matching the user information according to the user information associated with the video client 13.
  • the video live server 16 can determine the user tag according to the information of the video client 13, determine the tag data of the at least one event matching the user tag in the data storage device 17, and provide the tag data of the at least one event to the video. Client 13.
  • the video live broadcast server 16 may obtain the user identifier from the event acquisition request sent by the video client 13 or obtain the user identifier corresponding to the video client from the correspondence between the stored video client and the user identifier, and then The user label corresponding to the user identifier is obtained from the user database.
  • video client 13 may determine the user's user tag and add the user tag to the event acquisition request; video live server 16 may obtain the user tag from the event acquisition request.
  • the video live server 16 may determine a user tag according to the event acquisition request, and determine, in the data storage device 17, annotation data of at least one event that matches the user tag, Annotation data for at least one event is provided to the client.
  • the data acquisition server 14 may acquire a plurality of original description information of the same event from one or more data sources, extract a plurality of description texts therefrom, and provide the description information of the event together with the occurrence time of the event to the video clip.
  • Multiple pieces of descriptive text for the same event may be text that describes the event in a different manner. Different ways may include, but are not limited to, different preferences, different languages, different perspectives, different description styles, and the like.
  • the video clip server 15 may extract a plurality of description texts from the description information, and add a description vector composed of the plurality of description texts to the annotation data.
  • the data acquisition server 14 may extract the tag information of each description text from the original description information, or determine the tag information describing the text according to the data source of the description text.
  • the data acquisition server 14 can provide the plurality of description texts of the event and the tag information of each description text to the video clip server 15.
  • the video clip server 15 adds this information to the tag data of the event.
  • the video client 13 may determine a user tag of the local user, look up the description text matching the user tag in the tag information of the plurality of description texts, and display the description text matching the user tag.
  • the video client 13 may also display multiple description texts of the same event in turn, for example, at predetermined time intervals, along with description texts of other events.
  • the timing at which the video client 13 displays the description text of the event may include: responding to the operation of the prompt information in the play progress bar, or in response to the acquisition of the description text, or in response to the live broadcast of the video program Ok.
  • the hardware modules in the embodiments may be implemented in a hardware manner or a hardware platform plus software.
  • the above software includes machine readable instructions stored in a non-volatile storage medium.
  • embodiments can also be embodied as software products.
  • the hardware can be made up of specialized hardware or hardware that executes machine readable instructions.
  • the hardware can be a specially designed permanent circuit or logic device (such as a dedicated processor such as an FPGA or ASIC) for performing a particular operation.
  • the hardware may also include programmable logic devices or circuits (such as including general purpose processors or other programmable processors) that are temporarily configured by software for performing particular operations.
  • the machine readable instructions corresponding to the modules in the figures may cause an operating system or the like operating on a computer to perform some or all of the operations described herein.
  • the non-transitory computer readable storage medium may be inserted into a memory provided in an expansion board within the computer or written to a memory provided in an expansion unit connected to the computer.
  • the CPU or the like installed on the expansion board or the expansion unit can perform part and all of the actual operations according to the instructions.
  • the non-transitory computer readable storage medium includes a floppy disk, a hard disk, a magneto-optical disk, an optical disk (such as a CD-ROM, a CD-R, a CD-RW, a DVD-ROM, a DVD-RAM, a DVD-RW, a DVD+RW), and a magnetic tape. , non-volatile memory card and ROM.
  • the program code can be downloaded from the server computer by the communication network.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

L'invention concerne un procédé, un appareil et un système de diffusion vidéo en continu. Le procédé consiste : à obtenir des informations descriptives d'un ou plusieurs événements associés à un programme vidéo en diffusion en continu en direct, les informations descriptives d'un événement comprenant le temps d'occurrence de l'événement ; pour les informations descriptives de l'événement obtenu, à déterminer le temps de lecture de l'événement en fonction du temps d'occurrence de l'événement dans les informations descriptives ; et à fournir le temps de lecture de l'événement au serveur de diffusion de vidéo en continu en direct, le temps de lecture de l'événement étant fourni par le serveur de diffusion en continu de vidéo en direct à un client qui reçoit la diffusion en continu en direct du programme vidéo pour marquer l'événement dans une barre de progression de lecture du programme vidéo dans le client.
PCT/CN2017/110341 2017-11-10 2017-11-10 Procédé, appareil et système de diffusion vidéo en continu WO2019090653A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2017/110341 WO2019090653A1 (fr) 2017-11-10 2017-11-10 Procédé, appareil et système de diffusion vidéo en continu
CN201780055514.9A CN110024412B (zh) 2017-11-10 2017-11-10 一种视频直播的方法、装置和系统

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/110341 WO2019090653A1 (fr) 2017-11-10 2017-11-10 Procédé, appareil et système de diffusion vidéo en continu

Publications (1)

Publication Number Publication Date
WO2019090653A1 true WO2019090653A1 (fr) 2019-05-16

Family

ID=66437369

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/110341 WO2019090653A1 (fr) 2017-11-10 2017-11-10 Procédé, appareil et système de diffusion vidéo en continu

Country Status (2)

Country Link
CN (1) CN110024412B (fr)
WO (1) WO2019090653A1 (fr)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190342621A1 (en) * 2018-05-07 2019-11-07 Apple Inc. User interfaces for viewing live video feeds and recorded video
US10635303B2 (en) 2016-06-12 2020-04-28 Apple Inc. User interface for managing controllable external devices
US10779085B1 (en) 2019-05-31 2020-09-15 Apple Inc. User interfaces for managing controllable external devices
US11079913B1 (en) 2020-05-11 2021-08-03 Apple Inc. User interface for status indicators
US11363071B2 (en) 2019-05-31 2022-06-14 Apple Inc. User interfaces for managing a local network
CN115103213A (zh) * 2022-06-10 2022-09-23 咪咕视讯科技有限公司 信息处理方法、装置、设备及计算机可读存储介质
US11589010B2 (en) 2020-06-03 2023-02-21 Apple Inc. Camera and visitor user interfaces
US11657614B2 (en) 2020-06-03 2023-05-23 Apple Inc. Camera and visitor user interfaces
US11785277B2 (en) 2020-09-05 2023-10-10 Apple Inc. User interfaces for managing audio for media items

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111565334B (zh) * 2020-04-30 2021-12-28 广州酷狗计算机科技有限公司 直播回放方法、装置、终端、服务器及存储介质
CN111757147B (zh) * 2020-06-03 2022-06-24 苏宁云计算有限公司 一种赛事视频结构化的方法、装置及系统
CN114189699A (zh) * 2020-09-15 2022-03-15 阿里巴巴集团控股有限公司 政务服务信息提供方法、装置及电子设备
CN112231517A (zh) * 2020-11-04 2021-01-15 支付宝(杭州)信息技术有限公司 一种数据查询方法及装置
CN112533008A (zh) * 2020-11-16 2021-03-19 北京达佳互联信息技术有限公司 视频回放方法、装置、电子设备及存储介质
CN112417209A (zh) * 2020-11-20 2021-02-26 青岛以萨数据技术有限公司 一种基于浏览器的实时视频标注方法、系统、终端及介质
CN112423113A (zh) * 2020-11-20 2021-02-26 广州欢网科技有限责任公司 电视节目打点方法、装置及电子终端
CN113423000B (zh) * 2021-06-11 2024-01-09 完美世界征奇(上海)多媒体科技有限公司 视频的生成方法及装置、存储介质、电子装置
CN114095791A (zh) * 2021-11-15 2022-02-25 广州博冠信息科技有限公司 直播回放方法、装置、电子设备和存储介质
CN115134631B (zh) * 2022-07-25 2024-01-30 北京达佳互联信息技术有限公司 视频处理方法和视频处理装置

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130198005A1 (en) * 2012-01-27 2013-08-01 Sony Network Entertainment International Llc System, method, and infrastructure for real-time live streaming content
CN103763581A (zh) * 2013-05-02 2014-04-30 乐视网信息技术(北京)股份有限公司 一种实现直播回看的方法和系统
CN103763626A (zh) * 2013-12-19 2014-04-30 华为软件技术有限公司 一种信息推送方法、设备和系统
CN104469512A (zh) * 2013-09-25 2015-03-25 浙江大华技术股份有限公司 一种视频播放器及其控制视频播放的方法
CN105007533A (zh) * 2015-07-28 2015-10-28 米科互动教育科技(北京)有限公司 直播课程回放方法、装置以及系统
CN105376588A (zh) * 2015-12-18 2016-03-02 北京金山安全软件有限公司 一种视频直播方法、装置及电子设备
CN105916035A (zh) * 2015-12-15 2016-08-31 乐视致新电子科技(天津)有限公司 一种快速定位播放时间点的显示方法及装置

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2690879B1 (fr) * 2012-07-23 2016-09-07 LG Electronics, Inc. Terminal mobile et son procédé de commande
KR102161230B1 (ko) * 2013-05-28 2020-09-29 삼성전자주식회사 멀티미디어 콘텐츠 검색을 위한 사용자 인터페이스 방법 및 장치
CN104284249A (zh) * 2013-07-11 2015-01-14 腾讯科技(深圳)有限公司 视频播放方法及装置
US9727215B2 (en) * 2013-11-11 2017-08-08 Htc Corporation Method for performing multimedia management utilizing tags, and associated apparatus and associated computer program product
CN104038848A (zh) * 2014-05-30 2014-09-10 无锡天脉聚源传媒科技有限公司 一种视频处理方法及装置
CN104219571B (zh) * 2014-09-17 2019-05-28 传线网络科技(上海)有限公司 一种自动提供看点的方法和装置
CN105302906A (zh) * 2015-10-29 2016-02-03 小米科技有限责任公司 信息标注方法及装置
CN105611413B (zh) * 2015-12-24 2018-10-02 小米科技有限责任公司 一种添加视频段类别标记的方法和装置
US10219040B2 (en) * 2015-12-28 2019-02-26 The Directv Group, Inc. Video frame bookmarking user interface component
CN105812941A (zh) * 2016-03-31 2016-07-27 北京金山安全软件有限公司 一种视频播放方法、装置及电子设备
CN106375860B (zh) * 2016-09-30 2020-03-03 腾讯科技(深圳)有限公司 一种视频播放方法、装置、终端及服务器

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130198005A1 (en) * 2012-01-27 2013-08-01 Sony Network Entertainment International Llc System, method, and infrastructure for real-time live streaming content
CN103763581A (zh) * 2013-05-02 2014-04-30 乐视网信息技术(北京)股份有限公司 一种实现直播回看的方法和系统
CN104469512A (zh) * 2013-09-25 2015-03-25 浙江大华技术股份有限公司 一种视频播放器及其控制视频播放的方法
CN103763626A (zh) * 2013-12-19 2014-04-30 华为软件技术有限公司 一种信息推送方法、设备和系统
CN105007533A (zh) * 2015-07-28 2015-10-28 米科互动教育科技(北京)有限公司 直播课程回放方法、装置以及系统
CN105916035A (zh) * 2015-12-15 2016-08-31 乐视致新电子科技(天津)有限公司 一种快速定位播放时间点的显示方法及装置
CN105376588A (zh) * 2015-12-18 2016-03-02 北京金山安全软件有限公司 一种视频直播方法、装置及电子设备

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10635303B2 (en) 2016-06-12 2020-04-28 Apple Inc. User interface for managing controllable external devices
US10904628B2 (en) * 2018-05-07 2021-01-26 Apple Inc. User interfaces for viewing live video feeds and recorded video
US20190342621A1 (en) * 2018-05-07 2019-11-07 Apple Inc. User interfaces for viewing live video feeds and recorded video
US10820058B2 (en) 2018-05-07 2020-10-27 Apple Inc. User interfaces for viewing live video feeds and recorded video
US11785387B2 (en) 2019-05-31 2023-10-10 Apple Inc. User interfaces for managing controllable external devices
US10904029B2 (en) 2019-05-31 2021-01-26 Apple Inc. User interfaces for managing controllable external devices
US11363071B2 (en) 2019-05-31 2022-06-14 Apple Inc. User interfaces for managing a local network
US10779085B1 (en) 2019-05-31 2020-09-15 Apple Inc. User interfaces for managing controllable external devices
US11824898B2 (en) 2019-05-31 2023-11-21 Apple Inc. User interfaces for managing a local network
US11079913B1 (en) 2020-05-11 2021-08-03 Apple Inc. User interface for status indicators
US11513667B2 (en) 2020-05-11 2022-11-29 Apple Inc. User interface for audio message
US11589010B2 (en) 2020-06-03 2023-02-21 Apple Inc. Camera and visitor user interfaces
US11657614B2 (en) 2020-06-03 2023-05-23 Apple Inc. Camera and visitor user interfaces
US11937021B2 (en) 2020-06-03 2024-03-19 Apple Inc. Camera and visitor user interfaces
US11785277B2 (en) 2020-09-05 2023-10-10 Apple Inc. User interfaces for managing audio for media items
CN115103213A (zh) * 2022-06-10 2022-09-23 咪咕视讯科技有限公司 信息处理方法、装置、设备及计算机可读存储介质
CN115103213B (zh) * 2022-06-10 2023-10-17 咪咕视讯科技有限公司 信息处理方法、装置、设备及计算机可读存储介质

Also Published As

Publication number Publication date
CN110024412B (zh) 2020-12-25
CN110024412A (zh) 2019-07-16

Similar Documents

Publication Publication Date Title
WO2019090653A1 (fr) Procédé, appareil et système de diffusion vidéo en continu
US11805291B2 (en) Synchronizing media content tag data
US20200065322A1 (en) Multimedia content tags
US9633696B1 (en) Systems and methods for automatically synchronizing media to derived content
EP2901631B1 (fr) Enrichissement de médias électroniques relatifs à une messagerie électronique
US20170019452A1 (en) Video-Production System With Social-Media Features
US20140129570A1 (en) Crowdsourcing Supplemental Content
JP2006155384A (ja) 映像コメント入力・表示方法及び装置及びプログラム及びプログラムを格納した記憶媒体
KR101246917B1 (ko) 미디어 재생 시스템의 사용자 간에 정보를 공유하는 방법및 시스템
US10484756B2 (en) Presenting advertisements during media content seek
US20200133984A1 (en) Video-Production System With Social-Media Features
CN105230035A (zh) 用于选择的时移多媒体内容的社交媒体的处理
US11778286B2 (en) Systems and methods for summarizing missed portions of storylines
JP2009239729A (ja) コンテンツのシーン出現を通知する装置、方法およびプログラム
CN111512635A (zh) 用于选择性跳过媒体内容的方法和系统
US9619123B1 (en) Acquiring and sharing content extracted from media content
US20180020034A1 (en) Video-Production System With Social-Media Features
JP2008283409A (ja) メタデータ関連情報生成装置、メタデータ関連情報生成方法およびメタデータ関連情報生成プログラム
KR101181732B1 (ko) 동영상 핑거프린트 정보에 기반한 동영상 마크업 데이터 생성 방법 및 이를 이용한 정보 제공 방법 및 시스템
WO2017008498A1 (fr) Procédé et dispositif de recherche de programme
WO2008087742A1 (fr) Système de reproduction de film, dispositif terminal d'information et procédé d'affichage d'information
JP2013150221A (ja) 情報処理装置、情報処理方法、及びプログラム
TWI497959B (zh) Scene extraction and playback system, method and its recording media

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17931633

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17931633

Country of ref document: EP

Kind code of ref document: A1