WO2006045061A2 - Procede de synchronisation d'evenements avec des donnees en flux - Google Patents

Procede de synchronisation d'evenements avec des donnees en flux Download PDF

Info

Publication number
WO2006045061A2
WO2006045061A2 PCT/US2005/037951 US2005037951W WO2006045061A2 WO 2006045061 A2 WO2006045061 A2 WO 2006045061A2 US 2005037951 W US2005037951 W US 2005037951W WO 2006045061 A2 WO2006045061 A2 WO 2006045061A2
Authority
WO
WIPO (PCT)
Prior art keywords
stream
client
data
metadata
received
Prior art date
Application number
PCT/US2005/037951
Other languages
English (en)
Other versions
WO2006045061A3 (fr
Inventor
Samuel Sergio Tenembaum
Ivan A. Ivanoff
Jorge A. Estevez
Original Assignee
Porto Ranelli, Sa
Pi Trust
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Porto Ranelli, Sa, Pi Trust filed Critical Porto Ranelli, Sa
Publication of WO2006045061A2 publication Critical patent/WO2006045061A2/fr
Publication of WO2006045061A3 publication Critical patent/WO2006045061A3/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • H04N21/43074Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of additional data with content streams on the same device, e.g. of EPG data or interactive icon with a TV program
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/43Querying
    • G06F16/438Presentation of query results
    • G06F16/4387Presentation of query results by the use of playlists
    • G06F16/4393Multimedia presentations, e.g. slide shows, multimedia albums
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/235Processing of additional data, e.g. scrambling of additional data or processing content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/435Processing of additional data, e.g. decrypting of additional data, reconstructing software from modules extracted from the transport stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/462Content or additional data management, e.g. creating a master electronic program guide from data received from the Internet and a Head-end, controlling the complexity of a video stream by scaling the resolution or bit-rate based on the client capabilities
    • H04N21/4622Retrieving content or additional data from different sources, e.g. from a broadcast channel and the Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/84Generation or processing of descriptive data, e.g. content descriptors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/83Generation or processing of protective or descriptive data associated with content; Content structuring
    • H04N21/845Structuring of content, e.g. decomposing content into time segments
    • H04N21/8455Structuring of content, e.g. decomposing content into time segments involving pointers to the content, e.g. pointers to the I-frames of the video stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/858Linking data to content, e.g. by linking an URL to a video object, by creating a hotspot

Definitions

  • the present invention relates generally to a method for synchronizing events on a computer with a stream of data being received by such computer from another computer through a network.
  • the resulting synchronization of streamed data and pre-programmed events can be used to trigger actions on the local computer based on the amount of data being received.
  • the invention enables the presentation of offers and interactivity in streamed content, matching it to specific points within the content.
  • links By clicking on a hyperlink ("link") with a mouse, users effectively request the transmission of one or many data files their local computers. Until a few years ago, before streaming of files was available, the client computer could not start accessing data in the downloaded file until the entire file had been transferred. This caused large files, such as video or audio or to be impractical for most users, unless they were willing to wait 10 minutes or more after clicking on something on a screen before seeing the results?
  • Streaming files changed that by organizing data into a format that can be interpreted by the receiving machine as it arrives, in real time. Among other things, this permits video signals to be broadcast via a network, with the client machine rendering the media in real time as the data arrives, without having to wait for the entire file to arrive.
  • Many streaming formats have been developed by various consortiums and private entities, the most famous being part of the QuickTime (Apple), RealMedia (Real Networks) and Win Media (Microsoft) multimedia platforms.
  • streaming formats solved the problem of accessing large linear files via slower connections, making the distribution of video and music through the Internet a viable enterprise. Nevertheless, the nature of streaming files prevents the client computer from verifying the integrity of the data received, since it has to process it and move onto the next incoming package. If data is lost during transmission, it is lost, and any synchronicity between elements is lost with it. With streaming files, information can be expected to be lost.
  • the present invention solves this problem by using 2 parallel connections: the data stream, and an additional connection which relays metadata about the stream, such as how much has been transmitted by the streaming server.
  • the current invention functions by calculating the sync points between the stream and the programmed events based on the data sent by the server, not the data received by the client.
  • the invention utilizes two independent timelines:
  • the data stream and An event-sync connection.
  • the data stream which is decoded and rendered as it arrives.
  • This "media timeline” is completely linear: information is displayed as data arrives; data is used to generate the media (audio and video, for example).
  • a separate event-sync connection is established for sync purposes.
  • the information coming from this alternate connection is used on the client side to skip along the "events timeline".
  • This second timeline is independent from the stream and non-linear, meaning that the system can access any event at any time.
  • a parallel connection to the streaming server is used to report metadata (in the case of video: the amount of time, the frames per second, etc.).
  • This metadata is used by the client-side application to trigger events based on what the server indicates it has streamed, not based on what the client has received in the stream.
  • a client side component performs two parallel tasks: it receives the media stream and renders it, while at the same time triggering events based on metadata received from the streaming server via a separate link.
  • the currently preferred embodiment uses Macromedia Flash for the client side component and Macromedia Flash Server for the server side.
  • the client computer sets a mark in time effectively starting a stopwatch
  • the client component uses the other connection to figure out how much data the server has pushed
  • the client computer can accurately trigger events scheduled for T1 + 3 seconds.
  • Time is perceived by humans to move in a linear, sequential manner. TO comes before T1 , which comes before T2, which comes before T3, etc. Video is presented to observers in the same way, the first frame precedes the second frame, which in turn precedes the third frame, etc.
  • the sequence of frames, presented to a user in order is defined as a timeline: it is basically the linear arrangement of frames in order to represent passing time.
  • Table A shows the way in which REAL TIME and the VIDEO TIMELINE relate.
  • Table A shows the way in which REAL TIME and the VIDEO TIMELINE relate.
  • Table A shows the way in which REAL TIME and the VIDEO TIMELINE relate.
  • Table A shows the way in which REAL TIME and the VIDEO TIMELINE relate.
  • Table A shows the way in which REAL TIME and the VIDEO TIMELINE relate.
  • Table A shows a perfect match between elapsed time and presented frames. For every passing time unit the video timeline renders a unique and matching video frame. Since real time and the video timeline match, it would be possible to synchronize any event to the video timeline by using real time as a reference. Should one want to match any event to the image presented on frame 4, all that need done is to instruct the program to trigger the event on second 4 (T4). If this were a real world scenario, all that would need to synchronize events would be to trigger them based on elapsed real time.
  • Table B shows a case in which time and the video timeline loose their correspondence. If frame 3 becomes delayed during transmission and arrives at the client computer a second later, any event synchronized to it will trigger early.
  • the table shows no frame being rendered on T3, which causes a misstep and leaves the video timeline lagging with regards to elapsed time, placing event D on frame 3 instead of frame 4. It is clear that T4 now matches frame 3, therefore events synched to T4 will not take place in frame 4, but in frame 3.
  • Table B shows a case in which time and the video timeline loose their correspondence. If frame 3 becomes delayed during transmission and arrives at the client computer a second later, any event synchronized to it will trigger early. The table shows no frame being rendered on T3, which causes a misstep and leaves the video timeline lagging with regards to elapsed time, placing event D on frame 3 instead of frame 4. It is clear that T4 now matches frame 3, therefore events synched to T4 will not take place in frame 4, but in frame 3.
  • Table B shows a case in which time and the video timeline loose
  • the way to match frames and events, allowing for data loss, is to match the events to the transmitted data, and not to elapsed time. This is achieved using a sync signal, a parallel connection between the client and the server that serves as a control stream, albeit an intermittent one.
  • the current embodiment of the invention used to synchronize streamed video and client based events is built using technologies available from Macromedia and Adobe, among others.
  • Macromedia Flash is used to program a client side module that requests and receives the stream of video while simultaneously connecting to the Flash Content Server and triggering events based on the data from this intermittent connection. In order for this to work several steps need to take place.
  • the video stream timeline and “the event timeline” can be synchronized as they play, they need to be matched in authoring. This is done by using the video as a guide for building the events timeline. Since the events will be programmed using Flash in the current embodiment, we need to use a video format that is compatible with Flash.
  • the FLV streaming format is used in this example, since it is the same as what will be streamed.
  • ADOBE AFTER EFFECTS To turn video into an FLV file, we use ADOBE AFTER EFFECTS. The video is imported into After Effects, where adjustments can be made to its size, frame rate, duration, quality, compression, etc. Once the video is of desired size and quality, the FLV file is generated.
  • the FLV file is imported into Flash, where a key step is that its video properties are matched in the Flash file to the ones in the FLV file. Frame rate must be the same on both, otherwise the procedure will not work.
  • the resulting code will preferably look something like this (although alternative coding will be apparent to those skilled in the art):
  • FIG. 1 is a block diagram showing the computers involved and data transmitted between them, where block A is the client computer, running a web browser displaying an HTML document which holds a Flash file (swf).
  • Block B is a web server and block C is a Flash Content Server.
  • the first thing that takes place, as represented by data flow 1 is that the HTML document requests the SWF file from the web server.
  • data flow 2 the web server returns the SWF, which is executed and requests a connection to the Flash Content Server (FCS) in data flow 3.
  • FCS Flash Content Server
  • the connection is established via data flow 4, and through it the SWF requests the video stream as seen in data flow 5.
  • the SWF and the FCS keep communicated via data flow 7 intermittently.
  • TO as reference, along with the information in data flow 7 regarding the amount of data transmitted, the SWF file triggers events that are matched to their corresponding frame. In other words, the events timeline does not run linearly, but it jumps from one frame to another based on the following question: how much info has the FCS sent? Instead of: how much time has elapsed?
  • onBWDone function(p_bw) ⁇ tracefonBWDone: "+p_bw); bnc.closeO;

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Strategic Management (AREA)
  • Databases & Information Systems (AREA)
  • Entrepreneurship & Innovation (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Development Economics (AREA)
  • Finance (AREA)
  • Marketing (AREA)
  • General Business, Economics & Management (AREA)
  • Economics (AREA)
  • Human Resources & Organizations (AREA)
  • Accounting & Taxation (AREA)
  • Tourism & Hospitality (AREA)
  • General Engineering & Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Operations Research (AREA)
  • Game Theory and Decision Science (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

La présente invention concerne un procédé permettant de synchroniser des événements d'un ordinateur avec des données en flux entrant dans le même ordinateur depuis un serveur en temps réel. Le procédé permet la coordination, l'organisation et la présentation d'information, de graphiques, de vidéo, de commerce électronique et de tout autre événement d'ordinateur avec les données entrant dans l'ordinateur sous forme de fichier en continu, au fur et à mesure de son arrivée. Les possibilités commerciales qui sont diverses, vont de produits et services offerts en fonction de ce que l'utilisateur est en train de voir, au renforcement des connaissances en matière de multimédia par d'autres moyens. L'invention permet la synchronisation de divers calendriers.
PCT/US2005/037951 2004-10-19 2005-10-19 Procede de synchronisation d'evenements avec des donnees en flux WO2006045061A2 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US62020704P 2004-10-19 2004-10-19
US60/620,207 2004-10-19

Publications (2)

Publication Number Publication Date
WO2006045061A2 true WO2006045061A2 (fr) 2006-04-27
WO2006045061A3 WO2006045061A3 (fr) 2006-06-22

Family

ID=36203718

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2005/037951 WO2006045061A2 (fr) 2004-10-19 2005-10-19 Procede de synchronisation d'evenements avec des donnees en flux

Country Status (1)

Country Link
WO (1) WO2006045061A2 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090158147A1 (en) * 2007-12-14 2009-06-18 Amacker Matthew W System and method of presenting media data

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030088511A1 (en) * 2001-07-05 2003-05-08 Karboulonis Peter Panagiotis Method and system for access and usage management of a server/client application by a wireless communications appliance
US20030229899A1 (en) * 2002-05-03 2003-12-11 Matthew Thompson System and method for providing synchronized events to a television application
US6701383B1 (en) * 1999-06-22 2004-03-02 Interactive Video Technologies, Inc. Cross-platform framework-independent synchronization abstraction layer
US6788333B1 (en) * 2000-07-07 2004-09-07 Microsoft Corporation Panoramic video

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6701383B1 (en) * 1999-06-22 2004-03-02 Interactive Video Technologies, Inc. Cross-platform framework-independent synchronization abstraction layer
US6788333B1 (en) * 2000-07-07 2004-09-07 Microsoft Corporation Panoramic video
US20030088511A1 (en) * 2001-07-05 2003-05-08 Karboulonis Peter Panagiotis Method and system for access and usage management of a server/client application by a wireless communications appliance
US20030229899A1 (en) * 2002-05-03 2003-12-11 Matthew Thompson System and method for providing synchronized events to a television application

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090158147A1 (en) * 2007-12-14 2009-06-18 Amacker Matthew W System and method of presenting media data
US9275056B2 (en) * 2007-12-14 2016-03-01 Amazon Technologies, Inc. System and method of presenting media data
US10248631B2 (en) 2007-12-14 2019-04-02 Amazon Technologies, Inc. System and method of presenting media data

Also Published As

Publication number Publication date
WO2006045061A3 (fr) 2006-06-22

Similar Documents

Publication Publication Date Title
JP6783293B2 (ja) 複数のオーバーザトップストリーミングクライアントを同期させること
US9171545B2 (en) Browsing and retrieval of full broadcast-quality video
US20050154679A1 (en) System for inserting interactive media within a presentation
CN111010614A (zh) 一种显示直播字幕的方法、装置、服务器及介质
US20090106357A1 (en) Synchronized Media Playback Using Autonomous Clients Over Standard Internet Protocols
US8737804B2 (en) System for delayed video viewing
EP1126714A2 (fr) Appareil de réception de données, méthode de réception de données, méthode de transmission de données et support de stockage de données
KR20170074866A (ko) 수신 장치, 송신 장치, 및 데이터 처리 방법
Boronat et al. HbbTV-compliant platform for hybrid media delivery and synchronization on single-and multi-device scenarios
Van Deventer et al. Standards for multi-stream and multi-device media synchronization
US20130057759A1 (en) Live Audio Track Additions to Digital Streams
CN104604245B (zh) 呈现时间控制
CN109756744B (zh) 数据处理方法、电子设备及计算机存储介质
WO2019088853A1 (fr) Remplacement audio en direct dans un flux numérique
US20230336842A1 (en) Information processing apparatus, information processing method, and program for presenting reproduced video including service object and adding additional image indicating the service object
CN111669605B (zh) 多媒体数据与其关联互动数据的同步方法和装置
KR101520788B1 (ko) 동영상 동기화 재생 방법
CN114697712B (zh) 一种媒体流的下载方法、装置、设备及存储介质
WO2006045061A2 (fr) Procede de synchronisation d'evenements avec des donnees en flux
Concolato et al. Live HTTP streaming of video and subtitles within a browser
CN106537930B (zh) 一种用于实施多媒体流业务呈现方法的客户端
van Deventer et al. Media synchronisation for television services through HbbTV
US11689776B2 (en) Information processing apparatus, information processing apparatus, and program
US20080148319A1 (en) Coordinating web media with time-shifted broadcast
KR102273795B1 (ko) 영상 동기화 처리를 위한 시스템 및 그 제어방법

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KP KR KZ LC LK LR LS LT LU LV LY MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application
32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: COMMUNICATION UNDER RULE 69 EPC ( EPO FORM 1205A DATED 14/09/07 )

122 Ep: pct application non-entry in european phase

Ref document number: 05811829

Country of ref document: EP

Kind code of ref document: A2