WO2010082199A1 - Video-associated objects - Google Patents

Video-associated objects Download PDF

Info

Publication number
WO2010082199A1
WO2010082199A1 PCT/IL2010/000037 IL2010000037W WO2010082199A1 WO 2010082199 A1 WO2010082199 A1 WO 2010082199A1 IL 2010000037 W IL2010000037 W IL 2010000037W WO 2010082199 A1 WO2010082199 A1 WO 2010082199A1
Authority
WO
WIPO (PCT)
Prior art keywords
event
video
displaying
computerized system
media player
Prior art date
Application number
PCT/IL2010/000037
Other languages
French (fr)
Inventor
Izhak Zvi Netter
Tal Chalozin
Zack Zigdon
Original Assignee
Innovid Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Innovid Inc. filed Critical Innovid Inc.
Priority to EP10731118A priority Critical patent/EP2387850A4/en
Publication of WO2010082199A1 publication Critical patent/WO2010082199A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • G06F16/74Browsing; Visualisation therefor
    • G06F16/748Hypervideo
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/958Organisation or management of web site content, e.g. publishing, maintaining pages or automatic linking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising

Definitions

  • the present disclosure relates to enriching video content in general, and to enriching video content using external objects, in particular.
  • Modern technology enables serving video content to clients over a computerized network, such as the Internet. Additional objects may be introduced to the video in order to improve user experience, value to video owners, such as by serving commercial advertisements, and the like.
  • Other methods, systems and products to match objects to videos and insert the objects to a video also include static matching, static insertion, utilization of predetermined applications such as object-specific video players, and the like.
  • One exemplary embodiment of the disclosed subject matter is a computerized system comprising: a media player configured to display a data stream, wherein the media player having a display layout; an event detector configured to identify an event associated with the data stream displayed by the media player; and an object, wherein the object is configured to be displayed in a location, the location is outside the display layout, wherein the object is responsive to the event identified by the event detector.
  • the media player and the object are configured to be executed in a securely separated manner.
  • the media player and the object are configured to be executed by a separate virtual machine.
  • the object is an interactive object.
  • the object is configured to perform a predetermined animation in response to the event.
  • the media player comprising a second object, the second object is configured to perform a second predetermined animation in response to the event; and the computerized system further comprising a coordinating module configured to coordinate the predetermined animation and the second predetermined animation.
  • the media player comprising a second object, the second object is configured to perform a predetermined animation in response to the event.
  • the predetermined animation is based upon the location of the object.
  • the event is selected from the group consisting of an interaction with a second object, a tracking event of an entity in the video, a placement event of an entity in the video, a frame event, a keyword event and an ambient event of the video.
  • the event comprising a frame identification; and a target object identification; wherein the target object identification is associated with the object.
  • the media player and the object are configured to be displayed by a web browser.
  • the media player and the object are configured to be displayed in a web page.
  • Another exemplary embodiment of the disclosed subject matter is a method for utilizing a video-associated object in a computerized environment, the method comprising: displaying a data stream in a display layout by a media player; identifying an event associated with the data stream; in response to the event, displaying an object in a location, the location is outside the display layout.
  • displaying a second object in the display layout comprises performing a predetermined animation.
  • the performing the predetermined animation comprises at least one of the following: playing a media content; displaying an image; displaying an animated image; displaying a message; and modifying a current display to a modified display.
  • the predetermined animation is performed in accordance with a relative position between the display layout and the location.
  • the displaying the object comprises performing a second predetermined animation; wherein the predetermined animation and the second predetermined animation are coordinated.
  • the displaying the data stream and the displaying the object are performed in separated resource environments.
  • the displaying the data stream comprises displaying the data stream in a web page; and the displaying the object comprises displaying the displayable object in the web page.
  • Yet another exemplary embodiment of the disclosed subject matter is a computer program product comprising: a computer readable medium; a first program instruction for displaying a data stream in a display layout by a media player; a second program instruction for identifying an event associated with the data stream; and a third program instruction for displaying an object in a location in response to the event, the location is outside the display; wherein the first, second and third program instructions are stored on the computer readable medium.
  • FIG. 1 shows a computerized environment in which the disclosed subject matter is used, in accordance with some exemplary embodiments of the subject matter
  • Fig. 2 A and 2B show a web page, in accordance with some exemplary embodiments of the disclosed subject matter;
  • FIG. 3 shows a block diagram of a computerized system, in accordance with some exemplary embodiments of the disclosed subject matter.
  • Fig. 4 shows a flowchart diagram of a method, in accordance with some exemplary embodiments of the disclosed subject matter.
  • These computer program instructions may also be stored in a computer- readable medium that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instruction means which implement the function/act specified in the flowchart and/or block diagram block or blocks.
  • the computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • One technical problem dealt with by the disclosed subject matter is to enable an interaction between a video player and an object.
  • the object may be external to the video player.
  • the video player may be enriched, enhanced or otherwise extended with another object, such as an interactive object.
  • the video player and the object may be executed in a securely separated manner, such as in a different sandbox or by a virtual machine.
  • Another technical problem dealt with by the disclosed subject matter is to provide an infrastructure for such interaction between an object and a video matched on the fly.
  • Yet another technical problem dealt with by the disclosed subject matter is to coordinate between animation of the object and animation of the video player, such as for example to provide a synchronized animation, a video-context-dependant animation by the object or the like.
  • One technical solution is to provide an event detector to identify an event associated with the video player or an object presented by the video player.
  • the event detector may dispatch the event to the external object.
  • the external object may be responsive to the event.
  • Another technical solution is to provide a coordinating module to coordinate predetermined animation associated with the video player, the external object or other components.
  • the predetermined animation may be parameter-dependant, such as may be modified based on a relative location of the video player and the external object.
  • One technical effect of utilizing the disclosed subject matter is providing a modified display based on an identification of an event. Another technical effect is achieving synchronized or otherwise coordinated animation between two components, such as the video player and the external object.
  • the components may be executed separately, such as by a virtual machine, and still be coordinated.
  • a computerized environment 100 may comprise a computerized network 110.
  • the computerized network 110 may connected one or more computerized clients.
  • the computerized network 1 10 may be the Internet, an Intranet, a WAN, a LAN, a WiFi-based network, a cellular network, or the like
  • a user 150 may utilize a client 140 to consume data stream retrieved from the network 1 10.
  • the data stream may represent a video having a plurality of frames, a video having a soundtrack, an interactive video, or the like.
  • the user 150 may consume the data stream by, for example, viewing the video represented by the data stream.
  • the client 140 may be a personal computer, a laptop, a Personal Digital Assistant (PDA), a mobile device, a eel 140 may comprise a web browser configured to display the data stream retrieved from the network 1 10.
  • PDA Personal Digital Assistant
  • the user 150 may view a web page, such as served by a web server 120.
  • the web server 120 may provide the client 140 the web page including a video player, such as for example, a flash-based video player.
  • the web page may include a video player configured to play a predetermined video, such as selected by a content server 130.
  • the content server 130 may select a video or other content from a video storage 135.
  • the content server 130 may select the content based on user input of the user 150, based on the IP address of the user client 140, based on a history of the user 150 or the client 140, or the like. Other parameters, rules and configurations may apply to the selection of the video from the video storage 135.
  • the web page provided by the web server 120 may comprise additional content, such as objects selected from a container storage 138.
  • the container storage 138 may retain objects, such as interactive objects, displayable objects or the like. Some of the objects may be inserted to a video, such as for example, as a ticker, in a seamless manner, in accordance with a perspective of a frame of the video, or the like. Some of the objects may be displayed outside a video, such as a companion to the video.
  • companion objects may provide additional information to that detailed in the video. Some companion objects may display an advertisement. Some companion objects may be associated with an object, where the object may be inserted to the video. Some companion objects may be predeterminedly associated with a video. Some companion objects may be predeterminedly associated with another object. Some companion objects may be associated with metadata of a video, such as for example, a keyword, a predetermined frame or the like. Some companion objects may be associated with a video on-the-fly. In some exemplary embodiments, the companion may display additional information in response to an event associated with a video. In some exemplary embodiments, the companion may display additional information in response to an interaction of the user with the video or with an object displayed in the video, such as an object dynamically inserted to the video as an overlay on the video.
  • Figures 2A and 2B disclose an exemplary usage of the disclosed subject matter.
  • Figures 2A and 2B both show ; embodiments of the disclosed subject matter.
  • the web page may be defined by a markup language, such as for example HTML.
  • the web page may be embedded with scripts, applets, programs or the like, as is known in the art.
  • the web page may be presented by a web browser, by an email client, by a word processor, or by another program configured to display a page defined by the markup language.
  • a web browser 200 may display a web page.
  • the web page may comprise a video player 210 and an object 240.
  • the video player 210 may be a flash-based video player, a proprietary video player, an open-source video player or the like.
  • the video player 210 may be a standard video player extended by an extension to provide the functionality in accordance with the disclosed subject matter.
  • the object 240 may be displayed side-by- side to the video player 210, above or below it, or elsewhere.
  • the object 240 is displayed in an external position relative to the video player 210.
  • the object 240 may be displayed or may be in an undisplayed state.
  • the object 240 is displayed and its content is a picture of pans.
  • the pans may be displayed as part of an advertisement.
  • the video player 210 displays a video of a person cooking. The person can be seen cutting an apple.
  • a user such as 150 of Figure 1 , may utilize a pointing device, such as a computer mouse, a touch screen or the like, to interact with the video, such as by pointing to an object using a cursor 220.
  • the user may interact with an object such as an apple 215.
  • the user may click on the apple 215, hover above it, or otherwise interact with the apple 215.
  • the apple 215 may be an object, such as dynamically inserted object, may be associated with a hot spot, or otherwise defined for the exemplary interaction disclosed in Figure 2A.
  • a hot spot as is known in the art, is an area within one or more frames of a media item, such as a video, that enables an interaction by the user.
  • a hot spot may be defined as an area of the apple 215 in the video, and upon a user interaction, such as by clicking in the hot spot, an event may be triggered.
  • an animation In response to the interaction of the user with the apple 215, an animation
  • the animation 230 may be displayed.
  • the animation 230 may be an image of an apple moving from the apple 215 to the object 240.
  • the animation 230 may depend on various parameters, such as location of interaction (e.g., location 240 to the video player 210, a frame of the video being displayed in the video player, the object with which the user interacted (e.g., the apple 215), or the like.
  • the animation 230 may comprise an image of an apple similar to the apple 215 animated from the location of the apple 215 to the location of the object 240.
  • FIG 2B the web page 200 is displayed after the animation 230 of Figure 2A has ended.
  • the object 240 may display another animation, an image or the like.
  • the object 240 displays, for example, a commercial associated with the apple 215.
  • the video player 210 may pause the video being displayed in response to an interaction of the user via the cursor 220. In some exemplary embodiments, the video player 210 may resume the video upon predetermined action. In other exemplary embodiments, the video player 210 may continue to show the video while the animation 230 is displayed.
  • the video displayed in the video player 210 may be a pre-roll commercial, such as a video commercial displayed before displaying a requested video.
  • the video may be a post-roll, a video displayed during commercial intermission or the like.
  • the object 240 which may be referred to as a companion object, or a video orbit, may be described by an identification and layout parameters, such as size, position and the like.
  • the object 240 may be responsive to events associated with the identification. For example, in some cases multiple objects similar to the object 240 may be presented, each associated with a different identification. Some events may induce a functionality of some of the multiple objects.
  • the disclosed subject matter may dispatch the event to objects using their identification.
  • the event may comprise an identification, which may be used to determine to which objects the event may be dispatched or which objects may be responsive to the event.
  • a video-serving device 300 such as comprised by a web server 120 of
  • Figure I 5 may be configured to serve a video to a user, such as 150 of Figure I 5 in accordance with the disclosed subject matter.
  • the video-serving device 300 may comprise a video player 310, an event detector 320, a coordinating module 340, an I/O device 305 or a processor 302.
  • the video-serving device 300 may comprise a processor 302.
  • the processor 302 may be a Central Processing Unit (CPU), a microprocessor, an electronic circuit, an Integrated Circuit (IC) or the like.
  • the processor 302 may be utilized to perform computations required by the video-serving device 300 or any of it subcomponents.
  • the video-serving device 300 may comprise an Input/Output (I/O) module 305.
  • I/O Input/Output
  • the I/O module 305 may provide an interface to a content server 360, such as 130 of Figure 1.
  • the content server 360 may provide a video from a video storage 365, such as 135 of Figure 1.
  • the video storage 365 may be a database, a storage server, or the like.
  • the I/O module 305 may provide an interface to a container storage 370, such as 138 of Figure 1.
  • the container storage 370 may be a database, a storage server or the like.
  • the container storage 370 may retain objects, such as an object 380.
  • the object 380 may be a displayable object, such as the apple 215 of Figure 2A or the object 240 of Figure 2A.
  • the object 380 may be an interactive object.
  • the object 380 may be associated with a predetermined video, a set of videos, a plurality of videos having a common characteristic or the like.
  • the I/O module 305 may provide an interface to a web browser 330.
  • the I/O module 305 may enable serving a web page to a client, such as 140 of Figure 1 via a network, such as 1 10 of Figure 1.
  • the I/O module 305 may comprise a communication device to enable communication over the network.
  • the video player 310 may be configured to play a video.
  • the video may be retrieved by the content server 360.
  • the 310 may be configured to play a video on the web browser 330.
  • the video player 310 may, for example, be a downloadable Java program, flash video player, applet or the like.
  • the video player 310 may ena accordance with the disclosed subject matter.
  • the video player 310 may be associated with a display layout, such as a location within a web page when displayed by the web browser 330.
  • the event detector 320 may be configured to identify an event associated with the video player 310.
  • the event may be an interactive event, such as a click by a user, a cursor interaction, keyboard stroke or the like.
  • the event may be associated with data stream representative of a video being displayed by the video player 310.
  • the event may be a tracking event based on a determined movement of an object within the video.
  • the event may be a placement event based on determined placement of an entity in the video.
  • the event may be a frame event, such as playing a predetermined frame or set of frames.
  • the event may be a keyword event, such as a keyword is associated with a frame being played.
  • the keyword may be predeterminedly associated with the frame, such as by manual or automatic processing.
  • the keyword may be dynamically determined, such as by face recognition, computerized vision or other machine-implemented algorithms.
  • the event may be an ambient event representative of a change in ambience in a video, such as determined based on sound, color scheme, manual determination or the like. It will be noted that other event may be utilized.
  • the events may utilize other characteristics of the metadata associated with frames or videos, other types of user inputs and the like.
  • the coordinating module 340 may be configured to coordinate animation of two or more elements.
  • the coordinating module 340 may coordinate animation of an object, such as 240 of Figure 2A, and the video being played by the video player 310.
  • the coordinating module 340 may synchronize the object such that the object may be animated in accordance with a tempo associated with a soundtrack of the video.
  • the coordinating module 340 may synchronize animation such as 230 of Figure 2 A and of object, such as 240 of Figure 2B, such that an order between different animation elements may be determined. For example, a predetermined order may induce, that the animation 230 may be first performed, and once the element animated in 230 reaches the object 240 or almost reaches the object 240, the image in object 240 may be modified.
  • the web browser 330 is configured to display a web page in accordance with the disclosed subject matter.
  • the web browser 330 may be a standard web browser, s Google Chrome, Mozilla FireFox or the like.
  • the web browser 330 may be executed by a client, such as 140 of Figure 1.
  • the web browser 330 may utilize a virtual machine 335 to provide a separation between executed elements of the web page, such as the video player 310 and the object 380. The separation may be performed as is known in the art.
  • the virtual machine 335 may provide a unique sandbox for each applet.
  • the virtual machine 335 may be a virtual machine associated with a predetermined computing language, such as for example Java. Other exemplary embodiments, may utilize different devices to achieve separation.
  • a video may be displayed in a video player.
  • the video player may be 310 of Figure 3.
  • the video player may be displayed in a display layout.
  • the video player may be displayed by a web browser, such as 330 of Figure 3.
  • an event may be detected.
  • the event may be detected by an event detector 320 of Figure 3.
  • the step 410 may comprise step 415 of detection of an interaction of a user with a first object.
  • the first object may be inside the video, such as for example an hot spot object above the apple 215 of Figure 2A.
  • the step 410 may comprise step 420 of detection of metadata associated with a frame of the video being displayed. Other event may be similarly detected.
  • a predetermined animation may be coordinated.
  • the animation may be coordinated by a coordinating module 340 of Figure 3.
  • the coordination may induce an order of elements with the predetermined animation, between elements from different animations, between elements of animation and frames of the video and the like. In some cases, the coordination may induce a synchronization of the animation.
  • a second object may be displayed outside the display layout of the video player. The second object may be previously displayed, and in response to the event of Figure 410, the display of the second object may be modified.
  • step 450 coordination may be performed in a similar manner to that in step 430.
  • the disclosed subject matter may be utilized to display in the second object information regarding elements displayed in the video.
  • the object may list each product in response to a pertinen exemplary embodiment may be useful to increase sales.
  • the products may be listed with other relevant information, such as price, weight or the like.
  • the products may be displayed with additional data which may be retrieved from other sources, such as Internet sites.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of program code, which comprises one or more executable instructions for implementing the specified logical function(s).
  • the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
  • each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
  • the disclosed subject matter may be embodied as a system, method or computer program product. Accordingly, the disclosed subject matter may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit,” "module” or “system.” Furthermore, the present invention may take the form of a computer program product embodied in any tangible medium of expression having computer-usable program code embodied in the medium.
  • the computer-usable or computer-readable medium may be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium.
  • the computer-readable medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CDROM), an optical storage device, a transmission media such as those supporting the Internet or an intranet, or a magnetic storage device.
  • the computer-usable or computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted, or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory.
  • a computer-usable or computer-readable medium may be any medium that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
  • the computer- usable medium may include a propagated data signal with the computer-usable program code embodied therewith, either in baseband or as part of a carrier wave.
  • the computer usable program code may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, and the like.
  • Computer program code for carrying out operations of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of nei r wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • WAN wide area network
  • Internet Service Provider for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Multimedia (AREA)
  • Databases & Information Systems (AREA)
  • Accounting & Taxation (AREA)
  • Finance (AREA)
  • Strategic Management (AREA)
  • Development Economics (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Game Theory and Decision Science (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • General Business, Economics & Management (AREA)
  • Human Computer Interaction (AREA)
  • Information Transfer Between Computers (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A companion object to a media player, such as a video player, is responsive to an event associated with the video player. The event may be associated with the data stream displayed by the video player. The event may be associated with an object displayed by the video player. The companion object is displayed outside the display layout of the video player. The companion object and the video player may be displayed by a web browser in a web page. The companion object and the video player may be executed in a securely separated manner.

Description

VIDEO-ASSOCIATED OBJECTS
CROSS-REFERENCE TO RELATED APPLICATIONS The present application claims the benefit of U.S. Provisional Application 61/144.477 titled "Multiple components manager", filed January 14, 2009, the contents of which is hereby incorporated by reference herein.
BACKGROUND
The present disclosure relates to enriching video content in general, and to enriching video content using external objects, in particular. Modern technology enables serving video content to clients over a computerized network, such as the Internet. Additional objects may be introduced to the video in order to improve user experience, value to video owners, such as by serving commercial advertisements, and the like.
PCT Publication number WO2009/101623 titled "INSERTING INTERACTIVE OBJECTS INTO VIDEO CONTENT", which is hereby incorporated by reference, discloses dynamic matching of objects to a video, to increase utility of the inserted object.
PCT Publication number WO2009/101624 titled "APPARATUS AND METHOD FOR MANIPULATING AN OBJECT INSERTED TO VIDEO CONTENT", which is hereby incorporated by reference, discloses dynamic insertion of objects to a video.
Other methods, systems and products to match objects to videos and insert the objects to a video also include static matching, static insertion, utilization of predetermined applications such as object-specific video players, and the like.
BRIEF SUMMARY OF THE INVENTION
One exemplary embodiment of the disclosed subject matter is a computerized system comprising: a media player configured to display a data stream, wherein the media player having a display layout; an event detector configured to identify an event associated with the data stream displayed by the media player; and an object, wherein the object is configured to be displayed in a location, the location is outside the display layout, wherein the object is responsive to the event identified by the event detector.
Optionally, the media player and the object are configured to be executed in a securely separated manner.
Optionally, the media player and the object are configured to be executed by a separate virtual machine.
Optionally, the object is an interactive object.
Optionally, the object is configured to perform a predetermined animation in response to the event.
Optionally, the media player comprising a second object, the second object is configured to perform a second predetermined animation in response to the event; and the computerized system further comprising a coordinating module configured to coordinate the predetermined animation and the second predetermined animation. Optionally, the media player comprising a second object, the second object is configured to perform a predetermined animation in response to the event.
Optionally, the predetermined animation is based upon the location of the object.
Optionally, the event is selected from the group consisting of an interaction with a second object, a tracking event of an entity in the video, a placement event of an entity in the video, a frame event, a keyword event and an ambient event of the video.
Optionally, the event comprising a frame identification; and a target object identification; wherein the target object identification is associated with the object.
Optionally, the media player and the object are configured to be displayed by a web browser.
Optionally, the media player and the object are configured to be displayed in a web page. Another exemplary embodiment of the disclosed subject matter is a method for utilizing a video-associated object in a computerized environment, the method comprising: displaying a data stream in a display layout by a media player; identifying an event associated with the data stream; in response to the event, displaying an object in a location, the location is outside the display layout.
Optionally, in response to the event, displaying a second object in the display layout; the displaying the second object comprises performing a predetermined animation.
Optionally, the performing the predetermined animation comprises at least one of the following: playing a media content; displaying an image; displaying an animated image; displaying a message; and modifying a current display to a modified display.
Optionally, the predetermined animation is performed in accordance with a relative position between the display layout and the location. Optionally, the displaying the object comprises performing a second predetermined animation; wherein the predetermined animation and the second predetermined animation are coordinated.
Optionally, the displaying the data stream and the displaying the object are performed in separated resource environments. Optionally, the displaying the data stream comprises displaying the data stream in a web page; and the displaying the object comprises displaying the displayable object in the web page.
Yet another exemplary embodiment of the disclosed subject matter is a computer program product comprising: a computer readable medium; a first program instruction for displaying a data stream in a display layout by a media player; a second program instruction for identifying an event associated with the data stream; and a third program instruction for displaying an object in a location in response to the event, the location is outside the display; wherein the first, second and third program instructions are stored on the computer readable medium. THE BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS
The present disclosed subject matter will be understood and appreciated more fully from the following detailed description taken in conjunction with the drawings in which corresponding or like numerals or characters indicate corresponding or like components. Unless indicated otherwise, the drawings provide exemplary embodiments or aspects of the disclosure and do not limit the scope of the disclosure. In the drawings:
Fig. 1 shows a computerized environment in which the disclosed subject matter is used, in accordance with some exemplary embodiments of the subject matter; Fig. 2 A and 2B show a web page, in accordance with some exemplary embodiments of the disclosed subject matter;
Fig. 3 shows a block diagram of a computerized system, in accordance with some exemplary embodiments of the disclosed subject matter; and
Fig. 4 shows a flowchart diagram of a method, in accordance with some exemplary embodiments of the disclosed subject matter.
DETAILED DESCRIPTION
The disclosed subject matter is described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the subject matter. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer- readable medium that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instruction means which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. One technical problem dealt with by the disclosed subject matter is to enable an interaction between a video player and an object. The object may be external to the video player. The video player may be enriched, enhanced or otherwise extended with another object, such as an interactive object. The video player and the object may be executed in a securely separated manner, such as in a different sandbox or by a virtual machine.
Another technical problem dealt with by the disclosed subject matter is to provide an infrastructure for such interaction between an object and a video matched on the fly. Yet another technical problem dealt with by the disclosed subject matter is to coordinate between animation of the object and animation of the video player, such as for example to provide a synchronized animation, a video-context-dependant animation by the object or the like. One technical solution is to provide an event detector to identify an event associated with the video player or an object presented by the video player. The event detector may dispatch the event to the external object. The external object may be responsive to the event.
Another technical solution is to provide a coordinating module to coordinate predetermined animation associated with the video player, the external object or other components.
Yet another technical solution is to have a predetermined animation associated with an object. The predetermined animation may be parameter-dependant, such as may be modified based on a relative location of the video player and the external object.
One technical effect of utilizing the disclosed subject matter is providing a modified display based on an identification of an event. Another technical effect is achieving synchronized or otherwise coordinated animation between two components, such as the video player and the external object. The components may be executed separately, such as by a virtual machine, and still be coordinated.
Referring now to Figure 1 showing a computerized environment in which the disclosed subject matter is used, in accordance with some exemplary embodiments of the subject matter. A computerized environment 100 may comprise a computerized network 110. The computerized network 110 may connected one or more computerized clients. The computerized network 1 10 may be the Internet, an Intranet, a WAN, a LAN, a WiFi-based network, a cellular network, or the like
In some exemplary embodiments, a user 150 may utilize a client 140 to consume data stream retrieved from the network 1 10. In some exemplary embodiments, the data stream may represent a video having a plurality of frames, a video having a soundtrack, an interactive video, or the like. The user 150 may consume the data stream by, for example, viewing the video represented by the data stream. In some exemplary embodiments, the client 140 may be a personal computer, a laptop, a Personal Digital Assistant (PDA), a mobile device, a eel 140 may comprise a web browser configured to display the data stream retrieved from the network 1 10.
In some exemplary embodiments, the user 150 may view a web page, such as served by a web server 120. The web server 120 may provide the client 140 the web page including a video player, such as for example, a flash-based video player. The web page may include a video player configured to play a predetermined video, such as selected by a content server 130. The content server 130 may select a video or other content from a video storage 135. The content server 130 may select the content based on user input of the user 150, based on the IP address of the user client 140, based on a history of the user 150 or the client 140, or the like. Other parameters, rules and configurations may apply to the selection of the video from the video storage 135.
In some exemplary embodiments, the web page provided by the web server 120 may comprise additional content, such as objects selected from a container storage 138. The container storage 138 may retain objects, such as interactive objects, displayable objects or the like. Some of the objects may be inserted to a video, such as for example, as a ticker, in a seamless manner, in accordance with a perspective of a frame of the video, or the like. Some of the objects may be displayed outside a video, such as a companion to the video.
There may be many different companion objects. Some companion objects may provide additional information to that detailed in the video. Some companion objects may display an advertisement. Some companion objects may be associated with an object, where the object may be inserted to the video. Some companion objects may be predeterminedly associated with a video. Some companion objects may be predeterminedly associated with another object. Some companion objects may be associated with metadata of a video, such as for example, a keyword, a predetermined frame or the like. Some companion objects may be associated with a video on-the-fly. In some exemplary embodiments, the companion may display additional information in response to an event associated with a video. In some exemplary embodiments, the companion may display additional information in response to an interaction of the user with the video or with an object displayed in the video, such as an object dynamically inserted to the video as an overlay on the video.
Figures 2A and 2B disclose an exemplary usage of the disclosed subject matter. Figures 2A and 2B both show ; embodiments of the disclosed subject matter. In some exemplary embodiments, the web page may be defined by a markup language, such as for example HTML. The web page may be embedded with scripts, applets, programs or the like, as is known in the art. In some exemplary embodiments, the web page may be presented by a web browser, by an email client, by a word processor, or by another program configured to display a page defined by the markup language.
Referring now to Figure 2A, a web browser 200 may display a web page. The web page may comprise a video player 210 and an object 240. The video player 210 may be a flash-based video player, a proprietary video player, an open-source video player or the like. In some exemplary embodiments, the video player 210 may be a standard video player extended by an extension to provide the functionality in accordance with the disclosed subject matter. The object 240 may be displayed side-by- side to the video player 210, above or below it, or elsewhere. In some exemplary embodiments, the object 240 is displayed in an external position relative to the video player 210. The object 240 may be displayed or may be in an undisplayed state.
In Figure 2A, the object 240 is displayed and its content is a picture of pans. The pans may be displayed as part of an advertisement. In Figure 2A, the video player 210 displays a video of a person cooking. The person can be seen cutting an apple. A user, such as 150 of Figure 1 , may utilize a pointing device, such as a computer mouse, a touch screen or the like, to interact with the video, such as by pointing to an object using a cursor 220. The user may interact with an object such as an apple 215. The user may click on the apple 215, hover above it, or otherwise interact with the apple 215. It will be noted that the apple 215 may be an object, such as dynamically inserted object, may be associated with a hot spot, or otherwise defined for the exemplary interaction disclosed in Figure 2A. It will be noted that a hot spot, as is known in the art, is an area within one or more frames of a media item, such as a video, that enables an interaction by the user. In an exemplary embodiment, a hot spot may be defined as an area of the apple 215 in the video, and upon a user interaction, such as by clicking in the hot spot, an event may be triggered. In response to the interaction of the user with the apple 215, an animation
230 may be displayed. The animation 230 may be an image of an apple moving from the apple 215 to the object 240. The animation 230 may depend on various parameters, such as location of interaction (e.g., location 240 to the video player 210, a frame of the video being displayed in the video player, the object with which the user interacted (e.g., the apple 215), or the like. The animation 230 may comprise an image of an apple similar to the apple 215 animated from the location of the apple 215 to the location of the object 240. Referring now to Figure 2B, the web page 200 is displayed after the animation 230 of Figure 2A has ended. The object 240 may display another animation, an image or the like. The object 240 displays, for example, a commercial associated with the apple 215.
In some exemplary embodiments, the video player 210 may pause the video being displayed in response to an interaction of the user via the cursor 220. In some exemplary embodiments, the video player 210 may resume the video upon predetermined action. In other exemplary embodiments, the video player 210 may continue to show the video while the animation 230 is displayed.
In some exemplary embodiments, there may be an animation associated with the object 240, without any additional animation related to the video, such as the animation 230 of Figure 2A.
In some exemplary embodiments, the video displayed in the video player 210 may be a pre-roll commercial, such as a video commercial displayed before displaying a requested video. The video may be a post-roll, a video displayed during commercial intermission or the like.
In some exemplary embodiments, the object 240, which may be referred to as a companion object, or a video orbit, may be described by an identification and layout parameters, such as size, position and the like. The object 240 may be responsive to events associated with the identification. For example, in some cases multiple objects similar to the object 240 may be presented, each associated with a different identification. Some events may induce a functionality of some of the multiple objects. The disclosed subject matter may dispatch the event to objects using their identification. In some exemplary embodiments, the event may comprise an identification, which may be used to determine to which objects the event may be dispatched or which objects may be responsive to the event.
Referring now to Figure 3 showing a block diagram of a computerized system, in accordance with some exemplary embodiments of the disclosed subject matter. A video-serving device 300, such as comprised by a web server 120 of
Figure I 5 may be configured to serve a video to a user, such as 150 of Figure I 5 in accordance with the disclosed subject matter. The video-serving device 300 may comprise a video player 310, an event detector 320, a coordinating module 340, an I/O device 305 or a processor 302.
In some exemplary embodiments, the video-serving device 300 may comprise a processor 302. The processor 302 may be a Central Processing Unit (CPU), a microprocessor, an electronic circuit, an Integrated Circuit (IC) or the like. The processor 302 may be utilized to perform computations required by the video-serving device 300 or any of it subcomponents.
In some exemplary embodiments, the video-serving device 300 may comprise an Input/Output (I/O) module 305.
In some exemplary embodiments, the I/O module 305 may provide an interface to a content server 360, such as 130 of Figure 1. The content server 360 may provide a video from a video storage 365, such as 135 of Figure 1. The video storage 365 may be a database, a storage server, or the like.
In some exemplary embodiments, the I/O module 305 may provide an interface to a container storage 370, such as 138 of Figure 1. The container storage 370 may be a database, a storage server or the like. The container storage 370 may retain objects, such as an object 380. The object 380 may be a displayable object, such as the apple 215 of Figure 2A or the object 240 of Figure 2A. The object 380 may be an interactive object. The object 380 may be associated with a predetermined video, a set of videos, a plurality of videos having a common characteristic or the like.
In some exemplary embodiments, the I/O module 305 may provide an interface to a web browser 330. The I/O module 305 may enable serving a web page to a client, such as 140 of Figure 1 via a network, such as 1 10 of Figure 1.
In some exemplary embodiments, the I/O module 305 may comprise a communication device to enable communication over the network.
In some exemplary embodiments, the video player 310 may be configured to play a video. The video may be retrieved by the content server 360. The video player
310 may be configured to play a video on the web browser 330. The video player 310 may, for example, be a downloadable Java program, flash video player, applet or the like. The video player 310 may ena accordance with the disclosed subject matter. The video player 310 may be associated with a display layout, such as a location within a web page when displayed by the web browser 330.
In some exemplary embodiments, the event detector 320 may configured to identify an event associated with the video player 310. The event may be an interactive event, such as a click by a user, a cursor interaction, keyboard stroke or the like. The event may be associated with data stream representative of a video being displayed by the video player 310. The event may be a tracking event based on a determined movement of an object within the video. The event may be a placement event based on determined placement of an entity in the video. The event may be a frame event, such as playing a predetermined frame or set of frames. The event may be a keyword event, such as a keyword is associated with a frame being played. The keyword may be predeterminedly associated with the frame, such as by manual or automatic processing. The keyword may be dynamically determined, such as by face recognition, computerized vision or other machine-implemented algorithms. The event may be an ambient event representative of a change in ambiance in a video, such as determined based on sound, color scheme, manual determination or the like. It will be noted that other event may be utilized. The events may utilize other characteristics of the metadata associated with frames or videos, other types of user inputs and the like. In some exemplary embodiments, the coordinating module 340 may be configured to coordinate animation of two or more elements. The coordinating module 340 may coordinate animation of an object, such as 240 of Figure 2A, and the video being played by the video player 310. For example, the coordinating module 340 may synchronize the object such that the object may be animated in accordance with a tempo associated with a soundtrack of the video. The coordinating module 340 may synchronize animation such as 230 of Figure 2 A and of object, such as 240 of Figure 2B, such that an order between different animation elements may be determined. For example, a predetermined order may induce, that the animation 230 may be first performed, and once the element animated in 230 reaches the object 240 or almost reaches the object 240, the image in object 240 may be modified.
In some exemplary embodiments, the web browser 330 is configured to display a web page in accordance with the disclosed subject matter. The web browser 330 may be a standard web browser, s Google Chrome, Mozilla FireFox or the like. The web browser 330 may be executed by a client, such as 140 of Figure 1. The web browser 330 may utilize a virtual machine 335 to provide a separation between executed elements of the web page, such as the video player 310 and the object 380. The separation may be performed as is known in the art. In some exemplary embodiments, the virtual machine 335 may provide a unique sandbox for each applet. The virtual machine 335 may be a virtual machine associated with a predetermined computing language, such as for example Java. Other exemplary embodiments, may utilize different devices to achieve separation.
Referring now to Figure 4 showing a flowchart diagram of a method, in accordance with some exemplary embodiments of the disclosed subject matter.
In step 400, a video may be displayed in a video player. The video player may be 310 of Figure 3. The video player may be displayed in a display layout. The video player may be displayed by a web browser, such as 330 of Figure 3.
In step 410, an event may be detected. The event may be detected by an event detector 320 of Figure 3. The step 410 may comprise step 415 of detection of an interaction of a user with a first object. The first object may be inside the video, such as for example an hot spot object above the apple 215 of Figure 2A. The step 410 may comprise step 420 of detection of metadata associated with a frame of the video being displayed. Other event may be similarly detected. In step 430, a predetermined animation may be coordinated. The animation may be coordinated by a coordinating module 340 of Figure 3. The coordination may induce an order of elements with the predetermined animation, between elements from different animations, between elements of animation and frames of the video and the like. In some cases, the coordination may induce a synchronization of the animation. In step 440, a second object may be displayed outside the display layout of the video player. The second object may be previously displayed, and in response to the event of Figure 410, the display of the second object may be modified.
In step 450, coordination may be performed in a similar manner to that in step 430. In some exemplary embodiments, the disclosed subject matter may be utilized to display in the second object information regarding elements displayed in the video. For example, in case the video shows a plurality of products, the object may list each product in response to a pertinen exemplary embodiment may be useful to increase sales. The products may be listed with other relevant information, such as price, weight or the like. The products may be displayed with additional data which may be retrieved from other sources, such as Internet sites. The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of program code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
As will be appreciated by one skilled in the art, the disclosed subject matter may be embodied as a system, method or computer program product. Accordingly, the disclosed subject matter may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit," "module" or "system." Furthermore, the present invention may take the form of a computer program product embodied in any tangible medium of expression having computer-usable program code embodied in the medium.
Any combination of one or more computer usable or computer readable medium(s) may be utilized. The computer-usable or computer-readable medium may be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CDROM), an optical storage device, a transmission media such as those supporting the Internet or an intranet, or a magnetic storage device. Note that the computer-usable or computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted, or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory. In the context of this document, a computer-usable or computer-readable medium may be any medium that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The computer- usable medium may include a propagated data signal with the computer-usable program code embodied therewith, either in baseband or as part of a carrier wave. The computer usable program code may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, and the like. Computer program code for carrying out operations of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of nei r wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims below are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of the present invention has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the invention in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the invention. The embodiment was chosen and described in order to best explain the principles of the invention and the practical application, and to enable others of ordinary skill in the art to understand the invention for various embodiments with various modifications as are suited to the particular use contemplated.

Claims

CLAIMS What is claimed is:
1. A computerized system comprising: a media player configured to display a data stream, wherein said media player having a display layout; an event detector configured to identify an event associated with the data stream displayed by said media player; and an object, wherein said object is configured to be displayed in a location, the location is outside the display layout, wherein said object is responsive to the event identified by said event detector.
2. The computerized system of claim 1, wherein said media player and said object are configured to be executed in a securely separated manner.
3. The computerized system of claim 2, wherein said media player and said object are configured to be executed by a separate virtual machine.
4. The computerized system of claim 1, wherein said object is an interactive object.
5. The computerized system of claim 1, wherein said object is configured to perform a predetermined animation in response to the event.
6. The computerized system of claim 5, wherein said media player comprising a second object, the second object is configured to perform a second predetermined animation in response to the event; and the computerized system further comprising a coordinating module configured to coordinate the predetermined animation and the second predetermined animation.
7. The computerized system of claim 1, wherein said media player comprising a second object, the second object is configured to perform a predetermined animation in response to the event.
8. The computerized system of claim 7, wherein the predetermined animation is based upon the location of said object.
9. The computerized system of claim 1 , wherein the event is selected from the group consisting of an interaction with a second object, a tracking event of an entity in the video, a placement event of an entity in the video, a frame event, a keyword event and an ambient event of the video.
10. The computerized system of claim 1, wherein the event comprising a frame identification; and a target object identification; wherein the target object identification is associated with said object.
1 1. The computerized system of claim 1, wherein said media player and said object are configured to be displayed by a web browser.
12. The computerized system of claim 1 , wherein said media player and said object are configured to be displayed in a web page.
13. A method for utilizing a video-associated object in a computerized environment, said method comprising: displaying a data stream in a display layout by a media player; identifying an event associated with the data stream; in response to the event, displaying an object in a location, the location is outside the display layout.
14. The method of claim 13, further comprising in response to the event, displaying a second object in the display layout; said displaying the second object comprises performing a predetermined animation.
15. The method of claim 14, wherein said performing the predetermined animation comprises at least one of the following: playing a media content; displaying an image; displaying an animated image; displaying a message; and modifying a current display to a modified display.
16. The method of claim 14, wherein the predetermined animation is performed in accordance with a relative position between the display layout and the location.
17. The method of claim 14, wherein said displaying the object comprises performing a second predetermined animation; wherein the predetermined animation and the second predetermined animation are coordinated.
18. The method of claim 13, wherein said displaying the data stream and said displaying the object are performed in separated resource environments.
19. The method of claim 13, wherein said displaying the data stream comprises displaying the data stream in a web page; and said displaying the object comprises displaying the displayable object in the web page.
20. A computer program product comprising: a computer readable medium; a first program instruction for displaying a data stream in a display layout by a media player; a second program instruction for identifying an event associated with the data stream; and a third program instruction for displaying an object in a location in response to the event, the location is outside the display; wherein said first, second and third program instructions are stored on said computer readable medium.
PCT/IL2010/000037 2009-01-14 2010-01-14 Video-associated objects WO2010082199A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP10731118A EP2387850A4 (en) 2009-01-14 2010-01-14 Video-associated objects

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US14447709P 2009-01-14 2009-01-14
US61/144,477 2009-01-14

Publications (1)

Publication Number Publication Date
WO2010082199A1 true WO2010082199A1 (en) 2010-07-22

Family

ID=42318746

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2010/000037 WO2010082199A1 (en) 2009-01-14 2010-01-14 Video-associated objects

Country Status (3)

Country Link
US (2) US9665965B2 (en)
EP (1) EP2387850A4 (en)
WO (1) WO2010082199A1 (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007109162A2 (en) * 2006-03-17 2007-09-27 Viddler, Inc. Methods and systems for displaying videos with overlays and tags
US11227315B2 (en) 2008-01-30 2022-01-18 Aibuy, Inc. Interactive product placement system and method therefor
US8312486B1 (en) 2008-01-30 2012-11-13 Cinsay, Inc. Interactive product placement system and method therefor
US20110191809A1 (en) 2008-01-30 2011-08-04 Cinsay, Llc Viral Syndicated Interactive Product System and Method Therefor
WO2011003014A1 (en) * 2009-07-02 2011-01-06 Huntley Stafford Ritter Attracting viewer attention to advertisements embedded in media
KR101357262B1 (en) 2010-08-13 2014-01-29 주식회사 팬택 Apparatus and Method for Recognizing Object using filter information
US20120326993A1 (en) * 2011-01-26 2012-12-27 Weisman Jordan K Method and apparatus for providing context sensitive interactive overlays for video
KR20130029579A (en) * 2011-09-15 2013-03-25 주식회사 팬택 Text based dynamic effect apparatus and method
US20130263182A1 (en) * 2012-03-30 2013-10-03 Hulu Llc Customizing additional content provided with video advertisements
US9747727B2 (en) * 2014-03-11 2017-08-29 Amazon Technologies, Inc. Object customization and accessorization in video content
CN110971955B (en) * 2018-09-30 2022-06-07 北京京东尚科信息技术有限公司 Page processing method and device, electronic equipment and storage medium
CN110166842B (en) * 2018-11-19 2020-10-16 深圳市腾讯信息技术有限公司 Video file operation method and device and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5590262A (en) * 1993-11-02 1996-12-31 Magic Circle Media, Inc. Interactive video interface and method of creation thereof
US6570587B1 (en) * 1996-07-26 2003-05-27 Veon Ltd. System and method and linking information to a video
US7158676B1 (en) * 1999-02-01 2007-01-02 Emuse Media Limited Interactive system
WO2009101623A2 (en) 2008-02-13 2009-08-20 Innovid Inc. Inserting interactive objects into video content

Family Cites Families (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7721307B2 (en) * 1992-12-09 2010-05-18 Comcast Ip Holdings I, Llc Method and apparatus for targeting of interactive virtual objects
US5708845A (en) * 1995-09-29 1998-01-13 Wistendahl; Douglass A. System for mapping hot spots in media content for interactive digital media program
US6154771A (en) * 1998-06-01 2000-11-28 Mediastra, Inc. Real-time receipt, decompression and play of compressed streaming video/hypervideo; with thumbnail display of past scenes and with replay, hyperlinking and/or recording permissively intiated retrospectively
US6196917B1 (en) * 1998-11-20 2001-03-06 Philips Electronics North America Corp. Goal directed user interface
US7089579B1 (en) * 1998-12-20 2006-08-08 Tvworks, Llc System for transporting MPEG video as streaming video in an HTML web page
US6774908B2 (en) * 2000-10-03 2004-08-10 Creative Frontier Inc. System and method for tracking an object in a video and linking information thereto
US20020073149A1 (en) * 2000-10-11 2002-06-13 Young Christopher Tyler Dynamic content linking
US20020083469A1 (en) * 2000-12-22 2002-06-27 Koninklijke Philips Electronics N.V. Embedding re-usable object-based product information in audiovisual programs for non-intrusive, viewer driven usage
US20020161909A1 (en) * 2001-04-27 2002-10-31 Jeremy White Synchronizing hotspot link information with non-proprietary streaming video
US6904561B1 (en) * 2001-07-19 2005-06-07 Microsoft Corp. Integrated timeline and logically-related list view
US20040021684A1 (en) * 2002-07-23 2004-02-05 Dominick B. Millner Method and system for an interactive video system
US20050046630A1 (en) * 2003-08-29 2005-03-03 Kurt Jacob Designable layout animations
TWI241824B (en) * 2003-10-31 2005-10-11 Benq Corp Mobile phone and related method for displaying text message with various background images
TWI288362B (en) * 2005-10-21 2007-10-11 Ming-Jang Chen Visualized animation elements generative method of computer executable
US20070124792A1 (en) * 2005-11-30 2007-05-31 Bennett James D Phone based television remote control
US20070250775A1 (en) * 2006-04-19 2007-10-25 Peter Joseph Marsico Methods, systems, and computer program products for providing hyperlinked video
US20080177630A1 (en) * 2007-01-19 2008-07-24 Babak Maghfourian Method apparatus, system, media, and signals for billing a sponsor of an object link in interactive sequenced media
KR100900794B1 (en) * 2007-04-12 2009-06-02 누리엔소프트웨어 주식회사 Method for dance game and the recording media therein readable by computer
US8619038B2 (en) * 2007-09-04 2013-12-31 Apple Inc. Editing interface
US7941758B2 (en) * 2007-09-04 2011-05-10 Apple Inc. Animation of graphical objects
US8381086B2 (en) * 2007-09-18 2013-02-19 Microsoft Corporation Synchronizing slide show events with audio
US9113214B2 (en) * 2008-05-03 2015-08-18 Cinsay, Inc. Method and system for generation and playback of supplemented videos
US20090327893A1 (en) * 2008-06-25 2009-12-31 Paul Terry Coordinated video presentation methods and apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5590262A (en) * 1993-11-02 1996-12-31 Magic Circle Media, Inc. Interactive video interface and method of creation thereof
US6570587B1 (en) * 1996-07-26 2003-05-27 Veon Ltd. System and method and linking information to a video
US7158676B1 (en) * 1999-02-01 2007-01-02 Emuse Media Limited Interactive system
WO2009101623A2 (en) 2008-02-13 2009-08-20 Innovid Inc. Inserting interactive objects into video content
WO2009101624A2 (en) 2008-02-13 2009-08-20 Innovid Inc. Apparatus and method for manipulating an object inserted to video content

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2387850A4 *

Also Published As

Publication number Publication date
EP2387850A4 (en) 2012-07-18
US20170263035A1 (en) 2017-09-14
US20100177122A1 (en) 2010-07-15
EP2387850A1 (en) 2011-11-23
US9665965B2 (en) 2017-05-30

Similar Documents

Publication Publication Date Title
US20170263035A1 (en) Video-Associated Objects
US10491958B2 (en) Live video stream with interactive shopping interface
US10547909B2 (en) Electronic commerce functionality in video overlays
US11356746B2 (en) Dynamic overlay video advertisement insertion
US9832253B2 (en) Content pre-render and pre-fetch techniques
US9532116B2 (en) Interactive video advertisement in a mobile browser
US10440436B1 (en) Synchronizing interactive content with a live video stream
US10620804B2 (en) Optimizing layout of interactive electronic content based on content type and subject matter
US9930311B2 (en) System and method for annotating a video with advertising information
US9374411B1 (en) Content recommendations using deep data
US20110001758A1 (en) Apparatus and method for manipulating an object inserted to video content
AU2010256367A1 (en) Ecosystem for smart content tagging and interaction
CN103384253B (en) The play system and its construction method of multimedia interaction function are presented in video
US10191624B2 (en) System and method for authoring interactive media assets
CN114025188B (en) Live advertisement display method, system, device, terminal and readable storage medium
US10091556B1 (en) Relating items to objects detected in media
US8845429B2 (en) Interaction hint for interactive video presentations
CN115190366B (en) Information display method, device, electronic equipment and computer readable medium
WO2018036493A1 (en) Information processing method and apparatus, display terminal, and storage medium
CN111667313A (en) Advertisement display method and device, client device and storage medium
KR102541981B1 (en) Conversion of static content items to interactive content items
US20180365268A1 (en) Data structure, system and method for interactive media
CN106331790A (en) Information display method and information display device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10731118

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

REEP Request for entry into the european phase

Ref document number: 2010731118

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2010731118

Country of ref document: EP