WO2011041056A2 - Video content-aware advertisement placement - Google Patents
Video content-aware advertisement placement Download PDFInfo
- Publication number
- WO2011041056A2 WO2011041056A2 PCT/US2010/047198 US2010047198W WO2011041056A2 WO 2011041056 A2 WO2011041056 A2 WO 2011041056A2 US 2010047198 W US2010047198 W US 2010047198W WO 2011041056 A2 WO2011041056 A2 WO 2011041056A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- advertisement
- video content
- media file
- trajectory
- locations
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0241—Advertisements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/812—Monomedia components thereof involving advertisement data
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/91—Television signal processing therefor
Definitions
- search engines In data-searching systems preceding the Web, and on the Web since its inception, search engines have employed a variety of tools to aid in organizing and presenting advertisements in tandem with search results and other online content, such as digital images and streaming video. These tools are also leveraged to optimize the revenue received by the search engine, where optimizing revenue may be facilitated by selecting advertisements that are relevant to a user and by placing the selected advertisements in a noticeable location.
- companies that advertise strive to develop advertisements that are attention-capturing, frequently selected by the search engines for display, and, upon being displayed, readily perceived by users of those search engines. If these three objectives are achieved, a company is likely to be successful in selling a particular item or a particular service.
- an eye-catching advertisement placed in a top-center banner position on a web page will likely receive more attention from a user and, thus, likely generate more revenue for the search engine and the company, as opposed to a bland advertisement positioned in a lower portion of the web page. That is, because the advertisement is noticed by the user, the likelihood that the user will take action (e.g., visit a website of the advertiser) based on the advertisement is increased.
- Embodiments of the present invention generally relate to computer-readable media and computerized methods for identifying and tracking an object within video content of a media file (e.g., digital video) such that an awareness of characteristics of the video content is developed. This awareness can then be used for manipulating how and when an advertisement is overlaid on the video content. For instance, the advertisement may be manipulated to interact within the identified object.
- a media file e.g., digital video
- the step of developing the awareness of video-content characteristics is carried out by an offline authoring process.
- This offline authoring process is implemented to identify an object within the video content with which an advertisement will visually interact.
- Tracking may include the steps of targeting a patch within the object appearing in the video content of the media file and tracking the movement of the patch over a sequence of frames within the media file.
- a "patch" generally refers to a prominent set of pixels within the object that exhibits an identifiable texture (e.g., an eye of a person or animal).
- locations of the patch within the sequence of frames are written to a trajectory.
- a trajectory includes a list of patch locations, configured as X and Y coordinates, that are each associated with a particular frame in the sequence of frames.
- the step of manipulating how and when an advertisement is overlaid on the video content is performed by the online rendering process.
- the online rendering process is carried out upon initiating play of the media file. Accordingly, several steps are typically performed before the online rendering process is invoked, such as receiving a plurality of advertisements that are each designed with consideration of the trajectory and choosing one of the received advertisements for rendering based on a selection scheme (e.g., revenue optimizing, rotational, and the like).
- a selection scheme e.g., revenue optimizing, rotational, and the like.
- the online rendering process Upon choosing an advertisement and receiving an indication (e.g., user-initiated selection of a representation of the media file on a web page) to invoke the online rendering process, the online rendering process conducts the following procedures: generating an ad-overlay that accommodates a container to hold the video advertisement; positioning the container within the ad-overlay according to the trajectory; and inserting the chosen advertisement into the container. Accordingly, the ad-overlay is rendered on top of the video content when playing the media file such that the advertisement appears to visually interact with the object or other video content.
- FIG. 1 is a block diagram of an exemplary computing environment suitable for use in implementing embodiments of the present invention
- FIG. 2 is a block diagram illustrating a distributed computing environment, suitable for use in implementing embodiments of the present invention, that is configured to utilize awareness of video content within a media file to select and place an advertisement;
- FIG. 3 is a diagrammatic view of a sequence of frames of a media file with an object being tracked therein, in accordance with an embodiment of the present invention
- FIG. 4 is a diagrammatic view of a user interface (UI) display illustrating an object, within the video content, with a patch selected thereon, in accordance with an embodiment of the present invention
- FIG. 5 depicts progressive UI displays illustrating stages of placing an advertisement on top of the video content within a sequence of frames, in accordance with an embodiment of the present invention
- FIG. 6 is a diagrammatic view of animation of an advertisement being incorporated into video content via an ad-overlay, in accordance with an embodiment of the present invention
- FIG. 7 is a flow diagram illustrating an overall method for performing an offline authoring process to generate a trajectory, in accordance with an embodiment of the present invention.
- FIG. 8 is a flow diagram illustrating an overall method for performing an online rendering process upon initiating play of the media file, in accordance with an embodiment of the present invention.
- the present invention relates to computer- executable instructions, embodied on one or more computer-readable media, that perform a method for dynamically placing an advertisement on top of video content in a media file, based on movement of an object therein.
- the method involves performing an offline authoring process for generating a trajectory.
- the offline authoring process includes the steps of targeting a patch within the object appearing in the video content of the media file, tracking the movement of the patch over a sequence of frames within the media file, and, based on the tracked movement of the patch, writing locations of the patch within the sequence of frames to the trajectory.
- the term "patch" is not meant to be limiting but may encompass any segment of the object that can be consistently identified within a predefined sequence of frames within the media file.
- the term patch may refer to a prominent set of pixels (e.g., eyes) within the object (e.g., bear) that exhibits an identifiable texture. See FIGS. 4 and 6 for a more detailed explanation of how the eyes of a bear may be utilized as a patch to establish a trajectory.
- the term patch may broadly refer to any feature within any sequence of frames in the media file that it appears in a substantial number of the frames of the sequence of frames.
- the method involves performing an online rendering process upon initiating play of the media file.
- the online rendering process includes the steps of automatically selecting the advertisement and, while the media file is playing, dynamically placing the selected advertisement on top of the video content as a function of the locations within the trajectory. Accordingly, the advertisement and media file are rendered in a synchronized manner such that the advertisement appears to visually interact within the object, or at least some portion of the video content.
- aspects of the present invention involve a computerized method, implemented at one or more processing units, for utilizing an awareness of video content within a media file to select and to place a video advertisement therein.
- the method includes abstracting one or more coordinate locations of an object appearing in the video content of the media file.
- object is not meant to be limiting, but may encompass an expansive scope of items, elements, lines, points, figures, or other aspects of the video content being presented upon playing the media file.
- the object represents a most impressive figure or item within the video content.
- the object may be a football.
- FIGS. 3 and 5 the object may be a football.
- the object may be the bear. As such, the thing that initially draws the attention of a viewer of the media file may be selected as the object.
- the object may be determined by monitoring and collecting both the less intrusive and the most intrusive aspects of the video content, and ascertaining the object as the appropriate vehicle within to the video content to be associated with the advertisement. For instance, if the media file is a video clip of a football game and if it is determined that a football being thrown is the most impressive figure, the advertisement may be placed on the football and/or on the jersey of the player receiving the football, which is not as prominent yet still captures the user's attention.
- the computerized method continues with, at least temporarily, storing coordinate locations of the object on a trajectory.
- the coordinate locations are stored in association within a sequence of frames comprising the media file.
- the trajectory is utilized to generate an ad-overlay that accommodates a container that holds the video advertisement.
- the container is positioned within the ad-overlay according to the coordinate locations stored in the trajectory. For instance, the container may be placed on top of the coordinate locations.
- placing the container on top of the coordinate locations involves inserting the video advertisement within the container and positioning the video advertisement on top of a football, which was previously determined to be the object.
- the computerized method includes the steps of receiving the video advertisement, inserting the video advertisement into the container, and rendering the ad-overlay on top of the video content when playing the media file.
- embodiments of the present invention provide for selection and presentation of advertisements, or the video advertisement.
- the term "advertisement" or the phrase "video advertisement” is not meant to be limiting.
- advertisements could relate to a promotional communication between a seller offering goods or services to a prospective purchaser of such goods or services.
- the advertisement could contain any type or amount of data that is capable of being communicated for the purpose of generating interest in, or sale of, goods or services, such as text, animation, executable information, video, audio, and other various forms.
- the advertisement may be configured as a digital image that is published within an advertisement space allocated within a UI display.
- the UI display is rendered by a web browser or other application running on a client device.
- the video advertisement may be specifically designed to visually interact with the object within the video content of the media file.
- the design of the video advertisement may be performed by an administrator associated with the web browser, a third-party advertising company, or any other entity capable of generating video content. Further, the design of the video advertisement may be based on the trajectory, the timestamps associated with locations of the object, a theme of the media file, an identity of the object, or any other useful criteria.
- the video advertisement may be developed in such a way as to visually interact with the video content when played.
- the present invention encompasses a computer system for abstracting information from the video content of a media file and for placing the advertisement within the video content to visually interact therewith.
- the abstracted information allows for developing the visually interactive advertisement, as discussed immediately above.
- the computer system includes a first processing unit and a second processing unit.
- the first processing unit is configured to accomplish at least the following steps: access the media file; track locations of an object appearing in the video content of the media file on a frame -by- frame basis; and write the tracked locations to a trajectory.
- the second processing unit is configured to accomplish the following steps: access the advertisement that is developed utilizing the trajectory; dynamically place the advertisement in a position on top of the video content based on the tracked locations; and render the video content in synchronization with animating the advertisement placed thereon. Accordingly, the animated advertisement appears to visually interact with the video content.
- these steps above may be performed by a single processing unit, on a multitude of processing units that are communicatively coupled (e.g., server cloud), and/or on one processing unit. Further, some of the steps may be carried out in an offline authoring process, while other steps may be carried out in real-time as the video content is streaming online.
- the first processing unit may operate offline while the second processing unit may operate online.
- computing device 100 an exemplary operating environment for implementing embodiments of the present invention is shown and designated generally as computing device 100.
- Computing device 100 is but one example of a suitable computing environment and is not intended to suggest any limitation as to the scope of use or functionality of the invention. Neither should the computing device 100 be interpreted as having any dependency or requirement relating to any one or combination of components illustrated.
- the invention may be described in the general context of computer code or machine-useable instructions, including computer-executable instructions such as program components, being executed by a computer or other machine, such as a personal data assistant or other handheld device.
- program components including routines, programs, objects, components, data structures, and the like, refer to code that performs particular tasks or implements particular abstract data types.
- Embodiments of the present invention may be practiced in a variety of system configurations, including handheld devices, consumer electronics, general-purpose computers, specialty computing devices, etc.
- Embodiments of the invention may also be practiced in distributed computing environments where tasks are performed by remote-processing devices that are linked through a communications network.
- computing device 100 includes a bus 110 that directly or indirectly couples the following devices: memory 112, one or more processors 114, one or more presentation components 116, input/output (I/O) ports 118, I/O components 120, and an illustrative power supply 122.
- Bus 110 represents what may be one or more busses (such as an address bus, data bus, or combination thereof).
- FIG. 1 is merely illustrative of an exemplary computing device that can be used in connection with one or more embodiments of the present invention. Distinction is not made between such categories as “workstation,” “server,” “laptop,” “handheld device,” etc., as all are contemplated within the scope of FIG. 1 and reference to "computer” or “computing device.”
- Computing device 100 typically includes a variety of computer-readable media.
- computer-readable media may comprise Random Access Memory (RAM); Read Only Memory (ROM); Electronically Erasable Programmable Read Only Memory (EEPROM); flash memory or other memory technologies; CDROM, digital versatile disks (DVDs), or other optical or holographic media; magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices; or any other medium that can be used to encode desired information and be accessed by computing device 100.
- Memory 112 includes computer-storage media in the form of volatile and/or nonvolatile memory.
- the memory may be removable, non removable, or a combination thereof.
- Exemplary hardware devices include solid-state memory, hard drives, optical- disc drives, etc.
- Computing device 100 includes one or more processors that read data from various entities such as memory 112 or I/O components 120.
- Presentation component(s) 116 present data indications to a user or other device.
- Exemplary presentation components include a display device, speaker, printing component, vibrating component, etc.
- I/O ports 118 allow computing device 100 to be logically coupled to other devices including I/O components 120, some of which may be built in.
- Illustrative components include a microphone, joystick, game pad, satellite dish, scanner, printer, wireless device, etc.
- FIG. 2 is a block diagram illustrating a distributed computing environment 200 suitable for use in implementing embodiments of the present invention.
- the exemplary computing environment 200 includes a first processing unit 210, a second processing unit 220, at least one data store 230, a display device 250, and a network (not shown) that interconnects each of these items.
- Each of the first processing unit 210 and the second processing unit 220, shown in FIG. 2 may take the form of various types of computing devices, such as, for example, the computing device 100 described above with reference to FIG. 1.
- first processing unit 210 and the second processing unit 220 may be a personal computer, desktop computer, laptop computer, consumer electronic device, handheld device (e.g., personal digital assistant), various servers, processing equipment, and the like. It should be noted, however, that the invention is not limited to implementation on such computing devices but may be implemented on any of a variety of different types of computing devices within the scope of embodiments of the present invention.
- each of the first processing unit 210 and the second processing unit 220 includes, or is linked to, some form of a computing unit (e.g., central processing unit, microprocessor, etc.) to support operations of the component(s) running thereon.
- a computing unit e.g., central processing unit, microprocessor, etc.
- the phrase “computing unit” generally refers to a dedicated computing device with processing power and storage memory, which supports operating software that underlies the execution of software, applications, and computer programs thereon.
- the computing unit is configured with tangible hardware elements, or machines, that are integral, or operably coupled, to the first processing unit 210 and the second processing unit 220 in order to enable each device to perform communication-related processes and other operations (e.g., executing an offline authoring process 215 or an online rendering process 225).
- the computing unit may encompass a processor (not shown) coupled to the computer-readable medium accommodated by each of the first processing unit 210 and the second processing unit 220.
- the computer-readable medium includes physical memory that stores, at least temporarily, a plurality of computer software components that are executable by the processor.
- the term "processor” is not meant to be limiting and may encompass any elements of the computing unit that act in a computational capacity. In such capacity, the processor may be configured as a tangible article that processes instructions. In an exemplary embodiment, processing may involve fetching, decoding/interpreting, executing, and writing back instructions.
- the processor may transfer information to and from other resources that are integral to, or disposed on, the first processing unit 210 and the second processing unit 220.
- resources refer to software components or hardware mechanisms that enable the first processing unit 210 and the second processing unit 220 to perform a particular function.
- a resource accommodated by the first processing unit 210 includes a component to conduct the offline authoring process 215, while a resource accommodated by the second processing unit includes a component to conduct the online rendering process 225.
- the second processing unit 220 may be integral to a computer that has a monitor to serve as the display device 250. In these embodiment, the computer may include an input device (not shown).
- the input device is provided to receive input(s) affecting, among other things, a media file 205, such as invoking the play of its video content 290, or altering properties of the video content being surfaced at a graphical user interface (GUI) 260 display.
- GUI graphical user interface
- Illustrative input devices include a mouse, joystick, key pad, microphone, I/O components 120 of FIG. 1, or any other component capable of receiving a user input and communicating an indication of that input to the second processing unit 220.
- the display device 250 is configured to render and/or present the GUI 260 thereon.
- the display device 250 which is operably coupled to an output of the second processing unit 220, may be configured as any presentation component that is capable of presenting information to a user, such as a digital monitor, electronic display panel, touch-screen, analog set-top box, plasma screen, Braille pad, and the like.
- the display device 250 is configured to present rich content, such as the advertisement 270 embedded within video content 290 and/or digital images.
- the display device 250 is capable of rendering other forms of media (e.g., audio signals).
- the data store 230 is generally configured to store information associated with the advertisement 270 and the media file 205 that may be selected for concurrent presentation.
- information may include, without limitation, the advertisement 270, the media file 205, a description file 255 to be passed to an ad- designing entity 240; and a group of advertisements (being a compilation of advertisements developed specifically for presentation in tandem with the media file 205) associated within the media file 205, and a trajectory 265.
- the data store 230 may be configured to be searchable for suitable access to the stored advertisement 270 and the stored media file(s) 205. For instance, the data store 230 may be searchable for one or more of the advertisements within the group that are targeted toward interests of a user, relevant to the video content 290, and/or associated within the media file 205.
- the information stored in the data store 230 may be configurable and may include any information relevant to the storage or, access to, and retrieval of the advertisement 270 for placement within the video content 290 of the media file 205 and for rendering the integrated advertisement 270 and media file 205 on the GUI 260.
- the content and volume of such information are not intended to limit the scope of embodiments of the present invention in any way.
- the data store 230 may, in fact, be a plurality of databases, for instance, a database cluster, portions of which may reside on the first processing unit 210, the second processing unit 220, another external computing device (not shown), and/or any combination thereof.
- This distributed computing environment 200 is but one example of a suitable environment that may be implemented to carry out aspects of the present invention and is not intended to suggest any limitation as to the scope of use or functionality of the invention. Neither should the illustrated distributed computing environment 200 be interpreted as having any dependency or requirement relating to any one or combination of the devices 210, 220, and 250, the data store 230, nor the components for carrying out the processes 215 and 225 as illustrated. In some embodiments, the components may be implemented as stand-alone devices. In other embodiments, one or more of the components may be integrated directly into the processing units 210 and 220. It will be appreciated and understood that the components for implementing the processes 215 and 225 are exemplary in nature and in number and should not be construed as limiting.
- any number of components and devices may be employed to achieve the desired functionality within the scope of embodiments of the present invention.
- the various components and devices of FIG. 2 are shown with lines for the sake of clarity, in reality, delineating various components is not so clear, and, metaphorically, the lines would more accurately be grey or fuzzy.
- some components and devices of FIG. 2 are depicted as single blocks, the depictions are exemplary in nature and in number and are not to be construed as limiting (e.g., although individual processing units 210 and 220 are shown, the steps and operations performed by each may be performed by a single processing unit or other type of computing device).
- the devices 210, 220, and 250, and the data store 230, of the exemplary system architecture may be interconnected by any method known in the relevant field. For instance, they may be operably coupled via a distributed computing environment that includes multiple computing devices coupled with one another via one or more networks (not shown).
- the networks may include, without limitation, one or more local area networks (LANs) and/or wide area networks (WANs).
- LANs local area networks
- WANs wide area networks
- Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets, and the Internet. Accordingly, the network is not further described herein.
- the components are designed to perform the offline authoring process 215 and the online rendering process 225.
- the offline authoring process 215 includes a plurality of discrete steps that may include the following: targeting a patch within an object appearing in the video content 290 of the media file 205; tracking movement of the patch over a sequence of frames within the media file 205; based on the tracked movement of the patch, abstracting coordinate locations of the patch within the video content 290; and writing locations of the patch within the sequence of frames to the trajectory 265.
- the phrase "media file” is not meant to be construed as limiting, but may encompass any general structure for time-based multimedia, such as video and audio.
- the media file 205 may be configured with any known file formats (e.g. container formats, MP4, and 3GP) that facilitates interchange, management, editing, and presentation of the video content 290.
- the presentation may be local, via a network, or other streaming delivery mechanism.
- the media file may be a digital video that is configured to play upon receiving a user-initiated selection (during an online computing session) directed thereto.
- the media file 205 may be accessed at a variety of storage locations.
- these storage locations may reside locally on the first processing unit 210, in the possession of a user (e.g., internal folders, CD memory, external flash drives, etc), online space accommodated by remote web servers responsible for managing media, a networking site, or a public database for hosting a media collection.
- the offline authoring process 215 Upon retrieving the media file 205, the offline authoring process 215 abstracts information from the media file 205 to generate a trajectory 265 and/or a description file 255.
- the "trajectory" 265 essentially serves as a vehicle to store the abstracted information in a logical format.
- the trajectory may assume a form of an XML file that stores the locations as metadata.
- the trajectory 265 may be distinct from the media file 205, or may comprise data appended to the media file 205 such that media file 205 includes the abstracted information, yet the video content 290 remains unaltered.
- the trajectory 265 may include timestamps associated with each of the locations of the object abstracted from the media file 205, where the timestamps are potentially utilized for developing the advertisement 270 and for starting and stopping play of the advertisement 270 in a manner that synchronizes the presentation of it and the media file 205. Consequently, in this instance, the trajectory 265 persists a robust set of information for accurately describing a location and timing of the object's existence at the location within the media file 205.
- FIG. 3 One embodiment of abstracting information from the media file 205 is shown in FIG. 3.
- a diagrammatic view 300 of a sequence of frames 301, 302, 303, 311, and 312 of the media file 205 is illustrated with an object 320 being tracked therein, in accordance with an embodiment of the present invention.
- the object 320 is represented as a football. As discussed above, however, the object 320 may be any identifiable item occurring in the video content 290.
- the sequence of frames 301, 302, 303, 311, and 312 is analyzed to find the object 320 within the video content 290.
- analyzing involves selecting key frames, shown at reference numerals 301, 302, and 303, and labeling them as such.
- locations 341, 343, and 345 of positions of the object 320 within the key frames 301, 311, and 312, respectively, are manually ascertained. These locations 341, 340, and 345 may be retained in a listing of locations within the trajectory 265 and may be associated with their respective key frames 310, 302, and 303.
- the locations 341, 343, and 345 of positions of the object 320 are X 335 and Y 330 coordinates of the object 320 relative to the key frames 310, 302, and 303.
- a mechanism is applied to automatically interpolate movement of the object 320 on intermediate frames, shown at reference numerals 311 and 312, that are in- between, the key frames 301, 302, and 303.
- the mechanism may comprise a video or vision computing algorithm (e.g., various research algorithms used to understand the video content 290 and recognize the object 320 therein) to review the locations 341, 343, and 345 of the object 320 in the key frames 310, 302, and 303 and to interpolate predicted locations 342 and 344 for the intermediate frames 311 and 312, respectively.
- Interpolation may be carried out by deducing a difference in location of the object 320 from one frame to the next, and identifying the predicted locations 342 and 344 within the difference, thereby linking the locations 341, 343, and 345 into a continuous movement pattern of the object 320. Accordingly, a semiautomatic procedure is conducted for accurately pulling locations 341, 342, 343, 344, and 345 from the video content 290.
- this semiautomatic procedure is scalable to accommodate abstracting accurate locations from large media files because it is not necessary to manually recognize and record a location of the object therein for each frame of a selected sequence of frames in which the advertisements will be placed.
- an additional algorithm is executed for automatically tuning the predicted locations generated by interpolation.
- the tuning process may involve automatically locating the object 320 using known characteristics of the object 320, such as shape, color, size, and predicted location at a particular frame. Further, known characteristics may include an identifiable texture associated with a patch on the object 320, as discussed more fully below with reference to FIG. 4.
- the predicted locations 342 and 343 may be tuned to correspond with an actual position of the object. These tuned locations are indicated by reference numerals 351 and 352. Accordingly, an accurate arc-shaped path of the object 320, which follows the true movement of a football through the air, is saved to the trajectory 265, thereby correcting deficiencies of the interpolation process.
- FIG. 4 is a diagrammatic view of a user interface (UI) display 400 illustrating an object 405, within video content 415, with a patch 410 selected thereon, in accordance with an embodiment of the present invention.
- UI user interface
- the patch 410 is an area of the object 405 (bear) that is generally easy to indentify and frequently appears within the sequence of frames. Further, it is preferred that the patch 410 be of substantially consistent shape and color, have invariant feature points, and be somewhat prominent. As shown, the patch 410 covers a rectangular area (5 x 5 window of pixels 420) with a distinct texture (contrasting dark and light colors of the eye as compared against the brown fur of the bear) that is targeted within the object 405, or at least associated with the object 405. As such, the window of pixels 420 can be used to manually or automatically identify a location of the object 405, or specific portions thereof.
- the window of pixels 420 can be used to manually or automatically identify a vector 425 established by the window, or set, of pixels 420 that are designated as the patch 410.
- attributes of the identified vector 425 are maintained in the trajectory 265. These attributes may include a radial direction and origin.
- the attributes in the trajectory 265 are employed to render an advertisement at positions within the video content 290 that consistently intersect the identified vector 425.
- the vector 425 is based on a feature of the object 405 that naturally provides a linear subspace. For instance, as illustrated in FIG. 4, identifying attributes of the vector 425 involves ascertaining line-of-sight originating from one or more eyes of the object 405. In operation, employing the attributes in the trajectory 265 to render an advertisement at a position within the video content 290 that consistently intersects the vector 425 involves placing the advertisement in a position that intersects the line-of-sight of the object 405, or bear. As such, because the bear appears to be looking at the advertisement upon placement, the attention of a viewer of the media file 205 would likely gravitate toward the advertisement as well. [0054] Returning to FIG.
- the first processing unit 210 may conduct the offline authoring process 215 that includes generating a description file 255.
- Generating the description file 255 may involve analyzing the video content 290 to determine a theme of the media file 205, and combining the media- file theme with the trajectory 265 to form the description file 255.
- the phrase "description file” is not meant to be limiting, but may encompass a broad range of vehicles for carrying information related to the video content 290 to an ad-designer entity 240 in order to assist in developing the advertisement 270.
- the description file 255 may include some or all data from the trajectory 265, such as coordinate locations and timestamps of positions of an object, as well as a theme or topic of the media file 205 and an identity of the object.
- the ad-designer entity 240 uses some or all information carried by the description file 255 to create the advertisement 270.
- the creation of the advertisement 270 may be based on a concept of a bear in a stream, as illustrated in FIG. 4, such that an appropriate subject of the advertisement 270 may be an animated fish or other water-based animation that is relevant to the bear in a stream. (This example is depicted in FIG.
- the advertisement 270 may be created in a way that visually interacts within the context of the video content 290 and appears more natural or sophisticated.
- ad-designer entity 240 As shown in FIG. 2, only one ad-designer entity 240 is illustrated. However, it should be appreciated that a plurality of advertisement designers may have access to the description file 255 and may create a variety of advertisements that are relevant to the theme of the media file 205 and that can be placed in the video content 290. Accordingly, in embodiments, these relevant advertisements are joined to a group associated with the media file 205. As used herein, the phrase "group" generally refers to a compilation of advertisements developed specifically for being presented in tandem with the media file 205. In operation, the group may be stored on the data store 230 and may be accessible by the second processing unit 220 for gathering a relevant advertisement to be placed on the video content 290 during the online rendering process 225.
- the online rendering process 225 applies a selection scheme that provides rules for choosing one of the relevant advertisements within the group (e.g., on a rotational basis). [0057] Further, the online rendering process 225 carries out a plurality of steps for placing the advertisement 270 on top of the video content 290.
- the trigger for implementing the online rendering process 225 involves a user selection of a representation of the media file 205. This user selection may involve a user-initiated click action directed toward a uniform resource locator (URL) linked to the media file 205. Or, the user selection may involve launching a web browser that is configured to present the media file 205. In yet other embodiments, the user selection involves receiving an indication that a user-initiated selection occurred with respect to a visual representation of the advertisement 270.
- URL uniform resource locator
- the variety of steps performed by the online rendering process 225 include one or more of the following, in no particular order: selecting the advertisement 270; generating an ad-overlay that accommodates a container to hold the advertisement 270, where the container is positioned within the ad-overlay according to the trajectory 265; inserting the advertisement 270 into the container; and rendering the ad-overlay on top of the video content 290 when playing the media file 205.
- a particular embodiment of performing these steps is depicted at FIG. 5. In particular, FIG.
- FIG. 5 depicts progressive UI displays illustrating stages of placing an advertisement 510 (flag waving in the wind) on top of the object 320 (football) within the sequence of frames 302, 312, and 303, in accordance with an embodiment of the present invention.
- the advertisement 510 can be placed on the object 320 in such a manner that the flag remains on the football throughout the movement of the football through the air.
- attention is drawn to the advertisement 510.
- a trajectory associated with the object 320 allows for creation and placement of the advertisement 510 such that it visually interacts with the video content.
- the trajectory provides an advertisement designer with a concept of a path of the object 320 allowing the advertisement designer to animate the advertisement 510 in a meaningful way.
- the flag (advertisement 510) is blowing in a direction as if it were attached to the football (object 320) as it travels through the air.
- the trajectory allows the online rendering process to dynamically place the advertisement 510 on top of the video content by rendering the advertisement 510 at positions within the video content that substantially correspond to the locations of the object 320, or patch, written to the trajectory. Accordingly, the flag may be placed, based on X and Y coordinate locations of the football, along its entire path.
- an interesting map that records locations of significant objects embedded within the video content may be applied.
- the phrase "interesting map” relates to information gathered from the sequence of frames that may be employed for positioning the advertisement 510 (flag) on top of the object 320 (football).
- the interesting map may include information about another object 520 (receiver) within the video content.
- the position of the advertisement 510 may be adjusted by an offset 550 so that it does not obscure the object 520 when being placed.
- the interesting map allows for building freedom into the placement of the advertisement 510 about the locations in the trajectory. This freedom provides the ability to rotate or translate laterally/vertically the advertisement 510 to avoid blocking any significant object (e.g., the object 520) or other critical aspects in the video content.
- FIG. 6 is a diagrammatic view 600 of animation of an advertisement 610 being incorporated into video content 415 via an ad-overlay 620, in accordance with an embodiment of the present invention.
- the advertisement 610 is created to include animation that visually interacts with the video content 415.
- the fish (advertisement 610) is created to swim within a stream (video content 415).
- the ad-overlay 620 is prepared with a container 615.
- the container 615 is placed within the ad-overlay 620 as a function of the locations of the object 405 or a vector 425 originating from the object 405.
- the container 615 is placed at an intersection with the vector 425 using X' and Y' coordinate locations of the vector 425.
- the container 615 may be placed in proximity with X and Y coordinate locations of the object 405 itself. Accordingly, the use of the container 615 to dynamically place the advertisement 610 within the video content 415 provides a suitable mechanism for positioning the advertisement 610 based on the trajectory, thereby generating the visual interaction between the advertisement 610 and the object 405.
- FIG. 7 illustrates a high-level overview of techniques for performing an offline authoring process to generate a trajectory, in accordance with an embodiment of the present invention.
- steps may be used herein to connote different elements of methods employed, the terms should not be interpreted as implying any particular order among or between various steps herein disclosed unless and except when the order of individual steps is explicitly described.
- the exemplary flow diagram 700 commences with targeting a patch within an object appearing in video content of the media file, as indicated at block 710.
- the patch 410 may cover an identifiable texture (e.g., eye) of the object (e.g., bear).
- the movement of the patch is tracked over a sequence of frames (e.g., employing the offline authoring process 215 of FIG. 2), as indicated at block 720.
- tracking the movement of the patch, or object may involve the following steps: selecting key frames within the sequence of frames (see block 730); manually inputting the locations of the patch within the key frames into the trajectory (see block 740); and utilizing the input locations to automatically interpolate movement of the patch on intermediate frames that reside in-between the key frames (see block 750).
- the process of tracking movement of the patch may further include the following steps: partitioning the interpolated movement into predicted locations that are each associated with the intermediate frames, respectively (see block 760); and tuning the predicted locations based on perceived locations of an identifiable texture associated with the patch (see block 770). Based on the tracked movement of the patch, the locations of the patch may be written to a trajectory 780.
- FIG. 8 a flow diagram illustrating an overall method 800 for performing an online rendering process (e.g., online rendering process 225 performed by the second processing unit 220 of FIG. 2) upon initiating play of the media file is shown, in accordance with an embodiment of the present invention.
- the method 800 includes automatically selecting an advertisement (e.g., utilizing a selection scheme), as indicated at block 810.
- the advertisement e.g., utilizing a selection scheme
- the method 800 involves dynamically placing the selected advertisement on top of the video content of the media file.
- the advertisement is dynamically placed as a function of the locations saved to the trajectory.
- the process of dynamically placing includes the following steps: creating an ad-overlay that includes a container that is positioned within the ad-overlay based on the trajectory (see block 830); inserting the selected advertisement into the container (see block 840); and rendering the ad-overlay and the media file in a synchronized manner such that the container is layered on top of the video content (see block 850).
- the process of dynamically placing may further include rendering the selected advertisement at positions within the video content that substantially correspond to the locations of a patch written to the trajectory, as indicated at block 860. As such, the advertisement will appear to visually interact with an object in the video content and draw a user's attention to the advertisement.
Landscapes
- Business, Economics & Management (AREA)
- Engineering & Computer Science (AREA)
- Finance (AREA)
- Accounting & Taxation (AREA)
- Development Economics (AREA)
- Strategic Management (AREA)
- Marketing (AREA)
- Economics (AREA)
- Game Theory and Decision Science (AREA)
- Entrepreneurship & Innovation (AREA)
- Physics & Mathematics (AREA)
- General Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Processing Or Creating Images (AREA)
- Information Transfer Between Computers (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Priority Applications (10)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| KR1020127008172A KR101760962B1 (ko) | 2009-09-30 | 2010-08-31 | 비디오 컨텐츠-인지 광고 배치 방법 |
| RU2012112228/08A RU2542923C2 (ru) | 2009-09-30 | 2010-08-31 | Размещение рекламы с учетом видеоконтента |
| SG2012007225A SG178220A1 (en) | 2009-09-30 | 2010-08-31 | Video content-aware advertisement placement |
| JP2012532091A JP5570607B2 (ja) | 2009-09-30 | 2010-08-31 | ビデオコンテンツを意識した広告掲載 |
| EP10820993.3A EP2483858A4 (en) | 2009-09-30 | 2010-08-31 | VIDEO-CONSIDERED ADVERTISING PLACEMENT |
| BR112012007127A BR112012007127A2 (pt) | 2009-09-30 | 2010-08-31 | colocação de anúncio de conteúdo de vídeo consciente |
| CN201080044011.XA CN102576441B (zh) | 2009-09-30 | 2010-08-31 | 视频内容认知广告放置 |
| AU2010301005A AU2010301005B2 (en) | 2009-09-30 | 2010-08-31 | Video content-aware advertisement placement |
| CA2771167A CA2771167C (en) | 2009-09-30 | 2010-08-31 | Video content-aware advertisement placement |
| MX2012003327A MX2012003327A (es) | 2009-09-30 | 2010-08-31 | Colocacion de publicidad consciente de contenido de video. |
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US24737509P | 2009-09-30 | 2009-09-30 | |
| US61/247,375 | 2009-09-30 | ||
| US12/633,609 US9111287B2 (en) | 2009-09-30 | 2009-12-08 | Video content-aware advertisement placement |
| US12/633,609 | 2009-12-08 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| WO2011041056A2 true WO2011041056A2 (en) | 2011-04-07 |
| WO2011041056A3 WO2011041056A3 (en) | 2011-06-16 |
Family
ID=43781716
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2010/047198 Ceased WO2011041056A2 (en) | 2009-09-30 | 2010-08-31 | Video content-aware advertisement placement |
Country Status (13)
| Country | Link |
|---|---|
| US (1) | US9111287B2 (enExample) |
| EP (1) | EP2483858A4 (enExample) |
| JP (1) | JP5570607B2 (enExample) |
| KR (1) | KR101760962B1 (enExample) |
| CN (1) | CN102576441B (enExample) |
| AU (1) | AU2010301005B2 (enExample) |
| BR (1) | BR112012007127A2 (enExample) |
| CA (1) | CA2771167C (enExample) |
| MX (1) | MX2012003327A (enExample) |
| RU (1) | RU2542923C2 (enExample) |
| SG (1) | SG178220A1 (enExample) |
| TW (1) | TWI521456B (enExample) |
| WO (1) | WO2011041056A2 (enExample) |
Families Citing this family (50)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20110150283A1 (en) * | 2009-12-18 | 2011-06-23 | Electronics And Telecommunications Research Institute | Apparatus and method for providing advertising content |
| US9429940B2 (en) | 2011-01-05 | 2016-08-30 | Sphero, Inc. | Self propelled device with magnetic coupling |
| US10281915B2 (en) | 2011-01-05 | 2019-05-07 | Sphero, Inc. | Multi-purposed self-propelled device |
| US9762967B2 (en) | 2011-06-14 | 2017-09-12 | Comcast Cable Communications, Llc | System and method for presenting content with time based metadata |
| US12212791B2 (en) | 2011-06-14 | 2025-01-28 | Comcast Cable Communications, Llc | Metadata delivery system for rendering supplementary content |
| US9349129B2 (en) * | 2011-10-17 | 2016-05-24 | Yahoo! Inc. | Media enrichment system and method |
| US9355115B2 (en) * | 2011-11-21 | 2016-05-31 | Microsoft Technology Licensing, Llc | Client application file access |
| US20140328570A1 (en) * | 2013-01-09 | 2014-11-06 | Sri International | Identifying, describing, and sharing salient events in images and videos |
| US9292758B2 (en) * | 2012-05-14 | 2016-03-22 | Sphero, Inc. | Augmentation of elements in data content |
| TWI470999B (zh) * | 2012-06-19 | 2015-01-21 | Wistron Corp | 編輯與儲存串流的方法、裝置、系統 |
| US20140025481A1 (en) * | 2012-07-20 | 2014-01-23 | Lg Cns Co., Ltd. | Benefit promotion advertising in an augmented reality environment |
| US10187520B2 (en) | 2013-04-24 | 2019-01-22 | Samsung Electronics Co., Ltd. | Terminal device and content displaying method thereof, server and controlling method thereof |
| CN104166926A (zh) * | 2013-05-16 | 2014-11-26 | 宇宙互联有限公司 | 电子装置、广告植入系统及方法 |
| CN103324702B (zh) * | 2013-06-13 | 2016-09-21 | 华为技术有限公司 | 视频文件处理方法和视频文件处理设备 |
| EP3016052A4 (en) * | 2013-06-25 | 2017-01-04 | Dongguan Yulong Telecommunication Tech Co. Ltd. | Data processing method and data processing system |
| KR102146443B1 (ko) * | 2013-10-08 | 2020-08-20 | 에스케이플래닛 주식회사 | 광고 서비스 시스템 및 방법 |
| US9654814B2 (en) | 2013-10-28 | 2017-05-16 | Microsoft Technology Licensing, Llc | Video frame selection for targeted content |
| US9934252B2 (en) * | 2014-03-10 | 2018-04-03 | Microsoft Technology Licensing, Llc | Metadata-based photo and/or video animation |
| US9432703B2 (en) * | 2014-11-17 | 2016-08-30 | TCL Research America Inc. | Method and system for inserting contents into video presentations |
| US9472009B2 (en) | 2015-01-13 | 2016-10-18 | International Business Machines Corporation | Display of context based animated content in electronic map |
| CN104639958A (zh) * | 2015-02-06 | 2015-05-20 | 百度在线网络技术(北京)有限公司 | 广告投放方法和系统 |
| CN104618745B (zh) * | 2015-02-17 | 2017-08-01 | 北京影谱科技股份有限公司 | 一种在视频中动态植入广告的装置 |
| CN104967885B (zh) * | 2015-03-27 | 2019-01-11 | 哈尔滨工业大学深圳研究生院 | 一种基于视频内容感知的广告推荐方法及系统 |
| CN105141988A (zh) * | 2015-08-17 | 2015-12-09 | 深圳市云宙多媒体技术有限公司 | 一种在线网络视频广告投放方法和系统 |
| TWI552591B (zh) * | 2015-08-24 | 2016-10-01 | 晶睿通訊股份有限公司 | 標記視頻中物件的方法、裝置及電腦可讀取記錄媒體 |
| US10617945B1 (en) | 2015-12-14 | 2020-04-14 | Amazon Technologies, Inc. | Game video analysis and information system |
| CN105704138A (zh) * | 2016-03-14 | 2016-06-22 | 浪潮(苏州)金融技术服务有限公司 | 利用多维化技术在流媒体数据中植入第三方信息的方法 |
| WO2018033137A1 (zh) * | 2016-08-19 | 2018-02-22 | 北京市商汤科技开发有限公司 | 在视频图像中展示业务对象的方法、装置和电子设备 |
| US10121513B2 (en) | 2016-08-30 | 2018-11-06 | International Business Machines Corporation | Dynamic image content overlaying |
| KR101877671B1 (ko) | 2017-03-31 | 2018-08-09 | 넷마블 주식회사 | 애니메이션 정보를 생성하는 방법 및 사용자 단말기에 애니메이션을 표시하는 방법 및 애플리케이션 |
| KR102589628B1 (ko) * | 2016-12-13 | 2023-10-13 | 로비 가이드스, 인크. | 미디어 자산의 관심 객체의 이동 경로를 예측하고 이동 경로에 오버레이를 배치하는 것을 회피함으로써 오버레이에 의한 미디어 자산의 가림을 최소화하기 위한 시스템 및 방법 |
| KR101949261B1 (ko) * | 2017-01-23 | 2019-02-18 | 고범준 | Vr 영상 생성 방법, vr 영상 처리 방법 및 vr 영상 처리 시스템 |
| CN109299963A (zh) * | 2017-07-24 | 2019-02-01 | 阿里克·阿图尔·博洛廷 | 消息接收验证系统 |
| CN107493488B (zh) * | 2017-08-07 | 2020-01-07 | 上海交通大学 | 基于Faster R-CNN模型的视频内容物智能植入的方法 |
| KR20180081480A (ko) | 2018-07-04 | 2018-07-16 | 넷마블 주식회사 | 애니메이션 정보를 생성하는 방법 및 사용자 단말기에 애니메이션을 표시하는 방법 및 애플리케이션 |
| CN109783688A (zh) * | 2018-12-28 | 2019-05-21 | 广州烽火众智数字技术有限公司 | 一种分布式视频摘要处理系统 |
| US11141656B1 (en) * | 2019-03-29 | 2021-10-12 | Amazon Technologies, Inc. | Interface with video playback |
| TW202046745A (zh) * | 2019-06-14 | 2020-12-16 | 陳暐傑 | 賽事廣告導入系統及其實施方法 |
| US10951563B2 (en) | 2019-06-27 | 2021-03-16 | Rovi Guides, Inc. | Enhancing a social media post with content that is relevant to the audience of the post |
| WO2022018628A1 (en) * | 2020-07-20 | 2022-01-27 | Sky Italia S.R.L. | Smart overlay : dynamic positioning of the graphics |
| US11457059B2 (en) | 2020-08-20 | 2022-09-27 | Rewardstyle, Inc. | System and method for ingesting and presenting a video with associated linked products and metadata as a unified actionable shopping experience |
| GB2615264A (en) * | 2020-10-22 | 2023-08-02 | Jay Collier Jeffrey | Conversion of text to dynamic video |
| CN114902649B (zh) | 2020-10-30 | 2025-04-01 | 谷歌有限责任公司 | 用于非遮挡视频叠加的方法、系统和计算机可读介质 |
| US11854129B2 (en) * | 2020-11-17 | 2023-12-26 | Bria Artificial Intelligence Ltd. | Generating visual content consistent with aspects of a visual language |
| CN113055732A (zh) * | 2021-03-19 | 2021-06-29 | 湖南快乐阳光互动娱乐传媒有限公司 | 广告投放方法、广告投放服务器、客户端和广告投放系统 |
| US11862126B2 (en) * | 2022-02-10 | 2024-01-02 | Hewlett-Packard Development Company, L.P. | Inset window alterations |
| CN114511359B (zh) * | 2022-02-17 | 2022-11-22 | 北京优酷科技有限公司 | 显示方法、装置、设备及介质 |
| CN118537467A (zh) * | 2023-02-21 | 2024-08-23 | 荣耀终端有限公司 | 渲染方法和电子设备 |
| CN116761037B (zh) * | 2023-08-23 | 2023-11-03 | 星河视效科技(北京)有限公司 | 视频植入多媒体信息的方法、装置、设备及介质 |
| CN119887307B (zh) * | 2025-03-19 | 2025-07-01 | 浙江工贸职业技术学院(浙江工贸技师学院) | 基于用户情感识别的动态广告内容智能投放方法 |
Family Cites Families (34)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7721307B2 (en) | 1992-12-09 | 2010-05-18 | Comcast Ip Holdings I, Llc | Method and apparatus for targeting of interactive virtual objects |
| US5970504A (en) * | 1996-01-31 | 1999-10-19 | Mitsubishi Denki Kabushiki Kaisha | Moving image anchoring apparatus and hypermedia apparatus which estimate the movement of an anchor based on the movement of the object with which the anchor is associated |
| US6724915B1 (en) * | 1998-03-13 | 2004-04-20 | Siemens Corporate Research, Inc. | Method for tracking a video object in a time-ordered sequence of image frames |
| US6366701B1 (en) | 1999-01-28 | 2002-04-02 | Sarnoff Corporation | Apparatus and method for describing the motion parameters of an object in an image sequence |
| US6381362B1 (en) * | 1999-04-08 | 2002-04-30 | Tata America International Corporation | Method and apparatus for including virtual ads in video presentations |
| US7248300B1 (en) * | 1999-06-03 | 2007-07-24 | Fujifilm Corporation | Camera and method of photographing good image |
| JP2000350123A (ja) * | 1999-06-04 | 2000-12-15 | Fuji Photo Film Co Ltd | 画像選択装置、カメラ、画像選択方法及び記録媒体 |
| US6424370B1 (en) * | 1999-10-08 | 2002-07-23 | Texas Instruments Incorporated | Motion based event detection system and method |
| GB2361370A (en) * | 2000-04-14 | 2001-10-17 | Allan Plaskett | Analysing movement of objects |
| JP2002032590A (ja) | 2000-06-22 | 2002-01-31 | Internatl Business Mach Corp <Ibm> | 広告方法、広告システム、広告枠の取引方法、広告枠の取引システムおよび記録媒体 |
| US6774908B2 (en) | 2000-10-03 | 2004-08-10 | Creative Frontier Inc. | System and method for tracking an object in a video and linking information thereto |
| US7003061B2 (en) | 2000-12-21 | 2006-02-21 | Adobe Systems Incorporated | Image extraction from complex scenes in digital video |
| KR20020065250A (ko) | 2001-02-06 | 2002-08-13 | 강용희 | 동영상과 컨텐츠의 오버레이 처리방법 및 그를 이용한전자메일 처리방법과 상기 방법을 실행시키기 위한프로그램을 기록한 컴퓨터로 읽을 수 있는 기록매체 |
| JP2003242410A (ja) | 2002-02-19 | 2003-08-29 | Fujitsu Ltd | 情報配信方法及びコンピュータプログラム |
| EP1416727A1 (en) * | 2002-10-29 | 2004-05-06 | Accenture Global Services GmbH | Moving virtual advertising |
| US20040116183A1 (en) | 2002-12-16 | 2004-06-17 | Prindle Joseph Charles | Digital advertisement insertion system and method for video games |
| US7116342B2 (en) * | 2003-07-03 | 2006-10-03 | Sportsmedia Technology Corporation | System and method for inserting content into an image sequence |
| US7979877B2 (en) * | 2003-12-23 | 2011-07-12 | Intellocity Usa Inc. | Advertising methods for advertising time slots and embedded objects |
| US7746378B2 (en) * | 2004-10-12 | 2010-06-29 | International Business Machines Corporation | Video analysis, archiving and alerting methods and apparatus for a distributed, modular and extensible video surveillance system |
| US20060105841A1 (en) * | 2004-11-18 | 2006-05-18 | Double Fusion Ltd. | Dynamic advertising system for interactive games |
| US8413182B2 (en) | 2006-08-04 | 2013-04-02 | Aol Inc. | Mechanism for rendering advertising objects into featured content |
| JP2008077173A (ja) | 2006-09-19 | 2008-04-03 | Sony Computer Entertainment Inc | コンテンツ表示処理装置およびコンテンツ内広告表示方法 |
| US20080195468A1 (en) * | 2006-12-11 | 2008-08-14 | Dale Malik | Rule-Based Contiguous Selection and Insertion of Advertising |
| JP4809201B2 (ja) | 2006-12-12 | 2011-11-09 | ヤフー株式会社 | 情報提供装置、情報提供方法、及びコンピュータプログラム |
| US20080295129A1 (en) | 2007-05-21 | 2008-11-27 | Steven Laut | System and method for interactive video advertising |
| US8417037B2 (en) | 2007-07-16 | 2013-04-09 | Alexander Bronstein | Methods and systems for representation and matching of video content |
| US8091103B2 (en) | 2007-07-22 | 2012-01-03 | Overlay.Tv Inc. | Server providing content directories of video signals and linkage to content information sources |
| CN101098344A (zh) * | 2007-07-23 | 2008-01-02 | 王文钢 | 一种视频广告的展示方法 |
| JP2009094980A (ja) | 2007-10-12 | 2009-04-30 | Ask.Jp Co Ltd | 投稿動画配信サーバ及び投稿動画配信方法 |
| RU73115U1 (ru) * | 2007-10-19 | 2008-05-10 | Андрей Петрович Сутормин | Установка визуально-звуковой рекламы |
| KR101111726B1 (ko) | 2007-10-31 | 2012-03-08 | 주식회사 소프닉스 | 양방향 광고 정보 파일 저작 서비스 제공방법 및, 양방향 광고 정보 파일 저작 프로그램이 기록된 기록매체 |
| US20090307722A1 (en) * | 2007-12-10 | 2009-12-10 | Jesse Ernest Gross | System to deliver targeted advertisements in a live video stream |
| US20090171787A1 (en) | 2007-12-31 | 2009-07-02 | Microsoft Corporation | Impressionative Multimedia Advertising |
| TWI375177B (en) * | 2008-09-10 | 2012-10-21 | Univ Nat Taiwan | System and method for inserting advertising content |
-
2009
- 2009-12-08 US US12/633,609 patent/US9111287B2/en active Active
-
2010
- 2010-08-19 TW TW099127767A patent/TWI521456B/zh not_active IP Right Cessation
- 2010-08-31 MX MX2012003327A patent/MX2012003327A/es active IP Right Grant
- 2010-08-31 CA CA2771167A patent/CA2771167C/en active Active
- 2010-08-31 RU RU2012112228/08A patent/RU2542923C2/ru not_active IP Right Cessation
- 2010-08-31 SG SG2012007225A patent/SG178220A1/en unknown
- 2010-08-31 BR BR112012007127A patent/BR112012007127A2/pt not_active IP Right Cessation
- 2010-08-31 JP JP2012532091A patent/JP5570607B2/ja not_active Expired - Fee Related
- 2010-08-31 EP EP10820993.3A patent/EP2483858A4/en not_active Ceased
- 2010-08-31 AU AU2010301005A patent/AU2010301005B2/en not_active Ceased
- 2010-08-31 WO PCT/US2010/047198 patent/WO2011041056A2/en not_active Ceased
- 2010-08-31 KR KR1020127008172A patent/KR101760962B1/ko not_active Expired - Fee Related
- 2010-08-31 CN CN201080044011.XA patent/CN102576441B/zh active Active
Non-Patent Citations (1)
| Title |
|---|
| See references of EP2483858A4 * |
Also Published As
| Publication number | Publication date |
|---|---|
| CA2771167A1 (en) | 2011-04-07 |
| US20110078623A1 (en) | 2011-03-31 |
| EP2483858A2 (en) | 2012-08-08 |
| JP2013506907A (ja) | 2013-02-28 |
| KR20120091033A (ko) | 2012-08-17 |
| CN102576441B (zh) | 2014-12-17 |
| TW201113825A (en) | 2011-04-16 |
| SG178220A1 (en) | 2012-03-29 |
| US9111287B2 (en) | 2015-08-18 |
| CA2771167C (en) | 2017-01-10 |
| RU2542923C2 (ru) | 2015-02-27 |
| BR112012007127A2 (pt) | 2016-07-12 |
| KR101760962B1 (ko) | 2017-07-24 |
| WO2011041056A3 (en) | 2011-06-16 |
| TWI521456B (zh) | 2016-02-11 |
| MX2012003327A (es) | 2012-04-20 |
| AU2010301005A1 (en) | 2012-03-01 |
| AU2010301005B2 (en) | 2014-06-05 |
| JP5570607B2 (ja) | 2014-08-13 |
| CN102576441A (zh) | 2012-07-11 |
| EP2483858A4 (en) | 2015-05-20 |
| RU2012112228A (ru) | 2013-10-10 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US9111287B2 (en) | Video content-aware advertisement placement | |
| US11184676B2 (en) | Automated process for ranking segmented video files | |
| US20230086529A1 (en) | Method for serving interactive content to a user | |
| US12200306B1 (en) | Computer program product for allowing a user to create an interactive object in a content owner's digital media file | |
| US8369686B2 (en) | Intelligent overlay for video advertising | |
| US20100164989A1 (en) | System and method for manipulating adverts and interactive | |
| JP2013506907A5 (enExample) | ||
| US20180197575A1 (en) | Methods for serving interactive content to a user | |
| US20170277404A1 (en) | Optimizing Layout of Interactive Electronic Content Based on Content Type and Subject Matter | |
| KR20030051708A (ko) | 책 모양 인터페이스를 갖는 인터넷 브라우저를 이용하는광고 방법 및 시스템 | |
| WO2013138370A1 (en) | Interactive overlay object layer for online media | |
| US20230419366A1 (en) | Method for serving interactive content to a user | |
| US9772752B1 (en) | Multi-dimensional online advertisements | |
| KR101381966B1 (ko) | 동영상 재생 시 부가정보를 제공하는 방법 | |
| TR201709269T4 (tr) | Bir video akışındaki öğelerin izlenmesi için cihaz. | |
| Jain et al. | " Ad you like it" advertisement sourcing and selection technique across multiple heterogeneous applications |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| WWE | Wipo information: entry into national phase |
Ref document number: 201080044011.X Country of ref document: CN |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10820993 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2010820993 Country of ref document: EP |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2010301005 Country of ref document: AU |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2771167 Country of ref document: CA |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2012532091 Country of ref document: JP |
|
| ENP | Entry into the national phase |
Ref document number: 2010301005 Country of ref document: AU Date of ref document: 20100831 Kind code of ref document: A |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 1975/CHENP/2012 Country of ref document: IN |
|
| WWE | Wipo information: entry into national phase |
Ref document number: MX/A/2012/003327 Country of ref document: MX |
|
| ENP | Entry into the national phase |
Ref document number: 20127008172 Country of ref document: KR Kind code of ref document: A |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2012112228 Country of ref document: RU |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112012007127 Country of ref document: BR |
|
| ENP | Entry into the national phase |
Ref document number: 112012007127 Country of ref document: BR Kind code of ref document: A2 Effective date: 20120329 |