US20100153861A1 - Interactive events - Google Patents

Interactive events Download PDF

Info

Publication number
US20100153861A1
US20100153861A1 US12/568,666 US56866609A US2010153861A1 US 20100153861 A1 US20100153861 A1 US 20100153861A1 US 56866609 A US56866609 A US 56866609A US 2010153861 A1 US2010153861 A1 US 2010153861A1
Authority
US
United States
Prior art keywords
event
interactive
embodiment
client
audience
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/568,666
Inventor
Jeffrey David Henshaw
Alexander Irvin Hopmann
Christopher Andrew Evans
Daniel Evan Socolof
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Deep Rock Drive Partners Inc
Original Assignee
Deep Rock Drive Partners Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US10070108P priority Critical
Priority to US10070408P priority
Priority to US10070308P priority
Priority to US10070608P priority
Application filed by Deep Rock Drive Partners Inc filed Critical Deep Rock Drive Partners Inc
Priority to US12/568,666 priority patent/US20100153861A1/en
Publication of US20100153861A1 publication Critical patent/US20100153861A1/en
Application status is Abandoned legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00Arrangements for user-to-user messaging in packet-switching networks, e.g. e-mail or instant messages
    • H04L51/04Real-time or near real-time messaging, e.g. instant messaging [IM]
    • H04L51/046Real-time or near real-time messaging, e.g. instant messaging [IM] interacting with other applications or services
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object or an image, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/02Reservations, e.g. for tickets, services or events
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management, e.g. organising, planning, scheduling or allocating time, human or machine resources; Enterprise planning; Organisational models
    • G06Q10/063Operations research or analysis
    • G06Q10/0637Strategic management or analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation, e.g. computer aided management of electronic mail or groupware; Time management, e.g. calendars, reminders, meetings or time accounting
    • G06Q10/109Time management, e.g. calendars, reminders, meetings, time accounting
    • G06Q10/1093Calendar-based scheduling for a person or group
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce, e.g. shopping or e-commerce
    • G06Q30/02Marketing, e.g. market research and analysis, surveying, promotions, advertising, buyer profiling, customer management or rewards; Price estimation or determination
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce, e.g. shopping or e-commerce
    • G06Q30/02Marketing, e.g. market research and analysis, surveying, promotions, advertising, buyer profiling, customer management or rewards; Price estimation or determination
    • G06Q30/0241Advertisement
    • G06Q30/0242Determination of advertisement effectiveness
    • G06Q30/0246Traffic
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce, e.g. shopping or e-commerce
    • G06Q30/02Marketing, e.g. market research and analysis, surveying, promotions, advertising, buyer profiling, customer management or rewards; Price estimation or determination
    • G06Q30/0241Advertisement
    • G06Q30/0251Targeted advertisement
    • G06Q30/0252Targeted advertisement based on events or environment, e.g. weather or festivals
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce, e.g. shopping or e-commerce
    • G06Q30/02Marketing, e.g. market research and analysis, surveying, promotions, advertising, buyer profiling, customer management or rewards; Price estimation or determination
    • G06Q30/0282Business establishment or product rating or recommendation
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce, e.g. shopping or e-commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • H04N7/155Conference systems involving storage of or access to video conference sessions

Abstract

An interactive event allows clients to provide feedback to the performing artist and/or producers relative to the event being observed. Feedback options include shout outs, emotapplause, and voting.

Description

    TECHNICAL FIELD
  • Embodiments of the present disclosure generally relate to data evaluation, categorization, and presentation. More particularly, the embodiments of the present disclosure relate to systems which deliver live entertainment online via the Internet in various media forms and allow both the performer and the observer to interact.
  • BACKGROUND
  • Attempts to display media on computers date back to the earliest days of computing. However, little progress was made for several decades, primarily due to the high cost, limited capabilities and to a lesser extent compatibilities of available computer hardware. Recently consumer-grade personal computers have become powerful enough to display various types of media, including high quality audio and/or video streams.
  • Streaming multimedia represents one method of media distribution. In essence streaming multimedia is multimedia that is broadcast by a streaming provider to an end-user. Generally, the term streaming specifically refers to the delivery method of the data rather than to the content. Unfortunately, streaming typically requires tremendous bandwidth and/or latency to cache the data locally. Recent advances in computer networks combined with powerful home computers and modern operating systems have made possible, i.e. practical and affordable, the near universal distribution of streaming media for ordinary consumers. Universal distribution represents multimedia that is constantly received by, and normally presented to, an end-user while it is being delivered by a streaming provider.
  • A stream of media can be on demand or live. On demand streams are stored on a server for a long period of time, and are available to be transmitted at a user's request. Live streams may still use a server to broadcast the event, but are typically only available at one particular time, such as a video stream of a live sporting event, a political debate, educational lecture, or a concert. Live streams may be edited and converted into on demand streams for later content consumption. Current on demand or live streams lose any possibility for constructive feedback from the streaming targets. Essentially, live online presentations to large streaming audiences generally only provide unidirectional information in a manner that is difficult to facilitate observer participation. On demand performances are presented after the fact, preventing the presenter and/or observer(s) from directly altering the previously recorded presentation.
  • SUMMARY
  • In view of the problems in the state of the art, embodiments of the invention are based on the technical problem of optimizing interactive live events, categorization, and presentation in an online environment. While the internet already allows many services for one way communication and event broadcast, there have been no options for providing realtime, two way interactivity between audience members and the people creating the event. Systems and methods presented in this disclosure provide this very type of interactivity to create truly compelling live events on the internet.
  • One illustrated and described method provides large scale, real-time interactivity between distributed audience members on the internet and performers in an interactive event. Multiple types of interaction are possible, including direct text communication in the form of shoutouts as well as non-verbal communication, such as Emotapplause, that represents real-world feedback mechanisms like applause, fists in the air, peace signs, or throwing kisses, among others.
  • A system, suitable to solve the problems which at least one embodiment of the invention is based on, generates an interactive online event forum, such as a widget that may be used to solicit feedback from observer(s) and facilitate feedback response by the event presenter and/or publisher, to produce events for consumption by online audiences. The events could be live music performances, sporting events, political meetings, education or informational lectures, news broadcasts, travel logs, game shows, or any other type of event where the “performers” are at a central location or multiple locations and the audience is distributed across the internet, or in any number of locations with internet capable devices for communicating.
  • Feedback may be generated via a client side module to input data on relative performance quality and relative emotional response from the perspective of the observer. After receiving input, the rank-value of a particular monitored response can be calculated and presented back to the performer. In one embodiment, that ranking can be shown or used to sort lists in some embodiments. Monitored responses may include a playlist compiled of proposed songs/material for the artist to consider, subject matter for further discussion, desired topics for debate, questions regarding covered material or stated positions, and relative emotional responses to the content currently being presented. These customized lists or relative feedback of the monitored responses may be used to attract additional observers and/or alter the performance of the presenter(s).
  • Various types of relative feedback options may include shout outs, emotapplause, and voting. Shout outs are a text messages sent from the audience members to the performers and audience members. The intent of the shout out is for the audience members to be able to send a directed message or question to the performers. In addition to the performer seeing the message at the performance venue, the audience members also see a subset of the messages, thus providing a sense of community among all of the audience members. Because the number of audience members could be very large for a worldwide internet event, there is no guarantee that all messages will be presented to the performers but due to the mechanism of transferring shout out messages, a good random sampling of messages from all audience members will be presented to both the performers and other audience members.
  • Emotapplause is a mechanism of sending non-verbal communication from the audience members to the performers. By clicking graphical representations of the emotapplause (such as clapping hands, a heart, etc) a message is sent to a centralized service that aggregates all of the feedback from the audience. The performers then see a graphical representation of the aggregated feedback. The actual experience by the performer changes based on how many audience members are using that emotapplause image at that moment, so if 70% of the audience was ‘clapping’ and 10% of the audience was sending kisses, the visualization might include very large clapping hands, or perhaps many clapping hands and a smaller representation of kissing lips.
  • One of the best ways to keep an audience engaged in an event is to give them some control of how the event unfolds. Providing a voting mechanism allows them to decide what song is played next, what topic is covered next or the audience decision on the outcome of some sporting event or any number of other mechanisms for impacting the flow of the event based on popular vote.
  • An event producer may use the ranked lists generated for the client/user interface in a variety of ways including to present audience feedback for the performer and to alter a previously recorded presentation for the individual observer. A producer may solicit feedback by an observer and/or performer according to a variety of factors including prior event experience, available repertoire, relative quality of received feedback, and real time education of audience trends. Producers may also maintain and/or improve participant (performer and/or audience) satisfaction by reviewing the event rankings and comparing the responses with comparable events to identify trends. By decreasing or eliminating any discrepancies with participant expectations, the producer will likely increase the quality of the participant experience and thereby reduce marketing costs associated with bringing new events to market.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Non-limiting and non-exhaustive exemplary embodiments of the present disclosure are described with reference to the following drawings in which:
  • FIG. 1 illustrates a block diagram view of computer systems in an online interactive event environment in accordance with at least one embodiment;
  • FIG. 2 illustrates a block diagram view of components contained in an interactive client system configured in accordance with at least one embodiment;
  • FIG. 3 illustrates a block/flow diagram view of a portion of computer systems to filter feedback in an exemplary online interactive event environment in accordance with at least one embodiment;
  • FIG. 4 illustrates a flow diagram view of a method of a portion of operation for interactive event data evaluation, categorization, and presentation in accordance with at least one embodiment;
  • FIGS. 5A-5D illustrate block diagram views of portions of user interfaces, each generated in an interactive feedback system configured for compelling live event quality via relative interactivity in accordance with various embodiments;
  • FIG. 6 illustrates a block diagram view of a portion of an interactive client interface of an online interactive event environment during presentation of the event in accordance with various embodiments of the present disclosure; and
  • FIG. 7 illustrates a block diagram view of a portion of an interactive client interface of an online interactive event environment for various after party presentations associated with an online interactive event in accordance with various embodiments of the present disclosure.
  • DETAILED DESCRIPTION
  • In the following detailed description, reference is made to the accompanying drawings which form a part hereof wherein like numerals designate like parts throughout, and in which are shown, by way of illustration, specific embodiments in which the disclosure may be practiced. It is to be understood that other embodiments may be utilized and structural or logical changes may be made without departing from the scope of the present disclosure. Therefore, the following detailed description is not to be taken in a limiting sense, and the scope of a portion of the present disclosure is defined by the appended claims and their equivalents.
  • Throughout the specification and claims, the following terms take at least the meanings explicitly associated herein, unless the context clearly dictates otherwise. The meanings identified below are not intended to limit the terms, but merely provide illustrative examples for use of the terms. The meaning of “a,” “an,” and “the” may include reference to both the singular and the plural. Reference in the specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the disclosure. The meaning of “in” may include “in” and “on.” The appearances of the phrases “in one embodiment” or “in an embodiment” in various places in the specification do not necessarily all refer to the same embodiment, but it may. The term “connected” may mean a direct electrical, electro-magnetic, mechanical, logical, or other connection between the items connected, without any electrical, mechanical, logical or other intermediary therebetween. The term “coupled” can mean a direct connection between items, an indirect connection through one or more intermediaries, or communication between items in a manner that may not constitute a connection. The term “circuit” or “circuitry” as used in any embodiment described herein, can mean a single component or a plurality of components, active and/or passive, discrete or integrated, that are coupled together to provide a desired function and may include, for example, singly or in any combination, hardwired circuitry, programmable circuitry, state machine circuitry, and/or firmware that stores instructions executed by programmable circuitry. The term “signal” can mean at least one current, voltage, charge, data, or other such identifiable quantity.
  • In an effort to clarify comparative phrases used in the specification and the claims of this disclosure, please note that the following phrases take at least the meanings indicated and associated herein, unless the context clearly dictates otherwise. The phrase “A/B” means “A or B”. The phrase “A and/or B” means “(A), (B), or (A and B)”. The phrase “at least one of A, B and C” means “(A), (B), (C), (A and B), (A and C), (B and C) or (A, B and C)”. The phrase “(A) B” means “(A B) or (B)”, that is “A” is optional.
  • In addition, various embodiments depicted in FIG. 1 through FIG. 4 are represented by block diagrams and flow diagrams that illustrate in more detail scope of the present disclosure. The block diagrams often illustrate certain embodiments of modules for performing various functions of the present invention. In general, the represented modules include therein executable and operational data for operation within a system as depicted in FIG. 1 and/or FIG. 2 in accordance with embodiments of the present disclosure. Various operations of the system may be described as multiple discrete operations in turn, in a manner that may be helpful in understanding embodiments of the present disclosure; however, the order of description should not be construed to imply that these operations are order dependent.
  • As used herein, the term executable code, or merely “executable,” is intended to include any type of computer instruction and computer executable code that may be located within a memory device and/or transmitted as electronic signals over a system bus or network. An identified module of executable code may, for instance, comprise one or more physical or logical blocks of computer instructions which may, for instance, be organized as an object, procedure, or function. Nevertheless, the executables of an identified module need not be located together, but may comprise disparate instructions stored in different locations which together comprise the module and achieve the purpose stated for the module. Indeed, an executable may be a single instruction, or many instructions, and may even be distributed over several different code segments, among different programs, and across several memory devices. Similarly, operational data may be identified and illustrated herein within modules, and may be embodied in any suitable form and organized within any suitable type of data structure. The operational data may be collected as a single data set, or may be distributed over different locations including over different storage devices, and may at least partially exist merely as electronic signals on a system bus or network.
  • Referring now to FIG. 1, a block diagram of various computer systems in an online interactive event environment 100 is shown. Computer systems useful for data evaluation, categorization, and presentation of interactive events are shown in accordance with various embodiments of the present disclosure. The online interactive event environment 100 includes both a variety of operating environments and a variety of network devices. Operating environments within the online interactive event environment 100 may include, but are not limited to, multiple interactive client endpoints that may attach via a communication network, such as the internet, to a production center and/or one or more performance studios. In one embodiment, the production center includes network operations and a datacenter. In one embodiment, the performance studio includes an event studio, an event database, an event interface, and an interactive display. The production center and performance studio may be separately connected via a private communication network or via a virtual private network across a public communication network, such as the internet.
  • An interactive client endpoint may represent a variety of consumer devices including, but not limited to, general purpose computer systems, personal digital assistants, digital media players, mobile telephones, video equipment, application specific devices, and other digital communication devices.
  • Performance centers provide executable code and operational data to the interactive client endpoints, directly and indirectly via the production center. Interactive client endpoints, in accordance with various embodiments, can be visitors of the event website, people who own or purchase a ticket, employees of the production company running the web site, or any other types of people or device that may participate in the interactive event. Various multimedia devices may be used to upload a rich variety of media information for or about an event to the event profile. For example, multiple cameras or webcams may be used to collect video images of an event, conduct a separate web interviews, and/or provide a video preview of an event. Likewise, multiple microphones may be used to collect sound from the event and/or associated interviews or advertisements.
  • In one embodiment, the audience member at the interactive client endpoint joins an ongoing event and initiates interactivity with the event by typing a message, clicking or otherwise choosing an emotapplause image, voting for event presentation lists, selecting a camera angle, or some other method of indicating the message they would like to send. The messages are then sent to a centralized internet web service that adds user information about that audience member such as their name, image, location, source, etc. That information is then stored in a central database or data store such that the web service can index, search, log and recall each request, or aggregated totals of requests.
  • Interactive client applications can then periodically issue requests for the current summary state of the interactivity information. That information includes a set of recent shout out messages and their related metadata, the current aggregate information for emotapplause items, current voting topics and voting choices, and any other status information that is helpful for the client to be able to process this data. Because of the potential quantity of requests coming from audience members, various caching mechanisms can be used to reduce the overhead spent gathering this information on every request. To maintain relevancy it is important that the information sent out to clients be very current, so as to maintain the feeling of interactivity at the event. In one embodiment, shout out messages are not allowed to be more than about 30 seconds old (time they were sent from audience member) and preferably represent the most recent messages received by the system. The response to the interactive client may be encoded in at least one of a variety of different formats, including but not limited to, XML, JSON, CSV, and the like.
  • In one embodiment, when the interactive audience client or performance studio client initially receives the data, they present the information to the performers or audience members in an appropriate way. For the performers, that may be showing the name of the audience member, their image, location and the shoutout message itself in an interesting animation. Some additional options for emotapplause and shoutouts are described below with reference to FIG. 5A and FIG. 5C.
  • Referring now to FIG. 2, a computer system is shown for implementing at least one embodiment of the invention, the system including a computing device 200 in which executable and operational data may be hosted and transmitted to one or more interactive stations via a communication network of the previously described online interactive event environment 100. Computing device 200 may be configured as a client, a server, a mobile device, or any other computing device that interacts with data in a network based collaboration system.
  • In a very basic configuration, computing device 200 typically includes at least one processing unit 220. In one embodiment, the processing unit 220 includes at least one processor. As such, the term “processor”, as used herein, should be interpreted to mean an individual processor, firmware logic, reconfigurable logic, a hardware description language logic configuration, a state machine, an application-specific integrated circuit, a processing core co-disposed in an integrated circuit package with at least one other processing core and/or component, or combinations thereof.
  • The processing unit 220 may be operably connected to system memory 210. Depending on the exact configuration and type of computing device, system memory 210 may be non-volatile memory 211 (such as ROM, flash memory, etc.), volatile memory 214 (such as RAM), or some combination of the two. System memory 210 typically includes Basic Input/Output System (BIOS) firmware code 212, an operating system 215, one or more applications 216, and may include program modules and data 217. A configuration library 218 (e.g., registries), which contain code and data to be shared and changed in a modular or database fashion to provide services to applications 216 and programs 217 is also often included in system memory 210.
  • Computing device 200 may have additional features or functionality. For example, computing device 200 may also have a dedicated graphics rendering device, such as video adapter 230 coupled with at least one display monitor 235. Computing device 200 may also have a variety of human input device(s) (HID) 259 such as keyboard, mouse, pen, voice input device, touch input device, and the like. In a broader sense, human input device (HID) 259 may also include various output devices such as a display monitor 235, speakers, printer, and the like. Computing device 200 may utilize a variety of ports via port interface 250 to share data including wireless ports 253, parallel ports 255, and serial ports 257. Each of these port types may include further varieties, for example serial ports may include a Universal Serial Bus (USB) port and/or a FireWire/IEEE 1394 port.
  • In various embodiments, computing device 200 may also include a storage drive interface 240 for communication with additional data storage devices (removable and/or non-removable) such as, for example, magnetic disk drives 242, optical disk drives 243, hard disk drives 244, tape drives, and other storage devices. Such additional storage is illustrated in FIG. 2 by removable magnetic storage 241 and removable optical storage 249 and non-removable storage (hard disk drive 244).
  • Computer storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data. System memory 210, removable storage and non-removable storage are all examples of computer storage media. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computing device 200. Any such computer storage media may be used to store desired information, such as operating system 245, one or more applications 246, programs 247, and/or registries and configuration libraries 248 accessible to computing device 200.
  • Computing device 200 may also contain a communication connection via port interface 250 and/or network interface card 260 that allows the device 200 to communicate with other remote computing devices 280, such as over a communication network. The communication network may comprise a local area network (LAN) and/or a wide area network (WAN). Each network may be wired or wireless or combination thereof. The communication network may also comprise other large scale networks including, but not limited to, intranets and extranets, or combinations thereof. In one embodiment the communication network is an interconnected system of networks, one particular example of which is the Internet and the World Wide Web supported on the Internet.
  • A variety of configurations may be used to connect the computing device 200 to the remote computing devices 280. For example, although modem 265 is illustrated as connecting to the remote computing device 280, a remote server, via a WAN and network interface 260 is illustrated as connecting via a LAN, both the network interface 260 and/or the modem 265 may just as well be coupled to other large scale networks including, but not limited to, a global system of interconnected computer networks (internet), various intranets and extranets, or combinations thereof.
  • The information transmitted as data across the previously discussed communication connections are examples of communication media. Communication media may typically be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. The term computer readable media as used herein includes both storage media and communication media.
  • Although many of the examples refer to computing devices with a single operating system, file system and configuration library, the concepts, principles, and examples disclosed below may be extended to provide interactive event functionality across several or many operating systems, file systems, and/or configurations libraries (e.g., registries). Accordingly, it is contemplated that the principles described herein may be applied to these and other computing systems and devices, both existing and yet to be developed, using the methods and principles disclosed herein.
  • Turning now to FIGS. 3 and 4, methods and various operations of the interactive event system, in accordance with at least one embodiment, are described in terms of firmware, software, and/or hardware with reference to flowcharts and/or flow diagrams. More specifically, FIG. 3 is a block and flow diagram schematically showing a portion of various computer systems configured to filter feedback in an exemplary online interactive event environment in accordance with at least one embodiment. FIG. 4 is a flow diagram showing a portion of a method of operation for interactive event data evaluation, categorization, and presentation in accordance with at least one embodiment. Describing a method and/or various operations by reference to a flowchart enables one skilled in the art to develop programs, including instructions to carry out the methods on suitably configured computer systems and electronic devices. In various embodiments, portions of the operations to be performed by an electronic device or computer system may constitute circuits, general purpose processors (e.g., micro-processors, micro-controllers, an ASIC, or digital signal processors (DSPs)), special purpose processors (e.g., application specific integrated circuits or ASICs), firmware (e.g., firmware that is used by a processor such as a micro-processor, a micro-controller, and/or a digital signal processor), state machines, hardware arrays, reconfigurable hardware, and/or software made up of executable instructions. The executable instructions may be embodied in firmware logic, reconfigurable logic, a hardware description language, a state machine, an application-specific integrated circuit (ASIC), or combinations thereof.
  • With respect to various embodiments using a software implementation (e.g., a hardware simulator), at least one of the processors of a suitably configured electronic communication device, such as a computer, executes the instructions from a storage medium. The computer-executable instructions may be written in a computer programming language or executable code. If written in a programming language conforming to a recognized standard, such instructions may be executed on a variety of hardware platforms and may interface with a variety of operating systems. Although the various embodiments are not described with reference to any particular programming language, it will be appreciated that a variety of programming languages may be used to implement the teachings of the embodiments as described herein. Furthermore, it is common in the art to speak of software in one form or another (e.g., program, procedure, process, application, etc.) as taking an action or causing a result. Such expressions are merely a shorthand way of saying that execution of the software by a device causes the processor of the computer to perform an action or a produce a result.
  • Referring now to FIG. 3, a block and flow diagram view of a portion of a system 300 configured to filter feedback in an exemplary online interactive event environment is illustrated in accordance with at least one embodiment. The system 300 includes one or more performance studios for producing the underlying content for the event, a production center to produce and monitor the interactive event, and a plurality of interactive client endpoints to generate the interactive content associated with the underlying content of the event.
  • Each of the one or more interactive client endpoints are configured to receive the data transmitted from the performance studio and transmit user-generated interactive feedback associated with the interactive event back to the production center and/or the performance studio. In one embodiment, the various multimedia streams received by the client include camera captured feedback from performing artist and fans in the studio audience. Accordingly, the user-generated interactive content transmitted by the client may include voting results, shout outs, emotapplause, and other feedback solicited and/or generated from the watching audience.
  • In various embodiments, the performance studio may be a customized interactive studio, such as a Deep Rock Drive certified performance studio or a traditional performance studio upgraded with interactive equipment. In one embodiment, each performance studio includes at least one interactive display to receive interact content, such as voting results, shout outs, emotapplause, and other feedback from the watching audience.
  • The at least one production center is configured to control a variety of network operations and provide a datacenter for the interactive event. In one embodiment, the production center monitors the flow of content to the interactive clients to maintain a log of the event and ensure quality reception of the content sent to the client. Quality levels may be adjusted in a variety of ways including bandwidth throttling, data compression, refresh rate manipulation, and adjustment of packet size and/or frequency. In one embodiment, the production center may also receive the content transmitted by the interactive client for additional processing, including interactive content sampling, filtering, and transformation.
  • In one embodiment, content may be filtered prior to transmission to the performance studio. Filtered content may merely be removed from the feedback stream. Alternatively, filtered content may be replaced with alternative content expressing a similar intent, but in a more acceptable manner. Another form of filtering includes the relative weighting of received responses from the interactive client endpoints. This allows the performer to get a feel for the response of the audience. It will be appreciated by those of ordinary skill in the art and others, that a wide variety of alternate and/or equivalent implementations of interactive content filtration may be substituted for the specific embodiment of filtering as shown.
  • Moreover, the illustrated configuration of the system 300 may also include a wide variety of alternate and/or equivalent implementations. For example, in at least one embodiment, the performance studio and the production center may be the same location. Moreover, in at least one embodiment, some of the interactive clients may also be co-located at the performance studio and/or the production center.
  • Referring now to FIG. 4, a flow diagram view of a portion of a method of operation 400 for interactive event data evaluation, categorization, and presentation is illustrated in accordance with at least one embodiment. Initially the event is established in block 410. The established event may include information about the performers at the event, size (number of available tickets), ticket sales thresholds, anticipated playlists, online location of the event, and other particulars about the event. In block 420 tickets or admission codes for the event are issued based on event information.
  • Once the event opens in block 430, such as the beginning of a performance, the method 400 begins to determine which interactive clients may have access to the data being transmitted. Query block 440 handles this by determining whether the soliciting client has ticket or admission code. If not then the soliciting client is encouraged to purchase a ticket in block 420. If the client has a ticket, then they are allowed into the event in block 450. Upon registering with the event coordinators, the interactive client will be allowed to receive the event stream in block 460, including at least one integrated multimedia audio and video stream from the performance studio. In one embodiment, the integrated multimedia audio and video stream includes multiple synchronized streams, one for each camera angle.
  • Monitoring block 470 determines whether the event has concluded. If not concluded, the method 400 continues to accept and process interactive inputs from the interactive client, such as requests to change camera angles 482, voting information 484 including votes regarding upcoming playlists, emotapplause 486, and shout outs 488. If the event has concluded, the method 400 directs interactive clients towards after party presentations 490 associated with the event, which may include post videos 494, post photos 496, post notes 498, and other post event offerings. In one embodiment, the post videos 494 may include the entire event stream for review of the interactive client. In one embodiment, the post photos 496 may include a collection of images from the event and/or publicity shots of the performers at the event. In one embodiment, the post notes 498 may include links to additional information about the performers at the event, including future concerts that may be available.
  • Referring now to FIGS. 5A-5D, block diagram views of portions of user interfaces (510, 520, 530, 540) are illustrated. Each user interface generated in an interactive feedback system configured for compelling live event quality via relative interactivity in accordance with various embodiments.
  • In FIG. 5A, portions of user interface 510 are shown illustrating the solicitation on an interactive client of an event attendee to select and transmit emotapplause via an emoticon indicative of a current emotional state of the event attendee. Emotapplause is a mechanism of sending non-verbal communication from the audience members to the performers. By clicking graphical representations of the emotapplause (such as clapping hands, a heart, etc) a message is sent to a centralized service that aggregates all of the feedback from the audience. The performers then see a graphical representation of the aggregated feedback. The actual experience by the performer changes based on how many audience members are using that emotapplause image at that moment, so if 70% of the audience was ‘clapping’ and 10% of the audience was sending kisses, the visualization might include very large clapping hands, or perhaps many clapping hands and a smaller representation of kissing lips. Other sample emoticons include a lighter, a unity or rock-on fist, a hang-loose or horned devil hand sign, a virtual bra, and clapping. It will be appreciated by those of ordinary skill in the art and others, that a wide variety of alternate and/or equivalent implementations of emotapplause may be substituted for the specific embodiment of emotapplause shown. For example, emotapplause messages may be displayed to the performers based on statistical aggregation of the number of times each emotapplause item is clicked by audience members in accordance with one embodiment. It may be appreciated by those of ordinary skill in the art and others that a variety of algorithms may be used to determine the quantity, size and intensity of the animation that is presented to the performers. For example, if a statistically larger percentage of the audience is clicking one icon in the most recent set of data received from the interactive clients, the associated animation may be larger than the other animations for the less used emotapplause at that moment. Alternatively, in one embodiment, if one form of emotapplause is trending up in total number of clicks over a number of recent requests for data from the service that could result in the corresponding animations also growing in size, quantity and/or intensity. Similarly, if a trend is downward, the corresponding animations could shrink in size, quantity, and/or intensity. In one embodiment, different animations may be displayed to indicate some such large milestone has been hit when detected emotapplause images from the audience hit a designated milestone in number or a threshold gauging relative intensity of user actions is reached. In one embodiment, multiple animations may be shown simultaneously, and/or different display surfaces may show different sets of animations where the placement of the display surfaces could indicate a higher or lower priority to the performer or audience. In one embodiment, animations on the audience member's interface could also show similar animations based on the activity of the overall audience, so they will be able to see how active different emotapplause items are. Various embodiments enable animations to be overlaid on the video stream to allow audience members to see exactly what the performers are seeing.
  • In FIG. 5B, portions of user interface 520 are shown illustrating the solicitation on an interactive client of an event attendee to select and transmit a prioritized interactive playlist. One of the best ways to keep an audience engaged in an event is to give them some control of how the event unfolds. Providing a voting mechanism allows them to decide what song is played next, what topic is covered next or the audience decision on the outcome of some sporting event or any number of other mechanisms for impacting the flow of the event based on popular vote. Voting can be presented as a list of choices below some header describing what is currently being voted on. Each choice has an option for the audience member to make or change their choice. When they make a choice, it is sent to the service which tallies the votes and provides summary information in the client data requests. It will be appreciated by those of ordinary skill in the art and others, that a wide variety of alternate and/or equivalent implementations of voting mechanisms may be substituted for the specific embodiment of voting on a presented playlist as shown. For example, the questions to be voted on can be sent in real time by an administrator, based on input by the performers. In one embodiment, the voting results can be presented in real-time to performers and/or audience members. One embodiment allows past ballot results or voting history to be saved for later use and review.
  • In FIG. 5C, portions of user interface 530 are shown illustrating the solicitation on an interactive client of an event attendee to provide and transmit a virtual shout out to the performer. Shout outs are a text messages sent from the audience members to the performers and audience members. The intent of the shout out is for the audience members to be able to send a directed message or question to the performers. In addition to the performer seeing the message at the performance venue, the audience members also see a subset of the messages, thus providing a sense of community among all of the audience members. Because the number of audience members could be very large for a worldwide internet event, there is no guarantee that all messages will be presented to the performers but due to the mechanism of transferring shout out messages, a good random sampling of messages from all audience members will be presented to both the performers and other audience members. It will be appreciated by those of ordinary skill in the art and others, that a wide variety of alternate and/or equivalent implementations of selective instant messaging mechanisms may be substituted for the specific shout out embodiment shown. For example, in one embodiment, messages from audience members may be filtered if the same message is sent multiple times in a row to prevent “spamming” of messages to the participants. Moreover, messages from audience members may also be filtered based on content and length. In one embodiment, the audience and/or performers may be shielded from inappropriate content or specific topics. In one embodiment, a message can be filtered if too long to prevent situations where information download would be slowed by extra long messages. One variation allows long messages to be parsed and resent separately, while another throws out long messages. Determining which action should be taken may be based in part on the content of the message.
  • In one embodiment, specific audience members can be blocked from sending messages if they are found to be consistently sending inappropriate messages and/or “spamming” messages. When messages are blocked, various embodiments allow the audience member to still see their message as if they were sent, so that they are unaware that messages they send have been blocked.
  • In one embodiment, messages that are displayed to audience member and/or performers are displayed for a relative period of time related to the length of the message, so that longer messages are displayed longer while short messages go by faster. This helps audience and/or artist to both read and comprehend messages before they disappear. For example, messages like “yay!” take less time to comprehend than more complex messages like “That was amazing, what were you thinking when you wrote that song?” In one embodiment, the message animations at event location may be overlaid on the video stream to allow audience members to see exactly what the performers are seeing.
  • In one embodiment, when the incoming content is slow, for example from a low attendance event, the client may show messages from farther back in time. However, one embodiment monitors and limits the length of time that an old message may be used to prevent displayed messages from seeming out of context due to latency since the message was originally sent.
  • In FIG. 5D, portions of user interface 540 are shown illustrating the solicitation on an interactive client of an event attendee to select and transmit a desired camera angle to the producer. Audience members may have a specialized interest in the performing band and camera angle selection allows the event attendee to choose the position of their virtual seat in the performance hall.
  • Referring now to FIG. 6, a block diagram view of a portion of an interactive client interface 600 of an online interactive event environment is illustrated showing portions of the presentation during an interactive event in accordance with various embodiments of the present disclosure. The interactive client interface 600, in one embodiment, may include a video presentation of the event, and audio presentation of the event, or some combination thereof. The illustrated interactive client interface 600 incorporates each of the previously discussed user interfaces 510, 520, 530, and 540 into the event presentation. In addition, the illustrated embodiment also shows event sponsorship of the event. Accordingly, this sponsorship may be sold in accordance with a variety of advertising mechanisms, including but not limited to per event, per song, per minute, per impression, or some combination thereof. In one embodiment, an event sponsor may present customized logos and marketing material targeted for the audience of the event. On embodiment provides promotional links on the presentation page of event. When clicked, another window may open without interrupting the stream. Alternatively, a sponsorship link may change the look of the event interface. Other more subtle methods of promotion also considered within the scope of the disclosure include use of a watermark and/or background images and/or desktop/window wallpaper of promotional material.
  • Referring now to FIG. 7, a block diagram view of a portion of an interactive client interface 700 of after party presentations associated with an online interactive event environment is illustrated in accordance with various embodiments of the present disclosure. The interface 700 shows various after party presentations including links to websites associated with the presenter, sponsors, upcoming events, topical news, photo and video archives, discussion boards, and other information associated with the online interactive event. In various embodiments, a playback of the event, such as a highlight reel, may also be available at the after party interactive client interface 700.
  • The above specification, examples and data provide a description of the manufacture and use of the composition of the invention. Although specific embodiments have been illustrated and described herein, it will be appreciated by those of ordinary skill in the art and others, that a wide variety of alternate and/or equivalent implementations may be substituted for the specific embodiment shown in the described without departing from the spirit and scope of the present disclosure. This application is intended to cover any adaptations or variations of the embodiments discussed herein. Therefore, it is manifested and intended that the disclosure be limited only by the claims and the equivalence thereof.

Claims (3)

1. A method for participation in interactive online events, comprising:
requesting admittance to an event from an event production center;
obtaining a ticket for the event from the event production center;
receiving a stream of the event from the event performance studio;
displaying the stream of the event for user observation;
transmitting user initiated feedback messages of an interactive client directed to at least one of the event production center, the event performance studio, other audience clients of the interactive event, and/or back to the interactive client; and
receiving and displaying a stream of the event from the event performance studio.
2. A interactive event system comprising:
a performance studio having at least one interactive display;
a production center for producing an interactive event based on event material received from the performance studio; and
an interactive client having at least one interactive display and configured to generate and transmit event feedback to at least one of the production center, the performance studio, other audience clients of the interactive event, and/or the interactive client.
3. A method of performing for an interactive event, comprising:
a performance studio having at least one interactive display;
a production center for producing an interactive event based on event material received from the performance studio;
an interactive client having at least one interactive display and configured to generate and transmit event feedback to at least one of the production center, the performance studio, other audience clients of the interactive event, and/or the interactive client;
requesting admittance to an event from an event production center;
obtaining a ticket for the event from the event production center;
receiving a stream of the event from the event performance studio;
displaying the stream of the event for user observation;
transmitting user initiated feedback messages of an interactive client directed to at least one of the event production center, the event performance studio, other audience clients of the interactive event, and/or back to the interactive client; and
receiving and displaying a stream of the event from the event performance studio.
US12/568,666 2008-09-26 2009-09-28 Interactive events Abandoned US20100153861A1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
US10070108P true 2008-09-26 2008-09-26
US10070408P true 2008-09-26 2008-09-26
US10070308P true 2008-09-26 2008-09-26
US10070608P true 2008-09-26 2008-09-26
US12/568,666 US20100153861A1 (en) 2008-09-26 2009-09-28 Interactive events

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US12/568,666 US20100153861A1 (en) 2008-09-26 2009-09-28 Interactive events

Publications (1)

Publication Number Publication Date
US20100153861A1 true US20100153861A1 (en) 2010-06-17

Family

ID=42076479

Family Applications (6)

Application Number Title Priority Date Filing Date
US12/586,920 Active US8442424B2 (en) 2008-09-26 2009-09-28 Interactive live political events
US12/586,922 Abandoned US20100088128A1 (en) 2008-09-26 2009-09-28 Ticket scarcity management for interactive events
US12/586,921 Abandoned US20100094686A1 (en) 2008-09-26 2009-09-28 Interactive live events
US12/586,923 Active 2032-02-14 US9548950B2 (en) 2008-09-26 2009-09-28 Switching camera angles during interactive events
US12/568,666 Abandoned US20100153861A1 (en) 2008-09-26 2009-09-28 Interactive events
US13/894,269 Active 2029-11-23 US9160692B2 (en) 2008-09-26 2013-05-14 Interactive live political events

Family Applications Before (4)

Application Number Title Priority Date Filing Date
US12/586,920 Active US8442424B2 (en) 2008-09-26 2009-09-28 Interactive live political events
US12/586,922 Abandoned US20100088128A1 (en) 2008-09-26 2009-09-28 Ticket scarcity management for interactive events
US12/586,921 Abandoned US20100094686A1 (en) 2008-09-26 2009-09-28 Interactive live events
US12/586,923 Active 2032-02-14 US9548950B2 (en) 2008-09-26 2009-09-28 Switching camera angles during interactive events

Family Applications After (1)

Application Number Title Priority Date Filing Date
US13/894,269 Active 2029-11-23 US9160692B2 (en) 2008-09-26 2013-05-14 Interactive live political events

Country Status (1)

Country Link
US (6) US8442424B2 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130005465A1 (en) * 2011-06-29 2013-01-03 EarDish Corporation Audio playlist selections and related entertainment systems and methods
US20130097635A1 (en) * 2011-10-13 2013-04-18 Gface Gmbh Interactive remote participation in live entertainment
EP2621165A1 (en) 2012-01-25 2013-07-31 Alcatel Lucent, S.A. Videoconference method and device
US20150052238A1 (en) * 2013-08-19 2015-02-19 Google Inc. Device Compatibility Management
US20160127288A1 (en) * 2014-11-04 2016-05-05 Calay Venture S.à r.l. System and method for inviting users to participate in activities based on interactive recordings

Families Citing this family (50)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130218688A1 (en) * 2007-09-26 2013-08-22 Aq Media, Inc. Audio-visual navigation and communication dynamic memory architectures
US20090138554A1 (en) * 2007-11-26 2009-05-28 Giuseppe Longobardi Controlling virtual meetings with a feedback history
US9892028B1 (en) 2008-05-16 2018-02-13 On24, Inc. System and method for debugging of webcasting applications during live events
US8442424B2 (en) * 2008-09-26 2013-05-14 Deep Rock Drive Partners Inc. Interactive live political events
US20140176665A1 (en) * 2008-11-24 2014-06-26 Shindig, Inc. Systems and methods for facilitating multi-user events
WO2010080639A2 (en) * 2008-12-18 2010-07-15 Band Crashers, Llc Media systems and methods for providing synchronized multiple streaming camera signals of an event
US8655383B2 (en) * 2009-06-15 2014-02-18 Alpine Electronics, Inc Content delivery system and method
US8706812B2 (en) 2010-04-07 2014-04-22 On24, Inc. Communication console with component aggregation
US8667533B2 (en) * 2010-04-22 2014-03-04 Microsoft Corporation Customizing streaming content presentation
US20110276882A1 (en) 2010-05-04 2011-11-10 Kai Buehler Automatic grouping for users experiencing a specific broadcast media
US8606073B2 (en) 2010-05-12 2013-12-10 Woodman Labs, Inc. Broadcast management system
US20110289539A1 (en) * 2010-05-19 2011-11-24 Kim Sarubbi Multimedia content production and distribution platform
US20120004950A1 (en) * 2010-07-01 2012-01-05 Effective Measure System and method for integrated offline audience validation
US20120116789A1 (en) * 2010-11-09 2012-05-10 International Business Machines Corporation Optimizing queue loading through variable admittance fees
US9009194B2 (en) * 2010-12-01 2015-04-14 Democrasoft, Inc. Real time and dynamic voting
US20120191774A1 (en) * 2011-01-25 2012-07-26 Vivek Bhaskaran Virtual dial testing and live polling
US9246957B2 (en) * 2011-03-04 2016-01-26 Viafoura Systems and methods for interactive content generation
AU2012225536B9 (en) 2011-03-07 2014-01-09 Kba2, Inc. Systems and methods for analytic data gathering from image providers at an event or geographic location
US8845429B2 (en) * 2011-05-27 2014-09-30 Microsoft Corporation Interaction hint for interactive video presentations
US8768139B2 (en) 2011-06-27 2014-07-01 First Principles, Inc. System for videotaping and recording a musical group
US8732739B2 (en) 2011-07-18 2014-05-20 Viggle Inc. System and method for tracking and rewarding media and entertainment usage including substantially real time rewards
US9870552B2 (en) * 2011-10-19 2018-01-16 Excalibur Ip, Llc Dynamically updating emoticon pool based on user targeting
US20130144716A1 (en) * 2011-12-06 2013-06-06 Sony Network Entertainment International Llc Advertising opportunities for live streaming contents and services
EP2793344A4 (en) * 2011-12-14 2015-07-01 Kyocera Corp Display terminal, power control system, and display method
US20130304575A1 (en) * 2012-05-09 2013-11-14 Michael Fetyko Systems, methods and machine-readable media for providing and managing an interface between fans and celebrities within a social network environment
US20140156752A1 (en) * 2012-05-11 2014-06-05 iConnectUS LLC Software applications for interacting with live events and methods of use thereof and creating custom content streams
EP2670156A1 (en) 2012-06-01 2013-12-04 Thomson Licensing Interactive audio/video broadcast system, method for operating the same and user device for operation in the interactive audio/video broadcast system
US20150127734A1 (en) * 2012-06-22 2015-05-07 Sony Corporation Information processing device, information processing method and terminal device
WO2014044898A1 (en) * 2012-09-18 2014-03-27 Nokia Corporation Apparatus, method and computer program product for providing access to a content
US20140089401A1 (en) * 2012-09-24 2014-03-27 Google Inc. System and method for camera photo analytics
US20140094153A1 (en) * 2012-10-02 2014-04-03 Alpine Audio Now Digital, LLC System and method of interacting with a broadcaster via an application
US8554873B1 (en) * 2012-10-05 2013-10-08 Google Inc. Custom event and attraction suggestions
US8606872B1 (en) * 2012-10-22 2013-12-10 HotSpots U, Inc. Method and apparatus for organizing, packaging, and sharing social content and social affiliations
WO2014120951A1 (en) * 2013-01-30 2014-08-07 FREILICHER, David, Joel Methods and systems for providing online events
US9264474B2 (en) 2013-05-07 2016-02-16 KBA2 Inc. System and method of portraying the shifting level of interest in an object or location
US9953085B1 (en) 2013-05-31 2018-04-24 Google Llc Feed upload for search entity based content selection
US20150113403A1 (en) * 2013-10-20 2015-04-23 Eric A. Harvey Simultaneously presenting media on devices
CN104596929B (en) * 2013-10-31 2017-06-23 国际商业机器公司 Method and apparatus for determining the air mass
US20150170061A1 (en) * 2013-12-17 2015-06-18 Scott J. Glosserman Web ticketing event system
US9900362B2 (en) 2014-02-11 2018-02-20 Kiswe Mobile Inc. Methods and apparatus for reducing latency shift in switching between distinct content streams
US20150350733A1 (en) * 2014-06-02 2015-12-03 Grid News Bureau, LLC Systems and methods for opinion sharing related to live events
US9832500B2 (en) 2014-07-05 2017-11-28 TiltedGlobe LLC System for enabling a virtual theater
CA2998482A1 (en) 2014-09-12 2016-03-17 Kiswe Mobile Inc. Methods and apparatus for content interaction
US10116714B2 (en) 2015-06-15 2018-10-30 At&T Intellectual Property I, L.P. Apparatus and method for on-demand multi-device social network experience sharing
US20160379441A1 (en) * 2015-06-23 2016-12-29 Mark Oley Application for enhancing a sport viewing experience on an electronic device
US9817557B2 (en) * 2015-07-22 2017-11-14 Enthrall Sports LLC Interactive audience communication for events
US10176486B2 (en) 2015-11-23 2019-01-08 Tesla Laboratories, LLC System and method for using a mobile device as an input device for surveys at a live event
US10178710B2 (en) 2015-11-23 2019-01-08 Fevr Tech Llc System and method for using a mobile device as an input device for surveys at a live event
US9959689B2 (en) 2015-11-23 2018-05-01 Tesla Laboratories Llc System and method for creation of unique identification for use in gathering survey data from a mobile device at a live event
US9843768B1 (en) * 2016-09-23 2017-12-12 Intel Corporation Audience engagement feedback systems and techniques

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6519771B1 (en) * 1999-12-14 2003-02-11 Steven Ericsson Zenith System for interactive chat without a keyboard
US7680912B1 (en) * 2000-05-18 2010-03-16 thePlatform, Inc. System and method for managing and provisioning streamed data
US20100094686A1 (en) * 2008-09-26 2010-04-15 Deep Rock Drive Partners Inc. Interactive live events
US20110090347A1 (en) * 2008-12-18 2011-04-21 Band Crashers LLC. Media Systems and Methods for Providing Synchronized Multiple Streaming Camera Signals of an Event

Family Cites Families (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5226177A (en) * 1990-03-27 1993-07-06 Viewfacts, Inc. Real-time wireless audience response system
US20040261127A1 (en) * 1991-11-25 2004-12-23 Actv, Inc. Digital interactive system for providing full interactivity with programming events
EP0823179B1 (en) * 1995-04-24 2004-08-11 United Video Properties, Inc. Electronic television program guide schedule system and method with remote product ordering
WO1998011494A1 (en) * 1996-09-16 1998-03-19 Advanced Research Solutions, Llc Data correlation and analysis tool
US6317881B1 (en) * 1998-11-04 2001-11-13 Intel Corporation Method and apparatus for collecting and providing viewer feedback to a broadcast
US20020120925A1 (en) * 2000-03-28 2002-08-29 Logan James D. Audio and video program recording, editing and playback systems using metadata
AU2918601A (en) * 1999-11-05 2001-05-14 Donald A. Glaser A method and system for audience participation and selective viewing of various aspects of a theatrical performance, whether opera, symphonic, drama or dance orcombinations and variations thereof
AU6472301A (en) * 2000-05-18 2001-11-26 Imove Inc Multiple camera video system which displays selected images
US7782363B2 (en) * 2000-06-27 2010-08-24 Front Row Technologies, Llc Providing multiple video perspectives of activities through a data network to a remote multimedia server for selective display by remote viewing audiences
CN100397899C (en) * 2000-10-11 2008-06-25 联合视频制品公司 System and methods for providing storage of data on servers in on-demand media delivery system
US7796162B2 (en) * 2000-10-26 2010-09-14 Front Row Technologies, Llc Providing multiple synchronized camera views for broadcast from a live venue activity to remote viewers
US7149549B1 (en) * 2000-10-26 2006-12-12 Ortiz Luis M Providing multiple perspectives for a venue activity through an electronic hand held device
US20040064838A1 (en) * 2002-01-08 2004-04-01 Lykke Olesen Method and device for viewing a live performance
US20020108125A1 (en) * 2001-02-07 2002-08-08 Joao Raymond Anthony Apparatus and method for facilitating viewer or listener interaction
US20020115453A1 (en) * 2001-02-16 2002-08-22 Poulin Ronald Leon Method and system for location based wireless communication services
US20030208613A1 (en) * 2002-05-02 2003-11-06 Envivio.Com, Inc. Managing user interaction for live multimedia broadcast
US7603321B2 (en) * 2002-05-22 2009-10-13 Gurvey Amy R Electronic system and method coupling live event ticketing and interactive entries with the sale, distribution and transmission of event recordings, mastering system and intelligent terminal designs
US20050024488A1 (en) * 2002-12-20 2005-02-03 Borg Andrew S. Distributed immersive entertainment system
US20070070210A1 (en) * 2003-04-11 2007-03-29 Piccionelli Gregory A Video production with selectable camera angles
US7428000B2 (en) * 2003-06-26 2008-09-23 Microsoft Corp. System and method for distributed meetings
TWI256257B (en) * 2004-03-17 2006-06-01 Era Digital Media Company Ltd Real-time interactive video management system
US20060104600A1 (en) * 2004-11-12 2006-05-18 Sfx Entertainment, Inc. Live concert/event video system and method
US7478334B2 (en) * 2005-01-20 2009-01-13 International Business Machines Corporation Folding text in side conversations
KR20070105348A (en) * 2005-02-28 2007-10-30 인라이브 인터랙티브 리미티드 Method and apparatus for conducting real time dialogues with mass viewer audiences during live programs
MX2007011675A (en) * 2005-03-22 2008-11-04 Ticketmaster Apparatus and methods for providing queue messaging over a network.
US20070028272A1 (en) * 2005-08-01 2007-02-01 Airplay Network, Inc. Live television show utilizing real-time input from a viewing audience
US20070233785A1 (en) * 2006-03-30 2007-10-04 International Business Machines Corporation Communicating using collaboration spaces
US8019815B2 (en) * 2006-04-24 2011-09-13 Keener Jr Ellis Barlow Interactive audio/video method on the internet
US20080046910A1 (en) * 2006-07-31 2008-02-21 Motorola, Inc. Method and system for affecting performances
US8850464B2 (en) * 2006-10-09 2014-09-30 Verizon Patent And Licensing Inc. Systems and methods for real-time interactive television polling
US20080098417A1 (en) * 2006-10-19 2008-04-24 Mehdi Hatamian Viewer participatory television shows in conjuction with a system and method for real-time data collection and statistical assessment
US20080271082A1 (en) * 2007-04-27 2008-10-30 Rebecca Carter User controlled multimedia television broadcast on single channel
US20090052645A1 (en) * 2007-08-22 2009-02-26 Ravi Prakash Bansal Teleconference system with participant feedback
US9131016B2 (en) * 2007-09-11 2015-09-08 Alan Jay Glueckman Method and apparatus for virtual auditorium usable for a conference call or remote live presentation with audience response thereto
US9060094B2 (en) * 2007-09-30 2015-06-16 Optical Fusion, Inc. Individual adjustment of audio and video properties in network conferencing
US9584564B2 (en) * 2007-12-21 2017-02-28 Brighttalk Ltd. Systems and methods for integrating live audio communication in a live web event
US9661275B2 (en) * 2008-06-13 2017-05-23 Scott Gordon Dynamic multi-perspective interactive event visualization system and method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6519771B1 (en) * 1999-12-14 2003-02-11 Steven Ericsson Zenith System for interactive chat without a keyboard
US7680912B1 (en) * 2000-05-18 2010-03-16 thePlatform, Inc. System and method for managing and provisioning streamed data
US20100094686A1 (en) * 2008-09-26 2010-04-15 Deep Rock Drive Partners Inc. Interactive live events
US20110090347A1 (en) * 2008-12-18 2011-04-21 Band Crashers LLC. Media Systems and Methods for Providing Synchronized Multiple Streaming Camera Signals of an Event

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Timmy Curran Launches Interactive Website on Modlife.com to Connect with Music and Surfing Fans, August 6, 2008, retrieved via Internet at http://www.surfline.com/surf-news/press-release/timmy-curran-launches-interactive-webs *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130005465A1 (en) * 2011-06-29 2013-01-03 EarDish Corporation Audio playlist selections and related entertainment systems and methods
US20130097635A1 (en) * 2011-10-13 2013-04-18 Gface Gmbh Interactive remote participation in live entertainment
US9066144B2 (en) * 2011-10-13 2015-06-23 Crytek Gmbh Interactive remote participation in live entertainment
EP2621165A1 (en) 2012-01-25 2013-07-31 Alcatel Lucent, S.A. Videoconference method and device
US20150052238A1 (en) * 2013-08-19 2015-02-19 Google Inc. Device Compatibility Management
US20160127288A1 (en) * 2014-11-04 2016-05-05 Calay Venture S.à r.l. System and method for inviting users to participate in activities based on interactive recordings

Also Published As

Publication number Publication date
US20140237381A1 (en) 2014-08-21
US20100088128A1 (en) 2010-04-08
US20120123811A1 (en) 2012-05-17
US9548950B2 (en) 2017-01-17
US8442424B2 (en) 2013-05-14
US20100094686A1 (en) 2010-04-15
US9160692B2 (en) 2015-10-13
US20100088159A1 (en) 2010-04-08

Similar Documents

Publication Publication Date Title
US9367862B2 (en) Asynchronous advertising placement based on metadata
US7937740B2 (en) Method and apparatus for interactive programming using captioning
US8769589B2 (en) System and method to create a media content summary based on viewer annotations
Gillespie The politics of ‘platforms’
US8151194B1 (en) Visual presentation of video usage statistics
Chan-Olmsted et al. From on-air to online world: Examining the content and structures of broadcast TV stations' web sites
JP5301425B2 (en) System and method for organizing group content presentation, and the group communication in the group content presentation
US9491525B2 (en) Interactive media display across devices
US20070094083A1 (en) Matching ads to content and users for time and space shifted media network
US7133837B1 (en) Method and apparatus for providing communication transmissions
US20070118425A1 (en) User device agent for asynchronous advertising in time and space shifted media network
US8700641B2 (en) Detecting repeating content in broadcast media
US20080083003A1 (en) System for providing promotional content as part of secondary content associated with a primary broadcast
KR101108866B1 (en) User programmed media delivery service
CN101529909B (en) Method for assigning advertisement and / or content to multimedia devices and device thereof
US20070094081A1 (en) Resolution of rules for association of advertising and content in a time and space shifted media network
US20080082922A1 (en) System for providing secondary content based on primary broadcast
US20060294571A1 (en) Collaborative video via distributed storage and blogging
Thorson et al. YouTube, Twitter and the Occupy movement: Connecting content and circulation practices
AU2008341052B2 (en) Social broadcasting
US8543622B2 (en) Method and system for meta-tagging media content and distribution
US8990328B1 (en) Facilitating media streaming with social interaction
US20090063983A1 (en) System and method for representing content, user presence and interaction within virtual world advertising environments
US8019815B2 (en) Interactive audio/video method on the internet
US8799005B2 (en) Systems and methods for capturing event feedback