WO2023168086A1 - Systèmes et procédés de production de contenu - Google Patents

Systèmes et procédés de production de contenu Download PDF

Info

Publication number
WO2023168086A1
WO2023168086A1 PCT/US2023/014510 US2023014510W WO2023168086A1 WO 2023168086 A1 WO2023168086 A1 WO 2023168086A1 US 2023014510 W US2023014510 W US 2023014510W WO 2023168086 A1 WO2023168086 A1 WO 2023168086A1
Authority
WO
WIPO (PCT)
Prior art keywords
content clip
content
processor
clip
event
Prior art date
Application number
PCT/US2023/014510
Other languages
English (en)
Inventor
Kevin MCREYNOLDS
Original Assignee
Mcreynolds Kevin
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mcreynolds Kevin filed Critical Mcreynolds Kevin
Publication of WO2023168086A1 publication Critical patent/WO2023168086A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • G06V20/42Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items of sport video content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/46Extracting features or characteristics from the video content, e.g. video fingerprints, representative shots or key frames
    • G06V20/47Detecting features for summarising video content
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/4147PVR [Personal Video Recorder]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring

Definitions

  • the present disclosure generally relates to capturing and presenting content to viewers, and to systems and methods to facilitate the same.
  • Events such as sporting events (e.g., baseball games, basketball games, football games, soccer games, hockey games, tennis matches, and/or the like), may be viewed on a device (e g., a television, computer screen, smart phone, tablet, etc.).
  • a device e g., a television, computer screen, smart phone, tablet, etc.
  • the viewing of such events is traditionally continuous, during which the entire event is presented to the viewer for the duration that the event takes place (with commercial or advertisement breaks periodically throughout).
  • viewing can be time-consuming and unengaging, for example, through timeouts, huddles in football, pitching changes in baseball, or other periods during which progress of the actual event is paused, or during which no happenings of consequence occur.
  • system for creating and presenting entertainment content.
  • the system can be configured to perform operations including receiving, by a processor, a first recording of a first event from a first view, wherein the first recording can be recorded on a first user device physically present at the first event; editing, by the processor, the first recording to generate a first content clip; receiving, by the processor, a second recording of the first event from a second view, wherein the second recording can be recorded on a second user device physically present at the first event; editing, by the processor, the second recording to generate a second content clip; transmitting, by the processor, at least one of the first content clip or the second content clip for presentation on a viewer device; and/or presenting, by the processor, at least one of the first content clip or the second content clip on the viewer device.
  • the first event can be a sports game or a play at a sports game.
  • presenting at least one of the first content clip can be a sports game or a play at a sports game.
  • the operations can further comprise selecting, by the processor, a content clip between the first content clip and the second content clip based on a criterion, and wherein transmitting at least one of the first content clip or the second content clip for presentation can comprise transmitting, by the processor, the selected content clip.
  • the operations can further comprise receiving, by the processor, a selection from the viewer device indicating a selected content clip between the first content clip and the second content clip, and wherein transmitting at least one of the first content clip or the second content clip for presentation can comprise transmitting, by the processor, the selected content clip.
  • transmitting least one of the first content clip or the second content clip for presentation comprises transmitting the first content clip and the second content clip.
  • the operations can further comprise presenting, by the processor, the first content clip and the second content clip on a display screen on the viewer device; receiving, by the processor, a selection from the viewer device indicating a selected content clip between the first content clip and the second content clip; enlarging, by the processor, the selected content clip on the display screen relative to an unselected content clip between the first content clip and the second content clip; and/or continuing to present, by the processor, the selected content clip at least one of during or after the enlarging the selected content clip.
  • the operations can further comprise continuing to present, by the processor, the unselected content clip on the display screen during and/or after the enlarging the selected content clip. In various examples, the operations can further comprise receiving, by the processor, a subsequent selection of another content clip; and/or presenting, by the processor, the other content clip after presenting the at least one of the first content clip or the second content clip.
  • a duration of the first content clip can be substantially equal to a duration of the first recording recorded on the first user device, and/or a duration of the second content clip can be substantially equal to a duration of the second recording recorded on the second user device.
  • FIG. 1 illustrates a block diagram of an exemplary content production system, in accordance with various examples.
  • FIG. 2 illustrates a top view of a sports court from which content can be captured by a content production system, in accordance with various examples.
  • FIG. 3 illustrates an exemplary display screen on a user device depicting a GUI provided by the content production system, in accordance with various examples.
  • FIG. 4 illustrates a flowchart depicting an exemplary method for producing and presenting content by utilizing a content production system, in accordance with various examples.
  • System 100 can allow the capturing of content for an event(s) by image, video, and/or audio, and transmit such captured content to an end user for viewing on an end user device.
  • the event can be captured using system 100 through the recording of a series of recordings or content clips.
  • Each content clip or recording can capture a portion of the overarching event (e.g., a play in a sports game or match).
  • a content clip or recording can include any suitable content, such as an image(s), text, video, and/or audio.
  • the series of content clips can capture various plays within a sports game, and a viewer can view the content clips of plays on an end user device.
  • System 100 can allow a viewer to view only desired portions of an event or multiple events (e.g., game(s)), for example, cutting out or avoiding timeouts, stoppage, inconsequential sequences, and/or the like.
  • system 100 can allow a player to review content clips of event portions (e.g., plays), for example, by allowing control of the playing of such content clips (e.g., allowing the viewer to slow down, pause, rewind, fast forward, and/or the like, for example, to review a play in-depth).
  • system 100 can comprise an application server 130 and various user devices (e.g., operator device 160, recorder device 140, and/or viewer device 150).
  • any or all of the components of system 100 can be integrated, and/or in electronic communication, with one another via one or more application programming interfaces (APIs).
  • System 100 and/or any of the components comprised therein can be computer-based, and can comprise a processor (e g., processor 190), a tangible non- transitory computer-readable memory, and/or a network interface, along with other suitable system software and hardware components. Instructions stored on the tangible non- transitory memory may allow system 100, or the components therein, to perform various functions, as described herein.
  • one or more processors can control, preform, and/or facilitate the functions of all components of system 100, receive inputs from users of the user devices through input devices (e.g., physical or electronic buttons, voice commands, and/or the like), execute actions or commands in response to the inputs, and/or present/play/display content on a display screen of a user device (or cause data and information to be presented, played, and/or displayed), such as video.
  • input devices e.g., physical or electronic buttons, voice commands, and/or the like
  • application server 130 can incorporate hardware and/or software components. Instructions stored on the tangible non-transitory memory can allow system 100 to perform various functions, as described herein.
  • the application server 130 can be configured as a central network element or hub to access various systems, engines, and components of system 100.
  • the application server 130 can comprise a network, a computer-based system, and/or software components configured to provide an access point to various systems, engines, and components of system 100.
  • the application server 130 can be in operative and/or electronic communication with user devices (e.g., recorder device 140, viewer device 150, and/or operator device 160) via a network.
  • the application server 130 can allow communication from the user devices to systems, engines, and components of system 100 (such as, for example, those within application server 130).
  • the application server 130 can receive commands and/or metadata from the user devices and can pass replies to, or execute actions on, the user devices.
  • application server 130 can include one or more computing devices described above, rack mounted servers, and/or virtual machines providing load balancing, application services, web services, data query services, data transfer services, reverse proxy services, or otherwise facilitating the delivery and receipt of data across networks.
  • application server 130 can comprise a server appliance running a suitable server operating system (e.g., MICROSOFT INTERNET INFORMATION SERVICES or, “IIS”) and having database software (e.g., ORACLE) installed thereon.
  • Application server 130 can be in electronic communication with processor 190 and/or various user devices (e.g., recorder device 140, viewer device 150, and/or operator device 160).
  • application server 130 can comprise content database 132, content processing system 134, and/or content capturing system 136.
  • application server 130 can be configured to receive recordings and/or content clips recorded on a recorder device (e.g., recorder device 140), edit or reformat recordings (e.g., to produce content clips), store content clips, select between content clips, and/or transmit content clips for viewing by a user (e.g., of viewer device 150).
  • a recorder device e.g., recorder device 140
  • edit or reformat recordings e.g., to produce content clips
  • store content clips e.g., select between content clips
  • select between content clips e.g., of viewer device 150
  • content database 132 can be configured to store content clips and recordings that are recorded by content capturing system 136, and/or content edited and/or created from the recorded content clips (e.g., by content processing system 134).
  • application server 130 can store the content clip, as discussed, and/or can transmit an access link to GUI 156 presented to the viewer via viewer device 150 that is configured to allow the viewer to view the stored content clip.
  • the access link e.g., a thumbnail, image, content window displaying content, text, etc. comprising a link to the desired content
  • processor 190 receiving the selection
  • application server 130 can retrieve the stored content and present it (or transmit it to viewer device 150 to be presented) to the user on GUI 156.
  • content capturing system 136 can incorporate hardware and/or software components.
  • content capturing system 136 can comprise a server appliance running a suitable server operating system (e.g., MICROSOFT INTERNET INFORMATION SERVICES or, “IIS”) and having database software (e.g., ORACLE) installed thereon.
  • server operating system e.g., MICROSOFT INTERNET INFORMATION SERVICES or, “IIS”
  • database software e.g., ORACLE
  • Content capturing system 136 can comprise content API 138, through which content capturing system 136 can integrate and electronically communicate with one or more user devices (e.g., with recorder device 140 and/or camera system 144 thereon).
  • content capturing system 136 can be configured to respond to commands from processor 190 (which can be in response to inputs received from recorder device 140 (or through a GUI on display screen 142) from a user), and record recordings of content (e.g., image, audio, video, and/or the like) using camera 144 of recorder device 140.
  • the recording can be transmitted to be stored in content database 132.
  • Processor 190 can instruct that the recording be marked with one or more tags, markers, identifiers, and/or metadata to indicate a characteristic of the recording (e.g., to indicate an event type, play type, teams or persons depicted, text or other graphics to be added to the video, and/or the like).
  • Application server 130 and/or content capturing system 140 can mark the recordings accordingly. In various examples, such marking can take place via content processing system 134.
  • content processing system 134 can be configured to edit recordings by content capturing system 136 and/or received by application server 130.
  • Processor 190 can be configured to receive a recording and take various actions. For example, processor 190 can analyze the recording and the associated data to determine one or more characteristics of the recording, such as the event type, play type, viewing angle, length, the specific point during the event (e.g., during a specific quarter or period of the game, or a specific time based on comparison with a game clock), teams or persons depicted, and/or the like. Processor 190 can mark the recording (or resulting content clip) with a marker, identifier, metadata, or the like to indicate the determined characteristic.
  • processor 190 can detect/identify a certain ball or sport equipment, and can mark the recording or content clip accordingly (e.g., identifying the presence of a football or field goal post, thus marking the recording or content clip with a football marker).
  • processor 190 can identify the sex of players in the recording or content clip, and mark the recording or content clip accordingly (e.g., for a men’s or women’s sporting event).
  • processor 190 can identify some movement ty pe in the recording or content clip (e.g., identifying a baseball pitch or football pass being thrown, or a basketball shot or block occurring), and mark the recording or content clip accordingly.
  • processor can identify ajersey type and/or colors (thus determining that the content clip depicts a certain sports team), and mark the recording or content clip accordingly.
  • the characteristics of a recording or content clip received by application server 130 can be received by processor 190 via an input from a user (e.g., from operator device 160 or recorder device 140), and processor 190 can mark the respective recording or content clip(s) accordingly.
  • recordings and content clips can be grouped according to identifiers and offered to viewers with text or images, video, or other identifiers identifying such characteristic.
  • a viewer can be able to choose to view content clips having certain characteristics, such as those for a certain team, player, game, play type, and/or the like.
  • content processing system 134 can also edit recordings or content clips.
  • content processing system 134 (or the processor associated therewith or therein) can format a recording or content clip into a desired format, change the speed of playing video, change the volume of audio, adjust audio (e.g., remove or lessen background noise such as cheering, emphasize a certain player’s speech to make it more audible, and/or the like), adjust resolution or lighting, resize the content clip, add a filter, or take any other action to edit the content clip.
  • adjust audio e.g., remove or lessen background noise such as cheering, emphasize a certain player’s speech to make it more audible, and/or the like
  • adjust resolution or lighting e.g., remove or lessen background noise such as cheering, emphasize a certain player’s speech to make it more audible, and/or the like
  • adjust resolution or lighting e.g., remove or lessen background noise such as cheering, emphasize a certain player’s speech to make it more audible, and/
  • a team’s logo or other identifying image or text can be added to a recording or content clip (e.g. , a video) depicting a team or a player therefrom. Therefore, when a viewer views the edited content clip, the content clip will reflect the implemented changes from content processing system 134.
  • Such changes can be in response to the processor receiving a command to make such a change (e.g., from a user of operator device 160).
  • such changes can be in response to the processor detecting a characteristic of a received recording or content clip, marking the recording or content clip with the appropriate identifier or metadata, and making edits to the recording or clip associated with such identifier.
  • a recording or content clip can be marked with an identifier for a certain sports team, so the processor can automatically add text or an image associated with the team to the recording or content clip.
  • content processing system 134 can be configured to control the displaying, playing, or otherwise presenting content clips or other content to viewers on their respective user devices. For example, in response to receiving a command, the processor can present or play a content clip on a viewer device 150. The processor can pause, fast forward, rewind, slow, or otherwise control the playing of the content clip in accordance with commands received from viewer device 150.
  • a viewer may want to review a play during a sports game, and therefore, send a command to the processor to pause the content clip (e.g., by pressing an electronic button on a touch screen), or advance or rewind the content clip (e.g., by pressing another button, or by holding the viewer’s finger on the screen and moving it back and forth, or up and down, to command the processor advance the content clip forward or backward, or pause in response to the finger ceasing to move on the touchpad or screen).
  • pause the content clip e.g., by pressing an electronic button on a touch screen
  • advance or rewind the content clip e.g., by pressing another button, or by holding the viewer’s finger on the screen and moving it back and forth, or up and down, to command the processor advance the content clip forward or backward, or pause in response to the finger ceasing to move on the touchpad or screen.
  • Such processing by content processing system 134 can occur at any suitable time, such as in response to the recordings or content clips being received by application server 130, being stored (and/or during storage) in content database 132, in response to the processor instructing content processing system 134 to process recordings or content clips, and/or the like.
  • components of application server 130 and/or a processor can be comprised in any component, or all components of system 100, for example, in any of recorder device 140, viewer device 150, operator device 160, and/or separate therefrom.
  • a user device can incorporate hardware and/or software components.
  • a user device can comprise a server appliance running a suitable server operating system (e g., MICROSOFT INTERNET INFORMATION SERVICES or, “IIS”).
  • a user device can be any device that allows a user to communicate with anetwork (e.g., a personal computer, personal digital assistant (e.g., IPHONE®, BLACKBERRY®, GALAXY®), tablet, cellular phone, kiosk, and/or the like).
  • a user device can be in electronic communication with application server 130 and/or any modules or systems therein.
  • a user device can comprise an input device (e.g., a physical or digital button)
  • a user device can allow the user of system 100 to interact with the other components of system 100.
  • a user device can comprise a display screen, which can display a GUI provided by application server 130. The display screen on the respective user device can allow the user to select input device(s) to communicate to system 100 a desired action by system 100.
  • a user device includes any device (e.g., personal computer, mobile device, etc.) which communicates via any network, such as those discussed herein.
  • a user device can comprise and/or run a browser, such as MICROSOFT® INTERNET EXPLORER®, MOZILLA® FIREFOX®, GOOGLE® CHROME®, APPLE® Safari, or any other of the myriad software packages available for browsing the internet.
  • the browser can communicate with a server via network by using Internet browsing software installed in the browser.
  • the browser can comprise Internet browsing software installed within a computing unit or a system to conduct online transactions and/or communications.
  • These computing units or systems can take the form of a computer or set of computers, although other types of computing units or systems can be used, including laptops, notebooks, tablets, handheld computers, personal digital assistants, set-top boxes, workstations, computer-servers, mainframe computers, mini-computers, PC servers, pervasive computers, network sets of computers, personal computers, such as IPADS®, IMACS®, and MACBOOKS®, kiosks, terminals, point of sale (POS) devices and/or terminals, televisions, or any other device capable of receiving data over a network.
  • a browser can be configured to display an electronic channel.
  • system 100 can comprise one or more user devices (e.g., a recorder device 140, a viewer device 150, and/or an operator device 160).
  • System 100 can comprise multiple recorder devices 140, viewer devices 150, and/or operator devices 160.
  • a user device can comprise software and/or hardware in communication with the system 100 via a network comprising hardware and/or software configured to allow a user, and/or the like, access to application server 130 and/or content stored thereon or provided therefrom.
  • the user device can comprise any suitable device that is configured to allow a user to communicate with a network and the system 100.
  • the user device can allow a user to transmit commands and requests to the system 100.
  • a user device e.g., as part of system 100 described herein can run a web application or native application to communicate with application server 130.
  • a native application can be installed on the user device via download, physical media, or an app store, for example.
  • the native application can utilize the development code base provided for use with the operating system and capable of performing system calls to manipulate the stored and displayed data on the user device and communicates with application server 130.
  • a web application can be web browser compatible and written specifically to run on a web browser. The web application can thus be a browser-based application that operates in conjunction with application server 130.
  • the native application running on the user device can be in communication with the application server 130 to support real-time updates.
  • data and/or content pertaining to events happening in real time or near real time can be received by application server 130, and system 100 can synchronize across the various user devices used by any number of users interacting with the application server 130 and/or system 100.
  • the application server 130 can serve data from system 100 to any or all of the user devices and can serve commands from the user devices to the application server 130.
  • application server 130 can apply access permissions to restrict the data transmitted between the various components of system 100.
  • Users can be authenticated on the native or web application, for example, via a username, password, dual factor authentication, private cryptographic key, one-time password, security question, biometrics, or other suitable authentication techniques know to those skilled in the art.
  • system 100 can allow the sharing on social media of content clips (e.g.. Facebook®, Twitter®, Snapchat®, TikTok®, and/or the like).
  • content clips e.g.. Facebook®, Twitter®, Snapchat®, TikTok®, and/or the like.
  • system 100 can comprise one or multiple recorder devices 140.
  • a recorder device 140 can be any device with the ability to record image, video, and/or audio (e.g., a personal computing device, a smartphone, tablet, camera, and/or the like).
  • Each recorder device 140 can be operated by a recording user, who will capture content (e.g., image, video, and/or audio content) of an event.
  • recorder device 140 can comprise a camera system 144, which can comprise a camera and/or microphone, which can be configured to digitally capture an image(s) and/or audio from the event.
  • image may include static images as well as video.
  • Recorder device 140 can also include a display screen 142 which can display the images being captured by camera system 144, whether or not such images are being recorded (i.e., display screen 142 can display things at which the camera is pointing). Display screen 142 can also be configured to display captured and/or recorded content, for example, for the user of recorder device 140 to review such content.
  • a content clip or other content recorded or captured by recorder device 140 can be transmitted to application server 130 for processing. Transmission of a recording or content clip can occur automatically upon completion of the recording or capture, and/or upon the user of recorder device 140 selecting a button or other input device to transmit the same (the recorder device can send the content to application server 130, and/or application server 130 can retrieve the content clip from recorder device 140).
  • system 100 can utilize multiple recorder devices 240 (similar to recorder devices 140 in FIG. 1) to capture recordings of an event from multiple angles or perspectives.
  • multiple recorder devices can be positioned around a playing field or court.
  • recorder devices 240 at a basketball game, recorder devices 240 (and users thereof) can be positioned at various locations around the basketball court 290
  • one recorder device 240 can be positioned at a baseline 292 or under the basket or at half court 294.
  • recorder devices can be positioned a different heights relative to a playing surface to capture recordings from different perspectives (e.g., on the ground or floor level versus elevated).
  • a portion of an event can be recorded by multiple recorder devices 240, thus providing multiple perspectives of an event.
  • Recorder devices 240 can be user devices of spectators or other attendees of an event.
  • recorder devices 240 can be spectators or attendees (e.g., fans, volunteers, students on the sideline of a school game, and/or the like) taking recordings via a personal user device such as their personal smartphone or the like.
  • the multiple content clips can be transmitted to application server 130 for processing.
  • application server 130 can select between multiple recordings or content clips of the same event portion or play to send the best or desired view or angle to the viewer. Such a selection can be received in response to a user of system 100 (e.g., the operator of operator device 160) selecting the desired recording and/or content clip.
  • application processing system 134 can automatically select between recording and/or content clips of the same event portion or play based on certain criteria. For example, closer proximity to a play, or a certain perspective for a certain play type (e.g., an elevated view may be more desirable than a ground view for a pass play in football), or a certain view that has better visibility of what is happening, can cause the processor to select one recording and/or content clip over another.
  • system 100 can comprise one or multiple operator devices 1 0.
  • An operator device 160 can comprise a display screen and an input device, such that an operator using operator device 160 can communicate and interact with application server 130. For example, an operator can preview, review, or edit recording and/or content clips before publishing to viewers.
  • viewer device 150 can comprise a display screen configured to display a graphical user interface (GUI) 156.
  • GUI 156 can take different forms and/or comprise various features.
  • Input device(s) can be an aspect of GUI 156 displayed on the display screen which, in response to being selected, allows a user of system 100 to produce an input signal received by processor 190, which can command processor 190 to perform or facilitate performance of an operation.
  • an input device can be a digital button displayed on the display screen of user device 150 (e.g., a touch screen) which can be selected by tapping the screen on a touch screen or selecting the input device with a computer mouse, and/or the input device can be a physical button to input information.
  • a viewer can view content clips on GUI 156 by selecting an input device (e.g., a digital or physical button, a link, and/or the like) associated with a content clip.
  • an associated content clip can be presented or displayed. For example, a thumbnail version of the content clip can be displayed on GUI 156 (i.e., a smaller version giving a preview of the associated content clip).
  • application server 130 and/or content processing system 134 can receive a command to present or play the associated content clip (e.g., by playing a larger version of the video clip).
  • a content clip in a certain position in the GUI can be automatically played until the user selects another content clip.
  • a user can tap the thumbnail or image of another content clip, and the content clip that was playing will cease to play, and the selected content clip will play.
  • the user can swipe the currently- playing content clip to the side, for example, by moving the user’s finger along a touchscreen or touchpad, which moves another content clip into a center or prominent position for playing.
  • the content clip moved into a center or prominent position can automatically begin playing in response to such moving or can begin playing in response to a command to do so (e.g., the user selecting a “play” input device).
  • a GUI can present multiple content clips (e.g., and input devices associated therewith). For example, a GUI can display multiple previews or smaller versions of content clips for a viewer to view. Thus, a viewer can select one of the multiple content clips (or input devices) to view a desired content clip (e.g., showing a desired game, play, etc ).
  • a GUI 302 on a viewer device 300 can display multiple content windows 310A-310D each having one or more respective content clips. Each content window can comprise content, such as video, audio, image, and/or text content, and/or an input device (e.g., a link) to further content.
  • the content windows displayed on GUI 302 can be depicting different angles of the same event or play, different plays, different games, and/or the like.
  • the selected content window can enlarge (e.g., relative to the other content clips) to show a larger view of the selected content window.
  • at least portions of the other content window 310A, 31 OB, and 3 I0D can still be visible even after selection and enlargement of a selected content window (e.g., content clip 310C), thus allowing the viewer to continue to view portions or previews of other content windows.
  • the viewer can select another content window, which can enlarge and/or take the place of the previously enlarged content window.
  • a user can select one of the other content windows 310A, 31 OB, or 310D.
  • the selected content window can enlarge and/or the previously enlarged content clip 310C can shrink.
  • a user can swipe away from content window 310C (e.g., on a touchscreen or touchpad) in a direction of a desired content window.
  • the selected content window can enlarge and/or the previously enlarged content window 310C can shrink (e.g., swiping leftward from content window 310C can result in content window 310A enlarging and/or playing).
  • a user can swipe through content windows (e.g., swipe right to move from content window 310D to content window 310B), and content windows can remain substantially the same size.
  • the centered or emphasized content window can automatically begin playing a video.
  • Video playing can cease in response to the user swiping to cause another content window to be centered or emphasized on GUI 302.
  • a user can swipe through content windows, which can play in response to the user selecting a content window.
  • a GUI can offer a user various content types.
  • a first row e.g., the top row comprising content windows 310A and 310C
  • a second row e.g., the bottom row comprising content windows 310B and 310D
  • Content windows in a row can comprise any suitable content, including, for example, content clips of various happenings during an event (e.g., plays in sequential order during a game, possession, or the like).
  • GUI 302 e.g., up or down
  • application server 130 and/or content processing system 134 can periodically change the content window(s) displayed on a GUI to a viewer (e.g., changing the content clips to watch in real time or near real time to keep up with happenings during an event or game as it progresses).
  • one section of content on a GUI e.g., a row of content windows
  • another section of content on the GUI can be configured to allow a user to follow an event (e.g., a game) in real time.
  • one section of content on a GUI can offer content clips (e g., videos), and another section of content on a GUI can offer stories or articles about a certain topic (e.g., about agame, team, etc.).
  • a user can select desired content by selecting an associated input device (e.g., by tapping on a touchscreen link, selecting a link via a mouse or physical button, and/or the like).
  • System 100 can also present a viewer with paid content (such as paid advertising) by inserting a paid content clip into at least a portion of the GUI 156.
  • paid content such as paid advertising
  • a benefit of system 100 and its functionality is the ability of a viewer to tune into an event and watch various content clips as desired by the viewer.
  • the viewer may only elect to view content clips of interest. For example, a viewer may want to decrease or eliminate viewing the time during a basketball game from defensive rebound to crossing half court and transition to offense because such time is largely uneventful. Therefore, the viewer can only select and view content clips presenting offensive plays. Also, a viewer may be able to skip other uneventful or inconsequential events that occur in the normal course of an event or game, such as huddles, time outs, warm-ups, etc. Thus, the viewer can simply watch content clips showing plays.
  • the viewer can navigate between content clips, while doing other activities on or off of the viewer device (e.g., going to other apps, messaging, etc.), and view an event or game of interest at the viewer’s leisure.
  • This allows a viewer to view an event or game in smaller “chunks” or content clips, rather than as one continuous video (thus shortening the time commitment for watching an event like a sports game), viewing multiple events or games, and/or allows application server 130 to insert other content for the viewer to see (e g., other suggested content clips, paid advertising, and/or the like).
  • the viewer can select a perspective of a play of multiple offered perspectives, and/or control the content within the content clip as desired to better view and/or analyze a play or other happening (e.g., moving video forward, backward, or pausing by movement of a viewer’s finger to better view a play, for example, to analyze a penalty call or scoring event).
  • a perspective of a play of multiple offered perspectives and/or control the content within the content clip as desired to better view and/or analyze a play or other happening (e.g., moving video forward, backward, or pausing by movement of a viewer’s finger to better view a play, for example, to analyze a penalty call or scoring event).
  • FIG. 4 depicts a method 400 for producing and presenting content using system 100, in accordance with various examples.
  • processor 190 may perform the functions of system 100 and/or the steps of method 400 (of FIG. 4).
  • each component of system 100 can have a separate processor performing functions, or processor 190 can be located in another component of system 100, or processor 190 can be a separate component of system 100.
  • one or more user devices can be positioned around a playing field or court, with different perspectives of such an area and the happenings occurring therein.
  • the recorder devices can capture recordings (step 402) of content depicting happenings on the playing field.
  • the recorder devices can be personal user devices of spectators or attendees of the subject event. For example, the recorder devices can record video clips of plays during a game from various perspectives and angles.
  • the recording can be transmitted from recorder device 140 to application server 130, and application server 130 can receive the recording (step 404).
  • the recordings received by application server 130 can be processed.
  • a recording can be edited (step 406) (e.g., adding an image or text to a video clip, reformatting, resizing, etc.).
  • Such editing can create a content clip that can be configured for user viewing.
  • the durations of the recordings received by application server 130 can be substantially equal to the durations of the resulting content clips (in this context, the term “substantially” means within plus or minus five or ten percent of the duration).
  • the editing to create content clips can comprise a minimal amount of editing to allow rapid production of content clips (e.g., to provide real time or near real time available content).
  • Part of the recording editing or processing can comprise adding markers or identifiers to the recordings to indicate a certain characteristic of the resulting content clip.
  • a selection between multiple recordings or content clips can be made (e.g., by an operator or automatically by application server 130). Such a selection can determine a desired perspective, quality, or other characteristic between content clips, and which content clip has the desired characteristic, or a certain level thereof, relative to other content clips. Content clips can be presented on a viewer device 150 GUI 156 for viewing by a user.
  • a viewer on a viewer device 150 can select a content clip(s) to view (e g , by selecting an input device associated with the content clip).
  • Application server 130 can receive the selection of a content clip (step 408).
  • Such a selection can be of a single content clip, or of a group of content clips having a certain characteristic (as indicated by the markers/identifiers associated with the content clips). For example, a viewer can select content clips for a certain team, game, play type, or the like.
  • application server 130 can transmit the selected content clip(s) for presentation on viewer device 140 and viewing by the viewer (step 410).
  • the transmitted content clip can be presented (step 412) on a GUI for viewing.
  • Such display can be as a preview or thumbnail, and therefore, the displaying of a content clip can be in response to a selection of a specific input device associated with a content clip. For example, multiple content prompts (or previews or thumbnails thereol) can be displayed on GUI 156 to view for selection. In response to selecting one of the content clips, such content clip can be enlarged relative to the other displayed content clips for improved viewing of the selected content clip.
  • application server 130 can receive a subsequent selection of a content clip (step 414). Such as selection can occur before, during, or after, another content clip is being displayed or played (e.g., a previously selected or automatically-played content clip).
  • application server 130 can transmit and/or present the subsequent content clip (step 416).
  • the viewer through viewer device 150 and its connection to application server 130 (e g., through a web or native application), can view any desired content clips offered by application server 130 at any suitable time, in any suitable order.
  • method 400 can be repeated in response to the recording of each content clip and receipt thereof by applications server 130, and in response to receiving a selection of a content prompt(s) or a group thereof.
  • content means any image, video, and/or audio content captured, created, recorded, and/or presented.
  • “satisfy”, “meet”, “match”, “associated with” or similar phrases may include an identical match, a partial match, meeting certain criteria, matching a subset of data, a correlation, satisfying certain criteria, a correspondence, an association, an algorithmic relationship and/or the like.
  • Terms and phrases similar to “associate” and/or “associating” may include tagging, flagging, marking, correlating, using a look-up table or any other method or system for indicating or creating a relationship between elements, such as, for example, (i) a content clip and/or (ii) a characteristic identifier or marker.
  • the associating may occur at any point, in response to any suitable action, event, or period of time.
  • the associating may occur at pre-determined intervals, periodic, randomly, once, more than once, or in response to a suitable request or action. Any of the information may be distributed and/or accessed via a software enabled link, wherein the link may be sent via an email, text, post, social network input and/or any other method known in the art.
  • system and method may be described herein in terms of functional block components, screen shots, optional selections and various processing steps. It should be appreciated that such functional blocks may be realized by any number of hardware and/or software components configured to perform the specified functions.
  • the system may employ various integrated circuit components, e.g., memory elements, processing elements, logic elements, look-up tables, and the like, which may carry out a variety of functions under the control of one or more microprocessors or other control devices.
  • the software elements of the system may be implemented with any programming or scripting language such as C, C++, C#, JAVA®, JAVASCRIPT, VBScript, Macromedia Cold Fusion, COBOL, MICROSOFT® Active Server Pages, assembly, PERL, PHP, awk, Python, Visual Basic, SQL Stored Procedures, PL/SQL, any UNIX shell script, and extensible markup language (XML) with the various algorithms being implemented with any combination of data structures, objects, processes, routines or other programming elements.
  • the system may employ any number of conventional techniques for data transmission, signaling, data processing, network control, and the like.
  • the system could be used to detect or prevent security issues with a client-side scripting language, such as JAVASCRIPT, VBScript or the like.
  • a client-side scripting language such as JAVASCRIPT, VBScript or the like.
  • the system may be embodied as a customization of an existing system, an add-on product, a processing apparatus executing upgraded software, a standalone system, a distributed system, a method, a data processing system, a device for data processing, and/or a computer program product. Accordingly, any portion of the system or a module may take the form of a processing apparatus executing code, an internet-based embodiment, an entirely hardware embodiment, or an embodiment combining aspects of the internet, software and hardware. Furthermore, the system may take the form of a computer program product on a computer- readable storage medium having computer-readable program code means embodied in the storage medium. Any suitable computer-readable storage medium may be utilized, including hard disks, CD-ROM, optical storage devices, magnetic storage devices, and/or the like.
  • These computer program instructions may be loaded onto a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions that execute on the computer or other programmable data processing apparatus create means for implementing the functions specified in the flowchart block or blocks.
  • These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory' produce an article of manufacture including instruction means which implement the function specified in the flowchart block or blocks.
  • the computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer- implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart block or blocks.
  • steps described herein may comprise in any number of configurations including the use of WINDOWS®, webpages, web forms, popup WINDOWS®, prompts and the like. It should be further appreciated that the multiple steps as illustrated and described may be combined into single webpages and/or WINDOWS® but have been expanded for the sake of simplicity. In other cases, steps illustrated and described as single process steps may be separated into multiple webpages and/or WINDOWS® but have been combined for simplicity.
  • “transmit” may include sending electronic data from one system component to another over a network connection.
  • “data” may include encompassing information such as commands, queries, files, data for storage, and the like in digital or any other form.
  • the term “network” includes any cloud, cloud computing system or electronic communications system or method which incorporates hardware and/or software components. Communication among the parties may be accomplished through any suitable communication channels, such as, for example, a telephone network, an extranet, an intranet, Internet, point of interaction device (point of sale device, personal digital assistant (e.g., IPHONE®, BLACKBERRY®), cellular phone, kiosk, etc.), online communications, satellite communications, off-line communications, wireless communications, transponder communications, local area network (LAN), wide area network (WAN), virtual private network (VPN), networked or linked devices, keyboard, mouse and/or any suitable communication or data input modality.
  • a telephone network such as, for example, a telephone network, an extranet, an intranet, Internet, point of interaction device (point of sale device, personal digital assistant (e.g., IPHONE®, BLACKBERRY®), cellular phone, kiosk, etc.), online communications, satellite communications, off-line communications, wireless communications, transponder communications, local area network (LAN), wide area network
  • the system is frequently described herein as being implemented with TCP/IP communications protocols, the system may also be implemented using IPX, APPLE®talk, IP-6, NetBIOS®, OSI, any tunneling protocol (e.g. IPsec, SSH), or any number of existing or future protocols.
  • IPX IPX
  • APPLE®talk IP-6
  • NetBIOS® NetBIOS
  • OSI any tunneling protocol (e.g. IPsec, SSH)
  • IPsec Secure Shell
  • SSH Secure Shell
  • non-transitory is to be understood to remove only propagating transitory signals per se from the claim scope and does not relinquish rights to all standard computer- readable media that are not only propagating transitory signals per se. Stated another w ay. the meaning of the term “non-transitory computer-readable medium” and “non-transitory computer-readable storage medium” should be construed to exclude only those types of transitory computer-readable media which were found in In Re Nuijten to fall outside the scope of patentable subject matter under 35 U.S C. ⁇ 101.
  • the disclosure includes a method, it is contemplated that it may be embodied as computer program instructions on a tangible computer-readable carrier, such as a magnetic or optical memory or a magnetic or optical disk.
  • a tangible computer-readable carrier such as a magnetic or optical memory or a magnetic or optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Computer Security & Cryptography (AREA)
  • Databases & Information Systems (AREA)
  • Television Signal Processing For Recording (AREA)

Abstract

Un procédé de production et/ou de présentation de contenu peut consister à recevoir un premier enregistrement d'un premier événement à partir d'une première vue, le premier enregistrement pouvant être enregistré sur un premier dispositif utilisateur physiquement présent au premier événement ; à éditer le premier enregistrement pour générer un premier clip de contenu ; à recevoir un second enregistrement du premier événement à partir d'une seconde vue, le second enregistrement pouvant être enregistré sur un second dispositif utilisateur physiquement présent au premier événement ; à éditer le second enregistrement pour générer un second clip de contenu ; à transmettre le premier clip de contenu et/ou le second clip de contenu en vue d'une présentation sur un dispositif de visualisation ; et/ou à présenter le premier clip de contenu et/ou le second clip de contenu sur le dispositif de visualisation.
PCT/US2023/014510 2022-03-04 2023-03-03 Systèmes et procédés de production de contenu WO2023168086A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202263316913P 2022-03-04 2022-03-04
US63/316,913 2022-03-04

Publications (1)

Publication Number Publication Date
WO2023168086A1 true WO2023168086A1 (fr) 2023-09-07

Family

ID=87884174

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2023/014510 WO2023168086A1 (fr) 2022-03-04 2023-03-03 Systèmes et procédés de production de contenu

Country Status (1)

Country Link
WO (1) WO2023168086A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190273837A1 (en) * 2015-09-30 2019-09-05 Amazon Technologies, Inc. Video ingestion and clip creation
US20200322592A1 (en) * 2015-10-29 2020-10-08 Oy Vulcan Vision Corporation Video imaging an area of interest using networked cameras

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190273837A1 (en) * 2015-09-30 2019-09-05 Amazon Technologies, Inc. Video ingestion and clip creation
US20200322592A1 (en) * 2015-10-29 2020-10-08 Oy Vulcan Vision Corporation Video imaging an area of interest using networked cameras

Similar Documents

Publication Publication Date Title
US11899637B2 (en) Event-related media management system
JP5499331B2 (ja) ストリーミングメディアのトリックプレー
US9386339B2 (en) Tagging product information
US20210004131A1 (en) Highlights video player
US10848831B2 (en) Methods, systems, and media for providing media guidance
US11383164B2 (en) Systems and methods for creating a non-curated viewing perspective in a video game platform based on a curated viewing perspective
US20180314758A1 (en) Browsing videos via a segment list
CA3012143A1 (fr) Procede et systeme d'enregistrement en boucles programmables
US11998828B2 (en) Method and system for presenting game-related information
WO2008087742A1 (fr) Système de reproduction de film, dispositif terminal d'information et procédé d'affichage d'information
WO2023168086A1 (fr) Systèmes et procédés de production de contenu
US20240211453A1 (en) Event-related media management system
Moon Interactive Football Summarization

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23763983

Country of ref document: EP

Kind code of ref document: A1