WO2019070609A1 - Method and apparatus for editing media content - Google Patents

Method and apparatus for editing media content Download PDF

Info

Publication number
WO2019070609A1
WO2019070609A1 PCT/US2018/053807 US2018053807W WO2019070609A1 WO 2019070609 A1 WO2019070609 A1 WO 2019070609A1 US 2018053807 W US2018053807 W US 2018053807W WO 2019070609 A1 WO2019070609 A1 WO 2019070609A1
Authority
WO
WIPO (PCT)
Prior art keywords
media
trigger
segments
capture
data
Prior art date
Application number
PCT/US2018/053807
Other languages
French (fr)
Inventor
Christopher A. Wiklof
Geoffery D. OSLER
Timothy M. Londergan
Original Assignee
Howl Holdings Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US201762567054P priority Critical
Priority to US62/567,054 priority
Priority to US201762572157P priority
Priority to US62/572,157 priority
Application filed by Howl Holdings Llc filed Critical Howl Holdings Llc
Publication of WO2019070609A1 publication Critical patent/WO2019070609A1/en
Priority claimed from US16/821,379 external-priority patent/US20200286523A1/en

Links

Classifications

    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • G11B27/30Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on the same track as the main recording
    • G11B27/309Table of contents
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/08Speech classification or search
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L25/00Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
    • G10L25/48Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use
    • G10L25/51Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/34Indicating arrangements
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B33/00Constructional parts, details or accessories not provided for in the preceding groups
    • G11B33/02Cabinets; Cases; Stands; Disposition of apparatus therein or thereon
    • G11B33/022Cases
    • G11B33/025Portable cases
    • H04M1/72412
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/466Learning process for intelligent management, e.g. learning user preferences for recommending movies
    • H04N21/4667Processing of monitored end-user data, e.g. trend analysis based on the log file of viewer selections
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/65Transmission of management data between client and server
    • H04N21/658Transmission by the client directed to the server
    • H04N21/6582Data stored in the client, e.g. viewing habits, hardware capabilities, credit card number
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8547Content authoring involving timestamps for synchronizing content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, camcorders, webcams, camera modules specially adapted for being embedded in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/232Devices for controlling television cameras, e.g. remote control ; Control of cameras comprising an electronic image sensor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/82Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
    • H04N9/8205Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/08Speech classification or search
    • G10L2015/088Word spotting
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • G10L2015/223Execution procedure of a spoken command
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network-specific arrangements or communication protocols supporting networked applications
    • H04L67/42Protocols for client-server architectures

Abstract

A system presents a media playback. The system receives a plurality of time stamped trigger data. The system reads a plurality of media segments from a media capture device. Each media segment corresponds to a respective time stamped trigger datum. The system outputs the plurality of media segments as a playback including the plurality of media segments.

Description

METHOD AND APPARATUS FOR EDITING MEDIA

CONTENT

CROSS REFERENCE TO RELATED APPLICATIONS

The present application claims priority benefit from co-pending U.S.

Provisional Patent Application No. 62/572, 157, entitled "METHOD AND

APPARATUS FOR EDITING MEDIA CONTENT," filed October 13, 2017 (docket number 3051 -003-02). The present application also claims priority benefit from co-pending U.S. Provisional Patent Application No. 62/567,054, entitled

"RETROSPECTIVE CAPTURE TRIGGER," filed October 2, 2017 (docket number 3051 -002-02). Each priority document, to the extent not inconsistent with the disclosure herein, is incorporated by reference.

SUMMARY According to an embodiment, a method for presenting a media playback includes receiving a plurality of time stamped trigger data, reading a plurality of media segments from a media capture device, each media segment

corresponding to a respective time stamped trigger datum, and outputting the plurality of media segments as a playback comprising the plurality of media segments.

According to an embodiment, a method for presenting a media playback includes receiving a plurality of time stamped trigger data, and reading at least a thumbnail from each of a plurality of media segments from a media capture device. Each media segment corresponds to a respective time stamped trigger datum. The method includes outputting a plurality of thumbnails arranged along a timeline at a location corresponding to the moment of each trigger datum. The method includes receiving a selection of a portion of the thumbnails via a user interface (Ul). The method includes outputting a portion of the plurality of media segments, the portion corresponding to the selected thumbnails, as a media playback.

According to an embodiment, a non-transitory computer-readable medium has a plurality of computer-executable instructions which, when executed by a processor, provide a media processing system. The media processing system includes a data reading module configured to read captured media and trigger data, the trigger data including a plurality of time stamps each indicating a real time associated with a trigger event. The media processing system also includes a media processing module configured to generate a plurality of media segments. The media processing system also includes an interface module configured to present a media playback including the media segments to a user.

According to an embodiment, a media editing system includes a media capture device configured to capture media. The system includes a trigger device configured to generate a plurality of trigger data, each trigger datum including a time stamp. The system includes a media processing system configured to receive the captured media and the trigger data and to output a plurality of media segments each corresponding to a time stamp.

According to an embodiment, a method includes capturing media data, storing the media data in temporary storage, and receiving a plurality of trigger data from a trigger device, each trigger datum including a time stamp

corresponding to a real time that the trigger device generated the trigger data. The method also includes, each time a trigger datum is received, transferring the media data from the temporary storage to a permanent storage as a media segment. The method also includes uploading the media segments to a media processing system. BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 is a block diagram of a media presentation system, according to an embodiment.

FIG. 2 is a block diagram of a media presentation system, according to an embodiment.

FIG. 3 is an illustration of thumbnails representing media output files, according to an embodiment.

FIG. 4 is an illustration of thumbnails representing segments of captured media, according to an embodiment.

FIG. 5 is an illustration of thumbnails representing segments of captured media from multiple media capture devices, according to an embodiment.

FIG. 6 is an illustration of a media playback interface including a timeline of a captured media file with markers indicating times at which trigger data were generated, according to an embodiment.

FIG. 7 is an illustration of a media presentation system, according to an embodiment.

FIG. 8 is an illustration of a portion of a system including multiple media capture devices and a trigger device, according to an embodiment.

FIG. 9 is flow diagram of a process for presenting a media playback, according to an embodiment.

FIG. 10 is a flow diagram of a process for presenting a media playback, according to an embodiment.

FIG. 11 is a block diagram of a media processing system, according to an embodiment.

DETAILED DESCRIPTION

In the following detailed description, reference is made to the

accompanying drawings, which form a part hereof. In the drawings, similar symbols typically identify similar components, unless context dictates otherwise. Other embodiments may be used and/or other changes may be made without departing from the spirit or scope of the disclosure.

FIG. 1 is a block diagram of a media presentation system 100, according to an embodiment. The media presentation system 100 includes a media capture device 102, a trigger device 104, a media processing system 106, and a personal electronic device 108. These components of the media presentation system 100 are configured to cooperate to assist in editing and presenting captured media. According to embodiments, the media presentation system 100 may include more or fewer components than shown in FIG. 1. For example, a single component may perform the functions of multiple components shown in FIG. 1. Additionally, or alternatively, a single component shown in FIG. 1 may include multiple components.

According to an embodiment, the media capture device 102, the trigger device 104, the media processing system 106, and the personal electronic device 108 communicate with each other via the network 101 . The network 101 may include multiple networks including the Internet, wireless area networks, cellular networks, local area networks, or other types of networks.

According to an embodiment, the media capture device 102 is configured to capture media. The media captured by the media capture device 102 may include video media, audio media, a combination of video and audio media, and/or individual images. Accordingly, the media capture device 102 may include a camera configured to capture one or more of a video stream, an audio stream, and still images.

According to an embodiment, the media capture device 102 is configured to be coupled to a body of the user of the media capture device 102. For example, the media capture device 102 may include a head mounted camera. The head mounted camera may be configured to mount on a helmet worn by the user. The head mounted camera may be configured to be worn on a head of the user without being mounted to a helmet. The user may wear a head mounted media capture device 102 in a variety of circumstances such as while the user is participating in activities such as riding a bicycle, operating a motorcycle, riding in a kayak, riding a canoe, riding in a raft, skiing, waterskiing, skydiving, rock climbing, hiking, attending an event such as a concert, or in a variety of circumstances in which the user may wish to capture media related to the activity in which the user is participating.

According to an embodiment, the media capture device 102 may be configured to be worn by a user. The media capture device 102 may be configured to be worn on an article of clothing of the user, such as on the shirt or jacket of the user. The media capture device 102 may be a body camera worn by a police officer or other individuals.

According to an embodiment, the media capture device 102 may be configured to be mounted to an item of sporting equipment or a vehicle. The media capture device 102 may be configured to be mounted to a bicycle, a raft, a kayak, a canoe, a motorcycle, an automobile, a jet ski, an ATV, or in another location.

According to an embodiment, the media capture device 102 may be configured to be mounted in a permanent or stationary location. The media capture device 102 may be configured to be mounted at a location to capture media related to an athletic competition, an athletic activity, a concert, a gathering, a park, a home, a secure location such as a bank, a military

installation, a police installation, or in another stationary location.

According to an embodiment, as the media capture device 102 captures media, or while the media capture device 102 is ready to capture media, notable events of particular interest may occur. For example, during an athletic activity a participant may perform an outstanding maneuver, during a family activity a comical instance may occur, during a crowded gathering an injury or a crime may occur, or during an outdoor excursion a wild animal may be seen or encountered. In circumstances such as these, it may be desirable to single out such notable events from captured media. For example, if a media capture device 102 continuously captures a large amount of video during several hours of an outdoor activity, it may be desirable to be able to single out those notable events from the large amount of video. Additionally, or alternatively, participants may not desire to capture a large amount of video, but rather to capture media only related to those notable events.

According to an embodiment, the trigger device 104 enables the media presentation system 100 to readily identify the time of occurrence of notable events. In particular, the trigger device 104 is configured to generate and output time stamped trigger data. The time stamps trigger data identifies the time of occurrence of notable events. In particular, the trigger device 104 includes a real-time clock and a trigger circuit. The real-time clock keeps track of the real time. The trigger circuit includes a mechanism for generating a trigger signal upon the occurrence of a notable event. The trigger circuit may generate the trigger signal upon receiving trigger input from a user. In an embodiment, the trigger circuit may include a mechanism that enables the trigger circuit to generate the trigger signal without intentional intervention by the user. The trigger circuit may operate to generate trigger signals upon the occurrence of a notable event.

According to an embodiment, a user of the trigger device 104 may intentionally provide trigger inputs to the trigger device 104 upon the occurrence of notable events. For example, when the user of the trigger device 104 sees or hears that a notable event has occurred, the user may operate a switch, such as by pressing a button, flipping a toggle, manipulating a slider, or otherwise providing trigger input to the trigger device 104 indicating that a trigger event has occurred. When the trigger device 104 receives the trigger input from the user, the trigger device 104 generates a trigger signal.

In one embodiment, the trigger device 104 is configured to generate trigger signals in response to stimuli other than intentional user input. For example, the trigger device 104 may be configured to generate a trigger signal based on the tone or intensity of audio signals received by the trigger device 104. The trigger device 104 may be configured to generate a trigger signal based on motions sensed by the trigger device 104. The trigger device 104 may be configured to generate a trigger signal based on the intensity of light or based on sudden changes in lighting. Accordingly, the trigger device 104 may include sensors such as microphones, motion sensors, photo sensors, or other kinds of sensors, and may be configured to generate trigger signals based on sensed parameters.

According to an embodiment, when the trigger circuit of the trigger device

104 generates a trigger signal, the trigger device 104 reads the real-time from the real-time clock and generates time stamped trigger data that indicates the occurrence of a trigger event and the real time at which the trigger event occurred. Each time the trigger circuit generates a trigger signal, the trigger device 104 generates trigger data indicating the occurrence of the trigger event and including a time stamp indicating the time at which the trigger event occurred. Thus, the trigger device 104 generates trigger data indicating the time of one or more trigger events.

In one embodiment, the trigger data includes location data corresponding to the location of the trigger device 104 that generates the trigger data. Thus, the trigger data not only includes a time stamp that indicates a time at which the trigger data was generated, but also location data indicating the location of the trigger device 104 when the trigger data was generated.

In one embodiment, the trigger data may includes an audio signal corresponding in time to each time stamped trigger data. Thus, when the trigger device 104 generates a trigger signal, the trigger device 104 also records sound from an environment of the trigger device 104. Accordingly, the trigger device 104 may include a microphone capable of recording sounds. Each trigger datum may include a time stamp and an audio signal.

In one embodiment, the trigger device 104 is configured to wirelessly communicate with the media capture device 102. In particular, the trigger device 104 is configured to transmit the trigger data to the media capture device 102. The trigger device 104 may be configured to communicate with the media capture device 102 via Bluetooth, Wi-Fi protocol, Zigbee protocol, or with other kinds of wireless communication protocols. The trigger device 104 may be configured to transmit the trigger data to the media capture device 102. In one embodiment, the trigger device 104 is configured to wirelessly transmit the trigger data to the media capture device 102 immediately upon generating new trigger data. Thus, when the trigger device 104 generates a trigger signal indicating the occurrence of a notable event, the trigger device 104 transmits the trigger data to the media capture device 102. Each time the trigger device 104 generates new trigger data, the trigger device 104 transmits the new trigger data to the media capture device 102.

In one embodiment, the trigger device 104 is configured to wirelessly transmit the trigger data to the media capture device 102 after the end of an activity for which the media capture device 102 captured media. For example, the user may manipulate the trigger device 104 indicating that the trigger device 104 should now transmit all trigger data to the media capture device 102.

Additionally, or alternatively, the media capture device 102 may communicate with the trigger device 104 and request all trigger data from the trigger device 104. Additionally, or alternatively, after the end of an activity, the trigger device 104 and the media capture device 102 may be coupled together in a wired connection, and the trigger device 104 may provide the trigger data to the media capture device 102.

In one embodiment, the media capture device 102 is a retrospective media capture device. Rather than permanently capturing a large amount of media during a prolonged activity or a prolonged amount of time, the media capture device 102 may be configured to permanently capture only selected events. Accordingly, the media capture device 102 may be configured to continuously capture media data, such as a stream of video and/or audio data, and to store the media data in a temporary storage. The temporary storage may include a circular buffer, a first in first out (FIFO) memory, or another type of temporary storage. As the temporary storage fills up with media data, the oldest media data is removed from the temporary storage and the newest media data is written to the temporary storage. Thus, at any given moment, the temporary storage including captured media data from the immediately preceding several seconds or minutes. When the media capture device 102 receives trigger data from the trigger device 104, the media capture device 102 transfers the media data in the temporary storage to a permanent storage which will not be

overwritten as the media capture device 102 continues to capture media data. The media capture device 102 may be configured to continue to store newly captured media data in the permanent storage for a selected amount of time after the trigger data has been received from the trigger device 104. In this way, the media capture device 102 may retrospectively capture media data, such as audio and/or video data, from a period of time immediately preceding the receipt of a trigger signal.

In one embodiment, throughout the course of an activity, the media capture device 102 may receive multiple trigger data from the trigger device 104 at various times. Each time the media capture device 102 receives new trigger data from the trigger device 104, the media capture device 102 retrospectively captures media content to the permanent storage. Thus, the media capture device 102 may be operated in a mode that does not cause the media capture device 102 to permanently record hours of continuous data. Instead, the media capture device 102 may be operated in a retrospective capture mode in which the media capture device 102 continuously stores new captured media in a relatively small temporary storage and only permanently records media data upon the receipt of trigger data from the trigger device 104.

In one embodiment, the media capture device 102 generates a media file that includes only segments of media data permanently and retrospectively captured in response to the receipt of trigger data from the trigger device 104. For example, at the conclusion of an activity, a user of the media capture device 102 may manipulate the media capture device 102 to indicate that the activity has concluded and that the media capture device 102 should now finalize the captured media. In response, the media capture device 102 may generate multiple individual media files each corresponding to a portion of media data surrounding the receipt of a trigger event. Additionally, or alternatively, the media capture device 102 may generate a single media file that includes multiple segments of media data each corresponding to a time of receipt of a trigger datum from the trigger device 104.

In one embodiment, the media capture device 102 is configured to continuously capture media data, such as video data and/or audio data, throughout the course of an activity or a selected period of time. While the media capture device 102 captures media data, a user of the trigger device 104 may cause the trigger device 104 to generate trigger data at various times

corresponding to notable events. The trigger device 104 may output the trigger data to the media capture device 102. The media capture device 102 may mark the captured media data in accordance with the time stamp of the trigger data to indicate in the captured media data those times at which trigger data was received. The media capture device 102 may generate and output a media file that includes the captured media data and markers indicating the times at which trigger data was received. A user of the media capture device 102 may view the media file on an external device, such as the personal electronic device 108, and may easily identify the locations of notable events in the media data file.

In one embodiment, the media capture device 102 is configured to output multiple media files at the conclusion of an activity for which the media capture device 102 captured media data. The media capture device 102 may be configured to generate a media file that includes all of the captured media with markers indicating times at which trigger data was received. The media capture device 102 may be configured to generate a second media file that includes the compilation of segments, each corresponding to the receipt of trigger data. The media capture device 102 may be configured to generate a plurality of individual media files each for a segment of captured media corresponding to receipt of trigger data.

In one embodiment, the trigger device 104 is configured to communicate with a media processing system 106. For example, the trigger device 104 may be configured to transfer trigger data to the media processing system 106 at the conclusion of an activity, event, or period of time during which the media capture device 102 captured media. The trigger device 104 transfers the trigger data, including trigger data for each trigger event that occurred during the activity. Thus, the trigger device 104 may transfer trigger data for a plurality of trigger events to the media processing system 106.

In one embodiment, the trigger device 104 may be configured to automatically upload trigger data to the media processing system 106 when the trigger device 104 is connected to a wireless network. Many kinds of activities may take place at locations where the trigger device 104 cannot connect to a wireless network, such as for outdoor activities at remote locations. In these cases, the trigger device 104 will not transmit the trigger data to the media processing system 106 until the trigger device 104 is connected to a wireless network. Once the trigger device 104 is again connected to a wireless network, the trigger device 104 uploads the trigger data to the media processing system 106.

In one embodiment, the trigger device 104 may include a software application associated with the media processing system 106. The trigger device 104 may store the software application in a computer readable medium, i.e., a memory, of the trigger device 104. The trigger device 104 may execute the software application using one or more processors of the trigger device 104. The software application may cause the trigger device 104 to transmit trigger data to the media processing system 106.

In one embodiment, the media capture device 102 is configured to upload captured media data to the media processing system 106 upon completion of a media capture activity. For example, the media capture device 102 may be configured to upload the entirety of captured media data after the media capture activity. Additionally, or alternatively, the media capture device 102 may be configured to upload the entirety of captured media and/or individual captured media segments.

In one embodiment, the media capture device 102 may include a software application associated with the media processing system 106. The media capture device 102 may store the software application in a computer readable medium, i.e., a memory, of the media capture device 102. The media capture device 102 may execute the software application using one or more processors of the media capture device 102. The software application may cause the media capture device 102 to transmit captured media data to the media processing system 106.

In one embodiment, the media processing system 106 is configured to receive the trigger data from the trigger device 104 and the capture media data from the media capture device 102. The media processing system 106 is configured to generate edited media data from the captured media data based on the trigger data. The media processing system 106 may output the edited media data to an external device such as the personal electronic device 108 or the media capture device 102.

In one embodiment, the media processing system 106 is configured to generate a plurality of media segments from the captured media data. Each media segment corresponds to a portion of the captured media data at a time associated with a respective trigger event from the trigger data. The media processing system 106 may output the plurality of media segments to an external device.

In one embodiment, the media processing system 106 is configured to generate, for each media segment corresponding to the trigger event from the trigger data, a thumbnail image. The user may navigate to a selected thumbnail image to access the media segment corresponding to a trigger event from the trigger data.

In one embodiment, the media processing system 106 is configured to generate a media file that includes each media segment. The media processing system 106 is configured to present the media file with a timeline that includes markers indicating the segments corresponding to the trigger events. The media file may include the entirety of the captured media from the media capture device 102, together with markers on the timeline indicating the locations in the media file of the segments corresponding to the trigger events from the trigger data. Alternatively, or additionally, the media file may include a compilation of each of the segments corresponding to the trigger events from the trigger data. The mark on the timeline may include an indication such as a thumbnail image from the segment corresponding to the mark on the timeline. Additionally, or alternatively, the mark on the timeline may include indications other than a thumbnail image.

In one embodiment, the media processing system 106 is configured to output the edited media files to the personal electronic device 108. The personal electronic device 108 is configured to enable the user to access and view the edited media files generated by the media processing system 106. The personal electronic device 108 may include a mobile phone, a laptop computer, a desktop computer, a tablet, or another kind of personal electronic device that includes a display for displaying media content.

In one embodiment, the personal electronic device 108 includes a software application associated with the media processing system 106. The personal electronic device 108 may store the software application in a computer readable medium, i.e., a memory, of the personal electronic device 108. The personal electronic device 108 may execute the software application using one or more processors of the personal electronic device 108. The software application may cause the personal electronic device 108 to communicate with the media processing system 106 and to receive edited media content files from the media processing system 106.

In one embodiment, the media processing system 106 includes one or more servers executing media processing software on computing resources of the one or more servers. The media processing system 106 may include cloud- based processing resources that enable the media processing system 106 to execute resource intensive media editing operations. Users of the media processing system 106 may thus utilize the media processing system 106 to generate edited media files based on the trigger data from the trigger device 104 and the captured media from the media capture device 102. The media capture device 102, the trigger device 104, and the personal electronic device 108 may communicate with the media processing system 106 via the network 101. The network 101 can include one or more of the Internet, wireless networks, local area networks, or other communication networks configured to enable the components of the media presentation system 100 to communicate with each other.

In one embodiment, the personal electronic device 108 includes all or a portion of the media processing system 106. In this case, the personal electronic device 108 receives the trigger data and the captured media data from the trigger device 104 and the media capture device 102. The personal electronic device 108 executes media processing software associated with the media processing system 106 and generates edited media files based on the captured media data and the trigger data.

In one embodiment, the trigger device 104 can be part of the media capture device 102. In this case, the user may operate the media capture device 102 to generate trigger data and to cause the media capture device 102 to retrospectively capture media data.

In one embodiment, each of the plurality of media segments includes a portion beginning prior to the time stamp of the respective trigger data. For example, the media segment may include media captured a few seconds prior to the time stamp of the trigger data. In an embodiment, a media segment includes up to a minute prior to the time stamp of the trigger data. In another

embodiment, a media segment may include more than a minute prior to the time stamp of the trigger data.

FIG. 2 is a block diagram of a media presentation system 200, according to an embodiment. The media presentation system 200 may include a trigger device 104 and a plurality of media capture devices 102. The trigger device 104 and a plurality of media capture devices 102 cooperate together to assist in editing captured media. In an embodiment, a media capture device may be identified (e.g., dynamically identified) using local discovery of media capture devices proximal to an event experienced by the user of the trigger device.

Accordingly, a trigger may cause a "harvesting" of captured media from media capture devices unknown to the user at a time prior to or even

contemporaneously with the notable event marked by the user. In one embodiment, the trigger device 104 is configured to generate and output time stamped trigger data. The time stamped trigger data identifies the time of occurrence of corresponding notable events. In particular, the trigger device 104 may include a real-time clock and a trigger circuit. The real-time clock keeps track of the real time. The trigger circuit includes a mechanism for receiving a trigger signal from the user upon an occurrence of a notable event. The trigger circuit can generate the trigger signal upon receiving trigger input from a user. In embodiments, the trigger circuit may include a mechanism that enables the trigger circuit to generate the trigger signal without intentional intervention by the user.

In one embodiment, each media capture device 102 is configured to capture media. The media captured by each the media capture device 102 may include video media, audio media, a combination of video and audio media, and/or individual images. One or more media capture device 102 may include a camera configured to capture one or more of a video stream, an audio stream, and still images. In one embodiment, one or more media capture device 102 may include retrospective capture capabilities as described in relation to FIG. 1.

In one embodiment, the trigger device 104 is configured to communicate with each of the media capture devices 102. The trigger device 104 may transmit time stamped trigger data to each of the media capture devices 102. The trigger device 104 may communicate with the media capture devices 102 in order to identify media capture devices 102 that are present and in an

operational state to capture media.

In one embodiment, when the trigger device 104 generates a trigger signal indicative of a notable event, the trigger device 104 outputs trigger data to one or more of the media capture devices 102. Each of the media capture devices 102 may receive the trigger data from the trigger device 104, or alternatively may be queried by a media processing system at a later time. One or more of the media capture devices 102 may be capable of retrospectively capturing media data on receipt of the trigger signal. Additionally or alternatively, one or more of the media capture devices 102 may add markers into the captured media indicating the location within the captured media corresponding to the receipt of the trigger data. Additionally or alternatively, the trigger signals generated by the trigger device 104 may cause a media processing system to parse captured media streams from the media capture devices to retrieve media segments

corresponding to a notable event identified by the user.

In one embodiment, the plurality of media capture devices 102 include a predetermined linked group of media capture devices 102. A user, or users, of the media capture devices 102 may link the media capture devices 102 together prior to utilizing the media capture devices 102 in a situation to capture media. The media capture devices 102 each store data indicating which other media capture devices 102 are in the predetermined group.

In one embodiment, the predetermined group of media capture devices 102 is associated with the trigger device 104. When the trigger device 104 generates trigger data, the trigger device 104 may output the trigger data to each of the media capture devices 102 of the predetermined group of media capture devices 102. In one embodiment, the media capture devices 102 may pass trigger data to each other in case one of the media capture devices 102 is outside the range of the trigger device 104 but still within the range of another of the media capture devices 102.

In one embodiment, the trigger device 104 is configured to discover nearby media capture devices 102. The trigger device 104 may be configured to discover media capture devices 102 that have a media capture zone proximate to the trigger device 104. Thus, when the trigger device 104 generates time stamped trigger data, the trigger device 104 may output the time stamped trigger data to each of the media capture devices 102 that are in a position to capture media in an area local to the trigger device 104.

In one embodiment, the media presentation system 200 includes a media processing system 106. The media processing system 106 may include a server or cloud-based computing system configured to read captured media from each of the media capture devices 102. The media processing system 106 may also be configured to receive trigger data from the trigger device 104. The media processing system 106 may process the captured media in accordance with the trigger data. The media processing system 106 may generate media segments from each media capture device 102 for each trigger event in the trigger data. The media processing system 106 may output edited media data including media segments from each of the media capture devices 102.

In one embodiment, the media processing system 106 may determine which media capture devices 102 were in the vicinity of the trigger device 104 each time trigger data was generated based on the location data included in the trigger signal, as well as location data stored by the media capture devices 102. The media processing system 106 may generate media segments from the captured media of each media capture device 102 determined to be within the vicinity of the trigger device 104.

In one embodiment, the media processing system 106 is configured to output the plurality of media segments as a playback. The media processing system 106 is configured to assemble the plurality of media segments in the time sequence corresponding to the sequence of time stamps. The media processing system 106 is configured to output the sequence of segments as a playback. The playback may be continuous, with substantially no break between segments. Alternatively, when the media processing system 106 assembles the media segments, the media processing system 106 may insert a fade between successive media segments.

In one embodiment, the media processing system 106 is configured to output the plurality of media segments as a playback by assembling at least a portion of the plurality of media segments as panes in a display list. The media processing system 106 may be configured to simultaneously output a plurality of panes from the display list on the user display. In one embodiment, the media processing system 106 is configured to output the plurality of media segments as a playback comprising the plurality of media segments by displaying media segments from respective media capture devices 102 corresponding to a single time stamp. In one embodiment, the media processing system 106 may be configured to output the plurality of media segments as a playback by outputting the plurality of media segments in a media stream.

In one embodiment, the media processing system 106 is configured to output the plurality of media segments as a media file.

In one embodiment, the media processing system 106 is configured to read a plurality of media segments from the media capture device 102 by reading a media file spanning a plurality of stamped times. The media processing system 106 then processes the media file into separate segments based on the stamped times.

In one embodiment, the media processing system 106 is configured to read a plurality of media segments from a media capture device 102 by excluding portions of the captured media not corresponding to a trigger. Thus, the media processing system 106 retrieves from the media capture device 102 only those portions of the captured media that correspond to a trigger event.

FIG. 3 is an illustration of how media files are output by the media processing system 106, according to an embodiment. The media playback may be presented on a display of the personal electronic device 108. The media processing system 106 has generated a file 310 that is a full video corresponding to an entirety of media captured by a media capture device 102. The media processing system 106 has also generated a file 312 that includes a compilation of segments from the media captured by the media capture device 102.

In one embodiment, when the media processing system 106 reads captured media data from a media capture device 102, the media processing system 106 may process the captured media so that the captured media can be conveniently viewed or otherwise consumed by a user.

In one example, in accordance with one embodiment, a user operates a media capture device 102 during an outdoor activity. The user operates the media capture device 102 in the mode that causes the media capture device 102 to capture media during the entirety of the activity. The duration of the activity is three hours, thirteen minutes, and seven seconds. In this example, in accordance with one embodiment, the user also operates a trigger device 104 to mark the occurrence of notable events during the outdoor activity. Each time a notable event occurs, the user manipulates the trigger device 104 to generate time stamped trigger data. During the outdoor activity, the user causes the trigger device 104 to generate twelve time stamps of trigger data corresponding to notable events that occurred during the outdoor activity.

In one embodiment, at the end of the outdoor activity, the media processing system 106 receives the captured media from the media capture device 102. The media processing system 106 also receives the trigger data, either from the trigger device 104, or from the media capture device 102. The media processing system 106 may receive the trigger data from the media capture device 102 as markers placed in the captured media, or as separate triggers.

In one embodiment, the media processing system 106 generates the full video file 310 from the captured media. The full video file 310 corresponds to the entirety of the captured media from the outdoor activity. The full video file 310 is three hours, thirteen minutes, and seven seconds long.

In one embodiment, the media processing system 106 also generates a highlight compilation 312 corresponding to a compilation of media segments from the captured media. Each media segment includes a segment of the captured media corresponding to a time stamp in the trigger data. The highlight compilation 312 includes twelve segments compiled together having a total duration of four minutes, 27 seconds. The media processing system 106 may output the highlight compilation 312 via the personal electronic device 108.

In one embodiment, the media processing system 106 outputs only the highlight compilation file 312. In this case, the media processing system 106 either does not read the entirety of the captured media from the media capture device 102, or does read the entirety of the captured media but only generates and outputs the highlight compilation 312. FIG. 4 is an illustration of how a media processing system 106 outputs media files in a media playback, according to an embodiment. The media playback may be presented on a display of the personal electronic device 108. The media processing system 106 displays a separate thumbnail for each of a plurality of segments of media captured by a media capture device 102. Each segment corresponds to a respective time stamp from time stamped trigger data generated by a trigger device 104.

In one example, in accordance with one embodiment, the media

processing system 106 reads captured media from a media capture device 102. The media processing system 106 also reads time stamped trigger data generated by a trigger device 104. The time stamped trigger data includes three time stamps. The media processing system 106 generates three media segments from the captured media. The media processing system 106 outputs a thumbnail for each media segment. In particular, the media capture device 102 generates a first thumbnail 414a corresponding to a first segment with a time stamp of 3:55 PM and the duration of 37 seconds. The media processing system 106 outputs a second thumbnail 414b corresponding to a second segment with a time stamp of 4:01 PM and the duration of 15 seconds. The media processing system 106 outputs a third thumbnail 414c corresponding to a third segment with a time stamp of 6:15 PM and the duration of one minute, three seconds. A user may select a thumbnail 414a, 414b, or 414c to play back the respective segment.

In one embodiment, each trigger data may include a first time stamp indicating initiation of a notable event, and a second time stamp indicating the end of the notable event. A user may operate the trigger device 104 a first time to indicate the end of the notable event. This can cause the media capture device 102 (or a media processing system in combination with continuously operating media capture devices 102) to retrospectively capture a notable event and to stop capture when the notable event has ended. Thus, each media segment may have a duration in accordance with a duration of the notable event. Alternatively, each trigger signal may cause the media capture device 102 to capture media for a preselected period of time. Alternatively, the media processing system 106 may generate segments of a preselected length for each trigger signal.

FIG. 5 is an illustration of how a media processing system 106 outputs media files in a media playback, according to an embodiment. The media playback may be presented on a display of the personal electronic device 108. The media processing system 106 reads captured media from multiple media capture devices 102. The media processing system 106 also receives time stamped trigger data indicating a plurality of trigger events. The media processing system 106 generates, for each trigger event in the trigger data, a media segment for each media capture device 102. The media processing system 106 may optionally generate a highlight compilation that includes each of the segments from each media capture device 102. The media processing system 106 displays a separate thumbnail for each of the media segments and for the compilation.

In one example, in accordance with one embodiment, the media processing system 106 reads captured media from a first media capture device 102 and from a second media capture device 102. The media processing system 106 also reads trigger data either from a trigger device 104 or from the media capture devices 102. The trigger data includes three time stamps corresponding to trigger events. The first trigger event has a time stamp of 10:03 AM. The second trigger event has a time stamp of 10:08 AM. The third trigger event has a time stamp of 1 1 : 19 AM. The media processing system 106 generates a media segment with a thumbnail 514a, including media captured by the first media capture device 102 corresponding to the first trigger event. The media processing system 106 generates a media segment as a thumbnail 514b, including media captured by the first media capture device 102 corresponding to the second trigger event. The media processing system 106 generates a media segment as a thumbnail 514c including media captured by the first media capture device 102 corresponding to the third trigger event. The media processing system 106 generates a media segment as a thumbnail 516a, including media captured by the second media capture device 102 corresponding to the first trigger event. The media processing system 106 generates a media segment as a thumbnail 516b, including media captured by the second media capture device 102 corresponding to the second trigger event. The media processing system 106 generates a media segment as a thumbnail 516c, including media captured by the second media capture device 102 corresponding to the third trigger event.

In one embodiment, the media processing system 106 also generates a highlight compilation as a thumbnail 512, corresponding to a compilation of all of the media segments from each of the media capture devices 102.

In one embodiment, the user may select any of these thumbnails in order to view the respective media segment or compilation.

FIG. 6 is an illustration of how a media processing system 106 outputs media files in a media playback, according to an embodiment. The media playback may be presented on a display of the personal electronic device 108. The media processing system 106 reads captured media from a media capture device 102. The media processing system 106 also receives time stamped trigger data indicating a plurality of trigger events. The media processing system 106 generates a media file that includes the captured media. The media file includes markers on a timeline indicating locations within the media file corresponding to trigger events.

In one example, in accordance with one embodiment, the media processing system 106 captures media from a media capture device 102. The media processing system 106 also receives time stamped trigger data

corresponding to four trigger events that occurred while the media capture device 102 captured media. The media processing system 106 outputs a media file 618, including the captured media from the media capture device 102. The media file 618 includes markers 620a-620d, each corresponding to a time stamp from the trigger data. The user may utilize the markers 620a-620d to quickly navigate to portions of the media file 618 that correspond to trigger events.

Additionally, or alternatively, the markers 620a-620d may include thumbnail images indicating the content at the marked location. FIG. 7 is an illustration of a media presentation system 700, according to an embodiment. The media presentation system 700 includes a media capture device 102, a trigger device 104, and a personal electronic device 724.

In one embodiment, after a user has operated a media capture device 102 to capture media, and a trigger device 104 to generate trigger data, the user seeks to upload and process the captured media in accordance with the trigger data. The user outputs the captured media from the media capture device 102 to the personal electronic device 724. This may be done wirelessly, or via a wired connection 722. The personal electronic device 724 also receives trigger data, either from the trigger device 104 or from the media capture device 102. The personal electronic device 724 utilizes a media processing system 106 to process the captured media from the media capture device 102 in accordance with the trigger data from the trigger device 104. The media processing system 106 processes the captured media in accordance with the trigger data to generate and output one or more media playback files. The media playback files may include a plurality of segments in accordance with the trigger data, a compilation of the plurality of segments, and/or a full media file including portions of the captured media that do not correspond to a trigger event.

In one embodiment, the media processing system 106 includes a software application executed by the media capture device 102. The software application corresponds to instructions stored in a computer readable medium of the personal electronic device 724 and executed by one or more processors of the personal electronic device 724.

In one embodiment, the media capture device 102 outputs the captured media to a media processing system 106 hosted on one or more servers, or hosted in the cloud. The media processing system 106 receives the captured media and the trigger data, processes captured media, and outputs the captured media for playback on the personal electronic device 724.

In one embodiment, the personal electronic device 724 includes a laptop computer, a tablet, a mobile phone, a desktop computer, or another kind of computing device. FIG. 8 is an illustration of a system 800 including a plurality of media capture devices 102 operatively coupled to a trigger device 104, according to an embodiment. The system 800 includes three media capture devices 102 and a trigger device 104. Three users 826 each have a media capture device 102 mounted on a helmet. At least one of the users 826 is wearing a trigger device 104 around their wrist.

In one embodiment, as the users 826 participate in an activity, the media capture devices 102 are in retrospective capture mode, as described previously. Upon the occurrence of a notable event, the user 826 that is wearing the trigger device 104 causes the trigger device 104 to generate trigger data. The trigger device 104 outputs the trigger data to the media capture devices 102. The media capture devices 102 retrospectively capture media corresponding to a period of time beginning before the receipt of the trigger data and extending after the receipt of the trigger data for a selected period of time.

In one embodiment, multiples of users 826 wear trigger devices 104.

Each trigger device 104 is configured to communicate with each of the media capture devices 102. Thus, any of the users may cause each of the media capture devices 102 to retrospectively capture media by manipulating the respective trigger devices 104.

FIG. 9 is a flow diagram of a process 900, according to an embodiment.

At 902, a plurality of time stamped trigger data are received. At 904, a plurality of media segments are read from a media capture device 102. Each segment corresponds to a respective time stamped trigger datum. At 906, a plurality of media segments are output as a playback, including a plurality of media segments.

FIG. 10 is a flow diagram of a process 1000, according to an embodiment. At 1002, a plurality of time stamped trigger data are received. At 1004, at least a thumbnail from each of a plurality of media segments from a media capture device 102 is read. Each segment corresponds to a respective time stamped trigger datum. At 1006, a plurality of thumbnails or outputs are arranged along the timeline at a location corresponding to the moment of each trigger datum. At 1008, a selection of the portion of the thumbnails is received via a user interface. At 1010, a portion of the plurality of media segments are output as a media playback. The portion corresponds to the selected thumbnails.

FIG. 11 is a block diagram of a media processing system 106, according to an embodiment. The media processing system 106 includes a data reading module 1 102, a media processing module 1 104, and an interface module 1 106, according to an embodiment. The data reading module 1 102, the media processing module 1 104, and the interface module 1 106 cooperate together to generate and present a media playback for users.

In one embodiment, the data reading module 1 102 is configured to read or receive captured media 1 1 10 from one or more media capture devices 102. In one embodiment, the data reading module 1 102 is configured to interface with one or more media capture devices 102 in order to read or receive the captured media 1 1 10 from the media capture devices 102.

In one embodiment, the data reading module 1 102 is configured to communicate with the media capture devices 102 via a network connection. For example, the data reading module 1 102 may be configured to communicate with, and retrieve captured media 1 1 10 from, the media capture devices 102 anytime the media capture devices 102 have an Internet connection. Additionally, or alternatively, the media capture devices 102 may be configured to communicate with the data reading module 1 102 when the media capture device 102 is connected to the media processing system 106 via a network other than the Internet. Additionally, or alternatively, the media capture devices 102 may be configured to communicate with the media processing system 106 when the media input devices 102 are connected to the media processing system 106 with a wired connection.

The data reading module 1 102 may query the media capture devices 102 in order to determine whether the media capture devices 102 have any captured media 1 1 10 to upload to the media processing system 106. If the media capture devices 102 have captured media 1 1 10 available for upload, the data reading module 1 102 reads the captured media 1 1 10 from the one or more media capture devices 102. In one embodiment, the media capture devices 102 may include software applications that are part of or are related to the media processing system 106. The media capture devices 102 may execute the software applications to cause communication with the media processing system 106 via a network connection.

In one embodiment, the data reading module 1 102 is configured to read captured media data 1 1 10 from one or more media capture devices 102 when a user of the media capture device 102 specifically elects to upload captured media 1 1 10. For example, the media capture device 102 may include a software application associated with the media processing system 106. When the media capture device 102 executes the software application, the user may select to upload captured media 1 1 10 to the media processing system 106 via a user interface of the media capture device 102.

In one embodiment, the data reading module 1 102 is configured to read trigger data 1 1 12. The trigger data 1 1 12 may be read from a trigger device 104. Additionally, or alternatively, the trigger data 1 1 12 may be read from a media capture device 102 that received the trigger data 1 1 12 from a trigger device 104. The trigger data 1 1 12 may include time stamps indicating each time that a user manipulated a trigger device 104 to generate a trigger signal. The time stamps indicate times at which trigger events occurred while a media capture device 102 was capturing media, or while a media capture device 102 was in a retrospective capture mode.

In one embodiment, the data reading module 1 102 is configured to read or receive trigger data 1 1 12 from a trigger device 104. In one embodiment, the data reading module 1 102 is configured to interface with one or more media capture devices 102 in order to read or receive the captured media 1 1 10 from the media capture devices 102.

In one embodiment, the data reading module 1 102 is configured to communicate with a trigger device 104 via a network connection. For example, the data reading module 1 102 may be configured to communicate with, and retrieve trigger data 1 1 12 from, a trigger device 104 anytime the trigger device 104 has an Internet connection. Additionally, or alternatively, the trigger device 104 may be configured to communicate with the data reading module 1 102 when the trigger device 104 is connected to the media processing system 106 via a network other than the Internet. Additionally, or alternatively, the trigger device 104 may be configured to communicate with the media processing system 106 when the trigger device 104 is connected to the media processing system 106 with a wired connection.

In one embodiment, the media processing module 1 104 generates processed media 1 1 14 from the captured media 1 1 10. The media processing module 1 104 may process the captured media 1 1 10 in accordance with the trigger data 1 1 12. The media processing module 1 104 may generate media segments 1 120 from the captured media 1 1 10 based on the trigger data 1 1 12.

In one embodiment, the media processing module 1 104 generates processed media 1 1 14 from captured media 1 1 10 that corresponds to a media file that represents the entirety of captured media 1 1 10 from a media capture device 102. For example, if a user of a media capture device 102 captured media for three hours, then in one embodiment the media processing module 1 104 processes the three hours of captured media 1 1 10 to generate multiple media segments 1 120 based on the trigger data 1 1 12. If the trigger data 1 1 12 represents multiple trigger events, i.e., includes multiple time stamps, then the media processing module 1 104 generates from the captured media 1 1 10 a respective media segment 1 120 for each time stamp in the trigger data 1 1 12. The processed media data 1 1 14 includes the multiple media segments 1 120.

In one embodiment, the captured media 1 1 10 received by the data reading module 1 102 includes multiple segments 1 120. In this case, the media processing module 1 104 may generate the processed media 1 1 14 by processing the previously generated media segments 1 120 into a format that may be accessed by a user.

In one embodiment, the media processing module 1 104 receives captured media 1 1 10 from multiple media capture devices 102. The media processing module 1 104 processes the captured media 1 1 10 from the multiple media capture devices 102 and generates processed media 1 1 14 including a plurality of media segments 1 120 for each time stamp in the trigger data 1 1 12. For each time stamp, the media processing module 1 104 generates multiple media segments 1 120, each from a different media capture device 102. Each media segment 1 120 covers a same real-time span. If the captured media 1 1 10 includes media from three media capture devices 102, and if the trigger data 1 1 12 includes two trigger events, then the media processing module 1 104 generates processed media 1 1 14 that includes three media segments 1 120 associated with the first trigger event, and three media segments 1 120

associated with the second trigger event.

In one embodiment, the processed media data 1 1 14 includes a plurality of thumbnails representing various media segments 1 120 generated by the media processing module 1 104 or received from the media capture devices 102. In one embodiment, the processed media 1 1 14 includes large media playback data 1 1 18 in a format that includes user accessible markers that indicate locations within the large media playback data file 1 1 18 corresponding to trigger events.

In one embodiment, the interface module 1 106 is configured to enable a user to interface with the media processing system 106. The user may interface with the media processing system 106 via a personal electronic device 108 of the user. The personal electronic device 108 may include a software application linked to the media processing system 106. Additionally, or alternatively, the personal electronic device 108 includes the media processing system 106 as software instructions executed by the personal electronic device 108.

Additionally, or alternatively, a portion of the media processing system 106 may be implemented on the personal electronic device 108, and a portion of the media processing system 106 may be implemented within one or more remote servers, or on a separate computing device. However the user accesses the media processing system 106, the interface module 1 106 enables the user to interface with the media processing system 106.

In one embodiment, the interface module 1 106 presents media playback data 1 1 18 to the user. The interface module 1 106 may present the media playback data 1 1 18 to the user on a display of the personal electronic device 108. Alternatively, the interface module 1 106 may present the media playback data 1 1 18 to the user in a different manner.

In one embodiment, the media playback data 1 1 18 includes a media playback of the processed media 1 1 14 generated from the captured media 1 1 10. Accordingly, the media playback data 1 1 18 may include a plurality of media segments 1 120 from the processed media 1 1 14. The media playback data 1 1 18 may also include unsegmented media files received from media capture devices 102. The media playback data 1 1 18 may include thumbnails representing a plurality of media segments 1 120 that the user can select to view. The media playback data 1 1 18 may include large media files presented with a timeline that includes markers that indicate the locations of media segments 1 120 that correspond to trigger events. The markers may include thumbnail images showing a frame of captured media 1 1 10 representing the media segment 1 120.

In one embodiment, the interface module 1 106 receives selection data

1 1 16 from a user. For example, the media playback data 1 1 18 may include a plurality of media segment files and full media files. The user may provide selection data 1 1 16 indicating which of the media segment files should be compiled for a highlight compilation. The selection data 1 1 16 may indicate which segments, compilations, or full media files should be kept and which should be deleted. The selection data 1 1 16 may indicate which media files should be downloaded from the media processing system 106 to the personal electronic device 108. The selection data 1 1 16 may indicate which media files that should be permanently saved in a backup storage, or a cloud-based storage. The selection data 1 1 16 may indicate a media file that should be played as part of the media playback data 1 1 18 immediately. The media processing module 1 104 may generate further processed media 1 1 14 based on the selection data 1 1 16.

While various aspects and embodiments have been disclosed herein, other aspects and embodiments are contemplated. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope and spirit being indicated by the following claims.

Claims

CLAIMS What is claimed is:
1 . A method for presenting a media playback, comprising:
receiving a plurality of time stamped trigger data;
reading a plurality of media segments from a media capture device, each media segment corresponding to a respective time stamped trigger datum; and outputting the plurality of media segments as a playback including the plurality of media segments.
2. The method of claim 1 , wherein each of the plurality of media segments includes a portion beginning prior to a time stamp of the respective time stamped trigger datum.
3. The method of claim 2, wherein each of the plurality of media segments includes a portion beginning a few seconds prior to the time stamp of the respective time stamped trigger datum.
4. The method of claim 2, wherein each of the plurality of media segments includes a portion beginning up to a minute prior to the time stamp of the respective time stamped trigger datum.
5. The method of claim 2, wherein each of the plurality of media segments includes a portion beginning more than a minute prior to the time stamp of the respective time stamped trigger datum.
6. The method of claim 1 , wherein the media capture device comprises a plurality of media capture devices.
7. The method of claim 6, wherein the plurality of media capture devices includes a predetermined linked group of media capture devices.
8. The method of claim 1 , wherein the plurality of time stamped trigger data each includes a location datum corresponding to a location of a trigger device that generates the trigger data in response to receiving a trigger state from a user interface.
9. The method of claim 8, further comprising:
discovering at least a portion of proximate media capture devices having a media capture zone proximate to the trigger device when the trigger state was generated by the user interface;
wherein reading the plurality of media segments includes reading at least one media segment from each of the proximate media capture devices.
10. The method of claim 1 , wherein at least a portion of the time stamped trigger data includes an audio signal corresponding to each time stamped trigger datum.
1 1 . The method of claim 10, wherein the audio signal is collected by a microphone disposed in a device that receives a trigger state from a user interface and generates the time stamped trigger datum responsive to receiving the trigger state.
12. The method of claim 1 , wherein outputting the plurality of media segments as a playback comprising the plurality of media segments further comprises: assembling the plurality of media segments in a time sequence
corresponding to the sequence of time stamps; and
outputting the sequence of segments as a playback.
13. The method of claim 12, wherein the playback is continuous.
14. The method of claim 12, wherein assembling the plurality of media segments includes inserting a fade between successive media segments.
15. The method of claim 12, wherein outputting the plurality of media segments as a playback further comprises:
assembling at least a portion of the plurality of media segments as panes in a display list;
wherein outputting the plurality of media segments includes
simultaneously outputting a plurality of panes from the display list on the user display.
16. The method of claim 1 , wherein outputting the plurality of media segments as a playback comprising the plurality of media segments includes
simultaneously displaying media segments from respective devices
corresponding to a single time stamp.
17. The method of claim 1 , wherein outputting the plurality of media segments as a playback comprising the plurality of media segments includes outputting a media stream.
18. The method of claim 1 , wherein outputting the plurality of media segments as a playback comprising the plurality of media segments includes outputting a media file.
19. The method of claim 1 , wherein reading a plurality of media segments from a media capture device includes reading a media file spanning a plurality of stamped times.
20. The method of claim 1 , wherein reading a plurality of media segments from a media capture device includes reading captured media segments excluding times not corresponding to a trigger period.
21 . The method of claim 20, wherein the trigger period extends to a moment in time prior to the stamped time.
22. The method of claim 20, wherein the trigger period extends to a moment in time after the stamped time.
23. The method of claim 1 , wherein the media includes video.
24. The method of claim 1 , wherein the media includes audio.
25. The method of claim 1 , wherein the media includes sensor data other than video and audio.
26. A method for presenting a media playback, comprising:
receiving a plurality of time stamped trigger data;
reading at least a thumbnail from each of a plurality of media segments from a media capture device, each media segment corresponding to a respective time stamped trigger datum;
outputting a plurality of thumbnails arranged along a timeline at a location corresponding to the moment of each trigger datum;
receiving a selection of a portion of the thumbnails via a user interface; and
outputting a portion of the plurality of media segments, the portion corresponding to the selected thumbnails, as a media playback.
27. The method of claim 26, further comprising, upon receiving a selection of a portion of the thumbnails, reading a portion of media segments from the media capture device, each portion corresponding to a selected thumbnail.
28. The method of claim 26, wherein reading a portion of the media segments includes transmitting a command to transfer, from the media capture device, for each portion of the media segments.
29. The method of claim 26, further comprising outputting, for each instance of the selection, a media segment including a portion captured prior to the respective time stamp.
30. A media processing system, comprising:
a data reading module configured to read captured media and trigger data, the trigger data including a plurality of time stamps, each indicating a real time associated with a trigger event;
a media processing module configured to generate a plurality of media segments; and
an interface module configured to present a media playback including the media segments to a user.
31 . The system of claim 30, wherein the data reading module is configured to read the captured media from a media capture device.
32. The system of claim 31 , wherein the data reading module is configured to read the captured media from multiple media capture devices.
33. The system of claim 32, wherein the captured media devices are part of a network of captured media devices.
34. The system of claim 31 , wherein the media processing module is configured to generate the media segments by processing previously generated media segments included in the captured media.
35. The system of claim 31 , wherein the data reading module is configured to read the trigger data from the media capture device.
36. The system of claim 35, wherein the trigger data is included in the captured media.
37. The system of claim 31 , wherein the data reading module is configured to detect when the media capture device is connected to a network and to read the captured media from the media capture device upon detecting that the media capture device is connected to the network.
38. The system of claim 30, wherein the data reading module is configured to read the trigger data from a trigger device.
39. The system of claim 38, wherein the data reading module is configured to detect when the trigger device is connected to a network and to read the trigger data from the media capture device upon detecting that the media capture device is connected to the network.
40. The system of claim 30, wherein the interface module is configured to present the media playback on a display of a personal electronic device of the user.
41 . The system of claim 30, wherein the media playback includes a
compilation of the media segments in a single media file.
42. The system of claim 40, wherein the media playback includes the media segments and one or more full media files from which the media segments were taken.
43. The system of claim 40, wherein the media playback includes a media file that includes the segments and media captured between the segments and not included in the segments.
44. The system of claim 43, wherein the media playback includes a timeline of the media file and wherein the timeline includes markers indicating locations of the segments within the media file.
45. The system of claim 44, wherein the markers include thumbnail images.
46. The system of claim 30, wherein the media playback includes segments captured by and read from multiple media capture devices.
47. The system of claim 30, wherein the media playback includes, for each time stamp, multiple media segments each received from a respective media capture device.
48. The system of claim 30, wherein the interface module is configured to receive selection data from the user selecting one or more of the media segments.
49. The system of claim 30, wherein each media segment includes a portion beginning prior to a time stamp associated with the media segment.
50. The system of claim 30, wherein the plurality of trigger data includes, for each time stamp, a location datum corresponding to a location of a trigger device that generates the trigger data at a time that the trigger device generated the time stamp.
51 . The system of claim 50, wherein the data reading module is configured to: discover at least a portion of proximate media capture devices having a media capture zone proximate to the trigger device when the time stamp was generated by the user interface and to read at least one media segment from each of the proximate media capture devices.
52. The system of claim 30, wherein the media processing module is configured to assemble the plurality of media segments in a time sequence corresponding to the sequence of time stamps, and wherein the interface module is configured to output the sequence of segments with the media playback.
53. The system of claim 52, wherein the media processing module is configured to insert a fade between successive media segments.
54. The system of claim 52, wherein the media playback includes assembling at least a portion of the plurality of media segments as panes in a display list and wherein the playback includes simultaneously outputting a plurality of panes from the display list on a user display.
55. The system of claim 30, wherein outputting the plurality of media segments as a playback comprising the plurality of media segments includes simultaneously displaying media segments from respective devices
corresponding to a single time stamp.
56. The system of claim 30, wherein the captured media includes video.
57. The system of claim 30, wherein the captured media includes audio.
58. The system of claim 30, wherein the media includes sensor data other than video and audio.
59. A system, comprising:
a media capture device configured to capture media;
a trigger device configured to generate a plurality of trigger data, each including a time stamp; and
a media processing system configured to receive the captured media and the trigger data and to output a plurality of media segments, each corresponding to a time stamp.
60. The system of claim 59, wherein the media processing system is configured to output the plurality of media segments together in a single media file.
61 . The system of claim 59, wherein the media processing system is configured to output the plurality of media segments in separate files.
62. The system of claim 59, wherein the media capture device is configured to output the captured media to the media processing system via the Internet.
63. The system of claim 62, wherein the media capture device is configured to operate in a retrospective capture mode by temporarily capturing media to a temporary storage, by continuously replacing the oldest temporarily captured media with new temporarily captured media when the temporary storage is full, and by transferring the temporarily captured media from the temporary storage to a permanent storage as the captured media in response to receiving trigger data from the trigger device.
64. The system of claim 63, wherein the media capture device is configured to output the captured media to the media processing system.
65. The system of claim 64, wherein the media capture device is configured to output the captured media to the media processing system upon connecting to the internet.
66. The system of claim 59, wherein the trigger device is configured to output the trigger data to the media capture system upon connecting to the internet.
67. The system of claim 59, further comprising multiple media capture devices each configured to capture media, to receive trigger data from the trigger device, and to output the captured media to the media processing system.
68. The system of claim 67, wherein the multiple media capture devices are part of a media capture device network.
69. The system of claim 68, wherein the trigger device is configured to output trigger data to the each of the media capture devices in the media capture network.
70. A method, comprising:
capturing media data;
storing the media data in temporary storage;
receiving a plurality of trigger data from a trigger device, each trigger data including a time stamp corresponding to a real time that the trigger device generated the trigger data;
each time a trigger datum is received, transferring the media data from the temporary storage to a permanent storage as a media segment; and
uploading the media segments to a media processing system.
71 . The method of claim 70, wherein uploading the media segments includes uploading the media segments to a cloud based media processing system.
72. The method of claim 70, wherein uploading the media segments includes uploading the media segments via the internet.
73. The method of claim 70, wherein uploading the media segments includes uploading the media segments to a media processing system executed on one or more servers.
PCT/US2018/053807 2017-10-02 2018-10-01 Method and apparatus for editing media content WO2019070609A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US201762567054P true 2017-10-02 2017-10-02
US62/567,054 2017-10-02
US201762572157P true 2017-10-13 2017-10-13
US62/572,157 2017-10-13

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US16/554,989 US10614858B2 (en) 2017-10-02 2019-08-29 Method and apparatus for editing media content
US16/821,379 US20200286523A1 (en) 2017-10-02 2020-03-17 Method and apparatus for editing media content
US16/821,423 US20200286526A1 (en) 2017-10-02 2020-03-17 Retrospective capture trigger

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2018/053807 Continuation-In-Part WO2019070609A1 (en) 2017-10-02 2018-10-01 Method and apparatus for editing media content

Related Child Applications (3)

Application Number Title Priority Date Filing Date
PCT/US2018/053807 Continuation-In-Part WO2019070609A1 (en) 2017-10-02 2018-10-01 Method and apparatus for editing media content
PCT/US2018/053805 Continuation WO2019070608A1 (en) 2017-10-02 2018-10-01 Retrospective capture trigger
US16/554,989 Continuation-In-Part US10614858B2 (en) 2017-10-02 2019-08-29 Method and apparatus for editing media content

Publications (1)

Publication Number Publication Date
WO2019070609A1 true WO2019070609A1 (en) 2019-04-11

Family

ID=65994824

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/US2018/053805 WO2019070608A1 (en) 2017-10-02 2018-10-01 Retrospective capture trigger
PCT/US2018/053807 WO2019070609A1 (en) 2017-10-02 2018-10-01 Method and apparatus for editing media content

Family Applications Before (1)

Application Number Title Priority Date Filing Date
PCT/US2018/053805 WO2019070608A1 (en) 2017-10-02 2018-10-01 Retrospective capture trigger

Country Status (2)

Country Link
US (2) US10614858B2 (en)
WO (2) WO2019070608A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110289413A1 (en) * 2006-12-22 2011-11-24 Apple Inc. Fast Creation of Video Segments
US20120137255A1 (en) * 2010-11-26 2012-05-31 Htc Corporation Note management methods and systems
KR20150112113A (en) * 2014-03-26 2015-10-07 주식회사 티엔티크라우드 Method for managing online lecture contents based on event processing
KR101656241B1 (en) * 2015-08-18 2016-09-09 주식회사 씨트링 Method for storing and retrieving event information related to event
US20160316176A1 (en) * 2014-07-07 2016-10-27 Google Inc. Method and System for Displaying Recorded and Live Video Feeds

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008109172A1 (en) 2007-03-07 2008-09-12 Wiklof Christopher A Recorder with retrospective capture
KR101427718B1 (en) * 2013-02-08 2014-08-07 이익선 Portable black-box device and system for detecting and transmitting emergency situations, and controlling method for the same
KR102114729B1 (en) * 2013-08-23 2020-05-26 삼성전자주식회사 Method for displaying saved information and an electronic device thereof
US9646387B2 (en) * 2014-10-15 2017-05-09 Comcast Cable Communications, Llc Generation of event video frames for content
US10397641B2 (en) * 2015-03-05 2019-08-27 Comcast Cable Communications, Llc Methods and systems for content management
DE102015207415A1 (en) * 2015-04-23 2016-10-27 Adidas Ag Method and apparatus for associating images in a video of a person's activity with an event

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110289413A1 (en) * 2006-12-22 2011-11-24 Apple Inc. Fast Creation of Video Segments
US20120137255A1 (en) * 2010-11-26 2012-05-31 Htc Corporation Note management methods and systems
KR20150112113A (en) * 2014-03-26 2015-10-07 주식회사 티엔티크라우드 Method for managing online lecture contents based on event processing
US20160316176A1 (en) * 2014-07-07 2016-10-27 Google Inc. Method and System for Displaying Recorded and Live Video Feeds
KR101656241B1 (en) * 2015-08-18 2016-09-09 주식회사 씨트링 Method for storing and retrieving event information related to event

Also Published As

Publication number Publication date
US10614858B2 (en) 2020-04-07
US20200066306A1 (en) 2020-02-27
US20200286526A1 (en) 2020-09-10
WO2019070608A1 (en) 2019-04-11

Similar Documents

Publication Publication Date Title
US10750116B2 (en) Automatically curating video to fit display time
US10432841B2 (en) Wearable apparatus and method for selectively categorizing information derived from images
US9681111B1 (en) Apparatus and methods for embedding metadata into video stream
US10187612B2 (en) Display apparatus for displaying image data received from an image pickup apparatus attached to a moving body specified by specification information
US10536683B2 (en) System and method for presenting and viewing a spherical video segment
US10187522B2 (en) Caller preview data and call messages based on caller preview data
US10720183B2 (en) Method and electronic device for generating multiple point of view video
US9792951B2 (en) Systems and methods for identifying potentially interesting events in extended recordings
US6563532B1 (en) Low attention recording unit for use by vigorously active recorder
US9230602B2 (en) Media clip creation and distribution systems, apparatus, and methods
CN105144741B (en) Client device, Video service platform, the method for generating video clipping
JP5162928B2 (en) Image processing apparatus, image processing method, and image processing system
JP5092357B2 (en) Imaging display device and imaging display method
KR100919221B1 (en) Portable telephone
JP4315827B2 (en) Image display method, image display apparatus, and image display program
CN101207804B (en) Image display system, display apparatus, and display method
US7873258B2 (en) Method and apparatus for reviewing video
CN105359501B (en) Automatic music video creation from photograph collection and intelligent picture library
US6934461B1 (en) Low attention recording, with particular application to social recording
KR20160087222A (en) Method and Appratus For Creating Photo Story based on Visual Context Analysis of Digital Contents
CN103428555B (en) A kind of synthetic method of multimedia file, system and application process
JP4099973B2 (en) Video data transmission method, video data reception method, and video surveillance system
US20190182417A1 (en) Systems and methods to control camera operations
US8289410B2 (en) Recording apparatus and method, playback apparatus and method, and program
CN105210379B (en) Display control unit, display control method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18864233

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18864233

Country of ref document: EP

Kind code of ref document: A1