US20200358979A1 - System and method for supporting selective backtracking data recording - Google Patents

System and method for supporting selective backtracking data recording Download PDF

Info

Publication number
US20200358979A1
US20200358979A1 US16/939,653 US202016939653A US2020358979A1 US 20200358979 A1 US20200358979 A1 US 20200358979A1 US 202016939653 A US202016939653 A US 202016939653A US 2020358979 A1 US2020358979 A1 US 2020358979A1
Authority
US
United States
Prior art keywords
data
storage medium
processing system
data flow
control signal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/939,653
Inventor
Sheldon Schwartz
Tao Wang
Mingyu Wang
Zisheng Cao
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from PCT/CN2014/093785 external-priority patent/WO2016095071A1/en
Priority claimed from PCT/CN2014/093786 external-priority patent/WO2016095072A1/en
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Priority to US16/939,653 priority Critical patent/US20200358979A1/en
Assigned to SZ DJI Technology Co., Ltd. reassignment SZ DJI Technology Co., Ltd. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SCHWARTZ, SHELDON, WANG, TAO, WANG, MINGYU, CAO, ZISHENG
Publication of US20200358979A1 publication Critical patent/US20200358979A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • H04N5/92Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N5/9201Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving the multiplexing of an additional signal and the video signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/78Television signal recording using magnetic recording
    • H04N5/781Television signal recording using magnetic recording on disks or drums
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • G11B27/036Insert-editing
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/34Indicating arrangements 
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B31/00Arrangements for the associated working of recording or reproducing apparatus with related apparatus
    • G11B31/006Arrangements for the associated working of recording or reproducing apparatus with related apparatus with video camera or receiver
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2628Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/7605Television signal recording on discs or drums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
    • H04N5/772Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera the recording apparatus and the television camera being placed in the same enclosure
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • H04N5/92Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N5/9201Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving the multiplexing of an additional signal and the video signal
    • H04N5/9202Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving the multiplexing of an additional signal and the video signal the additional signal being a sound signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • H04N5/92Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N5/9201Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving the multiplexing of an additional signal and the video signal
    • H04N5/9205Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving the multiplexing of an additional signal and the video signal the additional signal being at least another television signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • H04N5/92Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N5/9201Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving the multiplexing of an additional signal and the video signal
    • H04N5/9206Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving the multiplexing of an additional signal and the video signal the additional signal being a character code signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0125Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level one of the standards being a high definition standard
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/015High-definition television systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/82Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
    • H04N9/8205Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/82Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
    • H04N9/8205Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
    • H04N9/8233Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal the additional signal being a character code signal

Definitions

  • the disclosed embodiments relate generally to data processing and more particularly, but not exclusively, to multimedia data recording.
  • Data recording such as video recording
  • the data can be stored as soon as the recording starts.
  • critical data such as memorable moments, may still be missed since the user may not be able to start the recording in time.
  • an unmanned aerial vehicle may be used for a wide range of applications including surveillance, search and rescue operations, exploration, and other fields.
  • UAV unmanned aerial vehicle
  • the image technology develops, the amount of image data that needs to be recorded can grow very fast. Thus, excessive bandwidth may be consumed, which prohibits the downloading of the captured video from an UAV.
  • the data processing apparatus can include a data processor that is associated with a data capturing device on a stationary object and/or a movable object.
  • the data processor can receive data in a data flow from one or more data sources, wherein the data flow is configured based on a time sequence. Then, the data processor can receive a control signal, which is associated with a first timestamp, wherein the first timestamp indicates a first time. Furthermore, the data processor can determine a first data segment by applying the first timestamp on the data flow, wherein the first data segment is associated with a time period in the time sequence that includes the first time.
  • the data processing apparatus can include a data processor that is associated with a data capturing device on a stationary object and/or a movable object.
  • the data processor can store, in a memory, data received from one or more data sources in a time sequence. Then, the data processor can remove a portion of the data stored in the memory after receiving a synchronization signal. Furthermore, the data processor can forward the removed data to a storage, when a switch is turned on, for a time period in the time sequence.
  • the data processing apparatus can include a data processor that is associated with a data capturing device on a stationary object and/or a movable object.
  • the data processor can associate one or more tags with a data flow received from one or more data sources, wherein said one or more tags are associated with one or more timestamps. Then, data processor can save the one or more timestamps in a record. Furthermore, the data processor can select a subset of the data in the received data flow based on the one or more timestamps saved in the record.
  • FIG. 1 is an exemplary illustration of a data recording system, in accordance with various embodiments of the present invention.
  • FIG. 2 is an exemplary illustration of selective data recording, in accordance with various embodiments of the present invention.
  • FIG. 3 is an exemplary illustration of using a buffer to support selective data recording, in accordance with various embodiments of the present invention.
  • FIG. 4 is an exemplary illustration of an asynchronous selective data recording, in accordance with various embodiments of the present invention.
  • FIG. 5 shows a flowchart of selective data recording, in accordance with various embodiments of the present invention.
  • FIG. 6 is an exemplary illustration of backtracking data recording, in accordance with various embodiments of the present invention.
  • FIG. 7 is an exemplary illustration of backtracking recording optimized in memory usage, in accordance with various embodiments of the present invention.
  • FIG. 8 is an exemplary illustration of backtracking recording optimized in computation load, in accordance with various embodiments of the present invention.
  • FIG. 9 shows a flowchart of backtracking data recording, in accordance with various embodiments of the present invention.
  • FIG. 10 is an exemplary illustration of data recording with tags, in accordance with various embodiments of the present invention.
  • FIG. 11 is an exemplary illustration of exporting from data recording with tags, in accordance with various embodiments of the present invention.
  • FIG. 12 shows a flowchart of data recording with tags, in accordance with various embodiments of the present invention.
  • FIG. 1 is an exemplary illustration of a data recording system, in accordance with various embodiments of the present invention.
  • a data processing system 100 can include a data processor 102 that receives various types of data from a data source 101 .
  • the data source 101 can be associated with a data capturing device 110 .
  • a data capturing device 110 For example, an image sensor can be used for capturing the image or video information, while a microphone (MIC) can be used for capturing the audio information and a physical or virtual keyboard can be used for capturing the textual information.
  • MIC microphone
  • a stationary object may include a fixed monitoring device
  • a movable object may include an unmanned aircraft, an unmanned vehicle, a hand held device (such as a camera), or a robot.
  • different communication protocols can be used to receive data from the one or more data sources.
  • the data source 101 can be physically connected to the data processor 102 .
  • the data source 101 can communicate with data processor 102 via wireless connection.
  • the traditional approach is to start recording after a need is identified, such as when critical data is identified. This approach may not be ideal since the success of the recording depends on how fast the user can start the recording process and how responsive the system is.
  • the data processing system 100 can support selective backtracking data recording.
  • the data capturing device 110 can be configured to keep on capturing data and pushes the captured data flow to the data processor 102 continuously.
  • the data processor 102 may choose not to store the received data (i.e. abandoning the received data), unless an instruction, to perform otherwise, is received from a user or is prescribed by the system.
  • the data processor 102 can export the selected data to a storage 103 .
  • the data processor 102 can save the data into a storage medium that is physically connected.
  • the data processor 102 may send the first data segment to a remote server via a network.
  • a user terminal 104 can be used for different purposes, such as controlling the data capturing device, and viewing the received data flow.
  • the data processor 102 can transmit a reduced set of the received data to a user terminal 104 for preview.
  • the received data may contain data in multiple data types
  • the reduced set of the received data flow can only contain data in one data type.
  • the user terminal 104 such as a handheld device, can be used for editing data captured by the data capture devices 110 .
  • the user terminal 104 can be used for editing video captured by image capture devices on unmanned aerial vehicles (UAVs).
  • UAVs unmanned aerial vehicles
  • the UAV may have an image capture device that captures video at a high definition and transmits the captured video to the handheld device at a low definition.
  • the handheld device may receive and edit the captured video at the low definition and form a video edit request.
  • the video edit request may be small in size and contain edited video data information for an edited video.
  • the video edit request may then be transmitted to the UAV and/or image capture device.
  • the UAV and/or image capture device may analyze the edited video file, generate a video file corresponding to the edited video data information, and transmit the generated video file to the handheld device or another device.
  • FIG. 2 is an exemplary illustration of selective data recording, in accordance with various embodiments of the present invention.
  • a data processing system 200 can include a data processor 201 that can receive data in a data flow 202 from a data source 210 .
  • the data flow 202 may include different types of information.
  • the data in a data flow 202 may contain audio data that are captured using a microphone; video data that are captured using an image sensor; and/or textual data that are captured using a physical or virtual keyboard.
  • the data flow 202 can be configured based on a time sequence 206 .
  • the time sequence can be defined based on a clock time, and/or a system defined time (such as image frame counts).
  • the data processor 201 can receive a control signal 203 .
  • the control signal 203 may be triggered by a button being pushed, a gesture of a being, a movement of an object, and/or a change of a state, and generated by a user terminal, a motion detector, a state monitor, and/or a data analyzer.
  • control signal 203 is associated with a timestamp 204 , which indicates (or represents) a time 214 in the time sequence 206 .
  • the data processor 201 can apply the timestamp 204 on the data flow 202 , to determine a data segment 211 that is associated with a time period 205 in the time sequence 206 .
  • the time period 205 includes the time 214 .
  • the data processing system 200 can support selective backtracking data recording.
  • the data processing system 200 allows for the use of the reduced set of the received data flow, which may only contain data in one data type, for initiating the backtracking data recording.
  • the data processor 201 can export the selected data segment 211 to a storage.
  • the data processor 201 can save the data segment into a storage medium that is physically connected.
  • the data processor 201 may send the first data segment to a remote server via a network.
  • FIG. 3 is an exemplary illustration of using a buffer to support selective data recording, in accordance with various embodiments of the present invention.
  • a data processing system 300 can take advantage of a buffer 302 in a memory 310 for handling the received data 301 .
  • the data processing system 300 can use the buffer 302 for storing the received data 301 before removing it from the memory 310 .
  • the data processing system 300 can drop 303 (a portion of) the data stored in the buffer 302 periodically, e.g. when receiving a frame synchronization signal and the buffer 302 is full.
  • the size of the buffer 302 can be preconfigured.
  • the size of the buffer 302 can be configured dynamically, e.g. based on policies.
  • the data processing system 300 can determine a data segment 311 after receiving a control signal 306 for storing the related data.
  • the data segment 311 can include both past data stored in the buffer 302 and the data arriving as the time progresses.
  • the data processing system 300 can export the data segment 311 to a storage.
  • the data processing system 300 can save the data segment into a storage medium 304 that is physically connected.
  • the data processing system 300 may send the first data segment to a remote server 305 via a network 307 .
  • FIG. 4 is an exemplary illustration of an asynchronous selective data recording, in accordance with various embodiments of the present invention.
  • a data processing system 400 can take advantage of a buffer 402 in a memory 410 for handling the received data 401 .
  • the data processing system 400 can use the buffer 402 for storing the received data 401 before removing it from the memory 410 .
  • the data processing system 400 can drop 403 (a portion of) the data stored in the buffer 402 periodically, e.g. when receiving a frame synchronization signal and the buffer 402 is full.
  • the size of the buffer 402 can be preconfigured.
  • the size of the buffer 402 can be configured dynamically, e.g. based on policies.
  • the data processing system 400 can create a copy 412 in the memory 410 for the data segment 411 when the data processing system 400 receives a control signal 406 . Then, the data processing system 400 can export the copy of the data segment 411 at a later time (i.e. in an asynchronous fashion).
  • the data processing system 400 can export the data segment 411 to a storage.
  • the data processing system 400 can save the data segment into a storage medium 404 that is physically connected.
  • the data processing system 400 may send the first data segment to a remote server 405 via a network 407 .
  • the data processing system 400 may be configured to export multiple data segments (each of which is associated with a different time period) in one action.
  • FIG. 5 shows a flowchart of selective data recording, in accordance with various embodiments of the present invention.
  • a data processor can receive data in a data flow from one or more data sources, wherein the data flow is configured based on a time sequence.
  • the data processor can receive a control signal, which is associated with a first timestamp, wherein the first timestamp indicates a first time.
  • the data processor can determine a first data segment by applying the first timestamp on the data flow, wherein the first data segment is associated with a time period in the time sequence that includes the first time.
  • FIG. 6 is an exemplary illustration of backtracking data recording, in accordance with various embodiments of the present invention.
  • a data processing system 600 can take advantage of a buffer 606 in a memory 602 for handling the data 611 , which are received from a data source 601 .
  • the received data 611 can be associated with a time sequence 610 .
  • the received data may contain one or more image frames, and the time sequence 610 can correspond to an image frame count.
  • the received data 611 can be stored in a memory 602 , which may be based on a first-in-first-out (FIFO) protocol.
  • a memory 602 may be based on a first-in-first-out (FIFO) protocol.
  • FIFO first-in-first-out
  • other types of memory can be used without limitation.
  • the memory 602 can be based on a ring buffer or a list.
  • the data processing system 600 can remove a portion of the data (e.g. one or more image frames) stored in the memory 602 periodically, e.g. after receiving a synchronization signal 620 (i.e. when new data arrives and the buffer 606 is full). Furthermore, the data processing system 600 can include a switch 603 , which can be responsible for directing the data out of the memory 602 .
  • a switch 603 can be responsible for directing the data out of the memory 602 .
  • the data processing system 600 can turn on the switch 603 after receiving a control signal.
  • the data processing system 600 can export the data to a storage 605 .
  • the storage 605 can be based on a storage medium that connects to the memory 602 and/or a remote server that connects via a network.
  • the switch 603 is turned off, the data processing system 600 can drop 604 the data without storing it.
  • the data processing system 600 receives a control signal at a time, tg.
  • the switch 603 can be maintained for a time period, T (i.e. from tg to tc in the time sequence 610 ).
  • T can be preconfigured or prescribed.
  • the time period, T can be dynamically configured based on one or more policies.
  • the data processing system 600 can use a dynamically determined time, e.g. when the volume of a voice signal exceeds certain or when certain moving objects are detected in the scene, as the starting point and the ending point for the time period, T, as long as the memory is sufficient.
  • a dynamically determined time e.g. when the volume of a voice signal exceeds certain or when certain moving objects are detected in the scene, as the starting point and the ending point for the time period, T, as long as the memory is sufficient.
  • the data processing system 600 can use video analysis technology to automatically identify the time when certain interesting object appears in the image. Then, the data processing system 600 can use this time as the starting point for the time period, T.
  • the data processing system 600 can support backtracking data recording.
  • the buffer 606 in the memory 602 can be used to store a number of image frames with variable sizes. There can be a number of images existing in the buffer 606 , which may be configured to have a maximum image frame number, M.
  • the stored data segment can include a past portion with a duration of Tr, which equals to the result of the maximum image frame number, M, divided by the image speed, fps, (i.e. M/fps). Additionally, the data segment can include a current portion with a duration of Tc, where Tc is the difference between the time period, T, and the duration for the past portion Tr (i.e. T ⁇ Tr).
  • the data processing system 600 can be optimized for different purposes, depending on how the memory 602 is configured.
  • the backtracking data recording can be configured for optimized memory usage or for optimized computation load (i.e. battery usage).
  • FIG. 7 is an exemplary illustration of backtracking recording optimized in memory usage, in accordance with various embodiments of the present invention.
  • a data processing system 700 can take advantage of a memory 708 for handling the data received from a data source.
  • the data processing system 700 can receive various image information from a sensor 701 . Then, the data processing system 700 can use an image signal processor (ISP) 703 for transforming the image information into raw video data.
  • ISP image signal processor
  • the data processing system 700 can receive various analog voice information from a microphone (MIC) 702 . Then, the data processing system 700 can use an analog/digital (A/D) converter 704 to convert the analog voice information into digital audio data.
  • MIC microphone
  • A/D analog/digital
  • the data processing system 700 can use different encoder for encoding, or compressing, the received data before storing the encoded data in the memory 708 .
  • a video encoder 705 can be used for encoding the raw video data from the ISP 703
  • an audio encoder 706 can be used for encoding the audio data from the (A/D) converter 704 .
  • the data processing system 700 can use a multiplexer, e.g. AV Mux 707 , for combining the separate audio data and video data into an audio/video (A/V) data stream.
  • the memory usage is optimized, since the received data are encoded or compressed before storing the encoded data in the memory 708 (on the other hand, more computation load may be required).
  • the memory 708 can be based on a first-in-first-out (FIFO) protocol.
  • FIFO first-in-first-out
  • other types of memory can be used without limitation.
  • the memory 708 can be based on a ring buffer or a list.
  • the data processing system 700 can remove a portion of the data (e.g. one or more image frames) stored in the memory 708 periodically, e.g. after receiving a synchronization signal 720 (i.e. when new data arrives and the buffer is full).
  • the data processing system 700 can include a switch 707 and a switch 709 , which is responsible for directing the video data and the audio data out of the memory 708 .
  • the data processing system 700 can turn on the switch 709 after receiving a control signal.
  • the switch 709 is turned on, the data processing system 700 can direct the data to a storage 712 .
  • the switch 709 is turned off, the data processing system 700 can drop 711 the data without storing it.
  • the data processing system 700 receives a control signal at a time, tg. Once the switch 709 is turned on, the data processing system 700 can maintain the switch 709 for a time period, T (i.e. from tg to tc in the time sequence 710 ).
  • a buffer in the memory 708 can be used for storing a number of image frames with variable sizes. There can be a number of images existing in the buffer, which may be configured to have a maximum image frame number, M.
  • the stored data segment can include a past portion with a duration of Tr, which equals to the result of the maximum image frame number, M, divided by the image speed, fps, (i.e. M/fps). Additionally, the data segment can include a current portion with a duration of Tc, where Tc is the difference between the time period, T, and the duration for the past portion Tr (i.e. T ⁇ Tr).
  • data processing system 700 can direct the received data to a user terminal for live view 713 .
  • FIG. 8 is an exemplary illustration of backtracking recording optimized in computation load, in accordance with various embodiments of the present invention.
  • a data processing system 800 can take advantage of a memory 805 for handling the data received from a data source.
  • the data processing system 800 can receive various image information from a sensor 801 . Then, the data processing system 800 can use an image signal processor (ISP) 803 for transforming the image information into raw video data.
  • ISP image signal processor
  • the data processing system 800 can receive various analog voice information from a microphone (MIC) 802 . Then, the data processing system 800 can use an analog/digital (A/D) converter 804 to convert the analog voice information into raw digital audio data.
  • MIC microphone
  • A/D analog/digital
  • both the raw video data and raw digital audio data may be saved directly into the memory 802 (i.e. requires more memory).
  • the data processing system 800 can perform various image processing operations on the raw video data and raw digital audio data (e.g. playing backwards), since the raw video data and raw digital audio data has not be encode or compressed.
  • the data processing system 800 can remove a portion of the data (e.g. one or more image frames) stored in the memory 805 periodically, e.g. after receiving a synchronization signal 820 (i.e. when new data arrives and the buffer is full).
  • a synchronization signal 820 i.e. when new data arrives and the buffer is full.
  • the data processing system 800 can include a switch 807 and a switch 808 , which are responsible for directing the video data and the audio data out of the memory 805 , respectively.
  • the data processing system 800 can turn on the switches 807 - 808 after receiving a control signal. When the switches 807 - 808 are turned on, the data processing system 800 can direct the data to a storage 814 . On the other hand, when the switches 807 - 808 are turned off, the data processing system 800 can drop 804 the data without storing it.
  • the data processing system 800 receives a control signal at a time, tg. Once the switches 807 - 808 are turned on, the data processing system 800 can maintain the switches 807 - 808 for a time period, T (i.e. from tg to tc in the time sequence 810 ).
  • the data processing system 800 can support backtracking data recording.
  • a buffer in the memory 805 can be used for storing a number of image frames with variable sizes. There can be a number of images existing in the buffer, which may be configured to have a maximum image frame number, M.
  • the stored data segment can include a past portion with a duration of Tr, which equals to the result of the maximum image frame number, M, divided by the image speed, fps, (i.e. M/fps). Additionally, the data segment can include a current portion with a duration of Tc, where Tc is the difference between the time period, T, and the duration for the past portion Tr (i.e. T ⁇ Tr).
  • the data processing system 800 can use one or more encoders to encode the received data before actually saving the encoded data in the storage.
  • the storage 814 can be based on a storage medium that connects to the memory 805 , and/or a remote server that connects via a network.
  • the data processing system 800 can use a video encoder 805 for encoding the raw video data from the ISP 803 , and use an audio encoder 806 for encoding the audio data from the (A/D) converter 804 . Additionally, the data processing system 800 can use a multiplexer, e.g. AV Mux 813 , to combine the separate audio data and video data into an audio/video (A/V) data stream.
  • a multiplexer e.g. AV Mux 813
  • data processing system 800 can direct the received data to a user terminal for live view 806 .
  • FIG. 9 shows a flowchart of backtracking data recording, in accordance with various embodiments of the present invention.
  • a data processor can store, in a memory, data received from one or more data sources, wherein the received data is associate with a time sequence.
  • the data processor can remove a portion of the data stored in the memory after receiving a synchronization signal.
  • the data processor can forward the removed data to a storage, when a switch is turned on, for a time period in the time sequence.
  • FIG. 10 is an exemplary illustration of data recording with tags, in accordance with various embodiments of the present invention.
  • a data processing system 1000 can receive a data flow 1008 from a data source.
  • the data processing system 1000 can receive various image information from a sensor 1001 . Then, the data processing system 1000 can use an image signal processor (ISP) 1003 for transforming the image information into raw video data.
  • ISP image signal processor
  • the data processing system 1000 can receive various analog voice information from a microphone (MIC) 1002 . Then, the data processing system 1000 can use an analog/digital (A/D) converter 1004 to convert the analog voice information into digital audio data.
  • MIC microphone
  • A/D analog/digital
  • the data processing system 1000 can use various encoder for encoding, or compressing, the received digital data.
  • a video encoder 1005 is used for encoding the raw video data from the ISP 1003
  • an audio encoder 1006 is used for encoding the audio data from the (A/D) converter 1004 .
  • the data processing system 1000 can use a multiplexer, such as an AV Mux 1007 , to combine the audio data and the video data into an audio/video (A/V) stream for the data flow 1008 .
  • a multiplexer such as an AV Mux 1007
  • the data processing system 1000 can export (or save) the audio/video (A/V) stream in the storage 1012 .
  • the storage 1012 can be based on a storage medium that connects to the memory, and/or a remote server that connects via a network.
  • the data processing system 1000 can take advantage of one or more tags 1030 for handling the data in a data flow 1008 that are received from a data source.
  • the one or more tags 1030 may be applied by the user at the time of recording.
  • the tags 1030 can be associated with one or more timestamps (e.g. tO to tn), which can be defined in a time sequence 1010 .
  • the time sequence 1010 can be defined using a clock time or based on a system defined time.
  • the data processing system 1000 can assign an image frame count, e.g. a recorded frame number, to each image frame in the received data flow.
  • the recorded frame numbers which may be defined based on a received frame synchronization signal 1020 , can be used as a reference for the system defined time. Also, the timestamp associated with a tag 1030 may correspond to one or more recorded frame numbers.
  • the data processing system 1000 may receive a control signal, e.g. a Tigg signal. Alternatively, the data processing system 1000 may generate a control signal in responding to a user request.
  • a control signal e.g. a Tigg signal.
  • the data processing system 1000 can save the timestamps, tO to tn, in a record 1011 (e.g. in a memory). Based on the timestamps saved in the record 1011 , the data processing system 600 can select a subset of the data in the received data flow 1008 . Then, the data processing system 1000 can use the selected data for different purposes, such as sharing, playing and storing.
  • the data processing system 1000 can direct data in the received data flow to a user terminal for live view 1009 .
  • FIG. 11 is an exemplary illustration of exporting from data recording with tags, in accordance with various embodiments of the present invention.
  • the data processing system 1100 can transcode data in the storage 1101 based the stored timestamps 1111 in the record 1110 .
  • the data processing system 1000 can create a data segment, such as a clip 1121 in a clip memory 1120 , based on a saved timestamp 1111 .
  • the data processing system 1100 can use a decoder 1102 to decode the recorded data in the storage 1101 , before creating the clip 1121 in the clip memory 1120 (e.g. in a memory).
  • the data processing system 1100 can employ a searching step 1105 for creating the clip 1121 , which may include one or more recorded image frames.
  • the timestamp 1111 may represent a time tg in the time sequence 1106 .
  • the data processing system 1100 can determine a time period, T (e.g. from T 1 to T 2 ), in the time sequence 1106 ) and the corresponding recorded frame counts. Then, the data processing system 1100 can determine the clip 1121 , which includes one or more image frames recorded before the saved timestamp (i.e. the time period Tr), and/or one or more image frames recorded after the saved timestamp (i.e. the time period Tc).
  • the data processing system 1100 can traverse the different timestamps saved in the record, and select a data segment for each saved timestamp.
  • the data processing system 1100 can use an encoder 1103 to create a data stream, e.g. AV stream 1104 , based on the clips in the clip memory (i.e. the selected data segments).
  • a data stream e.g. AV stream 1104
  • FIG. 12 shows a flowchart of data recording with tags, in accordance with various embodiments of the present invention.
  • a data processor can associate one or more tags with a data flow received from one or more data sources, wherein said one or more tags are associated with one or more timestamps.
  • the data processor can save the one or more timestamps in a record.
  • the data processor can select a subset of the data in the received data flow based on the one or more timestamps saved in the record.
  • the methods and features described herein thus provide a data processing method comprising receiving data in a data flow from one or more data sources, wherein the data flow is configured based on a time sequence; receiving a control signal, which is associated with a first timestamp, wherein the first timestamp indicates a first time; determining, via a data processor, a first data segment by applying the first timestamp on the data flow, wherein the first data segment is associated with a time period in the time sequence that includes the first time.
  • the one or more data sources may be associated with a data capturing device on a stationary object and/or a movable object.
  • the data processing method may further comprise using different communication protocols to receive data from the one or more data sources.
  • the data flow may contain audio data that are captured using a microphone, video data that are captured using an image sensor, and/or textual data that are captured using a physical or virtual keyboard.
  • the time sequence may be defined based on a clock time and/or a system defined time.
  • the data processing method may further comprise storing the received data into a buffer in a memory, and removing a portion of the received data stored in the buffer periodically when new data arrives and the buffer is full.
  • a size of the buffer may be either preconfigured or dynamically configured based on one or more policies.
  • the data processing method may further comprise creating a copy in the memory for the first data segment, before exporting the first data segment.
  • the data processing method may further comprise exporting the first data segment by saving the first data segment into a storage medium that is physically connected to the memory, and/or sending the first data segment to a remote server via a network.
  • the data processing method may further comprise exporting a plurality of data segments in the data flow, wherein each said data segment is associated with a different time period.
  • control signal may be triggered by a button being pushed, a gesture of a being, a movement of an object, and/or a change of a state, and generated by a user terminal, a motion detector, a state monitor, and/or a data analyzer.
  • the data processing method may further comprise transmitting a reduced set of the received data flow to a user terminal for preview and/or initiating recording.
  • the received data flow may contain data in multiple data types, while the reduced set of the received data flow may contain data only in one data type.
  • the data processing method may further comprise configuring the data source to start transmitting the data flow.
  • the systems and features described herein thus provide a data processing apparatus comprising one or more microprocessors and a data processor running on the one or more microprocessors, wherein the data processor operates to receive data in a data flow from one or more data sources, wherein the data flow is configured based on a time sequence; receive a control signal, which is associated with a first timestamp, wherein the first timestamp indicates a first time; determine a first data segment by applying the first timestamp on the data flow, wherein the first data segment is associated with a time period in the time sequence that includes the first time.
  • the one or more data sources may be associated with a data capturing device on a stationary object and/or a movable object.
  • different communication protocols may be used to receive data from the one or more data sources.
  • the data flow may contain audio data that are captured using a microphone, video data that are captured using an image sensor, and/or textual data that are captured using a physical or virtual keyboard.
  • the time sequence may be defined based on a clock time and/or a system defined time.
  • the data processor may operate to store the received data into a buffer in a memory, and remove a portion of the data stored in the buffer periodically when new data arrives and the buffer is full.
  • the size of the buffer may be either preconfigured or dynamically configured based on one or more policies.
  • the data processor may operate to create a copy in the memory for the first data segment, before exporting the first data segment.
  • the data processor may operate to export the first data segment by saving the first data segment into a storage medium that is physically connected to the memory, and/or send the first data segment to a remote server via a network.
  • the data processor may operate to export a plurality of data segments in the data flow, wherein each said data segment is associated with a different time period.
  • control signal may be triggered by a button being pushed, a gesture of a being, a movement of an object, and/or a change of a state, and generated by a user terminal, a motion detector, a state monitor, and/or a data analyzer.
  • the data processor may operate to transmit a reduced set of the received data flow to a user terminal for preview and/or initiate recording.
  • the received data flow may contain data in multiple data types, while the reduced set of the received data flow may contain data only in one data type.
  • the data processor may operate to configure the one or more data sources to start transmitting the data flow.
  • the systems and features described herein also provide a non-transitory computer-readable medium with instructions stored thereon, that when executed by a processor, perform the steps comprising: receiving data in a data flow from one or more data sources, wherein the data flow is configured based on a time sequence; receiving a control signal, which is associated with a first timestamp, wherein the first timestamp indicates a first time; determining, via a processor, a first data segment by applying the first timestamp on the data flow, wherein the first data segment is associated with a time period in the time sequence that includes the first time.
  • the systems and features described herein also provide a data processing system comprising: a data capturing device on a stationary object and/or a movable object; and a data processor, running on one or more microprocessors, wherein the data processor operates to receive data in a data flow from one or more data sources, wherein the data flow is configured based on a time sequence; receive a control signal, which is associated with a first timestamp, wherein the first timestamp indicates a first time; determine a first data segment by applying the first timestamp on the data flow, wherein the first data segment is associated with a time period in the time sequence that includes the first time.
  • the methods and features described herein also provide a data processing method comprising: storing, in a memory, data received from one or more data sources, wherein the received data is associated with a time sequence; removing a portion of the data stored in the memory after receiving a synchronization signal; and forwarding the removed data to a storage, when a switch is turned on, for a time period in the time sequence.
  • the one or more data sources may be associated with one or more data capturing devices on a stationary object and/or a movable object.
  • the received data may contain video data captured using an image sensor, audio data captured using a microphone, and/or textual data captured using a key board.
  • the data processing method may further comprise using different communication protocols to receive data from the one or more data sources.
  • the memory may include one of a first-in-first-out (FIFO) protocol memory, a ring buffer, or a list.
  • FIFO first-in-first-out
  • the data processing method may further comprise using one or more encoders to encode the received data before storing the encoded data in the memory, and using a multiplexer to combine audio data and video data into an audio/video (A/V) data stream.
  • A/V audio/video
  • the data processing method may further comprise using one or more encoders to encode the received data before saving the encoded data in the storage, and using a multiplexer to combine audio data and video data into an audio/video (A/V) data stream.
  • A/V audio/video
  • the data processing method may further comprise performing one or more image processing operations on the received data before the encoding step.
  • the storage may be based on a storage medium that connects to the memory and a remote server that connects via a network.
  • the data processing method may further comprise turning on the switch when receiving a control signal, wherein the control signal is triggered by a button being pushed, a gesture of a being, a movement of an object, and/or a change of a state, and generated by a user terminal, a system detector, a state monitor, and/or a data analyzer.
  • the received data may contain one or more image frames, wherein the time sequence is associated with an image frame count.
  • the data processing method may further comprise providing a buffer in the memory, wherein the buffer is used to store a number of image frames with variable sizes.
  • the data processing method may further comprise creating a data segment, which includes a past portion and a current portion.
  • the data processing method may further comprise directing the received data to a user terminal for live view.
  • the systems and features described herein also provide a data processing apparatus comprising one or more microprocessors; a data processor, running on the one or more microprocessors, wherein the data processor operates to store, in a memory, data received from one or more data sources in a time sequence; remove a portion of the data stored in the memory after receiving a synchronization signal; and forward the removed data to a storage, when a switch is turned on, for a time period in the time sequence.
  • the one or more data sources may be associated with one or more data capturing devices on a stationary object and/or a movable object.
  • the received data may contain video data captured using an image sensor, audio data captured using a microphone, and/or textual data captured using a key board.
  • different communication protocols may be used to receive data from the one or more data sources.
  • the memory may be based on a first-in-first-out (FIFO) protocol, or the memory is a ring buffer or a list.
  • FIFO first-in-first-out
  • one or more encoders may be used to encode the received data before storing the encoded data in the memory.
  • one or more encoders may be used to encode the received data before saving the encoded data in the storage.
  • a multiplexer may be used to combine audio data and video data into an audio/video (A/V) data stream.
  • the storage may be based on a storage medium that connects to the memory and a remote server that connects via a network.
  • the data processor may operate to turn on the switch when receiving a control signal, wherein the control signal is triggered by a button being pushed, a gesture of a being, a movement of an object, and/or a change of a state, and generated by a user terminal, a system detector, a state monitor, and/or a data analyzer.
  • the received data may contain one or more image frames, wherein the time sequence is associated with an image frame count.
  • the data processing apparatus may further comprise a buffer in the memory, wherein the buffer is used to store a number of image frames with variable sizes.
  • the data processor may operate to create a data segment, which includes a past portion and a current portion.
  • the data processor may operate to direct the received data to a user terminal for live view.
  • the systems and features described herein also provide a non-transitory computer-readable medium with instructions stored thereon, that when executed by a processor, perform the steps comprising: storing, in a memory, data received from one or more data sources, wherein the received data is associate with a time sequence; removing a portion of the data stored in the memory after receiving a synchronization signal; and forwarding the removed data to a storage, when a switch is turned on, for a time period in the time sequence.
  • the systems and features described herein also provide a data processing system comprising: a data capturing device on a stationary object and/or a movable object; and a data processor, running on one or more microprocessors, wherein the data processor operates to store, in a memory, data received from one or more data sources in a time sequence; remove a portion of the data stored in the memory after receiving a synchronization signal; and forward the removed data to a storage, when a switch is turned on, for a time period in the time sequence.
  • processors can include, without limitation, one or more general purpose microprocessors (for example, single or multi-core processors), application-specific integrated circuits, application-specific instruction-set processors, graphics processing units, physics processing units, digital signal processing units, coprocessors, network processing units, audio processing units, encryption processing units, and the like.
  • the storage medium can include, but is not limited to, any type of disk including floppy disks, optical discs, DVD, CD-ROMs, microdrive, and magneto-optical disks, ROMs, RAMs, EPROMs, EEPROMs, DRAMs, VRAMs, flash memory devices, magnetic or optical cards, nanosystems (including molecular memory ICs), or any type of media or device suitable for storing instructions and/or data.
  • features of the present invention can be incorporated in software and/or firmware for controlling the hardware of a processing system, and for enabling a processing system to interact with other mechanism utilizing the results of the present invention.
  • software or firmware may include, but is not limited to, application code, device drivers, operating systems and execution environments/containers.
  • ASICs application specific integrated circuits
  • FPGA field-programmable gate array
  • the present invention may be conveniently implemented using one or more conventional general purpose or specialized digital computer, computing device, machine, or microprocessor, including one or more processors, memory and/or computer readable storage media programmed according to the teachings of the present disclosure.
  • Appropriate software coding can readily be prepared by skilled programmers based on the teachings of the present disclosure, as will be apparent to those skilled in the software art.

Abstract

Systems and methods can support a data processing apparatus. The data processing apparatus can include a data processor that is associated with a data capturing device on a stationary object and/or a movable object. The data processor can receive data in a data flow from one or more data sources, wherein the data flow is configured based on a time sequence. Then, the data processor can receive a control signal, which is associated with a first timestamp, wherein the first timestamp indicates a first time. Furthermore, the data processor can determine a first data segment by applying the first timestamp on the data flow, wherein the first data segment is associated with a time period in the time sequence that includes the first time.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation application of International Application No. PCT/CN2015/076015, entitled “SYSTEM AND METHOD FOR SUPPORTING SELECTIVE BACKTRACKING DATA RECORDING,” filed on Apr. 7, 2015, which claims priority to International Application No. PCT/CN2014/093785, entitled “VIDEO PROCESSING METHOD, VIDEO PROCESSING DEVICE AND PLAYING DEVICE,” filed on Dec. 14, 2014, International Application No. PCT/CN2014/093786, entitled “VIDEO PROCESSING METHOD, VIDEO PROCESSING DEVICE AND DISPLAY DEVICE,” filed on Dec. 14, 2014, and International Application No. PCT/CN2015/075458, entitled “METHODS AND SYSTEMS OF VIDEO PROCESSING,” filed on Mar. 31, 2015. The contents of all four International Applications are hereby incorporated by reference in their entirety.
  • BACKGROUND OF THE INVENTION
  • The disclosed embodiments relate generally to data processing and more particularly, but not exclusively, to multimedia data recording.
  • Data recording, such as video recording, is traditionally an action in one direction along the timeline, where the past has already been settled and the future is yet to come. Typically, the data can be stored as soon as the recording starts. However, critical data, such as memorable moments, may still be missed since the user may not be able to start the recording in time.
  • On the other hand, there are physical limits that prevent a user from recording everything without qualification. For example, an unmanned aerial vehicle (UAV) may be used for a wide range of applications including surveillance, search and rescue operations, exploration, and other fields. As the image technology develops, the amount of image data that needs to be recorded can grow very fast. Thus, excessive bandwidth may be consumed, which prohibits the downloading of the captured video from an UAV.
  • This is the general area that embodiments of the invention are intended to address.
  • BRIEF SUMMARY OF THE INVENTION
  • Described herein are systems and methods that can support a data processing apparatus. The data processing apparatus can include a data processor that is associated with a data capturing device on a stationary object and/or a movable object. The data processor can receive data in a data flow from one or more data sources, wherein the data flow is configured based on a time sequence. Then, the data processor can receive a control signal, which is associated with a first timestamp, wherein the first timestamp indicates a first time. Furthermore, the data processor can determine a first data segment by applying the first timestamp on the data flow, wherein the first data segment is associated with a time period in the time sequence that includes the first time.
  • Also described herein are systems and methods that can support a data processing apparatus. The data processing apparatus can include a data processor that is associated with a data capturing device on a stationary object and/or a movable object. The data processor can store, in a memory, data received from one or more data sources in a time sequence. Then, the data processor can remove a portion of the data stored in the memory after receiving a synchronization signal. Furthermore, the data processor can forward the removed data to a storage, when a switch is turned on, for a time period in the time sequence.
  • Also described herein are systems and methods that can support a data processing apparatus. The data processing apparatus can include a data processor that is associated with a data capturing device on a stationary object and/or a movable object. The data processor can associate one or more tags with a data flow received from one or more data sources, wherein said one or more tags are associated with one or more timestamps. Then, data processor can save the one or more timestamps in a record. Furthermore, the data processor can select a subset of the data in the received data flow based on the one or more timestamps saved in the record.
  • Other objects and features of the present invention will become apparent by a review of the specification, claims, and appended figures.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The novel features of the invention are set forth with particularity in the appended claims. A better understanding of the features and advantages of the present invention will be obtained by reference to the following detailed description that sets forth illustrative embodiments, in which the principles of the invention are utilized, and the accompanying drawings briefly described herein.
  • FIG. 1 is an exemplary illustration of a data recording system, in accordance with various embodiments of the present invention.
  • FIG. 2 is an exemplary illustration of selective data recording, in accordance with various embodiments of the present invention.
  • FIG. 3 is an exemplary illustration of using a buffer to support selective data recording, in accordance with various embodiments of the present invention.
  • FIG. 4 is an exemplary illustration of an asynchronous selective data recording, in accordance with various embodiments of the present invention.
  • FIG. 5 shows a flowchart of selective data recording, in accordance with various embodiments of the present invention.
  • FIG. 6 is an exemplary illustration of backtracking data recording, in accordance with various embodiments of the present invention.
  • FIG. 7 is an exemplary illustration of backtracking recording optimized in memory usage, in accordance with various embodiments of the present invention.
  • FIG. 8 is an exemplary illustration of backtracking recording optimized in computation load, in accordance with various embodiments of the present invention.
  • FIG. 9 shows a flowchart of backtracking data recording, in accordance with various embodiments of the present invention.
  • FIG. 10 is an exemplary illustration of data recording with tags, in accordance with various embodiments of the present invention.
  • FIG. 11 is an exemplary illustration of exporting from data recording with tags, in accordance with various embodiments of the present invention.
  • FIG. 12 shows a flowchart of data recording with tags, in accordance with various embodiments of the present invention.
  • DETAILED DESCRIPTION OF THE INVENTION
  • The invention is illustrated, by way of example and not by way of limitation, in the figures of the accompanying drawings in which like references indicate similar elements. It should be noted that references to “an” or “one” or “some” embodiment(s) in this disclosure are not necessarily to the same embodiment, and such references mean at least one.
  • The description of the invention as following uses multimedia data recording as example for data recording. It will be apparent to those skilled in the art that other types of data recording can be used without limitation.
  • An Exemplary Data Recording System
  • FIG. 1 is an exemplary illustration of a data recording system, in accordance with various embodiments of the present invention. As shown in FIG. 1, a data processing system 100 can include a data processor 102 that receives various types of data from a data source 101.
  • In accordance with various embodiments of the present invention, the data source 101 can be associated with a data capturing device 110. For example, an image sensor can be used for capturing the image or video information, while a microphone (MIC) can be used for capturing the audio information and a physical or virtual keyboard can be used for capturing the textual information.
  • Additionally, the data capturing device 110 can be placed on a stationary object and/or a movable object. For example, a stationary object may include a fixed monitoring device, while a movable object may include an unmanned aircraft, an unmanned vehicle, a hand held device (such as a camera), or a robot.
  • In accordance with various embodiments of the present invention, different communication protocols can be used to receive data from the one or more data sources. For example, the data source 101 can be physically connected to the data processor 102. Alternatively, the data source 101 can communicate with data processor 102 via wireless connection.
  • Different approaches can be used for controlling the data recording. The traditional approach is to start recording after a need is identified, such as when critical data is identified. This approach may not be ideal since the success of the recording depends on how fast the user can start the recording process and how responsive the system is.
  • Alternatively, the data processing system 100 can support selective backtracking data recording. For example, once started, the data capturing device 110 can be configured to keep on capturing data and pushes the captured data flow to the data processor 102 continuously. In the meantime, the data processor 102 may choose not to store the received data (i.e. abandoning the received data), unless an instruction, to perform otherwise, is received from a user or is prescribed by the system.
  • Then, the data processor 102 can export the selected data to a storage 103. For example, the data processor 102 can save the data into a storage medium that is physically connected. Alternatively, the data processor 102 may send the first data segment to a remote server via a network.
  • Thus, the selective backtracking data recording approach is beneficial, since there is no dependency on how fast the user can start the recording process and how responsive the system is.
  • Additionally, a user terminal 104 can be used for different purposes, such as controlling the data capturing device, and viewing the received data flow.
  • In accordance with various embodiments of the present invention, the data processor 102 can transmit a reduced set of the received data to a user terminal 104 for preview. For example, while the received data may contain data in multiple data types, the reduced set of the received data flow can only contain data in one data type.
  • Additionally, the user terminal 104, such as a handheld device, can be used for editing data captured by the data capture devices 110. For example, the user terminal 104 can be used for editing video captured by image capture devices on unmanned aerial vehicles (UAVs).
  • In some embodiments, the UAV may have an image capture device that captures video at a high definition and transmits the captured video to the handheld device at a low definition. The handheld device may receive and edit the captured video at the low definition and form a video edit request. The video edit request may be small in size and contain edited video data information for an edited video. The video edit request may then be transmitted to the UAV and/or image capture device. The UAV and/or image capture device may analyze the edited video file, generate a video file corresponding to the edited video data information, and transmit the generated video file to the handheld device or another device.
  • Therefore, the pressure on bandwidth for transmission and computation for video editing may be alleviated, since not all the high definition video is transmitted.
  • Selective Data Recording
  • FIG. 2 is an exemplary illustration of selective data recording, in accordance with various embodiments of the present invention. As shown in FIG. 2, a data processing system 200 can include a data processor 201 that can receive data in a data flow 202 from a data source 210.
  • In accordance with various embodiments of the present invention, the data flow 202 may include different types of information. For example, the data in a data flow 202 may contain audio data that are captured using a microphone; video data that are captured using an image sensor; and/or textual data that are captured using a physical or virtual keyboard.
  • Furthermore, the data flow 202 can be configured based on a time sequence 206. For example, the time sequence can be defined based on a clock time, and/or a system defined time (such as image frame counts).
  • As shown in FIG. 2, the data processor 201 can receive a control signal 203. The control signal 203 may be triggered by a button being pushed, a gesture of a being, a movement of an object, and/or a change of a state, and generated by a user terminal, a motion detector, a state monitor, and/or a data analyzer.
  • Furthermore, the control signal 203 is associated with a timestamp 204, which indicates (or represents) a time 214 in the time sequence 206. When the data processor 201 receives the control signal 203, the data processor 201 can apply the timestamp 204 on the data flow 202, to determine a data segment 211 that is associated with a time period 205 in the time sequence 206.
  • As shown in FIG. 2, the time period 205, from a starting time 207 to an ending time 208, includes the time 214. This indicates that the data segment 211 includes data received from both the time period before the time 214 (i.e. backtracking) and the time period after the time 214. Thus, the data processing system 200 can support selective backtracking data recording.
  • In accordance with various embodiments of the present invention, the data processing system 200 allows for the use of the reduced set of the received data flow, which may only contain data in one data type, for initiating the backtracking data recording.
  • Then, the data processor 201 can export the selected data segment 211 to a storage. For example, the data processor 201 can save the data segment into a storage medium that is physically connected. Alternatively, the data processor 201 may send the first data segment to a remote server via a network.
  • FIG. 3 is an exemplary illustration of using a buffer to support selective data recording, in accordance with various embodiments of the present invention. As shown in FIG. 3, a data processing system 300 can take advantage of a buffer 302 in a memory 310 for handling the received data 301.
  • The data processing system 300 can use the buffer 302 for storing the received data 301 before removing it from the memory 310. For example, the data processing system 300 can drop 303 (a portion of) the data stored in the buffer 302 periodically, e.g. when receiving a frame synchronization signal and the buffer 302 is full. Furthermore, the size of the buffer 302 can be preconfigured. Alternatively, the size of the buffer 302 can be configured dynamically, e.g. based on policies.
  • Furthermore, the data processing system 300 can determine a data segment 311 after receiving a control signal 306 for storing the related data. The data segment 311 can include both past data stored in the buffer 302 and the data arriving as the time progresses.
  • Then, the data processing system 300 can export the data segment 311 to a storage. For example, the data processing system 300 can save the data segment into a storage medium 304 that is physically connected. Alternatively, the data processing system 300 may send the first data segment to a remote server 305 via a network 307.
  • FIG. 4 is an exemplary illustration of an asynchronous selective data recording, in accordance with various embodiments of the present invention. As shown in FIG. 4, a data processing system 400 can take advantage of a buffer 402 in a memory 410 for handling the received data 401.
  • The data processing system 400 can use the buffer 402 for storing the received data 401 before removing it from the memory 410. For example, the data processing system 400 can drop 403 (a portion of) the data stored in the buffer 402 periodically, e.g. when receiving a frame synchronization signal and the buffer 402 is full. Furthermore, the size of the buffer 402 can be preconfigured. Alternatively, the size of the buffer 402 can be configured dynamically, e.g. based on policies.
  • As shown in FIG. 4, the data processing system 400 can create a copy 412 in the memory 410 for the data segment 411 when the data processing system 400 receives a control signal 406. Then, the data processing system 400 can export the copy of the data segment 411 at a later time (i.e. in an asynchronous fashion).
  • In accordance with various embodiments of the present invention, the data processing system 400 can export the data segment 411 to a storage. For example, the data processing system 400 can save the data segment into a storage medium 404 that is physically connected. Alternatively, the data processing system 400 may send the first data segment to a remote server 405 via a network 407.
  • Additionally, in order to save the overhead relating to establishing network connection, the data processing system 400 may be configured to export multiple data segments (each of which is associated with a different time period) in one action.
  • FIG. 5 shows a flowchart of selective data recording, in accordance with various embodiments of the present invention. As shown in FIG. 5, at step 501, a data processor can receive data in a data flow from one or more data sources, wherein the data flow is configured based on a time sequence. Then, at step 502, the data processor can receive a control signal, which is associated with a first timestamp, wherein the first timestamp indicates a first time. Furthermore, at step 503, the data processor can determine a first data segment by applying the first timestamp on the data flow, wherein the first data segment is associated with a time period in the time sequence that includes the first time.
  • Backtracking Data Recording
  • FIG. 6 is an exemplary illustration of backtracking data recording, in accordance with various embodiments of the present invention. As shown in FIG. 6, a data processing system 600 can take advantage of a buffer 606 in a memory 602 for handling the data 611, which are received from a data source 601. Additionally, the received data 611 can be associated with a time sequence 610. For example, the received data may contain one or more image frames, and the time sequence 610 can correspond to an image frame count.
  • In accordance with various embodiments of the present invention, the received data 611 can be stored in a memory 602, which may be based on a first-in-first-out (FIFO) protocol. Alternatively, other types of memory can be used without limitation. For example, the memory 602 can be based on a ring buffer or a list.
  • As shown in FIG. 6, the data processing system 600 can remove a portion of the data (e.g. one or more image frames) stored in the memory 602 periodically, e.g. after receiving a synchronization signal 620 (i.e. when new data arrives and the buffer 606 is full). Furthermore, the data processing system 600 can include a switch 603, which can be responsible for directing the data out of the memory 602.
  • The data processing system 600 can turn on the switch 603 after receiving a control signal. When the switch 603 is turned on, the data processing system 600 can export the data to a storage 605. The storage 605 can be based on a storage medium that connects to the memory 602 and/or a remote server that connects via a network. On the other hand, when the switch 603 is turned off, the data processing system 600 can drop 604 the data without storing it.
  • As shown in FIG. 6, the data processing system 600 receives a control signal at a time, tg. Once the switch 603 is turned on, the switch 603 can be maintained for a time period, T (i.e. from tg to tc in the time sequence 610). Here, the time period, T, can be preconfigured or prescribed. Alternatively, the time period, T, can be dynamically configured based on one or more policies.
  • In accordance with various embodiments of the present invention, the data processing system 600 can use a dynamically determined time, e.g. when the volume of a voice signal exceeds certain or when certain moving objects are detected in the scene, as the starting point and the ending point for the time period, T, as long as the memory is sufficient.
  • For example, the data processing system 600 can use video analysis technology to automatically identify the time when certain interesting object appears in the image. Then, the data processing system 600 can use this time as the starting point for the time period, T.
  • In accordance with various embodiments of the present invention, the data processing system 600 can support backtracking data recording. As shown in FIG. 6, the buffer 606 in the memory 602 can be used to store a number of image frames with variable sizes. There can be a number of images existing in the buffer 606, which may be configured to have a maximum image frame number, M.
  • Thus, for the time period, T, the stored data segment can include a past portion with a duration of Tr, which equals to the result of the maximum image frame number, M, divided by the image speed, fps, (i.e. M/fps). Additionally, the data segment can include a current portion with a duration of Tc, where Tc is the difference between the time period, T, and the duration for the past portion Tr (i.e. T−Tr).
  • In accordance with various embodiments of the present invention, the data processing system 600 can be optimized for different purposes, depending on how the memory 602 is configured. For example, the backtracking data recording can be configured for optimized memory usage or for optimized computation load (i.e. battery usage).
  • FIG. 7 is an exemplary illustration of backtracking recording optimized in memory usage, in accordance with various embodiments of the present invention. As shown in FIG. 7, a data processing system 700 can take advantage of a memory 708 for handling the data received from a data source.
  • For example, the data processing system 700 can receive various image information from a sensor 701. Then, the data processing system 700 can use an image signal processor (ISP) 703 for transforming the image information into raw video data.
  • Additionally, the data processing system 700 can receive various analog voice information from a microphone (MIC) 702. Then, the data processing system 700 can use an analog/digital (A/D) converter 704 to convert the analog voice information into digital audio data.
  • Furthermore, the data processing system 700 can use different encoder for encoding, or compressing, the received data before storing the encoded data in the memory 708. For example, a video encoder 705 can be used for encoding the raw video data from the ISP 703, and an audio encoder 706 can be used for encoding the audio data from the (A/D) converter 704. Additionally, the data processing system 700 can use a multiplexer, e.g. AV Mux 707, for combining the separate audio data and video data into an audio/video (A/V) data stream.
  • Thus, the memory usage is optimized, since the received data are encoded or compressed before storing the encoded data in the memory 708 (on the other hand, more computation load may be required).
  • In accordance with various embodiments of the present invention, the memory 708 can be based on a first-in-first-out (FIFO) protocol. Alternatively, other types of memory can be used without limitation. For example, the memory 708 can be based on a ring buffer or a list.
  • Furthermore, the data processing system 700 can remove a portion of the data (e.g. one or more image frames) stored in the memory 708 periodically, e.g. after receiving a synchronization signal 720 (i.e. when new data arrives and the buffer is full).
  • As shown in FIG. 7, the data processing system 700 can include a switch 707 and a switch 709, which is responsible for directing the video data and the audio data out of the memory 708. The data processing system 700 can turn on the switch 709 after receiving a control signal. When the switch 709 is turned on, the data processing system 700 can direct the data to a storage 712. On the other hand, when the switch 709 is turned off, the data processing system 700 can drop 711 the data without storing it.
  • As shown in FIG. 7, the data processing system 700 receives a control signal at a time, tg. Once the switch 709 is turned on, the data processing system 700 can maintain the switch 709 for a time period, T (i.e. from tg to tc in the time sequence 710).
  • Thus, the data processing system 700 can support backtracking data recording. A buffer in the memory 708 can be used for storing a number of image frames with variable sizes. There can be a number of images existing in the buffer, which may be configured to have a maximum image frame number, M.
  • Thus, for the time period, T, the stored data segment can include a past portion with a duration of Tr, which equals to the result of the maximum image frame number, M, divided by the image speed, fps, (i.e. M/fps). Additionally, the data segment can include a current portion with a duration of Tc, where Tc is the difference between the time period, T, and the duration for the past portion Tr (i.e. T−Tr).
  • Additionally, data processing system 700 can direct the received data to a user terminal for live view 713.
  • FIG. 8 is an exemplary illustration of backtracking recording optimized in computation load, in accordance with various embodiments of the present invention. As shown in FIG. 8, a data processing system 800 can take advantage of a memory 805 for handling the data received from a data source.
  • For example, the data processing system 800 can receive various image information from a sensor 801. Then, the data processing system 800 can use an image signal processor (ISP) 803 for transforming the image information into raw video data.
  • Additionally, the data processing system 800 can receive various analog voice information from a microphone (MIC) 802. Then, the data processing system 800 can use an analog/digital (A/D) converter 804 to convert the analog voice information into raw digital audio data.
  • In order to optimize the computation load, both the raw video data and raw digital audio data may be saved directly into the memory 802 (i.e. requires more memory). Here, the data processing system 800 can perform various image processing operations on the raw video data and raw digital audio data (e.g. playing backwards), since the raw video data and raw digital audio data has not be encode or compressed.
  • Furthermore, the data processing system 800 can remove a portion of the data (e.g. one or more image frames) stored in the memory 805 periodically, e.g. after receiving a synchronization signal 820 (i.e. when new data arrives and the buffer is full).
  • As shown in FIG. 8, the data processing system 800 can include a switch 807 and a switch 808, which are responsible for directing the video data and the audio data out of the memory 805, respectively.
  • The data processing system 800 can turn on the switches 807-808 after receiving a control signal. When the switches 807-808 are turned on, the data processing system 800 can direct the data to a storage 814. On the other hand, when the switches 807-808 are turned off, the data processing system 800 can drop 804 the data without storing it.
  • As shown in FIG. 8, the data processing system 800 receives a control signal at a time, tg. Once the switches 807-808 are turned on, the data processing system 800 can maintain the switches 807-808 for a time period, T (i.e. from tg to tc in the time sequence 810).
  • The data processing system 800 can support backtracking data recording. A buffer in the memory 805 can be used for storing a number of image frames with variable sizes. There can be a number of images existing in the buffer, which may be configured to have a maximum image frame number, M.
  • Thus, for the time period, T, the stored data segment can include a past portion with a duration of Tr, which equals to the result of the maximum image frame number, M, divided by the image speed, fps, (i.e. M/fps). Additionally, the data segment can include a current portion with a duration of Tc, where Tc is the difference between the time period, T, and the duration for the past portion Tr (i.e. T−Tr).
  • The data processing system 800 can use one or more encoders to encode the received data before actually saving the encoded data in the storage. Here, the storage 814 can be based on a storage medium that connects to the memory 805, and/or a remote server that connects via a network.
  • As shown in FIG. 8, the data processing system 800 can use a video encoder 805 for encoding the raw video data from the ISP 803, and use an audio encoder 806 for encoding the audio data from the (A/D) converter 804. Additionally, the data processing system 800 can use a multiplexer, e.g. AV Mux 813, to combine the separate audio data and video data into an audio/video (A/V) data stream.
  • Additionally, data processing system 800 can direct the received data to a user terminal for live view 806.
  • FIG. 9 shows a flowchart of backtracking data recording, in accordance with various embodiments of the present invention. As shown in FIG. 9, at step 901, a data processor can store, in a memory, data received from one or more data sources, wherein the received data is associate with a time sequence. Then, at step 902, the data processor can remove a portion of the data stored in the memory after receiving a synchronization signal. Furthermore, at step 903, the data processor can forward the removed data to a storage, when a switch is turned on, for a time period in the time sequence.
  • Data Recording with Tags
  • FIG. 10 is an exemplary illustration of data recording with tags, in accordance with various embodiments of the present invention. As shown in FIG. 10, a data processing system 1000 can receive a data flow 1008 from a data source.
  • For example, the data processing system 1000 can receive various image information from a sensor 1001. Then, the data processing system 1000 can use an image signal processor (ISP) 1003 for transforming the image information into raw video data.
  • Additionally, the data processing system 1000 can receive various analog voice information from a microphone (MIC) 1002. Then, the data processing system 1000 can use an analog/digital (A/D) converter 1004 to convert the analog voice information into digital audio data.
  • Furthermore, the data processing system 1000 can use various encoder for encoding, or compressing, the received digital data. For example, a video encoder 1005 is used for encoding the raw video data from the ISP 1003, and an audio encoder 1006 is used for encoding the audio data from the (A/D) converter 1004. Then, the data processing system 1000 can use a multiplexer, such as an AV Mux 1007, to combine the audio data and the video data into an audio/video (A/V) stream for the data flow 1008.
  • Additionally, the data processing system 1000 can export (or save) the audio/video (A/V) stream in the storage 1012. For example, the storage 1012 can be based on a storage medium that connects to the memory, and/or a remote server that connects via a network.
  • In accordance with various embodiments of the present invention, the data processing system 1000 can take advantage of one or more tags 1030 for handling the data in a data flow 1008 that are received from a data source. For example, the one or more tags 1030 may be applied by the user at the time of recording.
  • As shown in FIG. 10, the tags 1030 can be associated with one or more timestamps (e.g. tO to tn), which can be defined in a time sequence 1010. The time sequence 1010 can be defined using a clock time or based on a system defined time.
  • The data processing system 1000 can assign an image frame count, e.g. a recorded frame number, to each image frame in the received data flow. The recorded frame numbers, which may be defined based on a received frame synchronization signal 1020, can be used as a reference for the system defined time. Also, the timestamp associated with a tag 1030 may correspond to one or more recorded frame numbers.
  • When a user applies a tag 1030 on the received data flow 1008, the data processing system 1000 may receive a control signal, e.g. a Tigg signal. Alternatively, the data processing system 1000 may generate a control signal in responding to a user request.
  • Furthermore, the data processing system 1000 can save the timestamps, tO to tn, in a record 1011 (e.g. in a memory). Based on the timestamps saved in the record 1011, the data processing system 600 can select a subset of the data in the received data flow 1008. Then, the data processing system 1000 can use the selected data for different purposes, such as sharing, playing and storing.
  • Additionally, the data processing system 1000 can direct data in the received data flow to a user terminal for live view 1009.
  • FIG. 11 is an exemplary illustration of exporting from data recording with tags, in accordance with various embodiments of the present invention. As shown in FIG. 11, the data processing system 1100 can transcode data in the storage 1101 based the stored timestamps 1111 in the record 1110.
  • In accordance with various embodiments of the present invention, the data processing system 1000 can create a data segment, such as a clip 1121 in a clip memory 1120, based on a saved timestamp 1111. For example, the data processing system 1100 can use a decoder 1102 to decode the recorded data in the storage 1101, before creating the clip 1121 in the clip memory 1120 (e.g. in a memory).
  • As shown in FIG. 11, the data processing system 1100 can employ a searching step 1105 for creating the clip 1121, which may include one or more recorded image frames. For example, the timestamp 1111 may represent a time tg in the time sequence 1106. The data processing system 1100 can determine a time period, T (e.g. from T1 to T2), in the time sequence 1106) and the corresponding recorded frame counts. Then, the data processing system 1100 can determine the clip 1121, which includes one or more image frames recorded before the saved timestamp (i.e. the time period Tr), and/or one or more image frames recorded after the saved timestamp (i.e. the time period Tc).
  • Furthermore, the data processing system 1100 can traverse the different timestamps saved in the record, and select a data segment for each saved timestamp.
  • Then, the data processing system 1100 can use an encoder 1103 to create a data stream, e.g. AV stream 1104, based on the clips in the clip memory (i.e. the selected data segments).
  • FIG. 12 shows a flowchart of data recording with tags, in accordance with various embodiments of the present invention. As shown in FIG. 12, at step 1201, a data processor can associate one or more tags with a data flow received from one or more data sources, wherein said one or more tags are associated with one or more timestamps. Then, at step 1202, the data processor can save the one or more timestamps in a record. Furthermore, at step 1203, the data processor can select a subset of the data in the received data flow based on the one or more timestamps saved in the record.
  • In some embodiments, the methods and features described herein thus provide a data processing method comprising receiving data in a data flow from one or more data sources, wherein the data flow is configured based on a time sequence; receiving a control signal, which is associated with a first timestamp, wherein the first timestamp indicates a first time; determining, via a data processor, a first data segment by applying the first timestamp on the data flow, wherein the first data segment is associated with a time period in the time sequence that includes the first time.
  • In some embodiments, the one or more data sources may be associated with a data capturing device on a stationary object and/or a movable object.
  • In some embodiments, the data processing method may further comprise using different communication protocols to receive data from the one or more data sources.
  • In some embodiments, the data flow may contain audio data that are captured using a microphone, video data that are captured using an image sensor, and/or textual data that are captured using a physical or virtual keyboard.
  • In some embodiments, the time sequence may be defined based on a clock time and/or a system defined time.
  • In some embodiments, the data processing method may further comprise storing the received data into a buffer in a memory, and removing a portion of the received data stored in the buffer periodically when new data arrives and the buffer is full.
  • In some embodiments, a size of the buffer may be either preconfigured or dynamically configured based on one or more policies.
  • In some embodiments, the data processing method may further comprise creating a copy in the memory for the first data segment, before exporting the first data segment.
  • In some embodiments, the data processing method may further comprise exporting the first data segment by saving the first data segment into a storage medium that is physically connected to the memory, and/or sending the first data segment to a remote server via a network.
  • In some embodiments, the data processing method may further comprise exporting a plurality of data segments in the data flow, wherein each said data segment is associated with a different time period.
  • In some embodiments, the control signal may be triggered by a button being pushed, a gesture of a being, a movement of an object, and/or a change of a state, and generated by a user terminal, a motion detector, a state monitor, and/or a data analyzer.
  • In some embodiments, the data processing method may further comprise transmitting a reduced set of the received data flow to a user terminal for preview and/or initiating recording.
  • In some embodiments, the received data flow may contain data in multiple data types, while the reduced set of the received data flow may contain data only in one data type.
  • In some embodiments, the data processing method may further comprise configuring the data source to start transmitting the data flow.
  • In some embodiments, the systems and features described herein thus provide a data processing apparatus comprising one or more microprocessors and a data processor running on the one or more microprocessors, wherein the data processor operates to receive data in a data flow from one or more data sources, wherein the data flow is configured based on a time sequence; receive a control signal, which is associated with a first timestamp, wherein the first timestamp indicates a first time; determine a first data segment by applying the first timestamp on the data flow, wherein the first data segment is associated with a time period in the time sequence that includes the first time.
  • In some embodiments, the one or more data sources may be associated with a data capturing device on a stationary object and/or a movable object.
  • In some embodiments, different communication protocols may be used to receive data from the one or more data sources.
  • In some embodiments, the data flow may contain audio data that are captured using a microphone, video data that are captured using an image sensor, and/or textual data that are captured using a physical or virtual keyboard.
  • In some embodiments, the time sequence may be defined based on a clock time and/or a system defined time.
  • In some embodiments, the data processor may operate to store the received data into a buffer in a memory, and remove a portion of the data stored in the buffer periodically when new data arrives and the buffer is full.
  • In some embodiments, the size of the buffer may be either preconfigured or dynamically configured based on one or more policies.
  • In some embodiments, the data processor may operate to create a copy in the memory for the first data segment, before exporting the first data segment.
  • In some embodiments, the data processor may operate to export the first data segment by saving the first data segment into a storage medium that is physically connected to the memory, and/or send the first data segment to a remote server via a network.
  • In some embodiments, the data processor may operate to export a plurality of data segments in the data flow, wherein each said data segment is associated with a different time period.
  • In some embodiments, the control signal may be triggered by a button being pushed, a gesture of a being, a movement of an object, and/or a change of a state, and generated by a user terminal, a motion detector, a state monitor, and/or a data analyzer.
  • In some embodiments, the data processor may operate to transmit a reduced set of the received data flow to a user terminal for preview and/or initiate recording.
  • In some embodiments, the received data flow may contain data in multiple data types, while the reduced set of the received data flow may contain data only in one data type.
  • In some embodiments, the data processor may operate to configure the one or more data sources to start transmitting the data flow.
  • In some embodiments, the systems and features described herein also provide a non-transitory computer-readable medium with instructions stored thereon, that when executed by a processor, perform the steps comprising: receiving data in a data flow from one or more data sources, wherein the data flow is configured based on a time sequence; receiving a control signal, which is associated with a first timestamp, wherein the first timestamp indicates a first time; determining, via a processor, a first data segment by applying the first timestamp on the data flow, wherein the first data segment is associated with a time period in the time sequence that includes the first time.
  • In some embodiments, the systems and features described herein also provide a data processing system comprising: a data capturing device on a stationary object and/or a movable object; and a data processor, running on one or more microprocessors, wherein the data processor operates to receive data in a data flow from one or more data sources, wherein the data flow is configured based on a time sequence; receive a control signal, which is associated with a first timestamp, wherein the first timestamp indicates a first time; determine a first data segment by applying the first timestamp on the data flow, wherein the first data segment is associated with a time period in the time sequence that includes the first time.
  • In some embodiments, the methods and features described herein also provide a data processing method comprising: storing, in a memory, data received from one or more data sources, wherein the received data is associated with a time sequence; removing a portion of the data stored in the memory after receiving a synchronization signal; and forwarding the removed data to a storage, when a switch is turned on, for a time period in the time sequence.
  • In some embodiments, the one or more data sources may be associated with one or more data capturing devices on a stationary object and/or a movable object.
  • In some embodiments, the received data may contain video data captured using an image sensor, audio data captured using a microphone, and/or textual data captured using a key board.
  • In some embodiments, the data processing method may further comprise using different communication protocols to receive data from the one or more data sources.
  • In some embodiments, the memory may include one of a first-in-first-out (FIFO) protocol memory, a ring buffer, or a list.
  • In some embodiments, the data processing method may further comprise using one or more encoders to encode the received data before storing the encoded data in the memory, and using a multiplexer to combine audio data and video data into an audio/video (A/V) data stream.
  • In some embodiments, the data processing method may further comprise using one or more encoders to encode the received data before saving the encoded data in the storage, and using a multiplexer to combine audio data and video data into an audio/video (A/V) data stream.
  • In some embodiments, the data processing method may further comprise performing one or more image processing operations on the received data before the encoding step.
  • In some embodiments, the storage may be based on a storage medium that connects to the memory and a remote server that connects via a network.
  • In some embodiments, the data processing method may further comprise turning on the switch when receiving a control signal, wherein the control signal is triggered by a button being pushed, a gesture of a being, a movement of an object, and/or a change of a state, and generated by a user terminal, a system detector, a state monitor, and/or a data analyzer.
  • In some embodiments, the received data may contain one or more image frames, wherein the time sequence is associated with an image frame count.
  • In some embodiments, the data processing method may further comprise providing a buffer in the memory, wherein the buffer is used to store a number of image frames with variable sizes.
  • In some embodiments, the data processing method may further comprise creating a data segment, which includes a past portion and a current portion.
  • In some embodiments, the data processing method may further comprise directing the received data to a user terminal for live view.
  • In some embodiments, the systems and features described herein also provide a data processing apparatus comprising one or more microprocessors; a data processor, running on the one or more microprocessors, wherein the data processor operates to store, in a memory, data received from one or more data sources in a time sequence; remove a portion of the data stored in the memory after receiving a synchronization signal; and forward the removed data to a storage, when a switch is turned on, for a time period in the time sequence.
  • In some embodiments, the one or more data sources may be associated with one or more data capturing devices on a stationary object and/or a movable object.
  • In some embodiments, the received data may contain video data captured using an image sensor, audio data captured using a microphone, and/or textual data captured using a key board.
  • In some embodiments, different communication protocols may be used to receive data from the one or more data sources.
  • In some embodiments, the memory may be based on a first-in-first-out (FIFO) protocol, or the memory is a ring buffer or a list.
  • In some embodiments, one or more encoders may be used to encode the received data before storing the encoded data in the memory.
  • In some embodiments, one or more encoders may be used to encode the received data before saving the encoded data in the storage.
  • In some embodiments, a multiplexer may be used to combine audio data and video data into an audio/video (A/V) data stream.
  • In some embodiments, the storage may be based on a storage medium that connects to the memory and a remote server that connects via a network.
  • In some embodiments, the data processor may operate to turn on the switch when receiving a control signal, wherein the control signal is triggered by a button being pushed, a gesture of a being, a movement of an object, and/or a change of a state, and generated by a user terminal, a system detector, a state monitor, and/or a data analyzer.
  • In some embodiments, the received data may contain one or more image frames, wherein the time sequence is associated with an image frame count.
  • In some embodiments, the data processing apparatus may further comprise a buffer in the memory, wherein the buffer is used to store a number of image frames with variable sizes.
  • In some embodiments, the data processor may operate to create a data segment, which includes a past portion and a current portion.
  • In some embodiments, the data processor may operate to direct the received data to a user terminal for live view.
  • In some embodiments, the systems and features described herein also provide a non-transitory computer-readable medium with instructions stored thereon, that when executed by a processor, perform the steps comprising: storing, in a memory, data received from one or more data sources, wherein the received data is associate with a time sequence; removing a portion of the data stored in the memory after receiving a synchronization signal; and forwarding the removed data to a storage, when a switch is turned on, for a time period in the time sequence.
  • In some embodiments, the systems and features described herein also provide a data processing system comprising: a data capturing device on a stationary object and/or a movable object; and a data processor, running on one or more microprocessors, wherein the data processor operates to store, in a memory, data received from one or more data sources in a time sequence; remove a portion of the data stored in the memory after receiving a synchronization signal; and forward the removed data to a storage, when a switch is turned on, for a time period in the time sequence.
  • Many features of the present invention can be performed in, using, or with the assistance of hardware, software, firmware, or combinations thereof. Consequently, features of the present invention may be implemented using a processing system (e.g., including one or more processors). Exemplary processors can include, without limitation, one or more general purpose microprocessors (for example, single or multi-core processors), application-specific integrated circuits, application-specific instruction-set processors, graphics processing units, physics processing units, digital signal processing units, coprocessors, network processing units, audio processing units, encryption processing units, and the like.
  • Features of the present invention can be implemented in, using, or with the assistance of a computer program product which is a storage medium (media) or computer readable medium (media) having instructions stored thereon/in which can be used to program a processing system to perform any of the features presented herein. The storage medium can include, but is not limited to, any type of disk including floppy disks, optical discs, DVD, CD-ROMs, microdrive, and magneto-optical disks, ROMs, RAMs, EPROMs, EEPROMs, DRAMs, VRAMs, flash memory devices, magnetic or optical cards, nanosystems (including molecular memory ICs), or any type of media or device suitable for storing instructions and/or data.
  • Stored on any one of the machine readable medium (media), features of the present invention can be incorporated in software and/or firmware for controlling the hardware of a processing system, and for enabling a processing system to interact with other mechanism utilizing the results of the present invention. Such software or firmware may include, but is not limited to, application code, device drivers, operating systems and execution environments/containers.
  • Features of the invention may also be implemented in hardware using, for example, hardware components such as application specific integrated circuits (ASICs) and field-programmable gate array (FPGA) devices. Implementation of the hardware state machine so as to perform the functions described herein will be apparent to persons skilled in the relevant art.
  • Additionally, the present invention may be conveniently implemented using one or more conventional general purpose or specialized digital computer, computing device, machine, or microprocessor, including one or more processors, memory and/or computer readable storage media programmed according to the teachings of the present disclosure. Appropriate software coding can readily be prepared by skilled programmers based on the teachings of the present disclosure, as will be apparent to those skilled in the software art.
  • While various embodiments of the present invention have been described above, it should be understood that they have been presented by way of example, and not limitation. It will be apparent to persons skilled in the relevant art that various changes in form and detail can be made therein without departing from the spirit and scope of the invention.
  • The present invention has been described above with the aid of functional building blocks illustrating the performance of specified functions and relationships thereof. The boundaries of these functional building blocks have often been arbitrarily defined herein for the convenience of the description. Alternate boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. Any such alternate boundaries are thus within the scope and spirit of the invention.
  • The foregoing description of the present invention has been provided for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise forms disclosed. The breadth and scope of the present invention should not be limited by any of the above-described exemplary embodiments. Many modifications and variations will be apparent to the practitioner skilled in the art. The modifications and variations include any relevant combination of the disclosed features. The embodiments were chosen and described in order to best explain the principles of the invention and its practical application, thereby enabling others skilled in the art to understand the invention for various embodiments and with various modifications that are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the following claims and their equivalence.

Claims (21)

1-30. (canceled)
31. A method for backtracking data recording, comprising:
receiving a data flow associated with a time sequence from one or more data sources;
storing the data flow in a first storage medium;
receiving, at a time point, a control signal to transfer data from the first storage medium to a second storage medium;
determining a time period in the time sequence according to the time point;
selecting a subset of data in the data flow; and
transferring the subset of data from the first storage medium to the second storage medium;
wherein the time period includes a past portion and a current portion, and the past portion and the current portion are adjacent in the time sequence relative to the time point.
32. The method of claim 31, wherein the past portion is determined based on at least one of a characteristic of the data flow or a characteristic of the first storage medium and the current portion is a difference between the time period and the past portion.
33. The method of claim 31, wherein the past portion equals a maximum image frame number associated with the first storage medium divided by an image speed.
34. The method of claim 31, wherein the control signal is triggered by a button being pushed, a gesture of a user, a movement of an object, or a change of a state.
35. The method of claim 31, wherein the data flow comprises at least one of:
video data captured using an image sensor,
audio data captured using a microphone, or
textual data captured using a keyboard.
36. The method of claim 31, wherein the first storage medium is configured to store a maximum number of image frames, and wherein the time period is based on a maximum number of image frames associated with the first storage medium.
37. The method of claim 31, wherein the received data contains one or more image frames, and the time sequence is associated with an image frame count.
38. The method of claim 31, wherein the first storage medium is configured to store the data flow in a first-in, first-out buffer.
39. The method of claim 31, wherein the subset of data includes:
at least one image frame recorded before receipt of the control signal, and at least one image frame recorded after receipt of the control signal.
40. The method of claim 31, wherein the one or more data sources are configured to capture information for at least one of an unmanned aircraft, an unmanned vehicle, a hand held device, or a robot.
41. A system for backtracking data recording, comprising:
at least one non-transitory computer-readable medium configured to store instructions; and
at least one processor configured to execute the instructions to perform operations comprising:
receiving a data flow associated with a time sequence from one or more data sources;
storing the data flow in a first storage medium;
receiving, at a time point, a control signal to transfer data from the first storage medium to a second storage medium;
determining a time period in the time sequence according to the time point;
selecting a subset of data in the data flow; and
transferring the subset of data from the first storage medium to the second storage medium;
wherein the time period includes a past portion and a current portion, and the past portion and the current portion are adjacent in the time sequence relative to the time point.
42. The system of claim 41, wherein the past portion is determined based on at least one of a characteristic of the data flow or a characteristic of the first storage medium and the current portion is a difference between the time period and the past portion.
43. The system of claim 41, wherein the past portion equals a maximum image frame number associated with the first storage medium divided by an image speed.
44. The system of claim 41, wherein the control signal is triggered by a button being pushed, a gesture of a user, a movement of an object, or a change of a state.
45. The system of claim 41, wherein the data flow comprises at least one of:
video data captured using an image sensor,
audio data captured using a microphone, or
textual data captured using a keyboard.
46. The system of claim 41, wherein the first storage medium is configured to store a maximum number of image frames, and wherein the time period is based on a maximum number of image frames associated with the first storage medium.
47. The system of claim 41, wherein the received data contains one or more image frames, and the time sequence is associated with an image frame count.
48. The system of claim 41, wherein the first storage medium is configured to store the data flow in a first-in, first-out buffer.
49. The system of claim 41, wherein the subset of data includes:
at least one image frame recorded before receipt of the control signal, and
at least one image frame recorded after receipt of the control signal.
50. The system of claim 41, wherein the one or more data sources are configured to capture information for at least one of an unmanned aircraft, an unmanned vehicle, a hand held device, or a robot.
US16/939,653 2014-12-14 2020-07-27 System and method for supporting selective backtracking data recording Abandoned US20200358979A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/939,653 US20200358979A1 (en) 2014-12-14 2020-07-27 System and method for supporting selective backtracking data recording

Applications Claiming Priority (7)

Application Number Priority Date Filing Date Title
PCT/CN2014/093785 WO2016095071A1 (en) 2014-12-14 2014-12-14 Video processing method, video processing device and playing device
PCT/CN2014/093786 WO2016095072A1 (en) 2014-12-14 2014-12-14 Video processing method, video processing device and display device
PCT/CN2015/076015 WO2016095367A1 (en) 2014-12-14 2015-04-07 System and method for supporting selective backtracking data recording
US15/349,958 US9973728B2 (en) 2014-12-14 2016-11-11 System and method for supporting selective backtracking data recording
US15/942,862 US10284808B2 (en) 2014-12-14 2018-04-02 System and method for supporting selective backtracking data recording
US16/270,122 US10771734B2 (en) 2014-12-14 2019-02-07 System and method for supporting selective backtracking data recording
US16/939,653 US20200358979A1 (en) 2014-12-14 2020-07-27 System and method for supporting selective backtracking data recording

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US16/270,122 Continuation US10771734B2 (en) 2014-12-14 2019-02-07 System and method for supporting selective backtracking data recording

Publications (1)

Publication Number Publication Date
US20200358979A1 true US20200358979A1 (en) 2020-11-12

Family

ID=56125736

Family Applications (6)

Application Number Title Priority Date Filing Date
US15/349,958 Active US9973728B2 (en) 2014-12-14 2016-11-11 System and method for supporting selective backtracking data recording
US15/598,170 Expired - Fee Related US10567700B2 (en) 2014-12-14 2017-05-17 Methods and systems of video processing
US15/942,862 Active US10284808B2 (en) 2014-12-14 2018-04-02 System and method for supporting selective backtracking data recording
US16/270,122 Active US10771734B2 (en) 2014-12-14 2019-02-07 System and method for supporting selective backtracking data recording
US16/789,651 Active US11095847B2 (en) 2014-12-14 2020-02-13 Methods and systems of video processing
US16/939,653 Abandoned US20200358979A1 (en) 2014-12-14 2020-07-27 System and method for supporting selective backtracking data recording

Family Applications Before (5)

Application Number Title Priority Date Filing Date
US15/349,958 Active US9973728B2 (en) 2014-12-14 2016-11-11 System and method for supporting selective backtracking data recording
US15/598,170 Expired - Fee Related US10567700B2 (en) 2014-12-14 2017-05-17 Methods and systems of video processing
US15/942,862 Active US10284808B2 (en) 2014-12-14 2018-04-02 System and method for supporting selective backtracking data recording
US16/270,122 Active US10771734B2 (en) 2014-12-14 2019-02-07 System and method for supporting selective backtracking data recording
US16/789,651 Active US11095847B2 (en) 2014-12-14 2020-02-13 Methods and systems of video processing

Country Status (6)

Country Link
US (6) US9973728B2 (en)
EP (3) EP3167604B1 (en)
CN (4) CN107005624B (en)
DK (1) DK3123644T3 (en)
ES (1) ES2877224T3 (en)
WO (2) WO2016095361A1 (en)

Families Citing this family (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013023063A1 (en) 2011-08-09 2013-02-14 Path 36 Llc Digital media editing
CN107005624B (en) 2014-12-14 2021-10-01 深圳市大疆创新科技有限公司 Method, system, terminal, device, processor and storage medium for processing video
SG11201707024XA (en) * 2015-03-02 2017-09-28 Israel Aerospace Ind Ltd Remote detecting and tracking of objects
US10503264B1 (en) * 2015-06-16 2019-12-10 Snap Inc. Radial gesture navigation
US10109319B2 (en) 2016-01-08 2018-10-23 Gopro, Inc. Digital media editing
US9812175B2 (en) 2016-02-04 2017-11-07 Gopro, Inc. Systems and methods for annotating a video
JP2017151894A (en) * 2016-02-26 2017-08-31 ソニーモバイルコミュニケーションズ株式会社 Information processing device, information processing method, and program
US10530731B1 (en) 2016-03-28 2020-01-07 Snap Inc. Systems and methods for chat with audio and video elements
KR102581438B1 (en) * 2017-01-12 2023-09-21 삼성전자주식회사 Wireless display subsystem and system-on-chip
WO2018221068A1 (en) * 2017-05-30 2018-12-06 ソニー株式会社 Information processing device, information processing method and information processing program
US10939158B2 (en) 2017-06-23 2021-03-02 Samsung Electronics Co., Ltd. Electronic apparatus, display apparatus and control method thereof
CN107633228A (en) * 2017-09-20 2018-01-26 北京奇虎科技有限公司 Video data handling procedure and device, computing device
WO2019100204A1 (en) * 2017-11-21 2019-05-31 深圳市大疆创新科技有限公司 Video processing method, device, aerial vehicle, system, and storage medium
CN109076263B (en) * 2017-12-29 2021-06-22 深圳市大疆创新科技有限公司 Video data processing method, device, system and storage medium
CN110058887B (en) * 2018-01-16 2022-02-18 腾讯科技(深圳)有限公司 Video processing method, video processing device, computer-readable storage medium and computer equipment
CN110622517A (en) * 2018-01-19 2019-12-27 深圳市大疆创新科技有限公司 Video processing method and device
CN110121105B (en) * 2018-02-06 2022-04-29 阿里巴巴(中国)有限公司 Clip video generation method and device
CN111194433A (en) * 2018-04-04 2020-05-22 深圳市大疆创新科技有限公司 Method and system for composition and image capture
CN108564543A (en) * 2018-04-11 2018-09-21 长春理工大学 A kind of underwater picture color compensation method based on electromagnetic theory
CN108810657B (en) * 2018-06-15 2020-11-06 网宿科技股份有限公司 Method and system for setting video cover
CN108881928A (en) * 2018-06-29 2018-11-23 百度在线网络技术(北京)有限公司 Method and apparatus for release information, the method and apparatus for handling information
CN108924647A (en) * 2018-07-27 2018-11-30 深圳众思科技有限公司 Video editing method, video editing apparatus, terminal
CN110771150B (en) * 2018-09-29 2022-04-05 深圳市大疆创新科技有限公司 Video processing method, video processing device, shooting system and computer readable storage medium
CN109275028B (en) * 2018-09-30 2021-02-26 北京微播视界科技有限公司 Video acquisition method, device, terminal and medium
CN109348276B (en) * 2018-11-08 2019-12-17 北京微播视界科技有限公司 video picture adjusting method and device, computer equipment and storage medium
WO2020131037A1 (en) * 2018-12-18 2020-06-25 Rovi Guides, Inc. Systems and methods for automated tracking on a handheld device using a remote camera
CN109714531B (en) * 2018-12-26 2021-06-01 深圳市道通智能航空技术股份有限公司 Image processing method and device and unmanned aerial vehicle
CN109768845B (en) * 2018-12-28 2021-03-09 北京诺亦腾科技有限公司 Data processing method, device and storage medium
US10992960B2 (en) 2019-02-06 2021-04-27 Jared Michael Cohn Accelerated video exportation to multiple destinations
CN109819338B (en) * 2019-02-22 2021-09-14 影石创新科技股份有限公司 Automatic video editing method and device and portable terminal
CN110139149B (en) * 2019-06-21 2020-11-24 上海摩象网络科技有限公司 Video optimization method and device, and electronic equipment
CN110234029B (en) * 2019-07-31 2021-12-17 上海商汤临港智能科技有限公司 Playing processing method, device, equipment and storage medium of multi-sensor data
WO2021046324A1 (en) * 2019-09-06 2021-03-11 Google Llc Event based recording
CN110677734B (en) * 2019-09-30 2023-03-10 北京达佳互联信息技术有限公司 Video synthesis method and device, electronic equipment and storage medium
CN112800805A (en) * 2019-10-28 2021-05-14 上海哔哩哔哩科技有限公司 Video editing method, system, computer device and computer storage medium
CN111050066A (en) * 2019-11-01 2020-04-21 深圳市道通智能航空技术有限公司 Zooming method and device, aircraft, flight system and storage medium
CN111311477A (en) * 2020-02-28 2020-06-19 深圳看到科技有限公司 Image editing method and device and corresponding storage medium
CN111510644B (en) * 2020-04-24 2022-06-07 Oppo广东移动通信有限公司 Video processing method and device, mobile terminal and storage medium
CN112997506A (en) * 2020-05-28 2021-06-18 深圳市大疆创新科技有限公司 Video file editing method, device, system and computer readable storage medium
CN111669516A (en) * 2020-06-19 2020-09-15 江苏镭博智能科技有限公司 Method for realizing transverse splicing of eight paths of video data and real-time synchronization of IMU (inertial measurement Unit) data based on FPGA (field programmable Gate array)
CN111757013B (en) * 2020-07-23 2022-04-29 北京字节跳动网络技术有限公司 Video processing method, device, equipment and storage medium
CN111736790B (en) * 2020-07-31 2020-12-18 开立生物医疗科技(武汉)有限公司 Multi-screen display method, device and system and host equipment
CN114125551B (en) * 2020-08-31 2023-11-17 抖音视界有限公司 Video generation method, device, electronic equipment and computer readable medium
US11297260B1 (en) * 2020-11-20 2022-04-05 Donald Siu Techniques for capturing video in landscape mode by a handheld device
US11955144B2 (en) * 2020-12-29 2024-04-09 Snap Inc. Video creation and editing and associated user interface
TWI819275B (en) * 2021-02-17 2023-10-21 瑞昱半導體股份有限公司 Circuit and method for controlling audio adapter
CN113468319B (en) * 2021-07-21 2022-01-14 深圳市亿莱顿科技有限公司 Internet-based multi-application-scene conference interaction system and method

Family Cites Families (65)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0702832B1 (en) * 1993-06-10 1998-03-04 Lightworks Editing Systems Ltd Video editing systems
US20030093790A1 (en) * 2000-03-28 2003-05-15 Logan James D. Audio and video program recording, editing and playback systems using metadata
JPH11176137A (en) 1997-12-15 1999-07-02 Matsushita Electric Ind Co Ltd Optical disk medium and its recording method and device
KR100723282B1 (en) * 1999-05-07 2007-05-30 비티에스 홀딩 인터내셔널 비.브이. Gathering and editing information with a camera
US6870547B1 (en) * 1999-12-16 2005-03-22 Eastman Kodak Company Method and apparatus for rendering a low-resolution thumbnail image suitable for a low resolution display having a reference back to an original digital negative and an edit list of operations
US7068668B2 (en) * 2000-01-07 2006-06-27 Feuer Donald S Method and apparatus for interfacing a public switched telephone network and an internet protocol network for multi-media communication
JP3663362B2 (en) 2001-03-30 2005-06-22 インターナショナル・ビジネス・マシーンズ・コーポレーション Index generator
US8965175B2 (en) * 2001-04-09 2015-02-24 Monitoring Technology Corporation Data recording and playback system and method
US20030018647A1 (en) * 2001-06-29 2003-01-23 Jan Bialkowski System and method for data compression using a hybrid coding scheme
US7110026B2 (en) * 2001-07-03 2006-09-19 Logitech Europe S.A. Image tagging for post processing
JP2003256432A (en) 2002-03-06 2003-09-12 Telecommunication Advancement Organization Of Japan Image material information description method, remote retrieval system, remote retrieval method, edit device, remote retrieval terminal, remote edit system, remote edit method, edit device, remote edit terminal, and image material information storage device, and method
US7447369B2 (en) * 2003-03-07 2008-11-04 Ricoh Co., Ltd. Communication of compressed digital images
AU2004227949B9 (en) 2003-04-03 2010-07-22 Commvault Systems, Inc. System and method for dynamically performing storage operations in a computer network
JP4189653B2 (en) 2003-04-14 2008-12-03 ソニー株式会社 Image recording / reproducing method and image recording / reproducing apparatus
US20070019072A1 (en) * 2003-07-28 2007-01-25 The Boeing Company Apparatus for processing digital images
US20050033758A1 (en) * 2003-08-08 2005-02-10 Baxter Brent A. Media indexer
JP3923932B2 (en) 2003-09-26 2007-06-06 株式会社東芝 Video summarization apparatus, video summarization method and program
JP4396567B2 (en) 2005-04-15 2010-01-13 ソニー株式会社 Material recording apparatus and material recording method
US7769819B2 (en) 2005-04-20 2010-08-03 Videoegg, Inc. Video editing with timeline representations
JP2008537439A (en) 2005-04-22 2008-09-11 トムソン ライセンシング Method, apparatus, and system for creating editing operation list of media content recorded in advance
CN101395918B (en) * 2006-01-13 2012-02-29 雅虎公司 Method and system for creating and applying dynamic media specification creator and applicator
EP1929407A4 (en) * 2006-01-13 2009-09-23 Yahoo Inc Method and system for online remixing of digital multimedia
WO2007120694A1 (en) * 2006-04-10 2007-10-25 Yahoo! Inc. User interface for editing media assets
JP2008136466A (en) 2006-11-29 2008-06-19 Youko Furuhata Easily weeding tool
US8020100B2 (en) 2006-12-22 2011-09-13 Apple Inc. Fast creation of video segments
JP4960121B2 (en) 2007-03-12 2012-06-27 パナソニック株式会社 Content shooting device
WO2008136466A1 (en) 2007-05-01 2008-11-13 Dep Co., Ltd. Dynamic image editing device
KR101341504B1 (en) * 2007-07-12 2013-12-16 엘지전자 주식회사 Portable terminal and method for creating multi-media contents in the portable terminal
JP4424389B2 (en) 2007-08-24 2010-03-03 ソニー株式会社 Movie creation device, movie creation method, and program
US20090259944A1 (en) 2008-04-10 2009-10-15 Industrial Technology Research Institute Methods and systems for generating a media program
CN101312535A (en) * 2008-07-07 2008-11-26 新科电子集团有限公司 Terminal apparatus of CMMB system and working method thereof
US8237856B2 (en) * 2008-11-07 2012-08-07 Looxcie, Inc. Timeshifting video recording camera
CN101753941A (en) 2008-12-19 2010-06-23 康佳集团股份有限公司 Method for realizing markup information in imaging device and imaging device
EP2416581A4 (en) * 2009-03-30 2014-03-12 Panasonic Corp Recording medium, reproducing device, and integrated circuit
JP5444878B2 (en) * 2009-06-25 2014-03-19 横河電機株式会社 Waveform analyzer
CN101740082A (en) 2009-11-30 2010-06-16 孟智平 Method and system for clipping video based on browser
US20110206351A1 (en) * 2010-02-25 2011-08-25 Tal Givoli Video processing system and a method for editing a video asset
CN102215429B (en) * 2010-04-01 2013-04-17 安凯(广州)微电子技术有限公司 Recording method for mobile TV
PL211198B1 (en) * 2010-04-21 2012-04-30 Univ Przyrodniczy We Wrocławiu Geocomposite element, preferably for supporting the plant vegetation
JP2012049840A (en) 2010-08-27 2012-03-08 Nippon Telegr & Teleph Corp <Ntt> Video editing device, video editing method, video editing program, and computer readable storage medium
DK2617186T3 (en) 2010-09-13 2022-02-07 Contour Ip Holding Llc Portable digital video camera for controlling and viewing distant images
CN102685430A (en) * 2011-03-18 2012-09-19 新奥特(北京)视频技术有限公司 Method for achieving video real-time output through 1394 connection line
US8754984B2 (en) * 2011-05-02 2014-06-17 Futurewei Technologies, Inc. System and method for video caption re-overlaying for video adaptation and retargeting
CN103108190A (en) * 2011-11-15 2013-05-15 上海宝康电子控制工程有限公司 System and method of achieving image synchronization between analog camera and high definition cameras
CN102542466A (en) * 2011-12-02 2012-07-04 万军 Video tracing method and device
CN103313122B (en) 2012-03-09 2018-02-27 联想(北京)有限公司 A kind of data processing method and electronic equipment
TW201347521A (en) * 2012-05-11 2013-11-16 Hon Hai Prec Ind Co Ltd System and method for adjusting timestamps
JP5988798B2 (en) 2012-09-18 2016-09-07 キヤノン株式会社 Image display apparatus, control method therefor, program, and storage medium
CN103856806B (en) * 2012-11-28 2018-05-01 腾讯科技(北京)有限公司 Video stream switching method, apparatus and system
US20140153900A1 (en) * 2012-12-05 2014-06-05 Samsung Electronics Co., Ltd. Video processing apparatus and method
CN103002330B (en) * 2012-12-31 2014-07-09 合一网络技术(北京)有限公司 Method for editing multiple videos shot at same time and place through network, client side, server and system
KR101978216B1 (en) 2013-01-04 2019-05-14 엘지전자 주식회사 Mobile terminal and method for controlling thereof
CN103096184B (en) 2013-01-18 2016-04-13 深圳市同洲电子股份有限公司 A kind of video editing method and device
US9141866B2 (en) * 2013-01-30 2015-09-22 International Business Machines Corporation Summarizing salient events in unmanned aerial videos
JP6178099B2 (en) * 2013-04-05 2017-08-09 ソニー株式会社 Intermediate unit and camera system
CN103412746B (en) 2013-07-23 2017-06-06 华为技术有限公司 Media content sharing method and terminal device and content sharing system, content
JP6277646B2 (en) * 2013-09-25 2018-02-14 富士通株式会社 Receiving device, receiving method, program
CN103490842B (en) * 2013-09-26 2016-09-28 深圳市大疆创新科技有限公司 Data transmission system and method
CN103716712A (en) 2013-12-31 2014-04-09 上海艾麒信息科技有限公司 Video processing method based on mobile terminal
CN103761985B (en) 2014-01-24 2016-04-06 北京华科飞扬科技股份公司 A kind of hyperchannel video and audio is online performs in a radio or TV programme editing system
CN104052935B (en) 2014-06-18 2017-10-20 广东欧珀移动通信有限公司 A kind of video editing method and device
US9371133B2 (en) * 2014-11-07 2016-06-21 Paccar Inc Drone systems for pre-trip inspection and assisted backing
US9570106B2 (en) * 2014-12-02 2017-02-14 Sony Corporation Sensor configuration switching for adaptation of video capturing frame rate
CN107005624B (en) 2014-12-14 2021-10-01 深圳市大疆创新科技有限公司 Method, system, terminal, device, processor and storage medium for processing video
US9818451B1 (en) * 2015-12-21 2017-11-14 Amazon Technologies, Inc. Frame selection of video data

Also Published As

Publication number Publication date
EP3167604A1 (en) 2017-05-17
US11095847B2 (en) 2021-08-17
US20200186749A1 (en) 2020-06-11
US10284808B2 (en) 2019-05-07
EP3123644A4 (en) 2017-05-03
US10771734B2 (en) 2020-09-08
CN107005624B (en) 2021-10-01
US9973728B2 (en) 2018-05-15
EP3123644A1 (en) 2017-02-01
US20170256288A1 (en) 2017-09-07
EP3846454A1 (en) 2021-07-07
CN107223316A (en) 2017-09-29
DK3123644T3 (en) 2021-05-31
EP3167604B1 (en) 2021-06-30
WO2016095367A1 (en) 2016-06-23
US20190174090A1 (en) 2019-06-06
US10567700B2 (en) 2020-02-18
EP3123644B1 (en) 2021-04-07
CN107223316B (en) 2020-04-17
CN111432152A (en) 2020-07-17
ES2877224T3 (en) 2021-11-16
US20180227539A1 (en) 2018-08-09
CN113766161B (en) 2023-06-20
WO2016095361A1 (en) 2016-06-23
EP3167604A4 (en) 2017-07-26
CN113766161A (en) 2021-12-07
CN107005624A (en) 2017-08-01
US20170064247A1 (en) 2017-03-02

Similar Documents

Publication Publication Date Title
US20200358979A1 (en) System and method for supporting selective backtracking data recording
US11197057B2 (en) Storage management of data streamed from a video source device
US20140298179A1 (en) Method and device for playback of presentation file
CN109155840B (en) Moving image dividing device and monitoring method
KR20190116978A (en) Display-based video analytics
WO2023165608A1 (en) Frame dropping method and apparatus, and server and medium
CN108696505B (en) Video distribution apparatus, video reception apparatus, video distribution method, and recording medium
US20140068438A1 (en) Favorites bar for broadcast video production systems
US10902884B2 (en) Methods and apparatus for ordered serial synchronization of multimedia streams upon sensor changes
JP2014116805A (en) Imaging device, information processing device, control method therefor, and video processing system
EP3599767A1 (en) Video distribution apparatus, distribution method, and program
KR102144512B1 (en) Storage apparatus of multi-channel video data
US20150022676A1 (en) Information processing apparatus and control method thereof
EP3496414B1 (en) Video image distribution apparatus, control method, and recording medium
KR101212947B1 (en) Apparatus for transmitting data
CN114827542A (en) Method, system, equipment and medium for capturing images of multiple paths of video code streams
JP5499207B2 (en) Data transmission system
JP2007324882A (en) Video recording/reproducing apparatus
JP2014165693A (en) Recording device

Legal Events

Date Code Title Description
AS Assignment

Owner name: SZ DJI TECHNOLOGY CO., LTD., CHINA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SCHWARTZ, SHELDON;WANG, TAO;WANG, MINGYU;AND OTHERS;SIGNING DATES FROM 20160901 TO 20161012;REEL/FRAME:053423/0919

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION