US20210227102A1 - Systems and methods for synchronizing frame timing between physical layer frame and video frame - Google Patents
Systems and methods for synchronizing frame timing between physical layer frame and video frame Download PDFInfo
- Publication number
- US20210227102A1 US20210227102A1 US17/207,593 US202117207593A US2021227102A1 US 20210227102 A1 US20210227102 A1 US 20210227102A1 US 202117207593 A US202117207593 A US 202117207593A US 2021227102 A1 US2021227102 A1 US 2021227102A1
- Authority
- US
- United States
- Prior art keywords
- frame
- video
- time point
- physical layer
- sub
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/04—Synchronising
- H04N5/08—Separation of synchronising signals from picture signals
- H04N5/10—Separation of line synchronising signal from frame synchronising signal or vice versa
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/21—Server components or server architectures
- H04N21/218—Source of audio or video content, e.g. local disk arrays
- H04N21/2187—Live feed
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/60—Network streaming of media packets
- H04L65/61—Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/60—Network streaming of media packets
- H04L65/70—Media network packetisation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/60—Network streaming of media packets
- H04L65/75—Media network packet handling
- H04L65/762—Media network packet handling at the source
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/27—Server based end-user applications
- H04N21/274—Storing end-user multimedia data in response to end-user request, e.g. network recorder
- H04N21/2743—Video hosting of uploaded data from client
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/414—Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
- H04N21/41422—Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance located in transportation means, e.g. personal vehicle
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/4223—Cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/4302—Content synchronisation processes, e.g. decoder synchronisation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/83—Generation or processing of protective or descriptive data associated with content; Content structuring
- H04N21/845—Structuring of content, e.g. decomposing content into time segments
- H04N21/8455—Structuring of content, e.g. decomposing content into time segments involving pointers to the content, e.g. pointers to the I-frames of the video stream
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/83—Generation or processing of protective or descriptive data associated with content; Content structuring
- H04N21/845—Structuring of content, e.g. decomposing content into time segments
- H04N21/8456—Structuring of content, e.g. decomposing content into time segments by decomposing the content in the time domain, e.g. in time segments
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/04—Synchronising
- H04N5/06—Generation of synchronising signals
- H04N5/067—Arrangements or circuits at the transmitter end
- H04N5/073—Arrangements or circuits at the transmitter end for mutually locking plural sources of synchronising signals, e.g. studios or relay stations
- H04N5/0733—Arrangements or circuits at the transmitter end for mutually locking plural sources of synchronising signals, e.g. studios or relay stations for distributing synchronisation pulses to different TV cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/24—Systems for the transmission of television signals using pulse code modulation
- H04N7/52—Systems for transmission of a pulse code modulated video signal with one or more other pulse code modulated signals, e.g. an audio signal or a synchronizing signal
- H04N7/54—Systems for transmission of a pulse code modulated video signal with one or more other pulse code modulated signals, e.g. an audio signal or a synchronizing signal the signals being synchronous
- H04N7/56—Synchronising systems therefor
Definitions
- This application relates to systems and methods for transmission synchronization, more particularly, to systems and methods for synchronizing the frame timing between a physical layer frame and a video frame.
- Unmanned movable platform such as unmanned aerial vehicles (UAV) have been widely used in various fields such as aerial photography, surveillance, scientific research, geological survey, and remote sensing.
- the maneuvering of a UAV may be controlled by a user via a ground terminal.
- the UAV may record video data during flight and transmit the video data to the ground terminal.
- the ground terminal may display the video data synchronously with the recording.
- a method for synchronizing video transmission with physical layer may be implemented on a computing device including at least one processor and a storage.
- the method may include determining a first time point corresponding to a frame header of a video frame, determining a second time point corresponding to a frame header of a physical layer frame based at least in part on the first time point, and starting transmitting the video frame at the second time point.
- the method may further include generating the physical layer frame at the second time point.
- the determining a second time point corresponding to a frame header of a physical layer frame based at least in part on the first time point may include determining the second time point based at least in part on the first time point, and synchronizing a third time point corresponding to a frame header of a physical layer frame with the second time point, the physical layer frame corresponding to the video frame.
- the method may further include compressing the video frame before transmitting the video frame at the second time point.
- the method may further include segmenting the video frame into a plurality of sub-frames, and compressing data associated with each of the plurality of sub-frames.
- the determining the second time point may include determining a time period for compressing a sub-frame of the plurality of sub-frames, and determining the second time point based on the first time point and the time period for compressing the sub-frame.
- the determining the second time point based on the first time point may include determining a time period for compressing at least a portion of the video frame, and determining the second time point based on the first time point and the time period.
- the frame header of the video frame may correspond to a frame synchronization pulse signal.
- the video frame may be extracted from a real-time video stream transmitted by a video recording device.
- the video frame may be extracted from a real-time video received at a data interface communicatively connected to a video recording device.
- the method may further include obtaining a frame rate of the video frame, and configuring a frame rate of the physical layer frame based on the frame rate of the video frame.
- the frame rate of the physical layer frame may be an integer multiple of the frame rate of the video frame.
- a system for synchronizing video transmission with physical layer may include a memory that stores one or more computer-executable instructions, and one or more processors configured to communicate with the memory.
- the one or more processors may be directed to determine a first time point corresponding to a frame header of a video frame, determine a second time point corresponding to a frame header of a physical layer frame based at least in part on the first time point, and start transmitting the video frame at the second time point.
- a non-transitory computer readable medium may include executable instructions.
- the executable instructions When the executable instructions are executed by at least one processor, the executable instructions may cause the at least one processor to effectuate a method.
- the method may include determining a first time point corresponding to a frame header of a video frame, determining a second time point corresponding to a frame header of a physical layer frame based at least in part on the first time point, and starting transmitting the video frame at the second time point.
- an unmanned aerial vehicle may include at least one video recording device, at least one processor and at least one UAV transceiver.
- the at least one video recording device may be configured to record video data including a plurality of video frames.
- the at least one processor may be configured to determine a first time point corresponding to a frame header of a video frame of the plurality of video frames, and determine a second time point corresponding to a frame header of a physical layer frame based at least in part on the first time point.
- the at least one UAV transceiver may be configured to start transmitting the video frame at the second time point.
- FIG. 1 illustrates a schematic diagram of an exemplary unmanned aerial vehicle (UAV) system according to some embodiments of the present disclosure
- FIG. 2 illustrates a block diagram of an exemplary unmanned aerial vehicle (UAV) according to some embodiments of the present disclosure
- FIG. 3 illustrates a flowchart of an exemplary process for video transmission in a UAV system according to some embodiments of the present disclosure
- FIG. 4 illustrates a block diagram of an exemplary ground terminal in a UAV system according to some embodiments of the present disclosure
- FIG. 5 illustrates a block diagram of an exemplary processor in a UAV system according to some embodiments of the present disclosure
- FIG. 6 illustrates a flowchart of an exemplary process for transmitting a video frame in a UAV system according to some embodiments of the present disclosure
- FIG. 7 illustrates a flowchart of an exemplary process for configuring a frame rate of physical layer frames in a UAV system according to some embodiments of the present disclosure
- FIG. 8A and FIG. 8B illustrate two schematic diagrams of a video transmission in a UAV system according to some embodiments of the present disclosure.
- FIG. 9 illustrates a schematic diagram of an exemplary open systems interconnection (OSI) model.
- OSI open systems interconnection
- system means, “unit,” “module,” and/or “engine” used herein are one method to distinguish different components, elements, parts, section or assembly of different level in ascending order. However, the terms may be displaced by other expression if they may achieve the same purpose.
- the present disclosure provides systems and methods for video transmission synchronization in a UAV system.
- the present disclosure adjusts a frame timing of a physical layer transmission according to frame timing of a video stream transmission.
- the video stream may be received directly from a video recording device carried on a UAV or at a data interface communicatively connected to a video recording device carried on the UAV.
- the transmission delay of the video stream from a UAV to a ground terminal due to the wait time associated with each video frame may be reduced.
- the present disclosure configures the frame rate of the physical layer frame to be an integer multiple of the frame rate of the video frame. Therefore, the number of adjustments of the frame timing during the video stream transmission between a UAV and a ground terminal may be reduced.
- FIG. 1 illustrates a schematic diagram of an exemplary unmanned aerial vehicle (UAV) system 100 according to some embodiments of the present disclosure.
- the UAV system 100 may include a UAV 102 , a ground terminal 104 , a network 106 , a server 108 , and a storage 110 .
- the UAV 102 may be configured to collect data and transmit the collected data to the ground terminal 104 and/or the server 108 during flight.
- the data may include the flight status of the UAV 102 , the battery usage of the UAV 102 , information associated with the surrounding environment, etc.
- the data may further include text data, video data, audio data, etc.
- the video data may include videos, images, graphs, animations, audios, etc.
- the UAV 102 may transmit the data to the ground terminal 104 during flight to synchronously display the content of the data on the ground terminal 104 .
- the UAV 102 may be operated completely autonomously (e.g., by a computing system such as an onboard controller), semi-autonomously, or manually (e.g., by a user operating a control application implemented on a mobile device).
- the UAV 102 may be operated by a user via the ground terminal 104 .
- the UAV 102 may receive commands from an entity (e.g., human user or autonomous control system) and respond to such commands by performing one or more actions.
- the UAV 102 may be controlled to take off from the ground, move in the air, move to target location or to a sequence of target locations, hover in the air, land on the ground, etc.
- the UAV 102 may be controlled to move at a specified velocity and/or acceleration or along a specified route in the air.
- the commands may be used to control one or more UAV 102 components described in FIG. 2 (e.g., video recording device 206 , sensor 210 , flight controller 208 , etc.). For example, some commands may be used to control the position, orientation, and/or operation of the video recording device 206 .
- the ground terminal 104 may be configured to transmit, receive, output, display, and/or process information.
- the ground terminal 104 may receive information from the UAV 102 , the network 106 , the server 108 , the storage 110 , etc.
- the ground terminal 104 may transmit a command generated by a user to control the UAV 102 .
- the command may include information to control the velocity, acceleration, altitude, and/or orientation of the UAV 102 .
- the ground terminal 104 may display images or play videos taken by the UAV 102 to a user.
- the ground terminal 104 may process information received from the server 108 to update the application installed on the ground terminal 104 .
- the ground terminal 104 may include a desktop computer, a mobile device, a laptop computer, a tablet computer, or the like, or any combination thereof.
- the mobile device may include a smart home device, a wearable device, a smart mobile device, a virtual reality device, an augmented reality device, or the like, or any combination thereof.
- the smart home device may include a smart lighting device, a smart television, a smart video camera, an interphone, or the like, or any combination thereof.
- the wearable device may include a smart bracelet, a smart footgear, a smart glass, a smart watch, a smart helmet, smart clothing, a smart backpack, a smart accessory, or the like, or any combination thereof.
- the smart mobile device may include a smartphone, a gaming device, a navigation device, a point of sale (POS) device, or the like, or any combination thereof.
- POS point of sale
- the network 106 may be configured to facilitate exchange of information.
- one or more components in the UAV system 100 e.g., the UAV 102 , the ground terminal 104 , the server 108 and the storage 110
- the ground terminal 104 may receive videos and/or images from the UAV 102 via the network 106 .
- the network 106 may be any type of a wired or wireless network, or a combination thereof.
- the network 106 may include a cable network, a wire line network, an optical fiber network, a telecommunication network, an intranet, an Internet, a local area network (LAN), a wide area network (WAN), a wireless local area network (WLAN), a metropolitan area network (MAN), a wide area network (WAN), a public telephone switched network (PSTN), a BluetoothTM network, a ZigBeeTM network, a near field communication (NFC) network, or the like, or any combination thereof.
- the network 106 may include wired or wireless network access points such as base stations and/or internet exchange points (not shown), through which one or more components of the UAV system 100 may be connected to the network 106 to exchange information.
- the base stations and/or internet exchange points may be a Wi-Fi station.
- the UAV and/or the ground terminal 104 may access to the network 106 through a competition-based random access manner or a non-competition-based random access manner.
- the server 108 may be configured to process data.
- the data may be received from the UAV 102 , the ground terminal 104 , the network 106 , the storage 110 , etc.
- the server 108 may archive the flight log information from the UAV 102 in the storage 110 .
- the server 108 may back up information from the ground terminal 104 in the storage 110 .
- the server 108 may include a central processing unit (CPU), an application-specific integrated circuit (ASIC), an application-specific instruction-set processor (ASIP), a graphics processing unit (GPU), a physics processing unit (PPU), a digital signal processor (DSP), a field programmable gate array (FPGA), a programmable logic device (PLD), a controller, a microcontroller unit, a reduced instruction-set computer (RISC), a microprocessor, or the like, or any combination thereof.
- the server 108 may be integrated in the ground terminal 104 .
- the storage 110 may be configured to acquire and/or store information.
- the information may be received from components of the UAV system 100 (e.g., the UAV 102 , the ground terminal 104 , or the server 108 , etc.).
- the storage 110 may acquire information from the ground terminal 104 .
- the information acquired and/or stored in the storage 110 may include programs, software, algorithms, functions, files, parameters, data, texts, numbers, images, or the like, or any combination thereof.
- the storage 110 may store images collected by the UAV 102 .
- the storage 110 may store parameters (e.g., latitude, longitude, altitude of the UAV 102 ) from the ground terminal 104 .
- the storage 110 may include a mass storage, a removable storage, a volatile read-and-write memory, a read-only memory (ROM), or the like, or any combination thereof.
- the storage 110 may be implemented on a cloud platform including a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, or the like, or any combination thereof.
- the UAV system 100 is merely provided for the purpose of illustration, and is not intended to limit the scope of the present disclosure.
- the UAV 102 may be any type of remote device that records and transmits video data, including but is not limited to a monitoring device, a wireless sensing network device, a smart home device, an onboard video device, etc.
- these variations and modifications may not depart from the scope of the present disclosure.
- FIG. 2 illustrates a block diagram of an exemplary unmanned aerial vehicle (UAV) 102 according to some embodiments of the present disclosure.
- the UAV 102 may include an UAV transceiver 202 , a processor 204 , a video recording device 206 , a flight controller 208 , a sensor 210 , an inertial measurement unit (IMU) 212 , and storage medium 214 .
- IMU inertial measurement unit
- the UAV transceiver 202 may transmit and/or receive data.
- the data may include text, videos, images, audios, animations, graphs, or the like, or any combination thereof.
- the UAV 102 may communicate with the ground terminal 104 via the UAV transceiver 202 .
- the UAV transceiver 202 may transmit a video processed by the processor 204 to the ground terminal 104 , for example, the compressed video.
- the UAV transceiver 202 may receive commands from the ground terminal 104 to maneuver the motion of the UAV 102 .
- the UAV transceiver 202 may be any type of transceiver.
- the UAV transceiver 202 may be a radio frequency (RF) transceiver that is capable of transmitting or receiving data through a wireless network. More particularly, the wireless network may operate in various frequency bands, such as 433 MHz, 900 MHz, 2.4 GHz, 5 GHz, 5.8 GHz, etc.
- the UAV transceiver 202 may include a transmitter and a receiver. The transmitter and the receiver may each implement part or all of the functions of the UAV transceiver 202 .
- the processor 204 may process data.
- the data may be received from other components of the UAV 102 (e.g., the UAV transceiver 202 , the video recording device 206 , or the storage medium 214 , etc.).
- the processor 204 may process data received by the UAV transceiver 202 .
- the processor 204 may process data to be transmitted to the ground terminal 104 via the UAV transceiver 202 .
- the processor 204 may receive video data from the video recording device 206 .
- the processor 204 may compress the video data, adjust the video data and transmit the adjusted video data to the ground terminal 104 via the UAV transceiver 202 .
- the adjustment may include synchronizing the transmission of the video data with physical layer.
- the processor 204 may receive data from the sensor 210 , the flight controller 208 , and the IMU 212 to evaluate the status of the UAV 102 and determine a course of action. For example, the processor 204 may constantly and/or periodically communicate with the IMU 212 , which may measure the UAV 102 's velocity and attitude data, and adaptively adjust the position of the UAV 102 . In some embodiments, the processor 204 may include one or more processors (e.g., single-core processors or multi-core processors).
- the processor 204 may include a central processing unit (CPU), an application-specific integrated circuit (ASIC), an application-specific instruction-set processor (ASIP), a graphics processing unit (GPU), a physics processing unit (PPU), a digital signal processor (DSP), a field programmable gate array (FPGA), a programmable logic device (PLD), a controller, a microcontroller unit, a reduced instruction-set computer (RISC), a microprocessor, or the like, or any combination thereof.
- CPU central processing unit
- ASIC application-specific integrated circuit
- ASIP application-specific instruction-set processor
- GPU graphics processing unit
- PPU a physics processing unit
- DSP digital signal processor
- FPGA field programmable gate array
- PLD programmable logic device
- controller a controller
- microcontroller unit a reduced instruction-set computer (RISC)
- RISC reduced instruction-set computer
- the video recording device 206 may capture video data.
- the video data may include images, videos, audios, graphs, animations, etc.
- the video recording device 206 may be a camera, a vidicon, a video recorder, a digital camera, an infrared camera, or an ultraviolet camera, etc.
- the video recording device 206 may send the captured video data to the processor 204 for processing.
- the video recording device 206 may send the captured video data to the processor 204 .
- the processor 204 may compress the video data and cause transmission of the compressed video data to the ground terminal 104 .
- the ground terminal 104 may receive and decompress the video data.
- the UAV 102 may include a holder/pan-tilt device (not shown in FIG.
- the processor 204 may control the operation of the holder/pan-tilt device to adjust the position of the video recording device 206 .
- the flight controller 208 may control propulsion of the UAV 102 to control the pitch angle, roll angle, and/or yaw angle of the UAV 102 .
- the flight controller 208 may alter the speed, orientation and/or position of the UAV 102 .
- the processor 204 may interpret the data and send corresponding instructions to the flight controller 208 .
- the flight controller 208 may alter the speed and/or position of the UAV 102 based on the instructions.
- the sensor 210 may collect relevant data.
- the relevant data may include information relating to the UAV status, the surrounding environment, or the objects within the environment.
- the sensor 210 may include a location sensor (e.g., a global positioning satellite (GPS) sensor, a mobile device transmitter enabling location triangulation), a vision sensor (e.g., an imaging device capable of detecting visible, infrared, or ultraviolet light, such as a camera), a proximity or range sensor (e.g., an ultrasonic sensor, LIDAR (Light Detection and Ranging), a time-of-flight or depth camera), an inertial sensor (e.g., an accelerometer, a gyroscope, an inertial measurement unit (IMU)), an altitude sensor, an attitude sensor (e.g., a compass, an IMU), a pressure sensor (e.g., a barometer), an audio sensor (e.g., a microphone), a field sensor (e.g., a
- the IMU 212 may measure an angular velocity (e.g., attitude change) and a linear acceleration (e.g., velocity change) of the UAV 102 .
- the IMU 212 may include one or more gyroscopes to measure attitude change (e.g., absolute or relative pitch angle, roll angle, and/or yaw angle) of the UAV, and may include one or more accelerometers to measure linear velocity change (e.g., acceleration along x, y, and/or z directions) of the UAV 102 .
- the IMU 212 may be integrated in the sensor 210 .
- the storage medium 214 may store data.
- the data may be obtained from the UAV transceiver 202 , the processor 204 , the video recording device 206 , the flight controller 208 , the sensor 210 , the IMU 212 , and/or any other devices.
- the data may include image data, video data, metadata associated with the image data and the video data, instruction data, etc.
- the storage medium 214 may include a hard disk drive, a solid-state drive, a removable storage drive (e.g., a flash memory disk drive, an optical disk drive, etc.), a digital video recorder, or the like, or any combination thereof.
- the UAV 102 is merely provided for the purposes of illustration, and is not intended to limit the scope of the present disclosure.
- some other components may be implemented in the UAV 102 , e.g. a battery may be implemented in the UAV 102 as a power supply.
- the UAV 102 may include an electronic speed controller (ESC) for controlling or adjusting the rotation speed of motors mounted therein.
- ESC electronic speed controller
- FIG. 3 illustrates a flowchart of an exemplary process 300 for video transmission in a UAV system according to some embodiments of the present disclosure.
- the exemplary process 300 may be implemented by one or more processors of the UAV 102 .
- step 302 video data may be recorded.
- step 302 may be implemented by the video recording device 206 (illustrated in FIG. 2 ).
- the video data may include videos, audios, images, graphs, animations, or the like, or any combination thereof.
- the video data may include a plurality of video frames.
- step 302 may be performed in response to a request received from the processor 204 or the ground terminal 104 .
- a user of the ground terminal 104 may send a request for recording video data via a user interface (e.g., the user interface 408 illustrated in FIG. 4 ).
- the video recording device 206 may be activated to record the video data.
- commands to record the video data may be pre-programmed by the user via ground terminal 104 and stored in storage medium 214 of the UAV 102 .
- the recording of the video data may be controlled by the processor 204 via executing the pre-programmed commands stored in the storage medium 214 .
- a video frame may be extracted from the video data.
- step 304 may be implemented by the processor 204 .
- the video frame may refer to a frame of the video data.
- the video frame may correspond to a length of time depending on the compressing algorithms.
- the video frame may be a still image.
- the video frame may include a frame header.
- the frame header may indicate the start of the video frame transmission.
- the frame header may include information of the video frame, such as synchronization information, address information, error control information, coding information, etc.
- the synchronization information may include a start time point and/or an end time point of the video frame.
- the frame header may correspond to a frame synchronization pulse signal.
- the frame synchronization pulse signal may indicate the start time point of the video frame.
- the video frame may be segmented into a plurality of sub-frames. Data associated with each of the plurality of sub-frames may be compressed and/or packed into a data package for transmission. Each of the plurality of sub-frames may correspond to a portion of the video frame. The lengths of the segmented sub-frames may be the same or different.
- transmission of a physical layer frame may be synchronized with transmission of the video frame over the physical layer.
- step 306 may be implemented by the processor 204 .
- the physical layer defines the means of transmitting raw bits over a physical link connecting network nodes.
- the bit stream may be converted to a physical signal to be transmitted over a hardware transmission medium.
- the physical layer comprises a physical signaling sublayer that interfaces with the data link layer's media access control (MAC) sublayer and performs signaling control of transmission and reception.
- the physical layer frame refers to a frame generated by the physical layer of the OSI architecture that performs signaling control of transmission and reception via the physical layer.
- the transmission of a data package (e.g., a compressed and/or packed video frame or sub-frame thereof) through the physical layer is controlled by a physical layer frame.
- transmission of the first bit of the physical layer frame may signal the allowability of transmitting the first bit of the data package (e.g., compressed and/or packed video frame or sub-frame thereof). That is, the timing of transmitting the first bit of the data package (also referred to as frame timing of the video frame) needs to be synchronous to the timing of transmitting the first bit of the physical layer frame (also referred to as frame timing of the physical layer frame).
- the frame timing of the video frame is asynchronous to the frame timing of the physical layer frame, the video frame may not be transmitted immediately.
- the physical layer frame for transmission control may be generated in a random basis.
- the physical layer frames may be generated in accordance to a scheduling of a timer of the physical layer.
- the frame timing of the physical layer frame and the frame timing of the video frame may be asynchronous.
- the start of the physical layer frame transmission (also referred to as the frame header of physical layer frame) and the start of the video frame transmission (also referred to as a time point that a compressed and/or packed sub-frame can be transmitted) through the physical layer may be asynchronous.
- the expected transmission starting time may fall in the middle of transmission of a physical layer frame.
- the compressed sub-frame i.e., the data package, may have to wait for a period of time (also referred to as a wait time) before being transmitted and thus, causing a transmission delay.
- the frame header of the physical layer frame may be adjusted.
- the time point corresponding to the frame header of the physical layer frame may be adjusted to be synchronized with the time point at which the compressed and/or packed sub-frames are expected to transmit.
- Detailed process of the synchronization may be found elsewhere in the present disclosure (e.g., in the description of FIG. 6 , FIG. 8A , FIG. 8B , etc.).
- the video frame may be transmitted.
- step 308 may be implemented by the processor 204 .
- the video frame may be sent to one or more other components of the UAV 102 , such as the UAV transceiver 202 , the storage medium 214 , etc.
- the video frame may be sent to the UAV transceiver 202 for processing.
- the processing may include amplification, analog-to-digital conversion, digital-to-analog conversion, etc.
- the UAV transceiver 202 may send the processed data to the ground terminal 104 .
- the video frame may be stored in the storage medium 214 .
- the video frame may be transmitted in a physical layer.
- the physical layer frame may be synchronized with the video frame so that the time point when the video frame (or a compressed and/or packed sub-frame thereof) can be transmitted is synchronized with the frame header of the physical layer frame.
- the video frame (or the compressed and/or packed sub-frames thereof) may be transmitted with the physical layer frame in the physical layer.
- the steps as shown in FIG. 3 are for illustrative purpose, and are not intended to limit the protection scope of the present disclosure.
- the process 300 may be accomplished with one or more additional steps not described, and/or without one or more the steps discussed above. Additionally, the order that the steps of the process 300 are performed in FIG. 3 is not intended to be limiting. For example, one or more other optional steps may be added between any two steps in the process illustrated in FIG. 3 . Examples of such steps may include storing or caching the video data, or the like.
- FIG. 4 illustrates a block diagram of an exemplary ground terminal 104 in a UAV system according to some embodiments of the present disclosure.
- the ground terminal 104 may include a ground terminal transceiver 402 , a processor 404 , a display 406 , a user interface 408 , and a memory 410 .
- the ground terminal transceiver 402 may transmit and/or receive data.
- the data may include text, videos, images, audios, animations, graphs, or the like, or any combination thereof.
- the ground terminal 104 may communicate with the UAV 102 via the ground terminal transceiver 402 .
- the ground terminal transceiver 402 may transmit an instruction (e.g., an instruction to record video data) from the processor 404 to the UAV transceiver 202 .
- the UAV transceiver 202 may send the instruction to the video recording device to start recording video data.
- the ground terminal transceiver 402 may be any type of transceiver.
- the ground terminal transceiver 402 may be a radio frequency (RF) transceiver that is capable of transmitting or receiving data over a wireless network. More particularly, the wireless network may operate in various frequency bands, such as 433 MHz, 900 MHz, 2.4 GHz, 5 GHz, 5.8 GHz, etc.
- the ground terminal transceiver 402 may include a transmitter and a receiver. The transmitter and the receiver may each implement part or all functions of the ground terminal receiver 402 .
- the processor 404 may process data.
- the data may be received from the ground terminal transceiver 402 , the memory 410 , etc.
- the data may include information relating to the state of the UAV 102 (e.g., velocity, acceleration, altitude, etc.), image data, video data, a user instruction (e.g., an instruction to increase altitude of the UAV 102 ), etc.
- the processing of the data may include storing, classifying, selecting, transforming, calculating, estimating, encoding, decoding, or the like, or any combination thereof.
- the processor 404 may decompress a compressed video frame received from the UAV 102 .
- the ground terminal 104 may receive an application updating package from the server 108 via the ground terminal transceiver 402 .
- the processor 404 may process the application updating package to update the related mobile application installed on the ground terminal 104 .
- the processor 404 may process data from the memory 410 to view the historical flight logs.
- the processor 404 may include one or more microprocessors, filed programmable gate arrays (FPGAs), central processing units (CPUs), digital signal processors (DSPs), etc.
- the display 406 may display information.
- the information may include text, audios, videos, images, or the like, or any combination thereof.
- the display 406 may include a liquid crystal display (LCD), a light emitting diode (LED)-based display, a flat panel display or curved screen, a television device, a cathode ray tube (CRT), or the like, or a combination thereof.
- the display 406 may include a touchscreen.
- the information displayed on the display 406 may relate to the status of the UAV 102 , such as altitude, velocity, etc.
- images and/or videos captured by the video recording device 206 may be displayed on the display 406 .
- the user interface 408 may receive user interactions with the ground terminal 104 and generate one or more instructions to operate one or more components of the ground terminal 104 or other components in the UAV system 100 .
- the one or more instructions may include an instruction to operate the ground terminal transceiver 402 to communicate with the UAV transceiver 202 , an instruction to operate the processor 404 to process data received from the ground terminal transceiver 402 , an instruction to operate the display 406 to display images and/or videos received, an instruction to operate the memory 410 to store data, an instruction operate the UAV 102 to capture video data, or the like, or any combination thereof.
- the user interface 408 may include one or more input devices, such as a touchscreen, a keyboard, a mouse, a trackball, a joystick, a stylus, an audio recognition device or application, etc.
- a keyboard may be integrated in the ground terminal 104 .
- An instruction may be generated in response to pressing down a plurality of keys on the keyboard in a certain sequence by a user.
- the instruction may direct the UAV 102 to adjust flight attitudes.
- the instruction may direct the video recording device 206 to take a photo or to record a video.
- the instruction may direct the display 406 to display the photo or the video.
- the instruction may direct the memory 410 to store data. For example, after a video is obtained, the memory 410 may receive an instruction to store the video.
- the memory 410 may store data.
- the data may be obtained from the ground terminal transceiver 402 , the processor 404 , the display 406 , the user interface 408 , and/or any other components in the UAV system 100 .
- the data may include image data, video data, metadata associated with the image data and the video data, instruction data, etc.
- the memory 410 may include a hard disk drive, a solid-state drive, a removable storage drive (e.g., a flash memory disk drive, an optical disk drive, etc.), a digital video recorder, or the like, or a combination thereof.
- ground terminal 104 is merely provided for the purposes of illustration, and is not intended to limit the scope of the present disclosure.
- one or more components of the ground terminal 104 may include an independent storage block (not shown) respectively.
- those variations and modifications may not depart from the scope of the present disclosure.
- FIG. 5 illustrates a block diagram of an exemplary processor 204 in a UAV system according to some embodiments of the present disclosure.
- the processor 204 may include a data acquisition module 502 , a data compressing and packing module 504 , a physical layer processing module 506 , and a storage module 508 .
- the modules in the processor 204 may be connected in a wired or wireless manner.
- the data acquisition module 502 may obtain data.
- the data may relate to the UAV 102 (e.g., velocity of the UAV 102 ), the surrounding environment (e.g., temperature, barometric pressure, etc.), or objects within the environment.
- the data may include image data, audio data, video data, or the like, or any combination thereof.
- the data may be obtained from the UAV transceiver 202 , the video recording device 206 , the storage medium 214 , or other components of the UAV system 100 .
- the video data may include a real-time video stream.
- the real-time video stream may be transmitted by the video recording device 206 , or received at a data interface communicatively connected to the video recording device 206 .
- the video data may include a plurality of video frames.
- Each of the plurality of video frames may have a frame header.
- the frame header of each of the plurality of video frames may correspond to a frame synchronization pulse signal.
- the frame synchronization pulse signal may indicate a start time point of transmitting the each of the plurality of video frames in the physical layer.
- the data compressing and packing module 504 may compress and/or pack data.
- the data may be received from the data acquisition module 502 , or the storage module 508 .
- the data compressing and packing module 504 may be configured to reduce the redundancy in the data.
- the redundancy may include temporal redundancy, spatial redundancy, statistical redundancy, perceptual redundancy, etc.
- the data may be compressed in various compression algorithms.
- the compression algorithms may include lossless data compression, and lossy data compression.
- the lossless data compression may include run-length encoding (RLE), Lempel-Ziv compression, Huffman coding, prediction by partial matching (PPM), bzip2 compression, or the like, or any combination thereof.
- the lossy data compression may include fractal compression, vector quantization, wavelet compression, linear predictive coding, or the like, or any combination thereof.
- the data may be video data.
- the video data may be compressed under a video coding standard.
- the video coding standard may include but not limited to H.120, H.261, H.262, H.263, H.264, high efficiency video coding, MPEG-4, or the like, or any combination thereof.
- the data compressing and packing module 504 may also be configured to pack the compressed data.
- the compressed data may be packed in various formats. Exemplary packaging formats may include an AVI format, a DV-AVI format, a WMV format, a MP4 format, a RMVB format, a MOV format, a FLV format, or the like, or any combination thereof.
- the compressed data may be packed into a data package that complies with the physical layer protocols.
- a video frame may be segmented into a plurality of sub-frames of the same or different lengths.
- Each of the plurality of sub-frames may correspond to a portion of the video frame.
- the data compressing and packing module 504 may compress data associated with the each of the plurality of sub-frames. Alternatively or additionally, the compression and packing module 504 may further pack the compressed data.
- the physical layer processing module 506 may be configured to process data.
- the data may be obtained from the data acquisition module 502 or the data compressing and packing module 504 .
- the processing of the data may include extracting a frame header from a video frame, determining a time point and/or a time period, synchronizing two or more time points, or the like, or any combination thereof.
- a video frame may be acquired by the data acquisition module 502 from the video recording device 206 .
- a sub-frame of the video frame may then be compressed and/or packed by the data compressing and packing module 504 .
- the compressed and/or packed sub-frame may be transmitted in the physical layer.
- a physical layer frame may control the data transmission (including the transmission of the compressed and/or packed sub-frame) in the physical layer.
- a frame header of the physical layer frame which corresponds to a time point at which the data transmission starts may be determined.
- the start of the physical layer frame transmission (or referred to as the frame header of physical layer frame) and the start of the video frame transmission (or referred to as a time point that a compressed and/or packed sub-frame can be transmitted) may be asynchronous. Therefore, the sub-frame may have to wait for a period of time (also referred to as a wait time) before being transmitted and a delay may be caused. In order to reduce the wait time and delay, the frame header of the physical layer frame may be adjusted.
- the time point corresponding to the frame header of the physical layer frame may be adjusted to be synchronized with the time point at which the compressed and/or packed sub-frames can be transmitted.
- Detailed process of the synchronization may be found in, for example, FIG. 6 and the description thereof.
- the physical layer processing module 506 may include a time determination unit 510 , a time adjustment unit 520 , and a data transmitting and receiving unit 530 .
- the time determination unit 510 may be configured to determine a time point and/or a time period.
- the time point may correspond to a frame header of a video frame, or a frame header of a physical layer frame.
- the time point may also include a time point that the compression and/or packing of the video frame is started or completed, a time point that the compression and/or packing of the sub-frame of the video frame is started or completed, a time point that a camera (e.g., the video recording device 206 ) starts to record video data, etc.
- the time point may be determined based on a signal.
- the frame header of the video frame may correspond to a frame synchronization pulse signal.
- the time point corresponding to the frame header of the video frame may be determined based on the frame synchronization pulse signal.
- the frame header of the physical layer frame may correspond to a physical layer pulse signal.
- the time point corresponding to the frame header of the physical layer frame may be determined based on the physical layer pulse signal.
- the frame synchronization pulse signal may indicate a time point of receiving the video frame from the video recording device.
- the time period may include a time period for compressing and/or packing the video frame, a time period for compressing and/or packing the sub-frame of the video frame, a time period for transmitting the physical layer frame (and/or the compressed and packed sub-frames), or a time period that the processor 204 operates, etc.
- the time period of compressing and/or packing the video frame (or sub-frames thereof) may be determined based on the length of the video frame and the compression and/or packing algorithms used.
- the time determination unit 510 may be implemented by a timer.
- the timer may be a counter that provides an internal reading.
- the reading may rise in every fixed time period (e.g., a microsecond, a millisecond, a second, etc.).
- the timer may generate a signal and the reading may be zeroed.
- the timer may generate a signal periodically without zeroing its reading. For example, the signal is generated when the reading is a multiple of 100.
- the timer may record its internal reading when a frame synchronization pulse signal is detected (e.g., at a time point corresponding to the frame header of the video frame).
- the timer may record the time period (or a number increase in its internal reading) of compressing and/or packing the sub-frame.
- the time adjustment unit 520 may be configured to adjust the time point and/or time period determined by the time determination unit 510 .
- the time adjustment unit 520 may adjust the time point corresponding to the frame header of the physical layer frame. More particularly, the time point may be adjusted such that the time point corresponding to the frame header of the physical layer frame may be synchronized with a time point when the compressed and/or packed sub-frames are ready to be transmitted.
- the compressed and/or packed sub-frame may be transmitted with the physical layer frame in the physical layer.
- the period of time of a present physical layer frame may be extended or shorten such that the frame header of the succeeding physical layer frame is synchronized with the sub-frame to be transmitted.
- internal reading of the timer may control the frame header of the physical layer frame.
- a physical layer frame is generated when the reading of the timer reaches a certain value.
- the frame header of the physical layer frame may be shifted such that the frame header of the succeeding physical layer frame is synchronized with the time point that the sub-frame is ready to be transmitted.
- FIG. 6 , FIG. 8A and FIG. 8B Detailed description about the adjustment may be found in FIG. 6 , FIG. 8A and FIG. 8B .
- the data transmitting and receiving unit 530 may be configured to transmit and receive data.
- the data may include data received from the data compressing and packing module 504 (e.g., a compressed and/or packed video frame, a compressed and/or packed sub-frame), the storage module 508 , or other components of the UAV system 100 (e.g., the UAV transceiver 202 , the storage medium 214 , etc.), etc.
- the compressed and/or packed sub-frame received from the data compressing and packing module 504 may be transmitted by the data transmitting and receiving unit 530 to the UAV transceiver 202 for processing.
- the compressed and/or packed sub-frame may be transmitted in the physical layer.
- the processing may include amplification, analog-to-digital conversion, digital-to-analog conversion, etc.
- the UAV transceiver 202 may send the processed data to the ground terminal 104 .
- the data transmitting and receiving unit 530 may obtain data from the storage module 508 .
- the storage module 508 may be configured to store data.
- the data may be obtained from the data acquisition module 502 , the data compressing and packing module 504 , the physical layer processing module 506 , and/or other devices.
- the data may include programs, software, algorithms, functions, files, parameters, text, numbers, images, videos, audios, or the like, or any combination thereof.
- the storage module 508 may include a hard disk drive, a solid-state drive, a removable storage drive (e.g., a flash memory disk drive, an optical disk drive, etc.), a digital video recorder, or the like, or a combination thereof.
- the above description about the processor 204 is merely for the purposes of illustration, and is not intended to limit the scope of the present disclosure.
- the storage module 508 may be omitted from the processor 204 and/or incorporated into the storage medium 214 . However, those variations and modifications may not depart from the scope of the present disclosure.
- FIG. 6 illustrates a flowchart of an exemplary process 600 for transmitting a video frame in a UAV system according to some embodiments of the present disclosure.
- the process 600 may be implemented by the processor 204 .
- step 602 video data may be recorded.
- step 602 may be implemented by the video recording device 206 .
- Step 602 may be similar to Step 302 of FIG. 3 , and is not repeated here.
- a video frame may be extracted from the video data.
- step 604 may be implemented by the data acquisition module 502 .
- the video frame may refer to a frame of the video data.
- the video frame may correspond to a length of time.
- the video frame may be a still image.
- the video frame may be compressed.
- step 606 may be implemented by the data compressing and packing module 504 .
- the video frame may be segmented into a plurality of sub-frames. Data associated with each of the plurality of sub-frames may be compressed. Each of the plurality of sub-frames may correspond to a portion of the video frame.
- the video frame may be segmented into a plurality of sub-frames of the same or different lengths.
- Related description about the compression may be found in the description of the data compressing and packing module 502 .
- a time period for compressing the video frame may be determined.
- step 608 may be implemented by the physical layer processing module 506 .
- the time period for compressing the video frame may be determined based on the length of the video frame and the compressing algorithms used.
- the time period for compressing the video frame may refer to the sum of time periods of compressing all the sub-frames of the video frame.
- the time period of compressing the video frame in the present application may also refer to a time period of compressing a sub-frame of the video frame.
- the compressed video frame (or sub-frames thereof) may be further packed in step 606 .
- the time period determined in step 608 may include the time period for both compressing and packing the video frame (or the sub-frames thereof).
- the time period for compressing the video frame may include other processing time of the video frame before transmission, for example, the time for adding redundancy check bits to the packed video frame.
- a frame header may be extracted from the video frame.
- step 610 may be implemented by the data acquisition module 502 .
- the frame header may include control information of the video frame.
- the control information may include synchronization information, address information, error control information, coding information, etc.
- the synchronization information may include a start time point and/or an end time point of the video frame.
- a first time point corresponding to the frame header of the video frame may be determined.
- step 612 may be implemented by the physical layer processing module 506 .
- the first time point may refer to the start time point of the video frame.
- the frame header of the video frame may correspond to a frame synchronization pulse signal. The time point that the frame synchronization pulse signal is received at the physical layer may further be determined as the first time point.
- a second time point may be determined based on the first time point obtained in step 612 and the time period for compressing (and/or packing) the video frame obtained in step 608 .
- step 614 may be implemented by the physical layer processing module 506 .
- the second time point may refer to a time point when the compression and/or packing of the video frame is completed.
- the second time point may refer to a time point that the compression and/or packing of the sub-frame is completed.
- FIG. 8A and FIG. 8B Detailed description about the determination of the second time point may be found in FIG. 8A and FIG. 8B .
- a third time point corresponding to a frame header of a physical layer frame may be synchronized with the second time point.
- step 616 may be implemented by the physical layer processing module 506 .
- the third time point corresponding to the frame header of the physical layer frame may be synchronized with the second time point at which the compression and/or packing of the sub-frame is completed.
- the physical layer frame may control a data transmission and the frame header of the physical layer frame may indicate the start of the data transmission.
- the frame header of the physical layer frame may indicate the start of transmission of the compressed and/or packed sub-frame through the physical layer.
- the transmission time of a physical layer frame may be a fixed value as the length of the physical layer frames may be configured as the same.
- the transmission time of a present physical layer frame may be extended or shorten such that the frame header of the succeeding physical layer frame and the second time point (i.e., the time point at which the compression and/or packing of the sub-frame is completed) are synchronized.
- the transmission time of one or more of the succeeding physical layer frames from the present physical layer frame may be adjusted in a similar manner as transmission time of the present physical layer frame.
- only the transmission time of the present physical layer frame is adjusted and by maintaining a certain relationship between the frame rate of the video frame and the physical layer frame, the transmission time of the succeeding physical layer frames may not need to be adjusted.
- the transmission time of the present physical layer frame may be adjusted each time a video frame is received for compression and transmission.
- the frame synchronization pulse signal indicating the time point of receiving the video frame from the video recording device may be transmitted to the physical layer.
- a snapshot may be taken on the timer of the physical layer to record the occurrence time point of the frame synchronization pulse signal at the physical layer.
- the physical layer processing module 506 may calculate a start time point of transmitting the video frame based on the occurrence time point of the frame synchronization pulse signal (i.e., the time point when the video frame from the video recording device for transmission) and the time for compressing and/or packing the video frame.
- the physical layer processing module 506 may further compare the calculated start time point of transmitting the video frame with a scheduled time point of generating a physical layer frame and determine whether the calculated start time point of transmitting the video frame and the next scheduled time point of generating a physical layer frame are synchronous. When it is determined that the calculated start time point of transmitting the video frame and the next scheduled time point of generating a physical layer frame are asynchronous, the physical layer processing module 506 may adjust the value of the physical layer timer such that the next scheduled time point of generating a physical layer frame is adjusted to be the same as the calculated start time point of transmitting the video frame. Detailed description about the synchronization may be disclosed in FIG. 8A and FIG. 8B .
- the compressed video frame may be transmitted at the third time point.
- step 618 may be implemented by the physical layer processing module 506 .
- the compressed and/or packed sub-frame may be transmitted at the third time point with the corresponding physical layer frame in the physical layer.
- the compressed and/or packed video frame may be transmitted to a remote device (e.g., the ground terminal 104 , the server 108 , the storage 110 , etc.) via the UAV transceiver 202 .
- step 620 the process 600 may determine whether the transmission of the video frame (or the sub-frames thereof) is completed. If the transmission of the video frame is not completed, the process 600 may proceed back to step 616 . For example, a plurality of third time points corresponding to the plurality of the sub-frames may be obtained respectively. Each of the plurality of sub-frames may be transmitted at the corresponding third time point in sequence. The plurality of sub-frames may be transmitted by repeating step 616 to step 618 . If the transmission of the video frames is completed, the process 600 may proceed to step 622 . In step 622 , the processor 204 may wait for next video data. When the next video data is received or recorded, it may be processed by the processor 204 by repeating steps 604 - 620 .
- the steps as shown in FIG. 6 are for illustrative purpose, and are not intended to limit the protection scope of the present disclosure. In some embodiments, the process may be accomplished with one or more additional steps not described, and/or without one or more of the steps discussed above. Additionally, the order that the steps of the process 600 are performed in FIG. 6 is not intended to be limiting. For example, step 606 and step 610 may be performed simultaneously or successively. As another example, step 614 and step 616 may be merged into one step. As another example, step 616 may be separated into two steps: a third time point determination and a third time point adjustment.
- the video data may include a plurality of video frames. Each of the video frames may be processed by repeating steps 604 - 620 .
- FIG. 7 illustrates a flowchart of an exemplary process 700 for configuring a frame rate of physical layer frames in a UAV system according to some embodiments of the present disclosure.
- the exemplary process 700 may be implemented by the processor 204 .
- a plurality of video frames may be received.
- the video frames may be extracted from the video data recorded in methods similar to those disclosed in step 602 and/or step 302 .
- a frame rate of the plurality of video frames may be obtained.
- Each of the video frames may be a still image.
- the frame rate of the plurality of video frames may refer to the number of still images obtained per second.
- the frame rate of twenty video frames may be obtained by dividing the total time (in seconds) of the twenty frames by twenty.
- a frame rate of a plurality of physical layer frames may be configured based on the frame rate of the plurality of video frames.
- the frame rate of the plurality of physical layer frames may refer to the number of physical layer frames per unit of time (e.g., second or millisecond).
- the frame header of the present physical layer frame may be synchronized with present video frame to reduce wait time and delay.
- the frame rate of the physical layer frames may be configured so that the subsequent physical layer frames may be synchronized with the subsequent video frames, respectively. More particularly, the frame rate of physical layer frames may be adjusted such that each frame header of the physical layer frames is synchronized or almost synchronized with the time point when the compression of the video frame is completed.
- the frame rate of the plurality of physical layer frames may be an integer multiple of the frame rate of the plurality of video frames. Merely by way of example, the integer may be 2, 4, 6, 8, 10, 20, 25, 30, etc.
- FIG. 7 the steps as shown in FIG. 7 are for illustrative purpose, but are not intended to limit the protection scope of the present disclosure. In some embodiments, the process may be accomplished with one or more additional steps not described, and/or without one or more of the steps discussed above. Additionally, the order in which the steps of the process as illustrated in FIG. 7 is not intended to be limiting.
- FIG. 8A and FIG. 8B illustrate two schematic diagrams of a video transmission in a UAV system according to some embodiments of the present disclosure.
- FIG. 8A and FIG. 8B illustrate a video transmission from the UAV 102 to the ground terminal 104 .
- the video transmission may include capturing video data via the UAV 102 , receiving video frames at the UAV 102 , compressing the video frames at the UAV 102 , processing the video frames at the UAV 102 , transmitting the video frames via at the UAV 102 according to the timings of the physical layer frames, transmitting the video frames from the UAV 102 to the ground terminal 104 wirelessly, receiving and decompressing the video frames in the ground terminal 104 , etc.
- a plurality of video frames may be extracted from a real-time video.
- the plurality of video frames may include a video frame 802 - 1 , a video frame 802 - 2 , etc.
- the real-time video may be received at a data interface (e.g., a universal serial bus (USB) interface, an IEEE1394 interface, or a RS-232 interface, etc.) communicatively connected to a camera (e.g., the video recording device 206 ).
- a frame timing of the camera may include a plurality of signals that indicate the start time points of the camera to record videos.
- the plurality of signals may include a signal 800 - 1 , a signal 800 - 2 , a signal 800 - 3 , etc.
- the video frame 802 - 1 and the video frame 802 - 2 may be segmented into sub-frames and compressed into a plurality of compression packages, including a compression package 804 - 1 , a compression package 804 - 2 . . . a compression package 804 - 7 , etc.
- the segmentation and compression method may be similar to the description of data compressing and packing module 504 .
- the plurality of compression packages may be transmitted through physical layer under the control of a plurality of physical layer frames including a physical layer frame 806 - 1 , a physical layer frame 806 - 2 . . . a physical layer frame 806 - 8 , etc.
- the plurality of compression packages corresponding to the video frames may be received in the ground terminal 104 and may be decompressed as a plurality of decompressed video frames, including a decompressed video frame 808 - 1 , a decompressed video frame 808 - 2 . . . a decompressed video frame 808 - 5 , etc.
- a time point T 1 may correspond to the frame header of the video frame 802 - 1 .
- a time point T 4 may correspond to the time point when the compression package 804 - 1 is received and decompressed (the decompressed video frame that correspond to the compression package 804 - 1 is denoted by 808 - 1 ).
- a time period t 3 may correspond to the time period for compressing and/or packing the first sub-frame of the video frame 802 - 1 .
- a time point T 2 may be determined based on the time point T 1 and the time period t 3 . The time point T 2 may indicate that the compression package 804 - 1 is compressed and ready to be transmitted.
- a wait time period t 2 may be defined as the time length that the compression package 804 - 1 needs to wait before it is transmitted.
- a time delay may be defined as a time period from the start time of the video frame to the time point when the video frame is available for streaming and/or playback at the ground terminal 104 . As shown in FIG.
- a time delay t 1 may refer to a time period from the time point T 1 to the time point T 4 , which may be unnecessarily extended due to the wait time period t 2 .
- the method disclosed in present disclosure may be implemented to eliminate or reduce the wait time, hence to reduce the time delay.
- FIG. 8B illustrates a schematic diagram of a video transmission similar to FIG. 8A but with adjusted physical layer frame timings.
- a time point T 5 may correspond to the frame header of the video frame 802 - 1 .
- a time point T 8 may correspond to the time point when the compression package 804 - 1 is received and decompressed (the decompressed video frame that correspond to the compression package 804 - 1 is denoted by 808 - 1 ).
- a time period t 5 may correspond to the time period for compressing and/or packing the first sub-frame of the video frame 802 - 1 .
- a time point T 7 may be determined based on the time point T 5 and the time period t 5 .
- the time point T 7 may indicate that the compression package 804 - 1 is compressed and ready to be transmitted.
- a time point T 6 may correspond to the frame header of the physical layer frame 806 - 1 .
- the physical layer frame 806 - 1 may be adjusted. For example, the length of physical layer frame 806 - 1 may be extended.
- an internal reading of a timer that corresponds to the frame header of the physical layer frame 806 - 1 may be increased. In particular, the reading of the timer may be adjusted such that the time point corresponding to the frame header of the physical layer frame 806 - 2 is synchronized with the time point T 7 .
- the transmission time of the physical layer frame 806 - 1 may be extended or shorten so that the time point corresponding to the frame header of the physical layer frame 806 - 2 is synchronized with the time point T 7 .
- a time delay t 4 may be defined as a time period from the time point T 5 to the time point T 8 .
- the compression package 804 - 1 can be transmitted immediately when the compression is completed, i.e., without the wait time period t 2 as shown in FIG. 8A .
- only the transmission time of the physical layer frame 806 - 1 is adjusted and the transmission time of the physical layer frame 806 - 2 and the subsequent physical layer frames are not adjusted until a subsequent video frame is received.
- each of the plurality of video frames may be transmitted with the timing adjustment of a number of physical layer frames.
- the number of physical layer frames of which the timings need to be adjusted may be relevant with a frame rate of the plurality of physical layer frames and a frame rate of the plurality of video frames.
- the frame rate of the plurality of physical layer frames may be configured to be an integer multiple (e.g., twenty) of the frame rate of the plurality of video frames so that the number of physical layer frames of which the timings need to be adjusted may be greatly decreased.
- FIG. 9 illustrates a schematic diagram of an exemplary OSI model.
- the OSI model 900 characterizes and standardizes the communication functions of a telecommunication or computing system.
- the OSI model 900 defines a networking framework to implement protocols in seven layers. From the bottom up, the seven layers may include a physical layer 902 , a data link layer 904 , a network layer 906 , a transport layer 908 , a session layer 910 , a presentation layer 912 , and an application layer 914 .
- the physical layer 902 defines electrical and physical specifications of a data connection.
- the physical layer 902 defines the relationship between a device and a physical transmission medium (e.g., a copper or fiber optical cable, a radio frequency, etc.). The relationship may include layout of pins, voltages, line impedance, cable specifications, signal timing and similar characteristics of connected devices, frequency (5 GHz or 2.4 GHz etc.) of wireless devices, etc.
- the physical layer 902 may convey a bit stream (e.g., an electrical impulse, light or radio signals, etc.) through network at the electrical and mechanical level.
- the physical layer 902 may provide hardware means of sending and receiving data on a carrier, including defining cables, cards and physical aspects.
- the physical layer processing module 506 and/or the user interface 408 may operate in the physical layer 902 .
- the data link layer 904 may furnish transmission protocol knowledge, management, handling errors in the physical layer 902 , flow control, frame synchronization, etc.
- data packets may be encoded and decoded into bits.
- the data link layer 904 may be divided into two sub layers: the Media Access Control (MAC) layer and the Logical Link Control (LLC) layer.
- the MAC sub layer may control how a computer on the network gains access to data and permission to transmit the data.
- the LLC layer may control frame synchronization, flow control and error checking.
- the processor 204 , the processor 404 of the UAV system 100 may operate in the data link layer 904 .
- the network layer 906 may provide switching and routing technologies, and creating logical paths for transmitting data from node to node.
- the network 106 , the UAV transceiver 202 , and/or the ground terminal transceiver 402 may operate in the network layer 906 .
- accessing of the UAV 102 to the network 106 may be operated in the network layer 906 .
- the access may be a competition-based random access or a non-competition-based random access.
- the transport layer 908 may provide transparent transfer of data between end systems, or hosts.
- the transport layer 908 may be responsible for end-to-end error recovery and flow control.
- the transport layer 908 may ensure complete data transfer.
- the network 106 , the UAV transceiver 202 , and/or the ground terminal transceiver 402 may operate in the transport layer 908 .
- protocols that operate in the transport layer 908 may include TCP, UDP, SPX, etc.
- the session layer 910 may control connections between devices.
- the session layer 910 may establish, manage and terminate the connections between devices.
- the processor 204 and/or the processor 404 may operate in the session layer 910 .
- the presentation layer 912 may provide independence from differences in data representation (e.g., encryption) by translating from application to network format, and vice versa.
- the presentation layer 912 may work to transform data into the form that the application layer 914 can accept.
- the data compressing and packing module 504 may operate in the presentation layer 912 .
- the application layer 914 may provide application services, such as file transfers, e-mail, or other network software services, etc.
- the application layer 914 may perform functions including identifying communication partners, determining resource availability, etc.
- protocols that operate in the application layer 912 may include FTP, HTTP, DNS, etc.
- OSI model 900 is provided merely for the purposes of illustration, and is not intended to limit the scope of the present disclosure.
- a device or module of the UAV system 100 may work in multiple layers simultaneously or subsequently.
- those variations and modifications may not depart from the scope of the present disclosure.
- aspects of the present disclosure may be illustrated and described herein in any of a number of patentable classes or context including any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof. Accordingly, aspects of the present disclosure may be implemented entirely hardware, entirely software (including firmware, resident software, micro-code, etc.) or combining software and hardware implementation that may all generally be referred to herein as a “block,” “module,” “engine,” “unit,” “component,” or “system.” Furthermore, aspects of the present disclosure may take the form of a computer program product embodied in one or more computer readable media having computer readable program code embodied thereon.
Abstract
A video processing method includes determining a first time point corresponding to a frame header of a video frame, determining a time period for processing at least a portion of the video frame, determining a second time point based on the first time point and the time period, adjusting a third time point corresponding to a frame header of a physical layer frame to synchronize the third time point with the second time point, and starting to transmit the video frame at the second time point.
Description
- This application is a continuation of U.S. application Ser. No. 16/704,662, filed on Dec. 5, 2019, which is a continuation of International Application No. PCT/CN2017/099156, filed Aug. 25, 2017, the entire contents of both of which are incorporated herein by reference.
- This application relates to systems and methods for transmission synchronization, more particularly, to systems and methods for synchronizing the frame timing between a physical layer frame and a video frame.
- Unmanned movable platform (UMP) such as unmanned aerial vehicles (UAV) have been widely used in various fields such as aerial photography, surveillance, scientific research, geological survey, and remote sensing. The maneuvering of a UAV may be controlled by a user via a ground terminal. The UAV may record video data during flight and transmit the video data to the ground terminal. The ground terminal may display the video data synchronously with the recording.
- In order to display the video data at the ground terminal without causing lags, it is important for the UAV to reduce time delay of transmitting the video data from the UAV to the ground terminal.
- According to an aspect of the present disclosure, a method for synchronizing video transmission with physical layer is provided. The method may be implemented on a computing device including at least one processor and a storage. The method may include determining a first time point corresponding to a frame header of a video frame, determining a second time point corresponding to a frame header of a physical layer frame based at least in part on the first time point, and starting transmitting the video frame at the second time point.
- In some embodiments, the method may further include generating the physical layer frame at the second time point.
- In some embodiments, the determining a second time point corresponding to a frame header of a physical layer frame based at least in part on the first time point may include determining the second time point based at least in part on the first time point, and synchronizing a third time point corresponding to a frame header of a physical layer frame with the second time point, the physical layer frame corresponding to the video frame.
- In some embodiments, the method may further include compressing the video frame before transmitting the video frame at the second time point.
- In some embodiments, the method may further include segmenting the video frame into a plurality of sub-frames, and compressing data associated with each of the plurality of sub-frames.
- In some embodiments, the determining the second time point may include determining a time period for compressing a sub-frame of the plurality of sub-frames, and determining the second time point based on the first time point and the time period for compressing the sub-frame.
- In some embodiments, the determining the second time point based on the first time point may include determining a time period for compressing at least a portion of the video frame, and determining the second time point based on the first time point and the time period.
- In some embodiments, the frame header of the video frame may correspond to a frame synchronization pulse signal.
- In some embodiments, the video frame may be extracted from a real-time video stream transmitted by a video recording device.
- In some embodiments, the video frame may be extracted from a real-time video received at a data interface communicatively connected to a video recording device.
- In some embodiments, the method may further include obtaining a frame rate of the video frame, and configuring a frame rate of the physical layer frame based on the frame rate of the video frame.
- In some embodiments, the frame rate of the physical layer frame may be an integer multiple of the frame rate of the video frame.
- According to another aspect of the present disclosure, a system for synchronizing video transmission with physical layer is provided. The system may include a memory that stores one or more computer-executable instructions, and one or more processors configured to communicate with the memory. When executing the one or more computer-executable instructions, the one or more processors may be directed to determine a first time point corresponding to a frame header of a video frame, determine a second time point corresponding to a frame header of a physical layer frame based at least in part on the first time point, and start transmitting the video frame at the second time point.
- According to another aspect of the present disclosure, a non-transitory computer readable medium is provided. The non-transitory computer readable medium may include executable instructions. When the executable instructions are executed by at least one processor, the executable instructions may cause the at least one processor to effectuate a method. The method may include determining a first time point corresponding to a frame header of a video frame, determining a second time point corresponding to a frame header of a physical layer frame based at least in part on the first time point, and starting transmitting the video frame at the second time point.
- According to another aspect of the present disclosure, an unmanned aerial vehicle (UAV) is provided. The UAV may include at least one video recording device, at least one processor and at least one UAV transceiver. The at least one video recording device may be configured to record video data including a plurality of video frames. The at least one processor may be configured to determine a first time point corresponding to a frame header of a video frame of the plurality of video frames, and determine a second time point corresponding to a frame header of a physical layer frame based at least in part on the first time point. The at least one UAV transceiver may be configured to start transmitting the video frame at the second time point.
- Additional features will be set forth in part in the description which follows, and in part will become apparent to those skilled in the art upon examination of the following and the accompanying drawings or may be learned by production or operation of the examples. The features of the present disclosure may be realized and attained by practice or use of various aspects of the methodologies, instrumentalities and combinations set forth in the detailed examples discussed below.
- The present disclosure is further described in terms of exemplary embodiments. These exemplary embodiments are described in detail with reference to the drawings. These embodiments are non-limiting exemplary embodiments, in which like reference numerals represent similar structures throughout the several views of the drawings, and wherein:
-
FIG. 1 illustrates a schematic diagram of an exemplary unmanned aerial vehicle (UAV) system according to some embodiments of the present disclosure; -
FIG. 2 illustrates a block diagram of an exemplary unmanned aerial vehicle (UAV) according to some embodiments of the present disclosure; -
FIG. 3 illustrates a flowchart of an exemplary process for video transmission in a UAV system according to some embodiments of the present disclosure; -
FIG. 4 illustrates a block diagram of an exemplary ground terminal in a UAV system according to some embodiments of the present disclosure; -
FIG. 5 illustrates a block diagram of an exemplary processor in a UAV system according to some embodiments of the present disclosure; -
FIG. 6 illustrates a flowchart of an exemplary process for transmitting a video frame in a UAV system according to some embodiments of the present disclosure; -
FIG. 7 illustrates a flowchart of an exemplary process for configuring a frame rate of physical layer frames in a UAV system according to some embodiments of the present disclosure; -
FIG. 8A andFIG. 8B illustrate two schematic diagrams of a video transmission in a UAV system according to some embodiments of the present disclosure; and -
FIG. 9 illustrates a schematic diagram of an exemplary open systems interconnection (OSI) model. - In the following detailed description, numerous specific details are set forth by way of examples in order to provide a thorough understanding of the relevant disclosure. However, it should be apparent to those skilled in the art that the present disclosure may be practiced without such details. In other instances, well known methods, procedures, systems, components, and/or circuitry have been described at a relatively high-level, without detail, in order to avoid unnecessarily obscuring aspects of the present disclosure. Various modifications to the disclosed embodiments will be readily apparent to those skilled in the art, and the general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the present disclosure. Thus, the present disclosure is not limited to the embodiments shown, but to be accorded the widest scope consistent with the claims.
- It will be understood that the term “system,” “unit,” “module,” and/or “engine” used herein are one method to distinguish different components, elements, parts, section or assembly of different level in ascending order. However, the terms may be displaced by other expression if they may achieve the same purpose.
- It will be understood that when a unit, module or engine is referred to as being “on,” “connected to” or “coupled to” another unit, module, or engine, it may be directly on, connected or coupled to, or communicate with the other unit, module, or engine, or an intervening unit, module, or engine may be present, unless the context clearly indicates otherwise. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
- The terminology used herein is for the purposes of describing particular examples and embodiments only, and is not intended to be limiting. As used herein, the singular forms “a,” “an,” and “the” may be intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “include,” and/or “comprise,” when used in this disclosure, specify the presence of integers, devices, behaviors, stated features, steps, elements, operations, and/or components, but do not exclude the presence or addition of one or more other integers, devices, behaviors, features, steps, elements, operations, components, and/or groups thereof.
- The present disclosure provides systems and methods for video transmission synchronization in a UAV system. The present disclosure adjusts a frame timing of a physical layer transmission according to frame timing of a video stream transmission. The video stream may be received directly from a video recording device carried on a UAV or at a data interface communicatively connected to a video recording device carried on the UAV. By synchronizing the frame timings between a physical layer frame and a video frame, the transmission delay of the video stream from a UAV to a ground terminal due to the wait time associated with each video frame may be reduced. Further, the present disclosure configures the frame rate of the physical layer frame to be an integer multiple of the frame rate of the video frame. Therefore, the number of adjustments of the frame timing during the video stream transmission between a UAV and a ground terminal may be reduced.
-
FIG. 1 illustrates a schematic diagram of an exemplary unmanned aerial vehicle (UAV)system 100 according to some embodiments of the present disclosure. TheUAV system 100 may include aUAV 102, aground terminal 104, anetwork 106, aserver 108, and astorage 110. - The
UAV 102 may be configured to collect data and transmit the collected data to theground terminal 104 and/or theserver 108 during flight. The data may include the flight status of theUAV 102, the battery usage of theUAV 102, information associated with the surrounding environment, etc. The data may further include text data, video data, audio data, etc. The video data may include videos, images, graphs, animations, audios, etc. TheUAV 102 may transmit the data to theground terminal 104 during flight to synchronously display the content of the data on theground terminal 104. - The
UAV 102 may be operated completely autonomously (e.g., by a computing system such as an onboard controller), semi-autonomously, or manually (e.g., by a user operating a control application implemented on a mobile device). In some embodiments, theUAV 102 may be operated by a user via theground terminal 104. TheUAV 102 may receive commands from an entity (e.g., human user or autonomous control system) and respond to such commands by performing one or more actions. For example, theUAV 102 may be controlled to take off from the ground, move in the air, move to target location or to a sequence of target locations, hover in the air, land on the ground, etc. As another example, theUAV 102 may be controlled to move at a specified velocity and/or acceleration or along a specified route in the air. Further, the commands may be used to control one ormore UAV 102 components described inFIG. 2 (e.g.,video recording device 206,sensor 210,flight controller 208, etc.). For example, some commands may be used to control the position, orientation, and/or operation of thevideo recording device 206. - The
ground terminal 104 may be configured to transmit, receive, output, display, and/or process information. For example, theground terminal 104 may receive information from theUAV 102, thenetwork 106, theserver 108, thestorage 110, etc. As another example, theground terminal 104 may transmit a command generated by a user to control theUAV 102. The command may include information to control the velocity, acceleration, altitude, and/or orientation of theUAV 102. As still another example, theground terminal 104 may display images or play videos taken by theUAV 102 to a user. As still another example, theground terminal 104 may process information received from theserver 108 to update the application installed on theground terminal 104. - In some embodiments, the
ground terminal 104 may include a desktop computer, a mobile device, a laptop computer, a tablet computer, or the like, or any combination thereof. In some embodiments, the mobile device may include a smart home device, a wearable device, a smart mobile device, a virtual reality device, an augmented reality device, or the like, or any combination thereof. In some embodiments, the smart home device may include a smart lighting device, a smart television, a smart video camera, an interphone, or the like, or any combination thereof. In some embodiments, the wearable device may include a smart bracelet, a smart footgear, a smart glass, a smart watch, a smart helmet, smart clothing, a smart backpack, a smart accessory, or the like, or any combination thereof. In some embodiments, the smart mobile device may include a smartphone, a gaming device, a navigation device, a point of sale (POS) device, or the like, or any combination thereof. - The
network 106 may be configured to facilitate exchange of information. In some embodiments, one or more components in the UAV system 100 (e.g., theUAV 102, theground terminal 104, theserver 108 and the storage 110) may send information to other components in theUAV system 100 via thenetwork 106. For example, theground terminal 104 may receive videos and/or images from theUAV 102 via thenetwork 106. In some embodiments, thenetwork 106 may be any type of a wired or wireless network, or a combination thereof. Merely by way of example, thenetwork 106 may include a cable network, a wire line network, an optical fiber network, a telecommunication network, an intranet, an Internet, a local area network (LAN), a wide area network (WAN), a wireless local area network (WLAN), a metropolitan area network (MAN), a wide area network (WAN), a public telephone switched network (PSTN), a Bluetooth™ network, a ZigBee™ network, a near field communication (NFC) network, or the like, or any combination thereof. In some embodiments, thenetwork 106 may include wired or wireless network access points such as base stations and/or internet exchange points (not shown), through which one or more components of theUAV system 100 may be connected to thenetwork 106 to exchange information. In some embodiments, the base stations and/or internet exchange points may be a Wi-Fi station. In some embodiments, the UAV and/or theground terminal 104 may access to thenetwork 106 through a competition-based random access manner or a non-competition-based random access manner. - The
server 108 may be configured to process data. The data may be received from theUAV 102, theground terminal 104, thenetwork 106, thestorage 110, etc. For example, theserver 108 may archive the flight log information from theUAV 102 in thestorage 110. As another example, theserver 108 may back up information from theground terminal 104 in thestorage 110. Theserver 108 may include a central processing unit (CPU), an application-specific integrated circuit (ASIC), an application-specific instruction-set processor (ASIP), a graphics processing unit (GPU), a physics processing unit (PPU), a digital signal processor (DSP), a field programmable gate array (FPGA), a programmable logic device (PLD), a controller, a microcontroller unit, a reduced instruction-set computer (RISC), a microprocessor, or the like, or any combination thereof. In some embodiments, theserver 108 may be integrated in theground terminal 104. - The
storage 110 may be configured to acquire and/or store information. The information may be received from components of the UAV system 100 (e.g., theUAV 102, theground terminal 104, or theserver 108, etc.). For example, thestorage 110 may acquire information from theground terminal 104. In some embodiments, the information acquired and/or stored in thestorage 110 may include programs, software, algorithms, functions, files, parameters, data, texts, numbers, images, or the like, or any combination thereof. For example, thestorage 110 may store images collected by theUAV 102. As another example, thestorage 110 may store parameters (e.g., latitude, longitude, altitude of the UAV 102) from theground terminal 104. In some embodiments, thestorage 110 may include a mass storage, a removable storage, a volatile read-and-write memory, a read-only memory (ROM), or the like, or any combination thereof. In some embodiments, thestorage 110 may be implemented on a cloud platform including a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, or the like, or any combination thereof. - It should be noted that the above description of the
UAV system 100 is merely provided for the purpose of illustration, and is not intended to limit the scope of the present disclosure. For persons having ordinary skills in the art, multiple variations or modifications may be made under the teachings of the present disclosure. For example, theUAV 102 may be any type of remote device that records and transmits video data, including but is not limited to a monitoring device, a wireless sensing network device, a smart home device, an onboard video device, etc. However, these variations and modifications may not depart from the scope of the present disclosure. -
FIG. 2 illustrates a block diagram of an exemplary unmanned aerial vehicle (UAV) 102 according to some embodiments of the present disclosure. TheUAV 102 may include anUAV transceiver 202, aprocessor 204, avideo recording device 206, aflight controller 208, asensor 210, an inertial measurement unit (IMU) 212, andstorage medium 214. - The
UAV transceiver 202 may transmit and/or receive data. The data may include text, videos, images, audios, animations, graphs, or the like, or any combination thereof. In some embodiments, theUAV 102 may communicate with theground terminal 104 via theUAV transceiver 202. For example, theUAV transceiver 202 may transmit a video processed by theprocessor 204 to theground terminal 104, for example, the compressed video. As another example, theUAV transceiver 202 may receive commands from theground terminal 104 to maneuver the motion of theUAV 102. TheUAV transceiver 202 may be any type of transceiver. For example, theUAV transceiver 202 may be a radio frequency (RF) transceiver that is capable of transmitting or receiving data through a wireless network. More particularly, the wireless network may operate in various frequency bands, such as 433 MHz, 900 MHz, 2.4 GHz, 5 GHz, 5.8 GHz, etc. In some embodiments, theUAV transceiver 202 may include a transmitter and a receiver. The transmitter and the receiver may each implement part or all of the functions of theUAV transceiver 202. - The
processor 204 may process data. The data may be received from other components of the UAV 102 (e.g., theUAV transceiver 202, thevideo recording device 206, or thestorage medium 214, etc.). For example, theprocessor 204 may process data received by theUAV transceiver 202. As another example, theprocessor 204 may process data to be transmitted to theground terminal 104 via theUAV transceiver 202. As still another example, theprocessor 204 may receive video data from thevideo recording device 206. Theprocessor 204 may compress the video data, adjust the video data and transmit the adjusted video data to theground terminal 104 via theUAV transceiver 202. The adjustment may include synchronizing the transmission of the video data with physical layer. Theprocessor 204 may receive data from thesensor 210, theflight controller 208, and theIMU 212 to evaluate the status of theUAV 102 and determine a course of action. For example, theprocessor 204 may constantly and/or periodically communicate with theIMU 212, which may measure theUAV 102's velocity and attitude data, and adaptively adjust the position of theUAV 102. In some embodiments, theprocessor 204 may include one or more processors (e.g., single-core processors or multi-core processors). In some embodiments, theprocessor 204 may include a central processing unit (CPU), an application-specific integrated circuit (ASIC), an application-specific instruction-set processor (ASIP), a graphics processing unit (GPU), a physics processing unit (PPU), a digital signal processor (DSP), a field programmable gate array (FPGA), a programmable logic device (PLD), a controller, a microcontroller unit, a reduced instruction-set computer (RISC), a microprocessor, or the like, or any combination thereof. - The
video recording device 206 may capture video data. The video data may include images, videos, audios, graphs, animations, etc. Thevideo recording device 206 may be a camera, a vidicon, a video recorder, a digital camera, an infrared camera, or an ultraviolet camera, etc. Thevideo recording device 206 may send the captured video data to theprocessor 204 for processing. For example, thevideo recording device 206 may send the captured video data to theprocessor 204. Theprocessor 204 may compress the video data and cause transmission of the compressed video data to theground terminal 104. Theground terminal 104 may receive and decompress the video data. In some embodiments, theUAV 102 may include a holder/pan-tilt device (not shown inFIG. 2 ) for mounting and/or stabilizing thevideo recording device 206, for example, a gimbal with at least one axis. Theprocessor 204 may control the operation of the holder/pan-tilt device to adjust the position of thevideo recording device 206. - The
flight controller 208 may control propulsion of theUAV 102 to control the pitch angle, roll angle, and/or yaw angle of theUAV 102. Theflight controller 208 may alter the speed, orientation and/or position of theUAV 102. For example, upon receiving data from theground terminal 104 including a user-defined route plan, theprocessor 204 may interpret the data and send corresponding instructions to theflight controller 208. Theflight controller 208 may alter the speed and/or position of theUAV 102 based on the instructions. - The
sensor 210 may collect relevant data. The relevant data may include information relating to the UAV status, the surrounding environment, or the objects within the environment. Thesensor 210 may include a location sensor (e.g., a global positioning satellite (GPS) sensor, a mobile device transmitter enabling location triangulation), a vision sensor (e.g., an imaging device capable of detecting visible, infrared, or ultraviolet light, such as a camera), a proximity or range sensor (e.g., an ultrasonic sensor, LIDAR (Light Detection and Ranging), a time-of-flight or depth camera), an inertial sensor (e.g., an accelerometer, a gyroscope, an inertial measurement unit (IMU)), an altitude sensor, an attitude sensor (e.g., a compass, an IMU), a pressure sensor (e.g., a barometer), an audio sensor (e.g., a microphone), a field sensor (e.g., a magnetometer, an electromagnetic sensor), or the like, or any combination thereof. Data collected by thesensor 210 may be sent to theprocessor 204. - The
IMU 212 may measure an angular velocity (e.g., attitude change) and a linear acceleration (e.g., velocity change) of theUAV 102. For example, theIMU 212 may include one or more gyroscopes to measure attitude change (e.g., absolute or relative pitch angle, roll angle, and/or yaw angle) of the UAV, and may include one or more accelerometers to measure linear velocity change (e.g., acceleration along x, y, and/or z directions) of theUAV 102. In some embodiments, theIMU 212 may be integrated in thesensor 210. - The
storage medium 214 may store data. The data may be obtained from theUAV transceiver 202, theprocessor 204, thevideo recording device 206, theflight controller 208, thesensor 210, theIMU 212, and/or any other devices. The data may include image data, video data, metadata associated with the image data and the video data, instruction data, etc. Thestorage medium 214 may include a hard disk drive, a solid-state drive, a removable storage drive (e.g., a flash memory disk drive, an optical disk drive, etc.), a digital video recorder, or the like, or any combination thereof. - It should be noted that the above description of the
UAV 102 is merely provided for the purposes of illustration, and is not intended to limit the scope of the present disclosure. For persons having ordinary skills in the art, multiple variations or modifications may be made under the teachings of the present disclosure. For example, some other components may be implemented in theUAV 102, e.g. a battery may be implemented in theUAV 102 as a power supply. As another example, theUAV 102 may include an electronic speed controller (ESC) for controlling or adjusting the rotation speed of motors mounted therein. However, those variations and modifications may not depart from the scope of the present disclosure. -
FIG. 3 illustrates a flowchart of anexemplary process 300 for video transmission in a UAV system according to some embodiments of the present disclosure. In some embodiment, theexemplary process 300 may be implemented by one or more processors of theUAV 102. - In
step 302, video data may be recorded. In some embodiments,step 302 may be implemented by the video recording device 206 (illustrated inFIG. 2 ). The video data may include videos, audios, images, graphs, animations, or the like, or any combination thereof. In some embodiments, the video data may include a plurality of video frames. In some embodiments,step 302 may be performed in response to a request received from theprocessor 204 or theground terminal 104. For example, a user of theground terminal 104 may send a request for recording video data via a user interface (e.g., the user interface 408 illustrated inFIG. 4 ). Upon receiving the request, thevideo recording device 206 may be activated to record the video data. As another example, commands to record the video data may be pre-programmed by the user viaground terminal 104 and stored instorage medium 214 of theUAV 102. The recording of the video data may be controlled by theprocessor 204 via executing the pre-programmed commands stored in thestorage medium 214. - In
step 304, a video frame may be extracted from the video data. In some embodiments,step 304 may be implemented by theprocessor 204. As used herein, the video frame may refer to a frame of the video data. The video frame may correspond to a length of time depending on the compressing algorithms. In some embodiments, the video frame may be a still image. The video frame may include a frame header. The frame header may indicate the start of the video frame transmission. The frame header may include information of the video frame, such as synchronization information, address information, error control information, coding information, etc. The synchronization information may include a start time point and/or an end time point of the video frame. In some embodiments, the frame header may correspond to a frame synchronization pulse signal. The frame synchronization pulse signal may indicate the start time point of the video frame. In some embodiments, the video frame may be segmented into a plurality of sub-frames. Data associated with each of the plurality of sub-frames may be compressed and/or packed into a data package for transmission. Each of the plurality of sub-frames may correspond to a portion of the video frame. The lengths of the segmented sub-frames may be the same or different. - In
step 306, transmission of a physical layer frame may be synchronized with transmission of the video frame over the physical layer. In some embodiments,step 306 may be implemented by theprocessor 204. In an open system interconnection (OSI) architecture, the physical layer defines the means of transmitting raw bits over a physical link connecting network nodes. The bit stream may be converted to a physical signal to be transmitted over a hardware transmission medium. The physical layer comprises a physical signaling sublayer that interfaces with the data link layer's media access control (MAC) sublayer and performs signaling control of transmission and reception. The physical layer frame according to the present application refers to a frame generated by the physical layer of the OSI architecture that performs signaling control of transmission and reception via the physical layer. The transmission of a data package (e.g., a compressed and/or packed video frame or sub-frame thereof) through the physical layer is controlled by a physical layer frame. In particular, transmission of the first bit of the physical layer frame may signal the allowability of transmitting the first bit of the data package (e.g., compressed and/or packed video frame or sub-frame thereof). That is, the timing of transmitting the first bit of the data package (also referred to as frame timing of the video frame) needs to be synchronous to the timing of transmitting the first bit of the physical layer frame (also referred to as frame timing of the physical layer frame). When the frame timing of the video frame is asynchronous to the frame timing of the physical layer frame, the video frame may not be transmitted immediately. When it is determined that the frame timing of the video frame is synchronous to a frame timing of one of the succeeding physical layer frames from the current physical layer frame, transmission of the video frame through the physical layer may be allowed. As theUAV 102 may access thenetwork 106 in a non-competition-based random access scheme, the physical layer frame for transmission control may be generated in a random basis. In some embodiments, the physical layer frames may be generated in accordance to a scheduling of a timer of the physical layer. The frame timing of the physical layer frame and the frame timing of the video frame may be asynchronous. As a result, the start of the physical layer frame transmission (also referred to as the frame header of physical layer frame) and the start of the video frame transmission (also referred to as a time point that a compressed and/or packed sub-frame can be transmitted) through the physical layer may be asynchronous. For example, when a sub-frame of the video frame is compressed and/or packed into a data package and ready to be transmitted in the physical layer, the expected transmission starting time may fall in the middle of transmission of a physical layer frame. The compressed sub-frame, i.e., the data package, may have to wait for a period of time (also referred to as a wait time) before being transmitted and thus, causing a transmission delay. As the physical layer frame for transmission control may be generated in a random basis or in accordance to the scheduling of the timer of the physical layer, the wait time before a data package is allowed to transmit is unpredictable. In order to reduce the wait time before transmission of the data package, the frame header of the physical layer frame may be adjusted. For example, the time point corresponding to the frame header of the physical layer frame may be adjusted to be synchronized with the time point at which the compressed and/or packed sub-frames are expected to transmit. Detailed process of the synchronization may be found elsewhere in the present disclosure (e.g., in the description ofFIG. 6 ,FIG. 8A ,FIG. 8B , etc.). - In
step 308, the video frame may be transmitted. In some embodiments,step 308 may be implemented by theprocessor 204. The video frame may be sent to one or more other components of theUAV 102, such as theUAV transceiver 202, thestorage medium 214, etc. For example, the video frame may be sent to theUAV transceiver 202 for processing. The processing may include amplification, analog-to-digital conversion, digital-to-analog conversion, etc. TheUAV transceiver 202 may send the processed data to theground terminal 104. As another example, the video frame may be stored in thestorage medium 214. In some embodiments, the video frame may be transmitted in a physical layer. As described instep 306, the physical layer frame may be synchronized with the video frame so that the time point when the video frame (or a compressed and/or packed sub-frame thereof) can be transmitted is synchronized with the frame header of the physical layer frame. The video frame (or the compressed and/or packed sub-frames thereof) may be transmitted with the physical layer frame in the physical layer. - It should be note that the steps as shown in
FIG. 3 are for illustrative purpose, and are not intended to limit the protection scope of the present disclosure. In some embodiments, theprocess 300 may be accomplished with one or more additional steps not described, and/or without one or more the steps discussed above. Additionally, the order that the steps of theprocess 300 are performed inFIG. 3 is not intended to be limiting. For example, one or more other optional steps may be added between any two steps in the process illustrated inFIG. 3 . Examples of such steps may include storing or caching the video data, or the like. -
FIG. 4 illustrates a block diagram of anexemplary ground terminal 104 in a UAV system according to some embodiments of the present disclosure. Theground terminal 104 may include aground terminal transceiver 402, aprocessor 404, adisplay 406, a user interface 408, and amemory 410. - The
ground terminal transceiver 402 may transmit and/or receive data. The data may include text, videos, images, audios, animations, graphs, or the like, or any combination thereof. In some embodiments, theground terminal 104 may communicate with theUAV 102 via theground terminal transceiver 402. For example, theground terminal transceiver 402 may transmit an instruction (e.g., an instruction to record video data) from theprocessor 404 to theUAV transceiver 202. Upon receiving the instruction from theground terminal 104, theUAV transceiver 202 may send the instruction to the video recording device to start recording video data. Theground terminal transceiver 402 may be any type of transceiver. For example, theground terminal transceiver 402 may be a radio frequency (RF) transceiver that is capable of transmitting or receiving data over a wireless network. More particularly, the wireless network may operate in various frequency bands, such as 433 MHz, 900 MHz, 2.4 GHz, 5 GHz, 5.8 GHz, etc. In some embodiments, theground terminal transceiver 402 may include a transmitter and a receiver. The transmitter and the receiver may each implement part or all functions of theground terminal receiver 402. - The
processor 404 may process data. The data may be received from theground terminal transceiver 402, thememory 410, etc. The data may include information relating to the state of the UAV 102 (e.g., velocity, acceleration, altitude, etc.), image data, video data, a user instruction (e.g., an instruction to increase altitude of the UAV 102), etc. In some embodiments, the processing of the data may include storing, classifying, selecting, transforming, calculating, estimating, encoding, decoding, or the like, or any combination thereof. For example, theprocessor 404 may decompress a compressed video frame received from theUAV 102. As another example, theground terminal 104 may receive an application updating package from theserver 108 via theground terminal transceiver 402. Theprocessor 404 may process the application updating package to update the related mobile application installed on theground terminal 104. As still another example, theprocessor 404 may process data from thememory 410 to view the historical flight logs. In some embodiments, theprocessor 404 may include one or more microprocessors, filed programmable gate arrays (FPGAs), central processing units (CPUs), digital signal processors (DSPs), etc. - The
display 406 may display information. The information may include text, audios, videos, images, or the like, or any combination thereof. Thedisplay 406 may include a liquid crystal display (LCD), a light emitting diode (LED)-based display, a flat panel display or curved screen, a television device, a cathode ray tube (CRT), or the like, or a combination thereof. In some embodiments, thedisplay 406 may include a touchscreen. In some embodiments, the information displayed on thedisplay 406 may relate to the status of theUAV 102, such as altitude, velocity, etc. In some embodiments, images and/or videos captured by thevideo recording device 206 may be displayed on thedisplay 406. - The user interface 408 may receive user interactions with the
ground terminal 104 and generate one or more instructions to operate one or more components of theground terminal 104 or other components in theUAV system 100. The one or more instructions may include an instruction to operate theground terminal transceiver 402 to communicate with theUAV transceiver 202, an instruction to operate theprocessor 404 to process data received from theground terminal transceiver 402, an instruction to operate thedisplay 406 to display images and/or videos received, an instruction to operate thememory 410 to store data, an instruction operate theUAV 102 to capture video data, or the like, or any combination thereof. In some embodiments, the user interface 408 may include one or more input devices, such as a touchscreen, a keyboard, a mouse, a trackball, a joystick, a stylus, an audio recognition device or application, etc. For example, a keyboard may be integrated in theground terminal 104. An instruction may be generated in response to pressing down a plurality of keys on the keyboard in a certain sequence by a user. In some embodiments, the instruction may direct theUAV 102 to adjust flight attitudes. In some embodiments, the instruction may direct thevideo recording device 206 to take a photo or to record a video. In some embodiments, the instruction may direct thedisplay 406 to display the photo or the video. In some embodiments, the instruction may direct thememory 410 to store data. For example, after a video is obtained, thememory 410 may receive an instruction to store the video. - The
memory 410 may store data. The data may be obtained from theground terminal transceiver 402, theprocessor 404, thedisplay 406, the user interface 408, and/or any other components in theUAV system 100. The data may include image data, video data, metadata associated with the image data and the video data, instruction data, etc. In some embodiments, thememory 410 may include a hard disk drive, a solid-state drive, a removable storage drive (e.g., a flash memory disk drive, an optical disk drive, etc.), a digital video recorder, or the like, or a combination thereof. - It should be noted that the above description of the
ground terminal 104 is merely provided for the purposes of illustration, and is not intended to limit the scope of the present disclosure. For persons having ordinary skills in the art, multiple variations or modifications may be made under the teachings of the present disclosure. For example, one or more components of theground terminal 104 may include an independent storage block (not shown) respectively. However, those variations and modifications may not depart from the scope of the present disclosure. -
FIG. 5 illustrates a block diagram of anexemplary processor 204 in a UAV system according to some embodiments of the present disclosure. Theprocessor 204 may include adata acquisition module 502, a data compressing andpacking module 504, a physicallayer processing module 506, and astorage module 508. The modules in theprocessor 204 may be connected in a wired or wireless manner. - The
data acquisition module 502 may obtain data. The data may relate to the UAV 102 (e.g., velocity of the UAV 102), the surrounding environment (e.g., temperature, barometric pressure, etc.), or objects within the environment. The data may include image data, audio data, video data, or the like, or any combination thereof. The data may be obtained from theUAV transceiver 202, thevideo recording device 206, thestorage medium 214, or other components of theUAV system 100. In some embodiments, the video data may include a real-time video stream. The real-time video stream may be transmitted by thevideo recording device 206, or received at a data interface communicatively connected to thevideo recording device 206. The video data may include a plurality of video frames. Each of the plurality of video frames may have a frame header. The frame header of each of the plurality of video frames may correspond to a frame synchronization pulse signal. The frame synchronization pulse signal may indicate a start time point of transmitting the each of the plurality of video frames in the physical layer. - The data compressing and
packing module 504 may compress and/or pack data. The data may be received from thedata acquisition module 502, or thestorage module 508. The data compressing andpacking module 504 may be configured to reduce the redundancy in the data. The redundancy may include temporal redundancy, spatial redundancy, statistical redundancy, perceptual redundancy, etc. The data may be compressed in various compression algorithms. The compression algorithms may include lossless data compression, and lossy data compression. The lossless data compression may include run-length encoding (RLE), Lempel-Ziv compression, Huffman coding, prediction by partial matching (PPM), bzip2 compression, or the like, or any combination thereof. The lossy data compression may include fractal compression, vector quantization, wavelet compression, linear predictive coding, or the like, or any combination thereof. In some embodiments, the data may be video data. The video data may be compressed under a video coding standard. The video coding standard may include but not limited to H.120, H.261, H.262, H.263, H.264, high efficiency video coding, MPEG-4, or the like, or any combination thereof. - In some embodiments, the data compressing and
packing module 504 may also be configured to pack the compressed data. For example, the compressed data may be packed in various formats. Exemplary packaging formats may include an AVI format, a DV-AVI format, a WMV format, a MP4 format, a RMVB format, a MOV format, a FLV format, or the like, or any combination thereof. In some embodiments, the compressed data may be packed into a data package that complies with the physical layer protocols. - As described elsewhere in the present disclosure, a video frame may be segmented into a plurality of sub-frames of the same or different lengths. Each of the plurality of sub-frames may correspond to a portion of the video frame. The data compressing and
packing module 504 may compress data associated with the each of the plurality of sub-frames. Alternatively or additionally, the compression andpacking module 504 may further pack the compressed data. - The physical
layer processing module 506 may be configured to process data. The data may be obtained from thedata acquisition module 502 or the data compressing andpacking module 504. In some embodiments, the processing of the data may include extracting a frame header from a video frame, determining a time point and/or a time period, synchronizing two or more time points, or the like, or any combination thereof. For example, a video frame may be acquired by thedata acquisition module 502 from thevideo recording device 206. A sub-frame of the video frame may then be compressed and/or packed by the data compressing andpacking module 504. The compressed and/or packed sub-frame may be transmitted in the physical layer. A physical layer frame may control the data transmission (including the transmission of the compressed and/or packed sub-frame) in the physical layer. A frame header of the physical layer frame which corresponds to a time point at which the data transmission starts may be determined. However, the start of the physical layer frame transmission (or referred to as the frame header of physical layer frame) and the start of the video frame transmission (or referred to as a time point that a compressed and/or packed sub-frame can be transmitted) may be asynchronous. Therefore, the sub-frame may have to wait for a period of time (also referred to as a wait time) before being transmitted and a delay may be caused. In order to reduce the wait time and delay, the frame header of the physical layer frame may be adjusted. For example, the time point corresponding to the frame header of the physical layer frame may be adjusted to be synchronized with the time point at which the compressed and/or packed sub-frames can be transmitted. Detailed process of the synchronization may be found in, for example,FIG. 6 and the description thereof. - The physical
layer processing module 506 may include atime determination unit 510, atime adjustment unit 520, and a data transmitting and receivingunit 530. - The
time determination unit 510 may be configured to determine a time point and/or a time period. The time point may correspond to a frame header of a video frame, or a frame header of a physical layer frame. The time point may also include a time point that the compression and/or packing of the video frame is started or completed, a time point that the compression and/or packing of the sub-frame of the video frame is started or completed, a time point that a camera (e.g., the video recording device 206) starts to record video data, etc. In some embodiments, the time point may be determined based on a signal. For example, the frame header of the video frame may correspond to a frame synchronization pulse signal. The time point corresponding to the frame header of the video frame may be determined based on the frame synchronization pulse signal. As another example, the frame header of the physical layer frame may correspond to a physical layer pulse signal. The time point corresponding to the frame header of the physical layer frame may be determined based on the physical layer pulse signal. Merely by way of example, the frame synchronization pulse signal may indicate a time point of receiving the video frame from the video recording device. - The time period may include a time period for compressing and/or packing the video frame, a time period for compressing and/or packing the sub-frame of the video frame, a time period for transmitting the physical layer frame (and/or the compressed and packed sub-frames), or a time period that the
processor 204 operates, etc. In some embodiments, the time period of compressing and/or packing the video frame (or sub-frames thereof) may be determined based on the length of the video frame and the compression and/or packing algorithms used. In some embodiments, thetime determination unit 510 may be implemented by a timer. The timer may be a counter that provides an internal reading. The reading may rise in every fixed time period (e.g., a microsecond, a millisecond, a second, etc.). In some embodiments, when the reading reaches a preset number, the timer may generate a signal and the reading may be zeroed. In some other embodiments, the timer may generate a signal periodically without zeroing its reading. For example, the signal is generated when the reading is a multiple of 100. In some embodiments, the timer may record its internal reading when a frame synchronization pulse signal is detected (e.g., at a time point corresponding to the frame header of the video frame). In some embodiments, the timer may record the time period (or a number increase in its internal reading) of compressing and/or packing the sub-frame. - The
time adjustment unit 520 may be configured to adjust the time point and/or time period determined by thetime determination unit 510. For example, thetime adjustment unit 520 may adjust the time point corresponding to the frame header of the physical layer frame. More particularly, the time point may be adjusted such that the time point corresponding to the frame header of the physical layer frame may be synchronized with a time point when the compressed and/or packed sub-frames are ready to be transmitted. The compressed and/or packed sub-frame may be transmitted with the physical layer frame in the physical layer. In some embodiments, the period of time of a present physical layer frame may be extended or shorten such that the frame header of the succeeding physical layer frame is synchronized with the sub-frame to be transmitted. In some other embodiments, internal reading of the timer may control the frame header of the physical layer frame. For example, a physical layer frame is generated when the reading of the timer reaches a certain value. By changing the readings of the timer, the frame header of the physical layer frame may be shifted such that the frame header of the succeeding physical layer frame is synchronized with the time point that the sub-frame is ready to be transmitted. Detailed description about the adjustment may be found inFIG. 6 ,FIG. 8A andFIG. 8B . - The data transmitting and receiving
unit 530 may be configured to transmit and receive data. The data may include data received from the data compressing and packing module 504 (e.g., a compressed and/or packed video frame, a compressed and/or packed sub-frame), thestorage module 508, or other components of the UAV system 100 (e.g., theUAV transceiver 202, thestorage medium 214, etc.), etc. For example, the compressed and/or packed sub-frame received from the data compressing andpacking module 504 may be transmitted by the data transmitting and receivingunit 530 to theUAV transceiver 202 for processing. The compressed and/or packed sub-frame may be transmitted in the physical layer. The processing may include amplification, analog-to-digital conversion, digital-to-analog conversion, etc. TheUAV transceiver 202 may send the processed data to theground terminal 104. As another example, the data transmitting and receivingunit 530 may obtain data from thestorage module 508. - The
storage module 508 may be configured to store data. The data may be obtained from thedata acquisition module 502, the data compressing andpacking module 504, the physicallayer processing module 506, and/or other devices. The data may include programs, software, algorithms, functions, files, parameters, text, numbers, images, videos, audios, or the like, or any combination thereof. In some embodiments, thestorage module 508 may include a hard disk drive, a solid-state drive, a removable storage drive (e.g., a flash memory disk drive, an optical disk drive, etc.), a digital video recorder, or the like, or a combination thereof. - It should be noted that the above description about the
processor 204 is merely for the purposes of illustration, and is not intended to limit the scope of the present disclosure. For persons having ordinary skills in the art, multiple variations and modifications may be made under the teaching of the present disclosure. In some embodiments, thestorage module 508 may be omitted from theprocessor 204 and/or incorporated into thestorage medium 214. However, those variations and modifications may not depart from the scope of the present disclosure. -
FIG. 6 illustrates a flowchart of anexemplary process 600 for transmitting a video frame in a UAV system according to some embodiments of the present disclosure. In some embodiments, theprocess 600 may be implemented by theprocessor 204. - In
step 602, video data may be recorded. In some embodiments,step 602 may be implemented by thevideo recording device 206. Step 602 may be similar to Step 302 ofFIG. 3 , and is not repeated here. - In
step 604, a video frame may be extracted from the video data. In some embodiments,step 604 may be implemented by thedata acquisition module 502. As used herein, the video frame may refer to a frame of the video data. The video frame may correspond to a length of time. In some embodiments, the video frame may be a still image. - In
step 606, the video frame may be compressed. In some embodiments,step 606 may be implemented by the data compressing andpacking module 504. The video frame may be segmented into a plurality of sub-frames. Data associated with each of the plurality of sub-frames may be compressed. Each of the plurality of sub-frames may correspond to a portion of the video frame. The video frame may be segmented into a plurality of sub-frames of the same or different lengths. Related description about the compression may be found in the description of the data compressing andpacking module 502. - In
step 608, a time period for compressing the video frame may be determined. In some embodiments,step 608 may be implemented by the physicallayer processing module 506. The time period for compressing the video frame may be determined based on the length of the video frame and the compressing algorithms used. In some embodiments, the time period for compressing the video frame may refer to the sum of time periods of compressing all the sub-frames of the video frame. In some embodiments, the time period of compressing the video frame in the present application may also refer to a time period of compressing a sub-frame of the video frame. - In some embodiments, the compressed video frame (or sub-frames thereof) may be further packed in
step 606. In this case, the time period determined instep 608 may include the time period for both compressing and packing the video frame (or the sub-frames thereof). In some embodiments, the time period for compressing the video frame may include other processing time of the video frame before transmission, for example, the time for adding redundancy check bits to the packed video frame. - In
step 610, a frame header may be extracted from the video frame. In some embodiments,step 610 may be implemented by thedata acquisition module 502. The frame header may include control information of the video frame. The control information may include synchronization information, address information, error control information, coding information, etc. The synchronization information may include a start time point and/or an end time point of the video frame. - In
step 612, a first time point corresponding to the frame header of the video frame may be determined. In some embodiments,step 612 may be implemented by the physicallayer processing module 506. As used herein, the first time point may refer to the start time point of the video frame. In some embodiments, the frame header of the video frame may correspond to a frame synchronization pulse signal. The time point that the frame synchronization pulse signal is received at the physical layer may further be determined as the first time point. - In step 614, a second time point may be determined based on the first time point obtained in
step 612 and the time period for compressing (and/or packing) the video frame obtained instep 608. In some embodiments, step 614 may be implemented by the physicallayer processing module 506. As used herein, the second time point may refer to a time point when the compression and/or packing of the video frame is completed. For example, the second time point may refer to a time point that the compression and/or packing of the sub-frame is completed. Detailed description about the determination of the second time point may be found inFIG. 8A andFIG. 8B . - In
step 616, a third time point corresponding to a frame header of a physical layer frame may be synchronized with the second time point. In some embodiments,step 616 may be implemented by the physicallayer processing module 506. For example, the third time point corresponding to the frame header of the physical layer frame may be synchronized with the second time point at which the compression and/or packing of the sub-frame is completed. The physical layer frame may control a data transmission and the frame header of the physical layer frame may indicate the start of the data transmission. For example, the frame header of the physical layer frame may indicate the start of transmission of the compressed and/or packed sub-frame through the physical layer. The transmission time of a physical layer frame may be a fixed value as the length of the physical layer frames may be configured as the same. In some embodiments, the transmission time of a present physical layer frame may be extended or shorten such that the frame header of the succeeding physical layer frame and the second time point (i.e., the time point at which the compression and/or packing of the sub-frame is completed) are synchronized. In some embodiments, the transmission time of one or more of the succeeding physical layer frames from the present physical layer frame may be adjusted in a similar manner as transmission time of the present physical layer frame. In some embodiments, only the transmission time of the present physical layer frame is adjusted and by maintaining a certain relationship between the frame rate of the video frame and the physical layer frame, the transmission time of the succeeding physical layer frames may not need to be adjusted. Alternatively, the transmission time of the present physical layer frame may be adjusted each time a video frame is received for compression and transmission. - In some embodiments, the frame synchronization pulse signal indicating the time point of receiving the video frame from the video recording device may be transmitted to the physical layer. A snapshot may be taken on the timer of the physical layer to record the occurrence time point of the frame synchronization pulse signal at the physical layer. The physical
layer processing module 506 may calculate a start time point of transmitting the video frame based on the occurrence time point of the frame synchronization pulse signal (i.e., the time point when the video frame from the video recording device for transmission) and the time for compressing and/or packing the video frame. The physicallayer processing module 506 may further compare the calculated start time point of transmitting the video frame with a scheduled time point of generating a physical layer frame and determine whether the calculated start time point of transmitting the video frame and the next scheduled time point of generating a physical layer frame are synchronous. When it is determined that the calculated start time point of transmitting the video frame and the next scheduled time point of generating a physical layer frame are asynchronous, the physicallayer processing module 506 may adjust the value of the physical layer timer such that the next scheduled time point of generating a physical layer frame is adjusted to be the same as the calculated start time point of transmitting the video frame. Detailed description about the synchronization may be disclosed inFIG. 8A andFIG. 8B . - In
step 618, the compressed video frame may be transmitted at the third time point. In some embodiments,step 618 may be implemented by the physicallayer processing module 506. For example, the compressed and/or packed sub-frame may be transmitted at the third time point with the corresponding physical layer frame in the physical layer. In some embodiments, the compressed and/or packed video frame may be transmitted to a remote device (e.g., theground terminal 104, theserver 108, thestorage 110, etc.) via theUAV transceiver 202. - In
step 620, theprocess 600 may determine whether the transmission of the video frame (or the sub-frames thereof) is completed. If the transmission of the video frame is not completed, theprocess 600 may proceed back tostep 616. For example, a plurality of third time points corresponding to the plurality of the sub-frames may be obtained respectively. Each of the plurality of sub-frames may be transmitted at the corresponding third time point in sequence. The plurality of sub-frames may be transmitted by repeatingstep 616 to step 618. If the transmission of the video frames is completed, theprocess 600 may proceed to step 622. Instep 622, theprocessor 204 may wait for next video data. When the next video data is received or recorded, it may be processed by theprocessor 204 by repeating steps 604-620. - It should be noted that the steps as shown in
FIG. 6 are for illustrative purpose, and are not intended to limit the protection scope of the present disclosure. In some embodiments, the process may be accomplished with one or more additional steps not described, and/or without one or more of the steps discussed above. Additionally, the order that the steps of theprocess 600 are performed inFIG. 6 is not intended to be limiting. For example,step 606 and step 610 may be performed simultaneously or successively. As another example, step 614 and step 616 may be merged into one step. As another example, step 616 may be separated into two steps: a third time point determination and a third time point adjustment. In some embodiments, the video data may include a plurality of video frames. Each of the video frames may be processed by repeating steps 604-620. -
FIG. 7 illustrates a flowchart of anexemplary process 700 for configuring a frame rate of physical layer frames in a UAV system according to some embodiments of the present disclosure. In some embodiments, theexemplary process 700 may be implemented by theprocessor 204. - In
step 702, a plurality of video frames may be received. The video frames may be extracted from the video data recorded in methods similar to those disclosed instep 602 and/or step 302. - In
step 704, a frame rate of the plurality of video frames may be obtained. Each of the video frames may be a still image. The frame rate of the plurality of video frames may refer to the number of still images obtained per second. For example, the frame rate of twenty video frames may be obtained by dividing the total time (in seconds) of the twenty frames by twenty. - In
step 706, a frame rate of a plurality of physical layer frames may be configured based on the frame rate of the plurality of video frames. The frame rate of the plurality of physical layer frames may refer to the number of physical layer frames per unit of time (e.g., second or millisecond). As described elsewhere in present disclosure, the frame header of the present physical layer frame may be synchronized with present video frame to reduce wait time and delay. Instep 706, the frame rate of the physical layer frames may be configured so that the subsequent physical layer frames may be synchronized with the subsequent video frames, respectively. More particularly, the frame rate of physical layer frames may be adjusted such that each frame header of the physical layer frames is synchronized or almost synchronized with the time point when the compression of the video frame is completed. In some embodiments, the frame rate of the plurality of physical layer frames may be an integer multiple of the frame rate of the plurality of video frames. Merely by way of example, the integer may be 2, 4, 6, 8, 10, 20, 25, 30, etc. - It should be noted that the steps as shown in
FIG. 7 are for illustrative purpose, but are not intended to limit the protection scope of the present disclosure. In some embodiments, the process may be accomplished with one or more additional steps not described, and/or without one or more of the steps discussed above. Additionally, the order in which the steps of the process as illustrated inFIG. 7 is not intended to be limiting. -
FIG. 8A andFIG. 8B illustrate two schematic diagrams of a video transmission in a UAV system according to some embodiments of the present disclosure. In some embodiments,FIG. 8A andFIG. 8B illustrate a video transmission from theUAV 102 to theground terminal 104. - In some embodiments, as illustrated in
FIG. 8A andFIG. 8B , the video transmission may include capturing video data via theUAV 102, receiving video frames at theUAV 102, compressing the video frames at theUAV 102, processing the video frames at theUAV 102, transmitting the video frames via at theUAV 102 according to the timings of the physical layer frames, transmitting the video frames from theUAV 102 to theground terminal 104 wirelessly, receiving and decompressing the video frames in theground terminal 104, etc. A plurality of video frames may be extracted from a real-time video. The plurality of video frames may include a video frame 802-1, a video frame 802-2, etc. In some embodiments, the real-time video may be received at a data interface (e.g., a universal serial bus (USB) interface, an IEEE1394 interface, or a RS-232 interface, etc.) communicatively connected to a camera (e.g., the video recording device 206). A frame timing of the camera may include a plurality of signals that indicate the start time points of the camera to record videos. The plurality of signals may include a signal 800-1, a signal 800-2, a signal 800-3, etc. The video frame 802-1 and the video frame 802-2 may be segmented into sub-frames and compressed into a plurality of compression packages, including a compression package 804-1, a compression package 804-2 . . . a compression package 804-7, etc. The segmentation and compression method may be similar to the description of data compressing andpacking module 504. The plurality of compression packages may be transmitted through physical layer under the control of a plurality of physical layer frames including a physical layer frame 806-1, a physical layer frame 806-2 . . . a physical layer frame 806-8, etc. The plurality of compression packages corresponding to the video frames may be received in theground terminal 104 and may be decompressed as a plurality of decompressed video frames, including a decompressed video frame 808-1, a decompressed video frame 808-2 . . . a decompressed video frame 808-5, etc. - As illustrated in
FIG. 8A , a time point T1 may correspond to the frame header of the video frame 802-1. A time point T4 may correspond to the time point when the compression package 804-1 is received and decompressed (the decompressed video frame that correspond to the compression package 804-1 is denoted by 808-1). A time period t3 may correspond to the time period for compressing and/or packing the first sub-frame of the video frame 802-1. A time point T2 may be determined based on the time point T1 and the time period t3. The time point T2 may indicate that the compression package 804-1 is compressed and ready to be transmitted. However, the time point T2 is not synchronized with the time point T3 corresponding to the frame header of the physical layer frame 806-2 (shown as the time point T9). Hence, the compression package 804-1 cannot be transmitted until at a time point corresponding to the frame header of the physical layer frame 806-3 (shown as the time point T3). A wait time period t2 may be defined as the time length that the compression package 804-1 needs to wait before it is transmitted. A time delay may be defined as a time period from the start time of the video frame to the time point when the video frame is available for streaming and/or playback at theground terminal 104. As shown inFIG. 8A , a time delay t1 may refer to a time period from the time point T1 to the time point T4, which may be unnecessarily extended due to the wait time period t2. The method disclosed in present disclosure may be implemented to eliminate or reduce the wait time, hence to reduce the time delay. -
FIG. 8B illustrates a schematic diagram of a video transmission similar toFIG. 8A but with adjusted physical layer frame timings. As illustrated inFIG. 8B , a time point T5 may correspond to the frame header of the video frame 802-1. A time point T8 may correspond to the time point when the compression package 804-1 is received and decompressed (the decompressed video frame that correspond to the compression package 804-1 is denoted by 808-1). A time period t5 may correspond to the time period for compressing and/or packing the first sub-frame of the video frame 802-1. A time point T7 may be determined based on the time point T5 and the time period t5. The time point T7 may indicate that the compression package 804-1 is compressed and ready to be transmitted. A time point T6 may correspond to the frame header of the physical layer frame 806-1. To synchronize the time point corresponding to the frame header of the physical layer frame 806-2, the physical layer frame 806-1 may be adjusted. For example, the length of physical layer frame 806-1 may be extended. As another example, an internal reading of a timer that corresponds to the frame header of the physical layer frame 806-1 may be increased. In particular, the reading of the timer may be adjusted such that the time point corresponding to the frame header of the physical layer frame 806-2 is synchronized with the time point T7. The transmission time of the physical layer frame 806-1 may be extended or shorten so that the time point corresponding to the frame header of the physical layer frame 806-2 is synchronized with the time point T7. A time delay t4 may be defined as a time period from the time point T5 to the time point T8. With the timing adjustment of the physical layer frame 806-1, the compression package 804-1 can be transmitted immediately when the compression is completed, i.e., without the wait time period t2 as shown inFIG. 8A . In some embodiments, only the transmission time of the physical layer frame 806-1 is adjusted and the transmission time of the physical layer frame 806-2 and the subsequent physical layer frames are not adjusted until a subsequent video frame is received. - In some embodiments, as illustrated in
FIG. 8B , each of the plurality of video frames may be transmitted with the timing adjustment of a number of physical layer frames. The number of physical layer frames of which the timings need to be adjusted may be relevant with a frame rate of the plurality of physical layer frames and a frame rate of the plurality of video frames. In some embodiments, the frame rate of the plurality of physical layer frames may be configured to be an integer multiple (e.g., twenty) of the frame rate of the plurality of video frames so that the number of physical layer frames of which the timings need to be adjusted may be greatly decreased. -
FIG. 9 illustrates a schematic diagram of an exemplary OSI model. TheOSI model 900 characterizes and standardizes the communication functions of a telecommunication or computing system. TheOSI model 900 defines a networking framework to implement protocols in seven layers. From the bottom up, the seven layers may include aphysical layer 902, adata link layer 904, anetwork layer 906, atransport layer 908, asession layer 910, apresentation layer 912, and anapplication layer 914. - The
physical layer 902 defines electrical and physical specifications of a data connection. Thephysical layer 902 defines the relationship between a device and a physical transmission medium (e.g., a copper or fiber optical cable, a radio frequency, etc.). The relationship may include layout of pins, voltages, line impedance, cable specifications, signal timing and similar characteristics of connected devices, frequency (5 GHz or 2.4 GHz etc.) of wireless devices, etc. Thephysical layer 902 may convey a bit stream (e.g., an electrical impulse, light or radio signals, etc.) through network at the electrical and mechanical level. Thephysical layer 902 may provide hardware means of sending and receiving data on a carrier, including defining cables, cards and physical aspects. In some embodiments, the physicallayer processing module 506 and/or the user interface 408 may operate in thephysical layer 902. - The
data link layer 904 may furnish transmission protocol knowledge, management, handling errors in thephysical layer 902, flow control, frame synchronization, etc. In thedata link layer 904, data packets may be encoded and decoded into bits. In some embodiments, thedata link layer 904 may be divided into two sub layers: the Media Access Control (MAC) layer and the Logical Link Control (LLC) layer. The MAC sub layer may control how a computer on the network gains access to data and permission to transmit the data. The LLC layer may control frame synchronization, flow control and error checking. In some embodiments, theprocessor 204, theprocessor 404 of theUAV system 100 may operate in thedata link layer 904. - The
network layer 906 may provide switching and routing technologies, and creating logical paths for transmitting data from node to node. In some embodiments, thenetwork 106, theUAV transceiver 202, and/or theground terminal transceiver 402 may operate in thenetwork layer 906. In some embodiments, accessing of theUAV 102 to thenetwork 106 may be operated in thenetwork layer 906. The access may be a competition-based random access or a non-competition-based random access. - The
transport layer 908 may provide transparent transfer of data between end systems, or hosts. Thetransport layer 908 may be responsible for end-to-end error recovery and flow control. Thetransport layer 908 may ensure complete data transfer. In some embodiments, thenetwork 106, theUAV transceiver 202, and/or theground terminal transceiver 402 may operate in thetransport layer 908. In some embodiments, protocols that operate in thetransport layer 908 may include TCP, UDP, SPX, etc. - The
session layer 910 may control connections between devices. Thesession layer 910 may establish, manage and terminate the connections between devices. In some embodiments, theprocessor 204 and/or theprocessor 404 may operate in thesession layer 910. - The
presentation layer 912 may provide independence from differences in data representation (e.g., encryption) by translating from application to network format, and vice versa. Thepresentation layer 912 may work to transform data into the form that theapplication layer 914 can accept. In some embodiments, the data compressing andpacking module 504 may operate in thepresentation layer 912. - The
application layer 914 may provide application services, such as file transfers, e-mail, or other network software services, etc. Theapplication layer 914 may perform functions including identifying communication partners, determining resource availability, etc. In some embodiments, protocols that operate in theapplication layer 912 may include FTP, HTTP, DNS, etc. - It should be noted that the
OSI model 900 is provided merely for the purposes of illustration, and is not intended to limit the scope of the present disclosure. For persons having ordinary skills in the art, multiple variations or modifications may be made under the teachings of the present disclosure. For example, a device or module of theUAV system 100 may work in multiple layers simultaneously or subsequently. However, those variations and modifications may not depart from the scope of the present disclosure. - Having thus described the basic concepts, it may be rather apparent to those skilled in the art after reading this detailed disclosure that the foregoing detailed disclosure is intended to be presented by way of example only and is not limiting. Various alterations, improvements, and modifications may occur and are intended to those skilled in the art, though not expressly stated herein. These alterations, improvements, and modifications are intended to be suggested by this disclosure, and are within the spirit and scope of the exemplary embodiments of this disclosure.
- Moreover, certain terminology has been used to describe embodiments of the present disclosure. For example, the terms “one embodiment,” “an embodiment,” and/or “some embodiments” mean that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. Therefore, it is emphasized and should be appreciated that two or more references to “an embodiment” or “one embodiment” or “an alternative embodiment” in various portions of this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures or characteristics may be combined as suitable in one or more embodiments of the present disclosure.
- Further, it will be appreciated by one skilled in the art that aspects of the present disclosure may be illustrated and described herein in any of a number of patentable classes or context including any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof. Accordingly, aspects of the present disclosure may be implemented entirely hardware, entirely software (including firmware, resident software, micro-code, etc.) or combining software and hardware implementation that may all generally be referred to herein as a “block,” “module,” “engine,” “unit,” “component,” or “system.” Furthermore, aspects of the present disclosure may take the form of a computer program product embodied in one or more computer readable media having computer readable program code embodied thereon.
Claims (20)
1. A video processing method comprising:
determining a first time point corresponding to a frame header of a video frame;
determining a time period for processing at least a portion of the video frame;
determining a second time point based on the first time point and the time period;
adjusting a third time point corresponding to a frame header of a physical layer frame to synchronize the third time point with the second time point; and
starting to transmit the video frame at the second time point.
2. The method of claim 1 , wherein processing the at least a portion of the video frame includes:
compressing data associated with the video frame.
3. The method of claim 1 , further comprising:
segmenting the video frame into a plurality of sub-frames;
wherein processing the at least a portion of the video frame includes compressing a sub-frame of the plurality of sub-frames.
4. The method of claim 3 , wherein:
determining the time period includes determining a period for compressing the sub-frame according to a length of the sub-frame and a compression algorithm used for compressing the sub-frame; and
the time period includes the period for compressing the sub-frame.
5. The method of claim 4 , wherein:
determining the time period further includes determining a period for adding redundancy check bits to the compressed sub-frame; and
the time period further includes the period for adding the redundancy check bits.
6. The method of claim 1 , further comprising:
generating the physical layer frame at the second time point.
7. The method of claim 1 , wherein the frame header of the video frame corresponds to a frame synchronization pulse signal.
8. The method of claim 1 , wherein the video frame is extracted from a real-time video stream transmitted by a video recording device.
9. The method of claim 1 , wherein the video frame is extracted from a real-time video received at a data interface communicatively connected to a video recording device.
10. The method of claim 1 , further comprising:
obtaining a frame rate of the video frame; and
configuring a frame rate of the physical layer frame based on the frame rate of the video frame.
11. The method of claim 10 , wherein the frame rate of the physical layer frame is an integer multiple of the frame rate of the video frame.
12. The method of claim 1 , wherein adjusting the third time point corresponding to the frame header of the physical layer frame includes:
adjusting a length of the physical layer frame; or
adjusting an internal reading of a timer corresponding to the frame header of the physical layer frame.
13. A system comprising:
a memory storing one or more computer-executable instructions; and
one or more processors configured to communicate with the memory and execute the one or more computer-executable instructions to:
determine a first time point corresponding to a frame header of a video frame;
determine a time period for processing at least a portion of the video frame;
determine a second time point based on the first time point and the time period;
adjust a third time point corresponding to a frame header of a physical layer frame to synchronize the third time point with the second time point; and
start to transmit the video frame at the second time point.
14. The system of claim 13 , wherein the one or more processors are further configured to execute the one or more computer-executable instructions to:
compress data associated with the video frame.
15. The system of claim 13 , wherein the one or more processors are further configured to execute the one or more computer-executable instructions to:
segment the video frame into a plurality of sub-frames; and
compress a sub-frame of the plurality of sub-frames.
16. The system of claim 15 , wherein:
the one or more processors are further configured to execute the one or more computer-executable instructions to determine a period for compressing the sub-frame according to a length of the sub-frame and a compression algorithm used for compressing the sub-frame; and
the time period includes the period for compressing the sub-frame.
17. The system of claim 16 , wherein:
the one or more processors are further configured to execute the one or more computer-executable instructions to determine a period for adding redundancy check bits to the compressed sub-frame; and
the time period further includes the period for adding the redundancy check bits.
18. The system of claim 13 , wherein the one or more processors are further configured to execute the one or more computer-executable instructions to:
generate the physical layer frame at the second time point.
19. The system of claim 13 , wherein the one or more processors are further configured to execute the one or more computer-executable instructions to adjust the third time point by:
adjusting a length of the physical layer frame; or
adjusting an internal reading of a timer corresponding to the frame header of the physical layer frame.
20. A non-transitory computer readable medium storing executable instructions that, when executed by at least one processor, cause the at least one processor to:
determine a first time point corresponding to a frame header of a video frame;
determine a time period for processing at least a portion of the video frame;
determine a second time point based on the first time point and the time period;
adjust a third time point corresponding to a frame header of a physical layer frame to synchronize the third time point with the second time point; and
start to transmit the video frame at the second time point.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/207,593 US20210227102A1 (en) | 2017-08-25 | 2021-03-19 | Systems and methods for synchronizing frame timing between physical layer frame and video frame |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2017/099156 WO2019037121A1 (en) | 2017-08-25 | 2017-08-25 | Systems and methods for synchronizing frame timing between physical layer frame and video frame |
US16/704,662 US11012591B2 (en) | 2017-08-25 | 2019-12-05 | Systems and methods for synchronizing frame timing between physical layer frame and video frame |
US17/207,593 US20210227102A1 (en) | 2017-08-25 | 2021-03-19 | Systems and methods for synchronizing frame timing between physical layer frame and video frame |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/704,662 Continuation US11012591B2 (en) | 2017-08-25 | 2019-12-05 | Systems and methods for synchronizing frame timing between physical layer frame and video frame |
Publications (1)
Publication Number | Publication Date |
---|---|
US20210227102A1 true US20210227102A1 (en) | 2021-07-22 |
Family
ID=65439756
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/704,662 Active US11012591B2 (en) | 2017-08-25 | 2019-12-05 | Systems and methods for synchronizing frame timing between physical layer frame and video frame |
US17/207,593 Abandoned US20210227102A1 (en) | 2017-08-25 | 2021-03-19 | Systems and methods for synchronizing frame timing between physical layer frame and video frame |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/704,662 Active US11012591B2 (en) | 2017-08-25 | 2019-12-05 | Systems and methods for synchronizing frame timing between physical layer frame and video frame |
Country Status (4)
Country | Link |
---|---|
US (2) | US11012591B2 (en) |
EP (1) | EP3643074A4 (en) |
CN (1) | CN110537372B (en) |
WO (1) | WO2019037121A1 (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11075817B1 (en) * | 2020-05-08 | 2021-07-27 | International Business Machines Corporation | Context aware network capacity augmentation using a flying device |
WO2022000497A1 (en) * | 2020-07-03 | 2022-01-06 | 深圳市大疆创新科技有限公司 | Display control method, device and system |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6157674A (en) * | 1996-03-21 | 2000-12-05 | Sony Corporation | Audio and video data transmitting apparatus, system, and method thereof |
CA2566126A1 (en) * | 2004-05-13 | 2005-12-01 | Qualcomm Incorporated | Synchronization of audio and video data in a wireless communication system |
US7979784B2 (en) * | 2006-03-29 | 2011-07-12 | Samsung Electronics Co., Ltd. | Method and system for enhancing transmission reliability of video information over wireless channels |
EP2362653A1 (en) * | 2010-02-26 | 2011-08-31 | Panasonic Corporation | Transport stream packet header compression |
US8700796B2 (en) * | 2010-09-22 | 2014-04-15 | Qualcomm Incorporated | MAC data service enhancements |
US8917608B2 (en) * | 2012-01-31 | 2014-12-23 | Qualcomm Incorporated | Low latency WiFi display using intelligent aggregation |
US9007384B2 (en) * | 2012-12-18 | 2015-04-14 | Apple Inc. | Display panel self-refresh entry and exit |
CN103916668A (en) * | 2013-01-04 | 2014-07-09 | 云联(北京)信息技术有限公司 | Image processing method and system |
JP6168462B2 (en) * | 2014-08-21 | 2017-07-26 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | Communication method and system for unmanned aerial vehicles |
MX2018002978A (en) * | 2015-09-17 | 2018-05-02 | Sony Corp | Transmission device, receiving device, and data processing method. |
CN106717000A (en) * | 2016-12-12 | 2017-05-24 | 深圳市大疆创新科技有限公司 | An image signal processing method and device |
-
2017
- 2017-08-25 CN CN201780089542.2A patent/CN110537372B/en active Active
- 2017-08-25 EP EP17922728.5A patent/EP3643074A4/en not_active Withdrawn
- 2017-08-25 WO PCT/CN2017/099156 patent/WO2019037121A1/en unknown
-
2019
- 2019-12-05 US US16/704,662 patent/US11012591B2/en active Active
-
2021
- 2021-03-19 US US17/207,593 patent/US20210227102A1/en not_active Abandoned
Also Published As
Publication number | Publication date |
---|---|
CN110537372B (en) | 2022-06-24 |
EP3643074A1 (en) | 2020-04-29 |
EP3643074A4 (en) | 2020-04-29 |
WO2019037121A1 (en) | 2019-02-28 |
US20200112653A1 (en) | 2020-04-09 |
CN110537372A (en) | 2019-12-03 |
US11012591B2 (en) | 2021-05-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11373338B2 (en) | Image padding in video-based point-cloud compression CODEC | |
US11348283B2 (en) | Point cloud compression via color smoothing of point cloud prior to texture video generation | |
US11019280B2 (en) | Systems and methods for video processing and display | |
US20200036944A1 (en) | Method and system for video transmission | |
US20210227102A1 (en) | Systems and methods for synchronizing frame timing between physical layer frame and video frame | |
US20200244970A1 (en) | Video processing method, device, aerial vehicle, system, and storage medium | |
JP2017208802A (en) | Method for encoding and decoding video of drone and related device | |
US10944991B2 (en) | Prediction for matched patch index coding | |
US11750844B2 (en) | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device | |
US20210112111A1 (en) | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device | |
US20240040149A1 (en) | Three-dimensional data storage method, three-dimensional data acquisition method, three-dimensional data storage device, and three-dimensional data acquisition device | |
JP6290949B2 (en) | Photo cluster detection and compression | |
CN112785700A (en) | Virtual object display method, global map updating method and device | |
US20240070924A1 (en) | Compression of temporal data by using geometry-based point cloud compression | |
WO2022252234A1 (en) | 3d map encoding apparatus and method | |
EP4202611A1 (en) | Rendering a virtual object in spatial alignment with a pose of an electronic device | |
US20230206575A1 (en) | Rendering a virtual object in spatial alignment with a pose of an electronic device | |
US20240119639A1 (en) | Apparatus and method for decoding 3d map, and encoded bitstream of 3d map | |
WO2022252238A1 (en) | 3d map compression method and apparatus, and 3d map decompression method and apparatus | |
Ubik et al. | On acquisition of high-quality video from unmanned vehicles | |
CN115442338A (en) | Compression and decompression method and device for 3D map | |
CN114326764A (en) | Rtmp transmission-based smart forestry unmanned aerial vehicle fixed-point live broadcast method and unmanned aerial vehicle system | |
KR20210107409A (en) | Method and apparatus for transmitting video content using edge computing service | |
CN117529912A (en) | Encoding and decoding method and device for 3D map |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SZ DJI TECHNOLOGY CO., LTD., CHINA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:WANG, XIAODONG;REEL/FRAME:055660/0176 Effective date: 20191122 |
|
STCB | Information on status: application discontinuation |
Free format text: EXPRESSLY ABANDONED -- DURING EXAMINATION |