CN112119630A - Data sending and processing method, movable platform, display device, glasses and system - Google Patents

Data sending and processing method, movable platform, display device, glasses and system Download PDF

Info

Publication number
CN112119630A
CN112119630A CN201980031843.9A CN201980031843A CN112119630A CN 112119630 A CN112119630 A CN 112119630A CN 201980031843 A CN201980031843 A CN 201980031843A CN 112119630 A CN112119630 A CN 112119630A
Authority
CN
China
Prior art keywords
screen display
movable platform
information
image information
display information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201980031843.9A
Other languages
Chinese (zh)
Inventor
赵财华
刘俞聪
向春
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN112119630A publication Critical patent/CN112119630A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • H04N7/185Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/04Synchronising
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The specification discloses a data sending and processing method, a movable platform, a display device, glasses and a system, which comprises the steps of obtaining image information shot by a camera carried by the movable platform (S110); generating on-screen display information time-synchronized with the image information based on the state data of the movable platform (S120); and transmitting the image information and the on-screen display information to a user terminal through a picture transmission channel (S130).

Description

Data sending and processing method, movable platform, display device, glasses and system
Technical Field
The present disclosure relates to the field of data communication technologies, and in particular, to a data sending method, a data processing method, a movable platform, a display device, glasses, and a system.
Background
When a user uses a movable platform such as an unmanned aerial vehicle, some states of the movable platform need to be checked in real time sometimes, or the states of the movable platform need to be recorded so as to facilitate subsequent analysis. For example, a crossing unmanned aerial vehicle for special flight belongs to a high-race small unmanned aerial vehicle, and the highest speed per hour can reach 280 km/h; therefore, the display and the record of some state information are more important in the flight process; in some games, the state information is also required to be recorded as the basis for the judgment penalty of the referee. In the prior art, a flight control board of an unmanned aerial vehicle is generally provided with a hardware module for processing a flight state, such as an On-Screen Display (OSD) module.
Disclosure of Invention
Based on this, the specification provides a data sending and processing method, a movable platform, display equipment, glasses and a system, and aims to solve the technical problem that hardware needs to be additionally arranged in the existing on-screen display information transmission and processing mode.
In a first aspect, the present specification provides a data transmission method for a movable platform, the method comprising:
acquiring image information shot by the camera device carried by the movable platform;
generating on-screen display information that is time-synchronized with the image information based on the state data of the movable platform;
and sending the image information and the on-screen display information to a user side through an image transmission channel.
In a second aspect, the present specification provides a data processing method, for a user side, where the data processing method includes:
receiving image information from the movable platform through a graph transmission channel; and the number of the first and second groups,
on-screen display information time-synchronized with the image information;
displaying the image information, and synchronously displaying the on-screen display information when displaying the image information.
In a third aspect, the present specification provides a data processing method, including:
the method comprises the steps that a movable platform obtains image information shot by a camera device carried by the movable platform, and the image information is sent to a user side through a picture transmission channel;
the movable platform generates on-screen display information time-synchronized with the image information based on the state data of the movable platform, and sends the on-screen display information to the user side through a picture transmission channel;
and the user side displays the image information and synchronously displays the on-screen display information when displaying the image information.
In a fourth aspect, the present specification provides a movable platform comprising a memory and a processor;
the memory is used for storing a computer program;
the processor is configured to execute the computer program and, when executing the computer program, implement the following steps:
acquiring image information shot by the camera device carried by the movable platform;
generating on-screen display information that is time-synchronized with the image information based on the state data of the movable platform;
and sending the image information and the on-screen display information to a user side through an image transmission channel.
In a fifth aspect, the present specification provides a display device comprising a display component, a memory, and a processor;
the display component is used for displaying;
the memory is used for storing a computer program;
the processor is configured to execute the computer program and, when executing the computer program, implement the following steps:
receiving image information from the movable platform through a graph transmission channel; and the number of the first and second groups,
on-screen display information time-synchronized with the image information;
displaying the image information, and synchronously displaying the on-screen display information when displaying the image information.
In a sixth aspect, the present specification provides eyewear comprising a display assembly, a memory, and a processor;
the display component is used for displaying;
the memory is used for storing a computer program;
the processor is configured to execute the computer program and, when executing the computer program, implement the following steps:
receiving image information from the movable platform through a graph transmission channel; and the number of the first and second groups,
on-screen display information time-synchronized with the image information;
displaying the image information, and synchronously displaying the on-screen display information when displaying the image information.
In a seventh aspect, the present specification provides a movable platform system comprising a movable platform and a display device;
the movable platform is used for acquiring image information shot by the camera device carried by the movable platform and sending the image information to a user side through an image transmission channel;
the movable platform is used for generating on-screen display information which is time-synchronous with the image information based on the state data of the movable platform, and sending the on-screen display information to the user side through a picture transmission channel;
the user side is used for displaying the image information and synchronously displaying the on-screen display information when the image information is displayed.
In an eighth aspect, the present specification provides a computer readable storage medium having stored thereon a computer program which can be processed by a processor to implement the method described above.
The embodiment of the specification provides a data sending and processing method, a movable platform, display equipment, glasses and a system, image information shot by a camera carried by the movable platform is obtained through the movable platform, and on-screen display information time-synchronized with the image information is generated based on state data of the movable platform; and then the movable platform sends the image information and the on-screen display information to the user end through the image transmission channel so that the user end displays the image information and synchronously displays the on-screen display information when displaying the image information. The image information and the state data of the movable platform are respectively transmitted, so that time delay increase, power consumption increase and heating increase caused by the image processing process of the on-screen display module are avoided, and additional hardware does not need to be added.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure as claimed.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present disclosure, and it is obvious for those skilled in the art to obtain other drawings based on the drawings without creative efforts.
FIG. 1 is a schematic diagram of an on-screen display module application;
fig. 2 is a flowchart illustrating a data transmission method according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of data transmission between a mobile platform and a user terminal;
fig. 4 is a flowchart illustrating an embodiment of the data transmission method in fig. 2;
FIG. 5 is a schematic illustration of a display interface displaying image information and on-screen display information;
fig. 6 is a schematic flow chart of a data processing method provided in an embodiment of the present specification;
FIG. 7 is a schematic flow chart diagram illustrating another embodiment of the data processing method of FIG. 6;
FIG. 8 is a flow chart illustrating a data processing method according to another embodiment of the present disclosure;
FIG. 9 is a schematic flow chart diagram illustrating another embodiment of the data processing method of FIG. 8;
FIG. 10 is a schematic flow chart diagram illustrating a further embodiment of the data processing method of FIG. 8;
FIG. 11 is a schematic block diagram of a movable platform provided by an embodiment of the present description;
fig. 12 is a schematic block diagram of a display device provided in an embodiment of the present specification;
FIG. 13 is a schematic block diagram of eyewear provided in an embodiment of the present description;
fig. 14 is a schematic view of one of the eyeglasses of fig. 13.
Detailed Description
The technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are some, but not all, of the embodiments of the present disclosure. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present specification without any creative effort belong to the protection scope of the present specification.
The flow diagrams depicted in the figures are merely illustrative and do not necessarily include all of the elements and operations/steps, nor do they necessarily have to be performed in the order depicted. For example, some operations/steps may be decomposed, combined or partially combined, so that the actual execution sequence may be changed according to the actual situation.
As shown in fig. 1, an On Screen Display (OSD) 11 is connected to an airplane control module 12 and an imaging device 13 mounted On the unmanned aerial vehicle 10. The state of the flight process and the images shot by the unmanned aerial vehicle are superposed and then transmitted to the display device for displaying and recording by the hardware module. For example in the manner of liveview. However, in the mode of the superimposition processing, OSD data and an image need to be superimposed and associated, extra hardware needs to be added, power consumption becomes large, and a large processing delay is introduced, which may cause the overall delay of the unmanned aerial vehicle to become large. The whole endurance time of the unmanned aerial vehicle is shortened due to the increased power consumption, and the local heating of the unmanned aerial vehicle is easy to be large in the image processing process; for the situation that unmanned aerial vehicle control is carried out by relying on image transmission data, damage caused by image transmission delay can cause damage results, and even the unmanned aerial vehicle explodes.
In order to solve the above technical problems, the present specification provides a data transmission method, a data processing method, a movable platform, a display device, glasses, and a system.
Some embodiments of the present description will be described in detail below with reference to the accompanying drawings. The embodiments described below and the features of the embodiments can be combined with each other without conflict.
Referring to fig. 2, fig. 2 is a flowchart illustrating a data transmission method according to an embodiment of the present disclosure. The data sending method can be applied to a movable platform and is used for data transmission and other processes between the movable platform and a user side; the mobile platform comprises an unmanned aerial vehicle or a cloud trolley and the like, the user side can be a mobile phone, a tablet personal computer, a notebook computer, a desktop computer, a personal digital assistant, wearable equipment, a remote controller and the like, and for example, the user side is FPV (First Person View) glasses; further, the drone may be a rotary wing drone, such as a quad-rotor drone, a hexa-rotor drone, an eight-rotor drone, or a fixed wing drone, such as a cross-plane.
Further, data is transmitted between the mobile platform and the user terminal through a wireless channel. For example, as shown in fig. 3, a wireless channel from the movable platform to the user end, referred to as a map transmission channel, is used for transmitting image information captured by the camera mounted on the movable platform, and may also be used for transmitting status data of the movable platform, and the like;
illustratively, as shown in fig. 3, a wireless channel from the user side to the movable platform, called an upstream channel, is used for transmitting remote control data; for example, when the movable platform is an unmanned aerial vehicle, the uplink channel is used for transmitting flight control instructions and control instructions such as photographing, video recording, return flight and the like.
As shown in fig. 2, the remote control data transmission method for a movable platform of the present embodiment includes steps S110 to S130.
And S110, acquiring image information shot by the camera device carried by the movable platform.
The movable platform is a traversing machine which is a high-speed-competition small unmanned aerial vehicle, and the maximum speed per hour of the traversing machine can reach 280km/h per hour.
For example, the movable platform may be provided with a camera or a user may add a camera by himself, and the camera may include a camera, and may further include a holder for improving the quality of the captured image information.
In some embodiments, if the movable platform receives an image transmission instruction sent by a user through an uplink channel, the image information captured by the camera mounted on the movable platform is acquired.
Illustratively, the user side is FPV glasses in communication connection with the movable platform, and a user may control the user side to send an image transmission instruction to the movable platform by clicking a button or a touch pad on the FPV glasses or by inputting a voice instruction to the FPV glasses or the like in an interactive manner, so as to notify the movable platform to start to capture an image and send the captured image to the user side, which is convenient for the user to view and/or the user side to store image information.
And S120, generating on-screen display information time-synchronized with the image information based on the state data of the movable platform.
In particular, the user terminal includes a display assembly, which may include any type of display screen, including a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), a Thin Film Transistor (TFT) screen, plasma, etc.
Specifically, the on-screen display information is information to be sent to the user side and displayed by the user side.
In some embodiments, if the mobile platform receives an image transmission instruction sent by the user terminal through an uplink channel, the mobile platform starts generating on-screen display information time-synchronized with the image information based on the state data of the mobile platform.
Illustratively, when a user starts flying by using a traversing machine, triggering the user side to send an image transmission instruction to the movable platform; and after receiving the image transmission instruction sent by the user side, the movable platform starts to acquire the state data of the movable platform, and generates on-screen display information which is time-synchronous with the image information according to the state data so as to send the on-screen display information to the user side for displaying by the user side.
In some embodiments, as shown in fig. 4, step S120 generates on-screen display information time-synchronized with the image information based on the state data of the movable platform, including step S121 and step S122.
And S121, acquiring the state data of the movable platform, and determining the display time of the state data according to the time for shooting the image information.
For example, the state data of the movable platform may represent the self condition of the movable platform and/or the remote control environment of the movable platform, and the user may determine, for example, the working state of the traversing machine according to the state data, so as to adjust the working of the movable platform in time under some conditions, such as controlling the stop flight, controlling the return flight, switching the communication frequency, and the like.
Illustratively, the step S120 of generating on-screen display information time-synchronized with the image information based on the state data of the movable platform includes: and generating on-screen display information time-synchronized with the image information based on the first parameter information of the movable platform and/or second parameter information of data transmission between the movable platform and the user side.
Specifically, first parameter information of a movable platform and/or second parameter information of data transmission between the movable platform and a user side are/is acquired as state data of the movable platform.
Specifically, the first parameter information includes at least one of an unlocking time flightTime of the movable platform, a voltage uavBat, and a number of battery cells uavbattcells, where the unlocking time represents, for example, a traverse unlocking duration. The user can know the health condition, the electric quantity, the endurance time and the like of the movable platform according to the first parameter information of the movable platform.
Specifically, the second parameter information includes at least one of signal strength signal, channel information ch, transmission delay, and transmission code rate of data transmission between the mobile platform and the user end; for example, the signal strength represents the signal strength of data transmission on the video channel, the channel information represents the channel number of the video transmission, and the transmission delay and transmission code rate represent the delay and code rate of the data transmission on the video channel. The user can know the communication quality between the movable platform and the user terminal according to the second parameter information.
Illustratively, the step S120 of generating on-screen display information time-synchronized with the image information based on the state data of the movable platform includes: and generating on-screen display information time-synchronized with the image information based on the state data of the movable platform and the third parameter information of the user side.
Specifically, the third parameter information includes at least one of the voltage glsBat of the user terminal and the number of battery cells glsBatCells; for example, glsBat, glsbatcels represent the voltage, number of cells of the FPV glasses communicatively connected to the movable platform.
Specifically, the mobile platform obtains the third parameter information from a user end in communication connection with the mobile platform, for example, the user end periodically sends the third parameter information to the mobile platform.
In some embodiments, the movable platform determines the display time of the status data based on the time at which the image information was captured.
For example, the movable platform acquires state data once after the image information is shot by the image shooting device, and then the display time corresponding to the state data can be determined according to the shooting time of the image information; thereby enabling generation of on-screen display information that is time-synchronized with the image information.
For example, if the image information is captured at 00:00:00,116 to 00:00:00,283, the display time corresponding to the state data is 00:00:00,116 to 00:00:00,283.
And S122, generating the on-screen display information according to the state data and the display time.
Illustratively, the state data acquired at the same time and the display time corresponding to the state data generate one piece of on-screen display information.
For example, the on-screen display information is shown in table 1, where table 1 includes three pieces of on-screen display information, which are on-screen display information corresponding to the first three pieces of image information.
TABLE 1 on-Screen display information
Figure BDA0002773113450000081
In some embodiments, step S120 generates on-screen display information time-synchronized with the image information based on the state data of the movable platform, including: and generating on-screen display information in a preset format according to the state data and the display time.
Illustratively, on-screen display information in SRT format, binary DAT format, text format, SSA format, or SUB format is generated based on the status data and the display time.
The SRT format on-screen display information is a text file, the generation of the SRT format on-screen display information is standard and simple, the SRT format on-screen display information comprises a time code and a subtitle, the display time can be used as the time code, and the state data of the movable platform and the third parameter information of the user side can be used as the subtitle. In the present embodiment, the on-screen display information, i.e., the display time of the OSD information is determined according to the time axis of the image information, and in the case of the on-screen display information of SRT format, for example, 00:00:00,050- - >00:00:00,116 represents the image display information and the length of the information corresponding to the graphics information at zero time. SSA, called subtitle Alpha, is a subtitle format created by CS Low (also called Kotus) to implement more complex functions than the conventional subtitle such as SRT, and may define a font format, a font color, a font size, and the like. SUB is a graphic format subtitle, generally composed of idx and SUB files; idx is equivalent to an index file, and includes a time code of subtitle occurrence, subtitle display attribute, and the like, and the sub file stores the subtitle itself, such as state data of the movable platform and third parameter information of the user side.
And S130, sending the image information and the on-screen display information to a user side through a picture transmission channel.
Specifically, after the movable platform acquires image information shot by the camera device carried by the movable platform, the acquired image information is sent to a user terminal in real time through a picture transmission channel; and after the movable platform generates the on-screen display information which is time-synchronized with the image information based on the state data of the movable platform, the generated on-screen display information is sent to the user side in real time through the image transmission channel. So that the user terminal can receive and display image information and on-screen display information in time.
In some embodiments, the on-screen display information includes a display time, and the display time is used for restricting the time and/or the display duration of the on-screen display information displayed at the user terminal.
In an alternative embodiment, the on-screen display information, i.e. the OSD information and the image information, is transmitted to the user terminal in the form of data packets and image information in the image transmission channel.
In an alternative embodiment, the OSD information is encoded into the image information for transmission.
In an alternative embodiment, the user terminal may be a remote controller, a mobile terminal or flight glasses with display and control functions.
In some embodiments, as shown in fig. 4, step S130 sends the image information and the on-screen display information to the user terminal through a map channel, including step S131.
S131, sending the on-screen display information to a user side through a picture transmission channel, so that the user side displays the on-screen display information according to the display time.
For example, as shown in fig. 5, the user displays the image information on a display interface, and synchronously displays the on-screen display information in a preset area of the display interface when displaying the image information. Therefore, when the user can view the image information displayed on the display interface, the user can also pay attention to the information displayed on the screen, so that the working state of the traversing machine can be judged, and the working of the movable platform can be adjusted in time under certain conditions, such as control of stopping flying, control of returning, switching of communication frequency and the like.
Illustratively, the user terminal displays the on-screen display information synchronously when displaying the image information according to the display time of the on-screen display information.
For example, when the display time of the display information on a certain screen is 00:00:00,116 to 00:00:00,283, the user terminal displays the display information on the screen in the time period from 00:00:00,116 to 00:00:00,283 when the user terminal starts playing the image information; new on-screen display information is displayed at a later time period, such as on-screen display information displayed for a time period of 00:00:00,283 to 00:00:00,450.
Illustratively, the on-screen display information is sent to the user side through a graph transmission channel, so that the user side displays the state data according to the display time.
Specifically, the user side synchronously displays the state data in the on-screen display information when displaying the image information according to the display time of the on-screen display information. As shown in fig. 5, the user side displays the status data in the information on the display screen at the lower right corner of the display interface, so that the user can know the working status of the movable platform conveniently.
In some embodiments, the sending, by the mobile platform, the on-screen display information to a user side through a graph transmission channel, so that the user side displays the state data according to the display time includes: and sending the on-screen display information in a preset format to a user side through a picture transmission channel so that the user side receives the on-screen display information, analyzes the on-screen display information to obtain the state data and the display time, and displays the state data according to the display time.
Specifically, the user side receives on-screen display information in SRT format, binary DAT format, text format, SSA format, or SUB format from the movable platform, and then parses state data and the display time from the on-screen display information, for example, the parsed display time is 00:00:00,116 to 00:00:00,283, where the state data corresponding to the display time includes: signal 4 ch 4 flightTime 1 uavBat 16.4V glsBat 11.7V uavBatCells 4 glsBatCells 3 delay 28ms bitrate 20.9 Mbps; the status data is then displayed according to the display time, as shown in fig. 5.
In some embodiments, the image information includes a time axis. For example, when an image pickup apparatus mounted on a movable platform picks up an image, information on the time axis is added to the picked-up image.
For example, step S131 sends the on-screen display information to the user side through a graph transmission channel, so that the user side displays the on-screen display information according to the display time, including: and sending the on-screen display information to a user terminal through a picture transmission channel, so that the user terminal determines display time and the on-screen display information corresponding to the time axis according to the time axis of the image information being displayed, and the user terminal displays the on-screen display information.
Specifically, after receiving the image information and the on-screen display information sent by the movable platform, the user side sequentially displays the image information according to the shooting sequence of the image information, namely the sequence of the time axis. When certain image information is displayed, the user side determines display time and on-screen display information corresponding to the time axis according to the time axis of the image information, and then displays the on-screen display information, so that synchronous playing of the image information and the on-screen display information of the user side is realized.
In some embodiments, the user side stores the image information from the movable platform to generate a video file. For example, the user terminal generates a video file according to the received image information from the movable platform. Specifically, the video file is obtained by storing the plurality of image information according to the sequence of the received image information. So that the video file can be played back later, for example, to facilitate judgment by an officer, or the operator can repeat the operation according to the OSD information.
Illustratively, as shown in fig. 4, the step S130 of sending the image information and the on-screen display information to the user terminal through the map channel includes a step S132.
S132, the image information and the on-screen display information are sent to a user side through a picture transmission channel, so that the user side generates an on-screen display information file synchronous with the image information according to the on-screen display information based on a time axis of the image information.
Illustratively, the user terminal generates an on-screen display information file according to a time axis of the image information. For example, according to the corresponding relationship between the time axis of each image information in the video file and the display time of the on-screen display information, the received pieces of on-screen display information are sorted and stored, and the on-screen display information file synchronized with the video file is obtained.
Illustratively, the user side may further store the plurality of pieces of on-screen display information according to the display time of the on-screen display information to obtain an on-screen display information file corresponding to the video file. Therefore, when the video file is replayed, the user end can still call the information file displayed on the screen, and the information displayed on the screen is synchronously displayed when the image information is displayed, so that the state of the movable platform and/or the state of the user end when the image information is shot can be restored.
In some embodiments, the data transmission method further includes: acquiring an operation event of a user to the user side from the user side, and generating buried point data according to the operation event and on-screen display information and/or image information associated with the operation event; and when the embedded point data is connected to the terminal equipment, the embedded point data is sent to the terminal equipment.
Specifically, the user operates the user terminal, so that the user terminal generates a control instruction for the movable platform, and the user terminal sends the control instruction to the movable platform through the uplink channel. And the movable platform realizes corresponding control operation according to the control instruction, and also sends image information and on-screen display information to the user terminal in real time.
Specifically, a user operates a user side, so that the user side sends a control instruction to the movable platform; meanwhile, the control instruction triggers a buried point event of the movable platform, and the movable platform can generate the buried point data.
Specifically, the movable platform determines on-screen display information and/or image information associated with the operation event according to the generation time or the receiving time of the control instruction, the time axis of the image information, and the display time of the on-screen display information; and generating buried point data according to the operation event and on-screen display information and/or image information associated with the operation event. Exemplarily, since the data amount of the buried point data is huge, format conversion and encapsulation compression processing are required, and the effect of saving space is achieved. And simultaneously recording the time stamp of each buried point data, for example, taking the generation time of the control instruction as the time stamp of the corresponding buried point data, and then generating a buried point file by the movable platform from the processed buried point data.
In some embodiments, the data transmission method further includes: and if the uploading success notification sent by the terminal equipment is received, deleting the data of the buried point corresponding to the uploading success notification.
For example, when the mobile platform detects that the mobile platform is connected to a terminal device such as a mobile phone or a computer, the mobile platform triggers to acquire a stored embedded point file, such as a JSON file. Because the buried point file is packaged and compressed, the buried point file can be restored into real buried point data only by decompression according to rules, including the triggering time of events; and then the movable platform uploads the buried point data to the connected terminal equipment.
Therefore, the embedded point function can be realized locally on a movable platform such as an unmanned aerial vehicle; when a mobile phone or a computer needs to be connected in the scenes of upgrading a movable platform and the like, a user can noninductively upload buried point data to terminal equipment such as the mobile phone or the computer; the data of the buried point can be uploaded to a server by the terminal equipment which can be networked; therefore, under the condition that the unmanned aerial vehicle is not connected with the mobile device, the buried point statistics can be carried out.
According to the data transmission method for the movable platform, the image information shot by the camera device carried by the movable platform is obtained, and the on-screen display information time-synchronized with the image information is generated based on the state data of the movable platform; and then sending the image information and the on-screen display information to the user end through an image transmission channel so that the user end displays the image information and synchronously displays the on-screen display information when displaying the image information. The image information and the state data of the movable platform are respectively transmitted, so that time delay increase, power consumption increase and heating increase caused by the image processing process of the on-screen display module are avoided, and additional hardware does not need to be added.
Referring to fig. 6, fig. 6 is a schematic flowchart illustrating a data processing method according to an embodiment of the present disclosure. The data processing method can be applied to a user side and is used for processing the processes of data received from the movable platform and the like; the user side can be a mobile phone, a tablet computer, a notebook computer, a desktop computer, a personal digital assistant, a wearable device, a remote controller, and the like, for example, the user side is FPV glasses.
As shown in fig. 6, the data processing method for the user side of the present embodiment includes steps S210 to S230.
And S210, receiving image information from the movable platform through the image transmission channel.
The movable platform is a traversing machine which is a high-speed-competition small unmanned aerial vehicle, and the maximum speed per hour of the traversing machine can reach 280km/h per hour.
And the user side receives the image information shot by the camera device carried by the movable platform through the image transmission channel between the user side and the movable platform.
In some embodiments, a user terminal sends an image transmission instruction to the movable platform, so that the movable platform acquires image information captured by an imaging device carried by the movable platform and sends the image information to the user terminal.
Illustratively, a user side sends an image transmission instruction to a movable platform according to user operation, and the movable platform starts to acquire image information shot by a camera carried by the movable platform after receiving the image transmission instruction sent by the user side and sends the image information to the user side.
Specifically, the user side is an FPV glasses in communication connection with the movable platform, and a user can control the user side to send an image transmission instruction to the movable platform by clicking a button or a touch pad on the FPV glasses or by inputting a voice instruction and other interaction modes to the FPV glasses, so as to inform the movable platform to start shooting an image and send the shot image to the user side, so that the user side can conveniently view the image and/or the user side can conveniently store image information.
And S220, receiving on-screen display information which comes from the movable platform through the image transmission channel and is time-synchronized with the image information.
In some embodiments, the on-screen display information is generated by the movable platform based on state data and a display time of the movable platform.
Illustratively, the movable platform acquires state data of the movable platform, and determines the display time of the state data according to the time for shooting the image information; and then generating the on-screen display information according to the state data and the display time.
In some embodiments, the user side sends an image transmission instruction to the movable platform according to a user operation, so that the movable platform generates on-screen display information time-synchronized with the image information based on the state data of the movable platform, and sends the on-screen display information to the user side.
Illustratively, when a user starts to use the traversing machine by using the FPV glasses, the user end is triggered to send an image transmission instruction to the movable platform; and after receiving the image transmission instruction sent by the user side, the movable platform starts to acquire the state data of the movable platform, and generates on-screen display information which is time-synchronous with the image information according to the state data so as to send the on-screen display information to the user side for displaying by the user side.
For example, the state data of the movable platform may represent the self condition of the movable platform and/or the remote control environment of the movable platform, and the user may determine, for example, the working state of the traversing machine according to the state data, so as to adjust the working of the movable platform in time under some conditions, such as controlling the stop flight, controlling the return flight, switching the communication frequency, and the like.
Illustratively, the on-screen display information includes: at least one of first parameter information of the movable platform and second parameter information of data transmission between the movable platform and a user side.
Specifically, a user can know the health condition, the electric quantity, the endurance time and the like of the movable platform according to the first parameter information of the movable platform; the user can know the communication quality between the movable platform and the user terminal according to the second parameter information.
Illustratively, the on-screen display information may further include third parameter information of the user terminal. The movable platform may generate on-screen display information time-synchronized with the image information based on the state data of the movable platform and/or the third parameter information of the user terminal, and the display time.
Specifically, the mobile platform obtains the third parameter information from a user side in communication connection with the mobile platform, for example, the user side periodically sends the third parameter information to the mobile platform; so that the movable platform can generate on-screen display information including the third parameter information at the user end.
Specifically, the first parameter information includes at least one of an unlocking time flightTime, a voltage uavBat, and a cell number uavbattcells of the movable platform; the second parameter information comprises at least one of signal strength signal, channel information ch, transmission delay and transmission code rate of data transmission between the movable platform and a user terminal; the third parameter information includes at least one of the voltage glsBat of the user terminal and the number of battery cells glsBatCells.
In some embodiments, the on-screen display information is generated by the movable platform according to the state data and the display time of the movable platform, and is in a preset format.
Illustratively, the on-screen display information is in SRT format, binary DAT format, text format, SSA format, or SUB format.
Illustratively, the on-screen display information is in SRT format, for example, the on-screen display information acquired at a certain time is: 00:00:00,116- - >00:00:00,283; signal 4 ch 4 flightTime 1 uavBat 16.4V glsBat 11.7V uavBatCells 4 glsBatCells 3 delay 28ms bitrate 20.9 Mbps.
And S230, displaying the image information, and synchronously displaying the on-screen display information when the image information is displayed.
For example, as shown in fig. 5, the user terminal displays the image information on a display interface, and synchronously displays the on-screen display information in a preset area of the display interface when displaying the image information. Therefore, when the user can view the image information displayed on the display interface, the user can also pay attention to the information displayed on the screen, so that the working state of the traversing machine can be judged, and the working of the movable platform can be adjusted in time under certain conditions, such as control of stopping flying, control of returning, switching of communication frequency and the like.
In some embodiments, the on-screen display information includes a display time.
Illustratively, the display time is determined by the movable platform based on the time at which the image information was captured.
Specifically, the movable platform acquires state data once after the image information is shot by the camera device, and then the display time corresponding to the state data can be determined according to the shooting time of the image information; thereby enabling generation of on-screen display information that is time-synchronized with the image information.
Illustratively, the step S230 of synchronously displaying the on-screen display information when the user terminal displays the image information includes: and synchronously displaying the on-screen display information when the image information is displayed according to the display time of the on-screen display information.
For example, when the display time of the display information on a certain screen is 00:00:00,116 to 00:00:00,283, the user terminal displays the display information on the screen in the time period from 00:00:00,116 to 00:00:00,283 when the user terminal starts playing the image information; new on-screen display information is displayed at a later time period, such as on-screen display information displayed for a time period of 00:00:00,283 to 00:00:00,450.
Specifically, the synchronously displaying the on-screen display information when displaying the image information according to the display time of the on-screen display information includes: and synchronously displaying the state data in the on-screen display information when the image information is displayed according to the display time of the on-screen display information.
As shown in fig. 5, the user side displays the status data in the information on the display screen at the lower right corner of the display interface, so that the user can know the working status conveniently.
In some embodiments, the movable platform generates on-screen display information in a preset format based on the state data and display time of the movable platform. The user side synchronously displays the state data in the on-screen display information when displaying the image information according to the display time of the on-screen display information, and the method comprises the following steps: analyzing the on-screen display information in the preset format to obtain the state data and the display time; and synchronously displaying the state data when the image information is displayed according to the display time.
Specifically, the user side receives on-screen display information in SRT format, binary DAT format, text format, SSA format, or SUB format from the movable platform, and then parses state data and the display time from the on-screen display information, for example, the parsed display time is 00:00:00,116 to 00:00:00,283, where the state data corresponding to the display time includes: signal 4 ch 4 flightTime 1 uavBat 16.4V glsBat 11.7V uavBatCells 4 glsBatCells 3 delay 28ms bitrate 20.9 Mbps; the status data is then displayed according to the display time, as shown in fig. 5.
In some embodiments, the image information received from the movable platform over the graph channel includes a timeline. For example, when an image pickup apparatus mounted on a movable platform picks up an image, information on the time axis is added to the picked-up image.
Illustratively, the synchronously displaying the on-screen display information while displaying the image information according to the display time of the on-screen display information includes: and determining the display time and the on-screen display information corresponding to the time axis according to the time axis of the image information being displayed, and displaying the on-screen display information.
Specifically, after receiving the image information and the on-screen display information sent by the movable platform, the user side sequentially displays the image information according to the shooting sequence of the image information, namely the sequence of the time axis. When certain image information is displayed, the user side determines display time and on-screen display information corresponding to the time axis according to the time axis of the image information, and then displays the on-screen display information, so that synchronous playing of the image information and the on-screen display information of the user side is realized.
In some embodiments, as shown in fig. 7, the data processing method further includes step S240.
And S240, storing the image information from the movable platform to generate a video file.
Illustratively, the user terminal generates a video file according to the received image information from the movable platform. Specifically, the user side stores the plurality of image information according to the sequence of the received image information to obtain a video file, or sorts and stores the plurality of image information according to the time axis of the image information to obtain the video file; so that the video file can be played back later, e.g. to facilitate the judgment of the referee.
Illustratively, as shown in fig. 7, the data processing method further includes a step S250.
And S250, storing the on-screen display information time-synchronized with the image information to generate an on-screen display information file of the video file.
Illustratively, the user side stores the plurality of pieces of on-screen display information according to the display time of the on-screen display information to obtain an on-screen display information file corresponding to the video file. At the user end, such as glasses, the video file is generated according to the image information, and meanwhile, the on-screen display information is recorded into the on-screen display information file. Therefore, when the video file is replayed, the user end can still call the information file displayed on the screen, and the information displayed on the screen is synchronously displayed when the image information is displayed, so that the state of the movable platform and/or the state of the user end when the image information is shot can be restored.
Illustratively, the user terminal generates an on-screen display information file according to a time axis of the image information. For example, according to the corresponding relationship between the time axis of each image information in the video file and the display time of the on-screen display information, the received pieces of on-screen display information are sorted and stored, and the on-screen display information file synchronized with the video file is obtained.
Illustratively, as shown in fig. 7, the data processing method further includes a step S260.
S260, if a playing instruction input by a user is received, displaying the image information in the video file, and synchronously displaying the on-screen display information in the on-screen display information file when the image information is displayed.
Specifically, if the user needs to play back the video file in the user side, the user side can receive the playing instruction by operating the user side. And after receiving the playing instruction, the user side can display the image information in the video file and synchronously display the on-screen display information in the on-screen display information file when displaying the image information so as to restore the state of the movable platform and/or the state of the user side when shooting the image information.
In some embodiments, the user terminal, such as FPV glasses, can perform data buried point statistics, storage, and upload locally.
The data embedding function is generally used for counting and collecting information of use habits, behavior operations and the like of users, and then uploading the collected data to a data server through a network medium for analysis, knowing user requirements, and improving and optimizing product functions. Conventionally, the data point burying function is generally applied to equipment with a network module, such as a mobile phone, a computer and the like, and real-time transmission can be achieved; for some devices without network modules, such as a consumer unmanned aerial vehicle, the embedded point function is not realized locally on the aircraft, but mobile devices such as mobile phones are connected, and the mobile device agents are responsible for collecting, counting and transmitting the embedded point data.
However, at present, some clients, such as FPV glasses without network module and glasses for penetrating through machine, do not have the function of implementing the embedded point statistics, storage and uploading, so that the information of habits, behaviors and the like of users using glasses equipment cannot be collected.
For user-side electronic equipment without a network module, the equipment with the network module such as a mobile phone, a computer and the like does not need to be connected for a long time, a typical example is through-machine FPV glasses, the equipment generally has independent user interaction such as a UI (user interface), a key touch panel and the like, the intervention of mobile equipment such as the mobile phone and the like is not needed at all, and if information such as the use habits of users and the like is expected to be collected and counted, data collection, counting and storage are required to be carried out locally on the equipment.
In this embodiment, when the user end is not connected to a mobile device such as a mobile phone or a PC computer, the user end encapsulates, compresses and stores the embedded point data into an embedded point file in a software manner, and each embedded point data includes time information, and stores the embedded point file in the user end, such as the FPV glasses. When the user side is connected to terminal equipment such as a mobile phone, a PC (personal computer) and the like to perform system upgrading, debugging and other operations, the user side can trigger the analysis of the embedded point file and upload the embedded point data obtained through analysis to the terminal equipment. Because each buried point data contains time information, the transmission mode of the buried point data is not different from real-time transmission, and the buried point data can still be classified and counted on the data server based on a time axis.
For user-side electronic equipment without a network module, the user-side electronic equipment is not required to be connected with equipment with the network module, such as a mobile phone, a computer and the like for a long time, and the user generally tries to connect the user side to the terminal equipment, such as the mobile phone, the computer and the like, only when the equipment needs to be upgraded; in this time, the data of the embedded point stored at the local part of the user end can be uploaded to the server through a mobile phone, a computer and other agents, and the user does not sense and influence the normal use of the user.
Illustratively, the data processing method for the user side further includes: and acquiring an operation event of the user side operated by the user, and generating buried point data according to the operation event and on-screen display information and/or image information associated with the operation event.
Specifically, the user operates the user terminal, so that the user terminal generates a control instruction for the movable platform, and the user terminal sends the control instruction to the movable platform through the uplink channel. And the movable platform realizes corresponding control operation according to the control instruction, and also sends image information and on-screen display information to the user terminal in real time.
Specifically, a user operates a user side, so that the user side generates a control instruction sent to the movable platform; and meanwhile, triggering a buried point event to generate buried point data.
Specifically, the user side determines the on-screen display information and/or the image information associated with the operation event according to the generation time of the control instruction, the time axis of the image information and the display time of the on-screen display information; and generating buried point data according to the operation event and on-screen display information and/or image information associated with the operation event.
Exemplarily, since the data amount of the buried point data is huge, format conversion and encapsulation compression processing are required, and the effect of saving space is achieved. And simultaneously recording the time stamp of each buried point data, for example, taking the generation time of the control instruction as the time stamp of the corresponding buried point data, and then generating a buried point file by the user side from the processed buried point data.
Exemplary buried point files may be binary DAT files, ini files, text TXT files, or the like; the buried point file can be saved to a storage medium which is not lost in power failure, such as an SD card, NAND FLASH and the like.
Illustratively, the user side generates a JSON file from the processed buried point data, and stores the JSON file into an internal emmc (embedded Multi Media card) module, so as to ensure that data is not lost when power is down.
In some embodiments, the data processing method for the user side further includes: and when the user side is connected to the terminal equipment, sending the buried point data to the terminal equipment.
For example, when the user detects that the user is connected to a terminal device such as a mobile phone or a computer, the user may trigger to acquire a stored embedded point file, such as a JSON file. Because the buried point file is packaged and compressed, the buried point file can be restored into real buried point data only by decompression according to rules, including the triggering time of events; and then the user side uploads the buried point data to the connected terminal equipment.
Although the user-side electronic equipment without the network module, such as FPV glasses and pass-through glasses, does not have the network module, the equipment still needs to be connected with terminal equipment such as a mobile phone or a PC computer when the system is upgraded and debugged, so that the opportunity of uploading the data of the embedded point stored locally at the user side is provided.
Because the user does not have a high-frequency behavior in the operation of upgrading the device, the embedded data locally stored at the user side is required to be highly compressed to store more embedded data in a limited space, and the data is generally required to be stored for more than half a year.
Illustratively, when a user side is connected to a terminal device capable of being networked, the buried point data is sent to the terminal device, so that the terminal device uploads the buried point data to a server.
After the user side uploads the buried point data to the connected terminal equipment, the terminal equipment can further upload the buried point data to the server, so that the server can classify and count the buried point data, the use habits, the real requirements and the like of the user are obtained, and updating iteration, optimization and improvement of products are facilitated.
In some embodiments, the data processing method for the user side further includes: and if the uploading success notification sent by the terminal equipment is received, deleting the buried point data and/or the buried point file corresponding to the uploading success notification.
Specifically, after the terminal device in communication connection with the user side receives complete buried point data, a successful uploading notification is fed back to the user side; and the user side can delete the locally stored buried point data and/or the buried point file after the uploading success notification so as to release more space for recording new buried point data.
The data processing method for the user side provided by the above embodiment receives the image information and the on-screen display information time-synchronized with the image information from the movable platform through the image transmission channel to display the image information, and synchronizes the on-screen display information when displaying the image information; the image information and the state data of the movable platform are respectively transmitted, so that time delay increase, power consumption increase and heating increase caused by the image processing process of the on-screen display module are avoided, and additional hardware does not need to be added.
Referring to fig. 8, fig. 8 is a schematic flowchart illustrating a data processing method according to another embodiment of the present application. The data processing method can be applied to the movable platform and the user side and is used for the processes of data transmission between the movable platform and the user side, processing of data received from the movable platform by the user side and the like. The mobile platform comprises an unmanned aerial vehicle or a cloud trolley and the like, the user side can be a mobile phone, a tablet personal computer, a notebook computer, a desktop computer, a personal digital assistant, wearable equipment, a remote controller and the like, and for example, the user side is FPV (First Person View) glasses; further, the drone may be a rotary wing drone, such as a quad-rotor drone, a hexa-rotor drone, an eight-rotor drone, or a fixed wing drone, such as a cross-plane.
As shown in fig. 8, the data processing method of the present embodiment includes steps S310 to S330.
And S310, the movable platform acquires the image information shot by the camera device carried by the movable platform, and sends the image information to a user side through an image transmission channel.
For example, if the movable platform receives an image transmission instruction sent by the user side, image information shot by a camera device carried by the movable platform is acquired.
For example, a user may control the user terminal to send an image transmission instruction to the movable platform by clicking a button or a touch pad on the FPV glasses or by inputting a voice instruction to the FPV glasses or other interaction means, so as to notify the movable platform to start to capture an image and send the captured image to the user terminal, which is convenient for the user to view and/or convenient for the user terminal to store image information.
Specifically, after the movable platform acquires the image information shot by the camera device carried by the movable platform, the acquired image information is sent to the user terminal through the image transmission channel in real time.
And S320, generating on-screen display information time-synchronized with the image information by the movable platform based on the state data of the movable platform, and sending the on-screen display information to the user side through a picture transmission channel.
For example, if the movable platform receives an image transmission instruction sent by the user side, on-screen display information time-synchronized with the image information is generated based on the state data of the movable platform.
Specifically, after the movable platform generates the on-screen display information time-synchronized with the image information based on the state data of the movable platform, the generated on-screen display information is sent to the user terminal through the image transmission channel in real time. So that the user terminal can receive and display image information and on-screen display information in time.
Specifically, when a user starts to fly by using a traversing machine, a user side is triggered to send an image transmission instruction to the movable platform; and after receiving the image transmission instruction sent by the user side, the movable platform starts to acquire the state data of the movable platform, and generates on-screen display information which is time-synchronous with the image information according to the state data so as to send the on-screen display information to the user side for displaying by the user side.
In some embodiments, the movable platform generates on-screen display information that is time-synchronized with the image information based on state data of the movable platform, including: the movable platform acquires state data of the movable platform, and display time of the state data is determined according to the time for shooting the image information; and the movable platform generates the on-screen display information according to the state data and the display time.
For example, the state data of the movable platform may represent the self condition of the movable platform and/or the remote control environment of the movable platform, and the user may determine, for example, the working state of the traversing machine according to the state data, so as to adjust the working of the movable platform in time under some conditions, such as controlling the stop flight, controlling the return flight, switching the communication frequency, and the like.
Illustratively, the movable platform generating on-screen display information time-synchronized with the image information based on the state data of the movable platform includes: and the movable platform generates on-screen display information time-synchronized with the image information based on the first parameter information of the movable platform and/or the second parameter information of data transmission between the movable platform and a user side.
Illustratively, the movable platform generating on-screen display information time-synchronized with the image information based on the state data of the movable platform includes: and the movable platform generates on-screen display information time-synchronized with the image information based on the state data of the movable platform and the third parameter information of the user side.
Specifically, the first parameter information includes at least one of unlocking time, voltage, and number of battery cells of the movable platform; the second parameter information comprises at least one of signal strength, channel information, transmission delay and transmission code rate of data transmission between the movable platform and the user side; the third parameter information includes at least one of a voltage of the user terminal and a number of battery cells.
For example, the movable platform acquires state data once after the image information is shot by the image shooting device, and then the display time corresponding to the state data can be determined according to the shooting time of the image information; thereby enabling generation of on-screen display information that is time-synchronized with the image information.
For example, if the image information is captured at 00:00:00,116 to 00:00:00,283, the display time corresponding to the state data is 00:00:00,116 to 00:00:00,283.
Illustratively, the state data acquired at the same time and the display time corresponding to the state data generate one piece of on-screen display information.
In some embodiments, the mobile platform generates the on-screen display information based on the status data and the display time, including: and the movable platform generates on-screen display information in a preset format according to the state data and the display time.
Illustratively, the generating, by the movable platform, on-screen display information in a preset format according to the state data and the display time includes: and the movable platform generates on-screen display information in an SRT format, a binary DAT format, a text format, an SSA format or an SUB format according to the state data and the display time.
S330, the user side displays the image information and synchronously displays the on-screen display information when the image information is displayed.
Illustratively, the user end displays the image information on a display interface, and synchronously displays the on-screen display information in a preset area of the display interface when displaying the image information, as shown in fig. 5. Therefore, when the user can view the image information displayed on the display interface, the user can also pay attention to the information displayed on the screen, so that the working state of the traversing machine can be judged, and the working of the movable platform can be adjusted in time under certain conditions, such as control of stopping flying, control of returning, switching of communication frequency and the like.
In some embodiments, the on-screen display information includes a display time. Illustratively, the display time is determined by the movable platform based on the time at which the image information was captured.
Illustratively, the step S330 of synchronously displaying the on-screen display information when the user terminal displays the image information includes: and the user side synchronously displays the on-screen display information when displaying the image information according to the display time of the on-screen display information.
For example, when the display time of the display information on a certain screen is 00:00:00,116 to 00:00:00,283, the user terminal displays the display information on the screen in the time period from 00:00:00,116 to 00:00:00,283 when the user terminal starts playing the image information; new on-screen display information is displayed at a later time period, such as on-screen display information displayed for a time period of 00:00:00,283 to 00:00:00,450.
Illustratively, the displaying the on-screen display information synchronously at the time of displaying the image information by the user side according to the display time of the on-screen display information includes: and the user side synchronously displays the state data in the display information on the screen when displaying the image information according to the display time of the display information on the screen.
As shown in fig. 5, the user side displays the status data in the information on the display screen at the lower right corner of the display interface, so that the user can know the working status conveniently.
In some embodiments, the movable platform generates on-screen display information in a preset format according to the state data and the display time. The user side synchronously displays the state data in the on-screen display information when displaying the image information according to the display time of the on-screen display information, and the method comprises the following steps: the user side analyzes the on-screen display information in the preset format to obtain the state data and the display time; and the user side synchronously displays the state data when displaying the image information according to the display time.
Specifically, the user side receives on-screen display information in SRT format, binary DAT format, text format, SSA format, or SUB format from the movable platform, and then parses state data and the display time from the on-screen display information, for example, the parsed display time is 00:00:00,116 to 00:00:00,283, where the state data corresponding to the display time includes: signal 4 ch 4 flightTime 1 uavBat 16.4V glsBat 11.7V uavBatCells 4 glsBatCells 3 delay 28ms bitrate 20.9 Mbps; the status data is then displayed according to the display time, as shown in fig. 5.
In some embodiments, the image information includes a time axis. For example, when an image pickup apparatus mounted on a movable platform picks up an image, information on the time axis is added to the picked-up image.
The user side synchronously displays the on-screen display information when displaying the image information according to the display time of the on-screen display information, and the method comprises the following steps: and the user side determines display time and on-screen display information corresponding to the time axis according to the time axis of the image information being displayed, and displays the on-screen display information.
Specifically, after receiving the image information and the on-screen display information sent by the movable platform, the user side sequentially displays the image information according to the shooting sequence of the image information, namely the sequence of the time axis. When certain image information is displayed, the user side determines display time and on-screen display information corresponding to the time axis according to the time axis of the image information, and then displays the on-screen display information, so that synchronous playing of the image information and the on-screen display information of the user side is realized.
In some embodiments, as shown in fig. 9, the data processing method further includes step S340.
S340, the user side stores the image information sent by the movable platform to generate a video file.
Illustratively, as shown in fig. 9, the data processing method further includes a step S350.
And S350, the user side stores the on-screen display information time-synchronized with the image information to generate an on-screen display information file of the video file.
Illustratively, as shown in fig. 9, the data processing method further includes a step S360.
S360, if the user side receives a playing instruction input by a user, displaying the image information in the video file, and synchronously displaying the on-screen display information in the on-screen display information file when displaying the image information.
Specifically, if the user needs to play back the video file in the user side, the user side can receive the playing instruction by operating the user side. And after receiving the playing instruction, the user side can display the image information in the video file and synchronously display the on-screen display information in the on-screen display information file when displaying the image information so as to restore the state of the movable platform and/or the state of the user side when shooting the image information. By analyzing the recorded on-screen display information file, the on-screen display information of the video scene is updated to a user side, such as a display interface of glasses, in real time when the video file is played.
In some embodiments, the data processing method further comprises: and the user side acquires an operation event of operating the user side by a user, and generates buried point data according to the operation event and on-screen display information and/or image information associated with the operation event.
Specifically, the user operates the user terminal, so that the user terminal generates a control instruction for the movable platform, and the user terminal sends the control instruction to the movable platform through the uplink channel. And the movable platform realizes corresponding control operation according to the control instruction, and also sends image information and on-screen display information to the user terminal in real time.
Specifically, a user operates a user side to enable the user side to generate a control instruction for the movable platform; and meanwhile, triggering a buried point event to generate buried point data.
Specifically, the user side determines the on-screen display information and/or the image information associated with the operation event according to the generation time of the control instruction, the time axis of the image information and the display time of the on-screen display information; and generating buried point data according to the operation event and on-screen display information and/or image information associated with the operation event.
In some embodiments, the data processing method further comprises: and when the user side is connected to the terminal equipment, the buried point data is sent to the terminal equipment.
For example, when the user detects that the user is connected to a terminal device such as a mobile phone or a computer, the user may trigger to acquire a stored embedded point file, such as a JSON file. Because the buried point file is packaged and compressed, the buried point file can be restored into real buried point data only by decompression according to rules, including the triggering time of events; and then the user side uploads the buried point data to the connected terminal equipment.
Illustratively, when a user side is connected to a terminal device capable of being networked, the buried point data is sent to the terminal device, so that the terminal device uploads the buried point data to a server.
After the user side uploads the buried point data to the connected terminal equipment, the terminal equipment can further upload the buried point data to the server, so that the server can classify and count the buried point data, the use habits, the real requirements and the like of the user are obtained, and updating iteration, optimization and improvement of products are facilitated.
In some embodiments, the data processing method further comprises: and if the user side receives the uploading success notification sent by the terminal equipment, deleting the data of the buried point corresponding to the uploading success notification.
Specifically, after the terminal device in communication connection with the user side receives complete buried point data, a successful uploading notification is fed back to the user side; and the user side can delete the locally stored buried point data and/or the buried point file after the uploading success notification so as to release more space for recording new buried point data.
In some embodiments, the data processing method further comprises: the movable platform acquires an operation event of a user to the user side from the user side, and generates buried point data according to the operation event and on-screen display information and/or image information associated with the operation event; and when the movable platform is connected to the terminal equipment, sending the buried point data to the terminal equipment.
Specifically, the user operates the user terminal, so that the user terminal generates a control instruction for the movable platform, and the user terminal sends the control instruction to the movable platform through the uplink channel. And the movable platform realizes corresponding control operation according to the control instruction, and also sends image information and on-screen display information to the user terminal in real time.
Specifically, a user operates a user side, so that the user side sends a control instruction to the movable platform; meanwhile, the control instruction triggers a buried point event of the movable platform, and the movable platform can generate the buried point data.
Specifically, the movable platform determines on-screen display information and/or image information associated with the operation event according to the generation time or the receiving time of the control instruction, the time axis of the image information, and the display time of the on-screen display information; and generating buried point data according to the operation event and on-screen display information and/or image information associated with the operation event. Exemplarily, since the data amount of the buried point data is huge, format conversion and encapsulation compression processing are required, and the effect of saving space is achieved. And simultaneously recording the time stamp of each buried point data, for example, taking the generation time of the control instruction as the time stamp of the corresponding buried point data, and then generating a buried point file by the movable platform from the processed buried point data.
In some embodiments, the data transmission method further includes: and if the movable platform receives an uploading success notice sent by the terminal equipment, deleting the data of the buried point corresponding to the uploading success notice.
For example, when the mobile platform detects that the mobile platform is connected to a terminal device such as a mobile phone or a computer, the mobile platform triggers to acquire a stored embedded point file, such as a JSON file. Because the buried point file is packaged and compressed, the buried point file can be restored into real buried point data only by decompression according to rules, including the triggering time of events; and then the movable platform uploads the buried point data to the connected terminal equipment.
Therefore, the embedded point function can be realized locally on a movable platform such as an unmanned aerial vehicle; when a mobile phone or a computer needs to be connected in the scenes of upgrading a movable platform and the like, a user can noninductively upload buried point data to terminal equipment such as the mobile phone or the computer; the data of the buried point can be uploaded to a server by the terminal equipment which can be networked; therefore, under the condition that the unmanned aerial vehicle is not connected with the mobile device, the buried point statistics can be carried out.
In the data processing method provided by the embodiment, the image information shot by the camera device carried by the movable platform is acquired through the movable platform, and the on-screen display information time-synchronized with the image information is generated based on the state data of the movable platform; and then the movable platform sends the image information and the on-screen display information to the user end through the image transmission channel so that the user end displays the image information and synchronously displays the on-screen display information when displaying the image information. The image information and the state data of the movable platform are respectively transmitted, so that time delay increase, power consumption increase and heating increase caused by the image processing process of the on-screen display module are avoided, and additional hardware does not need to be added.
Fig. 10 is a flowchart illustrating an embodiment of a data processing method.
As shown in fig. 10, the data processing method includes steps S410 to S450.
And S410, the user side sends an image transmission instruction to the movable platform according to the operation of the user.
Illustratively, when the glasses recording starts, a signal is sent to inform the software module displaying the information record on the screen to start working.
And S420, the movable platform receives an image transmission instruction, acquires image information shot by the camera device carried by the movable platform according to the image transmission instruction, and generates on-screen display information time-synchronized with the image information based on the state data of the movable platform.
Illustratively, the software module for on-screen display information recording may obtain information from the state data providing module of the traversing machine to generate the on-screen display information in a defined format.
And S430, the movable platform sends the image information and the on-screen display information to a user side through a picture transmission channel.
Illustratively, the glasses receive the image information and the on-screen display information, and will store the image information from the movable platform to generate a video file, and generate an on-screen display information file synchronized with the image information according to the on-screen display information.
And S440, the user side analyzes the on-screen display information to obtain display time and state data.
Illustratively, when the glasses start playing the video file, a signal is sent to inform the glasses to analyze the generated on-screen display information file, and the on-screen display information stored before is analyzed.
S450, when the user side displays the image information, determining display time and state data corresponding to the time axis according to the time axis of the displayed image information, and displaying the state data.
Illustratively, the display interface of the glasses updates the display by fetching the corresponding on-screen display information according to the time axis of the currently played image information.
Referring to fig. 11 in conjunction with the above embodiments, fig. 11 is a schematic block diagram of a movable platform 600 provided in an embodiment of the present disclosure. The movable platform 600 includes a processor 601 and a memory 602, the processor 601 and the memory 602 being connected by a bus 603, such as an I2C (Inter-integrated Circuit) bus.
Specifically, the Processor 601 may be a Micro-controller Unit (MCU), a Central Processing Unit (CPU), a Digital Signal Processor (DSP), or the like.
Specifically, the Memory 602 may be a Flash chip, a Read-Only Memory (ROM) magnetic disk, an optical disk, a usb disk, or a removable hard disk.
The processor 601 is configured to run a computer program stored in the memory 602, and when executing the computer program, implement the foregoing data transmission method for the movable platform.
Illustratively, the processor 601 is configured to run a computer program stored in the memory 602 and to implement the following steps when executing the computer program:
acquiring image information shot by the camera device carried by the movable platform;
generating on-screen display information that is time-synchronized with the image information based on the state data of the movable platform;
and sending the image information and the on-screen display information to a user side through an image transmission channel.
Illustratively, the movable platform comprises a drone or a cloud trolley, such as a traversing machine.
Referring to fig. 12, fig. 12 is a schematic block diagram of a display device 700 according to an embodiment of the present disclosure. The display device 700 comprises a processor 701 and a memory 702, the processor 701 and the memory 702 being connected by a bus 703, the bus 703 being for example an I2C (Inter-integrated Circuit) bus.
Display device 700 also includes a display component 704 for displaying.
Specifically, the Processor 701 may be a Micro-controller Unit (MCU), a Central Processing Unit (CPU), a Digital Signal Processor (DSP), or the like.
Specifically, the Memory 702 may be a Flash chip, a Read-Only Memory (ROM) magnetic disk, an optical disk, a usb disk, or a removable hard disk.
The processor 701 is configured to run a computer program stored in the memory 702, and when executing the computer program, implement the foregoing data processing method for the user side.
Illustratively, the processor 701 is configured to run a computer program stored in the memory 702 and to implement the following steps when executing the computer program:
receiving image information from the movable platform through a graph transmission channel; and the number of the first and second groups,
on-screen display information time-synchronized with the image information;
displaying the image information, and synchronously displaying the on-screen display information when displaying the image information.
Referring to fig. 13, fig. 13 is a schematic block diagram of glasses 800 according to an embodiment of the present disclosure. The glasses 800 comprise a processor 801 and a memory 802, the processor 801 and the memory 802 being connected by a bus 803, such as an I2C (Inter-integrated Circuit) bus.
The eyewear 800 also includes a display component 804 for displaying.
Specifically, the Processor 801 may be a Micro-controller Unit (MCU), a Central Processing Unit (CPU), a Digital Signal Processor (DSP), or the like.
Specifically, the Memory 802 may be a Flash chip, a Read-Only Memory (ROM) magnetic disk, an optical disk, a usb disk, or a removable hard disk.
The processor 801 is configured to run a computer program stored in the memory 802, and when executing the computer program, implement the foregoing data processing method for the user side.
Illustratively, the processor 801 is configured to run a computer program stored in the memory 802, and when executing the computer program, to implement the following steps:
receiving image information from the movable platform through a graph transmission channel; and the number of the first and second groups,
on-screen display information time-synchronized with the image information;
displaying the image information, and synchronously displaying the on-screen display information when displaying the image information.
Illustratively, fig. 14 is a schematic view of a pair of eyeglasses, specifically FPV eyeglasses.
The embodiment of the specification further provides a movable platform system which comprises a movable platform and a user side.
For example, as shown in fig. 3, the user terminal is embodied as glasses, such as FPV glasses.
Specifically, the movable platform is used for acquiring image information shot by the camera device carried by the movable platform and sending the image information to a user side through an image transmission channel;
the movable platform is used for generating on-screen display information which is time-synchronous with the image information based on the state data of the movable platform, and sending the on-screen display information to the user side through a picture transmission channel;
the user side is used for displaying the image information and synchronously displaying the on-screen display information when the image information is displayed.
In an embodiment of the present specification, a computer-readable storage medium is further provided, where a computer program is stored in the computer-readable storage medium, where the computer program includes program instructions, and the processor executes the program instructions to implement the steps of the data transmission method for a movable platform provided in the foregoing embodiment.
The computer readable storage medium may be an internal storage unit of the removable platform described in any previous embodiment, for example, a hard disk or a memory of the removable platform. The computer readable storage medium may also be an external storage device of the removable platform, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), etc. provided on the removable platform.
In an embodiment of the present specification, a computer-readable storage medium is further provided, where a computer program is stored in the computer-readable storage medium, where the computer program includes program instructions, and the processor executes the program instructions to implement the steps of the data processing method for the user terminal provided in the foregoing embodiment.
The computer-readable storage medium may be an internal storage unit of the display device, such as glasses, according to any of the foregoing embodiments, for example, a hard disk or a memory of the display device. The computer readable storage medium may also be an external storage device of the display device, such as a plug-in hard disk provided on the display device, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like.
In the movable platform, the display device, the glasses, the movable platform system, and the computer-readable storage medium provided in the above embodiments of the present specification, image information captured by the camera device carried by the movable platform is acquired by the movable platform, and on-screen display information time-synchronized with the image information is generated based on state data of the movable platform; and then the movable platform sends the image information and the on-screen display information to the user end through the image transmission channel so that the user end displays the image information and synchronously displays the on-screen display information when displaying the image information. The image information and the state data of the movable platform are respectively transmitted, so that time delay increase, power consumption increase and heating increase caused by the image processing process of the on-screen display module are avoided, and additional hardware does not need to be added.
It is to be understood that the terminology used in the description herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the description.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
The above description is only for the specific embodiments of the present disclosure, but the scope of the present disclosure is not limited thereto, and any person skilled in the art can easily conceive various equivalent modifications or substitutions within the technical scope of the present disclosure, and these modifications or substitutions should be covered within the scope of the present disclosure. Therefore, the protection scope of the present specification shall be subject to the protection scope of the claims.

Claims (59)

1. A data transmission method for a movable platform, the method comprising:
acquiring image information shot by the camera device carried by the movable platform;
generating on-screen display information that is time-synchronized with the image information based on the state data of the movable platform;
and sending the image information and the on-screen display information to a user side through an image transmission channel.
2. The method of claim 1, wherein the on-screen display of information comprises:
displaying the time;
the sending the on-screen display information to the user side through the graph transmission channel comprises:
and sending the on-screen display information to a user side through a picture transmission channel so that the user side displays the on-screen display information according to the display time.
3. The method according to claim 2, wherein the generating on-screen display information time-synchronized with the image information based on the state data of the movable platform comprises:
acquiring state data of the movable platform, and determining the display time of the state data according to the time for shooting the image information;
and generating the on-screen display information according to the state data and the display time.
4. The data transmission method according to claim 3, wherein the transmitting the on-screen display information to the user side through a graph transmission channel so that the user side displays the on-screen display information according to the display time includes:
and sending the on-screen display information to a user side through a picture transmission channel so that the user side displays the state data according to the display time.
5. The method according to claim 3, wherein the generating the on-screen display information according to the status data and the display time comprises:
and generating on-screen display information in a preset format according to the state data and the display time.
6. The data transmission method according to claim 5, wherein the transmitting the on-screen display information to a user side through a graph transmission channel so that the user side displays the status data according to the display time includes:
and sending the on-screen display information in a preset format to a user side through a picture transmission channel so that the user side receives the on-screen display information, analyzes the on-screen display information to obtain the state data and the display time, and displays the state data according to the display time.
7. The data transmission method according to claim 5 or 6, wherein the generating of the on-screen display information in a preset format according to the state data and the display time comprises:
and generating screen display information in an SRT format, a binary DAT format, a text format, an SSA format or an SUB format according to the state data and the display time.
8. The data transmission method according to any one of claims 2 to 6, wherein the image information includes:
a time axis;
the sending the on-screen display information to a user side through a graph transmission channel so that the user side displays the on-screen display information according to the display time comprises the following steps:
and sending the on-screen display information to a user terminal through a picture transmission channel, so that the user terminal determines display time and the on-screen display information corresponding to the time axis according to the time axis of the image information being displayed, and the user terminal displays the on-screen display information.
9. The method according to claim 8, wherein the sending the image information and the on-screen display information to the user side through a graph transmission channel comprises:
and sending the image information and the on-screen display information to a user side through a picture transmission channel so that the user side generates an on-screen display information file synchronous with the image information according to the on-screen display information based on the time axis of the image information.
10. The data transmission method according to any one of claims 1 to 6, wherein the generating on-screen display information time-synchronized with the image information based on the state data of the movable platform includes:
and generating on-screen display information time-synchronized with the image information based on the first parameter information of the movable platform and/or second parameter information of data transmission between the movable platform and the user side.
11. The method according to claim 10, wherein the generating on-screen display information time-synchronized with the image information based on the state data of the movable platform comprises:
and generating on-screen display information time-synchronized with the image information based on the state data of the movable platform and the third parameter information of the user side.
12. The data transmission method according to claim 11, wherein the first parameter information includes at least one of an unlocking time, a voltage, and a number of cells of the movable platform; the second parameter information comprises at least one of signal strength, channel information, transmission delay and transmission code rate of data transmission between the movable platform and the user side; the third parameter information includes at least one of a voltage of the user terminal and a number of battery cells.
13. The data transmission method according to any one of claims 1 to 6, wherein the generating on-screen display information time-synchronized with the image information based on the state data of the movable platform includes:
and if an image transmission instruction sent by the user side is received, generating on-screen display information time-synchronized with the image information based on the state data of the movable platform.
14. The data transmission method according to claim 13, wherein the acquiring of the image information captured by the movable platform-mounted imaging device includes:
and if an image transmission instruction sent by the user side is received, acquiring image information shot by the camera device carried by the movable platform.
15. The data transmission method according to any one of claims 1 to 6, further comprising:
acquiring an operation event of a user to the user side from the user side, and generating buried point data according to the operation event and on-screen display information and/or image information associated with the operation event;
and when the embedded point data is connected to the terminal equipment, the embedded point data is sent to the terminal equipment.
16. The data transmission method according to claim 15, further comprising:
and if the uploading success notification sent by the terminal equipment is received, deleting the data of the buried point corresponding to the uploading success notification.
17. A data processing method is used for a user side, and is characterized by comprising the following steps:
receiving image information from the movable platform through a graph transmission channel; and the number of the first and second groups,
on-screen display information time-synchronized with the image information;
displaying the image information, and synchronously displaying the on-screen display information when displaying the image information.
18. The data processing method of claim 17, wherein the on-screen display information comprises:
displaying the time;
the displaying the on-screen display information synchronously while displaying the image information includes:
and synchronously displaying the on-screen display information when the image information is displayed according to the display time of the on-screen display information.
19. The data processing method of claim 18, wherein the on-screen display information is generated by the movable platform from state data and display time of the movable platform;
the synchronously displaying the on-screen display information when displaying the image information according to the display time of the on-screen display information comprises:
and synchronously displaying the state data in the on-screen display information when the image information is displayed according to the display time of the on-screen display information.
20. The data processing method of claim 19, wherein the on-screen display information is generated by the movable platform according to the state data and the display time of the movable platform, and is in a preset format;
the synchronous display of the state data in the on-screen display information when the image information is displayed according to the display time of the on-screen display information comprises:
analyzing the on-screen display information in the preset format to obtain the state data and the display time;
and synchronously displaying the state data when the image information is displayed according to the display time.
21. The data processing method of claim 20, wherein receiving on-screen display information from the movable platform comprises:
on-screen display information is received from the movable platform in SRT format, binary DAT format, text format, SSA format, or SUB format.
22. A data processing method according to any of claims 18-21, wherein the image information comprises:
a time axis;
the synchronously displaying the on-screen display information when displaying the image information according to the display time of the on-screen display information comprises:
and determining the display time and the on-screen display information corresponding to the time axis according to the time axis of the image information being displayed, and displaying the on-screen display information.
23. The data processing method of any of claims 17-21, wherein the on-screen display of information comprises:
at least one of first parameter information of the movable platform, second parameter information of data transmission between the movable platform and a user side, and third parameter information of the user side.
24. The data processing method of claim 23, wherein the first parameter information includes at least one of an unlock time, a voltage, and a number of cells of the movable platform; the second parameter information comprises at least one of signal strength, channel information, transmission delay and transmission code rate of data transmission between the movable platform and the user side; the third parameter information includes at least one of a voltage of the user terminal and a number of battery cells.
25. The data processing method according to any one of claims 17 to 21, wherein said displaying the image information and synchronously displaying the on-screen display information while displaying the image information comprises:
displaying the image information on a display interface, and synchronously displaying the on-screen display information in a preset area of the display interface when the image information is displayed.
26. The data processing method according to any one of claims 17 to 21, further comprising:
and sending an image transmission instruction to the movable platform so that the movable platform generates on-screen display information time-synchronized with the image information based on the state data of the movable platform, and the movable platform sends the on-screen display information to the user side.
27. The data processing method of claim 26, further comprising:
and sending an image transmission instruction to the movable platform so that the movable platform acquires image information shot by a camera carried by the movable platform and sends the image information to the user terminal.
28. The data processing method according to any one of claims 17 to 21, further comprising:
storing the image information from the movable platform to generate a video file.
29. The data processing method of claim 28, further comprising:
storing the on-screen display information time-synchronized with the image information to generate an on-screen display information file of the video file.
30. The data processing method of claim 29, further comprising:
and if a playing instruction input by a user is received, displaying the image information in the video file, and synchronously displaying the on-screen display information in the on-screen display information file when the image information is displayed.
31. The data processing method according to any one of claims 17 to 21, further comprising:
acquiring an operation event of a user operating the user side, and generating buried point data according to the operation event and on-screen display information and/or image information associated with the operation event;
and when the embedded point data is connected to the terminal equipment, the embedded point data is sent to the terminal equipment.
32. The data processing method of claim 31, further comprising:
and if the uploading success notification sent by the terminal equipment is received, deleting the data of the buried point corresponding to the uploading success notification.
33. A data processing method, comprising:
the method comprises the steps that a movable platform obtains image information shot by a camera device carried by the movable platform, and the image information is sent to a user side through a picture transmission channel;
the movable platform generates on-screen display information time-synchronized with the image information based on the state data of the movable platform, and sends the on-screen display information to the user side through a picture transmission channel;
and the user side displays the image information and synchronously displays the on-screen display information when displaying the image information.
34. The data processing method of claim 33, wherein the on-screen display information comprises:
displaying the time;
the user side synchronously displays the on-screen display information when displaying the image information, and the method comprises the following steps:
and the user side synchronously displays the on-screen display information when displaying the image information according to the display time of the on-screen display information.
35. The data processing method of claim 34, wherein the movable platform generates on-screen display information that is time-synchronized with the image information based on the state data of the movable platform, comprising:
the movable platform acquires state data of the movable platform, and display time of the state data is determined according to the time for shooting the image information;
and the movable platform generates the on-screen display information according to the state data and the display time.
36. The data processing method of claim 35, wherein the user side synchronously displays the on-screen display information when displaying the image information according to the display time of the on-screen display information, and the method comprises:
and the user side synchronously displays the state data in the display information on the screen when displaying the image information according to the display time of the display information on the screen.
37. The data processing method of claim 35, wherein the generating the on-screen display information by the movable platform based on the status data and the display time comprises:
and the movable platform generates on-screen display information in a preset format according to the state data and the display time.
38. The data processing method of claim 37, wherein the user side synchronously displays the status data in the on-screen display information when displaying the image information according to the display time of the on-screen display information, and the method comprises:
the user side analyzes the on-screen display information in the preset format to obtain the state data and the display time;
and the user side synchronously displays the state data when displaying the image information according to the display time.
39. The data processing method of claim 37 or 38, wherein the generating of the on-screen display information in a preset format by the movable platform according to the state data and the display time comprises:
and the movable platform generates on-screen display information in an SRT format, a binary DAT format, a text format, an SSA format or an SUB format according to the state data and the display time.
40. A data processing method according to any of claims 34-38, wherein the image information comprises:
a time axis;
the user side synchronously displays the on-screen display information when displaying the image information according to the display time of the on-screen display information, and the method comprises the following steps:
and the user side determines display time and on-screen display information corresponding to the time axis according to the time axis of the image information being displayed, and displays the on-screen display information.
41. The data processing method of any of claims 33-38, wherein the movable platform generating on-screen display information that is time-synchronized with the image information based on the state data of the movable platform comprises:
and the movable platform generates on-screen display information time-synchronized with the image information based on the first parameter information of the movable platform and/or the second parameter information of data transmission between the movable platform and a user side.
42. The data processing method of claim 41, wherein the movable platform generates on-screen display information that is time-synchronized with the image information based on the state data of the movable platform, comprising:
and the movable platform generates on-screen display information time-synchronized with the image information based on the state data of the movable platform and the third parameter information of the user side.
43. The data processing method of claim 42, wherein the first parameter information comprises at least one of an unlocking time, a voltage, and a number of cells of the movable platform; the second parameter information comprises at least one of signal strength, channel information, transmission delay and transmission code rate of data transmission between the movable platform and the user side; the third parameter information includes at least one of a voltage of the user terminal and a number of battery cells.
44. The data processing method according to any one of claims 33 to 38, wherein displaying the image information at the user end and synchronously displaying the on-screen display information while displaying the image information comprises:
and the user side displays the image information on a display interface, and synchronously displays the on-screen display information in a preset area of the display interface when the image information is displayed.
45. The data processing method of any of claims 33-38, wherein the movable platform generating on-screen display information that is time-synchronized with the image information based on the state data of the movable platform comprises:
and if the movable platform receives an image transmission instruction sent by the user side, generating on-screen display information time-synchronized with the image information based on the state data of the movable platform.
46. The data processing method of claim 45, wherein the obtaining of the image information captured by the movable platform mounted camera by the movable platform comprises:
and if the movable platform receives the image transmission instruction sent by the user side, acquiring the image information shot by the camera carried by the movable platform.
47. The data processing method of any one of claims 33 to 38, further comprising:
and the user side stores the image information sent by the movable platform to generate a video file.
48. The data processing method of claim 47, further comprising:
and the user side stores the on-screen display information time-synchronized with the image information to generate an on-screen display information file of the video file.
49. The data processing method of claim 48, further comprising:
and if the user side receives a playing instruction input by a user, displaying the image information in the video file, and synchronously displaying the on-screen display information in the on-screen display information file when displaying the image information.
50. The data processing method of any one of claims 33 to 38, further comprising:
the user side acquires an operation event of the user side operated by a user, and generates buried point data according to the operation event and on-screen display information and/or image information associated with the operation event;
and when the user side is connected to the terminal equipment, the buried point data is sent to the terminal equipment.
51. The data processing method of claim 50, further comprising:
and if the user side receives the uploading success notification sent by the terminal equipment, deleting the data of the buried point corresponding to the uploading success notification.
52. The data processing method of any one of claims 33 to 38, further comprising:
the movable platform acquires an operation event of a user to the user side from the user side, and generates buried point data according to the operation event and on-screen display information and/or image information associated with the operation event;
and when the movable platform is connected to the terminal equipment, sending the buried point data to the terminal equipment.
53. The data processing method of claim 52, further comprising:
and if the movable platform receives an uploading success notice sent by the terminal equipment, deleting the data of the buried point corresponding to the uploading success notice.
54. A movable platform comprising a memory and a processor;
the memory is used for storing a computer program;
the processor is configured to execute the computer program and, when executing the computer program, implement the following steps:
acquiring image information shot by the camera device carried by the movable platform;
generating on-screen display information that is time-synchronized with the image information based on the state data of the movable platform;
and sending the image information and the on-screen display information to a user side through an image transmission channel.
55. The movable platform of claim 54, wherein the movable platform comprises a drone or a cloud cart.
56. A display device comprising a display component, a memory, and a processor;
the display component is used for displaying;
the memory is used for storing a computer program;
the processor is configured to execute the computer program and, when executing the computer program, implement the following steps:
receiving image information from the movable platform through a graph transmission channel; and the number of the first and second groups,
on-screen display information time-synchronized with the image information;
displaying the image information, and synchronously displaying the on-screen display information when displaying the image information.
57. An eyewear comprising a display component, a memory, and a processor;
the display component is used for displaying;
the memory is used for storing a computer program;
the processor is configured to execute the computer program and, when executing the computer program, implement the following steps:
receiving image information from the movable platform through a graph transmission channel; and the number of the first and second groups,
on-screen display information time-synchronized with the image information;
displaying the image information, and synchronously displaying the on-screen display information when displaying the image information.
58. A movable platform system is characterized by comprising a movable platform and a user side;
the movable platform is used for acquiring image information shot by the camera device carried by the movable platform and sending the image information to a user side through an image transmission channel;
the movable platform is used for generating on-screen display information which is time-synchronous with the image information based on the state data of the movable platform, and sending the on-screen display information to the user side through a picture transmission channel;
the user side is used for displaying the image information and synchronously displaying the on-screen display information when the image information is displayed.
59. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which, when executed by a processor, causes the processor to implement:
the data transmission method according to any one of claims 1 to 16; and/or
A data processing method as claimed in any one of claims 17 to 32.
CN201980031843.9A 2019-07-24 2019-07-24 Data sending and processing method, movable platform, display device, glasses and system Pending CN112119630A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/097485 WO2021012212A1 (en) 2019-07-24 2019-07-24 Data sending and processing methods, movable platform, display device, glasses and system

Publications (1)

Publication Number Publication Date
CN112119630A true CN112119630A (en) 2020-12-22

Family

ID=73799210

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980031843.9A Pending CN112119630A (en) 2019-07-24 2019-07-24 Data sending and processing method, movable platform, display device, glasses and system

Country Status (2)

Country Link
CN (1) CN112119630A (en)
WO (1) WO2021012212A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105278362A (en) * 2015-01-16 2016-01-27 深圳一电科技有限公司 Unmanned reconnaissance system control method, device and system
CN105516604A (en) * 2016-01-20 2016-04-20 陈昊 Aerial video sharing method and system
CN106094863A (en) * 2015-04-23 2016-11-09 鹦鹉无人机股份有限公司 The system of unmanned plane is driven for immersion
CN106292719A (en) * 2016-09-21 2017-01-04 深圳智航无人机有限公司 Earth station's emerging system and earth station's video data fusion method
CN108319289A (en) * 2017-01-16 2018-07-24 翔升(上海)电子技术有限公司 Head-wearing display device, unmanned plane, flight system and unmanned aerial vehicle (UAV) control method
WO2019049095A1 (en) * 2017-09-11 2019-03-14 Bonasi Catellani Andrea Electronic device for receiving data in real time from another electronic device
CN109729293A (en) * 2019-01-21 2019-05-07 深圳市敢为软件技术有限公司 Display methods, device and the storage medium of video related information
CN109753330A (en) * 2018-12-26 2019-05-14 深圳市麦谷科技有限公司 A kind of navigation equipment interface display method, system, computer equipment and storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108205618A (en) * 2016-12-20 2018-06-26 亿航智能设备(广州)有限公司 VR glasses and connection method, the apparatus and system of control VR glasses and unmanned plane
CN108156418A (en) * 2017-12-14 2018-06-12 沈阳无距科技有限公司 The methods, devices and systems of real-time imaging are obtained by unmanned plane

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105278362A (en) * 2015-01-16 2016-01-27 深圳一电科技有限公司 Unmanned reconnaissance system control method, device and system
CN106094863A (en) * 2015-04-23 2016-11-09 鹦鹉无人机股份有限公司 The system of unmanned plane is driven for immersion
CN105516604A (en) * 2016-01-20 2016-04-20 陈昊 Aerial video sharing method and system
CN106292719A (en) * 2016-09-21 2017-01-04 深圳智航无人机有限公司 Earth station's emerging system and earth station's video data fusion method
CN108319289A (en) * 2017-01-16 2018-07-24 翔升(上海)电子技术有限公司 Head-wearing display device, unmanned plane, flight system and unmanned aerial vehicle (UAV) control method
WO2019049095A1 (en) * 2017-09-11 2019-03-14 Bonasi Catellani Andrea Electronic device for receiving data in real time from another electronic device
CN109753330A (en) * 2018-12-26 2019-05-14 深圳市麦谷科技有限公司 A kind of navigation equipment interface display method, system, computer equipment and storage medium
CN109729293A (en) * 2019-01-21 2019-05-07 深圳市敢为软件技术有限公司 Display methods, device and the storage medium of video related information

Also Published As

Publication number Publication date
WO2021012212A1 (en) 2021-01-28

Similar Documents

Publication Publication Date Title
CN106791958B (en) Position mark information generation method and device
CN107454416B (en) Video stream sending method and device
CN111131884B (en) Video clipping method, related device, equipment and storage medium
CN106407984B (en) Target object identification method and device
CN107682714B (en) Method and device for acquiring online video screenshot
CN109756767B (en) Preview data playing method, device and storage medium
CN109597556A (en) A kind of screenshotss method and terminal
CN106412702A (en) Video clip interception method and device
CN105830429B (en) For handling the method and system for the video frame damaged by camera motion
CN105320695A (en) Picture processing method and device
CN114095776B (en) Screen recording method and electronic equipment
CN110163051B (en) Text extraction method, device and storage medium
CN112994980A (en) Time delay testing method and device, electronic equipment and storage medium
CN114189700A (en) Live broadcast card pause prompting method and device, computer equipment and storage medium
CN110636337B (en) Video image intercepting method, device and system
CN113596555A (en) Video playing method and device and electronic equipment
CN202444580U (en) System, terminal and server capable of acquiring television program screenshot
CN112333458A (en) Live broadcast room display method, device, equipment and storage medium
CN111935516A (en) Audio file playing method, device, terminal, server and storage medium
CN111416996A (en) Multimedia file detection method, multimedia file playing device, multimedia file equipment and storage medium
CN114254183A (en) Content recommendation method, device, terminal, server and storage medium
CN110543276B (en) Picture screening method and terminal equipment thereof
CN112087725B (en) Push message display method and device and storage medium
CN112119630A (en) Data sending and processing method, movable platform, display device, glasses and system
CN110442408A (en) A kind of power-assisted information statistical method, device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20201222