CN113496559A - Unmanned equipment data acquisition method, device and system, unmanned equipment and storage medium - Google Patents

Unmanned equipment data acquisition method, device and system, unmanned equipment and storage medium Download PDF

Info

Publication number
CN113496559A
CN113496559A CN202110731486.6A CN202110731486A CN113496559A CN 113496559 A CN113496559 A CN 113496559A CN 202110731486 A CN202110731486 A CN 202110731486A CN 113496559 A CN113496559 A CN 113496559A
Authority
CN
China
Prior art keywords
aerial vehicle
unmanned aerial
unmanned
environment information
accident
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110731486.6A
Other languages
Chinese (zh)
Inventor
吴泽远
黄蓉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Xaircraft Technology Co Ltd
Original Assignee
Guangzhou Xaircraft Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Xaircraft Technology Co Ltd filed Critical Guangzhou Xaircraft Technology Co Ltd
Priority to CN202110731486.6A priority Critical patent/CN113496559A/en
Publication of CN113496559A publication Critical patent/CN113496559A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0841Registering performance data
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/008Registering or indicating the working of vehicles communicating information to a remotely located station

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The embodiment of the application discloses a method, a device and a system for acquiring data of unmanned equipment, the unmanned equipment and a storage medium. According to the technical scheme provided by the embodiment of the application, the surrounding environment information and the motion state parameters of the unmanned equipment are obtained in real time, wherein the surrounding environment information comprises video information shot by a camera device of the unmanned equipment; determining whether the unmanned equipment meets accident occurrence conditions or not according to the motion state parameters; and when the unmanned equipment is determined to meet the accident occurrence condition, storing the surrounding environment information for analysis and acquisition. Through the technical means, the problems that unmanned equipment data is low in acquisition efficiency and lack of effective acquisition means in the prior art are solved, accident-related data can be effectively determined, and follow-up analysis is facilitated.

Description

Unmanned equipment data acquisition method, device and system, unmanned equipment and storage medium
Technical Field
The embodiment of the application relates to the technical field of unmanned equipment, in particular to an unmanned equipment data acquisition method, device and system, unmanned equipment and a storage medium.
Background
In some technical fields requiring high-altitude operation, manual work is generally replaced by unmanned equipment, such as unmanned aerial vehicles for high-altitude inspection, high-altitude cargo transportation, surveying and mapping data acquisition and the like. In the process of high-altitude operation, the unmanned equipment is influenced by various environmental factors or self working conditions, and accidents can occur, such as a machine explosion accident caused by falling outside.
At present, various protection means are mostly adopted to reduce the damage degree of the unmanned aerial vehicle aiming at the explosion accidents of the unmanned aerial vehicle, and the acquisition of accident data lacks effective means, so that the analysis of accident reasons is not facilitated, and the improvement is needed.
Disclosure of Invention
The embodiment of the application provides a method, a device and a system for acquiring data of unmanned equipment, the unmanned equipment and a storage medium, the problems that the unmanned equipment in the prior art is low in data acquisition efficiency and lacks of effective acquisition means are solved, accident-related data can be effectively determined, and follow-up analysis is facilitated.
In a first aspect, an embodiment of the present application provides an unmanned device data acquisition method, including:
acquiring surrounding environment information and motion state parameters of the unmanned equipment in real time, wherein the surrounding environment information comprises video information shot by the unmanned equipment camera device;
determining whether the unmanned equipment meets accident occurrence conditions or not according to the motion state parameters;
and when the unmanned equipment is determined to meet the accident occurrence condition, storing the surrounding environment information for analysis and acquisition.
Further, the motion state parameters include a motor rotation parameter and an acceleration parameter of the unmanned aerial vehicle, and determining whether the unmanned aerial vehicle meets an accident occurrence condition according to the motion state parameters includes:
determining whether the motor rotation parameter meets a first preset condition and whether the acceleration parameter meets a second preset condition;
and when the motor rotation parameter meets a first preset condition and/or the acceleration parameter meets a second preset condition, determining that the unmanned equipment meets an accident occurrence condition.
Further, the determining whether the motor rotation parameter meets a first preset condition and the acceleration parameter meets a second preset condition includes:
calculating a rotation difference value between a motor rotation parameter obtained at the current moment and a motor rotation parameter obtained at the previous moment, and calculating an acceleration difference value between an acceleration parameter obtained at the current moment and an acceleration parameter obtained at the previous moment;
judging whether the rotation difference value and the acceleration difference value exceed the corresponding threshold range or not;
when the rotation difference value is larger than a first threshold value, determining that the motor rotation parameter meets a first preset condition;
and when the acceleration difference is larger than a second threshold value, determining that the acceleration parameter meets a second preset condition.
Further, the video information includes image data of a first person main view angle of the unmanned aerial vehicle, and the image data of the first person main view angle is acquired by a camera device arranged at the head of the unmanned aerial vehicle.
Further, the acquiring, in real time, the surrounding environment information of the unmanned device includes:
and acquiring a video clip with a preset time length recorded by the camera device in real time, wherein the camera device records the video clip when the unmanned equipment is in an operating state.
Further, the storing the surrounding environment information includes:
storing the video clips acquired in real time into a local memory;
and uploading the video clips in the local storage to a remote controller end, other unmanned equipment and/or a cloud server through a cellular network and/or a local numerical chain.
Further, the uploading the surrounding environment information to a remote controller terminal, other unmanned equipment terminals and/or a cloud server includes:
and acquiring at least one newly stored video clip from the local memory and uploading the video clip.
Further, the storing the video segment acquired in real time to a local storage includes:
if the local memory has no free space, deleting the video clip with the longest storage time in the local memory, and storing the currently acquired video clip into the local memory.
Further, the uploading the surrounding environment information to a remote controller terminal, other unmanned equipment terminals and/or a cloud server includes:
and merging and uploading all the video clips in the local storage.
Further, after the storing the surrounding environment information, the method further includes:
and identifying the video clip according to a preset obstacle identification model, and determining the accident occurrence reason of the unmanned equipment.
Further, before the determining whether the motion state parameter satisfies an accident occurrence condition, the method further includes:
determining whether the current state of the unmanned equipment is a working state;
and if the current state is the operation state, determining whether the motion state parameters meet accident occurrence conditions.
In a second aspect, an embodiment of the present application provides an unmanned equipment data acquisition apparatus, including:
the data acquisition module is configured to acquire surrounding environment information and motion state parameters of the unmanned equipment in real time, wherein the surrounding environment information comprises video information shot by the unmanned equipment camera device;
a state judgment module configured to determine whether the unmanned equipment meets an accident occurrence condition according to the motion state parameter;
a data saving module configured to save the ambient environment information for analysis and acquisition when it is determined that the unmanned device satisfies the accident occurrence condition.
In a third aspect, an embodiment of the present application provides an unmanned aerial vehicle data acquisition system, including a peripheral information acquisition module, a flight control module, and a service processing module:
the peripheral information acquisition module is used for acquiring the current peripheral environment information of the unmanned equipment;
the flight control module is used for acquiring a motion state parameter of the unmanned equipment, determining whether the unmanned equipment meets an accident occurrence condition according to the motion state parameter, and notifying the service processing module when the unmanned equipment meets the accident occurrence condition;
and the service processing module is used for acquiring the surrounding environment information in real time and storing the surrounding environment information after receiving the notification sent by the flight control module.
Further, unmanned equipment data acquisition system still includes communication module, wherein:
and the communication module is used for uploading the surrounding environment information stored by the service processing module.
In a fourth aspect, embodiments of the present application provide an unmanned device, including:
one or more processors; a storage device storing one or more programs that, when executed by the one or more processors, cause the one or more processors to implement the unmanned device data acquisition method of the first aspect.
In a fifth aspect, embodiments of the present application provide a storage medium containing computer-executable instructions for performing the method of unmanned device data acquisition according to the first aspect when executed by a computer processor.
The method comprises the steps that peripheral environment information and motion state parameters of the unmanned equipment are obtained in real time, wherein the peripheral environment information comprises video information shot by an unmanned equipment camera device; determining whether the unmanned equipment meets accident occurrence conditions or not according to the motion state parameters; and when the unmanned equipment is determined to meet the accident occurrence condition, storing the surrounding environment information for analysis and acquisition. By adopting the technical means, the motion state parameters of the unmanned equipment are obtained in real time to monitor whether the unmanned equipment has an accident or not in real time, and when the unmanned equipment is monitored to have the accident, the obtained surrounding environment information is stored. The peripheral environment information may record information causing accidents of the unmanned aerial vehicle, and the acquired peripheral environment information is stored so as to acquire accident related data from the stored peripheral environment information, and further, accident reasons are effectively analyzed according to the accident related data.
Drawings
Fig. 1 is a flowchart of a data acquisition method for an unmanned aerial vehicle according to an embodiment of the present application;
FIG. 2 is a diagram of an unmanned aerial vehicle data acquisition system according to an embodiment of the present application;
fig. 3 is a flowchart of another data acquisition method for an unmanned aerial vehicle according to an embodiment of the present application;
fig. 4 is a flowchart of another data acquisition method for an unmanned aerial vehicle according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of an unmanned aerial vehicle data acquisition apparatus according to a second embodiment of the present application;
fig. 6 is a schematic structural diagram of an unmanned aerial vehicle according to a third embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, specific embodiments of the present application will be described in detail with reference to the accompanying drawings. It is to be understood that the specific embodiments described herein are merely illustrative of the application and are not limiting of the application. It should be further noted that, for the convenience of description, only some but not all of the relevant portions of the present application are shown in the drawings. Before discussing exemplary embodiments in more detail, it should be noted that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart may describe the operations (or steps) as a sequential process, many of the operations can be performed in parallel, concurrently or simultaneously. In addition, the order of the operations may be re-arranged. The process may be terminated when its operations are completed, but may have additional steps not included in the figure. The processes may correspond to methods, functions, procedures, subroutines, and the like.
The method for acquiring the data of the unmanned equipment aims at acquiring the surrounding environment information and the motion state parameters of the unmanned equipment in real time, wherein the surrounding environment information comprises video information shot by a camera device of the unmanned equipment; determining whether the unmanned equipment meets accident occurrence conditions or not according to the motion state parameters; and when the unmanned equipment is determined to meet the accident occurrence condition, storing the surrounding environment information for analysis and acquisition. At present, when an unmanned device has an explosion accident, various protection means are mostly adopted to reduce the damage degree of the unmanned aerial vehicle, but effective accident data acquisition means are lacked, and the accident data acquisition efficiency is low. Therefore, the method for acquiring the data of the unmanned equipment is provided to solve the problems that the existing unmanned equipment is low in data acquisition efficiency and lacks of effective acquisition means.
The first embodiment is as follows:
fig. 1 is a flowchart of an unmanned device data acquisition method according to an embodiment of the present disclosure, where the unmanned device data acquisition method provided in this embodiment may be executed by an unmanned device, the unmanned device may be implemented in a software and/or hardware manner, and the unmanned device may be formed by two or more physical entities or may be formed by one physical entity.
The following description will be given taking the unmanned aerial vehicle as an example of a main body for executing the unmanned aerial vehicle data acquisition method.
Referring to fig. 1, the method for acquiring data of the unmanned aerial vehicle specifically includes:
and S110, acquiring surrounding environment information and motion state parameters of the unmanned equipment in real time.
The unmanned equipment comprises unmanned planes, unmanned vehicles and unmanned ships, and can move by themselves based on preset air routes. The peripheral environment information comprises the temperature, humidity, wind degree, illumination, video information and the like of the periphery of the unmanned equipment, the video information is obtained by shooting through a camera device, and peripheral obstacles of the unmanned equipment are recorded in the video information. The motion state parameters comprise data which can represent the motion condition of the unmanned equipment, such as the speed, the acceleration and the motor rotating speed of the unmanned equipment.
The present embodiment describes an unmanned aerial vehicle as an unmanned aerial vehicle. When unmanned aerial vehicle was according to predetermineeing flight line high altitude construction, probably received the striking of aerial barrier and lead to motor trouble, the motor can't continue to improve power for unmanned aerial vehicle's flight, and unmanned aerial vehicle continues to fly or normally descend because of not having sufficient power, and unmanned aerial vehicle has directly followed the high altitude and has fallen to the ground this moment, and unmanned aerial vehicle has taken place the explosive accident. When the existing unmanned aerial vehicle has a floor accident of a blasting machine, various protection means are mostly adopted to reduce the loss degree of the unmanned aerial vehicle, but the accident data acquisition efficiency is low. In contrast, according to the data acquisition method for the unmanned aerial vehicle provided by the embodiment of the application, when the unmanned aerial vehicle performs a navigation task and flies in the air, the surrounding environment information of the unmanned aerial vehicle is acquired in real time, and the surrounding environment information records the environment where the unmanned aerial vehicle flies. If the unmanned aerial vehicle is in the event of an explosion accident due to external factors such as obstacles, the corresponding surrounding environment information is analyzed when the unmanned aerial vehicle is in the event of an accident, and the accident reason of the explosion accident of the unmanned aerial vehicle can be determined.
Fig. 2 is an exemplary data acquisition system for an unmanned aerial vehicle according to an embodiment of the present application. Referring to fig. 2, the data acquisition system for the unmanned aerial vehicle includes a peripheral information acquisition module, a flight control module, a communication module, and a service processing module, where the service processing module is connected to the peripheral information acquisition module, the flight control module, and the communication module, respectively. The peripheral information acquisition module is used for acquiring the current peripheral environment information of the unmanned equipment, and comprises a camera device which is used for recording the video information around the unmanned equipment. In one embodiment, the video information includes image data of a main visual angle of the first person of the unmanned aerial vehicle, for example, a camera device is arranged on the head of the unmanned aerial vehicle, and the image data of the main visual angle of the first person of the unmanned aerial vehicle is acquired through the camera device on the head of the unmanned aerial vehicle. Illustratively, if unmanned aerial vehicle when high altitude construction, head-on hit the barrier, the unmanned aerial vehicle that receives the barrier impact breaks down, falls from the high altitude and falls out the fried accident, and the camera device of unmanned aerial vehicle head notes the whole process that unmanned aerial vehicle hit the barrier and generates the image data of first person main visual angle this moment, and the image data of first person main visual angle carries out the analysis and can confirm that unmanned aerial vehicle's fried accident reason is striking high altitude barrier.
Furthermore, when the unmanned aerial vehicle carries out the flight path task and flies in the air, the motion state parameters of the unmanned aerial vehicle are obtained in real time, and whether the unmanned aerial vehicle has an accident or not is monitored in real time according to the motion state parameters obtained in real time. Referring to fig. 2, the data acquisition system for the unmanned aerial vehicle includes a flight control module, the flight control module is used for acquiring the motion state parameters of the unmanned aerial vehicle in real time, and controlling the motor rotation state of the unmanned aerial vehicle and the flight attitude of the unmanned aerial vehicle according to the currently acquired motion state parameters, so that the unmanned aerial vehicle flies according to a preset airline task. Wherein, unmanned aerial vehicle control motor rotates and rotates with the control screw, and the screw rotates and provides flight power for unmanned aerial vehicle, if unmanned aerial vehicle receives the striking of barrier and breaks down, falls the fried accident that appears from the high altitude, flies the motion state parameter that the accuse module recorded that unmanned aerial vehicle falls the whole process that the fried accident appears from the high altitude, consequently can judge whether the fried accident appears in unmanned aerial vehicle according to the motion state parameter.
And S120, determining whether the unmanned equipment meets accident occurrence conditions or not according to the motion state parameters.
The accident occurrence condition is used for representing the situation that the motion state parameter occurs or the change quantity of the motion state parameter occurs when the unmanned aerial vehicle has an accident, namely, when the unmanned aerial vehicle has an accident, the motion state parameter or the change quantity of the motion state parameter of the unmanned aerial vehicle can occur as the accident occurrence condition. Therefore, the current motion state parameter or the motion state parameter variation is compared with the preset accident occurrence condition, and whether the unmanned aerial vehicle has an explosion accident or not can be determined. Wherein, only when unmanned aerial vehicle is flying in the execution air line task, just judge whether unmanned aerial vehicle takes place the accident, this step includes:
s1201, determining whether the current state of the unmanned equipment is a job state.
Illustratively, the unmanned aerial vehicle includes an operating state and a non-operating state, the operating state refers to a flight state or a hovering state when the unmanned aerial vehicle executes an airline task, and the non-operating state refers to a state when the unmanned aerial vehicle is in a standby state or a debugging and checking state before operation, that is, when the airline task is not executed. It can be understood that the falling and explosion accidents of the unmanned aerial vehicle can only occur when the unmanned aerial vehicle executes the airline tasks, and therefore whether the explosion accidents occur to the unmanned aerial vehicle is judged when the unmanned aerial vehicle executes the airline tasks.
And further, judging whether the unmanned aerial vehicle executes the air route task to carry out air flight, wherein whether the unmanned aerial vehicle executes the air route task can be determined according to the motion state parameter of the unmanned aerial vehicle or the control instruction for controlling the flight of the unmanned aerial vehicle by the flight control module.
And S1202, if the current state is the operation state, determining whether the motion state parameters meet accident occurrence conditions.
For example, when the unmanned aerial vehicle is determined to be executing the air route task according to the motion state parameters or the control instructions of the unmanned aerial vehicle, whether the unmanned aerial vehicle has an explosion accident or not is judged according to the motion state parameters.
In one embodiment, the motion state parameters acquired by the flight control module in real time include a motor rotation parameter and an acceleration parameter of the unmanned aerial vehicle, wherein the motor rotation parameter can be measured according to the rotation speed sensor, and the acceleration parameter of the unmanned aerial vehicle can be measured according to the inertia measurement unit. For example, the motor rotation parameter and the acceleration parameter of the drone may reflect the flight status of the drone, so whether a bomb accident occurs to the drone may be determined by the motor rotation parameter and/or the acceleration parameter. In contrast, the step of judging whether the unmanned aerial vehicle has an accident includes steps S1203-S1204:
s1203, determining whether the motor rotation parameter meets a first preset condition and whether the acceleration parameter meets a second preset condition.
The first preset condition refers to motor rotation parameters of the unmanned aerial vehicle in an abnormal state, illustratively, in a normal flight state, the motor rotation parameters of the unmanned aerial vehicle are in a preset rated working range, and when the motor rotation parameters exceed the rated working range, the unmanned aerial vehicle is determined to be in the abnormal state. Similarly, the second preset condition refers to an acceleration parameter of the unmanned aerial vehicle in an abnormal state, for example, in a normal flight state, the acceleration parameter of the unmanned aerial vehicle is in a preset rated working range, and when the acceleration parameter exceeds the rated working range, it is determined that the unmanned aerial vehicle is in the abnormal state.
S1204, when the motor rotation parameter meets a first preset condition and/or the acceleration parameter meets a second preset condition, determining that the unmanned equipment meets an accident occurrence condition.
Illustratively, the abnormal state includes the unmanned aerial vehicle being in a blast landing state and an abnormal flight state. Accessible motor rotation parameter or acceleration parameter direct determination unmanned aerial vehicle are in which kind of abnormal state to can jointly determine which kind of abnormal state unmanned aerial vehicle is in through motor rotation parameter and acceleration parameter. The motor rotation parameters not only include rotation parameters of the motor at a certain moment, but also include rotation change values of a certain time period, and the acceleration parameters not only include acceleration at a certain moment, but also include acceleration change values of a certain time period.
In one embodiment, whether the unmanned aerial vehicle is in the state of falling to the ground of the fryer is determined according to the motor rotation change value and the acceleration change value at the moment before and after. Illustratively, when the unmanned aerial vehicle falls from high altitude to the ground due to an unexpected fault, the rotating speed of a certain motor or all motors of the unmanned aerial vehicle suddenly changes to zero or suddenly drops by a large margin, so that if the rotating speed of the motors acquired at the front and back adjacent moments of the flight control module changes by a large margin, the unmanned aerial vehicle is indicated to fall or fall at the moment. And the landing moment when the unmanned aerial vehicle falls off abnormally at high altitude, the acceleration of the unmanned aerial vehicle can change greatly, so if the acceleration of the unmanned aerial vehicle acquired at the front and back adjacent moments of the flight control module changes greatly, and the rotating speed of the motor acquired at the front and back adjacent moments also changes greatly, the current landing explosion accident of the unmanned aerial vehicle is indicated. To this end, the joint determination of whether the unmanned aerial vehicle has an explosion accident according to the motor rotation parameter and the acceleration parameter includes S12021-S12024:
s12041, a rotation difference between the motor rotation parameter obtained at the current time and the motor rotation parameter obtained at the previous time is calculated, and an acceleration difference between the acceleration parameter obtained at the current time and the acceleration parameter obtained at the previous time is calculated.
Illustratively, the flight control module calculates a change value of the motor speed at the current moment by taking a difference between the currently acquired motor speed and the motor speed acquired at the previous moment. And the flight control module makes a difference between the currently acquired acceleration and the acceleration acquired at the last moment, and calculates to obtain an acceleration change value at the current moment.
S12042, judging whether the rotation difference value and the acceleration difference value exceed the corresponding threshold value range.
Illustratively, the flight control module compares the motor rotating speed change value and the acceleration change value with corresponding threshold values, and if the motor rotating speed change value and the acceleration change value exceed the corresponding threshold value ranges, it is determined that the unmanned aerial vehicle is presently subjected to the explosion accident. If at least one of the motor rotating speed change value and the acceleration change value does not exceed the corresponding threshold range, the motor rotating speed and the acceleration at the next moment are continuously obtained to judge whether the explosion accident happens at the next moment, and whether the explosion accident happens to the unmanned aerial vehicle is monitored in real time. The threshold range corresponding to the rotation difference value and the acceleration difference value is used for representing the rated working range of the rotation difference value and the acceleration difference value in a normal flight state, and when the threshold range exceeds the rated working range, the unmanned aerial vehicle is indicated to be in an abnormal state. The threshold ranges corresponding to the rotation difference and the acceleration difference may be set according to experimental data.
S12043, when the rotation difference value is larger than the first threshold value, determining that the motor rotation parameter meets a first preset condition.
The first threshold value is a threshold value range corresponding to the rotation difference value, and when the rotation difference value is larger than the first threshold value, the rotation difference value of the unmanned aerial vehicle can be determined to exceed a rated working range, namely that the unmanned aerial vehicle meets a first preset condition is indicated.
S12044, when the acceleration difference value is larger than the second threshold value, determining that the acceleration parameter meets a second preset condition.
And when the acceleration difference value is larger than the second threshold value, the acceleration difference value of the unmanned aerial vehicle can be determined to exceed the rated working range, namely that the unmanned aerial vehicle meets a second preset condition is indicated.
Further, when the motor rotation parameter meets a first preset condition and the acceleration parameter meets a second preset condition, it is determined that the unmanned aerial vehicle has an accident of explosion.
And S130, when the motion state parameters meet the accident occurrence conditions, storing the surrounding environment information for analysis and acquisition.
Exemplarily, it is determined that the motion state parameter satisfies the accident occurrence condition, that is, it is determined that the unmanned aerial vehicle has an accident, and after it is determined that the current unmanned aerial vehicle has the accident, video information shot by the camera device is sent to the local storage device, so that the video information can be obtained in a subsequent manual copying mode. Wherein, after the explosion accident of unmanned aerial vehicle, unmanned aerial vehicle has dropped to ground, and unmanned aerial vehicle's all ring edge border information is invalid information this moment, and it does not help to analysis accident occurrence reason, consequently after determining that unmanned aerial vehicle takes place the explosion accident, control camera device and stop video shooting to improve accident data and obtain efficiency. Whether combine foretell unmanned aerial vehicle just to judge that unmanned aerial vehicle takes place to explode the quick-witted accident under the operation state, it is corresponding, the camera device that unmanned aerial vehicle self set up also only unmanned aerial vehicle just opens under the operation state and records the function, and unmanned aerial vehicle is in can not take place to drop under the non-operation state and explodes the quick-witted accident, consequently need not to open camera device's the function of recording.
In one embodiment, since the unmanned aerial vehicle may be accidentally dropped from high altitude to the ground by external interference at any time when the unmanned aerial vehicle performs the airline task to fly in the air, the recording function of the camera device on the head of the unmanned aerial vehicle is started when the unmanned aerial vehicle starts to perform the airline task, that is, starts to fly. When the camera device continuously records, the camera device can acquire image data of a first person called a main visual angle with a long time, and then subsequently captures a video clip recorded with an accident occurrence process from the image data, and analyzes the accident occurrence reason according to the video clip. The method needs to judge the video clip which is determined and recorded with the accident occurrence process in the image data, and the operation is complex, thereby influencing the data analysis efficiency. In this embodiment, the video clips with the preset time length recorded in real time by the camera device are acquired, so that the video clips recorded with the accident occurrence process can be directly analyzed and processed in the following. Illustratively, a video clip of a first-person main view is recorded in real time in segments by a camera of the unmanned head. The camera device records a video clip every 5 seconds, namely the time length of each video clip is 5 seconds. And merging and splicing all video clips acquired by the camera device according to the acquisition time sequence to obtain the image data of the first-person main visual angle recorded in the whole flight process. It should be noted that, the camera device records the video clip when the unmanned aerial vehicle is in the operation state, and when the unmanned aerial vehicle has an explosion accident, the camera device is controlled to stop recording video, so that the last recorded video clip may be less than 5 seconds. However, if the camera device does not receive the stop notification, a video clip is recorded after the video clip is recorded, so that the image data of the first person named the main viewing angle when the unmanned aerial vehicle flies is continuously recorded.
Further, in order to ensure that the video clip recorded with the accident occurrence process can be used in the subsequent analysis process, the video clip collected by the camera device needs to be saved. In this regard, the video clip saving step includes S1301-S1302:
s1301, the video clip acquired in real time is stored in a local memory.
Illustratively, after the camera device records a video clip, the video clip is stored in the local memory, so as to avoid the video clip loss caused by damage of the camera device when the unmanned aerial vehicle has an accident of explosion.
S1302, uploading the video clips in the local storage to a remote controller end, other unmanned equipment and/or a cloud server through a cellular network and/or a local numerical chain.
For example, in order to improve data processing efficiency, when it is determined that an unmanned aerial vehicle has an accident, the video clip recorded with the accident occurrence process can be directly uploaded to the remote controller end, other unmanned aerial vehicles and/or the cloud server, so that the device ends analyze the accident occurrence reason of the unmanned aerial vehicle according to the video clip. In this embodiment, the video clip is preferably uploaded to the cloud server over a cellular network. When the cellular network is poor, the video clip is transmitted to the remote controller end and/or other unmanned aerial vehicles through the local numerical chain, and the video clip is uploaded to the cloud server by the remote controller end and/or other unmanned aerial vehicles. The embodiment ensures effective transmission of data through two data transmission paths.
In this embodiment, the local storage stores therein image data of the first-person main view angle of the whole process from takeoff to occurrence of an aircraft explosion accident of the unmanned aerial vehicle. And when the unmanned aerial vehicle is collided by a front obstacle in flight and falls from high altitude to cause a bomb accident, the unmanned aerial vehicle falls from the accident to the accident within ten seconds. The video clip that only has recorded the unmanned aerial vehicle process of falling can be used for the analysis accident to take place the reason promptly in local memory, and because unmanned aerial vehicle will stop recording of video clip when falling to the ground, and camera device will be preserved to local memory for recording a new video clip, consequently the video clip that has recorded the unmanned aerial vehicle process of falling is at least one video clip of latest storage in local memory. In this regard, the present embodiment obtains at least one newly stored video clip from the local storage and uploads the video clip. The number of the acquired latest stored video clips is determined by the time length of the video clips, for example, the time length of the video clip recorded in the embodiment is 5 seconds, so that the crash process of the unmanned aerial vehicle for ten seconds can be acquired by acquiring the four latest stored video clips in the local memory. If the time length of the recorded video clip is 10 seconds, the crash process of ten seconds can be obtained by acquiring two or three newly stored video clips in the local memory.
Furthermore, except for recording the accident occurrence reason of the unmanned aerial vehicle, the video clip of the crash process of the unmanned aerial vehicle can be used for analyzing the accident occurrence reason of the unmanned aerial vehicle, and the surrounding environment information acquired by the unmanned aerial vehicle in the flight process can also be used for subsequent geographic environment analysis so as to better plan the flight route of the unmanned aerial vehicle. Therefore, after the fact that the unmanned aerial vehicle has an explosion accident is determined, the video clip recorded with the process that the unmanned aerial vehicle crashes is uploaded at the first time. After the video clip is uploaded, whether the video clip left in the local storage is uploaded or not is determined according to the remaining power of the unmanned aerial vehicle and the priority of the data. In this embodiment, after it is determined that the unmanned aerial vehicle has an explosion accident, the priority of the video clip recorded with the crash process of the unmanned aerial vehicle is adjusted up, so that the video clip is preferentially uploaded to the cloud server. Secondly, other important data such as position coordinates and motion state parameters of the unmanned aerial vehicle are uploaded according to the data priority, and after the data are completely uploaded, if the residual electric quantity of the unmanned aerial vehicle is sufficient, the residual video clips in the local storage can be uploaded.
In one embodiment, if no explosive accident occurs from takeoff to normal landing of the unmanned aerial vehicle, a camera with a large memory is acquired from recording to stopping of the first-person view angle. Because whole flight process unmanned aerial vehicle does not take place to explode the quick-witted accident, this section image data belongs to invalid data, and it can not be used for data analysis, and the local memory space that stores this section image data is wasted totally, leads to accident data to acquire inefficiency. Moreover, hardware resources of the unmanned aerial vehicle are limited, and for some unmanned aerial vehicles executing high-altitude tasks such as high-altitude inspection, high-altitude cargo transportation or surveying and mapping data acquisition, most hardware resources are occupied to execute main line tasks, and a small part of hardware resources are left to execute auxiliary line tasks such as peripheral environment information acquisition. Therefore, the space of a local storage for storing image data in the unmanned aerial vehicle is limited, and the local storage stores image data of 20 seconds at most. And if the image data with the time length longer than 20 seconds is acquired by the camera from the recording to the stopping of the first-person perspective, the image data cannot be completely saved in the local storage. In this regard, in this embodiment, on the basis of recording the video segment with the preset time length in segments, the video segment with the longest storage time is replaced by the newly recorded video segment, so as to update the video segment in the local storage in real time.
Illustratively, a newly recorded video segment is stored in the local storage, and if the local storage has no free space, the image data with the longest storage time in the local storage is deleted, and the currently recorded image data is stored in the local storage. Wherein the local memory can store at most 4 video segments. And if 4 video clips are stored in the local storage, deleting the video clip with the longest storage time in the local storage, and storing the video clip recorded by the camera device to the local storage. It can be understood that if the local storage stores 4 video segments, the recording time of the video segment with the longest storage time and the newly recorded video segment is different by 20 seconds to 25 seconds. When the unmanned aerial vehicle is collided by a front obstacle in flight and falls from high altitude to cause a bomb accident, the unmanned aerial vehicle falls from the accident to the accident within ten seconds. Therefore, when the first-person perspective camera records a new video clip, the video clip with the longest storage time in the local storage is recorded in the flying environment before 20 seconds, the first-person perspective camera records the video clip normally within the time of 20 seconds, namely stops recording without receiving the accident occurrence notification, it is indicated that the video clip with the longest storage time in the local storage is recorded in the environment information in the first-person main perspective when the unmanned aerial vehicle flies normally, the data has no help for analyzing the accident occurrence reason, the data belongs to invalid data, and the normal acquisition of the accident data is not influenced by deleting the data. The newly recorded video clip is replaced with the influence data with the earliest recording time so as to update the video clip in the local storage in real time, the video clip in the 15-second or 20-second time period before the time node of the occurrence of the explosion accident is stored in the local storage, the whole process that the unmanned aerial vehicle starts to fall accidentally to the occurrence of the explosion accident is recorded in the video clip, and the reason of the occurrence of the explosion accident can be determined by analyzing the video clip. It should be noted that after it is determined that the unmanned aerial vehicle has an explosion accident, the first-person perspective camera is controlled to stop recording, and the video segments that have not been recorded by the first-person perspective camera are stored in the local storage. Wherein, control first person's visual angle camera and stop recording after determining that unmanned aerial vehicle takes place the explosive accident, firstly avoid gathering the invalid data after falling to the ground, secondly avoid keeping on recording to result in the video clip that the record has the unexpected process of falling to be deleted by the mistake after storing in local memory.
In this embodiment, when the unmanned aerial vehicle has an accident, the local storage stores four video clips, and the four video clips record the flight environment within the first person main view angle 15 to 20 seconds before the unmanned aerial vehicle has the accident. The four video clips are combined to be integrated into image data with a time length of 15-20 seconds, and the image data are uploaded to a cloud server, so that background workers can obtain the image data from a cloud and directly perform accident analysis on the image data.
In one embodiment, the remote controller, other unmanned devices and/or the cloud server are provided with a pre-trained obstacle recognition model, and whether the accident cause is an impact obstacle or not can be determined by analyzing the video clip of the first person main view according to the obstacle recognition model. Or, the unmanned aerial vehicle is provided with a barrier recognition model trained in advance, when the unmanned aerial vehicle is determined to have an explosion accident, the video clip of the first person name main view angle in the local memory is input into the barrier recognition model, the video clip is recognized through the preset barrier recognition model, and the accident occurrence reason of the unmanned aerial vehicle is determined. After the accident reason of the unmanned aerial vehicle is determined, the accident reason is uploaded through a cellular network and/or a local numerical chain.
In an embodiment, referring to fig. 2, the system for acquiring data of an unmanned aerial vehicle further includes a service processing module, where the service processing module is configured to acquire the ambient environment information in real time, and store the ambient environment information after receiving the notification sent by the flight control module. The flight control module informs the service processing module that the unmanned equipment has an accident when determining that the motion state parameters meet accident occurrence conditions, the service processing module informs the peripheral information acquisition module to stop acquisition after receiving the accident occurrence notification sent by the flight control module, the peripheral information acquisition module stops acquiring peripheral environment information after receiving the accident occurrence notification sent by the service processing module so as to avoid acquiring invalid data after landing, and the service processing module acquires and stores the peripheral environment information. Further, the data acquisition system of the unmanned aerial vehicle further comprises a communication module, and the communication module generally adopts a cellular network module and/or a local numerical chain module. After receiving the accident occurrence notification sent by the business processing module, the communication module uploads the peripheral environment information stored by the business processing module, so that background workers can directly acquire the peripheral environment information from the cloud and perform accident analysis on the peripheral environment information. Although the communication module may not transmit data after the unmanned aerial vehicle has an explosion accident, the surrounding environment information is stored in the memory in the service processing module and then can be acquired in a manual copying manner.
For example, fig. 3 is a flowchart of another data acquisition method for an unmanned aerial vehicle according to an embodiment of the present application. Referring to fig. 3, after the flight path task starts, the peripheral information acquisition module starts to acquire the peripheral environment information, and the service processing module stores the peripheral environment information. The flight control module acquires the motor rotation parameter and the acceleration parameter at the current moment to judge whether the unmanned aerial vehicle has an accident, and if the unmanned aerial vehicle does not have the accident, the motor rotation parameter and the acceleration parameter at the next moment are continuously acquired, and whether the unmanned aerial vehicle has the accident or not is judged in a circulating manner. When the accident of the unmanned aerial vehicle is determined, the flight control module sets the abnormal zone bit from zero to one, the service processing module acquires the state of the abnormal zone bit of the flight control module in real time, and the state of the unmanned aerial vehicle is determined to be the accident state when the abnormal zone bit of the flight control module is acquired for one time. The peripheral information acquisition module acquires the state of the unmanned aerial vehicle from the service processing module and judges whether to stop acquiring the peripheral environment information according to the state of the unmanned aerial vehicle. And when the acquired state of the unmanned aerial vehicle is in the accident state, stopping acquiring the surrounding environment information, otherwise, continuing acquiring the state of the unmanned aerial vehicle from the service processing module, and circularly judging whether to stop acquiring the surrounding environment information. After the fact that the collection of the surrounding environment information is stopped is determined, the service processing module obtains the surrounding environment information and stores the surrounding environment information into an internal memory, the service processing module informs the communication module to upload the surrounding environment information in the local memory, and the surrounding environment information is stored by the cloud, so that workers can copy the surrounding environment information from the service processing module manually, or obtain the surrounding environment information from the cloud, and then conduct accident analysis according to the surrounding environment information.
On the other hand, fig. 4 is a flowchart of another data acquisition method for an unmanned aerial vehicle according to an embodiment of the present application. Referring to fig. 4, after the flight path task starts, the peripheral information acquisition module starts to acquire the peripheral environment information, and the service processing module stores the peripheral environment information. The flight control module acquires the motor rotation parameter and the acceleration parameter at the current moment to judge whether the unmanned aerial vehicle has an accident, and if the unmanned aerial vehicle does not have the accident, the motor rotation parameter and the acceleration parameter at the next moment are continuously acquired, and whether the unmanned aerial vehicle has the accident or not is judged in a circulating manner. When the accident of the unmanned aerial vehicle is determined, the flight control module sets the abnormal zone bit from zero to one, the service processing module acquires the state of the abnormal zone bit of the flight control module in real time, and the state of the unmanned aerial vehicle is determined to be the accident state when the abnormal zone bit of the flight control module is acquired for one time. The peripheral information acquisition module acquires the state of the unmanned aerial vehicle from the service processing module and judges whether to stop acquiring the peripheral environment information according to the state of the unmanned aerial vehicle. And when the acquired state of the unmanned aerial vehicle is in the accident state, stopping acquiring the surrounding environment information, otherwise, continuing acquiring the state of the unmanned aerial vehicle from the service processing module, and circularly judging whether to stop acquiring the surrounding environment information. After the fact that the acquisition of the surrounding environment information is stopped is determined, the communication module uploads all the surrounding environment information stored by the surrounding information acquisition module to the cloud end, whether the uploading of the surrounding environment information is successful is judged, if the uploading is successful, the surrounding environment information is stored by the cloud end, if the uploading is unsuccessful, the surrounding environment information is stored by the business processing module, and therefore workers can copy the surrounding environment information manually from the business processing module or obtain the surrounding environment information from the cloud end, and accident analysis is conducted according to the surrounding environment information.
On the basis of the embodiment, the accident data further comprises the motion state parameters, and the motion state parameters of the unmanned aerial vehicle before the explosion accident occurs are stored, so that the accident reason is analyzed according to the motion state parameters of the unmanned aerial vehicle before the explosion accident occurs and the surrounding environment information.
In conclusion, the surrounding environment information and the motion state parameters of the unmanned equipment are obtained in real time, wherein the surrounding environment information comprises video information shot by the unmanned equipment camera device; determining whether the unmanned equipment meets accident occurrence conditions or not according to the motion state parameters; and when the unmanned equipment is determined to meet the accident occurrence condition, storing the surrounding environment information for analysis and acquisition. By adopting the technical means, the motion state parameters of the unmanned equipment are obtained in real time to monitor whether the unmanned equipment has an accident or not in real time, and when the unmanned equipment is monitored to have the accident, the obtained surrounding environment information is stored. The peripheral environment information may record information causing accidents of the unmanned aerial vehicle, and the acquired peripheral environment information is stored so as to acquire accident related data from the stored peripheral environment information, and further, accident reasons are effectively analyzed according to the accident related data.
Example two:
on the basis of the foregoing embodiment, fig. 5 is a schematic structural diagram of an unmanned aerial vehicle data acquisition apparatus provided in the second embodiment of the present application. Referring to fig. 5, the data acquiring apparatus for unmanned equipment provided in this embodiment specifically includes: a data acquisition module 21, a state judgment module 22 and a data storage module 23.
The data acquisition module 21 is configured to acquire surrounding environment information and motion state parameters of the unmanned equipment in real time, wherein the surrounding environment information comprises video information shot by an unmanned equipment camera;
a state judgment module 22 configured to determine whether the unmanned equipment satisfies an accident occurrence condition according to the motion state parameter;
and the data saving module 23 is configured to save the surrounding environment information for analysis and acquisition when the unmanned equipment is determined to meet the accident occurrence condition.
On the basis of the above embodiment, the motion state parameters include a motor rotation parameter and an acceleration parameter of the unmanned aerial vehicle, and the state judgment module includes: a condition judging unit configured to determine whether the motor rotation parameter satisfies a first preset condition and whether the acceleration parameter satisfies a second preset condition; the accident determination unit is configured to determine that the unmanned equipment meets the accident occurrence condition when the motor rotation parameter meets a first preset condition and/or the acceleration parameter meets a second preset condition.
On the basis of the above embodiment, the condition judgment unit includes: the calculating subunit is configured to calculate a rotation difference value between the motor rotation parameter acquired at the current moment and the motor rotation parameter acquired at the previous moment, and calculate an acceleration difference value between the acceleration parameter acquired at the current moment and the acceleration parameter acquired at the previous moment; a judging subunit configured to judge whether the rotation difference value and the acceleration difference value exceed the corresponding threshold ranges; a first condition determining subunit configured to determine that the motor rotation parameter satisfies a first preset condition when the rotation difference is greater than a first threshold; a second condition determining subunit configured to determine that the acceleration parameter satisfies a second preset condition when the acceleration difference is greater than a second threshold.
On the basis of the above embodiment, the video information includes image data of a first person main view angle of the unmanned aerial vehicle, and the image data of the first person main view angle is acquired by a camera device arranged on the head of the unmanned aerial vehicle.
On the basis of the above embodiment, the data acquisition module includes: the segmented recording unit is configured to acquire a video segment with a preset time length recorded by the camera in real time, wherein the camera records the video segment when the unmanned equipment is in an operation state.
On the basis of the above embodiment, the data saving module includes: the video saving unit is configured to save the video segments acquired in real time into the local memory; the video uploading unit is configured to upload the video clips in the local storage to the remote controller, other unmanned equipment and/or the cloud server through a cellular network and/or a local numerical chain.
On the basis of the above embodiment, the video uploading unit includes: a first upload subunit configured to retrieve at least one newly stored video clip from the local storage and upload the video clip.
On the basis of the above-described embodiment, the video saving unit includes: and the replacement storage subunit is configured to delete the video segment with the longest storage time in the local storage and store the currently acquired video segment into the local storage if the local storage has no free space.
On the basis of the above embodiment, the video uploading unit includes: and the second uploading subunit is configured to combine and upload all the video clips in the local storage.
On the basis of the above embodiment, the unmanned aerial vehicle data acquisition apparatus includes: and the accident analysis module is configured to identify the video segments through a preset obstacle identification model and determine the accident occurrence reason of the unmanned equipment.
On the basis of the above embodiment, the state determination module further includes: an operating condition determination unit configured to determine whether a current state of the unmanned device is a job state; and the accident judgment unit is configured to determine whether the motion state parameters meet accident occurrence conditions if the current state is the working state.
The surrounding environment information and the motion state parameters of the unmanned equipment are obtained in real time, and the surrounding environment information comprises video information shot by the unmanned equipment camera device; determining whether the unmanned equipment meets accident occurrence conditions or not according to the motion state parameters; and when the unmanned equipment is determined to meet the accident occurrence condition, storing the surrounding environment information for analysis and acquisition. By adopting the technical means, the motion state parameters of the unmanned equipment are obtained in real time to monitor whether the unmanned equipment has an accident or not in real time, and when the unmanned equipment is monitored to have the accident, the obtained surrounding environment information is stored. The peripheral environment information may record information causing accidents of the unmanned aerial vehicle, and the acquired peripheral environment information is stored so as to acquire accident related data from the stored peripheral environment information, and further, accident reasons are effectively analyzed according to the accident related data.
The data acquisition device for the unmanned aerial vehicle provided by the second embodiment of the present application can be used for executing the data acquisition method for the unmanned aerial vehicle provided by the first embodiment of the present application, and has corresponding functions and beneficial effects.
Example three:
fig. 6 is a schematic structural diagram of an unmanned aerial vehicle according to a third embodiment of the present application, and referring to fig. 6, the unmanned aerial vehicle data acquisition apparatus includes: processor 31, storage device 32, communication device 33, peripheral information acquisition device 34, inertial measurement unit 35, and rotational speed sensor 36. The number of processors 31 in the drone may be one or more and the number of storage devices 32 in the drone may be one or more. The processor 31, the storage device 32, the communication device 33, the peripheral information acquisition device 34, the inertial measurement unit 35, and the rotation speed sensor 36 of the unmanned aerial vehicle may be connected by a bus or other means.
The storage device 32 is a computer-readable storage medium, and can be used to store software programs, computer-executable programs, and modules, such as program instructions/modules corresponding to the method for acquiring data of an unmanned aerial vehicle according to any embodiment of the present application (for example, the data acquiring module 21, the state determining module 22, and the data saving module 23 in the unmanned aerial vehicle data acquiring device). The storage device 32 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to use of the device, and the like. Further, the storage device 32 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some examples, the storage 32 may further include memory located remotely from the processor, which may be connected to the device over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The communication device 33 is used to upload the ambient environment information.
The peripheral information collection device 34 is used for collecting peripheral environment information of the unmanned device.
The inertial measurement unit 35 is used to detect acceleration parameters of the drone.
The speed sensor 36 is used to detect the motor rotation parameters of the drone.
The processor 31 executes various functional applications and data processing of the device by running software programs, instructions and modules stored in the storage device 32, that is, implements the above-described unmanned device data acquisition method.
The unmanned aerial vehicle provided by the embodiment can be used for executing the unmanned aerial vehicle data acquisition method provided by the embodiment one, and has corresponding functions and beneficial effects.
Example four:
embodiments of the present application also provide a storage medium containing computer-executable instructions, which when executed by a computer processor, are configured to perform a method for unmanned aerial device data acquisition, the method comprising: acquiring peripheral environment information and motion state parameters of the unmanned equipment in real time, wherein the peripheral environment information comprises video information shot by an unmanned equipment camera device; determining whether the unmanned equipment meets accident occurrence conditions or not according to the motion state parameters; and when the unmanned equipment is determined to meet the accident occurrence condition, storing the surrounding environment information for analysis and acquisition.
Storage medium-any of various types of memory devices or storage devices. The term "storage medium" is intended to include: mounting media such as CD-ROM, floppy disk, or tape devices; computer system memory or random access memory such as DRAM, DDR RAM, SRAM, EDO RAM, Lanbas (Rambus) RAM, etc.; non-volatile memory such as flash memory, magnetic media (e.g., hard disk or optical storage); registers or other similar types of memory elements, etc. The storage medium may also include other types of memory or combinations thereof. In addition, the storage medium may be located in a first computer system in which the program is executed, or may be located in a different second computer system connected to the first computer system through a network (such as the internet). The second computer system may provide program instructions to the first computer for execution. The term "storage medium" may include two or more storage media residing in different locations, e.g., in different computer systems connected by a network. The storage medium may store program instructions (e.g., embodied as a computer program) that are executable by one or more processors.
Of course, the storage medium provided in the embodiments of the present application contains computer-executable instructions, and the computer-executable instructions are not limited to the above-mentioned data acquisition method for the unmanned aerial vehicle, and may also perform related operations in the data acquisition method for the unmanned aerial vehicle provided in any embodiment of the present application.
The data acquisition device, the storage medium, and the unmanned device provided in the above embodiments may execute the data acquisition method of the unmanned device provided in any embodiment of the present application, and reference may be made to the data acquisition method of the unmanned device provided in any embodiment of the present application without detailed technical details described in the above embodiments.
The foregoing is considered as illustrative of the preferred embodiments of the invention and the technical principles employed. The present application is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present application has been described in more detail with reference to the above embodiments, the present application is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present application, and the scope of the present application is determined by the scope of the claims.

Claims (16)

1. An unmanned aerial vehicle data acquisition method, comprising:
acquiring surrounding environment information and motion state parameters of the unmanned equipment in real time, wherein the surrounding environment information comprises video information shot by the unmanned equipment camera device;
determining whether the unmanned equipment meets accident occurrence conditions or not according to the motion state parameters;
and when the unmanned equipment is determined to meet the accident occurrence condition, storing the surrounding environment information for analysis and acquisition.
2. The method for acquiring the data of the unmanned aerial vehicle as claimed in claim 1, wherein the motion state parameters comprise a motor rotation parameter and an acceleration parameter of the unmanned aerial vehicle, and the determining whether the unmanned aerial vehicle meets an accident occurrence condition according to the motion state parameters comprises:
determining whether the motor rotation parameter meets a first preset condition and whether the acceleration parameter meets a second preset condition;
and when the motor rotation parameter meets a first preset condition and/or the acceleration parameter meets a second preset condition, determining that the unmanned equipment meets an accident occurrence condition.
3. The method according to claim 2, wherein the determining whether the motor rotation parameter satisfies a first preset condition and the acceleration parameter satisfies a second preset condition includes:
calculating a rotation difference value between a motor rotation parameter obtained at the current moment and a motor rotation parameter obtained at the previous moment, and calculating an acceleration difference value between an acceleration parameter obtained at the current moment and an acceleration parameter obtained at the previous moment;
judging whether the rotation difference value and the acceleration difference value exceed the corresponding threshold range or not;
when the rotation difference value is larger than a first threshold value, determining that the motor rotation parameter meets a first preset condition;
and when the acceleration difference is larger than a second threshold value, determining that the acceleration parameter meets a second preset condition.
4. The method according to claim 1, wherein the video information includes image data of a first main perspective of the unmanned aerial vehicle, and the image data of the first main perspective is acquired by a camera device disposed on a head of the unmanned aerial vehicle.
5. The unmanned aerial vehicle data acquisition method according to any one of claims 1 to 4, wherein the acquiring, in real time, ambient environment information of the unmanned aerial vehicle comprises:
and acquiring a video clip with a preset time length recorded by the camera device in real time, wherein the camera device records the video clip when the unmanned equipment is in an operating state.
6. The unmanned aerial vehicle data acquisition method of claim 5, wherein the saving the ambient environment information comprises:
storing the video clips acquired in real time into a local memory;
and uploading the video clips in the local storage to a remote controller end, other unmanned equipment and/or a cloud server through a cellular network and/or a local numerical chain.
7. The method according to claim 6, wherein uploading the ambient environment information to a remote controller, other unmanned devices, and/or a cloud server comprises:
and acquiring at least one newly stored video clip from the local memory and uploading the video clip.
8. The method according to claim 6, wherein the saving the video segments acquired in real time to a local memory comprises:
if the local memory has no free space, deleting the video clip with the longest storage time in the local memory, and storing the currently acquired video clip into the local memory.
9. The method for acquiring the data of the unmanned aerial vehicle as claimed in claim 8, wherein uploading the surrounding environment information to a remote controller, other unmanned aerial vehicle and/or a cloud server comprises:
and merging and uploading all the video clips in the local storage.
10. The unmanned aerial vehicle data acquisition method of any one of claims 1-4, further comprising, after the saving the ambient environment information:
and identifying the video clip through a preset obstacle identification model, and determining the accident occurrence reason of the unmanned equipment.
11. The unmanned aerial vehicle data acquisition method of any one of claims 1-4, further comprising, prior to the determining whether the unmanned aerial vehicle satisfies an incident occurrence condition:
determining whether the current state of the unmanned equipment is a working state;
and if the current state is the operation state, determining whether the motion state parameters meet accident occurrence conditions.
12. An unmanned aerial device data acquisition apparatus, comprising:
the data acquisition module is configured to acquire surrounding environment information and motion state parameters of the unmanned equipment in real time, wherein the surrounding environment information comprises video information shot by the unmanned equipment camera device;
a state judgment module configured to determine whether the unmanned equipment meets an accident occurrence condition according to the motion state parameter;
a data saving module configured to save the ambient environment information for analysis and acquisition when it is determined that the unmanned device satisfies the accident occurrence condition.
13. The utility model provides an unmanned aerial vehicle data acquisition system, includes peripheral information acquisition module, flies accuse module and business processing module, its characterized in that:
the peripheral information acquisition module is used for acquiring the current peripheral environment information of the unmanned equipment;
the flight control module is used for acquiring a motion state parameter of the unmanned equipment, determining whether the unmanned equipment meets an accident occurrence condition according to the motion state parameter, and notifying the service processing module when the unmanned equipment meets the accident occurrence condition;
and the service processing module is used for acquiring the surrounding environment information in real time and storing the surrounding environment information after receiving the notification sent by the flight control module.
14. The unmanned-device data acquisition system of claim 13, further comprising a communication module, wherein:
and the communication module is used for uploading the surrounding environment information stored by the service processing module.
15. An unmanned device, comprising: one or more processors; storage means storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the method of unmanned device data acquisition as claimed in any of claims 1-11.
16. A storage medium containing computer-executable instructions, which when executed by a computer processor, are configured to perform the drone data acquisition method of any one of claims 1-11.
CN202110731486.6A 2021-06-29 2021-06-29 Unmanned equipment data acquisition method, device and system, unmanned equipment and storage medium Pending CN113496559A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110731486.6A CN113496559A (en) 2021-06-29 2021-06-29 Unmanned equipment data acquisition method, device and system, unmanned equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110731486.6A CN113496559A (en) 2021-06-29 2021-06-29 Unmanned equipment data acquisition method, device and system, unmanned equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113496559A true CN113496559A (en) 2021-10-12

Family

ID=77998164

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110731486.6A Pending CN113496559A (en) 2021-06-29 2021-06-29 Unmanned equipment data acquisition method, device and system, unmanned equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113496559A (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN202093582U (en) * 2011-05-23 2011-12-28 五邑大学 Digital traveling vehicle image recorder for defining liabilities of traffic accidents
CN102956045A (en) * 2011-08-19 2013-03-06 徐菲 Event trigger based vehicle monitoring, recording and prompting device and method thereof
CN103440689A (en) * 2013-06-24 2013-12-11 开平市中铝实业有限公司 Full-direction driving digital image recorder
CN104507799A (en) * 2014-04-28 2015-04-08 深圳市大疆创新科技有限公司 Protection control method of air vehicle, device and air vehicle
CN108120476A (en) * 2017-12-15 2018-06-05 中国电子产品可靠性与环境试验研究所 Unmanned plane actual time safety prior-warning device
CN110612252A (en) * 2018-01-05 2019-12-24 深圳市大疆创新科技有限公司 Unmanned aerial vehicle fault detection method and device and movable platform
CN110979645A (en) * 2019-12-18 2020-04-10 新疆联海创智信息科技有限公司 Post-fault emergency control device and method for unmanned aerial vehicle
CN112581652A (en) * 2020-11-25 2021-03-30 宝能(广州)汽车研究院有限公司 Monitoring control method, monitoring control system and vehicle
CN112937888A (en) * 2019-12-10 2021-06-11 广州极飞科技股份有限公司 Method and device for determining fault reason of unmanned equipment
CN112947509A (en) * 2019-12-10 2021-06-11 广州极飞科技股份有限公司 Method and device for determining fault reason of unmanned equipment

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN202093582U (en) * 2011-05-23 2011-12-28 五邑大学 Digital traveling vehicle image recorder for defining liabilities of traffic accidents
CN102956045A (en) * 2011-08-19 2013-03-06 徐菲 Event trigger based vehicle monitoring, recording and prompting device and method thereof
CN103440689A (en) * 2013-06-24 2013-12-11 开平市中铝实业有限公司 Full-direction driving digital image recorder
CN104507799A (en) * 2014-04-28 2015-04-08 深圳市大疆创新科技有限公司 Protection control method of air vehicle, device and air vehicle
CN108120476A (en) * 2017-12-15 2018-06-05 中国电子产品可靠性与环境试验研究所 Unmanned plane actual time safety prior-warning device
CN110612252A (en) * 2018-01-05 2019-12-24 深圳市大疆创新科技有限公司 Unmanned aerial vehicle fault detection method and device and movable platform
CN112937888A (en) * 2019-12-10 2021-06-11 广州极飞科技股份有限公司 Method and device for determining fault reason of unmanned equipment
CN112947509A (en) * 2019-12-10 2021-06-11 广州极飞科技股份有限公司 Method and device for determining fault reason of unmanned equipment
CN110979645A (en) * 2019-12-18 2020-04-10 新疆联海创智信息科技有限公司 Post-fault emergency control device and method for unmanned aerial vehicle
CN112581652A (en) * 2020-11-25 2021-03-30 宝能(广州)汽车研究院有限公司 Monitoring control method, monitoring control system and vehicle

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张伟等: "《汽车电气构造与维修》", 北京航空航天大学出版社, pages: 161 *

Similar Documents

Publication Publication Date Title
US11745876B2 (en) Method for adaptive mission execution on an unmanned aerial vehicle
CN110069071B (en) Unmanned aerial vehicle navigation method and device, storage medium and electronic equipment
US10745127B2 (en) Systems and methods for execution of recovery actions on an unmanned aerial vehicle
US10218893B2 (en) Image capturing system for shape measurement of structure, method of capturing image of structure for shape measurement of structure, on-board control device, remote control device, program, and storage medium
TWI585006B (en) Unmanned aerial vehicle flying method and unmaned aerial vehicle flying system
WO2018072133A1 (en) Method for controlling mobile device, control system and mobile device
US20180141656A1 (en) Method and system for monitoring operational status of drone
WO2016172251A1 (en) Systems and methods for execution of recovery actions on an unmanned aerial vehicle
WO2021237618A1 (en) Capture assistance method, ground command platform, unmanned aerial vehicle, system, and storage medium
JP6726224B2 (en) Management device and flight device management method
CN113496559A (en) Unmanned equipment data acquisition method, device and system, unmanned equipment and storage medium
KR102192686B1 (en) Drone controlling system for checking of facility, and method for the same
KR20210016678A (en) Apparatus and method for controlling a unmanned aerial vehicle
CN115793682A (en) Bridge intelligent inspection method and inspection system based on unmanned aerial vehicle
CN112837497B (en) Alarm triggering system based on target position analysis
CN114217631A (en) Intelligent patrol system and method based on unmanned aerial vehicle and storage medium
CN110322701B (en) Traffic violation judgment method based on cloud platform
CN114167884A (en) Unmanned aerial vehicle control method and device, computer equipment and storage medium
WO2020107465A1 (en) Control method, unmanned aerial vehicle and computer-readable storage medium
CN113917935B (en) Unmanned aerial vehicle nest control method and unmanned aerial vehicle nest
CN116736747B (en) Unmanned aerial vehicle emergency treatment method and device
KR102587311B1 (en) Electronic apparatus and method for searching distress, unmanned aerial vehicle, computer program
CN112666961B (en) Unmanned aerial vehicle crash detection method, system, device, unmanned aerial vehicle and storage medium
US20240152162A1 (en) Control method and device of unmanned aerial vehicle system, unmanned aerial vehicle system and storage medium
JP7137034B2 (en) Management device, flight management method, program and photography system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20211012