CN111487993A - Information acquisition method and device, storage medium, automatic pilot and unmanned aerial vehicle - Google Patents

Information acquisition method and device, storage medium, automatic pilot and unmanned aerial vehicle Download PDF

Info

Publication number
CN111487993A
CN111487993A CN202010339942.8A CN202010339942A CN111487993A CN 111487993 A CN111487993 A CN 111487993A CN 202010339942 A CN202010339942 A CN 202010339942A CN 111487993 A CN111487993 A CN 111487993A
Authority
CN
China
Prior art keywords
target object
position information
information
aerial vehicle
unmanned aerial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010339942.8A
Other languages
Chinese (zh)
Inventor
李泽伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Yifei Zhilian Technology Co ltd
Original Assignee
Chongqing Yifei Zhilian Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Yifei Zhilian Technology Co ltd filed Critical Chongqing Yifei Zhilian Technology Co ltd
Priority to CN202010339942.8A priority Critical patent/CN111487993A/en
Publication of CN111487993A publication Critical patent/CN111487993A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The application provides an information acquisition method, an information acquisition device, a storage medium, an automatic pilot and an unmanned aerial vehicle, and relates to the technical field of information, wherein a tracking visual angle when an image acquisition device tracks a target object and a current relative distance corresponding to the target object are acquired; therefore, second position information of the target object is obtained according to the tracking visual angle, the current relative distance and the first position information of the unmanned aerial vehicle; obtaining target moving speed information of the target object based on the second position information and historical position information corresponding to the target object; compare in prior art, can make unmanned aerial vehicle when tracking the target object, need not to rely on other equipment, can independently acquire target object's information such as position and moving speed promptly, promote unmanned aerial vehicle's autonomic ability.

Description

Information acquisition method and device, storage medium, automatic pilot and unmanned aerial vehicle
Technical Field
The application relates to the technical field of information, in particular to an information acquisition method, an information acquisition device, a storage medium, an automatic pilot and an unmanned aerial vehicle.
Background
In some possible scenarios, a drone may be used to mount, for example, a pod, a camera, etc., as an image capture device to enable tracking of a target object.
However, when tracking a target object, the drone cannot independently acquire information such as a position or a moving speed of the target, and can only acquire the information by means of other devices.
Disclosure of Invention
The application aims to provide an information acquisition method and device, a storage medium, an automatic pilot and an unmanned aerial vehicle, so that the unmanned aerial vehicle can independently acquire information such as the position and the moving speed of a target object, and the autonomous ability of the unmanned aerial vehicle is improved.
In order to achieve the purpose, the technical scheme adopted by the application is as follows:
in a first aspect, the present application provides an information acquisition method, which is applied to an autopilot in an unmanned aerial vehicle, wherein the unmanned aerial vehicle is also provided with an image acquisition device; the method comprises the following steps:
acquiring a tracking visual angle when the image acquisition equipment tracks a target object and a current relative distance corresponding to the target object;
obtaining second position information of the target object according to the tracking visual angle, the current relative distance and the first position information of the unmanned aerial vehicle;
and obtaining target moving speed information of the target object based on the second position information and historical position information corresponding to the target object.
In a second aspect, the present application provides an information acquisition device, which is applied to an autopilot in an unmanned aerial vehicle, wherein the unmanned aerial vehicle is also provided with an image acquisition device; the device comprises:
the execution module is used for acquiring a tracking visual angle when the image acquisition equipment tracks a target object and a current relative distance corresponding to the target object;
the processing module is used for obtaining second position information of the target object according to the tracking visual angle, the current relative distance and the first position information of the unmanned aerial vehicle;
the processing module is further configured to obtain target moving speed information of the target object based on the second position information and historical position information corresponding to the target object.
In a third aspect, the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the information acquisition method described above.
In a fourth aspect, the present application provides an autopilot that includes a memory for storing one or more programs; a processor; the one or more programs, when executed by the processor, implement the information acquisition method described above.
In a fifth aspect, the present application provides an unmanned aerial vehicle, which is equipped with the above-mentioned autopilot.
According to the information acquisition method, the information acquisition device, the storage medium, the automatic pilot and the unmanned aerial vehicle, the tracking visual angle when the image acquisition equipment tracks the target object and the current relative distance corresponding to the target object are acquired; therefore, second position information of the target object is obtained according to the tracking visual angle, the current relative distance and the first position information of the unmanned aerial vehicle; obtaining target moving speed information of the target object based on the second position information and historical position information corresponding to the target object; compare in prior art, can make unmanned aerial vehicle when tracking the target object, need not to rely on other equipment, can independently acquire target object's information such as position and moving speed promptly, promote unmanned aerial vehicle's autonomic ability.
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly explain the technical solutions of the present application, the drawings needed for the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and that those skilled in the art can also derive other related drawings from these drawings without inventive effort.
Fig. 1 shows a schematic application scenario diagram of an information acquisition method provided in the present application;
FIG. 2 is a block schematic diagram of an autopilot provided herein;
FIG. 3 is a schematic flow chart diagram of an information acquisition method provided by the present application;
FIG. 4 shows a schematic flow diagram of sub-steps of step 201 of FIG. 3;
FIG. 5 shows a schematic flow diagram of sub-steps of step 203 in FIG. 3;
FIG. 6 shows a schematic flow diagram of sub-steps of step 205 of FIG. 3;
fig. 7 shows a schematic block diagram of an information acquisition apparatus provided in the present application.
In the figure: 100-autopilot; 101-a memory; 102-a processor; 103-a communication interface; 300-an information acquisition device; 301-an execution module; 302-processing module.
Detailed Description
To make the purpose, technical solutions and advantages of the present application clearer, the technical solutions in the present application will be clearly and completely described below with reference to the accompanying drawings in some embodiments of the present application, and it is obvious that the described embodiments are some, but not all embodiments of the present application. The components of the present application, as generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present application, as presented in the figures, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments obtained by a person of ordinary skill in the art based on a part of the embodiments in the present application without any creative effort belong to the protection scope of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
Some embodiments of the present application will be described in detail below with reference to the accompanying drawings. The embodiments described below and the features of the embodiments can be combined with each other without conflict.
Referring to fig. 1, fig. 1 is a schematic application scenario diagram illustrating an information obtaining method provided by the present application; in some application scenarios, equipment such as a pod or a camera can be mounted on the unmanned aerial vehicle as a load, so that image video information of a target object on the ground can be acquired by using the load with the sky as a view angle, and tasks such as area security or target tracking can be performed.
In an application scenario as shown in fig. 1, a positioning device such as a GPS (global positioning System), an RTK (Real-time kinematic) or the like may be configured on the target object, so as to obtain information such as a moving speed, a position and the like of the target object.
The above-mentioned manner of obtaining information such as moving speed, position, etc. of the target object is generally only applicable to the target object which is configured with the positioning device in advance, such as the transfer device in the freight logistics scene; however, in a scenario such as that shown in fig. 1, a drone often tracks a target object without a positioning device, such as an intrusion target or the like tracked when performing a regional security task, so that the drone cannot independently acquire information such as the above-mentioned moving speed, position, and the like of the target object.
In addition, when the pod is used as a load tracking target object of the unmanned aerial vehicle, the pod adjusts the attitude of the pod itself so that the target object is positioned at the center of the pod screen in accordance with the flight attitude of the unmanned aerial vehicle, and the pod cannot calculate information such as the moving speed and the position of the target object in accordance with another reference object.
Therefore, based on the above drawbacks, the present application provides a possible implementation manner as follows: acquiring a tracking visual angle when the image acquisition equipment tracks the target object and a current relative distance corresponding to the target object; therefore, second position information of the target object is obtained according to the tracking visual angle, the current relative distance and the first position information of the unmanned aerial vehicle; obtaining target moving speed information of the target object based on the second position information and historical position information corresponding to the target object; so that the unmanned aerial vehicle can independently acquire information such as the position and the moving speed of the target object, and the autonomous ability of the unmanned aerial vehicle is improved.
Referring to fig. 2, fig. 2 shows a schematic block diagram of an autopilot 100 provided herein, where the autopilot 100 may include a memory 101, a processor 102 and a communication interface 103, and the memory 101, the processor 102 and the communication interface 103 are electrically connected to each other directly or indirectly to implement data transmission or interaction. For example, the components may be electrically connected to each other via one or more communication buses or signal lines.
The memory 101 may be used to store software programs and modules, such as program instructions/modules corresponding to the information acquisition apparatus provided in the present application, and the processor 102 executes the software programs and modules stored in the memory 101 to execute various functional applications and data processing, thereby executing the steps of the information acquisition method provided in the present application. The communication interface 103 may be used for communicating signaling or data with other node devices.
The Memory 101 may be, but is not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a Programmable Read-Only Memory (PROM), an Erasable Read-Only Memory (EPROM), an electrically Erasable Programmable Read-Only Memory (EEPROM), and the like.
The processor 102 may be an integrated circuit chip having signal processing capabilities. The processor 102 may be a general-purpose processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components.
It will be appreciated that the configuration shown in fig. 2 is merely illustrative and that the autopilot 100 may include more or fewer components than shown in fig. 2 or may have a different configuration than shown in fig. 2. The components shown in fig. 2 may be implemented in hardware, software, or a combination thereof.
Based on the autopilot 100 of the above example, the present application also provides an unmanned aerial vehicle (not shown) equipped with the autopilot 100.
The following describes an exemplary information acquisition method provided by the present application, taking the autopilot 100 shown in fig. 2 as an exemplary execution subject, where the autopilot 100 is mounted on an unmanned aerial vehicle, and the unmanned aerial vehicle is also mounted with an image capturing device as a load, and the image capturing device may be, for example, the pod or the camera described above.
Referring to fig. 3, fig. 3 shows a schematic flow chart of an information obtaining method provided in the present application, and as a possible implementation manner, the information obtaining method may include the following steps:
step 201, acquiring a tracking view angle when an image acquisition device tracks a target object, and a current relative distance corresponding to the target object;
step 203, obtaining second position information of the target object according to the tracking visual angle, the current relative distance and the first position information of the unmanned aerial vehicle;
step 205, obtaining target moving speed information of the target object based on the second position information and the historical position information corresponding to the target object.
In an embodiment, when the autopilot acquires the information of the target object, a tracking view angle when the image capturing device tracks the target object and a current relative distance corresponding to the target object may be acquired first.
Taking a pod as an image acquisition device as an example, the tracking visual angle may be a pitch angle of the pod when the target object is tracked by the pod and the target object is located at the midpoint of a pod picture; in addition, as a possible implementation manner, the tracking view angle may be obtained by measuring with a capturing device such as an angle sensor provided on the image capturing device and sent to the autopilot.
Moreover, it should be noted that the current relative distance corresponding to the target object may be a measured relative distance between the image acquisition device and the target object, or a measured relative distance between the unmanned aerial vehicle and the target object; for example, the current relative distance corresponding to the target object may be measured by a laser range finder or an infrared range finding module provided on the image acquisition device or the unmanned aerial vehicle.
Of course, it is understood that in some other possible scenarios of the present application, the current relative distance may also be obtained by comprehensive weighting; for example, the current relative distance may be obtained by taking the relative distance between the image capturing device and the target object as a first relative distance, taking the relative distance between the drone and the target object as a second relative distance, and performing weighted summation on the first relative distance and the second relative distance.
Then, the automatic pilot calculates according to the acquired tracking visual angle and the first relative distance of the target object by combining the first position information of the unmanned aerial vehicle to obtain second position information of the target object; as a possible implementation manner, the first position information may be measured by a device, such as a GPS or an Inertial Measurement Unit (IMU), provided on the drone, for example, information such as a measured speed, an acceleration, a heading, an angular velocity, and coordinates in a terrestrial coordinate system of the drone may be obtained.
Next, the autopilot may obtain target movement speed information of the target object based on the second position information obtained as described above and the historical position information corresponding to the target object.
It should be noted that the historical location information may be location information of the target object under the historical time node; in addition, as a possible implementation manner, the historical location information corresponding to the target object may be stored locally in the automatic pilot in an array form, for example, and the automatic pilot may execute the information obtaining method provided by the present application according to a set period T, so that each time the second location information is obtained, the current second location information is stored to update the historical location information corresponding to the target object; or, the autopilot does not execute the information acquisition method provided by the present application at a set period T, but every time second position information is obtained, the obtained second position information and the current time are packaged and stored in the array, so as to update the historical position information; or, an intermediate variable for recording the historical position information may be set, and each time the latest second position information is obtained and the target moving speed information of the target object is obtained in step 205, the autopilot packs the latest second position information and the current time node and replaces the intermediate variable, so as to store the latest second position information and the current time node.
Based on the design, the tracking visual angle when the image acquisition equipment tracks the target object and the current relative distance corresponding to the target object are acquired; therefore, second position information of the target object is obtained according to the tracking visual angle, the current relative distance and the first position information of the unmanned aerial vehicle; obtaining target moving speed information of the target object based on the second position information and historical position information corresponding to the target object; compare in prior art, can make unmanned aerial vehicle when tracking the target object, need not to rely on other equipment, can independently acquire target object's information such as position and moving speed promptly, promote unmanned aerial vehicle's autonomic ability.
It should be noted that, in the foregoing implementation manner provided in the present application, a tracking view angle of the image capturing device is measured by a manner that an angle sensor and other devices are disposed on the image capturing device, and additional device overhead needs to be added.
However, in a flight scene of, for example, a drone, inertial elements such as an IMU or a hall sensor are generally required to be arranged on the drone and an image acquisition device for auxiliary navigation and the like.
Therefore, in some other possible implementation manners of the present application, in order to save the equipment overhead when obtaining the tracking viewing angle, an inertial element such as an IMU or a hall sensor, which is arranged on the image acquisition equipment, may be further used, and a conversion algorithm is combined to obtain the tracking viewing angle.
For example, referring to fig. 4 on the basis of fig. 3, fig. 4 shows a schematic flow chart of sub-steps of step 201 in fig. 3, as a possible implementation manner, step 201 may include the following sub-steps when acquiring a tracking perspective:
step 201-1, obtaining a posture relative angle of an image acquisition device when tracking a target object;
step 201-2, converting the relative attitude angle to obtain a tracking view angle of the image acquisition device in the terrestrial coordinate system.
In an embodiment, the IMU or an inertial element such as a hall sensor, which is provided on the image capturing device according to the above example, may be used to measure a relative attitude angle of the image capturing device when tracking the target object, which is an attitude angle representing the image capturing device relative to the drone.
Rather, inertial elements such as IMUs can also be provided on the drone, so that the inertial coordinate system of the image acquisition device can be converted into the inertial coordinate system of the drone.
For example, the autopilot may receive parameter information transmitted by the IMU disposed on the image acquisition device and the drone, respectively, so as to calculate a rotation matrix in which an inertial coordinate system of the image acquisition device is converted to an inertial coordinate system of the drone, and calculate a rotation matrix in which the inertial coordinate system of the drone is converted to a terrestrial coordinate system; and then, after certain operation is carried out on the two obtained rotation matrixes, a rotation matrix for converting the inertial coordinate system of the image acquisition equipment into the terrestrial coordinate system can be obtained.
Illustratively, the operational formula may satisfy the following:
Cei=Cep*Cpi
where Cei denotes a rotation matrix to which the inertial coordinate system of the image capturing device is converted to the terrestrial coordinate system, Cep denotes a rotation matrix to which the inertial coordinate system of the drone is converted to the terrestrial coordinate system, and Cpi denotes a rotation matrix to which the inertial coordinate system of the image capturing device is converted to the inertial coordinate system of the drone.
Then, when the autopilot executes step 201, the obtained relative attitude angle of the image capturing device may be converted into the terrestrial coordinate system by using the rotation matrix Cei obtained by the above calculation, so as to obtain the tracking viewing angle of the image capturing device in the terrestrial coordinate system.
Therefore, the posture relative angle of the image acquisition equipment when the target object is tracked is measured by using the set inertia element, and then the posture relative angle is converted into the earth coordinate system, so that the tracking visual angle of the image acquisition equipment is obtained, other measurement equipment is not required to be set, and the equipment overhead of acquiring the tracking visual angle of the image acquisition equipment can be saved.
In addition, referring to fig. 5 on the basis of fig. 3, fig. 5 shows a schematic flow chart of the sub-steps of step 203 in fig. 3, and as a possible implementation, step 203 may include the following sub-steps:
step 203-1, obtaining relative position information between the target object and the unmanned aerial vehicle based on the tracking visual angle and the current relative distance;
and step 203-2, superimposing the relative position information on the first position information of the unmanned aerial vehicle to obtain second position information of the target object.
In an embodiment, when performing step 203, the autopilot may first calculate and obtain relative position information between the target object and the drone in the terrestrial coordinate system based on the tracking viewing angle and the current relative distance.
In this way, the autopilot may combine the first position information of the drone and superimpose the above-mentioned relative position information on the first position information, thereby calculating and obtaining the second position information of the target object.
For example, as a possible implementation manner, the formula for calculating the second position information by the autopilot may satisfy the following:
Latti_target=Latti_plane+dist*cos(pitch_pod)*cos(yaw_plane)/Lat2Meter
Longi_target=Longi_plane+dist*cos(pitch_pod)*sin(yaw_plane)/Lon2Meter
Height_target=Height_plane-dist*sin(pitch_pod)
l atti _ target, L area _ target and Height _ target respectively represent longitude information, latitude information and altitude information of a target object in second position information, L atti _ plane, L area _ plane and Height _ plane respectively represent longitude information, latitude information and altitude information of an unmanned aerial vehicle in first position information, dist represents current relative distance, yaw _ plane represents the heading of the unmanned aerial vehicle in the first position information, pitch _ pod represents a tracking view angle, and L at2Meter and L on2Meter respectively represent unit conversion values of longitude and latitude.
Of course, it can be understood that in the foregoing implementation manner provided by the present application, when the autopilot executes step 203, the calculated relative position information is directly superimposed on the first position information of the unmanned aerial vehicle, and in some other possible implementation manners of the present application, the calculation error may also be considered comprehensively, and a corresponding distance error information is configured for the relative distance of each interval; therefore, when step 203-2 is executed, the distance error information corresponding to the current relative distance can be superimposed, so as to improve the accuracy of the second position information.
In addition, it should be noted that the unit conversion values L at2Meter and L on2Meter can be set as conversion between longitude and latitude and Meter unit respectively, and the specific values of L at2Meter and L on2Meter can be related to the latitude of the drone at the current time.
For example, in some possible scenes, the flight area of the unmanned aerial vehicle is small, and the latitude variation range is not large, so that the influence of latitude variation on the values of the L at2Meter and the L on2Meter can be ignored, the radius of curvature of the meridian and the prime circle selects the ellipsoid long radius, and the value manner of the L at2Meter and the L on2Meter can adopt the following formula:
Lat2Meter=111699.749-1132.978*cos(Latti_plane*Pi/180)
Lon2Meter=111321.543*cos(Latti_plane*Pi/180)
in the equation, L at2Meter and L on2Meter respectively indicate unit conversion values of longitude and latitude, L atti _ plane indicates longitude information, and Pi indicates a circumferential ratio.
And, referring to fig. 6 on the basis of fig. 3, fig. 6 shows a schematic flow chart of sub-steps of step 205 in fig. 3, as a possible implementation, step 205 may include the following sub-steps:
step 205-1, performing difference processing on the second position information and the historical position information to obtain initial moving speed information of the target object;
and step 205-2, filtering the initial moving speed information to obtain target moving speed information.
In an embodiment, after the autopilot obtains the second position information of the target object through step 203, the second position information and the historical position information may be subjected to difference processing by using a position difference, that is, the initial moving speed information of the target object is obtained by dividing a time difference between the current time and the historical time by a position difference between the current time and the historical time of the target object.
However, the differential processing introduces a large noise; therefore, after the automatic pilot obtains the initial moving speed information, the initial moving speed information can be filtered by using a low-pass filter or a Kalman filter, so as to obtain the target moving speed information of the target object; therefore, through the implementation mode provided by the application, the automatic pilot can autonomously acquire information such as the position and the moving speed of the target object without depending on other equipment, and the autonomous ability of the unmanned aerial vehicle is improved.
In addition, referring to fig. 7 based on the same inventive concept as the above-mentioned information acquisition method provided in the present application, fig. 7 shows a schematic block diagram of an information acquisition apparatus 300 provided in the present application, where the information acquisition apparatus 300 can be applied to, for example, an autopilot shown in fig. 2, and the information acquisition apparatus 300 can include an execution module 301 and a processing module 302. Wherein:
the execution module 301 is configured to acquire a tracking view angle when the image acquisition device tracks the target object, and a current relative distance corresponding to the target object;
the processing module 302 is configured to obtain second position information of the target object according to the tracking view angle, the current relative distance, and the first position information of the unmanned aerial vehicle;
the processing module 302 is further configured to obtain target moving speed information of the target object based on the second position information and historical position information corresponding to the target object.
Optionally, as a possible implementation manner, when obtaining the second position information of the target object according to the tracking perspective, the current relative distance, and the first position information of the drone, the processing module 302 is specifically configured to:
obtaining relative position information between the target object and the unmanned aerial vehicle based on the tracking visual angle and the current relative distance;
and superposing the relative position information on the first position information of the unmanned aerial vehicle to obtain second position information of the target object.
Optionally, as a possible implementation manner, the formula for calculating the second position information satisfies the following:
Latti_target=Latti_plane+dist*cos(pitch_pod)*cos(yaw_plane)/Lat2Meter
Longi_target=Longi_plane+dist*cos(pitch_pod)*sin(yaw_plane)/Lon2Meter
Height_target=Height_plane-dist*sin(pitch_pod)
l atti _ target, L area _ target and Height _ target respectively represent longitude information, latitude information and altitude information of a target object in second position information, L atti _ plane, L area _ plane and Height _ plane respectively represent longitude information, latitude information and altitude information of an unmanned aerial vehicle in first position information, dist represents current relative distance, yaw _ plane represents the heading of the unmanned aerial vehicle in the first position information, pitch _ pod represents a tracking view angle, and L at2Meter and L on2Meter respectively represent unit conversion values of longitude and latitude.
Optionally, as a possible implementation manner, when obtaining the target moving speed information of the target object based on the second position information and the historical position information corresponding to the target object, the processing module 302 is specifically configured to:
carrying out difference processing on the second position information and the historical position information to obtain initial moving speed information of the target object;
and filtering the initial moving speed information to obtain target moving speed information.
Optionally, as a possible implementation manner, when acquiring a tracking view angle when the image capturing device tracks the target object, the executing module 301 is specifically configured to:
acquiring a posture relative angle of the image acquisition equipment when tracking a target object; the relative attitude angle represents the attitude angle of the image acquisition equipment relative to the unmanned aerial vehicle;
and converting the relative attitude angle to obtain a tracking visual angle of the image acquisition equipment under the terrestrial coordinate system.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The apparatus embodiments described above are merely illustrative, and for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to some embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in some embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to perform all or part of the steps of the method according to some embodiments of the present application. And the aforementioned storage medium includes: u disk, removable hard disk, read only memory, random access memory, magnetic or optical disk, etc. for storing program codes.
The above description is only a few examples of the present application and is not intended to limit the present application, and those skilled in the art will appreciate that various modifications and variations can be made in the present application. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.
It will be evident to those skilled in the art that the present application is not limited to the details of the foregoing illustrative embodiments, and that the present application may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the application being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned.

Claims (10)

1. An information acquisition method is characterized in that the method is applied to an automatic pilot in an unmanned aerial vehicle, and the unmanned aerial vehicle is also provided with an image acquisition device; the method comprises the following steps:
acquiring a tracking visual angle when the image acquisition equipment tracks a target object and a current relative distance corresponding to the target object;
obtaining second position information of the target object according to the tracking visual angle, the current relative distance and the first position information of the unmanned aerial vehicle;
and obtaining target moving speed information of the target object based on the second position information and historical position information corresponding to the target object.
2. The method of claim 1, wherein the step of obtaining second position information of the target object based on the tracking perspective, the current relative distance, and the first position information of the drone comprises:
obtaining relative position information between the target object and the unmanned aerial vehicle based on the tracking view angle and the current relative distance;
superimposing the relative position information on the first position information of the drone to obtain the second position information of the target object.
3. The method of claim 2, wherein the formula for calculating the second location information satisfies the following:
Latti_target=Latti_plane+distcos(pitch_pod)*cos(yaw_plane)/Lat2Meter
Longi_target=Longi_plane+dist*cos(pitch_pod)*sin(yaw_plane)/Lon2Meter
Height_target=Height_plane-dist*sin(pitch_pod)
l atti _ target, L area _ target and Height _ target respectively represent longitude information, latitude information and altitude information of the target object in the second position information, L atti _ plane, L area _ plane and Height _ plane respectively represent longitude information, latitude information and altitude information of the unmanned aerial vehicle in the first position information, dist represents the current relative distance, yaw _ plane represents the heading of the unmanned aerial vehicle in the first position information, pit _ pod represents the tracking view angle, and L at2 er and L on2Meter respectively represent unit conversion values of longitude and latitude.
4. The method of claim 1, wherein obtaining the target moving speed information of the target object based on the second position information and historical position information corresponding to the target object comprises:
performing difference processing on the second position information and the historical position information to obtain initial moving speed information of the target object;
and filtering the initial moving speed information to obtain the target moving speed information.
5. The method of claim 1, wherein the step of obtaining a tracking perspective from which the image capture device tracks the target object comprises:
obtaining a posture relative angle of the image acquisition equipment when the target object is tracked; wherein the relative attitude angle characterizes an attitude angle of the image capture device relative to the drone;
and converting the relative attitude angle to obtain the tracking visual angle of the image acquisition equipment in a terrestrial coordinate system.
6. An information acquisition device is characterized in that the information acquisition device is applied to an automatic pilot in an unmanned aerial vehicle, and the unmanned aerial vehicle is also hung with an image acquisition device; the device comprises:
the execution module is used for acquiring a tracking visual angle when the image acquisition equipment tracks a target object and a current relative distance corresponding to the target object;
the processing module is used for obtaining second position information of the target object according to the tracking visual angle, the current relative distance and the first position information of the unmanned aerial vehicle;
the processing module is further configured to obtain target moving speed information of the target object based on the second position information and historical position information corresponding to the target object.
7. The apparatus of claim 6, wherein the processing module, when obtaining the second position information of the target object according to the tracking perspective, the current relative distance, and the first position information of the drone, is specifically configured to:
obtaining relative position information between the target object and the unmanned aerial vehicle based on the tracking view angle and the current relative distance;
superimposing the relative position information on the first position information of the drone to obtain the second position information of the target object.
8. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-5.
9. An autopilot, comprising:
a memory for storing one or more programs;
a processor;
the one or more programs, when executed by the processor, implement the method of any of claims 1-5.
10. An unmanned aerial vehicle, characterized in that the unmanned aerial vehicle is equipped with an autopilot as claimed in claim 9.
CN202010339942.8A 2020-04-26 2020-04-26 Information acquisition method and device, storage medium, automatic pilot and unmanned aerial vehicle Pending CN111487993A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010339942.8A CN111487993A (en) 2020-04-26 2020-04-26 Information acquisition method and device, storage medium, automatic pilot and unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010339942.8A CN111487993A (en) 2020-04-26 2020-04-26 Information acquisition method and device, storage medium, automatic pilot and unmanned aerial vehicle

Publications (1)

Publication Number Publication Date
CN111487993A true CN111487993A (en) 2020-08-04

Family

ID=71795620

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010339942.8A Pending CN111487993A (en) 2020-04-26 2020-04-26 Information acquisition method and device, storage medium, automatic pilot and unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN111487993A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112306096A (en) * 2020-11-04 2021-02-02 苏州臻迪智能科技有限公司 Unmanned aerial vehicle automatic following method, system, equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106682572A (en) * 2016-10-12 2017-05-17 纳恩博(北京)科技有限公司 Target tracking method, target tracking system and first electronic device
US20170300759A1 (en) * 2016-03-03 2017-10-19 Brigham Young University Automated multiple target detection and tracking system
CN107992068A (en) * 2017-11-29 2018-05-04 天津聚飞创新科技有限公司 Method for tracking target, device and aircraft
CN108680143A (en) * 2018-04-27 2018-10-19 南京拓威航空科技有限公司 Object localization method, device based on long-distance ranging and unmanned plane
CN109596118A (en) * 2018-11-22 2019-04-09 亮风台(上海)信息科技有限公司 It is a kind of for obtaining the method and apparatus of the spatial positional information of target object
CN110570463A (en) * 2019-09-11 2019-12-13 深圳市道通智能航空技术有限公司 target state estimation method and device and unmanned aerial vehicle

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170300759A1 (en) * 2016-03-03 2017-10-19 Brigham Young University Automated multiple target detection and tracking system
CN106682572A (en) * 2016-10-12 2017-05-17 纳恩博(北京)科技有限公司 Target tracking method, target tracking system and first electronic device
CN107992068A (en) * 2017-11-29 2018-05-04 天津聚飞创新科技有限公司 Method for tracking target, device and aircraft
CN108680143A (en) * 2018-04-27 2018-10-19 南京拓威航空科技有限公司 Object localization method, device based on long-distance ranging and unmanned plane
CN109596118A (en) * 2018-11-22 2019-04-09 亮风台(上海)信息科技有限公司 It is a kind of for obtaining the method and apparatus of the spatial positional information of target object
CN110570463A (en) * 2019-09-11 2019-12-13 深圳市道通智能航空技术有限公司 target state estimation method and device and unmanned aerial vehicle

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112306096A (en) * 2020-11-04 2021-02-02 苏州臻迪智能科技有限公司 Unmanned aerial vehicle automatic following method, system, equipment and storage medium

Similar Documents

Publication Publication Date Title
KR102463176B1 (en) Device and method to estimate position
Garratt et al. Integration of GPS/INS/vision sensors to navigate unmanned aerial vehicles
KR20200044420A (en) Method and device to estimate position
Atia et al. A low-cost lane-determination system using GNSS/IMU fusion and HMM-based multistage map matching
US10488203B2 (en) Coherence map navigational system for autonomous vehicle
EP3109589A1 (en) A unit and method for improving positioning accuracy
US20150234055A1 (en) Aerial and close-range photogrammetry
US10627232B2 (en) Method and system for aerial image processing
AU2017202519A1 (en) On-board backup and anti-spoofing GPS system
TWI556198B (en) Positioning and directing data analysis system and method thereof
CN112631265B (en) Flight control method and device, storage medium, automatic pilot and unmanned aerial vehicle
US20170184404A1 (en) Smoothed navigation solution using filtered resets
US20200025571A1 (en) Navigation system
Mercado et al. Gps/ins/optic flow data fusion for position and velocity estimation
Andert et al. Optical-aided aircraft navigation using decoupled visual SLAM with range sensor augmentation
CN113296532A (en) Flight control method and device of manned aircraft and manned aircraft
US9885569B2 (en) Passive altimeter
CN111487993A (en) Information acquisition method and device, storage medium, automatic pilot and unmanned aerial vehicle
CN113654528B (en) Method and system for estimating target coordinates through unmanned aerial vehicle position and cradle head angle
Praschl et al. Enabling outdoor MR capabilities for head mounted displays: a case study
Whitacre et al. Flight results from tracking ground targets using seascan uavs with gimballing cameras
CN114964245B (en) Unmanned aerial vehicle vision reconnaissance positioning method
Leishman et al. Utilization of UAV Autopilots in Vision-based Alternative Navigation
Malysheva Integrated aircraft navigation system
Quan et al. State Estimation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination