CN111457923A - Path planning method, device and storage medium - Google Patents

Path planning method, device and storage medium Download PDF

Info

Publication number
CN111457923A
CN111457923A CN201910059757.0A CN201910059757A CN111457923A CN 111457923 A CN111457923 A CN 111457923A CN 201910059757 A CN201910059757 A CN 201910059757A CN 111457923 A CN111457923 A CN 111457923A
Authority
CN
China
Prior art keywords
aircraft
target
waypoint
starting
obstacle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910059757.0A
Other languages
Chinese (zh)
Other versions
CN111457923B (en
Inventor
李梅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingdong Qianshi Technology Co Ltd
Original Assignee
Beijing Jingdong Century Trading Co Ltd
Beijing Jingdong Shangke Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingdong Century Trading Co Ltd, Beijing Jingdong Shangke Information Technology Co Ltd filed Critical Beijing Jingdong Century Trading Co Ltd
Priority to CN201910059757.0A priority Critical patent/CN111457923B/en
Publication of CN111457923A publication Critical patent/CN111457923A/en
Application granted granted Critical
Publication of CN111457923B publication Critical patent/CN111457923B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations

Abstract

The application provides a path planning method, a device and a storage medium, wherein the method comprises the following steps: generating a plurality of waypoint positions of the aircraft passing from the starting position to the target position according to the starting point pose of the aircraft at the starting position, the target point pose at the target position and the position of the obstacle, updating the plurality of waypoint positions based on the ground image information acquired by the aircraft at the starting position, the target position and each waypoint position and a preset constraint condition, and finally determining a target planning path of the aircraft moving from the starting position to the target position based on the target reference speed of the aircraft and the plurality of updated waypoint positions. According to the technical scheme, the target planning path can be determined only according to the starting point pose, the target point pose, the barrier position and the preset constraint condition, and the method is small in calculation amount, simple to operate and wide in application range.

Description

Path planning method, device and storage medium
Technical Field
The application relates to the technical field of unmanned aerial vehicles, in particular to a path planning method, a path planning device and a storage medium.
Background
The four-rotor aircraft is also called a four-rotor helicopter, is a small light rotor aircraft with 4 propellers, and the propellers are crossed, and has novel layout form and compact structure. Quad-rotor aircraft include multiple flight procedures where autonomous landing is one of higher complexity. During autonomous landing, precise control of its position and attitude is required.
In the prior art, the position and the attitude of the quad-rotor aircraft in the autonomous landing process can be controlled in a visual servo control mode. The visual servo control mainly utilizes the camera to collect image information as feedback information of the servo controller, does not need to carry out contact measurement on the environment, and realizes the closed-loop control of the unmanned aerial vehicle in the aspects of kinematics, dynamics and the like.
However, the control accuracy of the existing visual servo control depends on the accuracy of extracted features or the estimation of pose, and the problems of large calculation amount and poor stability exist.
Disclosure of Invention
The application provides a path planning method, a path planning device and a storage medium, which are used for solving the problems of large calculated amount and poor stability of the existing visual servo control.
A path planning method provided in a first aspect of the present application includes:
generating a plurality of waypoint positions through which the aircraft moves from the starting position to the target position according to the starting point pose of the aircraft at the starting position, the target point pose at the target position and the position of the obstacle;
updating the plurality of waypoint positions based on the ground image information and preset constraint conditions acquired by the aircraft at the starting position, the target position and each waypoint position;
determining a target planned path for the aircraft to move from the starting location to the target location based on the target reference speed of the aircraft and the updated plurality of waypoint locations.
In a possible design of the first aspect, before the updating the plurality of waypoint locations based on the ground image information acquired by the aircraft at the starting location, the target location, and each waypoint location and preset constraints, the method further includes:
and acquiring the initial projection coordinate, the target projection coordinate and the middle projection coordinate of the visual reference object at each waypoint position on the image plane when the aircraft is at the initial position, the target position and each waypoint position respectively.
In the above possible design of the first aspect, the acquiring a start projection coordinate, a target projection coordinate, and each waypoint projection coordinate of a visual reference object on an image plane at the start position, the target position, and each waypoint position of the aircraft respectively includes:
taking the visual reference object arranged on the ground as a target, and respectively acquiring ground image information comprising the visual reference object at the starting position, the target position and each waypoint position;
and determining the initial projection coordinate, the target projection coordinate and each intermediate projection coordinate of the visual reference object based on the feature point coordinate and the projection coordinate conversion parameter of the visual reference object in each piece of ground image information.
In another possible design of the first aspect, the method further includes:
and controlling the error of the initial projection coordinate and the target projection coordinate and each intermediate projection coordinate and the target projection coordinate to meet an exponential convergence condition.
In yet another possible design of the first aspect, the preset constraints include: a view constraint and an anti-collision constraint;
updating the plurality of waypoint positions based on the ground image information and preset constraint conditions acquired by the aircraft at the starting position, the target position and each waypoint position, comprising:
judging whether the aircraft meets the visual field constraint condition and the anti-collision constraint condition at each waypoint position simultaneously according to the ground image information acquired at each waypoint position;
for waypoint locations that do not satisfy the horizon constraint and/or the collision avoidance constraint, updating the waypoint locations such that the aircraft satisfies the horizon constraint and the collision avoidance constraint simultaneously at all waypoint locations.
In the foregoing possible design of the first aspect, the determining, according to the ground image information acquired at each waypoint position, whether the aircraft satisfies the view constraint condition and the collision avoidance constraint condition at each waypoint position at the same time includes:
for the ground image information acquired at each waypoint position, judging whether each feature point of a visual reference object arranged on the ground is positioned in the visual field range of the aircraft and whether the Euclidean distance between a waypoint coordinate corresponding to the waypoint position and a center point coordinate of the obstacle is greater than the minimum circumscribed circle radius of the obstacle;
and if the characteristic point of the visual reference object is not positioned in the visual field range of the aircraft and/or the Euclidean distance between the navigation point coordinate and the center point coordinate of the obstacle is smaller than or equal to the minimum circumscribed circle radius of the obstacle, updating the position of the navigation point so that each characteristic point of the visual reference object is positioned in the visual field range of the aircraft and the Euclidean distance between the navigation point coordinate and the center point coordinate of the obstacle is larger than the minimum circumscribed circle radius of the obstacle.
In yet another possible design of the first aspect, the generating, based on the starting point pose of the aircraft at the starting position, the target point pose at the target position, and the position of the obstacle, a plurality of waypoint positions through which the aircraft moves from the starting position to the target position includes:
and determining a plurality of waypoint positions where the aircraft moves from the starting position to the target position on the way based on a rapid search treetor RRT algorithm, the starting point pose, the target point pose and a principle that the aircraft does not have collision risk with the obstacle.
A second aspect of the present application provides a path planning apparatus, including: the device comprises a generating module, an updating module and a determining module;
the generating module is used for generating a plurality of waypoint positions through which the aircraft moves from the starting position to the target position according to the starting point pose of the aircraft at the starting position, the target point pose at the target position and the position of the obstacle;
the updating module is used for updating the plurality of waypoint positions based on the ground image information and preset constraint conditions acquired by the aircraft at the starting position, the target position and each waypoint position;
the determining module is configured to determine a target planned path for the aircraft to move from the starting position to the target position based on the target reference speed of the aircraft and the updated plurality of waypoint positions.
In one possible design of the second aspect, the apparatus further includes: an acquisition module;
the obtaining module is configured to obtain a starting projection coordinate, a target projection coordinate, and a middle projection coordinate of the visual reference object on the image plane at the starting position, the target position, and each waypoint position of the aircraft respectively before the updating module updates the plurality of waypoint positions based on the ground image information and preset constraint conditions obtained by the aircraft at the starting position, the target position, and each waypoint position.
In the above possible design of the second aspect, the obtaining module is configured to obtain ground image information including the visual reference object at the starting position, the target position and each waypoint position respectively by targeting the visual reference object disposed on the ground, and determine the starting projection coordinate, the target projection coordinate and each intermediate projection coordinate of the visual reference object based on the feature point coordinate and the projection coordinate conversion parameter of the visual reference object in each ground image information.
In another possible design of the second aspect, the apparatus further includes: a control module;
the control module is used for controlling the error between the initial projection coordinate and the target projection coordinate and controlling each intermediate projection coordinate and the target projection coordinate to meet an exponential convergence condition.
In yet another possible design of the second aspect, the preset constraint condition includes: a view constraint and an anti-collision constraint;
the update module includes: a judgment unit and an update unit;
the judging unit is used for judging whether the aircraft meets the visual field constraint condition and the anti-collision constraint condition at each waypoint position simultaneously according to the ground image information acquired at each waypoint position;
the updating unit is used for updating the waypoint positions which do not meet the visual field constraint condition and/or the anti-collision constraint condition so that the aircraft meets the visual field constraint condition and the anti-collision constraint condition at all waypoint positions simultaneously.
In the above possible design of the second aspect, the determining unit is specifically configured to determine, for the ground image information acquired at each waypoint position, whether each feature point of a visual reference object arranged on the ground is located within a visual field range of the aircraft and whether an euclidean distance between a waypoint coordinate corresponding to the waypoint position and a center point coordinate of the obstacle is greater than a minimum circumscribed circle radius of the obstacle;
the updating unit is specifically configured to update the waypoint position when the feature points of the visual reference object are not located within the visual field range of the aircraft and/or when the euclidean distance between the waypoint coordinates and the center point coordinates of the obstacle is less than or equal to the minimum circumscribed circle radius of the obstacle, so that each feature point of the visual reference object is located within the visual field range of the aircraft and the euclidean distance between the waypoint coordinates and the center point coordinates of the obstacle is greater than the minimum circumscribed circle radius of the obstacle.
In yet another possible design of the second aspect, the generating module is specifically configured to determine a plurality of waypoint positions where the aircraft moves from the starting position to the target position on the basis of a fast search treemark RRT algorithm, the starting point pose, the target point pose, and a principle that there is no risk of collision between the aircraft and the obstacle.
A third aspect of the present application provides a path planning apparatus, comprising a processor, a memory, and a computer program stored on the memory and executable on the processor, wherein the processor executes the program to implement the method according to the first aspect and various possible designs of the first aspect.
A fourth aspect of the present application provides a storage medium having stored therein instructions that, when executed on a computer, cause the computer to perform the method as set forth in the first aspect and various possible designs of the first aspect.
A fifth aspect of the present application provides a chip for executing instructions, the chip being configured to perform the method according to the first aspect and various possible designs of the first aspect.
According to the path planning method, the path planning device and the storage medium, a plurality of waypoint positions through which the aircraft moves from the initial position to the target position are generated according to the initial point pose of the aircraft at the initial position, the target point pose at the target position and the position of the obstacle, then the plurality of waypoint positions are updated based on the ground image information and the preset constraint conditions acquired by the aircraft at the initial position, the target position and each waypoint position, and finally the target planning path through which the aircraft moves from the initial position to the target position is determined based on the target reference speed of the aircraft and the plurality of updated waypoint positions. According to the technical scheme, the target planning path can be determined only according to the starting point pose, the target point pose, the barrier position and the preset constraint condition, and the method is small in calculation amount, simple to operate and wide in application range.
Drawings
FIG. 1 is an imaging schematic of a visual servo control system of a quad-rotor aircraft;
FIG. 2A is a schematic diagram of closed loop control of the PBVS control system;
FIG. 2B is a schematic diagram of the closed loop control of the IBVS control system;
fig. 3 is a schematic flowchart of a first embodiment of a path planning method according to an embodiment of the present application;
fig. 4 is a schematic flow chart of a second embodiment of a path planning method provided in the embodiment of the present application;
fig. 5A to 5E are schematic diagrams of node expansion performed by the RRT algorithm;
fig. 6 is a schematic flow chart of a third embodiment of a path planning method provided in the embodiment of the present application;
fig. 7 is a schematic flowchart of a fourth embodiment of a path planning method according to an embodiment of the present application;
FIGS. 8A and 8B are schematic views of the field of view of a quad-rotor flight;
FIG. 9A is a schematic diagram of the position relationship between the aircraft and the visual reference object at the starting position and the target position in the three-dimensional space;
FIG. 9B is a schematic projection diagram of feature points of the visual reference object in the planar image when the aircraft is at the starting position and the target position;
FIG. 9C is a schematic diagram of the path of the feature points of the visual reference object when the aircraft is in three-dimensional space;
FIG. 9D is a schematic diagram of the path of each feature point of the visual reference object when the aircraft is in the plan view;
fig. 10 is a schematic structural diagram of a first embodiment of a path planning apparatus according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of a second embodiment of a path planning apparatus according to an embodiment of the present application;
fig. 12 is a schematic structural diagram of a third embodiment of a path planning apparatus according to an embodiment of the present application;
fig. 13 is a schematic structural diagram of a fourth embodiment of a path planning apparatus according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The embodiment of the application is explained by applying a path planning method to an unmanned aerial vehicle. With the rapid development of automatic control technology, the unmanned aerial vehicle quickly becomes a novel tool. Autonomous landing is a complex procedure for the multiple flight procedures that an aircraft includes. In the autonomous landing process of the aircraft, the position and the attitude of the aircraft need to be accurately controlled, so that the problem of visual servo control of the aircraft in the autonomous landing process needs to be researched.
The following figures are illustrated for exemplary purposes as a quad-rotor aircraft. FIG. 1 is an imaging schematic of a visual servo control system for a quad-rotor aircraft. As shown in fig. 1, the visual servo control system uses image information acquired by a camera (e.g., a camera) arranged at the lower end of the aircraft as feedback information, and does not need to perform contact measurement on environmental parameters, thereby realizing closed-loop control of the unmanned aerial vehicle in the aspects of kinematics, dynamics, and the like.
Alternatively, the visual servoing control system may be classified into a position-based visual servoing (PBVS) control system, an image-based visual servoing (IBVS) control system, and a hybrid visual servoing control system, in which the form of the error amount is constructed according to feedback information input into the visual servoing control system.
Illustratively, FIG. 2A is a closed loop control schematic of a PBVS control system. As shown in fig. 2A, for the PBVS control system, the external input information is the target pose, and the output information is the actual pose, in the system, the actual pose output by the aircraft is subjected to image acquisition, feature extraction and pose estimation to obtain an estimated pose corresponding to the actual pose, an error signal between the estimated pose and the externally input target pose is input to the visual servo controller, and the visual servo controller adjusts the internal parameters according to the error to regenerate an actual pose, and outputs the actual pose through the aircraft.
Optionally, the error signal in the PBVS control system is defined in a three-dimensional cartesian coordinate space. The cartesian coordinate system is a general term for a rectangular coordinate system and an oblique coordinate system. The control accuracy of the PBVS control system depends on the accuracy of pose estimation, and the pose estimation accuracy depends on the calibration accuracy of the camera equipment and the unmanned aerial vehicle, so that the problem of large calculation amount exists.
FIG. 2B is a schematic diagram of the closed loop control of the IBVS control system. As shown in fig. 2B, for the IBVS control system, the external input information is a target image feature, and the output information is an actual pose, in the system, the actual pose output by the aircraft is subjected to image acquisition and feature extraction to obtain an actual image feature corresponding to the actual pose, an error signal between the actual image feature and the target image feature input externally is input to the visual servo controller, and the visual servo controller adjusts the internal parameter according to the error to regenerate an actual pose, and outputs the actual pose through the aircraft.
Optionally, the error signal in the IBVS control system is defined in a two-dimensional image plane space. Because the vision servo controller in the IBVS control system is relatively complicated and lacks adaptability, in order to extract accurate characteristics, an additional sensor is needed to obtain depth information, and the phenomenon that the unmanned aerial vehicle vibrates due to too large displacement and poor stability can occur.
Illustratively, the error signal in the hybrid visual servoing control system includes both error information in a three-dimensional cartesian coordinate system and error information in a two-dimensional image plane space.
In summary, the visual servo control system in the prior art has the problems of large calculation amount and poor stability.
In view of the above problems, embodiments of the present application provide a path planning method, an apparatus, and a storage medium, where a plurality of waypoint positions through which an aircraft moves from an initial position to a target position are first generated according to a starting point pose of the aircraft at the initial position, a target point pose at the target position, and a position of an obstacle, then the plurality of waypoint positions are updated based on the initial position, the target position, and ground image information and preset constraint conditions acquired at each waypoint position of the aircraft, and finally a target planned path through which the aircraft moves from the initial position to the target position is determined based on a target reference speed of the aircraft and the plurality of waypoint positions after updating. According to the technical scheme, the target planning path can be determined only according to the starting point pose, the target point pose, the barrier position and the preset constraint condition, and the method is small in calculation amount, simple to operate and wide in application range.
The technical solution of the present application will be described in detail below with reference to specific examples. It should be noted that, the following specific embodiments mainly illustrate the aircraft as a quad-rotor aircraft, the embodiments of the present application do not limit the types of the aircraft, and the following specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments.
Fig. 3 is a schematic flowchart of a first embodiment of a path planning method according to an embodiment of the present application. The embodiment takes an aircraft, namely a visual servo controller on the aircraft, as an execution subject. As shown in fig. 3, the path planning method may include the following steps:
step 31: and generating a plurality of waypoint positions through which the aircraft moves from the starting position to the target position according to the starting point pose of the aircraft at the starting position, the target point pose at the target position and the position of the obstacle.
For example, in this embodiment, the visual servo controller first determines external input information of the aircraft, and optionally, the external input information may include: the starting point pose (comprising a starting position and an orientation angle) of the aircraft, the target point pose (comprising a target position and an orientation angle), and the position of the obstacle (comprising the coordinates of the center point of the obstacle and the radius of a circumscribed circle of the obstacle).
Optionally, the embodiments of the present application are all premised on tiling the visual reference on the ground within the field of view of the aircraft. For example, the visual reference may be an H-shaped marker or a checkerboard marker. The embodiment of the application does not limit the expression form and size of the visual marker, and the visual marker can be determined according to actual conditions.
After a vision servo controller of the aircraft acquires an initial point pose, a target point pose and a position of an obstacle of the aircraft, which are input from the outside, a series of waypoints from an initial position to a target position are generated based on a preset algorithm, such as a rapid expansion tree (RRT) algorithm or a rapid expansion tree star (RRT star) algorithm, and a plurality of waypoint positions through which the aircraft moves from the initial position to the target position are determined.
For a specific implementation principle of this step, reference may be made to the following description in the embodiment shown in fig. 4, and details are not described here.
Step 32: and updating the plurality of waypoint positions based on the ground image information acquired by the aircraft at the initial position, the target position and each waypoint position and a preset constraint condition.
For example, in this embodiment, in order to ensure the accuracy of the path planning, ground image information including a visual reference object may be collected at the start position, the target position, and each waypoint position by a camera (or referred to as a visual sensor) fixed below the aircraft, the collected ground image information is used as feedback information of the path planning, and the position of each waypoint is updated, so that the aircraft meets the preset constraint condition at each waypoint position.
Alternatively, in this embodiment, the camera device may be a camera below the quad-rotor aircraft.
The specific implementation principle of this step can refer to the description in the following embodiments, which are not described herein again.
Step 33: a target planned path for the aircraft to move from the starting location to the target location is determined based on a target reference speed and a plurality of waypoint locations of the aircraft.
In this embodiment, in order to make the actual flying speed of the aircraft approach the target reference speed, the speed tracking controller may be designed to realize real-time tracking of the target reference speed by analyzing a dynamic model of the aircraft (e.g., a quad-rotor). The speed tracking controller can be a proportional-integral-derivative (PID) speed tracking controller, which is simple and easy to control, and can track the change of speed, reduce the error and complete the closed-loop control of the visual servo control system.
According to the path planning method provided by the embodiment of the application, a plurality of waypoint positions through which the aircraft moves from the initial position to the target position are generated according to the initial point pose of the aircraft at the initial position, the target point pose at the target position and the position of the obstacle, then the plurality of waypoint positions are updated based on the ground image information and the preset constraint conditions acquired by the aircraft at the initial position, the target position and each waypoint position, and finally the target planning path through which the aircraft moves from the initial position to the target position is determined based on the target reference speed of the aircraft and the plurality of updated waypoint positions. According to the technical scheme, the target planning path can be determined only according to the starting point pose, the target point pose, the barrier position and the preset constraint condition, and the method is small in calculation amount, simple to operate and wide in application range.
Exemplarily, on the basis of the above embodiments, fig. 4 is a schematic flow chart of a second embodiment of the path planning method provided in the embodiment of the present application. As shown in fig. 4, in this embodiment, the step 31 can be implemented by:
step 41: and determining a plurality of waypoint positions through which the aircraft moves from the starting position to the target position based on a rapid search trestar RRT algorithm, the starting point pose, the target point pose and a principle that the aircraft and the obstacle have no collision risk.
The RRT algorithm is similar to the RRT algorithm, the space can be searched mainly by generating sampling points in an unknown region, and the workload can be effectively reduced because preprocessing operation is not needed in the processing process, so that the aircraft can be rapidly planned between the starting point pose and the target point pose.
Fig. 5A to 5E are schematic diagrams illustrating node expansion performed by the RRT algorithm. As shown in fig. 5A to 5E, in the present embodiment, it is assumed that the starting point is q0The target point is qD(not shown) from the start point q, based on the RRT algorithm0Randomly generating a certain number aroundRandom point q of quantitytSelecting a node q from the generated plurality of random points1The node q1Is a distance target point q among all random pointsDThe most recent one. Based on the above principle, the waypoint q can be generated2And q is3At q is3On the basis of which the node q is regenerated4. Based on the principle, the node generated till the last is a target point qDThe algorithm terminates.
It is worth noting that the RRT algorithm in the embodiment of the present application is an optimized form of the RRT algorithm, and the RRT algorithm adds a re-planning (Rewire) process compared with the RRT algorithm, that is, the RRT algorithm may determine whether a path formed by the generated waypoints is an optimal path, and if not, update the generated waypoint positions.
Specifically, referring to fig. 5A to 5E, in the present embodiment, a specific algorithm for determining a plurality of waypoint positions through which the aircraft moves from the start position to the target position based on the RRT algorithm is as follows:
as shown in fig. 5A, at the starting time, the fast spanning tree diagram corresponding to the RRT algorithm only includes the starting point q0. First, at a starting point q0The random sampling around generates a certain number of random points qtThen calculate each random point qtAnd target point qDEuclidean distance (| | q)t-qD| q), the calculated results are compared to determine the value (min | | q) with the minimum resultt-qDI |), i.e. at all random points qtTo determine the distance target point qDNearest node qr
Next, as shown in FIG. 5B, in the fast expanding tree view, a starting point q is used0To node qrIs an extension direction, and is at a starting point q according to a preset node step length0To node qrDefine a node q therebetween1. Then, a start point q is detected0Node q1And the position of the obstacle, determining a starting point q0And node q1Whether the path between the nodes collides with the barrier or not, if not, the node q is connected1Adding the tree into the fast expansion tree graph, if yes,node q is discarded1And a re-search is performed.
Correspondingly, the rapid expansion tree diagram shown in fig. 5C can be obtained by performing expansion according to the above operation steps.
Optionally, in this embodiment, a starting point q exists in the fast expansion tree diagram0Node q1Node q2Node q3Node qnIn the case of (2), a node q is determined among the generated random nodesr(shown by the dashed circle), node q is computedrDetermining the distance between the node q and all the nodes nearby3And node qrThe distance between the nodes is the shortest, and q is the preset node step length3To node qrIs the extension direction, determines the node qw
Alternatively, as shown in FIG. 5C, node qwThe adjacent node set V comprises nodes shown by large dotted circles in the graph, traversal retrieval is carried out on all nodes in the large dotted circles, and a distance node q is determinedwThe nearest node is node q2Therefore, removing node q from the fast spanning tree graph may be performed3And node qwConnecting line between them, and for node q2And node qwThe wiring is performed and the result is shown in fig. 5D.
In this embodiment, at node qwRemoving the node q from the nearby node set V2Then, the calculation is passed through the node qwTotal path length of (i.e. for)
Figure BDA0001953741920000111
(representing node q)wRemoving the node q from the nearby node set V2Latter set), calculate S (q)w)+||qn-qwIf S (q |)w)+||qn-qwLess than S (q)n) Then, S (q) is usedw)+||qn-qwI update S (q |)n) Then, as shown in FIG. 5E, node q is removed3And node qnConnecting line between them, and for node qwAnd node qnGo on line, node q3Is node qnThe result after processing is shown in fig. 5E. If S (q)w)+||qn-qw| | is greater than S (q |)n) And no further action is required at this point.
That is, in the process of node calculation, if the calculated result is smaller than the total path length to the target node, the update operation needs to be performed on the optimal path sequence.
Optionally, when the number of iterations of the RRT algorithm reaches a set threshold, the obtained total path length and the optimal path sequence may be returned.
Optionally, in this embodiment, as shown in fig. 4, before the step 32, the method may further include the following steps:
step 42: and acquiring the initial projection coordinate, the target projection coordinate and the middle projection coordinate of the visual reference object at each waypoint position on the image plane when the aircraft is at the initial position, the target position and each waypoint position respectively.
In this embodiment, the autonomous landing mission of a quad-rotor aircraft based on visual servo control is mainly studied, and thus, the embodiments of the present application consider only the aircraft in a four-degree-of-freedom constellation space, i.e., the constellation space contains three position coordinates (x, y, z) and a yaw angle ψ about the centroid, while the pitch and roll angles (θ, Φ) about the centroid are zero.
Optionally, when the visual reference object is laid on the ground, the ground images can be respectively shot at the starting position, the target position and each waypoint position through the four handheld rotors, and the starting projection coordinate s and the target projection coordinate s of the visual reference object in the image plane can be calculated according to the ground image information of the visual reference object in each ground image*And intermediate projected coordinates at each waypoint location.
The path planning method provided by the embodiment of the application determines a plurality of waypoint positions where the aircraft passes from the initial position to the target position based on the RRT algorithm, the initial point pose and the target point pose of the aircraft and the principle that the aircraft does not have collision risk with the obstacle, and obtains the initial projection coordinate, the target projection coordinate and the middle projection coordinate of the visual reference object on the image plane when the aircraft is respectively at the initial position, the target position and each waypoint position. According to the technical scheme, the optimal target path can be determined based on the RRT algorithm, and the calculation amount is small.
Exemplarily, on the basis of the above embodiments, fig. 6 is a schematic flow chart of a third embodiment of the path planning method provided in the embodiment of the present application. As shown in fig. 6, in the present embodiment, the step 42 can be implemented by:
step 61: and taking a visual reference object arranged on the ground as a target, and respectively acquiring ground image information comprising the visual reference object at the starting position, the target position and each waypoint position.
In the present embodiment, a ground image is captured by an image capturing device (also referred to as a vision sensor, for example, a camera) provided below the aircraft with reference to a visual reference object laid flat on the ground, and ground image information including the visual reference object is acquired.
For example, when the visual reference object is in a world coordinate system (i.e., a three-dimensional coordinate system), the position coordinates of each feature point on the visual reference object are assumed to be
Figure BDA0001953741920000121
Wherein M isjIs the coordinate of the jth characteristic point on the visual reference object, N represents the number of the characteristic points on the visual reference object, [ X ]j,Yj,Zj]TDenotes the coordinate [ Xj,Yj,Zj]The transposing of (1).
Optionally, when the aircraft photographs the visual reference object at the starting position, the target position and each waypoint position, the aircraft may include a ground image of the visual reference object, and by analyzing the position and the shape of the visual reference object in each ground image, ground image information at each position may be acquired.
Step 62: and determining the initial projection coordinate, the target projection coordinate and each intermediate projection coordinate of the visual reference object based on the feature point coordinate and the projection coordinate conversion parameter of the visual reference object in each ground image information.
For example, assume that the projection coordinates of the visual reference object on the image plane are
Figure BDA0001953741920000122
Wherein i represents the waypoint position of the aircraft, i can represent any one of the starting position, each waypoint position and the target position, j represents the jth characteristic point on the visual reference object,
Figure BDA0001953741920000123
the transverse coordinates of the j-th characteristic point of the visual reference object in the plane image when the aircraft is at the i position are shown,
Figure BDA0001953741920000124
and the longitudinal coordinate of the j-th characteristic point of the visual reference object in the plane image when the aircraft is at the i position is shown.
For example, in the present embodiment, based on the perspective projection principle, the perspective camera model and the internal parameters used by the imaging device of the aircraft, i.e., the projection coordinate conversion parameters, are first determined, and using the projection coordinate conversion parameters and the above feature point coordinates of the visual reference object, the starting projection coordinates, the target projection coordinates, and each intermediate projection coordinate of the visual reference object in the image plane when the aircraft is at the starting position, the target position, and each waypoint position can be determined.
For example, the position coordinates of each feature point on the visual reference object and the projection coordinates on the image plane can be converted by the following formula (1):
Figure BDA0001953741920000131
wherein the content of the first and second substances,
Figure BDA0001953741920000132
for projection coordinates, MjK denotes an internal parameter of the image pickup apparatus, T, as a feature point coordinateiIndicating that the aircraft is inA transformation parameter at position, TiDetermined by the rotation matrix between the camera coordinate system and the world coordinate system.
Alternatively, for a camera device, the internal parameter K is a known parameter, and the transfer parameter T is determined when the position of the aircraft is determinediAre also known parameters.
The path planning method provided by the embodiment of the application takes a visual reference object arranged on the ground as a target, ground image information comprising the visual reference object is respectively obtained at an initial position, a target position and each waypoint position, and the initial projection coordinate, the target projection coordinate and each middle projection coordinate of the visual reference object are determined based on the characteristic point coordinate and the projection coordinate conversion parameter of the visual reference object in each ground image information. According to the technical scheme, the initial projection coordinate, the target projection coordinate and each intermediate projection coordinate of the visual reference object can be obtained according to the position of the aircraft and the coordinates of each characteristic point in the visual reference object, and a foundation is laid for path planning of the aircraft.
Exemplarily, on the basis of the above embodiments, fig. 7 is a schematic flow chart of a fourth embodiment of the path planning method provided in the embodiment of the present application. In this embodiment, the preset constraint condition may include: a view constraint and a collision avoidance constraint.
Optionally, as shown in fig. 7, in this embodiment, the step 32 may be implemented by:
step 71: and judging whether the aircraft meets the visual field constraint condition and the anti-collision constraint condition at each waypoint position simultaneously according to the ground image information acquired at each waypoint position.
Optionally, in this embodiment, it is determined whether each feature point of the visual reference object is located within the visual field of the aircraft and the euclidean distance between the waypoint coordinate and the center point coordinate of the obstacle is greater than the minimum circumscribed radius of the obstacle for the ground image information acquired at each waypoint position.
Illustratively, fig. 8A and 8B are views of a field of view for a four-rotor flight. In order to ensure that the aircraft is inEach waypoint position satisfies the visual field constraint condition, and as shown in fig. 8A and 8B, the visual Field (FOV) constraint of the camera is checked first, and the projection coordinates of the visual reference object on the image plane are determined
Figure BDA0001953741920000141
Whether all are within the visual field range of the camera, the specific constraint can be carried out by the following inequality:
Figure BDA0001953741920000142
wherein [ u ]min,umax]And [ v ]min,vmax]The boundary value of the FOV constraint of the visual field is represented, and N represents the number of characteristic points on the visual reference object. For example, in the present embodiment, as shown in fig. 8A, N is 4.
However, the above-mentioned visual field restriction can only ensure that each feature point on the visual reference object is located in front of the camera, and during the movement of the camera, i.e. the movement of the aircraft, the problem that the feature point is blocked by an obstacle may also occur, which can be solved by using that the connecting line between the optical axis of the image pickup device and each feature point is not blocked by the obstacle, i.e. the distance from the edge of the obstacle to the optical axis of the camera is greater than 0, as shown in fig. 8B.
Further, collision avoidance constraints may impose requirements on the camera that the planned path not enter an obstacle area. In this embodiment, the camera may be forced to remain in the space of the safety area by introducing a path of the control point, at a sufficiently far distance from the obstacle. For example, the control point may be the waypoints determined above such that the aircraft is sufficiently far from the center location of the obstacle when the aircraft is at each waypoint location.
In this embodiment, assume that the waypoint coordinate of the ith waypoint is tiIn this case, the positional relationship between the waypoint and the obstacle may be controlled to satisfy the following formula (3):
||ti-Oo||>ra(3)
wherein, ti=(xi,yi,zi)TAs coordinates of the aircraft at position i, Oo=(xo,yo,zo)TCoordinates of center point representing obstacle, raRepresenting the minimum circumscribed circle radius of the obstacle. That is, in order to prevent the collision between the aircraft and the obstacle, the euclidean distance between the coordinates of the aircraft at the waypoint position and the coordinates of the center point of the obstacle needs to be greater than the minimum circumscribed circle radius of the obstacle.
Step 72: for waypoint locations that do not satisfy the above-described sight and/or collision avoidance constraints, the waypoint locations are updated so that the aircraft satisfies the sight and collision avoidance constraints simultaneously at all waypoint locations.
In this embodiment, when the aircraft meets both of the above conditions at each waypoint location, then the waypoint location does not need to be updated. Otherwise, if the characteristic point of the visual reference object is not positioned in the visual field range of the aircraft and/or the Euclidean distance between the navigation point coordinate and the center point coordinate of the obstacle is smaller than or equal to the minimum circumscribed circle radius of the obstacle, updating the navigation point position so that all the characteristic points of the visual reference object are positioned in the visual field range of the aircraft and the Euclidean distance between the navigation point coordinate and the center point coordinate of the obstacle is larger than the minimum circumscribed circle radius of the obstacle.
For example, when the feature point of the visual reference object is not within the visual field range of the aircraft, it indicates that the aircraft does not satisfy the visual field constraint condition, and if the aircraft is at a certain waypoint position, the euclidean distance between the waypoint coordinate corresponding to the waypoint position and the center point coordinate of the obstacle is less than or equal to the minimum circumscribed circle radius of the obstacle, which indicates that the aircraft does not satisfy the collision avoidance constraint condition. When the aircraft does not meet any of the above conditions at a certain waypoint location, the determined waypoint location needs to be updated. The updated principle is that the aircraft needs to meet the above two conditions at the waypoint position simultaneously.
According to the path planning method provided by the embodiment of the application, whether the aircraft meets the visual field constraint condition and the anti-collision constraint condition at each waypoint position or not is judged according to the ground image information acquired at each waypoint position, and the waypoint position is updated for the waypoint position which does not meet the visual field constraint condition and/or the anti-collision constraint condition, so that the aircraft meets the visual field constraint condition and the anti-collision constraint condition at all waypoint positions. In the technical scheme, the aircraft is controlled to simultaneously meet the visual field constraint condition and the anti-collision constraint condition, and a foundation is laid for ensuring the safe landing of the aircraft.
For example, on the basis of any one of the above embodiments, in order to enable the aircraft to land at the target position, the technical solution of this embodiment needs to enable the starting projection coordinates corresponding to each feature point of the visual reference object, and the error between each intermediate projection coordinate and the target projection coordinate to satisfy the exponential convergence condition when the aircraft is at the starting position, and therefore, in this embodiment, the path planning method may further include the following steps:
and controlling the error between the initial projection coordinate and the target projection coordinate and controlling each intermediate projection coordinate and the target projection coordinate to meet an exponential convergence condition.
Optionally, in this embodiment, the control target of the visual servo controller is to minimize the error between the current projection coordinate s and the target projection coordinate s ″, that is, the following formula (4) is satisfied, and the current projection coordinate s may be any one of the start projection coordinate and all intermediate projection coordinates:
e=s-s*(4)
optionally, when the visual reference object in this embodiment is a rectangular ground object lying on the ground, the visual reference object includes four feature points on the image plane, and therefore, it may be defined that the feature vector corresponding to the current projection coordinate s may be s ═ x1,y1,x2,y2,x3,y3,x4,y4]TAnd the feature vectors of the four feature points of the rectangular landmark at the current camera pose are included. Likewise, target projection coordinates s may be defined*=[x* 1,y* 1,x* 2,y* 2,x* 3,y* 3,x* 4,y* 4]TTherefore, the feature vectors of the four feature points of the rectangular landmark at the pose of the target camera are contained.
Alternatively, to ensure that the error e is exponentially converged, the error vector e must satisfy a differential equation shown in the following equation (5):
Figure BDA0001953741920000161
wherein the content of the first and second substances,
Figure BDA0001953741920000162
this represents a vector obtained by differentiating the error vector e in the x, y, and z directions, respectively.
In the embodiment, the safe landing of the aircraft can be ensured by controlling the error between the initial projection coordinate and the target projection coordinate and controlling each intermediate projection coordinate and the target projection coordinate to meet the index convergence condition.
Further, in the present embodiment, it is derived from practical literature that the target flying speed (i.e. the camera speed) of the aircraft and the projection coordinates of each feature point of the visual reference object can be converted by the following formula (6):
Figure BDA0001953741920000163
wherein L is L (s, Z)c) Since the visual reference includes four feature points, this L satisfies L∈ R8×6It consists of a stack of interaction matrices of four feature points.
From equation (5) and equation (6), equation (7) can be derived, and accordingly, equation (8) with respect to the target reference velocity is derived:
L·Vc=-λe (7)
Vc=[Tx,Ty,Tz,ωx,ωy,ωz]T=-λ·L+·e (8)
wherein, L+∈R6×8Is the pseudo-inverse of LMatrix, [ T ]x,Ty,Tz,ωx,ωy,ωz]Representing the aircraft at position (x, y, z) with a velocity (T)x,Ty,Tz) The direction angles are respectively (omega)x,ωy,ωz)。
Thus, in this embodiment, based on the target reference speed and the determined positions of the waypoints, a target planned path of the aircraft from the starting position to the target position may be obtained.
For example, fig. 9A is a schematic diagram of the position relationship between the aircraft and the visual reference object in the three-dimensional space at the starting position and the target position. FIG. 9B is a schematic projection diagram of the characteristic points of the visual reference object when the aircraft is at the starting position and the target position in the planar image. FIG. 9C is a schematic diagram of the path of each feature point of the visual reference object when the aircraft is in three-dimensional space. FIG. 9D is a schematic diagram of the path of each feature point of the visual reference object when the aircraft is in the plane image.
Referring to fig. 9A and 9B, when the aircraft moves from the starting position to the target position, the shapes of the visual reference objects photographed by the cameras are relatively regular. Referring to fig. 9C and 9D, the aircraft can move from the start position to the target position while avoiding the obstacle based on the determined waypoint positions, and fig. 9D shows path trajectories corresponding to the four feature points of the visual reference object, respectively.
In summary, according to the path planning method provided by this embodiment, on the premise that the path planning of the four-rotor aircraft based on the RRT × algorithm satisfies the visual field constraint and the obstacle avoidance constraint, the four-rotor aircraft can autonomously land without depending on the GPS signal, and particularly for the large displacement phenomenon in the landing process, the algorithm has a faster operation speed and higher efficiency, and increases the anti-interference performance of the system.
The following are embodiments of the apparatus of the present application that may be used to perform embodiments of the method of the present application. For details which are not disclosed in the embodiments of the apparatus of the present application, reference is made to the embodiments of the method of the present application.
Fig. 10 is a schematic structural diagram of a first embodiment of a path planning apparatus according to an embodiment of the present application. The path planning device can be integrated in an aircraft or can be realized by the aircraft. As shown in fig. 10, the apparatus may include: a generation module 101, an update module 102 and a determination module 103.
The generating module 101 is configured to generate a plurality of waypoint positions where the aircraft moves from the starting position to the target position according to a starting point pose of the aircraft at the starting position, a target point pose at the target position, and a position of the obstacle;
the updating module 102 is configured to update the plurality of waypoint positions based on the ground image information and preset constraint conditions acquired by the aircraft at the starting position, the target position, and each waypoint position;
the determining module 103 is configured to determine a target planned path for the aircraft to move from the starting location to the target location based on the target reference speed of the aircraft and the updated plurality of waypoint locations.
For example, on the basis of the above embodiments, fig. 11 is a schematic structural diagram of a second embodiment of the path planning apparatus provided in the embodiment of the present application. As shown in fig. 11, the apparatus may further include: an acquisition module 111.
The obtaining module 111 is configured to obtain a starting projection coordinate, a target projection coordinate, and an intermediate projection coordinate of the visual reference object on the image plane at the starting position, the target position, and each waypoint position of the aircraft respectively before the updating module 102 updates the plurality of waypoint positions based on the ground image information and preset constraint conditions obtained by the aircraft at the starting position, the target position, and each waypoint position.
Optionally, the obtaining module 111 is configured to obtain ground image information including the visual reference object at the starting position, the target position and each waypoint position by targeting the visual reference object disposed on the ground, and determine the starting projection coordinate, the target projection coordinate and each intermediate projection coordinate of the visual reference object based on the feature point coordinate and the projection coordinate conversion parameter of the visual reference object in each ground image information.
Illustratively, as shown in fig. 11, the apparatus further includes: a control module 112.
The control module 112 is configured to control, after the obtaining module 111, an error between the start projection coordinate and the target projection coordinate and each of the intermediate projection coordinates and the target projection coordinate to satisfy an exponential convergence condition.
For example, on the basis of the above embodiments, fig. 12 is a schematic structural diagram of a third embodiment of the path planning apparatus provided in the embodiment of the present application. As shown in fig. 12, in the present embodiment, the preset constraint conditions include: a view constraint and a collision avoidance constraint.
Optionally, the update module 102 includes: a judging unit 121 and an updating unit 122.
The determining unit 121 is configured to determine, according to the ground image information acquired at each waypoint position, whether the aircraft satisfies the view constraint condition and the collision avoidance constraint condition at each waypoint position at the same time;
the updating unit 122 is configured to update the waypoint positions for waypoint positions that do not satisfy the sight field constraint and/or the collision avoidance constraint so that the aircraft satisfies the sight field constraint and the collision avoidance constraint at all waypoint positions simultaneously.
In a possible implementation manner of this embodiment, the determining unit 121 is specifically configured to determine, for the ground image information acquired at each waypoint position, whether each feature point of a visual reference object arranged on the ground is located within a visual field range of the aircraft and whether an euclidean distance between a waypoint coordinate corresponding to the waypoint position and a center point coordinate of the obstacle is greater than a minimum circumscribed circle radius of the obstacle;
the updating unit 122 is specifically configured to update the waypoint position when the feature points of the visual reference object are not located in the visual field range of the aircraft and/or when the euclidean distance between the waypoint coordinates and the center point coordinates of the obstacle is less than or equal to the minimum circumscribed circle radius of the obstacle, so that each feature point of the visual reference object is located in the visual field range of the aircraft and the euclidean distance between the waypoint coordinates and the center point coordinates of the obstacle is greater than the minimum circumscribed circle radius of the obstacle.
For example, in any of the embodiments of the present application, the generating module 101 is specifically configured to determine a plurality of waypoint positions where the aircraft moves from the starting position to the target position based on a fast search treemark RRT algorithm, the starting point pose, the target point pose, and a principle that there is no collision risk between the aircraft and the obstacle.
The apparatus provided in the embodiment of the present application may be used to execute the method in the embodiments shown in fig. 3 to fig. 7, and the implementation principle and the technical effect are similar, which are not described herein again.
It should be noted that the division of the modules of the above apparatus is only a logical division, and the actual implementation may be wholly or partially integrated into one physical entity, or may be physically separated. And these modules can be realized in the form of software called by processing element; or may be implemented entirely in hardware; and part of the modules can be realized in the form of calling software by the processing element, and part of the modules can be realized in the form of hardware. For example, the determining module may be a processing element separately set up, or may be implemented by being integrated in a chip of the apparatus, or may be stored in a memory of the apparatus in the form of program code, and the function of the determining module is called and executed by a processing element of the apparatus. Other modules are implemented similarly. In addition, all or part of the modules can be integrated together or can be independently realized. The processing element described herein may be an integrated circuit having signal processing capabilities. In implementation, each step of the above method or each module above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in the form of software.
For example, the above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), or one or more microprocessors (DSPs), or one or more Field Programmable Gate Arrays (FPGAs), among others. For another example, when some of the above modules are implemented in the form of a processing element scheduler code, the processing element may be a general-purpose processor, such as a Central Processing Unit (CPU) or other processor that can call program code. As another example, these modules may be integrated together, implemented in the form of a system-on-a-chip (SOC).
The computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, e.g., from one website site, computer, server, or data center via a wired (e.g., coaxial cable, optical fiber, digital subscriber line (DS L)) or wireless (e.g., infrared, wireless, microwave, etc.) manner to another website site, computer, server, or data center.
Fig. 13 is a schematic structural diagram of a fourth embodiment of a path planning apparatus according to an embodiment of the present application. As shown in fig. 13, the apparatus may include: the system comprises a processor 131, a memory 132, a communication interface 133 and a system bus 134, wherein the memory 132 and the communication interface 133 are connected with the processor 131 through the system bus 134 and complete mutual communication, the memory 132 is used for storing computer execution instructions, the communication interface 133 is used for communicating with other devices, and the processor 131 implements the scheme in the embodiments shown in fig. 3 to 7 when executing the computer program.
The system bus mentioned in fig. 13 may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The system bus may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus. The communication interface is used for realizing communication between the database access device and other equipment (such as a client, a read-write library and a read-only library). The memory may comprise Random Access Memory (RAM) and may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
The processor may be a general-purpose processor, including a central processing unit CPU, a Network Processor (NP), and the like; but also a digital signal processor DSP, an application specific integrated circuit ASIC, a field programmable gate array FPGA or other programmable logic device, discrete gate or transistor logic, discrete hardware components.
Optionally, an embodiment of the present application further provides a storage medium, where instructions are stored in the storage medium, and when the storage medium is run on a computer, the storage medium causes the computer to execute the method according to the embodiment shown in fig. 3 to 7.
Optionally, an embodiment of the present application further provides a chip for executing the instruction, where the chip is configured to execute the method in the embodiment shown in fig. 3 to 7.
The embodiment of the present application further provides a program product, where the program product includes a computer program, where the computer program is stored in a storage medium, and the computer program can be read from the storage medium by at least one processor, and when the computer program is executed by the at least one processor, the method of the embodiment shown in fig. 3 to 7 can be implemented.
In the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone, wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship; in the formula, the character "/" indicates that the preceding and following related objects are in a relationship of "division". "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or multiple.
It is to be understood that the various numerical references referred to in the embodiments of the present application are merely for descriptive convenience and are not intended to limit the scope of the embodiments of the present application.
It should be understood that, in the embodiment of the present application, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiment of the present application.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (10)

1. A method of path planning, comprising:
generating a plurality of waypoint positions through which the aircraft moves from the starting position to the target position according to the starting point pose of the aircraft at the starting position, the target point pose at the target position and the position of the obstacle;
updating the plurality of waypoint positions based on the ground image information and preset constraint conditions acquired by the aircraft at the starting position, the target position and each waypoint position;
determining a target planned path for the aircraft to move from the starting location to the target location based on the target reference speed of the aircraft and the updated plurality of waypoint locations.
2. The method of claim 1, wherein prior to updating the plurality of waypoint locations based on the ground image information acquired by the aircraft at the starting location, the target location and each waypoint location and preset constraints, the method further comprises:
and acquiring the initial projection coordinate, the target projection coordinate and the middle projection coordinate of the visual reference object at each waypoint position on the image plane when the aircraft is at the initial position, the target position and each waypoint position respectively.
3. The method of claim 2, wherein said obtaining a starting projection coordinate, a target projection coordinate, and each waypoint projection coordinate of a visual reference on an image plane for the aerial vehicle at the starting location, the target location, and each waypoint location, respectively, comprises:
taking the visual reference object arranged on the ground as a target, and respectively acquiring ground image information comprising the visual reference object at the starting position, the target position and each waypoint position;
and determining the initial projection coordinate, the target projection coordinate and each intermediate projection coordinate of the visual reference object based on the feature point coordinate and the projection coordinate conversion parameter of the visual reference object in each piece of ground image information.
4. A method according to claim 2 or 3, characterized in that the method further comprises:
and controlling the error of the initial projection coordinate and the target projection coordinate and each intermediate projection coordinate and the target projection coordinate to meet an exponential convergence condition.
5. A method according to any one of claims 1-3, wherein the preset constraints comprise: a view constraint and an anti-collision constraint;
updating the plurality of waypoint positions based on the ground image information and preset constraint conditions acquired by the aircraft at the starting position, the target position and each waypoint position, comprising:
judging whether the aircraft meets the visual field constraint condition and the anti-collision constraint condition at each waypoint position simultaneously according to the ground image information acquired at each waypoint position;
for waypoint locations that do not satisfy the horizon constraint and/or the collision avoidance constraint, updating the waypoint locations such that the aircraft satisfies the horizon constraint and the collision avoidance constraint simultaneously at all waypoint locations.
6. The method of claim 5, wherein determining whether the aircraft satisfies the sight constraint and the collision avoidance constraint simultaneously at each waypoint location based on the ground image information acquired at each waypoint location comprises:
for the ground image information acquired at each waypoint position, judging whether each feature point of a visual reference object arranged on the ground is positioned in the visual field range of the aircraft and whether the Euclidean distance between a waypoint coordinate corresponding to the waypoint position and a center point coordinate of the obstacle is greater than the minimum circumscribed circle radius of the obstacle;
and if the characteristic point of the visual reference object is not positioned in the visual field range of the aircraft and/or the Euclidean distance between the navigation point coordinate and the center point coordinate of the obstacle is smaller than or equal to the minimum circumscribed circle radius of the obstacle, updating the position of the navigation point so that each characteristic point of the visual reference object is positioned in the visual field range of the aircraft and the Euclidean distance between the navigation point coordinate and the center point coordinate of the obstacle is larger than the minimum circumscribed circle radius of the obstacle.
7. The method of claim 1, wherein generating a plurality of waypoint positions through which the aircraft moves from the origin position to the target position based on the starting point pose of the aircraft at the origin position, the target point pose at the target position, and the position of the obstacle comprises:
and determining a plurality of waypoint positions where the aircraft moves from the starting position to the target position on the way based on a rapid search treetor RRT algorithm, the starting point pose, the target point pose and a principle that the aircraft does not have collision risk with the obstacle.
8. A path planning apparatus, comprising: the device comprises a generating module, an updating module and a determining module;
the generating module is used for generating a plurality of waypoint positions through which the aircraft moves from the starting position to the target position according to the starting point pose of the aircraft at the starting position, the target point pose at the target position and the position of the obstacle;
the updating module is used for updating the plurality of waypoint positions based on the ground image information and preset constraint conditions acquired by the aircraft at the starting position, the target position and each waypoint position;
the determining module is configured to determine a target planned path for the aircraft to move from the starting position to the target position based on the target reference speed of the aircraft and the updated plurality of waypoint positions.
9. A path planner comprising a processor, a memory and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the method according to any of the preceding claims 1-7 when executing the program.
10. A storage medium having stored therein instructions which, when run on a computer, cause the computer to perform the method of any one of claims 1-7.
CN201910059757.0A 2019-01-22 2019-01-22 Path planning method, device and storage medium Active CN111457923B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910059757.0A CN111457923B (en) 2019-01-22 2019-01-22 Path planning method, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910059757.0A CN111457923B (en) 2019-01-22 2019-01-22 Path planning method, device and storage medium

Publications (2)

Publication Number Publication Date
CN111457923A true CN111457923A (en) 2020-07-28
CN111457923B CN111457923B (en) 2022-08-12

Family

ID=71682297

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910059757.0A Active CN111457923B (en) 2019-01-22 2019-01-22 Path planning method, device and storage medium

Country Status (1)

Country Link
CN (1) CN111457923B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112162559A (en) * 2020-09-30 2021-01-01 杭州海康机器人技术有限公司 Method, device and storage medium for multi-robot mixing
CN112580293A (en) * 2020-12-18 2021-03-30 全芯智造技术有限公司 Method, apparatus and computer-readable storage medium for generating circuit layout
CN113253760A (en) * 2021-06-08 2021-08-13 北京远度互联科技有限公司 Path planning method and device, movable carrier and storage medium
CN114511044A (en) * 2022-04-18 2022-05-17 新石器慧通(北京)科技有限公司 Unmanned vehicle passing control method and device

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080243383A1 (en) * 2006-12-12 2008-10-02 Ching-Fang Lin Integrated collision avoidance enhanced GN&C system for air vehicle
CN102087530A (en) * 2010-12-07 2011-06-08 东南大学 Vision navigation method of mobile robot based on hand-drawing map and path
CN104932515A (en) * 2015-04-24 2015-09-23 深圳市大疆创新科技有限公司 Automatic cruising method and cruising device
CN106873630A (en) * 2017-04-20 2017-06-20 广州极飞科技有限公司 A kind of flight control method and device, perform equipment
CN107169468A (en) * 2017-05-31 2017-09-15 北京京东尚科信息技术有限公司 Method for controlling a vehicle and device
CN107300919A (en) * 2017-06-22 2017-10-27 中国科学院深圳先进技术研究院 A kind of robot and its traveling control method
CN107515606A (en) * 2017-07-20 2017-12-26 北京格灵深瞳信息技术有限公司 Robot implementation method, control method and robot, electronic equipment
US20180114449A1 (en) * 2015-01-19 2018-04-26 The Aerospace Corporation Autonomous nap-of-the-earth (anoe) flight path planning for manned and unmanned rotorcraft
US20180136650A1 (en) * 2015-06-29 2018-05-17 Yuneec Technology Co., Limited Aircraft and obstacle avoidance method and system thereof
US20180204469A1 (en) * 2017-01-13 2018-07-19 Unmanned Innovation, Inc. Unmanned aerial vehicle visual point cloud navigation
CN108776492A (en) * 2018-06-27 2018-11-09 电子科技大学 A kind of four-axle aircraft automatic obstacle avoiding and air navigation aid based on binocular camera
CN108827306A (en) * 2018-05-31 2018-11-16 北京林业大学 A kind of unmanned plane SLAM navigation methods and systems based on Multi-sensor Fusion
CN109029417A (en) * 2018-05-21 2018-12-18 南京航空航天大学 Unmanned plane SLAM method based on mixing visual odometry and multiple dimensioned map

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080243383A1 (en) * 2006-12-12 2008-10-02 Ching-Fang Lin Integrated collision avoidance enhanced GN&C system for air vehicle
CN102087530A (en) * 2010-12-07 2011-06-08 东南大学 Vision navigation method of mobile robot based on hand-drawing map and path
US20180114449A1 (en) * 2015-01-19 2018-04-26 The Aerospace Corporation Autonomous nap-of-the-earth (anoe) flight path planning for manned and unmanned rotorcraft
CN104932515A (en) * 2015-04-24 2015-09-23 深圳市大疆创新科技有限公司 Automatic cruising method and cruising device
US20180136650A1 (en) * 2015-06-29 2018-05-17 Yuneec Technology Co., Limited Aircraft and obstacle avoidance method and system thereof
US20180204469A1 (en) * 2017-01-13 2018-07-19 Unmanned Innovation, Inc. Unmanned aerial vehicle visual point cloud navigation
CN106873630A (en) * 2017-04-20 2017-06-20 广州极飞科技有限公司 A kind of flight control method and device, perform equipment
CN107169468A (en) * 2017-05-31 2017-09-15 北京京东尚科信息技术有限公司 Method for controlling a vehicle and device
CN107300919A (en) * 2017-06-22 2017-10-27 中国科学院深圳先进技术研究院 A kind of robot and its traveling control method
CN107515606A (en) * 2017-07-20 2017-12-26 北京格灵深瞳信息技术有限公司 Robot implementation method, control method and robot, electronic equipment
CN109029417A (en) * 2018-05-21 2018-12-18 南京航空航天大学 Unmanned plane SLAM method based on mixing visual odometry and multiple dimensioned map
CN108827306A (en) * 2018-05-31 2018-11-16 北京林业大学 A kind of unmanned plane SLAM navigation methods and systems based on Multi-sensor Fusion
CN108776492A (en) * 2018-06-27 2018-11-09 电子科技大学 A kind of four-axle aircraft automatic obstacle avoiding and air navigation aid based on binocular camera

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
YANG LYU等: "Vision-based UAV collision avoidance with 2D dynamic safety envelope", 《IEEE AEROSPACE AND ELECTRONIC SYSTEMS MAGAZINE》 *
YANG LYU等: "Vision-based UAV collision avoidance with 2D dynamic safety envelope", 《IEEE AEROSPACE AND ELECTRONIC SYSTEMS MAGAZINE》, vol. 31, no. 7, 5 August 2016 (2016-08-05), pages 16 - 26, XP011618715, DOI: 10.1109/MAES.2016.150155 *
吕强等: "基于视觉伺服的小型四旋翼无人机自主飞行控制研究进展", 《科技导报》 *
吕强等: "基于视觉伺服的小型四旋翼无人机自主飞行控制研究进展", 《科技导报》, vol. 34, no. 24, 28 December 2016 (2016-12-28), pages 68 - 73 *
熊超等: "基于碰撞锥改进人工势场的无人机避障路径规划", 《计算机工程》 *
熊超等: "基于碰撞锥改进人工势场的无人机避障路径规划", 《计算机工程》, vol. 44, no. 9, 14 March 2018 (2018-03-14), pages 314 - 320 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112162559A (en) * 2020-09-30 2021-01-01 杭州海康机器人技术有限公司 Method, device and storage medium for multi-robot mixing
CN112162559B (en) * 2020-09-30 2021-10-15 杭州海康机器人技术有限公司 Method, device and storage medium for multi-robot mixing
CN112580293A (en) * 2020-12-18 2021-03-30 全芯智造技术有限公司 Method, apparatus and computer-readable storage medium for generating circuit layout
CN113253760A (en) * 2021-06-08 2021-08-13 北京远度互联科技有限公司 Path planning method and device, movable carrier and storage medium
CN113253760B (en) * 2021-06-08 2021-11-09 北京远度互联科技有限公司 Path planning method and device, movable carrier and storage medium
CN114511044A (en) * 2022-04-18 2022-05-17 新石器慧通(北京)科技有限公司 Unmanned vehicle passing control method and device
CN114511044B (en) * 2022-04-18 2022-06-28 新石器慧通(北京)科技有限公司 Unmanned vehicle passing control method and device

Also Published As

Publication number Publication date
CN111457923B (en) 2022-08-12

Similar Documents

Publication Publication Date Title
CN111457923B (en) Path planning method, device and storage medium
Mohta et al. Fast, autonomous flight in GPS‐denied and cluttered environments
Padhy et al. Deep neural network for autonomous uav navigation in indoor corridor environments
Jung et al. A direct visual servoing‐based framework for the 2016 IROS Autonomous Drone Racing Challenge
CN110362098B (en) Unmanned aerial vehicle visual servo control method and device and unmanned aerial vehicle
Schuster et al. Multi-robot 6D graph SLAM connecting decoupled local reference filters
Winkvist et al. Towards an autonomous indoor aerial inspection vehicle
Qi et al. Autonomous landing solution of low-cost quadrotor on a moving platform
Santos et al. Indoor low-cost localization system for controlling aerial robots
Farmani et al. An optimal sensor management technique for unmanned aerial vehicles tracking multiple mobile ground targets
Yu et al. A vision-based collision avoidance technique for micro air vehicles using local-level frame mapping and path planning
Al-Kaff et al. A vision-based navigation system for Unmanned Aerial Vehicles (UAVs)
Irfan et al. Vision-based guidance and navigation for autonomous mav in indoor environment
Nhair et al. Vision-based obstacle avoidance for small drone using monocular camera
Mao et al. Obstacle recognition and avoidance for UAVs under resource-constrained environments
Vutetakis et al. An autonomous loop-closure approach for simultaneous exploration and coverage of unknown infrastructure using mavs
Dubey et al. Droan-disparity-space representation for obstacle avoidance: Enabling wire mapping & avoidance
Masehian et al. Path planning of nonholonomic flying robots using a new virtual obstacle method
Zhang et al. Indoor navigation for quadrotor using rgb-d camera
Bender et al. Map-based drone homing using shortcuts
Biswas et al. Goal-aware Navigation of Quadrotor UAV for Infrastructure Inspection
Petersen et al. Target tracking and following from a multirotor UAV
CN114815899A (en) Unmanned aerial vehicle three-dimensional space path planning method based on 3D laser radar sensor
Boucheloukh et al. UAV navigation based on adaptive fuzzy backstepping controller using visual odometry
Park et al. Horizontal-vertical guidance of quadrotor for obstacle shape mapping

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20210226

Address after: Room a1905, 19 / F, building 2, No. 18, Kechuang 11th Street, Daxing District, Beijing, 100176

Applicant after: Beijing Jingdong Qianshi Technology Co.,Ltd.

Address before: 101, 1st floor, building 2, yard 20, Suzhou street, Haidian District, Beijing 100080

Applicant before: Beijing Jingbangda Trading Co.,Ltd.

Effective date of registration: 20210226

Address after: 101, 1st floor, building 2, yard 20, Suzhou street, Haidian District, Beijing 100080

Applicant after: Beijing Jingbangda Trading Co.,Ltd.

Address before: 100195 8th floor, 76 Zhichun Road, Haidian District, Beijing

Applicant before: BEIJING JINGDONG SHANGKE INFORMATION TECHNOLOGY Co.,Ltd.

Applicant before: BEIJING JINGDONG CENTURY TRADING Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant