Disclosure of Invention
The application provides a path planning method, a path planning device and a storage medium, which are used for solving the problems of large calculated amount and poor stability of the existing visual servo control.
A path planning method provided in a first aspect of the present application includes:
generating a plurality of waypoint positions through which the aircraft moves from the starting position to the target position according to the starting point pose of the aircraft at the starting position, the target point pose at the target position and the position of the obstacle;
updating the plurality of waypoint positions based on the ground image information and preset constraint conditions acquired by the aircraft at the starting position, the target position and each waypoint position;
determining a target planned path for the aircraft to move from the starting location to the target location based on the target reference speed of the aircraft and the updated plurality of waypoint locations.
In a possible design of the first aspect, before the updating the plurality of waypoint locations based on the ground image information acquired by the aircraft at the starting location, the target location, and each waypoint location and preset constraints, the method further includes:
and acquiring the initial projection coordinate, the target projection coordinate and the middle projection coordinate of the visual reference object at each waypoint position on the image plane when the aircraft is at the initial position, the target position and each waypoint position respectively.
In the above possible design of the first aspect, the acquiring a start projection coordinate, a target projection coordinate, and each waypoint projection coordinate of a visual reference object on an image plane at the start position, the target position, and each waypoint position of the aircraft respectively includes:
taking the visual reference object arranged on the ground as a target, and respectively acquiring ground image information comprising the visual reference object at the starting position, the target position and each waypoint position;
and determining the initial projection coordinate, the target projection coordinate and each intermediate projection coordinate of the visual reference object based on the feature point coordinate and the projection coordinate conversion parameter of the visual reference object in each piece of ground image information.
In another possible design of the first aspect, the method further includes:
and controlling the error of the initial projection coordinate and the target projection coordinate and each intermediate projection coordinate and the target projection coordinate to meet an exponential convergence condition.
In yet another possible design of the first aspect, the preset constraints include: a view constraint and an anti-collision constraint;
updating the plurality of waypoint positions based on the ground image information and preset constraint conditions acquired by the aircraft at the starting position, the target position and each waypoint position, comprising:
judging whether the aircraft meets the visual field constraint condition and the anti-collision constraint condition at each waypoint position simultaneously according to the ground image information acquired at each waypoint position;
for waypoint locations that do not satisfy the horizon constraint and/or the collision avoidance constraint, updating the waypoint locations such that the aircraft satisfies the horizon constraint and the collision avoidance constraint simultaneously at all waypoint locations.
In the foregoing possible design of the first aspect, the determining, according to the ground image information acquired at each waypoint position, whether the aircraft satisfies the view constraint condition and the collision avoidance constraint condition at each waypoint position at the same time includes:
for the ground image information acquired at each waypoint position, judging whether each feature point of a visual reference object arranged on the ground is positioned in the visual field range of the aircraft and whether the Euclidean distance between a waypoint coordinate corresponding to the waypoint position and a center point coordinate of the obstacle is greater than the minimum circumscribed circle radius of the obstacle;
and if the characteristic point of the visual reference object is not positioned in the visual field range of the aircraft and/or the Euclidean distance between the navigation point coordinate and the center point coordinate of the obstacle is smaller than or equal to the minimum circumscribed circle radius of the obstacle, updating the position of the navigation point so that each characteristic point of the visual reference object is positioned in the visual field range of the aircraft and the Euclidean distance between the navigation point coordinate and the center point coordinate of the obstacle is larger than the minimum circumscribed circle radius of the obstacle.
In yet another possible design of the first aspect, the generating, based on the starting point pose of the aircraft at the starting position, the target point pose at the target position, and the position of the obstacle, a plurality of waypoint positions through which the aircraft moves from the starting position to the target position includes:
and determining a plurality of waypoint positions where the aircraft moves from the starting position to the target position on the way based on a rapid search treetor RRT algorithm, the starting point pose, the target point pose and a principle that the aircraft does not have collision risk with the obstacle.
A second aspect of the present application provides a path planning apparatus, including: the device comprises a generating module, an updating module and a determining module;
the generating module is used for generating a plurality of waypoint positions through which the aircraft moves from the starting position to the target position according to the starting point pose of the aircraft at the starting position, the target point pose at the target position and the position of the obstacle;
the updating module is used for updating the plurality of waypoint positions based on the ground image information and preset constraint conditions acquired by the aircraft at the starting position, the target position and each waypoint position;
the determining module is configured to determine a target planned path for the aircraft to move from the starting position to the target position based on the target reference speed of the aircraft and the updated plurality of waypoint positions.
In one possible design of the second aspect, the apparatus further includes: an acquisition module;
the obtaining module is configured to obtain a starting projection coordinate, a target projection coordinate, and a middle projection coordinate of the visual reference object on the image plane at the starting position, the target position, and each waypoint position of the aircraft respectively before the updating module updates the plurality of waypoint positions based on the ground image information and preset constraint conditions obtained by the aircraft at the starting position, the target position, and each waypoint position.
In the above possible design of the second aspect, the obtaining module is configured to obtain ground image information including the visual reference object at the starting position, the target position and each waypoint position respectively by targeting the visual reference object disposed on the ground, and determine the starting projection coordinate, the target projection coordinate and each intermediate projection coordinate of the visual reference object based on the feature point coordinate and the projection coordinate conversion parameter of the visual reference object in each ground image information.
In another possible design of the second aspect, the apparatus further includes: a control module;
the control module is used for controlling the error between the initial projection coordinate and the target projection coordinate and controlling each intermediate projection coordinate and the target projection coordinate to meet an exponential convergence condition.
In yet another possible design of the second aspect, the preset constraint condition includes: a view constraint and an anti-collision constraint;
the update module includes: a judgment unit and an update unit;
the judging unit is used for judging whether the aircraft meets the visual field constraint condition and the anti-collision constraint condition at each waypoint position simultaneously according to the ground image information acquired at each waypoint position;
the updating unit is used for updating the waypoint positions which do not meet the visual field constraint condition and/or the anti-collision constraint condition so that the aircraft meets the visual field constraint condition and the anti-collision constraint condition at all waypoint positions simultaneously.
In the above possible design of the second aspect, the determining unit is specifically configured to determine, for the ground image information acquired at each waypoint position, whether each feature point of a visual reference object arranged on the ground is located within a visual field range of the aircraft and whether an euclidean distance between a waypoint coordinate corresponding to the waypoint position and a center point coordinate of the obstacle is greater than a minimum circumscribed circle radius of the obstacle;
the updating unit is specifically configured to update the waypoint position when the feature points of the visual reference object are not located within the visual field range of the aircraft and/or when the euclidean distance between the waypoint coordinates and the center point coordinates of the obstacle is less than or equal to the minimum circumscribed circle radius of the obstacle, so that each feature point of the visual reference object is located within the visual field range of the aircraft and the euclidean distance between the waypoint coordinates and the center point coordinates of the obstacle is greater than the minimum circumscribed circle radius of the obstacle.
In yet another possible design of the second aspect, the generating module is specifically configured to determine a plurality of waypoint positions where the aircraft moves from the starting position to the target position on the basis of a fast search treemark RRT algorithm, the starting point pose, the target point pose, and a principle that there is no risk of collision between the aircraft and the obstacle.
A third aspect of the present application provides a path planning apparatus, comprising a processor, a memory, and a computer program stored on the memory and executable on the processor, wherein the processor executes the program to implement the method according to the first aspect and various possible designs of the first aspect.
A fourth aspect of the present application provides a storage medium having stored therein instructions that, when executed on a computer, cause the computer to perform the method as set forth in the first aspect and various possible designs of the first aspect.
A fifth aspect of the present application provides a chip for executing instructions, the chip being configured to perform the method according to the first aspect and various possible designs of the first aspect.
According to the path planning method, the path planning device and the storage medium, a plurality of waypoint positions through which the aircraft moves from the initial position to the target position are generated according to the initial point pose of the aircraft at the initial position, the target point pose at the target position and the position of the obstacle, then the plurality of waypoint positions are updated based on the ground image information and the preset constraint conditions acquired by the aircraft at the initial position, the target position and each waypoint position, and finally the target planning path through which the aircraft moves from the initial position to the target position is determined based on the target reference speed of the aircraft and the plurality of updated waypoint positions. According to the technical scheme, the target planning path can be determined only according to the starting point pose, the target point pose, the barrier position and the preset constraint condition, and the method is small in calculation amount, simple to operate and wide in application range.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The embodiment of the application is explained by applying a path planning method to an unmanned aerial vehicle. With the rapid development of automatic control technology, the unmanned aerial vehicle quickly becomes a novel tool. Autonomous landing is a complex procedure for the multiple flight procedures that an aircraft includes. In the autonomous landing process of the aircraft, the position and the attitude of the aircraft need to be accurately controlled, so that the problem of visual servo control of the aircraft in the autonomous landing process needs to be researched.
The following figures are illustrated for exemplary purposes as a quad-rotor aircraft. FIG. 1 is an imaging schematic of a visual servo control system for a quad-rotor aircraft. As shown in fig. 1, the visual servo control system uses image information acquired by a camera (e.g., a camera) arranged at the lower end of the aircraft as feedback information, and does not need to perform contact measurement on environmental parameters, thereby realizing closed-loop control of the unmanned aerial vehicle in the aspects of kinematics, dynamics, and the like.
Alternatively, the visual servoing control system may be classified into a position-based visual servoing (PBVS) control system, an image-based visual servoing (IBVS) control system, and a hybrid visual servoing control system, in which the form of the error amount is constructed according to feedback information input into the visual servoing control system.
Illustratively, FIG. 2A is a closed loop control schematic of a PBVS control system. As shown in fig. 2A, for the PBVS control system, the external input information is the target pose, and the output information is the actual pose, in the system, the actual pose output by the aircraft is subjected to image acquisition, feature extraction and pose estimation to obtain an estimated pose corresponding to the actual pose, an error signal between the estimated pose and the externally input target pose is input to the visual servo controller, and the visual servo controller adjusts the internal parameters according to the error to regenerate an actual pose, and outputs the actual pose through the aircraft.
Optionally, the error signal in the PBVS control system is defined in a three-dimensional cartesian coordinate space. The cartesian coordinate system is a general term for a rectangular coordinate system and an oblique coordinate system. The control accuracy of the PBVS control system depends on the accuracy of pose estimation, and the pose estimation accuracy depends on the calibration accuracy of the camera equipment and the unmanned aerial vehicle, so that the problem of large calculation amount exists.
FIG. 2B is a schematic diagram of the closed loop control of the IBVS control system. As shown in fig. 2B, for the IBVS control system, the external input information is a target image feature, and the output information is an actual pose, in the system, the actual pose output by the aircraft is subjected to image acquisition and feature extraction to obtain an actual image feature corresponding to the actual pose, an error signal between the actual image feature and the target image feature input externally is input to the visual servo controller, and the visual servo controller adjusts the internal parameter according to the error to regenerate an actual pose, and outputs the actual pose through the aircraft.
Optionally, the error signal in the IBVS control system is defined in a two-dimensional image plane space. Because the vision servo controller in the IBVS control system is relatively complicated and lacks adaptability, in order to extract accurate characteristics, an additional sensor is needed to obtain depth information, and the phenomenon that the unmanned aerial vehicle vibrates due to too large displacement and poor stability can occur.
Illustratively, the error signal in the hybrid visual servoing control system includes both error information in a three-dimensional cartesian coordinate system and error information in a two-dimensional image plane space.
In summary, the visual servo control system in the prior art has the problems of large calculation amount and poor stability.
In view of the above problems, embodiments of the present application provide a path planning method, an apparatus, and a storage medium, where a plurality of waypoint positions through which an aircraft moves from an initial position to a target position are first generated according to a starting point pose of the aircraft at the initial position, a target point pose at the target position, and a position of an obstacle, then the plurality of waypoint positions are updated based on the initial position, the target position, and ground image information and preset constraint conditions acquired at each waypoint position of the aircraft, and finally a target planned path through which the aircraft moves from the initial position to the target position is determined based on a target reference speed of the aircraft and the plurality of waypoint positions after updating. According to the technical scheme, the target planning path can be determined only according to the starting point pose, the target point pose, the barrier position and the preset constraint condition, and the method is small in calculation amount, simple to operate and wide in application range.
The technical solution of the present application will be described in detail below with reference to specific examples. It should be noted that, the following specific embodiments mainly illustrate the aircraft as a quad-rotor aircraft, the embodiments of the present application do not limit the types of the aircraft, and the following specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments.
Fig. 3 is a schematic flowchart of a first embodiment of a path planning method according to an embodiment of the present application. The embodiment takes an aircraft, namely a visual servo controller on the aircraft, as an execution subject. As shown in fig. 3, the path planning method may include the following steps:
step 31: and generating a plurality of waypoint positions through which the aircraft moves from the starting position to the target position according to the starting point pose of the aircraft at the starting position, the target point pose at the target position and the position of the obstacle.
For example, in this embodiment, the visual servo controller first determines external input information of the aircraft, and optionally, the external input information may include: the starting point pose (comprising a starting position and an orientation angle) of the aircraft, the target point pose (comprising a target position and an orientation angle), and the position of the obstacle (comprising the coordinates of the center point of the obstacle and the radius of a circumscribed circle of the obstacle).
Optionally, the embodiments of the present application are all premised on tiling the visual reference on the ground within the field of view of the aircraft. For example, the visual reference may be an H-shaped marker or a checkerboard marker. The embodiment of the application does not limit the expression form and size of the visual marker, and the visual marker can be determined according to actual conditions.
After a vision servo controller of the aircraft acquires an initial point pose, a target point pose and a position of an obstacle of the aircraft, which are input from the outside, a series of waypoints from an initial position to a target position are generated based on a preset algorithm, such as a rapid expansion tree (RRT) algorithm or a rapid expansion tree star (RRT star) algorithm, and a plurality of waypoint positions through which the aircraft moves from the initial position to the target position are determined.
For a specific implementation principle of this step, reference may be made to the following description in the embodiment shown in fig. 4, and details are not described here.
Step 32: and updating the plurality of waypoint positions based on the ground image information acquired by the aircraft at the initial position, the target position and each waypoint position and a preset constraint condition.
For example, in this embodiment, in order to ensure the accuracy of the path planning, ground image information including a visual reference object may be collected at the start position, the target position, and each waypoint position by a camera (or referred to as a visual sensor) fixed below the aircraft, the collected ground image information is used as feedback information of the path planning, and the position of each waypoint is updated, so that the aircraft meets the preset constraint condition at each waypoint position.
Alternatively, in this embodiment, the camera device may be a camera below the quad-rotor aircraft.
The specific implementation principle of this step can refer to the description in the following embodiments, which are not described herein again.
Step 33: a target planned path for the aircraft to move from the starting location to the target location is determined based on a target reference speed and a plurality of waypoint locations of the aircraft.
In this embodiment, in order to make the actual flying speed of the aircraft approach the target reference speed, the speed tracking controller may be designed to realize real-time tracking of the target reference speed by analyzing a dynamic model of the aircraft (e.g., a quad-rotor). The speed tracking controller can be a proportional-integral-derivative (PID) speed tracking controller, which is simple and easy to control, and can track the change of speed, reduce the error and complete the closed-loop control of the visual servo control system.
According to the path planning method provided by the embodiment of the application, a plurality of waypoint positions through which the aircraft moves from the initial position to the target position are generated according to the initial point pose of the aircraft at the initial position, the target point pose at the target position and the position of the obstacle, then the plurality of waypoint positions are updated based on the ground image information and the preset constraint conditions acquired by the aircraft at the initial position, the target position and each waypoint position, and finally the target planning path through which the aircraft moves from the initial position to the target position is determined based on the target reference speed of the aircraft and the plurality of updated waypoint positions. According to the technical scheme, the target planning path can be determined only according to the starting point pose, the target point pose, the barrier position and the preset constraint condition, and the method is small in calculation amount, simple to operate and wide in application range.
Exemplarily, on the basis of the above embodiments, fig. 4 is a schematic flow chart of a second embodiment of the path planning method provided in the embodiment of the present application. As shown in fig. 4, in this embodiment, the step 31 can be implemented by:
step 41: and determining a plurality of waypoint positions through which the aircraft moves from the starting position to the target position based on a rapid search trestar RRT algorithm, the starting point pose, the target point pose and a principle that the aircraft and the obstacle have no collision risk.
The RRT algorithm is similar to the RRT algorithm, the space can be searched mainly by generating sampling points in an unknown region, and the workload can be effectively reduced because preprocessing operation is not needed in the processing process, so that the aircraft can be rapidly planned between the starting point pose and the target point pose.
Fig. 5A to 5E are schematic diagrams illustrating node expansion performed by the RRT algorithm. As shown in fig. 5A to 5E, in the present embodiment, it is assumed that the starting point is q0The target point is qD(not shown) from the start point q, based on the RRT algorithm0Randomly generating a certain number aroundRandom point q of quantitytSelecting a node q from the generated plurality of random points1The node q1Is a distance target point q among all random pointsDThe most recent one. Based on the above principle, the waypoint q can be generated2And q is3At q is3On the basis of which the node q is regenerated4. Based on the principle, the node generated till the last is a target point qDThe algorithm terminates.
It is worth noting that the RRT algorithm in the embodiment of the present application is an optimized form of the RRT algorithm, and the RRT algorithm adds a re-planning (Rewire) process compared with the RRT algorithm, that is, the RRT algorithm may determine whether a path formed by the generated waypoints is an optimal path, and if not, update the generated waypoint positions.
Specifically, referring to fig. 5A to 5E, in the present embodiment, a specific algorithm for determining a plurality of waypoint positions through which the aircraft moves from the start position to the target position based on the RRT algorithm is as follows:
as shown in fig. 5A, at the starting time, the fast spanning tree diagram corresponding to the RRT algorithm only includes the starting point q0. First, at a starting point q0The random sampling around generates a certain number of random points qtThen calculate each random point qtAnd target point qDEuclidean distance (| | q)t-qD| q), the calculated results are compared to determine the value (min | | q) with the minimum resultt-qDI |), i.e. at all random points qtTo determine the distance target point qDNearest node qr。
Next, as shown in FIG. 5B, in the fast expanding tree view, a starting point q is used0To node qrIs an extension direction, and is at a starting point q according to a preset node step length0To node qrDefine a node q therebetween1. Then, a start point q is detected0Node q1And the position of the obstacle, determining a starting point q0And node q1Whether the path between the nodes collides with the barrier or not, if not, the node q is connected1Adding the tree into the fast expansion tree graph, if yes,node q is discarded1And a re-search is performed.
Correspondingly, the rapid expansion tree diagram shown in fig. 5C can be obtained by performing expansion according to the above operation steps.
Optionally, in this embodiment, a starting point q exists in the fast expansion tree diagram0Node q1Node q2Node q3Node qnIn the case of (2), a node q is determined among the generated random nodesr(shown by the dashed circle), node q is computedrDetermining the distance between the node q and all the nodes nearby3And node qrThe distance between the nodes is the shortest, and q is the preset node step length3To node qrIs the extension direction, determines the node qw。
Alternatively, as shown in FIG. 5C, node qwThe adjacent node set V comprises nodes shown by large dotted circles in the graph, traversal retrieval is carried out on all nodes in the large dotted circles, and a distance node q is determinedwThe nearest node is node q2Therefore, removing node q from the fast spanning tree graph may be performed3And node qwConnecting line between them, and for node q2And node qwThe wiring is performed and the result is shown in fig. 5D.
In this embodiment, at node q
wRemoving the node q from the nearby node set V
2Then, the calculation is passed through the node q
wTotal path length of (i.e. for)
(representing node q)
wRemoving the node q from the nearby node set V
2Latter set), calculate S (q)
w)+||q
n-q
wIf S (q |)
w)+||q
n-q
wLess than S (q)
n) Then, S (q) is used
w)+||q
n-q
wI update S (q |)
n) Then, as shown in FIG. 5E, node q is removed
3And node q
nConnecting line between them, and for node q
wAnd node q
nGo on line, node q
3Is node q
nThe result after processing is shown in fig. 5E. If S (q)
w)+||q
n-q
w| | is greater than S (q |)
n) And no further action is required at this point.
That is, in the process of node calculation, if the calculated result is smaller than the total path length to the target node, the update operation needs to be performed on the optimal path sequence.
Optionally, when the number of iterations of the RRT algorithm reaches a set threshold, the obtained total path length and the optimal path sequence may be returned.
Optionally, in this embodiment, as shown in fig. 4, before the step 32, the method may further include the following steps:
step 42: and acquiring the initial projection coordinate, the target projection coordinate and the middle projection coordinate of the visual reference object at each waypoint position on the image plane when the aircraft is at the initial position, the target position and each waypoint position respectively.
In this embodiment, the autonomous landing mission of a quad-rotor aircraft based on visual servo control is mainly studied, and thus, the embodiments of the present application consider only the aircraft in a four-degree-of-freedom constellation space, i.e., the constellation space contains three position coordinates (x, y, z) and a yaw angle ψ about the centroid, while the pitch and roll angles (θ, Φ) about the centroid are zero.
Optionally, when the visual reference object is laid on the ground, the ground images can be respectively shot at the starting position, the target position and each waypoint position through the four handheld rotors, and the starting projection coordinate s and the target projection coordinate s of the visual reference object in the image plane can be calculated according to the ground image information of the visual reference object in each ground image*And intermediate projected coordinates at each waypoint location.
The path planning method provided by the embodiment of the application determines a plurality of waypoint positions where the aircraft passes from the initial position to the target position based on the RRT algorithm, the initial point pose and the target point pose of the aircraft and the principle that the aircraft does not have collision risk with the obstacle, and obtains the initial projection coordinate, the target projection coordinate and the middle projection coordinate of the visual reference object on the image plane when the aircraft is respectively at the initial position, the target position and each waypoint position. According to the technical scheme, the optimal target path can be determined based on the RRT algorithm, and the calculation amount is small.
Exemplarily, on the basis of the above embodiments, fig. 6 is a schematic flow chart of a third embodiment of the path planning method provided in the embodiment of the present application. As shown in fig. 6, in the present embodiment, the step 42 can be implemented by:
step 61: and taking a visual reference object arranged on the ground as a target, and respectively acquiring ground image information comprising the visual reference object at the starting position, the target position and each waypoint position.
In the present embodiment, a ground image is captured by an image capturing device (also referred to as a vision sensor, for example, a camera) provided below the aircraft with reference to a visual reference object laid flat on the ground, and ground image information including the visual reference object is acquired.
For example, when the visual reference object is in a world coordinate system (i.e., a three-dimensional coordinate system), the position coordinates of each feature point on the visual reference object are assumed to be
Wherein M is
jIs the coordinate of the jth characteristic point on the visual reference object, N represents the number of the characteristic points on the visual reference object, [ X ]
j,Y
j,Zj]
TDenotes the coordinate [ X
j,Y
j,Z
j]The transposing of (1).
Optionally, when the aircraft photographs the visual reference object at the starting position, the target position and each waypoint position, the aircraft may include a ground image of the visual reference object, and by analyzing the position and the shape of the visual reference object in each ground image, ground image information at each position may be acquired.
Step 62: and determining the initial projection coordinate, the target projection coordinate and each intermediate projection coordinate of the visual reference object based on the feature point coordinate and the projection coordinate conversion parameter of the visual reference object in each ground image information.
For example, assume that the projection coordinates of the visual reference object on the image plane are
Wherein i represents the waypoint position of the aircraft, i can represent any one of the starting position, each waypoint position and the target position, j represents the jth characteristic point on the visual reference object,
the transverse coordinates of the j-th characteristic point of the visual reference object in the plane image when the aircraft is at the i position are shown,
and the longitudinal coordinate of the j-th characteristic point of the visual reference object in the plane image when the aircraft is at the i position is shown.
For example, in the present embodiment, based on the perspective projection principle, the perspective camera model and the internal parameters used by the imaging device of the aircraft, i.e., the projection coordinate conversion parameters, are first determined, and using the projection coordinate conversion parameters and the above feature point coordinates of the visual reference object, the starting projection coordinates, the target projection coordinates, and each intermediate projection coordinate of the visual reference object in the image plane when the aircraft is at the starting position, the target position, and each waypoint position can be determined.
For example, the position coordinates of each feature point on the visual reference object and the projection coordinates on the image plane can be converted by the following formula (1):
wherein,
for projection coordinates, M
jK denotes an internal parameter of the image pickup apparatus, T, as a feature point coordinate
iIndicating that the aircraft is inA transformation parameter at position, T
iDetermined by the rotation matrix between the camera coordinate system and the world coordinate system.
Alternatively, for a camera device, the internal parameter K is a known parameter, and the transfer parameter T is determined when the position of the aircraft is determinediAre also known parameters.
The path planning method provided by the embodiment of the application takes a visual reference object arranged on the ground as a target, ground image information comprising the visual reference object is respectively obtained at an initial position, a target position and each waypoint position, and the initial projection coordinate, the target projection coordinate and each middle projection coordinate of the visual reference object are determined based on the characteristic point coordinate and the projection coordinate conversion parameter of the visual reference object in each ground image information. According to the technical scheme, the initial projection coordinate, the target projection coordinate and each intermediate projection coordinate of the visual reference object can be obtained according to the position of the aircraft and the coordinates of each characteristic point in the visual reference object, and a foundation is laid for path planning of the aircraft.
Exemplarily, on the basis of the above embodiments, fig. 7 is a schematic flow chart of a fourth embodiment of the path planning method provided in the embodiment of the present application. In this embodiment, the preset constraint condition may include: a view constraint and a collision avoidance constraint.
Optionally, as shown in fig. 7, in this embodiment, the step 32 may be implemented by:
step 71: and judging whether the aircraft meets the visual field constraint condition and the anti-collision constraint condition at each waypoint position simultaneously according to the ground image information acquired at each waypoint position.
Optionally, in this embodiment, it is determined whether each feature point of the visual reference object is located within the visual field of the aircraft and the euclidean distance between the waypoint coordinate and the center point coordinate of the obstacle is greater than the minimum circumscribed radius of the obstacle for the ground image information acquired at each waypoint position.
Illustratively, fig. 8A and 8B are views of a field of view for a four-rotor flight. In order to ensure that the aircraft is inEach waypoint position satisfies the visual field constraint condition, and as shown in fig. 8A and 8B, the visual Field (FOV) constraint of the camera is checked first, and the projection coordinates of the visual reference object on the image plane are determined
Whether all are within the visual field range of the camera, the specific constraint can be carried out by the following inequality:
wherein [ u ]min,umax]And [ v ]min,vmax]The boundary value of the FOV constraint of the visual field is represented, and N represents the number of characteristic points on the visual reference object. For example, in the present embodiment, as shown in fig. 8A, N is 4.
However, the above-mentioned visual field restriction can only ensure that each feature point on the visual reference object is located in front of the camera, and during the movement of the camera, i.e. the movement of the aircraft, the problem that the feature point is blocked by an obstacle may also occur, which can be solved by using that the connecting line between the optical axis of the image pickup device and each feature point is not blocked by the obstacle, i.e. the distance from the edge of the obstacle to the optical axis of the camera is greater than 0, as shown in fig. 8B.
Further, collision avoidance constraints may impose requirements on the camera that the planned path not enter an obstacle area. In this embodiment, the camera may be forced to remain in the space of the safety area by introducing a path of the control point, at a sufficiently far distance from the obstacle. For example, the control point may be the waypoints determined above such that the aircraft is sufficiently far from the center location of the obstacle when the aircraft is at each waypoint location.
In this embodiment, assume that the waypoint coordinate of the ith waypoint is tiIn this case, the positional relationship between the waypoint and the obstacle may be controlled to satisfy the following formula (3):
||ti-Oo||>ra(3)
wherein, ti=(xi,yi,zi)TAs coordinates of the aircraft at position i, Oo=(xo,yo,zo)TCoordinates of center point representing obstacle, raRepresenting the minimum circumscribed circle radius of the obstacle. That is, in order to prevent the collision between the aircraft and the obstacle, the euclidean distance between the coordinates of the aircraft at the waypoint position and the coordinates of the center point of the obstacle needs to be greater than the minimum circumscribed circle radius of the obstacle.
Step 72: for waypoint locations that do not satisfy the above-described sight and/or collision avoidance constraints, the waypoint locations are updated so that the aircraft satisfies the sight and collision avoidance constraints simultaneously at all waypoint locations.
In this embodiment, when the aircraft meets both of the above conditions at each waypoint location, then the waypoint location does not need to be updated. Otherwise, if the characteristic point of the visual reference object is not positioned in the visual field range of the aircraft and/or the Euclidean distance between the navigation point coordinate and the center point coordinate of the obstacle is smaller than or equal to the minimum circumscribed circle radius of the obstacle, updating the navigation point position so that all the characteristic points of the visual reference object are positioned in the visual field range of the aircraft and the Euclidean distance between the navigation point coordinate and the center point coordinate of the obstacle is larger than the minimum circumscribed circle radius of the obstacle.
For example, when the feature point of the visual reference object is not within the visual field range of the aircraft, it indicates that the aircraft does not satisfy the visual field constraint condition, and if the aircraft is at a certain waypoint position, the euclidean distance between the waypoint coordinate corresponding to the waypoint position and the center point coordinate of the obstacle is less than or equal to the minimum circumscribed circle radius of the obstacle, which indicates that the aircraft does not satisfy the collision avoidance constraint condition. When the aircraft does not meet any of the above conditions at a certain waypoint location, the determined waypoint location needs to be updated. The updated principle is that the aircraft needs to meet the above two conditions at the waypoint position simultaneously.
According to the path planning method provided by the embodiment of the application, whether the aircraft meets the visual field constraint condition and the anti-collision constraint condition at each waypoint position or not is judged according to the ground image information acquired at each waypoint position, and the waypoint position is updated for the waypoint position which does not meet the visual field constraint condition and/or the anti-collision constraint condition, so that the aircraft meets the visual field constraint condition and the anti-collision constraint condition at all waypoint positions. In the technical scheme, the aircraft is controlled to simultaneously meet the visual field constraint condition and the anti-collision constraint condition, and a foundation is laid for ensuring the safe landing of the aircraft.
For example, on the basis of any one of the above embodiments, in order to enable the aircraft to land at the target position, the technical solution of this embodiment needs to enable the starting projection coordinates corresponding to each feature point of the visual reference object, and the error between each intermediate projection coordinate and the target projection coordinate to satisfy the exponential convergence condition when the aircraft is at the starting position, and therefore, in this embodiment, the path planning method may further include the following steps:
and controlling the error between the initial projection coordinate and the target projection coordinate and controlling each intermediate projection coordinate and the target projection coordinate to meet an exponential convergence condition.
Optionally, in this embodiment, the control target of the visual servo controller is to minimize the error between the current projection coordinate s and the target projection coordinate s ″, that is, the following formula (4) is satisfied, and the current projection coordinate s may be any one of the start projection coordinate and all intermediate projection coordinates:
e=s-s*(4)
optionally, when the visual reference object in this embodiment is a rectangular ground object lying on the ground, the visual reference object includes four feature points on the image plane, and therefore, it may be defined that the feature vector corresponding to the current projection coordinate s may be s ═ x1,y1,x2,y2,x3,y3,x4,y4]TAnd the feature vectors of the four feature points of the rectangular landmark at the current camera pose are included. Likewise, target projection coordinates s may be defined*=[x* 1,y* 1,x* 2,y* 2,x* 3,y* 3,x* 4,y* 4]TTherefore, the feature vectors of the four feature points of the rectangular landmark at the pose of the target camera are contained.
Alternatively, to ensure that the error e is exponentially converged, the error vector e must satisfy a differential equation shown in the following equation (5):
wherein,
this represents a vector obtained by differentiating the error vector e in the x, y, and z directions, respectively.
In the embodiment, the safe landing of the aircraft can be ensured by controlling the error between the initial projection coordinate and the target projection coordinate and controlling each intermediate projection coordinate and the target projection coordinate to meet the index convergence condition.
Further, in the present embodiment, it is derived from practical literature that the target flying speed (i.e. the camera speed) of the aircraft and the projection coordinates of each feature point of the visual reference object can be converted by the following formula (6):
wherein L is L (s, Z)c) Since the visual reference includes four feature points, this L satisfies L∈ R8×6It consists of a stack of interaction matrices of four feature points.
From equation (5) and equation (6), equation (7) can be derived, and accordingly, equation (8) with respect to the target reference velocity is derived:
L·Vc=-λe (7)
Vc=[Tx,Ty,Tz,ωx,ωy,ωz]T=-λ·L+·e (8)
wherein, L+∈R6×8Is the pseudo-inverse of LMatrix, [ T ]x,Ty,Tz,ωx,ωy,ωz]Representing the aircraft at position (x, y, z) with a velocity (T)x,Ty,Tz) The direction angles are respectively (omega)x,ωy,ωz)。
Thus, in this embodiment, based on the target reference speed and the determined positions of the waypoints, a target planned path of the aircraft from the starting position to the target position may be obtained.
For example, fig. 9A is a schematic diagram of the position relationship between the aircraft and the visual reference object in the three-dimensional space at the starting position and the target position. FIG. 9B is a schematic projection diagram of the characteristic points of the visual reference object when the aircraft is at the starting position and the target position in the planar image. FIG. 9C is a schematic diagram of the path of each feature point of the visual reference object when the aircraft is in three-dimensional space. FIG. 9D is a schematic diagram of the path of each feature point of the visual reference object when the aircraft is in the plane image.
Referring to fig. 9A and 9B, when the aircraft moves from the starting position to the target position, the shapes of the visual reference objects photographed by the cameras are relatively regular. Referring to fig. 9C and 9D, the aircraft can move from the start position to the target position while avoiding the obstacle based on the determined waypoint positions, and fig. 9D shows path trajectories corresponding to the four feature points of the visual reference object, respectively.
In summary, according to the path planning method provided by this embodiment, on the premise that the path planning of the four-rotor aircraft based on the RRT × algorithm satisfies the visual field constraint and the obstacle avoidance constraint, the four-rotor aircraft can autonomously land without depending on the GPS signal, and particularly for the large displacement phenomenon in the landing process, the algorithm has a faster operation speed and higher efficiency, and increases the anti-interference performance of the system.
The following are embodiments of the apparatus of the present application that may be used to perform embodiments of the method of the present application. For details which are not disclosed in the embodiments of the apparatus of the present application, reference is made to the embodiments of the method of the present application.
Fig. 10 is a schematic structural diagram of a first embodiment of a path planning apparatus according to an embodiment of the present application. The path planning device can be integrated in an aircraft or can be realized by the aircraft. As shown in fig. 10, the apparatus may include: a generation module 101, an update module 102 and a determination module 103.
The generating module 101 is configured to generate a plurality of waypoint positions where the aircraft moves from the starting position to the target position according to a starting point pose of the aircraft at the starting position, a target point pose at the target position, and a position of the obstacle;
the updating module 102 is configured to update the plurality of waypoint positions based on the ground image information and preset constraint conditions acquired by the aircraft at the starting position, the target position, and each waypoint position;
the determining module 103 is configured to determine a target planned path for the aircraft to move from the starting location to the target location based on the target reference speed of the aircraft and the updated plurality of waypoint locations.
For example, on the basis of the above embodiments, fig. 11 is a schematic structural diagram of a second embodiment of the path planning apparatus provided in the embodiment of the present application. As shown in fig. 11, the apparatus may further include: an acquisition module 111.
The obtaining module 111 is configured to obtain a starting projection coordinate, a target projection coordinate, and an intermediate projection coordinate of the visual reference object on the image plane at the starting position, the target position, and each waypoint position of the aircraft respectively before the updating module 102 updates the plurality of waypoint positions based on the ground image information and preset constraint conditions obtained by the aircraft at the starting position, the target position, and each waypoint position.
Optionally, the obtaining module 111 is configured to obtain ground image information including the visual reference object at the starting position, the target position and each waypoint position by targeting the visual reference object disposed on the ground, and determine the starting projection coordinate, the target projection coordinate and each intermediate projection coordinate of the visual reference object based on the feature point coordinate and the projection coordinate conversion parameter of the visual reference object in each ground image information.
Illustratively, as shown in fig. 11, the apparatus further includes: a control module 112.
The control module 112 is configured to control, after the obtaining module 111, an error between the start projection coordinate and the target projection coordinate and each of the intermediate projection coordinates and the target projection coordinate to satisfy an exponential convergence condition.
For example, on the basis of the above embodiments, fig. 12 is a schematic structural diagram of a third embodiment of the path planning apparatus provided in the embodiment of the present application. As shown in fig. 12, in the present embodiment, the preset constraint conditions include: a view constraint and a collision avoidance constraint.
Optionally, the update module 102 includes: a judging unit 121 and an updating unit 122.
The determining unit 121 is configured to determine, according to the ground image information acquired at each waypoint position, whether the aircraft satisfies the view constraint condition and the collision avoidance constraint condition at each waypoint position at the same time;
the updating unit 122 is configured to update the waypoint positions for waypoint positions that do not satisfy the sight field constraint and/or the collision avoidance constraint so that the aircraft satisfies the sight field constraint and the collision avoidance constraint at all waypoint positions simultaneously.
In a possible implementation manner of this embodiment, the determining unit 121 is specifically configured to determine, for the ground image information acquired at each waypoint position, whether each feature point of a visual reference object arranged on the ground is located within a visual field range of the aircraft and whether an euclidean distance between a waypoint coordinate corresponding to the waypoint position and a center point coordinate of the obstacle is greater than a minimum circumscribed circle radius of the obstacle;
the updating unit 122 is specifically configured to update the waypoint position when the feature points of the visual reference object are not located in the visual field range of the aircraft and/or when the euclidean distance between the waypoint coordinates and the center point coordinates of the obstacle is less than or equal to the minimum circumscribed circle radius of the obstacle, so that each feature point of the visual reference object is located in the visual field range of the aircraft and the euclidean distance between the waypoint coordinates and the center point coordinates of the obstacle is greater than the minimum circumscribed circle radius of the obstacle.
For example, in any of the embodiments of the present application, the generating module 101 is specifically configured to determine a plurality of waypoint positions where the aircraft moves from the starting position to the target position based on a fast search treemark RRT algorithm, the starting point pose, the target point pose, and a principle that there is no collision risk between the aircraft and the obstacle.
The apparatus provided in the embodiment of the present application may be used to execute the method in the embodiments shown in fig. 3 to fig. 7, and the implementation principle and the technical effect are similar, which are not described herein again.
It should be noted that the division of the modules of the above apparatus is only a logical division, and the actual implementation may be wholly or partially integrated into one physical entity, or may be physically separated. And these modules can be realized in the form of software called by processing element; or may be implemented entirely in hardware; and part of the modules can be realized in the form of calling software by the processing element, and part of the modules can be realized in the form of hardware. For example, the determining module may be a processing element separately set up, or may be implemented by being integrated in a chip of the apparatus, or may be stored in a memory of the apparatus in the form of program code, and the function of the determining module is called and executed by a processing element of the apparatus. Other modules are implemented similarly. In addition, all or part of the modules can be integrated together or can be independently realized. The processing element described herein may be an integrated circuit having signal processing capabilities. In implementation, each step of the above method or each module above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in the form of software.
For example, the above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), or one or more microprocessors (DSPs), or one or more Field Programmable Gate Arrays (FPGAs), among others. For another example, when some of the above modules are implemented in the form of a processing element scheduler code, the processing element may be a general-purpose processor, such as a Central Processing Unit (CPU) or other processor that can call program code. As another example, these modules may be integrated together, implemented in the form of a system-on-a-chip (SOC).
The computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, e.g., from one website site, computer, server, or data center via a wired (e.g., coaxial cable, optical fiber, digital subscriber line (DS L)) or wireless (e.g., infrared, wireless, microwave, etc.) manner to another website site, computer, server, or data center.
Fig. 13 is a schematic structural diagram of a fourth embodiment of a path planning apparatus according to an embodiment of the present application. As shown in fig. 13, the apparatus may include: the system comprises a processor 131, a memory 132, a communication interface 133 and a system bus 134, wherein the memory 132 and the communication interface 133 are connected with the processor 131 through the system bus 134 and complete mutual communication, the memory 132 is used for storing computer execution instructions, the communication interface 133 is used for communicating with other devices, and the processor 131 implements the scheme in the embodiments shown in fig. 3 to 7 when executing the computer program.
The system bus mentioned in fig. 13 may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The system bus may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus. The communication interface is used for realizing communication between the database access device and other equipment (such as a client, a read-write library and a read-only library). The memory may comprise Random Access Memory (RAM) and may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
The processor may be a general-purpose processor, including a central processing unit CPU, a Network Processor (NP), and the like; but also a digital signal processor DSP, an application specific integrated circuit ASIC, a field programmable gate array FPGA or other programmable logic device, discrete gate or transistor logic, discrete hardware components.
Optionally, an embodiment of the present application further provides a storage medium, where instructions are stored in the storage medium, and when the storage medium is run on a computer, the storage medium causes the computer to execute the method according to the embodiment shown in fig. 3 to 7.
Optionally, an embodiment of the present application further provides a chip for executing the instruction, where the chip is configured to execute the method in the embodiment shown in fig. 3 to 7.
The embodiment of the present application further provides a program product, where the program product includes a computer program, where the computer program is stored in a storage medium, and the computer program can be read from the storage medium by at least one processor, and when the computer program is executed by the at least one processor, the method of the embodiment shown in fig. 3 to 7 can be implemented.
In the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone, wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship; in the formula, the character "/" indicates that the preceding and following related objects are in a relationship of "division". "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or multiple.
It is to be understood that the various numerical references referred to in the embodiments of the present application are merely for descriptive convenience and are not intended to limit the scope of the embodiments of the present application.
It should be understood that, in the embodiment of the present application, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiment of the present application.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.