CN113761647A - Simulation method and system of unmanned cluster system - Google Patents

Simulation method and system of unmanned cluster system Download PDF

Info

Publication number
CN113761647A
CN113761647A CN202110883152.0A CN202110883152A CN113761647A CN 113761647 A CN113761647 A CN 113761647A CN 202110883152 A CN202110883152 A CN 202110883152A CN 113761647 A CN113761647 A CN 113761647A
Authority
CN
China
Prior art keywords
vehicle system
unmanned
unmanned vehicle
simulation
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110883152.0A
Other languages
Chinese (zh)
Other versions
CN113761647B (en
Inventor
罗晓亮
冯运铎
梁秀兵
马燕琳
王浩旭
燕琦
王晓晶
李陈
尹建程
查长流
胡振峰
刘华鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National Defense Technology Innovation Institute PLA Academy of Military Science
China North Computer Application Technology Research Institute
Original Assignee
National Defense Technology Innovation Institute PLA Academy of Military Science
China North Computer Application Technology Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National Defense Technology Innovation Institute PLA Academy of Military Science, China North Computer Application Technology Research Institute filed Critical National Defense Technology Innovation Institute PLA Academy of Military Science
Priority to CN202110883152.0A priority Critical patent/CN113761647B/en
Publication of CN113761647A publication Critical patent/CN113761647A/en
Application granted granted Critical
Publication of CN113761647B publication Critical patent/CN113761647B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • G06F30/15Vehicle, aircraft or watercraft design
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • G06Q10/047Optimisation of routes or paths, e.g. travelling salesman problem
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Strategic Management (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Evolutionary Computation (AREA)
  • Economics (AREA)
  • General Engineering & Computer Science (AREA)
  • Marketing (AREA)
  • Computational Mathematics (AREA)
  • Tourism & Hospitality (AREA)
  • Quality & Reliability (AREA)
  • Remote Sensing (AREA)
  • Operations Research (AREA)
  • Computer Graphics (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Business, Economics & Management (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Game Theory and Decision Science (AREA)
  • Development Economics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Navigation (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a simulation method of an unmanned cluster system, wherein the unmanned cluster system comprises an unmanned vehicle system and an unmanned aerial vehicle system, and the method comprises the following steps: constructing physical simulation models of the unmanned aerial vehicle system and the unmanned vehicle system; constructing a three-dimensional simulation scene dynamic map based on the sensing data of the unmanned vehicle system; and carrying out target tracking on the unmanned aerial vehicle system and the physical simulation model of the unmanned aerial vehicle system based on the three-dimensional simulation scene dynamic map. The invention also discloses a simulation system of the unmanned cluster system. The invention has the beneficial effects that: the simulation method can realize the functional simulation of target identification, self-positioning, image building, tracking, obstacle avoidance and the like of the unmanned cluster system so as to realize the collaborative simulation of the unmanned cluster system.

Description

Simulation method and system of unmanned cluster system
Technical Field
The invention relates to the technical field of unmanned systems, in particular to a simulation method and system of an unmanned cluster system.
Background
At present, in the technical field of unmanned systems, three-dimensional collaborative simulation of unmanned cluster systems generally cannot be realized, so that simulation results cannot be applied to the actual unmanned cluster systems, and the collaborative combat capability of the unmanned cluster systems is improved.
Disclosure of Invention
In order to solve the above problems, an object of the present invention is to provide a simulation method and system for an unmanned cluster system, which can implement functional simulations of target identification, self-positioning, mapping, tracking, obstacle avoidance, etc. of the unmanned cluster system, so as to implement collaborative simulation of the unmanned cluster system.
The invention provides a simulation method of an unmanned cluster system, wherein the unmanned cluster system comprises an unmanned vehicle system and an unmanned aerial vehicle system, and the method comprises the following steps:
constructing physical simulation models of the unmanned aerial vehicle system and the unmanned vehicle system;
constructing a three-dimensional simulation scene dynamic map based on the sensing data of the unmanned vehicle system;
and carrying out target tracking on the unmanned aerial vehicle system and the physical simulation model of the unmanned aerial vehicle system based on the three-dimensional simulation scene dynamic map.
As a further improvement of the present invention, constructing physical simulation models of the unmanned aerial vehicle system and the unmanned vehicle system comprises:
constructing a physical simulation model of the unmanned aerial vehicle system based on a V-REP tool;
and constructing a physical simulation model of the unmanned vehicle system based on a Gazebo tool.
As a further improvement of the present invention, a three-dimensional simulation scene dynamic map is constructed based on the sensing data of the unmanned vehicle system, including:
constructing a semi-dense map based on the sensing data of the unmanned vehicle system;
and constructing an environment dense map by fusing the semi-dense map based on the three-dimensional laser radar point cloud data of the unmanned vehicle system.
As a further improvement of the present invention, a semi-dense map is constructed based on the sensing data of the unmanned vehicle system, comprising:
constructing a semi-dense map based on binocular vision sensing data of the unmanned vehicle system;
fusing gyroscope and inertial navigation sensing data of the unmanned vehicle system to the semi-dense map, and performing inter-frame motion estimation of the unmanned vehicle system;
and based on a key frame obtained by inter-frame motion estimation of the unmanned vehicle system, correcting the semi-dense map in real time through loop detection.
As a further improvement of the present invention, when the unmanned vehicle system performs inter-frame motion estimation, the unmanned vehicle system includes:
estimating incremental motion T of the unmanned vehicle systemk,k-1To make the optical measurement error
Figure BDA0003192922220000021
Minimum;
Figure BDA0003192922220000022
in the formula, Tk,k-1For the last frame of the image taken by the camera of the unmanned vehicle system
Figure BDA0003192922220000023
And the current frame
Figure BDA0003192922220000024
The pose between, I represents the image,
Figure BDA0003192922220000025
for photometric residual, i.e. in the previous frame
Figure BDA0003192922220000026
And said current frame
Figure BDA0003192922220000027
The same three-dimensional point p observed inuU is the central position of the image.
As a further development of the invention, the estimation of the incremental movement T of the unmanned systemk,k-1To make the optical measurement error
Figure BDA0003192922220000028
And at least, comprising:
s1, determining the previous frame
Figure BDA0003192922220000029
And said current frame
Figure BDA00031929222200000210
Pose change between Tk,k-1
S2, in the last frame
Figure BDA00031929222200000211
In (3), calculating three-dimensional point ρ by reprojectionuThe pixel intensity difference of (a);
wherein the content of the first and second substances,
Figure BDA00031929222200000212
in the formula, TBCA transformation matrix of a body coordinate system B and a camera coordinate system C of the unmanned vehicle system, u is the previous frame
Figure BDA00031929222200000213
The central position of the middle image block;
s3, in the current frame
Figure BDA00031929222200000214
The image feature alignment is carried out, and the current frame is calculated
Figure BDA00031929222200000215
With the image block P centered on the projected feature position u' relative to the frame in which the same feature was first observed
Figure BDA00031929222200000216
The pixel intensity difference of the reference image block in (1);
s4, correcting the projection characteristic position u' to obtain the current frame
Figure BDA00031929222200000217
Projected feature position u';
for point features:
u′*=u′+δu*
wherein the content of the first and second substances,
Figure BDA0003192922220000031
Figure BDA0003192922220000032
where Δ u is an iterative variable of the sum calculated over the image block P, δ u is the reprojection error, TCBA transformation matrix, T, for the camera coordinate system C and the body coordinate system B of the unmanned vehicle systemkrFor the current frame
Figure BDA0003192922220000033
And the frame
Figure BDA0003192922220000034
A is a constant;
for the line segment characteristics:
u′*=u′+δu*·n;
wherein the content of the first and second substances,
Figure BDA0003192922220000035
in the formula, Δ u is an iterative variable of a sum calculated on the image block P, δ u is a reprojection error, a is a constant, and n is a normal direction of a line segment feature;
s5, optimizing camera pose and landmark position χ ═ { T { (T) } by minimizing the sum of squares of the reprojection errorskWi};
Figure BDA0003192922220000036
Wherein K is the set of all key frames in the semi-dense map,
Figure BDA0003192922220000037
for the current frame
Figure BDA0003192922220000038
Neutralization ofThe set of all landmarks corresponding to a point feature,
Figure BDA0003192922220000039
for the current frame
Figure BDA00031929222200000310
Set of all landmarks corresponding to line segment features, TkWFor the current frame
Figure BDA00031929222200000311
And key frames
Figure BDA00031929222200000312
The transformation matrix of (2).
As a further improvement of the present invention, the loop detection comprises:
extracting key point features and key line segment features in the key frames, constructing a bag-of-words model for the key point features and the line segment features, and storing the bag-of-words model in a data dictionary;
when the camera of the unmanned vehicle system moves, calculating a similarity score value for the current key frame, and searching for an image most similar to the current key frame in the data dictionary;
and performing real-time correction on the semi-dense map based on the most similar image.
As a further improvement of the present invention, the calculating a similarity score value for obtaining a current key frame and searching for an image most similar to the current key frame in the data dictionary when the camera of the unmanned vehicle system moves includes:
determining the number of key points n of the current key framekAnd the number of key segments nl
Calculating the dispersion value d of the key points according to the coordinates (x, y) of the key pointskAnd calculating the dispersion degree value d of the key line segment according to the midpoint coordinates of the key line segmentl
Based on the characteristics S of key pointskWeight of akAnd the Key line segment feature SlWeight of alCalculating the obtained imageFrame calculation similarity score value At
Wherein A ist=ak*(nk/(nk+nl)+dk/(dk+dl))*Sk+al*(nl/(nk+nl)+dl/(dk+dl))*Sl
As a further development of the invention, the calculation of the dispersion value d of the keypoints is based on the coordinates (x, y) of the keypointskThe method comprises the following steps:
calculating the sum of the variances of the x coordinate and the y coordinate;
calculating the square root of the sum of the variances as the dispersion value d of the key pointsk
As a further improvement of the invention, the method for constructing the environment dense map by fusing the semi-dense map based on the three-dimensional laser radar point cloud data of the unmanned vehicle system comprises the following steps:
based on the three-dimensional laser point cloud data of the unmanned vehicle system, fusing a gyroscope and inertial navigation sensing data of the unmanned vehicle system to obtain a point cloud key frame;
obtaining the pose of the unmanned vehicle system through inter-frame motion estimation of the unmanned vehicle system, and translating and rotating the point cloud key frame according to the pose of the unmanned vehicle system to construct a three-dimensional point cloud model;
and fusing the semi-dense map and the three-dimensional point cloud model to construct an environment dense map.
As a further improvement of the present invention, the obtaining of the pose of the unmanned vehicle system through inter-frame motion estimation of the unmanned vehicle system, and the translating and rotating of the point cloud key frame according to the pose of the unmanned vehicle system to construct a three-dimensional point cloud model includes:
establishing a three-dimensional grid map based on the size of the unmanned vehicle system by taking the coordinates of the starting point of the unmanned vehicle system as the origin, the northward direction as the positive direction of an X axis, the eastward direction as the positive direction of a Y axis and the upward direction as the positive direction of a Z axis;
translating and rotating the obtained point cloud of the obstacle according to the current pose of the unmanned vehicle system, filling the translated and rotated point cloud into corresponding grids, and recording the number S of the point cloud in each grid;
updating each grid in a local range with the unmanned vehicle system as the center, and determining the point cloud number S of each updated grid+=αS-+ S; wherein the number of point clouds before updating in the grid is S-The number of point clouds updated in the grid is S+Alpha is a forgetting coefficient between 0 and 1, S is the current point cloud number, and the non-obstacle grid corresponds to S and is 0;
for the updated point cloud number S+Judging a threshold value, and if the updated point cloud number S is equal to the updated point cloud number+And if the preset value is met, determining the grid as the obstacle, otherwise, determining the grid as the passable space.
As a further improvement of the present invention, the target tracking of the physical simulation models of the unmanned aerial vehicle system and the unmanned vehicle system based on the three-dimensional simulation scene dynamic map includes:
establishing a 3D route map according to the size of the unmanned aerial vehicle system in the three-dimensional simulation scene dynamic map;
and searching an optimal path through an A-algorithm based on the 3D roadmap.
As a further improvement of the present invention, the building a 3D route map according to the size of the unmanned aerial vehicle system in the three-dimensional simulation scene dynamic map includes:
randomly selecting a sampling point in the three-dimensional simulation scene dynamic map, if the sampling point is located in an obstacle, reserving the sampling point, and repeating the steps to establish a sampling point diagram based on the reserved sampling point;
connecting each sampling point based on the sampling point diagram, if the connecting line collides with the barrier, canceling the connecting line, and repeating the steps to complete the connection of all the sampling points;
and connecting at least one adjacent point around each sampling point, if the sampling point and the adjacent point can be connected, reserving the corresponding connection line, and repeating the steps to complete the construction of the 3D route map.
As a further improvement of the present invention, after a certain period of time or every time the unmanned aerial vehicle system moves for a certain distance, the 3D roadmap is updated according to a new obstacle environment, and new sampling points are added to the 3D roadmap, or sampling points or connection lines colliding with the obstacle are deleted.
As a further improvement of the present invention, based on the 3D roadmap, the searching for the optimal path through the a-algorithm includes:
calculating each estimated value of the 3D route map, which is obtained by the starting point through each sampling point to the end point, based on the starting point and the end point of the given unmanned aerial vehicle system through an A-x algorithm;
sequencing each estimated value, deleting the sampling point with the minimum estimated value, and adding the surrounding points of the deleted sampling point as expansion points;
and sequentially circulating until the end point is searched.
As a further improvement of the present invention, the unmanned cluster system comprises an unmanned vehicle system and an unmanned vehicle system, and the system comprises:
the simulation model establishing module is used for establishing physical simulation models of the unmanned aerial vehicle system and the unmanned vehicle system;
the simulation map building module is used for building a three-dimensional simulation scene dynamic map based on the sensing data of the unmanned vehicle system;
and the simulation module is used for tracking the targets of the unmanned aerial vehicle system and the physical simulation model of the unmanned aerial vehicle system based on the three-dimensional simulation scene dynamic map.
As a further improvement of the invention, the simulation model building module is configured to:
constructing a physical simulation model of the unmanned aerial vehicle system based on a V-REP tool;
and constructing a physical simulation model of the unmanned vehicle system based on a Gazebo tool.
As a further refinement of the present invention, the simulation map building module is configured to:
constructing a semi-dense map based on the sensing data of the unmanned vehicle system;
and constructing an environment dense map by fusing the semi-dense map based on the three-dimensional laser radar point cloud data of the unmanned vehicle system.
As a further refinement of the present invention, the simulation map building module is configured to:
constructing a semi-dense map based on binocular vision sensing data of the unmanned vehicle system;
fusing gyroscope and inertial navigation sensing data of the unmanned vehicle system to the semi-dense map, and performing inter-frame motion estimation of the unmanned vehicle system;
and based on a key frame obtained by inter-frame motion estimation of the unmanned vehicle system, correcting the semi-dense map in real time through loop detection.
As a further refinement of the present invention, the simulation map building module is configured to:
estimating incremental motion T of the unmanned vehicle systemk,k-1To make the optical measurement error
Figure BDA0003192922220000061
Minimum;
Figure BDA0003192922220000062
in the formula, Tk,k-1For the last frame of the image taken by the camera of the unmanned vehicle system
Figure BDA0003192922220000063
And the current frame
Figure BDA0003192922220000071
The pose between, I represents the image,
Figure BDA0003192922220000072
for photometric residual, i.e. in the previous frame
Figure BDA0003192922220000073
And said current frame
Figure BDA0003192922220000074
The same three-dimensional point p observed inuU is the central position of the image.
As a further refinement of the present invention, the simulation map building module is configured to:
determining the previous frame
Figure BDA0003192922220000075
And said current frame
Figure BDA0003192922220000076
Pose change between Tk,k-1
In the last frame
Figure BDA0003192922220000077
In (3), calculating three-dimensional point ρ by reprojectionuThe pixel intensity difference of (a);
wherein the content of the first and second substances,
Figure BDA0003192922220000078
in the formula, TBCA transformation matrix of a body coordinate system B and a camera coordinate system C of the unmanned vehicle system, u is the previous frame
Figure BDA0003192922220000079
The central position of the middle image block;
in the current frame
Figure BDA00031929222200000710
The image feature alignment is carried out, and the current frame is calculated
Figure BDA00031929222200000711
With the image block P centered on the projected feature position u' relative to the frame in which the same feature was first observed
Figure BDA00031929222200000712
The pixel intensity difference of the reference image block in (1);
correcting the projection characteristic position u' to obtain the current frame
Figure BDA00031929222200000713
Projected feature position u';
for point features:
u′*=u′+δu*
wherein the content of the first and second substances,
Figure BDA00031929222200000714
Figure BDA00031929222200000715
where Δ u is an iterative variable of the sum calculated over the image block P, δ u is the reprojection error, TCBA transformation matrix, T, for the camera coordinate system C and the body coordinate system B of the unmanned vehicle systemkrFor the current frame
Figure BDA00031929222200000716
And the frame
Figure BDA00031929222200000717
A is a constant;
for the line segment characteristics:
u′*=u′+δu*·n;
wherein the content of the first and second substances,
Figure BDA00031929222200000718
in the formula, Δ u is an iterative variable of a sum calculated on the image block P, δ u is a reprojection error, a is a constant, and n is a normal direction of a line segment feature;
by minimizing the flatness of the reprojection errorSquare sum, optimizing camera pose and landmark position χ ═ { T ═ TkWi};
Figure BDA0003192922220000081
Wherein K is the set of all key frames in the semi-dense map,
Figure BDA0003192922220000082
for the current frame
Figure BDA0003192922220000083
A set of all landmarks corresponding to the point feature,
Figure BDA0003192922220000084
for the current frame
Figure BDA0003192922220000085
Set of all landmarks corresponding to line segment features, TkWFor the current frame
Figure BDA0003192922220000086
And key frames
Figure BDA0003192922220000087
The transformation matrix of (2).
As a further refinement of the present invention, the simulation map building module is configured to:
extracting key point features and key line segment features in the key frames, constructing a bag-of-words model for the key point features and the line segment features, and storing the bag-of-words model in a data dictionary;
when the camera of the unmanned vehicle system moves, calculating a similarity score value for the current key frame, and searching for an image most similar to the current key frame in the data dictionary;
and performing real-time correction on the semi-dense map based on the most similar image.
As a further refinement of the present invention, the simulation map building module is configured to:
determining the number of key points n of the current key framekAnd the number of key segments nl
Calculating the dispersion value d of the key points according to the coordinates (x, y) of the key pointskAnd calculating the dispersion degree value d of the key line segment according to the midpoint coordinates of the key line segmentl
Based on the characteristics S of key pointskWeight of akAnd the Key line segment feature SlWeight of alCalculating a similarity score value A for the obtained image framet
Wherein A ist=ak*(nk/(nk+nl)+dk/(dk+dl))*Sk+al*(nl/(nk+nl)+dl/(dk+dl))*Sl
As a further refinement of the present invention, the simulation map building module is configured to:
calculating the sum of the variances of the x coordinate and the y coordinate;
calculating the square root of the sum of the variances as the dispersion value d of the key pointsk
As a further refinement of the present invention, the simulation map building module is configured to:
based on the three-dimensional laser point cloud data of the unmanned vehicle system, fusing a gyroscope and inertial navigation sensing data of the unmanned vehicle system to obtain a point cloud key frame;
obtaining the pose of the unmanned vehicle system through inter-frame motion estimation of the unmanned vehicle system, and translating and rotating the point cloud key frame according to the pose of the unmanned vehicle system to construct a three-dimensional point cloud model;
and fusing the semi-dense map and the three-dimensional point cloud model to construct an environment dense map.
As a further refinement of the present invention, the simulation map building module is configured to:
establishing a three-dimensional grid map based on the size of the unmanned vehicle system by taking the coordinates of the starting point of the unmanned vehicle system as the origin, the northward direction as the positive direction of an X axis, the eastward direction as the positive direction of a Y axis and the upward direction as the positive direction of a Z axis;
translating and rotating the obtained point cloud of the obstacle according to the current pose of the unmanned vehicle system, filling the translated and rotated point cloud into corresponding grids, and recording the number S of the point cloud in each grid;
updating each grid in a local range with the unmanned vehicle system as the center, and determining the point cloud number S of each updated grid+=αS-+ S; wherein the number of point clouds before updating in the grid is S-The number of point clouds updated in the grid is S+Alpha is a forgetting coefficient between 0 and 1, S is the current point cloud number, and the non-obstacle grid corresponds to S and is 0;
for the updated point cloud number S+Judging a threshold value, and if the updated point cloud number S is equal to the updated point cloud number+And if the preset value is met, determining the grid as the obstacle, otherwise, determining the grid as the passable space.
As a further refinement of the invention, the simulation module is configured to:
establishing a 3D route map according to the size of the unmanned aerial vehicle system in the three-dimensional simulation scene dynamic map;
and searching an optimal path through an A-algorithm based on the 3D roadmap.
As a further refinement of the invention, the simulation module is configured to:
randomly selecting a sampling point in the three-dimensional simulation scene dynamic map, if the sampling point is located in an obstacle, reserving the sampling point, and repeating the steps to establish a sampling point diagram based on the reserved sampling point;
connecting each sampling point based on the sampling point diagram, if the connecting line collides with the barrier, canceling the connecting line, and repeating the steps to complete the connection of all the sampling points;
and connecting at least one adjacent point around each sampling point, if the sampling point and the adjacent point can be connected, reserving the corresponding connection line, and repeating the steps to complete the construction of the 3D route map.
As a further refinement of the invention, the simulation module is configured to: and updating the 3D route map according to a new obstacle environment after every period of time or every distance of movement of the unmanned aerial vehicle system, and adding new sampling points into the 3D route map or deleting sampling points or connecting lines colliding with the obstacles.
As a further refinement of the invention, the simulation module is configured to:
calculating each estimated value of the 3D route map, which is obtained by the starting point through each sampling point to the end point, based on the starting point and the end point of the given unmanned aerial vehicle system through an A-x algorithm;
sequencing each estimated value, deleting the sampling point with the minimum estimated value, and adding the surrounding points of the deleted sampling point as expansion points;
and sequentially circulating until the end point is searched.
The invention also provides an electronic device comprising a memory and a processor, wherein the memory is configured to store one or more computer instructions, wherein the one or more computer instructions are executed by the processor to implement the method.
The invention also provides a computer-readable storage medium, on which a computer program is stored, characterized in that the computer program is executed by a processor to implement the method.
The invention has the beneficial effects that:
an unmanned cluster simulation system is constructed based on ROS, and functional simulation such as target identification, self-positioning and mapping, tracking and obstacle avoidance can be realized, so that collaborative simulation of the unmanned cluster is realized. In addition, the simulation system of the invention can also be conveniently transplanted to various hardware platforms, and has good expansibility and compatibility.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below. It is obvious that the drawings in the following description are only some embodiments of the invention, and that for a person skilled in the art, other drawings can be derived from them without inventive effort.
Fig. 1 is a schematic flowchart of a simulation method of an unmanned cluster system according to an exemplary embodiment of the present invention;
FIG. 2 is a schematic flow chart of building a dynamic map of a three-dimensional simulation scene according to an exemplary embodiment of the present invention;
FIG. 3 is a diagram illustrating pose transformation between a previous frame and a current frame according to an exemplary embodiment of the present invention;
fig. 4 is a schematic diagram of a 3D roadmap constructed according to an exemplary embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that, if directional indications (such as up, down, left, right, front, and back … …) are involved in the embodiment of the present invention, the directional indications are only used to explain the relative positional relationship between the components, the movement situation, and the like in a specific posture (as shown in the drawing), and if the specific posture is changed, the directional indications are changed accordingly.
In addition, in the description of the present invention, the terms used are for illustrative purposes only and are not intended to limit the scope of the present invention. The terms "comprises" and/or "comprising" are used to specify the presence of stated elements, steps, operations, and/or components, but do not preclude the presence or addition of one or more other elements, steps, operations, and/or components. The terms "first," "second," and the like may be used to describe various elements, not necessarily order, and not necessarily limit the elements. In addition, in the description of the present invention, "a plurality" means two or more unless otherwise specified. These terms are only used to distinguish one element from another. These and/or other aspects will become apparent to those of ordinary skill in the art in view of the following drawings, and the description of the embodiments of the present invention will be more readily understood by those of ordinary skill in the art. The drawings are only for purposes of illustrating the described embodiments of the invention. One skilled in the art will readily recognize from the following description that alternative embodiments of the structures and methods illustrated in the present application may be employed without departing from the principles described in the present application.
The simulation method of the unmanned cluster system of the embodiment of the invention, the unmanned cluster system comprises an unmanned vehicle system and an unmanned aerial vehicle system, as shown in figure 1, the method comprises the following steps:
constructing physical simulation models of an unmanned aerial vehicle system and an unmanned vehicle system;
constructing a three-dimensional simulation scene dynamic map based on the sensing data of the unmanned vehicle system;
and carrying out target tracking on the physical simulation models of the unmanned aerial vehicle system and the unmanned vehicle system based on the three-dimensional simulation scene dynamic map.
In an alternative embodiment, constructing a physical simulation model of the drone system and the drone vehicle system includes:
constructing a physical simulation model of the unmanned aerial vehicle system based on the V-REP tool;
and constructing a physical simulation model of the unmanned vehicle system based on the Gazebo tool.
The V-REP (virtual Robot expert platform) supports Bullet, ODE and Vortex (for fluid simulation) engines, integrates a large number of common models, is simple to model, and can calculate motion, rotation and collision according to physical properties of objects by the integrated physical engine. The V-REP supports a plurality of control modes including remote control, asynchronous control and synchronous control. The V-REP programming mode supports embedded scripts, loading items, plug-ins, remote APIs, ROS nodes and the like. Gazebo is the default simulator for ROS, supporting bullets and ODE engines. The model format in Gazebo is based on sdf (simulation Description format) of XML, and can simulate many physical characteristics in robots and environments. The physical simulation models of the unmanned aerial vehicle system and the unmanned vehicle system can display relevant data obtained by calculation of all unmanned systems in real time through the RVIZ data visualization tool.
In an alternative embodiment, a three-dimensional simulation scene dynamic map is constructed based on the sensing data of the unmanned vehicle system, as shown in fig. 2, and the method includes:
constructing a semi-dense map based on the sensing data of the unmanned vehicle system;
and (3) three-dimensional laser radar point cloud data based on the unmanned vehicle system, and constructing an environment dense map by fusing the semi-dense map.
The invention realizes positioning and environment modeling through the unmanned vehicle system, and the constructed three-dimensional scene dynamic map provides prior information for the whole unmanned system so as to realize cooperative control and target tracking. Firstly, fusing inertial navigation and IMU data during in-place attitude estimation, improving the vision self-positioning precision of the unmanned system, and quickly generating a semi-dense map, wherein the positioning and composition frequency at this stage is about 30 Hz; and then, constructing an environment dense map on the basis of the semi-dense map by combining the three-dimensional laser radar multi-frame fusion data with motion estimation, wherein the composition frequency at this stage is about 1Hz, and the aim is to locally generate a high-precision environment map.
In an alternative embodiment, the semi-dense map is constructed based on the sensing data of the unmanned vehicle system, and the semi-dense map comprises the following steps:
constructing a semi-dense map based on binocular vision sensing data of the unmanned vehicle system;
fusing gyroscope and inertial navigation sensing data of the unmanned vehicle system to a semi-dense map, and performing inter-frame motion estimation of the unmanned vehicle system;
and performing real-time correction on the semi-dense map through loop detection based on a key frame obtained by inter-frame motion estimation of the unmanned vehicle system.
In an alternative embodiment, the unmanned vehicle system, when performing inter-frame motion estimation, as shown in fig. 3, comprises:
estimating increase in unmanned vehicle systemQuantitative movement Tk,k-1To make the optical measurement error
Figure BDA0003192922220000131
Minimum;
Figure BDA0003192922220000132
in the formula, Tk,k-1Last frame of image taken by camera of unmanned vehicle system
Figure BDA0003192922220000133
And the current frame
Figure BDA0003192922220000134
The pose between, I represents the image,
Figure BDA0003192922220000135
for photometric residual, i.e. in the previous frame
Figure BDA0003192922220000136
And the current frame
Figure BDA0003192922220000137
The same three-dimensional point p observed inuU is the central position of the image.
In an alternative embodiment, the incremental motion T of the unmanned system is estimatedk,k-1To make the optical measurement error
Figure BDA0003192922220000138
And at least, comprising:
s1, determining the last frame
Figure BDA0003192922220000139
And the current frame
Figure BDA00031929222200001310
Pose change between Tk,k-1
S2, aboveOne frame
Figure BDA00031929222200001311
In (3), calculating three-dimensional point ρ by reprojectionuThe pixel intensity difference of (a);
wherein the content of the first and second substances,
Figure BDA00031929222200001312
in the formula, TBCA transformation matrix of a body coordinate system B and a camera coordinate system C of the unmanned vehicle system, u is a previous frame
Figure BDA00031929222200001313
The central position of the middle image block;
s3, in the current frame
Figure BDA00031929222200001314
The image feature alignment is carried out, and the current frame is calculated
Figure BDA00031929222200001315
With the image block P centered on the projected feature position u' relative to the frame in which the same feature was first observed
Figure BDA00031929222200001316
The pixel intensity difference of the reference image block in (1);
s4, correcting the projection characteristic position u' to obtain the current frame
Figure BDA00031929222200001317
Projected feature position u';
for point features:
u′*=u′+δu*
wherein the content of the first and second substances,
Figure BDA00031929222200001318
Figure BDA00031929222200001319
where Δ u is an iterative variable of the sum calculated over the image block P, δ u is the reprojection error, TCBTransformation matrix, T, for camera coordinate system C and body coordinate system B of unmanned vehicle systemkrIs a current frame
Figure BDA00031929222200001320
Sum frame
Figure BDA00031929222200001321
A is a constant;
for the line segment characteristics:
u′*=u′+δu*·n;
wherein the content of the first and second substances,
Figure BDA0003192922220000141
in the formula, Δ u is an iterative variable of a sum calculated on the image block P, δ u is a reprojection error, a is a constant, and n is a normal direction of a line segment characteristic;
s5, optimizing camera pose and landmark position χ ═ T by minimizing the sum of squares of reprojection errorskWi};
Figure BDA0003192922220000142
Wherein K is the set of all key frames in the semi-dense map,
Figure BDA0003192922220000143
is a current frame
Figure BDA0003192922220000144
A set of all landmarks corresponding to the point feature,
Figure BDA0003192922220000145
is a current frame
Figure BDA0003192922220000146
Set of all landmarks corresponding to line segment features, TkWIs a current frame
Figure BDA0003192922220000147
And key frames
Figure BDA0003192922220000148
The transformation matrix of (2).
In an alternative embodiment, the loop back detection comprises:
extracting key point features and key line segment features in the key frames, constructing a word bag model for the key point features and the line segment features, and storing the word bag model in a data dictionary;
when the camera of the unmanned vehicle system moves, calculating a similarity score value for the current key frame, and searching an image most similar to the current key frame in a data dictionary;
and performing real-time correction on the semi-dense map based on the most similar image.
In an alternative embodiment, when the camera of the unmanned vehicle system moves, calculating a similarity score value for obtaining the current key frame and looking up an image most similar to the current key frame in the data dictionary, comprises:
determining the number of key points n of a current key framekAnd the number of key segments nl
Calculating the dispersion value d of the key points according to the coordinates (x, y) of the key pointskAnd calculating the dispersion degree value d of the key line segment according to the midpoint coordinates of the key line segmentl
Based on the characteristics S of key pointskWeight of akAnd the Key line segment feature SlWeight of alCalculating a similarity score value A for the obtained image framet
Wherein A ist=ak*(nk/(nk+nl)+dk/(dk+dl))*Sk+al*(nl/(nk+nl)+dl/(dk+dl))*Sl
Wherein, the scale factor of 0.5 means that the point feature and the line feature have the same importance, and the scale can be properly adjusted according to the specific environment.
In an alternative embodiment, the dispersion value d of the keypoints is calculated from the coordinates (x, y) of the keypointskThe method comprises the following steps:
calculating the sum of the variances of the x coordinate and the y coordinate;
calculating the square root of the sum of variances as the dispersion value d of the key pointk
In an optional implementation mode, the method for constructing the environment dense map by fusing semi-dense map based on three-dimensional laser radar point cloud data of the unmanned vehicle system comprises the following steps:
the method comprises the steps that three-dimensional laser point cloud data based on an unmanned vehicle system are fused with gyroscope and inertial navigation sensing data of the unmanned vehicle system to obtain a point cloud key frame;
obtaining the pose of the unmanned vehicle system through inter-frame motion estimation of the unmanned vehicle system, and translating and rotating the point cloud key frame according to the pose of the unmanned vehicle system to construct a three-dimensional point cloud model;
and fusing the semi-dense map and the three-dimensional point cloud model to construct an environment dense map.
When the three-dimensional map is constructed, the point cloud data are translated and rotated according to the pose change of the unmanned vehicle system, and a unified three-dimensional grid map is finally formed through discretization. In an optional implementation manner, the pose of the unmanned vehicle system is obtained through inter-frame motion estimation of the unmanned vehicle system, and the point cloud key frame is translated and rotated according to the pose of the unmanned vehicle system to construct a three-dimensional point cloud model, which includes:
establishing a three-dimensional grid map based on the size of the unmanned vehicle system by taking the starting point coordinate of the unmanned vehicle system as an origin, the northward direction as the positive direction of an X axis, the eastward direction as the positive direction of a Y axis and the upward direction as the positive direction of a Z axis;
translating and rotating the obtained point cloud of the obstacle according to the current pose of the unmanned vehicle system, filling the translated and rotated point cloud into corresponding grids, and recording the number S of the point cloud in each grid;
updating each grid in a local range with the unmanned vehicle system as the center, and determining the point cloud number S of each updated grid+=αS-+ S; wherein the number of point clouds before updating in the grid is S-The number of point clouds updated in the grid is S+Alpha is a forgetting coefficient between 0 and 1, S is the current point cloud number, and the non-obstacle grid corresponds to S and is 0;
for the updated point cloud number S+Judging a threshold value, and if the updated point cloud number S is the same as the threshold value+And if the preset value is met, determining the grid as the obstacle, otherwise, determining the grid as the passable space.
The invention can automatically plan the tracking path of each unmanned system according to the motion state of the tracking target to the searched specific target so as to reach the target position. In the tracking process, in order to deal with the dynamic scene, real-time obstacle avoidance is needed.
In an optional embodiment, the target tracking of the physical simulation models of the unmanned aerial vehicle system and the unmanned vehicle system based on the three-dimensional simulation scene dynamic map includes:
in a three-dimensional simulation scene dynamic map, establishing a 3D route map according to the size of an unmanned aerial vehicle system;
and searching an optimal path through an A-algorithm based on the 3D roadmap.
In an optional embodiment, in the three-dimensional simulation scene dynamic map, a 3D route map is created according to the size of the drone system, and the method includes:
randomly selecting a sampling point in a three-dimensional simulation scene dynamic map, if the sampling point is located in an obstacle, reserving the sampling point, and repeating the steps until a sampling point diagram is established based on the reserved sampling point;
connecting each sampling point based on the sampling point diagram, if the connecting line collides with the barrier, canceling the connecting line, and repeating the steps to complete the connection of all the sampling points;
and connecting at least one adjacent point around each sampling point, if the sampling point and the adjacent point can be connected, reserving the corresponding connection line, and repeating the steps to complete the construction of the 3D route map.
The 3D route map disclosed by the invention is a probabilistic route map, adopts a random planning space sampling mode, and completes the description of a planning space by using fewer sampling points and path numbers. The complexity of the method is independent of the environment complexity and the dimension of a planning space, mainly depends on the path searching complexity, and has the advantages of no falling into a local minimum value, suitability for multiple dimensions and small calculation amount. The constructed 3D roadmap is shown, for example, in fig. 4.
In an optional implementation manner, after every certain time or every certain distance after the unmanned aerial vehicle system moves, the 3D roadmap is updated according to a new obstacle environment, new sampling points are added to the 3D roadmap, or sampling points or connecting lines colliding with the obstacle are deleted.
In an alternative embodiment, the searching for the optimal path through the a-algorithm based on the 3D roadmap includes:
calculating each estimated value of the starting point reaching the end point through each sampling point in the 3D route diagram through an A-x algorithm based on the starting point and the end point of the given unmanned aerial vehicle system;
sequencing each estimated value, deleting the sampling point with the minimum estimated value, and adding the surrounding points of the deleted sampling point as expansion points;
and sequentially circulating until the end point is searched.
The algorithm A is the most effective method for solving the shortest path in the static road network, and the basic idea is as follows: and selecting a proper heuristic function by adopting a heuristic search mode, and calculating and evaluating the cost value of each expansion point, so that a result with the optimal cost value is selected and expanded until a target point is found.
The a-x algorithm requires an evaluation function for evaluating the evaluation of the surrounding nodes, which is generally formed by:
f(n)=g(n)+h(n)
wherein, f (n) is an evaluation function and is the optimal estimated value of the starting point to the end point through the node n; g (n) is the actual consumption from the initial node to node n; h (n) is the estimated consumption to reach the end point from node n.
The invention searches for the shortest path based on the A-star algorithm so that the unmanned aerial vehicle system flies according to the shortest path. In the process of controlling flight, the unmanned aerial vehicle system is required to be subjected to real-time obstacle avoidance control. The invention adopts an artificial potential field method to carry out real-time obstacle avoidance control on the unmanned aerial vehicle system, and applies gravitation F to the unmanned aerial vehicle system from a target positionattThe obstacle exerts a repulsive force F to the unmanned aerial vehicle systemrepAnd the unmanned aerial vehicle system avoids the obstacle and reaches the target point under the action of the resultant force of the two forces. The artificial potential field method has the advantages of small calculation amount, easy understanding and clear mathematical expression.
The simulation system of the unmanned cluster system of the embodiment of the invention, the unmanned cluster system comprises an unmanned vehicle system and an unmanned aerial vehicle system, and the system comprises:
the simulation model establishing module is used for establishing physical simulation models of the unmanned aerial vehicle system and the unmanned vehicle system;
the simulation map building module is used for building a three-dimensional simulation scene dynamic map based on the sensing data of the unmanned vehicle system;
and the simulation module is used for tracking the targets of the physical simulation models of the unmanned aerial vehicle system and the unmanned vehicle system based on the three-dimensional simulation scene dynamic map.
In an alternative embodiment, the simulation modeling module is further configured to:
constructing a physical simulation model of the unmanned aerial vehicle system based on the V-REP tool;
and constructing a physical simulation model of the unmanned vehicle system based on the Gazebo tool.
In an alternative embodiment, the simulation mapping module is further configured to:
constructing a semi-dense map based on the sensing data of the unmanned vehicle system;
and (3) three-dimensional laser radar point cloud data based on the unmanned vehicle system, and constructing an environment dense map by fusing the semi-dense map.
In an alternative embodiment, the simulation mapping module is further configured to:
constructing a semi-dense map based on binocular vision sensing data of the unmanned vehicle system;
fusing gyroscope and inertial navigation sensing data of the unmanned vehicle system to a semi-dense map, and performing inter-frame motion estimation of the unmanned vehicle system;
and performing real-time correction on the semi-dense map through loop detection based on a key frame obtained by inter-frame motion estimation of the unmanned vehicle system.
In an alternative embodiment, the simulation mapping module is further configured to:
estimating incremental motion T of an unmanned vehicle systemk,k-1To make the optical measurement error
Figure BDA0003192922220000181
Minimum;
Figure BDA0003192922220000182
in the formula, Tk,k-1Last frame of image taken by camera of unmanned vehicle system
Figure BDA0003192922220000183
And the current frame
Figure BDA0003192922220000184
The pose between, I represents the image,
Figure BDA0003192922220000185
for photometric residual, i.e. in the previous frame
Figure BDA0003192922220000186
And the current frame
Figure BDA0003192922220000187
The same three-dimensional point p observed inuU is the central position of the image.
In an alternative embodiment, the simulation mapping module is further configured to:
determining the last frame
Figure BDA0003192922220000188
And the current frame
Figure BDA0003192922220000189
Pose change between Tk,k-1
In the last frame
Figure BDA00031929222200001810
In (3), calculating three-dimensional point ρ by reprojectionuThe pixel intensity difference of (a);
wherein the content of the first and second substances,
Figure BDA00031929222200001811
in the formula, TBCA transformation matrix of a body coordinate system B and a camera coordinate system C of the unmanned vehicle system, u is a previous frame
Figure BDA00031929222200001812
The central position of the middle image block;
in the current frame
Figure BDA00031929222200001813
The image feature alignment is carried out, and the current frame is calculated
Figure BDA00031929222200001814
With the image block P centered on the projected feature position u' relative to the frame in which the same feature was first observed
Figure BDA00031929222200001815
The pixel intensity difference of the reference image block in (1);
correcting the projection characteristic position u' to obtain the current frame
Figure BDA00031929222200001816
Projected feature position u';
for point features:
u′*=u′+δu*
wherein the content of the first and second substances,
Figure BDA00031929222200001817
Figure BDA0003192922220000191
where Δ u is an iterative variable of the sum calculated over the image block P, δ u is the reprojection error, TCBTransformation matrix, T, for camera coordinate system C and body coordinate system B of unmanned vehicle systemkrIs a current frame
Figure BDA0003192922220000192
Sum frame
Figure BDA0003192922220000193
A is a constant;
for the line segment characteristics:
u′*=u′+δu*·n;
wherein the content of the first and second substances,
Figure BDA0003192922220000194
in the formula, Δ u is an iterative variable of a sum calculated on the image block P, δ u is a reprojection error, a is a constant, and n is a normal direction of a line segment characteristic;
optimizing camera pose and landmark position by minimizing sum of squares of reprojection errors
Figure BDA0003192922220000195
Figure BDA0003192922220000196
Wherein K is the set of all key frames in the semi-dense map,
Figure BDA0003192922220000197
is a current frame
Figure BDA0003192922220000198
A set of all landmarks corresponding to the point feature,
Figure BDA0003192922220000199
is a current frame
Figure BDA00031929222200001910
Set of all landmarks corresponding to line segment features, TkWIs a current frame
Figure BDA00031929222200001911
And key frames
Figure BDA00031929222200001912
The transformation matrix of (2).
In an alternative embodiment, the simulation mapping module is further configured to:
extracting key point features and key line segment features in the key frames, constructing a word bag model for the key point features and the line segment features, and storing the word bag model in a data dictionary;
when the camera of the unmanned vehicle system moves, calculating a similarity score value for the current key frame, and searching an image most similar to the current key frame in a data dictionary;
and performing real-time correction on the semi-dense map based on the most similar image.
In an alternative embodiment, the simulation mapping module is further configured to:
determining the number of key points n of a current key framekAnd the number of key segments nl
Calculating the dispersion value d of the key points according to the coordinates (x, y) of the key pointskAnd calculating the dispersion degree value d of the key line segment according to the midpoint coordinates of the key line segmentl
Based on the characteristics S of key pointskWeight of akAnd the Key line segment feature SlWeight of alCalculating a similarity score value A for the obtained image framet
Wherein A ist=ak*(nk/(nk+nl)+dk/(dk+dl))*Sk+al*(nl/(nk+nl)+dl/(dk+dl))*Sl
In an alternative embodiment, the simulation mapping module is further configured to:
calculating the sum of the variances of the x coordinate and the y coordinate;
calculating the square root of the sum of variances as the dispersion value d of the key pointk
In an alternative embodiment, the simulation mapping module is further configured to:
the method comprises the steps that three-dimensional laser point cloud data based on an unmanned vehicle system are fused with gyroscope and inertial navigation sensing data of the unmanned vehicle system to obtain a point cloud key frame;
obtaining the pose of the unmanned vehicle system through inter-frame motion estimation of the unmanned vehicle system, and translating and rotating the point cloud key frame according to the pose of the unmanned vehicle system to construct a three-dimensional point cloud model;
and fusing the semi-dense map and the three-dimensional point cloud model to construct an environment dense map.
In an alternative embodiment, the simulation mapping module is further configured to:
establishing a three-dimensional grid map based on the size of the unmanned vehicle system by taking the starting point coordinate of the unmanned vehicle system as an origin, the northward direction as the positive direction of an X axis, the eastward direction as the positive direction of a Y axis and the upward direction as the positive direction of a Z axis;
translating and rotating the obtained point cloud of the obstacle according to the current pose of the unmanned vehicle system, filling the translated and rotated point cloud into corresponding grids, and recording the number S of the point cloud in each grid;
updating each grid in a local range taking the unmanned vehicle system as the center, and determining each grid after updatingNumber of point clouds S of grid+=αS-+ S; wherein the number of point clouds before updating in the grid is S-The number of point clouds updated in the grid is S+Alpha is a forgetting coefficient between 0 and 1, S is the current point cloud number, and the non-obstacle grid corresponds to S and is 0;
for the updated point cloud number S+Judging a threshold value, and if the updated point cloud number S is the same as the threshold value+And if the preset value is met, determining the grid as the obstacle, otherwise, determining the grid as the passable space.
In an alternative embodiment, the simulation module is further configured to:
in a three-dimensional simulation scene dynamic map, establishing a 3D route map according to the size of an unmanned aerial vehicle system;
and searching an optimal path through an A-algorithm based on the 3D roadmap.
In an alternative embodiment, the simulation module is further configured to:
randomly selecting a sampling point in a three-dimensional simulation scene dynamic map, if the sampling point is located in an obstacle, reserving the sampling point, and repeating the steps until a sampling point diagram is established based on the reserved sampling point;
connecting each sampling point based on the sampling point diagram, if the connecting line collides with the barrier, canceling the connecting line, and repeating the steps to complete the connection of all the sampling points;
and connecting at least one adjacent point around each sampling point, if the sampling point and the adjacent point can be connected, reserving the corresponding connection line, and repeating the steps to complete the construction of the 3D route map.
In an alternative embodiment, the simulation module is further configured to: and updating the 3D route map according to a new obstacle environment after every period of time or every distance of movement of the unmanned aerial vehicle system, and adding new sampling points into the 3D route map or deleting sampling points or connecting lines colliding with the obstacles.
In an alternative embodiment, the simulation module is further configured to:
calculating each estimated value of the starting point reaching the end point through each sampling point in the 3D route diagram through an A-x algorithm based on the starting point and the end point of the given unmanned aerial vehicle system;
sequencing each estimated value, deleting the sampling point with the minimum estimated value, and adding the surrounding points of the deleted sampling point as expansion points;
and sequentially circulating until the end point is searched.
The disclosure also relates to an electronic device comprising a server, a terminal and the like. The electronic device includes: at least one processor; a memory communicatively coupled to the at least one processor; and a communication component communicatively coupled to the storage medium, the communication component receiving and transmitting data under control of the processor; the memory stores instructions executable by the at least one processor, and the instructions are executed by the at least one processor to implement the simulation method in the above embodiments.
In an alternative embodiment, the memory is used as a non-volatile computer-readable storage medium for storing non-volatile software programs, non-volatile computer-executable programs, and modules. The processor executes various functional applications of the device and data processing, i.e., implements the emulation method, by running non-volatile software programs, instructions, and modules stored in the memory.
The memory may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store a list of options, etc. Further, the memory may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some embodiments, the memory optionally includes memory located remotely from the processor, and such remote memory may be connected to the external device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
One or more modules are stored in the memory and, when executed by the one or more processors, perform the methods of any of the method embodiments described above.
The product can execute the simulation method provided by the embodiment of the application, has corresponding functional modules and beneficial effects of the execution method, and can refer to the simulation method provided by the embodiment of the application without detailed technical details in the embodiment.
The present disclosure also relates to a computer-readable storage medium for storing a computer-readable program for causing a computer to perform some or all of the above-described embodiments of the simulation method.
That is, as can be understood by those skilled in the art, all or part of the steps in the simulation method of the above embodiments may be implemented by a program instructing related hardware, where the program is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
In the description provided herein, numerous specific details are set forth. It is understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Furthermore, those of ordinary skill in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the claims, any of the claimed embodiments may be used in any combination.
It will be understood by those skilled in the art that while the present invention has been described with reference to exemplary embodiments, various changes may be made and equivalents may be substituted for elements thereof without departing from the scope of the invention. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the invention without departing from the essential scope thereof. Therefore, it is intended that the invention not be limited to the particular embodiment disclosed, but that the invention will include all embodiments falling within the scope of the appended claims.

Claims (10)

1. A simulation method of an unmanned cluster system, wherein the unmanned cluster system comprises an unmanned vehicle system and an unmanned aerial vehicle system, the method comprising:
constructing physical simulation models of the unmanned aerial vehicle system and the unmanned vehicle system;
constructing a three-dimensional simulation scene dynamic map based on the sensing data of the unmanned vehicle system;
and carrying out target tracking on the unmanned aerial vehicle system and the physical simulation model of the unmanned aerial vehicle system based on the three-dimensional simulation scene dynamic map.
2. The method of claim 1, wherein constructing a three-dimensional simulated scene dynamic map based on sensory data of the unmanned vehicle system comprises:
constructing a semi-dense map based on the sensing data of the unmanned vehicle system;
and constructing an environment dense map by fusing the semi-dense map based on the three-dimensional laser radar point cloud data of the unmanned vehicle system.
3. The method of claim 2, wherein constructing a semi-dense map based on sensory data of the unmanned vehicle system comprises:
constructing a semi-dense map based on binocular vision sensing data of the unmanned vehicle system;
fusing gyroscope and inertial navigation sensing data of the unmanned vehicle system to the semi-dense map, and performing inter-frame motion estimation of the unmanned vehicle system;
and based on a key frame obtained by inter-frame motion estimation of the unmanned vehicle system, correcting the semi-dense map in real time through loop detection.
4. The method of claim 3, wherein the unmanned vehicle system, when performing inter-frame motion estimation, comprises:
estimating incremental motion T of the unmanned vehicle systemk,k-1To make the optical measurement error
Figure FDA0003192922210000011
Minimum;
Figure FDA0003192922210000012
in the formula, Tk,k-1For the last frame of the image taken by the camera of the unmanned vehicle system
Figure FDA0003192922210000013
And the current frame
Figure FDA0003192922210000014
The pose between, I represents the image,
Figure FDA0003192922210000015
for photometric residual, i.e. in the previous frame
Figure FDA0003192922210000016
And said current frame
Figure FDA0003192922210000017
The same three-dimensional point p observed inuU is the central position of the image.
5. The method of claim 2, wherein the loop back detection comprises:
extracting key point features and key line segment features in the key frames, constructing a bag-of-words model for the key point features and the line segment features, and storing the bag-of-words model in a data dictionary;
when the camera of the unmanned vehicle system moves, calculating a similarity score value for the current key frame, and searching for an image most similar to the current key frame in the data dictionary;
and performing real-time correction on the semi-dense map based on the most similar image.
6. The method of claim 3, wherein constructing an environmentally dense map based on three-dimensional lidar point cloud data for the unmanned vehicle system and fusing the semi-dense map comprises:
based on the three-dimensional laser point cloud data of the unmanned vehicle system, fusing a gyroscope and inertial navigation sensing data of the unmanned vehicle system to obtain a point cloud key frame;
obtaining the pose of the unmanned vehicle system through inter-frame motion estimation of the unmanned vehicle system, and translating and rotating the point cloud key frame according to the pose of the unmanned vehicle system to construct a three-dimensional point cloud model;
and fusing the semi-dense map and the three-dimensional point cloud model to construct an environment dense map.
7. The method of claim 1, wherein the target tracking of the physical simulation models of the drone system and the drone vehicle system based on the three-dimensional simulated scene dynamic map comprises:
establishing a 3D route map according to the size of the unmanned aerial vehicle system in the three-dimensional simulation scene dynamic map;
and searching an optimal path through an A-algorithm based on the 3D roadmap.
8. A simulation system of an unmanned cluster system, the unmanned cluster system comprising an unmanned vehicle system and an unmanned aerial vehicle system, the system comprising:
the simulation model establishing module is used for establishing physical simulation models of the unmanned aerial vehicle system and the unmanned vehicle system;
the simulation map building module is used for building a three-dimensional simulation scene dynamic map based on the sensing data of the unmanned vehicle system;
and the simulation module is used for tracking the targets of the unmanned aerial vehicle system and the physical simulation model of the unmanned aerial vehicle system based on the three-dimensional simulation scene dynamic map.
9. An electronic device comprising a memory and a processor, wherein the memory is configured to store one or more computer instructions, wherein the one or more computer instructions are executed by the processor to implement the method of any one of claims 1-7.
10. A computer-readable storage medium, on which a computer program is stored, the computer program being executable by a processor for implementing the method according to any one of claims 1-7.
CN202110883152.0A 2021-08-02 2021-08-02 Simulation method and system of unmanned cluster system Active CN113761647B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110883152.0A CN113761647B (en) 2021-08-02 2021-08-02 Simulation method and system of unmanned cluster system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110883152.0A CN113761647B (en) 2021-08-02 2021-08-02 Simulation method and system of unmanned cluster system

Publications (2)

Publication Number Publication Date
CN113761647A true CN113761647A (en) 2021-12-07
CN113761647B CN113761647B (en) 2023-06-30

Family

ID=78788360

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110883152.0A Active CN113761647B (en) 2021-08-02 2021-08-02 Simulation method and system of unmanned cluster system

Country Status (1)

Country Link
CN (1) CN113761647B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210394787A1 (en) * 2020-06-17 2021-12-23 Shenzhen Guo Dong Intelligent Drive Technologies Co., Ltd. Simulation test method for autonomous driving vehicle, computer equipment and medium
CN114964269A (en) * 2022-08-01 2022-08-30 成都航空职业技术学院 Unmanned aerial vehicle path planning method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102789171A (en) * 2012-09-05 2012-11-21 北京理工大学 Method and system for semi-physical simulation test of visual unmanned aerial vehicle flight control
CN110675418A (en) * 2019-09-26 2020-01-10 深圳市唯特视科技有限公司 Target track optimization method based on DS evidence theory
CN111694287A (en) * 2020-05-14 2020-09-22 北京百度网讯科技有限公司 Obstacle simulation method and device in unmanned simulation scene

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102789171A (en) * 2012-09-05 2012-11-21 北京理工大学 Method and system for semi-physical simulation test of visual unmanned aerial vehicle flight control
CN110675418A (en) * 2019-09-26 2020-01-10 深圳市唯特视科技有限公司 Target track optimization method based on DS evidence theory
CN111694287A (en) * 2020-05-14 2020-09-22 北京百度网讯科技有限公司 Obstacle simulation method and device in unmanned simulation scene

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210394787A1 (en) * 2020-06-17 2021-12-23 Shenzhen Guo Dong Intelligent Drive Technologies Co., Ltd. Simulation test method for autonomous driving vehicle, computer equipment and medium
CN114964269A (en) * 2022-08-01 2022-08-30 成都航空职业技术学院 Unmanned aerial vehicle path planning method
CN114964269B (en) * 2022-08-01 2022-11-08 成都航空职业技术学院 Unmanned aerial vehicle path planning method

Also Published As

Publication number Publication date
CN113761647B (en) 2023-06-30

Similar Documents

Publication Publication Date Title
CN109974693B (en) Unmanned aerial vehicle positioning method and device, computer equipment and storage medium
CN109084732B (en) Positioning and navigation method, device and processing equipment
Matthies et al. Stereo vision-based obstacle avoidance for micro air vehicles using disparity space
Lupton et al. Visual-inertial-aided navigation for high-dynamic motion in built environments without initial conditions
Panahandeh et al. Vision-aided inertial navigation based on ground plane feature detection
CN111811506A (en) Visual/inertial odometer combined navigation method, electronic equipment and storage medium
US10347001B2 (en) Localizing and mapping platform
Desaraju et al. Vision-based landing site evaluation and informed optimal trajectory generation toward autonomous rooftop landing
CN110749308B (en) SLAM-oriented outdoor positioning method using consumer-grade GPS and 2.5D building models
CN111338383A (en) Autonomous flight method and system based on GAAS and storage medium
CN113048980A (en) Pose optimization method and device, electronic equipment and storage medium
Andert et al. Lidar-aided camera feature tracking and visual slam for spacecraft low-orbit navigation and planetary landing
Magree et al. Combined laser and vision-aided inertial navigation for an indoor unmanned aerial vehicle
CN113177974A (en) Point cloud registration method and device, electronic equipment and storage medium
CN115639823A (en) Terrain sensing and movement control method and system for robot under rugged and undulating terrain
Mostafa et al. A smart hybrid vision aided inertial navigation system approach for UAVs in a GNSS denied environment
CN113761647B (en) Simulation method and system of unmanned cluster system
Ellingson et al. Relative visual-inertial odometry for fixed-wing aircraft in GPS-denied environments
KR102589296B1 (en) Apparatus and method for generating three dimensional map using aerial images
CN117685953A (en) UWB and vision fusion positioning method and system for multi-unmanned aerial vehicle co-positioning
KR102130687B1 (en) System for information fusion among multiple sensor platforms
CN111815684A (en) Space multivariate feature registration optimization method and device based on unified residual error model
Abdulov et al. Visual odometry approaches to autonomous navigation for multicopter model in virtual indoor environment
CN110794434A (en) Pose determination method, device, equipment and storage medium
CN109901589B (en) Mobile robot control method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant