CN110221623A - A kind of air-ground coordination operating system and its localization method - Google Patents
A kind of air-ground coordination operating system and its localization method Download PDFInfo
- Publication number
- CN110221623A CN110221623A CN201910522367.2A CN201910522367A CN110221623A CN 110221623 A CN110221623 A CN 110221623A CN 201910522367 A CN201910522367 A CN 201910522367A CN 110221623 A CN110221623 A CN 110221623A
- Authority
- CN
- China
- Prior art keywords
- pose
- unmanned vehicle
- unmanned plane
- unmanned
- relative pose
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 26
- 230000004807 localization Effects 0.000 title claims abstract description 19
- 230000000007 visual effect Effects 0.000 claims description 60
- 230000009466 transformation Effects 0.000 claims description 16
- 238000012545 processing Methods 0.000 claims description 9
- 239000013589 supplement Substances 0.000 claims description 9
- 230000007613 environmental effect Effects 0.000 claims description 6
- 230000008901 benefit Effects 0.000 claims description 3
- 230000000295 complement effect Effects 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 11
- 230000033001 locomotion Effects 0.000 description 8
- 238000001514 detection method Methods 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 4
- 230000006872 improvement Effects 0.000 description 4
- 238000005259 measurement Methods 0.000 description 4
- 238000004891 communication Methods 0.000 description 3
- 230000004927 fusion Effects 0.000 description 3
- 230000004888 barrier function Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000003709 image segmentation Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000011835 investigation Methods 0.000 description 1
- 238000007500 overflow downdraw method Methods 0.000 description 1
- 231100000572 poisoning Toxicity 0.000 description 1
- 230000000607 poisoning effect Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The invention discloses a kind of air-ground coordination operating systems, comprising: unmanned vehicle by the three-dimensional map of mobile building target scene on the ground, and is positioned in real time from the pose under the map coordinates system of three-dimensional map;Unmanned plane, for carrying out operation to operative goals;Data collector, for obtaining the first relative pose and operative goals of unmanned plane and unmanned vehicle and the second relative pose of unmanned vehicle;Data processor, it is connected with unmanned vehicle, unmanned plane and data collector, for pose, the first relative pose and the second relative pose according to unmanned vehicle under map coordinates system, obtain the third relative pose of unmanned plane and operative goals, and third relative pose is sent to unmanned plane, so that unmanned plane adjusts in pose to default job area according to third relative pose.The air-ground coordination operating system assists unmanned plane to be accurately positioned in target scene by unmanned vehicle.The invention also discloses a kind of localization methods of above-mentioned air-ground coordination operating system.
Description
Technical field
The present invention relates to multi-robot system technical fields, more specifically to a kind of air-ground coordination operating system.This
Outside, the invention further relates to a kind of localization methods of above-mentioned air-ground coordination operating system.
Background technique
With the development of unmanned air vehicle technique, operating personnel is replaced to carry out operation under 3 D complex environment using unmanned plane,
To reduction economic cost, reduces labor intensity and reduce security risk of the operating personnel when executing task with important
Meaning becomes a hot spot of those skilled in the art's research.
Unmanned plane in the prior art is mainly used to the untouchable task for executing investigation, inspection etc. along particular course, because
This, it is not high to the positioning accuracy request of unmanned plane.
But when executing complicated aerial work task, especially contact task, the operation dress carried on unmanned plane
It sets and carries out will receive reaction force when contact operation, and the effective operation of apparatus for work is limited in scope, and identifies operative goals
Airborne camera also will receive the limitation in focal length and the visual field, unmanned plane has to constantly adjust posture, to guarantee it in energy
Within enough OK ranges for smoothly executing job task, therefore, the accurate positioning of unmanned plane in the scene is even more important.
Currently, mainly using GPS, inertial navigation system (inertia navigation when unmanned plane operation in the sky
System, INS) or the location technologies such as carrier phase difference technology (Real-time kinematic, RTK) positioned.
However, GPS positioning is easy by electromagnetic interference, and the renewal frequency of GPS data is generally in 1Hz-2Hz, positioning
For precision generally in meter level, simple dependence GPS is not able to satisfy the demand of unmanned plane aerial work;And INS positioning can with when
Between increase there are drift errors;RTK location technology higher cost needs to be equipped with shifting in the range of fixed base stations are not covered with
Dynamic base station, and need the support of GPS data.
Therefore, how to make the exact position of unmanned plane in the scene obtained from body, be current those skilled in the art urgently
It solves the problems, such as.
Summary of the invention
In view of this, the object of the present invention is to provide a kind of air-ground coordination operating system and its localization methods, by nobody
Vehicle assists unmanned plane to be accurately positioned in target scene, the collaborative work of unmanned plane and unmanned vehicle is realized, even in unmanned plane
There are blind areas in itself visual field, when can not observe operative goals, also can be realized the accurate positioning of unmanned plane, meet unmanned plane
Operative goals are executed with the demand of job task.
To achieve the goals above, the invention provides the following technical scheme:
A kind of air-ground coordination operating system, comprising:
Unmanned vehicle, by the three-dimensional map of mobile building target scene on the ground, and positioning in real time is from described
Pose under the map coordinates system of three-dimensional map;
Unmanned plane, for carrying out operation to operative goals;
Data collector, for obtaining the first relative pose and the operation mesh of the unmanned plane and the unmanned vehicle
Second relative pose of mark and the unmanned vehicle;
Data processor is connected with the unmanned vehicle, the unmanned plane and the data collector, for according to institute
Pose, first relative pose and second relative pose of the unmanned vehicle under the map coordinates system are stated, institute is obtained
The third relative pose of unmanned plane Yu the operative goals is stated, and the third relative pose is sent to the unmanned plane, with
Adjust the unmanned plane in pose to default job area according to the third relative pose.
Preferably, the data collector includes:
First relative pose obtains module, for obtaining first relative pose;
Second relative pose obtains module, for obtaining second relative pose.
Preferably, it includes visual signature mark and alignment sensor, the positioning that first relative pose, which obtains module,
Sensor is for detecting the visual signature mark to obtain first relative pose, the alignment sensor and data processing
Device is connected;
The visual signature mark is arranged on one of the unmanned vehicle and the unmanned plane;The alignment sensor
It is arranged on the other of the unmanned vehicle and the unmanned plane.
Preferably, the second relative pose acquisition module includes:
Set on the operation sensor of the unmanned plane, for when the operative goals its within sweep of the eye when obtain described in
The image of operative goals;
Image processor is connected with the operation sensor, for obtaining the work according to the image of the operative goals
Fourth relative pose of the industry target under the coordinate system of the operation sensor, described image processor and the data processor
It is connected;
The data processor is also used to obtain described according to first relative pose and the 4th relative pose
Two relative poses.
Preferably, the unmanned vehicle is equipped with the laser radar sensing of the basic environment information for obtaining the target scene
The odometer of device and the mileage for measuring the unmanned vehicle, the laser radar sensor and the odometer respectively with it is described
The vehicle-mounted master controller of unmanned vehicle is connected, so that the vehicle-mounted master controller is according to the basic environment information and the mileage pair
The target scene constructs the three-dimensional map, and obtains pose of the unmanned vehicle under the map coordinates system.
Preferably, the unmanned plane is equipped with for obtaining the letter of the supplement ambient in the laser radar sensor blind area
Breath build map sensor, the map sensor of building is connected with the vehicle-mounted master controller so that the vehicle-mounted master controller according to
The basic environment information, the mileage and the supplement ambient information construct the three-dimensional map to the target scene.
Preferably, the alignment sensor and the map sensor of building are the same visual sensor.
A kind of localization method of air-ground coordination operating system is applied to any one of the above air-ground coordination operating system, packet
It includes:
The three-dimensional map of target scene is constructed using unmanned vehicle and obtains the unmanned vehicle from the three-dimensional map
Pose under map coordinates system;
The first relative pose of the unmanned vehicle and unmanned plane is obtained using data collector, and is acquired using the data
Second relative pose of device acquisition operative goals and the unmanned vehicle;
Pose, first relative pose using data processor according to the unmanned vehicle under the map coordinates system
And second relative pose, obtain the third relative pose of the unmanned plane Yu the operative goals, and by the third
Relative pose is sent to the unmanned plane, so that the unmanned plane adjusts pose to default operation according to the third relative pose
In range.
Preferably, described to construct the three-dimensional map of target scene using unmanned vehicle and obtain the unmanned vehicle from described
Pose under the map coordinates system of three-dimensional map, comprising:
The mileage information of the unmanned vehicle is measured using the odometer of the unmanned vehicle;
The basic environment information of the target scene is obtained using the laser radar sensor of the unmanned vehicle;
Using the vehicle-mounted master controller of the unmanned vehicle according to the mileage information and the basic environment information, use
Gmapping algorithm constructs the three-dimensional map of the target scene, and obtains the unmanned vehicle under the map coordinates system
Pose.
Preferably, described to construct the three-dimensional map of target scene using unmanned vehicle and obtain the unmanned vehicle from described
Pose under the map coordinates system of three-dimensional map, comprising:
The mileage information of the unmanned vehicle is measured using the odometer of the unmanned vehicle;
The basic environment information of the target scene is obtained using the laser radar sensor of the unmanned vehicle;
Supplement ambient in the laser radar sensor blind area is obtained using the map sensor of building of the unmanned plane
Information;
Using the vehicle-mounted master controller of the unmanned vehicle according to the mileage information, the basic environment information and the benefit
Environmental information is filled, the three-dimensional map of the target scene is constructed using Gmapping algorithm, and obtains the unmanned vehicle and exists
Pose under the map coordinates system.
Preferably, first relative pose that the unmanned vehicle and unmanned plane are obtained using data collector, comprising:
Using the upper alignment sensor carried of one of the unmanned vehicle and the unmanned plane detect the unmanned vehicle and
The upper visual signature mark being arranged of the other of described unmanned plane;Wherein, the visual signature is identified as AprilTag mark;
Contained using AprilTag recognizer to what the alignment sensor obtained using the alignment sensor
The image for stating visual signature mark is handled, and obtains the visual signature mark under the coordinate system of the alignment sensor
Pose;
Position using the data processor according to the alignment sensor under the unmanned plane coordinate system of the unmanned plane
The pose and visual signature mark of appearance, visual signature mark under the unmanned vehicle coordinate system of the unmanned vehicle exist
Pose under the coordinate system of the alignment sensor obtains first relative pose;
Alternatively, using the data processor according to the alignment sensor under the unmanned vehicle coordinate system of the unmanned vehicle
Pose, the visual signature identify pose and the visual signature mark under the unmanned plane coordinate system of the unmanned plane
Know the pose under the coordinate system of the alignment sensor, obtains first relative pose.
Preferably, second relative pose that operative goals and the unmanned vehicle are obtained using the data collector,
Include:
The image of the operative goals is obtained using the operation sensor carried on the unmanned plane;
Using the image processor for having data connection is established with the operation sensor, according to the image of the operative goals
Obtain fourth relative pose of the operative goals under the coordinate system of the operation sensor;
Using the data processor according to the 4th relative pose and first relative pose, pass through space coordinate
System's transformation, obtains second relative pose.
Preferably, the position using the data processor according to the unmanned vehicle under the map coordinates system
Appearance, first relative pose and second relative pose, obtain the third phase pair of the unmanned plane Yu the operative goals
Pose, comprising:
Using data processor according to pose of the unmanned vehicle under the map coordinates system and the first opposite position
Appearance obtains pose of the unmanned plane under the map coordinates system;
Using the data processor according to pose of the unmanned vehicle under the map coordinates system and second phase
To pose, pose of the operative goals under the map coordinates system is obtained;
Pose and the operation mesh using the data processor according to the unmanned plane under the map coordinates system
The pose being marked under the map coordinates system obtains the third relative pose.
Air-ground coordination operating system provided by the invention constructs three-dimensional map to target scene by unmanned vehicle, and in real time
Unmanned vehicle is positioned from pose under map coordinates system, meanwhile, the of unmanned vehicle and unmanned plane is obtained using data collector
One relative pose, the positioning references using unmanned vehicle as unmanned plane on ground assist unmanned plane in target field by unmanned vehicle
It is accurately positioned in scape.Meanwhile data collector can obtain the second relative pose of operative goals and unmanned vehicle, then by data
It manages device and carries out spatial coordinate transformation, it is opposite to the first of pose of the unmanned vehicle under map coordinates system, unmanned plane and unmanned vehicle
Second relative pose of pose and operative goals and unmanned vehicle carries out data processing, and unmanned plane and operative goals can be obtained
Third relative pose after third relative pose is sent to unmanned plane by data processor, makes unmanned plane according to third relative pose
It adjusts in pose to default job area.
It can thus be seen that air-ground coordination operating system provided by the invention, answering for unmanned vehicle in the prior art is breached
With limitation, solves the problems, such as aerial mobility platform and Land Mobile platform " fighting separately ", " collaboration is ineffective ", be truly realized
The collaborative work of unmanned plane and unmanned vehicle.Even in itself visual field of unmanned plane, there are blind areas, can not observe operative goals
When, it also can be realized the accurate positioning of unmanned plane, meet the needs of job task is executed to operative goals, be particularly suitable for aerial
Complicated contact operation.
The localization method of air-ground coordination operating system provided by the invention, it can be achieved that unmanned plane accurate positioning.
Detailed description of the invention
In order to more clearly explain the embodiment of the invention or the technical proposal in the existing technology, to embodiment or will show below
There is attached drawing needed in technical description to be briefly described, it should be apparent that, the accompanying drawings in the following description is only this
The embodiment of invention for those of ordinary skill in the art without creative efforts, can also basis
The attached drawing of offer obtains other attached drawings.
Fig. 1 is the structural block diagram of air-ground coordination operating system provided by the specific embodiment of the invention;
Fig. 2 is the operation schematic diagram of air-ground coordination operating system provided by the specific embodiment of the invention;
Fig. 3 is another visual angle schematic diagram of Fig. 2;
Fig. 4 is the structural schematic diagram of unmanned plane in Fig. 2;
Fig. 5 is the structural schematic diagram of unmanned vehicle in Fig. 2;
Fig. 6 is the movement locus schematic diagram that unmanned plane follows unmanned vehicle adjustment pose in Fig. 2;
Fig. 7 is the flow chart of the localization method of air-ground coordination operating system provided by the specific embodiment of the invention;
Fig. 8 is to construct the three-dimensional map of target scene using unmanned vehicle in Fig. 7 and obtain unmanned vehicle from three-dimensional map
Map coordinates system under pose a kind of preferred embodiment flow chart;
Fig. 9 is to construct the three-dimensional map of target scene using unmanned vehicle in Fig. 7 and obtain unmanned vehicle from three-dimensional map
Map coordinates system under pose another preferred embodiment flow chart;
Figure 10 is a kind of preferred side for obtaining the first relative pose of unmanned vehicle and unmanned plane in Fig. 7 using data collector
The flow chart of case;
Figure 11 is to obtain one kind of the second relative pose of operative goals and unmanned vehicle preferably using data collector in Fig. 7
The flow chart of scheme;
Figure 12 is the flow chart for obtaining the third relative pose of unmanned plane and operative goals in Fig. 7 using data processor.
Appended drawing reference of the Fig. 1 into Fig. 6 is as follows:
1 it is unmanned vehicle, 11 be visual signature mark, 12 be laser radar sensor, 13 be unmanned vehicle motion profile, 2 is
Unmanned plane, 21 be depth camera, 22 be mechanical arm, 23 be unmanned plane motion profile, 3 be data collector, 4 be data processing
Device, 5 are operative goals.
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete
Site preparation description, it is clear that described embodiments are only a part of the embodiments of the present invention, instead of all the embodiments.It is based on
Embodiment in the present invention, it is obtained by those of ordinary skill in the art without making creative efforts every other
Embodiment shall fall within the protection scope of the present invention.
Core of the invention is to provide a kind of air-ground coordination operating system and its localization method, assists nobody by unmanned vehicle
Machine is accurately positioned in target scene, the collaborative work of unmanned plane and unmanned vehicle is realized, even in itself visual field of unmanned plane
There are blind areas, when can not observe operative goals, also can be realized the accurate positioning of unmanned plane, meet unmanned plane to operative goals
Execute the demand of job task.
Fig. 1-Fig. 6 is please referred to, Fig. 1 is the structural frames of air-ground coordination operating system provided by the specific embodiment of the invention
Figure;Fig. 2 is the operation schematic diagram of air-ground coordination operating system provided by the specific embodiment of the invention;Fig. 3 is another view of Fig. 2
Angle schematic diagram;Fig. 4 is the structural schematic diagram of unmanned plane in Fig. 2;Fig. 5 is the structural schematic diagram of unmanned vehicle in Fig. 2;Fig. 6 is in Fig. 2
Unmanned plane follows the movement locus schematic diagram of unmanned vehicle adjustment pose.
The present invention provides a kind of air-ground coordination operating system, unmanned plane 2 and unmanned vehicle 1 including collaborative work, unmanned plane 2
It is preferred that equipped with mechanical arm 22, for carrying out operation to operative goals 5;Unmanned vehicle 1 is constructed for moving in target scene
The three-dimensional map of target scene, the corresponding coordinate system of three-dimensional map are known as map coordinates system, the three-dimensional that unmanned vehicle 1 is constructed according to it
Map can be positioned in real time from the pose under map coordinates system.
It should be noted that the main structure of unmanned plane 2 and unmanned vehicle 1 and the unmanned plane of routine in the prior art and nobody
The structure of vehicle is identical, and the concrete mode that unmanned vehicle 1 constructs the three-dimensional map of target scene can be found in the prior art, no longer superfluous herein
It states.
Preferably, unmanned plane 2 is culvert type aircraft, for example, double duct operation aircraft.
Preferably, unmanned plane 2 is to be tethered at unmanned plane, that is, unmanned plane 2 is connected with unmanned vehicle 1 by being tethered at cable, with reality
Existing unmanned plane 2 follows unmanned vehicle 1 to run.
It is further preferred that unmanned vehicle 1 is equipped with the power supply unit for powering to unmanned plane 2, unmanned plane 2 is equipped with power supply mould
Block, power supply unit are connected with power module by being tethered at cable.That is, realizing power supply unit and power module by being tethered at cable
Connection, so as to be that unmanned plane 2 charges by the power supply unit on unmanned vehicle 1, the program has given full play to unmanned vehicle 1
Load greatly and long-endurance advantage.
Certainly, unmanned vehicle 1 is also used as the delivery stopping platform of unmanned plane 2, unmanned plane 2 enter target scene it
Before, unmanned plane 2 is carried by unmanned vehicle 1 and is transported to target scene, to save the cruise duration of unmanned plane 2.In this base
On plinth, the charging unit for charging for unmanned plane 2 can also be set on unmanned vehicle 1, to rest in unmanned vehicle in unmanned plane 2
It charges when on 1 for unmanned plane 2, is ensured to provide charging for unmanned plane 2.
The air-ground coordination operating system further includes data collector 3 and data processor 4, and data collector 3 is for obtaining
Unmanned vehicle 1 and the first relative pose of unmanned plane 2 and the second relative pose of operative goals 5 and unmanned vehicle 1;Data acquisition
Device 3 is connected with data processor 4, and the first relative pose and the second relative pose are sent to data processor 4;Meanwhile nothing
People's vehicle 1 is connected with data processor 4, and pose of the unmanned vehicle 1 under map coordinates system is sent to data processor 4;Data
Pose, first relative pose and second relative pose of the unmanned vehicle 1 that processor 4 obtains it under map coordinates system into
Row data processing obtains the third relative pose of unmanned plane 2 Yu operative goals 5;Data processor 4 is connected with unmanned plane 2, will
The third relative pose of the unmanned plane 2 and operative goals 5 that obtain after processing is sent to unmanned plane 2, makes unmanned plane 2 according to third phase
To in pose adjustment pose to default job area.
It is understood that the third of unmanned plane 2 and operative goals 5 that data processor 4 obtains after being preferably processed to
Relative pose is sent to the airborne master controller of unmanned plane 2, by the flight control system of airborne main controller controls unmanned plane 2 to nothing
Man-machine 2 are adjusted pose.
It should be noted that default job area refers to that 2 relative activity target 5 of unmanned plane can normally complete aerial work
The operating area of task.
" data collector 3 is connected with data processor 4, unmanned vehicle 1 is connected with data processor 4 and number in the present invention
It is connected according to processor 4 with unmanned plane 2 ", refer to that data collector 3, unmanned vehicle 1 and unmanned plane 2 can be built with data processor 4 respectively
Vertical communication connection, to realize the transmission of information.
It should be noted that data processor 4 in the present invention can be for independently of the independent of unmanned plane 2 and unmanned vehicle 1
Data processing module, or such scheme, data processor 4 are set on unmanned vehicle 1 or unmanned plane 2,1 He of unmanned vehicle
Establishing between unmanned plane 2 has communication connection.
It is understood that data processor 4 is according to pose of the unmanned vehicle 1 under map coordinates system and unmanned plane 2 and nothing
First relative pose of people's vehicle 1, by spatial coordinate transformation, pose of the available unmanned plane 2 under map coordinates system,
That is, realizing positioning of the unmanned plane 2 in target scene.There is in target scene due to operative goals 5 determining position, number again
Pose according to processor 4 by the second relative pose and unmanned vehicle 1 of operative goals 5 and unmanned vehicle 1 under map coordinates system,
Pose of the operative goals 5 under map coordinates system can be obtained;Therefore, data processor 4 is according to unmanned plane 2 in map coordinates system
Under pose under map coordinates system of pose and operative goals 5, by spatial coordinate transformation, can be obtained unmanned plane 2 with
The third relative pose of operative goals 5.
That is, the present invention assists unmanned plane 2 to be accurately positioned in target scene by unmanned vehicle 1, unmanned plane 2 is realized
With the collaborative work of unmanned vehicle 1, the characteristics of making full use of 1 maneuverability of unmanned vehicle, unmanned vehicle 1 is positioned as unmanned plane 2
Object of reference makes unmanned plane 2 that the motion profile of unmanned vehicle 1 be followed to adjust pose.
For example, positioning is from ground in real time as shown in fig. 6, unmanned vehicle 1 is according to the three-dimensional map of the target scene of its building
Pose in figure coordinate system, and independent navigation is run, and forms unmanned vehicle motion profile 13;In 1 operational process of unmanned vehicle, nobody
Therefore pose real-time change of the vehicle 1 in map coordinates system occurs that unmanned plane 2 and the first relative pose of unmanned vehicle 1 in real time
Variation, and operative goals 5 have determining position in target scene, therefore, operative goals 5 have true in map coordinates system
Fixed pose coordinate forms unmanned plane to make unmanned plane 2 that the motion profile of unmanned vehicle 1 be followed constantly to adjust itself pose
Motion profile 23, to keep the third relative pose of unmanned plane 2 and operative goals 5 to meet default job area requirement.
In view of data collector 3 obtains the convenience of the first relative pose and the second relative pose, in above-described embodiment
Basis on, data collector 3 includes that the first relative pose obtains module and the second relative pose and obtains module, and first is opposite
Pose obtains the first relative pose for obtaining unmanned plane 2 Yu unmanned vehicle 1;Second relative pose obtains module for obtaining operation
Second relative pose of target 5 and unmanned vehicle 1.
In view of the first relative pose obtains the specific reality that module obtains the first relative pose of unmanned plane 2 and unmanned vehicle 1
Existing, on the basis of above-described embodiment, it includes that visual signature identifies 11 and alignment sensor that the first relative pose, which obtains module,
Alignment sensor is for detecting visual signature mark 11 to obtain the first relative pose, alignment sensor and 4 phase of data processor
Even, the first relative pose is sent to data processor 4;Visual signature mark 11 is arranged in unmanned vehicle 1 and unmanned plane 2
In one;Alignment sensor is arranged on the other of unmanned vehicle 1 and unmanned plane 2.
Preferably, visual signature mark 11 is arranged on unmanned vehicle 1, and alignment sensor is arranged on unmanned plane 2.It can manage
Solution, after visual signature mark 11 is fixed on unmanned vehicle 1, visual signature mark 11 is sat in the unmanned vehicle of unmanned vehicle 1
Pose under mark system determines, it is preferable that visual signature mark 11 carries its posture information under unmanned vehicle coordinate system.Together
Reason, after alignment sensor is fixed, the pose under the unmanned plane coordinate system of unmanned plane 2 be can determine.When alignment sensor is examined
After measuring visual signature mark 11, pose of the visual signature mark 11 under alignment sensor coordinate system can be obtained, positioning passes
Visual signature is identified 11 after the pose under alignment sensor coordinate system is sent to data processor 4 by sensor, data processor 4
By spatial coordinate transformation, the relative pose between unmanned vehicle coordinate system and unmanned plane coordinate system can be obtained, that is, nobody
First relative pose of vehicle 1 and unmanned plane 2.
The present invention to the specific structure of visual signature mark 11 without limitation, for example, visual signature mark 11 can be
The data volume of AprilTag mark, AprilTag mark is few, and feature is obvious, readily identified, and is illuminated by the light, drift angle, partial occlusion
Etc. factors influence it is small, meanwhile, AprilTag mark is adjusted the distance and the requirement of resolution ratio is low, therefore, the essence suitable for unmanned plane 2
Certainly position.
Preferably, visual signature mark 11 identifies for two dimensional code.
Alignment sensor is preferably depth camera.
It should be noted that the present invention does not limit the specific recognition methods of visual signature mark 11 alignment sensor
Fixed, those skilled in the art can be found in the prior art.
Preferably, depth camera uses AprilTag recognizer, the figure containing AprilTag mark obtained to it
As being handled, obtains AprilTag and identify the pose under the coordinate system of depth camera.
It should be noted that AprilTag recognizer is mature algorithm in the prior art, step and meter are calculated
It calculates principle and refers to the prior art, repeats no more herein.Main improvement of the invention is, AprilTag recognizer is answered
The identification that AprilTag is identified for depth camera.
The specific of operative goals 5 and the second relative pose of unmanned vehicle 1 is obtained in view of the second relative pose obtains module
It realizes, on the basis of above-described embodiment, it includes operation sensor and image processor that the second relative pose, which obtains module, is made
Industry sensor be set to unmanned plane 2 on, for when operative goals 5 its within sweep of the eye when obtain operative goals 5 image;Image
Processor is connected with operation sensor, for obtaining operative goals 5 in the coordinate of operation sensor according to the image of operative goals 5
The 4th relative pose under system, image processor is connected with data processor 4, by operative goals 5 operation sensor seat
The 4th relative pose under mark system is sent to data processor 4, and data processor 4 is also used to according to the first relative pose and the 4th
Relative pose obtains the second relative pose.
That is, the present embodiment detects operative goals 5 by operation sensor, when the flight with unmanned plane 2, make
When industry target 5 enters the visual field of operation sensor, operation sensor can obtain the image of operative goals 5, and in image procossing
Under the processing of device, fourth relative pose of the operative goals 5 under the coordinate system of operation sensor is obtained, it is to be understood that make
There is determining relative position by coordinate transform, operative goals 5 therefore can be obtained in unmanned plane for industry sensor and unmanned plane 2
Pose under coordinate system.
Therefore, data processor 4 can be according to pose of the unmanned vehicle 1 under map coordinates system, unmanned plane 2 and unmanned vehicle 1
The pose of first relative pose and operative goals 5 under the coordinate system of operation sensor is made by spatial coordinate transformation
Pose of the industry target 5 under map coordinates system.
It is understood that in the present embodiment, when operative goals 5 operation sensor within sweep of the eye when, nobody
Machine 2 can detect operative goals 5 by the operation sensor that itself is carried, thus according to the information of operation sensor detection into
The pose of row unmanned plane 2 itself adjusts.Meanwhile operation sensor passes through image processor for operative goals 5 in operation sensor
Coordinate system under pose be sent to data processor 4, by the processing of data processor 4, it may be determined that operative goals 5 are in map
Pose under coordinate system.When operative goals 5 are in the blind area of operation sensor, at this moment, operation sensor will be unable to examine
Operative goals 5 are measured, unmanned plane 2 can not be positioned using the operation sensor that itself is carried, and therefore, unmanned plane 2 passes through nobody
The assistance of vehicle 1 is accurately positioned.
It is of course also possible to the second operation sensor for obtaining the pose of operative goals 5 is set on unmanned vehicle 1, this
When, the second operation sensor and unmanned vehicle 1 have determining position, the pose for the operative goals 5 that the second operation sensor obtains
It is the pose relative to the coordinate system of the second operation sensor, further according to the coordinate system and unmanned vehicle coordinate of the second operation sensor
The pose of relative pose and unmanned vehicle 1 under map coordinates system between system, can be obtained operative goals 5 in map reference
Pose under system, it should be noted that in this scenario, image processor is desirably integrated into the second operation sensor.
It is, of course, also possible to be such scheme, unmanned vehicle 1 is equipped with for obtaining the letter of the target object of reference in target scene
The third operation sensor of breath;Wherein, target object of reference information and operative goals 5 have determining space relative pose.At this moment,
Pose of the available target object of reference information of third operation sensor under the coordinate system of third operation sensor, third operation
Sensor and unmanned vehicle 1 have determining relative position, therefore, according to pose, third work of the unmanned vehicle 1 under map coordinates system
The relative pose of the coordinate system of industry sensor and unmanned vehicle coordinate system, target object of reference information third operation sensor coordinate
Pose and operative goals 5 under system and the relative pose of target object of reference information are made by spatial coordinate transformation
Pose of the industry target 5 under map coordinates system, it should be noted that in this scenario, image processor is desirably integrated into third
In operation sensor.
In view of unmanned vehicle 1 constructs the specific implementation of the three-dimensional map of target scene, as a preferred embodiment, unmanned vehicle
1 is equipped with the mileage for obtaining the laser radar sensor 12 of the basic environment information of target scene and for measuring unmanned vehicle 1
Odometer, laser radar sensor 12 and odometer are connected with the vehicle-mounted master controller of unmanned vehicle 1 respectively, so that vehicle-mounted master control
Device processed constructs three-dimensional map to target scene according to basic environment information and mileage, and obtains unmanned vehicle 1 under map coordinates system
Pose.
Preferably, the basic environment information and odometer that vehicle-mounted master controller is detected according to laser radar sensor 12 measure
Unmanned vehicle 1 mileage, using Gmapping algorithm carry out three-dimensional map building and in real time position unmanned vehicle 1 in map coordinates system
In pose.
It should be noted that Gmapping algorithm is that mature algorithm, calculating step and calculating are former in the prior art
Reason refers to the prior art, and repeats no more herein.Main improvement of the invention is, Gmapping algorithm is applied to nobody
Vehicle 1 is to the three-dimensional map building of target scene and self poisoning.
In view of when blocking there are barrier, laser radar sensor 12 will test the ring less than barrier shield portions
Border information, that is, the limited view of laser radar sensor 12, in order to obtain the environmental information in more fully target scene,
On the basis of above-described embodiment, unmanned plane 2 is equipped with for obtaining the complementary ring in 12 blind area of laser radar sensor
Border information builds map sensor, builds map sensor and is connected with vehicle-mounted master controller, so that vehicle-mounted master controller is according to basic environment
Information, the mileage of unmanned vehicle 1 and supplement ambient information construct three-dimensional map to target scene.
It builds map sensor to be equipped on unmanned plane 2, the environmental information of target scene can be looked down from high-altitude, have the visual field excellent
Gesture is able to detect that laser radar sensor 12 depending on the environmental information that can not be obtained, and therefore, nobody can be assisted by building map sensor
Vehicle 1 carries out three-dimensional map building to target scene, suitable for target scene is more complicated or the laser radar sensor of unmanned vehicle 1
The case where 12 limited view.
Preferably, building map sensor is depth camera.It is understood that depth camera obtains laser radar sensing
After supplement ambient information in 12 blind area of device, three dimensional point cloud can be generated, and what laser radar sensor 12 obtained
It is also point cloud data, therefore, after the three dimensional point cloud that depth camera is generated is sent to vehicle-mounted master controller, vehicle-mounted master
Controller can carry out the three dimensional point cloud that depth camera generates and the point cloud data that laser radar sensor 12 obtains
Data fusion, then, vehicle-mounted master controller is according to the mileages of data and unmanned vehicle 1 after data fusion, using Gmapping
Algorithm can carry out three-dimensional map and construct and position pose of the unmanned vehicle 1 in map coordinates system in real time.
It is communicated to connect it should be noted that building map sensor and can directly be established with vehicle-mounted master controller;It is also possible to build
Map sensor is connected with the airborne master controller of unmanned plane 2, and airborne master controller and the foundation of vehicle-mounted master controller have communication connection,
It builds map sensor and vehicle-mounted master controller is sent to by the signal that airborne master controller is detected, in this regard, the present invention is not
It limits.
It should be noted that alignment sensor in the present invention, build map sensor and operation sensor can be to take respectively
The independent sensor being loaded on unmanned plane 2, or the same sensor, to mitigate the load of unmanned plane 2.
Preferably, operation sensor is individual visual sensor, and alignment sensor is the same view with map sensor is built
Feel sensor, that is to say, that preferably equipped with two visual sensors on unmanned plane 2, one for obtaining the position of operative goals 5
Appearance, another is not only used to detect visual signature mark 11, but also takes into account the environment sensing in target scene, for obtaining laser radar
Supplement ambient information in 12 blind area of sensor.
As shown in figure 4, the vision camera 21 is mainly used for obtaining operation equipped with vision camera 21 on unmanned plane 2
The pose of target 5.
In order to further increase the positioning accuracy of unmanned plane 2, it is preferable that unmanned plane 2 carries barometer and inertia measurement
Unit etc., to carry out data fusion by detection data to barometer and Inertial Measurement Unit etc., obtain it is more accurate nobody
The position of machine 2, specific data fusion method refer to the prior art, and repeats no more herein.
In conclusion air-ground coordination operating system provided by the invention, constructs dimensionally target scene by unmanned vehicle 1
Figure, and pose of the positioning unmanned vehicle 1 certainly under map coordinates system in real time, meanwhile, utilize data collector 3 to obtain unmanned vehicle 1
With the first relative pose of unmanned plane 2, positioning references by unmanned vehicle 1 as unmanned plane 2 on ground are assisted by unmanned vehicle 1
Unmanned plane 2 is helped to be accurately positioned in target scene.Meanwhile and since data collector 3 can obtain operative goals 5 and unmanned vehicle 1
The second relative pose, data processor 4 can by spatial coordinate transformation, to pose of the unmanned vehicle 1 under map coordinates system,
The first relative pose and operative goals 5 of unmanned plane 2 and unmanned vehicle 1 and the second relative pose of unmanned vehicle 1 carry out at data
Reason, obtains the third relative pose of unmanned plane 2 Yu operative goals 5, and third relative pose is sent to unmanned plane 2, to make
Unmanned plane 2 adjusts in pose to default job area according to third relative pose.
It can thus be seen that air-ground coordination operating system provided by the invention, answering for unmanned vehicle 1 in the prior art is breached
With limitation, solves the problems, such as aerial mobility platform and Land Mobile platform " fighting separately ", " collaboration is ineffective ", be truly realized
The collaborative work of unmanned plane 2 and unmanned vehicle 1.Even in itself visual field of unmanned plane 2, there are blind areas, can not observe operation mesh
When marking 5, the accurate positioning of unmanned plane 2 also can be realized, meet the needs of job task is executed to operative goals 5, it is especially suitable
In aerial complicated contact operation.
Fig. 7-Figure 12 is please referred to, Fig. 7 is the positioning side of air-ground coordination operating system provided by the specific embodiment of the invention
The flow chart of method;Fig. 8 is to construct the three-dimensional map of target scene using unmanned vehicle and obtain unmanned vehicle from three-dimensional map
A kind of flow chart of preferred embodiment of pose under map coordinates system;Fig. 9 is to construct target scene dimensionally using unmanned vehicle
Scheme and obtain the flow chart of another preferred embodiment of pose of the unmanned vehicle certainly under the map coordinates system of three-dimensional map;Figure 10
To utilize a kind of flow chart of preferred embodiment of data collector the first relative pose of acquisition;Figure 11 is to be obtained using data collector
Take a kind of flow chart of preferred embodiment of the second relative pose;Figure 12 is to obtain unmanned plane and operative goals using data processor
Third relative pose flow chart.
In addition to above-mentioned air-ground coordination operating system, the present invention also provides a kind of localization method of air-ground coordination operating system,
The localization method of the air-ground coordination operating system is applied to air-ground coordination operating system disclosed in any one above-mentioned embodiment, should
The localization method of air-ground coordination operating system includes step S1 to step S3:
Step S1: constructing the three-dimensional map of target scene using unmanned vehicle 1 and obtains unmanned vehicle 1 from three-dimensional map
Pose under map coordinates system.
As a preferred embodiment, step S1 includes step S11 to step S13:
Step S11: the mileage information of the odometer measurement unmanned vehicle 1 of unmanned vehicle 1 is utilized.
Step S12: the basic environment information of target scene is obtained using the laser radar sensor 12 of unmanned vehicle 1.
Step S13: it using the vehicle-mounted master controller of the unmanned vehicle 1 according to mileage information and basic environment information, uses
Gmapping algorithm constructs the three-dimensional map of target scene, and obtains pose of the unmanned vehicle 1 under map coordinates system.
That is, the unmanned vehicle 1 in the corresponding air-ground coordination operating system of the present embodiment is equipped with odometer and laser thunder
Up to sensor 12.Vehicle-mounted master controller is using Gmapping algorithm to odometer and laser radar sensor 12 2 in the present embodiment
The information of person's detection is handled, to construct the three-dimensional map of target scene and realize unmanned vehicle 1 in the real-time fixed of target scene
Position.
In view of the restricted gender in 12 visual field of laser radar sensor, in another preferred embodiment, step S1 includes step
S14 to step S17:
Step S14: the mileage information of the odometer measurement unmanned vehicle 1 of unmanned vehicle 1 is utilized.
Step S15: the basic environment information of target scene is obtained using the laser radar sensor 12 of unmanned vehicle 1.
Step S16: the complementary ring of unmanned plane 2 built in map sensor acquisition 12 blind area of laser radar sensor is utilized
Border information.
Step S17: using the vehicle-mounted master controller of unmanned vehicle 1 according to mileage information, basic environment information and supplement ambient
Information using the three-dimensional map of Gmapping algorithm building target scene, and obtains position of the unmanned vehicle 1 under map coordinates system
Appearance.
That is, the unmanned vehicle 1 in the corresponding air-ground coordination operating system of the present embodiment is equipped with odometer and laser thunder
Up to sensor 12, the unmanned plane 2 in air-ground coordination operating system, which is equipped with, builds map sensor, on the basis of upper one embodiment,
It builds map sensor by what is added and obtains in target scene more fully environmental information.In other words, by being equipped on nobody
Machine 2 builds map sensor to assist unmanned vehicle 1 to carry out building figure to target scene.
Step S2: the first relative pose of unmanned vehicle 1 and unmanned plane 2 is obtained using data collector 3, and is adopted using data
Second relative pose of storage 3 acquisition operative goals 5 and unmanned vehicle 1.
The present embodiment to the concrete mode of the first relative pose for obtaining unmanned vehicle 1 and unmanned plane 2 without limitation, as one
Kind preferred embodiment, step S2 includes step S21 to step S23.
Step S21: the upper alignment sensor detection unmanned vehicle 1 carried of one of unmanned vehicle 1 and unmanned plane 2 and nothing are utilized
Man-machine the other of 2 upper visual signature marks 11 being arranged;Wherein, visual signature mark 11 identifies for AprilTag.
That is, the present embodiment identifies 11 by the way that visual signature is above arranged in one of unmanned vehicle 1 and unmanned plane 2,
And in the upper setting alignment sensor of the other of unmanned vehicle 1 and unmanned plane 2, alignment sensor detection visual signature is utilized to identify
11, to obtain the first relative pose of unmanned plane 2 Yu unmanned vehicle 1.
Step S22: utilizing alignment sensor, using AprilTag recognizer, contains view to what alignment sensor obtained
Feel that the image of signature identification 11 is handled, obtains pose of the visual signature mark 11 under the coordinate system of alignment sensor.
That is, the image containing visual signature mark 11 that alignment sensor can obtain it is handled.
AprilTag recognizer is visual identification algorithm mature in the prior art, for example, alignment sensor is depth camera, it is deep
The available image information stream of video camera is spent, when handling the image that depth camera obtains, is first converted RGB image to
Grayscale image, then using adaptive threshold to grayscale image carry out binary conversion treatment, further according to bianry image adjacent pixel values whether
Identical carry out image segmentation, is divided into black and white chunking for picture, and the pixel gradient of black and white chunking junction is clustered, is obtained
To image border, pixel is fitted using least square method in each boundary cluster, approximate line segment is obtained, then utilizes
The depth-first recursive algorithm detection that one depth is 4 is rectangular, after acquisition candidate is rectangular, determines to detect using coded system
To it is rectangular whether be AprilTag mark component part, then using pinhole camera model be calculated AprilTag identify
Pose under the coordinate system of depth camera.
Step S23: using data processor 4 according to pose of the alignment sensor under the unmanned plane coordinate system of unmanned plane 2,
Pose and visual signature mark 11 of the visual signature mark 11 under the unmanned vehicle coordinate system of unmanned vehicle 1 is in alignment sensor
Coordinate system under pose, obtain the first relative pose of unmanned vehicle 1 Yu unmanned plane 2;
Alternatively, using data processor according to pose of the alignment sensor under the unmanned vehicle coordinate system of unmanned vehicle, vision
Pose and visual signature of the signature identification under the unmanned plane coordinate system of unmanned plane identify under the coordinate system of alignment sensor
Pose, obtain the first relative pose.
Preferably, visual signature mark 11 is arranged on unmanned vehicle 1, and alignment sensor is arranged on unmanned plane 2.It can manage
Solution, according to AprilTag pose of the mark under the coordinate system of alignment sensor and alignment sensor in unmanned plane 2
AprilTag mark can be obtained under unmanned plane coordinate system by spatial coordinate transformation in pose under unmanned plane coordinate system
Pose, and since AprilTag mark is arranged on unmanned vehicle 1, the pose under unmanned vehicle coordinate system is identified according to AprilTag
And unmanned vehicle 1 and nobody can be obtained by spatial coordinate transformation in pose of the AprilTag mark under unmanned plane coordinate system
First relative pose of machine 2.
In addition, the present embodiment does not do specifically the concrete mode for the second relative pose for obtaining operative goals 5 and unmanned vehicle 1
It limits, as a preferred embodiment, step S2 includes step S24 and step S25:
Step S24: the image of operative goals 5 is obtained using the operation sensor of unmanned plane 2.
That is, the present embodiment obtains operative goals 5 by the way that operation sensor is arranged on unmanned plane 2, to make
When industry target 5 enters the visual field of operation sensor, the image of operative goals 5 is obtained.
Step S25: using the image processor for having data connection is established with operation sensor, according to the figure of operative goals 5
As obtaining fourth relative pose of the operative goals 5 under the coordinate system of operation sensor.
It is obtained it is understood that those skilled in the art can use image processor according to the image of operative goals 5
Fourth relative pose of the operative goals 5 under the coordinate system of operation sensor.
Step S26: using data processor 4 according to the 4th relative pose and the first relative pose, pass through space coordinates
Transformation, obtains the second relative pose.
It is understood that operation sensor is set on unmanned plane 2, the coordinate system and unmanned plane coordinate system of operation sensor
With determining relative pose, therefore, according to pose of the operation sensor under unmanned plane coordinate system and operative goals 5 in operation
Pose of the operative goals 5 under unmanned plane coordinate system can be obtained by coordinate system transformation in pose under the coordinate system of sensor,
Further according to pose of the first relative pose and unmanned vehicle 1 of unmanned plane 2 and unmanned vehicle 1 under map coordinates system, by space
Pose of the operative goals 5 under map coordinates system can be obtained in the transformation of coordinate system.
Step S3: using data processor 4 according to pose of the unmanned vehicle 1 under map coordinates system, the first relative pose with
And second relative pose, the third relative pose for stating unmanned plane 2 Yu operative goals 5 is obtained, and third relative pose is sent to
Unmanned plane 2, so that unmanned plane 2 adjusts in pose to default job area according to third relative pose.
Specifically, step S3 includes step S31 to step S33:
Step S31: pose and the first relative pose using data processor 4 according to unmanned vehicle 1 under map coordinates system,
Obtain pose of the unmanned plane 2 under map coordinates system.
Step S32: pose and the second relative pose using data processor 4 according to unmanned vehicle 1 under map coordinates system,
Obtain pose of the operative goals 5 under map coordinates system.
Step S33: using data processor 4 according to pose of the unmanned plane 2 under map coordinates system and operative goals 5 on ground
Pose under figure coordinate system obtains third relative pose.
It is understood that according to the of pose of the unmanned vehicle 1 under map coordinates system and unmanned plane 2 and unmanned vehicle 1
Pose of the unmanned plane 2 under map coordinates system can be obtained by the transformation of space coordinates in one relative pose;Similarly, according to nothing
Second relative pose of pose and operative goals 5 and unmanned vehicle 1 of the people's vehicle 1 under map coordinates system, by space coordinates
Transformation, pose of the operative goals 5 under map coordinates system can be obtained;Then further according to unmanned plane 2 under map coordinates system
The third relative pose of unmanned plane 2 Yu operative goals 5 can be obtained in the pose of pose and operative goals 5 under map coordinates system, from
And unmanned plane 2 is facilitated to adjust pose according to third relative pose, until unmanned plane 2, which is located in default job area, can guarantee nothing
Man-machine 2 pairs of operative goals 5 carry out accurate operation.
It should also be noted that, in the present specification, relational terms such as first and second and the like be used merely to by
One entity or operation are distinguished with another entity or operation, without necessarily requiring or implying these entities or operation
Between there are any actual relationship or orders.
Each embodiment in this specification is described using progressive mode, the highlights of each of the examples are with other realities
The difference of example is applied, the same or similar parts in each embodiment may refer to each other.
Air-ground coordination operating system provided by the present invention and its localization method are described in detail above.Herein
Apply that a specific example illustrates the principle and implementation of the invention, the explanation of above example is only intended to help
Understand method and its core concept of the invention.It should be pointed out that for those skilled in the art, not taking off
, can be with several improvements and modifications are made to the present invention under the premise of from the principle of the invention, these improvement and modification also fall into this
In invention scope of protection of the claims.
Claims (13)
1. a kind of air-ground coordination operating system characterized by comprising
Unmanned vehicle (1), by the three-dimensional map of mobile building target scene on the ground, and positioning in real time is from described three
Tie up the pose under the map coordinates system of map;
Unmanned plane (2), for carrying out operation to operative goals (5);
Data collector (3), for obtaining the first relative pose of the unmanned plane (2) and the unmanned vehicle (1) and described
Second relative pose of operative goals (5) and the unmanned vehicle (1);
Data processor (4) is connected with the unmanned vehicle (1), the unmanned plane (2) and the data collector (3), uses
According to pose of the unmanned vehicle (1) under the map coordinates system, first relative pose and described second opposite
Pose obtains the third relative pose of the unmanned plane (2) Yu the operative goals (5), and the third relative pose is sent out
It send to the unmanned plane (2), so that the unmanned plane (2) adjusts pose to default job area according to the third relative pose
It is interior.
2. air-ground coordination operating system according to claim 1, which is characterized in that the data collector (3) includes:
First relative pose obtains module, for obtaining first relative pose;
Second relative pose obtains module, for obtaining second relative pose.
3. air-ground coordination operating system according to claim 2, which is characterized in that first relative pose obtains module
Including visual signature mark (11) and alignment sensor, the alignment sensor for detect the visual signature mark (11) with
First relative pose is obtained, the alignment sensor is connected with data processor (4);
The visual signature mark (11) is arranged on one of the unmanned vehicle (1) and the unmanned plane (2);The positioning
Sensor is arranged on the other of the unmanned vehicle (1) and the unmanned plane (2).
4. air-ground coordination operating system according to claim 3, which is characterized in that second relative pose obtains module
Include:
Set on the operation sensor of the unmanned plane (2), for when the operative goals (5) its within sweep of the eye when obtain institute
State the image of operative goals (5);
Image processor is connected with the operation sensor, for obtaining the operation according to the image of the operative goals (5)
Fourth relative pose of the target (5) under the coordinate system of the operation sensor, described image processor and the data processing
Device (4) is connected;
The data processor (4) is also used to obtain described according to first relative pose and the 4th relative pose
Two relative poses.
5. air-ground coordination operating system according to claim 3 or 4, which is characterized in that the unmanned vehicle (1) is equipped with and is used for
Obtain the laser radar sensor (12) of the basic environment information of the target scene and inner for measuring the unmanned vehicle (1)
The vehicle-mounted main control with the unmanned vehicle (1) respectively of the odometer of journey, the laser radar sensor (12) and the odometer
Device is connected, so that the vehicle-mounted master controller constructs institute to the target scene according to the basic environment information and the mileage
Three-dimensional map is stated, and obtains pose of the unmanned vehicle (1) under the map coordinates system.
6. air-ground coordination operating system according to claim 5, which is characterized in that the unmanned plane (2) is equipped with for obtaining
Take the map sensor of building of the supplement ambient information in the laser radar sensor (12) blind area, it is described build map sensor with
The vehicle-mounted master controller is connected, so that the vehicle-mounted master controller is according to the basic environment information, the mileage and described
Supplement ambient information constructs the three-dimensional map to the target scene.
7. air-ground coordination operating system according to claim 6, which is characterized in that the alignment sensor and described build figure
Sensor is the same visual sensor.
8. a kind of localization method of air-ground coordination operating system, which is characterized in that it is described in any item to be applied to claim 1-7
Air-ground coordination operating system, comprising:
The three-dimensional map of target scene is constructed using unmanned vehicle (1) and obtains the unmanned vehicle (1) from the three-dimensional map
Map coordinates system under pose;
The first relative pose of the unmanned vehicle (1) and unmanned plane (2) is obtained using data collector (3), and utilizes the number
The second relative pose of operative goals (5) and the unmanned vehicle (1) is obtained according to collector (3);
Using data processor (4) according to pose of the unmanned vehicle (1) under the map coordinates system, the first opposite position
Appearance and second relative pose obtain the third relative pose of the unmanned plane (2) Yu the operative goals (5), and will
The third relative pose is sent to the unmanned plane (2), so that the unmanned plane (2) is adjusted according to the third relative pose
In pose to default job area.
9. the localization method of air-ground coordination operating system according to claim 8, which is characterized in that described to utilize unmanned vehicle
(1) it constructs the three-dimensional map of target scene and obtains the unmanned vehicle (1) under the map coordinates system of the three-dimensional map
Pose, comprising:
The mileage information of the unmanned vehicle (1) is measured using the odometer of the unmanned vehicle (1);
The basic environment information of the target scene is obtained using the laser radar sensor (12) of the unmanned vehicle (1);
Using the vehicle-mounted master controller of the unmanned vehicle (1) according to the mileage information and the basic environment information, use
Gmapping algorithm constructs the three-dimensional map of the target scene, and obtains the unmanned vehicle (1) in the map reference
Pose under system.
10. the localization method of air-ground coordination operating system according to claim 8, which is characterized in that described to utilize nobody
The three-dimensional map of vehicle (1) building target scene simultaneously obtains the unmanned vehicle (1) from the map coordinates system in the three-dimensional map
Under pose, comprising:
The mileage information of the unmanned vehicle (1) is measured using the odometer of the unmanned vehicle (1);
The basic environment information of the target scene is obtained using the laser radar sensor (12) of the unmanned vehicle (1);
Utilize the complementary ring of the unmanned plane (2) built in map sensor acquisition laser radar sensor (12) blind area
Border information;
Using the vehicle-mounted master controller of the unmanned vehicle (1) according to the mileage information, the basic environment information and the benefit
Environmental information is filled, the three-dimensional map of the target scene is constructed using Gmapping algorithm, and obtains the unmanned vehicle (1)
Pose under the map coordinates system.
11. according to the localization method of the described in any item air-ground coordination operating systems of claim 8-10, which is characterized in that described
The first relative pose of the unmanned vehicle (1) and unmanned plane (2) is obtained using data collector (3), comprising:
The unmanned vehicle is detected using the upper alignment sensor carried of one of the unmanned vehicle (1) and the unmanned plane (2)
(1) and the upper visual signature being arranged of the other of the unmanned plane (2) identifies (11);Wherein, the visual signature mark
(11) it is identified for AprilTag;
Using the alignment sensor, using AprilTag recognizer, contain the view to what the alignment sensor obtained
Feel that the image of signature identification (11) is handled, obtains the visual signature mark (11) in the coordinate system of the alignment sensor
Under pose;
Using the data processor (4) according to the alignment sensor under the unmanned plane coordinate system of the unmanned plane (2)
The pose and the vision of pose, visual signature mark (11) under the unmanned vehicle coordinate system of the unmanned vehicle (1) are special
Pose of the sign mark (11) under the coordinate system of the alignment sensor, obtains first relative pose;
Alternatively, using the data processor (4) according to the alignment sensor the unmanned vehicle (1) unmanned vehicle coordinate system
Under pose, the visual signature mark (11) pose and the view under the unmanned plane coordinate system of the unmanned plane (2)
Feel the pose of signature identification (11) under the coordinate system of the alignment sensor, obtains first relative pose.
12. according to the localization method of the described in any item air-ground coordination operating systems of claim 8-10, which is characterized in that described
The second relative pose of operative goals (5) and the unmanned vehicle (1) is obtained using the data collector (3), comprising:
The image of the operative goals (5) is obtained using the operation sensor carried on the unmanned plane (2);
Using the image processor for having data connection is established with the operation sensor, according to the image of the operative goals (5)
Obtain fourth relative pose of the operative goals (5) under the coordinate system of the operation sensor;
Using the data processor (4) according to the 4th relative pose and first relative pose, pass through space coordinate
System's transformation, obtains second relative pose.
13. according to the localization method of the described in any item air-ground coordination operating systems of claim 8-10, which is characterized in that described
Using the data processor (4) according to pose of the unmanned vehicle (1) under the map coordinates system, described first opposite
Pose and second relative pose, obtain the third relative pose of the unmanned plane (2) Yu the operative goals (5), comprising:
Using data processor (4) according to pose of the unmanned vehicle (1) under the map coordinates system and described first opposite
Pose obtains pose of the unmanned plane (2) under the map coordinates system;
Pose and described second using the data processor (4) according to the unmanned vehicle (1) under the map coordinates system
Relative pose obtains pose of the operative goals (5) under the map coordinates system;
Using the data processor (4) according to pose of the unmanned plane (2) under the map coordinates system and the operation
Pose of the target (5) under the map coordinates system, obtains the third relative pose.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910522367.2A CN110221623A (en) | 2019-06-17 | 2019-06-17 | A kind of air-ground coordination operating system and its localization method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910522367.2A CN110221623A (en) | 2019-06-17 | 2019-06-17 | A kind of air-ground coordination operating system and its localization method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110221623A true CN110221623A (en) | 2019-09-10 |
Family
ID=67817373
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910522367.2A Pending CN110221623A (en) | 2019-06-17 | 2019-06-17 | A kind of air-ground coordination operating system and its localization method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110221623A (en) |
Cited By (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110827579A (en) * | 2019-11-14 | 2020-02-21 | 北京京航计算通讯研究所 | Control method for optimizing berthing of military harbor naval vessel |
CN110888456A (en) * | 2019-12-05 | 2020-03-17 | 中国北方车辆研究所 | Autonomous cooperative reconnaissance control method for unmanned aerial vehicle and unmanned vehicle |
CN110989505A (en) * | 2019-10-28 | 2020-04-10 | 中国人民解放军96782部队 | Unmanned command and dispatch system based on ground equipment machine vision |
CN111506078A (en) * | 2020-05-13 | 2020-08-07 | 北京洛必德科技有限公司 | Robot navigation method and system |
CN111744122A (en) * | 2020-05-18 | 2020-10-09 | 浙江西贝虎特种车辆股份有限公司 | Multi-end cooperative forest fire isolation belt building system |
CN111765886A (en) * | 2020-05-18 | 2020-10-13 | 浙江西贝虎特种车辆股份有限公司 | Multi-terminal collaborative forest crown lower landform mapping system and method |
CN112000130A (en) * | 2020-09-07 | 2020-11-27 | 哈尔滨工业大学 | Unmanned aerial vehicle's multimachine cooperation high accuracy is built and is drawn positioning system |
CN112365622A (en) * | 2020-10-28 | 2021-02-12 | 深圳市朗驰欣创科技股份有限公司 | Inspection system, method, terminal and storage medium |
CN112889480A (en) * | 2021-03-03 | 2021-06-04 | 南京农业大学 | Unmanned intelligent fruit picking system |
CN112965514A (en) * | 2021-01-29 | 2021-06-15 | 北京农业智能装备技术研究中心 | Air-ground cooperative pesticide application method and system |
CN112991440A (en) * | 2019-12-12 | 2021-06-18 | 纳恩博(北京)科技有限公司 | Vehicle positioning method and device, storage medium and electronic device |
CN113120237A (en) * | 2021-05-31 | 2021-07-16 | 北京水云星晗科技有限公司 | Unmanned investigation method |
CN113238576A (en) * | 2021-05-07 | 2021-08-10 | 北京中科遥数信息技术有限公司 | Positioning method for unmanned aerial vehicle and related device |
CN113865579A (en) * | 2021-08-06 | 2021-12-31 | 湖南大学 | Unmanned aerial vehicle pose parameter measuring system and method |
CN114152254A (en) * | 2021-12-14 | 2022-03-08 | 广州极飞科技股份有限公司 | Positioning method, device and system, operation method, system and electronic device |
CN114167891A (en) * | 2021-11-29 | 2022-03-11 | 湖南汽车工程职业学院 | Ground data acquisition and processing system based on unmanned aerial vehicle |
CN114489112A (en) * | 2021-12-13 | 2022-05-13 | 深圳先进技术研究院 | Cooperative sensing system and method for intelligent vehicle-unmanned aerial vehicle |
CN115793093A (en) * | 2023-02-02 | 2023-03-14 | 水利部交通运输部国家能源局南京水利科学研究院 | Empty ground integrated equipment for diagnosing hidden danger of dam |
CN116088064A (en) * | 2023-01-18 | 2023-05-09 | 汕头大学 | Method and system for detecting solenopsis invicta nest based on unmanned aerial vehicle group |
WO2023104207A1 (en) * | 2021-12-10 | 2023-06-15 | 深圳先进技术研究院 | Collaborative three-dimensional mapping method and system |
CN116858229A (en) * | 2023-07-12 | 2023-10-10 | 广东喜讯智能科技有限公司 | Bridge defect positioning method |
CN117858105A (en) * | 2024-03-07 | 2024-04-09 | 中国电子科技集团公司第十研究所 | Multi-unmanned aerial vehicle cooperation set dividing and deploying method in complex electromagnetic environment |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017015898A1 (en) * | 2015-07-29 | 2017-02-02 | Abb 瑞士股份有限公司 | Control system for robotic unstacking equipment and method for controlling robotic unstacking |
WO2017177533A1 (en) * | 2016-04-12 | 2017-10-19 | 深圳市龙云创新航空科技有限公司 | Method and system for controlling laser radar based micro unmanned aerial vehicle |
WO2018010164A1 (en) * | 2016-07-15 | 2018-01-18 | 深圳飞豹航天航空科技有限公司 | Obstacle-avoidance detection method, moving apparatus, and unmanned aerial vehicle |
CN108573619A (en) * | 2018-04-25 | 2018-09-25 | 河南聚合科技有限公司 | A kind of unmanned plane fortune pipe cloud platform of air-ground coordination operation |
CN109240331A (en) * | 2018-09-30 | 2019-01-18 | 北京航空航天大学 | A kind of unmanned plane-unmanned vehicle cluster models time-varying formation control method and system |
CN210377164U (en) * | 2019-06-17 | 2020-04-21 | 酷黑科技(北京)有限公司 | Air-ground cooperative operation system |
-
2019
- 2019-06-17 CN CN201910522367.2A patent/CN110221623A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017015898A1 (en) * | 2015-07-29 | 2017-02-02 | Abb 瑞士股份有限公司 | Control system for robotic unstacking equipment and method for controlling robotic unstacking |
WO2017177533A1 (en) * | 2016-04-12 | 2017-10-19 | 深圳市龙云创新航空科技有限公司 | Method and system for controlling laser radar based micro unmanned aerial vehicle |
WO2018010164A1 (en) * | 2016-07-15 | 2018-01-18 | 深圳飞豹航天航空科技有限公司 | Obstacle-avoidance detection method, moving apparatus, and unmanned aerial vehicle |
CN108573619A (en) * | 2018-04-25 | 2018-09-25 | 河南聚合科技有限公司 | A kind of unmanned plane fortune pipe cloud platform of air-ground coordination operation |
CN109240331A (en) * | 2018-09-30 | 2019-01-18 | 北京航空航天大学 | A kind of unmanned plane-unmanned vehicle cluster models time-varying formation control method and system |
CN210377164U (en) * | 2019-06-17 | 2020-04-21 | 酷黑科技(北京)有限公司 | Air-ground cooperative operation system |
Cited By (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110989505A (en) * | 2019-10-28 | 2020-04-10 | 中国人民解放军96782部队 | Unmanned command and dispatch system based on ground equipment machine vision |
CN110827579B (en) * | 2019-11-14 | 2020-09-08 | 北京京航计算通讯研究所 | Control method for optimizing berthing of military harbor naval vessel |
CN110827579A (en) * | 2019-11-14 | 2020-02-21 | 北京京航计算通讯研究所 | Control method for optimizing berthing of military harbor naval vessel |
CN110888456A (en) * | 2019-12-05 | 2020-03-17 | 中国北方车辆研究所 | Autonomous cooperative reconnaissance control method for unmanned aerial vehicle and unmanned vehicle |
CN110888456B (en) * | 2019-12-05 | 2023-06-30 | 中国北方车辆研究所 | Unmanned aerial vehicle and unmanned aerial vehicle autonomous collaborative reconnaissance control method |
CN112991440B (en) * | 2019-12-12 | 2024-04-12 | 纳恩博(北京)科技有限公司 | Positioning method and device for vehicle, storage medium and electronic device |
CN112991440A (en) * | 2019-12-12 | 2021-06-18 | 纳恩博(北京)科技有限公司 | Vehicle positioning method and device, storage medium and electronic device |
CN111506078B (en) * | 2020-05-13 | 2021-06-11 | 北京洛必德科技有限公司 | Robot navigation method and system |
CN111506078A (en) * | 2020-05-13 | 2020-08-07 | 北京洛必德科技有限公司 | Robot navigation method and system |
CN111744122A (en) * | 2020-05-18 | 2020-10-09 | 浙江西贝虎特种车辆股份有限公司 | Multi-end cooperative forest fire isolation belt building system |
CN111765886A (en) * | 2020-05-18 | 2020-10-13 | 浙江西贝虎特种车辆股份有限公司 | Multi-terminal collaborative forest crown lower landform mapping system and method |
CN112000130A (en) * | 2020-09-07 | 2020-11-27 | 哈尔滨工业大学 | Unmanned aerial vehicle's multimachine cooperation high accuracy is built and is drawn positioning system |
CN112000130B (en) * | 2020-09-07 | 2023-04-25 | 哈尔滨工业大学 | Multi-machine collaborative high-precision map building positioning system of unmanned aerial vehicle |
CN112365622A (en) * | 2020-10-28 | 2021-02-12 | 深圳市朗驰欣创科技股份有限公司 | Inspection system, method, terminal and storage medium |
CN112965514A (en) * | 2021-01-29 | 2021-06-15 | 北京农业智能装备技术研究中心 | Air-ground cooperative pesticide application method and system |
CN112965514B (en) * | 2021-01-29 | 2022-07-01 | 北京农业智能装备技术研究中心 | Air-ground cooperative pesticide application method and system |
CN112889480A (en) * | 2021-03-03 | 2021-06-04 | 南京农业大学 | Unmanned intelligent fruit picking system |
CN113238576A (en) * | 2021-05-07 | 2021-08-10 | 北京中科遥数信息技术有限公司 | Positioning method for unmanned aerial vehicle and related device |
CN113120237A (en) * | 2021-05-31 | 2021-07-16 | 北京水云星晗科技有限公司 | Unmanned investigation method |
CN113865579B (en) * | 2021-08-06 | 2024-08-09 | 湖南大学 | Unmanned aerial vehicle pose parameter measurement system and method |
CN113865579A (en) * | 2021-08-06 | 2021-12-31 | 湖南大学 | Unmanned aerial vehicle pose parameter measuring system and method |
CN114167891A (en) * | 2021-11-29 | 2022-03-11 | 湖南汽车工程职业学院 | Ground data acquisition and processing system based on unmanned aerial vehicle |
CN114167891B (en) * | 2021-11-29 | 2022-08-16 | 湖南汽车工程职业学院 | Ground data acquisition and processing system based on unmanned aerial vehicle |
WO2023104207A1 (en) * | 2021-12-10 | 2023-06-15 | 深圳先进技术研究院 | Collaborative three-dimensional mapping method and system |
WO2023109589A1 (en) * | 2021-12-13 | 2023-06-22 | 深圳先进技术研究院 | Smart car-unmanned aerial vehicle cooperative sensing system and method |
CN114489112A (en) * | 2021-12-13 | 2022-05-13 | 深圳先进技术研究院 | Cooperative sensing system and method for intelligent vehicle-unmanned aerial vehicle |
CN114152254A (en) * | 2021-12-14 | 2022-03-08 | 广州极飞科技股份有限公司 | Positioning method, device and system, operation method, system and electronic device |
CN116088064A (en) * | 2023-01-18 | 2023-05-09 | 汕头大学 | Method and system for detecting solenopsis invicta nest based on unmanned aerial vehicle group |
CN116088064B (en) * | 2023-01-18 | 2023-10-13 | 汕头大学 | Method and system for detecting solenopsis invicta nest based on unmanned aerial vehicle group |
CN115793093A (en) * | 2023-02-02 | 2023-03-14 | 水利部交通运输部国家能源局南京水利科学研究院 | Empty ground integrated equipment for diagnosing hidden danger of dam |
CN116858229A (en) * | 2023-07-12 | 2023-10-10 | 广东喜讯智能科技有限公司 | Bridge defect positioning method |
CN117858105A (en) * | 2024-03-07 | 2024-04-09 | 中国电子科技集团公司第十研究所 | Multi-unmanned aerial vehicle cooperation set dividing and deploying method in complex electromagnetic environment |
CN117858105B (en) * | 2024-03-07 | 2024-05-24 | 中国电子科技集团公司第十研究所 | Multi-unmanned aerial vehicle cooperation set dividing and deploying method in complex electromagnetic environment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110221623A (en) | A kind of air-ground coordination operating system and its localization method | |
US10715963B2 (en) | Navigation method and device | |
CN104217439B (en) | Indoor visual positioning system and method | |
KR101622028B1 (en) | Apparatus and Method for controlling Vehicle using Vehicle Communication | |
CN106796112B (en) | Control apparatus, control method, and computer program for detecting vehicle | |
CN106595630B (en) | It is a kind of that drawing system and method are built based on laser navigation Intelligent Mobile Robot | |
CN105652305B (en) | The three-dimensional localization method for determining posture and system of a kind of dynamic environment lower railway detection platform | |
CN104215239B (en) | Guidance method using vision-based autonomous unmanned plane landing guidance device | |
CN109737981B (en) | Unmanned vehicle target searching device and method based on multiple sensors | |
CN108571971A (en) | A kind of AGV vision positioning systems and method | |
CN105335733A (en) | Autonomous landing visual positioning method and system for unmanned aerial vehicle | |
CN102538793B (en) | Double-base-line non-cooperative target binocular measurement system | |
CN110119698A (en) | For determining the method, apparatus, equipment and storage medium of Obj State | |
CN106525044B (en) | The personnel positioning navigation system and its method of large-scale naval vessels based on Ship Structure Graphing | |
KR20200001471A (en) | Apparatus and method for detecting lane information and computer recordable medium storing computer program thereof | |
CN106370160A (en) | Robot indoor positioning system and method | |
EP3765820B1 (en) | Positioning method and positioning apparatus | |
CN109632333A (en) | Automatic driving vehicle performance test methods, device, equipment and readable storage medium storing program for executing | |
CN110879617A (en) | Infrared-guided unmanned aerial vehicle landing method and device | |
CN102998689B (en) | Region decision method based on virtual-sensor | |
CN108627864A (en) | Localization method and system, pilotless automobile system based on automobile key | |
CN210377164U (en) | Air-ground cooperative operation system | |
CN113311821A (en) | Drawing and positioning system and method for multi-pendulous pipeline flaw detection mobile robot | |
CN105301621A (en) | Vehicle positioning device and intelligent driving exam system | |
KR20030026497A (en) | Self-localization apparatus and method of mobile robot |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |