CN106931963A - Environmental data shared platform, unmanned vehicle, localization method and alignment system - Google Patents

Environmental data shared platform, unmanned vehicle, localization method and alignment system Download PDF

Info

Publication number
CN106931963A
CN106931963A CN201710241152.4A CN201710241152A CN106931963A CN 106931963 A CN106931963 A CN 106931963A CN 201710241152 A CN201710241152 A CN 201710241152A CN 106931963 A CN106931963 A CN 106931963A
Authority
CN
China
Prior art keywords
environmental data
unmanned vehicle
data
shared
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710241152.4A
Other languages
Chinese (zh)
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
High Domain (beijing) Intelligent Technology Research Institute Co Ltd
Original Assignee
High Domain (beijing) Intelligent Technology Research Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by High Domain (beijing) Intelligent Technology Research Institute Co Ltd filed Critical High Domain (beijing) Intelligent Technology Research Institute Co Ltd
Priority to CN201710241152.4A priority Critical patent/CN106931963A/en
Publication of CN106931963A publication Critical patent/CN106931963A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)

Abstract

A kind of unmanned vehicle localization method of shared environment data, alignment system, environmental data shared platform and unmanned vehicle are disclosed, localization method is comprised the following steps:Environmental data shared platform (1) is collected and shared environment data;Unmanned vehicle (2) carries out vision positioning based on environmental data, alignment system includes collecting the environmental data shared platform (1) and unmanned vehicle (2) of simultaneously shared environment data, the storage device (3) of the database of environmental data shared platform (1) including the environmental data that is stored with and the COM1 (4) of access database, COM1 (4) is via wireless communication connection unmanned vehicle (2), communication equipment (5) of the unmanned vehicle (2) including reception environmental data, the processor (6) of processing environment data and the camera and/or laser radar (7) for vision positioning, unmanned vehicle (2) carries out vision positioning based on environmental data using camera and/or laser radar (7).

Description

Environmental data shared platform, unmanned vehicle, localization method and alignment system
Technical field
The invention belongs to unmanned vehicle field, more particularly to one kind without gps signal under, such as indoor shared environment The unmanned vehicle localization method of data, alignment system, environmental data shared platform and unmanned vehicle.
Background technology
At present, GPS is based primarily upon, formula of being illustrated with GPS here illustrates NAVSTAR, in fact global range Inside also there is the similar system of Russia and China, because its concrete principle and situation have no directly with technical scheme Association, herein not reinflated explanation, or differential GPS has been substantially achieved come the orientation problem of the outdoor unmanned vehicle realized Solve, due to the service efficiency more and more higher for gps signal, in the preferable unmanned vehicle of outdoor flight, believed using GPS Number have been able to easily do in a centimetre location recognition for rank degree.It is embodied in manipulation experience, a floating state Multi-rotor unmanned aerial vehicle, in outdoor using from many signals of gps satellite, certain can be scheduled on as nail specifically Point is motionless.
In fact, for rotary wind type unmanned plane, the accurate reliability of hovering is the premise of other complicated flight paths manipulations, Just because of in an outdoor environment, the orientation problem of rotary wind type unmanned plane has been substantially achieved solution, and this solution Cost than less expensive.So, the application field of rotary wind type unmanned plane especially multi-rotor unmanned aerial vehicle is opened up extensively, including Take photo by plane, logistics, exploration, line walking, the multiple fields such as plant protection, occurred in that the active figure of unmanned plane.
In contrast, in the case of for lacking gps signal guide, the station keeping of unmanned plane turns into a big problem, For especially for indoor this typical environment, multi-rotor unmanned aerial vehicle cannot precisely hover, and cause to fly indoors is feasible Property be subject to great challenge.
Indoors under environment, due to the confirmation to self-position, the solution party of low cost cannot be realized by gps signal Case is to realize that self-position judges by IMU, but due to integral error, the unmanned plane drift positioned using this scheme Phenomenon is serious, that is to say, that in the case where user does not provide any move, unmanned plane may be held towards any direction Continuous movement, and speed is not also slow.And indoor environment is often more more complicated than outdoor, barrier is more, this unmanned vehicle Drift, will cause user to be busy with adjusting the position of unmanned vehicle, and have no time to allow aircraft to complete practical flight task.
It is come real by motion capture system to realize another solution of the precise positioning under indoor environment It is existing, under pre-designed environment, by motion capture system, passive luminescent marking is caught using infrared high-speed video camera, Can realize being recognized for the high speed position of target object.A kind of famous motion capture system is referred to as Vicon motion captures System.But a significant problem in this way is uncontrollable cost, and indoor ring is realized using motion capture system The precise positioning in border, not only needs to arrange expensive high-speed camera equipment in advance for the environmental quality of the indoor environment, Also need to the luminescent marking for being provided for being captured on unmanned vehicle.The solution cost of a set of good results is up to a hundred Ten thousand, for consumer level products, this means that this scheme does not have market value.
Currently, the hot topic of research is that above-mentioned orientation problem is solved by image recognition technology.So-called image recognition technology Some cameras are usually utilized, the environment around captured in real-time unmanned vehicle, the environment according to surrounding is winged to judge nobody The current position of row device.
Two kinds of typical image recognition technologys are as follows:
One is feature according to subject is positioned, such as there are or are previously provided with some tools on the ground There is the terrestrial reference of obvious marking effect, the camera of downward shooting is provided with unmanned vehicle, by continual to ground Shoot, constantly go to read these terrestrial references to help unmanned plane to judge self-position.The premise of the program is needed to surrounding environment There is advance understanding, could so be compared with the terrestrial reference recognized in advance according to the data of captured in real-time, so as to judge certainly Body position.
The unmanned plane dynamic target tracking and localization method of a kind of view-based access control model disclosed in patent document CN103149939A Comprise the following steps:The first step, Video processing carries out detection and the image trace of dynamic object;Detection dynamic target:Continuous In two field pictures, feature point set is extracted in the first two field picture, these characteristic points are tracked in the second two field picture, obtain The character pair point set of feature point set in one two field picture, further removes character pair point and concentrates the feature for belonging to dynamic object Point;With the motion that the estimation of motion vectors of each characteristic point pair of remaining validity feature point set centering is global, transformation matrix is obtained, So as to compensate the motion of background, the Detection dynamic target under dynamic background is converted into the Detection dynamic target under static background, Two field pictures after frame difference compensation background, corresponding bianry image is produced with the automatic selected threshold of variance threshold values method between infima species, Treatment is filtered to bianry image, then the dynamic object is described with minimum enclosed rectangle, the detection of dynamic object is realized; (2) dynamic object for detecting is tracked using mean vector of the fusion based on skew in subsequent frames and Kalman's priori is estimated Algorithm continuously track, position of the dynamic object in current frame image is estimated out with Kalman filtering first, then by the position Confidence breath passes to the mean vector track algorithm based on skew, and the algorithm searches for target near the position, judges search knot Whether fruit is effective, if effectively then further passing to Kalman filtering carries out the prediction of next step as observation;Second step, Head SERVO CONTROL, the pitching deflection angle of real-time adjustment head makes target be maintained at the center of image;3rd step, sets up shadow Target and the corresponding relation of target in actual environment, further measure the distance between video camera and dynamic object as in, complete Dynamic object is accurately positioned;4th step, system for flight control computer is independently tracked ground dynamic object flight.The patent can Detection, the image trace to moving target are voluntarily completed, automatic deflection optical axis makes dynamic object be presented in imaging plane all the time Centre, obtain unmanned plane elevation information on the basis of according to set up model measure in real time between unmanned plane and dynamic object away from From, but the patent causes the bad of indoor positioning Resolving probiems due to lacking data support for unknown indoor environment, even if Positioned using vision positioning mode, effect is also bad.
Another kind is passed through certainly in circumstances not known from one using SLAM (concurrently building figure with positioning) mode, i.e. aircraft Individual unknown position starts movement, self poisoning is carried out according to location estimation and map in moving process, while in self poisoning On the basis of build increment type map, realize autonomous positioning and the navigation of robot.The applicability of this mode is wider, but right It is higher in the requirement of image-capable and computing capability.Although this mode also can in the case of without any map datum Start from scratch collection environmental data, but also imply that the aircraft before normal work, is also needed if this is the case Having the map structuring work to current environment will do.
Patent document CN104062977 A disclose a kind of full autonomous flight control of four rotor wing unmanned aerial vehicles of view-based access control model SLAM Method processed, comprises the following steps:Using the camera collection image information installed in four rotor wing unmanned aerial vehicle bottoms, using integrated used Property navigation elements obtain unmanned plane attitude angle and acceleration information, this two parts information is used as input, the vision of operational development SLAM algorithms obtain four rotor wing unmanned aerial vehicle three-dimensional positions and attitude information;Believed using extended Kalman filter fusion visual position The three-dimensional acceleration that breath and inertial navigation unit are provided, so as to obtain accurate positional information, the water obtained using above-mentioned algorithm Prosposition puts the elevation information obtained with velocity information and barometer as feedback quantity, designs PID controller, realize four rotors nobody The full autonomous flight control of machine.The patent document has built four rotor wing unmanned aerial vehicle vision control system experiment porch, and the platform is adopted Use embedded control system framework, using airborne embedded computer run paper in propose algorithm, but the patent nobody Flight Vehicle Structure is complicated and higher for the requirement of image-capable and computing capability, and the positioning precision that obtains and and When property cannot meet the flight control needs of unmanned vehicle.
Therefore, as above described in background technology, traditional unmanned vehicle solves indoor positioning and asks using vision positioning mode When topic, although can also obtain certain effect, but be supported due to lacking data for unknown indoor environment, cause interior It is bad that orientation problem is solved, even if being positioned using vision positioning mode, effect is also bad.
The content of the invention
The application collects the indoor environment data information in each place in advance by certain platform, and is shared with needs Unmanned vehicle, in this way, when certain unmanned vehicle starts aerial mission in strange indoor environment, can pass through The corresponding environmental data data of the environment is transferred in advance to improve its accuracy and reliability for implementing indoor positioning flight.Often While platform unmanned vehicle carries out flight in certain specific indoor environment, the camera data that will can also be gathered in real time Preserve, send shared platform, the correspondence of each indoor environment data in constantly improve shared platform to immediately or subsequently Data.
The purpose of the present invention is to be achieved by the following technical programs.
According to an aspect of the present invention, a kind of unmanned vehicle localization method of shared environment data is comprised the following steps:
In the first step, environmental data shared platform is collected and shared environment data.
In the second step, unmanned vehicle determines current location, and connects the environment number based on the current location According to shared platform obtaining respective environment data.
In third step, unmanned vehicle carries out vision positioning based on the environmental data.In the present invention, the ring Border data refer to the spatial information of unmanned vehicle flight environment of vehicle, include but are not limited to such as landform, barrier, passage structure Into two-dimensional map and/or three-dimensional map with data.
In the unmanned vehicle localization method of described shared environment data, in third step, the unmanned flight The environmental data that device will be gathered in-flight uploads the environmental data shared platform.
In the unmanned vehicle localization method of described shared environment data, in the first step, environmental data is shared Platform gathers environmental information and is concurrently built figure with positioning (SLAM) to form environment number by laser radar and/or camera According to.Further, laser radar includes single line laser radar or multi-line laser radar, and the camera is imaged including monocular depth Head, binocular depth camera, monocular structure light depth camera, binocular structure light depth camera or ToF (time of Flight) depth camera;It is described concurrently build figure with positioning (SLAM) via PTAM algoritic modules, MonoSLAM algoritic modules, ORB-SLAM algoritic modules, RGBD-SLAM algoritic modules, RTAB-SLAM algoritic modules or LSD-SLAM algoritic modules modeling with Form environmental data.
In the unmanned vehicle localization method of described shared environment data, in the first step, environmental data is shared Platform gathers ground texture information and is used as environmental data by camera.
In the unmanned vehicle localization method of described shared environment data, in the first step, the environmental data Place title, the direction of environmental data collecting including environmental data collecting, the acquisition mode of environmental data, the class of environmental data The processing format of type and/or environmental data, wherein, the type of the environmental data includes what ground texture map and SLAM set up Three-dimensional map.
In the unmanned vehicle localization method of described shared environment data, in first step, environmental data is shared flat Platform is cloud server, and environmental data shared platform carries out data interaction described in unmanned vehicle wireless communication connection.
In the unmanned vehicle localization method of described shared environment data, in the second step, unmanned vehicle sets There is binocular camera, the unmanned vehicle carries out binocular visual positioning based on the environmental data.
In the unmanned vehicle localization method of described shared environment data, in the second step, unmanned vehicle base Place title, the direction of environmental data collecting, the acquisition mode of environmental data in environmental data collecting, the class of environmental data The processing format search of type, the content of environmental data and/or environmental data and down load environmental data, the environmental data are two dimensions Map or three-dimensional map.
In the unmanned vehicle localization method of described shared environment data, in third step, unmanned vehicle is adopted The environmental data for collecting is in real time or delay uploads to environmental data shared platform.
According to another aspect of the present invention, the unmanned vehicle localization method of the shared environment data described in a kind of implementation Alignment system includes collecting the environmental data shared platform and unmanned vehicle of simultaneously shared environment data, and the environmental data is shared The storage device of database of the platform including the environmental data that is stored with and the COM1 of the access database, the communication ends Via unmanned vehicle described in wireless communication connection, the unmanned vehicle includes receiving the communication equipment of environmental data, place mouth The processor and the camera and/or laser radar for vision positioning of environmental data are managed, the unmanned vehicle is based on described Environmental data carries out vision positioning using the camera and/or laser radar.
In described alignment system, the environmental data that the unmanned vehicle will be gathered in-flight sets via the communication It is standby to upload the environmental data shared platform.
In described alignment system, the environmental data shared platform is provided with the laser radar for gathering environmental information And/or camera and for concurrently building figure with positioning (SLAM) to form the SLAM modules of environmental data.Further, it is described to swash Optical radar includes single line laser radar or multi-line laser radar, and the camera includes that monocular depth camera, binocular depth are taken the photograph As head, monocular structure light depth camera, binocular structure light depth camera or ToF (time of flight) depth camera Head;The SLAM modules include that PTAM algoritic modules, MonoSLAM algoritic modules, ORB-SLAM algoritic modules, RGBD-SLAM are calculated Method module, RTAB-SLAM algoritic modules or LSD-SLAM algoritic modules.
According to another aspect of the present invention, a kind of environmental data shared platform for described alignment system is high in the clouds clothes Business device, server include processor, the hard disk of the database of the environmental data that is stored with, internal memory, bus and for unmanned flight The COM1 of device data interaction.
According to another aspect of the present invention, a kind of unmanned vehicle of the localization method described in implementation includes receiving environment number According to communication equipment, the processor of processing environment data and the camera and/or laser radar for vision positioning, camera And/or laser radar sends current location to the processor, the processor is based on current location via the communication equipment Connect the alignment system and obtain respective environment data, the unmanned vehicle is based on the environmental data and utilizes the camera And/or laser radar carries out vision positioning.
According to another aspect of the present invention, a kind of unmanned vehicle for described alignment system includes receiving environment number According to communication equipment and the camera and/or laser radar for vision positioning, gather environmental data camera and/or laser Radar connects the alignment system to upload environmental data via the communication equipment.Scheme proposed by the present invention, simple structure, It is improved by transferring correspondence environmental data in advance and implements accuracy and reliability that indoor positioning is flown, every unmanned flight While device carries out flight in certain specific indoor environment, the camera data that will can also be gathered in real time is preserved, Immediately shared platform, each indoor environment data in constantly improve shared platform are sent to or subsequently.
Brief description of the drawings
The step of Fig. 1 is the unmanned vehicle localization method of shared environment data according to an embodiment of the invention is illustrated Figure.
Fig. 2 is the structural representation of the unmanned vehicle alignment system of shared environment data according to an embodiment of the invention Figure.
Fig. 3 is that the structure of the unmanned vehicle alignment system of shared environment data in accordance with another embodiment of the present invention is shown It is intended to.
The present invention is further explained below in conjunction with drawings and Examples.
Specific embodiment
It is described in detail below be actually only it is exemplary and be not intended to limit application and use.Additionally, being not intended to By any clear and definite of above technical field, background, brief overview or middle presentation described in detail below or the theoretical constraint for implying.Such as Use herein, term " module " or " unit " refer to any hardware, software, firmware, electronic control part, treatment logic and/or Processor device is included, without being limited to individually or with any combinations:Application-specific integrated circuit ASIC, electronic circuit, execution one The processor of individual or multiple softwares or firmware program is shared, special or groups of and memory, combinational logic circuit and/or offer Described functional other suitable parts.Additionally, unless clearly have opposite description, otherwise word " including " and Its different modification should be understood the part described in implicit including but be not excluded for any miscellaneous part.
Referred to as " unmanned vehicle ", english abbreviation is " UAV " to unmanned vehicle, is using radio robot and providing for oneself Presetting apparatus manipulate not manned aircraft.Can be divided into from technical standpoint definition:Depopulated helicopter, unmanned fixed-wing Machine, unmanned multi-rotor aerocraft, unmanned airship, unmanned parasol etc..
Preferred unmanned vehicle is many rotor unmanned aircrafts (or being multi-rotor aerocraft) in the embodiment of the present invention, It can be the unmanned vehicle of four rotors, six rotors and rotor quantity more than six.Preferably, fuselage is made up of carbon fibre material, On the premise of use intensity higher and rigidity is met, can significantly mitigate the weight of fuselage, so as to reduce many rotor unmanned flights The power demand of device and the mobility of many rotor unmanned aircrafts of raising.Certainly, in other embodiments of the invention, fuselage Can also be made up of plastics or other arbitrarily used materials.Fuselage be provided with it is multiple relative to the fuselage in it is symmetrical flat The slurry arm that face is symmetric, each slurry arm is provided with paddle components away from one end of the fuselage, and the paddle components include Motor and the blade being connected on the output shaft of the motor on the slurry arm, the rotation axis per a piece of blade are equal On the same face of cylinder.
Certainly, the explanation of many rotor unmanned aircrafts is a simple declaration, is specifically also included many other Composition component, also also has other many kinds of unmanned vehicle types, may be incorporated for realizing the purpose of the present invention, no longer goes to live in the household of one's in-laws on getting married herein State.
But, come for the development trend of the manipulation convenience that flight shoots from the demand in consumer level market and user See, the flight capture apparatus of technical solution of the present invention are primarily referred to as small, miniature many rotor unmanned aircrafts, this unmanned plane volume Preferably, it is low that flight shoots cost for small, low cost, flight stability.The aircraft that the present invention is used, typically revolves more with four axles Rotor aircraft is representative.Also, this aircraft have begun to be widely used in take photo by plane, the field such as aerial work, logistics.
The step of Fig. 1 is the unmanned vehicle localization method of shared environment data according to an embodiment of the invention is illustrated A kind of figure, unmanned vehicle localization method of shared environment data is comprised the following steps:
In first step S1, environmental data shared platform 1 is collected and shared environment data.
In second step S2, unmanned vehicle 2 determines current location, and connects the environment based on the current location Data sharing platform 1 is obtaining respective environment data.
In third step S3, unmanned vehicle 2 carries out vision positioning based on the environmental data.In the present invention, institute The spatial information that environmental data refers to unmanned vehicle flight environment of vehicle is stated, is included but are not limited to such as landform, barrier, passage Etc. the two-dimensional map or three-dimensional map with data of composition.
It is preferably in the unmanned vehicle localization method of shared environment data of the present invention, in third step S3 In, the environmental data that the unmanned vehicle 2 will be gathered in-flight uploads the environmental data shared platform 1.
The environmental data shared platform 1 of at least one computer composition of the invention, is stored thereon with different chamber The environmental data of environment.Some unmanned vehicles 2 being registered in the environmental data shared platform 1, can be by under the platform Environmental data required for carrying it lifts the stability of its indoor flight.Other these unmanned vehicles 2 can also by its Under specific indoor environment in flight course, the environmental data collected is uploaded to the environmental data shared platform 1, so that constantly Lift the service range and service depth of the environmental data shared platform 1.
It is preferably in the unmanned vehicle localization method of shared environment data of the present invention, in first step S1 In, environmental data shared platform 1 gathers environmental information and is concurrently built figure with positioning by laser radar and/or camera SLAM to form environmental data, wherein, laser radar includes single line laser radar or multi-line laser radar, and the camera includes Monocular depth camera, binocular depth camera, monocular structure light depth camera, binocular structure light depth camera or ToF (time of flight) depth camera;The figure of concurrently building is calculated with positioning SLAM via PTAM algoritic modules, MonoSLAM Method module, ORB-SLAM algoritic modules, RGBD-SLAM algoritic modules, RTAB-SLAM algoritic modules or LSD-SLAM algoritic modules Model to form environmental data.
In one embodiment, first, the environmental data shared platform 1 collects environmental data money according to SLAM modes Material.SLAM modes are implemented mainly by laser radar single line, multi-thread or camera monocular, binocular, monocular structure light, double Mesh structure light, ToF both modes are realized.Due to the environmental data obtained by this modeling in the application, it is primarily adapted for use in Unmanned plane avoidance reference indoors in-flight is used, thus it is no so high for modeling accuracy and algorithm requirement, and work as The preceding more ripe vision avoidance means being applied on unmanned plane, are mainly also based on the camera of binocular vision to realize , such as it is each provided with a pair of depth cameras of binocular vision in the front and lower section of unmanned vehicle.Due to specific calculation Method in itself with the selection of data form and not this Applicant's Abstract graph emphasis, repeated no more in application scheme, relatively well-known algorithm, Such as PTAM, MonoSLAM, ORB-SLAM, RGBD-SLAM, RTAB-SLAM, LSD-SLAM etc., can be appropriate be used in Application scheme, more detailed statements are not done yet at this in itself to algorithm.
It is preferably in the unmanned vehicle localization method of shared environment data of the present invention, in first step S1 In, environmental data shared platform 1 gathers ground texture information and is used as environmental data by camera.
It is preferably in the unmanned vehicle localization method of shared environment data of the present invention, in first step S1 In, the environmental data includes place title, the direction of environmental data collecting, the collection side of environmental data of environmental data collecting The processing format of formula, the type of environmental data and/or environmental data, wherein, the type of the environmental data includes ground texture The three-dimensional map that map and SLAM set up.
It is preferably in the unmanned vehicle localization method of shared environment data of the present invention, in first step S1, Environmental data shared platform 1 is cloud server, and environmental data shared platform described in the wireless communication connection of unmanned vehicle 2 is carried out Data interaction.Environmental data shared platform 1 is cloud server or a server cluster, there is many servers, and general Computer architecture is similar to, and the composition of environmental data shared platform 1 is including processor, hard disk, internal memory, system bus etc..Environment number According to shared platform 1 can provide simple efficient, safe and reliable, disposal ability can elastic telescopic calculating service.Environmental data is total to The processor for enjoying platform 1 can also include:CPU, memory ram, operating system and application software, are responsible for multi-task scheduling, including Radio communication function, memory read/write and data processing etc..The hard disk or memory of environmental data shared platform 1 may include can The SDD hard disks of fast reading and writing and the mobile read-write equipment of pluggable SD card, are mainly used in storing the environment number of unmanned vehicle According to.
It is preferably in the unmanned vehicle localization method of shared environment data of the present invention, in second step S2 In, unmanned vehicle 2 is provided with binocular camera, and the unmanned vehicle 2 carries out binocular visual positioning based on the environmental data.
It is preferably in the unmanned vehicle localization method of shared environment data of the present invention, in second step S2 In, unmanned vehicle 2 is based on place title, the direction of environmental data collecting, the collection side of environmental data of environmental data collecting The processing format search of formula, the type of environmental data, the content of environmental data and/or environmental data and down load environmental data, institute It is two-dimensional map or three-dimensional map to state environmental data.
It is preferably in the unmanned vehicle localization method of shared environment data of the present invention, in third step S3 In, the environmental data that unmanned vehicle 2 is collected is in real time or delay uploads to environmental data shared platform 1.
In order to further illustrate the unmanned vehicle localization method of shared environment data of the invention, it is exemplified below.
Embodiment 1
For the facility of explanation, first a better simply implementation method is described as follows:
Although foregoing is to illustrate environmental data shared platform 1 in SLAM modes as an example, due to SLAM modes pair It is higher in data operation and the disposal ability requirement of unmanned vehicle 2, so being more applied to the stronger unmanned flight of performance Device.For some more portable and fool auto heterodyne type unmanned vehicles, in fact can also be using the lower interior of data volume Environmental data shared model, such as using the single camera of certain specific direction of the direction of setting on auto heterodyne type unmanned vehicle, To gather corresponding data, so as to set up the corresponding environmental data of the indoor environment.For example, under the specific direction is directed towards just Side, then unmanned vehicle is actually obtained in that the ground texture environmental data of whole indoor environment.It is specific for some For indoor scenarios, its ground texture has more obvious mark degree, as long as a unmanned vehicle can be by shared flat Platform obtains the ground texture of the specific indoor scenarios, just as being that of obtaining complete map datum, can reliably exist The indoor scenarios are reliably flown.
In particular, floor tile has been laid in certain market room, the plural floor tile constitutes the not same texture of uniqueness, this Shen Environmental data shared platform that please be described is obtained from height as the take a crane shot downward vertically ground texture that obtains of 2 meters or so of level is complete Figure, and corresponding flight safety region is also identified with the full figure, the data constitute the specific occasion, that is, market Environmental data.
One auto heterodyne type unmanned vehicle A, as long as having downloaded above-mentioned data, the shooting number obtained by the camera that takes a crane shot According to, it becomes possible to which position itself being currently in the room of market is understood rapidly by the ground texture full figure, and by figure The size contrast of shape, moreover it is possible to the flying height of auxiliary judgment unmanned vehicle.Since it is determined the flight position of current unmanned plane And height, being marked by the flight safety region corresponding to the ground texture full figure, the unmanned vehicle can also be avoided reliably Barrier in the room of market.Above-mentioned all processes only need from the shared platform obtain corresponding environmental data by it is real easily It is existing, and reliability is high.
Embodiment 2
For increasingly complex SLAM situations, it is each provided with a pair with the front and lower section in unmanned vehicle below Illustrated as a example by the visual identity targeting scheme of the depth camera of binocular vision.
Due to obtaining the depth of field by the depth camera of a pair of binocular visions, so current this unmanned vehicle reality On border can be provided with the forward direction of camera and it is lower to, realize vision avoidance, allow unmanned vehicle to avoid on this direction Barrier.But because these data are to obtain treatment immediately immediately to judge whether barrier immediately, without doing history number According to accumulation, so the result of data acquisition can not lift service efficiency by accumulating repeatedly.In application scheme, to depth Camera institute's continuous acquisition to video data carry out continuous storage, it is and via above-mentioned any SLAM algorithms, this is preceding to taking the photograph As head and it is lower to camera taken by data processing be converted into correspondence form, can reflect the environment of surrounding physical model Data such that it is able to obtain simple threedimensional model coordinate corresponding with actual indoor environment.
When data demand is fairly simple, can by it is preceding to camera obtain data and it is lower to camera obtain Data processed according to SLAM algorithms respectively, obtain different environmental datas for same forward direction camera and it is lower to The aircraft of camera is utilized.For another only by the unmanned vehicle of forward direction camera, although with current collection The unmanned vehicle of data is different in camera configuration, but as long as is downloaded only from the data sharing platform and forward direction shooting Corresponding environmental data, can equally make full use of the environmental data, in the case where that need not re-start study, can just carry Understanding to certain actual environment high.
For when data-handling capacity is higher, it is also possible to based on the camera external member that multigroup camera is constituted, To set up the more perfect three-dimensional physical model for certain indoor environment, in this case, equivalent to foring one Three-dimensional map based on certain specific indoor environment.Another unmanned vehicle that can support correspondence SLAM algorithms, as long as The three-dimensional map data is downloaded, the data can be just based on, the carrying out tried to locate by following up a clue has been flown, and further increases the reliability of flight Property.
As described above, it is necessary to being directed to different indoor environments sets corresponding note in the construction of data sharing platform Record data, the record data is also needed to mark the title of the indoor environment in addition to including corresponding environmental data, typically, is somebody's turn to do The map style of indoor environment than texture map described above, certain specific direction binocular vision camera gather data, press Three-dimensional map model set up according to the SLAM algorithms of certain form etc..
Embodiment 3
Go to gather corresponding environmental data during the flight use of user certainly if based on certain given aircraft, And the locating effect that aircraft flies indoors is improved by updating environmental data, then this scheme is commercially come Say it is unpractical, because effect is bad, and the cost paid is also higher.
The application is emphasized, by the exclusive use behavior of the different user of magnanimity come the substantial amounts of accumulation different places of early stage Environmental data.
Most at first, it may be necessary to by platform operation person, actively gone actively at some using unmanned vehicle Conventional indoor flying field is gone actively to collect environmental data, but more importantly encourages user to go not by various modes Same indoor flight place flight, and attempt collecting produced by these users, may be wasted or ignore the history ring for falling Border data.
As environmental data is in the continuous accumulation of different places, the accuracy of data is by more and more higher.And with user The increase of frequency of use, the environmental data also will be more and more faster with the actual match degree of actual environment.
It is bad in practice for those gps signals also, the application is not limited only to indoor environment, it is difficult to obtain reliable fixed For the environment of position, the application is equally applicable.
A kind of typical place is exactly certain environment in high building in the environment of skyscraper stands in great numbers in city, now Due to the interference of Adjacent Buildings and metal object, especially also tend to the presence of substantial amounts of electromagnetic interference in city, now gps signal is It is insecure.Application scheme is so based on, as long as unmanned vehicle is obtained in that the environmental data of the correspondence occasion, nobody flies Row device just reliably can complete itself in the case where this conventional positioning methods of GPS are not relied on by vision positioning Positioning and navigation task.
Embodiment 4
This part is further described for the division and classification of the title of the environmental data in data sharing platform.
For some specific occasions, can be defined by its title, such as certain indoor park, certain interior are wide , naturally it is also possible to be outdoor environment, such as Hong Kong Disneyland.
Now, unmanned vehicle when in use, as long as corresponding place title is specified by user come match download this pair The environmental data in place is answered, so as to allow unmanned vehicle that reference data is obtained in practical flight.
Certainly, the Data Matching of the specific occasion can also be automatic.
For the occasion for being obtained in that gps coordinate, unmanned vehicle upon actuation, current institute is judged by GPS In position, then go shared platform to inquire about with the presence or absence of environmental data corresponding with the coordinate, this is shared if there is then request Environmental data.
Further, since unmanned vehicle is when in use, corresponding environmental data information also can be constantly collected, with the application's As a example by first embodiment, when a unmanned vehicle continually scans for ground texture, it is found that the ground texture is put down with data When the environmental data of certain locality in platform is matched, it is also possible to it is automatic download and to should locality environment number According to lifting the flight stability of the unmanned vehicle.
Certainly in the examples described above, due to some specific occasions in some local textures or some parts dimensionally Figure there may be identical situation, the operation of the above-mentioned this environmental data information for corresponding to certain specific occasion automatically, it is also possible to A confirmation is issued the user with, after user verification, is being used.
Embodiment 5
The shared platform can be set up beyond the clouds, and radio communication or other communication modes are passed through by each unmanned vehicle To be attached with the platform, and complete data sharing.
Certainly, the data of the shared platform can also regularly pack the mnemon shared on unmanned vehicle body, Now unmanned vehicle body can complete data separate.It is particularly suitable in the case of little for data volume under the occasion.Than Hong Kong is gone to play as user will clearly carry unmanned vehicle, then the unmanned vehicle is being started shooting after Hong Kong region is entered Under the conditions of, to judge to be in Hong Kong by GPS, then all known environment data where idle downloads Hong Kong using network are arrived Unmanned vehicle body.When user uses the aircraft in any one specific occasion in Hong Kong, the aircraft can be with By the data in body come matched environment data.When user's carrying unmanned vehicle leaves Hong Kong hurries to next destination, The unmanned vehicle can re-download the corresponding data bag in new place, and the selectable corresponding number deleted or retain Hong Kong According to bag.
Embodiment 6
In view of the data volume that the unmanned vehicle of binocular vision is gathered in flight is larger, for some unmanned planes of taking photo by plane Say, when itself completing to implement aerial images figure biography, the communication channel resources of required occupancy are just a lot, so, for this Plant the upload procedure of the environmental data for collecting, it is also possible to be deferred to idle implementation.Such as in the communication channel of unmanned vehicle It is idle, when not taken photo by plane shared by data;For another example, do not taken off in unmanned vehicle, but in open state When etc..Certainly, in the case of for the enough supports of communication channel resources, it is also possible to upload corresponding environment number immediately According to.
Fig. 2 is the structural representation of the alignment system of unmanned vehicle according to an embodiment of the invention, and one kind is implemented The alignment system of the unmanned vehicle localization method of described shared environment data includes collecting the environment of simultaneously shared environment data Data sharing platform 1 and unmanned vehicle 2, database of the environmental data shared platform 1 including the environmental data that is stored with Storage device 3 and access the COM1 4 of the database, the COM1 4 via described in wireless communication connection nobody fly Row device 2, the unmanned vehicle 2 includes receiving the communication equipment 5 of environmental data, the processor 6 of processing environment data and is used for The camera and/or laser radar 7 of vision positioning, the unmanned vehicle 2 are based on the environmental data and utilize the camera And/or laser radar 7 carries out vision positioning, the environmental data that the unmanned vehicle 2 will be gathered in-flight is via the communication Equipment 5 uploads the environmental data shared platform 1.
In one embodiment, processor 6 includes general processor, digital signal processor, application-specific integrated circuit ASIC, On-site programmable gate array FPGA, analog circuit, digital circuit, and combinations thereof or other known or exploitation later processors. Storage device 3 includes one or more read only memory ROMs, random access memory ram, flash memory, Electrical Erasable Programmable read only memory EEPROM or other types of memories.
In one embodiment, unmanned vehicle includes receiving communication equipment 5, the place of processing environment data of environmental data Reason device 6 and the camera for vision positioning and/or laser radar 7, I/O Interface, function device, power unit and Other.Each element is directly or indirectly electrically connected with each other, to realize the transmission or interaction of data.The processor 6 exists After receiving execute instruction, the program that the executable software function module includes, any implementation of the embodiment of the present invention are performed The method performed by unmanned vehicle that example is disclosed can apply in processor, or be realized by processor.
Wherein, memory is used to store the environmental data of the unmanned vehicle.The memory can be it is described nobody The internal storage of aircraft, or removable memory, memory may be, but not limited to, random access memory (Random Access Memory, RAM), read-only storage (Read OnlyMemory, ROM), programmable read only memory (Programmable Read-Only Memory, PROM), erasable read-only memory (Erasable Programmable Read-Only Memory, EPROM), electricallyerasable ROM (EEROM) (Electric Erasable Programmable Read-Only Memory, EEPROM) etc..Wherein, memory can be used for storage program.
Processor is probably a kind of IC chip, the disposal ability with signal.Above-mentioned processor can be logical With processor, including central processing unit (Central Processing Unit, abbreviation CPU), network processing unit 103 (Network Processor, abbreviation NP) etc.;Can also be digital signal processor (DSP), application specific integrated circuit (ASIC), It is ready-made programmable gate array (FPGA) or other PLDs, discrete gate or transistor logic, discrete hard Part component.Can realize or perform disclosed each method in the embodiment of the present invention, step and logic diagram.General processor Can be microprocessor or the processor can also be any conventional processor etc..
In one embodiment, accessing wirelessly is set up by communication equipment 5, and the communication equipment 5 at least includes having not It is logical with the wireless LAN communication equipment of priority, mobile communication network device, the Stratosphere Communication network equipment and satellite network One in letter equipment, the radio communication that the communication equipment 5 is set up between unmanned vehicle 2 and environmental data shared platform 1 Link.Wherein, mobile communication network device is mainly made up of 2G/3G/4G wireless communication chips groups.Wireless LAN communication equipment Can be in bluetooth, ZigBee or Wi-Fi module, wireless LAN communication equipment can be built by 2.4GHz communication frequencys Vertical short haul connection, indoors or the outdoor environment of low speed movement preferably can set up unmanned vehicle 2 and environment by the communication equipment Communication connection between data sharing platform 1.The Stratosphere Communication network equipment is typically forwarded with helium-airship, balloon as placement The platform stood, away from ground 17km~22km, unmanned vehicle is in field flight on a large scale, it may be preferred to stratosphere for podium level The communication connection that communication network device is set up between unmanned vehicle 2 and environmental data shared platform 1.Satellite network communications equipment The communication connection set up between unmanned vehicle 2 and environmental data shared platform 1 using satellite communication channel, usually without it In the case of his available wireless communication network, satellite network communications equipment can be used as emergency communication.
In one embodiment, according to wireless network cost or wireless network access speed, wireless-transmission network is selected, this It is precedence scheme, Wi-Fi network that application design is following:Priority is 0;4G wireless networks:Priority is 1;3G wireless networks: Priority is 2;Stratosphere Communication network:Priority is 3;Satellite communication network:Priority is 4;Priority level 0-4, it is selected Wireless network priority from high to low, if there are various wireless signals simultaneously, and signal intensity it is effective when, unmanned vehicle Wi-Fi network can first be selected as Radio Access Network;When Wi-Fi signal strength is invalid, unmanned vehicle can be less preferred 4G networks are selected as Radio Access Network;The like.
Power unit includes electron speed regulator, motor, rotor etc., wherein, the electron speed regulator is electrically connected with motor, Rotor is arranged on motor, the control signal that the electron speed regulator receiving processor sends, controlled motor rotation, so as to drive rotation The rotation of the wing.
Fig. 3 is the structural representation of the alignment system of unmanned vehicle according to an embodiment of the invention, and one kind is implemented The alignment system of the unmanned vehicle localization method of described shared environment data includes collecting the environment of simultaneously shared environment data Data sharing platform 1 and unmanned vehicle 2, database of the environmental data shared platform 1 including the environmental data that is stored with Storage device 3 and access the COM1 4 of the database, the COM1 4 via described in wireless communication connection nobody fly Row device 2, the unmanned vehicle 2 includes receiving the communication equipment 5 of environmental data, the processor 6 of processing environment data and is used for The camera and/or laser radar 7 of vision positioning.Further, the unmanned vehicle 2 utilizes institute based on the environmental data Camera and/or laser thunder 7 are stated up to carrying out vision positioning, the environmental data that the unmanned vehicle 2 will be gathered in-flight via The communication equipment 5 uploads the environmental data shared platform 1, and the environmental data shared platform 1 is provided with for gathering environment The laser radar and/or camera of information and for concurrently building figure with positioning SLAM to form the SLAM modules 8 of environmental data, its In, the laser radar includes single line laser radar or multi-line laser radar, and the camera includes monocular depth camera, double Mesh depth camera, monocular structure light depth camera, binocular structure light depth camera or ToFtime of flight depth Camera;The SLAM modules 8 include PTAM algoritic modules, MonoSLAM algoritic modules, ORB-SLAM algoritic modules, RGBD- SLAM algoritic modules, RTAB-SLAM algoritic modules or LSD-SLAM algoritic modules.
In one embodiment, monocular structure light depth camera has laser projecting apparatus, diffractive-optical element DOE, red The big core devices of outer camera three.
In one embodiment, ToF (time of flight) depth cameras send modulated near by sensor Infrared light, meets object back reflection, and sensor is launched with reflection interval difference or phase difference come the scape that is taken that converts by calculating light The distance of thing, to produce depth information.
In one embodiment, ORB-SLAM algoritic modules can be with real time execution, it is adaptable to various occasions, it is indoor or It is outdoor, large scene or small scene.System has very strong robustness, can well process strenuous exercise's image, Ke Yiyou Than the closed-loop control of larger leeway unrestrained section, reorientation, even full-automatic position initialization.
In one embodiment, RGBD-SLAM algoritic modules extract every frame image features point of the depth map for obtaining, and utilize Adjacent two field picture, carries out Feature Points Matching, then removes big noise using RANSAC, is then matched, and obtains a pose Information position and attitude, at the same can utilize IMU provide attitude information be filtered fusion, using filtering theory EKF, UKF, The optimization of PF or optimum theory TORO, G2O tree or figure, finally gives optimal pose and estimates.
In the present invention, a kind of environmental data shared platforms for described alignment system of are cloud server, server Including processor, the hard disk of the database of the environmental data that is stored with, internal memory, bus and for unmanned vehicle data interaction COM1.
In the present invention, a kind of unmanned vehicle of the localization method described in implementation includes receiving the communication equipment of environmental data 5th, the processor 6 and the camera for vision positioning and/or laser radar 7, camera and/or laser thunder of processing environment data Current location to the processor 6 is sent up to 7, the processor 6 is based on current location and connects described via the communication equipment 5 Alignment system obtains respective environment data, and the unmanned vehicle is based on the environmental data using the camera and/or swashs Optical radar 7 carries out vision positioning.
In the present invention, a kind of unmanned vehicle for described alignment system includes receiving the communication equipment of environmental data 5 and the camera for vision positioning and/or laser radar 7, gather the camera and/or laser radar 7 of environmental data via The communication equipment 5 connects the alignment system to upload environmental data.
Although being described to embodiment of the present invention above in association with accompanying drawing, the invention is not limited in above-mentioned Specific embodiments and applications field, above-mentioned specific embodiment is only schematical, guiding, rather than restricted 's.One of ordinary skill in the art is under the enlightenment of this specification and is not departing from the scope that the claims in the present invention are protected In the case of, the form of many kinds can also be made, these belong to the row of protection of the invention.

Claims (10)

1. a kind of unmanned vehicle localization method of shared environment data, it is comprised the following steps:
In first step (S1), environmental data shared platform (1) is collected and shared environment data;
In second step (S2), unmanned vehicle (2) determines current location, and connects the environment based on the current location Data sharing platform (1) is obtaining respective environment data;In third step (S3), unmanned vehicle (2) is based on the environment Data carry out vision positioning.
2. a kind of unmanned vehicle localization method of shared environment data as claimed in claim 1, it is characterised in that:
In first step (S1), environmental data shared platform (1) by laser radar and/or camera gather environmental information and Figure is concurrently built with positioning (SLAM) to form environmental data.
3. a kind of unmanned vehicle localization method of shared environment data as claimed in claim 1, it is characterised in that:
In first step (S1), environmental data shared platform (1) gathers ground texture information and is used as environment number by camera According to.
4. a kind of unmanned vehicle localization method of shared environment data as claimed in claim 1, it is characterised in that:
In first step (S1), the environmental data includes place title, the court of environmental data collecting of environmental data collecting To, the type of the acquisition mode of environmental data, environmental data and/or the processing format of environmental data, wherein, the environmental data Type include the three-dimensional map that ground texture map and/or SLAM set up.
5. a kind of unmanned vehicle localization method of shared environment data as claimed in claim 1, it is characterised in that:The 4th In step (S4), the environmental data that the unmanned vehicle (2) will in-flight gather uploads the environmental data shared platform (1)。
6. a kind of unmanned vehicle localization method for implementing shared environment data as any one of claim 1-5 is determined Position system, the alignment system includes collecting the environmental data shared platform (1) and unmanned vehicle of simultaneously shared environment data (2), it is characterised in that:The storage device (3) of the database of the environmental data shared platform (1) including the environmental data that is stored with With the COM1 (4) for accessing the database, the COM1 (4) is via unmanned vehicle described in wireless communication connection (2), the unmanned vehicle (2) including receive the communication equipment (5) of environmental data, the processor (6) of processing environment data and For the camera and/or laser radar (7) of vision positioning, the unmanned vehicle (2) utilizes institute based on the environmental data Stating camera and/or laser radar (7) carries out vision positioning.
7. a kind of alignment system as claimed in claim 6, it is characterised in that the environmental data shared platform (1) is provided with use In the laser radar and/or camera of collection environmental information and for concurrently building figure with positioning (SLAM) to form environmental data SLAM modules (8).
8. a kind of environmental data shared platform for alignment system as claimed in claims 6 or 7, it is characterised in that:It is described Environmental data shared platform is cloud server, and server includes the processor, hard disk of the database of the environmental data that is stored with, interior Deposit, bus and for the COM1 with unmanned vehicle data interaction.
9. a kind of unmanned vehicle for implementing localization method as described in claim any one of 1-5, the unmanned vehicle bag Include receive the communication equipment (5) of environmental data, the processor (6) of processing environment data and the camera for vision positioning and/ Or laser radar (7), it is characterised in that:Camera and/or laser radar (7) send current location to the processor (6), institute State processor (6) and the alignment system acquisition respective environment data, institute are connected via the communication equipment (5) based on current location State unmanned vehicle carries out vision positioning based on the environmental data using the camera and/or laser radar (7).
10. a kind of unmanned vehicle for alignment system as claimed in claims 6 or 7, the unmanned vehicle includes connecing Receive the communication equipment (5) and the camera and/or laser radar (7) for vision positioning of environmental data, it is characterised in that:Collection The camera and/or laser radar (7) of environmental data connect the alignment system to upload environment via the communication equipment (5) Data.
CN201710241152.4A 2017-04-13 2017-04-13 Environmental data shared platform, unmanned vehicle, localization method and alignment system Pending CN106931963A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710241152.4A CN106931963A (en) 2017-04-13 2017-04-13 Environmental data shared platform, unmanned vehicle, localization method and alignment system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710241152.4A CN106931963A (en) 2017-04-13 2017-04-13 Environmental data shared platform, unmanned vehicle, localization method and alignment system

Publications (1)

Publication Number Publication Date
CN106931963A true CN106931963A (en) 2017-07-07

Family

ID=59436871

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710241152.4A Pending CN106931963A (en) 2017-04-13 2017-04-13 Environmental data shared platform, unmanned vehicle, localization method and alignment system

Country Status (1)

Country Link
CN (1) CN106931963A (en)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107289948A (en) * 2017-07-24 2017-10-24 成都通甲优博科技有限责任公司 A kind of UAV Navigation System and method based on Multi-sensor Fusion
CN107391641A (en) * 2017-07-11 2017-11-24 北京航空航天大学 A kind of anti-interference data management system of civilian Unmanned Aerial Vehicle Data Link
CN108195359A (en) * 2017-12-12 2018-06-22 北京建筑大学 The acquisition method and system of spatial data
CN108427438A (en) * 2018-04-11 2018-08-21 北京木业邦科技有限公司 Flight environment of vehicle detection method, device, electronic equipment and storage medium
CN108549088A (en) * 2018-04-27 2018-09-18 科沃斯商用机器人有限公司 Localization method, equipment, system based on robot and storage medium
CN109002037A (en) * 2018-06-27 2018-12-14 中国人民解放军国防科技大学 Multi-robot collaborative path following method based on deep learning
CN109559277A (en) * 2018-11-28 2019-04-02 中国人民解放军国防科技大学 Multi-unmanned aerial vehicle cooperative map construction method oriented to data sharing
WO2019126957A1 (en) * 2017-12-25 2019-07-04 深圳前海达闼云端智能科技有限公司 Terminal-cloud combined positioning method and apparatus, electronic device and computer program product
CN110275181A (en) * 2019-07-08 2019-09-24 武汉中海庭数据技术有限公司 A kind of vehicle-mounted mobile measuring system and its data processing method
CN110446164A (en) * 2019-07-23 2019-11-12 深圳前海达闼云端智能科技有限公司 Mobile terminal positioning method and device, mobile terminal and server
WO2019242392A1 (en) * 2018-06-20 2019-12-26 华为技术有限公司 Database construction method, positioning method and relevant device therefor
CN110749323A (en) * 2019-10-22 2020-02-04 广州极飞科技有限公司 Method and device for determining operation route
CN111174788A (en) * 2018-11-13 2020-05-19 北京京东尚科信息技术有限公司 Indoor two-dimensional map building method and device
WO2020168667A1 (en) * 2019-02-18 2020-08-27 广州小鹏汽车科技有限公司 High-precision localization method and system based on shared slam map
CN112666973A (en) * 2020-12-15 2021-04-16 四川长虹电器股份有限公司 TOF-based unmanned aerial vehicle and method for maintaining and changing formation of cluster of unmanned aerial vehicle in flight
CN112762934A (en) * 2020-12-14 2021-05-07 浙江理工大学 Lower limb movement direction prediction device and method
CN113216565A (en) * 2021-04-26 2021-08-06 广东工业大学 Intelligent plastering furring brick device
WO2021163881A1 (en) * 2020-02-18 2021-08-26 深圳市大疆创新科技有限公司 Data processing method and system, device, and readable storage medium
WO2022000846A1 (en) * 2020-06-30 2022-01-06 深圳市大疆创新科技有限公司 Radar assembly and mobile platform having same
US11235823B2 (en) 2018-11-29 2022-02-01 Saudi Arabian Oil Company Automation methods for UAV perching on pipes
CN115421506A (en) * 2022-03-28 2022-12-02 北京理工大学 Model prediction control-based unmanned aerial vehicle periodic trajectory tracking and obstacle avoidance method
CN116203993A (en) * 2023-05-04 2023-06-02 南开大学 Multi-sensor-based power parafoil drop point control method and system

Cited By (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107391641A (en) * 2017-07-11 2017-11-24 北京航空航天大学 A kind of anti-interference data management system of civilian Unmanned Aerial Vehicle Data Link
CN107289948A (en) * 2017-07-24 2017-10-24 成都通甲优博科技有限责任公司 A kind of UAV Navigation System and method based on Multi-sensor Fusion
CN108195359B (en) * 2017-12-12 2020-02-07 北京建筑大学 Method and system for acquiring spatial data
CN108195359A (en) * 2017-12-12 2018-06-22 北京建筑大学 The acquisition method and system of spatial data
WO2019126957A1 (en) * 2017-12-25 2019-07-04 深圳前海达闼云端智能科技有限公司 Terminal-cloud combined positioning method and apparatus, electronic device and computer program product
CN108427438A (en) * 2018-04-11 2018-08-21 北京木业邦科技有限公司 Flight environment of vehicle detection method, device, electronic equipment and storage medium
CN108549088A (en) * 2018-04-27 2018-09-18 科沃斯商用机器人有限公司 Localization method, equipment, system based on robot and storage medium
CN108549088B (en) * 2018-04-27 2020-10-02 科沃斯商用机器人有限公司 Positioning method, device and system based on robot and storage medium
US11644339B2 (en) 2018-06-20 2023-05-09 Huawei Technologies Co., Ltd. Database construction method, positioning method, and related device
WO2019242392A1 (en) * 2018-06-20 2019-12-26 华为技术有限公司 Database construction method, positioning method and relevant device therefor
CN110688500A (en) * 2018-06-20 2020-01-14 华为技术有限公司 Database construction method, positioning method and related equipment thereof
CN110688500B (en) * 2018-06-20 2021-09-14 华为技术有限公司 Database construction method, positioning method and related equipment thereof
CN109002037A (en) * 2018-06-27 2018-12-14 中国人民解放军国防科技大学 Multi-robot collaborative path following method based on deep learning
CN109002037B (en) * 2018-06-27 2021-03-23 中国人民解放军国防科技大学 Multi-robot collaborative path following method based on deep learning
CN111174788B (en) * 2018-11-13 2023-05-02 北京京东乾石科技有限公司 Indoor two-dimensional mapping method and device
CN111174788A (en) * 2018-11-13 2020-05-19 北京京东尚科信息技术有限公司 Indoor two-dimensional map building method and device
CN109559277B (en) * 2018-11-28 2023-02-28 中国人民解放军国防科技大学 Multi-unmanned aerial vehicle cooperative map construction method oriented to data sharing
CN109559277A (en) * 2018-11-28 2019-04-02 中国人民解放军国防科技大学 Multi-unmanned aerial vehicle cooperative map construction method oriented to data sharing
US11235823B2 (en) 2018-11-29 2022-02-01 Saudi Arabian Oil Company Automation methods for UAV perching on pipes
WO2020168667A1 (en) * 2019-02-18 2020-08-27 广州小鹏汽车科技有限公司 High-precision localization method and system based on shared slam map
CN110275181A (en) * 2019-07-08 2019-09-24 武汉中海庭数据技术有限公司 A kind of vehicle-mounted mobile measuring system and its data processing method
CN110446164A (en) * 2019-07-23 2019-11-12 深圳前海达闼云端智能科技有限公司 Mobile terminal positioning method and device, mobile terminal and server
CN110749323A (en) * 2019-10-22 2020-02-04 广州极飞科技有限公司 Method and device for determining operation route
CN110749323B (en) * 2019-10-22 2022-03-18 广州极飞科技股份有限公司 Method and device for determining operation route
WO2021163881A1 (en) * 2020-02-18 2021-08-26 深圳市大疆创新科技有限公司 Data processing method and system, device, and readable storage medium
WO2022000846A1 (en) * 2020-06-30 2022-01-06 深圳市大疆创新科技有限公司 Radar assembly and mobile platform having same
CN112762934A (en) * 2020-12-14 2021-05-07 浙江理工大学 Lower limb movement direction prediction device and method
CN112762934B (en) * 2020-12-14 2023-12-22 浙江理工大学 Lower limb movement direction prediction device and method
CN112666973A (en) * 2020-12-15 2021-04-16 四川长虹电器股份有限公司 TOF-based unmanned aerial vehicle and method for maintaining and changing formation of cluster of unmanned aerial vehicle in flight
CN112666973B (en) * 2020-12-15 2022-04-29 四川长虹电器股份有限公司 Method for keeping and changing formation of unmanned aerial vehicle cluster in flight based on TOF
CN113216565A (en) * 2021-04-26 2021-08-06 广东工业大学 Intelligent plastering furring brick device
CN115421506A (en) * 2022-03-28 2022-12-02 北京理工大学 Model prediction control-based unmanned aerial vehicle periodic trajectory tracking and obstacle avoidance method
CN115421506B (en) * 2022-03-28 2023-08-11 北京理工大学 Unmanned aerial vehicle periodic track tracking and obstacle avoidance method based on model predictive control
CN116203993A (en) * 2023-05-04 2023-06-02 南开大学 Multi-sensor-based power parafoil drop point control method and system

Similar Documents

Publication Publication Date Title
CN106931963A (en) Environmental data shared platform, unmanned vehicle, localization method and alignment system
CN206804018U (en) Environmental data server, unmanned vehicle and alignment system
CN108351649B (en) Method and apparatus for controlling a movable object
CN110062919B (en) Drop-off location planning for delivery vehicles
US20210065400A1 (en) Selective processing of sensor data
US11866198B2 (en) Long-duration, fully autonomous operation of rotorcraft unmanned aerial systems including energy replenishment
CN110494360B (en) System and method for providing autonomous photography and photography
CN108351653B (en) System and method for UAV flight control
KR102010568B1 (en) System having a plurality of Unmanned Aerial Vehicles and Real world 3 dimensional Space Search method using Swarm Intelligence
EP3347789B1 (en) Systems and methods for detecting and tracking movable objects
Kong et al. Autonomous landing of an UAV with a ground-based actuated infrared stereo vision system
AU2021204188A1 (en) A backup navigation system for unmanned aerial vehicles
CN110058602A (en) Multi-rotor unmanned aerial vehicle autonomic positioning method based on deep vision
CN111527463A (en) Method and system for multi-target tracking
CN107637064A (en) Method and apparatus for image procossing
CN107850902A (en) Camera configuration in loose impediment
US10313575B1 (en) Drone-based inspection of terrestrial assets and corresponding methods, systems, and apparatuses
Cui et al. Search and rescue using multiple drones in post-disaster situation
JP6481228B1 (en) Determination device, control device, imaging system, flying object, determination method, and program
KR102467855B1 (en) A method for setting an autonomous navigation map, a method for an unmanned aerial vehicle to fly autonomously based on an autonomous navigation map, and a system for implementing the same
CN215813349U (en) Unmanned aerial vehicle formation target real-time tracking and modeling system
JP6459012B1 (en) Control device, imaging device, flying object, control method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20170707

WD01 Invention patent application deemed withdrawn after publication