CN108427438A - Flight environment of vehicle detection method, device, electronic equipment and storage medium - Google Patents
Flight environment of vehicle detection method, device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN108427438A CN108427438A CN201810322215.3A CN201810322215A CN108427438A CN 108427438 A CN108427438 A CN 108427438A CN 201810322215 A CN201810322215 A CN 201810322215A CN 108427438 A CN108427438 A CN 108427438A
- Authority
- CN
- China
- Prior art keywords
- vehicle
- flight environment
- information
- barrier
- flight
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/12—Target-seeking control
Abstract
The embodiment of the present disclosure discloses flight environment of vehicle detection method, device, electronic equipment and storage medium.The method includes:Obtain the barrier sensing data that aircraft detects in flight environment of vehicle;The cartographic information in the flight environment of vehicle is generated according to the barrier sensing data, and identifies the obstacle information in the flight environment of vehicle.By the embodiment of the present disclosure, the cartographic information of high density barrier flight environment of vehicle can be established, and the investigation of barrier can be carried out wherein, provide reliable data for related personnel, further improve the accuracy of investigation, and reduce cost of labor.
Description
Technical field
This disclosure relates to intelligent identification technology field, and in particular to a kind of flight environment of vehicle detection method, device, electronic equipment
And storage medium.
Background technology
In forest industry, orest management and management for economic aim are an important job.Especially artificial
In the commodity trees of cultivation, the yield and quality of the following trees is directly determined to the fine-grained management of trees, and then is determined
The economic benefit of trees.
Wherein, it is an important job to the forest survey of artificial forest or natural forests.Its object is to gloomy according to managing
The purpose of woods, the progress data sampling and processing of system predict the work of the resource of forest for information about.In forest survey process
In, telemetering remote sensing technology is widely used, such as by telemetering remote sensing satellite, can be covered to the entirety of forest, growth feelings
Condition carries out long-range measurement and estimation.
In recent years, unmanned air vehicle technique is widely used.Other than being applied in the photography and vedio recording of consumer field, make
The traditional artificial each task that carries out is replaced just to become an emerging direction in industrial circle with unmanned plane.In forest survey field
The use of unmanned plane is a kind of efficient method, unmanned plane is easier to dispose compared to satellite, and telemeter distance is close, measurement accuracy
It is high.Therefore, gradually occur various method and the practices that forest survey is carried out using unmanned plane in industry.
Invention content
A kind of flight environment of vehicle detection method of embodiment of the present disclosure offer, device, electronic equipment and storage medium.
In a first aspect, a kind of flight environment of vehicle detection method is provided in the embodiment of the present disclosure, including:
Obtain the barrier sensing data that aircraft detects in flight environment of vehicle;
The cartographic information in the flight environment of vehicle is generated according to the barrier sensing data, and identifies the flight environment of vehicle
In obstacle information.
Optionally, the barrier sensing data includes the of barrier in the flight environment of vehicle that imaging sensor obtains
One image information.
Optionally, the barrier sensing data include range sensor obtain the flight environment of vehicle in barrier away from
Second image information of barrier in the flight environment of vehicle obtained from information and imaging sensor.
Optionally, the barrier sensing data includes that first obtained in the carry-on binocular camera is arranged
Image information;
The cartographic information in the flight environment of vehicle is generated according to the barrier sensing data, including:
The first image information and SLAM algorithms obtained according to the binocular camera generates the cartographic information.
Optionally, the obstacle information in the flight environment of vehicle is identified, including:
The first image information obtained according to the binocular camera identifies the obstacle information in the flight environment of vehicle.
Optionally, the barrier sensing data includes that the laser thunder obtained in the carry-on laser radar is arranged
Up to point cloud data;
The cartographic information in the flight environment of vehicle is generated according to the barrier sensing data, including:
The laser radar point cloud data and SLAM algorithms obtained according to the laser radar generates the cartographic information.
Optionally, the obstacle information in the flight environment of vehicle is identified, including:
According to the second image information and the laser radar being arranged in the carry-on imaging sensor acquisition
Point cloud data identifies the obstacle information in the flight environment of vehicle.
Optionally, the obstacle information in the flight environment of vehicle is identified, including:
According to the machine learning model trained by sample data to the image information of barrier in the flight environment of vehicle
It is identified, obtains the obstacle information in the flight environment of vehicle.
Optionally, further include:
According to the obstacle information planning survey flight path, the investigation flight path is for investigating the flying ring
Barrier in border.
Optionally, the flight environment of vehicle is the environment that satellite positioning precision is less than predetermined threshold value.
Optionally, the flight environment of vehicle is that kind is implanted with trees and the flight environment of vehicle between trees;The obstacle information packet
Include the information of trees.
Optionally, further include:
According to the obstacle information in the cartographic information and the flight environment of vehicle in the flight environment of vehicle, the flight is generated
The survey report of environment, the survey report include the space bit of the information and the trees of trees in the flight environment of vehicle
Confidence ceases.
Optionally, the information of the trees includes one or more of type, the age of tree, the diameter of a cross-section of a tree trunk 1.3 meters above the ground, ground diameter, tree height, hat width.
Second aspect, the embodiment of the present disclosure provide a kind of flight environment of vehicle detection device, including:
Acquisition module is configured as obtaining the barrier sensing data that aircraft detects in flight environment of vehicle;
First generation module is configured as generating the map letter in the flight environment of vehicle according to the barrier sensing data
Breath, and identify the obstacle information in the flight environment of vehicle.
Optionally, the barrier sensing data includes the of barrier in the flight environment of vehicle that imaging sensor obtains
One image information.
Optionally, the barrier sensing data include range sensor obtain the flight environment of vehicle in barrier away from
Second image information of barrier in the flight environment of vehicle obtained from information and imaging sensor.
Optionally, the barrier sensing data includes that first obtained in the carry-on binocular camera is arranged
Image information;First generation module, including:
First generates submodule, and the first image information and SLAM for being configured as being obtained according to the binocular camera are calculated
Method generates the cartographic information.
Optionally, first generation module, including:
It is described winged to be configured as the first image information identification obtained according to the binocular camera for first identification submodule
Obstacle information in row environment.
Optionally, the barrier sensing data includes that the laser thunder obtained in the carry-on laser radar is arranged
Up to point cloud data;First generation module, including:
Second generates submodule, is configured as the laser radar point cloud data and SLAM that are obtained according to the laser radar
Algorithm generates the cartographic information.
Optionally, first generation module, including:
Second identification submodule is configured as according to the second figure being arranged in the carry-on imaging sensor acquisition
The obstacle information in the flight environment of vehicle is identified as information and the laser radar point cloud data.
Optionally, first generation module, including:
Third identifies submodule, is configured as according to the machine learning model trained by sample data to the flight
The image information of barrier is identified in environment, obtains the obstacle information in the flight environment of vehicle.
Optionally, further include:
Planning module is configured as according to the obstacle information planning survey flight path, the investigation flight path
For investigating the barrier in the flight environment of vehicle.
Optionally, the flight environment of vehicle is the environment that satellite positioning precision is less than predetermined threshold value.
Optionally, the flight environment of vehicle is that kind is implanted with trees and the flight environment of vehicle between trees;The obstacle information packet
Include the information of trees.
Optionally, further include:
Second generation module is configured as according to the barrier in the cartographic information and the flight environment of vehicle in the flight environment of vehicle
Hinder object information, generate the survey report of the flight environment of vehicle, the survey report includes that the information of trees and the trees exist
Spatial positional information in the flight environment of vehicle.
Optionally, the information of the trees includes one or more of type, the age of tree, the diameter of a cross-section of a tree trunk 1.3 meters above the ground, ground diameter, tree height, hat width.
The function can also execute corresponding software realization by hardware realization by hardware.The hardware or
Software includes one or more modules corresponding with above-mentioned function.
In a possible design, the structure of flight environment of vehicle detection device includes memory and processor, described to deposit
Reservoir supports flight environment of vehicle detection device to execute flight environment of vehicle detection method in above-mentioned first aspect for storing one or more
Computer instruction, the processor is configurable for executing the computer instruction stored in the memory.The flight
Environmental detection set can also include communication interface, for flight environment of vehicle detection device and other equipment or communication.
The third aspect, the embodiment of the present disclosure provide a kind of electronic equipment, including memory and processor;Wherein, described
Memory is for storing one or more computer instruction, wherein one or more computer instruction is by the processor
It executes to realize the method and step described in first aspect.
Fourth aspect, the embodiment of the present disclosure provide a kind of computer readable storage medium, for storing flight environment of vehicle inspection
The computer instruction used in device is surveyed, it includes by executing in above-mentioned first aspect based on involved by flight environment of vehicle detection method
Calculation machine instructs.
The technical solution that the embodiment of the present disclosure provides can include the following benefits:
The embodiment of the present disclosure establishes the ground in flight environment of vehicle by the barrier sensing data in the flight environment of vehicle that detects
Figure information, and the obstacle information in flight environment of vehicle, classification, appearance shape such as barrier are identified based on barrier sensing data
Shape, size and other desired attribute information.By the embodiment of the present disclosure, the ground of high density barrier flight environment of vehicle can be established
Figure information, and the investigation of barrier can be carried out wherein, reliable survey data is provided for related personnel, is further increased
The accuracy of investigation, and reduce cost of labor.
It should be understood that above general description and following detailed description is only exemplary and explanatory, not
The disclosure can be limited.
Description of the drawings
In conjunction with attached drawing, by the detailed description of following non-limiting embodiment, the other feature of the disclosure, purpose and excellent
Point will be apparent.In the accompanying drawings:
Fig. 1 shows the flow chart of the flight environment of vehicle detection method according to one embodiment of the disclosure;
Fig. 2 shows the schematic diagrames that unmanned plane is investigated under forest tree crown;
Fig. 3 shows the cartographic information schematic diagram built according to one embodiment of the disclosure;
Fig. 4 A-4C show to carry out the structural frames of flight environment of vehicle detection using binocular camera according to one embodiment of the disclosure
Figure;
Fig. 5 A-5C show the schematic diagram according to one embodiment planning survey path of the disclosure;
Fig. 6 shows the structure diagram of the flight environment of vehicle detection device according to one embodiment of the disclosure;
Fig. 7 is adapted for the knot of the electronic equipment for realizing the flight environment of vehicle detection method according to one embodiment of the disclosure
Structure schematic diagram.
Specific implementation mode
Hereinafter, the illustrative embodiments of the disclosure will be described in detail with reference to the attached drawings, so that those skilled in the art can
Easily realize them.In addition, for the sake of clarity, the portion unrelated with description illustrative embodiments is omitted in the accompanying drawings
Point.
In the disclosure, it should be appreciated that the term of " comprising " or " having " etc. is intended to refer to disclosed in this specification
Feature, number, step, behavior, the presence of component, part or combinations thereof, and be not intended to exclude other one or more features,
Number, step, behavior, component, part or combinations thereof there is a possibility that or be added.
It also should be noted that in the absence of conflict, the feature in embodiment and embodiment in the disclosure
It can be combined with each other.The disclosure is described in detail below with reference to the accompanying drawings and in conjunction with the embodiments.
Unmanned plane forest survey method existing at present uses the method flown on tree crown more.This means that telemetering is distant
Sense is being remotely measured by forest overhead.However, this method can not provide accurate forest survey information, because
The telemetry of the unmanned plane to fly on tree crown is blocked by tree crown, and is unable to get accurately forest survey information, such as nothing
Method obtains the details such as the diameter of a cross-section of a tree trunk 1.3 meters above the ground of trees.It is needed as far as possible in the forest operational inventory of the forest survey of fining, such as three classes
The details of forest are obtained, this makes a kind of method that trees details can be investigated and be recorded closer to distance become pole
For necessity.
Fig. 1 shows the flow chart of the flight environment of vehicle detection method according to one embodiment of the disclosure.As shown in Figure 1, described
Flight environment of vehicle detection method includes the following steps S101-S103:
In step S101, the barrier sensing data that aircraft detects in flight environment of vehicle is obtained;
In step s 102, the cartographic information in the flight environment of vehicle is generated according to the barrier sensing data, and known
Obstacle information in the not described flight environment of vehicle.
In the present embodiment, aircraft for example can be unmanned plane, can be obtained by the way that respective sensor is arranged on unmanned plane
Obtain the barrier sensing data in flight environment of vehicle.Flight environment of vehicle can be the survey area for having investigation demand, such as forest etc.;Fly
During row device flies in flight environment of vehicle, barrier sensing data, and the biography that will be got can be continuously obtained
Sense data send remote server to by communication network, and flight environment of vehicle is established according to barrier sensing data by remote server
Cartographic information, and then information planning flight path according to the map, or the processing unit by being arranged on aircraft is according to obstacle
Object sensing data establishes cartographic information, and plans flight path.At the same time it can also identify flying ring according to barrier sensing data
Obstacle information in border.Obstacle information can be the attribute information of barrier in flight environment of vehicle, such as the type of barrier,
Face shaping, size, relative position and with the relevant other information of the barrier, can specifically need to carry out according to factual survey
Setting, is not limited herein.For example, most of the barrier in forest is trees, then obstacle information may include trees
Type, the age of tree, the diameter of a cross-section of a tree trunk 1.3 meters above the ground, ground diameter, tree height, one or more of hat width etc..
In one embodiment, unmanned plane can provide lift by power plant, such as the mode of more rotors, and based on winged
Control system completes autonomous flight.Wherein, a specified path can be converted to the letter of the control to power unit by flight control system
Number, driving power device makes unmanned plane from the flight of A points to B points.Under normal circumstances, unmanned plane can use satellite positioning,
Such as the mode of GPS is completed to position and be navigated.However, in some particular surroundings such as forest survey field, due to satellite-signal
It can be blocked by tree crown, and trees bring multipath effect to prevent global position system from setting well the reflection of signal
Hat is lower to complete high accuracy positioning.In addition, although satellite-signal can provide positioning and navigation for unmanned plane, still cannot achieve
The required target that unmanned plane is flown under tree crown, because satellite-signal can not provide survey area for unmanned plane
The position of interior barrier and information.In order to allow unmanned plane larger region of this kind of obstacle density under such as tree crown
Middle flight, unmanned plane can use other positioning and air navigation aid, for example, being based on SLAM (Simultaneous
Localization and Mapping) method provide avoidance and navigation for unmanned plane.SLAM algorithms can be based on barrier
Sensing data independently completes the purpose of positioning and map structuring in zone of ignorance.
In one embodiment, the cartographic information established according to barrier sensing data can be spatially to scheme.The space
Map is the space three-dimensional map by constantly being established according to barrier sensing data in the uninterrupted flight course of aircraft.Space
Map may include the information of relative position whether there are obstacles in flight environment of vehicle.
In one embodiment, barrier sensing data can be obtained using range sensor, such as laser radar, millimeter wave
The equidistant sensor of radar;Barrier sensing data can also use image class sensor, such as monocular cam, binocular camera shooting
First-class imaging sensor.In one embodiment, aircraft can be completed by the barrier data and SLAM algorithms of acquisition
The structure of positioning and 3 dimension space maps.When aircraft is in flight, the distance of sensor acquired disturbance object, such as laser radar
The laser signal of barrier reflection can be received, binocular camera can capture the image information of barrier and be surveyed by binocular
Away from mode to barrier carry out ranging.Therefore, although the attribute letter of the barrier in SLAM algorithm None- identified flight environment of vehicle
Breath, but can cognitive disorders object position and demarcate in map.Based on the cartographic information of SLAM structures, aircraft is led
Model plane block can complete path planning, wherein path planning can be found between multiple barriers can flight path calculation
Method.
In an optional realization method of the present embodiment, the barrier sensing data includes what imaging sensor obtained
First image information of barrier in the flight environment of vehicle.
In the optional realization method, obstacle sensor data can be obtained by imaging sensor.Barrier senses
Data may include the first image information of barrier in flight environment of vehicle.For example, aircraft obtains flight using binocular camera
Image information in environment, and then the relative position relation in environment between barrier is determined according to image information, and be based on
SLAM algorithms establish space map.At the same time it can also by the way that image information is identified, determine obstacle information, such as hinder
Hinder the attribute information etc. of object.
In an optional realization method of the present embodiment, the barrier sensing data includes what range sensor obtained
Second figure of barrier in the flight environment of vehicle that the range information of barrier and imaging sensor obtain in the flight environment of vehicle
As information.
In the optional realization method, the range information of barrier in flight environment of vehicle can be obtained by range sensor,
And then three dimensional spatial map is established according to range information.Range sensor employed in the optional realization method may include
The equidistant sensor of laser radar, millimetre-wave radar.And imaging sensor is mainly used for obtaining the image information of barrier, so as to
Cognitive disorders object information in subsequent processing.
In an optional realization method of the present embodiment, the barrier sensing data includes being arranged in the aircraft
On binocular camera obtain the first image information;Institute is generated according to the barrier sensing data in the step S102
The step of stating the cartographic information in flight environment of vehicle, further comprises the steps:
The first image information and SLAM algorithms obtained according to the binocular camera generates the cartographic information.
In the optional realization method, by the way that binocular camera is arranged on board the aircraft, and is obtained and flown by binocular camera
The first image information in row environment.The first image information that binocular camera obtains may include two cameras simultaneously to same
Two parts image information acquired in one position, the camera parameter based on binocular camera and above-mentioned two parts image information energy
It enough determines the range information of barrier in acquired image information, and then map letter is generated according to range information and SLAM algorithms
Breath.In other embodiments, the first image letter can also be obtained by the way that monocular cam or other imaging sensors are arranged
Breath, for monocular cam, can first choose an object of reference, recycle later between the first image information and object of reference
Relative distance information and SLAM algorithms can also generate cartographic information.
In an optional realization method of the present embodiment, the barrier in the identification flight environment of vehicle in the step S102
The step of hindering object information, further comprise the steps:
The first image information obtained according to the binocular camera identifies the obstacle information in the flight environment of vehicle.
In the optional realization method, the first image information that binocular camera obtains includes barrier in flight environment of vehicle
Image information, therefore obstacle information can be identified based on the image information of barrier.Such as by by the first image
In information input to advance trained machine learning model, and then identified in the first image information by machine learning model
Obstacle information.In the optional realization method, the first image information acquired in binocular camera can both be used as barrier
The range information of barrier is for generating cartographic information in sensing data, the image letter being also used as in barrier sensing data
Cease obstacle information for identification.In this way, it can reach generation cartographic information only with imaging sensor
With two kinds of effects of cognitive disorders object information, hardware cost is relatively low, and processing procedure is also relatively simple.
In an optional realization method of the present embodiment, the barrier sensing data includes being arranged in the aircraft
On laser radar obtain laser radar point cloud data;Being generated according to the barrier sensing data in the step S102
The step of cartographic information in the flight environment of vehicle, further comprise the steps:
The laser radar point cloud data and SLAM algorithms obtained according to the laser radar generates the cartographic information.
In the optional realization method, using laser radar as range sensor, barrier in flight environment of vehicle is obtained
Range information.After laser radar sends out laser, if there are barriers, barrier can be reflected back laser in flight environment of vehicle, swash
Optical radar determines the range information of barrier by the time difference of the reflection laser and transmitting laser that receive.In order to accurately
Cartographic information is generated, cartographic information can be generated by the point cloud data and SLAM algorithms of laser radar.
In an optional realization method of the present embodiment, the barrier in the identification flight environment of vehicle in the step S102
The step of hindering object information, further comprise the steps:
According to the second image information and the laser radar being arranged in the carry-on imaging sensor acquisition
Point cloud data identifies the obstacle information in the flight environment of vehicle.
In the optional realization method, since laser radar is only capable of determining position, the size of barrier by point cloud data
And face shaping etc., but can not determine specific obstacle information, namely can not be determined and be hindered by laser radar point cloud data
Hinder the attribute information of object.In the optional realization method, needs that the equidistant sensor of laser radar is arranged simultaneously and image passes
Sensor, since when establishing cartographic information, capacity is smaller shared by the point cloud data of laser radar, required memory space is smaller,
And speed is dealt with, thus it is relatively low to the performance requirement of storage device and processing equipment.
In an optional realization method of the present embodiment, the barrier in the identification flight environment of vehicle in the step S102
The step of hindering object information, further comprise the steps:
According to the machine learning model trained by sample data to the image information of barrier in the flight environment of vehicle
It is identified, obtains the obstacle information in the flight environment of vehicle.
In the optional realization method, the obstacle information in image information is identified by machine learning model.Engineering
It includes but not limited to neural network, convolutional neural networks, deep neural network, support vector machines, K-means, K- to practise model
One or more combinations in neighbors, decision tree, random forest, Bayesian network.
For example, imaging sensor obtains the image information around aircraft, and it is input to advance trained machine learning
In model.Machine learning model can identify the obstacle information in image.For example, for forest survey region, work as engineering
After habit Model Identification goes out in image information trees, the number of a trees is established in the database.Further, engineering
The information such as classification differentiation, such as the kind of the tree, form, the age of tree can be carried out by grader to the trees by practising model.Into one
Step, machine learning model can identify the diameter of a cross-section of a tree trunk 1.3 meters above the ground, ground diameter, tree height, the hat width equidimension information of trees.Such as machine learning model can
By the grader based on convolutional neural networks by the seeds of the trees in image information, in the form of, the information such as the age of tree are identified.Base
In range information and image information, convolutional neural networks can also identify the information such as the diameter of a cross-section of a tree trunk 1.3 meters above the ground, ground diameter, tree height, the hat width of trees.
Machine learning model can first pass through training sample in advance and be trained.Training sample may include sample data and
The labeled data of sample, sample data may include the image information of barrier, and labeled data includes obstacle information.
In an optional realization method of the present embodiment, the method further comprises following steps:
According to the obstacle information planning survey flight path, the investigation flight path is for investigating the flying ring
Barrier in border.
In the optional realization method, when establishing cartographic information, aircraft constantly obtains barrier in flight course
Hinder object sensing data, and cartographic information is gradually established according to barrier sensing data, and then plan flight path, to set up
Complete map in survey area.Map is being established, and after identifying the obstacle information in survey area, it can basis
Obstacle information planning survey flight path again.For example, in wood land, a certain type or certain several type can be selected
Trees investigated, therefore according to identified obstacle information selection the tree families to be investigated where position into
Row investigation.For example, after aircraft identifies a trees sample in forest survey region, in order to obtain the essence of the trees sample
True investigation result, can be according to the location information planning survey flight path of trees sample, and then aircraft can be according to investigation
Flight path more fully investigates the trees sample.
In an optional realization method of the present embodiment, the flight environment of vehicle is that satellite positioning precision is less than predetermined threshold value
Environment.
The flight environment of vehicle detection method provided in the embodiment of the present disclosure can be adapted for any flight environment of vehicle, and also be adapted for
It is less than the environment of predetermined threshold value in satellite positioning precision.For example, kind is implanted with trees and the flight environment of vehicle between trees, this kind flies
Obstacle information includes but not limited to the information of trees in row environment, and the information of trees includes but not limited to type, the age of tree, chest
One or more of diameter, ground diameter, tree height, hat width.For example, for forest survey field, since satellite-signal can be by tree crown institute
It blocks, and trees bring multipath effect to prevent global position system from completing height under tree crown well the reflection of signal
Precision positions.In addition, although satellite-signal can provide positioning and navigation for aircraft, still cannot achieve makes flight
The purpose that device can fly under tree crown, because satellite-signal can not provide the position of the barrier in survey area for aircraft
And information.Therefore, the flight environment of vehicle detection method that the embodiment of the present disclosure proposes can be suitable for satellite positioning precision less than default
The environment of threshold value.
In an optional realization method of the present embodiment, trees and the flight environment of vehicle between trees, institute are implanted with for kind
The method of stating still further comprises following steps:
According to the obstacle information in the cartographic information and the flight environment of vehicle in the flight environment of vehicle, the flight is generated
The survey report of environment, the survey report include the space bit of the information and the trees of trees in the flight environment of vehicle
Confidence ceases.
In the optional realization method, trees and the flight environment of vehicle between trees, such as forest are implanted with for kind, it can be with
Generate the survey report in flight environment of vehicle.It, can be with after obtaining the cartographic information of survey area and the information of trees in aircraft
The information of trees is merged with cartographic information, obtains a survey report.Wherein, survey report can demarcate trees and adjust
Look into the information of the position in region and corresponding trees.Therefore, survey report provides the accurate sky in a survey area
Between information and tree census information, completely reproduced the investigation result in a piece of survey area.Due to being obtained by this method
To investigation result possess accurate tree information and spatial information, investigation result can be used for a variety of subsequent industry behaviour
Make, such as is concerned about according to the type of trees, state and corresponding position and carries out fostering planning.Again for example according to the kind of trees
The accurate assessment of class, state, the diameter of a cross-section of a tree trunk 1.3 meters above the ground to woods economic value.For another example the investigation result can be used for follow-up once establishing
The basic map of aircraft investigation.Aircraft, which can carry out barrier according to the map, to be evaded, and is navigated to target trees and is completed again
Investigation.
Illustrate to generate map using the barrier sensing data that binocular camera obtains below by an illustrative examples
Information and the detailed process for identifying obstacle information.
In the present embodiment, unmanned plane builds three-dimensional map using the SLAM based on binocular camera, while based on identical
Camera completes forest survey.As shown in Fig. 2, unmanned plane flies in a piece of survey area, wherein drone flying height is low
In height of tree crown.Unmanned plane is equipped with binocular camera, for perceiving environmental information.Unmanned plane is based on SLAM information, builds ring
Condition figure simultaneously completes automatic obstacle-avoiding and path planning.As shown in figure 3, method of the unmanned plane by SLAM, constructs three-dimensional sky
Between map, wherein containing the information and distance of barrier.Unmanned plane carries out avoidance according to the three dimensional spatial map that SLAM is generated
And path planning.Further, unmanned plane is when according to planning path flight, it will uses the environment on the periphery that binocular camera obtains
Image data.The image data of environment is admitted to an identification module based on convolutional neural networks, for identification in image
Including tree information.As shown in Figure 4 A, binocular camera obtains the image information of environment, and image information is input to identification
Module.Identification module utilizes a convolutional neural networks, and the trees specimen discerning for including in picture is come out, and is believed according to image
Breath obtains the survey information of trees sample, and survey information contains the information such as type, the diameter of a cross-section of a tree trunk 1.3 meters above the ground, ground diameter, the hat width of trees herein, such as
Shown in Fig. 4 B and Fig. 4 C.Further, image information is input to SLAM by binocular camera, and the position for obtaining current unmanned plane is sat
Mark, binding site coordinate and binocular ranging, range finder module can calculate the trees sample and unmanned plane identified in identification module
Relative position information, and the unmanned plane coordinate provided in conjunction with SLAM, resolving obtain the coordinate of trees sample.Finally, investigation report
Accuse generation module receive SLAM input cartographic information, and combine identification module output tree census as a result, by the two according to
Coordinate merges, and obtains final survey report.Survey report after fusion includes that the survey information of trees and trees generate
Three-dimensional obstacle boundaries.That is, final survey report provides the specific trees distribution of survey area and trees generate
Obstacle distribution.In one embodiment, after trees sample being identified in identification module, identification information is input to path rule
In drawing, path planning will plan a path so that the binocular camera of unmanned plane can capture more related trees samples
Information, to obtain more image informations, and then obtain more complete tree census result.As shown in figures 5a-5c, unmanned plane
According to the map that SLAM is exported, it is provided with the route of initial plan.In flight course, the identification module of unmanned plane identifies one
A trees sample (Fig. 5 A).In order to accurately be investigated the sample, the data of the sample are input to path by identification module
Scale module (Fig. 5 C).Path planning module re-starts path planning, and new planning path is for investigating the trees sample (figure
5B).After unmanned plane is completed to the investigation of the trees sample, path planning module planning path again so that unmanned plane is explored
More zone of ignorances.By above dynamic path planning method, unmanned plane can be autonomous to the trees in survey area
Fining investigation is carried out, and then obtains complete survey report.
The obstacle for illustrating to obtain jointly using range sensor and imaging sensor below by another illustrative examples
Object sensing data generates cartographic information and identifies the detailed process of obstacle information.
In the present embodiment, unmanned plane carries out three-dimensional map structure using laser radar and the SLAM based on laser radar point cloud
It builds.Meanwhile unmanned plane completes the identification and investigation of trees sample using imaging sensor.Wherein, laser emitter is passed with image
Sensor is demarcated and is calibrated so that the data of imaging sensor can be aligned with the point cloud data of laser radar.Work as knowledge
After other module identifies a trees sample, laser point cloud data, identification module are extracted according to the position of the sample in the picture
And then the specific coordinate of trees can be calculated.Further, can be used for by the data of the trees sample of lidar measurement
Generate the survey data of trees, such as the high survey data of the diameter of a cross-section of a tree trunk 1.3 meters above the ground, ground diameter, book.In the embodiment, the map structuring of unmanned plane is
Point cloud data based on laser radar generates, it is therefore desirable to additional data fusion could with the data fusion of imaging sensor,
And then complete the task of forest survey.
Following is embodiment of the present disclosure, can be used for executing embodiments of the present disclosure.
Fig. 6 shows that the structure diagram of the flight environment of vehicle detection device according to one embodiment of the disclosure, the device can lead to
Cross being implemented in combination with as some or all of of electronic equipment of software, hardware or both.As shown in fig. 6, the flying ring
Border detection device includes acquisition module 601 and the first generation module 602:
Acquisition module 601 is configured as obtaining the barrier sensing data that aircraft detects in flight environment of vehicle;
First generation module 602 is configured as generating the ground in the flight environment of vehicle according to the barrier sensing data
Figure information, and identify the obstacle information in the flight environment of vehicle.
In an optional realization method of the present embodiment, the barrier sensing data includes what imaging sensor obtained
First image information of barrier in the flight environment of vehicle.
In an optional realization method of the present embodiment, the barrier sensing data includes what range sensor obtained
Second figure of barrier in the flight environment of vehicle that the range information of barrier and imaging sensor obtain in the flight environment of vehicle
As information.
In an optional realization method of the present embodiment, the barrier sensing data includes being arranged in the aircraft
On binocular camera obtain the first image information;First generation module, including:
First generates submodule, and the first image information and SLAM for being configured as being obtained according to the binocular camera are calculated
Method generates the cartographic information.
In an optional realization method of the present embodiment, first generation module, including:
It is described winged to be configured as the first image information identification obtained according to the binocular camera for first identification submodule
Obstacle information in row environment.
In an optional realization method of the present embodiment, the barrier sensing data includes being arranged in the aircraft
On laser radar obtain laser radar point cloud data;First generation module, including:
Second generates submodule, is configured as the laser radar point cloud data and SLAM that are obtained according to the laser radar
Algorithm generates the cartographic information.
In an optional realization method of the present embodiment, first generation module, including:
Second identification submodule is configured as according to the second figure being arranged in the carry-on imaging sensor acquisition
The obstacle information in the flight environment of vehicle is identified as information and the laser radar point cloud data.
In an optional realization method of the present embodiment, first generation module, including:
Third identifies submodule, is configured as according to the machine learning model trained by sample data to the flight
The image information of barrier is identified in environment, obtains the obstacle information in the flight environment of vehicle.
In an optional realization method of the present embodiment, described device further includes:
Planning module is configured as according to the obstacle information planning survey flight path, the investigation flight path
For investigating the barrier in the flight environment of vehicle.
In an optional realization method of the present embodiment, the flight environment of vehicle is that satellite positioning precision is less than predetermined threshold value
Environment.
In an optional realization method of the present embodiment, the flight environment of vehicle is that kind is implanted with trees and between trees
Flight environment of vehicle;The obstacle information includes the information of trees.
In an optional realization method of the present embodiment, described device further includes:
Second generation module is configured as according to the barrier in the cartographic information and the flight environment of vehicle in the flight environment of vehicle
Hinder object information, generate the survey report of the flight environment of vehicle, the survey report includes that the information of trees and the trees exist
Spatial positional information in the flight environment of vehicle.
In an optional realization method of the present embodiment, the information of the trees include type, the age of tree, the diameter of a cross-section of a tree trunk 1.3 meters above the ground, ground diameter,
One or more of tree height, hat width.
The flight environment of vehicle detection method that above-mentioned flight environment of vehicle detection device is proposed with embodiment illustrated in fig. 1 and related embodiment
Corresponding consistent, detail can be found in the above-mentioned description to flight environment of vehicle detection method, and details are not described herein.
Fig. 7 is adapted for the structure of the electronic equipment for realizing the flight environment of vehicle detection method according to disclosure embodiment
Schematic diagram.
As shown in fig. 7, electronic equipment 700 includes central processing unit (CPU) 701, it can be according to being stored in read-only deposit
Program in reservoir (ROM) 702 is held from the program that storage section 708 is loaded into random access storage device (RAM) 703
Various processing in the above-mentioned embodiment shown in FIG. 1 of row.In RAM703, be also stored with electronic equipment 700 operate it is required
Various programs and data.CPU701, ROM702 and RAM703 are connected with each other by bus 704.Input/output (I/O) interface
705 are also connected to bus 704.
It is connected to I/O interfaces 705 with lower component:Importation 706 including keyboard, mouse etc.;It is penetrated including such as cathode
The output par, c 707 of spool (CRT), liquid crystal display (LCD) etc. and loud speaker etc.;Storage section 708 including hard disk etc.;
And the communications portion 709 of the network interface card including LAN card, modem etc..Communications portion 709 via such as because
The network of spy's net executes communication process.Driver 710 is also according to needing to be connected to I/O interfaces 705.Detachable media 711, such as
Disk, CD, magneto-optic disk, semiconductor memory etc. are mounted on driver 710, as needed in order to be read from thereon
Computer program be mounted into storage section 708 as needed.
Particularly, according to embodiment of the present disclosure, it is soft to may be implemented as computer above with reference to Fig. 1 methods described
Part program.For example, embodiment of the present disclosure includes a kind of computer program product comprising be tangibly embodied in and its readable
Computer program on medium, the computer program include the program code of the method for executing Fig. 1.In such implementation
In mode, which can be downloaded and installed by communications portion 709 from network, and/or from detachable media
711 are mounted.
Flow chart in attached drawing and block diagram, it is illustrated that according to the system, method and computer of the various embodiments of the disclosure
The architecture, function and operation in the cards of program product.In this regard, each box in course diagram or block diagram can be with
A part for a module, section or code is represented, a part for the module, section or code includes one or more
Executable instruction for implementing the specified logical function.It should also be noted that in some implementations as replacements, institute in box
The function of mark can also occur in a different order than that indicated in the drawings.For example, two boxes succeedingly indicated are practical
On can be basically executed in parallel, they can also be executed in the opposite order sometimes, this is depended on the functions involved.Also it wants
It is noted that the combination of each box in block diagram and or flow chart and the box in block diagram and or flow chart, Ke Yiyong
The dedicated hardware based system of defined functions or operations is executed to realize, or can be referred to specialized hardware and computer
The combination of order is realized.
Being described in unit or module involved in disclosure embodiment can be realized by way of software, also may be used
It is realized in a manner of by hardware.Described unit or module can also be arranged in the processor, these units or module
Title do not constitute the restriction to the unit or module itself under certain conditions.
As on the other hand, the disclosure additionally provides a kind of computer readable storage medium, the computer-readable storage medium
Matter can be computer readable storage medium included in device described in the above embodiment;Can also be individualism,
Without the computer readable storage medium in supplying equipment.There are one computer-readable recording medium storages or more than one journey
Sequence, described program is used for executing by one or more than one processor is described in disclosed method.
Above description is only the preferred embodiment of the disclosure and the explanation to institute's application technology principle.People in the art
Member should be appreciated that invention scope involved in the disclosure, however it is not limited to technology made of the specific combination of above-mentioned technical characteristic
Scheme, while should also cover in the case where not departing from the inventive concept, it is carried out by above-mentioned technical characteristic or its equivalent feature
Other technical solutions of arbitrary combination and formation.Such as features described above has similar work(with (but not limited to) disclosed in the disclosure
Can technical characteristic replaced mutually and the technical solution that is formed.
Claims (10)
1. a kind of flight environment of vehicle detection method, which is characterized in that including:
Obtain the barrier sensing data that aircraft detects in flight environment of vehicle;
The cartographic information in the flight environment of vehicle is generated according to the barrier sensing data, and is identified in the flight environment of vehicle
Obstacle information.
2. flight environment of vehicle detection method according to claim 1, which is characterized in that the barrier sensing data includes figure
First image information of barrier in the flight environment of vehicle obtained as sensor.
3. flight environment of vehicle detection method according to claim 1, which is characterized in that the barrier sensing data include away from
In the flight environment of vehicle that the range information of barrier and imaging sensor obtain in the flight environment of vehicle obtained from sensor
Second image information of barrier.
4. flight environment of vehicle detection method according to claim 2, which is characterized in that the barrier sensing data includes setting
Set the first image information obtained in the carry-on binocular camera;
The cartographic information in the flight environment of vehicle is generated according to the barrier sensing data, including:
The first image information and SLAM algorithms obtained according to the binocular camera generates the cartographic information.
5. flight environment of vehicle detection method according to claim 4, which is characterized in that identify the obstacle in the flight environment of vehicle
Object information, including:
The first image information obtained according to the binocular camera identifies the obstacle information in the flight environment of vehicle.
6. flight environment of vehicle detection method according to claim 3, which is characterized in that the barrier sensing data includes setting
Set the laser radar point cloud data obtained in the carry-on laser radar;
The cartographic information in the flight environment of vehicle is generated according to the barrier sensing data, including:
The laser radar point cloud data and SLAM algorithms obtained according to the laser radar generates the cartographic information.
7. flight environment of vehicle detection method according to claim 6, which is characterized in that identify the obstacle in the flight environment of vehicle
Object information, including:
According to the second image information and the laser radar point cloud being arranged in the carry-on imaging sensor acquisition
Data identify the obstacle information in the flight environment of vehicle.
8. a kind of flight environment of vehicle detection device, which is characterized in that including:
Acquisition module is configured as obtaining the barrier sensing data that aircraft detects in flight environment of vehicle;
First generation module is configured as generating the cartographic information in the flight environment of vehicle according to the barrier sensing data,
And identify the obstacle information in the flight environment of vehicle.
9. a kind of electronic equipment, which is characterized in that including memory and processor;Wherein,
The memory is for storing one or more computer instruction, wherein one or more computer instruction is by institute
Processor is stated to execute to realize claim 1-7 any one of them method and steps.
10. a kind of computer readable storage medium, is stored thereon with computer instruction, which is characterized in that the computer instruction quilt
Claim 1-7 any one of them method and steps are realized when processor executes.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810322215.3A CN108427438A (en) | 2018-04-11 | 2018-04-11 | Flight environment of vehicle detection method, device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810322215.3A CN108427438A (en) | 2018-04-11 | 2018-04-11 | Flight environment of vehicle detection method, device, electronic equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108427438A true CN108427438A (en) | 2018-08-21 |
Family
ID=63160939
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810322215.3A Pending CN108427438A (en) | 2018-04-11 | 2018-04-11 | Flight environment of vehicle detection method, device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108427438A (en) |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109459023A (en) * | 2018-09-18 | 2019-03-12 | 武汉三体机器人有限公司 | A kind of ancillary terrestrial robot navigation method and device based on unmanned plane vision SLAM |
CN109459777A (en) * | 2018-11-21 | 2019-03-12 | 北京木业邦科技有限公司 | A kind of robot, robot localization method and its storage medium |
CN109614889A (en) * | 2018-11-23 | 2019-04-12 | 华为技术有限公司 | Method for checking object, relevant device and computer storage medium |
CN109631860A (en) * | 2018-12-27 | 2019-04-16 | 南京理工大学 | Reservoir house refuse monitoring method and system based on unmanned plane |
CN109799817A (en) * | 2019-01-15 | 2019-05-24 | 智慧航海(青岛)科技有限公司 | A kind of unmanned boat global path planning method based on reflective character |
CN110008389A (en) * | 2019-02-13 | 2019-07-12 | 浩亚信息科技有限公司 | A kind of real-time measuring method of visual flight barrier, electronic equipment, storage medium |
CN110490114A (en) * | 2019-08-13 | 2019-11-22 | 西北工业大学 | Target detection barrier-avoiding method in a kind of unmanned plane real-time empty based on depth random forest and laser radar |
CN111044052A (en) * | 2019-12-31 | 2020-04-21 | 西安交通大学 | Unmanned aerial vehicle self-adaptive navigation system and method based on intelligent sensing |
WO2020103108A1 (en) * | 2018-11-22 | 2020-05-28 | 深圳市大疆创新科技有限公司 | Semantic generation method and device, drone and storage medium |
WO2020103109A1 (en) * | 2018-11-22 | 2020-05-28 | 深圳市大疆创新科技有限公司 | Map generation method and device, drone and storage medium |
CN112394744A (en) * | 2020-11-16 | 2021-02-23 | 广东电网有限责任公司肇庆供电局 | Integrated unmanned aerial vehicle system |
CN112416018A (en) * | 2020-11-24 | 2021-02-26 | 广东技术师范大学 | Unmanned aerial vehicle obstacle avoidance method and device based on multi-signal acquisition and path planning model |
WO2021056516A1 (en) * | 2019-09-29 | 2021-04-01 | 深圳市大疆创新科技有限公司 | Method and device for target detection, and movable platform |
CN112639881A (en) * | 2020-01-21 | 2021-04-09 | 深圳市大疆创新科技有限公司 | Distance measuring method, movable platform, device and storage medium |
CN112639655A (en) * | 2020-01-21 | 2021-04-09 | 深圳市大疆创新科技有限公司 | Control method and device for return flight of unmanned aerial vehicle, movable platform and storage medium |
CN114303112A (en) * | 2019-08-30 | 2022-04-08 | Nec平台株式会社 | Distribution device, aircraft, flight system, methods thereof, and non-transitory computer-readable medium |
CN116994153A (en) * | 2023-08-22 | 2023-11-03 | 中国科学院空天信息创新研究院 | Remote sensing satellite authenticity inspection system |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009008655A (en) * | 2007-02-06 | 2009-01-15 | Honeywell Internatl Inc | Method and system for three-dimensional obstacle mapping for navigation of autonomous vehicle |
CN104634322A (en) * | 2015-02-06 | 2015-05-20 | 北京林业大学 | Forest fixed sample area unmanned plane oblique photograph technical method |
CN105353768A (en) * | 2015-12-08 | 2016-02-24 | 清华大学 | Unmanned plane locus planning method based on random sampling in narrow space |
CN105469405A (en) * | 2015-11-26 | 2016-04-06 | 清华大学 | Visual ranging-based simultaneous localization and map construction method |
CN105492985A (en) * | 2014-09-05 | 2016-04-13 | 深圳市大疆创新科技有限公司 | Multi-sensor environment map building |
CN105517666A (en) * | 2014-09-05 | 2016-04-20 | 深圳市大疆创新科技有限公司 | Context-based flight mode selection |
CN105892489A (en) * | 2016-05-24 | 2016-08-24 | 国网山东省电力公司电力科学研究院 | Multi-sensor fusion-based autonomous obstacle avoidance unmanned aerial vehicle system and control method |
CN106123862A (en) * | 2016-06-03 | 2016-11-16 | 北京林业大学 | Flight unmanned plane understory species observation procedure in one elite stand |
CN106384382A (en) * | 2016-09-05 | 2017-02-08 | 山东省科学院海洋仪器仪表研究所 | Three-dimensional reconstruction system and method based on binocular stereoscopic vision |
CN106774410A (en) * | 2016-12-30 | 2017-05-31 | 易瓦特科技股份公司 | Unmanned plane automatic detecting method and apparatus |
CN106802668A (en) * | 2017-02-16 | 2017-06-06 | 上海交通大学 | Based on the no-manned plane three-dimensional collision avoidance method and system that binocular is merged with ultrasonic wave |
CN106931963A (en) * | 2017-04-13 | 2017-07-07 | 高域(北京)智能科技研究院有限公司 | Environmental data shared platform, unmanned vehicle, localization method and alignment system |
CN107272734A (en) * | 2017-06-13 | 2017-10-20 | 深圳市易成自动驾驶技术有限公司 | Unmanned plane during flying task executing method, unmanned plane and computer-readable recording medium |
-
2018
- 2018-04-11 CN CN201810322215.3A patent/CN108427438A/en active Pending
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009008655A (en) * | 2007-02-06 | 2009-01-15 | Honeywell Internatl Inc | Method and system for three-dimensional obstacle mapping for navigation of autonomous vehicle |
CN105492985A (en) * | 2014-09-05 | 2016-04-13 | 深圳市大疆创新科技有限公司 | Multi-sensor environment map building |
CN105517666A (en) * | 2014-09-05 | 2016-04-20 | 深圳市大疆创新科技有限公司 | Context-based flight mode selection |
CN104634322A (en) * | 2015-02-06 | 2015-05-20 | 北京林业大学 | Forest fixed sample area unmanned plane oblique photograph technical method |
CN105469405A (en) * | 2015-11-26 | 2016-04-06 | 清华大学 | Visual ranging-based simultaneous localization and map construction method |
CN105353768A (en) * | 2015-12-08 | 2016-02-24 | 清华大学 | Unmanned plane locus planning method based on random sampling in narrow space |
CN105892489A (en) * | 2016-05-24 | 2016-08-24 | 国网山东省电力公司电力科学研究院 | Multi-sensor fusion-based autonomous obstacle avoidance unmanned aerial vehicle system and control method |
CN106123862A (en) * | 2016-06-03 | 2016-11-16 | 北京林业大学 | Flight unmanned plane understory species observation procedure in one elite stand |
CN106384382A (en) * | 2016-09-05 | 2017-02-08 | 山东省科学院海洋仪器仪表研究所 | Three-dimensional reconstruction system and method based on binocular stereoscopic vision |
CN106774410A (en) * | 2016-12-30 | 2017-05-31 | 易瓦特科技股份公司 | Unmanned plane automatic detecting method and apparatus |
CN106802668A (en) * | 2017-02-16 | 2017-06-06 | 上海交通大学 | Based on the no-manned plane three-dimensional collision avoidance method and system that binocular is merged with ultrasonic wave |
CN106931963A (en) * | 2017-04-13 | 2017-07-07 | 高域(北京)智能科技研究院有限公司 | Environmental data shared platform, unmanned vehicle, localization method and alignment system |
CN107272734A (en) * | 2017-06-13 | 2017-10-20 | 深圳市易成自动驾驶技术有限公司 | Unmanned plane during flying task executing method, unmanned plane and computer-readable recording medium |
Cited By (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109459023B (en) * | 2018-09-18 | 2021-07-16 | 武汉三体机器人有限公司 | Unmanned aerial vehicle vision SLAM-based auxiliary ground robot navigation method and device |
CN109459023A (en) * | 2018-09-18 | 2019-03-12 | 武汉三体机器人有限公司 | A kind of ancillary terrestrial robot navigation method and device based on unmanned plane vision SLAM |
CN109459777A (en) * | 2018-11-21 | 2019-03-12 | 北京木业邦科技有限公司 | A kind of robot, robot localization method and its storage medium |
WO2020103108A1 (en) * | 2018-11-22 | 2020-05-28 | 深圳市大疆创新科技有限公司 | Semantic generation method and device, drone and storage medium |
WO2020103109A1 (en) * | 2018-11-22 | 2020-05-28 | 深圳市大疆创新科技有限公司 | Map generation method and device, drone and storage medium |
CN109614889A (en) * | 2018-11-23 | 2019-04-12 | 华为技术有限公司 | Method for checking object, relevant device and computer storage medium |
WO2020103427A1 (en) * | 2018-11-23 | 2020-05-28 | 华为技术有限公司 | Object detection method, related device and computer storage medium |
CN109614889B (en) * | 2018-11-23 | 2020-09-18 | 华为技术有限公司 | Object detection method, related device and computer storage medium |
CN109631860A (en) * | 2018-12-27 | 2019-04-16 | 南京理工大学 | Reservoir house refuse monitoring method and system based on unmanned plane |
CN109799817A (en) * | 2019-01-15 | 2019-05-24 | 智慧航海(青岛)科技有限公司 | A kind of unmanned boat global path planning method based on reflective character |
CN110008389A (en) * | 2019-02-13 | 2019-07-12 | 浩亚信息科技有限公司 | A kind of real-time measuring method of visual flight barrier, electronic equipment, storage medium |
CN110490114A (en) * | 2019-08-13 | 2019-11-22 | 西北工业大学 | Target detection barrier-avoiding method in a kind of unmanned plane real-time empty based on depth random forest and laser radar |
CN114303112A (en) * | 2019-08-30 | 2022-04-08 | Nec平台株式会社 | Distribution device, aircraft, flight system, methods thereof, and non-transitory computer-readable medium |
WO2021056516A1 (en) * | 2019-09-29 | 2021-04-01 | 深圳市大疆创新科技有限公司 | Method and device for target detection, and movable platform |
CN111044052A (en) * | 2019-12-31 | 2020-04-21 | 西安交通大学 | Unmanned aerial vehicle self-adaptive navigation system and method based on intelligent sensing |
CN111044052B (en) * | 2019-12-31 | 2021-07-06 | 西安交通大学 | Unmanned aerial vehicle self-adaptive navigation system and method based on intelligent sensing |
CN112639881A (en) * | 2020-01-21 | 2021-04-09 | 深圳市大疆创新科技有限公司 | Distance measuring method, movable platform, device and storage medium |
CN112639655A (en) * | 2020-01-21 | 2021-04-09 | 深圳市大疆创新科技有限公司 | Control method and device for return flight of unmanned aerial vehicle, movable platform and storage medium |
WO2021146973A1 (en) * | 2020-01-21 | 2021-07-29 | 深圳市大疆创新科技有限公司 | Unmanned aerial vehicle return-to-home control method, device, movable platform and storage medium |
CN112394744A (en) * | 2020-11-16 | 2021-02-23 | 广东电网有限责任公司肇庆供电局 | Integrated unmanned aerial vehicle system |
CN112416018B (en) * | 2020-11-24 | 2021-07-09 | 广东技术师范大学 | Unmanned aerial vehicle obstacle avoidance method and device based on multi-signal acquisition and path planning model |
CN112416018A (en) * | 2020-11-24 | 2021-02-26 | 广东技术师范大学 | Unmanned aerial vehicle obstacle avoidance method and device based on multi-signal acquisition and path planning model |
US11353893B1 (en) | 2020-11-24 | 2022-06-07 | Guangdong Polytechnic Normal University | Obstacle avoiding method and apparatus for unmanned aerial vehicle based on multi-signal acquisition and route planning model |
CN116994153A (en) * | 2023-08-22 | 2023-11-03 | 中国科学院空天信息创新研究院 | Remote sensing satellite authenticity inspection system |
CN116994153B (en) * | 2023-08-22 | 2024-02-06 | 中国科学院空天信息创新研究院 | Remote sensing satellite authenticity inspection system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108427438A (en) | Flight environment of vehicle detection method, device, electronic equipment and storage medium | |
US20200217666A1 (en) | Aligning measured signal data with slam localization data and uses thereof | |
KR102187654B1 (en) | Low altitude drone and Crop Cultivating Information Acquisition System including the same | |
EP3707466A1 (en) | Method of computer vision based localisation and navigation and system for performing the same | |
CN109341702B (en) | Route planning method, device and equipment in operation area and storage medium | |
CN109923589A (en) | Building and update hypsographic map | |
JP2018512687A (en) | Environmental scanning and unmanned aircraft tracking | |
CN106162144A (en) | A kind of visual pattern processing equipment, system and intelligent machine for overnight sight | |
US11182043B2 (en) | Interactive virtual interface | |
CN111339876B (en) | Method and device for identifying types of areas in scene | |
US10921825B2 (en) | System and method for perceptive navigation of automated vehicles | |
CN110235027A (en) | More object trackings based on LIDAR point cloud | |
CN110291480A (en) | A kind of unmanned plane test method, equipment and storage medium | |
CA3152226A1 (en) | Object tracking in local and global maps systems and methods | |
Magree et al. | Monocular visual mapping for obstacle avoidance on UAVs | |
CN114077249B (en) | Operation method, operation equipment, device and storage medium | |
Kamat et al. | A survey on autonomous navigation techniques | |
Abdalla et al. | Geospatial data integration | |
KR102467855B1 (en) | A method for setting an autonomous navigation map, a method for an unmanned aerial vehicle to fly autonomously based on an autonomous navigation map, and a system for implementing the same | |
KR102289752B1 (en) | A drone for performring route flight in gps blocked area and methed therefor | |
Śmigielski et al. | Visual simulator for MavLink-protocol-based UAV, applied for search and analyze task | |
Zhao et al. | Research on vision navigation and position system of agricultural unmanned aerial vehicle | |
Liang et al. | Forest in situ observations through a fully automated under-canopy unmanned aerial vehicle | |
CN109767387A (en) | A kind of forest image acquiring method and device based on unmanned plane | |
Aguilar | Results of the drone survey for Ngaitupoto Ki Motokaraka Trust |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180821 |