CN103926933A - Indoor simultaneous locating and environment modeling method for unmanned aerial vehicle - Google Patents

Indoor simultaneous locating and environment modeling method for unmanned aerial vehicle Download PDF

Info

Publication number
CN103926933A
CN103926933A CN201410127664.4A CN201410127664A CN103926933A CN 103926933 A CN103926933 A CN 103926933A CN 201410127664 A CN201410127664 A CN 201410127664A CN 103926933 A CN103926933 A CN 103926933A
Authority
CN
China
Prior art keywords
indoor
data
rgb
moment
aerial vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201410127664.4A
Other languages
Chinese (zh)
Inventor
丁嵘
陈震
王顺利
朱骋
朱润凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN201410127664.4A priority Critical patent/CN103926933A/en
Publication of CN103926933A publication Critical patent/CN103926933A/en
Pending legal-status Critical Current

Links

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses an indoor simultaneous locating and environment modeling method for an unmanned aerial vehicle. According to the method, flying tracks of the unmanned aerial vehicle can be located, and an indoor three-dimensional environmental model can be drawn fast. An operation platform and an operation method of the unmanned aerial vehicle are constructed, an RGB-D sensor carried by the unmanned aerial vehicle is utilized for collecting data, a visual odometry is adopted for estimating the indoor flying tracks and positions of the unmanned aerial vehicle, and an extended kalman filtering algorithm is adopted for obtaining the more accurate flying tracks and the positions of the aerial vehicle. Furthermore, the data transmission functions of the unmanned aerial vehicle and a ground communication device are utilized for feeding information collected by the sensor on the aerial vehicle back to ground operating personnel in real time, and a three-dimensional data display method of a computer is utilized for calculating and processing the collected data and carrying out indoor three-dimensional environment modeling and displaying.

Description

A kind of unmanned vehicle indoor location and environmental modeling method simultaneously
Technical field
The invention belongs to artificial intelligence and computer vision field, particularly robot indoor positioning and three-dimensional environment modeling method.
Background technology
Along with the research development of intelligent robot, utilize robot to replace people to explore with modeling and become gradually possibility environment.Needing the key issue solving is indoor positioning and the environmental modeling method of robot.
The robot platform that researchist studies and tests at present, great majority are ground robots, for example intelligent vehicle, anthropomorphic robot and some other functional robot, these researchs have obtained certain achievement, have covered a lot of fields such as artificial intelligence and image recognition.But for some specific environment, only rely on ground robot to be difficult to the task that environment is explored.The residing environment of robot is generally divided into outdoor environment and indoor environment.With respect to outdoor open environment, intelligent body in indoor environment is explored and is had more challenge, and first the residing environment of robot is more mixed and disorderly and narrow and small, and this has increased difficulty for robot obstacle-avoiding; Secondly under indoor environment, generally there is no the support of the external navigation system such as GPS, so cannot directly obtain the positional information of self, common this problem adopts location simultaneously to solve with composition (SLAM) algorithm.
SLAM (simultaneous localization and mapping), also referred to as CML (Concurrent Mapping and Localization), location and map structuring immediately.SLAM proposed early than 1988.Due to its important theory and using value, thought to realize the key of real full autonomous mobile robot by a lot of scholars.Location can be described as with composition problem simultaneously: mobile robot progressively builds the map of surrounding environment while moving in uncertain and circumstances not known at self-position, use this map to estimate robot location and attitude simultaneously, independently locates and navigates.Unmanned vehicle, in indoor flight, first utilizes external sensor to estimate that relative position changes.Because sensor self has error, particularly, in extensive environment, this measuring error can add up gradually.Once the same place of twice, aircraft or Multiple through then out, just may calculate inconsistent coordinate.In order to address this problem, first can select certain wave filter to merge various available data, weaken the impact of noise, obtain relative position more accurately and change estimated value.In addition, SLAM algorithm also can mate with history environment feature by current environment feature termly, detects flight path and whether has a ring.If existed, the flight path before just correcting according to the information of ring.Through so regular optimizing process, aircraft just can be drawn out more accurately and the consistent map of the overall situation in extensive environment.
Be color, depth information for outside RGB-D() selection of sensor, use PrimeSense or three-dimensional perception and the scanning sensor equipment similar to it.It includes common camera and infrared awareness apparatus.Its perceived distance principle is a kind of light coding techniques, gives and needs the space of measuring to be numbered with code with light illumination, and this light source is called laser speckle, is the random diffraction spot forming to rough object or after penetrating frosted glass when Ear Mucosa Treated by He Ne Laser Irradiation.These speckles have the randomness of height, and can be along with the different changing patterns of distance.The speckle pattern that is to say any two places in space is all different.As long as stamp such structured light in space, mark has just all been done in whole space, and an object is put into this space, as long as see the speckle pattern above object, can know this object position.
For some indoor environments, the for example house collapsing after earthquake, the underground mine caving in, even for a high building, for the ground robot that there is no self-determination climbing stairs ladder ability, it is all impossible completing indoor search and modeling, in the face of such demand in recent years scientific research personnel propose using unmanned vehicle as the platform of carrying out detection mission in indoor complex environment.Therefore realize autonomous indoor positioning and the three-dimensional environment modeling of unmanned vehicle, become a problem demanding prompt solution.But, than ground robot, on unmanned vehicle, realize a complete environment detection and modeling and exist a lot of challenges, mainly contain following:
1) limited load-carrying: as a kind of aircraft, the balance that need to provide incessantly lift just can maintain self, therefore load-carrying is limited, is difficult to as ground robot, can carry various heavy sensors.Certainly will have some limitations for obtaining of external environment information.
2) limited computing power: limited load-carrying has determined that unmanned vehicle intelligence carries the embedded computer system of a lightweight, and SLAM algorithm and online searching strategy time complexity based on unmanned vehicle is higher, also to ensure real-time simultaneously, the computing power of unmanned vehicle is limited, so must carry out special optimization or redesign to related algorithm.
3) action changes fast: unmanned vehicle movement velocity is very fast, and any delay all may be amplified by this specific character, causes larger mistake, and therefore the modules of system all will ensure certain real-time.
4) vibrations frequently: unmanned vehicle can not be slack as ground robot, aloft can slightly shake, and this has brought very large difficulty to planning.
5) three dimensions: unmanned vehicle operates in three dimensions, former some are often faced two-dimensional space for the online searching strategy of ground robot, and three dimensions can make the state space complexity of intelligent body promote.In addition, while changing height, may have barrier and occur suddenly or disappear.This requires it to possess good robustness and real-time.
6) need to estimate speed: quadrotor belongs to owes drive system, for better control, need estimating speed accurately.
7) relative position is estimated: in order to realize the indoor search strategy of unmanned vehicle, need to carry out accurately indoor positioning, odometry based on ground robot can directly be obtained relative position and estimate to position, but the Inertial Measurement Unit on unmanned vehicle is lightweight, the action error of calculation is larger, cannot directly use.Therefore, must indirectly carry out location estimation by external sensor.
As can be seen from the above, indoor exploration and the modeling on unmanned plane at present still exists very large challenge and Research Prospects, has very large scientific meaning.
Summary of the invention
In order to overcome above-mentioned defect, the present invention proposes a kind of unmanned vehicle indoor location and environmental modeling method simultaneously, and indoor three-dimensional model is located and drawn in the indoor flight that can utilize method related in the present invention to realize unmanned vehicle fast.
For achieving the above object, propose a kind of unmanned vehicle indoor location and environmental modeling method simultaneously, it is characterized in that:
Step 1): build unmanned vehicle hardware configuration, on machine, carry embedded master cpu, flight control panel, RGB-D sensor, and power-supply unit and radio transmission apparatus;
Step 2): control unmanned vehicle in indoor flight, and utilize RGB-D sensor to collect data;
Step 3): suppose that aircraft is t in certain moment of interior space flight, adopts vision odometry to estimate the situation of change that t moment aircraft moved compared with the t-1 moment.The method specifically comprises two kinds of variations of translation and rotation, be achieved as follows: first based on meet the feature detection techniques of requirement of real time and characteristic matching technology obtain before and after the character pair pair of two frames, and utilize some restrictions, as length and angle are removed some abnormal characteristic matching, then calculate the change amount of flight position based on least error principle;
Preferred, we can complete vision ranging algorithm with the depth information only obtaining from himself integrated infrared sensor based on RGB-D sensor, realize location and modeling in the environment such as the unglazed photograph of dark;
Preferred, in said method, adopt computer multiple thread treatment mechanism, be used for ensureing the independent operating of each module and real-time calculating, generation and the bandwagon effect of model;
Step 4): because the motion model of unmanned vehicle is nonlinear system, so in the data fusion stage, the gap between flight position and actual position that we need smoothly to estimate in vision odometry.Obviously, in nonlinear system, the disposal route Kalman filter generally being adopted has certain defect, therefore adopts extended Kalman filter algorithm, completes the noise reduction of flight path curve and level and smooth;
Step 5): complete on machine after vision odometry and filtering algorithm, can obtain preliminary location estimation.Land station must carry out data communication with aircraft and obtain the RGB-D data of indoor environment and the position of estimation, for path planning and indoor environment modeling;
Step 6): conventionally, the error of the location estimation that on aircraft, algorithm obtains can be accumulated along with the time.In the time that environment is very large, the Accumulation Phenomenon of this error can cause the disorder of whole environmental map.In order to address this problem, adopt closed loop detection technique to optimize flight path.This technology specifically refers to: when aircraft is in the time that the interior space is flown, when the flight path of carving whenever flight path and some time before overlaps, formed the loop of a sealing, now system is revised the flight path that forms loop carrying out again iterative algorithm, thereby reduces position estimation error;
Step 7): after step 3,4,5,6 optimization, obtain flight path and RGB-D data more accurately.In order to complete real-time display effect, thereby adopt OpenGL computer graphics algorithm interface RGB-D data to be processed to the real-time demonstration that realizes three-dimensional map.
The invention has the advantages that, we can utilize unmanned vehicle to carry out position indoor environment modeling and path is visual presents by this technology.Unmanned vehicle has strong, controlled good, the multiple advantage that is not subject to that ground environment affects etc. of dirigibility, and the unmanned vehicle that carries the correlation technique that RBG-D sensor relevant device and utilization the present invention relates to more can play even more important effect in indoor environment monitoring and mapping.Therefore as a brand-new invention technology, its adaptable field is very widely.
In emergency processing after the natural disasters such as underground mine detection that first, the unmanned vehicle with 3D vision and modeling function based on RGB-D sensor can be under location circumstances, condition of a fire scouting, earthquake, play very important effect.Can within the relatively short time, carry out overall three-dimensional model to location circumstances and calculate, set up, show and analyze use, greatly improve the ability that underground work personnel, the condition of a fire and emergency management and rescue personnel obtain site environment latest development, for people further take measures to have saved the more time.
Have 3D vision and the most basic function of unmanned vehicle of modeling function based on RGB-D sensor have been exactly the indoor complex environment scanning and mapping function relevant with location circumstances.Utilize this feature, can indoor environment be caught and modeling convenient people, thereby utilize better the model generating to carry out next step the indoor positioning work relevant to navigation, for the imagination in " wisdom city " provides possibility.
This technology can be in the realization of the indoor autonomous flight of unmanned vehicle and barrier avoiding function.Because the three dimensional vision modeling technology based on RGB-D sensor can generate indoor environment model more accurately the short time efficiently, so can model information be used within the shorter time, thereby make unmanned vehicle can perceive direction and the distance of front barrier, the flying quality information of the comprehensive inertance element self carrying and acquisition can be more convenient be that unmanned vehicle is hidden the barrier running in space flight, for the indoor autonomous flight that really realizes unmanned vehicle provides possibility.
This technology also can be applied in a large amount of public places.Such as we can carry out strange guidance by three dimensional vision modeling to the people that come for the first time public place in the public place such as museum, sports ground.Further, also for the individuals with disabilities' such as blind person position pathfinding and navigation provide possibility.The demonstration of our generation by model and flight, track route tells user oneself to be in the concrete orientation in map, thereby selects the place that next will go to.
Brief description of the drawings
Fig. 1 system hardware structure figure;
Fig. 2 realizes the method treatment scheme of aircraft indoor positioning and modeling;
Fig. 3 vision odometry process flow diagram.
Embodiment
Below in conjunction with accompanying drawing, technical scheme of the present invention is described in further detail.
First, we carry out holistic conformation by Fig. 1 to system hardware related in the present invention.In the process of platform building, should follow dead weight capacity minimum principle to ensure that unmanned vehicle more enough obtains relatively more steadily in the interior space, flight attitude flexibly, so that we carry out ensuing data acquisition and evaluation work.
Next utilize the system software framework shown in Fig. 2 to build unmanned vehicle software systems, utilize the RGB-D sensor carrying on it to collect external data, and adopt vision odometry to unmanned vehicle at the route track of indoor flight and displacement is calculated and filtering, thereby obtain the relatively more accurately flying quality of aircraft.Further, we utilize the data-transformation facility of unmanned vehicle and ground receiving wastewater facility to make it that machine upper sensor is collected to information Real-time Feedback to terrestrial operation personnel, and utilize Computerized three-dimensional data display method data calculated to collecting, processing and models show.
Concrete implementation step is as follows:
Step 1): build unmanned vehicle from weight and two aspects of stability, and consider to carry out dirigibility that indoor flight need to be satisfied and handling.Because the cell voltage output size that flying power is provided has fluctuation, in order to improve stability, for computing system on machine is equipped with separately enough stabilized voltage supplys of operation flight time.
Step 2): RGB-D sensor is installed on unmanned vehicle, and is adjusted unmanned vehicle balance.By the data communication and transmission module between aircraft and ground, control aircraft in indoor flight.Simultaneously, the computing system control RGB-D sensor carrying on aircraft is with the RGB-D data of the speed acquisition 320 × 240 of 30fps, and generated data storage file is according to RGB information, three dimensional space coordinate information, the storage of depth information form, for ensuing flight path calculating, location and indoor environment modeling provide raw data.
Step 3): the RGB information in the file that front and back two frames are stored is done independent extraction and signature analysis.First utilize gray-scale map to generate floating-point arithmetic RGB image is carried out to gray scale processing, carry out following formula operation: R*0.3+G*0.59+B*0.11 by the each numerical value in RGB data and using result as gray-scale value, can generate RGB and scheme corresponding gray-scale map.Then gray-scale map is carried out to Wiener filtering and remove Fuzzy Processing, the method is that a kind of comparatively basic image is removed fuzzy algorithm, the difference arranging between desired output and the actual output of gray level image is error, this error is asked to all sides, be square error, when square error is less, noise filtering effect is just better, causes image blurring and the problem of validity feature cannot be detected at indoor moving thereby solve aircraft.Then, in image, a certain random site is found a feature candidate point, chooses candidate point some pixels around and compares, and determines whether this point is unique point.If there is abundant point all very large with the pixel difference of candidate point around, think that this candidate point is unique point, the method for this selected characteristic point is also referred to as FAST feature extraction.Then calculate the change amount of flight position based on least error principle.
Finally need to get rid of the unique point without depth information collecting due to RGB-D sensor hardware self reason, determine the successfully unique point of coupling.
Preferred, can choose other removal fuzzy algorithmes and optimize above-mentioned Wiener filtering processing, remove better blur effect thereby reach.
Preferred, carry out FAST feature extraction in the depth information data that can obtain at RGB-G sensor, and utilize said process to calculate relative action to change, adapt to dark indoor environment.
Preferred, carrying out after FAST feature extraction, can use SAD image matching algorithm to carry out feature and slightly mate.Its basic procedure is as follows:
1. in image, construct a wicket, be similar to convolution;
2. cover the image on the left side with window, select all pixels in window overlay area;
3. equally cover the image on the right with window and select the pixel of overlay area;
Overlay area, the left side deduct the right overlay area, and obtain the poor absolute value of all pixels and, also referred to as sad value;
5. move the window of right image, repeat 3,4 action, specified search range will finish this action in the time that Search Results is not in specified scope, carry out next action;
6. find the window of sad value minimum within the scope of this, find the block of pixels of the optimum matching of left image.
Preferred, can carry out normal value detection to final selected unique point, in order to get rid of the abnormal unique point that may exist, thus the accuracy of raising motion estimation.Whether effectively normal value, utilizes length, angle or other restrictive conditions of selected characteristic point region, as grey scale change etc. compares, set rational numerical value as judging unique point standard and then filter out the normal unique point of coupling.
Step 4): because the motion model of unmanned vehicle is nonlinear system, so in the data fusion stage, the gap between flight position and actual position that we need smoothly to estimate in vision odometry.Obviously, in nonlinear system, the disposal route Kalman filter generally being adopted has certain defect, therefore adopts extended Kalman filter algorithm, completes the noise reduction of flight path curve and level and smooth.Specific algorithm is as follows: the basic thought of EKF is by nonlinear system linearization, then carries out Kalman filtering, thereby obtains required comparatively accurate flight path.
EKF algorithm passes through observed quantity Y krenewal, obtain to state vector X kestimation, they meet following relation:
X(k+1)=f(X k)+W k
Y(k)=h(X k)+V k
Wherein, X(k+1) be what estimated by the value of previous moment Xk, f(X k), h(X k) be nonlinear process model and observation model; Y kit is the observation vector that can obtain; W kand V kfor process and measuring error, suppose both separate, and be white Gaussian noise there is following probability distribution: p (W)~N (0, Q), p (V)~N (0, R).
X k|k-1=F*X k-1|k-1
P k|k-1=F*P k-1|k-1F T+wQ kw T
K k=P k|k-1*H k T[R k+H k*P k|k-1*H k T] -1
X k|k=X k|k-1+K k*[Y k-h k*(X k|k-1)]
P k|k-1=[I-K k*H k]*P k|k-1
First formula F is state transition matrix, X k-1|k-1for the state vector in k-1 moment, obtain the X that the k moment estimates k|k-1.
Second formula is that covariance is upgraded, Q kfor process noise, P k-1|k-1the covariance matrix in k-1 moment, P k|k-1for the covariance matrix in k moment.
The 3rd formula is kalman gain, controls speed of convergence, K kfor kalman gain, R kfor measuring noise, H kit is the Jacobian matrix of the local derviation of k moment h function to X.
The 4th is the optimal estimation value X in the k moment that will obtain.
The 5th to h(X k) linearization technique, wherein I is unit matrix.
Move the more accurate position that can obtain vision odometry through EKF algorithm, precise displacement is sent back to ground and show.
Step 5): we complete after vision odometry and filtering algorithm on the machine of unmanned vehicle by previous step, has obtained the preliminary location estimation that aircraft self flies.Utilize afterwards the Principle of Communication of unmanned vehicle and land station to make both communicate the position that obtains RGBD data and estimation, for path planning or other functions.Owing to considering wireless network transmissions less stable, and limited bandwidth.Therefore, in network transmission scheme, choose the UDP network transmission protocol and transmit RGB-D data, can allow to lose indivedual frames and then reach very fast speed.
Step 6): sample history RGB frame, the frame newly obtaining mates with it.If find, present frame overlaps with certain frame of history, is judged as this unmanned vehicle and flies over this point in a certain moment before, and formed the ring of a sealing.In the time that indoor environment is larger, step 3, the location estimation of the unmanned vehicle described in 4 will there will be certain error.The Accumulation Phenomenon of this error can cause the disorder of whole environmental map.In order to address this problem, adopt closed loop detection technique to optimize three-dimensional map.This technology specifically refers to: when aircraft is in the time that the interior space is flown, when the flight path of carving whenever flight path and some time before overlaps, formed the loop of a sealing, now system is revised the flight path that forms loop carrying out again iterative algorithm, thereby reduces position estimation error.
Step 7): after step 3,4,5,6 optimization, obtain flight path and RGB-D data more accurately.In order to complete real-time display effect, thereby adopt OpenGL computer graphics algorithm interface RGB-D data to be processed to the real-time demonstration that realizes three-dimensional map.
It should be noted last that, above embodiment is only unrestricted in order to technical scheme of the present invention to be described.Although the present invention is had been described in detail with reference to embodiment, those of ordinary skill in the art is to be understood that, technical scheme of the present invention is modified or is equal to replacement, do not depart from the spirit and scope of technical solution of the present invention, it all should be encompassed in the middle of claim scope of the present invention.

Claims (7)

1. indoor location and an environmental modeling method simultaneously of unmanned vehicle, is characterized in that,
Step 1): build unmanned vehicle hardware configuration, on machine, carry embedded master cpu, flight control panel, RGB-D sensor, and power-supply unit and radio transmission apparatus;
Step 2): control unmanned vehicle in indoor flight, and utilize RGB-D sensor to collect data;
Step 3): suppose that aircraft is t in certain moment of interior space flight, adopts vision odometry to estimate the situation of change that t moment aircraft moved compared with the t-1 moment; This situation of change specifically comprises two kinds of variations of translation and rotation, is implemented as follows:
First obtain the character pair point of front and back two frames based on feature detection techniques and characteristic matching technology, and utilize length, condition angle restricted con-ditionc to remove some abnormal characteristic matching, then calculate the change amount of flight position based on least error principle;
Step 4): in the data fusion stage, the gap between flight position and the actual position of smoothly estimating in vision odometry; Adopt extended Kalman filter algorithm, complete the noise reduction of flight path curve and level and smooth;
Step 5): complete on machine after the Kalman filtering algorithm of vision odometry and expansion, obtain preliminary location estimation; Land station must carry out data communication with aircraft and obtain the RGB-D data of indoor environment and the position of estimation, for path planning and indoor environment modeling;
Step 6): adopt closed loop detection technique to optimize flight path, specifically comprise: when aircraft is in the time that the interior space is flown, when the flight path of carving whenever flight path and some time before overlaps, formed the loop of a sealing, now by carrying out again iterative algorithm, the flight path that forms loop is revised, thereby reduced position estimation error;
Step 7): after step 3,4,5,6, obtain flight path and RGB-D data more accurately; In order to complete real-time display effect, thereby adopt OpenGL computer graphics algorithm interface RGB-D data to be processed to the real-time demonstration that realizes three-dimensional map.
2. method according to claim 1, is characterized in that, the RGB-D sensor being mounted on unmanned vehicle is the sensor device that can collect indoor accurate image information and depth information.
3. method according to claim 1, it is characterized in that, also comprise that the indoor environment information exchange collecting according to RGB-D sensor on unmanned vehicle crosses the real-time Show Color of OpenGL computer graphics algorithm interface, the depth information of ground computing system, and utilize data to carry out three-dimensional indoor environment modeling.
4. method according to claim 1, is characterized in that, in described step 2, with only completing vision ranging algorithm based on depth image, realizes location and modeling in the environment such as the unglazed photograph of dark.
5. method according to claim 1, is characterized in that, in described step 4, the Kalman filtering algorithm of application extension (EKF) is specific as follows:
By to observed quantity Y krenewal, obtain to state vector X kestimate, it meets following relation:
X(k+1)=f(X k)+W k
Y(k)=h(X k)+V k
Wherein, X(k+1) be by previous moment X kvalue estimate, f(X k), h(X k) be nonlinear process model and observation model; Y kit is the observation vector obtaining by calculating; W kand V kfor process and measuring error, suppose both separate, and be white gaussian noise there is following probability distribution: p (W)~N (0, Q), p (V)~N (0, R);
X k|k-1=F*X k-1|k-1
P k|k-1=F*P k-1|k-1F T+wQ kw T
K k=P k|k-1*H k T[R k+H k*P k|k-1*H k T] -1
X k|k=X k|k-1+K k*[Y k-h k*(X k|k-1)]
P k|k-1=[I-K k*H k]*P k|k-1
First formula F is state transition matrix, X k-1|k-1for the state vector in k-1 moment, obtain the X that the k moment estimates k|k-1;
Second formula is that covariance is upgraded, Q kfor process noise, P k-1|k-1the covariance matrix in k-1 moment, P k|k-1for the covariance matrix in k moment;
The 3rd formula is kalman gain, controls speed of convergence, K kfor kalman gain, R kfor measuring noise, H kit is the Jacobian matrix of the local derviation of k moment h function to X;
The 4th formula is the optimal estimation value X in the k moment that will obtain;
The 5th formula is to h(X k) linearization technique, wherein I is unit matrix;
Move the more accurate position that obtains vision odometry through EKF algorithm, precise displacement is sent back to ground and show.
6. method according to claim 1, is characterized in that, described step 5 is chosen UDP and transmitted RGBD data in Internet Transmission, tolerance frame losing energy quick data transfering; Meanwhile, choose Transmission Control Protocol complete accuracy requirement very high, but little data of capacity.
7. method according to claim 1 and 2, is characterized in that, adopts computer multiple thread treatment mechanism, is used for ensureing the independent operating of each module and real-time calculating, generation and the bandwagon effect of model.
CN201410127664.4A 2014-03-29 2014-03-29 Indoor simultaneous locating and environment modeling method for unmanned aerial vehicle Pending CN103926933A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410127664.4A CN103926933A (en) 2014-03-29 2014-03-29 Indoor simultaneous locating and environment modeling method for unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410127664.4A CN103926933A (en) 2014-03-29 2014-03-29 Indoor simultaneous locating and environment modeling method for unmanned aerial vehicle

Publications (1)

Publication Number Publication Date
CN103926933A true CN103926933A (en) 2014-07-16

Family

ID=51145191

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410127664.4A Pending CN103926933A (en) 2014-03-29 2014-03-29 Indoor simultaneous locating and environment modeling method for unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN103926933A (en)

Cited By (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104154910A (en) * 2014-07-22 2014-11-19 清华大学 Indoor micro unmanned aerial vehicle location method
CN104679002A (en) * 2015-01-28 2015-06-03 北京航空航天大学 Mobile robot system polluted by noises and coordination and control method of mobile robot system
CN104700575A (en) * 2015-03-27 2015-06-10 尚乐 Safe water rescue system and method
CN104714555A (en) * 2015-03-23 2015-06-17 深圳北航新兴产业技术研究院 Three-dimensional independent exploration method based on edge
CN104964683A (en) * 2015-06-04 2015-10-07 上海物景智能科技有限公司 Closed loop correction method for indoor environment map creation
CN105352508A (en) * 2015-10-22 2016-02-24 深圳创想未来机器人有限公司 Method and device of robot positioning and navigation
CN105786016A (en) * 2016-03-31 2016-07-20 深圳奥比中光科技有限公司 Unmanned plane and RGBD image processing method
CN105809687A (en) * 2016-03-08 2016-07-27 清华大学 Monocular vision ranging method based on edge point information in image
CN105847684A (en) * 2016-03-31 2016-08-10 深圳奥比中光科技有限公司 Unmanned aerial vehicle
CN105892474A (en) * 2016-03-31 2016-08-24 深圳奥比中光科技有限公司 Unmanned plane and control method of unmanned plane
CN105912980A (en) * 2016-03-31 2016-08-31 深圳奥比中光科技有限公司 Unmanned plane and unmanned plane system
CN105930766A (en) * 2016-03-31 2016-09-07 深圳奥比中光科技有限公司 Unmanned plane
CN106233219A (en) * 2015-03-31 2016-12-14 深圳市大疆创新科技有限公司 Mobile platform operating system and method
CN106371463A (en) * 2016-08-31 2017-02-01 重庆邮电大学 Multi-rotorcraft ground station positioning infrared beacon system
TWI573104B (en) * 2015-03-25 2017-03-01 宇瞻科技股份有限公司 Indoor monitoring system and method thereof
CN106846485A (en) * 2016-12-30 2017-06-13 Tcl集团股份有限公司 A kind of indoor three-dimensional modeling method and device
CN107085422A (en) * 2017-01-04 2017-08-22 北京航空航天大学 A kind of tele-control system of the multi-functional Hexapod Robot based on Xtion equipment
CN107478271A (en) * 2017-08-10 2017-12-15 哈尔滨工业大学 It is a kind of to take care of the data acquisition device that facility fits the evaluation of old property in the daytime for the elderly
WO2018023556A1 (en) * 2016-08-04 2018-02-08 SZ DJI Technology Co., Ltd. Methods and systems for obstacle identification and avoidance
CN107735794A (en) * 2015-08-06 2018-02-23 埃森哲环球服务有限公司 Use the condition detection of image procossing
CN108256543A (en) * 2016-12-29 2018-07-06 纳恩博(北京)科技有限公司 A kind of localization method and electronic equipment
CN108303099A (en) * 2018-06-14 2018-07-20 江苏中科院智能科学技术应用研究院 Autonomous navigation method in unmanned plane room based on 3D vision SLAM
CN108445900A (en) * 2018-06-20 2018-08-24 江苏大成航空科技有限公司 A kind of unmanned plane vision positioning replacement differential technique
CN109254591A (en) * 2018-09-17 2019-01-22 北京理工大学 The dynamic route planning method of formula sparse A* and Kalman filtering are repaired based on Anytime
CN109483409A (en) * 2018-11-21 2019-03-19 无锡荣恩科技有限公司 The paint removal method that aviation components fill spray automatically
CN110045750A (en) * 2019-05-13 2019-07-23 南京邮电大学 A kind of indoor scene building system and its implementation based on quadrotor drone
CN110456822A (en) * 2019-08-23 2019-11-15 西安爱生技术集团公司 A kind of small and medium size unmanned aerial vehicles double redundancy independently measures flight control system
CN110545373A (en) * 2018-05-28 2019-12-06 中兴通讯股份有限公司 Spatial environment sensing method and device
CN110887487A (en) * 2019-11-14 2020-03-17 天津大学 Indoor synchronous positioning and mapping method
CN111918214A (en) * 2014-08-15 2020-11-10 化文生 Device for determining a distance based on a transmission signal
CN112230211A (en) * 2020-10-15 2021-01-15 长城汽车股份有限公司 Vehicle positioning method and device, storage medium and vehicle
CN112356031A (en) * 2020-11-11 2021-02-12 福州大学 On-line planning method based on Kernel sampling strategy under uncertain environment
CN112505065A (en) * 2020-12-28 2021-03-16 上海工程技术大学 Method for detecting surface defects of large part by indoor unmanned aerial vehicle
CN112907644A (en) * 2021-02-03 2021-06-04 中国人民解放军战略支援部队信息工程大学 Machine map-oriented visual positioning method
CN113448343A (en) * 2020-03-26 2021-09-28 精工爱普生株式会社 Method, system and program for setting a target flight path of an aircraft
CN113720331A (en) * 2020-12-25 2021-11-30 北京理工大学 Multi-camera integrated unmanned aerial vehicle in-building navigation positioning method
CN114650089A (en) * 2022-03-15 2022-06-21 广东汇天航空航天科技有限公司 Aircraft positioning and tracking processing method and device and positioning and tracking system
CN117348500A (en) * 2023-12-04 2024-01-05 济南华科电气设备有限公司 Automatic control method and system for fully-mechanized coal mining face

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5636312A (en) * 1992-12-17 1997-06-03 Pioneer Electronic Corporation Video image mixing apparatus
CN102508439A (en) * 2011-11-18 2012-06-20 天津大学 HLA (High Level Architecture)-based multi-unmmaned aerial vehicle distributed simulation method
CN102692236A (en) * 2012-05-16 2012-09-26 浙江大学 Visual milemeter method based on RGB-D camera
CN103093047A (en) * 2013-01-12 2013-05-08 天津大学 Typical aircraft visual simulation system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5636312A (en) * 1992-12-17 1997-06-03 Pioneer Electronic Corporation Video image mixing apparatus
CN102508439A (en) * 2011-11-18 2012-06-20 天津大学 HLA (High Level Architecture)-based multi-unmmaned aerial vehicle distributed simulation method
CN102692236A (en) * 2012-05-16 2012-09-26 浙江大学 Visual milemeter method based on RGB-D camera
CN103093047A (en) * 2013-01-12 2013-05-08 天津大学 Typical aircraft visual simulation system

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
施彬彬等: "3D成像视觉引导系统", 《电子设计工程》 *
郭芳: "复杂环境下四旋翼无人机定位研究", 《中国优秀硕士学位论文全文数据库》 *
骆云祥: "非线性滤波在移动机器人SLAM中的应用", 《中国优秀硕士学位论文全文数据库》 *

Cited By (59)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104154910A (en) * 2014-07-22 2014-11-19 清华大学 Indoor micro unmanned aerial vehicle location method
CN111918214B (en) * 2014-08-15 2022-11-11 星盟国际有限公司 Device for determining a distance based on a transmission signal
US11582577B2 (en) 2014-08-15 2023-02-14 Star Ally International Limited System and method of time of flight detection
CN111918214A (en) * 2014-08-15 2020-11-10 化文生 Device for determining a distance based on a transmission signal
CN104679002A (en) * 2015-01-28 2015-06-03 北京航空航天大学 Mobile robot system polluted by noises and coordination and control method of mobile robot system
CN104679002B (en) * 2015-01-28 2017-06-06 北京航空航天大学 By the mobile-robot system and its control method for coordinating of noise pollution
CN104714555A (en) * 2015-03-23 2015-06-17 深圳北航新兴产业技术研究院 Three-dimensional independent exploration method based on edge
CN104714555B (en) * 2015-03-23 2017-05-10 深圳北航新兴产业技术研究院 Three-dimensional independent exploration method based on edge
TWI573104B (en) * 2015-03-25 2017-03-01 宇瞻科技股份有限公司 Indoor monitoring system and method thereof
CN104700575A (en) * 2015-03-27 2015-06-10 尚乐 Safe water rescue system and method
CN106233219B (en) * 2015-03-31 2020-03-17 深圳市大疆创新科技有限公司 Mobile platform operating system and method
CN106233219A (en) * 2015-03-31 2016-12-14 深圳市大疆创新科技有限公司 Mobile platform operating system and method
CN111273690A (en) * 2015-03-31 2020-06-12 深圳市大疆创新科技有限公司 Mobile platform operating system and method
CN104964683A (en) * 2015-06-04 2015-10-07 上海物景智能科技有限公司 Closed loop correction method for indoor environment map creation
CN104964683B (en) * 2015-06-04 2018-06-01 上海物景智能科技有限公司 A kind of closed-loop corrected method of indoor environment map building
CN107735794A (en) * 2015-08-06 2018-02-23 埃森哲环球服务有限公司 Use the condition detection of image procossing
CN107735794B (en) * 2015-08-06 2021-06-04 埃森哲环球服务有限公司 Condition detection using image processing
CN105352508A (en) * 2015-10-22 2016-02-24 深圳创想未来机器人有限公司 Method and device of robot positioning and navigation
CN105809687B (en) * 2016-03-08 2019-09-27 清华大学 A kind of monocular vision ranging method based on point information in edge in image
CN105809687A (en) * 2016-03-08 2016-07-27 清华大学 Monocular vision ranging method based on edge point information in image
CN105892474A (en) * 2016-03-31 2016-08-24 深圳奥比中光科技有限公司 Unmanned plane and control method of unmanned plane
CN105786016A (en) * 2016-03-31 2016-07-20 深圳奥比中光科技有限公司 Unmanned plane and RGBD image processing method
CN105847684A (en) * 2016-03-31 2016-08-10 深圳奥比中光科技有限公司 Unmanned aerial vehicle
CN105912980A (en) * 2016-03-31 2016-08-31 深圳奥比中光科技有限公司 Unmanned plane and unmanned plane system
CN105786016B (en) * 2016-03-31 2019-11-05 深圳奥比中光科技有限公司 The processing method of unmanned plane and RGBD image
CN105930766A (en) * 2016-03-31 2016-09-07 深圳奥比中光科技有限公司 Unmanned plane
CN105912980B (en) * 2016-03-31 2019-08-30 深圳奥比中光科技有限公司 Unmanned plane and UAV system
WO2018023556A1 (en) * 2016-08-04 2018-02-08 SZ DJI Technology Co., Ltd. Methods and systems for obstacle identification and avoidance
CN106371463B (en) * 2016-08-31 2019-04-02 重庆邮电大学 More gyroplane earth stations position infrared beacon system
CN106371463A (en) * 2016-08-31 2017-02-01 重庆邮电大学 Multi-rotorcraft ground station positioning infrared beacon system
CN108256543A (en) * 2016-12-29 2018-07-06 纳恩博(北京)科技有限公司 A kind of localization method and electronic equipment
CN106846485A (en) * 2016-12-30 2017-06-13 Tcl集团股份有限公司 A kind of indoor three-dimensional modeling method and device
CN107085422A (en) * 2017-01-04 2017-08-22 北京航空航天大学 A kind of tele-control system of the multi-functional Hexapod Robot based on Xtion equipment
CN107478271A (en) * 2017-08-10 2017-12-15 哈尔滨工业大学 It is a kind of to take care of the data acquisition device that facility fits the evaluation of old property in the daytime for the elderly
CN110545373B (en) * 2018-05-28 2021-12-28 中兴通讯股份有限公司 Spatial environment sensing method and device
CN110545373A (en) * 2018-05-28 2019-12-06 中兴通讯股份有限公司 Spatial environment sensing method and device
CN108303099B (en) * 2018-06-14 2018-09-28 江苏中科院智能科学技术应用研究院 Autonomous navigation method in unmanned plane room based on 3D vision SLAM
CN108303099A (en) * 2018-06-14 2018-07-20 江苏中科院智能科学技术应用研究院 Autonomous navigation method in unmanned plane room based on 3D vision SLAM
CN108445900A (en) * 2018-06-20 2018-08-24 江苏大成航空科技有限公司 A kind of unmanned plane vision positioning replacement differential technique
CN109254591A (en) * 2018-09-17 2019-01-22 北京理工大学 The dynamic route planning method of formula sparse A* and Kalman filtering are repaired based on Anytime
CN109254591B (en) * 2018-09-17 2021-02-12 北京理工大学 Dynamic track planning method based on Anytime restoration type sparse A and Kalman filtering
CN109483409A (en) * 2018-11-21 2019-03-19 无锡荣恩科技有限公司 The paint removal method that aviation components fill spray automatically
CN110045750A (en) * 2019-05-13 2019-07-23 南京邮电大学 A kind of indoor scene building system and its implementation based on quadrotor drone
CN110456822A (en) * 2019-08-23 2019-11-15 西安爱生技术集团公司 A kind of small and medium size unmanned aerial vehicles double redundancy independently measures flight control system
CN110887487A (en) * 2019-11-14 2020-03-17 天津大学 Indoor synchronous positioning and mapping method
CN110887487B (en) * 2019-11-14 2023-04-18 天津大学 Indoor synchronous positioning and mapping method
CN113448343B (en) * 2020-03-26 2024-01-19 精工爱普生株式会社 Method, system and readable medium for setting a target flight path of an aircraft
CN113448343A (en) * 2020-03-26 2021-09-28 精工爱普生株式会社 Method, system and program for setting a target flight path of an aircraft
CN112230211A (en) * 2020-10-15 2021-01-15 长城汽车股份有限公司 Vehicle positioning method and device, storage medium and vehicle
CN112356031A (en) * 2020-11-11 2021-02-12 福州大学 On-line planning method based on Kernel sampling strategy under uncertain environment
CN113720331A (en) * 2020-12-25 2021-11-30 北京理工大学 Multi-camera integrated unmanned aerial vehicle in-building navigation positioning method
CN113720331B (en) * 2020-12-25 2023-12-19 北京理工大学 Multi-camera fused unmanned aerial vehicle in-building navigation positioning method
CN112505065A (en) * 2020-12-28 2021-03-16 上海工程技术大学 Method for detecting surface defects of large part by indoor unmanned aerial vehicle
CN112907644B (en) * 2021-02-03 2023-02-03 中国人民解放军战略支援部队信息工程大学 Machine map-oriented visual positioning method
CN112907644A (en) * 2021-02-03 2021-06-04 中国人民解放军战略支援部队信息工程大学 Machine map-oriented visual positioning method
CN114650089A (en) * 2022-03-15 2022-06-21 广东汇天航空航天科技有限公司 Aircraft positioning and tracking processing method and device and positioning and tracking system
CN114650089B (en) * 2022-03-15 2023-09-22 广东汇天航空航天科技有限公司 Aircraft positioning and tracking processing method, device and positioning and tracking system
CN117348500A (en) * 2023-12-04 2024-01-05 济南华科电气设备有限公司 Automatic control method and system for fully-mechanized coal mining face
CN117348500B (en) * 2023-12-04 2024-02-02 济南华科电气设备有限公司 Automatic control method and system for fully-mechanized coal mining face

Similar Documents

Publication Publication Date Title
CN103926933A (en) Indoor simultaneous locating and environment modeling method for unmanned aerial vehicle
CN110262546B (en) Tunnel intelligent unmanned aerial vehicle inspection method
EP3940421A1 (en) Positioning method and device based on multi-sensor fusion
CN106840148B (en) Wearable positioning and path guiding method based on binocular camera under outdoor working environment
Achtelik et al. Stereo vision and laser odometry for autonomous helicopters in GPS-denied indoor environments
CN109885080B (en) Autonomous control system and autonomous control method
CN106168805A (en) The method of robot autonomous walking based on cloud computing
CN106097304B (en) A kind of unmanned plane real-time online ground drawing generating method
CN108827306A (en) A kind of unmanned plane SLAM navigation methods and systems based on Multi-sensor Fusion
CN107167139A (en) A kind of Intelligent Mobile Robot vision positioning air navigation aid and system
CN104236548A (en) Indoor autonomous navigation method for micro unmanned aerial vehicle
CN108536145A (en) A kind of robot system intelligently followed using machine vision and operation method
CN102190081B (en) Vision-based fixed point robust control method for airship
CN106017463A (en) Aircraft positioning method based on positioning and sensing device
CN113189977B (en) Intelligent navigation path planning system and method for robot
CN106898249B (en) A kind of map constructing method for earthquake-stricken area communication failure region
CN113703462B (en) Unknown space autonomous exploration system based on quadruped robot
EP4088884A1 (en) Method of acquiring sensor data on a construction site, construction robot system, computer program product, and training method
CN106227239A (en) Four rotor flying robot target lock-ons based on machine vision follow the tracks of system
CN111220999A (en) Restricted space detection system and method based on instant positioning and mapping technology
Karam et al. Integrating a low-cost mems imu into a laser-based slam for indoor mobile mapping
CN114923477A (en) Multi-dimensional space-ground collaborative map building system and method based on vision and laser SLAM technology
Si et al. A novel positioning method of anti-punching drilling robot based on the fusion of multi-IMUs and visual image
CN116352722A (en) Multi-sensor fused mine inspection rescue robot and control method thereof
Wang et al. Micro aerial vehicle navigation with visual-inertial integration aided by structured light

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20140716