CN108225302A - A kind of petrochemical factory's crusing robot alignment system and method - Google Patents

A kind of petrochemical factory's crusing robot alignment system and method Download PDF

Info

Publication number
CN108225302A
CN108225302A CN201711439909.7A CN201711439909A CN108225302A CN 108225302 A CN108225302 A CN 108225302A CN 201711439909 A CN201711439909 A CN 201711439909A CN 108225302 A CN108225302 A CN 108225302A
Authority
CN
China
Prior art keywords
observation
observing
module
available
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201711439909.7A
Other languages
Chinese (zh)
Other versions
CN108225302B (en
Inventor
朱华
李猛钢
李雨潭
由韶泽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xuzhou Sengon Intelligent Technology Co ltd
China University of Mining and Technology CUMT
Original Assignee
Xuzhou Sengon Intelligent Technology Co ltd
China University of Mining and Technology CUMT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xuzhou Sengon Intelligent Technology Co ltd, China University of Mining and Technology CUMT filed Critical Xuzhou Sengon Intelligent Technology Co ltd
Priority to CN201711439909.7A priority Critical patent/CN108225302B/en
Publication of CN108225302A publication Critical patent/CN108225302A/en
Application granted granted Critical
Publication of CN108225302B publication Critical patent/CN108225302B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • G01S19/46Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being of a radio-wave signal type
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • G01S19/47Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being an inertial measurement, e.g. tightly coupled inertial
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/48Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/48Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
    • G01S19/49Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an inertial position system, e.g. loosely-coupled

Abstract

The invention discloses a kind of petrochemical factory's crusing robot alignment system and methods.System includes GPS observing units, UWB observing units, visual observation unit, laser observations unit, mileage observing unit, combination decision module, inertial states estimation module, can use location observation/SINS composite modules and localization information fusion module.5 observing units acquire respectively observation data respectively, are sent to combination decision module;The environmental map model of combination decision module combining environmental model prior information module storage, carries out the positioning based on multilayer map, and judgement can use observation type;Observation data can will can be used to combine with the correspondence equation of motion that inertial states estimation module is established with location observation/SINS composite modules, combined result is sent to localization information fusion module, carries out final integrated pose estimation.The present invention solves the problems, such as seamless positioning of the crusing robot in the various complex region operations of petrochemical factory, ensure that the continuity of position fixing process and the stability of positioning result.

Description

A kind of petrochemical factory's crusing robot alignment system and method
Technical field
The invention belongs to specialized robot positioning field, a kind of more particularly to petrochemical factory's crusing robot alignment system And method.
Background technology
The work area of the factories such as petroleum chemical enterprise, oil depot, warehouse & transport corporation is big, environment form is complicated, and security risk is not Easily investigation.The instrument and meter of the workplaces such as each tank field, pump area, electric substation can only be carried out by manually at present live on foot Detection.Petrochemical factory needs the content of inspection various, generally includes pipeline road, oil storage pot bottom, pumping plant, metering room, flow Between, all kinds of workplaces such as flame proof passageway, safe investigation etc. is carried out to various valves, instrument, equipment.These contents to be checked exist In it is indoor with outdoor, structuring with unstructured, daytime and night, lighting condition is preferable and poor, overcast and rainy wind and snow, in some instances it may even be possible to It has in the adverse circumstances such as flue dust, steam, huge working strength and danger is brought to patrol task.Using autonomous inspection Robot replaces manually performing patrol task, can greatly reduce worker's working strength, reduces work risk.
Crusing robot has been achieved with answering in the preferable environment of the structurings condition such as substation, office work region at this stage With and promote, but the application in petrochemical industry is but made slow progress, the reason is that compared to substation and indoor environment, petrochemical factory The unstructured degree higher of environment, need inspection content and location type and environment it is more complicated.In addition also to meet daytime and The boisterous inspection work such as night, sleet.Due to the complexity of patrol task, robot must have very strong positioning energy Power should be adapted to all kinds of complicated occasions and environmental condition, while necessary strict guarantee positioning accuracy, so that it is guaranteed that robot It will not be collided in independent navigation and motion process with hazardous equipment or other objects.Running to scene changes, environmental characteristic When the region of larger change occurs, concussion or failure conditions cannot occur for locating effect.The autonomous crusing robot of petrochemical factory exists The high-precision of all kinds of complex environments, high reliability positioning play vital work for the independence of robot and safety With.
The crusing robot of petrochemical industry application at present is less.The common localization method of other field crusing robot have with Under it is several:
1. magnetic stripe guiding plus radio frequency discrimination RFID auxiliary positioning:Such method guides vehicle and substation for current factory AGV The common positioning method of crusing robot realizes robot localization by being laid with track and RFID label tag in advance, and major defect is It needs to arrange magnetic stripe in advance, construction workload and later maintenance cost are high, and polling path is fixed dumb.And it only adapts to and puts down Smooth road surface, for region bad adaptability with a varied topography.Plant area of petrochemical factory area is big, and Late reformation difficulty is big, and landform is answered It is miscellaneous, it is navigated using magnetic stripe of high cost, and being accurately positioned under complicated landform can not be met.
2.GNSS/INS integrated positionings:Such method in onboard system, outdoor moving robot and unmanned plane extensive use, Positioning is combined using high-precision GPS and IMU, major defect is equipment cost height, and there are multipath effects due to GPS It answers, for the inspection place bad adaptability that equipment and building are stood in great numbers, waiting can not use under occluded environments indoors.Petrochemical industry Between factory's flow, metering room, the indoor inspection environment such as electric substation it is more, and oil storage tank, pump area's equipment etc. are stood in great numbers, simple to use GPS location precision is difficult to ensure that.
3. reckoning positions:Such method is mainly used in the fields such as indoor mobile robot, using odometer or is used to Property measuring unit establish motion model by reckoning carry out pose estimation, major defect is the expansion with motion range, The accumulated error increase of reckoning, gradually loses stationkeeping ability, and for the rugged bad adaptability in ground.Petrochemical industry work There are the road surfaces such as rugged, slope for factory, big using only reckoning positioning accumulated error.
4. two-dimensional laser SLAM is positioned:Such method is mainly used in indoor mobile robot and Intelligent Mobile Robot On, realize synchronous positioning and map structuring using laser SLAM methods, major defect is that adaptive capacity to environment is not strong, dilute to feature Thin occasion can not be realized and is accurately positioned, for rough ground bad adaptability.There is indoor longer-range corridor, room in petrochemical factory The few occasion of features such as in outer fire wall, two-dimensional laser SLAM methods can not adapt to such environment, while laser is for rainy day work The bad adaptability of work.
5. vision SLAM is positioned:Such method is mainly used in the occasions such as domestic sweeper device people, and major defect is to light It according to sensitivity, can not work at night, the occasions such as quickly rotating in robot easily positions failure.Petrochemical factory's crusing robot needs It to work at night, realized using vision SLAM difficult.
In addition, also there is part research to realize robot localization for the method for Multi-sensor Fusion, but it is still to be directed to The integrated positioning application of specific occasion is usually applied under the conditions of sensor signal is stable, status transmission is preferable, for signal Integrated positioning under loss situation, stable position are indifferent.
In short, at present for petrochemical factory's crusing robot orientation problem, there is no can and meanwhile indoors with it is outdoor, Flat and rough ground, daytime and night, feature are sparse accurate, stable with all kinds of complicated occasions realizations such as apparent, sleety weather Positioning crusing robot alignment system.In addition the important point, the equipment that petrochemical factory uses also need to meet explosion-proof peace Full requirement for all kinds of positioning devices used at this stage, is all difficult to meet and is applied directly to such environmental area.
Invention content
In order to solve the technical issues of above-mentioned background technology proposes, the present invention is intended to provide a kind of petrochemical factory's inspection machine People's alignment system and method, seamless positioning of the Supplementary Study crusing robot in the various complex region operations of petrochemical factory are asked Topic, ensures the continuity of position fixing process and the stability of positioning result.
In order to realize above-mentioned technical purpose, the technical scheme is that:
A kind of petrochemical factory's crusing robot alignment system, including GPS observing units, UWB observing units, visual observation list Member, can use location observation/SINS at laser observations unit, mileage observing unit, combination decision module, inertial states estimation module Composite module and localization information fusion module;GPS observing units, UWB observing units, visual observation unit, laser observations unit With mileage observing unit respective observation data are acquired respectively, and combination decision module is sent to specific frequency;Combination decision Module receives the environmental map model of environmental model prior information module storage, is carried out with reference to each observation data based on multilayer map Positioning, judgement can use observation type, and available location observation/SINS composite modules can be sent to observation data;It can use Location observation/SINS composite modules will can carry out group with observation data with the corresponding equation of motion that inertial states estimation module is established It closes, and combined result is sent to localization information fusion module, carry out final integrated pose estimation.
Based on the localization method of above-mentioned petrochemical factory's crusing robot alignment system, include the following steps:
(1) when robot booting or artificial shift position, the observation z of 5 observing units is initializedi(0) it is and used The state variable space x of character state estimation modulei(0);5 observing units acquire observation data z respectivelyi(ki), and with specific frequency Rate is sent to combination decision module;Subscript i=1,2,3,4,5, represent 5 observing units;
(2) combination decision module combines the observation information of each observing unit and the ring of environmental model prior information module storage Condition graph model carries out pose according to a preliminary estimate, determines available observation type i, then by available observation data with frequency SiHair Send observation data zi(ki) and corresponding observing matrix Ci(ki) to available location observation/SINS composite modules;
(3) the corresponding each available observation data z of inertial states estimation module structurei(ki) motion state variable xi(k), will xi(k) with identical frequency S0It is sent to available location observation/SINS composite modules;
(4) location observation/SINS composite modules can be used by each motion state variable xi(k) with corresponding observing matrix Ci (ki) be respectively combined, it by interpolation and is extrapolated using virtual observation method by each observing matrix Ci(ki) be to frequency uniformly S0Observing matrix Ci(k), and local filter is built;
(5) localization information fusion module uses adaptive combined filtering blending algorithm by the estimated result of each local filter Integrated combination is carried out, obtains final combination pose estimated result.
Further, environmental model prior information module is aligned to obtain GIS electronic maps, visual signature by original map Map and the cartographic model of laser point cloud map three types superposition.
Further, in step (2), the method for determining available observation type is as follows:
Combination decision module receives the GPS satellite location information of GPS observing units acquisition, in environmental model prior information mould Coordinate setting is carried out in the GIS electronic maps that block provides, if the continuous t of GPS observing units1Normally in GIS electronic maps in time Middle positioning, i.e. observation satellite quantity be no less than 4, position dilution of precision be less than 4, then it is assumed that GPS observing units acquisition observation Data are available observation type;
Combination decision module receives the UWB wireless location information of UWB observing units acquisition, in environmental model prior information mould Coordinate setting is carried out in the GIS electronic map indoor environments that block provides, if the continuous t of UWB observing units2Normally in GIS in time It is positioned in electronic map, i.e. RSSI received signal strength PL (d) > Rmin, RminSignal strength threshold for setting, then it is assumed that UWB The observation data of observing unit acquisition are available observation type;
Combination decision module receives the image feature information of visual observation unit acquisition, in environmental model prior information module Characteristic matching positioning is carried out in the visual signature map of offer, if by rejecting mistake in the environment that visual observation unit detects Feature key points after matching are more than given threshold N to number3, then it is assumed that the observation data of visual observation unit acquisition are available Observe type;
Combination decision module receives the environment point cloud information of laser observations unit acquisition, in environmental model prior information module The positioning of cloud map registration is provided in the laser point cloud map of offer, if laser observations unit in laser point cloud matching process, After noise remove erroneous matching, the number of feature points of successful match is more than given threshold N4, then it is assumed that laser observations list The observation data of member acquisition are available observation type.
Further, inertial states estimation module measures the z-axis acceleration input combination decision perpendicular to robot platform Module, whether the observation data for judging mileage observing unit are available observation type:
Combination decision module receives the mileage information of mileage observing unit acquisition encoder feedback, believes in environmental model priori It is positioned in the GIS electronic maps that breath module provides according to reckoning method, if the z-axis of inertial states estimation module output The acceleration average rate of change is less than given threshold in Δ T timeThe observation data for then thinking mileage observing unit are available Observe type.
Further, in step (4), if the observation data of GPS observing units acquisition are available observation type, accordingly Local filter uses the alternate position spike of GPS and SINS positioning and speed difference structure observational equation z1;If the acquisition of UWB observing units Data are observed as available observation type, corresponding local filter is observed using the location information difference structure of UWB and SINS positioning Equation z2;If the observation data of visual observation unit acquisition are available observation type, corresponding local filter using vision with The alternate position spike of SINS positioning and pose difference structure observational equation z3;If the observation data of laser observations unit acquisition are seen to be available Type is surveyed, corresponding local filter uses laser and the alternate position spike of SINS positioning and posture difference structure observational equation z4;In if The observation data of journey observing unit are available observation types, and corresponding local filter uses the position of reckoning and SINS positioning Put difference and posture difference structure observational equation z5
Further, step (5) is as follows:
(51) the available observation type judged according to combination decision module, determines combined information distribution coefficient βi, setting is respectively Observation type of topical wave filter initial value can be used;
(52) by globalstate estimationCovariance matrix PfAnd public noise matrix QfIt is assigned to by information sharing principle In each local filter;
(53) elapsed time synchronize after each available observational equation ziThe movement exported respectively with inertial states estimation module State variable xiLocal position and posture estimation is carried out, obtains part filter resultWhereinEstimate for local state, Pi For corresponding covariance matrix, i ∈ [1,5];
(54) each part filter positioning result is transformed into from local coordinate system under robot geocentric coordinate system, with reference to respectively may be used With local pose estimated result and senior filter feedback result, estimation is combined using senior filter, global optimum is obtained and estimates Meter
(55) each local filter and the state estimation and covariance matrix of senior filter are reset, carries out the following iteration period Pose estimation.
Further, in step (51), the combined information distribution coefficient βiMeet information conservation principle:
Wherein, βmFor the information sharing scheme of senior filter, set according to place feature;
For the unavailable observation type that combination decision module judges, corresponding βi=0, the corresponding β of observation type can be usedi According to information conservation equation mean allocation.
Further, in step (55), the conversion of local coordinate system to robot geocentric coordinate system is to utilize each observation What the coordinate transformation relation of the installation site of unit to robot geocentric coordinate system determined.
The advantageous effect brought using above-mentioned technical proposal:
The present invention realizes crusing robot in the interior of petrochemical factory and room by the combination of various positioning subsystems Outside, it is flat sparse accurate, steady with all kinds of complicated occasions realizations such as apparent, sleety weather with rough ground, daytime and night, feature Fixed application ensure that the seamless positioning under the whole scene full working scope of robot.By the way that the data of all kinds of observation sensors are first Estimation is integrated with inertial states estimation module, obtains the observational equation that frequency is inertial states estimation module output frequency, It ensure that the synchronization of different frequency sensor observation.It is realized using adaptive combined filtering algorithm and sentenced automatically under various circumstances Disconnected available location observation type, positioning is combined using sensor appropriate under this scene, obtain less than exclusive use its In any positioning method error locating effect, ensure that the accuracy of seamless positioning.
Description of the drawings
Fig. 1 is the block diagram of system of the present invention;
Fig. 2 is flow chart of the method for the present invention;
Fig. 3 is the adaptive combined filtering blending algorithm schematic diagram towards patrol task;
Fig. 4 is the step flow chart of adaptive combined filtering algorithm;
Major Symbol explanation:
1st, GPS observing units;2nd, UWB observing units;3rd, visual observation unit;4th, laser observations unit;5th, mileage is observed Unit;6th, environmental model prior information module;7th, combination decision module;8th, inertial states estimation module;9th, can use location observation/ SINS composite modules;10th, localization information fusion module.
Specific embodiment
Below with reference to attached drawing, technical scheme of the present invention is described in detail.
As shown in Figure 1, petrochemical factory's crusing robot alignment system that the present invention designs, including GPS observing units 1, UWB Observing unit 2, visual observation unit 3, laser observations unit 4, mileage observing unit 5, environmental model prior information module 6, group It closes decision-making module 7, inertial states estimation module 8, location observation/SINS composite modules 9 and localization information fusion module 10 can be used.
Combination decision module 7 receives the environmental map model of 6 storage of environmental model prior information module, with reference to each observation number According to the positioning based on multilayer map is carried out, judgement can use observation type, and can be sent to observation data available location observation/ SINS composite modules 9;It can be established with location observation/SINS composite modules 9 with observation data with inertial states estimation module 8 The correspondence equation of motion be combined, and combined result is sent to localization information fusion module 10, carry out final integrated position Appearance is estimated.
As shown in Fig. 2, the localization method based on above-mentioned alignment system, is as follows.
Step S21, robot booting or during artificial shift position, the observation z of initialization observing unit 1-5i(0) with And the state variable space x of inertial states estimation module 8i(0).Each observing unit 1-5 acquires observation information z respectivelyi(ki), with Specific frequency is sent to combination decision module 7.
GPS observing units 1 are when robot operates in outdoor and unobstructed environment and GPS signal can be received continuously, automatically Acquisition GPS signal is simultaneously sent to combination decision module 7.In the present embodiment, transmission frequency 1Hz.
UWB observing units 2 operate in the indoor environment and UWB signal for laying UWB wireless sensor networks in advance in robot When can continuously receive, automatic collection UWB signal is simultaneously sent to combination decision module 7.In the present embodiment, transmission frequency is set as 1Hz。
For visual observation unit 3 in robot operational process, automatic collection image information is simultaneously sent to combination decision module 7.In the present embodiment, transmission frequency 20Hz.
For laser observations unit 4 in robot operational process, automatic collection laser point cloud information is simultaneously sent to combination decision Module 7.In the present embodiment, frequency acquisition is set as 10Hz.
For mileage observing unit 5 in robot operational process, automatic collection mileage is simultaneously sent to combination decision module 7.In the present embodiment, frequency acquisition is set as 20Hz.
Environmental model prior information module 6 is aligned to obtain GIS electronic maps, visual signature map, swash by original map The cartographic model of luminous point cloud map three types map superposition.In the present embodiment, GIS electronic maps utilize ARCGIS softwares two Secondary exploitation structure plant area's three-dimensional space data library, on the basis of remote sensing aerial survey image GIS electronic maps, by indoor UWB anchor nodes Environment under signal cover is stored using artificial surveying and mapping data and the GIS electronic maps that are added to;Sparse spy therein Sign point map carries out sparse environmental modeling by using vision ORB-SLAM and obtains and be added on GIS electronic maps;Wherein Laser point cloud map obtain and fold by using laser odometer LOAM (Lidar Odometry and Mapping) method It is added on GIS electronic maps.The all types of sensing datas acquired during drawing, which synchronize, to carry out and arrives generation by the way that coordinate conversion is unified Under boundary's coordinate system, the environmental model that three layers of map are superimposed and are aligned is obtained with this.
Step S22, combination decision module 7 combine each observing unit observation information and environmental model prior information into line position Appearance is estimated for the first time, determines available observation type i (i expressions can use observing unit serial number), then can use observation i with frequency Si Send observation zi(ki) to available location observation/SINS composite modules 9.
Combination decision module 7 is according to the sensing data and environment mould of the acquisition of each location observation unit received The multilayer cartographic information of type prior information module 6, determines currently available sensor type.Specific method is as follows:
Combination decision module 7 receives the GPS satellite location information of 1 acquisition of GPS observing units, in environmental model prior information Coordinate setting is carried out in the GIS electronic maps that module 6 provides, judges assembled scheme whether comprising GPS observation informations.If GPS is seen Survey the continuous t of unit1Normally (observation satellite quantity is no less than 4, position dilution of precision for positioning in GIS electronic maps in time Less than 4), then it is assumed that the assembled scheme of current kinetic position includes GPS observation informations.Combination decision module is with frequency S1It sends and sees Survey z1(k1), corresponding observing matrix C1(k1), observation noise v1(k1)。
Combination decision module 7 receives the UWB wireless location information of 2 acquisition of UWB observing units, in environmental model prior information Coordinate setting is carried out in the GIS electronic map indoor environments that module 6 provides, judges assembled scheme whether comprising UWB observation informations. If the continuous t of UWB observing units2(RSSI received signal strength PL (d) > R are normally positioned in time in GIS electronic mapsmin), RminSignal strength threshold for setting, then it is assumed that the assembled scheme of current kinetic position includes UWB observation informations.Combination decision Module is with frequency S2Send observation z2(k2), corresponding observing matrix C2(k2), observation noise v2(k2)。
Combination decision module 7 receives the image feature information of 3 acquisition of visual observation unit, in environmental model prior information mould Characteristic matching positioning is carried out in the characteristics map that block 6 provides, judges assembled scheme whether comprising visual observation information.If vision is seen The feature critical in the environment that unit 3 detects after rejecting erroneous matching is surveyed to count out more than specified threshold N3, then it is assumed that The assembled scheme of current kinetic position includes visual observation information.Combination decision module is with frequency S3Observe z3(k3), corresponding observation Matrix C3(k3), observation noise v3(k3).As preference, characteristic matching uses ORB characteristic points, stochastical sampling consistency is utilized Algorithm (RANSAC) rejects the registration point pair of mistake, realizes that robot pose is estimated using the station-keeping mode of ORB-SLAM.
Combination decision module 7 receives the environment point cloud information that the laser observations unit 4 acquires, in environmental model priori A cloud map registration positioning is carried out in the point cloud map that information module 6 provides, judges whether assembled scheme is believed comprising laser observations Breath.If laser observations unit 4 is in laser point cloud matching process, after noise remove erroneous matching, the feature of successful match Point is more than specified threshold N to quantity4, then it is assumed that the assembled scheme of current kinetic position includes laser observations information.Combination decision Module is with frequency S4Send observation z4(k4), corresponding observing matrix C4(k4), observation noise v4(k4).In the present embodiment, feature Matching process is carried out, with putting cloud using kd tree K-NN search method as feature using inquiry vertex neighborhood normal vector during matching Figure registration realizes that robot pose is estimated using iteration closest approach (ICP) method.
Combination decision module 7 receives the mileage information of the encoder feedback of 5 acquisition of mileage observing unit, in environmental model elder generation It tests in the GIS electronic maps of the offer of information module 6 and is positioned according to reckoning method, judge assembled scheme whether comprising inner Journey observation information.If the z-axis acceleration average rate of change that inertial states estimation module 8 exports is less than setting threshold in Δ T time ValueThen think that the assembled scheme of current kinetic position includes mileage observation information.Combination decision module is with frequency S5Observe z5 (k5), corresponding observing matrix C5(k5), observation noise v5(k5)。
Step S23, inertial states estimation module builds corresponding each observation observation z respectivelyi(ki) state space variable xi (k), state-transition matrix Ai(k), system noise item wi(k).By each motion prediction xi(k) with identical frequency S0Being sent to can With location observation/SINS composite modules (9).
Inertial states estimation module 8 is according to different observation structure robot motion's predicted state information xi(k), it is input to Multisensor synchronization module 9.The z-axis acceleration input combination decision module 7 perpendicular to robot platform is measured simultaneously, judges group Whether conjunction scheme includes mileage observation information.In the present embodiment, inertial states estimation transmission frequency S0=100Hz.
Step S24, it can be transported with each SINS that location observation/SINS composite modules (9) send inertial states estimation module Dynamic prediction xi(k), the C that can be sent with each different frequency with observing uniti(ki) be compared respectively, utilize virtual observation method It is S to obtain scale0Same frequency observed result Ci(k), local filter is built.
Location observation/SINS composite modules 9 can be used by the observation data C of each available observing uniti(ki) respectively with inertia shape The x of state estimation module (8)i(k) structure local filter is combined, after discretization, system is:
xi(k+1)=Ai(k)xi(k)+Γi(k)wi(k)
zi(ki)=Ci(ki)xi(ki)+vi(ki)
Wherein, xi(ki)=x (ni(ki-1)+1);niFor SINS and the sample rate ratio of each observing unit frequency;xi(k) Position and posture, w are estimated for corresponding i-th kind of method of estimationi(k) it is process noise matrix, Γi(k) it is noise inputs matrix, Corresponding process noise variance matrix is Qi(k);zi(ki) for observational equation, Ci(ki) represent observing matrix;Observation error vi(ki) be Zero mean Gaussian white noise, variance Ri(ki)。
It is obtained using virtual observation method by interpolation and extrapolation by under the observing frequency unification to SINS of each observing unit The respectively location estimation equation under observation:
xi(k+1)=Ai(k)xi(k)+Γi(k)wi(k)
zi(k)=Ci(k)xi(k)+vi(k)
Wherein, The estimation of the common conditions of each subfilters for various observation structures, i.e. it is fixed in SINS Position position, speed and attitude error and the state space of gyroscope and accelerometer bias.It is distinctive for i-th of wave filter State estimation.xcEstimate to obtain optimal global estimation, x by associated filters two-stage suboptimal filteringsiIt is carried out in subfilter Primary filtering output estimation value.
The state equation of local filter system can be expressed as:
Wherein,
The platform stance error of SINS systemsVelocity errorSite errorGyroscope constant value zero drift εT, accelerometer constant errorGyroscope white noiseAccelerometer white noise
Local filter 1 uses the alternate position spike that GPS and SINS is formed and speed difference structure observational equation z1
z1(k)=C1(k)x1(k)+v1(k)
x1(k)=xc(k)
Wherein,For the velocity error of GPS,For site error.
The location information difference structure observational equation z that local filter 2 is obtained using UWB and SINS positioning2
z2(k)=C2(k)x2(k)+v2(k)
x2(k)=xc(k)
Wherein,For UWB attitude errors.
Local filter 3 uses vision positioning and the alternate position spike of SINS positioning and pose difference structure observational equation z3
z3(k)=C3(k)x3(k)+v3(k)
Wherein,For the Ocular measure factor,For the attitude error of camera location estimation,Position for camera location estimation Put error.
Local filter 4 uses laser positioning and the alternate position spike of SINS positioning and posture difference structure observational equation z4
z4(k)=C4(k)x4(k)+v4(k)
Wherein,For the attitude error of laser radar location estimation,Site error for laser radar location estimation.
Local filter 5 uses the posture difference and alternate position spike of reckoning and SINS structure observational equation z5
z5(k)=C5(k)x5(k)+v5(k)
Wherein,For reckoning attitude error,For site error, δ K are odometer scale coefficient error.
Step S25, localization information fusion module 10 uses a kind of adaptive combined filtering towards patrol task (Federated Filter) blending algorithm estimates the local filter of each observation structureIt carries out Integrated combination obtains final combination pose estimated result
Fig. 3 is the adaptive combined filtering blending algorithm schematic diagram towards patrol task, is sent by each observing unit 1-5 Observation information z1~z5, the state estimation information x with the transmission of inertial states estimation module 81~x5It is respectively combined, It is input to corresponding local filter.Local filter i combination equations of motion xi, observational equation ziAnd the senior filter letter of feedback BreathCarry out part filter location estimation.The part filter result of output carries out Federated filter by senior filter, Obtain final pose estimation output
Fig. 4 is the step flow chart of adaptive combined filtering algorithm, and detailed process is:
Step S41, the available observation type judged according to combination decision module 7, determines combined information distribution coefficient βiIf Put each available observation local filter initial value.
Combined information distribution coefficient βiMeet information conservation principle:
Wherein βmFor the information sharing scheme of senior filter, (0.1 < β are manually specified according to the characteristics of experimental sitem < 0.3).For the unavailable observation that combination decision module 7 judges, corresponding βi=0, the corresponding β of observation can be used in remainingiAccording to Information conservation equation is distributed equally.
The covariance P of each available local filter and senior filter, noise variance R initial value designs are:
Step S42, by globalstate estimationCovariance matrix PfAnd public noise matrix QfBy information sharing principle point It is fitted in each local filter:
Step S43, the elapsed time synchronize after each available observation ziRespectively with the motion state x of SINSiCarry out local pose State estimation obtains available part filter result
Local filter and senior filter carry out status predication and update respectively:
Prediction process:
Renewal process:
Pi(k | k)=[In-Ki(k)Ci(k)]Pi(k|k-1)
InIt is unit matrix;
Senior filter updates:
Pm(k+1)=Pm(k|k-1)
Step S44, each part filter positioning result is transformed into from local coordinate system under robot geocentric coordinate system, with reference to Each available local pose estimated result and senior filter feedback result, are combined estimation using senior filter and obtain global optimum Estimation
Local coordinate system is sat to the conversion of robot geocentric coordinate system using each sensor mounting location to robot barycenter The coordinate transformation relation of mark system is determined.
Senior filter carries out adaptive combined information fusion:
Step S45, local coordinate system where the fusion positioning result of senior filter being converted back each sensor coordinate system, profit Each local filter and the location estimation value and covariance matrix of senior filter are reset with coordinate transform, carries out the following iteration period Pose estimation.
Embodiment is merely illustrative of the invention's technical idea, it is impossible to protection scope of the present invention is limited with this, it is every according to Technological thought proposed by the present invention, any change done on the basis of technical solution, each falls within the scope of the present invention.

Claims (9)

1. a kind of petrochemical factory's crusing robot alignment system, which is characterized in that single including GPS observing units (1), UWB observations First (2), visual observation unit (3), laser observations unit (4), mileage observing unit (5), combination decision module (7), inertia shape State estimation module (8) can use location observation/SINS composite modules (9) and localization information fusion module (10);GPS observing units (1), UWB observing units (2), visual observation unit (3), laser observations unit (4) and mileage observing unit (5) acquire respectively respectively From observation data, and combination decision module (7) is sent to specific frequency;It is first that combination decision module (7) receives environmental model The environmental map model of information module (6) storage is tested, the positioning based on multilayer map is carried out with reference to each observation data, judges available Type is observed, and available location observation/SINS composite modules (9) can be sent to observation data;Location observation/SINS can be used Composite module (9) can be combined with observation data with the corresponding equation of motion that inertial states estimation module (8) is established, and will Combined result is sent to localization information fusion module (10), carries out final integrated pose estimation.
2. the localization method based on petrochemical factory's crusing robot alignment system described in claim 1, which is characterized in that including with Lower step:
(1) when robot booting or artificial shift position, the observation z of 5 observing units (1-5) is initializedi(0) it is and used The state variable space x of character state estimation module (8)i(0);5 observing units (1-5) acquire observation data z respectivelyi(ki), and Combination decision module (7) is sent to specific frequency;Subscript i=1,2,3,4,5, represent 5 observing units;
(2) observation information of each observing unit of combination decision module (7) combination and environmental model prior information module (6) store Environmental map model carries out pose according to a preliminary estimate, determines available observation type i, then by available observation data with frequency Si Send observation data zi(ki) and corresponding observing matrix Ci(ki) to available location observation/SINS composite modules (9);
(3) the corresponding each available observation data z of inertial states estimation module (8) structurei(ki) motion state variable xi(k), by xi (k) with identical frequency S0It is sent to available location observation/SINS composite modules (9);
(4) location observation/SINS composite modules (9) can be used by each motion state variable xi(k) with corresponding observing matrix Ci(ki) It is respectively combined, by interpolation and is extrapolated using virtual observation method by each observing matrix Ci(ki) unify to frequency to be S0's Observing matrix Ci(k), and local filter is built;
(5) localization information fusion module (10) uses adaptive combined filtering blending algorithm by the estimated result of each local filter Integrated combination is carried out, obtains final combination pose estimated result.
3. localization method according to claim 2, which is characterized in that environmental model prior information module (6) is by primitively Figure alignment obtains the cartographic model of GIS electronic maps, visual signature map and the superposition of laser point cloud map three types.
4. localization method according to claim 3, which is characterized in that in step (2), determine the available method for observing type It is as follows:
Combination decision module (7) receives the GPS satellite location information that GPS observing units (1) acquire, in environmental model prior information Coordinate setting is carried out in the GIS electronic maps that module (6) provides, if the continuous t of GPS observing units1Normally in GIS electronics in time Positioned in map, i.e., observation satellite quantity be no less than 4, position dilution of precision be less than 4, then it is assumed that GPS observing units (1) acquire Observation data be available observation type;
Combination decision module (7) receives the UWB wireless location information that UWB observing units (2) acquire, in environmental model prior information Coordinate setting is carried out in the GIS electronic map indoor environments that module (6) provides, if the continuous t of UWB observing units2It is normal in time It is positioned in GIS electronic maps, i.e. RSSI received signal strength PL (d) > Rmin, RminFor the signal strength threshold of setting, then recognize Observation data for UWB observing units (2) acquisition are available observation type;
Combination decision module (7) receives the image feature information that visual observation unit (3) acquires, in environmental model prior information mould Characteristic matching positioning is carried out in the visual signature map that block (6) provides, if passing through in the environment that visual observation unit (3) detects It rejects the feature key points after erroneous matching and given threshold N is more than to number3, then it is assumed that the sight of visual observation unit (3) acquisition Measured data is available observation type;
Combination decision module (7) receives the environment point cloud information that laser observations unit (4) acquires, in environmental model prior information mould A cloud map registration positioning is carried out in the laser point cloud map that block (6) provides, if laser observations unit (4) is matched in laser point cloud In the process, after noise remove erroneous matching, the number of feature points of successful match is more than given threshold N4, then it is assumed that laser The observation data of observing unit (4) acquisition are available observation type.
5. localization method according to claim 3, which is characterized in that inertial states estimation module (8) is measured perpendicular to machine The z-axis acceleration input combination decision module (7) of people's platform, whether the observation data for judging mileage observing unit (5) are available Observe type:
Combination decision module (7) receives the mileage information that mileage observing unit (5) acquires encoder feedback, in environmental model priori It is positioned in the GIS electronic maps that information module (6) provides according to reckoning method, if inertial states estimation module (8) The z-axis acceleration average rate of change of output is less than given threshold in Δ T timeThen think the sight of mileage observing unit (5) Measured data is available observation type.
6. localization method according to claim 2, which is characterized in that in step (4), if GPS observing units (1) acquisition Data are observed as available observation type, corresponding local filter is built using the alternate position spike and speed difference of GPS and SINS positioning Observational equation z1;If the observation data of UWB observing units (2) acquisition are available observation type, corresponding local filter uses The location information difference structure observational equation z of UWB and SINS positioning2;If the observation data of visual observation unit (3) acquisition are available Type is observed, corresponding local filter uses vision and the alternate position spike of SINS positioning and pose difference structure observational equation z3; If the observation data of laser observations unit (4) acquisition are available observation type, corresponding local filter uses laser and SINS Alternate position spike and posture difference structure the observational equation z of positioning4;If the observation data of mileage observing unit (5) are available observation types, Corresponding local filter uses reckoning and the alternate position spike of SINS positioning and posture difference structure observational equation z5
7. localization method according to claim 6, which is characterized in that step (5) is as follows:
(51) the available observation type judged according to combination decision module (7), determines combined information distribution coefficient βi, set and respectively may be used With observation type of topical wave filter initial value;
(52) by globalstate estimationCovariance matrix PfAnd public noise matrix QfIt is assigned to by information sharing principle each In local filter;
(53) elapsed time synchronize after each available observational equation ziThe movement shape exported respectively with inertial states estimation module (8) State variable xiLocal position and posture estimation is carried out, obtains part filter resultWhereinEstimate for local state, PiFor Corresponding covariance matrix, i ∈ [1,5];
(54) each part filter positioning result is transformed into from local coordinate system under robot geocentric coordinate system, with reference to each available office Position appearance estimated result and senior filter feedback result, are combined estimation using senior filter, obtain global best estimates
(55) each local filter and the state estimation and covariance matrix of senior filter are reset, carries out the position in following iteration period Appearance is estimated.
8. localization method according to claim 7, which is characterized in that in step (51), the combined information distribution coefficient βi Meet information conservation principle:
Wherein, βmFor the information sharing scheme of senior filter, set according to place feature;
For the unavailable observation type that combination decision module (7) judges, corresponding βi=0, the corresponding β of observation type can be usediIt presses According to information conservation equation mean allocation.
9. localization method according to claim 7, which is characterized in that in step (55), local coordinate system to machine hostage The conversion of heart coordinate system is to utilize the installation site of each observing unit (1-5) to the coordinate transformation relation of robot geocentric coordinate system Determining.
CN201711439909.7A 2017-12-27 2017-12-27 Petrochemical plant inspection robot positioning system and method Active CN108225302B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711439909.7A CN108225302B (en) 2017-12-27 2017-12-27 Petrochemical plant inspection robot positioning system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711439909.7A CN108225302B (en) 2017-12-27 2017-12-27 Petrochemical plant inspection robot positioning system and method

Publications (2)

Publication Number Publication Date
CN108225302A true CN108225302A (en) 2018-06-29
CN108225302B CN108225302B (en) 2020-03-17

Family

ID=62648958

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711439909.7A Active CN108225302B (en) 2017-12-27 2017-12-27 Petrochemical plant inspection robot positioning system and method

Country Status (1)

Country Link
CN (1) CN108225302B (en)

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108803620A (en) * 2018-07-25 2018-11-13 梁步阁 A kind of UWB positioning systems for robot
CN109059942A (en) * 2018-08-22 2018-12-21 中国矿业大学 A kind of high-precision underground navigation map building system and construction method
CN109084760A (en) * 2018-07-11 2018-12-25 北京壹氢科技有限公司 Navigation system between a kind of building
CN109100768A (en) * 2018-08-01 2018-12-28 南京科远自动化集团股份有限公司 A kind of synthesized positioning method and positioning label
CN109141442A (en) * 2018-09-07 2019-01-04 高子庆 Navigation method based on UWB positioning and image feature matching and mobile terminal
CN109323696A (en) * 2018-11-07 2019-02-12 航天信息股份有限公司 A kind of unmanned fork lift indoor positioning navigation system and air navigation aid
CN109507677A (en) * 2018-11-05 2019-03-22 浙江工业大学 A kind of SLAM method of combination GPS and radar odometer
CN109612466A (en) * 2018-11-30 2019-04-12 北斗天地股份有限公司山东分公司 A kind of automobile-used multi-sensor combined navigation method and system in underground
CN109764874A (en) * 2018-12-20 2019-05-17 中国航空工业集团公司洛阳电光设备研究所 A kind of substation inspection vehicle autonomous navigation method
CN109807911A (en) * 2019-03-14 2019-05-28 湖南超能机器人技术有限公司 Based on GNSS, UWB, IMU, laser radar, code-disc the multi-environment joint positioning method of outdoor patrol robot
CN109917404A (en) * 2019-02-01 2019-06-21 中山大学 A kind of indoor positioning environmental characteristic point extracting method
CN110048509A (en) * 2019-04-22 2019-07-23 国网陕西省电力公司宝鸡供电公司 A kind of substation inspection system and method based on MR intelligent safety helmet
CN110362100A (en) * 2019-07-03 2019-10-22 武汉科技大学 A kind of autonomous method for inspecting of large-scale storage tank based on quadrotor drone
CN110446159A (en) * 2019-08-12 2019-11-12 上海工程技术大学 A kind of system and method for interior unmanned plane accurate positioning and independent navigation
CN110514225A (en) * 2019-08-29 2019-11-29 中国矿业大学 The calibrating external parameters and precise positioning method of Multi-sensor Fusion under a kind of mine
CN110542418A (en) * 2019-09-05 2019-12-06 重庆大学 Indoor pipeline positioning method integrating two-dimensional code and inertial sensor
CN110940420A (en) * 2019-11-14 2020-03-31 国网山西省电力公司大同供电公司 Temperature abnormal point positioning system for miniature inspection robot with narrow and small tube cavities
CN111208526A (en) * 2020-01-17 2020-05-29 西北工业大学 Multi-unmanned aerial vehicle cooperative positioning method based on laser radar and positioning vector matching
CN111239723A (en) * 2020-02-25 2020-06-05 南京航空航天大学 Satellite radar urban canyon vehicle mutual positioning method based on factor graph
CN111426316A (en) * 2020-06-15 2020-07-17 北京云迹科技有限公司 Robot positioning method and device, robot and readable storage medium
CN111486840A (en) * 2020-06-28 2020-08-04 北京云迹科技有限公司 Robot positioning method and device, robot and readable storage medium
CN111735451A (en) * 2020-04-16 2020-10-02 中国北方车辆研究所 Point cloud matching high-precision positioning method based on multi-source prior information
CN112180405A (en) * 2020-11-27 2021-01-05 北京建筑大学 GPS-based method for positioning and determining postures of sporters in mountain snowfield
CN112214019A (en) * 2020-09-21 2021-01-12 国网浙江省电力有限公司 Non-blind area intelligent feedback control system, method and terminal for unmanned inspection equipment
CN112230256A (en) * 2019-07-15 2021-01-15 苏州宝时得电动工具有限公司 Autonomous robot, positioning calibration method and device thereof, and storage medium
CN113255600A (en) * 2021-06-29 2021-08-13 上海影创信息科技有限公司 Point cloud map updating optimization method, system, medium and equipment based on client
CN113375661A (en) * 2020-02-25 2021-09-10 郑州宇通客车股份有限公司 Positioning navigation method and system of unmanned system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102538781A (en) * 2011-12-14 2012-07-04 浙江大学 Machine vision and inertial navigation fusion-based mobile robot motion attitude estimation method
US20120290146A1 (en) * 2010-07-15 2012-11-15 Dedes George C GPS/IMU/Video/Radar absolute/relative positioning communication/computation sensor platform for automotive safety applications
CN105628026A (en) * 2016-03-04 2016-06-01 深圳大学 Positioning and posture determining method and system of mobile object
CN106227220A (en) * 2016-09-28 2016-12-14 关健生 Independent navigation crusing robot based on Distributed Architecture
CN106325277A (en) * 2016-09-28 2017-01-11 关健生 Distributed framework based autonomous navigation and inspection robot control method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120290146A1 (en) * 2010-07-15 2012-11-15 Dedes George C GPS/IMU/Video/Radar absolute/relative positioning communication/computation sensor platform for automotive safety applications
CN102538781A (en) * 2011-12-14 2012-07-04 浙江大学 Machine vision and inertial navigation fusion-based mobile robot motion attitude estimation method
CN105628026A (en) * 2016-03-04 2016-06-01 深圳大学 Positioning and posture determining method and system of mobile object
CN106227220A (en) * 2016-09-28 2016-12-14 关健生 Independent navigation crusing robot based on Distributed Architecture
CN106325277A (en) * 2016-09-28 2017-01-11 关健生 Distributed framework based autonomous navigation and inspection robot control method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
禤家裕: "多传感器数据融合技术在无人机导航系统的研究应用", 《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》 *

Cited By (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109084760A (en) * 2018-07-11 2018-12-25 北京壹氢科技有限公司 Navigation system between a kind of building
CN108803620A (en) * 2018-07-25 2018-11-13 梁步阁 A kind of UWB positioning systems for robot
CN109100768A (en) * 2018-08-01 2018-12-28 南京科远自动化集团股份有限公司 A kind of synthesized positioning method and positioning label
CN109100768B (en) * 2018-08-01 2020-08-14 南京科远智慧科技集团股份有限公司 Comprehensive positioning method and positioning label
CN109059942A (en) * 2018-08-22 2018-12-21 中国矿业大学 A kind of high-precision underground navigation map building system and construction method
CN109141442A (en) * 2018-09-07 2019-01-04 高子庆 Navigation method based on UWB positioning and image feature matching and mobile terminal
CN109507677A (en) * 2018-11-05 2019-03-22 浙江工业大学 A kind of SLAM method of combination GPS and radar odometer
CN109323696B (en) * 2018-11-07 2022-07-08 航天信息股份有限公司 Indoor positioning navigation system and method for unmanned forklift
CN109323696A (en) * 2018-11-07 2019-02-12 航天信息股份有限公司 A kind of unmanned fork lift indoor positioning navigation system and air navigation aid
CN109612466A (en) * 2018-11-30 2019-04-12 北斗天地股份有限公司山东分公司 A kind of automobile-used multi-sensor combined navigation method and system in underground
CN109764874A (en) * 2018-12-20 2019-05-17 中国航空工业集团公司洛阳电光设备研究所 A kind of substation inspection vehicle autonomous navigation method
CN109917404A (en) * 2019-02-01 2019-06-21 中山大学 A kind of indoor positioning environmental characteristic point extracting method
CN109917404B (en) * 2019-02-01 2023-02-03 中山大学 Indoor positioning environment feature point extraction method
CN109807911A (en) * 2019-03-14 2019-05-28 湖南超能机器人技术有限公司 Based on GNSS, UWB, IMU, laser radar, code-disc the multi-environment joint positioning method of outdoor patrol robot
CN109807911B (en) * 2019-03-14 2021-10-22 湖南超能机器人技术有限公司 Outdoor patrol robot multi-environment combined positioning method based on GNSS, UWB, IMU, laser radar and code disc
CN110048509A (en) * 2019-04-22 2019-07-23 国网陕西省电力公司宝鸡供电公司 A kind of substation inspection system and method based on MR intelligent safety helmet
CN110362100A (en) * 2019-07-03 2019-10-22 武汉科技大学 A kind of autonomous method for inspecting of large-scale storage tank based on quadrotor drone
CN110362100B (en) * 2019-07-03 2022-02-01 武汉科技大学 Large storage tank autonomous inspection method based on quad-rotor unmanned aerial vehicle
CN112230256B (en) * 2019-07-15 2024-04-09 苏州宝时得电动工具有限公司 Autonomous robot, positioning calibration method and device thereof, and storage medium
CN112230256A (en) * 2019-07-15 2021-01-15 苏州宝时得电动工具有限公司 Autonomous robot, positioning calibration method and device thereof, and storage medium
CN110446159A (en) * 2019-08-12 2019-11-12 上海工程技术大学 A kind of system and method for interior unmanned plane accurate positioning and independent navigation
CN110514225A (en) * 2019-08-29 2019-11-29 中国矿业大学 The calibrating external parameters and precise positioning method of Multi-sensor Fusion under a kind of mine
CN110542418A (en) * 2019-09-05 2019-12-06 重庆大学 Indoor pipeline positioning method integrating two-dimensional code and inertial sensor
CN110940420A (en) * 2019-11-14 2020-03-31 国网山西省电力公司大同供电公司 Temperature abnormal point positioning system for miniature inspection robot with narrow and small tube cavities
CN111208526A (en) * 2020-01-17 2020-05-29 西北工业大学 Multi-unmanned aerial vehicle cooperative positioning method based on laser radar and positioning vector matching
CN111239723A (en) * 2020-02-25 2020-06-05 南京航空航天大学 Satellite radar urban canyon vehicle mutual positioning method based on factor graph
CN113375661A (en) * 2020-02-25 2021-09-10 郑州宇通客车股份有限公司 Positioning navigation method and system of unmanned system
CN111239723B (en) * 2020-02-25 2022-02-01 南京航空航天大学 Satellite radar urban canyon vehicle mutual positioning method based on factor graph
CN111735451B (en) * 2020-04-16 2022-06-07 中国北方车辆研究所 Point cloud matching high-precision positioning method based on multi-source prior information
CN111735451A (en) * 2020-04-16 2020-10-02 中国北方车辆研究所 Point cloud matching high-precision positioning method based on multi-source prior information
CN111426316A (en) * 2020-06-15 2020-07-17 北京云迹科技有限公司 Robot positioning method and device, robot and readable storage medium
CN111486840A (en) * 2020-06-28 2020-08-04 北京云迹科技有限公司 Robot positioning method and device, robot and readable storage medium
CN112214019A (en) * 2020-09-21 2021-01-12 国网浙江省电力有限公司 Non-blind area intelligent feedback control system, method and terminal for unmanned inspection equipment
CN112180405B (en) * 2020-11-27 2021-02-23 北京建筑大学 GPS-based method for positioning and determining postures of sporters in mountain snowfield
CN112180405A (en) * 2020-11-27 2021-01-05 北京建筑大学 GPS-based method for positioning and determining postures of sporters in mountain snowfield
CN113255600A (en) * 2021-06-29 2021-08-13 上海影创信息科技有限公司 Point cloud map updating optimization method, system, medium and equipment based on client
CN113255600B (en) * 2021-06-29 2021-10-01 上海影创信息科技有限公司 Point cloud map updating optimization method, system, medium and equipment based on client

Also Published As

Publication number Publication date
CN108225302B (en) 2020-03-17

Similar Documents

Publication Publication Date Title
CN108225302A (en) A kind of petrochemical factory's crusing robot alignment system and method
CN108181636B (en) Environment modeling and map building device and method for petrochemical plant inspection robot
CN104729506B (en) A kind of unmanned plane Camera calibration method of visual information auxiliary
Borenstein et al. Mobile robot positioning: Sensors and techniques
Grocholsky et al. Cooperative air and ground surveillance
CN113124856B (en) Visual inertia tight coupling odometer based on UWB (ultra wide band) online anchor point and metering method
Shalaby et al. Relative position estimation in multi-agent systems using attitude-coupled range measurements
Yun et al. IMU/Vision/Lidar integrated navigation system in GNSS denied environments
CN110686671B (en) Indoor 3D real-time positioning method and device based on multi-sensor information fusion
Meguro et al. Development of an autonomous mobile surveillance system using a network-based RTK-GPS
CN112925000B (en) Vehicle positioning method in tunnel environment based on visible light communication and inertial navigation
Bailey et al. Localisation in large-scale environments
Al-Darraji et al. A technical framework for selection of autonomous uav navigation technologies and sensors
Klein et al. LiDAR and INS fusion in periods of GPS outages for mobile laser scanning mapping systems
Ehrlich et al. Pedestrian localisation inside buildings based on multi-sensor smartphones
Suzuki et al. Development of a SIFT based monocular EKF-SLAM algorithm for a small unmanned aerial vehicle
Chiang et al. Semantic proximity update of GNSS/INS/VINS for Seamless Vehicular Navigation using Smartphone sensors
Ong et al. Six dof decentralised slam
Alliez et al. Indoor localization and mapping: Towards tracking resilience through a multi-slam approach
Li et al. A seamless indoor and outdoor low-cost integrated navigation system based on LIDAR/GPS/INS
Zali et al. Localization of an indoor mobile robot using decentralized data fusion
Fernandes et al. A low-cost localization system based on Artificial Landmarks
Meguro et al. Creating spatial temporal database by autonomous mobile surveillance system (a study of mobile robot surveillance system using spatial temporal GIS part 1)
Azizi et al. 3D inertial algorithm of SLAM for using on UAV
Zahran et al. New UAV velocity estimation using array of hall effect sensors for indoor navigation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant