CN107562054A - The independent navigation robot of view-based access control model, RFID, IMU and odometer - Google Patents
The independent navigation robot of view-based access control model, RFID, IMU and odometer Download PDFInfo
- Publication number
- CN107562054A CN107562054A CN201710772582.9A CN201710772582A CN107562054A CN 107562054 A CN107562054 A CN 107562054A CN 201710772582 A CN201710772582 A CN 201710772582A CN 107562054 A CN107562054 A CN 107562054A
- Authority
- CN
- China
- Prior art keywords
- module
- robot
- imu
- odometer
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Landscapes
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Navigation (AREA)
Abstract
The invention discloses the independent navigation robot of a kind of view-based access control model, RFID, IMU and odometer, is used in outside and posts on the road of RFID label tag, including:IMU and odometer;Data acquisition module, read the data of IMU, odometer and outside RFID label tag;Locating module, obtain the RFID tag data in data acquisition module;Data Fusion module, RFID, odometer are carried out with IMU data to merge to obtain output device people's accurate location information;Camera, obtain current traffic information;Navigation module, it possesses hunting pattern, local inertial navigation pattern;Obstacle avoidance module, peripheral obstacle information is detected, and feed back to control module;Communication system, for dispatching system communication with outside high in the clouds;Control module, it is electrically connected with data acquisition module, locating module, Data Fusion module, camera, navigation module, obstacle avoidance module, communication system, coordinates the operation between each module.
Description
Technical field
The present invention relates to robot field, and in particular to the independent navigation of a kind of view-based access control model, RFID, IMU and odometer
Robot.
Background technology
Existing independent navigation robot, positioned mostly using GPS and Big Dipper positioning, LBS (base station) positioning, iBeacom,
Problems with is individually present in technology, these location technologies such as WiFi positioning, track navigation, laser navigation, vision guided navigation:
(1), GPS and Big Dipper positioning:Independent navigation is limited to be accurately positioned that high-lager building in cell is more, and GPS and the Big Dipper are determined
Position signal is easily stopped by building, and is easily influenceed by weather, can not be accurately positioned.
(2), LBS (base station) is positioned:Influenceed by base station coverage density and high-lager building, in cell pile compared with
More, signal communication environments are bad, easily produce signal drift, and positioning precision is low.
(3), iBeacom is positioned:By using low-power consumption bluetooth technology, iBeacom base stations can be to automatically create one
Signal network, bluetooth outdoor positioning need to arrange Bluetooth base. station, it is difficult to arrange Bluetooth base. station in outdoor wide area, and safeguard
It is costly.
(4), WiFi is positioned:By robot WiFi module and the wireless signal strength of three wireless network access points, lead to
Cross difference algorithm, more accurately to carry out triangle polyester fibre, it is necessary to add more WiFi base stations to robot, and establish base station and
Maintenance cost is high.UWB (ultra wide band) is located to be divided tag location to sensor using TDOA and AOA location algorithms more
Analysis, multi-path resolved ability is strong, and precision is high, and positioning precision is up to Centimeter Level, but UWB is difficult to a wide range of interior covering of cell, fixed
Position cost is high.Traditional RFID positioning carries out deciding field according to reading device position to label, can not position in real time, positioning precision
Low, not possessing has communication capacity, poor anti jamming capability, needs also exist for multiple RFID card readers, and needs to safeguard, cost is high.
(5), track navigates:Needing to beat magnetic fourth on mapping out a route, difficulty of construction is big, and cost is high, and change route is difficult, and not
It is attractive in appearance.
(6), laser navigation:Using laser accuracy and not diversity degree robot position be accurately positioned come
Guidance machine people walks, it is necessary to dispose expelling plate at a certain distance in the section walked, the rotary laser that robot carries
Hair penetrates laser, obtains a series of transmitting pin of expelling plates after one week in laser scanning, is computed, extremely visitor calculate in laser rotary
The coordinate of the heart, so as to obtain robot position, easily influenceed by pedestrian, driving, installation laser shutter difficulty is big, influences
Attractive in appearance, cost is high.
(7), vision guided navigation:Computationally intensive, algorithm is complicated, and shakiness is easily caused in algorithm identification, filtering, conversion process
It is qualitative.
Therefore, there is problem, it is necessary to further improve in prior art.
The content of the invention
The present invention asks for above Camera calibration is inaccurate precisely, construction is difficult, maintenance cost is high, arithmetic speed is slow
A kind of topic, there is provided the independent navigation robot of view-based access control model, RFID, IMU and odometer.Wherein, IMU is Inertial Measurement Unit,
For robot measurement three-axis attitude and acceleration.
To achieve the above object, concrete technical scheme of the invention is as follows:A kind of view-based access control model, RFID, IMU and odometer
Independent navigation robot, be used in it is outside post on the road of RFID label tag, including:
IMU and odometer, wherein IMU are used for robot measurement three-axis attitude and acceleration;
Data acquisition module, for reading IMU, odometer and outside RFID label tag data;
Locating module, for obtaining the RFID tag data in data acquisition module, obtain rough location information;
Data Fusion module, RFID label tag, odometer and IMU data merge output device people is accurate
Positional information;
Camera, for obtaining current traffic information;
Navigation module, it possesses hunting pattern, local inertial navigation pattern, according to the traffic information of cam feedback two
Switch navigation between kind pattern;
Obstacle avoidance module, for detecting peripheral obstacle information, and feed back to control module;
Communication system, for dispatching system communication with outside high in the clouds, receive the instruction of high in the clouds scheduling system;
Control module, for data acquisition module, locating module, Data Fusion module, camera, navigation mould
Block, obstacle avoidance module, communication system are electrically connected with, and coordinate the operation between each module.
It is preferred that the Data Fusion module kalman filter method enters to RFID, odometer and IMU data
Row merges to obtain output device people's accurate location information.
It is preferred that the hunting pattern is used in the case of lane line in traffic information to have, it identifies track by camera
Line, the skew of robot is corrected by visual feedback information so that robot is maintained in lane line and travelled.
It is preferred that the local inertial navigation pattern is used in the case of lane line in traffic information to have, in no track
The region of line, outside high in the clouds scheduling system departed from track line position to robot last time and reach track line position next time to robot
Planned, robot is obtained from the position in map by the FRID label information and IMU/ odometer information that read
Put, complete no lane line region Local Navigation.
It is preferred that the obstacle avoidance module includes some ultrasonic sensors, it detects peripheral obstacle information in real time.
Above-mentioned technical proposal is used, the invention has the advantages that:
(1) odometer and IMU position errors, are corrected using lane line information, RFID information, robot localization is more smart
Standard, traveling are relatively reliable;
(2), lane line, RFID is combined with odometer/IMU data can accurately calculate position of the robot in known map
Put, solve the problems, such as that positioning is not accurate in groups of building;
(3) data of combining RFID and odometer/IMU, are only needed in terms of positioning, hunting part only needs to identify track
Line, the more traditional targeting scheme of amount of calculation greatly reduce;
(4), avoiding obstacles by supersonic wave module avoids producing collision during robot ride;
(5) robot ride status information, traffic information, are uploaded in real time, perform action message to cloud system, high in the clouds system
System can real-time supervisory-controlled robot, according to the data progress self-teaching being stored in cloud system, robot ride gets over for robot
It is much more clever;
(6), view-based access control model, RFID and odometer/IMU, the independent navigation robot of ultrasonic sensor are applied to cell
In go on patrol and on duty send express delivery.
Brief description of the drawings
Fig. 1 is the theory diagram of the present invention;
Fig. 2 is the control flow chart of the present invention;
Fig. 3 is navigation path planning figure in one embodiment of the invention.
Embodiment
Below in conjunction with the drawings and specific embodiments, invention is described in detail.
Shown in reference picture 1, the independent navigation robot of a kind of view-based access control model, RFID, IMU and odometer, outside is used in
On the road for posting RFID label tag 103, including:
IMU 101 and odometer 102, wherein IMU 101 are used for robot measurement three-axis attitude and acceleration;
Data acquisition module 104, for reading IMU 101, odometer 102 and outside RFID label tag 103 data;
Locating module 105, for obtaining the data of RFID label tag 103 in data acquisition module 104, obtain rough location letter
Breath;
Data Fusion module 106, RFID label tag 103, odometer 102 with IMU 101 data merge
Output device people's accurate location information;
Camera 107, for obtaining current traffic information;
Navigation module 108, it possesses hunting pattern 108a, local inertial navigation pattern 108b, according to cam feedback
Traffic information switches navigation between both of which;
Obstacle avoidance module 109, for detecting peripheral obstacle information, and feed back to control module;
Communication system 110, for being communicated with outside high in the clouds scheduling system 111, receive the instruction of high in the clouds scheduling system 111;
Control module 112, for data acquisition module 104, locating module 105, Data Fusion module 106, take the photograph
As first 107, navigation module 108, obstacle avoidance module 109, communication system 110 are electrically connected with, coordinate the operation between each module.
Wherein, the Data Fusion module 106 with kalman filter method to RFID label tag 103, odometer 102 with
IMU 101 data carry out merging to obtain output device people's accurate location information.
The hunting pattern 108a is used in the case of lane line in traffic information to have, and it identifies track by camera 107
Line, the skew of robot is corrected by visual feedback information so that robot is maintained in lane line and travelled.
The local inertial navigation pattern 108b is used in the case of lane line in traffic information to have, in no lane line
Region, outside high in the clouds scheduling system 111 departed from track line position to robot last time and reach track line position next time to robot
Planned, robot is obtained from ground by the information of RFID label tag 103 and the information of 101/ odometers of IMU 102 read
Position in figure, complete no lane line region Local Navigation.
The obstacle avoidance module 109 includes some ultrasonic sensors, and it detects peripheral obstacle information in real time.
Referring to figs. 1 to Fig. 3, operation principle of the invention is as follows:
Robot reads the information of RFID label tag 103 and odometer 102 and IMU 101 by data acquisition module 104 first
Information.Again by the combining RFID label 103 of Data Fusion module 106 and the information of odometer 102/IMU 101, machine is obtained
People current location.Robot be connected to send with charge free or patrol mission after, high in the clouds dispatch system 111 according to mission planning robot from work as
Front position reaches the path of target location.If current robot on lane line, calls hunting pattern 108b;If machine
People's running region does not have lane line, then calls local inertial navigation pattern 108a.Robot travels under both modes, if
Barrier is run into, then enables avoidance pattern.Until robot reaches target point.Specifically, the present invention is realized using following steps
Above-mentioned function:
S201, gathered data
The data acquisition module 104 of robot reads the information of RFID label tag 103, and the data of odometer 102/IMU 101,
And these data are transferred to Data Fusion module 106.
Data Fusion module 106:RFID label tag 103, odometer 102/IMU 101 are counted with kalman filter method
According to being merged to obtain accurate robot location's information.
S202, positioning
Approximate location of the robot in map is obtained by the information of RFID label tag 103, then by odometer 102/IMU 101
Data obtain elaborate position of the robot in map.
There is the region of lane line, Robot lane line traveling, machine is obtained by odometer 102 and the data of IMU 101
Accurate distance of the people on lane line, and then accurate location of the positioning robot in map.There is no the region of lane line, machine
People carries out local inertial navigation by odometer 102 and IMU 101, and combining RFID label 103 and odometer 102/IMU 101 are counted
According to obtaining accurate location of the robot in map.
S203, high in the clouds scheduling
Wisdom cell high in the clouds fusion cell life big data, United Dispatching is carried out to robot.Receive robot send with charge free or
Person's patrol mission, high in the clouds dispatch system 111 and carry out comprehensive analysis to all robot locations, chooses up to task object point most
Near robot, path planning, the optimal path of generation robot current location to target point are carried out for elected robot.
Whether S204, robot are on track (navigation)
Whether there is lane line by the position judgment current robot running region of camera 107 and robot in map,
If lane line, Robot Selection hunting pattern 108b;If without lane line, Robot Selection part inertial navigation pattern
108a。
In step S204a, Local Navigation pattern 108a:In the region of no lane line, system is dispatched to robot in high in the clouds
Last time departs from track line position and is planned that robot is by the RFID letters read to robot arrival next time track line position
Breath and IMU/ odometer information obtain, from the position in map, completing no lane line region Local Navigation.
In step S204b, hunting pattern 108b:Lane line is identified by monocular cam 107, passes through visual feedback information school
Positive robot skew, makes robot be maintained in lane line and travels.
S205, avoidance
During robot ride, (10 ultrasonic sensor sides are each three, front and rear altogether for ultrasonic sensor
Each two) detection robot peripheral obstacle, and obstacle information is added into map in real time, obstacle-avoiding route planning is carried out, until
Robot departs from barrier region.
Whether S206, robot reach home
If reaching home, terminate the navigation stroke of robot;If not reaching home also, return to step S201 repeats to hold
Row above step.
The foregoing is only a preferred embodiment of the present invention, but protection scope of the present invention be not limited thereto,
Any one skilled in the art in the technical scope of present disclosure, technique according to the invention scheme and its
Inventive concept is subject to equivalent substitution or change, should all be included within the scope of the present invention.
Claims (5)
1. the independent navigation robot of a kind of view-based access control model, RFID, IMU and odometer, it is used in outside and posts RFID label tag
On road, it is characterised in that including:
IMU and odometer, wherein IMU are used for robot measurement three-axis attitude and acceleration;
Data acquisition module, for reading IMU, odometer and outside RFID label tag data;
Locating module, for obtaining the RFID tag data in data acquisition module, obtain rough location information;
Data Fusion module, RFID label tag, odometer are carried out with IMU data to merge to obtain output device people's accurate location
Information;
Camera, for obtaining current traffic information;
Navigation module, it possesses hunting pattern, local inertial navigation pattern, according to the traffic information of cam feedback in two kinds of moulds
Switch navigation between formula;
Obstacle avoidance module, for detecting peripheral obstacle information, and feed back to control module;
Communication system, for dispatching system communication with outside high in the clouds, receive the instruction of high in the clouds scheduling system;
Control module, for data acquisition module, locating module, Data Fusion module, camera, navigation module, keep away
Barrier module, communication system are electrically connected with, and coordinate the operation between each module.
2. independent navigation robot according to claim 1, it is characterised in that the Data Fusion module karr
Graceful filtering method carries out merging to obtain output device people's accurate location information to RFID, odometer with IMU data.
3. independent navigation robot according to claim 1 or 2, it is characterised in that the hunting pattern is in traffic information
Used to have in the case of lane line, it identifies lane line by camera, and the skew of robot is corrected by visual feedback information, is made
Obtain robot and be maintained at traveling in lane line.
4. independent navigation robot according to claim 1 or 2, it is characterised in that the local inertial navigation pattern exists
Traffic information is used in the case of lane line to have, and in the region of no lane line, system was dispatched to robot last time in outside high in the clouds
Depart from track line position and planned that robot is by the RFID label tag letter read to robot arrival next time track line position
Breath and IMU/ odometer information obtain, from the position in map, completing no lane line region Local Navigation.
5. independent navigation robot according to claim 1 or 2, it is characterised in that the obstacle avoidance module includes some super
Sonic sensor, it detects peripheral obstacle information in real time.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710772582.9A CN107562054A (en) | 2017-08-31 | 2017-08-31 | The independent navigation robot of view-based access control model, RFID, IMU and odometer |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710772582.9A CN107562054A (en) | 2017-08-31 | 2017-08-31 | The independent navigation robot of view-based access control model, RFID, IMU and odometer |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107562054A true CN107562054A (en) | 2018-01-09 |
Family
ID=60977719
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710772582.9A Pending CN107562054A (en) | 2017-08-31 | 2017-08-31 | The independent navigation robot of view-based access control model, RFID, IMU and odometer |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107562054A (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108364159A (en) * | 2018-04-09 | 2018-08-03 | 郑州檀乐科技有限公司 | A kind of unmanned plane logistics face label device and method |
CN108388266A (en) * | 2018-04-09 | 2018-08-10 | 郑州檀乐科技有限公司 | A kind of UAV system for logistics delivery |
CN108520377A (en) * | 2018-04-09 | 2018-09-11 | 郑州琼佩电子技术有限公司 | A kind of unmanned plane logistics face label method |
CN108777071A (en) * | 2018-07-04 | 2018-11-09 | 深圳智达机械技术有限公司 | A kind of highway patrol robot |
CN109443350A (en) * | 2018-12-27 | 2019-03-08 | 西安中科光电精密工程有限公司 | Bluetooth/photoelectricity/INS combined navigation device neural network based and method |
CN110647089A (en) * | 2019-10-28 | 2020-01-03 | 天津中德应用技术大学 | Intelligent warehouse logistics robot control system and control method |
CN111753938A (en) * | 2020-06-23 | 2020-10-09 | 联想(北京)有限公司 | Position acquisition method and device and electronic equipment |
CN112462762A (en) * | 2020-11-16 | 2021-03-09 | 浙江大学 | Robot outdoor autonomous moving system and method based on roadside two-dimensional code unit |
CN112611381A (en) * | 2020-10-29 | 2021-04-06 | 武汉哈船导航技术有限公司 | Artificial intelligence inertial navigation system |
CN113077014A (en) * | 2021-04-29 | 2021-07-06 | 上海德衡数据科技有限公司 | Cloud edge terminal information fusion method, system, device and medium |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104537829A (en) * | 2014-12-09 | 2015-04-22 | 北京工业大学 | Intelligent car and positioning method used for intelligent transportation physical simulation platform |
CN105628026A (en) * | 2016-03-04 | 2016-06-01 | 深圳大学 | Positioning and posture determining method and system of mobile object |
CN105759820A (en) * | 2016-04-08 | 2016-07-13 | 济宁中科先进技术研究院有限公司 | Road autonomous cleaning control system and method based on laser and vision |
CN205940567U (en) * | 2016-08-11 | 2017-02-08 | 北京华航航宇科技有限公司 | On -vehicle combination navigational positioning system |
CN106696961A (en) * | 2016-12-09 | 2017-05-24 | 重庆长安汽车股份有限公司 | Control system and method for automatically driving onto and off ramp of freeway |
CN106708040A (en) * | 2016-12-09 | 2017-05-24 | 重庆长安汽车股份有限公司 | Sensor module of automatic driving system, automatic driving system and automatic driving method |
CN106740841A (en) * | 2017-02-14 | 2017-05-31 | 驭势科技(北京)有限公司 | Method for detecting lane lines, device and mobile unit based on dynamic control |
-
2017
- 2017-08-31 CN CN201710772582.9A patent/CN107562054A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104537829A (en) * | 2014-12-09 | 2015-04-22 | 北京工业大学 | Intelligent car and positioning method used for intelligent transportation physical simulation platform |
CN105628026A (en) * | 2016-03-04 | 2016-06-01 | 深圳大学 | Positioning and posture determining method and system of mobile object |
CN105759820A (en) * | 2016-04-08 | 2016-07-13 | 济宁中科先进技术研究院有限公司 | Road autonomous cleaning control system and method based on laser and vision |
CN205940567U (en) * | 2016-08-11 | 2017-02-08 | 北京华航航宇科技有限公司 | On -vehicle combination navigational positioning system |
CN106696961A (en) * | 2016-12-09 | 2017-05-24 | 重庆长安汽车股份有限公司 | Control system and method for automatically driving onto and off ramp of freeway |
CN106708040A (en) * | 2016-12-09 | 2017-05-24 | 重庆长安汽车股份有限公司 | Sensor module of automatic driving system, automatic driving system and automatic driving method |
CN106740841A (en) * | 2017-02-14 | 2017-05-31 | 驭势科技(北京)有限公司 | Method for detecting lane lines, device and mobile unit based on dynamic control |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108388266A (en) * | 2018-04-09 | 2018-08-10 | 郑州檀乐科技有限公司 | A kind of UAV system for logistics delivery |
CN108520377A (en) * | 2018-04-09 | 2018-09-11 | 郑州琼佩电子技术有限公司 | A kind of unmanned plane logistics face label method |
CN108364159A (en) * | 2018-04-09 | 2018-08-03 | 郑州檀乐科技有限公司 | A kind of unmanned plane logistics face label device and method |
CN108777071A (en) * | 2018-07-04 | 2018-11-09 | 深圳智达机械技术有限公司 | A kind of highway patrol robot |
CN109443350B (en) * | 2018-12-27 | 2023-09-01 | 仝人智能科技(江苏)有限公司 | Bluetooth/photoelectric/INS integrated navigation device and method based on neural network |
CN109443350A (en) * | 2018-12-27 | 2019-03-08 | 西安中科光电精密工程有限公司 | Bluetooth/photoelectricity/INS combined navigation device neural network based and method |
CN110647089A (en) * | 2019-10-28 | 2020-01-03 | 天津中德应用技术大学 | Intelligent warehouse logistics robot control system and control method |
CN111753938A (en) * | 2020-06-23 | 2020-10-09 | 联想(北京)有限公司 | Position acquisition method and device and electronic equipment |
CN111753938B (en) * | 2020-06-23 | 2021-12-24 | 联想(北京)有限公司 | Position acquisition method and device and electronic equipment |
CN112611381A (en) * | 2020-10-29 | 2021-04-06 | 武汉哈船导航技术有限公司 | Artificial intelligence inertial navigation system |
CN112462762B (en) * | 2020-11-16 | 2022-04-19 | 浙江大学 | Robot outdoor autonomous moving system and method based on roadside two-dimensional code unit |
CN112462762A (en) * | 2020-11-16 | 2021-03-09 | 浙江大学 | Robot outdoor autonomous moving system and method based on roadside two-dimensional code unit |
CN113077014A (en) * | 2021-04-29 | 2021-07-06 | 上海德衡数据科技有限公司 | Cloud edge terminal information fusion method, system, device and medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107562054A (en) | The independent navigation robot of view-based access control model, RFID, IMU and odometer | |
CN106405605B (en) | A kind of indoor and outdoor seamless positioning method and positioning system of the robot based on ROS and GPS | |
CN101661098B (en) | Multi-robot automatic locating system for robot restaurant | |
CN103926925B (en) | Improved VFH algorithm-based positioning and obstacle avoidance method and robot | |
CN108010360A (en) | A kind of automatic Pilot context aware systems based on bus or train route collaboration | |
CN105115497B (en) | A kind of reliable indoor mobile robot precision navigation positioning system and method | |
CN102368158B (en) | Navigation positioning method of orchard machine | |
CN107478214A (en) | A kind of indoor orientation method and system based on Multi-sensor Fusion | |
CN107544501A (en) | A kind of intelligent robot wisdom traveling control system and its method | |
CN105946853A (en) | Long-distance automatic parking system and method based on multi-sensor fusion | |
CN109541535A (en) | A method of AGV indoor positioning and navigation based on UWB and vision SLAM | |
CN108801269A (en) | A kind of interior cloud Algorithms of Robots Navigation System and method | |
CN108345005A (en) | The real-time continuous autonomous positioning orientation system and navigation locating method of tunnelling machine | |
CN105159236A (en) | Vending robot system and control method thereof | |
CN103353758A (en) | Indoor robot navigation device and navigation technology thereof | |
CN102914303A (en) | Navigation information acquisition method and intelligent space system with multiple mobile robots | |
CN106227212A (en) | The controlled indoor navigation system of precision based on grating map and dynamic calibration and method | |
CN109029463A (en) | The more balance car independent navigations in interior and scheduling system towards vehicle safe driving | |
CN103472434B (en) | Robot sound positioning method | |
US20220137628A1 (en) | Localization system for a driverless vehicle | |
RU2740229C1 (en) | Method of localizing and constructing navigation maps of mobile service robot | |
CN110148308A (en) | Vehicle positioning system in parking garage | |
Mueller et al. | GIS-based topological robot localization through LIDAR crossroad detection | |
CN106898249A (en) | A kind of map structuring system and its construction method for earthquake-stricken area communication failure region | |
CN110174108A (en) | A kind of AGV autonomous positioning air navigation aid based on topological map of apery |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180109 |
|
RJ01 | Rejection of invention patent application after publication |