CN105352508A - Method and device of robot positioning and navigation - Google Patents
Method and device of robot positioning and navigation Download PDFInfo
- Publication number
- CN105352508A CN105352508A CN201510690749.8A CN201510690749A CN105352508A CN 105352508 A CN105352508 A CN 105352508A CN 201510690749 A CN201510690749 A CN 201510690749A CN 105352508 A CN105352508 A CN 105352508A
- Authority
- CN
- China
- Prior art keywords
- robot
- task
- route
- image
- collisionless
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
- G01C21/206—Instruments for performing navigational calculations specially adapted for indoor navigation
Abstract
The invention relates to a method and a device of robot positioning and navigation. The method comprises the following steps: recording the indoor mobile position and the corresponding image with a depth camera; analyzing the mobile position and image of a robot returned by the depth camera to match image feature points and physical feature points to determine the position, direction and environment of the robot; obtaining the whole indoor map browsing system according to the matching information between image feature points and physical feature points and the determinate position, direction and environment of the robot; receiving a task execution instruction sent by a user, projecting an optimum collisionless route, and finishing the task by the robot according to the optimum collisionless route or sending the map information and positioning information to a robot without sensing ability by the robot to execute the task according to the optimum collisionless route. The precision of the robot positioning and navigation is improved, constructing cost of the hardware is reduced greatly, and the other robot or smart home device is controlled by the robot.
Description
Technical field
The present invention relates to a kind of positioning navigation method and system, particularly relate to a kind of robot localization air navigation aid and system.
Background technology
Domestic robot on market, such as sweeping robot, main by wireless location technology in indoor positioning navigation now, the prior aries such as such as WiFi (WirelessFidelity), infrared ray and bluetooth, but these wireless location technologies have its shortcoming, as follows:
Wi-Fi technology, WLAN (WLAN) is a kind of brand-new information acquisition platform, can realize complicated location on a large scale, monitoring and tracking task in application widely, and network node self poisoning is basis and the prerequisite of great majority application.The degree of accuracy that Wi-Fi draws is greatly in the scope of 1 meter to 20 meters, and in general, it is more accurate than cellular network triangulation localization method.At present, it is applied to indoor positioning among a small circle, and cost is lower.But be no matter for indoor or outdoor positioning, Wi-Fi transceiver all can only region within covering radius 90 meters, and is easy to the interference being subject to other signals, thus affects its precision, and the energy consumption of steady arm is also higher.
The principle of infrared ray indoor positioning technologies is, the infrared-ray of infrared ray IR (infrared) identification transmissions modulation, positions by being arranged on indoor optical sensor reception.Although infrared ray has relatively high indoor position accuracy, shortcoming is because light can not pass barrier, makes infrared-ray only can line-of-sight propagation.Straight line sighting distance and transmission range this two large major defect shorter make the poor effect of its indoor positioning.Add, infrared ray is easily disturbed by the light in fluorescent light or room, and there is limitation accurate location, can only be applicable to short distance and propagate.
Bluetooth technology is positioned by measure signal intensity.This is a kind of Radio Transmission Technology of short distance low-power consumption, at the bluetooth local area network access point that indoor location is suitable, network configuration is become the basic network connection mode based on multi-user, and ensure that bluetooth local area network access point is the main equipment of this piconet all the time, just can obtain the positional information of user.Its deficiency is that the price comparison of bluetooth devices and equipment is expensive, and for the space environment of complexity, the stability of Bluetooth system is slightly poor, large by noise signal interference.
But, existing robot localization airmanship, be not often positioning accurate accuracy low be exactly hardware erection cost more high various problem.
Summary of the invention
In view of this, the embodiment of the present invention provides a kind of robot localization air navigation aid and device, to improve location navigation precision.
Embodiments provide a kind of robot localization air navigation aid, the method comprises the following steps:
The image of shift position and correspondence thereof in depth camera recording room;
The robot shift position of analysis depth camera passback and image with the pairing completing image characteristic point and physical features point to determine the position of robot, direction and environment;
Match according to image characteristic point and physical features point, and the robot location determined, direction and environment obtain complete indoor map browing system;
Receive task-performance instructions that user sends and cook up best collisionless route;
Robot moves according to best collisionless route and executes the task; Or
The indoor map browing system of acquisition and locating information are sent to the robot without sensing capability, and allows the robot without sensing capability move according to path planning route to execute the task.
In addition, the embodiment of the present invention additionally provides a kind of robot localization guider, and this device comprises:
Logging modle, for the image by shift position in depth camera recording room and correspondence thereof;
Analysis module, for the robot shift position of analysis depth camera passback and image with the pairing completing image characteristic point and physical features point to determine the position of robot, direction and environment;
Navigation module, for matching according to image characteristic point and physical features point, and the robot location determined, direction and environment obtain complete indoor map browing system;
Execution module, cooks up best collisionless route and moves the robot that executes the task or above-mentioned map view system and best collisionless route are issued without sensing capability to execute the task to allow robot according to route during for receiving task-performance instructions that user sends.
Robot localization air navigation aid of the present invention and system are by utilizing the image of shift position and correspondence thereof in depth camera recording room, the robot shift position of analysis depth camera passback and image are to complete the pairing of image characteristic point and physical features point to determine the position of robot, direction and environment, match according to image characteristic point and physical features point, and the robot location determined, direction and environment obtain complete indoor map browing system, cook up best collisionless route when finally receiving the task-performance instructions that user sends to finish the work to allow robot or other robot without sensing capability move according to route.Such precision that can improve machine location navigation also greatly reduce hardware erection cost and by this robot controlling other robot or Smart Home meter standby.
Accompanying drawing explanation
By the more specifically explanation of the preferred embodiments of the present invention shown in accompanying drawing, above-mentioned and other object of the present invention, Characteristics and advantages will become more clear.
The schematic diagram of the robot that Fig. 1 provides for the embodiment of the present invention.
The process flow diagram of the image searching method that Fig. 2 provides for the embodiment of the present invention one.
Fig. 3 is the detailed sub-process figure of S1 step in Fig. 2.
The process flow diagram of the image searching method that Fig. 4 provides for the embodiment of the present invention three.
Embodiment
For enabling above-mentioned purpose of the present invention, feature and advantage become apparent more, are described in detail the specific embodiment of the present invention below in conjunction with accompanying drawing.Set forth a lot of detail in the following description so that fully understand the present invention.But the present invention can be much different from alternate manner described here to implement, those skilled in the art can when without prejudice to doing similar improvement when intension of the present invention, therefore the present invention is by the restriction of following public concrete enforcement.
Referring to Fig. 1, is robot architecture's schematic diagram of the embodiment of the present invention.
Described robot is one and integrates multi-functional system ensemble such as environment sensing, dynamic decision and planning, Behavior-Based control and execution etc.It has concentrated the multi-disciplinary achievements in research such as sensor technology, information processing, electronic engineering, computer engineering, automation control engineering and artificial intelligence, and representing the most overachievement of electromechanical integration, is one of most active field of current scientific technological advance.Along with robot performance is constantly perfect, the range of application of mobile robot is greatly expanded, not only be widely used in the industries such as industry, agricultural, medical treatment, service, and harmfully in urban safety, national defence and space exploration field etc. well applied with dangerous situation.Robot of the present invention is with chassis driving wheel 5, chassis drives 6, chassis universal wheel 4 forms the mechanical module of chassis movement, and drive chassis driving wheel 5 and chassis driving wheel 6 to rotate by controller, chassis universal wheel 4 plays a supportive role coordinate movement in whole chassis while.The rotation of head 1 is driven by end rotation and makes head have two rotary freedoms, carries depth camera 2 in above-head, carries specific control system control and itself positions the assignment instructions navigating and send with completing user.In order to coordinate each task, described mobile mobile robot also can carry other various inductor, such as temperature control inductor etc.
Mobile robot of the present invention may be used for indoor positioning navigation, recorder people indoor shift position and the image of process, and cook up indoor map navigation system by the above-mentioned shift position of computation vision Algorithm Analysis and image, a best collisionless route is cooked up when receiving the task-performance instructions of user.
Fig. 2 is the process flow diagram of the robot localization air navigation aid of the embodiment of the present invention.
The present embodiment is applicable to robot and cooks up indoor map navigation system according to indoor moving position and recording image, and cooks up the situation of best collisionless route when receiving the task-performance instructions of user, and the method can be performed by robot.This positioning navigation method specifically comprises as follows:
Step S1, mobile robot passes through the image of shift position and correspondence thereof in depth camera recording room.
In actual life, user may utilize mobile robot to complete various task, will have individual route planning, also namely know applied environment before finishing the work.When a mobile robot is positioned over new applied environment, mobile robot can move in new indoor environment, and the depth camera that head carries also can carry out the shooting record of each angle without dead angle along with end rotation.The shift position of recorder people in indoor and the image by way of process shooting thereof are to obtain a Cu Ding road warp roughly.Robot shift position and filmed image are transferred back to robot and carry out analyzing and processing with follow-up by described depth camera.Described image comprises image information and range information etc.
Step S2, the robot shift position of analysis depth camera passback and recording image with the pairing completing image characteristic point and physical features point to determine the position of robot, direction and environment.
Concrete, robot obtains the robot shift position of the depth camera passback that its head carries, and after the image of record, gather the information such as specific address information and direction of described shift position, gather the information such as the mark in described image, image, road sign simultaneously, by the above-mentioned information collected of computer vision measurement Algorithm Analysis, complete the pairing of image characteristic point and physical features point.
The information that vision sensor obtains is divided into two classes, and the first kind is the gray level image that camera etc. obtains, but this sampled images can not provide direct three-dimensional information.Equations of The Second Kind is the range image that 3D vision sensing obtains.In the present embodiment, described robot adopts Equations of The Second Kind, obtains 3D vision sensing by depth camera.The pixel value obtaining every bit on image is in this way not brightness but distance.Such image and illumination have nothing to do, and the three-dimensional profile of object is identical with object surface shape, and computing machine is easier from this image Zhi Shi Do object with tri-Victoria information.
Wherein the acquisition of three-dimensional information is in the following manner:
Robot indoor moving, change position time, from different perspectives to testee shooting, the distance that can obtain between the two accurately by the movement of guide rail.The unique point detected is carried out to the coupling of unique point, after coupling, calculate parallax, utilize parallax to obtain the locus of corresponding object point in camera coordinate system.When detection and the coupling of marginal point, employ Sub-pixel Technique, make the precision of measurement no longer be confined to the precision of a pixel.
By the way, each object three-dimensional coordinate measurement of robot identification, and then the particular location of each transfer point determining robot.
Step S3, perfect cooks up indoor map navigation system.
Completed the pairing of image characteristic point and physical features point by computer vision measurement algorithm after, determine the particular location of each transfer point of robot, direction and residing environmental information, and intactly indoor map navigation systems organization out.Described indoor map navigation system also can carry out calibration repeatedly and amendment, because indoor environment may have large change due to special circumstances, at this moment user can oneself adjustment according to demand, also can by repeating S1-S3 step and again obtain new robot shift position and more humane more reasonably indoor map navigation system being cooked up by image.
Step S4, when receiving task-performance instructions, rule dissolve a best collisionless route and execute the task to allow mobile robot move according to route.
After indoor map navigation systems organization, user can send task-performance instructions to robot as required, after robot receives user instruction, cook up a best collisionless route according to demand, final robot can execute the task according to the best collisionless route cooked up.Robot can move to coordinate to finish the work with head etc. by driving chassis driving wheel module.
Step S5, sends to the robot without sensing capability to execute the task to allow the robot without sensing capability move according to best collisionless route by map view navigational system and locating information.
Concrete, after indoor map navigation systems organization, user can send task-performance instructions to robot as required, after robot receives user instruction, cook up a best collisionless route according to demand, And can not have the robot of sensing capability to execute the task according to the best collisionless route cooked up suitable being distributed to.
The robot localization air navigation aid of the present embodiment passes through the image of shift position and correspondence thereof in depth camera recording room, analyze described shift position and image, finally determine the position of robot, direction and environment and cook up indoor map navigation, and final robot can cook up a collisionless road warp when receiving any execution instruction.Described depth camera is directly carried out indoor shot and is returned image, and utilizes computer vision measurement algorithm to carry out precise positioning, thus greatly improves the location navigation ability of robot.
Fig. 3 is the sub-process figure of S5 step in Fig. 2.
Wherein when receiving task-performance instructions, rule dissolve a best collisionless route and preferably comprise to allow mobile robot move to execute the task according to route:
S40, user sends task-performance instructions.
When needs perform a certain instruction, user sends an instruction for complete task to robot.User can by sending phonetic order, and such as " ten o'clock cleaning " is to robot, the voice messaging of the microphones in robot this " ten o'clock cleaning ".In addition, user can also understand some better simply gesture motion by image training robot and illustrate a certain action, after training, when user does gesture in the face of robot, just identifies the instruction of the tasks carrying that user sends very soon.User can also pass through input interface, directly to the task-performance instructions of robot input text information to robot.Except aforesaid way, user can also send instruction by alternate manner to robot.
S41, mobile robot encodes to this instruction.
After robot receives the task-performance instructions of user's transmission, such as, phonetic order, when gesture or text message are to robot, after robot receives described task-performance instructions, identify this instruction, and this instruction is carried out encode to become robot and can execute the task according to the instruction after this coding.
S42, cooks up best collisionless route according to this coding and makes mobile robot move the execution of finishing the work according to route.
Robot goes out a best collisionless route according to this coding and indoor map navigation systems organization, and described robot can carry out having moved a certain task according to this best collisionless route.
Fig. 4 is the schematic diagram of a kind of robot localization guider provided by the invention, it is applied in the mobile robot of various uses function, and the robot localization guider described in the present embodiment comprises: logging modle 300, analysis module 301, navigation module 302 and execution module 303.
Described logging modle 300 is for the image by shift position in depth camera recording room and correspondence thereof.When a mobile robot is positioned over new applied environment, mobile robot can move in new indoor environment, and the depth camera that head carries also can carry out the shooting record of each angle without dead angle along with end rotation.The shift position of recorder people in indoor and the image by way of process shooting thereof are to obtain a Cu Ding road warp roughly.Robot shift position and filmed image are transferred back to robot and carry out analyzing and processing with follow-up by described depth camera.
The robot shift position that described analysis module 301 returns for analysis depth camera and recording image with the pairing completing image characteristic point and physical features point to determine the position of robot, direction and environment.
Concrete, robot obtains the robot shift position of the depth camera passback that its head carries, and after the image of record, gather the information such as specific address information and direction of described shift position, adopt the information such as the mark in described image, image, road sign simultaneously, by the above-mentioned information collected of computer vision measurement Algorithm Analysis, complete the pairing of image characteristic point and physical features point, to obtain a complete indoor map navigation system.
Described navigation module 302, for when receiving the task-performance instructions that user sends, goes out a best collisionless route to realize the indoor security motion of robot according to this task-performance instructions and complete indoor map navigation systems organization.
Described execution module 303 drives chassis driving wheel module 5 and chassis driver module 6 to rotate for cooking up a best collisionless route post command robot when above-mentioned navigation module 302, plays a supportive role while chassis universal wheel 4 coordinates movement in whole chassis simultaneously.The rotation of head 1 is driven by end rotation and makes head have two rotary freedoms, carries depth camera 2 in above-head, carries specific control system control and itself positions navigation and move according to the best collisionless route gone out.
In addition, this robot also can execute the task to not having the robot of sensing capability Work distributing according to the best collisionless route cooked up.
Robot localization air navigation aid of the present invention and device can understand working environment and object in time by depth camera, and the countermeasure of oneself is adjusted depending on its situation, to improve the adaptability of robot and intelligent level and to improve Navigation and localization precision, simultaneously, because it only needs depth camera to coordinate robot work not need other too many hardware to coordinate, because also greatly reduce hardware erection cost, also by this robot controlling other robot or Smart Home meter standby.
Above-mentioned robot localization guider can perform the robot localization air navigation aid that the embodiment of the present invention one provides, and possesses the corresponding functional module of manner of execution and beneficial effect.
These are only preferred embodiment of the present invention and institute's application technology principle, it describes comparatively concrete and detailed, but therefore can not be interpreted as the restriction to the scope of the claims of the present invention.It should be pointed out that for the person of ordinary skill of the art, without departing from the inventive concept of the premise, can also make some distortion and improvement, these all belong to protection scope of the present invention.Therefore, the protection domain of patent of the present invention should be as the criterion with claims.
Claims (6)
1. a robot localization air navigation aid, is characterized in that, the method comprises the following steps:
The image of shift position and correspondence thereof in depth camera recording room;
The robot shift position of analysis depth camera passback and image with the pairing completing image characteristic point and physical features point to determine the position of robot, direction and environment;
Match according to image characteristic point and physical features point, and the robot location determined, direction and environment obtain complete indoor map browing system;
Receive task-performance instructions that user sends and cook up best collisionless route;
Robot moves according to best collisionless route and executes the task; Or
The indoor map browing system of acquisition and locating information are sent to the robot without sensing capability, and allows the robot without sensing capability move according to path planning route to execute the task.
2. robot localization air navigation aid as claimed in claim 1, is characterized in that, receives task-performance instructions that user sends and cook up best collisionless route specifically to comprise:
Receive the task-performance instructions that user sends;
This task-performance instructions is encoded; And
Best collisionless route is cooked up according to this coding.
3. robot localization air navigation aid as claimed in claim 1, is characterized in that, the task-performance instructions that described user sends sends in the following manner: any one in phonetic order, gesture or text message.
4. a robot localization guider, is characterized in that, this device comprises:
Logging modle, for the image by shift position in depth camera recording room and correspondence thereof;
Analysis module, for the robot shift position of analysis depth camera passback and image with the pairing completing image characteristic point and physical features point to determine the position of robot, direction and environment;
Navigation module, for matching according to image characteristic point and physical features point, and the robot location determined, direction and environment obtain complete indoor map browing system;
Execution module, cooks up best collisionless route and moves the robot that executes the task or above-mentioned map view system and best collisionless route are issued without sensing capability to execute the task to allow robot according to route during for receiving task-performance instructions that user sends.
5. robot localization guider as claimed in claim 4, is characterized in that, described execution module specifically for:
Receive the task-performance instructions that user sends;
This task-performance instructions is encoded;
Cook up best collisionless route according to this coding and make mobile robot move the execution of finishing the work according to route; And
Or cook up best collisionless route according to this coding, and look flat job requirement character, the robot controlled without sensing capability moves, and moves the execution of finishing the work according to route.
6. robot localization guider as claimed in claim 4, is characterized in that, the task-performance instructions that described user sends sends in the following manner: any one in phonetic order, gesture or text message.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510690749.8A CN105352508A (en) | 2015-10-22 | 2015-10-22 | Method and device of robot positioning and navigation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510690749.8A CN105352508A (en) | 2015-10-22 | 2015-10-22 | Method and device of robot positioning and navigation |
Publications (1)
Publication Number | Publication Date |
---|---|
CN105352508A true CN105352508A (en) | 2016-02-24 |
Family
ID=55328518
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510690749.8A Pending CN105352508A (en) | 2015-10-22 | 2015-10-22 | Method and device of robot positioning and navigation |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105352508A (en) |
Cited By (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105654648A (en) * | 2016-03-28 | 2016-06-08 | 浙江吉利控股集团有限公司 | Anti-theft monitoring device and system and method |
CN106020233A (en) * | 2016-07-08 | 2016-10-12 | 聂浩然 | Unmanned aerial vehicle (UAV) adopted plant protection system, unmanned aerial vehicle (UAV) for plant protection and its control method |
CN106056833A (en) * | 2016-06-23 | 2016-10-26 | 乐视控股(北京)有限公司 | Safety monitoring method, device, system and monitoring system |
CN106125925A (en) * | 2016-06-20 | 2016-11-16 | 华南理工大学 | Method is arrested based on gesture and voice-operated intelligence |
CN106441238A (en) * | 2016-06-01 | 2017-02-22 | 昆山塔米机器人有限公司 | Positioning device and positioning navigation algorithm of robot based on infrared visual technology |
CN106643692A (en) * | 2016-09-28 | 2017-05-10 | 深圳乐行天下科技有限公司 | Robot navigation and positioning method, system and robot |
CN107170011A (en) * | 2017-04-24 | 2017-09-15 | 杭州司兰木科技有限公司 | A kind of robot vision tracking and system |
CN107305381A (en) * | 2016-04-21 | 2017-10-31 | 上海慧流云计算科技有限公司 | A kind of self-navigation robot and automatic navigation method |
CN108319265A (en) * | 2017-12-21 | 2018-07-24 | 山西迪迈沃科光电工业有限公司 | The control system and method for a kind of ground running robot for electric power computer room inspection |
CN108318050A (en) * | 2017-12-14 | 2018-07-24 | 富华科精密工业(深圳)有限公司 | Central controller and the system and method for utilizing the central controller mobile navigation |
WO2018148878A1 (en) * | 2017-02-15 | 2018-08-23 | 深圳市前海中康汇融信息技术有限公司 | Smart robot capable of adaptively adjusting visual field, and control method therefor |
CN108459598A (en) * | 2017-08-24 | 2018-08-28 | 炬大科技有限公司 | A kind of mobile electronic device and method for handling the task of mission area |
CN108460801A (en) * | 2017-06-12 | 2018-08-28 | 炬大科技有限公司 | A kind of system and method for reaching indoor task object location determination by image recognition mode |
CN108459597A (en) * | 2017-07-26 | 2018-08-28 | 炬大科技有限公司 | A kind of mobile electronic device and method for handling the task of mission area |
CN108459596A (en) * | 2017-06-30 | 2018-08-28 | 炬大科技有限公司 | A kind of method in mobile electronic device and the mobile electronic device |
CN108710376A (en) * | 2018-06-15 | 2018-10-26 | 哈尔滨工业大学 | The mobile chassis of SLAM and avoidance based on Multi-sensor Fusion |
CN108748184A (en) * | 2018-06-13 | 2018-11-06 | 四川长虹电器股份有限公司 | A kind of robot patrol method and robot device based on area map mark |
WO2018228254A1 (en) * | 2017-06-12 | 2018-12-20 | 炬大科技有限公司 | Mobile electronic device and method for use in mobile electronic device |
WO2019095681A1 (en) * | 2017-11-16 | 2019-05-23 | 珊口(上海)智能科技有限公司 | Positioning method and system, and suitable robot |
CN110235079A (en) * | 2017-01-27 | 2019-09-13 | 威欧.艾姆伊有限公司 | Tourelle and the method for relocating equipment autonomously using integrated form tourelle |
CN110426030A (en) * | 2019-08-14 | 2019-11-08 | 金同磊 | Indoor navigation method and system |
TWI702377B (en) * | 2018-05-11 | 2020-08-21 | 新加坡商賽思托機器人有限公司 | A system, a method, a storage medium and a server for managing a plurality of vehicles |
CN112215443A (en) * | 2020-12-03 | 2021-01-12 | 炬星科技(深圳)有限公司 | Robot rapid routing customization method and device |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101920498A (en) * | 2009-06-16 | 2010-12-22 | 泰怡凯电器(苏州)有限公司 | Device for realizing simultaneous positioning and map building of indoor service robot and robot |
CN102609942A (en) * | 2011-01-31 | 2012-07-25 | 微软公司 | Mobile camera localization using depth maps |
CN102866706A (en) * | 2012-09-13 | 2013-01-09 | 深圳市银星智能科技股份有限公司 | Cleaning robot adopting smart phone navigation and navigation cleaning method thereof |
CN103092203A (en) * | 2013-01-15 | 2013-05-08 | 深圳市紫光杰思谷科技有限公司 | Control method of relative motion between primary robot and secondary robot |
EP2657644A1 (en) * | 2010-12-20 | 2013-10-30 | Nec Corporation | Positioning apparatus and positioning method |
CN103926933A (en) * | 2014-03-29 | 2014-07-16 | 北京航空航天大学 | Indoor simultaneous locating and environment modeling method for unmanned aerial vehicle |
CN104236548A (en) * | 2014-09-12 | 2014-12-24 | 清华大学 | Indoor autonomous navigation method for micro unmanned aerial vehicle |
CN104851094A (en) * | 2015-05-14 | 2015-08-19 | 西安电子科技大学 | Improved method of RGB-D-based SLAM algorithm |
-
2015
- 2015-10-22 CN CN201510690749.8A patent/CN105352508A/en active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101920498A (en) * | 2009-06-16 | 2010-12-22 | 泰怡凯电器(苏州)有限公司 | Device for realizing simultaneous positioning and map building of indoor service robot and robot |
EP2657644A1 (en) * | 2010-12-20 | 2013-10-30 | Nec Corporation | Positioning apparatus and positioning method |
CN102609942A (en) * | 2011-01-31 | 2012-07-25 | 微软公司 | Mobile camera localization using depth maps |
CN102866706A (en) * | 2012-09-13 | 2013-01-09 | 深圳市银星智能科技股份有限公司 | Cleaning robot adopting smart phone navigation and navigation cleaning method thereof |
CN103092203A (en) * | 2013-01-15 | 2013-05-08 | 深圳市紫光杰思谷科技有限公司 | Control method of relative motion between primary robot and secondary robot |
CN103926933A (en) * | 2014-03-29 | 2014-07-16 | 北京航空航天大学 | Indoor simultaneous locating and environment modeling method for unmanned aerial vehicle |
CN104236548A (en) * | 2014-09-12 | 2014-12-24 | 清华大学 | Indoor autonomous navigation method for micro unmanned aerial vehicle |
CN104851094A (en) * | 2015-05-14 | 2015-08-19 | 西安电子科技大学 | Improved method of RGB-D-based SLAM algorithm |
Cited By (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10937288B2 (en) | 2016-03-28 | 2021-03-02 | Zhejiang Geely Holding Group Co., Ltd. | Theft prevention monitoring device and system and method |
CN105654648B (en) * | 2016-03-28 | 2018-07-06 | 浙江吉利控股集团有限公司 | Anti-theft monitoring device and system and method |
JP2019510288A (en) * | 2016-03-28 | 2019-04-11 | 浙江吉利控股集団有限公司Zhejiang Geely Holding Group Co.,Ltd. | Anti-theft monitoring device, system and method |
CN105654648A (en) * | 2016-03-28 | 2016-06-08 | 浙江吉利控股集团有限公司 | Anti-theft monitoring device and system and method |
CN107305381A (en) * | 2016-04-21 | 2017-10-31 | 上海慧流云计算科技有限公司 | A kind of self-navigation robot and automatic navigation method |
CN106441238A (en) * | 2016-06-01 | 2017-02-22 | 昆山塔米机器人有限公司 | Positioning device and positioning navigation algorithm of robot based on infrared visual technology |
CN106125925A (en) * | 2016-06-20 | 2016-11-16 | 华南理工大学 | Method is arrested based on gesture and voice-operated intelligence |
CN106125925B (en) * | 2016-06-20 | 2019-05-14 | 华南理工大学 | Intelligence based on gesture and voice control arrests method |
CN106056833A (en) * | 2016-06-23 | 2016-10-26 | 乐视控股(北京)有限公司 | Safety monitoring method, device, system and monitoring system |
CN106020233A (en) * | 2016-07-08 | 2016-10-12 | 聂浩然 | Unmanned aerial vehicle (UAV) adopted plant protection system, unmanned aerial vehicle (UAV) for plant protection and its control method |
CN106020233B (en) * | 2016-07-08 | 2023-11-28 | 聂浩然 | Unmanned aerial vehicle plant protection operation system, unmanned aerial vehicle for plant protection operation and control method |
CN106643692A (en) * | 2016-09-28 | 2017-05-10 | 深圳乐行天下科技有限公司 | Robot navigation and positioning method, system and robot |
CN110235079A (en) * | 2017-01-27 | 2019-09-13 | 威欧.艾姆伊有限公司 | Tourelle and the method for relocating equipment autonomously using integrated form tourelle |
CN110235079B (en) * | 2017-01-27 | 2022-09-27 | 威欧.艾姆伊有限公司 | Scrolling device and method for autonomous repositioning of an apparatus using an integrated scrolling device |
WO2018148878A1 (en) * | 2017-02-15 | 2018-08-23 | 深圳市前海中康汇融信息技术有限公司 | Smart robot capable of adaptively adjusting visual field, and control method therefor |
CN107170011A (en) * | 2017-04-24 | 2017-09-15 | 杭州司兰木科技有限公司 | A kind of robot vision tracking and system |
CN107170011B (en) * | 2017-04-24 | 2019-12-17 | 杭州艾芯智能科技有限公司 | robot vision tracking method and system |
WO2018228254A1 (en) * | 2017-06-12 | 2018-12-20 | 炬大科技有限公司 | Mobile electronic device and method for use in mobile electronic device |
CN108460801A (en) * | 2017-06-12 | 2018-08-28 | 炬大科技有限公司 | A kind of system and method for reaching indoor task object location determination by image recognition mode |
CN108459596A (en) * | 2017-06-30 | 2018-08-28 | 炬大科技有限公司 | A kind of method in mobile electronic device and the mobile electronic device |
CN108459597B (en) * | 2017-07-26 | 2024-02-23 | 炬大科技有限公司 | Mobile electronic device and method for processing tasks in task area |
CN108459597A (en) * | 2017-07-26 | 2018-08-28 | 炬大科技有限公司 | A kind of mobile electronic device and method for handling the task of mission area |
CN108459598B (en) * | 2017-08-24 | 2024-02-20 | 炬大科技有限公司 | Mobile electronic device and method for processing tasks in task area |
CN108459598A (en) * | 2017-08-24 | 2018-08-28 | 炬大科技有限公司 | A kind of mobile electronic device and method for handling the task of mission area |
WO2019095681A1 (en) * | 2017-11-16 | 2019-05-23 | 珊口(上海)智能科技有限公司 | Positioning method and system, and suitable robot |
US11099577B2 (en) | 2017-11-16 | 2021-08-24 | Ankobot (Shanghai) Smart Technologies Co., Ltd. | Localization method and system, and robot using the same |
CN108318050A (en) * | 2017-12-14 | 2018-07-24 | 富华科精密工业(深圳)有限公司 | Central controller and the system and method for utilizing the central controller mobile navigation |
CN108318050B (en) * | 2017-12-14 | 2019-08-23 | 富华科精密工业(深圳)有限公司 | Central controller and the system and method for utilizing the central controller mobile navigation |
CN108319265A (en) * | 2017-12-21 | 2018-07-24 | 山西迪迈沃科光电工业有限公司 | The control system and method for a kind of ground running robot for electric power computer room inspection |
TWI702377B (en) * | 2018-05-11 | 2020-08-21 | 新加坡商賽思托機器人有限公司 | A system, a method, a storage medium and a server for managing a plurality of vehicles |
CN108748184B (en) * | 2018-06-13 | 2020-04-28 | 四川长虹电器股份有限公司 | Robot patrol method based on regional map identification and robot equipment |
CN108748184A (en) * | 2018-06-13 | 2018-11-06 | 四川长虹电器股份有限公司 | A kind of robot patrol method and robot device based on area map mark |
CN108710376A (en) * | 2018-06-15 | 2018-10-26 | 哈尔滨工业大学 | The mobile chassis of SLAM and avoidance based on Multi-sensor Fusion |
CN110426030A (en) * | 2019-08-14 | 2019-11-08 | 金同磊 | Indoor navigation method and system |
CN112215443A (en) * | 2020-12-03 | 2021-01-12 | 炬星科技(深圳)有限公司 | Robot rapid routing customization method and device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105352508A (en) | Method and device of robot positioning and navigation | |
Li et al. | Top 10 technologies for indoor positioning on construction sites | |
CN105547305B (en) | A kind of pose calculation method based on wireless location and laser map match | |
CN107094319B (en) | High-precision indoor and outdoor fusion positioning system and method | |
Xie et al. | LIPS: A light intensity--based positioning system for indoor environments | |
US11253991B1 (en) | Optimization of observer robot locations | |
JP4584213B2 (en) | Mobile robot positioning system and method using camera and sign | |
US20180003498A1 (en) | Visual positioning system and method based on high reflective infrared identification | |
Yu et al. | An autonomous restaurant service robot with high positioning accuracy | |
CN109946649B (en) | Low-cost indoor narrow and long environment two-dimensional UWB system positioning method | |
CN103926925A (en) | Improved VFH algorithm-based positioning and obstacle avoidance method and robot | |
Lee et al. | QR-code based Localization for Indoor Mobile Robot with validation using a 3D optical tracking instrument | |
CN104703118A (en) | System of indoor robot for locating mobile terminal based on bluetooth technology | |
AU2019438843A1 (en) | Recharging Control Method of Desktop Robot | |
CN103760517A (en) | Method and device for achieving high-precision tracking and positioning through underground scanning satellites | |
CN104457755B (en) | A kind of location acquiring method | |
WO2019001237A1 (en) | Mobile electronic device, and method in mobile electronic device | |
CN103389486A (en) | Control method and electronic device | |
US20060227998A1 (en) | Method for using networked programmable fiducials for motion tracking | |
CN105979478A (en) | Positioning method and device | |
Rátosi et al. | Real-time localization and tracking using visible light communication | |
CN106843280A (en) | A kind of intelligent robot system for tracking | |
Shi et al. | Indoor localization scheme using magnetic map for smartphones | |
CN108414980A (en) | A kind of indoor positioning device based on dotted infrared laser | |
Zeng et al. | Study on inspection robot for substation based on ultra-wide-band wireless localization system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20160224 |