CN113932808B - Visual and gyroscope fusion correction algorithm applicable to visual navigation floor sweeping robot - Google Patents
Visual and gyroscope fusion correction algorithm applicable to visual navigation floor sweeping robot Download PDFInfo
- Publication number
- CN113932808B CN113932808B CN202111287227.5A CN202111287227A CN113932808B CN 113932808 B CN113932808 B CN 113932808B CN 202111287227 A CN202111287227 A CN 202111287227A CN 113932808 B CN113932808 B CN 113932808B
- Authority
- CN
- China
- Prior art keywords
- executing
- angle
- gyroscope
- camera
- sweeping
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000010408 sweeping Methods 0.000 title claims abstract description 70
- 230000004927 fusion Effects 0.000 title claims abstract description 30
- 230000000007 visual effect Effects 0.000 title claims abstract description 10
- 238000000605 extraction Methods 0.000 claims abstract description 7
- 238000004140 cleaning Methods 0.000 claims description 21
- 238000000034 method Methods 0.000 claims description 11
- 239000000284 extract Substances 0.000 claims description 6
- 239000000428 dust Substances 0.000 description 3
- 230000007613 environmental effect Effects 0.000 description 2
- 241001417527 Pempheridae Species 0.000 description 1
- 238000009825 accumulation Methods 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000001680 brushing effect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000005201 scrubbing Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/183—Compensation of inertial measurements, e.g. for temperature effects
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
- G01C21/206—Instruments for performing navigational calculations specially adapted for indoor navigation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C25/00—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
- G01C25/005—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Manufacturing & Machinery (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Manipulator (AREA)
Abstract
The invention discloses an algorithm suitable for vision and gyroscope fusion correction of a vision navigation sweeping robot, which comprises the following steps: step 1: starting a sweeping robot to execute sweeping work; step 2: initializing a visual camera of the sweeping robot, determining the pose direction of the sweeping machine, and using a gyroscope as an initialization direction according to the pose direction, wherein the sweeping robot performs bow sweeping according to the pose direction; step 3: sending a straight line extraction request, and executing step 4; step 4: waiting for the camera to extract and execute, calculating the angle, executing the step 6 if the angle is extracted or the number of times of extracting the straight line exceeds twice, otherwise executing the step 5; step 5: step 3, randomly walking forward for 20cm, and executing the step; step 6: and if the angle is extracted, carrying out data fusion with the gyroscope, and then executing the step 7, otherwise, directly entering the step 7.
Description
Technical Field
The invention relates to the technical field of sweeping robots, in particular to an algorithm suitable for vision and gyroscope fusion correction of a vision navigation sweeping robot.
Background
The floor sweeping robot, also called automatic sweeping machine, intelligent dust collector, robot dust collector, etc., is one kind of intelligent household appliance and can complete floor cleaning automatically inside room with certain artificial intelligence. Generally, the brushing and vacuum modes are adopted, and the ground sundries are firstly absorbed into the garbage storage box of the ground, so that the function of cleaning the ground is completed. Generally, robots that perform cleaning, dust collection, and floor scrubbing work are also collectively referred to as floor cleaning robots.
Along with the continuous development and progress of science and technology, more and more families begin to use the sweeping robot, and the sweeping robot can cause the increase of the accumulated error of the gyroscope along with the continuation and multiple collisions of the sweeping time in the working process, if the accumulated error cannot be corrected in time, the sweeping robot is caused to walk more and more away from the original direction, and the sweeping efficiency and coverage rate are reduced.
Disclosure of Invention
The invention aims to provide an algorithm suitable for fusion correction of vision and gyroscopes of a vision navigation sweeping robot so as to solve the problems in the background art.
In order to achieve the above purpose, the present invention provides the following technical solutions: an algorithm suitable for vision and gyroscope fusion correction of a vision navigation sweeping robot comprises the following steps:
step 1: starting a sweeping robot to execute sweeping work;
step 2: initializing a visual camera of the sweeping robot, determining the pose direction of the sweeping machine, and using a gyroscope as an initialization direction according to the pose direction, wherein the sweeping robot performs bow sweeping according to the pose direction;
step 3: sending a straight line extraction request, and executing step 4;
step 4: waiting for the camera to extract and execute, calculating the angle, executing the step 6 if the angle is extracted or the number of times of extracting the straight line exceeds twice, otherwise executing the step 5;
step 5: step 3, randomly walking forward for 20cm, and executing the step;
step 6: if the angle is extracted, carrying out data fusion with the gyroscope, then executing the step 7, otherwise, directly entering the step 7;
step 7: continuing cleaning, if the gyroscope output needs to be corrected, or the cleaning time reaches the designated time, executing the step 8, otherwise directly executing the step 7;
step 8: stopping the machine, sending correction information to the camera, and executing the step 9;
step 9: waiting for the camera to output the angle. And if the angle is obtained, correcting the angle with the gyroscope, and then executing the step 7, otherwise, directly executing the step 7.
Preferably, the first data fusion algorithm of the camera and the gyroscope in the step 6 includes the following steps:
step 1: the sweeping robot starts global sweeping work;
step 2: sending a straight line extraction request, and executing the step 3;
step 3: waiting for the camera to extract and execute, calculating the angle, judging whether the times of extracting the straight line or the angle result exceeds twice, executing the step 7 if the times of extracting the straight line or the angle result exceeds twice, otherwise executing the step 4; step 5 is executed when the angle is extracted;
step 4: step 2, randomly walking forward for 20cm, and executing the step;
step 5: calculating the angle extracted by the camera, if the calculated angle times are larger than twice, and the result is not obtained, executing the step 7, otherwise, executing the step 4; if the angle is calculated, executing the step 6;
step 6: performing data fusion on the angle gyroscope extracted by the camera, and then executing the step 7;
step 7: and continuing the cleaning work.
Preferably, the algorithm for correcting the fusion of the angles of the gyroscope and the camera in the cleaning process in the step 7 comprises the following steps:
step 1: in the cleaning work of the sweeping robot, executing the step 2 when the designated cleaning time is not reached, otherwise executing the step 3;
step 2: outputting a request of whether correction is needed by the gyroscope or the camera, if so, executing the step 3, otherwise, executing the step 1;
step 3: stopping the machine, sending a gyroscope angle to the camera, and executing step 4 after sending a correction request;
step 4: the camera extracts a straight line and acquires an angle, judges whether the angle is calculated, if the angle is calculated, executes the step 5, otherwise, executes the step 1;
step 5: and (3) carrying out data fusion on the angle calculated by the camera and the gyroscope, and then executing the step (1).
Compared with the prior art, the invention has the beneficial effects that: the vision of the sweeping robot and the gyroscope are fused to correct the algorithm, so that the accumulated errors caused by the extension of the sweeping time of the sweeping robot and the repeated collisions can be corrected timely, the route deviation of the sweeping robot is avoided, and the sweeping efficiency and the coverage rate are ensured.
Drawings
FIG. 1 is a schematic diagram of the structure of the present invention;
FIG. 2 is a flow chart of a first visual camera and gyroscope pose data fusion algorithm in the invention;
FIG. 3 is a flow chart of a gyroscope and camera angle fusion correction algorithm in the cleaning process.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring to fig. 1-3, the present invention provides a technical solution: an algorithm suitable for vision and gyroscope fusion correction of a vision navigation sweeping robot comprises the following steps:
step 1: starting a sweeping robot to execute sweeping work;
step 2: initializing a visual camera of the sweeping robot, determining the pose direction of the sweeping machine, and using a gyroscope as an initialization direction according to the pose direction, wherein the sweeping robot performs bow sweeping according to the pose direction;
step 3: sending a straight line extraction request, and executing step 4;
step 4: waiting for the camera to extract and execute, calculating the angle, executing the step 6 if the angle is extracted or the number of times of extracting the straight line exceeds twice, otherwise executing the step 5;
step 5: step 3, randomly walking forward for 20cm, and executing the step;
step 6: if the angle is extracted, carrying out data fusion with the gyroscope, then executing the step 7, otherwise, directly entering the step 7;
step 7: continuing cleaning, if the gyroscope output needs to be corrected, or the cleaning time reaches the designated time, executing the step 8, otherwise directly executing the step 7;
step 8: stopping the machine, sending correction information to the camera, and executing the step 9;
step 9: waiting for the camera to output the angle. And if the angle is obtained, correcting the angle with the gyroscope, and then executing the step 7, otherwise, directly executing the step 7.
As shown in fig. 2, the first data fusion algorithm of the camera and the gyroscope in the sweeping work of the sweeping robot comprises the following steps:
step 1: the sweeping robot starts global sweeping work;
step 2: sending a straight line extraction request, and executing the step 3;
step 3: waiting for the camera to extract and execute, calculating the angle, judging whether the times of extracting the straight line or the angle result exceeds twice, executing the step 7 if the times of extracting the straight line or the angle result exceeds twice, otherwise executing the step 4; step 5 is executed when the angle is extracted;
step 4: step 2, randomly walking forward for 20cm, and executing the step;
step 5: calculating the angle extracted by the camera, if the calculated angle times are larger than twice, and the result is not obtained, executing the step 7, otherwise, executing the step 4; if the angle is calculated, executing the step 6;
step 6: performing data fusion on the angle gyroscope extracted by the camera, and then executing the step 7;
step 7: and continuing the cleaning work.
As shown in fig. 3, the algorithm for correcting the fusion of the angles of the gyroscope and the camera in the sweeping process of the sweeping robot comprises the following steps:
step 1: the sweeping robot continues to sweep after executing the first data fusion algorithm, and step 2 is executed when the appointed sweeping time is not reached, otherwise step 3 is executed;
step 2: outputting a request of whether correction is needed by the gyroscope or the camera, if so, executing the step 3, otherwise, executing the step 1;
step 3: stopping the machine, sending a gyroscope angle to the camera, and executing step 4 after sending a correction request;
step 4: the camera extracts a straight line and acquires an angle, judges whether the angle is calculated, if the angle is calculated, executes the step 5, otherwise, executes the step 1;
step 5: and (3) carrying out data fusion on the angle calculated by the camera and the gyroscope, and then executing the step (1).
In summary, when the sweeper starts to start, the visual navigation camera extracts characteristic points from the sweeping environment, combines pose data transmitted by the gyroscope, performs pose fusion and determines an initialization pose direction, transmits the initialization data to the gyroscope, and the gyroscope walks according to the initialization pose direction; in the bow sweeping process, along with accumulation of accumulated errors of the pose of the gyroscope to a certain threshold, the gyroscope requests to fuse pose data again, the vision camera extracts environmental characteristic points again, fuses the pose data of the gyroscope, calculates the pose direction, compares the pose direction with the first pose direction, outputs a new pose direction to the gyroscope, and the gyroscope starts to walk according to the new initialization walking direction again, and the method is used until the cleaning is finished.
Working principle: when the robot starts to start, the vision navigation camera is initialized, characteristic points are extracted from the sweeping environment, the pose direction of the robot is determined, initialization data are transmitted to the gyroscope, the gyroscope is used as the initialization direction according to the direction, the robot performs arcade sweeping according to the direction, in the arcade sweeping process, along with lengthening of the walking time and increasing of the collision times, the accumulated error of the pose of the gyroscope is increased, the robot obviously deviates from the original initialization direction, when the accumulated error of the gyroscope reaches a certain threshold value or reaches a set duration, when the accumulated error of the gyroscope reaches a set duration, the gyroscope sends an initialization request, the vision camera again extracts the environmental characteristic points and performs pose initialization, the pose direction of the robot is determined again, the pose direction of the gyroscope is compared with the first initialization direction and the current direction of the gyroscope, the gyroscope is output to the gyroscope again according to the new initialization walking direction, if the error is smaller than a certain range, the pose is successfully reinitialized according to the new initialization direction, the whole sweeping process is performed according to the initial error, and the sweeping coverage rate of the robot is avoided until the sweeping process is completed according to the initial coverage rate of the arcade, and the sweeping process is completed.
Although embodiments of the present invention have been shown and described, it will be understood by those skilled in the art that various changes, modifications, substitutions and alterations can be made therein without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.
Claims (3)
1. An algorithm suitable for vision and gyroscope fusion correction of a vision navigation sweeping robot is characterized by comprising the following steps:
step 1: starting a sweeping robot to execute sweeping work;
step 2: initializing a visual camera of the sweeping robot, determining the pose direction of the sweeping machine, and using a gyroscope as an initialization direction according to the pose direction, wherein the sweeping robot performs bow sweeping according to the pose direction;
step 3: sending a straight line extraction request, and executing step 4;
step 4: waiting for the camera to extract and execute, calculating the angle, executing the step 6 if the angle is extracted or the number of times of extracting the straight line exceeds twice, otherwise executing the step 5;
step 5: step 3, randomly walking forward for 20cm, and executing the step;
step 6: if the angle is extracted, carrying out data fusion with the gyroscope, then executing the step 7, otherwise, directly entering the step 7;
step 7: continuing cleaning, if the gyroscope output needs to be corrected, or the cleaning time reaches the designated time, executing the step 8, otherwise directly executing the step 7;
step 8: stopping the machine, sending correction information to the camera, and executing the step 9;
step 9: waiting for a camera to output an angle; and if the angle is obtained, correcting the angle with the gyroscope, and then executing the step 7, otherwise, directly executing the step 7.
2. An algorithm suitable for vision and gyroscope fusion correction of a vision-based sweeping robot as defined in claim 1, wherein: the first data fusion algorithm of the camera and the gyroscope in the step 6 comprises the following steps:
step 1: the sweeping robot starts global sweeping work;
step 2: sending a straight line extraction request, and executing the step 3;
step 3: waiting for the camera to extract and execute, calculating the angle, judging whether the times of extracting the straight line or the angle result exceeds twice, executing the step 7 if the times of extracting the straight line or the angle result exceeds twice, otherwise executing the step 4; step 5 is executed when the angle is extracted;
step 4: step 2, randomly walking forward for 20cm, and executing the step;
step 5: calculating the angle extracted by the camera, if the calculated angle times are larger than twice, and the result is not obtained, executing the step 7, otherwise, executing the step 4; if the angle is calculated, executing the step 6;
step 6: carrying out data fusion on the angle extracted by the camera and the gyroscope, and then executing the step 7;
step 7: and continuing the cleaning work.
3. An algorithm suitable for vision and gyroscope fusion correction of a vision-based sweeping robot as defined in claim 1, wherein: the gyroscope and camera angle fusion correction algorithm in the cleaning process in the step 7 comprises the following steps:
step 1: in the cleaning work of the sweeping robot, executing the step 2 when the designated cleaning time is not reached, otherwise executing the step 3;
step 2: outputting a request of whether correction is needed by the gyroscope or the camera, if so, executing the step 3, otherwise, executing the step 1;
step 3: stopping the machine, sending a gyroscope angle to the camera, and executing step 4 after sending a correction request;
step 4: the camera extracts a straight line and acquires an angle, judges whether the angle is calculated, if the angle is calculated, executes the step 5, otherwise, executes the step 1;
step 5: and (3) carrying out data fusion on the angle calculated by the camera and the gyroscope, and then executing the step (1).
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111287227.5A CN113932808B (en) | 2021-11-02 | 2021-11-02 | Visual and gyroscope fusion correction algorithm applicable to visual navigation floor sweeping robot |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111287227.5A CN113932808B (en) | 2021-11-02 | 2021-11-02 | Visual and gyroscope fusion correction algorithm applicable to visual navigation floor sweeping robot |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113932808A CN113932808A (en) | 2022-01-14 |
CN113932808B true CN113932808B (en) | 2024-04-02 |
Family
ID=79285380
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111287227.5A Active CN113932808B (en) | 2021-11-02 | 2021-11-02 | Visual and gyroscope fusion correction algorithm applicable to visual navigation floor sweeping robot |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113932808B (en) |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008033837A (en) * | 2006-07-31 | 2008-02-14 | Sanyo Electric Co Ltd | Inspection system and error correction program |
KR20150138889A (en) * | 2014-05-30 | 2015-12-11 | 동명대학교산학협력단 | Apparatus and method for estimating the location of autonomous robot based on three-dimensional depth information |
CN105411490A (en) * | 2015-10-26 | 2016-03-23 | 曾彦平 | Real-time positioning method of mobile robot and mobile robot |
JP2016173655A (en) * | 2015-03-16 | 2016-09-29 | 株式会社東芝 | Movable body control system and movable body control method |
CN106647755A (en) * | 2016-12-21 | 2017-05-10 | 上海芮魅智能科技有限公司 | Sweeping robot capable of intelligently building sweeping map in real time |
CN107443385A (en) * | 2017-09-26 | 2017-12-08 | 珠海市微半导体有限公司 | The detection method and chip and robot of the robot line navigation of view-based access control model |
CN108937742A (en) * | 2018-09-06 | 2018-12-07 | 苏州领贝智能科技有限公司 | A kind of the gyroscope angle modification method and sweeper of sweeper |
CN208289901U (en) * | 2018-05-31 | 2018-12-28 | 珠海市一微半导体有限公司 | A kind of positioning device and robot enhancing vision |
CN109141395A (en) * | 2018-07-10 | 2019-01-04 | 深圳市沃特沃德股份有限公司 | A kind of the sweeper localization method and device of view-based access control model winding calibration gyroscope |
CN110411444A (en) * | 2019-08-22 | 2019-11-05 | 深圳赛奥航空科技有限公司 | A kind of subsurface digging mobile device inertia navigation positioning system and localization method |
CN110672099A (en) * | 2019-09-09 | 2020-01-10 | 武汉元生创新科技有限公司 | Course correction method and system for indoor robot navigation |
CN111759241A (en) * | 2020-06-24 | 2020-10-13 | 湖南格兰博智能科技有限责任公司 | Sweeping path planning and navigation control method for sweeping robot |
CN112263188A (en) * | 2020-10-22 | 2021-01-26 | 湖南格兰博智能科技有限责任公司 | Correction method and device for moving direction of mobile robot |
-
2021
- 2021-11-02 CN CN202111287227.5A patent/CN113932808B/en active Active
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008033837A (en) * | 2006-07-31 | 2008-02-14 | Sanyo Electric Co Ltd | Inspection system and error correction program |
KR20150138889A (en) * | 2014-05-30 | 2015-12-11 | 동명대학교산학협력단 | Apparatus and method for estimating the location of autonomous robot based on three-dimensional depth information |
JP2016173655A (en) * | 2015-03-16 | 2016-09-29 | 株式会社東芝 | Movable body control system and movable body control method |
CN105411490A (en) * | 2015-10-26 | 2016-03-23 | 曾彦平 | Real-time positioning method of mobile robot and mobile robot |
CN106647755A (en) * | 2016-12-21 | 2017-05-10 | 上海芮魅智能科技有限公司 | Sweeping robot capable of intelligently building sweeping map in real time |
CN107443385A (en) * | 2017-09-26 | 2017-12-08 | 珠海市微半导体有限公司 | The detection method and chip and robot of the robot line navigation of view-based access control model |
CN208289901U (en) * | 2018-05-31 | 2018-12-28 | 珠海市一微半导体有限公司 | A kind of positioning device and robot enhancing vision |
CN109141395A (en) * | 2018-07-10 | 2019-01-04 | 深圳市沃特沃德股份有限公司 | A kind of the sweeper localization method and device of view-based access control model winding calibration gyroscope |
CN108937742A (en) * | 2018-09-06 | 2018-12-07 | 苏州领贝智能科技有限公司 | A kind of the gyroscope angle modification method and sweeper of sweeper |
CN110411444A (en) * | 2019-08-22 | 2019-11-05 | 深圳赛奥航空科技有限公司 | A kind of subsurface digging mobile device inertia navigation positioning system and localization method |
CN110672099A (en) * | 2019-09-09 | 2020-01-10 | 武汉元生创新科技有限公司 | Course correction method and system for indoor robot navigation |
CN111759241A (en) * | 2020-06-24 | 2020-10-13 | 湖南格兰博智能科技有限责任公司 | Sweeping path planning and navigation control method for sweeping robot |
CN112263188A (en) * | 2020-10-22 | 2021-01-26 | 湖南格兰博智能科技有限责任公司 | Correction method and device for moving direction of mobile robot |
Also Published As
Publication number | Publication date |
---|---|
CN113932808A (en) | 2022-01-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108021884B (en) | Sweeping machine power-off continuous sweeping method and device based on visual repositioning and sweeping machine | |
CN110353579A (en) | A kind of clean robot automatic path planning method | |
CN111077495B (en) | Positioning recovery method based on three-dimensional laser | |
US11896175B2 (en) | Method and apparatus for updating working map of mobile robot, and storage medium | |
CN107443385B (en) | Detection method and chip for robot linear navigation based on vision and robot | |
CN111603100B (en) | Storage and reuse method and storage and reuse device for sweeping drawing of sweeper | |
CN112198876B (en) | Map full-coverage sweeping modular control method suitable for sweeping robot | |
CN113674351B (en) | Drawing construction method of robot and robot | |
CN110495817B (en) | Recharging and docking method and system for cleaning equipment with laser radar | |
CN111240308A (en) | Method and device for detecting repeated obstacle, electronic equipment and readable storage medium | |
CN113932808B (en) | Visual and gyroscope fusion correction algorithm applicable to visual navigation floor sweeping robot | |
WO2022156746A1 (en) | Cleaning control method and apparatus for robot, and robot | |
CN112033391A (en) | Robot repositioning method and device based on charging pile | |
CN110604515B (en) | Multi-machine cooperation system and cleaning equipment | |
CN114431771B (en) | Sweeping method of sweeping robot and related device | |
CN115205470A (en) | Continuous scanning repositioning method, device, equipment, storage medium and three-dimensional continuous scanning method | |
CN112013840B (en) | Sweeping robot and map construction method and device thereof | |
CN110967703A (en) | Indoor navigation method and indoor navigation device using laser radar and camera | |
WO2022095327A1 (en) | Cleaning robot control system and method for generating cleaning route | |
CN112683266A (en) | Robot and navigation method thereof | |
CN113432593B (en) | Centralized synchronous positioning and map construction method, device and system | |
CN116098536A (en) | Robot control method and device | |
CN108319270A (en) | A kind of automatic dust absorption machine people's optimum path planning method based on historical data analysis | |
CN107468160A (en) | A kind of wired intellective dust collector cleaning method | |
CN115202369B (en) | Path control method, device and equipment of dust collection robot and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |