CN112484718B - Edge navigation device and method based on environment map correction - Google Patents

Edge navigation device and method based on environment map correction Download PDF

Info

Publication number
CN112484718B
CN112484718B CN202011370587.7A CN202011370587A CN112484718B CN 112484718 B CN112484718 B CN 112484718B CN 202011370587 A CN202011370587 A CN 202011370587A CN 112484718 B CN112484718 B CN 112484718B
Authority
CN
China
Prior art keywords
robot
edge
environment
map
obstacle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011370587.7A
Other languages
Chinese (zh)
Other versions
CN112484718A (en
Inventor
刘智
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Haizhiyun Suzhou Technology Co ltd
Original Assignee
Haizhiyun Suzhou Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Haizhiyun Suzhou Technology Co ltd filed Critical Haizhiyun Suzhou Technology Co ltd
Priority to CN202011370587.7A priority Critical patent/CN112484718B/en
Publication of CN112484718A publication Critical patent/CN112484718A/en
Application granted granted Critical
Publication of CN112484718B publication Critical patent/CN112484718B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

One or more embodiments of the present disclosure provide an edge navigation device and method based on an environmental map modification, which includes a robot, a controller, an environmental perception module, and an environmental perception map, where the controller is in signal connection with the environmental perception module, and controls the environmental perception module to map an environment where the robot is located, so as to obtain the environmental perception map, where the controller is in signal connection with the robot, and controls movement of the robot and execution of processing of sensing device information, where the robot is used to execute a control instruction of the controller.

Description

Edge navigation device and method based on environment map correction
Technical Field
One or more embodiments of the present disclosure relate to the field of mobile robot mapping and navigation, and in particular, to an apparatus and a method for edge navigation based on environment map modification.
Background
The problem of edge navigation control refers to that a mobile robot moves along the edge or the outline of an object according to a certain control strategy and keeps a certain distance with the object. In the advancing process, when the shape of the object edge changes or an obstacle appears in the walking path of the robot, the robot can autonomously identify and classify the environmental information, so that the robot can keep walking along the edge. The type of the peripheral edge of the mobile robot changes along with the movement of the mobile robot, so that the state of the mobile robot also changes along with the movement of the mobile robot. Edge navigation can be seen as the bottom-level behavior of mobile robot intelligence, which when combined with other top-level behavior such as path planning, can intelligently accomplish complex tasks.
The inventor finds that the existing robot has low edge navigation precision, the robot is easy to collide with an obstacle, and in the process of performing edge navigation on a position environment, the sensor of the robot cannot know the overall shape of the obstacle and cannot plan a collision prevention path of the obstacle, so that the mobile robot moves along the outline of the obstacle, which is a reasonable strategy, and therefore, a proper edge navigation control device is needed to be proposed.
Disclosure of Invention
The invention provides a mobile robot edge navigation device and method based on environment map correction, and aims to correct the position information of a robot by combining an established map while the robot performs edge navigation through a sensor information fusion method.
Based on the above purpose, the present specification proposes a mobile robot edge navigation device based on environment map correction, which comprises a robot, a controller, an environment sensing module and an environment sensing map, wherein the controller is in signal connection with the environment sensing module, and is used for controlling the environment sensing module to map the environment where the robot is located, so as to obtain the environment sensing map, the controller is in signal connection with the robot, and is used for controlling the movement of the robot and executing the processing of sensing equipment information, and the robot is used for executing the control instruction of the controller.
Optionally, the environment sensing module includes a laser radar and an RGBD camera, wherein the laser radar is used for constructing an environment map where the robot is located, and the RGBD camera is used for detecting whether an edge or an obstacle is around the robot, and simultaneously detecting a distance between the robot and the edge or the obstacle.
Optionally, the environment sensing map is a map of an environment where the robot is constructed by using a laser radar in the environment sensing module.
Optionally, the robot adopts the differential wheel to remove, four directions are equi-distant place around the robot RGBD camera, the right center of robot places the laser radar.
Based on one or more embodiments described above, a method of a mobile robot edge navigation device based on environment map modification is provided, the method comprising the steps of:
A. after the initial position of the robot is given, the edge nearest to the robot is found, and the robot reaches the range of the preset distance of the edge;
B. the controller controls the robot to realize edge navigation through processing the information acquired by the environment sensing module and dynamically detects the distance between the obstacle and the robot.
C. And correcting the position error of the mobile robot in the edge navigation through the environment perception map so as to realize the accurate positioning of the mobile robot.
Optionally, the specific operation of finding the edge nearest to the robot in the step a is as follows:
a1, giving an initial position of a robot, and selecting whether to walk along a left edge or a right edge;
a2, if the robot walks along the left edge, starting the environment sensing modules at the front side and the left side to sense the front and the left of the robot;
a3, if the front and the left of the robot do not have edges or obstacles, the robot is automatically rotated for one circle, and if the environment sensing module still does not find that the edges or the obstacles exist at the periphery of the robot, the robot forwards walks for a preset time distance;
a4, if the robot still does not detect the edge or the obstacle in the process of moving forward for a preset time distance, the robot turns left for a preset angle and continues to move forward for the preset time distance;
a5, repeating the step A4 until the robot finds the edge and reaches the range of the preset distance of the edge.
Optionally, the specific operation of edge navigation in the step B is as follows:
b1, dividing the left side area and the front side area of the mobile robot into a plurality of equal parts, judging whether edges or barriers exist in each partitioned area through an environment sensing module, and extracting the mass center of the edges or barriers;
b2, taking the average value of the mass center of the edge or the obstacle and the center of the robot to obtain the average slope from the center of the robot to the edge;
b3, taking the angle caused by the average slope in the step B2 as a part of a robot corner, obtaining the angle of the robot rotating to be parallel to the edge according to the average slope, and enabling the robot to rotate to be parallel to the edge;
b4, when the relative distance between the robot and the obstacle is relatively short, reducing the travelling speed of the robot so that the obstacle detection range is reduced; otherwise, the advancing speed of the robot is correspondingly increased, and the robot calculates the avoiding direction of the robot at the moment after finding the obstacle closest to the robot within the detection range.
Optionally, in step B3, according to the average slope, the robot is obtained to rotate to an angle parallel to the edge, and the robot is made to rotate to be parallel to the edge, specifically:
when an edge or an obstacle exists in the front, the robot rotates pi/2 rightwards on the basis of an angle caused by the average slope, and the robot is parallel to the edge body; when an edge or an obstacle exists on the left side, the robot rotates to the right by an angle caused by the average slope, and at the moment, the robot is parallel to the edge body.
According to one or more embodiments of the present disclosure, by using the device and the method for edge navigation of a mobile robot based on environmental map correction, edge navigation of the robot can be more stably implemented according to the environmental map, and the boundary and the change of the environmental map can be perfected.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of an apparatus for mobile robot edge navigation based on environmental map modification in accordance with one or more embodiments of the present disclosure;
FIG. 2 is a block diagram of a mobile robot;
FIG. 3 is a diagram of different detection ranges of a robot;
FIG. 4 is a view of a robot detection range partition;
FIG. 5 is a graph of edge volume linear slope solutions;
FIG. 6 is an edge detection process flow diagram;
FIG. 7 is an edge navigation control flow diagram;
FIG. 8 is a graph showing the comparison of the post-addition detection and before-after-addition detection;
1, a laser radar; 2. a robot; 3. an RGBD camera 3; 4. and a differential wheel.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention.
It is noted that unless otherwise defined, technical or scientific terms used in one or more embodiments of the present disclosure should be taken in a general sense as understood by one of ordinary skill in the art to which the present disclosure pertains. The use of the terms "first," "second," and the like in one or more embodiments of the present description does not denote any order, quantity, or importance, but rather the terms "first," "second," and the like are used to distinguish one element from another. The word "comprising" or "comprises", and the like, means that elements or items preceding the word are included in the element or item listed after the word and equivalents thereof, but does not exclude other elements or items. The terms "connected" or "connected," and the like, are not limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. "upper", "lower", "left", "right", etc. are used merely to indicate relative positional relationships, which may also be changed when the absolute position of the object to be described is changed.
One or more embodiments of the present disclosure provide a mobile robot edge navigation device based on environment map modification, as shown in fig. 1, where the device includes a robot, a controller, an environment sensing module, and an environment sensing map, the controller is in signal connection with the environment sensing module, controls the environment sensing module to map an environment where the robot is located, and obtains the environment sensing map, the controller is in signal connection with the robot, controls movement of the robot, and executes processing on sensing equipment information, and the robot is used for executing a control instruction of the controller.
As shown in fig. 2, the robot 2 includes: the differential wheel 4, the laser radar 1 and the high-precision RGBD camera 3 which are arranged in a front-back and left-right equal manner.
As shown in fig. 3, a larger range of detection is adopted before the edge body is reached, so that the edge body can be reached more quickly, and a smaller range of detection is adopted after the edge body is reached, thereby improving the navigation precision.
As shown in fig. 4, the detectable range of the robot is partitioned. As shown in the figure, they are divided into front (q_1, q_2, q_3), rear (h_1, h_2, h_3), left (z_1, z_2, z_3), right (y_1, y_2, y_3), front left (ZQ), front right (YQ), rear left (ZH), rear right (YH). (a partition may be divided into multiple blocks, as the case may be, here taking 3 blocks as an example)
As shown in fig. 5, assuming that the front (q_1, q_2, q_3) three blocks each detect an edge volume or obstacle (A, B, C), the slope of the edge volume is obtained by calculating the extraction of the average slope:
wherein the coordinate positions of the obstacle A, B, C detected by the sensor are (x A ,y A )、(x B ,y B )、(x C ,y C )。
kA B Calculating the slope formed between the obstacle A and the obstacle B according to the coordinates of the obstacle detected by the sensor;
k BC calculating the slope formed between the B and C obstacles for the coordinates of the obstacles detected by the sensor;
k WALL is kA B And k BC The calculated average slope, namely the slope relation between the robot and the wall;
θ WALL i.e. pass k WALL The calculated angle relation between the robot and the wall,
therefore, the robot only needs to rotate rightward (clockwise)Can be ensured to be parallel to the edge body. For the left sideAlong the body, the average slope k of the left Fang Bianyan body can be calculated by the same way WALL And an average inclination angle theta WALL The robot only needs to rotate by an angle theta WALL I.e. (positive counter-clockwise) parallel to the edges.
As shown in fig. 8, when no rear detection is added, only the left and front sensors are used, so that the user can only go straight when encountering a U-bend, and the function of perfecting the U-bend map is not achieved; after the rear detection is added, the U-shaped bend is turned in due to the fact that the rear detection is carried out, so that the environment map is completed, and the function of edge navigation is achieved.
As shown in fig. 6, the robot first receives the data from the sensor, after filtering, detects whether there is an edge in front, if so, extracts the slope and detects whether there is an edge in the left, if not, directly detects whether there is an edge in the left, calculates the slope and fuses with the slope in front to control the robot angle.
As shown in fig. 7, the process of detecting whether there is an edge or an obstacle in the front and left is the same as the process of extracting the slope of the edge, and at the same time, whether there is an edge or an obstacle in the rear is detected, if so, the rear detection process is required to be entered.
One or more embodiments of the present specification also provide a method of mobile robot edge navigation based on environmental map modification, including:
1. firstly, generating a map of the environment by a laser radar carried by a robot;
2. the robot is placed at a certain position in the environment map, and the robot can find whether an edge body exists in a self-detectable range or not through self-rotation for one circle because of edge navigation, and if the edge body exists around, the robot directly moves to the vicinity of the edge body for a certain distance to start the edge navigation; if the edge body is not present on the periphery, detecting whether the edge body is present on the periphery after the edge body moves forwards for a preset time, otherwise, rotating leftwards for a certain angle to continue to move forwards until the edge body is reached to the vicinity. Meanwhile, different detection ranges are adopted: the detection in a larger range is adopted before the edge body is reached, so that the edge body can be reached more quickly, the detection in a smaller range is adopted after the edge body is reached, and the navigation precision is improved.
3. After reaching the vicinity of the edge body, the vicinity of the robot is used as a detection range of the robot in a certain range, and partition detection is performed: when an edge body or an obstacle appears in one block in the left detection range, the front obstacle occupies a smaller area, so that the obstacle can be completely considered to be a regular edge, the slope of the front edge body is considered to be 0, and the mobile robot can avoid the obstacle only by rotating 90 degrees to the right. When the edge body or the obstacle appears in a plurality of blocks in the left detection range, the average slope of the edge body is required to be extracted and calculated, specifically, the average value of the center of mass of the edge or the obstacle and the center of the robot is taken, the average slope from the center of the robot to the edge is obtained, the angle required by the robot to rotate is calculated through the average slope, and the robot is kept parallel to the edge body.
4. The position of the robot is corrected through the generated map, and meanwhile, the previous map can be further perfected according to data generated by the robot with the edge navigation.
Those of ordinary skill in the art will appreciate that: the discussion of any of the embodiments above is merely exemplary and is not intended to suggest that the scope of the disclosure, including the claims, is limited to these examples; combinations of features of the above embodiments or in different embodiments are also possible within the spirit of the present disclosure, steps may be implemented in any order, and there are many other variations of the different aspects of one or more embodiments described above which are not provided in detail for the sake of brevity.
Additionally, well-known power/ground connections to Integrated Circuit (IC) chips and other components may or may not be shown within the provided figures, in order to simplify the illustration and discussion, and so as not to obscure one or more embodiments of the present description. Furthermore, the apparatus may be shown in block diagram form in order to avoid obscuring the one or more embodiments of the present description, and also in view of the fact that specifics with respect to implementation of such block diagram apparatus are highly dependent upon the platform within which the one or more embodiments of the present description are to be implemented (i.e., such specifics should be well within purview of one skilled in the art). Where specific details (e.g., circuits) are set forth in order to describe example embodiments of the disclosure, it should be apparent to one skilled in the art that one or more embodiments of the disclosure can be practiced without, or with variation of, these specific details. Accordingly, the description is to be regarded as illustrative in nature and not as restrictive.
While the present disclosure has been described in conjunction with specific embodiments thereof, many alternatives, modifications, and variations of those embodiments will be apparent to those skilled in the art in light of the foregoing description.
The present disclosure is intended to embrace all such alternatives, modifications and variances which fall within the broad scope of the appended claims. Any omissions, modifications, equivalents, improvements, and the like, which are within the spirit and principles of the one or more embodiments of the disclosure, are therefore intended to be included within the scope of the disclosure.

Claims (3)

1. An edge navigation method based on environment map correction is characterized by comprising the following steps:
A. after the initial position of the robot is given, the edge nearest to the robot is found, and the robot reaches the range of the preset distance of the edge;
B. the controller controls the robot to realize edge navigation through processing the information acquired by the environment sensing module, and dynamically detects the distance between the obstacle and the robot;
C. correcting the position error of the mobile robot in the edge navigation through the environment perception map so as to realize the accurate positioning of the mobile robot;
the specific operation of finding the nearest edge to the robot in step a is as follows:
a1, giving an initial position of a robot, and selecting whether to walk along a left edge or a right edge;
a2, if the robot walks along the left edge, starting the environment sensing modules at the front side and the left side to sense the front and the left of the robot;
a3, if the front and the left of the robot do not have edges or obstacles, the robot is automatically rotated for one circle, and if the environment sensing module still does not find that the edges or the obstacles exist at the periphery of the robot, the robot forwards walks for a preset time distance;
a4, if the robot still does not detect the edge or the obstacle in the process of moving forward for a preset time distance, the robot turns left for a preset angle and continues to move forward for the preset time distance;
a5, repeating the step A4 until the robot finds the edge and reaches the range of the preset distance of the edge;
the specific operation of the edge navigation in the step B is as follows:
b1, dividing the left side, the front side, the right side and the rear side of the mobile robot into trisections, if the rear side is provided with edges or barriers, enlarging a detection range, judging the environments of the edges or the barriers, detecting the junctions of the multiple sections of the edges or the barriers, enabling the robot to search a wall to travel, if the junctions of the multiple sections of the edges or the barriers are not provided, enabling the robot to keep straight, and if the rear side is provided with no edges or barriers, judging whether the edges or the barriers exist in each partitioned area or not through an environment sensing module, and extracting the mass center of the edges or the barriers;
b2, taking the average value of the mass center of the edge or the obstacle and the center of the robot to obtain the average slope from the center of the robot to the edge, wherein the formula for calculating the average slope is as follows
Wherein the coordinate positions of the obstacle A, B, C detected by the sensor are (x A ,y A )、(x B ,y B )、(x C ,y C );
B3, taking the angle caused by the average slope in the step B2 as a part of a robot corner, obtaining the angle of the robot rotating to be parallel to the edge according to the average slope, and enabling the robot to rotate to be parallel to the edge;
b4, when the relative distance between the robot and the obstacle is relatively short, reducing the travelling speed of the robot so that the obstacle detection range is reduced; otherwise, the advancing speed of the robot is correspondingly increased, and the robot calculates the avoiding direction of the robot at the moment after finding the obstacle closest to the robot within the detection range.
2. The method as claimed in claim 1, characterized in that in step B3, the robot is turned to an angle parallel to the edge and is turned to an angle parallel to the edge based on the average slope, in particular:
when an edge or an obstacle exists in the front, the robot rotates pi/2 rightwards on the basis of an angle caused by the average slope, and the robot is parallel to the edge body; when an edge or an obstacle exists on the left side, the robot rotates to the right by an angle caused by the average slope, and at the moment, the robot is parallel to the edge body.
3. An edge navigation device based on environment map correction is characterized by being used for executing the method according to claim 1 or 2, the device comprises a robot, a controller, an environment sensing module and an environment sensing map, the controller is in signal connection with the environment sensing module, the environment sensing module is controlled to map the environment where the robot is located, the environment sensing map is obtained, the controller is in signal connection with the robot, the movement of the robot is controlled, the processing of sensing equipment information is executed, and the robot is used for executing control instructions of the controller;
the environment sensing module comprises a laser radar and an RGBD camera, wherein the laser radar is used for constructing an environment map where the robot is located, and the RGBD camera is used for detecting whether edges or obstacles exist around the robot and detecting the distance between the robot and the edges or obstacles;
the environment perception map is a map of an environment where the robot is constructed by adopting a laser radar in the environment perception module;
the robot moves by adopting a differential wheel, RGBD cameras are placed in the front, back, left and right directions of the robot in a peer-to-peer mode, and the laser radar is placed in the center of the robot.
CN202011370587.7A 2020-11-30 2020-11-30 Edge navigation device and method based on environment map correction Active CN112484718B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011370587.7A CN112484718B (en) 2020-11-30 2020-11-30 Edge navigation device and method based on environment map correction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011370587.7A CN112484718B (en) 2020-11-30 2020-11-30 Edge navigation device and method based on environment map correction

Publications (2)

Publication Number Publication Date
CN112484718A CN112484718A (en) 2021-03-12
CN112484718B true CN112484718B (en) 2023-07-28

Family

ID=74937331

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011370587.7A Active CN112484718B (en) 2020-11-30 2020-11-30 Edge navigation device and method based on environment map correction

Country Status (1)

Country Link
CN (1) CN112484718B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114265405A (en) * 2021-09-26 2022-04-01 深圳市商汤科技有限公司 Mobile robot, edgewise moving method thereof and computer storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106102539A (en) * 2014-03-14 2016-11-09 艾薇波特公司 Sweeping robot and its control method
CN111308491A (en) * 2020-03-09 2020-06-19 中振同辂(江苏)机器人有限公司 Obstacle sensing method based on multi-sensor combination
CN111857127A (en) * 2020-06-12 2020-10-30 珠海市一微半导体有限公司 Clean partition planning method for robot walking along edge, chip and robot

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101738195B (en) * 2009-12-24 2012-01-11 厦门大学 Method for planning path for mobile robot based on environmental modeling and self-adapting window
US9504367B2 (en) * 2013-11-20 2016-11-29 Samsung Electronics Co., Ltd. Cleaning robot and method for controlling the same
CN106444768B (en) * 2016-10-20 2019-07-09 上海物景智能科技有限公司 A kind of the welt traveling method and system of robot
CN108247647B (en) * 2018-01-24 2021-06-22 速感科技(北京)有限公司 Cleaning robot
CN108196555B (en) * 2018-03-09 2019-11-05 珠海市一微半导体有限公司 The control method that autonomous mobile robot is walked along side
CN109917788B (en) * 2019-03-13 2022-12-06 深圳乐动机器人股份有限公司 Control method and device for robot to walk along wall
CN110908377B (en) * 2019-11-26 2021-04-27 南京大学 Robot navigation space reduction method
CN110989621A (en) * 2019-12-20 2020-04-10 深圳市杉川机器人有限公司 Autonomous robot control method and autonomous robot
CN111358374B (en) * 2020-04-22 2021-08-24 珠海市一微半导体有限公司 Detection method for robot walking along edge grinding bank and blocked detection and processing method
CN111538338B (en) * 2020-05-28 2023-05-26 长沙中联重科环境产业有限公司 Robot welt motion control system and method
CN111930111B (en) * 2020-06-23 2024-04-02 深圳拓邦股份有限公司 Cleaning robot path planning method and cleaning robot

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106102539A (en) * 2014-03-14 2016-11-09 艾薇波特公司 Sweeping robot and its control method
CN111308491A (en) * 2020-03-09 2020-06-19 中振同辂(江苏)机器人有限公司 Obstacle sensing method based on multi-sensor combination
CN111857127A (en) * 2020-06-12 2020-10-30 珠海市一微半导体有限公司 Clean partition planning method for robot walking along edge, chip and robot

Also Published As

Publication number Publication date
CN112484718A (en) 2021-03-12

Similar Documents

Publication Publication Date Title
US20240241522A1 (en) Localization and mapping using physical features
US9946264B2 (en) Autonomous navigation using visual odometry
US9254870B2 (en) Method of generating optimum parking path of unmanned driving vehicle, and unmanned driving vehicle adopting the method
US10345821B2 (en) Floor-treatment apparatus and navigation system therefor
CN110286672A (en) Robot and its navigation control method, Navigation Control Unit and storage medium
JP6477882B2 (en) Self-position estimation apparatus and self-position estimation method
US9616889B2 (en) Cruise control system and method
Barth et al. Tracking oncoming and turning vehicles at intersections
JP4682973B2 (en) Travel route creation method, autonomous mobile body, and autonomous mobile body control system
WO2020051923A1 (en) Systems And Methods For VSLAM Scale Estimation Using Optical Flow Sensor On A Robotic Device
US20100070078A1 (en) Apparatus and method for building map
US20110194755A1 (en) Apparatus and method with traveling path planning
JP4670807B2 (en) Travel route creation method, autonomous mobile body, and autonomous mobile body control system
JP6610799B2 (en) Vehicle traveling control method and traveling control apparatus
JP2004048295A (en) Image processor, parking assist apparatus, and image processing method
JP2020042469A (en) Autonomous traveling body
CN112484718B (en) Edge navigation device and method based on environment map correction
WO2012160373A2 (en) Vehicle navigation
WO2019069626A1 (en) Moving vehicle
CN107421538A (en) Navigation system and air navigation aid
JP2019008531A (en) Mobile vehicle
CN112697153A (en) Positioning method of autonomous mobile device, electronic device and storage medium
JP2020118451A (en) Movement amount calculation device
CN112489131B (en) Method, device, medium and robot for constructing cost map based on pavement detection
CN114690755A (en) Cleaning robot and obstacle detouring method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant