WO2014003517A1 - Robot mobile et procédé de récupération et de commande de trajet en ligne entier de robot mobile - Google Patents

Robot mobile et procédé de récupération et de commande de trajet en ligne entier de robot mobile Download PDF

Info

Publication number
WO2014003517A1
WO2014003517A1 PCT/KR2013/005823 KR2013005823W WO2014003517A1 WO 2014003517 A1 WO2014003517 A1 WO 2014003517A1 KR 2013005823 W KR2013005823 W KR 2013005823W WO 2014003517 A1 WO2014003517 A1 WO 2014003517A1
Authority
WO
WIPO (PCT)
Prior art keywords
mobile robot
cell
obstacle
robot
new
Prior art date
Application number
PCT/KR2013/005823
Other languages
English (en)
Korean (ko)
Inventor
이순걸
이채혁
Original Assignee
인텔렉추얼디스커버리 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 인텔렉추얼디스커버리 주식회사 filed Critical 인텔렉추얼디스커버리 주식회사
Publication of WO2014003517A1 publication Critical patent/WO2014003517A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0219Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory ensuring the processing of the whole working surface
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0255Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/027Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising intertial navigation means, e.g. azimuth detector

Definitions

  • the present invention relates to a mobile robot and a method for controlling the movement path of the mobile robot, and more particularly, to an efficient method for controlling the movement path of the mobile robot by measuring the moving distance.
  • Robots have been developed for industrial use and as part of factory automation, or have been used to collect or collect information on behalf of humans in extreme environments that humans cannot tolerate. Recently, as the researches of these robots become more active, robots used at home as well as robots used for cutting-edge space development have been developed. A typical example of such a home robot is a cleaning robot.
  • mobile robots such as cleaning robots used in real life
  • the work environment is less affected, but instead, the work environment has a problem that the efficiency is sharply worsened when the work environment is wider.
  • An object of the present invention is to provide a method for controlling a robot's movement path, which reduces duplication of movement paths in a working environment that is not recognized in advance, and minimizes movement of the movement robots, thereby increasing overall work time efficiency.
  • a method for controlling a movement path of a robot comprising: installing a mobile robot in a workspace that is not recognized in advance; Recognizing the obstacle by the mobile robot; Dividing the workspace into a plurality of cells bounded by an extension of at least an outer circumferential surface of the virtual rectangle corresponding to the obstacle; Selecting any one of the plurality of cells and driving the mobile robot in a path that minimizes the number of revolutions of the mobile robot while traveling linearly inside the selected cell; Terminating driving when the mobile robot covers all areas of the selected cell and forming a new cell when a new workspace is recognized at the terminated point; And driving the new cell, and repeating the formation and driving of the new cell until there is no new space that the mobile robot no longer covers at the point where the driving ends.
  • the mobile robot according to the present invention is a body portion installed in the work space; A moving part installed at one side of the body part to move the body part; Peripheral sensing unit that can recognize the obstacle in the workspace; And dividing the workspace into a plurality of cells including an extension line of the outer circumferential surface of the obstacle, selecting any one of the plurality of cells, and linearly traveling the selected cell in the selected cell, A control unit which instructs the moving unit to travel in a path that minimizes the number of steps;
  • the body part ends the driving when the body part covers all areas of the selected cell, and when the peripheral sensing unit recognizes the new workspace at the point where the body part stops running, the controller detects the new cell. Instructing the moving unit to drive the newly formed new cell, and forming a new cell until there is no new space that the body part no longer covers at the point where the running of the body part is finished; Repeat the run.
  • the present invention not only reduces the energy consumption of the robot, but also has the advantage of increasing the life of the robot.
  • FIG. 1 is a block diagram of a mobile robot according to an embodiment of the present invention.
  • FIG. 2 is a flow chart for a mobile robot control method according to an embodiment of the present invention.
  • Figure 3a is a diagram showing that the mobile robot according to the mobile robot control method according to an embodiment of the present invention to recognize the obstacle.
  • Figure 3b is a view showing that the mobile robot recognizes the obstacle in accordance with an embodiment of the present invention when the obstacle is not rectangular.
  • Figure 4 is a view showing a mobile robot according to a mobile robot control method according to an embodiment of the present invention to set the cell.
  • FIG. 5 is a view showing a mobile robot moving in the first cell according to the mobile robot control method according to an embodiment of the present invention.
  • FIG. 6 is a view showing that the mobile robot according to the mobile robot control method according to an embodiment of the present invention moves in the second cell after finishing the movement in the first cell.
  • FIG. 7 is a view showing that the mobile robot according to the mobile robot control method according to an embodiment of the present invention moves in the third cell after finishing the movement in the second cell.
  • FIG. 8 is a view showing the premise path running of the mobile robot according to the mobile robot control method according to an embodiment of the present invention.
  • FIG. 1 is a block diagram of a mobile robot according to an embodiment of the present invention
  • Figure 2 is a flow chart for a mobile robot control method according to an embodiment of the present invention.
  • a mobile robot 1 includes a body part (not shown) forming an outer shape, a peripheral sensor 3 capable of recognizing obstacles in a work space, and the body.
  • the moving part 4 and the control part 2 which are provided in one side of a part are included.
  • the body portion there is no limitation on the shape of the body portion, it is preferable to be formed in a sufficient width to cover the work space.
  • the peripheral sensing unit 3 serves to detect the surrounding environment and obstacles, and may be a non-contact sensor. Detailed description thereof will be described later.
  • the moving part 4 may be coupled to the body part to move the body part.
  • the moving unit 4 may be a wheel coupled to the lower surface of the body portion, or may be a chain coupled to the wheel.
  • the control unit 2 divides the work space where the mobile robot 1 performs the cleaning and the like into a plurality of cells including the extension line of the outer circumferential surface of the obstacle as a boundary.
  • controller 2 selects any one of the cells.
  • the control unit 2 instructs the moving unit 4 to travel in a path that minimizes the number of revolutions of the body unit while traveling linearly inside the selected cell.
  • the mobile robot R is first installed in a workspace that is not recognized in advance.
  • the mobile robot R recognizes the obstacle (S10).
  • the obstacle may be furniture or the like that can be disposed indoors.
  • the mobile robot R recognizes not only the obstacle but also the surrounding environment, and the surrounding environment may be an edge wall defining a workspace.
  • a point which can characterize the obstacle and the surrounding environment.
  • the point may be a point or a line defining the shape of the obstacle, and the point may form a boundary of the cell when the mobile robot R divides the workspace into a plurality of cells.
  • the point may be a vertex portion or a corner portion of the rectangle.
  • the minimum rectangle surrounding the obstacle is virtually formed and the virtual obstacle is placed as a virtual obstacle (see FIG. 3B).
  • the mobile robot R divides the working space into a plurality of cells including at least an extension line of the outer circumferential surface of the obstacle as a boundary (S30).
  • any one of the plurality of cells is selected (S40), and the mobile robot travels in a path that minimizes the rotational speed of the mobile robot while linearly traveling inside the selected cell (S50).
  • the mobile robot R travels the new cell, and repeats the formation of the new cell and the driving until there is no new space that the mobile robot no longer covers at the point where the driving ends.
  • the cover means that the mobile robot R passes while covering the inside of the cell
  • the covering of the inside of the cell means that the mobile robot R covers at least all spaces inside the cell when viewed from the top of the cell. Say that you have covered it once.
  • the mobile robot (R) covers all the inside of each cell, and eventually covers the entire work space without overlap and omission.
  • FIG 3 is a view showing the mobile robot according to the mobile robot control method according to an embodiment of the present invention to recognize the obstacle
  • Figure 4 is a mobile robot according to the mobile robot control method according to an embodiment of the present invention to set the cell It is a figure which shows a state.
  • the mobile robot R according to the mobile robot control method according to the present embodiment recognizes the surrounding environment and obstacles when placed in a workspace that has not been recognized in advance.
  • the mobile robot (R) recognizes the environment and obstacles using a non-contact sensor.
  • the non-contact sensor is a sensor that obtains information from the measurement target without contacting the measurement target, and there is also a method of determining the position and shape of the measurement target by measuring reflected waves by emitting light, electromagnetic waves, ultrasonic waves, lasers, etc. to the measurement target. There is a method of identifying the position and shape of the measurement object using a camera. Since the non-contact sensor is already known technology, in the present embodiment, a well-known technology is adopted, and detailed description thereof will be omitted in the description of the present embodiment.
  • the mobile robot R recognizes the shape of the surrounding environment of the workspace by using the non-contact sensor. In detail, the mobile robot R recognizes whether the geographic shape of the work space is rectangular or curved, and detects whether there is a protrusion at the edge of the work space.
  • the mobile robot R may recognize that the workspace where the mobile robot R is located includes a rectangular shape including at least a first wall W1 and a second wall W2.
  • the mobile robot (R) can recognize that the wall is installed around its smooth form.
  • the mobile robot (R) Recognizing the edge shape of the work space by the mobile robot (R) is the mobile robot (R) is to work in the early stage of work in order to cover the entire interior of the work space, such as to clean the work space This is because it moves along the edge of the space.
  • the mobile robot R recognizes the edge shape of the workspace and then recognizes the obstacles 10 and 20 disposed inside the workspace.
  • the obstacle is composed of the first obstacle 10 and the second obstacle 20 having a rectangular shape as an example.
  • the mobile robot R may also rotate in place to direct the non-contact sensor into the working space.
  • the mobile robot R sets a point that can characterize the shape of the obstacles 10 and 20.
  • the point is a part that becomes a boundary of the cell when the mobile robot R divides the workspace into a plurality of cells, and may be a vertex and an outer circumferential surface of the obstacles 10 and 20.
  • the mobile robot R may emit light or ultrasonic waves to the first obstacle 10 to recognize that the first obstacle 10 is located between L1 and L2.
  • the mobile robot R has a vertex P1, a horizontal edge 11, and a vertical edge 12 set from the first obstacle 10 as the point, and the second edge 12 as the point. Similarly, in the obstacle 20, the vertices, the horizontal edges 21, and the vertical edges 22 seen by the mobile robot R are set as points.
  • the mobile robot R converts the workspace into a plurality of cells based on the first and second walls W1 and W2 recognized as edges of the workspace, and the portions recognized as points in the obstacles 10 and 20. Start dividing.
  • the mobile robot R constructs an environment map in real time by recognizing the current position, obstacles, and environment of the robot using a non-contact sensor.
  • the mobile robot R is located at the lower left corner. In this position, the mobile robot R is bounded by an extension line between the edges W1 and W2 of the already recognized working space and the edges 11, 12, 21 and 22 of the obstacle 10 and 20. Divide into multiple virtual cells.
  • the mobile robot R includes a first virtual cell Cp1 defined by a line extending the vertical wall 12 of the first obstacle 10, the second wall W2, and the work space. It is divided into a first virtual wall C1 and a second virtual cell Cp2 defined by a line extending the horizontal 21 of the second obstacle 20.
  • the first virtual cell Cp1 is a rectangle having a horizontal length ap1 and a vertical length bp1
  • the second virtual cell Cp2 is a rectangle having a horizontal length a1 and a vertical length b1.
  • the mobile robot R selects any one of the first virtual cell Cp1 and the second virtual cell Cp2 and starts to move inside the selected cell.
  • the mobile robot R may be set to select a cell having a larger area among the two virtual cells.
  • the criterion for selecting one of the two cells by the mobile robot R is not limited thereto, and there may be other criteria.
  • the mobile robot R selects the second virtual cell Cp2 having a larger area among the first virtual cell Cp1 and the second virtual cell Cp2.
  • FIG. 5 is a view showing a mobile robot according to a mobile robot control method according to an embodiment of the present invention to move in the first cell
  • Figure 6 is a mobile robot according to a mobile robot control method according to an embodiment of the present invention
  • 7 is a diagram illustrating movement in the second cell after finishing the movement in the first cell
  • FIG. 7 illustrates a movement of the mobile robot according to the mobile robot control method according to the embodiment of the present invention after the movement in the second cell.
  • FIG. 8 is a diagram illustrating movement in a cell
  • FIG. 8 is a diagram illustrating preliminary path driving of a mobile robot according to a mobile robot control method according to an exemplary embodiment of the present invention.
  • the mobile robot (R) is a path that reduces the number of revolutions of the mobile robot (R) while the linear running is long in consideration of the shape and width of the cell and the size of the mobile robot in the cell Drive
  • the mobile robot R In order to measure the distance traveled by the mobile robot R, the mobile robot R is provided with an odometer, that is, a driving recorder that integrates the travel distance.
  • an odometer that is, a driving recorder that integrates the travel distance.
  • the mobile robot R travels inside the first cell C1.
  • the first cell C1 is a space in which the mobile robot R first travels, and the second virtual cell Cp2 is selected as the first travel space and is redefined as a first cell C1. .
  • the mobile robot R sets a start point S1 and an end point E1 within the first cell C1.
  • the starting point S1 of the mobile robot R becomes a lower left side in which the mobile robot R is initially installed.
  • the end point E1 is determined by simulating a path in which the mobile robot R starts from the start point and reduces the number of revolutions while driving linearly.
  • the mobile robot R starts from the lower left of the first cell C1 and travels to the right along the first wall W1 that forms the lower boundary of the first cell C1. 3 When the wall (W3) is detected, it rotates vertically upwards 90 degrees.
  • the mobile robot R proceeds upward by the width of the mobile robot R, rotates to the left by 90 degrees and travels to the left again.
  • the mobile robot R rotates vertically to face upward when the wall is sensed while traveling to the left, and proceeds upward by the width of the mobile robot R.
  • the mobile robot R proceeds upwards and then travels again to the right side, and when it encounters a wall, the mobile robot R rotates upward and moves by the width of the mobile robot R as the first cell C1 repeats. ) It covers the inside tightly.
  • the rising of the mobile robot R by the width of the mobile robot R at the left end and the right end of the first cell C1 may prevent the traveling area of the mobile robot R from overlapping.
  • it When driving in a state in contact with the upper end of the first cell (C1), it may slightly overlap with the previous traveling area.
  • An end point E1 of the first cell C1 of the mobile robot R becomes an upper left or upper right end of the first cell C1 having a rectangular shape.
  • FIG. 5 it is shown that the end point E1 in the first cell C1 is the upper right corner.
  • a virtual cell is created for a new space recognized at the end point.
  • a new space defined by the horizontal length a2 and the vertical length b2 at the upper end point is recognized, and this new space is defined as a virtual cell Cp3. .
  • the mobile robot R recognizes a new space defined by the horizontal length a2 and the vertical length b2 of the upper portion, and the new space is referred to as the virtual cell Cp4. In this embodiment, it is assumed that the end point E1 of the first cell C1 of the mobile robot R is at the upper right.
  • the mobile robot R may recognize a space defined by a boundary with a space that has already traveled at an end point, an extension line of an outer circumferential surface of an obstacle, and a border of a work space as a virtual cell.
  • the mobile robot R Since the mobile robot R recognizes the third virtual cell Cp3 which is only one virtual cell at the end point of the first cell C1, the mobile robot R has no choice but to select the third virtual cell Cp3 as a second cell. Recognize as (C2) and drive.
  • the second cell C2 has a rectangular shape having a length a2 and a length b2, an extension line of the left vertical edge 24 of the second obstacle 20, an upper end of the first cell C1, and a work space. The border is defined by the wall.
  • the manner in which the mobile robot R travels in the second cell C2 is the same as in the first cell C1.
  • An end point E2 of the second cell C2 of the mobile robot R may be an upper left corner of the second cell C2, and the mobile robot R is recognized at the end point E2.
  • the third cell C3, which is a space, may be determined as the next travel space.
  • the driving of the mobile robot R in the third cell C3 is also the same as the driving in the first cell C1 and the second cell C2, and the virtual cell at the driving and ending points Repeating the setting will eventually cover all of the workspaces as shown in FIG.
  • the present invention even if the mobile robot does the same work, it can be completed in a short time, there is an advantage that the work efficiency is improved.
  • the present invention not only reduces the energy consumption of the robot, but also has the advantage of increasing the life of the robot.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Optics & Photonics (AREA)
  • Electromagnetism (AREA)
  • Acoustics & Sound (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)

Abstract

La présente invention concerne un procédé de commande de trajet mobile de robot. Le procédé de commande de trajet mobile de robot, selon la présente invention, comprend les étapes de : installer un robot mobile dans un espace de travail qui n'a pas été reconnu à l'avance ; permettre au robot mobile de reconnaître un obstacle ; diviser l'espace de travail en au moins une pluralité de cellules ayant, en tant que limites, des extensions d'une surface extérieure de l'obstacle ; sélectionner une cellule parmi la pluralité de cellules et permettre au robot mobile de se déplacer linéairement à l'intérieur de la cellule choisie et de se déplacer dans un trajet pour rendre minimal le RPM du robot mobile ; arrêter le déplacement du robot mobile lorsque le robot mobile couvre toutes les zones de la cellule choisie et former une nouvelle cellule si un nouvel espace de travail est reconnu au niveau du point arrêté ; et se déplacer dans la nouvelle cellule et répéter la formation de et le déplacement dans la nouvelle cellule jusqu'à ce qu'il n'y ait pas de nouveaux espaces qui n'ont pas été couverts par le robot mobile au niveau du point au niveau duquel le déplacement du robot mobile a été arrêté. Ainsi, selon la présente invention, le robot mobile peut compléter le même travail dans un temps court, obtenant ainsi un rendement de travail excellent.
PCT/KR2013/005823 2012-06-29 2013-07-01 Robot mobile et procédé de récupération et de commande de trajet en ligne entier de robot mobile WO2014003517A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020120071334A KR101372062B1 (ko) 2012-06-29 2012-06-29 이동로봇 및 이동로봇의 온라인 전역경로 커버 제어방법
KR10-2012-0071334 2012-06-29

Publications (1)

Publication Number Publication Date
WO2014003517A1 true WO2014003517A1 (fr) 2014-01-03

Family

ID=49783559

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2013/005823 WO2014003517A1 (fr) 2012-06-29 2013-07-01 Robot mobile et procédé de récupération et de commande de trajet en ligne entier de robot mobile

Country Status (2)

Country Link
KR (1) KR101372062B1 (fr)
WO (1) WO2014003517A1 (fr)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016096304A1 (fr) * 2014-12-16 2016-06-23 Robert Bosch Gmbh Procédé de cartographie d'une surface à traiter pour véhicules robots autonomes
CN110595478A (zh) * 2019-09-16 2019-12-20 北京华捷艾米科技有限公司 基于离线地图的机器人全覆盖路径规划方法、装置及设备
CN111329399A (zh) * 2020-04-09 2020-06-26 湖南格兰博智能科技有限责任公司 一种基于有限状态机的扫地机目标点导航方法
EP3564769A4 (fr) * 2016-12-29 2020-08-19 Zhuhai Amicro Semicoductor Co., Ltd. Procédé de planification d'itinéraire d'un robot intelligent
CN112644738A (zh) * 2021-01-19 2021-04-13 哈尔滨工业大学 一种行星着陆避障轨迹约束函数设计方法
CN112817309A (zh) * 2020-12-30 2021-05-18 东南大学 一种几何折叠式机器人全覆盖路径及其生成方法
US20210397765A1 (en) * 2018-11-07 2021-12-23 Honda Motor Co., Ltd. Work area zone boundary demarcation apparatus of autonomously navigating work machine
AU2021203120B2 (en) * 2015-04-24 2023-06-22 Avidbots Corp. Apparatus and methods for semi-autonomous cleaning of surfaces

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107843262A (zh) * 2017-10-30 2018-03-27 洛阳中科龙网创新科技有限公司 一种农用机械全覆盖运动路径规划的方法
CN108775902A (zh) * 2018-07-25 2018-11-09 齐鲁工业大学 基于障碍物虚拟膨胀的伴随机器人路径规划方法及系统
CN108981710B (zh) * 2018-08-07 2019-10-11 北京邮电大学 一种移动机器人的全覆盖路径规划方法
CN113124876B (zh) * 2021-04-20 2022-04-15 国家海洋技术中心 无人船在地形复杂海域遍历监测中路径优化方法及系统
CN114993308A (zh) * 2021-12-31 2022-09-02 丰疆智能(深圳)有限公司 导航路径的规划方法、装置及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004326692A (ja) * 2003-04-28 2004-11-18 Toshiba Tec Corp 自律走行ロボット
US20050113990A1 (en) * 1998-05-11 2005-05-26 Ehud Peless Area coverage with an autonomous robot
KR20090077547A (ko) * 2008-01-11 2009-07-15 삼성전자주식회사 이동 로봇의 경로 계획 방법 및 장치
KR20090104393A (ko) * 2008-03-31 2009-10-06 엘지전자 주식회사 로봇 청소기의 제어방법
KR20110085499A (ko) * 2010-01-20 2011-07-27 엘지전자 주식회사 로봇 청소기 및 이의 제어 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050113990A1 (en) * 1998-05-11 2005-05-26 Ehud Peless Area coverage with an autonomous robot
JP2004326692A (ja) * 2003-04-28 2004-11-18 Toshiba Tec Corp 自律走行ロボット
KR20090077547A (ko) * 2008-01-11 2009-07-15 삼성전자주식회사 이동 로봇의 경로 계획 방법 및 장치
KR20090104393A (ko) * 2008-03-31 2009-10-06 엘지전자 주식회사 로봇 청소기의 제어방법
KR20110085499A (ko) * 2010-01-20 2011-07-27 엘지전자 주식회사 로봇 청소기 및 이의 제어 방법

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016096304A1 (fr) * 2014-12-16 2016-06-23 Robert Bosch Gmbh Procédé de cartographie d'une surface à traiter pour véhicules robots autonomes
CN107003675A (zh) * 2014-12-16 2017-08-01 罗伯特·博世有限公司 用于自主机器人车辆测绘处理表面的方法
US10551844B2 (en) 2014-12-16 2020-02-04 Robert Bosch Gmbh Method for mapping a processing area for autonomous robot vehicles
US11844474B2 (en) 2015-04-24 2023-12-19 Avidbots Corp. Apparatus and methods for semi-autonomous cleaning of surfaces
AU2021203120B2 (en) * 2015-04-24 2023-06-22 Avidbots Corp. Apparatus and methods for semi-autonomous cleaning of surfaces
EP3564769A4 (fr) * 2016-12-29 2020-08-19 Zhuhai Amicro Semicoductor Co., Ltd. Procédé de planification d'itinéraire d'un robot intelligent
US20210397765A1 (en) * 2018-11-07 2021-12-23 Honda Motor Co., Ltd. Work area zone boundary demarcation apparatus of autonomously navigating work machine
CN110595478A (zh) * 2019-09-16 2019-12-20 北京华捷艾米科技有限公司 基于离线地图的机器人全覆盖路径规划方法、装置及设备
CN111329399A (zh) * 2020-04-09 2020-06-26 湖南格兰博智能科技有限责任公司 一种基于有限状态机的扫地机目标点导航方法
CN112817309B (zh) * 2020-12-30 2021-12-03 东南大学 一种几何折叠式机器人全覆盖路径及其生成方法
WO2022141737A1 (fr) * 2020-12-30 2022-07-07 东南大学 Trajet à couverture complète de robot du type à pliage géométrique et procédé de production associé
CN112817309A (zh) * 2020-12-30 2021-05-18 东南大学 一种几何折叠式机器人全覆盖路径及其生成方法
CN112644738B (zh) * 2021-01-19 2021-09-17 哈尔滨工业大学 一种行星着陆避障轨迹约束函数设计方法
CN112644738A (zh) * 2021-01-19 2021-04-13 哈尔滨工业大学 一种行星着陆避障轨迹约束函数设计方法

Also Published As

Publication number Publication date
KR101372062B1 (ko) 2014-03-07
KR20140003249A (ko) 2014-01-09

Similar Documents

Publication Publication Date Title
WO2014003517A1 (fr) Robot mobile et procédé de récupération et de commande de trajet en ligne entier de robot mobile
CN109144067B (zh) 一种智能清洁机器人及其路径规划方法
EP3505037B1 (fr) Robot de nettoyage et son procédé de commande
CA3033972C (fr) Systeme robotique et methode d'exploitation sur une piece de travail
KR100988736B1 (ko) 자율주행 이동로봇의 최단 경로 이동을 위한 홈 네트워크시스템 및 그 방법
CN109363585B (zh) 分区遍历方法、清扫方法及其扫地机器人
JP7262076B2 (ja) 移動ロボット、及び、制御方法
CN111596662B (zh) 一种沿全局工作区域一圈的判断方法、芯片及视觉机器人
JP4264009B2 (ja) 自走式掃除機
KR100877072B1 (ko) 이동 로봇을 위한 맵 생성 및 청소를 동시에 수행하는 방법및 장치
CN107340768A (zh) 一种智能机器人的路径规划方法
EP2870513B1 (fr) Robot mobile autonome et procédé pour son exploitation
KR101506738B1 (ko) 로봇청소기 및 그 구동방법
CN108422419A (zh) 一种智能机器人及其控制方法和系统
JP2002325708A (ja) ロボット掃除機とそのシステム及び制御方法
CN102968122A (zh) 一种用于移动平台在未知区域自建地图的覆盖方法
JP2007213236A (ja) 自律走行ロボットの経路計画方法及び自律走行ロボット
CN106155056A (zh) 自移动机器人行走方法与装置
JP7281707B2 (ja) 移動ロボット、及び、制御方法
KR20180070062A (ko) 이동체 및 이동체의 제어 방법
CN104729502A (zh) 基于蓝牙基站和激光传感器的机器人绘制地图和定位的方法及系统
US20160195875A1 (en) Autonomous mobile robot and method for operating the same
CN104848848A (zh) 基于无线基站和激光传感器的机器人绘制地图和定位的方法及系统
KR20160120841A (ko) 청소 로봇을 이용한 스마트 청소 시스템
CN105739499A (zh) 自主移动机器人避障系统的多路红外及超声波传感器分布结构

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13810631

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13810631

Country of ref document: EP

Kind code of ref document: A1