WO2009089369A1 - Système de navigation pointer-et-aller, et procédé - Google Patents

Système de navigation pointer-et-aller, et procédé Download PDF

Info

Publication number
WO2009089369A1
WO2009089369A1 PCT/US2009/030464 US2009030464W WO2009089369A1 WO 2009089369 A1 WO2009089369 A1 WO 2009089369A1 US 2009030464 W US2009030464 W US 2009030464W WO 2009089369 A1 WO2009089369 A1 WO 2009089369A1
Authority
WO
WIPO (PCT)
Prior art keywords
robotic vehicle
environment
sensor
target point
visual representation
Prior art date
Application number
PCT/US2009/030464
Other languages
English (en)
Inventor
Stephen C. Jacobsen
Marc Olivier
Original Assignee
Raytheon Sarcos, Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Raytheon Sarcos, Llc filed Critical Raytheon Sarcos, Llc
Publication of WO2009089369A1 publication Critical patent/WO2009089369A1/fr

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0044Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with a computer generated representation of the environment of the vehicle, e.g. virtual reality, maps
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0227Control of position or course in two dimensions specially adapted to land vehicles using mechanical sensing means, e.g. for sensing treated area
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0251Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0255Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0259Control of position or course in two dimensions specially adapted to land vehicles using magnetic or electromagnetic means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/027Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising intertial navigation means, e.g. azimuth detector
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/0278Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS

Definitions

  • the present invention relates to navigation control for robotic vehicles.
  • BACKGROUND OF THE INVENTION AND RELATED ART Navigating a robotic vehicle remotely through an environment can be quite challenging due to a number of factors.
  • a complex robotic vehicle typically has a large number of operator controls available.
  • One control approach is the so-called replica master mode of operation.
  • the robot is slaved to a replica master, which provides the same joints as the robot.
  • the master communicates positions of joints to the robot, which attempts to move to the same position.
  • replica master mode is handy for some types of robot manipulation (e.g., telemanipulation), this type of control can be a tedious way to remotely navigate a semi- autonomous robot through a varied terrain.
  • An alternate approach to robotic navigation is to preprogram the robot movements beforehand. While preprogramming is efficient for some types of robot manipulation (e.g., manufacturing), this type of control is not particularly suitable when preplanning is not possible, such as in exploratory operations.
  • the present invention includes a method and system for providing point and go navigation for remote control of a robotic vehicle.
  • a method of providing point and go navigation of a robotic vehicle within an environment is provided.
  • the method can include obtaining sensor information from the robotic vehicle and displaying a visual representation of the environment on an operator console.
  • the method can also include designating a target point within the visual representation of the environment based on operator input, and advancing the robotic vehicle toward a target location in the environment that corresponds to the target point. Movement of the robotic vehicle can be based on an automatic navigation response to the designated target point.
  • FIG. 1 illustrates a schematic illustration of a system for providing point and go navigation of a robotic vehicle according to an exemplary embodiment of the present invention
  • FIG. 2 illustrates a block diagram of a remote operator console for point and go navigation in accordance with an embodiment of the present invention
  • FIG. 3 illustrates a flow chart of a method of providing point and go navigation for remote control of a robotic vehicle in accordance with another embodiment of the present invention.
  • FIG. 1 illustrates the system 10 as including a robotic vehicle 12 and a remote operator console 14.
  • the remote operator console is in bidirectional communication with the robotic vehicle via a communications link 16.
  • the communication link can be, for example, a wireless radio frequency link, a free space optical link, a free space ultrasonic link, a wired link, a fiber optic link, or a combination thereof.
  • the communication link may also include one or more relay nodes (not shown), for example, to extend the range for communication between the remote operator console and the robotic vehicle.
  • the relay node may be mounted on a second robotic vehicle.
  • the robotic vehicle 12 includes at least one environmental sensor 18, such as a camera, a stereo camera, a sound sensor, an electromagnetic sensor, a chemical sensor, a radar, a lidar, a range finder, a scanning range finder, a sonar, a contact sensor, a sniff sensor, a Global Positioning System (GPS) receiver, an inertial measurement unit, an orientation sensor, or a combination thereof.
  • the sensor allows the robotic vehicle to obtain information regarding the environment in which it is operating.
  • the robotic vehicle transmits sensor information to the remote operator console 14 via the communication link 16.
  • the remote operator console 14 includes a display 20 to display a visual representation of the environment based on sensor information received from the robotic vehicle 12.
  • the remote operator console includes an operator input function for defining a target point on the visual representation.
  • the operator input function can be, for example, a touch screen integrated with the display.
  • the operator input function can be buttons 23 or similar controls on the remote operator console. Navigational commands are transmitted from the remote operator console to the robotic vehicle via the communication link 16 to control movement of the robotic vehicle.
  • An exemplary embodiment of the operator console 14 is shown in further detail in
  • the operator console includes a receiver 22 for receiving sensor data from the robotic vehicle 12 (FIG. 1).
  • the sensor data is provided to the display 20.
  • the operator console also includes an operator input function 21 to allow designation of a target point.
  • the display can include a touch sensitive screen that allows the operator to designate a target point by touching the display.
  • the remote operator console can include a keyboard, cursor controls, or other operator input device.
  • an electronic cursor may be moved on the display to designate a target point.
  • an operator may enter coordinates using a keyboard.
  • the designated target point on the display 20 corresponds to a target location within the environment in which the robotic vehicle 12 is operating.
  • the environment includes a yellow fire hydrant.
  • the yellow fire hydrant may be designated as the target location by touching the portion of the image corresponding to the yellow fire hydrant on the display.
  • the remote operator console 14 may transmit navigation commands to the robotic vehicle 12 using the transmitter 24 to cause the robotic vehicle to advance toward the designated target location.
  • Navigation of the robotic vehicle 12 toward the designated target location can be performed in various ways.
  • coordinates corresponding to the designated target location can be transmitted to the robotic vehicle.
  • the coordinates may be defined relative to the position on the display, and translated into an actual location within the environment by the robotic vehicle.
  • the coordinates may be defined relative to references points within the environment or other reference datum and coordinate translation provided by the remote operator console.
  • Absolute coordinates may be defined, for example, by combining the displayed sensor data with external information such as maps, GPS position fixes, or the like.
  • the display may also be augmented with additional environmental information obtained independently of the robotic vehicle, such as, for example, an electronic map, environmental sensors, or sensor information from a second robotic vehicle operating within the environment.
  • a second environmental sensor can be mounted on a second robotic vehicle.
  • the second environmental sensor may include, for example, a camera, a stereo camera, a sound sensor, an electromagnetic sensor, a chemical sensor, a radar, a lidar, a range finder, a scanning range finder, a sonar, a contact sensor, a sniff sensor, a Global Positioning System (GPS) receiver, or a combination thereof.
  • the target location may be designated using an operator-defined displacement from the target point.
  • the fire hydrant may be designated using the touch screen and additional operator input provided to specify that the target location is 10 feet to the left of the fire hydrant.
  • Navigation commands to the robotic vehicle may alternately include a series of movement commands.
  • the remote operator console may include processing to determine a route toward the target point, and then transmit incremental movement commands to the robotic vehicle to navigate toward the target point.
  • the automatic navigation For example, one relatively straightforward automatic navigation approach is to turn the vehicle in a direction corresponding to the designated target and move forward in that direction.
  • the display may include processing to automatically determine a planned route from a current position of the robotic vehicle toward the target location. Processing may include object identification using image-processing techniques to allow obstacles within the environment to be identified and avoided.
  • the environmental sensor may include range finding, allowing for improved obstacle identification and navigation.
  • operator input can identify a target path, for example, by tracing a curve on the touch sensitive screen to define the target path.
  • the robotic vehicle can then be automatically navigated to follow a travel path in the environment correlated with the target path.
  • the display can be updated as new sensor data is received.
  • the target point may be re-designated by the operator. For example, navigation may be controlled incrementally, designating a target point, navigating the robotic vehicle toward the target point, and re-designating the target point as the robotic vehicle moves.
  • robotic vehicles are capable of multiple movement modes.
  • commonly owned co-pending U.S. Provisional Patent Application Serial No. 60/858,917, filed November 13, 2006, and entitled "Serpentine Robotic Crawler” describes one exemplary unmanned ground robotic vehicle having multiple movement modes that may be used in embodiments of the present invention and is incorporated herein by reference.
  • the operator may also identify transition points on the target path where a movement move or other behavior of the robotic vehicle is to change. When the robotic vehicle reaches the point in the environment corresponding to the transition point, robotic vehicle may change from a first mode of operation to a second mode of operation.
  • the robotic vehicle describe above includes a stair climbing mode, a tank configuration for movement on a relatively flat surface, and an inside surface climbing mode.
  • a desired path may include movement across a floor area, up a set of stairs, and into a vent duct. Transition points may be designated at the beginning of the stairs, the top of the stairs, and the entry to vent duct.
  • the robotic vehicle may then switch between appropriate movement modes as each different area is entered.
  • the robotic vehicle mode of operation may be adjusted automatically based on characteristics of the environment obtained from the sensor information. For example, image recognition software may recognize certain types of obstacles and movement mode changed automatically.
  • FIG. 3 illustrates a method of providing point and go navigation.
  • the method 30 includes obtaining 32-sensor information describing an environment in which the robotic vehicle is operating. For example, sensor information may be obtained using sensors mounted on the robotic vehicle as described above.
  • the method also includes communicating 34 the sensor information to a remotely located operator console. For example, a communication link as described above can be used.
  • the method also includes displaying 36 a visual representation of the environment on the operator console, the visual representation being based on the sensor information.
  • the method also includes 38 designating a target point within the visual representation based on operator input. For example, operator input may be via a touch screen or other input device as described above.
  • the method also includes advancing 40 the robotic vehicle toward a target location in the environment corresponding to the target point. Advancing the robotic vehicle can be based on an automatic navigation response to the designated target point, for example as described above.
  • point and go navigation in accordance with embodiments of the present invention can ease the difficulty of controlling a robotic vehicle.
  • An operator can point to a desired target point on a display screen and the robotic vehicle will automatically go to the designated location.
  • Processing within either the remote operator console or the robotic vehicle can translate a designated target point or path into navigational commands for the robotic vehicle.
  • transition points can be designated, and robotic vehicle operation mode changed at the transition points.
  • Applications of such a navigation technique may include search and rescue, military operations, and industrial operations.
  • the term "preferably” is non-exclusive where it is intended to mean “preferably, but not limited to.” Any steps recited in any method or process claims may be executed in any order and are not limited to the order presented in the claims. Means-plus-function or step-plus- function limitations will only be employed where for a specific claim limitation all of the following conditions are present: a) "means for” or “step for” is expressly recited in that limitation; b) a corresponding function is expressly recited in that limitation; and c) structure, material or acts that support that function are described within the specification. Accordingly, the scope of the invention should be determined solely by the appended claims and their legal equivalents, rather than by the descriptions and examples given above.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

L'invention concerne une console d'opérateur à distance assurant la navigation pointer-et-aller d'un véhicule robotisé. La console d'opérateur à distance fournit un affichage pour une représentation visuelle de l'environnement dans lequel le véhicule robotisé fonctionne sur la base d'informations de capteur reçues du véhicule robotisé. Un opérateur peut désigner un point cible sur l'affichage. Le véhicule robotisé est automatiquement dirigé vers un emplacement dans l'environnement correspondant au point cible désigné.
PCT/US2009/030464 2008-01-08 2009-01-08 Système de navigation pointer-et-aller, et procédé WO2009089369A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US1050108P 2008-01-08 2008-01-08
US61/010,501 2008-01-08

Publications (1)

Publication Number Publication Date
WO2009089369A1 true WO2009089369A1 (fr) 2009-07-16

Family

ID=40548976

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2009/030464 WO2009089369A1 (fr) 2008-01-08 2009-01-08 Système de navigation pointer-et-aller, et procédé

Country Status (1)

Country Link
WO (1) WO2009089369A1 (fr)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012021192A3 (fr) * 2010-05-12 2012-08-23 Irobot Corporation Système et procédé de commande d'un véhicule distant
US8774981B2 (en) 2009-09-14 2014-07-08 Israel Aerospace Industries Ltd. Infantry robotic porter system and methods useful in conjunction therewith
CN104049634A (zh) * 2014-07-02 2014-09-17 燕山大学 基于Camshift算法的智能体模糊动态避障方法
CN105116785A (zh) * 2015-06-26 2015-12-02 北京航空航天大学 一种多平台远程机器人通用控制系统
CN106444756A (zh) * 2016-09-22 2017-02-22 纳恩博(北京)科技有限公司 一种信息处理方法及电子设备
JP2017054475A (ja) * 2015-09-11 2017-03-16 富士ゼロックス株式会社 遠隔操作装置、方法及びプログラム
WO2017063652A1 (fr) * 2015-10-12 2017-04-20 Intelligent Marking Aps Unité de robot autopropulsée comprenant un système de navigation, et articles de mouvement pour le positionnement de l'unité de robot sur une base uniforme
CN109263641A (zh) * 2017-07-12 2019-01-25 罗伯特·博世有限公司 用于定位和自动化运行车辆的方法和装置
WO2019047643A1 (fr) * 2017-09-05 2019-03-14 百度在线网络技术(北京)有限公司 Procédé et dispositif de commande destinés à un véhicule aérien sans pilote
CN110570687A (zh) * 2018-06-06 2019-12-13 杭州海康机器人技术有限公司 Agv的控制方法、装置及存储介质

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010037163A1 (en) * 2000-05-01 2001-11-01 Irobot Corporation Method and system for remote control of mobile robot
EP1510896A1 (fr) * 2002-05-31 2005-03-02 Fujitsu Limited Robot telecommande et procede d'identification automatique de la position de robot.
US20070156286A1 (en) * 2005-12-30 2007-07-05 Irobot Corporation Autonomous Mobile Robot

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010037163A1 (en) * 2000-05-01 2001-11-01 Irobot Corporation Method and system for remote control of mobile robot
EP1510896A1 (fr) * 2002-05-31 2005-03-02 Fujitsu Limited Robot telecommande et procede d'identification automatique de la position de robot.
US20070156286A1 (en) * 2005-12-30 2007-07-05 Irobot Corporation Autonomous Mobile Robot

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8774981B2 (en) 2009-09-14 2014-07-08 Israel Aerospace Industries Ltd. Infantry robotic porter system and methods useful in conjunction therewith
US9658615B2 (en) 2010-05-12 2017-05-23 Irobot Defense Holdings, Inc. Remote vehicle control system and method
GB2494807A (en) * 2010-05-12 2013-03-20 Irobot Corp System and method to remotely control a vehicle
US8954194B2 (en) 2010-05-12 2015-02-10 Irobot Corporation Remote vehicle control system and method
GB2523484A (en) * 2010-05-12 2015-08-26 Irobot Corp Remote vehicle control system and method
GB2494807B (en) * 2010-05-12 2015-11-18 Irobot Corp System and method to remotely control a vehicle
WO2012021192A3 (fr) * 2010-05-12 2012-08-23 Irobot Corporation Système et procédé de commande d'un véhicule distant
GB2523484B (en) * 2010-05-12 2016-04-13 Irobot Corp Remote vehicle control system and method
CN104049634A (zh) * 2014-07-02 2014-09-17 燕山大学 基于Camshift算法的智能体模糊动态避障方法
CN104049634B (zh) * 2014-07-02 2017-02-01 燕山大学 基于Camshift算法的智能体模糊动态避障方法
CN105116785B (zh) * 2015-06-26 2018-08-24 北京航空航天大学 一种多平台远程机器人通用控制系统
CN105116785A (zh) * 2015-06-26 2015-12-02 北京航空航天大学 一种多平台远程机器人通用控制系统
JP2017054475A (ja) * 2015-09-11 2017-03-16 富士ゼロックス株式会社 遠隔操作装置、方法及びプログラム
WO2017063652A1 (fr) * 2015-10-12 2017-04-20 Intelligent Marking Aps Unité de robot autopropulsée comprenant un système de navigation, et articles de mouvement pour le positionnement de l'unité de robot sur une base uniforme
DK178897B1 (da) * 2015-10-12 2017-05-08 Intelligent Marking Aps Selvkørende robotenhed omfattende et navigationssystem og bevægeorganer til positionering af robotenheden på et jævnt underlag.
US11467601B2 (en) 2015-10-12 2022-10-11 Intelligent Marking Aps Self-propelled robot unit including a navigation system and movement items for positioning of the robot unit on an even base
CN106444756A (zh) * 2016-09-22 2017-02-22 纳恩博(北京)科技有限公司 一种信息处理方法及电子设备
CN109263641A (zh) * 2017-07-12 2019-01-25 罗伯特·博世有限公司 用于定位和自动化运行车辆的方法和装置
CN109263641B (zh) * 2017-07-12 2023-04-07 罗伯特·博世有限公司 用于定位和自动化运行车辆的方法和装置
WO2019047643A1 (fr) * 2017-09-05 2019-03-14 百度在线网络技术(北京)有限公司 Procédé et dispositif de commande destinés à un véhicule aérien sans pilote
CN110570687A (zh) * 2018-06-06 2019-12-13 杭州海康机器人技术有限公司 Agv的控制方法、装置及存储介质

Similar Documents

Publication Publication Date Title
US8392036B2 (en) Point and go navigation system and method
WO2009089369A1 (fr) Système de navigation pointer-et-aller, et procédé
US11294060B2 (en) System and method for lidar-based vehicular localization relating to autonomous navigation
AU2019210565B2 (en) Moving robot, method for controlling moving robot, and moving robot system
US10598505B2 (en) Travel route generation apparatus and method for generating travel route
US7539557B2 (en) Autonomous mobile robot
KR101202695B1 (ko) 자율 이동 장치
KR101703177B1 (ko) 차량 위치 인식 장치 및 방법
JP5647905B2 (ja) 無人移動体の制御方法
US20120072052A1 (en) Navigation Portals for a Remote Vehicle Control User Interface
JP5382770B2 (ja) 無人移動体システム
JP6982999B2 (ja) 経路決定装置と経路決定方法
US11351986B2 (en) In-vehicle processing apparatus
AU2015238339B2 (en) Navigation system
JP2009205652A (ja) 移動体制御システム及び移動体制御方法
US20190258265A1 (en) Method for Operating A Partially Autonomous or Autonomous Motor Vehicle, and Motor Vehicle
JP6910023B2 (ja) 無人移動体の制御方法
KR101146942B1 (ko) 이동로봇의 경로생성 장치, 이를 구비하는 이동로봇 및 이동로봇의 경로생성 방법
JP2010152833A (ja) 無人移動体システム
US20220019224A1 (en) Mobile body, method of controlling mobile body, and program
KR102089705B1 (ko) 임무 수행용 무인비행체의 임무 제어방법
JP5969903B2 (ja) 無人移動体の制御方法
JP6947563B2 (ja) 移動ロボットの制御装置と制御方法
KR101635585B1 (ko) 무인 이동체 원격 제어 장치
JP6934760B2 (ja) 走行装置、走行制御システム、走行制御方法及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09700561

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09700561

Country of ref document: EP

Kind code of ref document: A1