JP2006035381A - Control equipment for mobile robot - Google Patents

Control equipment for mobile robot Download PDF

Info

Publication number
JP2006035381A
JP2006035381A JP2004219576A JP2004219576A JP2006035381A JP 2006035381 A JP2006035381 A JP 2006035381A JP 2004219576 A JP2004219576 A JP 2004219576A JP 2004219576 A JP2004219576 A JP 2004219576A JP 2006035381 A JP2006035381 A JP 2006035381A
Authority
JP
Japan
Prior art keywords
robot
opponent
mobile robot
control equipment
control device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2004219576A
Other languages
Japanese (ja)
Other versions
JP4315872B2 (en
Inventor
Masanori Takeda
政宣 武田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honda Motor Co Ltd
Original Assignee
Honda Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honda Motor Co Ltd filed Critical Honda Motor Co Ltd
Priority to JP2004219576A priority Critical patent/JP4315872B2/en
Publication of JP2006035381A publication Critical patent/JP2006035381A/en
Application granted granted Critical
Publication of JP4315872B2 publication Critical patent/JP4315872B2/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Landscapes

  • Manipulator (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

<P>PROBLEM TO BE SOLVED: To provide control equipment for a mobile robot that can smoothly move without colliding with walking people in an environment where a non-specified number of people are moving. <P>SOLUTION: The control equipment for the robot (1) is equipped with environmental information-obtaining means, such as a video camera (3) and a microphone (4), as well as present location detection means (15) and map control means (7), designed to describe information about the location of passageways and fixed objects within its moving area, so that the robot can move in a self-controlled manner, while referring to the environmental information within its moving area. The moving route of the robot is altered to the rightward direction by issuing a signal (direction indicating right 16), when the robot is judged to be liable to collide with the confronting person, if it continues to move through the current route. The control equipment also confirms that the robot's adoption of such a collision-avoiding action has been surely conveyed to the confronting person. <P>COPYRIGHT: (C)2006,JPO&NCIPI

Description

本発明は、環境内の情報を参照しつつ自律的に移動するロボットの制御装置に関するものである。   The present invention relates to a control device for a robot that moves autonomously while referring to information in the environment.

所定の領域内を自律移動する清掃ロボットが知られている(特許文献1を参照されたい)。この清掃ロボットは、移動時に複数の表示灯を点滅し、その点滅パターンと点灯色とを変化させることにより、自身の稼働状況を周囲の人々に報知するようになっている。また、超音波センサ等で障害物までの距離を計測し、障害物との間が所定距離になると停止して方向転換するようになっている。
特開平7−46710号公報
A cleaning robot that autonomously moves within a predetermined area is known (see Patent Document 1). This cleaning robot blinks a plurality of indicator lamps when moving, and changes its blinking pattern and lighting color to notify its surroundings to the surrounding people. Further, the distance to the obstacle is measured with an ultrasonic sensor or the like, and when it reaches a predetermined distance from the obstacle, it stops and changes its direction.
JP 7-46710 A

しかるに、この従来の制御装置は、清掃ロボットに建物内の床面を隈無く走行させることを目的とするものであり、周囲の人間の方が表示灯を見て清掃ロボットの稼働状況を判断して清掃ロボットの進路から退避しなければならないものである。そのため、周囲の人間が表示灯を見落としたり、表示灯の意味に気付かなかったりすること考慮しなければならない環境、つまり不特定多数の人間が出入りするところで稼働するロボットには適していない。   However, this conventional control device is intended to allow the cleaning robot to travel on the floor in the building without any obstacles, and the surrounding humans look at the indicator lights to determine the operation status of the cleaning robot. Therefore, it is necessary to evacuate from the path of the cleaning robot. Therefore, it is not suitable for an environment where it is necessary to consider that surrounding humans overlook the indicator lamp or do not notice the meaning of the indicator lamp, that is, a robot that operates where an unspecified number of humans come and go.

このような課題を解決し、不特定多数の人間が移動する環境下でも、移動する人間と衝突せずに円滑に移動することのできる移動ロボットの制御装置を提供するため、本発明の請求項1は、環境情報取得手段(ビデオカメラ3やマイクロフォン4)と、現在位置検出手段(15)と、移動領域内の通路や固定物の配置に関わる情報を記述した地図管理手段(7)とを備え、移動領域内の環境情報を参照しつつ自律的に移動するロボット(1)の制御装置において、前記環境情報取得手段が認識した対向者と自身との位置関係に基づいてそのままの進路をとると対向者との衝突可能性大と判断された場合は、合図(方向指示灯16)を発して進路を所定の側へ変更し、且つその衝突回避行動が対向者に伝わったことを確認することを特徴とするものとした。また本発明の請求項2は、上記構成に加えて、前記衝突回避行動が対向者に伝わっていないと予想される場合は停止してさらに合図を発することを特徴とするものとし、さらに請求項3は、接触センサ(タッチセンサ17)を有し、対向者との接触時は発話して謝意を表すことを特徴とするものとした。   In order to solve such a problem and provide an apparatus for controlling a mobile robot that can move smoothly without colliding with a moving human even in an environment where an unspecified number of humans move, the claims of the present invention are provided. 1 includes environmental information acquisition means (video camera 3 and microphone 4), current position detection means (15), and map management means (7) describing information related to the arrangement of passages and fixed objects in the moving area. In the control device of the robot (1) that autonomously moves while referring to the environment information in the movement area, it takes the course as it is based on the positional relationship between the opponent recognized by the environment information acquisition means and itself. When it is determined that there is a high possibility of a collision with the opponent, a signal (direction indicator light 16) is issued to change the course to a predetermined side and confirm that the collision avoidance action has been transmitted to the opponent. It is characterized by It was as. Moreover, in addition to the said structure, Claim 2 of this invention shall stop and issue a signal, when it is estimated that the said collision avoidance action is not transmitted to the other party, Furthermore, it is characterized by the above-mentioned. No. 3 has a contact sensor (touch sensor 17), and speaks to express gratitude at the time of contact with the opponent.

このような本発明によれば、その環境に不慣れな人間が出入りする環境下でも、ロボットが対向者の進路を確認し、衝突を回避する方向への進路変更を行いつつ移動するので、人間と接触せずに円滑に移動することができる。特にこちらが避けたにも拘わらず対向者がこちらに向かってくる場合は、停止した上で合図して対向者の注意を喚起することで接触を防止し、やむなく接触した場合は、謝ることで不興を買わないようにすることができる。   According to the present invention, even in an environment where a person unfamiliar with the environment enters and leaves, the robot checks the path of the opponent and moves while changing the path in a direction to avoid a collision. It can move smoothly without contact. In particular, if the opponent comes to you even though you have avoided it, stop and signal to prevent the contact by alerting the opponent. You can avoid buying misery.

以下に添付の図面を参照して本発明について詳細に説明する。   Hereinafter, the present invention will be described in detail with reference to the accompanying drawings.

図1は、本発明が適用されるロボットの概略構成を示すブロック図である。このロボット1には、スピーカ2、ビデオカメラ3、及びマイクロフォン4が設けられており、ビデオカメラ3からの画像信号が画像処理部5に入力され、マイクロフォン4からの音声信号が音処理部6に入力され、適宜な合成音声生成手段で生成された音声信号がスピーカ2から出力される。   FIG. 1 is a block diagram showing a schematic configuration of a robot to which the present invention is applied. The robot 1 is provided with a speaker 2, a video camera 3, and a microphone 4. An image signal from the video camera 3 is input to the image processing unit 5, and an audio signal from the microphone 4 is input to the sound processing unit 6. An audio signal that is input and generated by an appropriate synthesized speech generation means is output from the speaker 2.

ビデオカメラ3は、モノクロまたはカラーの撮像素子を備えており、パン(左右方向)及びチルト(上下方向)動作がモータによって行われる。このビデオカメラ3からの画像出力は、フレームグラバーによってディジタル化され、連続または任意の間隔の2フレーム間の差分から動体が抽出される。また、左右一対のビデオカメラ3の画像から立体視して目標物までの距離情報が得られる。さらに画像情報からオプティカルフローに基づいて人の輪郭や移動体の位置を求め、人の顔を識別するための画像処理が行われる。   The video camera 3 includes a monochrome or color image sensor, and pan (left-right direction) and tilt (up-down direction) operations are performed by a motor. The image output from the video camera 3 is digitized by a frame grabber, and a moving object is extracted from the difference between two frames continuously or at an arbitrary interval. Further, distance information to the target can be obtained by stereoscopic viewing from the images of the pair of left and right video cameras 3. Further, image processing for identifying the human face is performed by obtaining the contour of the person and the position of the moving body based on the optical flow from the image information.

音処理部6では、暗騒音や残響成分を除去して目標音を抽出し易くすると共に、音の立ち上がり部分から、人が発する音声であるか、何らかの物体同士がぶつかり合う衝撃音であるかを推定する。また、一対のマイクロフォン4間の音圧差及び音の到達時間差に基づいて音源位置を特定する処理が行われる。   The sound processing unit 6 makes it easy to extract the target sound by removing background noise and reverberation components, and determines whether the sound is generated by a person from the rising part of the sound or is an impact sound in which some objects collide with each other. presume. Further, a process of specifying the sound source position based on the sound pressure difference between the pair of microphones 4 and the sound arrival time difference is performed.

一方、このロボット1には、ロボットの行動範囲における通路や固定物の配置を記述した地図データを格納する地図データ管理部7と、ID・氏名・性別・生年月日・血液型などの一般情報、会社名・所属・役職・電話番号・メールアドレス・端末情報などの職業情報、及び顔認識のための顔データとからなる個人データを格納した個人データ管理部8と、地図を適宜に分画した領域毎の照明の種類、明るさ、平均騒音レベル、残響特性、床の硬さなどの環境データに基づく制御パラメータを格納したパラメータデータ管理部9とを備えたロボット支援サーバ10からの各種情報が与えられる。   On the other hand, the robot 1 includes a map data management unit 7 for storing map data describing passages and fixed objects in the robot's range of action, and general information such as ID, name, gender, date of birth, and blood type. Personal data management unit 8 that stores personal data consisting of occupation information such as company name, affiliation, job title, telephone number, email address, terminal information, and face data for face recognition, and maps as appropriate Information from the robot support server 10 including a parameter data management unit 9 storing control parameters based on environmental data such as the type of illumination, brightness, average noise level, reverberation characteristics, floor hardness, etc. Is given.

このロボット1は、画像、音、地図、環境、および個人の各情報を基に、タスク設定部11がロボット1の進行ルートを設定する。そしてタスク設定部11は、ロボット1の行動様式を定義し、遂行すべき行動指令を行動制御部12に発する。また、タスク設定部11には、画像処理部5からの画像処理信号ならびに音処理部6からの音声信号が直接入力され、地図データ管理部7からの地図データを参照しつつロボット1の周辺の障害物の有無をチェックし、移動中に現れた障害物あるいは歩行者などに衝突しないように、移動速度や移動経路の変更指示を行動制御部12に発する。   In the robot 1, the task setting unit 11 sets the travel route of the robot 1 based on the image, sound, map, environment, and individual information. Then, the task setting unit 11 defines the behavior mode of the robot 1 and issues a behavior command to be performed to the behavior control unit 12. The task setting unit 11 is directly input with the image processing signal from the image processing unit 5 and the audio signal from the sound processing unit 6, and refers to the map data from the map data management unit 7 while surrounding the robot 1. The presence / absence of an obstacle is checked, and an instruction to change the movement speed or movement route is issued to the action control unit 12 so as not to collide with an obstacle or a pedestrian that appears during movement.

タスク設定部11には、キーボードやタッチパネルなどの入力デバイスと、液晶ディスプレーなどのモニタを備えたパーソナルコンピュータからなるユーザ端末13が接続されている。これは、ロボット1の起動・停止・原点復帰などをオペレータが指示してロボット1を遠隔操作するためのユーザインターフェースとして用いられると共に、ビデオカメラ3の映像やマイクロフォン4からの音、あるいは制御パラメータの設定状況など、ロボット自体の作動状況の監視に用いられる。さらに、各種情報や制御パラメータの新規・更新登録をユーザーが任意に行うことができる。   The task setting unit 11 is connected to a user terminal 13 including a personal computer having an input device such as a keyboard and a touch panel and a monitor such as a liquid crystal display. This is used as a user interface for remotely operating the robot 1 by an operator instructing the start / stop / origin return of the robot 1, and the image of the video camera 3, sound from the microphone 4, or control parameters. It is used to monitor the operation status of the robot itself, such as the setting status. Furthermore, the user can arbitrarily perform new / update registration of various information and control parameters.

行動制御部12では、タスク設定部11からの所定の動作指令に基づいて設定されたアクション指示値を手足関節部14に設けられたアクチュエータなどに与え、ロボット1の運動を制御する。特に脚部は、歩幅および歩数(歩容)の指示を受け、それに基づいて脚関節部を制御する。   In the behavior control unit 12, an action instruction value set based on a predetermined operation command from the task setting unit 11 is given to an actuator or the like provided in the limb joint unit 14 to control the movement of the robot 1. In particular, the leg receives instructions of the stride and the number of steps (gait) and controls the leg joint based on the instructions.

現在位置検出部15は、ロボット1の方向転換角度や移動距離を検出し、これとジャイロコンパス、地磁気センサ、或いはガスレートセンサを用いた位置補正装置、もしくはGPS等からの検出値から、最新の自己位置を推定し、この現在位置データを地図データ管理部7を介してタスク設定部11にフィードバックする。これによって目標地点と自己位置とを比較し、進路の補正を行う。   The current position detection unit 15 detects the direction change angle and the movement distance of the robot 1, and detects the latest from the detected value from the position correction device using the gyrocompass, the geomagnetic sensor, the gas rate sensor, or GPS. The self position is estimated, and the current position data is fed back to the task setting unit 11 via the map data management unit 7. In this way, the target point and the self position are compared, and the course is corrected.

さらにこのロボット1には、肩などの適所に方向指示灯16が設けられており、進路変更を行う際にどちらへ方向転換するのかを周囲の人間に報知し得るようになっている。また肩や腕の側部にタッチセンサ17が設けられており、歩行中や停止中に人間と接触したことを検出し得るようになっている。   Further, the robot 1 is provided with a direction indicator lamp 16 at an appropriate position such as a shoulder so that it can notify surrounding people of which direction is changed when the course is changed. A touch sensor 17 is provided on the side of the shoulder or arm so as to detect contact with a human during walking or stopping.

次に本発明による歩行者との接触回避制御について説明する。   Next, contact avoidance control with a pedestrian according to the present invention will be described.

ビデオカメラ3の画像情報にて前方から接近してくる対向者を認識したならば(ステップ1)、その進路を予測し、こちらの進路と一致しているか否かを判別する(ステップ2)。ここで両者の進路が合致していなければ、衝突懸念なしと判定してそのままの進路を維持し(ステップ3)、衝突懸念ありと判定された場合は、対向者との距離を測定し(ステップ4)、所定距離まで接近する間に相手が進路を変えなければ、回避可能な余地が自身の例えば右側に有るか否かを判別し(ステップ5)、右側に避ける余地がある場合は、ウインカーあるいは発声にて右側へ回避行動をとることを報知して進路を変更する(ステップ6)。この際、ロボットの頭部を回して進む方向へ顔面を向けることにより、ロボットが進路変更する意志を持つことを、対向者により一層強く伝達することができる。そして、対向者の動きを確認し(ステップ7)、対向者の直進が確認された場合は、こちらの進路変更によって衝突懸念が解消されたものと判定し、前進を継続する(ステップ3)。他方、ステップ5で右側に避ける余地がないと判定された場合、並びにステップ7で対向者がこちらと同方向へ進路を変更したことが確認された場合は、その場に停止して合図を発する(ステップ8)。   If an opponent approaching from the front is recognized from the image information of the video camera 3 (step 1), the course is predicted and it is determined whether or not it is coincident with this course (step 2). If the courses of the two do not match, it is determined that there is no fear of collision and the course is maintained as it is (step 3). If it is determined that there is a possibility of collision, the distance to the opponent is measured (step) 4) If the opponent does not change the course while approaching a predetermined distance, it is determined whether there is a room for avoidance on the right side of itself (step 5). Alternatively, it is notified that the avoidance action is taken to the right side by speaking and the course is changed (step 6). At this time, by turning the head of the robot and directing the face in the direction of advance, it is possible to more strongly convey to the opponent that the robot has the will to change the course. Then, the movement of the opponent is confirmed (step 7), and if it is confirmed that the opponent is going straight ahead, it is determined that the concern about the collision has been eliminated by this course change, and the advance is continued (step 3). On the other hand, if it is determined in step 5 that there is no room to avoid on the right side, and if it is confirmed in step 7 that the opponent has changed the course in the same direction as this, then stop and signal (Step 8).

停止待機時、あるいは前進すれ違い時にタッチセンサ17で対向者との接触の有無をを検出し(ステップ9)、接触が検知された場合は、例えば「失礼しました」などと発話して謝意を表す(ステップ10)。   At the time of stop waiting or when moving forward, the touch sensor 17 detects the presence or absence of contact with the opponent (step 9), and if contact is detected, for example, say "I am sorry" to express my gratitude (Step 10).

以上の様にして、右側へ進路変更して対向者との衝突を回避し、すれ違った後は、元の進路に戻ると良い。また、原則として右側の壁との間に避けられるだけの余地を残して通常の進路設定を行うと良い。これにより、人間がロボットに気付かないことを前提にしてロボットに回避行動をとらせるので、ロボットに不慣れな不特定多数の人が出入りする環境下でも、人との接触を避けた円滑な移動が可能となる。なお、上記実施例では右側への進路変更を原則としたが、これはその環境に最適な規則として適宜に定めれば良いことは言うまでもない。   As described above, the course is changed to the right to avoid a collision with the opponent, and after passing each other, it is good to return to the original course. Also, as a general rule, it is advisable to set a normal course leaving enough room to avoid the right wall. This allows the robot to take an avoidance action on the assumption that the human is not aware of the robot, so smooth movement avoiding contact with humans is possible even in an environment where an unspecified number of people unfamiliar with the robot enter and exit. It becomes possible. In the above embodiment, the course is changed to the right side in principle, but it is needless to say that this may be appropriately determined as a rule optimal for the environment.

本発明装置の概略構成を示すブロック図である。It is a block diagram which shows schematic structure of this invention apparatus. 本発明の制御に関わるフロー図である。It is a flowchart in connection with control of this invention.

符号の説明Explanation of symbols

1 ロボット
7 地図データ管理部
11 タスク設定部
12 行動制御部
16 方向指示灯
17 タッチセンサ
DESCRIPTION OF SYMBOLS 1 Robot 7 Map data management part 11 Task setting part 12 Behavior control part 16 Direction indicator 17 Touch sensor

Claims (3)

環境情報取得手段と、現在位置検出手段と、移動領域内の通路や固定物の配置に関わる情報を記述した地図管理手段とを備え、移動領域内の環境情報を参照しつつ自律的に移動するロボットの制御装置であって、
前記環境情報取得手段が認識した対向者と自身との位置関係に基づいてそのままの進路をとると対向者との衝突可能性大と判断された場合は、合図を発して進路を所定の側へ変更し、且つその衝突回避行動が対向者に伝わったことを確認することを特徴とする移動ロボットの制御装置。
Environment information acquisition means, current position detection means, and map management means describing information related to the arrangement of passages and fixed objects in the movement area, and autonomously move while referring to the environment information in the movement area A robot control device,
If it is determined that there is a high possibility of a collision with the opponent when taking the course as it is based on the positional relationship between the opponent recognized by the environment information acquisition means and the opponent, the path is moved to a predetermined side A control apparatus for a mobile robot, characterized in that the change is confirmed and the collision avoidance action is transmitted to the opponent.
前記衝突回避行動が対向者に伝わっていないと予想される場合は停止してさらに合図を発することを特徴とする請求項1に記載の移動ロボットの制御装置。   2. The mobile robot control device according to claim 1, wherein when the collision avoidance action is predicted not to be transmitted to the opponent, the control apparatus stops and issues a signal. 接触センサを有し、対向者との接触時は発話して謝意を表すことを特徴とする請求項1若しくは2に記載の移動ロボットの制御装置。   The mobile robot control device according to claim 1, wherein the mobile robot control device has a contact sensor and expresses gratitude when speaking with an opponent.
JP2004219576A 2004-07-28 2004-07-28 Mobile robot controller Expired - Fee Related JP4315872B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2004219576A JP4315872B2 (en) 2004-07-28 2004-07-28 Mobile robot controller

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2004219576A JP4315872B2 (en) 2004-07-28 2004-07-28 Mobile robot controller

Publications (2)

Publication Number Publication Date
JP2006035381A true JP2006035381A (en) 2006-02-09
JP4315872B2 JP4315872B2 (en) 2009-08-19

Family

ID=35900911

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2004219576A Expired - Fee Related JP4315872B2 (en) 2004-07-28 2004-07-28 Mobile robot controller

Country Status (1)

Country Link
JP (1) JP4315872B2 (en)

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010079852A (en) * 2008-09-29 2010-04-08 Honda Motor Co Ltd Mobile device
JP2010264585A (en) * 2009-05-15 2010-11-25 Honda Research Inst Europe Gmbh Autonomous robot incorporating planning in unestimatable dynamic complicated environment
JP2011110644A (en) * 2009-11-26 2011-06-09 Fujitsu Ltd Autonomous mobile robot
US8204679B2 (en) 2007-04-06 2012-06-19 Honda Motor Co., Ltd. Mobile apparatus, control device and control program
JP2012130986A (en) * 2010-12-21 2012-07-12 Toyota Motor Corp Moving body
US8442714B2 (en) 2007-04-12 2013-05-14 Panasonic Corporation Autonomous mobile device, and control device and program product for the autonomous mobile device
WO2013171905A1 (en) * 2012-05-18 2013-11-21 株式会社日立製作所 Autonomous moving device, control method, and autonomous moving method
WO2013176762A1 (en) * 2012-05-22 2013-11-28 Intouch Technologies, Inc. Social behavior rules for a medical telepresence robot
US9075416B2 (en) 2010-09-21 2015-07-07 Toyota Jidosha Kabushiki Kaisha Mobile body
US9361021B2 (en) 2012-05-22 2016-06-07 Irobot Corporation Graphical user interfaces including touchpad driving interfaces for telemedicine devices
US9469030B2 (en) 2011-01-28 2016-10-18 Intouch Technologies Interfacing with a mobile telepresence robot
US9571789B2 (en) 2012-11-26 2017-02-14 Intouch Technologies, Inc. Enhanced video interaction for a user interface of a telepresence network
US9785149B2 (en) 2011-01-28 2017-10-10 Intouch Technologies, Inc. Time-dependent navigation of telepresence robots
JPWO2017169826A1 (en) * 2016-03-28 2018-04-12 Groove X株式会社 Autonomous robot that welcomes you
US9974612B2 (en) 2011-05-19 2018-05-22 Intouch Technologies, Inc. Enhanced diagnostics for a telepresence robot
JP2019084641A (en) * 2017-11-08 2019-06-06 学校法人早稲田大学 Autonomous mobile robot, and control device and operation control program of the same
WO2019244644A1 (en) * 2018-06-19 2019-12-26 ソニー株式会社 Mobile body control device, mobile body control method, and program
WO2024111893A1 (en) * 2022-11-23 2024-05-30 삼성전자주식회사 Robot traveling in space using map, and method for identifying location thereof
WO2024203849A1 (en) * 2023-03-31 2024-10-03 ソニーグループ株式会社 Manipulator and manipulator control method

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5725001A (en) * 1980-07-21 1982-02-09 Shinko Electric Co Ltd Safety device of moving body
JPS598014A (en) * 1982-07-06 1984-01-17 Mitsubishi Electric Corp Safety device of self-traveling dolly
JPS63150710A (en) * 1986-12-16 1988-06-23 Shinko Electric Co Ltd Method for evading collision in autonomous unmanned vehicle system
JPH07248824A (en) * 1994-03-11 1995-09-26 Matsushita Electric Ind Co Ltd Carrying device
JPH09185412A (en) * 1995-12-28 1997-07-15 Yaskawa Electric Corp Autonomous moving device
JP2001212780A (en) * 2000-01-31 2001-08-07 Sony Corp Behavior controller, behavior control method, and recording medium
JP2002000574A (en) * 2000-06-22 2002-01-08 Matsushita Electric Ind Co Ltd Robot for nursing care support and nursing care support system
JP2002328724A (en) * 2001-05-07 2002-11-15 Matsushita Electric Ind Co Ltd Self-propelled equipment
JP2003050559A (en) * 2001-08-07 2003-02-21 Honda Motor Co Ltd Autonomously movable robot
JP2004042151A (en) * 2002-07-09 2004-02-12 Advanced Telecommunication Research Institute International Communication robot

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5725001A (en) * 1980-07-21 1982-02-09 Shinko Electric Co Ltd Safety device of moving body
JPS598014A (en) * 1982-07-06 1984-01-17 Mitsubishi Electric Corp Safety device of self-traveling dolly
JPS63150710A (en) * 1986-12-16 1988-06-23 Shinko Electric Co Ltd Method for evading collision in autonomous unmanned vehicle system
JPH07248824A (en) * 1994-03-11 1995-09-26 Matsushita Electric Ind Co Ltd Carrying device
JPH09185412A (en) * 1995-12-28 1997-07-15 Yaskawa Electric Corp Autonomous moving device
JP2001212780A (en) * 2000-01-31 2001-08-07 Sony Corp Behavior controller, behavior control method, and recording medium
JP2002000574A (en) * 2000-06-22 2002-01-08 Matsushita Electric Ind Co Ltd Robot for nursing care support and nursing care support system
JP2002328724A (en) * 2001-05-07 2002-11-15 Matsushita Electric Ind Co Ltd Self-propelled equipment
JP2003050559A (en) * 2001-08-07 2003-02-21 Honda Motor Co Ltd Autonomously movable robot
JP2004042151A (en) * 2002-07-09 2004-02-12 Advanced Telecommunication Research Institute International Communication robot

Cited By (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8204679B2 (en) 2007-04-06 2012-06-19 Honda Motor Co., Ltd. Mobile apparatus, control device and control program
US8442714B2 (en) 2007-04-12 2013-05-14 Panasonic Corporation Autonomous mobile device, and control device and program product for the autonomous mobile device
JP2010079852A (en) * 2008-09-29 2010-04-08 Honda Motor Co Ltd Mobile device
US8296005B2 (en) 2008-09-29 2012-10-23 Honda Motor Co., Ltd. Mobile apparatus
JP2010264585A (en) * 2009-05-15 2010-11-25 Honda Research Inst Europe Gmbh Autonomous robot incorporating planning in unestimatable dynamic complicated environment
JP2011110644A (en) * 2009-11-26 2011-06-09 Fujitsu Ltd Autonomous mobile robot
US9075416B2 (en) 2010-09-21 2015-07-07 Toyota Jidosha Kabushiki Kaisha Mobile body
JP2012130986A (en) * 2010-12-21 2012-07-12 Toyota Motor Corp Moving body
US11468983B2 (en) 2011-01-28 2022-10-11 Teladoc Health, Inc. Time-dependent navigation of telepresence robots
US10591921B2 (en) 2011-01-28 2020-03-17 Intouch Technologies, Inc. Time-dependent navigation of telepresence robots
US9785149B2 (en) 2011-01-28 2017-10-10 Intouch Technologies, Inc. Time-dependent navigation of telepresence robots
US9469030B2 (en) 2011-01-28 2016-10-18 Intouch Technologies Interfacing with a mobile telepresence robot
US9974612B2 (en) 2011-05-19 2018-05-22 Intouch Technologies, Inc. Enhanced diagnostics for a telepresence robot
WO2013171905A1 (en) * 2012-05-18 2013-11-21 株式会社日立製作所 Autonomous moving device, control method, and autonomous moving method
JPWO2013171905A1 (en) * 2012-05-18 2016-01-07 株式会社日立製作所 Autonomous mobile device, control device, and autonomous mobile method
US9588518B2 (en) 2012-05-18 2017-03-07 Hitachi, Ltd. Autonomous mobile apparatus, control device, and autonomous mobile method
US10061896B2 (en) 2012-05-22 2018-08-28 Intouch Technologies, Inc. Graphical user interfaces including touchpad driving interfaces for telemedicine devices
US11515049B2 (en) 2012-05-22 2022-11-29 Teladoc Health, Inc. Graphical user interfaces including touchpad driving interfaces for telemedicine devices
US11628571B2 (en) 2012-05-22 2023-04-18 Teladoc Health, Inc. Social behavior rules for a medical telepresence robot
US9776327B2 (en) 2012-05-22 2017-10-03 Intouch Technologies, Inc. Social behavior rules for a medical telepresence robot
US9361021B2 (en) 2012-05-22 2016-06-07 Irobot Corporation Graphical user interfaces including touchpad driving interfaces for telemedicine devices
WO2013176762A1 (en) * 2012-05-22 2013-11-28 Intouch Technologies, Inc. Social behavior rules for a medical telepresence robot
US10328576B2 (en) 2012-05-22 2019-06-25 Intouch Technologies, Inc. Social behavior rules for a medical telepresence robot
US11453126B2 (en) 2012-05-22 2022-09-27 Teladoc Health, Inc. Clinical workflows utilizing autonomous and semi-autonomous telemedicine devices
US10892052B2 (en) 2012-05-22 2021-01-12 Intouch Technologies, Inc. Graphical user interfaces including touchpad driving interfaces for telemedicine devices
US9174342B2 (en) 2012-05-22 2015-11-03 Intouch Technologies, Inc. Social behavior rules for a medical telepresence robot
US10603792B2 (en) 2012-05-22 2020-03-31 Intouch Technologies, Inc. Clinical workflows utilizing autonomous and semiautonomous telemedicine devices
US10658083B2 (en) 2012-05-22 2020-05-19 Intouch Technologies, Inc. Graphical user interfaces including touchpad driving interfaces for telemedicine devices
US10780582B2 (en) 2012-05-22 2020-09-22 Intouch Technologies, Inc. Social behavior rules for a medical telepresence robot
US10924708B2 (en) 2012-11-26 2021-02-16 Teladoc Health, Inc. Enhanced video interaction for a user interface of a telepresence network
US10334205B2 (en) 2012-11-26 2019-06-25 Intouch Technologies, Inc. Enhanced video interaction for a user interface of a telepresence network
US9571789B2 (en) 2012-11-26 2017-02-14 Intouch Technologies, Inc. Enhanced video interaction for a user interface of a telepresence network
US11910128B2 (en) 2012-11-26 2024-02-20 Teladoc Health, Inc. Enhanced video interaction for a user interface of a telepresence network
US11135727B2 (en) 2016-03-28 2021-10-05 Groove X, Inc. Autonomously acting robot that performs a greeting action
JPWO2017169826A1 (en) * 2016-03-28 2018-04-12 Groove X株式会社 Autonomous robot that welcomes you
JP7036399B2 (en) 2017-11-08 2022-03-15 学校法人早稲田大学 Autonomous mobile robots, their control devices and motion control programs
JP2019084641A (en) * 2017-11-08 2019-06-06 学校法人早稲田大学 Autonomous mobile robot, and control device and operation control program of the same
WO2019244644A1 (en) * 2018-06-19 2019-12-26 ソニー株式会社 Mobile body control device, mobile body control method, and program
US11526172B2 (en) 2018-06-19 2022-12-13 Sony Corporation Mobile object control apparatus and mobile object control method
WO2024111893A1 (en) * 2022-11-23 2024-05-30 삼성전자주식회사 Robot traveling in space using map, and method for identifying location thereof
WO2024203849A1 (en) * 2023-03-31 2024-10-03 ソニーグループ株式会社 Manipulator and manipulator control method

Also Published As

Publication number Publication date
JP4315872B2 (en) 2009-08-19

Similar Documents

Publication Publication Date Title
JP4315872B2 (en) Mobile robot controller
US7840308B2 (en) Robot device control based on environment and position of a movable robot
JP5768273B2 (en) A robot that predicts a pedestrian&#39;s trajectory and determines its avoidance behavior
US7474945B2 (en) Route generating system for an autonomous mobile robot
JP5987842B2 (en) Communication pull-in system, communication pull-in method and communication pull-in program
JP2003340764A (en) Guide robot
JP2008152504A (en) Guidance robot device and guidance system
JP5277974B2 (en) Driving assistance device
JP2005290813A (en) Parking guidance robot
JP7392377B2 (en) Equipment, information processing methods, programs, information processing systems, and information processing system methods
JP2004299025A (en) Mobile robot control device, mobile robot control method and mobile robot control program
JP2008087140A (en) Speech recognition robot and control method of speech recognition robot
EP3450118A1 (en) Robot
US20210154827A1 (en) System and Method for Assisting a Visually Impaired Individual
JP2006167838A (en) Autonomous moving robot
JP5084756B2 (en) Autonomous mobile wheelchair
JP3768957B2 (en) Mobile robot path setting method
JP2006231447A (en) Confirmation method for indicating position or specific object and method and device for coordinate acquisition
JP4326437B2 (en) Robot control device
JP2022078741A (en) robot
JP5115886B2 (en) Road guidance robot
WO2022224311A1 (en) Route guide device, route guide method, and route guide program
JP2006185239A (en) Robot device, moving and following method of robot device, and program
JP7408991B2 (en) Moving object system, moving object, control program, and control method
JP4621535B2 (en) Movement control method for legged mobile robot

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20061201

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20080922

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20080930

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20081119

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20090512

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20090519

R150 Certificate of patent or registration of utility model

Ref document number: 4315872

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120529

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130529

Year of fee payment: 4

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130529

Year of fee payment: 4

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20140529

Year of fee payment: 5

LAPS Cancellation because of no payment of annual fees