JP4315872B2 - Mobile robot controller - Google Patents

Mobile robot controller Download PDF

Info

Publication number
JP4315872B2
JP4315872B2 JP2004219576A JP2004219576A JP4315872B2 JP 4315872 B2 JP4315872 B2 JP 4315872B2 JP 2004219576 A JP2004219576 A JP 2004219576A JP 2004219576 A JP2004219576 A JP 2004219576A JP 4315872 B2 JP4315872 B2 JP 4315872B2
Authority
JP
Japan
Prior art keywords
course
opponent
control device
robot
mobile robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
JP2004219576A
Other languages
Japanese (ja)
Other versions
JP2006035381A (en
Inventor
政宣 武田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honda Motor Co Ltd
Original Assignee
Honda Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honda Motor Co Ltd filed Critical Honda Motor Co Ltd
Priority to JP2004219576A priority Critical patent/JP4315872B2/en
Publication of JP2006035381A publication Critical patent/JP2006035381A/en
Application granted granted Critical
Publication of JP4315872B2 publication Critical patent/JP4315872B2/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Description

本発明は、環境内の情報を参照しつつ自律的に移動するロボットの制御装置に関するものである。   The present invention relates to a control device for a robot that moves autonomously while referring to information in the environment.

所定の領域内を自律移動する清掃ロボットが知られている(特許文献1を参照されたい)。この清掃ロボットは、移動時に複数の表示灯を点滅し、その点滅パターンと点灯色とを変化させることにより、自身の稼働状況を周囲の人々に報知するようになっている。また、超音波センサ等で障害物までの距離を計測し、障害物との間が所定距離になると停止して方向転換するようになっている。
特開平7−46710号公報
A cleaning robot that autonomously moves within a predetermined area is known (see Patent Document 1). The cleaning robot blinks a plurality of indicator lamps when moving, and changes its blinking pattern and lighting color so as to notify its surroundings to the surrounding people. Further, the distance to the obstacle is measured with an ultrasonic sensor or the like, and when it reaches a predetermined distance from the obstacle, it stops and changes its direction.
JP 7-46710 A

しかるに、この従来の制御装置は、清掃ロボットに建物内の床面を隈無く走行させることを目的とするものであり、周囲の人間の方が表示灯を見て清掃ロボットの稼働状況を判断して清掃ロボットの進路から退避しなければならないものである。そのため、周囲の人間が表示灯を見落としたり、表示灯の意味に気付かなかったりすること考慮しなければならない環境、つまり不特定多数の人間が出入りするところで稼働するロボットには適していない。   However, this conventional control device is intended to allow the cleaning robot to travel on the floor in the building without any obstacles, and the surrounding humans look at the indicator lights to determine the operation status of the cleaning robot. Therefore, it is necessary to evacuate from the path of the cleaning robot. Therefore, it is not suitable for an environment where it is necessary to consider that surrounding humans overlook the indicator lamp or do not notice the meaning of the indicator lamp, that is, a robot that operates where an unspecified number of humans come and go.

このような課題を解決し、不特定多数の人間が移動する環境下でも、移動する人間と衝突せずに円滑に移動することのできる移動ロボットの制御装置を提供するため、本発明による移動ロボットの制御装置は、環境情報取得手段(ビデオカメラ3やマイクロフォン4)と、現在位置検出手段(15)と、移動領域内の通路や固定物の配置に関わる情報を記述した地図管理手段(7)とを備え、移動領域内の環境情報を参照しつつ自律的に移動するロボット(1)の制御装置であって、前記環境情報取得手段が認識した対向者と自身との位置関係に基づいてそのままの進路をとると、対向者との衝突可能性大と判断された場合には、進路変更の合図として、頭部を回して進む方向へ顔面を向け、進路を所定の側へ変更する。
本発明による移動ロボットの制御装置は、好ましくは、進路変更による衝突回避行動が対向者に伝わったことを、前記環境情報取得手段が認識した対向者の移動方向より確認し、対向者が進路変更方向に同方向に進路を変更した場合には移動を停止、そうでない場合には進路変更した前進を継続する。
本発明による移動ロボットの制御装置は、好ましくは、更に、接触センサを有し、前記接触センサによって対向者との接触の有無を検出し、対向者との接触時には発話して謝意を表す。
In order to solve such problems and to provide a control device for a mobile robot that can move smoothly without colliding with a moving human even in an environment where an unspecified number of humans move, a mobile robot according to the present invention is provided. The control device includes an environment information acquisition means (video camera 3 and microphone 4), a current position detection means (15), and a map management means (7) describing information related to the arrangement of passages and fixed objects in the moving area. A control device for the robot (1) that autonomously moves while referring to the environment information in the movement area, and is based on the positional relationship between the opponent recognized by the environment information acquisition means and itself If it is determined that there is a high possibility of collision with the opponent, as a signal for changing the course, the face is turned in the direction of turning by turning the head and the course is changed to a predetermined side.
The control device for a mobile robot according to the present invention preferably confirms from the moving direction of the opponent recognized by the environmental information acquisition means that the collision avoidance action due to the course change has been transmitted to the opponent, and the opponent changes the course. If the course is changed in the same direction, the movement is stopped, and if not, the forward movement with the changed course is continued.
The control device for a mobile robot according to the present invention preferably further includes a contact sensor, detects presence / absence of contact with the opponent by the contact sensor, and speaks to express thanks to the contact with the opponent.

このような本発明によれば、その環境に不慣れな人間が出入りする環境下でも、ロボットが対向者の進路を確認し、衝突を回避する方向への進路変更を行いつつ移動するので、人間と接触せずに円滑に移動することができる。特にこちらが避けたにも拘わらず対向者がこちらに向かってくる場合は、停止した上で合図して対向者の注意を喚起することで接触を防止し、やむなく接触した場合は、謝ることで不興を買わないようにすることができる。   According to the present invention, even in an environment where a person unfamiliar with the environment enters and leaves, the robot checks the path of the opponent and moves while changing the path in a direction to avoid a collision. It can move smoothly without contact. In particular, if the opponent comes to you even though you have avoided it, stop and signal to prevent the contact by alerting the opponent. You can avoid buying misery.

以下に添付の図面を参照して本発明について詳細に説明する。   Hereinafter, the present invention will be described in detail with reference to the accompanying drawings.

図1は、本発明が適用されるロボットの概略構成を示すブロック図である。このロボット1には、スピーカ2、ビデオカメラ3、及びマイクロフォン4が設けられており、ビデオカメラ3からの画像信号が画像処理部5に入力され、マイクロフォン4からの音声信号が音処理部6に入力され、適宜な合成音声生成手段で生成された音声信号がスピーカ2から出力される。   FIG. 1 is a block diagram showing a schematic configuration of a robot to which the present invention is applied. The robot 1 is provided with a speaker 2, a video camera 3, and a microphone 4. An image signal from the video camera 3 is input to the image processing unit 5, and an audio signal from the microphone 4 is input to the sound processing unit 6. An audio signal that is input and generated by an appropriate synthesized speech generation means is output from the speaker 2.

ビデオカメラ3は、モノクロまたはカラーの撮像素子を備えており、パン(左右方向)及びチルト(上下方向)動作がモータによって行われる。このビデオカメラ3からの画像出力は、フレームグラバーによってディジタル化され、連続または任意の間隔の2フレーム間の差分から動体が抽出される。また、左右一対のビデオカメラ3の画像から立体視して目標物までの距離情報が得られる。さらに画像情報からオプティカルフローに基づいて人の輪郭や移動体の位置を求め、人の顔を識別するための画像処理が行われる。   The video camera 3 includes a monochrome or color image sensor, and pan (left-right direction) and tilt (up-down direction) operations are performed by a motor. The image output from the video camera 3 is digitized by a frame grabber, and a moving object is extracted from the difference between two frames continuously or at an arbitrary interval. Further, distance information to the target can be obtained by stereoscopic viewing from the images of the pair of left and right video cameras 3. Further, image processing for identifying the human face is performed by obtaining the contour of the person and the position of the moving body based on the optical flow from the image information.

音処理部6では、暗騒音や残響成分を除去して目標音を抽出し易くすると共に、音の立ち上がり部分から、人が発する音声であるか、何らかの物体同士がぶつかり合う衝撃音であるかを推定する。また、一対のマイクロフォン4間の音圧差及び音の到達時間差に基づいて音源位置を特定する処理が行われる。   The sound processing unit 6 makes it easy to extract the target sound by removing background noise and reverberation components, and determines whether the sound is generated by a person from the rising part of the sound or is an impact sound in which some objects collide with each other. presume. Further, a process of specifying the sound source position based on the sound pressure difference between the pair of microphones 4 and the sound arrival time difference is performed.

一方、このロボット1には、ロボットの行動範囲における通路や固定物の配置を記述した地図データを格納する地図データ管理部7と、ID・氏名・性別・生年月日・血液型などの一般情報、会社名・所属・役職・電話番号・メールアドレス・端末情報などの職業情報、及び顔認識のための顔データとからなる個人データを格納した個人データ管理部8と、地図を適宜に分画した領域毎の照明の種類、明るさ、平均騒音レベル、残響特性、床の硬さなどの環境データに基づく制御パラメータを格納したパラメータデータ管理部9とを備えたロボット支援サーバ10からの各種情報が与えられる。   On the other hand, the robot 1 includes a map data management unit 7 for storing map data describing passages and fixed objects in the robot's range of action, and general information such as ID, name, gender, date of birth, and blood type. Personal data management unit 8 that stores personal data consisting of occupation information such as company name, affiliation, job title, telephone number, email address, terminal information, and face data for face recognition, and maps as appropriate Information from the robot support server 10 including a parameter data management unit 9 storing control parameters based on environmental data such as the type of illumination, brightness, average noise level, reverberation characteristics, floor hardness, etc. Is given.

このロボット1は、画像、音、地図、環境、および個人の各情報を基に、タスク設定部11がロボット1の進行ルートを設定する。そしてタスク設定部11は、ロボット1の行動様式を定義し、遂行すべき行動指令を行動制御部12に発する。また、タスク設定部11には、画像処理部5からの画像処理信号ならびに音処理部6からの音声信号が直接入力され、地図データ管理部7からの地図データを参照しつつロボット1の周辺の障害物の有無をチェックし、移動中に現れた障害物あるいは歩行者などに衝突しないように、移動速度や移動経路の変更指示を行動制御部12に発する。   In the robot 1, the task setting unit 11 sets the travel route of the robot 1 based on the image, sound, map, environment, and individual information. Then, the task setting unit 11 defines the behavior mode of the robot 1 and issues a behavior command to be performed to the behavior control unit 12. The task setting unit 11 is directly input with the image processing signal from the image processing unit 5 and the audio signal from the sound processing unit 6, and refers to the map data from the map data management unit 7 while surrounding the robot 1. The presence / absence of an obstacle is checked, and an instruction to change the movement speed or movement route is issued to the action control unit 12 so as not to collide with an obstacle or a pedestrian that appears during movement.

タスク設定部11には、キーボードやタッチパネルなどの入力デバイスと、液晶ディスプレーなどのモニタを備えたパーソナルコンピュータからなるユーザ端末13が接続されている。これは、ロボット1の起動・停止・原点復帰などをオペレータが指示してロボット1を遠隔操作するためのユーザインターフェースとして用いられると共に、ビデオカメラ3の映像やマイクロフォン4からの音、あるいは制御パラメータの設定状況など、ロボット自体の作動状況の監視に用いられる。さらに、各種情報や制御パラメータの新規・更新登録をユーザーが任意に行うことができる。   The task setting unit 11 is connected to a user terminal 13 including a personal computer having an input device such as a keyboard and a touch panel and a monitor such as a liquid crystal display. This is used as a user interface for remotely operating the robot 1 by an operator instructing the start / stop / origin return of the robot 1, and the image of the video camera 3, sound from the microphone 4, or control parameters. It is used to monitor the operation status of the robot itself, such as the setting status. Furthermore, the user can arbitrarily perform new / update registration of various information and control parameters.

行動制御部12では、タスク設定部11からの所定の動作指令に基づいて設定されたアクション指示値を手足関節部14に設けられたアクチュエータなどに与え、ロボット1の運動を制御する。特に脚部は、歩幅および歩数(歩容)の指示を受け、それに基づいて脚関節部を制御する。   In the behavior control unit 12, an action instruction value set based on a predetermined operation command from the task setting unit 11 is given to an actuator or the like provided in the limb joint unit 14 to control the movement of the robot 1. In particular, the leg receives instructions of the stride and the number of steps (gait) and controls the leg joint based on the instructions.

現在位置検出部15は、ロボット1の方向転換角度や移動距離を検出し、これとジャイロコンパス、地磁気センサ、或いはガスレートセンサを用いた位置補正装置、もしくはGPS等からの検出値から、最新の自己位置を推定し、この現在位置データを地図データ管理部7を介してタスク設定部11にフィードバックする。これによって目標地点と自己位置とを比較し、進路の補正を行う。   The current position detection unit 15 detects the direction change angle and the movement distance of the robot 1, and detects the latest from the detected value from the position correction device using the gyrocompass, the geomagnetic sensor, the gas rate sensor, or GPS. The self position is estimated, and the current position data is fed back to the task setting unit 11 via the map data management unit 7. In this way, the target point and the self position are compared, and the course is corrected.

さらにこのロボット1には、肩などの適所に方向指示灯16が設けられており、進路変更を行う際にどちらへ方向転換するのかを周囲の人間に報知し得るようになっている。また肩や腕の側部にタッチセンサ17が設けられており、歩行中や停止中に人間と接触したことを検出し得るようになっている。   Further, the robot 1 is provided with a direction indicator lamp 16 at an appropriate position such as a shoulder so that it can notify surrounding people of which direction is changed when the course is changed. A touch sensor 17 is provided on the side of the shoulder or arm so as to detect contact with a human during walking or stopping.

次に本発明による歩行者との接触回避制御について説明する。   Next, contact avoidance control with a pedestrian according to the present invention will be described.

ビデオカメラ3の画像情報にて前方から接近してくる対向者を認識したならば(ステップ1)、その進路を予測し、こちらの進路と一致しているか否かを判別する(ステップ2)。ここで両者の進路が合致していなければ、衝突懸念なしと判定してそのままの進路を維持し(ステップ3)、衝突懸念ありと判定された場合は、対向者との距離を測定し(ステップ4)、所定距離まで接近する間に相手が進路を変えなければ、回避可能な余地が自身の例えば右側に有るか否かを判別し(ステップ5)、右側に避ける余地がある場合は、ウインカーあるいは発声にて右側へ回避行動をとることを報知して進路を変更する(ステップ6)。この際、ロボットの頭部を回して進む方向へ顔面を向けることにより、ロボットが進路変更する意志を持つことを、対向者により一層強く伝達することができる。そして、対向者の動きを確認し(ステップ7)、対向者の直進が確認された場合は、こちらの進路変更によって衝突懸念が解消されたものと判定し、前進を継続する(ステップ3)。他方、ステップ5で右側に避ける余地がないと判定された場合、並びにステップ7で対向者がこちらと同方向へ進路を変更したことが確認された場合は、その場に停止して合図を発する(ステップ8)。   If an opponent approaching from the front is recognized from the image information of the video camera 3 (step 1), the course is predicted and it is determined whether or not it is coincident with this course (step 2). If the courses of the two do not match, it is determined that there is no fear of collision and the course is maintained as it is (step 3). If it is determined that there is a possibility of collision, the distance to the opponent is measured (step) 4) If the opponent does not change the course while approaching a predetermined distance, it is determined whether there is a room for avoidance on the right side of itself (step 5). Alternatively, it is notified that the avoidance action is taken to the right side by speaking and the course is changed (step 6). At this time, by turning the head of the robot and directing the face in the direction of advance, it is possible to more strongly convey to the opponent that the robot has the will to change the course. Then, the movement of the opponent is confirmed (step 7), and if it is confirmed that the opponent is going straight ahead, it is determined that the concern about the collision has been eliminated by this course change, and the advance is continued (step 3). On the other hand, if it is determined in step 5 that there is no room to avoid on the right side, and if it is confirmed in step 7 that the opponent has changed the course in the same direction as this, then stop and signal (Step 8).

停止待機時、あるいは前進すれ違い時にタッチセンサ17で対向者との接触の有無をを検出し(ステップ9)、接触が検知された場合は、例えば「失礼しました」などと発話して謝意を表す(ステップ10)。   At the time of stop waiting or when moving forward, the touch sensor 17 detects the presence or absence of contact with the opponent (step 9), and if contact is detected, for example, say "I am sorry" to express my gratitude (Step 10).

以上の様にして、右側へ進路変更して対向者との衝突を回避し、すれ違った後は、元の進路に戻ると良い。また、原則として右側の壁との間に避けられるだけの余地を残して通常の進路設定を行うと良い。これにより、人間がロボットに気付かないことを前提にしてロボットに回避行動をとらせるので、ロボットに不慣れな不特定多数の人が出入りする環境下でも、人との接触を避けた円滑な移動が可能となる。なお、上記実施例では右側への進路変更を原則としたが、これはその環境に最適な規則として適宜に定めれば良いことは言うまでもない。   As described above, the course is changed to the right to avoid a collision with the opponent, and after passing each other, it is good to return to the original course. Also, as a general rule, it is advisable to set a normal course leaving enough room to avoid the right wall. This allows the robot to take an avoidance action on the assumption that the human is not aware of the robot, so smooth movement avoiding contact with humans is possible even in an environment where an unspecified number of people unfamiliar with the robot enter and exit. It becomes possible. In the above embodiment, the course is changed to the right side in principle, but it is needless to say that this may be appropriately determined as a rule optimal for the environment.

本発明装置の概略構成を示すブロック図である。It is a block diagram which shows schematic structure of this invention apparatus. 本発明の制御に関わるフロー図である。It is a flowchart in connection with control of this invention.

符号の説明Explanation of symbols

1 ロボット
7 地図データ管理部
11 タスク設定部
12 行動制御部
16 方向指示灯
17 タッチセンサ
DESCRIPTION OF SYMBOLS 1 Robot 7 Map data management part 11 Task setting part 12 Behavior control part 16 Direction indicator 17 Touch sensor

Claims (3)

環境情報取得手段と、現在位置検出手段と、移動領域内の通路や固定物の配置に関わる情報を記述した地図管理手段とを備え、移動領域内の環境情報を参照しつつ自律的に移動するロボットの制御装置であって、
前記環境情報取得手段が認識した対向者と自身との位置関係に基づいてそのままの進路をとると、対向者との衝突可能性大と判断された場合には、進路変更の合図として、頭部を回して進む方向へ顔面を向け、進路を所定の側へ変更することを特徴とする移動ロボットの制御装置。
Environment information acquisition means, current position detection means, and map management means describing information related to the arrangement of passages and fixed objects in the movement area, and autonomously move while referring to the environment information in the movement area A robot control device,
Taking the course as it is based on the positional relationship between the opponent recognized by the environment information acquisition means and itself, if it is determined that there is a high possibility of collision with the opponent, A mobile robot control device characterized by turning the face in the direction of turning and changing the course to a predetermined side.
進路変更による衝突回避行動が対向者に伝わったことを、前記環境情報取得手段が認識した対向者の移動方向より確認し、対向者が進路変更方向に同方向に進路を変更した場合には移動を停止、そうでない場合には進路変更した前進を継続することを特徴とする請求項1記載の移動ロボットの制御装置。 It is confirmed from the movement direction of the opponent recognized by the environmental information acquisition means that the collision avoidance action due to the course change has been transmitted to the opponent, and moves when the opponent changes the course in the same direction as the course change direction. The mobile robot control device according to claim 1, wherein the control is continued, and if not, the forward movement after the course is changed is continued . 接触センサを有し、前記接触センサによって対向者との接触の有無を検出し、対向者との接触時に発話して謝意を表すことを特徴とする請求項1若しくは2に記載の移動ロボットの制御装置。 It has a contact sensor, wherein detecting the presence or absence of contact with the opposite party by the contact sensor, the mobile robot according to claim 1 or 2 upon contact with the opposite party, characterized in that representing the appreciation and speech Control device.
JP2004219576A 2004-07-28 2004-07-28 Mobile robot controller Expired - Fee Related JP4315872B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2004219576A JP4315872B2 (en) 2004-07-28 2004-07-28 Mobile robot controller

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2004219576A JP4315872B2 (en) 2004-07-28 2004-07-28 Mobile robot controller

Publications (2)

Publication Number Publication Date
JP2006035381A JP2006035381A (en) 2006-02-09
JP4315872B2 true JP4315872B2 (en) 2009-08-19

Family

ID=35900911

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2004219576A Expired - Fee Related JP4315872B2 (en) 2004-07-28 2004-07-28 Mobile robot controller

Country Status (1)

Country Link
JP (1) JP4315872B2 (en)

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4975503B2 (en) 2007-04-06 2012-07-11 本田技研工業株式会社 Legged mobile robot
JP4576445B2 (en) 2007-04-12 2010-11-10 パナソニック株式会社 Autonomous mobile device and program for autonomous mobile device
JP5063549B2 (en) * 2008-09-29 2012-10-31 本田技研工業株式会社 Mobile device
EP2251157B1 (en) * 2009-05-15 2011-07-20 Honda Research Institute Europe GmbH Autonomous robots with planning in unpredictable, dynamic and complex environments
JP5392028B2 (en) * 2009-11-26 2014-01-22 富士通株式会社 Autonomous mobile robot
US9075416B2 (en) 2010-09-21 2015-07-07 Toyota Jidosha Kabushiki Kaisha Mobile body
JP5615160B2 (en) * 2010-12-21 2014-10-29 トヨタ自動車株式会社 Moving body
US8965579B2 (en) 2011-01-28 2015-02-24 Intouch Technologies Interfacing with a mobile telepresence robot
US9323250B2 (en) 2011-01-28 2016-04-26 Intouch Technologies, Inc. Time-dependent navigation of telepresence robots
US9098611B2 (en) 2012-11-26 2015-08-04 Intouch Technologies, Inc. Enhanced video interaction for a user interface of a telepresence network
US20140139616A1 (en) 2012-01-27 2014-05-22 Intouch Technologies, Inc. Enhanced Diagnostics for a Telepresence Robot
WO2013171905A1 (en) * 2012-05-18 2013-11-21 株式会社日立製作所 Autonomous moving device, control method, and autonomous moving method
US9361021B2 (en) 2012-05-22 2016-06-07 Irobot Corporation Graphical user interfaces including touchpad driving interfaces for telemedicine devices
WO2013176758A1 (en) 2012-05-22 2013-11-28 Intouch Technologies, Inc. Clinical workflows utilizing autonomous and semi-autonomous telemedicine devices
GB2563785B (en) * 2016-03-28 2021-03-03 Groove X Inc Autonomously acting robot that performs a greeting action
JP7036399B2 (en) * 2017-11-08 2022-03-15 学校法人早稲田大学 Autonomous mobile robots, their control devices and motion control programs
JP2021157203A (en) * 2018-06-19 2021-10-07 ソニーグループ株式会社 Mobile control device, mobile control method, and program

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5725001A (en) * 1980-07-21 1982-02-09 Shinko Electric Co Ltd Safety device of moving body
JPS598014A (en) * 1982-07-06 1984-01-17 Mitsubishi Electric Corp Safety device of self-traveling dolly
JP2661023B2 (en) * 1986-12-16 1997-10-08 神鋼電機株式会社 Collision Avoidance Method for Autonomous Unmanned Vehicle System
JP3069675B2 (en) * 1994-03-11 2000-07-24 松下電器産業株式会社 Transfer device
JPH09185412A (en) * 1995-12-28 1997-07-15 Yaskawa Electric Corp Autonomous moving device
JP2001212780A (en) * 2000-01-31 2001-08-07 Sony Corp Behavior controller, behavior control method, and recording medium
JP2002000574A (en) * 2000-06-22 2002-01-08 Matsushita Electric Ind Co Ltd Robot for nursing care support and nursing care support system
JP4432278B2 (en) * 2001-05-07 2010-03-17 パナソニック株式会社 Self-propelled equipment
JP4642287B2 (en) * 2001-08-07 2011-03-02 本田技研工業株式会社 Autonomous mobile robot
JP2004042151A (en) * 2002-07-09 2004-02-12 Advanced Telecommunication Research Institute International Communication robot

Also Published As

Publication number Publication date
JP2006035381A (en) 2006-02-09

Similar Documents

Publication Publication Date Title
JP4315872B2 (en) Mobile robot controller
US10362429B2 (en) Systems and methods for generating spatial sound information relevant to real-world environments
JP5768273B2 (en) A robot that predicts a pedestrian's trajectory and determines its avoidance behavior
US20060058920A1 (en) Control apparatus for movable robot
JP3906743B2 (en) Guide robot
US9662788B2 (en) Communication draw-in system, communication draw-in method, and communication draw-in program
US7474945B2 (en) Route generating system for an autonomous mobile robot
JP2008084135A (en) Movement control method, mobile robot and movement control program
JP2008152504A (en) Guidance robot device and guidance system
JP5277974B2 (en) Driving assistance device
JP2004299025A (en) Mobile robot control device, mobile robot control method and mobile robot control program
JP2005290813A (en) Parking guidance robot
JP2008087140A (en) Speech recognition robot and control method of speech recognition robot
JP7392377B2 (en) Equipment, information processing methods, programs, information processing systems, and information processing system methods
JP2009222969A (en) Speech recognition robot and control method for speech recognition robot
JP2009045692A (en) Communication robot and its operating method
JP5084756B2 (en) Autonomous mobile wheelchair
JP2006167838A (en) Autonomous moving robot
JP2006231447A (en) Confirmation method for indicating position or specific object and method and device for coordinate acquisition
JP3768957B2 (en) Mobile robot path setting method
JP6822571B2 (en) Terminal device, risk prediction method, program
JP2009251761A (en) Driving support system
JP2006263873A (en) Communication robot system and communication robot
JP4326437B2 (en) Robot control device
JP5115886B2 (en) Road guidance robot

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20061201

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20080922

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20080930

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20081119

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20090512

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20090519

R150 Certificate of patent or registration of utility model

Ref document number: 4315872

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120529

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130529

Year of fee payment: 4

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130529

Year of fee payment: 4

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20140529

Year of fee payment: 5

LAPS Cancellation because of no payment of annual fees