JPH09267276A - Carrying robot system - Google Patents
Carrying robot systemInfo
- Publication number
- JPH09267276A JPH09267276A JP8103312A JP10331296A JPH09267276A JP H09267276 A JPH09267276 A JP H09267276A JP 8103312 A JP8103312 A JP 8103312A JP 10331296 A JP10331296 A JP 10331296A JP H09267276 A JPH09267276 A JP H09267276A
- Authority
- JP
- Japan
- Prior art keywords
- robot
- transfer robot
- information
- measurement
- transfer
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000012546 transfer Methods 0.000 claims abstract description 65
- 230000000007 visual effect Effects 0.000 claims abstract description 42
- 230000007246 mechanism Effects 0.000 claims abstract description 40
- 238000012544 monitoring process Methods 0.000 claims abstract description 26
- 210000000707 wrist Anatomy 0.000 claims abstract description 18
- 230000033001 locomotion Effects 0.000 claims abstract description 12
- 238000005259 measurement Methods 0.000 claims description 43
- 238000001514 detection method Methods 0.000 claims description 10
- 230000000474 nursing effect Effects 0.000 claims description 8
- 238000004088 simulation Methods 0.000 claims description 8
- 230000005540 biological transmission Effects 0.000 claims description 7
- 238000000605 extraction Methods 0.000 claims description 4
- 238000012795 verification Methods 0.000 claims description 4
- 230000002093 peripheral effect Effects 0.000 claims description 2
- 230000010365 information processing Effects 0.000 claims 1
- 230000003287 optical effect Effects 0.000 claims 1
- 235000012054 meals Nutrition 0.000 abstract description 41
- 238000012384 transportation and delivery Methods 0.000 abstract description 15
- 238000012545 processing Methods 0.000 description 22
- 235000013305 food Nutrition 0.000 description 15
- 238000000034 method Methods 0.000 description 14
- 238000010586 diagram Methods 0.000 description 11
- 230000008569 process Effects 0.000 description 6
- 238000013439 planning Methods 0.000 description 4
- 241000282412 Homo Species 0.000 description 3
- 230000032683 aging Effects 0.000 description 3
- 238000012937 correction Methods 0.000 description 2
- 239000003814 drug Substances 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 230000002457 bidirectional effect Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000010411 cooking Methods 0.000 description 1
- 229940079593 drug Drugs 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
Abstract
Description
【0001】[0001]
【発明の属する技術分野】本発明は、自律制御可能な搬
送ロボットにおける、特に医療・福祉施設等の人間が通
常生活する屋内環境において食事トレイやリネン類等の
移載,移送作業などの搬送作業に適したロボットシステ
ムに関するものである。BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to an autonomously controllable transfer robot, and particularly to transfer work such as transfer and transfer of meal trays and linens in an indoor environment where humans normally live, such as in medical and welfare facilities. The present invention relates to a robot system suitable for.
【0002】[0002]
【従来の技術】近年、少子,高齢化社会を迎えて、施設
・医療施設の需要が増加する一方で、施設の人員確保は
難しい状況になっている。また、高齢化社会を迎え、介
護者の高齢化が進み、食事の配膳・下膳,リネン類等の
移送作業は介護者にとっても重労働化すると考えれ、食
事トレイやリネン類の移載・移送を伴う搬送作業の自動
化が望まれている。2. Description of the Related Art In recent years, with the declining birthrate and aging society, demand for facilities and medical facilities has increased, while securing personnel for the facilities has become difficult. In addition, due to the aging society and the aging of caregivers, it is considered that the work of delivering meals, lower dishes, linens, etc. will become a heavy labor for caregivers, so it is necessary to transfer and transport meal trays and linens. Automation of the accompanying transportation work is desired.
【0003】[0003]
【発明が解決使用とする課題】このような背景におい
て、現在開発されている搬送ロボットは、走行経路上に
テープ等のガイドを設置し、これに沿って走行するロボ
ットが多く、移送経路が限定されている。このようなロ
ボットを導入する場合にはガイドの設置工事が必要であ
り、頻繁に変更される医療福祉施設内のベッド位置に対
してはその都度工事することになり問題があった。ま
た、被介護者の生活の場である室内を走行する場合で
も、室内には多くの日用品があり、ガイドに沿って一定
の経路を走行する搬送ロボットは、そこを生活の場とし
ている被介護者に苦痛を与えることになり、その上、施
設内で車椅子患者等と擦れ違う場合、患者に避けさせて
しまい、安全性や人間社会との適合性に欠けていた。例
えば、ベッドサイドからの食事トレイの配膳・下膳作業
に対しても、微妙な位置決めが必要であり、このような
ガイドに沿って走行する搬送ロボットでは柔軟性に乏し
いうえ、作業範囲が限定される問題があった。また走行
経路上にテープを貼らない搬送ロボットに関しても、施
設内の曲がり角等に目印を設置するなど施設の環境変更
が必要となるうえ、居室内での自律走行に関しては、こ
れまでの搬送ロボットの適用は柔軟性が乏しいため難し
かった。Under such a background, the transport robots currently being developed have a guide such as a tape installed on the traveling path, and many robots travel along the guide, so that the transfer route is limited. Has been done. When introducing such a robot, it is necessary to install a guide, and there is a problem in that the bed position in the medical and welfare facility, which is frequently changed, must be installed each time. In addition, even when traveling in a room where the care recipient lives, there are many daily necessities in the room, and the transport robot that travels along a guide along a certain path uses that as a place for living. This would cause the person to suffer, and in addition, if the patient rubs against a wheelchair patient or the like in the facility, the patient will be prevented from doing so, and safety and compatibility with human society are lacking. For example, delicate positioning is required for meal tray servicing and lower servicing work from the bedside, and a transfer robot traveling along such a guide has poor flexibility and a limited work range. There was a problem. In addition, even for transport robots that do not apply tape on the travel route, it is necessary to change the environment of the facility such as installing marks at the corners of the facility, and for autonomous traveling in the living room, The application was difficult due to the lack of flexibility.
【0004】また、実用的な搬送ロボットにはマニピュ
レーション機能を持つものが見受けられず、対象物の移
送のみを行うため作業者(看護・介護者など)による対
象物の積み下し作業が発生し、結局、看護・介護者への
負担となる上、時間的な制約を与えてしまう問題があっ
た。一方、マニピュレーション機能を持つものであって
も、マニピュレータは工場や実験室等の場所での作業を
想定しており安全性に対する配慮に乏しく、看護・介護
者や被介護者のいる環境での使用は難しかった。Further, no practical transfer robot has a manipulation function, and only the transfer of the object is performed, so that the worker (nursing / caregiver, etc.) unloads the object. After all, there was a problem that it was a burden on the nursing and caregivers and that it imposes time constraints. On the other hand, even if the manipulator has a manipulating function, the manipulator is intended to be used in places such as factories and laboratories, and lacks safety considerations, so it can be used in environments where there are nursing / caregivers and cared persons. Was difficult.
【0005】また、現状の移送ロボットは、工場内での
使用を前提とすることから、作業効率を重視するもので
あり、人間に与える視覚的印象を配慮した動作および情
報の提示、ならびに例えば人間の発する声などに応答す
るといった親しみある印象を与える反応動作のできる、
施設内での使用を前提とした人と共生型のロボットは実
用上実現していなかった。そして、現状の移送ロボット
は、熟練した特定の者だけが操作することを前提として
おり、ロボットの特性・機能に熟知し、経験を積んだも
のだけが円滑に操作できる最低限の操作性だけを有して
おり、特定の知識や経験の持たない人は容易に操作でき
ないという問題点があった。さらに、工場における定型
的な組立作業ラインでは、あらかじめプログラミングさ
れた手順に従って一方向的な処理を行っていた。これら
の要因は、本発明で使用を前提としている医療福祉施設
内において、介護・看護者などが操作するためには、運
用の柔軟性および安全性の面で実用上困難となってい
た。そこで本発明は、医療・福祉施設内の廊下・室内の
移動、対象物のロボットとの移載を安全に自律的に行
い、医療施設内でのヒューマンインタフェースを考慮し
た自律動作可能な搬送ロボットシステムを提供すること
を目的とする。Further, since the current transfer robot is premised on use in a factory, it emphasizes work efficiency, presents motions and information in consideration of a visual impression given to humans, and, for example, humans. It is possible to perform a reaction action that gives a familiar impression such as responding to the voice of
A robot symbiotic with a person who is supposed to be used in a facility has not been practically realized. In addition, the current transfer robot is premised on that only a skilled and specific person can operate it, and only the minimum operability that can be smoothly operated by those who are familiar with the characteristics and functions of the robot and have experience However, there is a problem that a person who does not have specific knowledge or experience cannot operate it easily. Further, in a typical assembly work line in a factory, one-way processing is performed according to a pre-programmed procedure. These factors have been practically difficult in terms of operational flexibility and safety for caregivers and nurses to operate in medical and welfare facilities premised on use in the present invention. Therefore, the present invention is a transfer robot system capable of autonomously and safely moving a corridor / room in a medical / welfare facility and transferring an object to / from a robot in consideration of a human interface in the medical facility. The purpose is to provide.
【0006】[0006]
【課題を解決するための手段】前記の課題を解決するた
め、搬送対象物を格納できる格納部と、視覚センサと、
そのセンサ情報に基づいた計測認識手段と、その計測情
報を利用して搬送ロボット内の格納部との間で対象物の
出し入れを自律的に行える多関節多自由マニピュレータ
と、前記視覚センサによる走行環境の計測認識手段と、
その計測認識結果からロボット動作経路生成手段と、そ
の生成経路とセンサ情報に従った走行指令から自律移動
できる移動機構と、操作者等とコミニュケーションを行
うインタフェース手段と、を有する搬送ロボットと、こ
の搬送ロボットの作業監視と遠隔操作が可能な遠隔監視
操作部とを備えたことを特徴とするものである。In order to solve the above-mentioned problems, a storage unit capable of storing an object to be conveyed, a visual sensor,
A measuring and recognizing means based on the sensor information, an articulated multi-free manipulator capable of autonomously moving an object in and out of a storage part in a transfer robot by using the measured information, and a traveling environment by the visual sensor Measurement and recognition means of
A transport robot having a robot movement path generation means based on the measurement recognition result, a movement mechanism capable of autonomous movement from a travel command according to the generation path and sensor information, and an interface means for communicating with an operator or the like, and a transfer robot. It is characterized in that it is provided with a remote monitoring operation unit capable of performing work monitoring and remote control of the robot.
【0007】[0007]
【発明の実施の形態】以下、実施例として、食事搬送ロ
ボットシステムの例を示す。図1は本発明に係る搬送ロ
ボットシステムの実施例である医療・福祉施設における
食事搬送ロボットシステムの斜視図である。図中番号10
1 は食事搭載あるいは返却中の、102 は食事配膳あるい
は下膳中の食事搬送ロボットを示す。食事搬送ロボット
101 および102 は本体に内蔵するコマンド・画像伝送装
置103 および104 を通して遠隔監視操作部105 との間で
コマンドや画像の伝送を行い、その情報に従って遠隔操
作可能に設定される。また食事搬送ロボット101 および
102 の搬送作業は事前に作業検証用実時間シミュレータ
106 によって検証され、安全な動作を確認したうえで実
行するように設定される。BEST MODE FOR CARRYING OUT THE INVENTION An example of a meal transport robot system will be shown below as an embodiment. FIG. 1 is a perspective view of a meal transportation robot system in a medical / welfare facility which is an embodiment of the transportation robot system according to the present invention. Number 10 in the figure
Reference numeral 1 denotes a meal carrying robot that is loading or returning food, and 102 is a food transport robot that is serving or serving food. Meal transport robot
101 and 102 transmit commands and images with the remote monitoring operation unit 105 through command / image transmission devices 103 and 104 built in the main body, and are set to be remotely operable according to the information. Also, the food transport robot 101 and
The transfer work of 102 is a real-time simulator for work verification in advance.
Validated by 106 and set to run after confirming safe operation.
【0008】人手により調理室から食事を載せた配送ワ
ゴン107 を所定位置に配置した後、遠隔監視操作部105
から食事搬送ロボット101 に被介護者のID, 部屋番号
など必要なデータを転送し作業開始指令を送出すると、
ロボットは配送ワゴン107 の前まで移動機構部108 によ
り自律移動する。配送ワゴン107 内には食事を載せたト
レイ109 が複数格納されており、トレイ109 上には食事
の認識ラベル110 が各々設置されている。食事搬送ロボ
ット101 は移動後には本体内に格納されていた多関節多
自由度マニピュレータ111 を食事トレイ109 に向けて動
作させる。マニピュレータ111 の手首部に設置された視
覚センサ112 により対象トレイ109 上の認識ラベル110
を撮像し、マニピュレーション用環境計測認識装置113
によりこれを認識し、搬送すべき患者の食事を選択した
後、手首部の把持機構114によって対象トレイ109 を把
持する。マニピュレータ111 は把持した対象トレイ109
を食事配送ワゴン107 から引き出し、搬送ロボット101
内のトレイ格納部115 へと格納する。この動作を所定回
数繰り返した後、マニピュレータ111 を再びロボット本
体内に格納し、移動機構108 により居室内ベッドサイド
まで移動する。この間ロボット上部に設置した視覚セン
サ116 およびロボット後方に設置した視覚センサ117 を
用いた走行用環境計測認識装置118 による施設内の特徴
部分の抽出結果を用いた自己位置修正情報及び障害物検
出情報によって経路の再計画を行う。またロボット本体
に配置された障害物センサ119 を用いた走行制御装置12
0 によって走行路上の障害物の認識および回避を行う。After the delivery wagon 107 on which the meal is put from the cooking chamber is manually placed at a predetermined position, the remote monitoring operation unit 105
When the necessary data such as the care recipient's ID and room number are transferred from the meal transfer robot 101 to the meal transfer robot 101 and a work start command is sent,
The robot autonomously moves to the front of the delivery wagon 107 by the moving mechanism unit 108. A plurality of trays 109 on which meals are placed are stored in the delivery wagon 107, and food recognition labels 110 are installed on the trays 109. After moving, the food transport robot 101 operates the multi-joint multi-degree-of-freedom manipulator 111 stored in the main body toward the food tray 109. The visual sensor 112 installed on the wrist of the manipulator 111 allows the recognition label 110 on the target tray 109.
Environment measurement recognition device for manipulation 113
This is recognized and the patient's meal to be transported is selected, and then the target tray 109 is gripped by the gripping mechanism 114 of the wrist. The manipulator 111 is the target tray 109 that is gripped.
The food delivery wagon 107 and the transfer robot 101
It is stored in the tray storage section 115 inside. After repeating this operation a predetermined number of times, the manipulator 111 is stored in the robot body again, and the moving mechanism 108 moves it to the bedside in the living room. During this time, the self-position correction information and the obstacle detection information using the extraction result of the characteristic part in the facility by the traveling environment measurement recognition device 118 using the visual sensor 116 installed in the upper part of the robot and the visual sensor 117 installed in the rear part of the robot are used. Re-plan the route. In addition, the traveling control device 12 using the obstacle sensor 119 arranged on the robot body
0 identifies and avoids obstacles on the road.
【0009】ベッドサイドまできた搬送ロボット102 は
ロボット上面のタッチパネル付ディスプレイ121 および
音声処理装置122 により、被介護者あるいは看護・介護
者との間で簡単な会話を行う。その後マニピュレータ12
3 を本体内対象トレイ124 まで動作させ、手首部の把持
機構125 により把持した後、トレイ格納部126 から引き
出す。手首部に設置した視覚センサ127 の画像情報から
マニピュレーション用環境計測認識装置128 はオーバテ
ーブル129 の位置を一定時間毎に計測認識し、この位置
情報をもとにマニピュレータ123 はトレイ124 をオーバ
テーブル129 上に設置する。設置後は手首部把持機構12
5 を開放し、マニピュレータ123 を再び本体内に格納す
る。この動作を居室内の所定の患者に対して繰り返し行
った後は、再び元の配送ワゴン107 まで自律移動によっ
て戻り、新たな食事トレイの取り込みと配膳を繰り返
す。ベッドサイドまできた搬送ロボット102 はロボット
上面のタッチパネル付ディスプレイ121 および音声処理
装置122 により、被介護者あるいは看護・介護者との間
で簡単な会話を行う。その後マニピュレータ123 を本体
内対象トレイ124 まで動作させ、手首部の把持機構125
により把持した後、トレイ格納部126 から引き出す。手
首部に設置した視覚センサ127 の画像情報からマニピュ
レーション用環境計測認識装置128 はオーバテーブル12
9 の位置を一定時間毎に計測認識し、この位置情報をも
とにマニピュレータ123 はトレイ124 をオーバテーブル
129 上に設置する。設置後は手首部把持機構125 を開放
し、マニピュレータ123 を再び本体内に格納する。この
動作を居室内の所定の患者に対して繰り返し行った後
は、再び元の配送ワゴン107 まで自律移動によって戻
り、新たな食事トレイの取り込みと配膳を繰り返す。The transfer robot 102 that has reached the bedside has a touch panel display 121 and a voice processing device 122 on the upper surface of the robot, and has a simple conversation with the cared person or the caregiver. Then manipulator 12
3 is moved up to the target tray 124 in the main body, gripped by the gripping mechanism 125 of the wrist, and then pulled out from the tray storage 126. From the image information of the visual sensor 127 installed on the wrist, the environment measurement recognition device for manipulation 128 measures and recognizes the position of the overtable 129 at regular time intervals, and the manipulator 123 uses the position information to detect the tray 124 of the overtable 129. Install on top. Wrist grip mechanism 12 after installation
Open 5 and store the manipulator 123 again in the body. After this operation is repeatedly performed for a predetermined patient in the living room, the original delivery wagon 107 is autonomously moved back again, and a new meal tray is taken in and serving is repeated. The transport robot 102 that has reached the bedside has a touch panel display 121 and a voice processing device 122 on the upper surface of the robot, and has a simple conversation with the cared person or nursing / carer. After that, the manipulator 123 is moved to the target tray 124 in the main body, and the gripping mechanism 125 of the wrist is
After gripping with, pull out from the tray storage unit 126. Based on the image information from the visual sensor 127 installed on the wrist, the environment measurement and recognition device for manipulation 128 shows the over table 12
The position of 9 is measured and recognized at regular intervals, and the manipulator 123 moves the tray 124 over table based on this position information.
Installed on 129. After installation, the wrist gripping mechanism 125 is opened and the manipulator 123 is stored in the main body again. After this operation is repeatedly performed for a predetermined patient in the living room, the original delivery wagon 107 is autonomously moved back again, and a new meal tray is taken in and serving is repeated.
【0010】食事終了後の下膳時においては、遠隔監視
操作部105 から食事搬送ロボット102 に作業開始指令を
送出すると、食事搬送ロボット102 はベッドサイドまで
自律的に移動し、ロボット上面のタッチパネル付ディス
プレイ121 および音声処理装置122 により、被介護者あ
るいは看護・介護者との間で簡単な会話を行う。その後
手首部に設置した視覚センサ127 の画像情報からマニピ
ュレーション用環境計測認識装置128 はオーバテーブル
129 上のトレイ124 の位置を一定時間毎に計測認識し、
この位置情報をもとにマニピュレータ123 は手首部把持
機構125 をトレイ124 まで移動させ、手首部把持機構12
5 はトレイ124 を把持する。マニピュレータ123 はトレ
イ124 を持ち上げ本体内トレイ格納部115 へと挿入した
後、手首部把持機構125 を開放し、ロボット本体内に格
納される。この動作を居室内の所定の患者に対して繰り
返し行った後、配送ワゴン107 まで移動機構部108によ
り自律的に移動する。この間にもロボット上部に設置し
た視覚センサ116 およびロボット後方に設置した視覚セ
ンサ117 を用いた走行用環境計測認識装置118 による施
設内の特徴部分抽出結果を用いた自己位置修正情報及び
障害物検出情報によって、経路の再計画を行う。またロ
ボット本体に配置された障害物センサ119 を用いた走行
制御装置120 によって走行路上の障害物の認識および回
避を行う。At the time of finishing the meal after finishing the meal, when a work start command is sent from the remote monitoring operation unit 105 to the meal transport robot 102, the meal transport robot 102 autonomously moves to the bedside and has a touch panel on the upper surface of the robot. With the display 121 and the voice processing device 122, a simple conversation is carried out with the cared person or nursing / carer. After that, from the image information of the visual sensor 127 installed on the wrist, the environment measurement recognition device for manipulation 128 is overtable.
The position of the tray 124 on the 129 is measured and recognized at regular intervals,
Based on this position information, the manipulator 123 moves the wrist gripping mechanism 125 to the tray 124, and the wrist gripping mechanism 12 is moved.
5 holds the tray 124. The manipulator 123 lifts the tray 124 and inserts it into the tray storing section 115 in the main body, then opens the wrist gripping mechanism 125, and is stored in the robot main body. After repeating this operation for a predetermined patient in the living room, the moving mechanism 108 autonomously moves to the delivery wagon 107. During this time, the self-position correction information and obstacle detection information using the result of feature extraction in the facility by the traveling environment measurement recognition device 118 using the visual sensor 116 installed in the upper part of the robot and the visual sensor 117 installed in the rear part of the robot. To re-plan the route. Further, the traveling control device 120 using the obstacle sensor 119 arranged in the robot body recognizes and avoids the obstacle on the traveling road.
【0011】配送ワゴン107 の前まで自律移動した食事
搬送ロボット101 は、移動中には本体内に格納されてい
た多関節多自由度マニピュレータ111 を本体内トレイ格
納部115 内のトレイ109 に向けて動作させる。手首部の
把持機構114 によってトレイ109 を把持すると、マニピ
ュレータ111 は把持したトレイ109 を引き出し、配送ワ
ゴン107 に向けて動作させる。手首部に設置した視覚セ
ンサ112 の画像情報からマニピュレーション用環境計測
認識装置113 は配送ワゴン内の空き位置を計測認識し、
この位置情報をもとにマニピュレータ123 はトレイ124
を配送ワゴン内の空き位置に挿入する。この動作を本体
内トレイ格納部115 内のトレイ109 を全て返却するまで
繰り返した後、マニピュレータ111 を再びロボット本体
内に格納し、移動機構108 により次の居室まで移動し、
新たな食事トレイの下膳と返却を繰り返す。The meal transport robot 101 that has autonomously moved to the front of the delivery wagon 107 directs the multi-joint multi-degree-of-freedom manipulator 111 stored in the main body to the tray 109 in the main body tray storage section 115 while moving. To operate. When the tray 109 is gripped by the gripping mechanism 114 of the wrist, the manipulator 111 pulls out the gripped tray 109 and operates it toward the delivery wagon 107. From the image information of the visual sensor 112 installed on the wrist, the environment measurement recognition device for manipulation 113 measures and recognizes the empty position in the delivery wagon,
Based on this position information, the manipulator 123 is set on the tray 124.
Insert into the empty position in the delivery wagon. After repeating this operation until all the trays 109 in the tray storage unit 115 in the main body are returned, the manipulator 111 is stored in the robot main body again, and the moving mechanism 108 moves it to the next room.
Repeat the set and return of a new meal tray.
【0012】図2は食事搬送ロボットの詳細図である。
図中符号201 で示される移動機構部には底部四隅に旋回
自在な4つ(4つ以上であってもよい)の被動車輪202
が配置されている。また底部中心部には2つの駆動輪20
3 およびこれを駆動するモータ204 が配置されており、
これらは垂直軸を中心に旋回自在になっている。この駆
動輪203 および駆動モータ204 を垂直軸回りに旋回させ
るモータ205 により、駆動輪203 自体を任意の角度に操
舵可能であり、これにより移動機構部201 を水平面内の
任意の方向に駆動することができる。モータ204 、205
およびモータ204 の制御装置206 はバッテリ207 によっ
て電源供給される。バッテリ207 はまた、食事搬送ロボ
ットの全電源を供給することにより自律移動を可能にす
る。移動機構部201 の外周には、安全機構としてバンパ
208 が配置されている。バンパ208 に設置されたスイッ
チにより、外環境との接触時には直ちに走行を停止する
ように設定される。FIG. 2 is a detailed view of the meal transport robot.
In the moving mechanism portion indicated by reference numeral 201 in the drawing, four driven wheels 202 (which may be four or more) which can freely swing at the four corners of the bottom are provided.
Is arranged. There are two drive wheels 20 in the center of the bottom.
3 and the motor 204 that drives it,
These are rotatable about a vertical axis. The drive wheel 203 itself can be steered to an arbitrary angle by a motor 205 that turns the drive wheel 203 and the drive motor 204 about a vertical axis, and thus the moving mechanism section 201 can be driven in an arbitrary direction within a horizontal plane. You can Motor 204, 205
And the controller 206 of the motor 204 is powered by a battery 207. The battery 207 also enables autonomous movement by providing full power to the food delivery robot. A bumper is provided as a safety mechanism on the outer periphery of the moving mechanism 201.
208 are located. A switch installed on the bumper 208 is set to immediately stop traveling when the vehicle comes into contact with the outside environment.
【0013】多関節多自由度マニピュレータ部209 は、
例えば根元に配置された垂直面内の直動自由度210 と、
これによって上下する水平面内の旋回自由度211 、212
、213 と、その先端に配置された把持機構214 とを有
する。自由度210 〜213 はそれぞれモータにより独立に
制御される。把持機構214 は図では単純なフォーク式で
あるが、把持自由度を加えたグリッパ方式であってもよ
い。またマニピュレータ部209 全体の軸構成も、例えば
根元に垂直面内の旋回自由度と、これによって旋回する
水平面内の旋回自由度3つと、その先端に配置された垂
直面内の旋回自由度と、それによって旋回する把持機構
とからなる構成など、空間内の任意の点に到達可能な他
の軸構成であってもよい。安全装置としては、例えば手
首部に設置された視覚センサ215 の情報により動作空間
内の障害物を認識することで動作を中止または変更した
り、あるいは各関節内に設置された、滑りを利用した機
械的なトルクリミッタにより衝突時の過大な負荷を避け
たり、あるいは各関節内に設置されたトルクセンサまた
は先端部に設置された力センサの情報を用いて衝突時の
外環境へ加わる力を制御したり、といった方法を用いる
ことができる。The multi-joint multi-degree-of-freedom manipulator unit 209 is
For example, the linear motion degree of freedom 210 in the vertical plane placed at the base,
Due to this, the degrees of freedom of turning 211, 212 in the horizontal plane that rises and falls
, 213 and a gripping mechanism 214 arranged at the tip thereof. The degrees of freedom 210 to 213 are independently controlled by the motor. The gripping mechanism 214 is a simple fork type in the drawing, but may be a gripper type with added gripping freedom. Further, the axial configuration of the entire manipulator unit 209 also includes, for example, a degree of freedom of rotation in a vertical plane at the root, three degrees of freedom of rotation in a horizontal plane that are swung by this, and a degree of freedom of rotation in a vertical plane arranged at the tip thereof. Other axis configurations that can reach an arbitrary point in space, such as a configuration including a gripping mechanism that swivels by that, may be used. As a safety device, for example, the movement is stopped or changed by recognizing an obstacle in the operation space based on the information of the visual sensor 215 installed on the wrist, or a slip installed in each joint is used. Avoid excessive load at the time of collision with a mechanical torque limiter, or control the force applied to the external environment at the time of collision using the information of the torque sensor installed in each joint or the force sensor installed at the tip. Or the like can be used.
【0014】トレイ格納部216 は、例えば食事を載せた
トレイ6枚を収納可能な棚状のボックスであり、各トレ
イはボックス内面両端に設置されたレール217 によって
支持される。トレイ格納部の下部にはマニピュレータ格
納用の空間218 があり、ロボット移動時にはマニピュレ
ータ部209 をこの空間に収納することで、マニピュレー
タ209 と外環境との接触を避けるように設定する。The tray storage unit 216 is, for example, a shelf-like box that can store six trays on which food is placed, and each tray is supported by rails 217 installed at both ends of the inner surface of the box. There is a space 218 for storing the manipulator in the lower part of the tray storage unit, and the manipulator unit 209 is stored in this space when the robot moves so that the manipulator 209 and the external environment are prevented from contacting each other.
【0015】図3は本発明の制御ブロック図を示す。走
行制御部301 は、搬送ロボットの基準経路データに基づ
いて、その経路へ追従するための移動機構部への走行指
令を送出することのできる経路追従装置302と、ロボッ
ト本体に配置された障害物センサからの情報を統合し、
走行路上の障害物を認識できる障害物検出部303 と、こ
の障害物検出部からの情報に基づいて、障害物回避軌道
を生成することのできる障害物回避部304 とからなる。FIG. 3 shows a control block diagram of the present invention. The traveling control unit 301 includes a route tracking device 302 capable of sending a traveling command to a moving mechanism unit for following the route based on the reference route data of the transfer robot, and an obstacle arranged in the robot body. Integrates information from sensors,
An obstacle detection unit 303 capable of recognizing an obstacle on the traveling road and an obstacle avoidance unit 304 capable of generating an obstacle avoidance trajectory based on the information from the obstacle detection unit.
【0016】走行用環境計測認識装置305 では、ロボッ
ト上部の視覚センサからの画像情報に基づいてロボット
の環境地図内の位置と走行上の障害物検出を行う。ロボ
ット上部の2台の視覚センサ306 は少なくとも1自由度
の回転機構307 上に設置されており、認識計測対象の位
置に従って前方及び左右に対象を追跡することができる
(視野制御装置)。また、後ろに走行する場合(バック
等)はロボット後部に取り付けた視覚センサ308 によっ
て画像を得る。各視覚センサの画像は画像切り替え器30
9 を通して、走行用環境計測認識装置305 に入力され
る。走行用環境計測認識装置305 の特徴抽出装置310 は
事前に走行環境内の複数の特徴的な部分をテンプレート
画像として持っておき、これらのテンプレート画像と走
行中の画像とのマッチングをとり、走行環境内で複数の
特徴的な部分を抽出する。自己位置計測装置311 は得ら
れた画像上の複数の特徴位置を2台の視覚センサを用い
たステレオ視により距離計測し、それら結果と環境地図
内の特徴位置情報からロボットの自己位置を決定する。
障害物検出装置は床面上の障害物を検出し、ステレオ視
を用いて障害物位置とその障害物マップを生成する。The running environment measurement recognition device 305 detects the position of the robot in the environment map and the obstacle on the run based on the image information from the visual sensor on the upper part of the robot. The two visual sensors 306 on the upper part of the robot are installed on the rotating mechanism 307 having at least one degree of freedom, and can track the object forward and leftward and rightward according to the position of the recognition measurement object (visual field control device). When traveling backward (back etc.), an image is obtained by the visual sensor 308 attached to the rear part of the robot. The image of each visual sensor is the image switcher 30
9 is input to the traveling environment measurement recognition device 305. The characteristic extraction device 310 of the traveling environment measurement recognition device 305 has a plurality of characteristic portions in the traveling environment in advance as template images, and matches these template images with the images during traveling to obtain the traveling environment. Extract multiple characteristic parts in. The self-position measuring device 311 measures the distances of a plurality of characteristic positions on the obtained image by stereo vision using two visual sensors, and determines the self-position of the robot from the result and the characteristic position information in the environment map. .
The obstacle detection device detects an obstacle on the floor surface and generates an obstacle position and its obstacle map by using stereo vision.
【0017】経路計画装置313 は事前に入力された施設
内の環境地図情報を持ち、その地図を用いてロボットの
位置から目的となる部屋と部屋内の各ベッドまでの基準
となる経路(基準経路)を基準経路計画部314 により生
成する。経路再計画部315 はロボットの走行中に走行用
環境計測認識装置305 で計測された自己位置情報と基準
経路情報との比較を行い、基準経路からロボットの走行
経路が外れていた場合は修正経路を再計画する。また、
走行用環境計測認識装置305 の生成した障害物マップ情
報に基づいて障害物の回避経路を再計画し、基準経路を
修正する。The route planning device 313 has environment map information of the facility, which has been input in advance, and uses the map as a reference route from the robot position to the target room and each bed in the room (reference route). ) Is generated by the reference route planning unit 314. The route replanning unit 315 compares the self-position information measured by the traveling environment measurement recognition device 305 with the reference route information while the robot is traveling, and if the robot's traveling route deviates from the reference route, the corrected route is corrected. Re-plan. Also,
Based on the obstacle map information generated by the traveling environment measurement recognition device 305, the obstacle avoidance route is re-planned and the reference route is corrected.
【0018】マニピュレーション用環境計測認識装置31
6 はマニピュレータの手先部分の2台の視覚センサ317
またはロボット上部の視覚センサ306,308 の画像情報を
用いて、対象物の認識と対象物までの距離計測を行う。
食事を搬送するロボットにおける対象物認識装置318 で
は、例えば食事トレイの角の部分を事前にテンプレート
画像として取得しておき、このテンプレート画像と視覚
センサから得られた画像情報から食事トレイの認識がで
きる。対象物計測装置319 は対象物認識装置318 の認識
情報に基づいてマニピュレータが食事トレイを把持する
部分を、例えばマニピュレータの手先部分の2台の視覚
センサ317 を用いたステレオ視によって計測できる。こ
の計測した結果はマニピュレータ装置320 に送信され、
マニピュレータは計測結果に従って動作する。計測精度
が十分な距離では1回の計測で食事トレイを把持できる
が、十分な計測精度を得られない距離では、対象物計測
装置319 では画像上の把持位置をマニピュレータ動作中
にトラキングさせながら一定時間毎に計測を繰り返し、
その計測情報をマニピュレータ部に連続的に送り、マニ
ピュレータの目標位置を修正させながらマニピュレータ
を動作させる(視覚フィードバックの処理フローは図4
参照)。マニピュレーション用環境計測認識装置316 内
の対象物ラベル識別装置321 としては、例えば食事ネー
ムプレートに書かれた被介護者の名前をテンプレート画
像として取得しておき、配送ワゴン内の食事トレイ上の
各被介護者の食事ネームプレートをテンプレートマッチ
ングで識別し、必要な食事トレイの取り出しを指示す
る。Manipulation environment measurement recognition device 31
6 is two visual sensors 317 at the hand of the manipulator
Alternatively, the image information of the visual sensors 306 and 308 above the robot is used to recognize the object and measure the distance to the object.
In the object recognition device 318 in the robot that conveys food, for example, the corner portion of the food tray is acquired in advance as a template image, and the food tray can be recognized from this template image and the image information obtained from the visual sensor. . The object measuring device 319 can measure the portion where the manipulator holds the meal tray based on the recognition information of the object recognizing device 318, for example, by stereoscopic vision using the two visual sensors 317 at the tip of the manipulator. The measurement result is sent to the manipulator device 320,
The manipulator operates according to the measurement result. At a distance with sufficient measurement accuracy, the food tray can be gripped by one measurement, but at a distance at which sufficient measurement accuracy cannot be obtained, the object measurement device 319 keeps the gripping position on the image constant while tracking during manipulator operation. Repeat the measurement every hour,
The measurement information is continuously sent to the manipulator unit, and the manipulator is operated while correcting the target position of the manipulator.
reference). As the object label identification device 321 in the environment measurement and recognition device 316 for manipulation, for example, the name of the cared person written on the meal nameplate is acquired as a template image, and each object on the meal tray in the delivery wagon is acquired. The caregiver's meal nameplate is identified by template matching, and instructions are given to remove the necessary meal tray.
【0019】図5は配膳作業シミュレーションの基本構
成である。配膳作業シミュレータでは、作業手順と経路
計画装置で生成され基準経路に従った作業シミュレーシ
ョンを実時間で行う。実時間シミュレーション部はロボ
ットのダイナミクス計算,ロボットと壁やベッド等との
干渉チェック,視覚センサのシミュレーションを実時間
で行う。さらに、表示やモデル登録も行う。外部インタ
フェースと作業手順管理は外部とのインタフェースとシ
ミュレーション時に手順管理を行い、ロボット制御部は
作業手順に従ったロボット動作のためのモータトルク等
を計算する。環境モデル入力は施設環境を入力する。こ
れらの構成要素を用いて配膳作業をシミュレーションす
る。まず、ナビゲーション部の基準経路計画部と同じ方
法で得られた基準経路に従って、シミュレータ上で搬送
ロボットを動作させる。このとき、走行中に廊下の壁や
事前に入力している長椅子等に搬送ロボットが接触しな
いか、走行経路が妥当かをチェックできる。また、室内
の走行においては配膳順序が妥当であるか等をチェック
できる。また、各ベッドへの食事の配膳方法、下膳方法
に関してもマニピュレータの動きをシミュレーションに
よって検証でき、問題がある場合は変更が可能である。FIG. 5 shows the basic configuration of the serving operation simulation. In the catering work simulator, the work procedure and the work simulation generated by the route planning device according to the reference route are performed in real time. The real-time simulation section performs robot dynamics calculations, interference checks between the robot and walls and beds, and visual sensor simulations in real time. Furthermore, display and model registration are also performed. The external interface and the work procedure management manage the procedure at the time of simulation with the interface with the outside, and the robot controller calculates the motor torque and the like for the robot operation according to the work procedure. The environment model input inputs the facility environment. These components are used to simulate the serving process. First, the transport robot is operated on the simulator according to the reference route obtained by the same method as the reference route planning unit of the navigation unit. At this time, it is possible to check whether the transport robot does not come into contact with the corridor wall or the chaise lounge that has been input in advance while traveling, and whether the traveling route is appropriate. Also, when traveling indoors, it is possible to check whether the serving order is appropriate. In addition, it is possible to verify the movement of the manipulator by a simulation regarding the method of serving meals to each bed and the method of preparing the meals. If there is a problem, it can be changed.
【0020】図6は、情報提示装置の構成図である。図
中符号401 で示される情報提示装置には、ロボット表面
に張り付けたタッチパネル付表示装置402 を備える。情
報提示処理装置403 は、マニピュレータや移動機構等の
ロボット内部の装置と通信を行う内部インターフェース
装置404 を備え、タッチパネルと内部インターフェース
装置からの指示に応じて、ディスプレイに表示を行い、
搬送ロボットの動作を制御する。FIG. 6 is a block diagram of the information presentation device. The information presenting device indicated by reference numeral 401 in the figure includes a display device with a touch panel 402 which is attached to the surface of the robot. The information presentation processing device 403 includes an internal interface device 404 that communicates with a device inside the robot such as a manipulator and a moving mechanism, and displays on a display according to an instruction from the touch panel and the internal interface device.
Controls the operation of the transfer robot.
【0021】図7は、情報提示処理装置403 の処理を示
す説明図である。動作制御処理501は、人がタッチパネ
ル402 を押すことによる操作指示、内部インターフェー
ス装置404 を通じてロボット内部からくる指示に基づ
き、情報提示装置及び搬送ロボットを制御する。制御さ
れた結果に基づき、文字や画像を表示する画像表示処理
502 、マニピュレータや移動機構等のロボット内部の装
置を動作または停止させる場合は、内部インターフェー
ス装置404 を通じて制御を行う。FIG. 7 is an explanatory diagram showing the processing of the information presentation processing device 403. The operation control processing 501 controls the information presentation device and the transfer robot based on an operation instruction by a person pressing the touch panel 402 and an instruction from the inside of the robot through the internal interface device 404. Image display processing that displays characters and images based on the controlled results
502, when operating or stopping a device inside the robot such as a manipulator or a moving mechanism, control is performed through the internal interface device 404.
【0022】図8は、音声処理機能を有した情報提示装
置の概観図である。図中符号401で示される情報提示装
置には、タッチパネル付表示装置402 と2つ(2つ以上
でもよい)の音声入力装置405 と2つ(2つ以上でもよ
い)の音声出力装置406 を備え、音に応じて音を出力さ
せたり文字や画像などを表示することのできる構成をと
る。また、情報提示装置401 の下部には、モーター408
を備えた回転装置407を配置し、垂直軸を中心に、自由
に情報提示装置401 を回転制御できる構成をとる。情報
提示処理装置403 は、これらの各装置と接続し、さらに
マニピュレータや移動機構等のロボット内部の装置と通
信を行う内部インターフェース装置404を備え、情報提
示装置401 と搬送ロボットを制御する。FIG. 8 is a schematic view of an information presenting apparatus having a voice processing function. The information presentation device denoted by reference numeral 401 in the figure includes a display device with a touch panel 402, two (may be two or more) voice input devices 405, and two (may be two or more) voice output devices 406. , It is configured to output a sound or display a character or an image according to the sound. In addition, a motor 408 is provided below the information presenting device 401.
A rotation device 407 provided with is arranged so that the information presentation device 401 can be freely rotationally controlled around the vertical axis. The information presentation processing device 403 is connected to each of these devices and further includes an internal interface device 404 that communicates with devices inside the robot such as a manipulator and a moving mechanism, and controls the information presentation device 401 and the transfer robot.
【0023】図9は、情報提示処理装置403 の処理を示
す説明図であり、図3における情報提示部の詳細説明で
ある。情報提示処理装置403 は、音声入力装置405 から
入力された信号から文字列を抽出する音声認識処理503
とどの方向から音がやってくるかを検出する音源方向検
出処理505 を備える。動作制御処理501は、音声認識処
理503と音源方向検出処理504 による結果、さらに人が
タッチパネル402 を押すことによる操作指示、内部イン
ターフェース装置404 を通じてロボット内部からくる指
示に応じて、情報提示装置及び搬送ロボットの動作を制
御する。制御された結果、音声を発生するための音声合
成処理504 、情報提示装置401 を回転させる回転制御処
理506 、文字や画像を表示する画像表示処理502 を行
い、マニピュレータや移動機構等のロボット内部の装置
を動作または停止させる場合は、内部インターフェース
装置404 を通じて制御を行う。FIG. 9 is an explanatory view showing the processing of the information presentation processing device 403, and is a detailed explanation of the information presentation unit in FIG. The information presentation processing device 403 is a voice recognition process 503 for extracting a character string from the signal input from the voice input device 405.
And a sound source direction detection process 505 for detecting from which direction the sound comes. The motion control processing 501 is based on the result of the voice recognition processing 503 and the sound source direction detection processing 504, an operation instruction by a person pressing the touch panel 402, and an instruction from the inside of the robot through the internal interface device 404. Control the movement of the robot. As a result of the control, a voice synthesis process 504 for generating a voice, a rotation control process 506 for rotating the information presenting device 401, an image display process 502 for displaying a character or an image, and a robot internal operation such as a manipulator or a moving mechanism are performed. When operating or stopping the device, control is performed through the internal interface device 404.
【0024】遠隔監視操作部は、ロボット内部とロボッ
トを管理するセンター側105 に分かれ、コマンド・画像
伝送装置103,104 として無線LAN等を用いることによ
り双方向にコマンドや画像転送が可能となっている。ロ
ボット内部の遠隔監視操作部はイーサネットによって各
機構と接続してあり、コマンドの双方向転送が可能とな
っている。またロボット内部の遠隔監視操作部は、画像
圧縮ボードを備え無線伝送路の付加低減を図っている。
センター側の遠隔監視操作部は、タッチパネル付ディス
プレイ、専用操作ボタン、およびパソコン等から構成さ
れている(図10参照)。次に遠隔監視操作部の操作実
施例を説明する。通常時の操作は、センター側から遠隔
監視操作者の操作により食事開始前に食事搬送ロボット
に被介護者のID、部屋番号、食事の種類等のデータの
転送、及び配膳開始コマンドを送出する。配膳および下
膳中は、走行用環境計測認識装置から送られてくる搬送
ロボット自己位置情報をセンター側でモニタして、操作
者の要望に応じて遠隔監視操作部のタッチパネル付ディ
スプレイに位置の表示を行う。故障、及び実行不可能な
状況に陥った等の緊急の場合には、走行用環境計測認識
装置からの搬送ロボット自己位置・障害物情報及び各視
覚センサの画像情報を、遠隔監視操作部のロボット内部
から伝送装置を経由してセンター側遠隔監視操作部に送
り、タッチパネル付ディスプレイに表示を行う。遠隔監
視操作者は送信されてきた搬送ロボットの周辺画像を見
ながらタッチパネル付ディスプレイに表示されたメニュ
ーボタンや搬送ロボット操作ボタンを用いて搬送ロボッ
トに動作の指示を行い、この情報が伝送装置を経由して
ロボット内部の遠隔監視操作部へ送られ、ロボット内部
のイーサネットを経由して各機構に制御コマンドを送信
し、センタから直接ロボットのコントロールを行う。The remote monitoring operation unit is divided into the inside of the robot and the center side 105 for managing the robot, and commands and images can be transferred bidirectionally by using a wireless LAN or the like as the command / image transmission devices 103 and 104. The remote monitoring operation unit inside the robot is connected to each mechanism via Ethernet, and bidirectional command transfer is possible. The remote monitoring operation unit inside the robot is equipped with an image compression board to reduce the number of wireless transmission lines.
The remote monitoring operation unit on the center side includes a display with a touch panel, dedicated operation buttons, a personal computer, etc. (see FIG. 10). Next, an operation example of the remote monitoring operation unit will be described. In the normal operation, the remote monitoring operator operates from the center side to transfer the data such as the care recipient's ID, room number, and meal type to the meal transport robot before the meal starts, and sends the serving start command. During serving and lowering, the center side monitors the position information of the transfer robot sent from the traveling environment measurement and recognition device, and the position is displayed on the touch panel display of the remote monitoring operation unit according to the operator's request. I do. In the event of an emergency such as a failure or an infeasible situation, the robot of the remote monitoring operation unit can display the transfer robot's self-position / obstacle information from the traveling environment measurement recognition device and the image information of each visual sensor. It is sent from the inside to the remote monitoring operation unit on the center side via the transmission device and displayed on the touch panel display. The remote monitoring operator uses the menu buttons and the transfer robot operation buttons displayed on the touch panel display while viewing the transmitted peripheral image of the transfer robot to instruct the transfer robot to operate, and this information is transmitted via the transmission device. Then, it is sent to the remote monitoring operation unit inside the robot, and the control command is sent to each mechanism via Ethernet inside the robot, and the robot is directly controlled from the center.
【0025】同じような全体構成をとるが異なった作業
を行うロボットとしては、リネン類回収搬送ロボット、
薬品・カルテ搬送ロボットがある。いずれも食事トレイ
の替わりにリネン収納箱、あるいは薬品・カルテ収納箱
を移載・移動させることで、ほぼ同様の全体構成により
実現できる。これにより各居室ベッドサイドから洗濯
室、あるいはナースステーション間のリネンあるいは薬
品・カルテ類の搬送が自律的に行える。 また各居室へ
の搬送ではなく、食堂への食事搬送ロボットとしても、
ほぼ同様の構成で対応できる。Robots that have the same overall structure but perform different tasks include linen collection and transfer robots,
There is a chemical / chart transfer robot. All of these can be realized with almost the same overall configuration by transferring or moving a linen storage box or a medicine / carte storage box instead of the meal tray. As a result, linens, medicines, and medical records can be autonomously transferred from the bedside of each room to the laundry room or between nurse stations. Also, instead of transporting to each living room, as a food transport robot to the dining room,
Almost the same configuration can be used.
【0026】[0026]
【発明の効果】以上述べたように、本発明によれば、医
療・福祉施設内の廊下・室内の移動、対象物のロボット
との移載を安全に自律的に行い、医療施設内でのヒュー
マンインタフェースを考慮した自律動作可能な搬送ロボ
ットシステムを提供できる。As described above, according to the present invention, it is possible to safely and autonomously move a corridor / room in a medical / welfare facility and transfer an object to / from a robot. It is possible to provide a transfer robot system capable of autonomous operation in consideration of a human interface.
【図1】本発明の搬送ロボットシステムの実施例を示す
図FIG. 1 is a diagram showing an embodiment of a transfer robot system of the present invention.
【図2】本発明の食事搬送ロボットの詳細図FIG. 2 is a detailed view of the meal transport robot of the present invention.
【図3】本発明の制御ブロック図FIG. 3 is a control block diagram of the present invention.
【図4】視覚フィードバックを説明する図FIG. 4 is a diagram illustrating visual feedback.
【図5】配膳作業シミュレーションの基本構成を示す図FIG. 5 is a diagram showing a basic configuration of a catering work simulation.
【図6】情報提示装置の構成図FIG. 6 is a block diagram of an information presentation device.
【図7】情報提示処理装置403 の処理を示す説明図FIG. 7 is an explanatory diagram showing the processing of the information presentation processing device 403.
【図8】音声処理機能を有した情報提示装置の概観図FIG. 8 is a schematic view of an information presentation device having a voice processing function.
【図9】情報提示処理装置403 の処理を示す説明図FIG. 9 is an explanatory diagram showing the processing of the information presentation processing device 403.
【図10】遠隔監視操作部の一例を示す図FIG. 10 is a diagram showing an example of a remote monitoring operation unit.
101 食事搭載あるいは返却中の食事搬送ロボット 102 食事配膳あるいは下膳中の食事搬送ロボット 103,104 コマンド・画像伝送装置 105 遠隔監視操作部 106 作業検証用実時間シミュレータ 101 Meal transport robot loading or returning meals 102 Meal transport robot feeding or lowering meals 103, 104 Command / image transmission device 105 Remote monitoring operation unit 106 Real-time simulator for work verification
フロントページの続き (72)発明者 下園 直登 福岡県北九州市八幡西区黒崎城石2番1号 株式会社安川電機内 (72)発明者 神田 真司 神奈川県川崎市中原区上小田中1015番地 富士通株式会社内 (72)発明者 内山 隆 神奈川県川崎市中原区上小田中1015番地 富士通株式会社内 (72)発明者 上田 昌伸 神奈川県川崎市中原区上小田中1015番地 富士通株式会社内Continued Front Page (72) Naoto Shimozono Naoto Shimozono 2-1, Kurosaki Shiroishi, Yawatanishi-ku, Kitakyushu, Fukuoka Yasukawa Electric Co., Ltd. (72) Inventor Takashi Uchiyama 1015 Kamiodanaka, Nakahara-ku, Kawasaki-shi, Kanagawa Within Fujitsu Limited
Claims (11)
センサと、そのセンサ情報に基づいた計測認識手段と、
その計測情報を利用して搬送ロボット内の格納部との間
で対象物の出し入れを自律的に行える多関節多自由マニ
ピュレータと、前記視覚センサによる走行環境の計測認
識手段と、その計測認識結果からロボット動作経路生成
手段と、その生成経路とセンサ情報に従った走行指令か
ら自律移動できる移動機構と、操作者等とコミニュケー
ションを行うインタフェース手段と、を有する搬送ロボ
ットと、 この搬送ロボットの作業監視と遠隔操作が可能な遠隔監
視操作部とを備えたことを特徴とする搬送ロボットシス
テム。1. A storage unit capable of storing an object to be conveyed, a visual sensor, and a measurement recognition unit based on the sensor information.
Using the measurement information, an articulated multi-free manipulator that can autonomously move an object in and out of a storage unit in the transfer robot, a traveling environment measurement recognition means by the visual sensor, and the measurement recognition result A transfer robot having a robot operation path generating means, a moving mechanism capable of autonomously moving from a travel command according to the generated path and sensor information, and an interface means for communicating with an operator or the like, and work monitoring of the transfer robot. A transfer robot system comprising: a remote monitoring operation unit capable of remote operation.
手先部分と、外環境と搬送ロボット内格納部との間隙に
おいて搬送対象物の移送を行う多関節多自由度機構と、
接触を未然に回避しかつ万一接触しても危害を与えない
安全機構とを備えたことを特徴とする請求項1記載の搬
送ロボットシステム。2. A multi-joint multi-degree-of-freedom mechanism that transfers an object to be conveyed in a gap between an external environment and a storage unit inside a transfer robot, and a hand portion capable of gripping or supporting the object to be transferred.
The transport robot system according to claim 1, further comprising a safety mechanism that avoids contact and does not cause harm even if contact occurs.
リ、および搬送対象物格納部を搭載し、底部四隅に配置
された旋回自在な4つの被動車輪と、底部中心部に配置
され各々独立に制御可能な2つの駆動輪および駆動モー
タと、この駆動輪を垂直軸まわりに旋回する操舵モータ
と、人や機械器具、施設等への衝突防止等の安全装置と
を備えた移動機構部を有し、走行指令に基づき速度制御
および操舵角制御を行い自律移動を行う請求項1記載の
搬送ロボットシステム。3. A variety of element devices, batteries for driving them, and an object storage unit are mounted, four driven wheels which are arranged at the four corners of the bottom and are rotatable, and which are arranged in the center of the bottom and are independently controlled. It has a moving mechanism section with two possible driving wheels and a driving motor, a steering motor that turns the driving wheels around a vertical axis, and a safety device for preventing collision with people, machinery, facilities, etc. The transfer robot system according to claim 1, wherein the transfer robot system performs autonomous movement by performing speed control and steering angle control based on a travel command.
の回転機構とその回転機構に設置した2つの視覚センサ
及びロボットの後部に設置した視覚センサを備え、視覚
センサからの情報をもとに施設内の特徴的な部分の抽出
ができる特徴抽出装置と施設内の特徴的な部分からロボ
ットの自己位置を計測できる自己位置計測装置、さらに
視覚センサからの情報に基づき障害物を計測認識できる
障害物検出装置から構成される走行用環境計測認識装置
を有し、この走行用環境計測認識装置からの計測認識デ
ータと経路データに基づきロボットの自己位置の修正及
び障害物の回避経路の再計画を行うとともに室内の環境
地図情報から自動的に搬送ロボットの走行する基準経路
データが生成することを特徴とする請求項1記載の搬送
ロボットシステム。4. A rotating mechanism having one or more degrees of freedom installed on the upper part of the robot, two visual sensors installed on the rotating mechanism and a visual sensor installed on the rear part of the robot, and based on information from the visual sensor. A feature extraction device that can extract characteristic parts in the facility, a self-position measurement device that can measure the robot's self-position from the characteristic parts in the facility, and obstacles that can measure and recognize obstacles based on information from visual sensors It has a traveling environment measurement recognition device composed of an object detection device, and based on the measurement recognition data and route data from this traveling environment measurement recognition device, it can correct the robot's own position and re-plan the obstacle avoidance route. The transfer robot system according to claim 1, wherein the reference route data along which the transfer robot travels is automatically generated from the indoor environment map information.
データに基づいて視覚センサの回転機構を特徴物に向か
って制御できる視野制御装置を備えた請求項4記載の搬
送ロボットシステム。5. The transfer robot system according to claim 4, further comprising a visual field control device capable of controlling a rotation mechanism of a visual sensor toward a feature based on measurement recognition data of the traveling environment measurement recognition device.
て、その経路へ追従するための移動機構部への走行指令
を送出することのできる経路追従装置と、ロボット本体
に配置された超音波センサおよび光センサからなる障害
物センサからのを統合し、走行路上の障害物を認識でき
る障害物検出部と、この障害物検出部からの情報に基づ
いて、障害物回避軌道を生成する障害物回避部とを備え
たことを特徴とする請求項1記載の搬送ロボットシステ
ム。6. A route tracking device capable of sending a traveling command to a moving mechanism section for following a route based on reference route data of a transfer robot, an ultrasonic sensor arranged in a robot body, and An obstacle detection unit that integrates optical sensors and recognizes obstacles on the road, and an obstacle avoidance unit that generates an obstacle avoidance trajectory based on information from the obstacle detection unit. The transfer robot system according to claim 1, further comprising:
を1つ以上配置し、その視覚センサまたはロボット上部
の取り付けた請求項4の視覚センサからの画像情報と事
前の対象物情報を用いて対象物を認識できる対象物認識
装置とその対象物の距離を計測できる対象物計測装置、
及び視覚センサからの情報に基づいて対象物のラベルを
識別できる対象物ラベル識別装置から構成されるマニピ
ュレーション用環境計測認識装置を備え、そのマニピュ
レーション用環境計測認識装置が一定時間毎に計測認識
を繰り返し、その計測情報に基づいてマニピュレータを
対象物まで動作させられる視覚フィードバック制御機能
を有する請求項1記載の搬送ロボットシステム。7. A manipulator is provided with one or more visual sensors on a wrist portion, and the visual sensor of the visual sensor or the robot upper part attached to the robot is used to detect an object using image information from the visual sensor and object information in advance. An object recognition device that can be recognized and an object measurement device that can measure the distance of the object,
And an environment measurement and recognition device for manipulation, which consists of a target label identification device that can identify the label of the object based on the information from the visual sensor, and the environment measurement and recognition device for manipulation repeats measurement recognition at regular intervals. The transfer robot system according to claim 1, further comprising a visual feedback control function capable of operating the manipulator up to the object based on the measurement information.
ミュレーションによる作業検証を行い、安全な動作が確
認できる作業検証用実時間シミュレータを有する請求項
1記載の搬送ロボットシステム。8. The transfer robot system according to claim 1, further comprising a work verification real-time simulator capable of performing a work verification by a simulation of a transfer work based on a travel plan in advance and confirming a safe operation.
ット内部の装置と双方向に通信できる内部インターフェ
ース装置を備え、ロボットの表面にタッチパネル付ディ
スプレイを搭載し、内部インターフェースからの情報を
加工してディスプレイ上に表示し、メニューボタンを選
択することにより看護・介護者、被介護者等の操作者が
搬送ロボットに対して簡単に指示できる画像表示操作装
置と、ロボットの動作を制御することのできる情報処理
装置から構成された、看護・介護者、被介護者等の操作
者がロボットの内部状態を容易に確認でき、容易に操作
可能であることを特徴とする請求項1記載の搬送ロボッ
トシステム。9. An internal interface device capable of bidirectionally communicating with an internal device of a transfer robot such as a manipulator and a moving mechanism, a display with a touch panel is mounted on the surface of the robot, and information from the internal interface is processed to be displayed on the display. Displayed on the screen, and an operator such as a nursing / caregiver or a cared person can easily instruct the transfer robot by selecting the menu button, and information processing that can control the operation of the robot. 2. The transfer robot system according to claim 1, wherein an operator such as a nursing / caregiver, a cared person, or the like configured from the device can easily check the internal state of the robot and can easily operate the robot.
音響信号を発生することのできる音声出力装置と、垂直
軸を中心に装置全体を回転させることができる回転装置
を備え、人間の音声を認識し、その認識結果に応じて的
確な応答ができ、音源の方向に装置全体を回転させるこ
とが可能である、看護・介護者、被介護者との間で簡単
な会話ができることを特徴とする請求項9記載の搬送ロ
ボットシステム。10. A voice input device capable of receiving an acoustic signal,
It is equipped with a voice output device that can generate an acoustic signal and a rotation device that can rotate the entire device around a vertical axis, recognizes human voice, and can make an accurate response according to the recognition result. 10. The transfer robot system according to claim 9, wherein the entire device can be rotated in the direction of the sound source so that a simple conversation can be performed between a nursing / caregiver and a cared person.
各装置に故障・命令実行不可等のアラーム発生機能を持
たせ、これらのアラーム情報、前記走行用環境計測認識
装置からの搬送ロボット自己位置・障害物情報及び、前
記各視覚センサの画像情報を搬送ロボットに内臓するコ
マンド・画像転送装置を通して遠隔監視操作部に送り、
故障箇所、実行不可能な状況、搬送ロボットの周辺画像
等を遠隔監視操作部のタッチパネル付ディスプレイに表
示でき、さらに遠隔監視操作者がタッチパネル付ディス
プレイとこれに表示されたメニューボタンや搬送ロボッ
ト操作ボタンを用いて、搬送ロボットの動作等を指示
し、これらの指示情報がコマンド・画像伝送装置を通じ
て搬送ロボットに伝送され、その情報に従って搬送ロボ
ットを直接コントロールできる遠隔監視操作部を有する
請求項に記載の搬送ロボットシステム。11. The remote monitoring operation unit provides each device in the transfer robot with an alarm generation function such as a failure or inability to execute a command, the alarm information, the transfer robot self-position from the traveling environment measurement recognition device.・ Sending the obstacle information and the image information of each of the visual sensors to the remote monitoring operation unit through the command / image transfer device built into the transfer robot,
The failure location, unexecutable situation, peripheral image of the transfer robot, etc. can be displayed on the touch panel display of the remote monitoring operation unit, and the remote monitoring operator can also display the touch panel display and the menu buttons and transfer robot operation buttons displayed on the display. The operation of the transfer robot is instructed by using, and the instruction information is transmitted to the transfer robot through the command / image transmission device, and the remote monitoring operation unit capable of directly controlling the transfer robot according to the information is included. Transport robot system.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP10331296A JP3601737B2 (en) | 1996-03-30 | 1996-03-30 | Transfer robot system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP10331296A JP3601737B2 (en) | 1996-03-30 | 1996-03-30 | Transfer robot system |
Publications (2)
Publication Number | Publication Date |
---|---|
JPH09267276A true JPH09267276A (en) | 1997-10-14 |
JP3601737B2 JP3601737B2 (en) | 2004-12-15 |
Family
ID=14350698
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP10331296A Expired - Fee Related JP3601737B2 (en) | 1996-03-30 | 1996-03-30 | Transfer robot system |
Country Status (1)
Country | Link |
---|---|
JP (1) | JP3601737B2 (en) |
Cited By (79)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000176746A (en) * | 1998-12-15 | 2000-06-27 | Advantest Corp | Part handling device |
KR100352306B1 (en) * | 1998-05-15 | 2002-12-26 | 한국전기초자 주식회사 | Device for automatically packing cathode ray tube panel and control method thereof |
JP2003093932A (en) * | 2001-09-25 | 2003-04-02 | Toyota Motor Corp | Coating system |
JP2003524531A (en) * | 2000-02-21 | 2003-08-19 | ヴィッテンシュタイン アーゲー | Method for recognizing, defining and locating at least one object and / or space |
JP2004081567A (en) * | 2002-08-27 | 2004-03-18 | Secom Co Ltd | Meal support device |
WO2004106009A1 (en) * | 2003-06-02 | 2004-12-09 | Matsushita Electric Industrial Co., Ltd. | Article operating system and method, and article managing system and method |
WO2005015467A1 (en) * | 2003-08-07 | 2005-02-17 | Matsushita Electric Industrial Co., Ltd. | Life supporting system |
WO2005015466A1 (en) * | 2003-08-07 | 2005-02-17 | Matsushita Electric Industrial Co., Ltd. | Life assisting system and its control program |
JP2005125457A (en) * | 2003-10-24 | 2005-05-19 | Yaskawa Electric Corp | Mobile robot for work |
JP2005209090A (en) * | 2004-01-26 | 2005-08-04 | Matsushita Electric Works Ltd | Self-position recognition service cart |
JP2006508806A (en) * | 2002-07-25 | 2006-03-16 | インタッチ−ヘルス・インコーポレーテッド | Medical remote control robot system |
JP2007094743A (en) * | 2005-09-28 | 2007-04-12 | Zmp:Kk | Autonomous mobile robot and system therefor |
JP2007111854A (en) * | 2003-06-02 | 2007-05-10 | Matsushita Electric Ind Co Ltd | Article handling system and article handling server |
JPWO2006006624A1 (en) * | 2004-07-13 | 2008-05-01 | 松下電器産業株式会社 | Article holding system, robot, and robot control method |
JP2008112442A (en) * | 2006-10-06 | 2008-05-15 | Amada Co Ltd | Sheet metal working system |
JP2008229800A (en) * | 2007-03-22 | 2008-10-02 | Toshiba Corp | Arm-mounted mobile robot and its control method |
JP2011143497A (en) * | 2010-01-13 | 2011-07-28 | Ihi Corp | Device and method for tray transfer |
JP2012245575A (en) * | 2011-05-25 | 2012-12-13 | Toyota Motor East Japan Inc | Work support system |
JP2013082071A (en) * | 2013-02-12 | 2013-05-09 | Toyota Motor East Japan Inc | Work assist system |
JP2014006832A (en) * | 2012-06-27 | 2014-01-16 | Hitachi Ltd | Conveyance system |
US8836751B2 (en) | 2011-11-08 | 2014-09-16 | Intouch Technologies, Inc. | Tele-presence system with a user interface that displays different communication links |
US8849679B2 (en) | 2006-06-15 | 2014-09-30 | Intouch Technologies, Inc. | Remote controlled robot system that provides medical images |
US8849680B2 (en) | 2009-01-29 | 2014-09-30 | Intouch Technologies, Inc. | Documentation through a remote presence robot |
US8892260B2 (en) | 2007-03-20 | 2014-11-18 | Irobot Corporation | Mobile robot for telecommunication |
US8930019B2 (en) | 2010-12-30 | 2015-01-06 | Irobot Corporation | Mobile human interface robot |
US8935005B2 (en) | 2010-05-20 | 2015-01-13 | Irobot Corporation | Operating a mobile robot |
US8965579B2 (en) | 2011-01-28 | 2015-02-24 | Intouch Technologies | Interfacing with a mobile telepresence robot |
US8983174B2 (en) | 2004-07-13 | 2015-03-17 | Intouch Technologies, Inc. | Mobile robot with a head-based movement mapping scheme |
US8996165B2 (en) | 2008-10-21 | 2015-03-31 | Intouch Technologies, Inc. | Telepresence robot with a camera boom |
US9014848B2 (en) | 2010-05-20 | 2015-04-21 | Irobot Corporation | Mobile robot system |
JP2015092348A (en) * | 2010-12-30 | 2015-05-14 | アイロボット コーポレイション | Mobile human interface robot |
US9089972B2 (en) | 2010-03-04 | 2015-07-28 | Intouch Technologies, Inc. | Remote presence system including a cart that supports a robot face and an overhead camera |
US9193065B2 (en) | 2008-07-10 | 2015-11-24 | Intouch Technologies, Inc. | Docking system for a tele-presence robot |
US9251313B2 (en) | 2012-04-11 | 2016-02-02 | Intouch Technologies, Inc. | Systems and methods for visualizing and managing telepresence devices in healthcare networks |
US9264664B2 (en) | 2010-12-03 | 2016-02-16 | Intouch Technologies, Inc. | Systems and methods for dynamic bandwidth allocation |
US9323250B2 (en) | 2011-01-28 | 2016-04-26 | Intouch Technologies, Inc. | Time-dependent navigation of telepresence robots |
US9375843B2 (en) | 2003-12-09 | 2016-06-28 | Intouch Technologies, Inc. | Protocol for a remotely controlled videoconferencing robot |
US9429934B2 (en) | 2008-09-18 | 2016-08-30 | Intouch Technologies, Inc. | Mobile videoconferencing robot system with network adaptive driving |
US9498886B2 (en) | 2010-05-20 | 2016-11-22 | Irobot Corporation | Mobile human interface robot |
US9602765B2 (en) | 2009-08-26 | 2017-03-21 | Intouch Technologies, Inc. | Portable remote presence robot |
CN107024934A (en) * | 2017-04-21 | 2017-08-08 | 山东大学 | A kind of hospital service robot and method based on cloud platform |
US9776327B2 (en) | 2012-05-22 | 2017-10-03 | Intouch Technologies, Inc. | Social behavior rules for a medical telepresence robot |
CN107421755A (en) * | 2017-08-11 | 2017-12-01 | 中汽研(天津)汽车工程研究院有限公司 | A kind of automobile composite measurement platform |
US9842192B2 (en) | 2008-07-11 | 2017-12-12 | Intouch Technologies, Inc. | Tele-presence robot system with multi-cast features |
US9849593B2 (en) | 2002-07-25 | 2017-12-26 | Intouch Technologies, Inc. | Medical tele-robotic system with a master remote station with an arbitrator |
US9974612B2 (en) | 2011-05-19 | 2018-05-22 | Intouch Technologies, Inc. | Enhanced diagnostics for a telepresence robot |
US10061896B2 (en) | 2012-05-22 | 2018-08-28 | Intouch Technologies, Inc. | Graphical user interfaces including touchpad driving interfaces for telemedicine devices |
US10059000B2 (en) | 2008-11-25 | 2018-08-28 | Intouch Technologies, Inc. | Server connectivity control for a tele-presence robot |
CN109110249A (en) * | 2018-09-12 | 2019-01-01 | 苏州博众机器人有限公司 | A kind of dispensing machine people |
WO2019014030A1 (en) * | 2017-07-11 | 2019-01-17 | Zume, Inc. | Multi-modal distribution systems and methods using vending kiosks and autonomous delivery vehicles |
US10259119B2 (en) | 2005-09-30 | 2019-04-16 | Intouch Technologies, Inc. | Multi-camera mobile teleconferencing platform |
US10334205B2 (en) | 2012-11-26 | 2019-06-25 | Intouch Technologies, Inc. | Enhanced video interaction for a user interface of a telepresence network |
US10343283B2 (en) | 2010-05-24 | 2019-07-09 | Intouch Technologies, Inc. | Telepresence robot system that can be accessed by a cellular phone |
JP2019117207A (en) * | 2013-07-26 | 2019-07-18 | エフ.ホフマン−ラ ロシュ アーゲーF. Hoffmann−La Roche Aktiengesellschaft | Method for handling sample tube and handling device |
CN110340863A (en) * | 2018-04-08 | 2019-10-18 | AIrobot株式会社 | Autonomous transfer robot |
US10471588B2 (en) | 2008-04-14 | 2019-11-12 | Intouch Technologies, Inc. | Robotic based health care system |
US10682763B2 (en) | 2007-05-09 | 2020-06-16 | Intouch Technologies, Inc. | Robot system that operates through a network firewall |
US10762170B2 (en) | 2012-04-11 | 2020-09-01 | Intouch Technologies, Inc. | Systems and methods for visualizing patient and telepresence device statistics in a healthcare network |
US10769739B2 (en) | 2011-04-25 | 2020-09-08 | Intouch Technologies, Inc. | Systems and methods for management of information among medical providers and facilities |
US10808882B2 (en) | 2010-05-26 | 2020-10-20 | Intouch Technologies, Inc. | Tele-robotic system with a robot face placed on a chair |
CN111906776A (en) * | 2020-06-15 | 2020-11-10 | 广州铁路职业技术学院(广州铁路机械学校) | Control method and device for railway food delivery robot |
CN112087984A (en) * | 2018-06-14 | 2020-12-15 | 国际商业机器公司 | Robot identification manager |
US10875182B2 (en) | 2008-03-20 | 2020-12-29 | Teladoc Health, Inc. | Remote presence system mounted to operating room hardware |
US10885492B2 (en) | 2017-07-14 | 2021-01-05 | Zume, Inc. | Vending-kiosk based systems and methods to vend and/or prepare items, for instance prepared foods |
CN112207794A (en) * | 2020-10-26 | 2021-01-12 | 北京洛必德科技有限公司 | Hotel meal delivery robot and working method thereof |
US10969766B2 (en) | 2009-04-17 | 2021-04-06 | Teladoc Health, Inc. | Tele-presence robot system with software modularity, projector and laser pointer |
JP2021070090A (en) * | 2019-10-30 | 2021-05-06 | トヨタ自動車株式会社 | robot |
JP2021086198A (en) * | 2019-11-25 | 2021-06-03 | トヨタ自動車株式会社 | Carrying system, carrying method and program |
KR102292795B1 (en) * | 2020-02-28 | 2021-08-23 | 한양대학교 에리카산학협력단 | Serving Robots And Control Method Thereof |
US11154981B2 (en) | 2010-02-04 | 2021-10-26 | Teladoc Health, Inc. | Robot user interface for telepresence robot system |
CN113909829A (en) * | 2020-07-08 | 2022-01-11 | 华为技术有限公司 | System, method and apparatus for assembling equipment |
KR20220099649A (en) * | 2021-01-07 | 2022-07-14 | 우리로봇 주식회사 | Serving robot for serving food providing sterilization function |
US11389064B2 (en) | 2018-04-27 | 2022-07-19 | Teladoc Health, Inc. | Telehealth cart that supports a removable tablet with seamless audio/video switching |
US11399153B2 (en) | 2009-08-26 | 2022-07-26 | Teladoc Health, Inc. | Portable telepresence apparatus |
EP4140661A1 (en) * | 2021-08-24 | 2023-03-01 | Siemens Healthcare GmbH | Autonomous mobile laboratory assistance robot for in vitro diagnostic laboratory |
CN115786054A (en) * | 2022-11-07 | 2023-03-14 | 山西万立科技有限公司 | Suspended eight-axis robot ground cylinder fermented grain material taking system and method |
US11636944B2 (en) | 2017-08-25 | 2023-04-25 | Teladoc Health, Inc. | Connectivity infrastructure for a telehealth platform |
US11742094B2 (en) | 2017-07-25 | 2023-08-29 | Teladoc Health, Inc. | Modular telehealth cart with thermal imaging and touch screen user interface |
US11862302B2 (en) | 2017-04-24 | 2024-01-02 | Teladoc Health, Inc. | Automated transcription and documentation of tele-health encounters |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11214437B1 (en) | 2017-09-13 | 2022-01-04 | AI Incorporated | Autonomous mobile robotic device for the transportation of items |
US11086314B1 (en) | 2018-01-09 | 2021-08-10 | AI Incorporated | Autonomous signal boosting robotic device |
-
1996
- 1996-03-30 JP JP10331296A patent/JP3601737B2/en not_active Expired - Fee Related
Cited By (131)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100352306B1 (en) * | 1998-05-15 | 2002-12-26 | 한국전기초자 주식회사 | Device for automatically packing cathode ray tube panel and control method thereof |
JP2000176746A (en) * | 1998-12-15 | 2000-06-27 | Advantest Corp | Part handling device |
JP2003524531A (en) * | 2000-02-21 | 2003-08-19 | ヴィッテンシュタイン アーゲー | Method for recognizing, defining and locating at least one object and / or space |
JP2003093932A (en) * | 2001-09-25 | 2003-04-02 | Toyota Motor Corp | Coating system |
JP2006508806A (en) * | 2002-07-25 | 2006-03-16 | インタッチ−ヘルス・インコーポレーテッド | Medical remote control robot system |
US9849593B2 (en) | 2002-07-25 | 2017-12-26 | Intouch Technologies, Inc. | Medical tele-robotic system with a master remote station with an arbitrator |
US8209051B2 (en) | 2002-07-25 | 2012-06-26 | Intouch Technologies, Inc. | Medical tele-robotic system |
US10315312B2 (en) | 2002-07-25 | 2019-06-11 | Intouch Technologies, Inc. | Medical tele-robotic system with a master remote station with an arbitrator |
JP2004081567A (en) * | 2002-08-27 | 2004-03-18 | Secom Co Ltd | Meal support device |
US7209803B2 (en) | 2003-02-17 | 2007-04-24 | Matsushita Electric Industrial Co., Ltd. | Article handling system and method and article management system and method |
US7187998B2 (en) | 2003-02-17 | 2007-03-06 | Matsushita Electric Industrial Co., Ltd. | Article handling system and method and article management system and method |
WO2004106009A1 (en) * | 2003-06-02 | 2004-12-09 | Matsushita Electric Industrial Co., Ltd. | Article operating system and method, and article managing system and method |
US7187999B2 (en) | 2003-06-02 | 2007-03-06 | Matsushita Electric Industrial Co., Ltd. | Article handling system and method and article management system and method |
US7191035B2 (en) | 2003-06-02 | 2007-03-13 | Matsushita Electric Industrial Co., Ltd. | Article handling system and method and article management system and method |
US7206668B2 (en) | 2003-06-02 | 2007-04-17 | Matsushita Electric Industrial Co., Ltd. | Article handling system and method and article management system and method |
JP2007111854A (en) * | 2003-06-02 | 2007-05-10 | Matsushita Electric Ind Co Ltd | Article handling system and article handling server |
WO2005015467A1 (en) * | 2003-08-07 | 2005-02-17 | Matsushita Electric Industrial Co., Ltd. | Life supporting system |
WO2005015466A1 (en) * | 2003-08-07 | 2005-02-17 | Matsushita Electric Industrial Co., Ltd. | Life assisting system and its control program |
JP2005125457A (en) * | 2003-10-24 | 2005-05-19 | Yaskawa Electric Corp | Mobile robot for work |
US9956690B2 (en) | 2003-12-09 | 2018-05-01 | Intouch Technologies, Inc. | Protocol for a remotely controlled videoconferencing robot |
US9375843B2 (en) | 2003-12-09 | 2016-06-28 | Intouch Technologies, Inc. | Protocol for a remotely controlled videoconferencing robot |
US10882190B2 (en) | 2003-12-09 | 2021-01-05 | Teladoc Health, Inc. | Protocol for a remotely controlled videoconferencing robot |
JP2005209090A (en) * | 2004-01-26 | 2005-08-04 | Matsushita Electric Works Ltd | Self-position recognition service cart |
US9766624B2 (en) | 2004-07-13 | 2017-09-19 | Intouch Technologies, Inc. | Mobile robot with a head-based movement mapping scheme |
US7706918B2 (en) | 2004-07-13 | 2010-04-27 | Panasonic Corporation | Article holding system, robot, and method of controlling robot |
JPWO2006006624A1 (en) * | 2004-07-13 | 2008-05-01 | 松下電器産業株式会社 | Article holding system, robot, and robot control method |
US10241507B2 (en) | 2004-07-13 | 2019-03-26 | Intouch Technologies, Inc. | Mobile robot with a head-based movement mapping scheme |
US8983174B2 (en) | 2004-07-13 | 2015-03-17 | Intouch Technologies, Inc. | Mobile robot with a head-based movement mapping scheme |
JP2007094743A (en) * | 2005-09-28 | 2007-04-12 | Zmp:Kk | Autonomous mobile robot and system therefor |
US10259119B2 (en) | 2005-09-30 | 2019-04-16 | Intouch Technologies, Inc. | Multi-camera mobile teleconferencing platform |
US8849679B2 (en) | 2006-06-15 | 2014-09-30 | Intouch Technologies, Inc. | Remote controlled robot system that provides medical images |
JP2008112442A (en) * | 2006-10-06 | 2008-05-15 | Amada Co Ltd | Sheet metal working system |
US9296109B2 (en) | 2007-03-20 | 2016-03-29 | Irobot Corporation | Mobile robot for telecommunication |
US8892260B2 (en) | 2007-03-20 | 2014-11-18 | Irobot Corporation | Mobile robot for telecommunication |
JP4550849B2 (en) * | 2007-03-22 | 2010-09-22 | 株式会社東芝 | Mobile robot with arm |
JP2008229800A (en) * | 2007-03-22 | 2008-10-02 | Toshiba Corp | Arm-mounted mobile robot and its control method |
US10682763B2 (en) | 2007-05-09 | 2020-06-16 | Intouch Technologies, Inc. | Robot system that operates through a network firewall |
US11787060B2 (en) | 2008-03-20 | 2023-10-17 | Teladoc Health, Inc. | Remote presence system mounted to operating room hardware |
US10875182B2 (en) | 2008-03-20 | 2020-12-29 | Teladoc Health, Inc. | Remote presence system mounted to operating room hardware |
US11472021B2 (en) | 2008-04-14 | 2022-10-18 | Teladoc Health, Inc. | Robotic based health care system |
US10471588B2 (en) | 2008-04-14 | 2019-11-12 | Intouch Technologies, Inc. | Robotic based health care system |
US10493631B2 (en) | 2008-07-10 | 2019-12-03 | Intouch Technologies, Inc. | Docking system for a tele-presence robot |
US9193065B2 (en) | 2008-07-10 | 2015-11-24 | Intouch Technologies, Inc. | Docking system for a tele-presence robot |
US10878960B2 (en) | 2008-07-11 | 2020-12-29 | Teladoc Health, Inc. | Tele-presence robot system with multi-cast features |
US9842192B2 (en) | 2008-07-11 | 2017-12-12 | Intouch Technologies, Inc. | Tele-presence robot system with multi-cast features |
US9429934B2 (en) | 2008-09-18 | 2016-08-30 | Intouch Technologies, Inc. | Mobile videoconferencing robot system with network adaptive driving |
US8996165B2 (en) | 2008-10-21 | 2015-03-31 | Intouch Technologies, Inc. | Telepresence robot with a camera boom |
US10059000B2 (en) | 2008-11-25 | 2018-08-28 | Intouch Technologies, Inc. | Server connectivity control for a tele-presence robot |
US10875183B2 (en) | 2008-11-25 | 2020-12-29 | Teladoc Health, Inc. | Server connectivity control for tele-presence robot |
US8849680B2 (en) | 2009-01-29 | 2014-09-30 | Intouch Technologies, Inc. | Documentation through a remote presence robot |
US10969766B2 (en) | 2009-04-17 | 2021-04-06 | Teladoc Health, Inc. | Tele-presence robot system with software modularity, projector and laser pointer |
US10404939B2 (en) | 2009-08-26 | 2019-09-03 | Intouch Technologies, Inc. | Portable remote presence robot |
US9602765B2 (en) | 2009-08-26 | 2017-03-21 | Intouch Technologies, Inc. | Portable remote presence robot |
US10911715B2 (en) | 2009-08-26 | 2021-02-02 | Teladoc Health, Inc. | Portable remote presence robot |
US11399153B2 (en) | 2009-08-26 | 2022-07-26 | Teladoc Health, Inc. | Portable telepresence apparatus |
JP2011143497A (en) * | 2010-01-13 | 2011-07-28 | Ihi Corp | Device and method for tray transfer |
US11154981B2 (en) | 2010-02-04 | 2021-10-26 | Teladoc Health, Inc. | Robot user interface for telepresence robot system |
US11798683B2 (en) | 2010-03-04 | 2023-10-24 | Teladoc Health, Inc. | Remote presence system including a cart that supports a robot face and an overhead camera |
US9089972B2 (en) | 2010-03-04 | 2015-07-28 | Intouch Technologies, Inc. | Remote presence system including a cart that supports a robot face and an overhead camera |
US9902069B2 (en) | 2010-05-20 | 2018-02-27 | Irobot Corporation | Mobile robot system |
US8935005B2 (en) | 2010-05-20 | 2015-01-13 | Irobot Corporation | Operating a mobile robot |
US9498886B2 (en) | 2010-05-20 | 2016-11-22 | Irobot Corporation | Mobile human interface robot |
US9014848B2 (en) | 2010-05-20 | 2015-04-21 | Irobot Corporation | Mobile robot system |
US10343283B2 (en) | 2010-05-24 | 2019-07-09 | Intouch Technologies, Inc. | Telepresence robot system that can be accessed by a cellular phone |
US11389962B2 (en) | 2010-05-24 | 2022-07-19 | Teladoc Health, Inc. | Telepresence robot system that can be accessed by a cellular phone |
US10808882B2 (en) | 2010-05-26 | 2020-10-20 | Intouch Technologies, Inc. | Tele-robotic system with a robot face placed on a chair |
US9264664B2 (en) | 2010-12-03 | 2016-02-16 | Intouch Technologies, Inc. | Systems and methods for dynamic bandwidth allocation |
US10218748B2 (en) | 2010-12-03 | 2019-02-26 | Intouch Technologies, Inc. | Systems and methods for dynamic bandwidth allocation |
JP2015092348A (en) * | 2010-12-30 | 2015-05-14 | アイロボット コーポレイション | Mobile human interface robot |
US8930019B2 (en) | 2010-12-30 | 2015-01-06 | Irobot Corporation | Mobile human interface robot |
US9785149B2 (en) | 2011-01-28 | 2017-10-10 | Intouch Technologies, Inc. | Time-dependent navigation of telepresence robots |
US9469030B2 (en) | 2011-01-28 | 2016-10-18 | Intouch Technologies | Interfacing with a mobile telepresence robot |
US8965579B2 (en) | 2011-01-28 | 2015-02-24 | Intouch Technologies | Interfacing with a mobile telepresence robot |
US11468983B2 (en) | 2011-01-28 | 2022-10-11 | Teladoc Health, Inc. | Time-dependent navigation of telepresence robots |
US10591921B2 (en) | 2011-01-28 | 2020-03-17 | Intouch Technologies, Inc. | Time-dependent navigation of telepresence robots |
US10399223B2 (en) | 2011-01-28 | 2019-09-03 | Intouch Technologies, Inc. | Interfacing with a mobile telepresence robot |
US9323250B2 (en) | 2011-01-28 | 2016-04-26 | Intouch Technologies, Inc. | Time-dependent navigation of telepresence robots |
US11289192B2 (en) | 2011-01-28 | 2022-03-29 | Intouch Technologies, Inc. | Interfacing with a mobile telepresence robot |
US10769739B2 (en) | 2011-04-25 | 2020-09-08 | Intouch Technologies, Inc. | Systems and methods for management of information among medical providers and facilities |
US9974612B2 (en) | 2011-05-19 | 2018-05-22 | Intouch Technologies, Inc. | Enhanced diagnostics for a telepresence robot |
JP2012245575A (en) * | 2011-05-25 | 2012-12-13 | Toyota Motor East Japan Inc | Work support system |
US10331323B2 (en) | 2011-11-08 | 2019-06-25 | Intouch Technologies, Inc. | Tele-presence system with a user interface that displays different communication links |
US8836751B2 (en) | 2011-11-08 | 2014-09-16 | Intouch Technologies, Inc. | Tele-presence system with a user interface that displays different communication links |
US9715337B2 (en) | 2011-11-08 | 2017-07-25 | Intouch Technologies, Inc. | Tele-presence system with a user interface that displays different communication links |
US9251313B2 (en) | 2012-04-11 | 2016-02-02 | Intouch Technologies, Inc. | Systems and methods for visualizing and managing telepresence devices in healthcare networks |
US11205510B2 (en) | 2012-04-11 | 2021-12-21 | Teladoc Health, Inc. | Systems and methods for visualizing and managing telepresence devices in healthcare networks |
US10762170B2 (en) | 2012-04-11 | 2020-09-01 | Intouch Technologies, Inc. | Systems and methods for visualizing patient and telepresence device statistics in a healthcare network |
US9776327B2 (en) | 2012-05-22 | 2017-10-03 | Intouch Technologies, Inc. | Social behavior rules for a medical telepresence robot |
US10892052B2 (en) | 2012-05-22 | 2021-01-12 | Intouch Technologies, Inc. | Graphical user interfaces including touchpad driving interfaces for telemedicine devices |
US11628571B2 (en) | 2012-05-22 | 2023-04-18 | Teladoc Health, Inc. | Social behavior rules for a medical telepresence robot |
US10780582B2 (en) | 2012-05-22 | 2020-09-22 | Intouch Technologies, Inc. | Social behavior rules for a medical telepresence robot |
US10658083B2 (en) | 2012-05-22 | 2020-05-19 | Intouch Technologies, Inc. | Graphical user interfaces including touchpad driving interfaces for telemedicine devices |
US10603792B2 (en) | 2012-05-22 | 2020-03-31 | Intouch Technologies, Inc. | Clinical workflows utilizing autonomous and semiautonomous telemedicine devices |
US11453126B2 (en) | 2012-05-22 | 2022-09-27 | Teladoc Health, Inc. | Clinical workflows utilizing autonomous and semi-autonomous telemedicine devices |
US11515049B2 (en) | 2012-05-22 | 2022-11-29 | Teladoc Health, Inc. | Graphical user interfaces including touchpad driving interfaces for telemedicine devices |
US10328576B2 (en) | 2012-05-22 | 2019-06-25 | Intouch Technologies, Inc. | Social behavior rules for a medical telepresence robot |
US10061896B2 (en) | 2012-05-22 | 2018-08-28 | Intouch Technologies, Inc. | Graphical user interfaces including touchpad driving interfaces for telemedicine devices |
JP2014006832A (en) * | 2012-06-27 | 2014-01-16 | Hitachi Ltd | Conveyance system |
US10334205B2 (en) | 2012-11-26 | 2019-06-25 | Intouch Technologies, Inc. | Enhanced video interaction for a user interface of a telepresence network |
US10924708B2 (en) | 2012-11-26 | 2021-02-16 | Teladoc Health, Inc. | Enhanced video interaction for a user interface of a telepresence network |
US11910128B2 (en) | 2012-11-26 | 2024-02-20 | Teladoc Health, Inc. | Enhanced video interaction for a user interface of a telepresence network |
JP2013082071A (en) * | 2013-02-12 | 2013-05-09 | Toyota Motor East Japan Inc | Work assist system |
JP2019117207A (en) * | 2013-07-26 | 2019-07-18 | エフ.ホフマン−ラ ロシュ アーゲーF. Hoffmann−La Roche Aktiengesellschaft | Method for handling sample tube and handling device |
CN107024934B (en) * | 2017-04-21 | 2023-06-02 | 山东大学 | Hospital service robot and method based on cloud platform |
CN107024934A (en) * | 2017-04-21 | 2017-08-08 | 山东大学 | A kind of hospital service robot and method based on cloud platform |
US11862302B2 (en) | 2017-04-24 | 2024-01-02 | Teladoc Health, Inc. | Automated transcription and documentation of tele-health encounters |
WO2019014030A1 (en) * | 2017-07-11 | 2019-01-17 | Zume, Inc. | Multi-modal distribution systems and methods using vending kiosks and autonomous delivery vehicles |
US10654394B2 (en) | 2017-07-11 | 2020-05-19 | Zume, Inc. | Multi-modal distribution systems and methods using vending kiosks and autonomous delivery vehicles |
US10902371B2 (en) | 2017-07-14 | 2021-01-26 | Zume, Inc. | Vending-kiosk based systems and methods to vend and/or prepare items, for instance prepared foods |
US10885492B2 (en) | 2017-07-14 | 2021-01-05 | Zume, Inc. | Vending-kiosk based systems and methods to vend and/or prepare items, for instance prepared foods |
US11742094B2 (en) | 2017-07-25 | 2023-08-29 | Teladoc Health, Inc. | Modular telehealth cart with thermal imaging and touch screen user interface |
CN107421755A (en) * | 2017-08-11 | 2017-12-01 | 中汽研(天津)汽车工程研究院有限公司 | A kind of automobile composite measurement platform |
US11636944B2 (en) | 2017-08-25 | 2023-04-25 | Teladoc Health, Inc. | Connectivity infrastructure for a telehealth platform |
US11839981B2 (en) | 2018-04-08 | 2023-12-12 | Airobot Co., Ltd. | Autonomous moving transfer robot |
CN110340863A (en) * | 2018-04-08 | 2019-10-18 | AIrobot株式会社 | Autonomous transfer robot |
JP2021517076A (en) * | 2018-04-08 | 2021-07-15 | AIrobot株式会社 | Autonomous mobile transfer robot |
CN110340863B (en) * | 2018-04-08 | 2023-02-17 | AIrobot株式会社 | Autonomous mobile transfer robot |
US11389064B2 (en) | 2018-04-27 | 2022-07-19 | Teladoc Health, Inc. | Telehealth cart that supports a removable tablet with seamless audio/video switching |
CN112087984A (en) * | 2018-06-14 | 2020-12-15 | 国际商业机器公司 | Robot identification manager |
CN109110249A (en) * | 2018-09-12 | 2019-01-01 | 苏州博众机器人有限公司 | A kind of dispensing machine people |
JP2021070090A (en) * | 2019-10-30 | 2021-05-06 | トヨタ自動車株式会社 | robot |
JP2021086198A (en) * | 2019-11-25 | 2021-06-03 | トヨタ自動車株式会社 | Carrying system, carrying method and program |
KR102292795B1 (en) * | 2020-02-28 | 2021-08-23 | 한양대학교 에리카산학협력단 | Serving Robots And Control Method Thereof |
CN111906776A (en) * | 2020-06-15 | 2020-11-10 | 广州铁路职业技术学院(广州铁路机械学校) | Control method and device for railway food delivery robot |
US11890707B2 (en) * | 2020-07-08 | 2024-02-06 | Huawei Technologies Co., Ltd. | Device assembling system, method, and apparatus |
CN113909829A (en) * | 2020-07-08 | 2022-01-11 | 华为技术有限公司 | System, method and apparatus for assembling equipment |
CN112207794A (en) * | 2020-10-26 | 2021-01-12 | 北京洛必德科技有限公司 | Hotel meal delivery robot and working method thereof |
CN112207794B (en) * | 2020-10-26 | 2021-08-03 | 北京洛必德科技有限公司 | Hotel meal delivery robot and working method thereof |
KR20220099649A (en) * | 2021-01-07 | 2022-07-14 | 우리로봇 주식회사 | Serving robot for serving food providing sterilization function |
EP4140661A1 (en) * | 2021-08-24 | 2023-03-01 | Siemens Healthcare GmbH | Autonomous mobile laboratory assistance robot for in vitro diagnostic laboratory |
CN115786054A (en) * | 2022-11-07 | 2023-03-14 | 山西万立科技有限公司 | Suspended eight-axis robot ground cylinder fermented grain material taking system and method |
Also Published As
Publication number | Publication date |
---|---|
JP3601737B2 (en) | 2004-12-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP3601737B2 (en) | Transfer robot system | |
Mišeikis et al. | Lio-a personal robot assistant for human-robot interaction and care applications | |
Graf et al. | Care-O-bot II—Development of a next generation robotic home assistant | |
Hans et al. | Robotic home assistant care-o-bot: Past-present-future | |
US10893988B2 (en) | Patient support systems and methods for docking, transporting, sterilizing, and storing patient support decks | |
JP5351221B2 (en) | Robotic transfer device and system | |
US6925357B2 (en) | Medical tele-robotic system | |
CA2625895C (en) | Robotic ordering and delivery apparatuses, systems and methods | |
US20040006422A1 (en) | Computer-controlled power wheelchair navigation system | |
US8499379B2 (en) | Robotic posture transfer assist devices and methods | |
WO2015143273A2 (en) | Mobile human-friendly assistive robot | |
WO2021227900A1 (en) | Robotic assistant | |
US9393692B1 (en) | Apparatus and method of assisting an unattended robot | |
US11813028B2 (en) | Active-detection self-propelled artificial intelligence surgical navigation cart | |
KR20150119734A (en) | Hospital Room Assistant Robot | |
US20220206506A1 (en) | Robot control system, robot control method, and program | |
Fiorini et al. | Health care robotics: A progress report | |
Thinh et al. | Telemedicine mobile robot-robots to assist in remote medical | |
Krishnamurthy et al. | HelpMate: A robotic courier for hospital use | |
Takanobu et al. | Remote interaction between human and humanoid robot | |
US20220208328A1 (en) | Transport system, transport method, and program | |
Hans et al. | Robotic home assistant Care-O-bot II | |
US11755009B2 (en) | Transport system, transport method, and program | |
CN111554389A (en) | Hospital service management system and hospital service robot control method | |
TWI692352B (en) | Self-propelled artificial intelligence surgical navigation cart with active detection |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A131 | Notification of reasons for refusal |
Free format text: JAPANESE INTERMEDIATE CODE: A131 Effective date: 20040616 |
|
A521 | Request for written amendment filed |
Free format text: JAPANESE INTERMEDIATE CODE: A523 Effective date: 20040811 |
|
TRDD | Decision of grant or rejection written | ||
A01 | Written decision to grant a patent or to grant a registration (utility model) |
Free format text: JAPANESE INTERMEDIATE CODE: A01 Effective date: 20040915 |
|
A61 | First payment of annual fees (during grant procedure) |
Free format text: JAPANESE INTERMEDIATE CODE: A61 Effective date: 20040916 |
|
R150 | Certificate of patent or registration of utility model |
Free format text: JAPANESE INTERMEDIATE CODE: R150 |
|
FPAY | Renewal fee payment (event date is renewal date of database) |
Free format text: PAYMENT UNTIL: 20081001 Year of fee payment: 4 |
|
FPAY | Renewal fee payment (event date is renewal date of database) |
Free format text: PAYMENT UNTIL: 20091001 Year of fee payment: 5 |
|
S111 | Request for change of ownership or part of ownership |
Free format text: JAPANESE INTERMEDIATE CODE: R313113 |
|
FPAY | Renewal fee payment (event date is renewal date of database) |
Free format text: PAYMENT UNTIL: 20091001 Year of fee payment: 5 |
|
R350 | Written notification of registration of transfer |
Free format text: JAPANESE INTERMEDIATE CODE: R350 |
|
FPAY | Renewal fee payment (event date is renewal date of database) |
Free format text: PAYMENT UNTIL: 20101001 Year of fee payment: 6 |
|
LAPS | Cancellation because of no payment of annual fees |