JP2008125696A - Meal support system - Google Patents
Meal support system Download PDFInfo
- Publication number
- JP2008125696A JP2008125696A JP2006312523A JP2006312523A JP2008125696A JP 2008125696 A JP2008125696 A JP 2008125696A JP 2006312523 A JP2006312523 A JP 2006312523A JP 2006312523 A JP2006312523 A JP 2006312523A JP 2008125696 A JP2008125696 A JP 2008125696A
- Authority
- JP
- Japan
- Prior art keywords
- food
- meal support
- user
- meal
- dish
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Landscapes
- Accommodation For Nursing Or Treatment Tables (AREA)
Abstract
Description
本発明は、上肢障害者や高齢者等の食事を支援する、食事支援システムに関する。 The present invention relates to a meal support system for supporting meals of persons with disabilities in the upper limbs and the elderly.
現在、上肢障害者は全国で約57.7万人も存在しており、高齢者の増加に伴い加齢による上肢機能障害者も急増し今後大きな社会問題化することが予想されている。上肢障害者は独力で日常生活を送ることが困難で介護者の支援を必要としているが、特に食事支援は介護作業の中でも長時間を要し、障害者の身体状況に個別にきめ細かく対応する必要があり介護者に大変な負担がかかっていた。一方、障害者にとっても食事支援を受けることは自尊心が傷つき介護者に遠慮してマイペースで食事が出来ず精神的にも大きな負担になっていた。食事は人生を豊かにする最も大切なイベントであるため、上肢障害者が独力で食事が可能となる食事支援ロボットが開発できれば障害者のQOLの向上に大いに寄与できる。このため国内外で食事支援ロボットの開発が行われてきた。障害者の指や腕の動きをタッチセンサーで検知して操作もの、マニピュレータで上肢障害者の飲食動作を支援するもの、障害者がジョイスティックでロボットを操作し飲食するもの、音声認識を応用した食事支援ロボットハンドなどが開発されている。しかし、いずれの食事支援ロボットも多関節方式で、アクチュエータはDCモータである。多関節方式ではロボット駆動時の振動は避けられず運搬中に食物が落下し易く、口元への食物の正確な運搬は困難である。DCモータの使用は電磁両立性(EMC)に問題があり、ペースメーカー装着者には危険であり、家電機器への電波妨害の恐れもある。また、タッチセンサーやジョイスティック操作では上肢の運動機能を完全に失った障害者では操作が不可能であり、残存機能があってもロボット操作にはかなりの訓練が必要である。 Currently, there are approximately 577,000 people with disabilities in the upper arm nationwide, and as the number of elderly people increases, the number of people with upper limb dysfunction due to aging increases rapidly, and it is expected that this will become a major social problem in the future. People with disabilities in the upper limbs are unable to live their daily lives on their own, and need the caregiver's support. In particular, meal support requires a long time even during the care work, and it is necessary to deal with the physical condition of the disabled individually. There was a heavy burden on caregivers. On the other hand, for people with disabilities, receiving meal support has hurt their self-esteem and refrains from caregivers, and cannot eat at their own pace. Since meals are the most important event that enriches life, development of a meal-supporting robot that enables individuals with upper limb disabilities to eat on their own can greatly contribute to improving the QOL of persons with disabilities. For this reason, meal support robots have been developed both at home and abroad. Manipulators that detect movements of fingers and arms of persons with disabilities using touch sensors, those that support eating and drinking operations for persons with disabilities in the upper limbs using manipulators, persons that persons with disabilities operate and operate robots with joysticks, and food that uses voice recognition Support robot hands have been developed. However, all the meal support robots are articulated, and the actuator is a DC motor. In the multi-joint system, vibration during driving of the robot is unavoidable, and food tends to fall during transportation, and accurate transportation of food to the mouth is difficult. The use of a DC motor has a problem in electromagnetic compatibility (EMC), is dangerous for a pacemaker wearer, and may cause radio wave interference to home appliances. In addition, the touch sensor and joystick operation cannot be operated by a handicapped person who has completely lost the upper limb motor function, and the robot operation requires considerable training even with the remaining function.
従来技術として、特許文献1〜3が挙げられる。
特許文献1には、食事支援ロボットが記載されている。特許文献1では多関節方式のロボットを採用している。また、操作にはタッチセンサーを利用しており、駆動手段にはステッピングモータ等を利用している。
特許文献2には、運動機能補助装置が記載されており、眼球運動により駆動手段の駆動制御を行うことが記載されている。眼球運動を電位変化によって検出しているので、利用者の眼球周辺に電極を設ける必要がある。
特許文献3は、本発明者らによる特許で、眼球運動により利用者が意思伝達を行うシステムが記載されている。
Patent Document 1 describes a meal support robot. In Patent Document 1, a multi-joint type robot is employed. Further, a touch sensor is used for the operation, and a stepping motor or the like is used for the driving means.
Patent Document 2 describes a motor function assisting device and describes driving control of a driving means by eye movement. Since eye movement is detected by a potential change, it is necessary to provide an electrode around the user's eyeball.
Patent Document 3 is a patent by the present inventors and describes a system in which a user communicates intention by eye movement.
本発明の目的は、上記の問題を解決し上肢の運動機能を完全に失った障害者でも操作可能で実用的な食事支援システムを提供することである。 An object of the present invention is to provide a practical meal support system that solves the above-described problems and can be operated even by a disabled person who has completely lost the motor function of the upper limb.
前記課題を解決するため、本発明は以下の構成を有する。
利用者に食事を提供する食事支援ロボットと、前記利用者の状態及び前記食事支援ロボットの状態を監視する1又は複数個の撮像手段と、前記利用者に動作メニュー及びカーソルを含む情報を提供する表示手段と、前記撮像手段からの情報に基づいて前記食事支援ロボットを制御するとともに前記表示手段に情報を表示させる制御手段と、を有する食事支援システムであって、前記食事支援ロボットは、食物を載せる食物皿、利用者に食物を提供する食物提供手段、前記食物皿の食物を前記食物提供手段に押出す食物押出手段、前記食物押出手段の押出し方向と直交方向に前記食物皿を移動させる食物皿移動手段、とを有し、前記食事支援ロボットの食物提供手段、食物押出手段及び食物皿移動手段はアクチュエータにより駆動されるとともに、前記各手段は直交座標方式により制御されることを特徴とする食事支援システム。
撮像手段としてはCCDカメラを用いることができる。撮像手段の数は2個が好ましいがこれに限定されない。制御手段には、PC等が利用できる。
前記アクチュエータとしては、超音波モータ、DCモータ、空気圧モータ、空気圧シリンダー等を用いることができる。
In order to solve the above problems, the present invention has the following configuration.
A meal support robot that provides a meal to the user, one or a plurality of imaging means for monitoring the state of the user and the state of the meal support robot, and information including an operation menu and a cursor is provided to the user A meal support system comprising: a display means; and a control means for controlling the meal support robot based on information from the imaging means and displaying the information on the display means. Food dish to be loaded, food providing means for providing food to a user, food pushing means for pushing food from the food dish to the food providing means, food for moving the food dish in a direction orthogonal to the pushing direction of the food pushing means Dish moving means, and the food providing means, food pushing means and food dish moving means of the meal support robot are driven by an actuator. The meal support system is characterized in that each means is controlled by an orthogonal coordinate system.
A CCD camera can be used as the imaging means. The number of imaging means is preferably two, but is not limited to this. A PC or the like can be used as the control means.
As the actuator, an ultrasonic motor, a DC motor, a pneumatic motor, a pneumatic cylinder, or the like can be used.
また、以下の実施態様を有する。
前記食物皿は、それぞれ異なる食物を載せる複数の区画を有し、前記複数の区画間の壁は前記食物押出手段の押出し方向と平行であり、前記複数の区画の前記食物供給手段に食物を押出す部分は開放状態になっている。
前記制御手段は、前記撮像手段からの前記食事支援ロボットの状態情報に基づいて知的ビジュアル制御を行う。
前記アクチュエータは、超音波モータである。
前記超音波モータは、知的可変IMC-PID制御により制御される。
前記制御手段は、前記撮像手段により撮像された利用者の眼球状態により操作可能なアイ・インターフェイス機能を有する。
前記アイ・インターフェイス機能は、眼球位置及び視線方向の検出に遺伝的アルゴリズムを用いる。
前記利用者の音声を入力する音声入力手段を更に有し、前記制御手段は、前記音声入力手段により入力された音声により操作可能なボイス・インターフェイス機能を有する。
Moreover, it has the following embodiments.
The food dish has a plurality of compartments for loading different foods, and a wall between the plurality of compartments is parallel to an extrusion direction of the food pushing means, and pushes food to the food supply means of the plurality of compartments. The part to put out is in an open state.
The control means performs intelligent visual control based on state information of the meal support robot from the imaging means.
The actuator is an ultrasonic motor.
The ultrasonic motor is controlled by intelligent variable IMC-PID control.
The control means has an eye interface function that can be operated according to the eyeball state of the user imaged by the imaging means.
The eye interface function uses a genetic algorithm to detect the eyeball position and line-of-sight direction.
The apparatus further includes voice input means for inputting the voice of the user, and the control means has a voice interface function operable by the voice input by the voice input means.
本発明の構成により、上肢運動機能を完全に失った上肢障害者でも独力で操作可能な食事支援システムを提供できる。撮像手段からの画像に基づいて食事支援ロボットの動作状態を検出し、検出された情報に基づいて食事支援ロボットの制御をしているので簡単な構成で正確な制御ができる。食事支援ロボットは直交座標方式のロボット構造を採用しているので軌道計算が簡単であり、多関節方式のロボットに比べて動作が単純なので撮像手段の死角が少なくなる。本発明は撮像手段により多くの情報を得るシステムになっているので、撮像手段の死角が少なくなる直交座標方式のロボット構造が有利である。さらに、アイ・インターフェイス機能を採用することで、撮像手段からの画像に基づいて利用者の眼球機能(瞬き、視線等)を検出するので、利用者に特別なセンサを取り付ける必要が無くなる。また、食事支援ロボットのアクチュエータに超音波モータを用いることで、振動・騒音が少なくなるとともに、小型軽量でEMCにも優れた特性を持つようになり、人間に対する安全性が向上する。 According to the configuration of the present invention, it is possible to provide a meal support system that can be operated independently even by a disabled person who has completely lost the upper limb motor function. Since the operation state of the meal support robot is detected based on the image from the imaging means, and the meal support robot is controlled based on the detected information, accurate control can be performed with a simple configuration. Since the meal support robot adopts a Cartesian coordinate robot structure, the trajectory calculation is simple, and since the operation is simple compared to the articulated robot, the blind spot of the imaging means is reduced. Since the present invention is a system that obtains a large amount of information by the image pickup means, an orthogonal coordinate system robot structure that reduces the blind spot of the image pickup means is advantageous. Further, by adopting the eye interface function, since the user's eyeball function (blink, line of sight, etc.) is detected based on the image from the imaging means, it is not necessary to attach a special sensor to the user. In addition, by using an ultrasonic motor for the actuator of the meal support robot, vibration and noise are reduced, and it is small and light, and has excellent EMC characteristics, thereby improving human safety.
本発明の実施形態について図面を用いて説明する。図1は本食事支援システムの概略図、図2は上面図である。本食事支援システムは、利用者1に食事を提供する食事支援ロボット2と、利用者1の状態・食事支援ロボット2の状態及び利用者1の眼球状態を監視する撮像手段3(CCDカメラ)と、利用者1に動作メニュー及びカーソルを含む情報を提供する表示手段4と、撮像手段3からの情報に基づいて食事支援ロボット2を制御するとともに表示手段4に情報を表示させる図示されていない制御手段とからなる。食事支援ロボット2は、食物を載せる食物皿5と、利用者1に食物を提供する食物提供手段6(スプーン)と、食物皿5の食物を食物提供手段6に押出す食物押出手段7と、食物押出手段7の押出し方向と直交方向に食物皿を移動させる食物皿移動手段8とからなる。食事支援ロボット2の食物提供手段6、食物押出手段7及び食物皿移動手段8は超音波モータにより駆動されるとともに、各手段は直交座標方式により制御される。 Embodiments of the present invention will be described with reference to the drawings. FIG. 1 is a schematic view of the present meal support system, and FIG. 2 is a top view. The present meal support system includes a meal support robot 2 that provides meals to a user 1, an imaging unit 3 (CCD camera) that monitors the state of the user 1, the state of the meal support robot 2, and the eyeball state of the user 1, The display means 4 for providing the user 1 with information including the operation menu and the cursor, and the control (not shown) for controlling the meal support robot 2 based on the information from the imaging means 3 and displaying the information on the display means 4 It consists of means. The meal support robot 2 includes a food dish 5 on which food is placed, a food providing means 6 (spoon) for providing food to the user 1, a food pushing means 7 for pushing food from the food dish 5 to the food providing means 6, It comprises food dish moving means 8 for moving the food dish in a direction orthogonal to the pushing direction of the food pushing means 7. The food providing means 6, the food pushing means 7 and the food dish moving means 8 of the meal support robot 2 are driven by an ultrasonic motor, and each means is controlled by an orthogonal coordinate system.
図3は、食事支援ロボット2の詳細図である。食物皿5は、それぞれ異なる食物を載せる複数の区画を有し、複数の区画間の壁は食物押出手段7の押出し方向と平行であり、複数の区画の食物供給手段6に食物を押出す部分は開放状態になっている。食物提供手段6と食物押出手段7とは、利用者に向かう方向に前後に移動可能である。食物皿5は、食物皿移動手段8により、食物押出手段7の移動方向と直交する方向に左右に移動可能である。このように、各移動が直交方向となる直交座標方式の制御を用いることにより、構成が簡単になるとともに撮像手段3の死角が少なくなる。また、構造が簡単であるため故障も少なくメンテナンスも容易である。これらの移動は、超音波モータ9により駆動される。従来のDCモータに対して超音波モータは摩擦力駆動であるため低速時のトルクが強い、駆動音がない、停止時の保持力が大きい、小型軽量、EMCに極めて優れているなどの特徴がある。この特徴を活かして超音波モータをロボットのアクチュエータに採用することにより振動の少ない、静粛で、EMCに優れ、人間にも安全なアクチュエータを実現できる。超音波モータ9の制御はどのような制御方法でも良いが、好ましくは知的可変IMC-PID制御が良い(「知的可変IMC-PID制御」について、必要であれば特願2005-312373号及びその関連出願参照)。 FIG. 3 is a detailed view of the meal support robot 2. The food dish 5 has a plurality of compartments for loading different foods, and the wall between the plurality of compartments is parallel to the extrusion direction of the food extruding means 7, and the portion for extruding the food to the food supply means 6 of the plurality of compartments Is open. The food providing means 6 and the food pushing means 7 are movable back and forth in the direction toward the user. The food dish 5 can be moved left and right in the direction orthogonal to the movement direction of the food pushing means 7 by the food dish moving means 8. Thus, by using the control of the orthogonal coordinate system in which each movement is in the orthogonal direction, the configuration is simplified and the blind spot of the imaging means 3 is reduced. In addition, since the structure is simple, there are few failures and maintenance is easy. These movements are driven by the ultrasonic motor 9. Compared to conventional DC motors, ultrasonic motors are driven by frictional force, so they have strong torque at low speed, no driving noise, large holding power when stopped, small size and light weight, and excellent characteristics such as EMC. is there. Utilizing this feature, an ultrasonic motor can be used as an actuator for a robot to realize an actuator that is quiet, quiet, excellent in EMC, and safe for humans. The ultrasonic motor 9 may be controlled by any control method, but preferably the intelligent variable IMC-PID control is preferable (for “intelligent variable IMC-PID control”, Japanese Patent Application No. 2005-312373 and See related application).
図4及び図5は、表示手段4による動作メニューの表示の一例である。図4は食事支援ロボット2の動作を指示する画面、図5は食物皿5上の食物を選択する画面である。これらの画面は必要に応じて切り替えることができる。画面にはカーソル10が表示されている。カーソル10は、撮像手段3により得た利用者の眼球状態(瞬き、視線等)により制御される(アイ・インターフェイス機能)。眼球状態の検出には様々なアルゴリズムを利用できる(例えば特許文献3)。好ましくは、前記アルゴリズムとして遺伝的アルゴリズムを用いるのが良い(特願2006-274040号)。眼球の位置・動きによりカーソル10を移動させ、瞬きや凝視などにより“クリック”する。画面上のメニューが選択されると、食事支援ロボット2がそれに応じた動作を行う。図4のようなメニューであれば食事支援ロボット2は指示された動作を行い、図5のようなメニューであれば食事支援ロボット2は選択された食物を提供するように動作する。メニューの選択は上記のカーソルに代わり、メニュー項目を一定時間注視することにより選択することも可能である。 4 and 5 are examples of the operation menu display by the display means 4. FIG. 4 is a screen for instructing the operation of the meal support robot 2, and FIG. 5 is a screen for selecting food on the food dish 5. These screens can be switched as necessary. A cursor 10 is displayed on the screen. The cursor 10 is controlled by the eyeball state (blink, line of sight, etc.) of the user obtained by the imaging means 3 (eye interface function). Various algorithms can be used to detect the eyeball state (for example, Patent Document 3). Preferably, a genetic algorithm is used as the algorithm (Japanese Patent Application No. 2006-274040). The cursor 10 is moved by the position / movement of the eyeball, and “clicked” by blinking or staring. When a menu on the screen is selected, the meal support robot 2 performs an operation corresponding to the menu. If the menu is as shown in FIG. 4, the meal support robot 2 performs the instructed operation, and if the menu is as shown in FIG. 5, the meal support robot 2 operates to provide the selected food. The menu can be selected by gazing at a menu item for a certain period of time instead of the cursor described above.
図6は、知的ビジュアル制御の説明図である。一般に、ロボットの制御にはアーム等の位置情報の検出が必要であり、通常は様々なセンサが用いられている。本食事支援システムにおいては、利用者1の状態を監視しながら食事支援ロボットの制御を行う必要があるため、利用者1の口の位置、食物提供手段6の位置、食物押出手段7の位置などの検出は撮像手段3の画像により行う。撮像手段3を用いて各部分の位置検出を行うため、食事支援ロボット2の構成を簡略化できるとともに、利用者1の状態を監視しながら制御できるので安全性が向上する。また、利用者1が動いても正確に制御することができる。一般に撮像手段からの画像に基づいて位置検出を行うには複雑な演算が必要であるが、本システムの食事支援ロボット2は直交座標方式の制御を採用しているために動作が比較的単純であり、各部の位置検出の演算も簡単で済む。また、撮像手段3からの画像に死角ができると位置検出が困難になるが、本システムの食事支援ロボット2は直交座標方式を採用しているので多関節方式に比べて撮像手段の死角ができにくい。 FIG. 6 is an explanatory diagram of intelligent visual control. In general, control of a robot requires detection of position information of an arm or the like, and usually various sensors are used. In this meal support system, since it is necessary to control the meal support robot while monitoring the state of the user 1, the position of the mouth of the user 1, the position of the food providing means 6, the position of the food pushing means 7, etc. Is detected from the image of the imaging means 3. Since the position of each part is detected using the imaging means 3, the configuration of the meal support robot 2 can be simplified, and the user 1 can be controlled while monitoring the state, so that safety is improved. Moreover, even if the user 1 moves, it can be accurately controlled. In general, a complicated calculation is required to detect a position based on an image from an imaging means. However, the meal support robot 2 of the present system employs an orthogonal coordinate system control, and thus the operation is relatively simple. Yes, it is easy to calculate the position of each part. In addition, if a blind spot is formed in the image from the image pickup means 3, it is difficult to detect the position. However, since the meal support robot 2 of this system adopts the orthogonal coordinate system, the blind spot of the image pickup means can be formed compared to the multi-joint system. Hateful.
本食事支援システムの動作例について説明する。事前に食物皿5の各区画に食物を載せておく。表示手段4には動作メニューとカーソル10が表示されていて、カーソル10は撮像手段3で得た利用者1の眼球状態により移動・クリックの制御がされる。またはカーソルの使用に代わりメニュー項目を一定時間注視することにより選択することも可能である。利用者1は、表示手段4の画面を見ながら食事支援ロボット2の動作や食物皿5上の食物を選択する。動作メニューにより動作や食物が選択されると、食事支援ロボット2は必要な動作を行う。食事支援ロボット2の動作は撮像手段3により監視され、前述の知的ビジュアル制御により制御される。例えば動作メニューにより食物が選択されると、該食物が食物押出手段7のところに来るように食物皿5が左右に移動し、食物押出手段7が利用者1側に移動して該食物を食物提供手段6に載せ、食物提供手段6が利用者1の方に移動することにより利用者1は該食物を口にすることができる。 An operation example of the meal support system will be described. Food is placed on each section of the food dish 5 in advance. An operation menu and a cursor 10 are displayed on the display means 4, and the cursor 10 is controlled to move and click according to the eyeball state of the user 1 obtained by the imaging means 3. Alternatively, the menu item can be selected by gazing for a certain period of time instead of using the cursor. The user 1 selects the operation of the meal support robot 2 and the food on the food dish 5 while viewing the screen of the display means 4. When an action or food is selected from the action menu, the meal support robot 2 performs a necessary action. The operation of the meal support robot 2 is monitored by the imaging means 3 and controlled by the above-described intelligent visual control. For example, when food is selected by the action menu, the food dish 5 moves to the left and right so that the food comes to the food pushing means 7, and the food pushing means 7 moves to the user 1 side to feed the food to the food. By placing the food providing means 6 on the providing means 6 and moving the food providing means 6 toward the user 1, the user 1 can eat the food.
上記実施形態では利用者による動作メニューの選択にアイ・インターフェイス機能を用いているが、アイ・インターフェイス機能に代えてボイス・インターフェイス機能を用いることもできる。図7はボイス・インターフェイス機能の説明図である。利用者1に音声入力手段11(マイク)を取り付けて、音声により動作メニューの選択が行えるようになっている。制御手段は音声入力手段11により入力された音声を音声認識し、認識された音声に基づいて動作メニューの選択を行う。 In the above embodiment, the eye interface function is used for selection of an operation menu by the user, but a voice interface function can be used instead of the eye interface function. FIG. 7 is an explanatory diagram of the voice interface function. A voice input means 11 (microphone) is attached to the user 1 so that an operation menu can be selected by voice. The control means recognizes the voice input by the voice input means 11 and selects an operation menu based on the recognized voice.
以上、本発明の実施形態の一例を説明したが、本発明はこれに限定されるものではなく、特許請求の範囲に記載された技術的思想の範疇において各種の変更が可能であることは言うまでもない。
Although an example of the embodiment of the present invention has been described above, the present invention is not limited to this, and it goes without saying that various modifications can be made within the scope of the technical idea described in the claims. Yes.
1 利用者(上肢障害者)
2 食事支援ロボット
3 撮像手段(CCDカメラ)
4 表示手段
5 食物皿
6 食物提供手段(スプーン)
7 食物押出手段
8 食物皿移動手段
9 超音波モータ
10 カーソル
11 音声入力手段(マイク)
1 users (upper limb disabled)
2 Meal support robot 3 Imaging means (CCD camera)
4 Display means 5 Food dish 6 Food provision means (spoon)
7 Food pushing means 8 Food dish moving means 9 Ultrasonic motor 10 Cursor 11 Voice input means (microphone)
Claims (8)
前記利用者の状態、及び、前記食事支援ロボットの状態を監視する、1又は複数個の撮像手段と、
前記利用者に動作メニュー及びカーソルを含む情報を提供する表示手段と、
前記撮像手段からの情報に基づいて前記食事支援ロボットを制御するとともに、前記表示手段に情報を表示させる制御手段と、を有する食事支援システムであって、
前記食事支援ロボットは、食物を載せる食物皿、利用者に食物を提供する食物提供手段、前記食物皿の食物を前記食物提供手段に押出す食物押出手段、前記食物押出手段の押出し方向と直交方向に前記食物皿を移動させる食物皿移動手段、とを有し、
前記食事支援ロボットの食物提供手段、食物押出手段及び食物皿移動手段はアクチュエータにより駆動されるとともに、前記各手段は直交座標方式により制御されることを特徴とする食事支援システム。 A meal support robot that provides meals to users;
One or more imaging means for monitoring the state of the user and the state of the meal support robot;
Display means for providing the user with information including an operation menu and a cursor;
And a control means for controlling the meal support robot based on information from the imaging means and displaying the information on the display means,
The meal support robot includes: a food dish on which food is placed; food providing means for providing food to a user; food pushing means for pushing food from the food dish to the food providing means; and a direction orthogonal to the pushing direction of the food pushing means. Food dish moving means for moving the food dish to
The meal support system, wherein the food providing means, the food pushing means and the food dish moving means of the meal support robot are driven by an actuator, and each means is controlled by an orthogonal coordinate system.
前記制御手段は、前記音声入力手段により入力された音声により操作可能なボイス・インターフェイス機能を有することを特徴とする請求項1乃至5いずれか記載の食事支援システム。
Voice input means for inputting the voice of the user;
The meal support system according to any one of claims 1 to 5, wherein the control means has a voice interface function operable by the voice input by the voice input means.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2006312523A JP5023328B2 (en) | 2006-11-20 | 2006-11-20 | Meal support system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2006312523A JP5023328B2 (en) | 2006-11-20 | 2006-11-20 | Meal support system |
Publications (2)
Publication Number | Publication Date |
---|---|
JP2008125696A true JP2008125696A (en) | 2008-06-05 |
JP5023328B2 JP5023328B2 (en) | 2012-09-12 |
Family
ID=39552136
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP2006312523A Active JP5023328B2 (en) | 2006-11-20 | 2006-11-20 | Meal support system |
Country Status (1)
Country | Link |
---|---|
JP (1) | JP5023328B2 (en) |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101237245B1 (en) | 2010-04-07 | 2013-03-08 | 대한민국 | Eating Assistant Robot |
CN103099513A (en) * | 2013-03-01 | 2013-05-15 | 上海海事大学 | Auxiliary device for self-help diet |
KR101380598B1 (en) * | 2012-04-25 | 2014-04-10 | 대한민국(국립재활원장) | Meal assistance robot using a conveyor |
KR101380596B1 (en) | 2012-04-25 | 2014-04-11 | 대한민국(국립재활원장) | Meal assistance robot using a camera |
CN103955214A (en) * | 2014-04-11 | 2014-07-30 | 陕西科技大学 | Double-wheel upright-walking intelligent meal delivery robot |
CN103948285A (en) * | 2014-04-11 | 2014-07-30 | 陕西科技大学 | Double-wheeled upright meal delivery robot capable of storing maps |
JP5628388B1 (en) * | 2013-08-12 | 2014-11-19 | 昌毅 明石 | Feeding aids |
KR101758660B1 (en) | 2016-02-05 | 2017-07-17 | 경북대학교 산학협력단 | Meal assistance robot apparatus and meal assistance system using the same |
CN112757302A (en) * | 2021-01-06 | 2021-05-07 | 北京航空航天大学 | Control method of portable dining-assistant robot |
KR102357507B1 (en) * | 2020-09-23 | 2022-02-07 | 한국생산기술연구원 | Meal assistance robot based on deep learning and meal assistance method using the same |
KR20220028211A (en) * | 2020-08-28 | 2022-03-08 | 한국생산기술연구원 | Method and Apparatus for Meal Assistance Using Interactive Interface |
WO2022055443A1 (en) * | 2020-09-10 | 2022-03-17 | Yildiz Teknik Universitesi | Method for robotic feeder platform for disabled individuals |
JP7526345B1 (en) | 2023-12-25 | 2024-07-31 | 株式会社イトーキ | Nursing care facilities |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1033599A (en) * | 1996-07-18 | 1998-02-10 | Katsuhiro Nakamura | Automatic food lifting dish |
JP2000051292A (en) * | 1998-08-14 | 2000-02-22 | Hiroo Yoshiike | Eating assisting tool for person handicapped in function of single hand |
JP2000116159A (en) * | 1998-09-30 | 2000-04-21 | Kyocera Corp | Control method for ultrasonic motor |
JP2000348173A (en) * | 1999-06-04 | 2000-12-15 | Matsushita Electric Ind Co Ltd | Lip extraction method |
JP2002158982A (en) * | 2000-11-20 | 2002-05-31 | Canon Inc | Image processing method, processor and computer readable medium |
JP2004180817A (en) * | 2002-12-02 | 2004-07-02 | National Institute Of Advanced Industrial & Technology | Work supporting manipulator system using biological signal |
JP3673834B2 (en) * | 2003-08-18 | 2005-07-20 | 国立大学法人山口大学 | Gaze input communication method using eye movement |
JP2006000427A (en) * | 2004-06-18 | 2006-01-05 | Secom Co Ltd | Meal support apparatus |
-
2006
- 2006-11-20 JP JP2006312523A patent/JP5023328B2/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1033599A (en) * | 1996-07-18 | 1998-02-10 | Katsuhiro Nakamura | Automatic food lifting dish |
JP2000051292A (en) * | 1998-08-14 | 2000-02-22 | Hiroo Yoshiike | Eating assisting tool for person handicapped in function of single hand |
JP2000116159A (en) * | 1998-09-30 | 2000-04-21 | Kyocera Corp | Control method for ultrasonic motor |
JP2000348173A (en) * | 1999-06-04 | 2000-12-15 | Matsushita Electric Ind Co Ltd | Lip extraction method |
JP2002158982A (en) * | 2000-11-20 | 2002-05-31 | Canon Inc | Image processing method, processor and computer readable medium |
JP2004180817A (en) * | 2002-12-02 | 2004-07-02 | National Institute Of Advanced Industrial & Technology | Work supporting manipulator system using biological signal |
JP3673834B2 (en) * | 2003-08-18 | 2005-07-20 | 国立大学法人山口大学 | Gaze input communication method using eye movement |
JP2006000427A (en) * | 2004-06-18 | 2006-01-05 | Secom Co Ltd | Meal support apparatus |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101237245B1 (en) | 2010-04-07 | 2013-03-08 | 대한민국 | Eating Assistant Robot |
KR101380598B1 (en) * | 2012-04-25 | 2014-04-10 | 대한민국(국립재활원장) | Meal assistance robot using a conveyor |
KR101380596B1 (en) | 2012-04-25 | 2014-04-11 | 대한민국(국립재활원장) | Meal assistance robot using a camera |
CN103099513A (en) * | 2013-03-01 | 2013-05-15 | 上海海事大学 | Auxiliary device for self-help diet |
JP5628388B1 (en) * | 2013-08-12 | 2014-11-19 | 昌毅 明石 | Feeding aids |
JP2015036013A (en) * | 2013-08-12 | 2015-02-23 | 昌毅 明石 | Ingestion assisting apparatus |
CN103948285A (en) * | 2014-04-11 | 2014-07-30 | 陕西科技大学 | Double-wheeled upright meal delivery robot capable of storing maps |
CN103955214A (en) * | 2014-04-11 | 2014-07-30 | 陕西科技大学 | Double-wheel upright-walking intelligent meal delivery robot |
KR101758660B1 (en) | 2016-02-05 | 2017-07-17 | 경북대학교 산학협력단 | Meal assistance robot apparatus and meal assistance system using the same |
KR20220028211A (en) * | 2020-08-28 | 2022-03-08 | 한국생산기술연구원 | Method and Apparatus for Meal Assistance Using Interactive Interface |
KR102417457B1 (en) | 2020-08-28 | 2022-07-06 | 한국생산기술연구원 | Method and Apparatus for Meal Assistance Using Interactive Interface |
WO2022055443A1 (en) * | 2020-09-10 | 2022-03-17 | Yildiz Teknik Universitesi | Method for robotic feeder platform for disabled individuals |
KR102357507B1 (en) * | 2020-09-23 | 2022-02-07 | 한국생산기술연구원 | Meal assistance robot based on deep learning and meal assistance method using the same |
CN112757302A (en) * | 2021-01-06 | 2021-05-07 | 北京航空航天大学 | Control method of portable dining-assistant robot |
JP7526345B1 (en) | 2023-12-25 | 2024-07-31 | 株式会社イトーキ | Nursing care facilities |
Also Published As
Publication number | Publication date |
---|---|
JP5023328B2 (en) | 2012-09-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5023328B2 (en) | Meal support system | |
Park et al. | Active robot-assisted feeding with a general-purpose mobile manipulator: Design, evaluation, and lessons learned | |
Li et al. | Development of a tele-nursing mobile manipulator for remote care-giving in quarantine areas | |
JP6756040B2 (en) | Immersive 3D display for robotic surgery | |
Krishnan et al. | Mobility assistive devices and self-transfer robotic systems for elderly, a review | |
Naotunna et al. | Meal assistance robots: A review on current status, challenges and future directions | |
Volosyak et al. | Rehabilitation robot FRIEND II-the general concept and current implementation | |
Bogue | Robots to aid the disabled and the elderly | |
JPWO2018100760A1 (en) | Upper limb motion support device and upper limb motion support system | |
Wu et al. | Development of smartphone-based human-robot interfaces for individuals with disabilities | |
Struijk et al. | Comparison of tongue interface with keyboard for control of an assistive robotic arm | |
Gandolla et al. | BRIDGE—behavioural reaching interfaces during daily antigravity activities through upper limb exoskeleton: preliminary results | |
Hasegawa et al. | Exoskeletal meal assistance system (EMAS II) for progressive muscle dystrophy patient | |
Takahashi et al. | Human interface using PC display with head pointing device for eating assist robot and emotional evaluation by GSR sensor | |
Pálsdóttir et al. | Remote tongue based control of a wheelchair mounted assistive robotic arm–a proof of concept study | |
CN205969048U (en) | Flexible medical care robot system of self -adaptation | |
Young et al. | Design of intention-based assistive robot for upper limb | |
Graf et al. | Service Robots and Automation for the Disabled and Nursing Home Care | |
Gushi et al. | A mobile robotic arm for people with severe disabilities: Evaluation of scooping foods | |
JP4120008B2 (en) | Motor function assist device | |
Koller et al. | Towards robotic drinking assistance: Low cost multi-sensor system to limit forces in human-robot-interaction | |
Chang et al. | Bio-inspired gaze-driven robotic neck brace | |
Popovic | Feeding systems, assistive robotic arms, robotic nurses, robotic massage | |
Abou Haidar et al. | Robotic Feeder for Disabled People (RFDP) | |
Lee et al. | Development of a future Intelligent Sweet Home for the disabled |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A621 | Written request for application examination |
Effective date: 20090608 Free format text: JAPANESE INTERMEDIATE CODE: A621 |
|
A977 | Report on retrieval |
Free format text: JAPANESE INTERMEDIATE CODE: A971007 Effective date: 20110811 |
|
A131 | Notification of reasons for refusal |
Effective date: 20110823 Free format text: JAPANESE INTERMEDIATE CODE: A131 |
|
RD02 | Notification of acceptance of power of attorney |
Free format text: JAPANESE INTERMEDIATE CODE: A7422 Effective date: 20110930 |
|
A521 | Written amendment |
Free format text: JAPANESE INTERMEDIATE CODE: A523 Effective date: 20111014 |
|
TRDD | Decision of grant or rejection written | ||
A01 | Written decision to grant a patent or to grant a registration (utility model) |
Effective date: 20120522 Free format text: JAPANESE INTERMEDIATE CODE: A01 |
|
A01 | Written decision to grant a patent or to grant a registration (utility model) |
Free format text: JAPANESE INTERMEDIATE CODE: A01 |
|
R150 | Certificate of patent (=grant) or registration of utility model |
Free format text: JAPANESE INTERMEDIATE CODE: R150 |