TW201006635A - In situ robot which can be controlled remotely - Google Patents

In situ robot which can be controlled remotely Download PDF

Info

Publication number
TW201006635A
TW201006635A TW97130067A TW97130067A TW201006635A TW 201006635 A TW201006635 A TW 201006635A TW 97130067 A TW97130067 A TW 97130067A TW 97130067 A TW97130067 A TW 97130067A TW 201006635 A TW201006635 A TW 201006635A
Authority
TW
Taiwan
Prior art keywords
robot
site
module
control device
human
Prior art date
Application number
TW97130067A
Other languages
Chinese (zh)
Inventor
Ye-Liang Xu
Chang-Hui Wu
zong-cheng Cai
Bo-Er Xu
Original Assignee
Univ Yuan Ze
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Univ Yuan Ze filed Critical Univ Yuan Ze
Priority to TW97130067A priority Critical patent/TW201006635A/en
Publication of TW201006635A publication Critical patent/TW201006635A/en

Links

Landscapes

  • Manipulator (AREA)

Abstract

This invention relates to an in situ robot which can be controlled remotely, comprising a body, a movement device, a video camera, an anthropomorphic module which simulates human emotion expression, an environment sensing module, a physiological information module, and a core control device. A remote operator can operate a man-machine interface to transmit control commands and sound via the Internet to a local in situ robot and control the movement, operator emotion expression and transmission of remote sound thereof. The core control device of the in situ robot can be controlled remotely and provided with autonomous capability by the use of an environment detection module. Meanwhile, the in situ robot can transmit the local user's image, sound, physiological information and environment information via a wireless manner to the remote man-machine interface to allow in situ interaction and communication between the remote operator and the local user and to provide distant healthcare functions.

Description

201006635 九、發明說明: 【發明所屬之技術領域】 本發明係關於一種機器人,尤指可由遠端控制的臨場 機器人。 【先前技術】 鄰國的日本國已邁向超高齡社會,平均約每3個年輕 人要撫養一個老年人,這使得高齡醫療和看護工作漸成為 ® 社會的沈重負擔;在看護人手不足的情形下,日本國早已 動腦筋到機器人身上。 日本國針對高齡市場開發的機器人,有幫助長者行 走、復健、提升筋骨能力,協助臥床老人入浴、處理如廁 排泄問題,有代替看護人員背負老人、餵食行動不便老人 的機器人,甚至有娛樂用的寵物、彈琴、吹長笛、舞伴等 等機器人。 為了讓老人們有陪伴的感覺,已有人提出中華民國專 利公告號第519826號「個人化智慧型攝影機系統」,並主 要藉著自動追蹤的PTZ攝影機來搭配可受控制臉部特徵 (例如:會動的眉毛、眼皮、嘴巴..·等等),以生動地 進行遠端視訊互動。 然而,上述技術中缺乏人類特有的活動性與肢體語 言,以至於在擬人效果上仍有待加強。在增強活動性時, 5 201006635 為了避免因為外在環境或操作不當等因素導致整體功能喪 失,還必須考慮到自主反應,以避免本來是要擔任照顧、 看護的機械人,本身卻因意外故障而失去照顧與看護作用。 【發明内容】 本發明之主要目的在提供一種可由遠端控制的臨場機 器人,其主要結合虛擬實境、人機介面、通訊技術與機器 A 人運動等技術,實現遠端臨場的目的,以人性化的方式進 擊 行遠端的照顧、看護與情感交流。 基於上述目的,本發明一種可由遠端控制的臨場機器 人,包含身體、移動裝置、攝影機、模擬人類情感表達的 擬人模組、環境感測模組、生理資訊模組、與核心控制裝 置。遠端的操作者可藉由人機介面的操作,將控制指令、 聲音以無線傳輸方式(例如:網際網路)傳遞至近端的臨場 Φ 機器人,以控制臨場機器人的移動、表達操作者的情感、 與遠端聲音之傳遞;臨場機器人上之核心控制裝置除受遠 端控制外,亦可配合環境偵測模組,使臨場機器人具有自 主行為能力。同時,臨場機器人能將近端使用者的影像、 聲音、生理資訊、與其環境資訊等,以無線傳輸方式(例如: 網際網路)傳遞至遠端的人機介面上,使遠端的操作者與近 端的使用者能以具有臨場感的方式進行互動與溝通,同時 可因生理資訊的獲得,使遠端操作者能對近端使用者進行 6 201006635 ' 遠距的照護。 關於本發明之優點與精神可以藉由以下的發明詳述及 所附圖式得到進一步的瞭解。 【實施方式】 請參閱第1圖,第1圖為本發明可由遠端控制的臨場 機器人之實施示意圖。如第1圖所示,本發明一種可由遠 Φ 端控制的臨場機器人,主要包含身體10、設置在身體10 底部的移動裝置11、攝影機12、模擬人類情感表達的擬人 模組14、環境感測模組20、核心控制裝置22。其中,可 在該攝影機12整合設置一收放音裝置18,使得攝影機12 除可拍攝、傳輸所處環境的影像之外,也可以藉由該收放 音裝置18傳遞遠端和近端的聲音。 簡單來說,在本發明實施例中,在遠端的操作者32可 ❿ 藉著由伺服器26所提供的人機介面30下達命令,並經由 伺服器26將控制指令、聲音利用無線傳輸方式,例如:網 際網路,傳遞至近端的臨場機器人,藉以控制臨場機器人 的移動、表達操作者情感、與進行遠端與近端間的聲音傳 遞。亦即,操作者32透過人機介面30以無線傳輸方式(網 際網路)傳輸指令至臨場機器人的核心控制裝置2 2,該核心 控制裝置22具有收發指令訊號、資料儲存、資料運算…等 功能,核心控制裝置22得以驅動移動裝置11、處理攝影 7 201006635 * 機12所拍攝的影像與收放音裝置a的聲音、以及控制擬 人模組14 ;或,操作者32透過人機介面3〇將操作者及其 環境聲音傳送至臨場機器人上的收放音裝置18。使遠端的 操作者32得控制臨場機器人的移動裝置u而移動,控制 擬人模組14模擬人的情感、表情、動作,經由收放音裝置 18的聲音傳送與臨場機器人的近端使用者進行對話、互動。 當然,臨場機器人的近端使用者的影像、聲音、與其 環境資訊等,亦可以無線傳輸方式(網際網路)傳遞至遠端 的人機介面30上,提供遠端的操作者32 —個與近端的使 用者面對面溝通、互動的模式,達到近似臨場感覺的遠端 互動的溝通方式,進行擬人性化的遠端人際交流。 為了實現具體的陪伴感’該臨場機器人的外型為擬人 化之形體(如:具臉部、手部等),在身體1〇上設置模擬人體 參 情感元件16a、16b,該模擬人體情感元件16a、16b可為各 種模擬人類情感表達之元件。藉由模擬人體情感元件16a、 配合前述的收放音裝置18而模擬人類情感的表情、動 作、與聲音。在本實施例中’該模擬人體情感元件16a為 鍵人類的眉毛、或為LED陣列,該模擬人體情感元件i6b 為擬人類的肢體'或其他外觀元件;利用該核心控制裝置 22經由擬人模組14驅動模擬人體情感元件i6a、16b時, 穑由控制模擬人體情感元件16a、16b,即眉毛(或LED陣 8 201006635 列)、肢體(或外觀元件),而模擬人類的形體與真實情感表 達,用以豐富地表達情感。例如,遠端的操作者32希望用 親切的聲音與表情來問候近端的使用者時,操作者32可利 用人機介面30下達一個「手舞足蹈」的控制命令,則該核 心控制裝置22會經由擬人模組14驅動人體情感元件 16b(肢體)揮動、以及移動裝置11移動;也可經由收放音裝 置18收放使用者與遠端操作者32兩端之聲音,例如:操 ® 作者32說話,經由該核心控制裝置22驅動該收放音裝置 18播放在遠端的操作者32所發出的聲音,例如··您好 嗎?,到近端使用者處,並且該收放音裝置18可轉遞在近 端使用者所發出的聲音,例如:我很好!,到遠端的操作 者32。 為使遠端臨場機械人達到更為靈活的行動力,本發明 實施例中的該移動裝置11可為三輪差速控制的載具。 另外,臨場機器人的核心控制裝置22除受遠端控制 外,亦可配合一環境偵測模組20,使臨場機器人具有自主 行為能力,該能力是為避免外在環境或操作不當等因素導 致整體功能喪失而設置。為實現此目的,該環境感測模組 20可感測週遭環境狀態,並依據所偵測到的環境狀態作出 自主反應,以避免外在因素導致功能失效。例如:環境感 9 201006635 測模組20可為溫度或距離感測器;環境感測模組20依據 所债測到的環境狀態,驅動受控制的移動裝置11移動,避 開危害因素或障礙物,例如:火源、或牆壁等,對應作出 自主反應。 參考第2圖’為了能監控近端的使用者或受測者的生 理狀態,在本發明進一步於身體1〇組設一生理資訊模蚯 24,該生理資訊模組24用以偵測近端使用者(受測去、沾a201006635 IX. Description of the Invention: TECHNICAL FIELD OF THE INVENTION The present invention relates to a robot, and more particularly to a field robot that can be controlled by a remote end. [Prior Art] Japan in neighboring countries has moved towards a super-aged society, and an average of about three young people are raising an elderly person. This makes old-age medical and nursing work a heavy burden for the society; Next, the Japanese country has already brainstormed on the robot. The robots developed by the Japanese for the elderly market have the ability to help the elderly to walk, rehabilitate and improve their bones and muscles, assist the elderly in bed to enter the bath, handle the problem of toilet discharge, replace the elderly with the caregiver, feed the robots with inconvenient movements, and even have entertainment. Pets, playing the piano, playing the flute, dancing partners and other robots. In order to make the old people feel companionship, the Republic of China Patent No. 519826 "Personalized Smart Camera System" has been proposed, and the PTZ camera with automatic tracking is used to match the controllable facial features (for example: Dynamic eyebrows, eyelids, mouths, etc.) to vividly perform far-end video interaction. However, the above-mentioned techniques lack human-specific activity and body language, so that the effect of anthropomorphism still needs to be strengthened. 5 201006635 In order to avoid the loss of overall function due to factors such as external environment or improper operation, it is necessary to take into account the autonomic reaction to avoid the robot who is supposed to take care of and care for itself. Loss of care and care. SUMMARY OF THE INVENTION The main object of the present invention is to provide a remotely controllable on-site robot, which mainly combines virtual reality, human-machine interface, communication technology, and machine A-person motion to realize the purpose of remote presence and humanity. The way to attack the far-end care, care and emotional communication. Based on the above objects, the present invention provides a remotely controllable on-site robot, including a body, a mobile device, a camera, an anthropomorphic module simulating human emotion expression, an environmental sensing module, a physiological information module, and a core control device. The remote operator can transmit control commands and sounds to the near-end Φ robot by wireless transmission (eg, the Internet) through the operation of the human-machine interface to control the movement of the on-site robot, express the emotion of the operator, And the transmission of the far-end sound; the core control device on the on-site robot can be combined with the environment detection module in addition to the remote control, so that the on-site robot has the autonomous behavior. At the same time, the on-site robot can transmit the video, sound, physiological information, and environmental information of the near-end user to the remote human-machine interface by wireless transmission (for example, the Internet), so that the remote operator and the remote operator The near-end user can interact and communicate in a sense of presence, and at the same time, the remote operator can perform 6 201006635 'distance care for the near-end user due to the acquisition of physiological information. The advantages and spirit of the present invention will be further understood from the following detailed description of the invention. [Embodiment] Please refer to Fig. 1. Fig. 1 is a schematic view showing the implementation of a remotely controlled field robot according to the present invention. As shown in Fig. 1, the present invention is a presence robot controlled by a far Φ end, mainly comprising a body 10, a mobile device 11 disposed at the bottom of the body 10, a camera 12, a personification module 14 simulating human emotion expression, and environmental sensing. Module 20, core control device 22. In the camera 12, a receiving and playback device 18 can be integrated, so that the camera 12 can transmit the sound of the environment and the remote and near-end sounds. . Briefly, in the embodiment of the present invention, the operator 32 at the remote end can issue a command through the human machine interface 30 provided by the server 26, and use the wireless transmission method to control the command and sound via the server 26. For example, the Internet, passed to the near-end on-site robot, to control the movement of the on-site robot, express the operator's emotions, and transmit the sound between the far end and the near end. That is, the operator 32 transmits the command to the core control device 22 of the on-site robot through the human interface 30 via the wireless transmission mode (internet), and the core control device 22 has functions of transmitting and receiving command signals, data storage, data calculation, and the like. The core control device 22 can drive the mobile device 11, process the image captured by the camera 12 and the sound of the sound receiving device a, and control the anthropomorphic module 14; or, the operator 32 passes through the human interface 3 The operator and his ambient sound are transmitted to the playback and playback device 18 on the on-site robot. The remote operator 32 is controlled to move by the mobile device u of the on-the-spot robot, and the anthropomorphic module 14 is controlled to simulate the emotions, expressions, and actions of the person, and the sound transmission through the sound receiving and playing device 18 is performed with the near-end user of the on-site robot. Dialogue, interaction. Of course, the image, sound, and environmental information of the near-end user of the on-site robot can also be transmitted to the remote human-machine interface 30 by wireless transmission (internet), providing a remote operator 32. The near-end user has a face-to-face communication and interaction mode, and achieves a remote interactive communication mode that approximates the sense of presence, and performs anthropomorphic remote interpersonal communication. In order to achieve a specific sense of companionship, the appearance of the on-site robot is an anthropomorphic shape (eg, with a face, a hand, etc.), and a simulated human body emotion component 16a, 16b is disposed on the body 1,, the simulated human emotion component 16a, 16b can be various components that simulate human emotion expression. The human emotion expressions, movements, and sounds are simulated by simulating the human emotion element 16a in conjunction with the aforementioned sound and discharge device 18. In the present embodiment, the simulated human emotion component 16a is a button human eyebrow or an LED array, and the simulated human emotion component i6b is a human body limb or other external component; the core control device 22 is used by the anthropomorphic module. When 14 is driven to simulate human emotion components i6a, 16b, 模拟 simulates human body emotion components 16a, 16b, ie eyebrows (or LED array 8 201006635 columns), limbs (or appearance components), and simulates human body and true emotional expression, Used to express emotions richly. For example, when the remote operator 32 wishes to greet the near-end user with a friendly voice and expression, the operator 32 can use the human-machine interface 30 to issue a "hand dance" control command, and the core control device 22 will The anthropomorphic module 14 drives the human emotion component 16b (limb) to swing and the mobile device 11 to move; the sound of the user and the remote operator 32 can also be retracted via the sound-receiving device 18, for example, the operator® speaks 32 The playback device 18 is driven by the core control device 22 to play the sound emitted by the operator 32 at the far end, for example, how are you? , to the near-end user, and the sound-receiving device 18 can transmit the sound from the near-end user, for example: I am very good! , to the remote operator 32. In order to enable the remote field robot to achieve more flexible action, the mobile device 11 in the embodiment of the present invention may be a three-wheel differential control carrier. In addition, the core control device 22 of the on-site robot can be combined with an environment detection module 20 in addition to the remote control, so that the on-site robot has an autonomous behavior capability, which is to avoid the external environment or improper operation and other factors. The function is lost and set. To achieve this, the environment sensing module 20 senses the surrounding environmental conditions and makes an autonomous response based on the detected environmental conditions to prevent external factors from causing functional failure. For example: environmental sense 9 201006635 The test module 20 can be a temperature or distance sensor; the environment sensing module 20 drives the controlled mobile device 11 to move according to the environmental state measured by the debt, avoiding harmful factors or obstacles. For example, fire source, or wall, etc., corresponding to an autonomous reaction. Referring to FIG. 2, in order to be able to monitor the physiological state of the user or the subject at the proximal end, the present invention further sets up a physiological information module 24 for detecting the proximal end. User (measured to go, touch a

提供給在遠端的操作者32或監控人員。該生理資訊模級 24具生理訊號異常警示功能,可發送警示訊息予遠端的操 作者32或指以員,可確實有效地進行遠距照護的功能。、 藉由以上較佳具體實施例之詳述,係希望 二 楚描述本發明之特徵_神。但並非以上逃簡露的較: 具體實施例來對本發明之範疇加以限制, ❹發明申請範圍所作之均等變化與修飾等, 之專利涵蓋範圍内。 相反地’凡依本 音應仍屬本發明 201006635 【圖式簡單說明】 第1圖係為本發明可由遠端控制的臨場機器人之實施示 意圖。 第2圖為本發明可由遠端控制的臨場機器人之另一實施 示意圖。 【主要元件符號說明】 ^ 10身體 11移動裝置 12攝影機 14擬人模組 16a模擬人體情感元件(眉毛或LED陣列) 16b模擬人體情感元件(肢體或外觀元件) 18收放音裝置 φ 20環境感測模組 22核心控制裝置 24生理資訊模組 26伺服器 30人機介面 32操作者 11Provided to the operator 32 or the monitoring personnel at the remote end. The physiological information module has a physiological signal abnormality warning function, and can send a warning message to the remote operator 32 or the finger, and can effectively perform the function of remote care. With the above detailed description of the preferred embodiments, it is desirable to describe the features of the present invention. However, it is not intended to limit the scope of the invention, and the scope of the invention is limited to the scope of the invention. On the contrary, the local sound should still belong to the present invention. 201006635 [Simplified description of the drawings] Fig. 1 is a schematic illustration of the implementation of a remotely controlled field robot according to the present invention. Figure 2 is a schematic illustration of another embodiment of a field-controlled robot that can be remotely controlled in accordance with the present invention. [Main component symbol description] ^ 10 body 11 mobile device 12 camera 14 anthropomorphic module 16a simulates human emotion components (eyebrows or LED array) 16b simulates human emotion components (limb or appearance components) 18 receiving and playback device φ 20 environment sensing Module 22 core control device 24 physiological information module 26 server 30 human interface 32 operator 11

Claims (1)

201006635 十、申請專利範圍: 1、一種可由遠端控制的臨場機器人,包含: 一身體,係用以裝設各相關之模組、元件.▲歹 身體連結設置一移動裝置; 一攝影機,設置在該身體上,係用以拍攝所處 環境的影像; ❹ 一擬人模組,用以模擬人類的情感; 一環境感測模組,其設置在該身體上, 、 1糸用以 感測週遭環境狀態’並依據所偵測到的罕产狀 態’作出自主反應;以及 一核心控制裝置,係透過無線傳輸方式傳輸資 訊與接收控制命令’以控制臨場機器人。 、 ⑩ 2、如申請專利範圍第1項所述的臨場機器人,其 該核心控制裝置的無線傳輸方式 ’ 、馬網際網路傳 3、如申請專利範圍第丨項所述的臨場機器人,其 該,心控制裝置具有收發指令訊號、資料儲存^料 運异功能;核心控制裝置用以驅動移動裝置、處理攝 影機所拍攝的影像與收放音農置的聲音、以及 人模組。 12 201006635 4、 如申請專利範圍第1項所述的臨場機器人,其中, 該移動裝置為三輪差速控制的載具。 5、 如申請專利範圍第1項所述的臨場機器人,其進一 步在5亥身體上設置模擬人體情感元件,該模擬人體情 感元件可為各種模擬人類情感表達之元件。 6、 如申請專利範圍第4項所述的臨場機器人,其中, ❹ 該模擬人體情感元件為擬人類的眉毛、LED陣列、M 人類的肢體。 ^ 7、 如申請專利範圍第4、5項所述的臨場機器人,其 中,該模擬人體情感元件受擬人模組控制。 8、 如申請專利範圍第丨項所述的臨場機器人,其中, 該攝影機整合設置-收放音裝置,該收放音裝置傳遞 _ 遠端和近端的聲音。 9如申明專利範圍第1項所述的臨場機器人,其中, 該核心控制裝置處理該攝影機所拍攝的影像時, 將衫像傳遞至遠端’提供在遠端的操作者作現場 監控。 10如申明專利範圍第1項所述的臨場機器人,其中 該環境感測模組為溫度感測器。 13 201006635 11 12、 Ο 如申請專利範圍第1項所述的臨場機器人,其中 該環境感測模組為距離感測器。 如申請專利範圍第1項所述的臨場機器人,其進 —步包含: 一生理資訊模組,係用以偵測受測者的生理資 訊,並具生理訊號異常警示功能,可發送警示訊 息予指定人員。 *如申請專利範圍第12項所述的臨場機器人, 其中,該生理資訊包含心跳、體溫、血糖、灰壓。 13201006635 X. Patent application scope: 1. A remotely controlled on-site robot, comprising: a body for installing various related modules and components. ▲ 歹 body connection to set up a mobile device; a camera, set in The body is used to capture images of the environment; 拟 an anthropomorphic module to simulate human emotions; an environmental sensing module disposed on the body, 1 糸 to sense the surrounding environment The state 'and reacts independently according to the detected state of birth'; and a core control device that transmits information and receives control commands via wireless transmission to control the on-site robot. 10, 2. The on-the-spot robot described in claim 1 of the patent scope, the wireless transmission mode of the core control device, and the network transmission of the horse, as described in the scope of the patent application, the The heart control device has a function of transmitting and receiving command signals and data storage and transportation; the core control device is used for driving the mobile device, processing images captured by the camera, sounds of the sound, and human modules. 12 201006635 4. The on-site robot of claim 1, wherein the mobile device is a three-wheel differential control vehicle. 5. The on-the-spot robot described in claim 1 of the patent application further includes a simulated human emotion component on the body of the 5 hai, and the simulated human emotion component can be various components for simulating human emotion expression. 6. The on-the-spot robot as claimed in claim 4, wherein the simulated human emotion component is a human eyebrow, an LED array, and an M human limb. ^ 7. The on-site robot of claim 4, 5, wherein the simulated human emotion component is controlled by a personification module. 8. The on-site robot of claim 3, wherein the camera integrates a set-and-play sound device that transmits _ far end and near end sounds. 9. The on-site robot of claim 1, wherein the core control device transmits the image of the camera to the distal end when the image captured by the camera is delivered to the remote end for on-site monitoring by an operator at the far end. 10. The on-site robot of claim 1, wherein the environmental sensing module is a temperature sensor. 13 201006635 11 12, Ο The on-the-spot robot described in claim 1, wherein the environment sensing module is a distance sensor. For example, the on-site robot described in claim 1 includes: a physiological information module for detecting physiological information of the subject, and having a physiological signal abnormal warning function, and sending a warning message to Designated person. * The on-site robot according to claim 12, wherein the physiological information includes heartbeat, body temperature, blood sugar, and gray pressure. 13
TW97130067A 2008-08-07 2008-08-07 In situ robot which can be controlled remotely TW201006635A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW97130067A TW201006635A (en) 2008-08-07 2008-08-07 In situ robot which can be controlled remotely

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW97130067A TW201006635A (en) 2008-08-07 2008-08-07 In situ robot which can be controlled remotely

Publications (1)

Publication Number Publication Date
TW201006635A true TW201006635A (en) 2010-02-16

Family

ID=44826755

Family Applications (1)

Application Number Title Priority Date Filing Date
TW97130067A TW201006635A (en) 2008-08-07 2008-08-07 In situ robot which can be controlled remotely

Country Status (1)

Country Link
TW (1) TW201006635A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI453555B (en) * 2012-07-11 2014-09-21 Univ Nan Kai Technology Auto service robot
TWI615773B (en) * 2012-11-30 2018-02-21 元智大學 Multimodal interpersonal communication robot system in combination with mobile device
TWI658377B (en) * 2018-02-08 2019-05-01 佳綸生技股份有限公司 Robot assisted interaction system and method thereof
TWI663495B (en) * 2017-01-18 2019-06-21 南韓商Lg電子股份有限公司 Mobile robot system and control method thereof
TWI691913B (en) * 2017-11-17 2020-04-21 日商三菱電機股份有限公司 3-dimensional space monitoring device, 3-dimensional space monitoring method, and 3-dimensional space monitoring program

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI453555B (en) * 2012-07-11 2014-09-21 Univ Nan Kai Technology Auto service robot
TWI615773B (en) * 2012-11-30 2018-02-21 元智大學 Multimodal interpersonal communication robot system in combination with mobile device
TWI663495B (en) * 2017-01-18 2019-06-21 南韓商Lg電子股份有限公司 Mobile robot system and control method thereof
US10946520B2 (en) 2017-01-18 2021-03-16 Lg Electronics Inc. Mobile robot system and control method thereof
TWI691913B (en) * 2017-11-17 2020-04-21 日商三菱電機股份有限公司 3-dimensional space monitoring device, 3-dimensional space monitoring method, and 3-dimensional space monitoring program
TWI658377B (en) * 2018-02-08 2019-05-01 佳綸生技股份有限公司 Robot assisted interaction system and method thereof

Similar Documents

Publication Publication Date Title
US20190108770A1 (en) System and method of pervasive developmental disorder interventions
US8909370B2 (en) Interactive systems employing robotic companions
TWI658377B (en) Robot assisted interaction system and method thereof
Dickstein-Fischer et al. An affordable compact humanoid robot for autism spectrum disorder interventions in children
JP5040865B2 (en) Robot control system, remote management device, remote management method and program
US20230305530A1 (en) Information processing apparatus, information processing method and program
US11439346B2 (en) Robotic device for assisting individuals with a mental illness
WO2002076687A1 (en) Robot device and control method therefor, and storage medium
TW201006635A (en) In situ robot which can be controlled remotely
US20230266767A1 (en) Information processing apparatus, information processing method, and program
JP2024023193A (en) Information processing device, and information processing method
JP7375770B2 (en) Information processing device, information processing method, and program
JP2020089947A (en) Information processing device, information processing method, and program
CN116572260A (en) Emotion communication accompanying and nursing robot system based on artificial intelligence generated content
JP2004034273A (en) Robot and system for generating action program during utterance of robot
US20210197393A1 (en) Information processing device, information processing method, and program
US11938625B2 (en) Information processing apparatus, information processing method, and program
Shin A study on the tele-medicine robot system with face to face interaction
JP2022074094A (en) System and method for continuously sharing action states of creature
Saint-Aimé et al. Evaluation of Emi interaction with non-disabled children in nursery school using wizard of Oz technique
TWI503099B (en) Multimodal interpersonal communication system for home telehealth with telepresence robot
JP2004227276A (en) Human communication behavior recording system and method
Khan et al. Tongue-supported human-computer interaction systems: a review
WO2018033839A1 (en) Interactive modular robot
BR102019009150A2 (en) interactive robot device with remote controls and audio and video communication