WO2000043168A1 - Robot - Google Patents
Robot Download PDFInfo
- Publication number
- WO2000043168A1 WO2000043168A1 PCT/JP2000/000342 JP0000342W WO0043168A1 WO 2000043168 A1 WO2000043168 A1 WO 2000043168A1 JP 0000342 W JP0000342 W JP 0000342W WO 0043168 A1 WO0043168 A1 WO 0043168A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- robot
- data
- robot device
- control
- storage means
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/04—Programme control other than numerical control, i.e. in sequence controllers or logic controllers
- G05B19/042—Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
- G05B19/0423—Input/output
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/20—Pc systems
- G05B2219/23—Pc programming
- G05B2219/23417—Read program from pluggable memory card
Definitions
- the present invention relates to a robot device, and is suitably applied to, for example, an entertainment robot used in a home.
- This type of entertainment robot has a shape very similar to a quadruped such as a large cat that is kept at home, and incorporates a visual sensor, a tactile sensor, etc. in addition to a microprocessor and signal processing circuit. In addition, they are designed to act autonomously based on a predetermined control program.
- the present invention has been made in view of the above points, and an object of the present invention is to propose a mouth bot device that can further improve the usability.
- a fixed storage means fixed at a predetermined position of the robot device is provided.
- Detachable storage means provided detachably, and control data used to operate a control program are stored in fixed storage means or detachable storage means according to the type thereof, or fixed data storage means according to the type.
- a control means for reading from the storage means or the detachable storage means is provided.
- the fixed storage means or the removable storage means can be selectively used according to the type of the control data, and the removable storage means can be removed to easily add, delete, and rewrite data contents. As a result, the usability can be further improved.
- a storage device provided at a predetermined position of the robot device and a control device for operating the control program Control data to be stored in storage means according to the type, or read out from storage means in accordance with the type, and the internal state of the control means when the movable part is operated.
- the change of is stored in the storage means.
- FIG. 1 is a schematic perspective view showing the appearance of an entertainment robot according to the present invention.
- FIG. 2 is a block diagram showing a circuit configuration of the entertainment robot.
- FIG. 3 is a schematic diagram illustrating a configuration of a control program.
- FIG. 4 is a schematic diagram used to explain the relationship between the emotion module and other objects.
- FIG. 5 is a schematic diagram for explaining data (1) to be written to the external storage memory.
- FIG. 6 is a schematic diagram for explaining data (2) to be written to the external storage memory.
- FIG. 7 is a schematic diagram for explaining data (3) to be written to the external storage memory.
- FIG. 8 is a flowchart showing a data write processing procedure to the internal storage memory or the external storage memory.
- reference numeral 1 denotes an entertainment robot (hereinafter, simply referred to as a robot) as a mouth bot device according to the present invention, which includes a head 2, a body 3, a mortar 4, and a main body frame 5. And a right front leg 6, a left front leg 7, a right rear leg 8, and a left rear leg 9 attached to the main body frame 5.
- the moving unit 10 supports the head 2 and the body 3 with the body frame 5 and is attached with the tail 4 protruding from a part of the body 3.
- the head 2 has a touch sensor 11, an image display section 12 composed of a liquid crystal display, a camera section 13 composed of a CCD (Charge Coupled D ebice) 13, a speech force 14 as an audio output means, an infrared remote controller 1 5 and a microphone 16, and the main body 3 has a controller 20 as a control means composed of a CPU (Central Processing Unit) inside, and a main memory composed of a RAM (R andom Access Memory).
- Part 22 and fixed in place It has an internal storage memory 23 as a fixed-type storage means formed of a non-volatile flash memory, and has a key input section 25 having a plurality of input keys on the back and a detachable memory card on the butt. and an external storage memory 2 4 as detachable storage means comprising at.
- the moving unit 10 has a battery 26 mounted on the abdomen of the main body frame 5, and the battery 26 is charged by a dedicated charging device (not shown).
- the overall operation of the robot 1 is controlled by a controller 20.
- the robot 1 performs predetermined image processing on an image signal captured by the camera unit 13 by the image processing unit 31. This is sent to the controller 20 as image data.
- the image processing unit 31 performs predetermined image processing on the reproduction data read from the internal storage memory 23 or the external storage memory 24 via the main storage unit 22 by the controller 20, and reproduces the reproduced data.
- the image is displayed on the image display unit 12 as an image.
- the robot 1 performs predetermined audio processing on the audio signal collected by the microphone 16 by the audio processing unit 32, and sends this to the controller 20 as audio data.
- the voice processing section 3 2 performs predetermined sound processing to the reproduction data internal storage memory 2 3 or read via the main storage unit 2 2 from the external storage memory 2 4 by controller 2 0, this For example, a “scream” is output from the speaker 14 as the reproduced sound.
- the transmitting / receiving unit 33 of the robot 1 transmits a control signal to an external device (not shown) wirelessly by infrared light based on a command from the controller 20.
- the movement control section 34 is composed of a motor, a driver, and a position sensor for controlling the movement of the joint portion. Each is built-in. Accordingly, the controller 20 controls the motion control sections 34 of the head 2, the right front leg 6, the left front leg 7, the right rear leg 8, and the left rear leg 9 to send a command. It is designed to detect the position when moving.
- the robot 1 detects contact information indicating whether or not an operation button of a predetermined device has been pressed down by the touch sensor 11 and, at the same time, detects contact information such as whether the user has been stroked or hit by the user, and the contact time and the impact at the time of contact.
- the posture at the time of operation or the current position of the user is recognized based on the acceleration and angular velocity detected by the acceleration sensor and the angular velocity sensor 35.
- the robot 1 is usually operated autonomously, but it is determined in advance to the controller 20 by performing a predetermined key input through the keyboard of the key input unit 25. Any desired command can be sent, so that the desired operation can be executed by the user.
- the external storage memory 24 stores control programs for controlling the movements and emotions of the entire robot 1 in a hierarchized manner. It consists of a working embedded real-time OS (Operating System) and three layers: a system software layer, a middleware layer, and an application layer.
- OS Operating System
- the system software layer provides device drivers that directly control each device such as the head 2, right front leg 6, left front leg 7, right rear leg 8, and left rear leg 9, and provides services to upper layer objects. It consists of server objects that provide The middleware layer includes, for example, a recognition object that processes image signals, audio signals, contact information, and the like, a motion control object that controls the motion of the robot 1 such as walking and posture, a leg, a head 2, and a tail. 4 and a motion generation object that expresses emotion by moving.
- the application layer consists of, for example, a learning object that performs learning, an emotion model object that handles emotions, an action generation object that determines behavior, and a scenario object that characterizes the entire mouth-box 1. Have been.
- the emotion model object has an emotion module.
- Emotion modules are 6 emotion models as data (joy, sadness, anger, fear, surprise, disgust) 0/00342
- the emotion level it is designed to handle multiple types of emotion units, which are called the current emotion level (hereinafter referred to as the emotion level), the minimum emotion level, the maximum emotion level, It consists of a threshold that is used as a criterion for reporting emotions.
- Each of these emotional levels is first initialized by the value of the emotional parameter, and then changes over time with external information such as a recognition object and the like over time.
- each emotional unit has the property of influencing each other and is designed to raise or lower the emotional level. For example, when the emotion level of the sadness emotion unit is high, the emotion level of the anger emotion unit is high. When the emotional level of the unity of joy is high, the emotional level of the unity of anger and disgust is low.
- the recognition objects of the middleware layer are various sensor information obtained from devices such as the head 2, right front leg 6, left front leg 7, right rear leg 8 and left rear leg 9, etc.
- it handles input information such as color information of an image by a color sensor, a sound signal collected by a microphone 16, and contact information by a touch sensor 11, and notifies the emotion module of the emotion model object as the recognition result of the input information. I do.
- the emotion module determines the type of the input recognition result, and changes the emotion level of each emotion unit using the parameter of the type. Then, the emotion module selects the emotion unit having the highest emotion level among the emotion units that exceed the threshold.
- the selected emotion unit notifies the object requesting the output, for example, the action generation object, of the information.
- an object requesting output must register itself as an observer to the emotion module using an object-oriented observer pattern.
- the action creation object controls the hardware via the action creation object and the like. That is, the robot 1 is configured to express emotions by moving devices such as the head 2, the right front leg 6, the left front leg 7, the right rear leg 8, and the left rear leg 9, and the like. (3) Use of internal storage memory and external storage memory for entertainment robots
- an internal storage memory 23 as a fixed type storage means which is attached to the inside of the body 3 so as not to be detachable, and is detachably attached to the buttocks of the body 3.
- An external storage memory 24 is provided as removable storage means, and the internal storage memory 23 and the external storage memory 24 are controlled by the controller 20 as control means in accordance with the type and use of data. It is designed to be used properly based on
- Robot 1 For example, in the case of Robot 1, the product name, version number, serial number, model information, repair and repair history, etc. of Robot 1 are information on the unique hardware that Robot 1 itself has. These are stored in the internal memory 23.
- a manufacturer when a manufacturer requests a repair, it reads information about the hardware from the internal storage memory 23 of Robot 1 to determine the manufacturing type, manufacturing date, and past repair status. At the same time, when the repair is completed, add the repair history in the same way. At this time, the information regarding the hard disk is stored in the internal storage memory 23 attached to the inside of the body 3, so that the data is not changed by the user without permission.
- the software can respond to subtle motor sounds such as noise, and operate with different parameters depending on the model.
- the motion control unit 34 moves the head 2, the right front leg 6, the left front leg 7, the right rear leg 8, the left rear leg 9, and other devices to a predetermined angle.
- the position is detected by a position sensor, and based on this angular position, the camera is used to calibrate the angle shift and to accurately recognize the color. It is designed to calibrate erroneous color recognition based on the range of RGB values and UV values corresponding to the colors of the image signals captured by the camera, and to calibrate such reference angular positions and UV value ranges
- the configuration information of the acceleration sensor 35 may be stored in the internal storage memory 23.
- the U value is represented by (R (R ed) — Y (luminance signal)), and the V value is represented by / 3 (B (B lue) -Y (luminance signal)).
- C and J3 are coefficients.
- the robot 1 since the control program for performing the basic operation is stored in the external storage memory 24 (FIG. 2), the robot 1 operates unless the external storage memory 24 is attached to the bottom of the body 3. No, and even if the external storage memory 24 is attached to the buttocks of the body 3, it cannot operate if any error occurs. In this case, the robot 1 informs the user by a predetermined operation that the external storage memory 24 is not attached or that even if it is attached, if an error occurs, it cannot operate autonomously. An error operation execution program for notifying an error by such an operation is stored in the internal storage memory 23 (FIG. 2) in advance.
- the robot 1 will not operate and a predetermined operation for notifying the error is performed.
- the robot 1 has an error operation execution program for an operation of notifying an error stored in the internal storage memory 23. Even when the operation is not performed because the storage memory 2'4 is not attached, only the operation of notifying the error can be performed at a minimum, so that the user can easily recognize the operation.
- robot 1 can be divided into, for example, "shaking pot 4 when stroking head 2", "raising head 2 and shaking head when striking head 2", etc.
- the operation generation object has a plurality of parameters that specify an operation that autonomously executes a predetermined operation in response to an external input, and the predetermined operation is executed more frequently by learning.
- the learning data of the parameters and the characteristics and characteristics that were different for each robot 1 such as the robot 1 that ⁇ moves the head 2 frequently '' and the robot 1 that ⁇ shakes the pot 4 well ''
- the personality data of the parameter is stored in the internal storage memory 23.
- the robot 1 stores the learning data and the personality data in the internal storage memory 23, so that even if the external storage memory 24 storing the control program is replaced with another one, Since there is no change in the learning data already obtained by learning or the unique personality data of the robot 1 itself, there is no need to retrain or change the personality.
- the robot 1 performs image processing on the surrounding image signals acquired at predetermined time intervals by the camera unit 13 (FIG. 1) by the image processing unit 31 (FIG. 2), and mainly processes this as an image data file.
- the data is sent to the external storage memory 24 via the storage unit 22 and written to the external storage memory 24.
- the robot 1 processes the surrounding audio signals collected by the microphone 16 (FIG. 1) at predetermined time intervals by the audio processing unit 32 (FIG. 2), and stores the processed audio data as an audio data file in the main memory.
- the data is sent to the external storage memory 24 via the section 22 and written to the external storage memory 24.
- the user removes the external storage memory 24 from the bottom part of the robot 1 as shown in FIG. 5 and uses a personal computer (hereinafter simply referred to as a computer). )
- the image data file and the audio data file stored in the external storage memory 24 can be read out via 40 and output to a monitor.
- the robot 1 changes the emotion level of the emotion unit when the user sees a favorite color while acting autonomously or is stroked by someone.
- the change history is stored in the external storage memory 24 as emotion level history data
- the operation history when the robot 1 itself operates is stored in the external storage memory 24 as operation history data.
- the user removes the external storage memory 24 from the bottom of the robot 1 as shown in FIG. 5, and transfers the emotion level history data and the operation history data stored in the external storage memory 24 via the computer 40. It can be read out and displayed on a monitor.
- the user can check the history of the emotion level and the history of the operation of the robot 1 displayed on the monitor, and thus can grasp the change of the emotion and the change of the operation of the robot 1.
- the robot 1 stores these errors in the external storage memory 24 as a system log when an error occurs during power-on or during autonomous operation.
- the user removes the external storage memory 24 from the bottom of the robot 1 as shown in FIG. 5, reads the system log stored in the external storage memory 24 via the computer 40, and sends it to the monitor. It can be displayed.
- the robot 1 previously stores the template image in which the user's face is captured in the external storage memory 24.
- the robot 1 performs template matching based on the captured still image and a template image stored in the external storage memory 24 in advance. Thereby, it can be determined whether or not the user is the master.
- the robot 1 stores the template image as the recognition image data in the external storage memory 24 in advance, so that the external storage memory 24 is stored in the robot 1 as shown in FIG.
- the template image stored in the external storage memory 24 via the computer 40 can be rewritten to an image of another user's face.
- the robot 1 when the external storage memory 24 storing the new template image in which the face of another user is imaged is re-attached to the robot 1, and when the external storage memory 24 storing the new template image is stored.
- the robot 1 When the robot 1 is replaced and remounted on the robot 1, the robot 1 can perform template matching based on the new template image, so that the other users can be recognized as the new master. Have been.
- robot 1 stores the imaging target to be recognize in advance in the external storage memory 2 4, or if you want to change the imaging object desired to recognize rewrites the data and remove the external storage memory 2 4 External It is only necessary to replace the storage memory 24 itself.
- the robot 1 stores, in the external storage memory 24, a range of RGB values and UV values of the types of colors to be recognized in the image signal captured by the camera unit 13 as color data.
- a command of “chase after” is given to the control program when the “red” color is recognized, and a command of “run away” is recognized when the “black” color is recognized. Is given to the control program, In this way, the behavior pattern of robot 1 can be set.
- the robot 1 stores the color data as the recognition color data in the external storage memory 24 in advance, so that the external storage memory 24 is stored at the bottom of the robot 1 as shown in FIG. , And the color data stored in the external storage memory 24 can be rewritten to another color data via the computer 40.
- the robot 1 takes an action pattern corresponding to the rewritten other color, so that the user can recognize a desired color and set an action pattern corresponding to the recognized color. .
- the robot 1 stores the audio data corresponding to the audio output from the speaker 14 in the external storage memory 24.
- the actual animal singing may be digitally recorded and used as a sound file, or MIDI (Musical Instrument Digital I / O). nterface) may be used.
- the mouth robot 1 removes the external memory 24 from the bottom of the robot 1 as shown in FIG.
- the audio data stored in the external storage memory 24 can be rewritten via 0.
- the robot 1 performs a “shake head” operation and a “tail waving operation” based on motion data corresponding to a predetermined motion pattern.
- the operation data is stored in the external storage memory 24.
- the robot 1 since the operation data is stored in the external storage memory 24, the robot 1 removes the external storage memory 24 from the bottom of the robot 1 as shown in FIG. Thus, the operation data stored in the external storage memory 24 can be rewritten with new operation data.
- the robot 1 can perform an operation according to the new operation data. In this way, the robot 1 stores the operation data in the external storage memory 24, so that the user can easily execute a desired operation.
- the robot 1 has an external storage memory 24 for an operation program for operating the image processing unit 31. To memorize. In this case, since the operation program is stored in the external storage memory 24, the robot 1 removes the external storage memory 24 from the bottom of the robot 1 as shown in FIG. Thus, the operation program stored in the external storage memory 24 can be rewritten with a new operation program.
- DSP Digi-ta1 Signal Processor
- the robot 1 causes the image processing unit 31 to execute processing other than image processing according to the new operation program. Can be.
- the robot 1 can perform processing by the DSP by replacing the external storage memory 24 or rewriting the operation program of the external storage memory 24. It can be set arbitrarily.
- the robot 1 stores, in the external storage memory 24, learning data of parameters which, among the parameters defining the operation, perform the predetermined operation more frequently by learning.
- the learning data means, for example, that when the robot 1 performs a certain operation, the user strokes the operation, and thereafter the operation is frequently performed.
- Contact information is input is the data set so as to increase the probability of performing a predetermined operation when given.
- the robot 1 stores the learning data in the external storage memory 24 so that the external storage memory 24 is detached from the bottom of the robot 1 as shown in FIG.
- Robot 1 can immediately execute an action according to new learning data without newly learning.
- the robot 1 stores common data in all the robots, and stores in the external storage memory 24 personality data having different behavior patterns and characteristics for each robot. 00 bell
- the robot 1 stores the personality data in the external storage memory 24, thereby removing the external storage memory 24 from the bottom of the robot 1 as shown in FIG.
- the personality data stored in the external storage memory 24 can be rewritten to new personality data, another personality data can be copied, or the personality data of another robot can be transplanted via 40.
- the robot 1 can immediately execute an action according to the new personality data.
- the operation result data such as learning data, image data file, audio data file, emotion level history data, operation history data, system log, etc. obtained as a result of autonomous operation in the robot 1 are stored in the internal storage memory 23 or Data processing until the data is stored in any of the external storage memories 24 will be described with reference to a data writing processing procedure RT1 shown in FIG.
- the controller 20 of the robot 1 enters from the start step of the data write processing procedure RT1 and moves to step SP1.
- the controller 20 reads out and starts the control program from the external storage memory 24 via the main storage unit 22 when the power of the mouth bot 1 is turned on, and proceeds to the next step SP2.
- step SP2 the controller 20 reads out data used for performing a predetermined operation according to the external input from the internal storage memory 22 or the external storage memory 24 via the main storage unit 22. Move on to SP 3.
- step SP3 the controller 20 stores the operation result data obtained as a result of performing a predetermined operation based on the control program and data read from the internal storage memory 22 or the external storage memory 24 into the internal storage memory.
- the data write processing procedure RT1 is completed in the next step SP4.
- the robot 1 is provided with an internal storage memory 23 attached and fixed inside the body 2 and an external storage memory 24 detachably attached to the bottom of the body 2.
- the internal storage memory 23 and the external storage memory 24 are selectively used based on the control of 20.
- the robot 1 has control information such as hardware-related information that should always be stored without having to rewrite or change data, error operation execution programs, learning data that does not want to be changed, and control data such as personality data. Write to the internal storage memory 23 fixed inside 2.
- the robot 1 also includes environment information including surrounding image data captured by the camera unit 13 and audio data collected by the microphone 16, emotion level history data indicating a history of changes in emotion levels, and operation history data. , System log, robot 1 Recognition image data for recognition by itself, recognition color data, audio data according to the audio to be output, operation data for executing a predetermined operation, and DSP to execute arbitrary processing
- the control data such as the operation program for learning and the learning data and personality data for freely changing are written in the external storage memory 24 detachably attached.
- the robot 1 can read out various data written in the external storage memory 24 and check it through the monitor, and also write the data to the external storage memory 24.
- the rewritten data can be rewritten, and the external storage memory 24 itself can be replaced with a new external storage memory 24.
- the robot 1 is stored in an external storage memory 24 from which the control program can be attached and detached.
- the robot 1 is provided with the internal storage memory 23 attached and fixed inside the body 2 and the external storage memory 24 detachably attached to the buttocks of the body 2.
- Use of the internal storage memory 23 and the external storage memory 24 via the controller 20 in accordance with the type and content of the control data allows the usability to be further improved.
- the present invention is not limited to this, and an internal storage memory 23 composed of a hard disk and an external storage memory 24 composed of a memory stick may be used. In this case, the same effect as in the above-described embodiment can be obtained.
- the internal storage memory 22 is mounted and fixed inside the body 3, and the external storage memory 24 is inserted into the buttocks of the body 3 so as to be removably mounted.
- the present invention is not limited to this, and the internal storage memory 22 is attached to the inside of the head 2 and fixed, and the external storage memory 24 is attached to the back and detachably attached. You may do it.
- the robot device of the present invention is applied to an entertainment robot.
- the present invention is not limited to this, and any other robot that behaves autonomously can be used.
- the present invention may be applied to various types of robot devices.
- the present invention relates to entertainment robots and other robotic devices that act autonomously. Can be used for ⁇
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP00900911A EP1120205A4 (en) | 1999-01-25 | 2000-01-25 | ROBOT |
US09/646,723 US6381515B1 (en) | 1999-01-25 | 2000-01-25 | Robot apparatus |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP11/15762 | 1999-01-25 | ||
JP01576299A JP4366617B2 (ja) | 1999-01-25 | 1999-01-25 | ロボット装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2000043168A1 true WO2000043168A1 (fr) | 2000-07-27 |
Family
ID=11897814
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2000/000342 WO2000043168A1 (fr) | 1999-01-25 | 2000-01-25 | Robot |
Country Status (5)
Country | Link |
---|---|
US (1) | US6381515B1 (ja) |
EP (1) | EP1120205A4 (ja) |
JP (1) | JP4366617B2 (ja) |
CN (1) | CN1301830C (ja) |
WO (1) | WO2000043168A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2002041254A1 (fr) * | 2000-11-17 | 2002-05-23 | Sony Corporation | Dispositif robot et procede d'identification faciale, et dispositif et procede d'identification d'images |
US6711467B2 (en) | 2000-10-05 | 2004-03-23 | Sony Corporation | Robot apparatus and its control method |
US7720775B2 (en) * | 2002-03-06 | 2010-05-18 | Sony Corporation | Learning equipment and learning method, and robot apparatus |
Families Citing this family (86)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1112984C (zh) * | 1999-05-10 | 2003-07-02 | 索尼公司 | 机器宠物装置的控制方法 |
US6663393B1 (en) * | 1999-07-10 | 2003-12-16 | Nabil N. Ghaly | Interactive play device and method |
JP2001191284A (ja) * | 1999-10-25 | 2001-07-17 | Sony Corp | ロボット装置及びロボット装置の学習方法 |
JP2002063505A (ja) * | 2000-08-16 | 2002-02-28 | Nippon Telegr & Teleph Corp <Ntt> | 情報配信方法、情報配信センタ装置、情報配信端末装置及びキャラクタ人形 |
CN100380324C (zh) * | 2000-08-28 | 2008-04-09 | 索尼公司 | 通信设备和通信方法、网络系统、以及机器人设备 |
JP2002163000A (ja) * | 2000-08-29 | 2002-06-07 | Matsushita Electric Ind Co Ltd | 配信システム |
KR20020067696A (ko) * | 2000-10-11 | 2002-08-23 | 소니 가부시끼 가이샤 | 로봇 장치와 정보 표시 시스템 및 정보 표시 방법 |
JP2002113675A (ja) | 2000-10-11 | 2002-04-16 | Sony Corp | ロボット制御システム並びにロボット制御用ソフトウェアの導入方法 |
TW546874B (en) * | 2000-10-11 | 2003-08-11 | Sony Corp | Robot apparatus and its control method with a function of preventing an image from being stolen |
WO2002030626A1 (fr) | 2000-10-11 | 2002-04-18 | Sony Corporation | Systeme de commande de robot et procede de commande de robot |
CN100411828C (zh) * | 2000-10-13 | 2008-08-20 | 索尼公司 | 机器人装置及其行为控制方法 |
JP2002127059A (ja) * | 2000-10-20 | 2002-05-08 | Sony Corp | 行動制御装置および方法、ペットロボットおよび制御方法、ロボット制御システム、並びに記録媒体 |
KR20020067921A (ko) * | 2000-10-23 | 2002-08-24 | 소니 가부시끼 가이샤 | 각식 로봇 및 각식 로봇의 행동 제어 방법, 및 기억 매체 |
JP2002166378A (ja) * | 2000-11-30 | 2002-06-11 | Sony Corp | ロボット装置 |
JP4759806B2 (ja) * | 2000-12-05 | 2011-08-31 | ソニー株式会社 | 診断装置 |
TWI236610B (en) * | 2000-12-06 | 2005-07-21 | Sony Corp | Robotic creature device |
KR100360722B1 (ko) * | 2000-12-18 | 2002-11-13 | 주식회사 이플래닛 | 컴퓨터와 연동되는 완구 시스템 |
JP4689107B2 (ja) * | 2001-08-22 | 2011-05-25 | 本田技研工業株式会社 | 自律行動ロボット |
JP3837479B2 (ja) * | 2001-09-17 | 2006-10-25 | 独立行政法人産業技術総合研究所 | 動作体の動作信号生成方法、その装置及び動作信号生成プログラム |
JP2005515903A (ja) * | 2001-11-28 | 2005-06-02 | エヴォリューション ロボティクス インコーポレイテッド | ロボット用センサおよびアクチュエータのハードウェア抽象化層内における抽象化および集合化 |
US20040162637A1 (en) | 2002-07-25 | 2004-08-19 | Yulun Wang | Medical tele-robotic system with a master remote station with an arbitrator |
US6925357B2 (en) | 2002-07-25 | 2005-08-02 | Intouch Health, Inc. | Medical tele-robotic system |
EP1598155B1 (en) * | 2003-02-14 | 2011-04-20 | Honda Giken Kogyo Kabushiki Kaisha | Abnormality detector of moving robot |
US7813836B2 (en) | 2003-12-09 | 2010-10-12 | Intouch Technologies, Inc. | Protocol for a remotely controlled videoconferencing robot |
WO2005069890A2 (en) * | 2004-01-15 | 2005-08-04 | Mega Robot, Inc. | System and method for reconfiguring an autonomous robot |
US20050204438A1 (en) | 2004-02-26 | 2005-09-15 | Yulun Wang | Graphical interface for a remote presence system |
JP2006015436A (ja) * | 2004-06-30 | 2006-01-19 | Honda Motor Co Ltd | 監視ロボット |
US8077963B2 (en) | 2004-07-13 | 2011-12-13 | Yulun Wang | Mobile robot with a head-based movement mapping scheme |
JP4086024B2 (ja) * | 2004-09-14 | 2008-05-14 | ソニー株式会社 | ロボット装置及びその行動制御方法 |
DE102004054867B4 (de) | 2004-11-12 | 2018-03-29 | Deutsches Zentrum für Luft- und Raumfahrt e.V. | Mit sensorischer Intelligenz ausgestatteter Roboter |
US7247783B2 (en) * | 2005-01-22 | 2007-07-24 | Richard Grossman | Cooperative musical instrument |
US8588979B2 (en) * | 2005-02-15 | 2013-11-19 | Sony Corporation | Enhancements to mechanical robot |
US7047108B1 (en) * | 2005-03-01 | 2006-05-16 | Sony Corporation | Enhancements to mechanical robot |
US8588969B2 (en) * | 2005-03-01 | 2013-11-19 | Sony Corporation | Enhancements to mechanical robot |
US9198728B2 (en) | 2005-09-30 | 2015-12-01 | Intouch Technologies, Inc. | Multi-camera mobile teleconferencing platform |
US8849679B2 (en) | 2006-06-15 | 2014-09-30 | Intouch Technologies, Inc. | Remote controlled robot system that provides medical images |
US9160783B2 (en) | 2007-05-09 | 2015-10-13 | Intouch Technologies, Inc. | Robot system that operates through a network firewall |
JP2009011362A (ja) * | 2007-06-29 | 2009-01-22 | Sony Computer Entertainment Inc | 情報処理システム、ロボット装置及びその制御方法 |
WO2009011054A1 (ja) * | 2007-07-18 | 2009-01-22 | Hirata Corporation | ロボットシステム |
US20090090305A1 (en) * | 2007-10-03 | 2009-04-09 | National University Of Singapore | System for humans and pets to interact remotely |
US10875182B2 (en) | 2008-03-20 | 2020-12-29 | Teladoc Health, Inc. | Remote presence system mounted to operating room hardware |
US8179418B2 (en) | 2008-04-14 | 2012-05-15 | Intouch Technologies, Inc. | Robotic based health care system |
US8170241B2 (en) | 2008-04-17 | 2012-05-01 | Intouch Technologies, Inc. | Mobile tele-presence system with a microphone system |
US9193065B2 (en) | 2008-07-10 | 2015-11-24 | Intouch Technologies, Inc. | Docking system for a tele-presence robot |
US9842192B2 (en) | 2008-07-11 | 2017-12-12 | Intouch Technologies, Inc. | Tele-presence robot system with multi-cast features |
US8340819B2 (en) | 2008-09-18 | 2012-12-25 | Intouch Technologies, Inc. | Mobile videoconferencing robot system with network adaptive driving |
US8996165B2 (en) | 2008-10-21 | 2015-03-31 | Intouch Technologies, Inc. | Telepresence robot with a camera boom |
US8463435B2 (en) | 2008-11-25 | 2013-06-11 | Intouch Technologies, Inc. | Server connectivity control for tele-presence robot |
US9138891B2 (en) * | 2008-11-25 | 2015-09-22 | Intouch Technologies, Inc. | Server connectivity control for tele-presence robot |
US8849680B2 (en) | 2009-01-29 | 2014-09-30 | Intouch Technologies, Inc. | Documentation through a remote presence robot |
US8897920B2 (en) | 2009-04-17 | 2014-11-25 | Intouch Technologies, Inc. | Tele-presence robot system with software modularity, projector and laser pointer |
US8384755B2 (en) | 2009-08-26 | 2013-02-26 | Intouch Technologies, Inc. | Portable remote presence robot |
US11399153B2 (en) * | 2009-08-26 | 2022-07-26 | Teladoc Health, Inc. | Portable telepresence apparatus |
US11154981B2 (en) * | 2010-02-04 | 2021-10-26 | Teladoc Health, Inc. | Robot user interface for telepresence robot system |
US8670017B2 (en) | 2010-03-04 | 2014-03-11 | Intouch Technologies, Inc. | Remote presence system including a cart that supports a robot face and an overhead camera |
US10343283B2 (en) | 2010-05-24 | 2019-07-09 | Intouch Technologies, Inc. | Telepresence robot system that can be accessed by a cellular phone |
US10808882B2 (en) | 2010-05-26 | 2020-10-20 | Intouch Technologies, Inc. | Tele-robotic system with a robot face placed on a chair |
CA2720886A1 (en) * | 2010-11-12 | 2012-05-12 | Crosswing Inc. | Customizable virtual presence system |
US9264664B2 (en) | 2010-12-03 | 2016-02-16 | Intouch Technologies, Inc. | Systems and methods for dynamic bandwidth allocation |
US8965579B2 (en) | 2011-01-28 | 2015-02-24 | Intouch Technologies | Interfacing with a mobile telepresence robot |
US9323250B2 (en) | 2011-01-28 | 2016-04-26 | Intouch Technologies, Inc. | Time-dependent navigation of telepresence robots |
CN102122448B (zh) * | 2011-02-28 | 2012-11-14 | 焦利民 | 一种家用教育机器人系统 |
CN102122205A (zh) * | 2011-02-28 | 2011-07-13 | 焦利民 | 一种用于家用教育机器人系统中的手持式识别系统 |
US10769739B2 (en) | 2011-04-25 | 2020-09-08 | Intouch Technologies, Inc. | Systems and methods for management of information among medical providers and facilities |
US9098611B2 (en) | 2012-11-26 | 2015-08-04 | Intouch Technologies, Inc. | Enhanced video interaction for a user interface of a telepresence network |
US20140139616A1 (en) | 2012-01-27 | 2014-05-22 | Intouch Technologies, Inc. | Enhanced Diagnostics for a Telepresence Robot |
US8836751B2 (en) | 2011-11-08 | 2014-09-16 | Intouch Technologies, Inc. | Tele-presence system with a user interface that displays different communication links |
US9251313B2 (en) | 2012-04-11 | 2016-02-02 | Intouch Technologies, Inc. | Systems and methods for visualizing and managing telepresence devices in healthcare networks |
US8902278B2 (en) | 2012-04-11 | 2014-12-02 | Intouch Technologies, Inc. | Systems and methods for visualizing and managing telepresence devices in healthcare networks |
US9361021B2 (en) | 2012-05-22 | 2016-06-07 | Irobot Corporation | Graphical user interfaces including touchpad driving interfaces for telemedicine devices |
EP2852881A4 (en) | 2012-05-22 | 2016-03-23 | Intouch Technologies Inc | GRAPHIC USER INTERFACES CONTAINING TOUCH PAD TOUCH INTERFACES FOR TELEMEDICINE DEVICES |
FR2991222B1 (fr) * | 2012-06-01 | 2015-02-27 | Aldebaran Robotics | Systeme et procede pour generer des comportements contextuels d'un robot mobile executes en temps reel |
US9873556B1 (en) | 2012-08-14 | 2018-01-23 | Kenney Manufacturing Company | Product package and a method for packaging a product |
JP5862611B2 (ja) | 2013-04-02 | 2016-02-16 | トヨタ自動車株式会社 | 作業変更装置、作業変更方法、及び作業変更プログラム |
US9592603B2 (en) * | 2014-12-01 | 2017-03-14 | Spin Master Ltd. | Reconfigurable robotic system |
CN104461016B (zh) * | 2014-12-23 | 2018-02-13 | 杭州云造科技有限公司 | 产品的机器性格表现方法及装置 |
JP6545472B2 (ja) | 2015-01-27 | 2019-07-17 | 蛇の目ミシン工業株式会社 | ロボット |
DE102016210846B4 (de) * | 2016-03-24 | 2021-12-30 | Deutsches Zentrum für Luft- und Raumfahrt e.V. | Roboterarm |
JP6933213B2 (ja) * | 2016-06-30 | 2021-09-08 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、およびプログラム |
CN105936047B (zh) * | 2016-07-06 | 2018-05-04 | 厦门快商通科技股份有限公司 | 仿脑机器人控制与学习系统 |
CN106228233B (zh) * | 2016-07-13 | 2019-09-20 | 百度在线网络技术(北京)有限公司 | 一种用于无人驾驶车辆测试的智能体的构建方法和装置 |
US11862302B2 (en) | 2017-04-24 | 2024-01-02 | Teladoc Health, Inc. | Automated transcription and documentation of tele-health encounters |
US10483007B2 (en) | 2017-07-25 | 2019-11-19 | Intouch Technologies, Inc. | Modular telehealth cart with thermal imaging and touch screen user interface |
US11636944B2 (en) | 2017-08-25 | 2023-04-25 | Teladoc Health, Inc. | Connectivity infrastructure for a telehealth platform |
US10617299B2 (en) | 2018-04-27 | 2020-04-14 | Intouch Technologies, Inc. | Telehealth cart that supports a removable tablet with seamless audio/video switching |
JP7320323B1 (ja) | 2023-03-30 | 2023-08-03 | Innovation Farm株式会社 | ロボット販売装置及びロボット販売方法 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS61167997A (ja) * | 1985-01-21 | 1986-07-29 | カシオ計算機株式会社 | 会話ロボツト |
JPH0612401A (ja) * | 1992-06-26 | 1994-01-21 | Fuji Xerox Co Ltd | 感情模擬装置 |
JPH09114514A (ja) * | 1995-10-17 | 1997-05-02 | Sony Corp | ロボット制御方法およびその装置 |
JPH09153082A (ja) * | 1995-12-01 | 1997-06-10 | Nippon Steel Corp | モデル作製装置及びその記憶媒体 |
JPH10289006A (ja) * | 1997-04-11 | 1998-10-27 | Yamaha Motor Co Ltd | 疑似感情を用いた制御対象の制御方法 |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3911613A (en) * | 1974-02-15 | 1975-10-14 | Marvin Glass & Associates | Articulated figure toy and accessories |
US5100362A (en) * | 1990-12-03 | 1992-03-31 | Fogarty A Edward | Propellable articulating animal toy |
US5172806A (en) * | 1991-11-08 | 1992-12-22 | S. R. Mickelberg Company, Inc. | Animated toy in package |
US5289916A (en) * | 1991-11-08 | 1994-03-01 | S. R. Mickelberg Company, Inc. | Animated toy in package |
US5606494A (en) * | 1993-11-25 | 1997-02-25 | Casio Computer Co., Ltd. | Switching apparatus |
EP0762498A3 (en) | 1995-08-28 | 1998-06-24 | International Business Machines Corporation | Fuse window with controlled fuse oxide thickness |
US5626505A (en) * | 1996-02-06 | 1997-05-06 | James Industries, Inc. | Spring-animated toy figure |
US5963712A (en) * | 1996-07-08 | 1999-10-05 | Sony Corporation | Selectively configurable robot apparatus |
JP3765356B2 (ja) * | 1997-12-22 | 2006-04-12 | ソニー株式会社 | ロボツト装置 |
-
1999
- 1999-01-25 JP JP01576299A patent/JP4366617B2/ja not_active Expired - Fee Related
-
2000
- 2000-01-25 EP EP00900911A patent/EP1120205A4/en not_active Withdrawn
- 2000-01-25 US US09/646,723 patent/US6381515B1/en not_active Expired - Fee Related
- 2000-01-25 WO PCT/JP2000/000342 patent/WO2000043168A1/ja active Application Filing
- 2000-01-25 CN CNB008000646A patent/CN1301830C/zh not_active Expired - Fee Related
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS61167997A (ja) * | 1985-01-21 | 1986-07-29 | カシオ計算機株式会社 | 会話ロボツト |
JPH0612401A (ja) * | 1992-06-26 | 1994-01-21 | Fuji Xerox Co Ltd | 感情模擬装置 |
JPH09114514A (ja) * | 1995-10-17 | 1997-05-02 | Sony Corp | ロボット制御方法およびその装置 |
JPH09153082A (ja) * | 1995-12-01 | 1997-06-10 | Nippon Steel Corp | モデル作製装置及びその記憶媒体 |
JPH10289006A (ja) * | 1997-04-11 | 1998-10-27 | Yamaha Motor Co Ltd | 疑似感情を用いた制御対象の制御方法 |
Non-Patent Citations (3)
Title |
---|
MASAHIRO FUJITA.: "Reconfigurable Physical Agents", PROCEEDINGS OF THE SECOND INTERNATIONAL CONFERECNE ON AUTONOMOUS AGENTS,, 9 May 1998 (1998-05-09), pages 54 - 61, XP002927687 * |
ROBOT ENTERTAINMENT., PROCEEDINGS OF THE 6TH SONY RESEARCH FORUM,, 27 November 1996 (1996-11-27), pages 234 - 239, XP002927686 * |
See also references of EP1120205A4 * |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6711467B2 (en) | 2000-10-05 | 2004-03-23 | Sony Corporation | Robot apparatus and its control method |
WO2002041254A1 (fr) * | 2000-11-17 | 2002-05-23 | Sony Corporation | Dispositif robot et procede d'identification faciale, et dispositif et procede d'identification d'images |
US7200249B2 (en) | 2000-11-17 | 2007-04-03 | Sony Corporation | Robot device and face identifying method, and image identifying device and image identifying method |
US7317817B2 (en) | 2000-11-17 | 2008-01-08 | Sony Corporation | Robot apparatus, face identification method, image discriminating method and apparatus |
US7720775B2 (en) * | 2002-03-06 | 2010-05-18 | Sony Corporation | Learning equipment and learning method, and robot apparatus |
Also Published As
Publication number | Publication date |
---|---|
CN1301830C (zh) | 2007-02-28 |
EP1120205A1 (en) | 2001-08-01 |
JP4366617B2 (ja) | 2009-11-18 |
CN1293606A (zh) | 2001-05-02 |
US6381515B1 (en) | 2002-04-30 |
JP2000210886A (ja) | 2000-08-02 |
EP1120205A4 (en) | 2004-05-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2000043168A1 (fr) | Robot | |
US6470235B2 (en) | Authoring system and method, and storage medium used therewith | |
US6591165B2 (en) | Robot apparatus, body unit and coupling unit | |
CA2309671C (en) | Robot apparatus, method of controlling robot apparatus, method of display, and medium | |
JP7400923B2 (ja) | 情報処理装置および情報処理方法 | |
EP1508409A1 (en) | Robot device and robot control method | |
US20030187653A1 (en) | Action teaching apparatus and action teaching method for robot system, and storage medium | |
KR20020056949A (ko) | 오서링 시스템 및 오서링 방법과 기억 매체 | |
JP2003039363A (ja) | ロボット装置、ロボット装置の行動学習方法、ロボット装置の行動学習プログラム、及びプログラム記録媒体 | |
WO2000068879A1 (fr) | Dispositif robot, son procede de commande et support d'enregistrement | |
WO2001058649A1 (fr) | Systeme robotique, dispositif robotise et procede de controle d'un tel systeme et dispositif et procede de traitement de donnees | |
JP2001191275A (ja) | ロボット・システム、外装及びロボット装置 | |
JP2001310283A (ja) | ロボットシステム、ロボット装置及びその制御方法、並びに情報処理装置及び方法 | |
JP2000334163A (ja) | 通信機能付き電子機器 | |
JP2002086378A (ja) | 脚式ロボットに対する動作教示システム及び動作教示方法 | |
JP4779292B2 (ja) | ロボット制御装置および方法、記録媒体、並びにプログラム | |
JP2003340760A (ja) | ロボット装置およびロボット制御方法、記録媒体、並びにプログラム | |
JP2003205179A (ja) | ペット型ロボット | |
JP2001191274A (ja) | データ保持装置、ロボット装置、変更装置及び変更方法 | |
JP2002205289A (ja) | ロボット装置の動作制御方法、プログラム、記録媒体及びロボット装置 | |
JP2001157982A (ja) | ロボット装置及びその制御方法 | |
JP2001157980A (ja) | ロボット装置及びその制御方法 | |
JP2001154707A (ja) | ロボット装置及びその制御方法 | |
JP2001157981A (ja) | ロボット装置及びその制御方法 | |
JP2001105363A (ja) | ロボットにおける自律的行動表現システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 00800064.6 Country of ref document: CN |
|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): CN US |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): DE FR GB |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2000900911 Country of ref document: EP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 09646723 Country of ref document: US |
|
WWP | Wipo information: published in national office |
Ref document number: 2000900911 Country of ref document: EP |