JP2002355779A - Robot type interface device and control method for the same - Google Patents

Robot type interface device and control method for the same

Info

Publication number
JP2002355779A
JP2002355779A JP2001166227A JP2001166227A JP2002355779A JP 2002355779 A JP2002355779 A JP 2002355779A JP 2001166227 A JP2001166227 A JP 2001166227A JP 2001166227 A JP2001166227 A JP 2001166227A JP 2002355779 A JP2002355779 A JP 2002355779A
Authority
JP
Japan
Prior art keywords
robot
interface device
type interface
means
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
JP2001166227A
Other languages
Japanese (ja)
Inventor
Atsuro Nakamura
Yasuhiro Ueda
Hironori Yamataka
泰広 上田
淳良 中村
大乗 山高
Original Assignee
Sharp Corp
シャープ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sharp Corp, シャープ株式会社 filed Critical Sharp Corp
Priority to JP2001166227A priority Critical patent/JP2002355779A/en
Publication of JP2002355779A publication Critical patent/JP2002355779A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F19/00Digital computing or data processing equipment or methods, specially adapted for specific applications

Abstract

PROBLEM TO BE SOLVED: To provide a robot type interface device that permits a user to recognize presented information accurately. SOLUTION: A drive device is controlled to move the robot type interface device so that the face of a user captured by a video camera 113 is near the center of an image. An angle of a liquid crystal display device 111 relative to the robot type interface device is next changed. The liquid crystal display device 111 is thus controlled into a position continuously opposed to the face of the user, who can accurately recognize presented information.

Description

DETAILED DESCRIPTION OF THE INVENTION

[0001]

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to an interface device for inputting and outputting information to and from a user.
The present invention relates to a robot-type interface device that performs a biological action used at home or the like and a control method thereof.

[0002]

2. Description of the Related Art In recent years, consumer robots such as dog-type robots have become widespread at home for hobby use. As a technology related to such a consumer robot, JP-A-7-2
There is an invention disclosed in Japanese Patent No. 48823.

A personal robot apparatus disclosed in Japanese Patent Application Laid-Open No. 7-248823 has a memory for storing a program, a microcomputer for executing the program, a traveling block for traveling in accordance with the program, and an image input to the robot body. , And a display device of one or a plurality of combinations for externally displaying the processing contents of voice, data, and signals and information in the robot main body.

[0004] The display device displays the processing contents of signals and information in the robot body to the outside, so that the user thinks what the robot is looking at, what he is listening to, and what he is listening to. This makes it easy to know what is going to be done next, and gives a sense of security to a user who has anxiety about the robot moving freely.

[0005]

SUMMARY OF THE INVENTION The above-mentioned JP-A-7-2
In a robot-type interface device for exchanging information with a user as in the invention disclosed in Japanese Patent No. 48823, the recognition rate is important in both input and output. That is, it is important for the robot to correctly recognize the user's voice, surrounding objects, and the like, and to make a voice or display so that the user can correctly recognize. In particular, in a consumer robot, the accuracy of mutual information transmission between the user and the robot greatly affects the user's impression of the entire robot.

However, in the personal robot device disclosed in Japanese Patent Application Laid-Open No. 7-248823, since the display device is fixed to the robot body, a plane connecting the personal robot device and the user by traveling of the personal robot device. (A plane parallel to the ground) can be adjusted. However, since the relative angle of the display device to the plane cannot be adjusted, the user must move his / her face to a position where the display device can be seen in order to correctly recognize the content displayed on the display device. There was a problem. There is also a problem that it is not possible to adjust the influence on the display device due to disturbance such as external light and noise.

[0007] Further, it is considered that there are various factors in the impression received by the user due to the interaction between the robot and the user. Among them, it is considered that the impression depends on the form of the robot. However, since sensors or display devices mounted on conventional robots are fixed, they may hinder interaction when moving or touching, or sensors or display devices may break down due to the interaction itself. There was a problem.

SUMMARY OF THE INVENTION The present invention has been made to solve the above problems, and a first object of the present invention is to provide a robot type interface device capable of accurately recognizing information presented by a user and a control thereof. Is to provide a way.

A second object is to provide a robot-type interface device capable of accurately acquiring external information and performing accurate control.

A third object is to provide a robot-type interface device capable of reducing power consumption.

A fourth object of the present invention is to provide a robot-type interface device which can be protected by changing the position of the device and which can be easily used by a user.

[0012]

According to one aspect of the present invention, there is provided a robot-type interface device for presenting information to a user in accordance with a peripheral state, and for detecting a state around the robot-type interface device. Detecting means, display means for presenting information to the user, first driving means for moving the robot-type interface device, second driving means for moving the display means, The first driving means is controlled in accordance with the peripheral state detected by the detection means to move the robot-type interface device, and the second driving means is controlled to control the relative position and relative position of the display means with respect to the robot-type interface device. Control means for changing at least one of the angles and presenting information to the user.

The control means controls the second drive means to change at least one of the relative position and the relative angle of the display means with respect to the robot-type interface device, thereby causing the user to present information. It is possible to accurately recognize the information to be performed.

Preferably, the detecting means includes an image acquiring means for acquiring an image of the periphery, the robot-type interface device further includes a third driving means for moving the image acquiring means, and the control means comprises: After controlling the first driving means and the third driving means so that the user's face acquired by the image acquiring means is located near the center of the image,
The second driving means is controlled to change at least one of the relative position and the relative angle of the display means with respect to the robot-type interface device.

Therefore, the position of the user's face can be accurately determined, and accurate control of the robot-type interface device can be performed.

[0016] More preferably, the control means starts presenting information by the display means when the image of the user's face acquired by the image acquisition means coincides with a previously stored face image.

Therefore, it is possible to prevent information from being presented to a user other than the user registered in advance.

Preferably, the detecting means further includes a distance measuring means for measuring a distance to a peripheral object, and the control means sets the distance to the user measured by the distance measuring means to a predetermined value. After moving the robot-type interface device by controlling the first driving means as described above,
After controlling the first driving means and the third driving means so that the user's face acquired by the image acquiring means is near the center of the image, the second driving means is controlled to control the display means. At least one of the relative position and the relative angle is changed.

Therefore, the robot-type interface device can accurately move to the position where the user is.

Preferably, the robot-type interface device further includes illuminance measuring means for measuring peripheral illuminance, and the control means controls the display mode of the display means in accordance with the peripheral illuminance measured by the illuminance measuring means. To change.

Therefore, the display mode of the display means can be accurately changed, and power consumption can be reduced.

Preferably, the robot-type interface device further includes voice input means for inputting and recognizing a user's voice, and the control means determines that the recognition result by the voice input means is a first predetermined command. Then, presentation of information by the display means is started.

Therefore, the robot-type interface device can accurately determine the start of information presentation.

[0024] More preferably, the control means terminates the presentation of the information by the display means if the recognition result by the voice input means is the second predetermined command.

Therefore, the robot type interface device can accurately determine the end of the information presentation.

More preferably, the robot-type interface device further includes illuminance measuring means for measuring peripheral illuminance, and the control means includes: when the peripheral illuminance measured by the illuminance measuring means is equal to or less than a predetermined value, The presentation of the information by the display means ends.

Therefore, the robot-type interface device can accurately determine the end of the information presentation.

Preferably, the robot-type interface device further includes a cover provided so as to cover the display means, and the second drive means lifts and moves the display means after opening the cover.

Therefore, the display means is covered with the cover except when presenting information, so that the display means which is vulnerable to impact can be protected.

Preferably, the robot-type interface device further includes receiving means for receiving information from the outside, and the display means displays the information received by the receiving means.

Therefore, information presented by the display means can be supplied from the outside, and the convenience of the robot-type interface device can be improved.

According to another aspect of the present invention, there is provided a method of controlling a robot-type interface device including a display device for presenting information to a user in accordance with the state of the surroundings. Detecting, moving the robot-type interface device according to the detected surrounding state, and changing at least one of a relative position and a relative angle of the display device with respect to the robot-type interface device to present information to a user. Performing the steps.

According to the detected surrounding state, at least one of the relative position and the relative angle of the display device with respect to the robot-type interface device is changed, and the user is presented with information. Can be accurately recognized.

[0034]

1 and 2 are a top view and a side view of a robot-type interface device according to an embodiment of the present invention. 1 and 2 respectively
FIG. 4 shows a top view and a side view when covers 103T and 103B described later are removed from the robot-type interface device according to the present embodiment. This robot-type interface device includes a motherboard 101 to which boards modularized for each function are connected, a main switch 102, a power supply block 104, and left and right drive wheels 1
05R and 105L, and drive wheels 105R and 105
L for driving the driving wheels 106R and 106L for driving the driving wheels L, respectively.
Four link mechanisms 107FR and 107F having a degree of freedom
L, 107RR and 107RL, and a link mechanism driving device 108F for driving these link mechanisms, respectively.
R, 108FL, 108RR, and 108RL, a liquid crystal display device 111, a liquid crystal display device cover 109 covering the liquid crystal display device 111, an opening / closing drive device 110 for opening and closing the liquid crystal display device cover 109, and a liquid crystal display device 1.
A storage driving device 112 that lifts one end of the liquid crystal display 11 to store and change the angle of the liquid crystal display device 111;
A video camera 113, microphones 115R and 115L provided on the left and right of the front of the apparatus, ultrasonic sensors 116E provided on the front of the apparatus, and ultrasonic sensors 116FR and 116F provided on the left and right of the front and rear of the apparatus.
L, 116RR and 116RL, and the video camera 11
Camera driving device 114 for simultaneously moving 3 and ultrasonic sensor 116E, illuminance sensor 117 provided at the center of the upper surface of power supply block 104, and video camera 11
3, a video camera lens 118 provided on the front of
A speaker 119 provided at the front of the device, an audio input block 120, an audio output block 121, an image processing block 122 connected to the motherboard 101, an input / output block 1 for controlling acquisition of sensor information and driving of motors and the like.
23, an external communication block 124, a control block 125 for controlling the entire apparatus, and an external communication connector 126.
And

The robot-type interface device is based on a motherboard 101, and a connector on the motherboard 101 is provided with a sound input block 120, a sound output block 121, and an image processing block, which are boards modularized for each function. 122, input / output block 123, external communication block 124, and control block 1
25 are connected. In addition, these blocks 120-1
Details of 25 will be described later.

A main switch 102 is provided at the rear of the motherboard 101.
Operation of the robot type interface device /
A stop is performed. Above the main switch 102,
An external communication connector 126 connected to the external communication block 124 is provided, and can receive information from the outside via an unillustrated cable or transmit information to the outside.

A drive wheel 105R for moving the robot-type interface device is provided below the motherboard 101.
Drive device 1 including drive wheels 105R and 105L, and motors for driving drive wheels 105R and 105L
06R and 106L, and link mechanisms 107FR, 107FL, 107RR and 107RL.
And link mechanism driving devices 108FR, 108FL, 108RR and 108R for driving these link mechanisms.
L is attached as a drive leg.

A speaker 1 is provided in front of the motherboard 101.
The audio output block 121 outputs audio and the like generated by the audio output block 121. Motherboard 101
A power supply block 104 composed of a rechargeable battery such as a lithium-ion battery is provided at an upper portion, and supplies power consumed by the robot-type interface device.

A cover 109 for a liquid crystal display device, a driving device 110 for opening and closing, a liquid crystal display device 111, and a driving device 112 for storage are provided on the upper part of the motherboard 101. The driving of the opening / closing drive device 110 opens the cover 109 for the liquid crystal display device.
One end of the liquid crystal display device 111 is lifted by the drive of 2, and the angle of the liquid crystal display device 111 is changed as described later.

In front of the liquid crystal display device 111, a video camera 113, an ultrasonic sensor 116E and a two-degree-of-freedom video camera driving device 114 for simultaneously moving them are provided. That is, the ultrasonic sensor 116E is oriented in the same direction as the imaging direction of the video camera 113,
The video camera 113 and the ultrasonic sensor 116E are driven by the video camera driving device 114. The image processing block 122 recognizes a person or an object around the robot-type interface device by analyzing an image captured by the video camera 113.

Video camera 113 and ultrasonic sensor 1
Microphones 115R and 115L are symmetrically mounted on both sides of 16E. The voice input block 120 includes microphones 115R and 115
The voice input via the L is analyzed to perform voice recognition. Also, at the front of the robot type interface device,
A speaker 119 is provided, and the audio output block 1
21 outputs the synthesized voice and the like.

The robot type interface device has an ultrasonic sensor 116 for measuring a distance in four directions.
FR, 116FL, 116RR, and 116RL are provided, and it is possible to measure the distances to humans and objects around the robot-type interface device.

FIG. 3 is a top view of the robot-type interface device according to the present embodiment with the covers 103T and 103B attached. The cover 103T is fixed to the liquid crystal display device cover 109, and the cover 103B is fixed to the motherboard 101. Power supply block 1
The illuminance sensor 117 at the top of the cover 04 is attached to the power supply block 104 so as to be located at the center of the cover 103T, and can measure the surrounding illuminance via the cover 103T.

FIG. 4 is a side view of the robot-type interface device according to the present embodiment with the cover 103T opened. The control block 125 includes the input / output block 1
After opening the liquid crystal display device cover 109 and the cover 103T fixed thereto by controlling the driving of the opening / closing drive device 110 via 23, in the direction of arrow 131,
By controlling the driving of the storage driving device 112, the liquid crystal display device 111 is lifted in the direction of arrow 132, and the angle of the liquid crystal display device 111 is changed.

At this time, the video camera 113 and the ultrasonic sensor 116E provided at the front of the liquid crystal display device cover 109 are also lifted at the same time, and each of them faces directly in front of the robot-type interface device and obliquely upward. Similarly, the microphones 115R and 115L provided symmetrically on both sides of the video camera 113 and the ultrasonic sensor 116E also face diagonally upward and in front of the robot-type interface device.

As a method for the robot-type interface device to accurately present information to the user, after the liquid crystal display device cover 109 is opened and the liquid crystal display device 111 is presented to the user, the user places the liquid crystal display device 111 in front of the liquid crystal display device 111. The first to move
Method and the robot-type interface device is the driving wheel 10
5 or the drive leg 107 is moved to the front of the user, and then the liquid crystal display device cover 109 is opened to present the liquid crystal display device 111 to the user.

In the first method, the voice input block 120 recognizes the voice of the user input through the microphone 115, and the control block 125 analyzes the user's command based on the recognition result. I do. When it is determined that the voice command is a specific command, the control block 125 controls the opening / closing drive device 110 to open the liquid crystal display device cover 109 and then controls the drive device 112 to control the liquid crystal display. Lifting the display device 111, the liquid crystal display device 1
11 is set to a predetermined angle. Thereafter, the user moves so that the presented liquid crystal display device 111 faces the front.

In the case of the second method, the control block 12
5, while driving the drive wheel 105 or the drive leg 107,
The object moves in the direction in which the object exists according to the output signal from the ultrasonic sensor 116E. Then, when the control block 125 detects that the distance to the object existing in front of the robot-type interface device has become a predetermined value based on the output signal from the ultrasonic sensor 116E, the image processing block 122 is acquired by the video camera 113. A flesh color region is extracted from the image and compared with the user's face data stored in a memory (not shown) in advance to determine whether the user is a user.

If it is determined that the user is the same as the user stored in the memory, the control block 125 controls the opening / closing drive device 110 to control the liquid crystal display cover 109.
Is opened, the storage driving device 112 is controlled to lift the liquid crystal display device 111, and the angle is changed so that the liquid crystal display device 111 faces the user. Thus, the output signal from the ultrasonic sensor 116E and the video camera 1
Based on the image obtained by the user 13, the control can be performed such that the orientation of the liquid crystal display device 111 is in front of the user, and the expression of the user, the distance to the user, the environment behind the user, and the like. Can be imaged.

FIG. 5 is a diagram showing how the robot-type interface device according to the present embodiment presents information to the user. The control block 125 displays, on the liquid crystal display device 111, information about the robot body stored in advance in an internal memory (not shown), information received from the outside via the external communication block 124, and the like.
Present information to the user.

FIG. 5A shows that the light from the light source reflected by the liquid crystal display 111 enters the user's eyes based on the output signal from the ultrasonic sensor 116E and the image acquired by the video camera 111. In this case, the angle of the liquid crystal display device is adjusted so as to be adjusted.
In this case, since disturbance such as reflection occurs, the control block 125 controls the storage driving device 112 to change the angle of the liquid crystal display device 111 and
Control is performed so that the light from the light source reflected by 1 does not enter the eyes of the user.

FIG. 5B shows a state in which the light from the light source reflected by the liquid crystal display device 111 is not incident on the user's eyes and no disturbance such as reflection occurs. In this way, the information can be displayed on the liquid crystal display device 111 and the information can be presented to the user while reducing the burden on the user.

FIG. 6 shows an example of the appearance of the robot-type interface device according to the present embodiment in a state in which the liquid crystal display device cover 109 is opened and the liquid crystal display device 111 faces the direction of the user's face. I have. As shown in FIG. 6, in this state, the video camera 113 and the ultrasonic sensor 116E face the direction of the user's face.

When the user moves the position of the face to take a comfortable posture, the control block 125 adjusts the position of the face of the user in the image obtained by the video camera 113 so as to adjust the position of the liquid crystal. By changing the angle of the display device 111 or driving the drive wheels 105 and the drive legs 107, the liquid crystal display device 111 can always be positioned to face the user's face.

In the above description, the case where the liquid crystal display device 111 is presented to the user to provide the user with the internal information has been described. When the liquid crystal display device 111 is not required, the liquid crystal display device 111 can be stored with the liquid crystal display device cover 109 closed. In this way, by storing the unused components inside, the appearance of the robot-type interface device can be made to be a shape that is easy for the user to approach, and the liquid crystal display device 111, which is vulnerable to impact, can be protected and a failure can be prevented. It can be avoided.

FIG. 7 is a block diagram showing a schematic configuration of electric components of the robot interface device according to the embodiment of the present invention. Robot type interface device
As electric components, a driving device 106 for driving a driving wheel for driving the driving wheel 105R, a driving device 108 for a link mechanism for driving a link mechanism 107, a liquid crystal display device 111, a cover 109 for a liquid crystal display device, and a cover fixed thereto. 1
Opening / closing drive device 110 for opening and closing 03T, storage drive device 112 for lifting one end of liquid crystal display device 111, video camera 113, video camera 113
And a video camera driving device 114 for simultaneously moving the ultrasonic sensor 116E, a microphone 115, an ultrasonic sensor 116, an illuminance sensor 117, and a speaker 119.
And an audio input block 120 for inputting and analyzing audio via the microphone 115, an audio output block 121 for outputting audio and the like via the speaker 119, and an image processing block for analyzing an image captured by the video camera 113. 122, an input / output block 123 for controlling acquisition of sensor information and driving of motors, an external communication block 124, and a control block 125 for controlling the entire apparatus.
And an external communication connector 126.

FIG. 8 is a flowchart for explaining the processing procedure of the robot-type interface device according to the present embodiment. In the initial state, the liquid crystal display 1
Reference numeral 11 is housed inside the robot-type interface, and is in a state of being covered by the liquid crystal display device cover 109. In this state, when the user inputs voice through the microphone 115, the voice input block 120 performs recognition processing of the input voice, and outputs a recognition result to the control block 125. The control block 125 determines whether or not the recognition result is an information presentation command by the user (S1).

If the control block 125 determines that the command is not an information presentation command (S1, No), the drive wheel 105 and the drive leg 107 are controlled to move the robot-type interface device, and search around the robot-type interface device. To make the robot-type interface device face the direction of the user, and control so that the distance to the user becomes a predetermined value (S2).

Next, the image processing block 122 determines whether or not the user's face exists in the image captured by the video camera 113 (S3). Video camera 1
If the user's face does not exist in the image captured by S13 (S3, No), the process returns to step S1 and the subsequent processes are repeated. If the face of the user is present in the image captured by the video camera 113 (S3, Yes), the control unit 125 causes the liquid crystal display device 111
It is determined whether or not is stored in the robot-type interface device (S4). If the liquid crystal display device 111 has already been presented to the user (S4, No), the process proceeds to step S6. When the liquid crystal display device 111 is stored in the robot-type interface device (S
4, Yes), the control block 125 is the input / output block 1
The opening / closing drive device 110 and the storage drive device 112 are controlled via 23 so that the liquid crystal display device 111 is presented to the user (S5).

In step S6, the image processing block 122 determines whether or not the face of the user exists near the center of the image captured by the video camera 113 (S6). If the user's face does not exist near the center of the image (S6, No), it is determined whether the user's face exists in the image (S7). If the user's face is not present in the image (S7, No), preparation for ending the information provision is made, and the opening / closing drive device 110 and the storage drive device 112 are controlled to move the liquid crystal display device 111 to the robot. And the process is terminated (S1).
2).

If the user's face is present in the image (S7, Yes), the drive wheels 106 and the link mechanism drive device 108 are controlled to drive the drive wheels 105 and the drive legs 107. Then, the image is moved so that the user's face exists at the center of the image by the above-described control (S8), and the process proceeds to step S9.

If it is determined in step S6 that the user's face exists near the center of the image (S6, Y
es), the control block 125 reads out the information stored in the memory (not shown) and presents the information to the user by displaying the information on the liquid crystal display device 111 (S).
9).

Next, it is determined whether or not there is an instruction to end the information presentation by voice (S10). When the voice input block 120 determines that the voice input via the microphone 115 is the end instruction (S10, Ye
s), preparing to end the information provision, controlling the opening / closing drive device 110 and the storage drive device 112, storing the liquid crystal display device 111 in the robot type interface device, and ending the processing (S12). .

When it is determined that the voice input via the microphone 115 is not an end instruction (S1).
(0, No), the illuminance sensor 117 determines whether or not an instruction to end information presentation has been issued (S11). When the illuminance sensor 117 is shielded and the illuminance decreases, it is determined that an instruction to end the information presentation has been issued (S11, Ye).
s) Prepare to end the information provision, control the opening / closing drive device 110 and the storage drive device 112 to store the liquid crystal display device 111 in the robot-type interface device, and end the process (S12). The illuminance sensor 11
If the illuminance of No. 7 is not low (S11, No), the process returns to step S6 and the subsequent processes are repeated.

In the above description, the liquid crystal display device 11
Although a liquid crystal display device capable of performing switching between a reflection type and a backlight type is used, a power supply block 1 is used.
04 can be saved. That is, depending on the image captured by the video camera 113 and the like,
The power consumption is reduced by measuring the amount of light in front of the liquid crystal display device 111, and switching the liquid crystal display device 111 to the reflection type when the light amount is large and switching the liquid crystal display device 111 to the backlight type when the light amount is small. It is possible to do.

Even when the liquid crystal display device 111 is of a backlight type and cannot be switched, or when it is a display device other than liquid crystal, for example, the facial expression of the user, predetermined identification information By using information such as external light and noise around the robot-type interface device to change the amount of light of the display device and the volume output from the speaker 119, power consumption can be reduced and stress on the user can be reduced. It is possible to present information without giving it. Furthermore, as a liquid crystal display device, EL (Elec
troLuminescence) display,
Since display can be performed with a smaller amount of light in a dark place, power consumption can be reduced.

The liquid crystal display device 111 is stored or presented to the user by opening and closing the liquid crystal display device cover 109 and lifting the liquid crystal display device 111.
The liquid crystal display device 111 may be stored using a parallel feed mechanism. In this case, the angle of the liquid crystal display device 111 cannot be adjusted, but the number of movable parts can be reduced to one, so that control and design can be easily performed.

Although only the liquid crystal display device 111 is accommodated in the robot-type interface device, the display device such as the speaker 119 and the microphone 1
Sensors such as 15 may be housed inside the robot-type interface device. If you do this,
It is possible to design the appearance of a functional robot-type interface device, protect those components from impacts and the like, and reduce the causes of failure.

Although the interaction between the user and the robot-type interface device is performed using only a non-contact sensor (ultrasonic sensor), the illuminance sensor 117 provided at the center of the cover 103T is used as a touch sensor.
If control is performed by detecting the direction and location of the touch sensor touched by the user, it is possible to give instructions to the robot-type interface device and acquire information even in a dark place.

As described above, according to the robot-type interface device of the present embodiment, the angle of the liquid crystal display device 111 is changed in accordance with the position of the user's face in the image acquired by the video camera 113. Or by driving the drive wheels 105 and the drive legs, the liquid crystal display device 111 is always positioned at a position facing the user's face, so that the user can obtain information presented by the robot-type interface device. Can be accurately recognized.

Further, since the direction of the video camera 113 and the ultrasonic sensor 116E is controlled to obtain information around the robot-type interface device, external information can be obtained accurately and the user can obtain the information. The robot-type interface device can perform accurate control without giving detailed instructions to the robot-type interface device.

After the cover 109 for the liquid crystal display device is opened, one end of the liquid crystal display device 111 is lifted to present information to the user.
When the device 1 is not used, the liquid crystal display device 111 can be housed in the robot-type interface device, and can be formed in a shape that is familiar to the user. Can be protected.

The embodiments disclosed this time are to be considered in all respects as illustrative and not restrictive. The scope of the present invention is defined by the terms of the claims, rather than the description above, and is intended to include any modifications within the scope and meaning equivalent to the terms of the claims.

[0074]

According to one aspect of the present invention, the control means controls the second drive means to change at least one of the relative position and the relative angle of the display means with respect to the robot-type interface device, and provides the user with the information. Since the information is presented, the user can accurately recognize the presented information.

Also, the control means may control the user's face obtained by the image obtaining means such that the user's face is located near the center of the image.
After controlling the first driving means and the third driving means, the second driving means
Control means for changing at least one of the relative position and the relative angle of the display means with respect to the robot-type interface device, so that the position of the user's face can be accurately determined and the robot-type interface device can be accurately determined. Control can be performed.

When the control means starts the presentation of the information by the display means when the image of the user's face acquired by the image acquisition means coincides with the previously stored face image, the control means starts the registration. It has become possible to prevent information from being presented to anyone other than the user.

Further, since the control means controls the first drive means so that the distance to the user measured by the distance measurement means becomes a predetermined value, the robot-type interface device can accurately determine the position of the user. It became possible to move to.

Further, since the control means changes the display mode of the display means in accordance with the peripheral illuminance measured by the illuminance measurement means, the display mode of the display means can be changed accurately and power consumption can be reduced. It has become possible.

Further, if the control means determines that the result of the recognition by the voice input means is the first predetermined command, the control means starts the presentation of information by the display means. It became possible.

If the result of the recognition by the voice input means is the second predetermined command, the control means terminates the presentation of the information by the display means, so that the robot-type interface device accurately determines the end of the information presentation. It became possible.

Further, when the control means terminates the presentation of information by the display means when the surrounding illuminance measured by the illuminance measurement means is equal to or less than a predetermined value, the robot-type interface device accurately determines the end of the information presentation. It has become possible to judge.

Since the second driving means lifts and moves the display means after opening the cover, the display means is covered with the cover except when presenting information. It is now possible to protect.

Further, since the display means displays the information received by the receiving means, the information presented by the display means can be supplied from the outside, and the convenience of the robot type interface device can be improved.

[Brief description of the drawings]

FIG. 1 is a top view of a robot-type interface device according to an embodiment of the present invention.

FIG. 2 is a side view of the robot-type interface device according to the embodiment of the present invention.

FIG. 3 is a top view of the robot interface device according to the present embodiment with covers 103T and 103B attached.

FIG. 4 is a side view of the robot interface device according to the present embodiment with a cover 103T opened.

FIG. 5 is a diagram showing that the robot-type interface device according to the embodiment of the present invention presents information to a user.

FIG. 6 shows an external appearance example of the robot-type interface device according to the present embodiment in a state where the liquid crystal display device cover 109 is opened and the liquid crystal display device 111 faces the direction of the user's face.

FIG. 7 is a block diagram illustrating a schematic configuration of electric components of the robot-type interface device according to the embodiment of the present invention.

FIG. 8 is a flowchart illustrating a processing procedure of the robot-type interface device according to the embodiment of the present invention.

[Explanation of symbols]

101 motherboard, 102 main switch, 10
3T, 103B cover, 104 power supply block, 10
5R, 105L drive wheel, 106R, 106L drive wheel drive device, 107FR, 107FL, 107RR, 10
7RL link mechanism, 108FR, 108FL, 108
RR, 108RL Link mechanism drive, 109 Liquid crystal display cover, 110 Opening / closing drive, 111
Liquid crystal display device, 112 Storage drive device, 113 Video camera, 114 Video camera drive device, 115
R, 115L microphone, 116E, 116FR,
116FL, 116RR, 116RL ultrasonic sensor,
117 illuminance sensor, 118 video camera lens,
119 speaker, 120 voice input block, 121
Audio output block, 122 Image processing block, 12
3 input / output block, 124 external communication block, 12
5 control block, 126 external communication connector.

Continuation of front page (72) Inventor Yasuhiro Ueda 22-22 Nagaikecho, Abeno-ku, Osaka City, Osaka F-term (reference) 2C150 AA14 CA01 CA02 CA04 DA06 DA24 DA25 DA26 DA27 DA28 DF03 DF04 DF06 DF33 DG01 DG02 DG12 DG13 DG42 DK02 ED42 ED52 EF07 EF16 EF17 EF23 EF29 EF33 EF36 3C007 AS34 CS08 JU03 KS00 KS11 KS36 KS39 KT01 KV18 KX02 LT06 LT11 LV12 WA02 WA16 WB16 WB17 WB19 WC06

Claims (11)

[Claims]
1. A robot-type interface device for presenting information to a user according to a peripheral state, comprising: a detecting unit for detecting a state around the robot-type interface device; and information for the user. Display means for presenting, a first drive means for moving the robot-type interface device, a second drive means for moving the display means, and a peripheral area detected by the detection means. The first driving means is controlled according to the state to move the robot-type interface device, and the second driving means is controlled to control the relative position and the relative angle of the display means with respect to the robot-type interface device. Control means for changing at least one to present information to the user,
Robot type interface device.
2. The method according to claim 1, wherein the detecting unit includes an image acquiring unit for acquiring an image of a surrounding area, the robot-type interface device further includes a third driving unit for moving the image acquiring unit, The control unit controls the first driving unit and the third driving unit so that the user's face acquired by the image acquiring unit is near the center of the image, and then controls the second driving The robot-type interface device according to claim 1, wherein at least one of a relative position and a relative angle of the display means with respect to the robot-type interface device is changed by controlling means.
3. The method according to claim 1, wherein the control unit starts presenting information by the display unit when the image of the user's face acquired by the image acquisition unit matches a face image stored in advance. Item 3. The robot type interface device according to Item 2.
4. The detecting means further includes a distance measuring means for measuring a distance to a peripheral object, and the control means determines that a distance to the user measured by the distance measuring means is a predetermined value. After controlling the first drive unit to move the robot-type interface device so that the value becomes a value, the user's face acquired by the image acquisition unit is located near the center of the image. The method according to claim 2, wherein after controlling the first driving means and the third driving means, the second driving means is controlled to change at least one of a relative position and a relative angle of the display means.
The robotic interface device as described.
5. The robot-type interface device further includes an illuminance measuring unit for measuring an illuminance of the surroundings, and the control unit controls the displaying unit according to the illuminance of the surroundings measured by the illuminance measuring unit. The robot type interface device according to any one of claims 1 to 4, wherein the display mode is changed.
6. The robot-type interface device further includes voice input means for inputting and recognizing a user's voice, wherein the control means determines that a result of recognition by the voice input means is a first predetermined command. The robot-type interface device according to any one of claims 1 to 4, wherein the presentation of information by the display means is started, if any.
7. The robot-type interface device according to claim 6, wherein the control unit ends the presentation of the information by the display unit if the recognition result by the voice input unit is a second predetermined command.
8. The robot-type interface device further includes an illuminance measuring unit for measuring the illuminance of the surroundings, wherein the control unit determines that the illuminance of the surroundings measured by the illuminance measuring unit is equal to or less than a predetermined value. 7. The robot-type interface device according to claim 6, wherein the presentation of the information by said display means ends.
9. The robot-type interface device further includes a cover provided so as to cover the display means, and the second drive means lifts and moves the display means after opening the cover. The robot-type interface device according to any one of claims 1 to 8.
10. The robot-type interface device according to claim 1, further comprising a receiving unit for receiving information from outside, wherein the display unit displays the information received by the receiving unit. A robot-type interface device as described in Crab.
11. A method for controlling a robot-type interface device including a display device for presenting information to a user according to a state of the surroundings, the method comprising: detecting a state around the robot-type interface device; Moving the robot-type interface device according to the surrounding state, and changing at least one of a relative position and a relative angle of the display device with respect to the robot-type interface device to present information to the user. A control method for a robot-type interface device, comprising:
JP2001166227A 2001-06-01 2001-06-01 Robot type interface device and control method for the same Withdrawn JP2002355779A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2001166227A JP2002355779A (en) 2001-06-01 2001-06-01 Robot type interface device and control method for the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2001166227A JP2002355779A (en) 2001-06-01 2001-06-01 Robot type interface device and control method for the same

Publications (1)

Publication Number Publication Date
JP2002355779A true JP2002355779A (en) 2002-12-10

Family

ID=19008790

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2001166227A Withdrawn JP2002355779A (en) 2001-06-01 2001-06-01 Robot type interface device and control method for the same

Country Status (1)

Country Link
JP (1) JP2002355779A (en)

Cited By (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005161417A (en) * 2003-11-28 2005-06-23 Kawada Kogyo Kk Electric equipment apparatus mounting structure of anthropomorphic robot
JP2005161414A (en) * 2003-11-28 2005-06-23 Kawada Kogyo Kk Electric equipment apparatus cooling structure of anthropomorphic robot
KR100629119B1 (en) 2005-04-20 2006-09-27 주식회사 유진로봇 Device for opening and closing lcd folder of robot and method for control the same
EP1594660A4 (en) * 2003-01-15 2009-03-11 Intouch Technologies Inc Five degrees of freedom mobile robot
KR100991193B1 (en) * 2008-07-31 2010-11-05 주식회사 유진로봇 System and method of robot for controling gradient
EP2281668A1 (en) * 2005-09-30 2011-02-09 iRobot Corporation Companion robot for personal interaction
US8077963B2 (en) 2004-07-13 2011-12-13 Yulun Wang Mobile robot with a head-based movement mapping scheme
US8340819B2 (en) 2008-09-18 2012-12-25 Intouch Technologies, Inc. Mobile videoconferencing robot system with network adaptive driving
US8463435B2 (en) 2008-11-25 2013-06-11 Intouch Technologies, Inc. Server connectivity control for tele-presence robot
US8670017B2 (en) 2010-03-04 2014-03-11 Intouch Technologies, Inc. Remote presence system including a cart that supports a robot face and an overhead camera
JP2014050472A (en) * 2012-09-05 2014-03-20 Toshiba Tec Corp Robot control device and program
KR101408657B1 (en) 2007-12-18 2014-06-17 삼성전자주식회사 Apparatus and method for user interface of remote control system for Robot
US8849679B2 (en) 2006-06-15 2014-09-30 Intouch Technologies, Inc. Remote controlled robot system that provides medical images
US8849680B2 (en) 2009-01-29 2014-09-30 Intouch Technologies, Inc. Documentation through a remote presence robot
US8861750B2 (en) 2008-04-17 2014-10-14 Intouch Technologies, Inc. Mobile tele-presence system with a microphone system
US8902278B2 (en) 2012-04-11 2014-12-02 Intouch Technologies, Inc. Systems and methods for visualizing and managing telepresence devices in healthcare networks
US8996165B2 (en) 2008-10-21 2015-03-31 Intouch Technologies, Inc. Telepresence robot with a camera boom
CN104723344A (en) * 2015-03-17 2015-06-24 江门市东方智慧物联网科技有限公司 Smart home service robot system
US9098611B2 (en) 2012-11-26 2015-08-04 Intouch Technologies, Inc. Enhanced video interaction for a user interface of a telepresence network
US9138891B2 (en) 2008-11-25 2015-09-22 Intouch Technologies, Inc. Server connectivity control for tele-presence robot
US9160783B2 (en) 2007-05-09 2015-10-13 Intouch Technologies, Inc. Robot system that operates through a network firewall
US9174342B2 (en) 2012-05-22 2015-11-03 Intouch Technologies, Inc. Social behavior rules for a medical telepresence robot
US9193065B2 (en) 2008-07-10 2015-11-24 Intouch Technologies, Inc. Docking system for a tele-presence robot
US9198728B2 (en) 2005-09-30 2015-12-01 Intouch Technologies, Inc. Multi-camera mobile teleconferencing platform
USRE45870E1 (en) 2002-07-25 2016-01-26 Intouch Technologies, Inc. Apparatus and method for patient rounding with a remote controlled robot
US9323250B2 (en) 2011-01-28 2016-04-26 Intouch Technologies, Inc. Time-dependent navigation of telepresence robots
US9361021B2 (en) 2012-05-22 2016-06-07 Irobot Corporation Graphical user interfaces including touchpad driving interfaces for telemedicine devices
US9375843B2 (en) 2003-12-09 2016-06-28 Intouch Technologies, Inc. Protocol for a remotely controlled videoconferencing robot
US9469030B2 (en) 2011-01-28 2016-10-18 Intouch Technologies Interfacing with a mobile telepresence robot
US9602765B2 (en) 2009-08-26 2017-03-21 Intouch Technologies, Inc. Portable remote presence robot
US9715337B2 (en) 2011-11-08 2017-07-25 Intouch Technologies, Inc. Tele-presence system with a user interface that displays different communication links
US9842192B2 (en) 2008-07-11 2017-12-12 Intouch Technologies, Inc. Tele-presence robot system with multi-cast features
US9849593B2 (en) 2002-07-25 2017-12-26 Intouch Technologies, Inc. Medical tele-robotic system with a master remote station with an arbitrator
US9902069B2 (en) 2010-05-20 2018-02-27 Irobot Corporation Mobile robot system
US9974612B2 (en) 2011-05-19 2018-05-22 Intouch Technologies, Inc. Enhanced diagnostics for a telepresence robot
EP3338962A1 (en) * 2016-12-23 2018-06-27 LG Electronics Inc. Guide robot
KR20180074141A (en) * 2016-12-23 2018-07-03 엘지전자 주식회사 Moving robot
US10218748B2 (en) 2010-12-03 2019-02-26 Intouch Technologies, Inc. Systems and methods for dynamic bandwidth allocation
US10343283B2 (en) 2010-05-24 2019-07-09 Intouch Technologies, Inc. Telepresence robot system that can be accessed by a cellular phone
KR20190102792A (en) * 2018-02-27 2019-09-04 주식회사 알아이파워 Display Mounting Apparatus And The Method Thereof
US10471588B2 (en) 2008-04-14 2019-11-12 Intouch Technologies, Inc. Robotic based health care system

Cited By (79)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10315312B2 (en) 2002-07-25 2019-06-11 Intouch Technologies, Inc. Medical tele-robotic system with a master remote station with an arbitrator
US9849593B2 (en) 2002-07-25 2017-12-26 Intouch Technologies, Inc. Medical tele-robotic system with a master remote station with an arbitrator
USRE45870E1 (en) 2002-07-25 2016-01-26 Intouch Technologies, Inc. Apparatus and method for patient rounding with a remote controlled robot
EP1594660A4 (en) * 2003-01-15 2009-03-11 Intouch Technologies Inc Five degrees of freedom mobile robot
JP2005161414A (en) * 2003-11-28 2005-06-23 Kawada Kogyo Kk Electric equipment apparatus cooling structure of anthropomorphic robot
JP2005161417A (en) * 2003-11-28 2005-06-23 Kawada Kogyo Kk Electric equipment apparatus mounting structure of anthropomorphic robot
US9956690B2 (en) 2003-12-09 2018-05-01 Intouch Technologies, Inc. Protocol for a remotely controlled videoconferencing robot
US9375843B2 (en) 2003-12-09 2016-06-28 Intouch Technologies, Inc. Protocol for a remotely controlled videoconferencing robot
US8077963B2 (en) 2004-07-13 2011-12-13 Yulun Wang Mobile robot with a head-based movement mapping scheme
US8401275B2 (en) 2004-07-13 2013-03-19 Intouch Technologies, Inc. Mobile robot with a head-based movement mapping scheme
US8983174B2 (en) 2004-07-13 2015-03-17 Intouch Technologies, Inc. Mobile robot with a head-based movement mapping scheme
US10241507B2 (en) 2004-07-13 2019-03-26 Intouch Technologies, Inc. Mobile robot with a head-based movement mapping scheme
US9766624B2 (en) 2004-07-13 2017-09-19 Intouch Technologies, Inc. Mobile robot with a head-based movement mapping scheme
KR100629119B1 (en) 2005-04-20 2006-09-27 주식회사 유진로봇 Device for opening and closing lcd folder of robot and method for control the same
EP2281668A1 (en) * 2005-09-30 2011-02-09 iRobot Corporation Companion robot for personal interaction
US10259119B2 (en) 2005-09-30 2019-04-16 Intouch Technologies, Inc. Multi-camera mobile teleconferencing platform
US7957837B2 (en) 2005-09-30 2011-06-07 Irobot Corporation Companion robot for personal interaction
US8583282B2 (en) 2005-09-30 2013-11-12 Irobot Corporation Companion robot for personal interaction
US10661433B2 (en) 2005-09-30 2020-05-26 Irobot Corporation Companion robot for personal interaction
US9878445B2 (en) 2005-09-30 2018-01-30 Irobot Corporation Displaying images from a robot
US8935006B2 (en) 2005-09-30 2015-01-13 Irobot Corporation Companion robot for personal interaction
US9198728B2 (en) 2005-09-30 2015-12-01 Intouch Technologies, Inc. Multi-camera mobile teleconferencing platform
US9796078B2 (en) 2005-09-30 2017-10-24 Irobot Corporation Companion robot for personal interaction
US9446510B2 (en) 2005-09-30 2016-09-20 Irobot Corporation Companion robot for personal interaction
US9452525B2 (en) 2005-09-30 2016-09-27 Irobot Corporation Companion robot for personal interaction
US8195333B2 (en) 2005-09-30 2012-06-05 Irobot Corporation Companion robot for personal interaction
US8849679B2 (en) 2006-06-15 2014-09-30 Intouch Technologies, Inc. Remote controlled robot system that provides medical images
US9160783B2 (en) 2007-05-09 2015-10-13 Intouch Technologies, Inc. Robot system that operates through a network firewall
US10682763B2 (en) 2007-05-09 2020-06-16 Intouch Technologies, Inc. Robot system that operates through a network firewall
US9757861B2 (en) 2007-12-18 2017-09-12 Samsung Electronics Co., Ltd. User interface device of remote control system for robot device and method using the same
US9002520B2 (en) 2007-12-18 2015-04-07 Samsung Electronics Co., Ltd. User interface device of remote control system for robot device and method using the same
KR101408657B1 (en) 2007-12-18 2014-06-17 삼성전자주식회사 Apparatus and method for user interface of remote control system for Robot
US10471588B2 (en) 2008-04-14 2019-11-12 Intouch Technologies, Inc. Robotic based health care system
US8861750B2 (en) 2008-04-17 2014-10-14 Intouch Technologies, Inc. Mobile tele-presence system with a microphone system
US10493631B2 (en) 2008-07-10 2019-12-03 Intouch Technologies, Inc. Docking system for a tele-presence robot
US9193065B2 (en) 2008-07-10 2015-11-24 Intouch Technologies, Inc. Docking system for a tele-presence robot
US9842192B2 (en) 2008-07-11 2017-12-12 Intouch Technologies, Inc. Tele-presence robot system with multi-cast features
KR100991193B1 (en) * 2008-07-31 2010-11-05 주식회사 유진로봇 System and method of robot for controling gradient
US9429934B2 (en) 2008-09-18 2016-08-30 Intouch Technologies, Inc. Mobile videoconferencing robot system with network adaptive driving
US8340819B2 (en) 2008-09-18 2012-12-25 Intouch Technologies, Inc. Mobile videoconferencing robot system with network adaptive driving
US8996165B2 (en) 2008-10-21 2015-03-31 Intouch Technologies, Inc. Telepresence robot with a camera boom
US9138891B2 (en) 2008-11-25 2015-09-22 Intouch Technologies, Inc. Server connectivity control for tele-presence robot
US8463435B2 (en) 2008-11-25 2013-06-11 Intouch Technologies, Inc. Server connectivity control for tele-presence robot
US10059000B2 (en) 2008-11-25 2018-08-28 Intouch Technologies, Inc. Server connectivity control for a tele-presence robot
US8849680B2 (en) 2009-01-29 2014-09-30 Intouch Technologies, Inc. Documentation through a remote presence robot
US9602765B2 (en) 2009-08-26 2017-03-21 Intouch Technologies, Inc. Portable remote presence robot
US10404939B2 (en) 2009-08-26 2019-09-03 Intouch Technologies, Inc. Portable remote presence robot
US9089972B2 (en) 2010-03-04 2015-07-28 Intouch Technologies, Inc. Remote presence system including a cart that supports a robot face and an overhead camera
US8670017B2 (en) 2010-03-04 2014-03-11 Intouch Technologies, Inc. Remote presence system including a cart that supports a robot face and an overhead camera
US9902069B2 (en) 2010-05-20 2018-02-27 Irobot Corporation Mobile robot system
US10343283B2 (en) 2010-05-24 2019-07-09 Intouch Technologies, Inc. Telepresence robot system that can be accessed by a cellular phone
US10218748B2 (en) 2010-12-03 2019-02-26 Intouch Technologies, Inc. Systems and methods for dynamic bandwidth allocation
US10399223B2 (en) 2011-01-28 2019-09-03 Intouch Technologies, Inc. Interfacing with a mobile telepresence robot
US9323250B2 (en) 2011-01-28 2016-04-26 Intouch Technologies, Inc. Time-dependent navigation of telepresence robots
US10591921B2 (en) 2011-01-28 2020-03-17 Intouch Technologies, Inc. Time-dependent navigation of telepresence robots
US9785149B2 (en) 2011-01-28 2017-10-10 Intouch Technologies, Inc. Time-dependent navigation of telepresence robots
US9469030B2 (en) 2011-01-28 2016-10-18 Intouch Technologies Interfacing with a mobile telepresence robot
US9974612B2 (en) 2011-05-19 2018-05-22 Intouch Technologies, Inc. Enhanced diagnostics for a telepresence robot
US10331323B2 (en) 2011-11-08 2019-06-25 Intouch Technologies, Inc. Tele-presence system with a user interface that displays different communication links
US9715337B2 (en) 2011-11-08 2017-07-25 Intouch Technologies, Inc. Tele-presence system with a user interface that displays different communication links
US8902278B2 (en) 2012-04-11 2014-12-02 Intouch Technologies, Inc. Systems and methods for visualizing and managing telepresence devices in healthcare networks
US10328576B2 (en) 2012-05-22 2019-06-25 Intouch Technologies, Inc. Social behavior rules for a medical telepresence robot
US9174342B2 (en) 2012-05-22 2015-11-03 Intouch Technologies, Inc. Social behavior rules for a medical telepresence robot
US9361021B2 (en) 2012-05-22 2016-06-07 Irobot Corporation Graphical user interfaces including touchpad driving interfaces for telemedicine devices
US10061896B2 (en) 2012-05-22 2018-08-28 Intouch Technologies, Inc. Graphical user interfaces including touchpad driving interfaces for telemedicine devices
US10603792B2 (en) 2012-05-22 2020-03-31 Intouch Technologies, Inc. Clinical workflows utilizing autonomous and semiautonomous telemedicine devices
US10658083B2 (en) 2012-05-22 2020-05-19 Intouch Technologies, Inc. Graphical user interfaces including touchpad driving interfaces for telemedicine devices
US9776327B2 (en) 2012-05-22 2017-10-03 Intouch Technologies, Inc. Social behavior rules for a medical telepresence robot
JP2014050472A (en) * 2012-09-05 2014-03-20 Toshiba Tec Corp Robot control device and program
US10334205B2 (en) 2012-11-26 2019-06-25 Intouch Technologies, Inc. Enhanced video interaction for a user interface of a telepresence network
US9098611B2 (en) 2012-11-26 2015-08-04 Intouch Technologies, Inc. Enhanced video interaction for a user interface of a telepresence network
CN104723344A (en) * 2015-03-17 2015-06-24 江门市东方智慧物联网科技有限公司 Smart home service robot system
KR20180074141A (en) * 2016-12-23 2018-07-03 엘지전자 주식회사 Moving robot
EP3338962A1 (en) * 2016-12-23 2018-06-27 LG Electronics Inc. Guide robot
KR102070210B1 (en) * 2016-12-23 2020-01-28 엘지전자 주식회사 Moving robot
EP3563979A1 (en) * 2016-12-23 2019-11-06 LG Electronics Inc. Guide robot
EP3563981A1 (en) * 2016-12-23 2019-11-06 LG Electronics Inc. Guide robot
KR102088456B1 (en) 2018-02-27 2020-03-12 주식회사 알아이파워 Display Mounting Apparatus And The Method Thereof
KR20190102792A (en) * 2018-02-27 2019-09-04 주식회사 알아이파워 Display Mounting Apparatus And The Method Thereof

Similar Documents

Publication Publication Date Title
JP6121034B2 (en) Game controller
US10474914B2 (en) Apparatus detecting driving incapability state of driver
US10606380B2 (en) Display control apparatus, display control method, and display control program
JP6214752B2 (en) Display control device, display control method for display control device, gaze direction detection system, and calibration control method for gaze direction detection system
CN102323829B (en) Display screen visual angle regulating method and display device
KR20150131634A (en) Mobile terminal and apparatus for controlling a vehicle
US8238622B2 (en) Vein authentication device
US20150379896A1 (en) Intelligent eyewear and control method thereof
Yoshimi et al. Development of a person following robot with vision based target detection
Van den Bergh et al. Real-time 3D hand gesture interaction with a robot for understanding directions from humans
CN103625477B (en) Run the method and system of vehicle
KR101334107B1 (en) Apparatus and Method of User Interface for Manipulating Multimedia Contents in Vehicle
US10031576B2 (en) Speech generation device with a head mounted display unit
JP3673834B2 (en) Gaze input communication method using eye movement
EP2693366B1 (en) Robots comprising projectors for projecting images on identified projection surfaces
JP3660492B2 (en) Object detection device
US7988620B2 (en) Capsule endoscope apparatus
CN101890719B (en) Robot remote control device and robot system
CN104813258B (en) Data input device
Schröer et al. An autonomous robotic assistant for drinking
KR20150076627A (en) System and method for learning driving information in vehicle
JP5646263B2 (en) Image processing program, image processing apparatus, image processing system, and image processing method
AU2013351980A1 (en) Direct hologram manipulation using IMU
US20120075424A1 (en) Computer-readable storage medium having image processing program stored therein, image processing apparatus, image processing system, and image processing method
JP2014508596A (en) Optical device for individuals with visual impairment

Legal Events

Date Code Title Description
A300 Withdrawal of application because of no request for examination

Free format text: JAPANESE INTERMEDIATE CODE: A300

Effective date: 20080805