CN112912152B - AC robot - Google Patents

AC robot Download PDF

Info

Publication number
CN112912152B
CN112912152B CN201980068901.5A CN201980068901A CN112912152B CN 112912152 B CN112912152 B CN 112912152B CN 201980068901 A CN201980068901 A CN 201980068901A CN 112912152 B CN112912152 B CN 112912152B
Authority
CN
China
Prior art keywords
person
robot
unit
distance
guidance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201980068901.5A
Other languages
Chinese (zh)
Other versions
CN112912152A (en
Inventor
山本晃弘
中村亮介
网野梓
上田泰士
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Ltd
Original Assignee
Hitachi Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Ltd filed Critical Hitachi Ltd
Publication of CN112912152A publication Critical patent/CN112912152A/en
Application granted granted Critical
Publication of CN112912152B publication Critical patent/CN112912152B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H11/00Self-movable toy figures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)
  • Toys (AREA)

Abstract

The communication robot of the present invention includes: a drive unit that moves at least one of a component attached to the housing and the housing itself; a voice acquisition unit that acquires voices of people around the casing; a voice generation unit for generating a voice; a person detection section for detecting the presence of a person; and a calculation section for calculating a position or distance of the person detected by the person detection section. Then, the operation of the component or the housing by the driving unit is controlled under the preset execution condition based on the position or the distance of the person calculated by the calculating unit. By controlling in this manner, for example, even when communication such as dialogue is not possible, guidance, and the like can be appropriately performed.

Description

AC robot
Technical Field
The invention relates to an alternating current robot.
Background
In recent years, various communication robots that perform communication with users have been developed. For example, patent document 1 describes a technique related to a robot control device applicable to an ac robot.
That is, in patent document 1, a robot control device determines an operation performed on a person (user) and causes an ac robot to execute the operation. When a reaction to the motion from the human is detected, the robot control device determines the possibility of speaking to the human based on the reaction, and controls the motion pattern of the communication robot based on the determination result.
Documents of the prior art
Patent document
Patent document 1: WO2016/132729 publication
Disclosure of Invention
Technical problem to be solved by the invention
However, in the conventionally proposed techniques, the motion is performed when it is detected that a person approaches a position near the robot where a conversation can be performed, and no consideration is given to the motion when no person is present near the robot.
In particular, in the case of a stationary ac robot that cannot move from a position where the robot is installed, the receivable area is limited to the periphery of the installation position. Therefore, in the case of a stationary ac robot, it may be difficult to provide the robot with a function such as guidance or guidance.
In addition, even in the case of a mobile ac robot that can travel or the like, the same problem as in the case of a stationary ac robot occurs due to limitations or the like on the range in which the robot can travel.
The purpose of the present invention is to provide an interactive robot that can appropriately perform guidance, and the like even when interactive communication such as dialogue is not possible.
Technical scheme for solving technical problems
In order to solve the above problem, for example, the structure described in the claims is adopted.
The present application includes a plurality of means for solving the above-mentioned technical problem, and an ac robot according to the present invention includes: a drive unit that moves at least one of a component mounted on the housing and the housing itself; a voice acquisition section that acquires a voice of a person around the casing; a voice generating unit for generating voice.
The communication robot of the present invention further includes: a human detection section that detects presence of a human based on a detection signal of a detection sensor mounted on the housing; a calculation unit that calculates a position or a distance of the person detected by the person detection unit; and an operation control unit that controls the operation of the component or the housing by the drive unit under a preset execution condition based on the position or the distance of the person calculated by the calculation unit.
According to the present invention, it is possible to guide a person around the ac robot by the movement of the casing or the member of the ac robot.
Problems, structures, and effects other than those described above will be more apparent from the following description of the embodiments.
Drawings
Fig. 1 is a front view showing an external appearance example of an ac robot according to a first embodiment of the present invention.
Fig. 2 is a block diagram showing an example of the internal configuration of an ac robot according to the first embodiment of the present invention.
Fig. 3 is a block diagram showing a configuration example of a driving unit of an ac robot according to a first embodiment of the present invention.
Fig. 4 is a plan view showing an example of an appearance of the ac robot according to the first embodiment of the present invention.
Fig. 5 is a side view showing an example of an appearance of the ac robot according to the first embodiment of the present invention.
Fig. 6 is a front view showing an example of a state where the ac robot according to the first embodiment of the present invention opens the arm.
Fig. 7 is a front view showing an example of a face of the communication robot according to the first embodiment of the present invention.
Fig. 8 is an explanatory diagram showing an example of changing the expression of the face shown in fig. 7.
Fig. 9 is a flowchart showing an operation control example according to the first embodiment of the present invention.
Fig. 10 is a characteristic diagram showing an example of control of the amplitude and the period in the first embodiment of the present invention.
Fig. 11 is a block diagram showing an example of the internal configuration of an ac robot according to a second embodiment of the present invention.
Fig. 12 is a flowchart showing an operation control example according to the second embodiment of the present invention.
Fig. 13 is a block diagram showing an example of the internal configuration of an ac robot according to the third embodiment of the present invention.
Fig. 14 is an explanatory diagram showing an example of group judgment in the third embodiment of the present invention.
Fig. 15 is (one of) a flowchart showing an operation control example according to the third embodiment of the present invention.
Fig. 16 is a flowchart (two) showing an operation control example according to the third embodiment of the present invention.
Fig. 17 is an explanatory diagram showing an example of the guidance range of the ac robot.
Detailed Description
<1. first embodiment >
Hereinafter, a first embodiment of the present invention will be described in detail with reference to fig. 1 to 10.
[1-1. Structure of AC robot ]
Fig. 1 shows an example of the external shape of an ac robot 1 according to a first embodiment. The ac robot 1 according to the present embodiment is installed, for example, at a reception site near an entrance of a building defined by a direction in which a person arrives and an area in which the person exists. The communication robot 1 is a stationary robot that cannot move autonomously. However, setting the ac robot 1 to the stationary type is only one example, and wheels or legs for walking may be provided on the ac robot 1.
The communication robot 1 includes a head 2, a trunk upper portion 3, a trunk lower portion 4, and a waist portion 5 as a housing constituting a robot main body.
A face 2a and a dummy right eye 8R and a dummy left eye 8L representing eyes on the surface of the face 2a are attached to the head 2. Further, a right arm portion 6R and a left arm portion 6L are mounted on the trunk upper portion 3.
The dummy right eye 8R and left eye 8L, upper torso 3, lower torso 4, right arm 6R, and left arm 6L are each configured as movable parts that can move. Each of these movable portions is driven by a driving portion 21 installed inside the ac robot 1. Examples of moving the movable portions will be described later with reference to fig. 4 to 8.
The display units 7R, 7L, 7ER, and 7EL are disposed in each part of the ac robot 1. That is, the display units 7R and 7L are disposed on the right arm portion 6R and the left arm portion 6L, and the display units 7ER and 7EL are disposed on the head 2. The display units 7R, 7L, 7ER, and 7EL are formed of Light Emitting Diodes (LEDs), and the portions where the display units 7R, 7L, 7ER, and 7EL are arranged emit light of predetermined colors. The emission colors, emission luminances, and blinking periods of the display units 7R, 7L, 7ER, and 7EL are set by instructions from an operation plan execution unit 75 described later in fig. 2. In addition to the display units 7R, 7L, 7ER, and 7EL, a display unit capable of displaying a character message, an image, and the like may be provided.
The detection sensor 10 is disposed on the head 2 of the ac robot 1. The detection sensor 10 is a sensor for detecting a person around the ac robot 1. As the detection sensor 10, a stereo camera, a depth sensor, an RGB camera, an infrared distance sensor, a range sensor, or the like can be used. The detection sensor 10 may be constituted by a plurality of the above-described cameras or sensors.
The voice acquiring unit 9, which is configured by a microphone array including a plurality of microphones and the like, is attached to the lower trunk portion 4 of the ac robot 1, and acquires the speech sound of the person around the ac robot 1. The voice generating unit 11, which is configured by a speaker, a buzzer, and the like, is attached to the head 2 of the ac robot 1, and outputs voice, an alarm sound, and the like. The communication robot 1 can communicate with people around the communication robot 1 by the voice acquisition process in the voice acquisition section 9 and the voice generation process in the voice generation section 11.
Fig. 2 shows an example of the internal configuration of the motion control system of the ac robot 1.
The ac robot 1 includes an arithmetic processing unit 70 as a motion control system thereof. The arithmetic processing unit 70 is configured by a computer device, a storage device, and the like, and controls the operation of the ac robot 1 by executing an installed program (software). The internal configuration of the arithmetic processing unit 70 shown in fig. 2 is a configuration when viewed from the software function executed by the arithmetic processing unit 70.
The output signal of the detection sensor 10 and the voice signal acquired by the voice acquisition unit 9 are supplied to the arithmetic processing unit 70. Further, the display units 7R, 7L, 7ER, and 7EL display the results of the calculation by the calculation processing unit 70. Further, based on the calculation result, the voice output in the voice generation unit 11 and the driving of each movable unit by the driving unit 21 are performed.
The arithmetic processing unit 70 includes a human detection unit 71, a position calculation unit 72, a speech/language processing unit 73, an operation control unit 74, an operation plan execution unit 75, and a language response database unit 76.
The person detection section 71 detects the face of a person based on the output signal of the detection sensor 10.
The position calculating unit 72 calculates the distance to the person whose face is detected by the person detecting unit 71 or the position where the person is present. Specifically, in the position calculation section 72, when the face of a person is detected, the position of the face of the person is acquired.
Here, as will be described later in fig. 10, DF is a distance between the human face and the ac robot 1, and Φ F is a deviation angle between the front direction of the ac robot 1 and the direction in which the human face is located.
Further, the position calculating section 72 has a memory function capable of storing the distance DF and the deviation angle Φ F with respect to the detected face, and capable of calculating the distance change Δ DF and the deviation angle change Δ Φ F from the difference from the distance recorded last time.
The speech and language processing section 73 discriminates whether or not a language exists based on the speech signal acquired by the speech acquisition section 9, and when it discriminates that a language exists, records language information as the content of the human utterance. The speech/language processing unit 73 also determines the type of language (japanese, english, etc.) included in the speech signal acquired by the speech acquisition unit 9, and records information on the determined type of language.
The operation control unit 74 selects an operation of the communication robot 1 based on the determination result of the presence or absence of the detected person and the position and the change in position of the face of the person, and controls the execution condition of the operation. The operations performed under the control of the operation control unit 74 include a response operation for communicating with people around the ac robot 1 and a guidance operation for guiding people outside the range in which communication is possible to the area in which communication is possible.
The range in which communication is possible is, for example, a range of a distance of about several m from the communication robot 1. The range in which the communication is possible is also a range in which the voice acquiring unit 9 can pick up the speech sound of the person.
The language response database unit 76 registers response operations such as a speech operation, a movable unit operation, and a display operation for the content (conversation content) of the acquired language, and reads a response operation corresponding to the language acquired by the speech/language processing unit 73 from the language response database unit 76.
The action plan executing unit 75 calculates various instructions regarding the action based on the action selected by the action control unit 74 and the action conditions thereof. Then, the operation selected by the operation control unit 74 and the various commands calculated by the operation plan execution unit 75 are output to the display units 7R, 7L, 7ER, and 7EL, the voice generation unit 11, and the drive unit 21.
Fig. 3 shows the structure of the driving section 21. The driving unit 21 includes a movable unit motor control unit 21a, a movable unit motor driving unit 21b, a movable unit motor 21c, and a movable unit position sensor 21 d. The driving unit 21 shown in fig. 3 is prepared for each movable unit.
The movable-unit motor control unit 21a calculates a control signal based on an operation command input to the drive unit 21, converts the control signal into a voltage corresponding to the control signal by the movable-unit motor drive unit 21b, and applies the voltage to the movable-unit motor 21 c. The operation angle of the movable unit is detected by the movable unit position sensor 21d, and the detection result is fed back to the movable unit motor control unit 21a, thereby controlling the operation of the movable unit motor 21 c.
[1-2. operation example of AC robot ]
Next, an example of the action performed by the ac robot 1 will be described with reference to fig. 4 to 8. The three axes (x-axis, y-axis, and z-axis) described in the following operation are defined by the axes shown in the lower right of fig. 1. That is, as shown in fig. 1, the left-right direction of the horizontal plane of the ac robot 1 is defined as the x-axis, the front-back direction of the horizontal plane is defined as the y-axis, and the up-down direction (vertical direction) is defined as the z-axis.
Fig. 4 shows a state in which the lower trunk 4 of the ac robot 1 is movable.
As shown in fig. 4, the body lower portion 4 of the ac robot 1 is movable about the z-axis with respect to the waist portion 5 and is rotatable. In the example shown in fig. 4, the ac robot 1 is shown in a state in which the lower trunk 4 is moved to the right side with respect to the waist 5 by a rotation angle θ a (here, a movement amount indicated as an angle 52) with reference to the movable center point 51.
Fig. 5 shows a state in which the upper trunk portion 3 of the ac robot 1 is movable.
As shown in FIG. 5, the upper torso portion 3 is movable relative to the lower torso portion 4 about the x-axis, and the upper torso portion 3 is capable of performing an action that appears to be a bow when viewed from the front side of the AC robot 1. In the example illustrated in FIG. 5, the AC robot 1 illustrates an operation wherein the upper torso portion 3 is actuated relative to the lower torso portion 4 by a bow angle θ B (herein indicated as the amount of the angular motion 54) based on a movable center point 53.
Fig. 6 shows a state in which the right arm portion 6R and the left arm portion 6L of the ac robot 1 are movable.
As shown in fig. 6, the right arm portion 6R and the left arm portion 6L are movable about rotation shafts 55R and 55L, respectively, with respect to the trunk upper portion 3, and can perform an arm-lifting motion. The right arm portion 6R and the left arm portion 6L are rotated about the rotation shafts 55R and 55L, respectively, and thereby the right arm portion 6R and the left arm portion 6L are moved to the positions of the right arm portion 6RA and the left arm portion 6 LA. In the state shown in fig. 6, the operation is performed so as to raise the operation angles 56R and 56L with respect to the positions of the right arm portion 6R and the left arm portion 6L shown by the virtual lines, respectively.
Fig. 7 and 8 show a state in which the right eye 8R and the left eye 8L of the face 2a of the ac robot 1 are movable.
As shown in fig. 7, the right eye 8R and the left eye 8L are movable along the face 2a with respect to the face 2a on the head 2, thereby enabling the eyes to be tilted. In the state shown in fig. 7, the right eye 8R and the left eye 8L are moved by the inclination angles of the eyes 58R (θ DR) and 58L (θ DL) with reference to the movable center points 57R and 57L, respectively.
Further, as shown in fig. 8, it is conceivable to make the inclination of the eye symmetrical left and right. That is, if θ DR is equal to θ DL as shown in fig. 7, the expression of the ac robot can be changed by the inclination of the eyes. In the example shown in fig. 8(a), the end portions of the outer ends of the right eye 8R and the left eye 8L are raised, and in the example shown in fig. 8(b), the end portions of the outer sides of the right eye 8R and the left eye 8L are lowered.
Also, as another example, as shown in fig. 1, each of the right eye 8R and the left eye 8L can be in a longitudinally long state.
The face 2a can be given various expressions by the movements of the right eye 8R and the left eye 8L as shown in fig. 6, fig. 7, or fig. 1.
In the following description, the operation angle in each operation direction shown in fig. 4 to 7 is positive (plus), and the operation angle opposite to the operation angle is negative (minus).
[1-3. operation control example ]
Fig. 9 is a flowchart showing an example of the control process executed by the operation control unit 74.
First, the operation control unit 74 acquires the language information detected by the speech/language processing unit 73 (step S11). Then, the operation control unit 74 determines whether or not language information is present in the acquired information (step S12). Here, when there is language information (yes in step S12), the processing of the flowchart in fig. 9 is ended, and the process shifts to communication processing of directions, dialogs, and the like based on the detected language information. Although the details of the communication process are omitted here, the communication process is a process of, for example, answering an inquiry from a person or guiding a destination to an approaching person.
When it is determined in step S12 that there is no language information (no in step S12), the operation controller 74 refers to the result of detection of a person acquired by the person detector 71 (step S13). Then, the operation control unit 74 determines whether or not the face of the person is present based on the detection result of the person detection unit 71 (step S14).
In the determination at step S14, it is determined whether or not the face of the person is present within a range of a certain distance (for example, within about several meters) from the ac robot 1, based on the information of the position or distance of the person calculated by the position calculating unit 72. The determination that the communication robot 1 is within the predetermined distance corresponds to the determination of whether or not the communication robot is within a range in which communication with a human is possible.
When it is determined in step S14 that there is no human face (no in step S14), the operation control unit 74 selects a search operation for changing the detection direction of the detection sensor 10 set in advance (step S15). Here, the action control portion 74 performs the bow action (see fig. 5) and changes the bow angle θ B (step S16) assuming that the ac robot 1 is installed at a reception or the like near an entrance or exit where the human coming direction is determined.
That is, the bow angle θ B shown in FIG. 5 is changed to (θ B + Δ θ B1). Here, Δ θ B1 is set in advance to a positive value, and the angle is changed to the forward-lean side. When the bow angle θ B exceeds the action angle limit for the front tilt, the positive and negative are reversed and the bow angle θ B is changed to the rear tilt. On the other hand, when the motion angle limit for the backward tilt is exceeded, the positive and negative are again reversed and the bow angle θ B is changed to the forward tilt. In step S16, this operation is repeated each time the switch is made.
When it is determined in step S14 that the face of the person is present (yes in step S14), the operation control unit 74 selects a preset operation of guiding the person (step S17). Here, in order to guide the movement of the person, the movement control unit 74 refers to the distance DF between the ac robot 1 and the face of the person and the change amount Δ DF thereof, which are acquired and stored by the position calculation unit 72 (step S18). Then, the operation controller 74 determines whether or not the distance DF is within the response-enabled distance D1 to the human (step S19).
When it is determined in step S19 that the distance DF is not within the response-able distance D1 to the person (no in step S19), the action control section 74 changes the magnitude and the cycle of the action based on the distance DF of the face of the person and the change amount Δ DF thereof, and executes the guidance action (step S20). Here, when the distance DF is large and the variation Δ DF is positive and large, the change is made so that the magnitude of the action becomes large and the cycle of the action becomes fast. Here, a case where the distance DF is large and the variation Δ DF is positive and large indicates a case where a person is present at a distant place and wants to move to a distant place.
On the other hand, when the distance DF is small, the distance change Δ DF is negative and small, that is, when a person exists in the vicinity and is about to move further to the vicinity, the change is made so that the magnitude of the motion becomes small and the cycle of the motion becomes slow.
When it is determined in step S19 that the distance DF is within the human response-able distance D1 (yes in step S19), the operation controller 74 sets the magnitude and the period of the operation to the response operation (step S21). At this time, the operation control unit 74 is set to reduce the magnitude of the operation and slow down the cycle to such an extent that the voice and language determination by the voice and language processing unit 73 when the response is not obstructed.
Fig. 10 shows one example of a process of changing the size and the cycle of the guidance action in step S20 of the flowchart of fig. 9. The left side of fig. 10 shows the relationship between the lift-up action angle θ CR of the right arm portion 6R (angle 56R and the like shown in fig. 6) and the distance DF to the face of the person, with the vertical axis showing the value of the lift-up action angle and the horizontal axis showing the change in the distance to the face.
In addition, the right side of fig. 10 shows the relationship between the cycle of the motion of raising the right arm portion 6R and the change amount Δ DF in the distance to the face of the person, the vertical axis shows the cycle of the motion, and the horizontal axis shows the change amount in the distance.
For example, the lift operation angle θ CR of the right arm portion 6R is set to
θCR=θCR0×sin(2Πt/TCR0)。
The lift operation angle θ CR is an operation angle at the time of lifting, which is shown by taking the operation angle 56R of the right arm portion 6R shown in fig. 6 as an example. θ CR0 represents the amplitude of the lift angle θ CR when it changes, and TCR0 represents the cycle of the right arm 6R when the lift is repeated.
The amplitude θ CR0 and the period TCR0 change as shown in fig. 10.
That is, when the distance DF between the ac robot 1 and the person is equal to or less than the first distance D1 shown in fig. 10, the amplitude θ CR0 becomes 0. Further, when the distance DF between the ac robot 1 and the person is changed from the first distance D1 to the second distance D2, the amplitude θ CR0 becomes gradually larger. When the second distance D2 is reached, θ CR2 becomes the movable limit value, and when the distance becomes equal to or greater than the second distance D2, the distance becomes the movable limit value θ CR 2.
Further, with respect to the cycle TCR0, the cycle TCR0 becomes the maximum cycle TCR2 when the change amount Δ DF of the distance is a specific change amount Δ D3 on the negative side, and the cycle TCR0 becomes the minimum cycle (movable limit value) TCR1 when the change amount Δ DF is a specific change amount Δ D4 on the positive side.
When the amount of change in the distance is a value between Δ D3 and Δ D4, the value gradually changes between the maximum cycle TCR2 and the minimum cycle TCR1 as shown in fig. 10, and when the distance change is Δ D4 or more, the value is set as the minimum cycle TCR 1. The amplitude and the period of the change in each of the left arm portion 6L, the upper torso portion 3, the right eye 8R, and the left eye 8L can be changed based on the face of the person by the same control processing.
Further, as for the display sections 7R, 7L, 7ER, 7EL, when the distance DF is large and the variation Δ DF thereof is positive and large, the luminance is changed to be brighter and the blinking period is made faster. In contrast, when the distance DF is small and the variation Δ DF is negative and small, the luminance is changed to be dark and the flicker period is slowed.
Further, regarding the voice generating section 11, when the distance DF is large and the amount of change Δ DF of the distance is positive and large, it is changed that the buzzer sound becomes large and the buzzer period becomes fast. In contrast, when the distance DF is small, the distance variation Δ DF is negative and small, changing is such that the buzzer sound becomes small and the buzzer cycle becomes slow.
When the inclination angles of the right eye 8R and the left eye 8L are set to be bilaterally symmetric with respect to the eyes (θ DR — θ DL), an expression can be formed by the right eye 8R and the left eye 8L.
For example, D1 < DF0 is defined, and θ DR is K1 × (DF-DF 0) + K2 × Δ DF. Thus, when the distance DF is large and the distance change Δ DF is positive and large, it becomes an angry expression as shown in fig. 8(a), and when the distance DF is small, the distance change Δ DF is negative and small, it becomes a mild expression as shown in fig. 8 (b). At distance DF0, it becomes non-expressive as shown in fig. 1 with the eye not rotating.
Further, a speech action for prompting guidance may be performed at the time of the processing in step S20, and a speech action for prompting a conversation may be performed at the time of the processing in step S21.
As described above, according to the communication robot 1 of the present embodiment, it is possible to actively detect a person outside a range in which communication such as conversation is possible, and to guide the detected person to a range in which communication is possible. In particular, when guiding a detected person, by changing the amplitude and cycle for moving an arm or the like in accordance with the detected distance, it is possible to display a reaction at the time of guidance so as not to make the person lose interest, and smoothly guide the person to an area where the ac robot 1 can take care. In the ac robot 1 according to the present embodiment, a person around the ac robot 1 can be effectively searched for by performing a searching operation such as a bow operation before performing the guiding operation.
In addition, at the time of the guidance operation, at least one of the magnitude (amplitude) and the period of the guidance operation is changed in accordance with a change in the distance or position from the detected person, so that guidance can be appropriately performed within a range in which communication is performed.
In the case of the ac robot 1 according to the present embodiment, guidance can be performed more favorably by changing the brightness, the blinking cycle, and the like of the display units 7R, 7L, 7ER, and 7EL in conjunction with each other during the guidance operation. However, interlocking the display state of the display portion with another guidance action is an example, and the display state at the display portion may not be changed when the guidance action is performed.
In the case of the ac robot 1 according to the present embodiment, appropriate guidance can be performed from the viewpoint that the right eye 8R and the left eye 8L are moved and the face is made to have an expression during the guidance operation. Having an expression by eyes is an example, and an expression may be also provided by an action of other parts (components or a housing) of the robot. Further, it is possible to omit a process of having an expression by eyes or the like at the time of guidance, or to change the expression only when the guided person approaches the robot by a certain distance.
<2 > second embodiment example
Next, a second embodiment of the present invention will be described in detail with reference to fig. 11 and 12. In fig. 11 and 12, the same reference numerals are given to portions corresponding to those of fig. 1 to 10 described in the first embodiment, and redundant description is omitted, and in the following description, the differences from the first embodiment will be mainly described.
The ac robot 1 according to the second embodiment has the same outer shape and the same shape of each movable part as the ac robot 1 described in the first embodiment, and the operation control system is different from that of the first embodiment.
The ac robot 1 according to the second embodiment is a robot that is installed in an open environment such as a station or an airport, where a direction from which a person comes or an area where a person exists is uncertain, and that receives and guides a person.
[2-1. Structure of System as a whole ]
Fig. 11 shows an internal configuration example of a motion control system of the ac robot 1 according to the second embodiment.
The arithmetic processing unit 70 of the motion control system shown in fig. 11 includes a guidable direction data storage unit 77 in addition to the arithmetic processing unit 70 described in the first embodiment.
As described above, in an open environment such as a station and an airport, there are factors that prevent communication with a person, such as a sound source of a speaker used for broadcasting. In order to communicate appropriately and accurately, it is preferable to communicate with a person while avoiding the direction in which the factor is located, and it is preferable to guide the person so as to avoid a place in which the factor that inhibits the communication is located.
As described above, the arithmetic processing unit 70 of the present embodiment is provided with the guidable direction data storage unit 77 in the arithmetic processing unit 70. In order to avoid directions in which there are major factors that hinder the communication around the installation position of the communication robot 1, the guidable direction data storage unit 77 has data on directions in which the person can be guided (guidable direction data). The guidable direction data is generated by, for example, initial setting when the ac robot 1 is set. Further, when a situation in which communication is difficult occurs in actual operation after the ac robot 1 is set, the guidable direction data can be updated at any time by removing the position of the person at the time of occurrence of the corresponding situation from the guidable direction.
The operation control unit 74 reads the guidable direction data stored in the guidable direction data storage unit 77.
The guidable direction data is represented by, for example, the rotation angle θ A of the lower torso 4 shown in FIG. 4, and the guidable region is set by a guidable angular range θ A0 ≦ θ A1.
The lower limit value θ a0 and the upper limit value θ a1 of these guidable angle ranges are values set according to actual installation conditions.
When the data of the guidable angle range stored in the guidable direction data storage unit 77 is read and a human is detected, the operation control unit 74 operates the guide to communicate with the human in the guidable angle range. In this indexing action, the size and cycle of the rotating action and the bowing action are varied depending upon the extent to which the person being indexed is spaced from the indexable angular range.
The other configuration of the arithmetic processing unit 70 is the same as the arithmetic processing unit 70 shown in fig. 2 described in the first embodiment.
[2-2. operation control example ]
Fig. 12 is a flowchart showing an example of the control processing performed by the operation control unit 74. In the flowchart of fig. 12, the same processing or determination as in the flowchart of fig. 9 described in the first embodiment is assigned the same step number, and redundant description is omitted.
In the operation control shown in fig. 12, the operation control unit 74 determines whether the distance DF is within the human-response-capable distance D1 in step S19, and if it is determined that the distance DF is within the human-response-capable distance D1 (yes in step S19), the process proceeds to step S24, which will be described later. When the distance DF is not within the response-able distance D1 to the person in step S19 (no in step S19), the process proceeds to step S20, and the motion controller 74 changes the magnitude and cycle of the motion based on the distance DF of the face of the person and the change Δ DF thereof, and executes the guidance motion. The operations in step S20 are those of the right arm 6R, left arm 6L, right eye 8R, and left eye 8L.
Thereafter, the process proceeds to step S23, and the motion controller 74 changes the magnitude and cycle of the rotational motion of the trunk lower portion 4 to perform the guiding motion. In step S23, the guide operation may be performed by changing the size and cycle of the bow operation (see fig. 5) of the upper torso portion 3.
In step S24, the operation control unit 74 determines whether or not the position of the person is within the range of the guidable direction data when it is determined in step S19 that the distance DF is within the human-respondable distance D1. When it is determined in step S24 that the position of the person is within the range of the guidable direction data (yes in step S24), the process proceeds to step S21, and the operation control unit 74 sets the magnitude and cycle of the operation for the response operation.
Further, when it is determined in step S24 that it is out of the range of the guidable direction data (no in step S24), the flow shifts to step S23, the motion control portion 74 changes the magnitude and cycle of the rotational motion of the trunk lower portion 4, and performs a guidable guiding motion. In the guide operation, the motion controller 74 also performs a bow operation (see fig. 5) for the body upper portion 3, and performs the guide operation while changing the magnitude and cycle of the operation at this time.
The other processing and determination of the flowchart of fig. 12 are the same as those of the flowchart shown in fig. 9 described in the first embodiment.
As shown in fig. 12, when it is determined in step S24 that the position of the person is out of the range of the guidance-enabled direction data, the operation controller 74 performs the guidance operation for avoiding the direction in which the main factor that inhibits the communication exists, by the rotating operation of the trunk lower portion 4 in step S23. That is, since the ac robot 1 according to the present embodiment is installed in an open environment such as a station or an airport, the direction in which a person arrives is uncertain, and there is a possibility that a person exists in any rotation direction. Therefore, the operation controller 74 causes the ac robot 1 to perform the rotating operation and the bowing operation of the lower torso portion 4, and performs a process of guiding the human to a range in which the communication can be appropriately performed.
Specifically, as the step S23, the rotation angle θ a + Δ θ a2 and the bow angle θ B + Δ θ B2 are processed. Δ θ A2 and Δ θ B2 are the amounts of change in the rotation angle and bow angle.
In this case, the change in the rotation angle θ A and the change in the angle θ B can be performed simultaneously, but the change in the rotation angle θ A and the change in the angle θ B can be alternately performed.
Here, a specific example of the determination example in step S24 is explained.
First, the position calculator 72 calculates the human direction θ H (═ θ a + Φ F) based on the deviation angle Φ F shown in fig. 17 and the rotation angle Φ a of the trunk lower portion 4. The deviation angle Φ F is a deviation angle between the front direction of the ac robot 1 and the direction in which the face of the person 80 is located. Here, as shown in fig. 17, the ac robot 1 and the person 80 are separated by a distance DF.
Then, in step S24, the operation control unit 74 determines whether or not the condition of θ a0 ≦ θ H ≦ θ a1 as the guidance range for the human direction θ H is satisfied, and when the condition is satisfied, the operation control unit 74 proceeds to step S21.
On the other hand, when the condition of θ a0 ≦ θ H ≦ θ a1 is not satisfied, the process proceeds to step S23, where the human direction θ H is compared with the upper limit value θ a1 and the lower limit value θ a 0. In this comparison, when θ H > θ a1, the guidance direction θ HREF as the target is θ a 1. When θ H < θ a0, the target guidance direction θ HREF is θ a 0. The processing of the operation control unit 74 is processing for setting a closer limit value.
Then, the operation control unit 74 sets the region of the human direction θ H and the target guidance direction θ HREF as the range of the turning operation. That is, the larger the distance DF from the person in the circumferential direction, the larger the magnitude of the turning motion, and the closer the distance DF from the person in the circumferential direction, the smaller the magnitude of the turning motion. The cycle of the rotational operation is set to be fast when the value of the change amount Δ θ H is positive and large, and to be slow when the value of the change amount Δ θ H is negative and large.
By this processing, the person around the ac robot 1 can be appropriately guided from the area where the communication is difficult to perform to the receivable area, and therefore the ac robot 1 can smoothly communicate with the person.
In the above example, the direction in which the noise or the like is present is the direction in which the main factor that inhibits the alternating current is present. On the other hand, for example, when a person is shot by the camera serving as the detection sensor 10, the range in which the illumination is external light or the like, the person is shot in a backlight state, and the face or the like of the person cannot be appropriately detected may be set as a direction in which a factor that inhibits the alternating current exists. That is, in addition to the direction in which the voice cannot be appropriately collected, the direction in which the camera can be appropriately used for imaging may be set as the direction in which there is a factor that hinders the communication.
Further, as for the direction or range in which the main factor that hinders the alternating current exists, a plurality of directions or ranges may be set in advance, and the direction or range in which the main factor that hinders the alternating current exists in the current state may be selected from the plurality of directions or ranges according to a condition such as time.
<3. third embodiment example >
Next, a third embodiment of the present invention will be described in detail with reference to fig. 13 to 16. In fig. 13 to 16, the same reference numerals are given to portions corresponding to fig. 1 to 10 described in the first embodiment and fig. 11 to 12 described in the second embodiment, and redundant description is omitted, and in the following description, the differences from the first embodiment will be mainly described.
The ac robot 1 according to the third embodiment has the same outer shape and the same shape of each movable part as the ac robot 1 described in the first embodiment, and the operation control system thereof is different from that of the first embodiment.
[3-1. Structure of System Overall ]
Fig. 13 shows an internal configuration example of a motion control system of an ac robot 1 according to a third embodiment.
The arithmetic processing unit 70 of the operation control system shown in fig. 13 is configured to include a group determination unit 78 in addition to the configuration of the arithmetic processing unit 70 described in the first embodiment. For example, in the case of the ac robot 1 installed in an environment where the number of people is large, such as a station and an airport, it is conceivable that the person detection unit 71 detects the faces of a plurality of persons at the same time.
When the face of a plurality of persons is detected by the person detection unit 71, the group determination unit 78 determines whether or not the plurality of persons are persons in the same group or whether or not the plurality of persons are not a group, based on a change in the position of the face of each person calculated by the position calculation unit 72 or the like. When the group determination unit 78 determines that the group is a person of the same group, the operation control unit 74 performs processing for collectively guiding a plurality of persons of the same group.
The other configuration of the arithmetic processing unit 70 is the same as the arithmetic processing unit 70 shown in fig. 2 described in the first embodiment.
[3-2 examples of group judgment ]
Fig. 14 shows an example of the determination processing in the group determination section 78.
The example shown in fig. 14 is a case where two persons 81 and 82 exist around the communication robot 1.
Here, when the distance between the robot 1 and the person 81 is DFA, the distance between the robot 1 and the person 82 is DFB, the distance between the person 81 and the person 82 is DFdiffAB, and the distance changes thereof are Δ DFA, Δ DFB, and Δ DFdiffAB, the group determination unit 78 acquires these distances and distance changes. Then, the group determination unit 78 determines whether or not the person 81 and the person 82 are in the same group based on the distance and the distance change.
Specifically, a threshold D10 for determining the difference between the distances DFA, DFB and a threshold D11 for determining the distance DFdiffAB between two persons are set. When the following conditions 1, 2, and 3 are continuously satisfied for a certain time T10, the group determination unit 78 determines that the person 81 and the person 82 are the same group.
Condition 1: | DFA-DFB | < D10
Condition 2: DFdiffAB < D11
Condition 3: | Δ DFA- Δ DFB | < Δ D10
In contrast, when any of the conditions 1, 2, and 3 is not continuously satisfied for the certain time T10, the group determination section 78 does not regard the person 81 and the person 82 as the same group.
[3-3. operation control example ]
Fig. 15 and 16 are flowcharts showing examples of control processing performed by the operation control unit 74. The position shown by [ A1] in FIG. 15 is connected to the position shown by [ A1] in FIG. 16, and the position shown by [ B1] in FIG. 16 is connected to the position shown by [ B1] in FIG. 15.
In the flowcharts of fig. 15 and 16, the same processing or determination as in the flowchart of fig. 9 described in the first embodiment is assigned the same step number, and redundant description is omitted.
In the operation control shown in fig. 15, after acquiring the position (distance) information of the face of the person in step S18, the operation control unit 74 determines whether there are a plurality of acquired faces (step S26). Here, when there are a plurality of faces acquired (yes in step S26), the operation control unit 74 proceeds to step S31 in the flowchart of fig. 16.
In step S31, the motion control unit 74 selects, as a guidance target, the person C closest to the communication robot 1, which is the person having the smallest distance DF among the detected persons. Then, based on the processing result at the group determination unit 78, the operation control unit 74 determines whether the person C to be guided and other people around the person C are in the same group (step S32).
Here, when the person C as the guidance target and the surrounding persons are the same group (yes in step S32), the motion control unit 74 calculates the distance DFCG as the group (step S33). Here, the distance DFC of the nearest person C in the group is set as the distance DFCG of the group.
Then, the motion control unit 74 sets the direction range of the guidance target so as to include the person in the group based on the distance DFCG of the group (step S34).
Here, for example, when the direction angle of the endmost group is θ CGmin (minimum value) and θ CGmax (maximum value), the operation controller 74 sets the magnitude of the rotational operation of the trunk lower portion 4 between the angle θ CGmin and the angle θ CGmax.
After setting the direction range of the guidance target, the operation control unit 74 proceeds to the determination of step S19 in fig. 15.
Further, when the face acquired in step S26 is not a plurality of faces (no in step S26), and when it is determined in step S32 that the face is not a group (no in step S32), the operation control section 74 proceeds to the determination in step S19 in fig. 15.
Thus, according to the communication robot 1 of the present embodiment, when the faces of a plurality of persons are simultaneously detected, it is determined whether or not the plurality of persons are a group, and when the plurality of persons are a group, the entire group is guided. Therefore, even when the ac robot 1 is installed in an environment where a plurality of people are present, such as a station and an airport, it is possible to positively detect a person and smoothly guide the person to a serviceable area.
<4. modification example >
The present invention is not limited to the above embodiments, and various modifications are also included.
For example, the processing for guiding from the range in which the factor inhibiting the communication exists to the appropriate range described in the second embodiment described above and the processing for guiding in the entire group in the case of the group described in the third embodiment may be combined.
In the first embodiment, a fixed range of about a few m is set in advance as a position or a distance at which communication is possible, and guidance is performed within the range. In contrast, the position or distance at which communication is possible can be set variably according to the situation of the place where the robot is installed. For example, when the communication apparatus is installed in a station, an airport, or the like, the communication-enabled distance may be set to, for example, about 3m when the surrounding noise is small, and the communication-enabled distance may be set to, for example, about 1m when the surrounding noise is large.
Further, as the main factors inhibiting the communication described in embodiment 2, a range in which it is difficult to collect voice and a range in which it is difficult to capture an image with a camera due to backlight or the like are examples, and other main factors inhibiting the communication may be considered. For example, when most people pass through the communication system, a range that hinders people from passing through (a range that becomes a passage) may be set as a range of a factor that hinders communication, and people may be guided to a position away from the passage to communicate.
In the process of performing guidance for the entire group described in the third embodiment, the person closest to the ac robot 1 is set as the guidance target. In contrast, for example, the center position of the positions of a plurality of persons constituting a group can be virtually determined, and the virtually determined center position can be guided to a range where communication is possible.
Alternatively, when a group is detected, it may be alternately diverted to individuals within the group and directed to a range where communication is possible.
In the above embodiments, the bow angle for tilting the upper body portion 3 and the head portion 2 is periodically changed as the seeking operation. In contrast, as the search operation, another operation such as a rotation operation of the trunk lower portion 4 may be performed. Alternatively, a combination of bowing and rotating actions can be used.
In the above embodiments, the left and right arms and the eyes are operated during the guide operation. The movement of the arms and eyes is also an example, and the movement of the housing itself constituting the trunk, face, and the like of the robot and the movement of the parts (members) of each part attached to the robot other than the arms and eyes may be performed when the guiding operation or the searching operation is performed.
In the case of performing the guidance operation or the search operation, it is also an example that both the amplitude (magnitude) and the period for moving the arm, the trunk, or the like are changed, and at least one of them may be changed.
In each of the above embodiments, the position calculating unit 72 calculates the distance from the robot to the person. In contrast, the position calculating unit 72 may calculate the position of the person in the space in which the robot is arranged, and calculate the distance from the difference between the calculated position of the person and the installation position of the robot.
In the above embodiments, the present invention is applied to a stationary robot, but may be applied to a robot capable of moving by itself. However, even if the robot is a robot capable of moving by itself, it is preferable to stop at a specific place and detect surrounding people and the like at the stopped place to perform the process of the present invention.
The above embodiments are described in detail for the purpose of facilitating understanding of the present invention, and are not limited to having all the structures described. In the configuration diagrams and functional block diagrams of fig. 2, 3, 11, 13, and the like, the control lines and the information lines are only lines necessary for explanation, and not necessarily all the control lines and information lines in a product are shown. In practice, it is also contemplated that substantially all structures may be interconnected. In the flowcharts shown in fig. 9, 12, 15, and 16, the order of execution of some of the processing steps may be changed or some of the processing steps may be executed simultaneously, as long as the processing result of the embodiment is not affected.
Description of the reference symbols
An AC robot 1, a head 2, a trunk 3, an upper part of a trunk 4, a lower part of a trunk 5, a right arm 6R, a left arm 6L, a display part 7R, 7L, 7ER, and 7EL, a right eye 8R, a left eye 8L, a voice acquisition part 9, a detection sensor 10, a voice generation part 11, a drive part 21, a movable part motor control part 21a, a movable part motor drive part 21b, a movable part motor 21c, a movable part position sensor 21d, an arithmetic processing part 70, a human detection part 71, a position calculation part 72, a voice and speech processing part 73, an operation control part 74, an operation plan execution part 75, a language response database part 76, a direction data storage part 77, and a direction data determination part 78.

Claims (7)

1. An AC robot, comprising:
a drive unit that moves at least one of a component mounted on a housing and the housing itself;
a voice acquisition unit that acquires voice of a person around the casing;
a voice generating unit for generating a voice;
a human detection section that detects the presence of a human based on a detection signal of a detection sensor mounted on the housing;
a calculation unit that calculates a position or a distance of the person detected by the person detection unit; and
an operation control unit that controls the operation of the component or the housing by the drive unit under a preset execution condition based on the position or the distance of the person calculated by the calculation unit,
the operation of the driving part by the control of the operation control part includes a search operation and a guide operation,
the search operation is performed by moving the housing by the drive unit to change the direction detected by the detection sensor,
the guiding action changes the execution condition according to the position or distance calculated by the calculation portion and makes the member or the housing movable so that a person is guided by making at least either one of the member or the housing movable,
and a guidable direction data holding portion that holds information on a direction in which a person can be guided,
when the guidance action is performed, the action control section changes the execution condition based on the position or distance of the person calculated by the calculation section with respect to the guidable direction represented by the guidable direction data.
2. The communication robot of claim 1,
the motion control unit performs the guidance motion to guide the person to a position or distance at which communication is possible, when the position or distance of the person calculated by the calculation unit is not a position or distance at which communication is possible by the voice acquisition unit and the generation by the voice generation unit.
3. The communication robot of claim 1,
the change in the execution condition is a change in at least one of a period and a size when the housing is periodically made movable in accordance with a change in the position or the distance calculated by the calculation unit.
4. An AC robot as recited in claim 1,
the execution condition executed by the operation control unit when the guidance operation is performed includes outputting a guidance voice from the voice generation unit.
5. An AC robot as recited in claim 1,
also comprises a display part which is used for displaying the information,
the execution condition executed by the operation control unit when the guidance operation is performed includes a display process for guidance on the display unit.
6. An AC robot as recited in claim 1,
the execution condition includes a condition that an expression of the communication robot is set by a movement of the component.
7. An AC robot as recited in claim 1,
further comprising a group judgment section that judges whether the plurality of persons are in the same group based on a change in the position or distance of the plurality of persons calculated by the calculation section,
when the group determination unit determines that a plurality of persons are in the same group during the guidance operation, the operation control unit changes the execution condition based on the position or distance of the group calculated by the calculation unit.
CN201980068901.5A 2018-11-13 2019-10-11 AC robot Active CN112912152B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2018-212853 2018-11-13
JP2018212853A JP7253900B2 (en) 2018-11-13 2018-11-13 communication robot
PCT/JP2019/040217 WO2020100488A1 (en) 2018-11-13 2019-10-11 Communication robot

Publications (2)

Publication Number Publication Date
CN112912152A CN112912152A (en) 2021-06-04
CN112912152B true CN112912152B (en) 2022-07-22

Family

ID=70730717

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980068901.5A Active CN112912152B (en) 2018-11-13 2019-10-11 AC robot

Country Status (3)

Country Link
JP (1) JP7253900B2 (en)
CN (1) CN112912152B (en)
WO (1) WO2020100488A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021175108A (en) 2020-04-27 2021-11-01 日本電気株式会社 Pcf device, af device, nef device, and method thereof

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013099800A (en) * 2011-11-07 2013-05-23 Fujitsu Ltd Robot, method for controlling robot, and control program
JP2013237124A (en) * 2012-05-15 2013-11-28 Fujitsu Ltd Terminal device, method for providing information, and program
JP2013244566A (en) * 2012-05-28 2013-12-09 Fujitsu Ltd Robot, and method for controlling the same
CN103612252A (en) * 2013-12-03 2014-03-05 北京科技大学 Intelligent remote social adjuvant therapy robot for autism children
JP2017177228A (en) * 2016-03-28 2017-10-05 株式会社国際電気通信基礎技術研究所 Service provision robot system
JP2018086689A (en) * 2016-11-28 2018-06-07 株式会社G−グロボット Communication robot
JP2018149625A (en) * 2017-03-13 2018-09-27 大日本印刷株式会社 Communication robot, program, and system

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4528295B2 (en) * 2006-12-18 2010-08-18 株式会社日立製作所 GUIDANCE ROBOT DEVICE AND GUIDANCE SYSTEM
WO2013176758A1 (en) * 2012-05-22 2013-11-28 Intouch Technologies, Inc. Clinical workflows utilizing autonomous and semi-autonomous telemedicine devices
JP6905812B2 (en) * 2016-06-14 2021-07-21 グローリー株式会社 Store reception system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013099800A (en) * 2011-11-07 2013-05-23 Fujitsu Ltd Robot, method for controlling robot, and control program
JP2013237124A (en) * 2012-05-15 2013-11-28 Fujitsu Ltd Terminal device, method for providing information, and program
JP2013244566A (en) * 2012-05-28 2013-12-09 Fujitsu Ltd Robot, and method for controlling the same
CN103612252A (en) * 2013-12-03 2014-03-05 北京科技大学 Intelligent remote social adjuvant therapy robot for autism children
JP2017177228A (en) * 2016-03-28 2017-10-05 株式会社国際電気通信基礎技術研究所 Service provision robot system
JP2018086689A (en) * 2016-11-28 2018-06-07 株式会社G−グロボット Communication robot
JP2018149625A (en) * 2017-03-13 2018-09-27 大日本印刷株式会社 Communication robot, program, and system

Also Published As

Publication number Publication date
WO2020100488A1 (en) 2020-05-22
JP2020078448A (en) 2020-05-28
JP7253900B2 (en) 2023-04-07
CN112912152A (en) 2021-06-04

Similar Documents

Publication Publication Date Title
US20100198443A1 (en) Path planning device, path planning method, and moving body
US9579795B2 (en) Robot device, method of controlling the same, and program for controlling the same
JP5456832B2 (en) Apparatus and method for determining relevance of an input utterance
US7853051B2 (en) Recognizing apparatus and method, recording media, and program
JP4849244B2 (en) Mobile robot and moving speed estimation method
US20180173300A1 (en) Interactive virtual objects in mixed reality environments
KR101295003B1 (en) Intelligent robot, system for interaction between intelligent robot and user, and method for interaction between intelligent robot and user
US20060004486A1 (en) Monitoring robot
KR102220564B1 (en) A method for estimating a location using multi-sensors and a robot implementing the same
JP6388141B2 (en) Moving body
JP2008087140A (en) Speech recognition robot and control method of speech recognition robot
JPWO2007132571A1 (en) robot
CN106548231A (en) Mobile controller, mobile robot and the method for moving to optimal interaction point
CN107077138B (en) Mobile body control device and mobile body
CN112912152B (en) AC robot
JP6134895B2 (en) Robot control system, robot control program, and explanation robot
EP3300841B1 (en) Communication device
CN103991492A (en) Intelligent trolley based on Kinect technology
US20120264095A1 (en) Emotion abreaction device and using method of emotion abreaction device
JP2023095918A (en) Robot, method for controlling robot, and program
JP2021157203A (en) Mobile control device, mobile control method, and program
KR20190093166A (en) Communication robot and control program therefor
CN108000533A (en) One kind explanation robot
JP6142307B2 (en) Attention target estimation system, robot and control program
CN113056315B (en) Information processing apparatus, information processing method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant