WO2018012219A1 - 自律行動型ロボット - Google Patents
自律行動型ロボット Download PDFInfo
- Publication number
- WO2018012219A1 WO2018012219A1 PCT/JP2017/022674 JP2017022674W WO2018012219A1 WO 2018012219 A1 WO2018012219 A1 WO 2018012219A1 JP 2017022674 W JP2017022674 W JP 2017022674W WO 2018012219 A1 WO2018012219 A1 WO 2018012219A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- robot
- user
- wheel
- storage space
- moving mechanism
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H11/00—Self-movable toy figures
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H3/00—Dolls
- A63H3/28—Arrangements of sound-producing means in dolls; Means in dolls for producing sounds
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H11/00—Self-movable toy figures
- A63H11/10—Figure toys with single- or multiple-axle undercarriages, by which the figures perform a realistic running motion when the toy is moving over the floor
- A63H11/12—Wheeled toys with figures performing a wriggling motion when moving
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J13/087—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices for sensing other physical parameters, e.g. electrical or chemical properties
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J5/00—Manipulators mounted on wheels or on carriages
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J5/00—Manipulators mounted on wheels or on carriages
- B25J5/007—Manipulators mounted on wheels or on carriages mounted on wheels
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/0003—Home robots, i.e. small robots for domestic use
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1628—Programme controls characterised by the control loop
- B25J9/163—Programme controls characterised by the control loop learning, adaptive, model based, rule based expert control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H2200/00—Computerized interactive toys, e.g. dolls
Definitions
- the present invention relates to a robot that autonomously selects an action according to an internal state or an external environment.
- the present invention has been completed on the basis of the above problem recognition, and its main object is to provide a structure and a control technique capable of reducing a sense of distance from the robot.
- An autonomous behavior robot includes a body, a moving mechanism having a ground contact surface during movement, and a drive mechanism that retracts the moving mechanism to a storage space provided in the body when a storage condition is satisfied. .
- An autonomous behavior type robot includes a body, a lifting determination unit that determines that the robot has been lifted, and driving so as to change the body posture when it is determined that the robot has been lifted and a predetermined driving condition is satisfied. And a drive mechanism.
- FIG. 1 is a diagram illustrating an appearance of the robot 100 according to the first embodiment.
- FIG. 1A is a front view
- FIG. 1B is a side view.
- the robot 100 according to the present embodiment is an autonomous behavior type robot that determines an action and a gesture (gesture) based on an external environment and an internal state.
- the external environment is recognized by various sensors such as a camera and a thermo sensor.
- the internal state is quantified as various parameters expressing the emotion of the robot 100. These will be described later.
- the robot 100 is premised on indoor behavior.
- the behavior range is the house of the owner's home.
- a person related to the robot 100 is referred to as a “user”, and a user who is a member of the home to which the robot 100 belongs is referred to as an “owner”.
- the body 104 of the robot 100 has a rounded shape as a whole and includes an outer skin formed of a soft and elastic material such as urethane, rubber, resin, or fiber.
- the robot 100 may be dressed. By making the body 104 round, soft, and comfortable to touch, the robot 100 provides the user with a sense of security and a comfortable touch.
- the robot 100 has a total weight of 15 kg or less, preferably 10 kg or less, and more preferably 5 kg or less.
- the average weight of a 13-month-old baby is over 9 kilograms for boys and less than 9 kilograms for girls. Therefore, if the total weight of the robot 100 is 10 kg or less, the user can hold the robot 100 with almost the same effort as holding a baby who cannot walk alone.
- the average weight of babies under 2 months of age is less than 5 kilograms for both men and women. Therefore, if the total weight of the robot 100 is 5 kilograms or less, the user can hold the robot 100 with the same effort as holding an infant.
- the height of the robot 100 is 1.2 meters or less, preferably 0.7 meters or less. It is an important concept that the robot 100 in this embodiment can be held.
- the robot 100 includes three wheels for traveling on three wheels. As shown, a pair of front wheels 102 (left wheel 102a and right wheel 102b) and one rear wheel 103 are included.
- the front wheel 102 is a driving wheel
- the rear wheel 103 is a driven wheel.
- the front wheel 102 does not have a steering mechanism, but the rotation speed and the rotation direction can be individually controlled.
- the rear wheel 103 is a so-called omni wheel, and is rotatable to move the robot 100 forward, backward, left and right.
- the robot 100 can turn left or rotate counterclockwise.
- the robot 100 can turn right or rotate clockwise.
- the front wheel 102 and the rear wheel 103 can be completely accommodated in the body 104 by a drive mechanism (rotation mechanism, link mechanism) described later.
- a drive mechanism rotation mechanism, link mechanism
- the robot 100 is incapable of moving when the wheels are completely stored in the body 104. That is, the body 104 descends and sits on the floor as the wheels are retracted. In this seated state, a flat seating surface 108 (grounding bottom surface) formed on the bottom of the body 104 abuts against the floor surface F.
- the robot 100 has a pair of hands 106.
- the hand 106 does not have a function of gripping an object, and is slightly displaced up and down and left and right in accordance with expansion / contraction deformation of the body portion described later.
- the two hands 106 may be individually controlled, and simple operations such as raising, shaking, and vibrating may be possible.
- Eye 110 has a built-in camera.
- the eye 110 can also display an image using a liquid crystal element or an organic EL element.
- the robot 100 is equipped with various sensors such as a sound collecting microphone and an ultrasonic sensor in addition to a camera built in the eye 110. It also has a built-in speaker and can emit simple sounds.
- a horn 112 is attached to the head of the robot 100. Since the robot 100 is lightweight as described above, the user can lift the robot 100 by holding the horn 112.
- FIG. 2 is a cross-sectional view schematically showing the structure of the robot 100.
- FIG. 3 is a side view showing the structure of the robot 100 with the frame as the center.
- FIG. 2 corresponds to a cross section taken along line AA in FIG.
- the body 104 of the robot 100 includes a base frame 308, a main body frame 310, a pair of wheel covers 312, and an outer skin 314.
- the base frame 308 is made of metal and constitutes an axis of the body 104 and supports an internal mechanism.
- the base frame 308 is configured by vertically connecting an upper plate 332 and a lower plate 334 by a plurality of side plates 336. Sufficient intervals are provided between the plurality of side plates 336 to allow ventilation.
- a battery 118, a control device 342, various actuators, and the like are accommodated inside the base frame 308.
- a stepped hole 360 is provided, and an exhaust valve 362 is provided. That is, the upper small diameter portion in the stepped hole 360 forms the exhaust port 364, and the valve body 366 made of a rubber sheet is disposed in the lower large diameter portion.
- a valve seat 368 is formed on the boundary surface between the small diameter portion and the large diameter portion.
- One side of the valve body 366 in the radial direction is bonded to the large diameter portion to be a fixed end, and the opposite side in the radial direction is a free end.
- the exhaust valve 362 is opened and closed when the valve body 366 is attached to and detached from the valve seat 368.
- the exhaust valve 362 is a check valve that opens only when the air in the main body frame 310 is discharged to the outside.
- the main body frame 310 is made of a resin material and includes a head frame 316 and a body frame 318.
- the head frame 316 has a hollow hemispherical shape and forms the head skeleton of the robot 100.
- the torso frame 318 has a stepped cylinder shape and forms a torso skeleton of the robot 100.
- the body frame 318 is fixed integrally with the base frame 308.
- the head frame 316 is assembled to the upper end portion of the trunk frame 318 so as to be relatively displaceable.
- the head frame 316 is provided with three axes of a yaw axis 320, a pitch axis 322, and a roll axis 324, and an actuator 326 for rotationally driving each axis.
- Actuator 326 includes a plurality of servo motors for individually driving each axis.
- the yaw shaft 320 is driven for the swinging motion
- the pitch shaft 322 is driven for the rolling motion
- the roll shaft 324 is driven for the tilting motion.
- a plate 325 that supports the yaw shaft 320 is fixed to the top of the head frame 316.
- the plate 325 is formed with a plurality of vent holes 327 for ensuring ventilation between the upper and lower sides.
- a stepped hole 350 is provided at the center of the upper end portion of the head frame 316, and an intake valve 352 is provided. That is, the upper small diameter portion of the stepped hole 350 forms the intake port 354, and the valve body 356 made of a rubber sheet is disposed in the lower large diameter portion.
- a valve seat 358 is formed on the boundary surface between the small diameter portion and the large diameter portion.
- the intake valve 352 is opened and closed when the valve body 356 is attached to and detached from the valve seat 358.
- the intake valve 352 is a check valve that opens only when outside air is introduced into the main body frame 310.
- a metal base plate 328 is provided so as to support the head frame 316 and its internal mechanism from below.
- the base plate 328 is connected to the plate 325 via a cross link mechanism 329 (pantograph mechanism), and is connected to the upper plate 332 (base frame 308) via a joint 330.
- the body frame 318 houses the base frame 308, the wheel drive mechanism 370, and the expansion / contraction drive mechanism 372.
- the wheel drive mechanism 370 includes a front wheel drive mechanism 374 and a rear wheel drive mechanism 376.
- the body frame 318 has a smooth curved surface at the upper half 380 so that the outline of the body 104 is rounded.
- the upper half 380 is formed so as to gradually become smaller toward the upper part corresponding to the neck.
- the lower half 382 of the body frame 318 has a small width so as to form a storage space S for the front wheel 102 between the lower half 382 and the wheel cover 312.
- the boundary between the upper half 380 and the lower half 382 has a step shape.
- the left and right side walls constituting the lower half 382 are parallel to each other, and pass through a rotation shaft 378 described later of the front wheel drive mechanism 374 to support it.
- the upper half 380 is formed with a slit-shaped opening 384 that is open from the side toward the front. Air can be introduced into the inner surface of the outer skin 314 through the opening 384.
- a lower plate 334 is provided to close the lower end opening of the lower half 382.
- the base frame 308 is fixed to and supported by the lower end portion of the body frame 318.
- the pair of wheel covers 312 are provided so as to cover the lower half 382 of the body frame 318 from the left and right.
- the wheel cover 312 is made of resin and assembled so as to form a smooth outer surface (curved surface) continuous with the upper half 380 of the body frame 318.
- the upper end of the wheel cover 312 is connected along the lower end of the upper half 380. Accordingly, a storage space S that is opened downward is formed between the side wall of the lower half 382 and the wheel cover 312.
- the outer skin 314 is made of urethane rubber and is mounted so as to cover the main body frame 310 and the wheel cover 312 from the outside.
- urethane rubber is adopted, but in a modified example, other elastic bodies such as rubber may be used.
- the outer skin 314 functions as an “expanded body”.
- the hand 106 is integrally formed with the outer skin 314.
- An opening 390 is provided at the upper end of the outer skin 314 at a position corresponding to the air inlet 354. Thereby, the introduction of outside air via the intake valve 352 is enabled.
- the outer skin 314 is generally in close contact with the outer surfaces of the main body frame 310 and the wheel cover 312, but is provided with a seal structure for ensuring airtightness in the main body frame 310. That is, the adhesive layer 514 is provided over the entire circumference between the upper portion of the trunk frame 318 and the outer skin 314. Also, an adhesive layer 516 is provided over the entire circumference between the wheel cover 312 and the outer skin 314. With such a configuration, the sealing performance of the expansion / contraction portion of the outer skin 314 is ensured.
- a communication path 355 that connects the intake port 354 and the exhaust port 364 is formed inside the main body frame 310.
- the communication path 355 is a sealed space.
- Heat generating components such as the battery 118, the control device 342, and the actuator are disposed in the communication path 355. Moreover, it is preferable that these heat generating components are arranged so as not to obstruct the flow of air flowing through the communication path 355 as much as possible.
- the base plate 328 has a plurality of ventilation holes 331 formed therein. A plurality of air holes 333 are also formed in the upper plate 332.
- the front wheel drive mechanism 374 includes a wheel drive mechanism for rotating the front wheel 102 and a storage operation mechanism for moving the front wheel 102 forward and backward from the storage space S. That is, the front wheel drive mechanism 374 includes a rotation shaft 378 and an actuator 379.
- the front wheel 102 has a direct drive motor (hereinafter referred to as “DD motor”) 396 at the center thereof.
- DD motor 396 has an outer rotor structure, a stator is fixed to axle 398, and a rotor is coaxially fixed to wheel 397 of front wheel 102.
- the axle 398 is integrated with the rotating shaft 378 via the arm 400.
- a bearing 402 is embedded in a lower side wall of the body frame 318 so as to be pivotably supported while penetrating the rotation shaft 378.
- the bearing 402 is provided with a seal structure (bearing seal) for hermetically sealing the inside and outside of the body frame 318.
- the rear wheel drive mechanism 376 includes a rotation shaft 404 and an actuator 406. Two arms 408 extend from the rotation shaft 404, and an axle 410 is integrally provided at the tip thereof.
- the rear wheel 103 is rotatably supported on the axle 410.
- a bearing (not shown) is embedded in the lower side wall of the trunk frame 318 so as to be pivotably supported while penetrating the pivot shaft 404.
- the bearing is also provided with a shaft seal structure.
- the expansion / contraction drive mechanism 372 includes a shape memory alloy wire 610 embedded in the outer skin 314 and a drive circuit 620 (energization circuit) thereof.
- the shape memory alloy wire 610 is formed as a wire-shaped wire that shrinks and hardens when heated and relaxes and expands when heated gradually. Lead wires drawn from both ends of the shape memory alloy wire 610 are connected to the drive circuit 620. When the switch of the drive circuit 620 is turned on, the shape memory alloy wire 610 is energized.
- the shape memory alloy wire 610 is molded or knitted at a height position corresponding to the opening 384 in the outer skin 314. Lead wires are drawn from both ends of the shape memory alloy wire 610 to the inside of the body frame 318.
- One shape memory alloy wire 610 may be provided on each side of the outer skin 314, or a plurality of shape memory alloy wires 610 may be provided in parallel.
- the shape memory alloy wire 610 is in a relaxed and elongated state along the outer skin 314 in a curved shape.
- the drive circuit 620 is turned on, the shape memory alloy wire 610 is contracted and hardened linearly (see FIG. 5B).
- FIG. 4 is a diagram schematically showing the wheel storing operation.
- FIG. 4A is a side view
- FIG. 4B is a front view.
- the dotted line in the figure indicates a state in which the wheel can advance from the storage space S and can travel
- the solid line in the figure indicates a state in which the wheel is stored in the storage space S.
- the actuators 379 and 406 are driven in one direction.
- the arm 400 rotates about the rotation shaft 378 and the front wheel 102 rises from the floor surface F.
- the arm 408 rotates about the rotation shaft 404, and the rear wheel 103 rises from the floor surface F (see the one-dot chain line arrow).
- the body 104 descends and the seating surface 108 contacts the floor surface F (see solid arrow).
- the state where the robot 100 is sitting is realized.
- FIG. 5 is a diagram schematically showing the expansion / contraction operation.
- FIG. 5A shows an expanded state
- FIG. 5B shows a reduced state.
- the switch of the drive circuit 620 is switched from on to off, the shape memory alloy wire 610 relaxes and extends as shown in FIG. 5A (see solid arrow).
- the outer skin 314 expands to its original state, and the internal pressure of the main body frame 310 becomes negative.
- the intake valve 352 is opened, and the outside air is introduced into the body 104 (see a two-dot chain line arrow).
- the exhaust valve 362 is kept closed. In appearance, the body of the robot 100 swells and the hand 106 is slightly pushed up.
- the shape memory alloy wire 610 is contracted and hardened linearly, and the outer skin 314 is pressed inward to contract. Thereby, the internal pressure of the main body frame 310 increases, and the exhaust valve 362 is opened. As a result, the inside air of the body 104 is discharged to the outside (see a two-dot chain line arrow). At this time, the intake valve 352 is kept closed. In appearance, the torso of the robot 100 is shrunk to the original state, and the hand 106 is lowered. By repeating the above operations, it is possible to realize a state where the robot 100 is breathing like a living thing in appearance.
- the intake valve 352, the exhaust valve 362, and the expansion / contraction drive mechanism 372 function as an “intake / exhaust mechanism”.
- FIG. 6 is a configuration diagram of the robot system 300.
- the robot system 300 includes a robot 100, a server 200, and a plurality of external sensors 114.
- a plurality of external sensors 114 (external sensors 114a, 114b,..., 114n) are installed in advance in the house.
- the external sensor 114 may be fixed to the wall surface of the house or may be placed on the floor.
- the position coordinates of the external sensor 114 are registered. The position coordinates are defined as x, y coordinates in the house assumed as the action range of the robot 100.
- the server 200 is installed in the home.
- the server 200 and the robot 100 in this embodiment correspond one-to-one.
- the server 200 determines the basic behavior of the robot 100.
- the external sensor 114 is for reinforcing the sensory organ of the robot 100
- the server 200 is for reinforcing the brain of the robot 100.
- External sensor 114 periodically transmits a radio signal (hereinafter referred to as “robot search signal”) including the ID of external sensor 114 (hereinafter referred to as “beacon ID”).
- robot search signal a radio signal including the ID of external sensor 114
- beacon ID the ID of external sensor 114
- the server 200 measures the time from when the external sensor 114 transmits the robot search signal to when it receives the robot response signal, and measures the distance from the external sensor 114 to the robot 100. By measuring the distances between the plurality of external sensors 114 and the robot 100, the position coordinates of the robot 100 are specified. Of course, the robot 100 may periodically transmit its position coordinates to the server 200.
- FIG. 7 is a conceptual diagram of the emotion map 116.
- the emotion map 116 is a data table stored in the server 200.
- the robot 100 selects an action according to the emotion map 116.
- the emotion map 116 indicates the magnitude of the affectionate feeling with respect to the location of the robot 100.
- the x-axis and y-axis of the emotion map 116 indicate two-dimensional space coordinates.
- the z-axis indicates the size of feelings of good and bad. When the z value is positive, the place is highly likable. When the z value is negative, the place is disliked.
- the coordinate P ⁇ b> 1 is a point having a high favorable feeling (hereinafter referred to as “favorite point”) in the indoor space managed by the server 200 as the action range of the robot 100.
- the favor point may be a “safe place” such as the shade of a sofa or under a table, a place where people can easily gather, such as a living room, or a lively place. Further, it may be a place that has been gently stroked or touched in the past.
- the definition of what kind of place the robot 100 prefers is arbitrary, but in general, it is desirable to set a place favored by small children, small animals such as dogs and cats, as a favorable point.
- the coordinate P2 is a point where the bad feelings are high (hereinafter referred to as “disgusting point”).
- Disgusting points include places that make loud noises such as near TVs, places that get wet easily like baths and toilets, closed spaces and dark places, and places that lead to unpleasant memories that have been treated wildly by users. There may be.
- the definition of what place the robot 100 dislikes is arbitrary, it is generally desirable to set a place where a small child or a small animal such as a dog or cat is afraid as an aversion point.
- the coordinate Q indicates the current position of the robot 100.
- the server 200 may grasp how far the robot 100 is from which external sensor 114 in which direction.
- the current position may be specified by calculating the movement distance of the robot 100 from the number of rotations of the wheels (front wheel 102), or the current position may be specified based on an image obtained from a camera. May be.
- the emotion map 116 is given, the robot 100 moves in a direction attracted to the favorable point (coordinate P1) and away from the disgusting point (coordinate P2).
- the emotion map 116 changes dynamically.
- the z value (favorable feeling) at the coordinate P1 decreases with time.
- the robot 100 can emulate the biological behavior of reaching the favored point (coordinate P1), “satisfied with emotion”, and eventually “getting bored” at that place.
- the bad feeling at the coordinate P2 is also alleviated with time.
- new favor points and dislike points are born, and the robot 100 makes a new action selection.
- the robot 100 has an “interest” at a new favorable point and continuously selects an action.
- the emotion map 116 represents the undulation of emotion as the internal state of the robot 100.
- the robot 100 aims at the favor point, avoids the dislike point, stays at the favor point for a while, and eventually takes the next action. By such control, the action selection of the robot 100 can be made human and biological.
- a map that affects the behavior of the robot 100 is not limited to the emotion map 116 of the type shown in FIG.
- various behavior maps such as curiosity, feelings of avoiding fear, feelings of peace, feelings of calm and dimness, feelings of physical comfort such as coolness and warmth.
- the destination point of the robot 100 may be determined by weighted averaging the z values of the plurality of behavior maps.
- the robot 100 may have a parameter indicating the size of various emotions and feelings separately from the behavior map. For example, when the value of the emotion parameter of loneliness is increasing, the value of the emotion parameter may be decreased by setting a large weighting coefficient of an action map for evaluating a place of peace and reaching the target point. Similarly, when the value of a parameter indicating a sense of boring is increasing, a weighting coefficient of an action map for evaluating a place satisfying curiosity may be set large.
- FIG. 8 is a hardware configuration diagram of the robot 100.
- the robot 100 includes an internal sensor 128, a communication device 126, a storage device 124, a processor 122, a drive mechanism 120, and a battery 118. Each unit is connected to each other by a power line 130 and a signal line 132.
- the battery 118 supplies power to each unit via the power line 130. Each unit transmits and receives control signals via a signal line 132.
- the battery 118 is a secondary battery such as a lithium ion secondary battery, and is a power source of the robot 100.
- the internal sensor 128 is a collection of various sensors built in the robot 100. Specifically, a camera, a sound collecting microphone, an infrared sensor, a thermo sensor, a touch sensor, an acceleration sensor, an odor sensor, and the like.
- the odor sensor is a known sensor that applies the principle that the electrical resistance changes due to the adsorption of molecules that cause odors.
- the odor sensor classifies various odors into a plurality of categories (hereinafter referred to as “odor category”).
- the communication device 126 is a communication module that performs wireless communication for various external devices such as the server 200, the external sensor 114, and a mobile device owned by the user.
- the storage device 124 includes a nonvolatile memory and a volatile memory, and stores a computer program and various setting information.
- the processor 122 is a computer program execution means.
- the drive mechanism 120 is an actuator that controls the internal mechanism. In addition to this, displays and speakers are also installed.
- the processor 122 selects an action of the robot 100 while communicating with the server 200 and the external sensor 114 via the communication device 126.
- Various external information obtained by the internal sensor 128 also affects action selection.
- the drive mechanism 120 includes the wheel drive mechanism 370 and the expansion / contraction drive mechanism 372 described above.
- the drive mechanism 120 mainly controls the wheel (front wheel 102), the head (head frame 316), and the trunk (energization of the shape memory alloy wire 610 for expanding and contracting the expansion / contraction body).
- the drive mechanism 120 changes the movement direction and movement speed of the robot 100 by changing the rotation speed and rotation direction of the two front wheels 102.
- the drive mechanism 120 can also raise and lower the wheels (the front wheel 102 and the rear wheel 103). When the wheel rises, the wheel is completely stored in the body 104, and the robot 100 comes into contact with the floor surface at the seating surface 108 and enters the seating state.
- FIG. 9 is a functional block diagram of the robot system 300.
- the robot system 300 includes the robot 100, the server 200, and the plurality of external sensors 114.
- Each component of the robot 100 and the server 200 includes an arithmetic unit such as a CPU (Central Processing Unit) and various coprocessors, a storage device such as a memory and a storage, hardware including a wired or wireless communication line connecting them, and a storage It is realized by software stored in the apparatus and supplying processing instructions to the arithmetic unit.
- the computer program may be constituted by a device driver, an operating system, various application programs located in an upper layer thereof, and a library that provides a common function to these programs.
- Each block described below is not a hardware unit configuration but a functional unit block.
- Some of the functions of the robot 100 may be realized by the server 200, and some or all of the functions of the server 200 may be realized by the robot 100.
- Server 200 includes a communication unit 204, a data processing unit 202, and a data storage unit 206.
- the communication unit 204 is in charge of communication processing with the external sensor 114 and the robot 100.
- the data storage unit 206 stores various data.
- the data processing unit 202 executes various processes based on the data acquired by the communication unit 204 and the data stored in the data storage unit 206.
- the data processing unit 202 also functions as an interface between the communication unit 204 and the data storage unit 206.
- the data storage unit 206 includes an operation pattern storage unit 232, a map storage unit 216, a personal data storage unit 218, and a history data storage unit 238.
- the motion pattern storage unit 232 associates IDs of motion patterns (hereinafter referred to as “motion IDs”) representing various gestures (gestures) of the robot 100 with the selection conditions.
- the map storage unit 216 stores a plurality of behavior maps.
- the personal data storage unit 218 stores information on users, particularly owners. Specifically, various parameters such as intimacy with the user and physical characteristics / behavioral characteristics of the user are stored. Other attribute information such as age and gender may be stored.
- the history data storage unit 238 stores history information of actions (movements) such as movement of the robot 100 and gestures. This history information includes information transmitted from the robot 100 in addition to information detected and managed on the server 200 side. This history information is updated or deleted periodically.
- the robot 100 identifies the user based on the physical characteristics and behavioral characteristics of the user.
- the robot 100 always captures the periphery with a built-in camera. Then, the physical characteristics and behavioral characteristics of the person appearing in the image are extracted.
- the physical characteristics may be visual characteristics associated with the body such as height, preferred clothes, presence of glasses, skin color, hair color, ear size, average body temperature, Other features such as smell, voice quality, etc. may also be included.
- behavioral features are features associated with behavior such as a location that the user likes, activeness of movement, and the presence or absence of smoking.
- an owner identified as a father often does not stay at home, and often does not move on the sofa when at home, but a behavioral characteristic such as the mother is often in the kitchen and has a wide action range is extracted.
- the robot 100 clusters users who frequently appear as “owners” based on physical characteristics and behavioral characteristics obtained from a large amount of image information and other sensing information.
- the method of identifying a user with a user ID is simple and reliable, but it is assumed that the user has a device that can provide the user ID.
- a method for identifying a user based on physical characteristics or behavioral characteristics has an advantage that even a user who does not have a portable device can identify even though the image recognition processing burden is large. Only one of the two methods may be employed, or the user may be specified by using the two methods in a complementary manner.
- users are clustered based on physical characteristics and behavioral characteristics, and the users are identified by deep learning (multilayer neural network).
- the robot 100 has an internal parameter called intimacy for each user.
- intimacy for each user.
- the robot 100 recognizes a behavior that favors oneself, such as picking up oneself or singing a voice, the familiarity with the user increases.
- the data processing unit 202 includes a position management unit 208, a map management unit 210, a recognition unit 212, an operation determination unit 222, and a closeness management unit 220.
- the position management unit 208 specifies the position coordinates of the robot 100 by the method described with reference to FIG.
- the position management unit 208 may track the position coordinates of the user in real time.
- the map management unit 210 changes the parameters of each coordinate by the method described in relation to FIG.
- the map management unit 210 may select one of a plurality of action maps, or may perform a weighted average of z values of a plurality of action maps.
- the z values at the coordinates R1 and R2 are 4 and 3
- the z values at the coordinates R1 and R2 are ⁇ 1 and 3.
- the robot 100 moves in the direction of the coordinates R2, not the coordinates R1.
- the recognition unit 212 recognizes the external environment.
- the recognition of the external environment includes various recognitions such as recognition of weather and season based on temperature and humidity, and recognition of shade (safe area) based on light quantity and temperature.
- the recognition unit 212 further includes a person recognition unit 214 and a reception recognition unit 228.
- the person recognizing unit 214 recognizes a person from an image captured by the built-in camera of the robot 100, and extracts a physical characteristic and a behavioral characteristic of the person. Then, based on the body feature information and behavior feature information registered in the personal data storage unit 218, the imaged user, that is, the user the robot 100 is viewing corresponds to any person such as a father, mother, eldest son, etc. Judge whether to do.
- the person recognition unit 214 includes a facial expression recognition unit 230.
- the facial expression recognition unit 230 estimates the user's emotion by recognizing the user's facial expression as an image.
- the person recognizing unit 214 also performs feature extraction for a cat or dog other than a person, for example, a pet.
- the reception recognition unit 228 recognizes various reception actions performed on the robot 100 and classifies them as pleasant / unpleasant actions.
- the response recognition unit 228 also classifies the response into an affirmative / negative response by recognizing the owner's response to the behavior of the robot 100.
- pleasant / unpleasant behavior is determined based on whether the user's response is pleasant or uncomfortable as a living thing. For example, being held is a pleasant action for the robot 100, and being kicked is an unpleasant action for the robot 100.
- An affirmative / negative reaction is discriminated based on whether the user's response acts indicate a user's pleasant feeling or an unpleasant feeling. For example, being held is an affirmative reaction indicating a user's pleasant feeling, and being kicked is a negative reaction indicating a user's unpleasant feeling.
- the operation determination unit 222 of the server 200 determines the operation (movement and gesture) of the robot 100 in cooperation with the operation determination unit 150 of the robot 100.
- the operation determination unit 222 includes a movement determination unit 234 and a behavior determination unit 236.
- the movement determination unit 234 creates a movement target point of the robot 100 and a movement route therefor based on the action map selection by the map management unit 210.
- the movement determination unit 234 may create a plurality of movement routes and select one of the movement routes.
- the behavior determination unit 236 selects a gesture of the robot 100 from a plurality of operation patterns in the operation pattern storage unit 232.
- the familiarity management unit 220 manages the familiarity for each user. As described above, the familiarity is registered in the personal data storage unit 218 as a part of personal data. Details of the intimacy management will be described later.
- the robot 100 includes a communication unit 142, a data processing unit 136, a data storage unit 148, a drive mechanism 120, and an internal sensor 128.
- the communication unit 142 corresponds to the communication device 126 (see FIG. 8), and is responsible for communication processing with the external sensor 114 and the server 200.
- the data storage unit 148 stores various data.
- the data storage unit 148 corresponds to the storage device 124 (see FIG. 8).
- the data processing unit 136 performs various processes based on the data acquired by the communication unit 142 and the data stored in the data storage unit 148.
- the data processing unit 136 corresponds to the processor 122 and a computer program executed by the processor 122.
- the data processing unit 136 also functions as an interface for the communication unit 142, the internal sensor 128, the drive mechanism 120, and the data storage unit 148.
- the internal sensor 128 includes a temperature detection unit 152.
- the temperature detector 152 measures the user's body temperature and ambient temperature.
- the temperature detection unit 152 includes a non-contact temperature sensor such as a radiation thermometer or a thermography, and a contact temperature sensor such as a thermistor, a resistance temperature detector, a thermocouple, or an IC temperature sensor.
- the data storage unit 148 includes an operation pattern storage unit 160 and a history data storage unit 164.
- the operation pattern storage unit 160 defines various operations of the robot 100.
- an operation ID and an operation selection condition are associated with each other. For example, the selection probability of the action pattern A when an unpleasant action is detected is recorded in association with the action ID.
- the behavior determination unit 140 selects an operation pattern based on such a selection condition.
- an operation ID and a control method of various actuators for realizing the operation are defined.
- the robot 100 is rotated by moving the robot 100 by accommodating the wheel, sitting, lifting the hand 106, or rotating the two front wheels 102 or rotating only one of the front wheels 102.
- various gestures such as trembling by rotating the front wheel 102 in the stowed state, or stopping and looking back when leaving the user
- the operation timing, operation time, and operation of each actuator (drive mechanism 120)
- the direction and the like are defined in time series for each operation pattern.
- the history data storage unit 164 sequentially stores history information of actions (actions) such as movement of the robot 100 and gestures. This history information is transmitted to the server 200, for example, at the timing of termination processing when the power is turned off.
- the history data storage unit 164 is a volatile memory and may be erased when the power is turned off.
- the data processing unit 136 includes a recognition unit 156 and an operation determination unit 150.
- the operation determination unit 150 determines the operation of the robot 100 in cooperation with the operation determination unit 222 of the server 200.
- the operation determination unit 150 includes a movement determination unit 138 and a behavior determination unit 140.
- the operation determination unit 150 also functions as a “control unit” that controls the drive mechanism 120.
- the driving mechanism 120 includes a movement driving unit 144 and an action driving unit 146.
- the movement determining unit 138 determines the moving direction of the robot 100 together with the movement determining unit 234 of the server 200.
- the movement based on the behavior map may be determined by the server 200, and the movement determining unit 138 may determine an immediate movement such as avoiding an obstacle.
- the movement drive unit 144 drives the wheels in accordance with instructions from the movement determination unit 138, thereby causing the robot 100 to move toward the movement target point.
- the action map determines the outline of the moving direction of the robot 100, but the robot 100 can also take actions corresponding to the intimacy.
- the operation ID selected by the action determination unit 236 of the server 200 is transmitted to the robot 100, and the action determination unit 140 instructs the action driving unit 146 to execute an operation pattern corresponding to the operation ID.
- the server 200 may determine some complicated operation patterns and other operation patterns may be determined by the robot 100.
- the basic operation pattern may be determined in the server 200 and the additional operation pattern may be determined in the robot 100.
- How the operation pattern determination process is shared between the server 200 and the robot 100 may be designed according to the specifications of the robot system 300.
- the behavior determination unit 140 can execute a gesture that raises both hands 106 as a gesture for holding a cuddle when a close user is nearby, or can execute a gesture that moves up and down. In addition, when the user is tired of “cuddle”, it is possible to express a gesture that hesitates the cuddle by rotating it backward while accommodating the front wheel 102.
- the action driving unit 146 causes the robot 100 to express various gestures by driving each mechanism in accordance with instructions from the action determining unit 140. For example, when an intimate operation instruction is received from the behavior determination unit 140 when a highly intimate user is nearby, the behavior driving unit 146 drives the wheel driving mechanism 370 to store the wheel, and the robot 100 is placed on the floor surface. Sit down. Moreover, the hand 106 is lifted by driving the expansion / contraction drive mechanism 372, and a gesture for holding the hand 106 is performed.
- the recognition unit 156 interprets external information obtained from the internal sensor 128.
- the recognition unit 156 can perform visual recognition (visual part), odor recognition (olfactory part), sound recognition (auditory part), and tactile recognition (tactile part).
- the recognizing unit 156 periodically images the outside world with the built-in camera (internal sensor 128), and detects a user who is a moving object such as a person or a pet. These features are transmitted to the server 200, and the person recognition unit 214 of the server 200 extracts the physical features of the moving object. It also detects the user's smell and the user's voice. Smell and sound (voice) are classified into a plurality of types by a known method. Further, the temperature detection unit 152 can also detect the temperature when touched.
- the recognition unit 156 also functions as a “temperature determination unit” that determines the temperature detected by the temperature detection unit 152.
- the recognizing unit 156 recognizes this with the built-in acceleration sensor, and the response recognizing unit 228 of the server 200 recognizes that a “violent act” has been performed by a nearby user.
- the user grabs the horn 112 and lifts the robot 100, it may be recognized as a violent act.
- the reception recognition unit 228 of the server 200 may recognize that a “speaking action” has been performed on the server 200.
- a “contact act” is performed by the user, and when an upward acceleration is detected in a state where the contact is recognized, it is recognized that “cug” has been performed.
- a physical contact when the user lifts the body 104 may be sensed, or the holding may be recognized by a decrease in the load applied to the wheel.
- the recognizing unit 156 functions as a “lifting determination unit” that determines that the user has held it.
- the reception recognition unit 228 of the server 200 recognizes various types of user responses to the robot 100.
- Some typical response actions among these various response actions are associated with pleasure or discomfort, affirmation or denial.
- most of the response actions that become pleasant acts are positive responses
- most of the response actions that become unpleasant acts are negative responses.
- Pleasant / unpleasant behavior is related to intimacy, and positive / negative reactions affect the behavior selection of the robot 100.
- a series of recognition processes including detection / analysis / determination may be performed only by the recognition unit 212 of the server 200, or may be performed only by the recognition unit 156 of the robot 100. Processing may be executed.
- the closeness management unit 220 of the server 200 changes the closeness to the user according to the response action recognized by the recognition unit 156. In principle, the intimacy for a user who has performed a pleasant act is increased, and the intimacy for a user who has performed an unpleasant action is decreased.
- the recognition unit 212 of the server 200 determines pleasure / discomfort according to the response, and the map management unit 210 changes the z value of the point where the pleasant / discomfort act is performed in the action map expressing “attachment to a place”. May be.
- the map management unit 210 may set a favorable point in the living room with a high probability. In this case, a positive feedback effect that the robot 100 likes the living room and receives the favor in the living room more and more likes the living room is realized.
- the person recognition unit 214 of the server 200 detects a moving object from various data obtained from the external sensor 114 or the internal sensor 128, and extracts its features (physical features and behavioral features). Based on these features, cluster analysis is performed on a plurality of moving objects. As moving objects, not only humans but also pets such as dogs and cats may be analyzed.
- the robot 100 periodically takes images, and the person recognition unit 214 recognizes the moving object from these images and extracts the feature of the moving object.
- the person recognition unit 214 recognizes the moving object from these images and extracts the feature of the moving object.
- physical characteristics and behavioral characteristics are extracted from an odor sensor, a built-in sound collecting microphone, a temperature sensor, and the like. For example, when a moving object appears in the image, bearded, active early in the morning, wearing red clothes, smelling perfume, loud, wearing glasses, wearing a skirt Various features such as being white hair, tall, fat, tanned, on the couch.
- the intimacy for the user changes depending on what action is taken from the moving object (user).
- the closeness management unit 220 increases or decreases the closeness for each clustered user.
- the intimacy changes mainly by (1) detection (visual recognition), (2) physical contact, and (3) voice call.
- the infant is “visually recognized” by the robot 100. More specifically, it is determined that the feature of the detected moving object matches the cluster (profile) of the infant by deep learning based on the feature information obtained from the photographed image and other feature information obtained from the odor sensor or the like at the time of photographing. When it becomes visual recognition determination. When the visual recognition determination is made, the closeness management unit 220 increases the closeness of the infant. The user with higher detection frequency tends to have a higher familiarity. This control method emulates the biological behavior that it is easy to get a sense of familiarity with people who often meet.
- the recognizing unit 156 of the robot 100 recognizes the face image of the directly facing user, recognizes the gaze direction from the face image, and when the gaze direction is directed toward itself for a predetermined time or more, You may recognize.
- the closeness management unit 220 increases the closeness of the mother.
- the robot 100 may detect a touch on itself by covering the outer shell with a piezoelectric fabric. You may detect a touch by detecting a user's body temperature with a temperature sensor. When the robot 100 detects a cuddle, the closeness may be greatly increased on the assumption that strong love for the robot 100 is indicated.
- the intimacy manager 220 decreases the intimacy.
- the familiarity management unit 220 greatly reduces the familiarity with the infant.
- Such a control method emulates the biological behavior that people who touch the software tend to feel close, but hate violent people.
- Voice call The robot 100 also changes the familiarity when it detects a voice directed at itself. For example, when the user's name and love terms are detected in a predetermined volume range, the love degree is increased. Typical term patterns such as “cute”, “interesting”, and “come” may be registered in advance as dear terms, and it may be determined whether or not they are dear terms by voice recognition. On the other hand, when the voice is spoken at a loud volume exceeding the normal volume range, the closeness may be lowered. When you are beaten loudly or surprised, your affection will decrease. Moreover, when a dislike term is applied, the degree of love may be reduced. A typical term pattern such as “Kora”, “Kuruna”, “Over there”, or “Baka” may be registered in advance as an aversion term, and it may be determined whether or not it is an aversion term by voice recognition.
- the name of the robot 100 may be registered in advance by the user.
- the robot 100 may recognize a term that is frequently applied among various terms applied to the robot 100 as its own name. In this case, terms that tend to appear frequently, such as “Oi” and “Odei”, may be excluded from candidates for name recognition.
- the robot 100 sets a high intimacy for people who often meet, people who often touch, and people who speak well. On the other hand, the intimacy of people who rarely see, people who do not touch much, violent people, and people who speak loudly is low.
- the robot 100 changes the familiarity for each user based on various external information detected by sensors (visual sense, tactile sense, auditory sense).
- the intimacy management unit 220 decreases the intimacy with time.
- the closeness management unit 220 may decrease the closeness of all users by 1 every 10 minutes. In other words, the user cannot maintain an intimate relationship with the robot 100 unless he / she continues to pet the robot 100.
- the actual robot 100 autonomously performs complex action selection according to the action map.
- the robot 100 behaves while being influenced by a plurality of behavior maps based on various parameters such as loneliness, boredom, and curiosity. If the influence of the action map is excluded, or the robot 100 is in an internal state where the influence of the action map is small, in principle, the robot 100 tries to approach a person with high intimacy, and away from a person with low intimacy. And
- the behavior of the robot 100 is categorized as follows according to intimacy.
- Cluster with very high intimacy The robot 100 approaches a user (hereinafter referred to as “proximity behavior”) and performs a affection gesture that is defined in advance as a gesture that favors a person, thereby showing the affection of love. Express strongly.
- Cluster with relatively high intimacy The robot 100 performs only the proximity action.
- Cluster with relatively low familiarity The robot 100 does not perform any special action.
- Cluster with a particularly low degree of intimacy The robot 100 performs a leaving action.
- the robot 100 approaches a user when a user with a high degree of closeness is found, and conversely moves away from the user when a user with a low closeness is found.
- it is possible to express so-called “shrinking”.
- the robot 100 may move away from the visitor toward the family (user B with a high intimacy).
- the respiratory expression is roughened by increasing the expansion / contraction speed by the expansion / contraction drive mechanism 372, or the stored left and right front wheels 102 (left wheel 102a, right wheel 102b) are rotated alternately.
- the left and right front wheels 102 may be simultaneously rotated in opposite directions with respect to the axle (axis). Alternatively, the left and right front wheels 102 may be rotated alternately.
- it is detected that the user B is hugging it is possible to express a sense of relief by slowing the breathing expression by reducing the expansion / contraction speed by the expansion / contraction drive mechanism 372.
- the user B can feel that the robot 100 feels anxiety because of shyness and relies on himself / herself.
- the user B is aroused by the joy of being selected and relied upon and the feeling of attachment accompanying it.
- the closeness of the robot 100 to the user A gradually increases, and the robot 100 does not perform a shy behavior (withdrawal behavior) to the user A.
- the user A can also be attached to the robot 100 by feeling that the robot 100 has become familiar with him.
- the above action selection is not always executed. For example, when an internal parameter indicating the curiosity of the robot 100 is high, an action map for finding a place satisfying the curiosity is emphasized, and thus the robot 100 may not select an action influenced by intimacy. . Further, when the external sensor 114 installed at the entrance detects the return of the user, the user's welcome action may be executed with the highest priority.
- the robot 100 determines that the wheel storage condition is satisfied. Then, the wheel drive mechanism 370 is driven, and the wheel is raised to be stored in the storage space S while being in a non-grounded state. As the body 104 comes into contact with the storage of the wheel, a gesture is expressed in which the robot 100 sits down and waits for a hug. By moving the hand 106 as described above, it is possible to express a gesture for holding a cuddle.
- the emotion of the robot 100 can also be expressed by rotationally driving the vehicle with the wheels stored. For example, the robot 100 determines that the driving condition is satisfied when the user to be held is a user with low intimacy, and executes control such as rotating the left and right wheels in opposite directions or switching the rotation directions alternately. To do. As a result, it is possible to express a gesture in which the robot 100 twists the body 104 and hesitates to hold it.
- a general robot is equipped with a device such as a fan for forcibly circulating air.
- the operating sound of the fan makes the user feel a “machine” and is preferably suppressed as much as possible. Therefore, the robot 100 of the present embodiment drives the expansion / contraction drive mechanism 372 to take in outside air into the body 104 and reduce the internal noise so as to reduce mechanical noise generated due to cooling of heat-generating components such as a CPU as much as possible. Let the air out.
- the expansion / contraction operation of the robot 100 looks like a breathing of a living thing, it can make the robot 100 feel closer to a living thing (living body).
- Such a breathing expression changes according to the internal state of the robot 100 and the surrounding environment. For example, after performing frequent movements and other relatively intense movements, it is possible to express shortness of breath after intense exercise by increasing the load of breathing movements based on the history information. Further, when the ambient temperature is detected to be high due to the season or the air conditioning state, it is possible to express how the robot 100 is hot by increasing the load of the breathing motion. With such control, the life feeling of the robot 100 can be improved.
- FIG. 10 is a flowchart illustrating wheel drive control of the robot 100.
- the processing in this figure is repeatedly executed at a predetermined control cycle.
- the internal sensor 128 periodically measures the internal temperature and room temperature (ambient temperature) of the robot 100.
- the recognition unit 156 detects that the user is nearby (Y in S10), if it is not in the wheel storage state (N in S12), the behavior determination unit 140 issues a wheel storage instruction to the behavior driving unit 146 (S14).
- the wheels (the front wheel 102 and the rear wheel 103) are housed in the body 104, and the robot 100 is seated on the floor. If the wheel is already stored (Y in S12), the process in S14 is skipped.
- the behavior determination unit 140 issues an intimate operation instruction to the behavior driving unit 146 (S18).
- the drive circuit 620 of the expansion / contraction drive mechanism 372 is turned on / off, and a gesture that holds the hand 106 up and down and holds it is expressed.
- the recognition unit 156 recognizes that the user has lifted the user within a predetermined time (Y in S20)
- the behavior determination unit 140 selects the breathing motion (S22), and the behavior driving unit 146 controls the breathing motion control A. (S24).
- the breathing motion control A expresses a breathing motion that appeals comfort, for example, by gently driving the expansion / contraction drive mechanism 372.
- the lifting by the user B is not detected (N of S20)
- the processes of S22 and S24 are skipped.
- the behavior determination unit 140 moves to the behavior driving unit 146.
- a wheel specific rotation instruction is issued (S28). As a result, the left and right front wheels 102 are driven while being accommodated, and a gesture to hesitate is held. If it is not recognized that the user is picked up (N in S26), the processes after S28 are skipped.
- the behavior determination unit 140 issues a wheel advance instruction to the behavior driving unit 146 (S32). As a result, the wheel advances from the storage space S and becomes movable. If it is not a wheel storage state (N of S30), the process of S32 will be skipped.
- the behavior determination unit 140 selects the respiratory motion (S36), and instructs the behavior driving unit 146 to perform the respiratory motion control B (S38). ).
- the breathing motion control B is set based on the internal temperature of the robot 100 and the ambient temperature. For example, when it is determined that the cooling load to an appropriate temperature is large based on the internal temperature and the ambient temperature, cooling is promoted by driving the expansion / contraction drive mechanism 372 at a relatively high speed. When it is determined that the cooling load is small, cooling is promoted by driving the expansion / contraction drive mechanism 372 at an appropriate speed. By such control, it is possible to increase the cooling efficiency and save power.
- the change rate of the internal temperature may be defined as a temperature increase rate per predetermined period, for example, 5 seconds. This is because when the rate of temperature increase is large, it is expected that cooling will be required immediately even if the internal temperature t is low at this time.
- the robot system 300 including the robot 100 and the robot 100 has been described above based on the embodiment.
- the wheel is stored, and the gesture of sitting and waiting for a cuddle is expressed.
- the presence / absence or driving mode of the wheel being stored is changed according to the familiarity of the user to express the emotion of the robot 100.
- the structure which is hard to get dirty when the user is held by the wheel storage is realized, it is possible to promote the skinship of the user to 100. That is, it is possible to reduce the sense of physical distance to the robot 100 of the user.
- the outer skin 314 of the body 104 is used as an expansion / contraction body, and intake / exhaust into the body 104 is performed by opening / closing the intake valve 352 and the exhaust valve 362 corresponding to the expansion / contraction of the outer skin 314. .
- the heat generating component in the robot 100 can be cooled to an appropriate temperature, and failure or deterioration due to heat can be prevented.
- breathing motion can be expressed by the expansion and contraction of the body 104, and the robot 100 can be given a sense of life.
- the computer may suffer from problems such as the stop of the processor 122 and the destruction of data in the memory (storage device 124) at high temperatures.
- problems such as the stop of the processor 122 and the destruction of data in the memory (storage device 124) at high temperatures.
- a cooling device such as a fan
- noise increases as the operating level of the cooling function increases.
- life can be given by operating the expansion / contraction mechanism to supply and discharge air and show the breathing operation of the robot 100.
- the robot 100 has a feature of autonomous behavior. When the robot 100 moves to the cool spot C by itself, cooling without excessively relying on the cooling device is possible. Such a control method also contributes to power saving of the robot 100.
- FIG. 11 is a schematic diagram illustrating the configuration and operation of the robot 500 according to the second embodiment.
- FIG. 11A is a side view
- FIG. 11B is a front view.
- the dotted line in the figure indicates a state in which the wheel can advance from the storage space S and can travel
- the solid line in the figure indicates a state in which the wheel is stored in the storage space S.
- the robot 500 includes a pair of wheels 502 (left wheel 502a and right wheel 502b) for traveling on two wheels, and a pair of drive mechanisms 504 for driving the wheels.
- the left wheel 502a and the right wheel 502b are drive wheels that can be individually controlled.
- Wheel 502 includes a DD motor 396 and a wheel 397.
- An axle 398 of the wheel 502 is provided integrally with a rotation shaft 506 extending in the vertical direction of the body 104.
- the rotation shaft 506 functions as a steering shaft for the wheels 502.
- the drive mechanism 504 includes a steering mechanism 508 for steering the wheels 502 and an elevating mechanism 510 for raising and lowering the wheels 502.
- the steering mechanism 508 is an actuator that rotates the rotation shaft 506 about its axis, and includes a motor and a reduction gear (deceleration gear) (not shown).
- the elevating mechanism 510 is an actuator that raises or lowers the rotating shaft 506 toward the inside of the storage space S, and includes a rack and pinion mechanism (not shown).
- the robot 500 is configured by using a so-called inverted pendulum principle, and can stably run with two wheels.
- various traveling states such as forward, reverse, right turn, left turn, right rotation, and left rotation can be realized.
- the lifting mechanism 510 When the wheels are stored, the lifting mechanism 510 is driven in one direction, so that the pair of wheels 502 are raised from the floor surface F (see the dashed line arrow). As a result, the body 104 descends and the seating surface 108 contacts the floor surface F (see solid arrow). Thereby, the state where the robot 100 is sitting is realized. By driving the lifting mechanism 510 in the opposite direction, each wheel can be advanced from the storage space S and the robot 500 can be raised.
- the drive mechanism 504 is provided on the outer wall of the lower half 382 of the main body frame 310 (see FIG. 2), and the power line and the signal line are guided into the main body frame 310 through a seal member. For this reason, the sealing performance of the communication path 355 is ensured.
- FIG. 12 is a diagram illustrating the configuration and operation of the robot 600 according to the third embodiment.
- 12A shows a state in which the wheels (front wheel 102, rear wheel 103) can travel from the storage space S and
- FIG. 12B shows a state in which the wheels are stored in the storage space S.
- FIG. 12A shows a state in which the wheels (front wheel 102, rear wheel 103) can travel from the storage space S
- FIG. 12B shows a state in which the wheels are stored in the storage space S.
- the robot 600 is different from the first embodiment in that the robot 600 has a configuration in which wheels are stored and the storage space S is closed.
- Each of the pair of wheel covers 612 has an upper end portion rotatably connected to the body frame 618. That is, the rotation shaft 621 of the wheel cover 612 is pivotally supported by the body frame 618.
- An actuator 622 for rotationally driving the rotation shaft 621 is provided inside the body frame 618.
- the wheel cover 612 has a smooth shape that curves from the proximal end having the rotation shaft 621 toward the distal end.
- the outer skin 314 is in close contact with the outer surface of the wheel cover 612.
- a fitting section 619 having a concave cross section is provided on the left and right side walls of the lower portion of the body frame 618.
- the fitting portion 619 is a fitting groove formed so as to face the wheel cover 612.
- a front end portion of the wheel cover 612 is detachably fitted to the fitting portion 619.
- a sensor 630 capable of detecting that the storage space S is closed by the wheel cover 612 after the wheels are stored is provided in the vicinity of the fitting portion 619.
- the sensor 630 is a reflective photosensor and irradiates light toward the fitting portion 619 side (downward). When the wheel cover 612 is fitted into the fitting portion 619, reflected light is detected by the sensor 630. With the detection of the reflected light, it is determined that the storage space S is closed.
- the pair of wheel covers 612 are in the most open state, and some of the wheels protrude from the storage space S to the outside. At this time, since the wheel cover 612 is not positioned on the optical axis (see the two-dot chain line), the sensor 630 is turned off.
- the actuator 622 is driven in one direction after the wheels are stored, as shown in FIG. 12B, the pair of wheel covers 612 operate in a direction to reduce the distance between them, and the leading ends thereof are fitted to the fitting portions 619.
- the storage space S is closed by fitting.
- the sensor 630 is turned on, and it is detected that the wheel is completely stored.
- the wheel is driven on the condition that the wheel 630 is completely stored by the sensor 630.
- tip part of a pair of wheel cover 612 remove
- the storage space S is opened, and the wheels can be advanced.
- the wheel cover 612 by completely storing the wheel with the storage space S closed by the wheel cover 612, it is possible to reliably prevent the wheel from contaminating the user's clothes. Further, by fitting the tip of the wheel cover 612 to the frame (body frame 618), it is possible to reliably prevent the user's body and the wheel from interfering with each other. Further, by operating the pair of wheel covers 612 inward together with the storage of the wheels, the body 104 is deformed into a more rounded state as a whole, so that the appearance of the robot 600 is softened. From these effects, the user wants to naturally hold the robot 600 with peace of mind, and the physical and mental distance to the robot 600 can be reduced.
- a reflective photosensor is exemplified as a sensor for detecting complete storage of the wheel, but a transmissive photosensor may be used.
- a light emitting element is provided on one side with respect to the fitting portion 619, while a light receiving element is provided on the other side, and the wheel cover 612 is fitted to the fitting portion when light from the light emitting element to the light receiving element is blocked. It may be determined that it is fitted to 619, that is, the storage space S is closed.
- the wheel cover 612 may be detected by a magnetic sensor or other sensors to determine that the storage space S has been closed. Further, it may be determined that the wheel cover 612 is closed by a mechanical switch.
- the sensor 630 may be configured to detect that the wheel cover 612 is closed by an arbitrary method such as non-contact or contact.
- the robot system 300 is configured by one robot 100, one server 200, and a plurality of external sensors 114, some of the functions of the robot 100 may be realized by the server 200. May be assigned to the robot 100.
- One server 200 may control a plurality of robots 100, or a plurality of servers 200 may cooperate to control one or more robots 100.
- a third device other than the robot 100 or the server 200 may take part of the function.
- a set of each function of the robot 100 described in FIG. 9 and each function of the server 200 can be generally grasped as one “robot”. How to allocate a plurality of functions necessary for realizing the present invention to one or a plurality of hardware depends on the processing capability of each hardware, the specifications required for the robot system 300, and the like. It only has to be decided.
- robot in the narrow sense refers to the robot 100 that does not include the server 200
- robot in the broad sense refers to the robot system 300. Many of the functions of the server 200 may be integrated into the robot 100 in the future.
- the configuration in which the pair of hands 106 is displaced in accordance with the expansion / contraction deformation of the trunk portion is shown.
- the two hands 106 may be individually controlled, and simple operations such as raising, shaking, and vibrating may be possible.
- a wire may be embedded in the hand 106.
- the hand 106 can be lifted by the driving mechanism 120 pulling the hand 106 through the wire.
- a gesture in which the hand 106 is shaken by vibrating the hand 106 is also possible. More complex gestures can be expressed by using a large number of wires.
- the robot 100 is driven in three wheels, and the front wheel is the driving wheel and the rear wheel is the driven wheel.
- both the front wheels and the rear wheels may be drive wheels.
- one of the front wheels and the rear wheels may be a drive wheel, or both may be drive wheels. It is preferable that all the wheels can be stored in the body 104 by the drive mechanism.
- the intake port 354 is provided at the uppermost part of the body 104 and the exhaust port 364 is provided at the lowermost part is shown. However, if the intake port 354 is disposed relatively above the exhaust port 364. Good. These arrangements can be set as appropriate.
- the behavior determination unit 140 may determine the activity amount (motion amount) of the robot 100 based on the history information stored in the history data storage unit 164. Then, the expansion / contraction drive mechanism may be driven at a higher speed as the amount of recent activity increases, or cooling may be promoted by increasing the drive frequency. A state where the amount of activity is large is considered to correspond to a state where the internal temperature of the robot 100 is high, and as a result, the cooling efficiency is increased. In terms of appearance, it is possible to express how the breathing after rough movements becomes rough, and the life feeling of the robot 100 can be further improved.
- valve body may be configured to be able to approach or leave the intake port or the exhaust port.
- a valve structure including a valve body that is detachably opposed to the valve seat and a biasing member such as a spring that biases the valve body in the valve closing direction may be employed.
- the intake valve and the exhaust valve may be an electrically driven valve such as an electromagnetic valve driven by a solenoid or an electric valve driven by a motor.
- an air pump may be installed in the body to expand and contract the expansion / contraction body and supply / discharge air.
- the intake valve is disposed relatively above the exhaust valve, but conversely, it may be disposed relatively below. However, it is preferable to arrange the intake valve at a position away from the floor surface.
- the outer skin 314 is an elastic body. In the modification, it is not an elastic body but may be a flexible expansion / contraction body.
- a first drive mechanism that can push the expansion / contraction body from the inside and a second drive mechanism that can push and shrink the expansion / contraction body from the outside may be provided.
- the “grounding portion” may be a leg portion and may be configured to be able to walk.
- the drive mechanism drives the grounding part forward and backward from the storage space provided in the body to the outside.
- the structure by which a wheel is fully accommodated in the storage space of a body was illustrated.
- the drive mechanism may be configured to retract the grounding portion to the storage space in a non-grounded state when the storage condition is satisfied when stopped. In that case, when the grounding portion is retracted to the storage space, it is preferable that more than half of the grounding portion is stored. Thereby, when a user picks up the robot, it can prevent or suppress that it gets dirty.
- the leg part may be driven with the leg part stored in the storage space. In that case, you may drive a leg part back and forth similarly at the time of a walk. Or it is good also as a structure which can provide a rotation axis separately in a leg part, and can rotate only in the accommodation state. Then, depending on whether or not the user to be embraced is a highly intimate user, the presence or absence of driving and the manner of driving during leg storage may be varied.
- the two conditions that the robot is in a stopped state and that the user is close are detected as the storage conditions of the wheels (grounding unit).
- different conditions may be set, or another condition may be added. For example, you may add to a condition that the user who is near is a user with high closeness.
- the driving condition when the wheel (grounding part) is stored is exemplified as being held by a user who is not close.
- different conditions may be set, or another condition may be added.
- the level of intimacy may be divided into stages, and it may be a condition that the intimacy is a user lower than a predetermined level.
- the wheel in the housed state may be controlled as follows. For example, a gesture that hesitates to hold may be expressed violently by stopping the wheel suddenly from the rotating state. Conversely, a twist of the body born from the joy of the robot may be expressed by slowly stopping the wheel from the rotating state. Moreover, you may appeal that the position of a hug is changed by controlling the rotation presence or absence and rotation direction of the left and right wheels.
- the wheels may be controlled so as to obtain an inertial force that twists the body in a direction in which the user's face can be seen from the robot. That is, it may be configured as a robot having a function of prompting the user to move in the direction in which he / she wants to move himself / herself (prompt to change the holding state).
- a highly intimate user may express a gesture to heal this when held for a long time. That is, there may be provided a time measuring unit for measuring the duration of the lifting after the lifting determination is made. Then, when the lifting duration time exceeds a predetermined reference time, a gesture for driving the moving mechanism (grounding unit / wheel) to heal the holding may be expressed.
- the reference time may be set to be different according to the familiarity of the user.
- the familiarity may be set in a plurality of stages, and a plurality of reference times may be set so as to correspond to each stage.
- the reference time may be varied according to the season and ambient temperature. For example, by making the reference time relatively short in summer (when the outside air temperature is high), it is possible to express a gesture that hesitates to hold when it is hot.
- the processing load on the CPU may be reduced in order to suppress an increase in internal temperature when the robot is held.
- the operation clock supplied to the CPU may be relatively slow.
- only the minimum necessary processing determined in advance may be executed by the CPU.
- a drop determination unit that determines the fall of the robot may be provided.
- the fall determination unit may include an acceleration sensor, for example, and may determine fall based on the detected value.
- the wheels When it is determined that the vehicle has fallen from the holding state, the wheels may be taken out to absorb the impact when the vehicle collides with the floor. You may attach the tire which consists of an elastic body to the wheel of a wheel.
- the robot may include a posture detection unit.
- the posture detection unit may detect the posture by determining the direction of gravity based on the detection result of the acceleration sensor and detecting the amount of deviation between the vertical direction and each axis of the robot.
- the posture of the robot may be detected based on surrounding images taken by the camera. In this way, the posture detection unit determines the current posture of the robot.
- control is performed such that the moving mechanisms (left and right wheels, etc.) housed in the body are rotated in the opposite directions, the rotation directions are switched alternately, and suddenly stopped after the rotation.
- the center of gravity of the falling robot may be adjusted by feedback control of the rotation of the wheel according to the posture of the robot, and the posture may be balanced so that the moving mechanism is positioned below the gravitational direction.
- control is performed to bring the deviation between the target value of the posture for positioning the moving mechanism below the gravitational direction in the body and the detected value of the current posture close to zero. By doing in this way, the damage of the body by fall can be prevented or eased.
- the grounding surface of the moving mechanism is set as an approach point, and the approach point is balanced so that it first contacts the floor surface.
- the moving mechanism is designed to be sturdy.
- a structurally strong part or a part (specific part) formed to absorb shocks is set as the approach point when dropping, and the approach point is first balanced to the floor. You may let them.
- the transition mechanism is rotationally driven to balance the posture of the robot.
- the posture may be balanced including linear driving (translational driving) of the moving mechanism. For example, when a plurality of parts (left and right wheels, left and right legs, etc.) constituting the moving mechanism are relatively displaceable in the linear direction, the posture may be balanced including linear driving of each part.
- the lifting determination unit may determine the stability of the holding state. This stability may be determined based on any one or a combination of a positional relationship with the user when the robot is being held, a contact site, a posture, and the like.
- the driving mode (the magnitude (amplitude) or speed of driving) may be changed according to the stability of the holding state. For example, when it is determined that the user is in a relatively stable holding state, such as being supported by the user's knees with both hands, it may be driven relatively intensely. When it is determined that the user is in a relatively unstable holding state such as being supported by the user with one hand, the driving may be performed relatively slowly or the driving may be prohibited.
- the stability of the holding state may be divided into a plurality of levels, and the driving mode may be varied depending on the level.
- a gesture that hesitates by holding the moving mechanism is expressed, but the gesture may be expressed by driving a mechanism different from the moving mechanism. For example, it may be expressed by shaking the head sideways or shaking the hand violently.
- storage of the moving mechanism may be prohibited when the robot is suddenly lifted while moving. At that time, it is preferable to urgently stop the drive system including the moving mechanism.
- the driving of the moving mechanism may be locked when it is determined that the robot is picked up.
- the driving of the moving mechanism may be locked when non-grounding is detected, such as when the grounding surface of the moving mechanism (grounding unit / wheel) moves away from the floor surface, regardless of factors such as lifting.
- Such a configuration may be applied regardless of whether the robot has a storage space for the moving mechanism.
- Such an autonomous behavior type robot can be defined as follows.
- the robot includes a body, a moving mechanism having a grounding surface at the time of movement, a driving mechanism that drives the moving mechanism, and an operation determining unit that determines a grounding state (presence / absence of grounding) of the grounding surface.
- the drive mechanism locks the operation of the moving mechanism when it is determined that the ground plane is not grounded.
- the body or the moving mechanism may be provided with a sensor (a touch sensor, a proximity sensor, or the like) for detecting a grounding state (grounding presence / absence) of the grounding surface. According to such a configuration, it is possible to solve technical problems such as maintaining a sense of stability when the user lifts the autonomous behavior type robot and ensuring safety at that time.
- the intake valve and the exhaust valve may be configured by an electrically driven valve such as an electromagnetic valve or an electric valve.
- the intake valve and the exhaust valve may be configured by an electrically driven valve such as an electromagnetic valve or an electric valve.
- a fan may be disposed in the communication path 355 in the trunk frame 318.
- a differential pressure can be generated inside and outside the body 104, and the intake valve 352 and the exhaust valve 362 can be opened simultaneously. Thereby, the outside air can be circulated in the body 104.
- the behavior determination unit 140 can adjust the operation level by adjusting the rotational speed of the fan according to the internal temperature of the robot 100.
- the above-described embodiments and modifications can also be defined as an autonomous behavior type robot having the following configuration.
- the robot includes a hollow body having a head and a torso, an intake port for taking outside air into the body, an exhaust port for exhausting inside air from the body, the intake port and the exhaust port.
- a communicating path that forms a sealed space when both the intake port and the exhaust port are closed, and an expansion / contraction body that is disposed in the trunk portion and forms at least a part of the communicating path;
- An intake / exhaust mechanism that opens the exhaust port while being closed, a moving mechanism (grounding portion) having a grounding surface during movement, a second drive mechanism that drives the moving mechanism forward and backward from a storage space provided in the body, Is provided. The second drive mechanism retracts the moving mechanism to the storage space in a non-grounded state when the storage condition is satisfied when stopped.
- the robot includes a hollow body having a head and a torso, an intake port for taking outside air into the body, an exhaust port for exhausting inside air from the body, the intake port and the exhaust port.
- a communicating path that communicates, an expansion / contraction body that is disposed in the body portion and forms at least a part of the communication path, a first drive mechanism that expands / contracts the expansion / contraction body, and expansion of the expansion / contraction body
- An intake / exhaust mechanism that increases the opening of the intake port in response to the exhaust port and increases the opening of the exhaust port in response to contraction of the expansion / contraction body, and a grounding surface during movement
- a second drive mechanism that drives the moving mechanism forward and backward from a storage space provided in the body. The second drive mechanism retracts the moving mechanism to the storage space in a non-grounded state when the storage condition is satisfied when stopped.
- the robot when the internal temperature rises in a state where the robot is held and exceeds a predetermined determination reference value, or when it is determined that the determination reference is exceeded after a predetermined time, the robot is held. It may be annoying. For the latter, for example, it may be determined (estimated) whether or not the determination criterion is reached based on the current internal temperature and the temperature increase gradient. For example, control is performed such as rotating the left and right wheels housed in the body in opposite directions, and alternately switching the rotation directions. That is, by performing an operation that makes it difficult to hold the robot, the user is prompted to lower the robot to the floor.
- the robot When the user lowers the robot to the floor surface, the robot may be controlled to move away from the user and cool the internal heat generating components. If the robot is provided with a cooling fan, the fan may be driven to cool. As a result, the user can listen to the rotating sound of the cooling fan to prevent the user from being conscious and awakened, and can prevent thermal runaway due to heat generation and failure of heat generating components.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Human Computer Interaction (AREA)
- Manipulator (AREA)
- Toys (AREA)
Abstract
Description
図1は、第1実施形態に係るロボット100の外観を表す図である。図1(a)は正面図であり、図1(b)は側面図である。
本実施形態におけるロボット100は、外部環境および内部状態に基づいて行動や仕草(ジェスチャー)を決定する自律行動型のロボットである。外部環境は、カメラやサーモセンサなど各種のセンサにより認識される。内部状態はロボット100の感情を表現するさまざまなパラメータとして定量化される。これらについては後述する。
図2に示すように、ロボット100のボディ104は、ベースフレーム308、本体フレーム310、一対のホイールカバー312および外皮314を含む。ベースフレーム308は、金属からなり、ボディ104の軸芯を構成するとともに内部機構を支持する。ベースフレーム308は、アッパープレート332とロアプレート334とを複数のサイドプレート336により上下に連結して構成される。複数のサイドプレート336間には通気が可能となるよう、十分な間隔が設けられている。ベースフレーム308の内方には、バッテリー118、制御装置342および各種アクチュエータ等が収容されている。
図4は、車輪収納動作を模式的に示す図である。図4(a)は側面図であり、図4(b)は正面図である。図中点線は車輪が収納スペースSから進出して走行可能な状態を示し、図中実線は車輪が収納スペースSに収納された状態を示す。
駆動回路620のスイッチがオンからオフにされると、図5(a)に示すように、形状記憶合金線610が弛緩伸長する(実線矢印参照)。それにより、外皮314が元の状態に膨らみ、本体フレーム310の内圧が負圧になる。その結果、吸気弁352が開弁し、ボディ104の内部に外気が導入される(二点鎖線矢印参照)。このとき、排気弁362は閉じた状態を保つ。外観上は、ロボット100の胴部が膨らみ、手106がやや押し上げられる状態となる。
ロボットシステム300は、ロボット100、サーバ200および複数の外部センサ114を含む。家屋内にはあらかじめ複数の外部センサ114(外部センサ114a、114b、・・・、114n)が設置される。外部センサ114は、家屋の壁面に固定されてもよいし、床に載置されてもよい。サーバ200には、外部センサ114の位置座標が登録される。位置座標は、ロボット100の行動範囲として想定される家屋内においてx,y座標として定義される。
感情マップ116は、サーバ200に格納されるデータテーブルである。ロボット100は、感情マップ116にしたがって行動選択する。感情マップ116は、ロボット100の場所に対する好悪感情の大きさを示す。感情マップ116のx軸とy軸は、二次元空間座標を示す。z軸は、好悪感情の大きさを示す。z値が正値のときにはその場所に対する好感が高く、z値が負値のときにはその場所を嫌悪していることを示す。
ロボット100は、内部センサ128、通信機126、記憶装置124、プロセッサ122、駆動機構120およびバッテリー118を含む。各ユニットは電源線130および信号線132により互いに接続される。バッテリー118は、電源線130を介して各ユニットに電力を供給する。各ユニットは信号線132により制御信号を送受する。バッテリー118は、リチウムイオン二次電池などの二次電池であり、ロボット100の動力源である。
上述のように、ロボットシステム300は、ロボット100、サーバ200および複数の外部センサ114を含む。ロボット100およびサーバ200の各構成要素は、CPU(Central Processing Unit)および各種コプロセッサなどの演算器、メモリやストレージといった記憶装置、それらを連結する有線または無線の通信線を含むハードウェアと、記憶装置に格納され、演算器に処理命令を供給するソフトウェアによって実現される。コンピュータプログラムは、デバイスドライバ、オペレーティングシステム、それらの上位層に位置する各種アプリケーションプログラム、また、これらのプログラムに共通機能を提供するライブラリによって構成されてもよい。以下に説明する各ブロックは、ハードウェア単位の構成ではなく、機能単位のブロックを示している。ロボット100の機能の一部はサーバ200により実現されてもよいし、サーバ200の機能の一部または全部はロボット100により実現されてもよい。
サーバ200は、通信部204、データ処理部202およびデータ格納部206を含む。通信部204は、外部センサ114およびロボット100との通信処理を担当する。データ格納部206は各種データを格納する。データ処理部202は、通信部204により取得されたデータおよびデータ格納部206に格納されているデータに基づいて各種処理を実行する。データ処理部202は、通信部204およびデータ格納部206のインタフェースとしても機能する。
ロボット100は、通信部142、データ処理部136、データ格納部148、駆動機構120および内部センサ128を含む。通信部142は、通信機126(図8参照)に該当し、外部センサ114およびサーバ200との通信処理を担当する。データ格納部148は各種データを格納する。データ格納部148は、記憶装置124(図8参照)に該当する。データ処理部136は、通信部142により取得されたデータおよびデータ格納部148に格納されているデータに基づいて各種処理を実行する。データ処理部136は、プロセッサ122およびプロセッサ122により実行されるコンピュータプログラムに該当する。データ処理部136は、通信部142、内部センサ128、駆動機構120およびデータ格納部148のインタフェースとしても機能する。
ロボット100の撮影画像に幼児が検出された場合、幼児はロボット100に「視認」される。より具体的には、撮影画像から得られる特徴情報と撮影時にニオイセンサ等から得られる他の特徴情報に基づくディープラーニングにより、検出した移動物体の特徴が幼児のクラスタ(プロファイル)に一致すると判定したとき、視認判定となる。視認判定がなされると、親密度管理部220は幼児の親密度をアップさせる。検出頻度が高いユーザほど親密度が高くなりやすい。このような制御方法により、よく出会う人について親近感をいだきやすい、という生物的行動をエミュレートする。
ロボット100がユーザを視認し、かつ、ユーザからのタッチ(物理的接触)を検出したときには、ユーザからロボット100に対する興味を示されたと判定し、親密度はアップする。たとえば、母親に触られたとき、親密度管理部220は母親の親密度をアップさせる。ロボット100は、圧電ファブリックによって外殻を覆うことにより、自らに対するタッチを検出してもよい。温度センサにより、ユーザの体温を検知することでタッチを検出してもよい。ロボット100が抱っこを検出したときには、ロボット100に対する強い親愛が示されたとして、親密度を大きくアップさせてもよい。
ロボット100は、自らに向けられた声を検出したときにも親密度を変化させる。たとえば、自分の名前や親愛用語を所定の音量範囲にて検出したとき、親愛度はアップする。親愛用語として「かわいい」「おもしろい」「おいで」のようにあらかじめ典型的な用語パターンを登録しておき、音声認識により親愛用語か否かを判定してもよい。一方、通常の音量範囲を超えた大音量で声を掛けられたときには親密度をダウンさせてもよい。大声で叱られたとき、びっくりさせられたときには親愛度は低下する。また、嫌悪用語をかけられたときには、親愛度を低下させてもよい。嫌悪用語として、「こら」「くるな」「あっちへいけ」「ばか」のようにあらかじめ典型的な用語パターンを登録しておき、音声認識によって嫌悪用語か否かを判定してもよい。
(1)親密度が非常に高いクラスタ
ロボット100は、ユーザに近づき(以下、「近接行動」とよぶ)、かつ、人に好意を示す仕草としてあらかじめ定義される愛情仕草を行うことで親愛の情を強く表現する。
(2)親密度が比較的高いクラスタ
ロボット100は、近接行動のみを行う。
(3)親密度が比較的低いクラスタ
ロボット100は特段のアクションを行わない。
(4)親密度が特に低いクラスタ
ロボット100は、離脱行動を行う。
一般的なロボットは、運搬のために持ち上げられることはあっても、ユーザに愛情をもって抱き上げられることは想定されていない。ロボットにいかに生物的な行動をさせようとも、所詮は機械であるという固定観念が設計者にあるからである。ロボット100ではその固定観念が取り除かれ、ユーザに抱っこされることを想定した設計がなされている。上述した車輪収納機能がその一つである。
一般的なロボットには、ファンなどの強制的に空気を循環させる装置が搭載される。しかしながら、生物的な行動特性をエミュレートするロボット100において、ファンの作動音はユーザに「機械」を感じさせてしまうことになり、極力抑えることが好ましい。そこで、本実施形態のロボット100は、CPUなどの発熱部品の冷却のために発生する機械的な騒音を極力減らすよう、膨縮駆動機構372を駆動してボディ104内に外気を取り込み、内部の空気を排出させる。上述のように、ロボット100の膨縮動作は外観上、生物の呼吸のようにみえるため、ロボット100をより生物(生命体)に近いものと感じさせることができる。
本図の処理は、所定の制御周期にて繰り返し実行される。内部センサ128は、定期的にロボット100の内部温度や室温(周辺温度)を計測する。認識部156によりユーザが近くにいることが検出されると(S10のY)、車輪収納状態でなければ(S12のN)、行動判断部140が行動駆動部146に車輪収納指示を出す(S14)。それにより、車輪(前輪102および後輪103)がボディ104に収納され、ロボット100が床面にお座りした状態となる。既に車輪が収納されていれば(S12のY)、S14の処理をスキップする。
[第2実施形態]
図11は、第2実施形態に係るロボット500の構成および動作を表す模式図である。図11(a)は側面図であり、図11(b)は正面図である。図中点線は車輪が収納スペースSから進出して走行可能な状態を示し、図中実線は車輪が収納スペースSに収納された状態を示す。
図12は、第3実施形態に係るロボット600の構成および動作を表す図である。図12(a)は車輪(前輪102,後輪103)が収納スペースSから進出して走行可能な状態を示し、図12(b)は車輪が収納スペースSに収納された状態を示す。
Claims (13)
- ボディと、
移動時の接地面を有する移動機構と、
収納条件が成立すると、前記ボディに設けた収納スペースへ前記移動機構を退避させる駆動機構と、
を備えることを特徴とする自律行動型ロボット。 - 前記移動機構が車輪であることを特徴とする請求項1に記載の自律行動型ロボット。
- 前記駆動機構は、前記移動機構が前記収納スペースに収納された状態で所定の駆動条件が成立すると、前記移動機構を前記収納スペースにて駆動することを特徴とする請求項1または2に記載の自律行動型ロボット。
- 抱き上げられたことを判定する抱き上げ判定部を備え、
前記駆動機構は、抱き上げられたと判定されて前記駆動条件が成立したときに、前記移動機構を駆動することを特徴とする請求項3に記載の自律行動型ロボット。 - 前記ボディは、前記移動機構が非接地状態にて前記収納スペースに収納されたときに接地する接地底面を有することを特徴とする請求項1~4のいずれかに記載の自律行動型ロボット。
- ユーザが近くにいることを検出する認識部を備え、
前記収納条件として、ユーザが近くにいることが検出されることが含まれることを特徴とする請求項1~5のいずれかに記載の自律行動型ロボット。 - 当該ロボットの落下を判定する落下判定部を備え、
前記駆動機構は、前記移動機構が前記収納スペースに収納された状態で落下が判定されたときに、前記移動機構を前記収納スペースから外部へ進出させることを特徴とする請求項1~6のいずれかに記載の自律行動型ロボット。 - 当該ロボットの落下を判定する落下判定部と、
当該ロボットの落下中の体勢を検出する体勢検出部と、
を備え、
当該ロボットの特定部位がアプローチポイントとして設定され、
前記駆動機構は、落下が判定されたときに検出される体勢に基づき、前記アプローチポイントが最初に接地するようにその体勢をバランスさせるよう、前記移動機構を駆動することを特徴とする請求項1~7のいずれかに記載の自律行動型ロボット。 - ユーザの親密度を判定する判断部を備え、
前記駆動機構は、前記移動機構が前記収納スペースに収納された状態で抱き上げられたと判定された場合、その収納状態での前記移動機構の駆動有無又は駆動態様をユーザの親密度に応じて変化させることを特徴とする請求項4に記載の自律行動型ロボット。 - 前記移動機構が収納された状態で前記収納スペースを開閉するカバーと、
前記収納スペースの開閉を検出するセンサと、
を備え、
前記駆動条件として、前記収納スペースが閉じられていることが含まれることを特徴とする請求項4に記載の自律行動型ロボット。 - ボディと、
抱き上げられたことを判定する抱き上げ判定部と、
抱き上げられたと判定されて所定の駆動条件が成立すると、前記ボディの体勢を変化させるよう駆動する駆動機構と、
を備えることを特徴とする自律行動型ロボット。 - 抱き上げ状態の継続時間を計測する計時部をさらに備え、
前記継続時間が予め定める基準時間を経過したときに前記駆動条件が成立することを特徴とする請求項11に記載の自律行動型ロボット。 - 前記ボディの内部温度を検出する温度センサを備え、
前記ボディの内部温度が予め定める判定基準値を超えたとき、または判定基準を超えると判定されるときに前記駆動条件が成立することを特徴とする請求項11に記載の自律行動型ロボット。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1820152.5A GB2565959B (en) | 2016-07-11 | 2017-06-20 | Autonomously acting robot |
JP2017564923A JP6436548B2 (ja) | 2016-07-11 | 2017-06-20 | 自律行動型ロボット |
CN201780042199.6A CN109414623B (zh) | 2016-07-11 | 2017-06-20 | 行为自主型机器人 |
DE112017003480.9T DE112017003480B4 (de) | 2016-07-11 | 2017-06-20 | Autonom handelnder Roboter |
US16/233,097 US11213763B2 (en) | 2016-07-11 | 2018-12-27 | Autonomously acting robot |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016-136647 | 2016-07-11 | ||
JP2016136647 | 2016-07-11 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/233,097 Continuation US11213763B2 (en) | 2016-07-11 | 2018-12-27 | Autonomously acting robot |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018012219A1 true WO2018012219A1 (ja) | 2018-01-18 |
Family
ID=60953090
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/022674 WO2018012219A1 (ja) | 2016-07-11 | 2017-06-20 | 自律行動型ロボット |
Country Status (6)
Country | Link |
---|---|
US (1) | US11213763B2 (ja) |
JP (2) | JP6436548B2 (ja) |
CN (1) | CN109414623B (ja) |
DE (1) | DE112017003480B4 (ja) |
GB (1) | GB2565959B (ja) |
WO (1) | WO2018012219A1 (ja) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019163312A1 (ja) * | 2018-02-26 | 2019-08-29 | ソニー株式会社 | 情報処理装置、情報処理方法、およびプログラム |
WO2020105625A1 (ja) * | 2018-11-20 | 2020-05-28 | Groove X株式会社 | 車輪構造およびロボット |
JP2020099526A (ja) * | 2018-12-21 | 2020-07-02 | 株式会社セガ トイズ | 形象体玩具及び形象体玩具の制御方法 |
WO2020158641A1 (ja) * | 2019-01-31 | 2020-08-06 | ソニー株式会社 | ロボットの制御装置、ロボットの制御方法、及びプログラム |
WO2021039191A1 (ja) * | 2019-08-27 | 2021-03-04 | ソニー株式会社 | 情報処理装置、その制御方法及びプログラム |
CN114918979A (zh) * | 2022-06-30 | 2022-08-19 | 上海擎朗智能科技有限公司 | 一种浮动托盘及机器人 |
WO2023037609A1 (ja) * | 2021-09-10 | 2023-03-16 | ソニーグループ株式会社 | 自律移動体、情報処理方法、及び、プログラム |
WO2024071166A1 (ja) * | 2022-09-29 | 2024-04-04 | 日東電工株式会社 | ロボット |
JP7501536B2 (ja) | 2019-08-27 | 2024-06-18 | ソニーグループ株式会社 | 情報処理装置、その制御方法及びプログラム |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10788235B2 (en) * | 2014-11-07 | 2020-09-29 | Sony Corporation | Control system, control method, and storage medium |
JP6565853B2 (ja) * | 2016-09-29 | 2019-08-28 | トヨタ自動車株式会社 | コミュニケーション装置 |
KR102616403B1 (ko) * | 2016-12-27 | 2023-12-21 | 삼성전자주식회사 | 전자 장치 및 그의 메시지 전달 방법 |
WO2019035913A1 (en) * | 2017-08-15 | 2019-02-21 | Reconrobotics, Inc. | MAGNETIC LATCH FOR ROBOT THAT CAN BE LAUNCHED |
CN109831717B (zh) * | 2017-11-23 | 2020-12-15 | 深圳市优必选科技有限公司 | 一种降噪处理方法、系统及终端设备 |
KR102148032B1 (ko) * | 2018-06-25 | 2020-08-26 | 엘지전자 주식회사 | 로봇 |
KR102148031B1 (ko) | 2018-06-25 | 2020-10-14 | 엘지전자 주식회사 | 로봇 |
TWI704471B (zh) * | 2018-09-27 | 2020-09-11 | 仁寶電腦工業股份有限公司 | 互動式電子裝置及其互動方法 |
WO2020235704A1 (ko) * | 2019-05-21 | 2020-11-26 | 엘지전자 주식회사 | 액션 로봇 |
CN112659114B (zh) * | 2019-10-16 | 2022-11-22 | 深圳大学 | 一种单通道自动充放气系统与气压稳定控制方法 |
WO2021131959A1 (ja) * | 2019-12-27 | 2021-07-01 | ソニーグループ株式会社 | 情報処理装置、情報処理方法及び情報処理プログラム |
CN111571558B (zh) * | 2020-05-26 | 2021-01-15 | 上海罗客实业集团有限公司 | 一种互动式教育机器人 |
CN111591091A (zh) * | 2020-05-27 | 2020-08-28 | 泉州台商投资区中栓机械技术有限公司 | 一种基于双层胎式的行走机器人结构装置 |
TWI767612B (zh) * | 2021-03-16 | 2022-06-11 | 崑山科技大學 | 深度學習控制兩輪機具平衡方法 |
CN113425560A (zh) * | 2021-06-16 | 2021-09-24 | 上海理工大学 | 一种扇形折叠的微型越障助行器 |
WO2024025192A1 (ko) * | 2022-07-27 | 2024-02-01 | 삼성전자주식회사 | 주행이 가능한 소형 로봇 및 이의 제어 방법 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001138273A (ja) * | 1999-11-08 | 2001-05-22 | Sony Corp | 脚式移動ロボット及びその制御方法 |
JP2003311028A (ja) * | 2002-04-26 | 2003-11-05 | Matsushita Electric Ind Co Ltd | ペットロボット装置 |
JP2004306251A (ja) * | 2003-03-23 | 2004-11-04 | Sony Corp | ロボット装置及びその制御方法 |
JP2005144612A (ja) * | 2003-11-17 | 2005-06-09 | Sony Corp | ロボットシステム、遠隔操作装置、ロボット装置及びその制御方法 |
JP2011000681A (ja) * | 2009-06-19 | 2011-01-06 | Advanced Telecommunication Research Institute International | コミュニケーションロボット |
US20150336276A1 (en) * | 2012-12-28 | 2015-11-26 | Future Robot Co., Ltd. | Personal robot |
Family Cites Families (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3693292A (en) * | 1971-04-07 | 1972-09-26 | Leva Anthony J Di | Swimming doll |
JP3455999B2 (ja) * | 1993-12-20 | 2003-10-14 | 株式会社デンソー | 走行台車 |
GB2326353B (en) * | 1997-06-20 | 2001-02-28 | Wong T K Ass Ltd | Toy |
US6374157B1 (en) | 1998-11-30 | 2002-04-16 | Sony Corporation | Robot device and control method thereof |
JP2000323219A (ja) * | 1999-05-10 | 2000-11-24 | Sony Corp | 接続装置及びロボットシステム |
US6458011B1 (en) * | 1999-05-10 | 2002-10-01 | Sony Corporation | Robot device |
JP2004034169A (ja) * | 2002-06-28 | 2004-02-05 | Sony Corp | 脚式移動ロボット装置及び脚式移動ロボット装置の移動制御方法 |
EP1607191A1 (en) * | 2003-03-23 | 2005-12-21 | Sony Corporation | Robot device and method of controlling the same |
US7752544B2 (en) * | 2003-11-17 | 2010-07-06 | International Business Machines Corporation | Method, system, and apparatus for remote interactions |
JP4595436B2 (ja) * | 2004-03-25 | 2010-12-08 | 日本電気株式会社 | ロボット、その制御方法及び制御用プログラム |
KR100657530B1 (ko) * | 2005-03-31 | 2006-12-14 | 엘지전자 주식회사 | 자동주행 로봇의 들림감지장치 |
ATE524784T1 (de) * | 2005-09-30 | 2011-09-15 | Irobot Corp | Begleitroboter für persönliche interaktion |
US8554370B2 (en) * | 2009-05-15 | 2013-10-08 | Honda Motor Co., Ltd | Machine learning approach for predicting humanoid robot fall |
FR2964055B1 (fr) * | 2010-08-27 | 2012-08-17 | Aldebaran Robotics S A | Robot humanoide dote de capacites de gestion de chutes et methode de gestion desdites chutes |
CN201940040U (zh) * | 2010-09-27 | 2011-08-24 | 深圳市杰思谷科技有限公司 | 家用机器人 |
US8596147B2 (en) * | 2010-11-30 | 2013-12-03 | Hallmark Cards, Incorporated | Non-rigid sensor for detecting deformation |
US8880221B2 (en) * | 2011-03-21 | 2014-11-04 | Honda Motor Co., Ltd. | Damage reduction control for humanoid robot fall |
US9789603B2 (en) * | 2011-04-29 | 2017-10-17 | Sarcos Lc | Teleoperated robotic system |
WO2014007728A1 (en) * | 2012-07-05 | 2014-01-09 | Husqvarna Ab | Displacement sensor for a robotic vehicle detecting a lift event and a collision event |
US10555498B2 (en) * | 2012-09-19 | 2020-02-11 | Botsitter, Llc | Method and system for remote monitoring, care and maintenance of animals |
CN103010326B (zh) * | 2012-12-19 | 2014-02-26 | 北京信息科技大学 | 一种蛇形机器人的电磁式八向独立可伸缩轮式机构 |
US9956687B2 (en) * | 2013-03-04 | 2018-05-01 | Microsoft Technology Licensing, Llc | Adapting robot behavior based upon human-robot interaction |
DE102014110875A1 (de) * | 2014-07-10 | 2016-01-28 | Vorwerk & Co. Interholding Gmbh | Verfahrteil, insbesondere selbsttätig verfahrbares Bodenreinigungsgerät |
US9501059B2 (en) * | 2014-09-12 | 2016-11-22 | Qualcomm Incorporated | Pocket robot |
US9530058B2 (en) * | 2014-12-11 | 2016-12-27 | Toyota Motor Engineering & Manufacturing North America, Inc. | Visual-assist robots |
US9567021B2 (en) * | 2015-06-11 | 2017-02-14 | Christopher Mailey | Dynamically stable stair climbing home robot |
-
2017
- 2017-06-20 WO PCT/JP2017/022674 patent/WO2018012219A1/ja active Application Filing
- 2017-06-20 DE DE112017003480.9T patent/DE112017003480B4/de active Active
- 2017-06-20 JP JP2017564923A patent/JP6436548B2/ja active Active
- 2017-06-20 GB GB1820152.5A patent/GB2565959B/en active Active
- 2017-06-20 CN CN201780042199.6A patent/CN109414623B/zh active Active
-
2018
- 2018-11-09 JP JP2018211099A patent/JP2019063543A/ja active Pending
- 2018-12-27 US US16/233,097 patent/US11213763B2/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001138273A (ja) * | 1999-11-08 | 2001-05-22 | Sony Corp | 脚式移動ロボット及びその制御方法 |
JP2003311028A (ja) * | 2002-04-26 | 2003-11-05 | Matsushita Electric Ind Co Ltd | ペットロボット装置 |
JP2004306251A (ja) * | 2003-03-23 | 2004-11-04 | Sony Corp | ロボット装置及びその制御方法 |
JP2005144612A (ja) * | 2003-11-17 | 2005-06-09 | Sony Corp | ロボットシステム、遠隔操作装置、ロボット装置及びその制御方法 |
JP2011000681A (ja) * | 2009-06-19 | 2011-01-06 | Advanced Telecommunication Research Institute International | コミュニケーションロボット |
US20150336276A1 (en) * | 2012-12-28 | 2015-11-26 | Future Robot Co., Ltd. | Personal robot |
Non-Patent Citations (1)
Title |
---|
YUICHIRO KUROSE: "Yuatsu Power Unit Tosaigata Kyakusharin Robot RL-W1 no Kaihatsu", THE 34TH ANNUAL CONFERENCE OF THE ROBOTICS SOCIETY OF JAPAN YOKOSHU DVD -ROM 2016 NEN, 7 September 2016 (2016-09-07), pages 1423 - 1424 * |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11899456B2 (en) | 2018-02-26 | 2024-02-13 | Sony Corporation | Information processing device, information processing method, and program |
CN111867696A (zh) * | 2018-02-26 | 2020-10-30 | 索尼公司 | 信息处理装置、信息处理方法及程序 |
WO2019163312A1 (ja) * | 2018-02-26 | 2019-08-29 | ソニー株式会社 | 情報処理装置、情報処理方法、およびプログラム |
EP3760290A4 (en) * | 2018-02-26 | 2021-04-28 | Sony Corporation | INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING PROCESS AND PROGRAM |
JPWO2019163312A1 (ja) * | 2018-02-26 | 2021-05-20 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、およびプログラム |
US11409289B2 (en) | 2018-02-26 | 2022-08-09 | Sony Corporation | Information processing device, information processing method, and program |
JP7312517B2 (ja) | 2018-02-26 | 2023-07-21 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、およびプログラム |
CN111867696B (zh) * | 2018-02-26 | 2022-08-26 | 索尼公司 | 信息处理装置、信息处理方法及程序 |
US20220326707A1 (en) * | 2018-02-26 | 2022-10-13 | Sony Corporation | Information processing device, information processing method, and program |
WO2020105625A1 (ja) * | 2018-11-20 | 2020-05-28 | Groove X株式会社 | 車輪構造およびロボット |
JP7237565B2 (ja) | 2018-12-21 | 2023-03-13 | 株式会社セガトイズ | 形象体玩具及び形象体玩具の制御方法 |
JP2020099526A (ja) * | 2018-12-21 | 2020-07-02 | 株式会社セガ トイズ | 形象体玩具及び形象体玩具の制御方法 |
JP7415956B2 (ja) | 2019-01-31 | 2024-01-17 | ソニーグループ株式会社 | ロボットの制御装置、ロボットの制御方法、及びプログラム |
WO2020158641A1 (ja) * | 2019-01-31 | 2020-08-06 | ソニー株式会社 | ロボットの制御装置、ロボットの制御方法、及びプログラム |
WO2021039191A1 (ja) * | 2019-08-27 | 2021-03-04 | ソニー株式会社 | 情報処理装置、その制御方法及びプログラム |
US12001226B2 (en) | 2019-08-27 | 2024-06-04 | Sony Group Corporation | Information processing apparatus, control method, and program |
JP7501536B2 (ja) | 2019-08-27 | 2024-06-18 | ソニーグループ株式会社 | 情報処理装置、その制御方法及びプログラム |
WO2023037609A1 (ja) * | 2021-09-10 | 2023-03-16 | ソニーグループ株式会社 | 自律移動体、情報処理方法、及び、プログラム |
CN114918979A (zh) * | 2022-06-30 | 2022-08-19 | 上海擎朗智能科技有限公司 | 一种浮动托盘及机器人 |
CN114918979B (zh) * | 2022-06-30 | 2024-04-26 | 上海擎朗智能科技有限公司 | 一种浮动托盘及机器人 |
WO2024071166A1 (ja) * | 2022-09-29 | 2024-04-04 | 日東電工株式会社 | ロボット |
Also Published As
Publication number | Publication date |
---|---|
DE112017003480B4 (de) | 2021-01-21 |
GB201820152D0 (en) | 2019-01-23 |
JP6436548B2 (ja) | 2018-12-12 |
DE112017003480T5 (de) | 2019-04-04 |
GB2565959B (en) | 2021-10-13 |
CN109414623B (zh) | 2021-03-19 |
JP2019063543A (ja) | 2019-04-25 |
GB2565959A (en) | 2019-02-27 |
CN109414623A (zh) | 2019-03-01 |
JPWO2018012219A1 (ja) | 2018-07-19 |
US20190126157A1 (en) | 2019-05-02 |
US11213763B2 (en) | 2022-01-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6436548B2 (ja) | 自律行動型ロボット | |
JP6467674B2 (ja) | スキンシップを理解する自律行動型ロボット | |
JP6409206B2 (ja) | お出迎え行動する自律行動型ロボット | |
JP6475872B2 (ja) | 活動量をコントロールされる自律行動型ロボット | |
JP6402320B2 (ja) | 人見知りする自律行動型ロボット | |
JP6436549B2 (ja) | 自律行動型ロボット | |
JP6472113B2 (ja) | 自然な距離感を保つ自律行動型ロボットおよびプログラム | |
JP6884401B2 (ja) | 服を着る自律行動型ロボット | |
JP6409209B2 (ja) | 涼しさを求める自律行動型ロボット | |
JP2019149181A (ja) | 瞳を変化させる自律行動型ロボット | |
JP6557840B2 (ja) | ロボット、サーバ及び行動制御プログラム | |
JPWO2018084170A1 (ja) | 人を識別する自律行動型ロボット | |
JP2018192559A (ja) | 曲面形状のボディに対するタッチを検出する自律行動型ロボット |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2017564923 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17827349 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 201820152 Country of ref document: GB Kind code of ref document: A Free format text: PCT FILING DATE = 20170620 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17827349 Country of ref document: EP Kind code of ref document: A1 |