WO2018047802A1 - 自然な距離感を保つ自律行動型ロボット - Google Patents
自然な距離感を保つ自律行動型ロボット Download PDFInfo
- Publication number
- WO2018047802A1 WO2018047802A1 PCT/JP2017/031890 JP2017031890W WO2018047802A1 WO 2018047802 A1 WO2018047802 A1 WO 2018047802A1 JP 2017031890 W JP2017031890 W JP 2017031890W WO 2018047802 A1 WO2018047802 A1 WO 2018047802A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- robot
- distance
- user
- movement
- control unit
- Prior art date
Links
- 230000033001 locomotion Effects 0.000 claims abstract description 122
- 238000003384 imaging method Methods 0.000 claims abstract description 30
- 230000003542 behavioural effect Effects 0.000 claims description 13
- 230000009471 action Effects 0.000 abstract description 60
- 230000007246 mechanism Effects 0.000 description 43
- 210000003128 head Anatomy 0.000 description 41
- 230000006399 behavior Effects 0.000 description 28
- 238000000034 method Methods 0.000 description 27
- 238000012545 processing Methods 0.000 description 27
- 230000008451 emotion Effects 0.000 description 23
- 238000003860 storage Methods 0.000 description 23
- 230000006870 function Effects 0.000 description 22
- 230000004044 response Effects 0.000 description 21
- 238000013459 approach Methods 0.000 description 20
- 238000005259 measurement Methods 0.000 description 17
- 238000004891 communication Methods 0.000 description 16
- 238000013500 data storage Methods 0.000 description 16
- 238000007726 management method Methods 0.000 description 14
- 230000004048 modification Effects 0.000 description 14
- 238000012986 modification Methods 0.000 description 14
- 238000010586 diagram Methods 0.000 description 11
- 230000008569 process Effects 0.000 description 11
- 241001465754 Metazoa Species 0.000 description 10
- 235000019645 odor Nutrition 0.000 description 8
- 241000282414 Homo sapiens Species 0.000 description 7
- 230000000694 effects Effects 0.000 description 7
- 230000007423 decrease Effects 0.000 description 6
- 239000000284 extract Substances 0.000 description 6
- 241000282472 Canis lupus familiaris Species 0.000 description 5
- 241000282326 Felis catus Species 0.000 description 5
- 238000012937 correction Methods 0.000 description 5
- 238000001514 detection method Methods 0.000 description 5
- 238000000605 extraction Methods 0.000 description 5
- 241000282412 Homo Species 0.000 description 4
- 238000004590 computer program Methods 0.000 description 4
- 230000015654 memory Effects 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 206010063659 Aversion Diseases 0.000 description 3
- 241000791900 Selene vomer Species 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 238000007621 cluster analysis Methods 0.000 description 3
- 238000009826 distribution Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000010438 heat treatment Methods 0.000 description 3
- 229920005989 resin Polymers 0.000 description 3
- 239000011347 resin Substances 0.000 description 3
- 230000035943 smell Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 238000012935 Averaging Methods 0.000 description 2
- 206010037180 Psychiatric symptoms Diseases 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- 238000013528 artificial neural network Methods 0.000 description 2
- 230000036760 body temperature Effects 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 238000013135 deep learning Methods 0.000 description 2
- 230000008921 facial expression Effects 0.000 description 2
- 230000002349 favourable effect Effects 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 229910052751 metal Inorganic materials 0.000 description 2
- 239000002184 metal Substances 0.000 description 2
- 230000000149 penetrating effect Effects 0.000 description 2
- 230000003014 reinforcing effect Effects 0.000 description 2
- 229910001285 shape-memory alloy Inorganic materials 0.000 description 2
- 206010048909 Boredom Diseases 0.000 description 1
- JOYRKODLDBILNP-UHFFFAOYSA-N Ethyl urethane Chemical compound CCOC(N)=O JOYRKODLDBILNP-UHFFFAOYSA-N 0.000 description 1
- HBBGRARXTFLTSG-UHFFFAOYSA-N Lithium ion Chemical compound [Li+] HBBGRARXTFLTSG-UHFFFAOYSA-N 0.000 description 1
- 229920006311 Urethane elastomer Polymers 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000004791 biological behavior Effects 0.000 description 1
- 230000031018 biological processes and functions Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000009529 body temperature measurement Methods 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000008602 contraction Effects 0.000 description 1
- 238000005520 cutting process Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 239000013013 elastic material Substances 0.000 description 1
- 229920001971 elastomer Polymers 0.000 description 1
- 210000000887 face Anatomy 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 230000037308 hair color Effects 0.000 description 1
- 230000035876 healing Effects 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 229910001416 lithium ion Inorganic materials 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 239000002304 perfume Substances 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 239000005060 rubber Substances 0.000 description 1
- 238000007789 sealing Methods 0.000 description 1
- 238000010187 selection method Methods 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 230000000391 smoking effect Effects 0.000 description 1
- 239000007779 soft material Substances 0.000 description 1
- 238000001179 sorption measurement Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 238000009423 ventilation Methods 0.000 description 1
- 208000008918 voyeurism Diseases 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/021—Optical sensing devices
- B25J19/023—Optical sensing devices including video camera means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J5/00—Manipulators mounted on wheels or on carriages
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J5/00—Manipulators mounted on wheels or on carriages
- B25J5/007—Manipulators mounted on wheels or on carriages mounted on wheels
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1628—Programme controls characterised by the control loop
- B25J9/1653—Programme controls characterised by the control loop parameters identification, estimation, stiffness, accuracy, error analysis
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
Definitions
- the present invention relates to a robot that autonomously selects an action according to an internal state or an external environment.
- the present invention is an invention completed based on the above problem recognition, and its main object is to provide a control technology for giving a sense of life to a robot.
- One embodiment of the present invention is an autonomous behavior robot.
- the robot includes an imaging unit for imaging the surroundings, and a movement control unit for controlling the distance to the object according to the size of the imaged object.
- Another aspect of the present invention is an action control program of an autonomous action robot.
- This program has a function of acquiring a captured image of the surroundings of the robot, a function of specifying a predetermined target in the captured image, and a position at which the robot and the target should be taken according to the size of the imaged target.
- a computer realizes a function of calculating a relation and a function of controlling movement of a robot so as to realize the calculated positional relation.
- FIG. 2 is a cross-sectional view schematically illustrating the structure of a robot. It is a side view showing the structure of a robot centering on a frame. It is a block diagram of a robot system. It is a conceptual diagram of an emotion map. It is a hardware block diagram of a robot. It is a functional block diagram of a robot system. It is a schematic diagram showing the method of controlling distance with a user. It is a figure which shows an example of the setting method of look-up angle. It is a figure which shows the other example of the setting method of look-up angle. It is a figure showing the setting table referred when determining a look-up angle. It is a flowchart which illustrates operation control of a robot.
- FIG. 1 is a view showing the appearance of a robot 100 according to the embodiment.
- Fig.1 (a) is a front view
- FIG.1 (b) is a side view.
- the robot 100 in the present embodiment is an autonomous action robot that determines an action or gesture (gesture) based on an external environment and an internal state.
- the external environment is recognized by various sensors such as a camera and a thermo sensor.
- the internal state is quantified as various parameters representing the emotion of the robot 100. These will be described later.
- the robot 100 is premised on indoor behavior, and takes, for example, an indoor range of an owner's home.
- a human being related to the robot 100 is referred to as a "user”
- a user who is a member of a home to which the robot 100 belongs is referred to as an "owner”.
- the body 104 of the robot 100 has an overall rounded shape, and includes an outer shell formed of a soft and elastic material such as urethane, rubber, resin, or fiber.
- the robot 100 may be dressed. By making the body 104 round and soft and have a good touch, the robot 100 provides the user with a sense of security and a pleasant touch.
- the robot 100 has a total weight of 15 kilograms or less, preferably 10 kilograms or less, and more preferably 5 kilograms or less.
- the average weight of a 13-month-old baby is just over 9 kilograms for boys and less than 9 kilograms for girls. Therefore, if the total weight of the robot 100 is 10 kilograms or less, the user can hold the robot 100 with almost the same effort as holding a baby that can not walk alone.
- the average weight of babies less than 2 months old is less than 5 kilograms for both men and women. Therefore, if the total weight of the robot 100 is 5 kg or less, the user can hold the robot 100 with the same effort as holding an infant.
- the various attributes such as appropriate weight, roundness, softness, and good touch realize an effect that the user can easily hold the robot 100 and can not hold it.
- it is desirable that the height of the robot 100 is 1.2 meters or less, preferably 0.7 meters or less.
- being able to hold it is an important concept.
- the robot 100 includes three wheels for traveling three wheels. As shown, a pair of front wheels 102 (left wheel 102a, right wheel 102b) and one rear wheel 103 are included.
- the front wheel 102 is a driving wheel
- the rear wheel 103 is a driven wheel.
- the front wheel 102 does not have a steering mechanism, but its rotational speed and rotational direction can be individually controlled.
- the rear wheel 103 is a so-called omni wheel, and is rotatable in order to move the robot 100 back and forth and left and right.
- the robot 100 can turn left or rotate counterclockwise.
- the rotation speed of the left wheel 102a larger than that of the right wheel 102b
- the robot 100 can turn right or rotate clockwise.
- the front wheel 102 and the rear wheel 103 can be completely housed in the body 104 by a drive mechanism (a rotation mechanism, a link mechanism) described later. Even when traveling, most of the wheels are hidden by the body 104, but when the wheels are completely housed in the body 104, the robot 100 can not move. That is, the body 104 descends and seats on the floor surface as the wheels are stored. In this sitting state, the flat seating surface 108 (grounding bottom surface) formed on the bottom of the body 104 abuts on the floor surface F.
- a drive mechanism a rotation mechanism, a link mechanism
- the robot 100 has two hands 106.
- the hand 106 does not have the function of gripping an object.
- the hand 106 can perform simple operations such as raising, shaking and vibrating.
- the two hands 106 are also individually controllable.
- Two eyes 110 are provided on the front of the head (face) of the robot 100.
- the eye 110 incorporates a high resolution camera 402.
- the eye 110 can also display an image with a liquid crystal element or an organic EL element.
- the robot 100 has a built-in speaker and can emit a simple voice.
- a horn 112 is attached to the top of the head of the robot 100. As described above, since the robot 100 is lightweight, the user can lift the robot 100 by grasping the tongue 112.
- the omnidirectional camera 400 (first camera) is built in the horn 112.
- the omnidirectional camera 400 can shoot all directions in the vertical and horizontal directions (360 degrees: particularly, substantially the entire area above the robot 100) at one time by the fisheye lens (see FIG. 8).
- the high resolution camera 402 (second camera) built in the eye 110 can capture only the front direction of the robot 100.
- the omnidirectional camera 400 has a wide shooting range but lower resolution than the high resolution camera 402.
- the robot 100 includes various sensors such as a temperature sensor (thermo sensor) for imaging the ambient temperature distribution, a microphone array having a plurality of microphones, a shape measurement sensor (depth sensor) capable of measuring the shape of the measurement object, and an ultrasonic sensor. Built-in sensor.
- a temperature sensor thermo sensor
- a microphone array having a plurality of microphones
- a shape measurement sensor depth sensor
- ultrasonic sensor Built-in sensor.
- FIG. 2 is a cross-sectional view schematically showing the structure of the robot 100.
- FIG. 3 is a side view showing the structure of the robot 100 centering on a frame.
- FIG. 2 corresponds to a cross section taken along line AA of FIG.
- the body 104 of the robot 100 includes a base frame 308, a body frame 310, a pair of wheel covers 312 and an outer shell 314.
- the base frame 308 is made of metal and constitutes an axial center of the body 104 and supports an internal mechanism.
- the base frame 308 is configured by connecting an upper plate 332 and a lower plate 334 by a plurality of side plates 336 up and down.
- the plurality of side plates 336 is sufficiently spaced to allow air flow.
- a battery 118, a control circuit 342, various actuators and the like are accommodated.
- the body frame 310 is made of a resin material and includes a head frame 316 and a body frame 318.
- the head frame 316 has a hollow hemispherical shape and forms a head skeleton of the robot 100.
- the body frame 318 has a stepped cylindrical shape and forms the body frame of the robot 100.
- the body frame 318 is integrally fixed to the base frame 308.
- the head frame 316 is assembled to the upper end portion of the trunk frame 318 so as to be capable of relative displacement.
- the head frame 316 is provided with three axes of a yaw axis 321, a pitch axis 322, and a roll axis 323, and actuators 324, 325 for rotationally driving each axis.
- the actuator 324 includes a servomotor for driving the yaw axis 321.
- the actuator 325 includes a plurality of servomotors for driving the pitch axis 322 and the roll axis 323, respectively.
- the yaw shaft 321 is driven for the swinging operation
- the pitch shaft 322 is driven for the peeping operation
- the roll shaft 323 is driven for the tilting operation.
- a plate 326 supported by the yaw axis 321 is fixed to the top of the head frame 316.
- the plate 326 is formed with a plurality of vents 327 for ensuring ventilation between the top and bottom.
- a metal base plate 328 is provided to support the head frame 316 and its internal mechanism from below.
- the base plate 328 is connected to the upper plate 332 (base frame 308) through the joint 330.
- a support base 335 is provided on the base plate 328, and the actuators 324 and 325 and the cross link mechanism 329 (pantograph mechanism) are supported.
- the cross link mechanism 329 may connect the actuators 325 and 326 up and down, and change their distance.
- the roll shaft 323 of the actuator 325 is connected to the support 335 via a gear mechanism (not shown).
- the pitch axis 322 of the actuator 325 is connected to the lower end of the cross link mechanism 329.
- an actuator 324 is fixed to the upper end portion of the cross link mechanism 329.
- the yaw axis 321 of the actuator 324 is coupled to the plate 326.
- the actuator 325 is provided with a rotary drive mechanism (not shown) for driving the cross link mechanism 329 to extend and retract.
- the actuator 325 and the head frame 316 can be integrally rotated (rolling), and an operation of tilting the neck can be realized.
- the pitch shaft 322 the cross link mechanism 329 and the head frame 316 can be integrally rotated (pitching), and a loosening operation and the like can be realized.
- the plate 326 and the head frame 316 can be integrally rotated (yawed), and a swinging motion can be realized.
- the cross link mechanism 329 By expanding and contracting the cross link mechanism 329, the expansion and contraction operation of the neck can be realized.
- Torso frame 318 houses base frame 308 and wheel drive mechanism 370.
- the wheel drive mechanism 370 includes a front wheel drive mechanism 374 and a rear wheel drive mechanism 376.
- the upper half 380 has a smooth curved surface shape so that the outline of the body 104 may be rounded.
- the upper half 380 is formed to have a gradually smaller width toward the upper portion corresponding to the neck.
- the lower half 382 of the body frame 318 is narrowed to form a storage space S of the front wheel 102 with the wheel cover 312.
- the boundary between the upper half 380 and the lower half 382 has a stepped shape.
- the left and right side walls constituting the lower half portion 382 are parallel to each other, and penetrate and support a pivot shaft 378 described later of the front wheel drive mechanism 374.
- a lower plate 334 is provided to close the lower end opening of the lower half 382.
- the base frame 308 is fixed to and supported by the lower end of the body frame 318.
- the pair of wheel covers 312 is provided to cover the lower half 382 of the body frame 318 from the left and right.
- the wheel cover 312 is made of resin and assembled so as to form a smooth outer surface (curved surface) continuous with the upper half 380 of the body frame 318.
- the upper end of the wheel cover 312 is connected along the lower end of the upper half 380.
- a storage space S opened downward is formed between the side wall of the lower half 382 and the wheel cover 312.
- the outer cover 314 is made of urethane rubber and covers the body frame 310 and the wheel cover 312 from the outside.
- the hand 106 is integrally molded with the skin 314.
- an opening 390 for introducing external air is provided at the upper end of the shell 314.
- the front wheel drive mechanism 374 includes a rotational drive mechanism for rotating the front wheel 102 and a storage operation mechanism for advancing and retracting the front wheel 102 from the storage space S. That is, the front wheel drive mechanism 374 includes a pivot 378 and an actuator 379.
- the front wheel 102 has a direct drive motor (hereinafter referred to as "DD motor") 396 at its center.
- the DD motor 396 has an outer rotor structure, and the stator is fixed to the axle 398, and the rotor is coaxially fixed to the wheel 397 of the front wheel 102.
- the axle 398 is integrated with the pivot 378 via an arm 350.
- a bearing 352 is rotatably embedded while penetrating the pivot shaft 378.
- the bearing 352 is provided with a seal structure (bearing seal) for sealing the inside and the outside of the body frame 318 in an airtight manner.
- Rear wheel drive mechanism 376 includes a pivot shaft 354 and an actuator 356. Two arms 358 extend from the pivot shaft 354 and an axle 360 is integrally provided at the tip thereof.
- the rear wheel 103 is rotatably supported by the axle 360.
- a bearing (not shown) is rotatably supported while penetrating the pivot shaft 354.
- the bearing is also provided with a shaft seal structure.
- the actuators 379 and 356 are driven in one direction.
- the arm 350 pivots about the pivot shaft 378, and the front wheel 102 ascends from the floor surface F.
- the arm 358 pivots about the pivot shaft 354, and the rear wheel 103 ascends from the floor surface F.
- the body 104 descends and the seating surface 108 contacts the floor surface F.
- the state in which the robot 100 is seated is realized.
- the drive mechanism for driving the hand 106 includes a wire 134 embedded in the outer skin 314 and a drive circuit 340 (energization circuit) thereof.
- the wire 134 is formed of a shape memory alloy wire in the present embodiment, and shrinks and hardens when heated, and relaxes and elongates when heated. Leads drawn from both ends of the wire 134 are connected to the drive circuit 340. When the switch of the drive circuit 340 is turned on, the wire 134 (shape memory alloy wire) is energized.
- the wire 134 is molded or braided to extend from the skin 314 to the hand 106. Leads are drawn from both ends of the wire 134 to the inside of the body frame 318. One wire 134 may be provided on each of the left and right of the outer covering 314, or a plurality of wires 134 may be provided in parallel.
- the hand 106 can be raised by energizing the wire 134, and the hand 106 can be lowered by interrupting the energization.
- the robot 100 can adjust the angle of the line of sight (see a dotted arrow) by controlling the rotation angle of the pitch axis 322.
- the direction of the imaginary straight line passing through the pitch axis 322 and the eye 110 is taken as the direction of the line of sight.
- the optical axis of the high resolution camera 402 coincides with the line of sight.
- the line connecting the omnidirectional camera 400 and the pitch axis 322 and the line of sight are set to be at a right angle.
- the movable range (rotational range) of the head frame 316 centered on the pitch axis 322 can be made large.
- the movable range is set to 90 degrees, and 45 degrees in the vertical direction from the state where the line of sight is horizontal. That is, the limit value of the angle at which the line of sight of the robot 100 rises (look-up angle) is 45 degrees, and the limit value of the angle at which the line of sight downwards (look-down angle) is 45 degrees.
- FIG. 4 is a block diagram of the robot system 300.
- the robot system 300 includes a robot 100, a server 200 and a plurality of external sensors 114.
- a plurality of external sensors 114 (external sensors 114a, 114b, ..., 114n) are installed in advance in the house.
- the external sensor 114 may be fixed to the wall of the house or may be mounted on the floor.
- position coordinates of the external sensor 114 are registered. The position coordinates are defined as x, y coordinates in a house assumed as the action range of the robot 100.
- the server 200 is installed in the home.
- the server 200 and the robot 100 in the present embodiment correspond one to one.
- the server 200 determines the basic behavior of the robot 100 based on the information obtained from the sensors contained in the robot 100 and the plurality of external sensors 114.
- the external sensor 114 is for reinforcing the senses of the robot 100, and the server 200 is for reinforcing the brain of the robot 100.
- the external sensor 114 periodically transmits a wireless signal (hereinafter referred to as a “robot search signal”) including the ID of the external sensor 114 (hereinafter referred to as “beacon ID”).
- a wireless signal hereinafter referred to as a “robot search signal”
- the robot 100 sends back a radio signal (hereinafter referred to as a “robot reply signal”) including a beacon ID.
- the server 200 measures the time from when the external sensor 114 transmits the robot search signal to when the robot reply signal is received, and measures the distance from the external sensor 114 to the robot 100. By measuring the distances between the plurality of external sensors 114 and the robot 100, the position coordinates of the robot 100 are specified. Of course, the robot 100 may periodically transmit its position coordinates to the server 200.
- FIG. 5 is a conceptual view of the emotion map 116.
- the emotion map 116 is a data table stored in the server 200.
- the robot 100 selects an action according to the emotion map 116.
- the emotion map 116 indicates the size of the evil emotion for the location of the robot 100.
- the x-axis and y-axis of emotion map 116 indicate two-dimensional space coordinates.
- the z-axis indicates the size of the bad feeling. When the z value is positive, the preference for the location is high, and when the z value is negative, it indicates that the location is disliked.
- the coordinate P1 is a point (hereinafter referred to as a "favory point") in the indoor space managed by the server 200 as the action range of the robot 100, in which the favorable feeling is high.
- the favor point may be a "safe place” such as a shade of a sofa or under a table, a place where people easily gather like a living, or a lively place. In addition, it may be a place which has been gently boiled or touched in the past.
- the definition of what kind of place the robot 100 prefers is arbitrary, generally, it is desirable to set a place favored by small children such as small children and dogs and cats.
- a coordinate P2 is a point at which a bad feeling is high (hereinafter, referred to as a “disgust point”).
- Aversion points are places with loud noise such as near a television, places that are easy to get wet like baths and washrooms, closed spaces or dark places, places that lead to unpleasant memories that have been roughly treated by users, etc. It may be.
- the definition of what place the robot 100 hates is also arbitrary, it is generally desirable to set a place where small animals such as small children, dogs and cats are scared as a disappointment point.
- the coordinate Q indicates the current position of the robot 100.
- the server 200 may grasp how far the robot 100 is from which external sensor 114 and in which direction.
- the movement distance of the robot 100 may be calculated from the number of rotations of the wheel (front wheel 102) to specify the current position, or the current position may be specified based on the image obtained from the camera. You may When the emotion map 116 is given, the robot 100 moves in the direction in which it is drawn to the favor point (coordinate P1) and in the direction away from the aversion point (coordinate P2).
- the emotion map 116 changes dynamically.
- the z-value (favorable feeling) at the coordinate P1 decreases with time.
- the robot 100 can reach the favor point (coordinate P1), and emulate the biological behavior of "feeling of emotion” being satisfied and eventually "being bored” at the place.
- bad feelings at coordinate P2 are also alleviated with time.
- new favor points and aversion points are created, whereby the robot 100 makes a new action selection.
- the robot 100 has an "interest" at a new favor point and continuously selects an action.
- the emotion map 116 expresses the ups and downs of emotion as the internal state of the robot 100.
- the robot 100 aims at the favor point, avoids the disgust point, stays at the favor point for a while, and then takes the next action again.
- Such control can make the behavior selection of the robot 100 human and biological.
- the map that affects the behavior of the robot 100 (hereinafter collectively referred to as “action map”) is not limited to the emotion map 116 of the type shown in FIG.
- various behavior maps can be defined such as curiosity, feeling of fear avoidance, feeling of relief, feeling of quietness and dimness, feeling of physical comfort such as coolness and warmth.
- the destination point of the robot 100 may be determined by weighted averaging the z values of each of the plurality of action maps.
- the robot 100 may have parameters indicating the magnitudes of various emotions and senses separately from the action map. For example, when the value of the emotion parameter of loneliness is increasing, the weighting coefficient of the behavior map for evaluating a safe place may be set large, and the value of the emotion parameter may be lowered by reaching the target point. Similarly, when the value of the parameter indicating a feeling of being boring is increasing, the weighting coefficient of the behavior map for evaluating a place satisfying the curiosity may be set large.
- FIG. 6 is a hardware configuration diagram of the robot 100.
- the robot 100 includes an internal sensor 128, a communicator 126, a storage device 124, a processor 122, a drive mechanism 120 and a battery 118.
- the drive mechanism 120 includes the wheel drive mechanism 370 described above.
- Processor 122 and storage 124 are included in control circuit 342.
- the units are connected to each other by a power supply line 130 and a signal line 132.
- the battery 118 supplies power to each unit via the power supply line 130. Each unit transmits and receives control signals through a signal line 132.
- the battery 118 is a secondary battery such as a lithium ion battery, and is a power source of the robot 100.
- the internal sensor 128 is an assembly of various sensors incorporated in the robot 100. Specifically, in addition to the camera 410 (the omnidirectional camera 400 and the high resolution camera 402), the microphone array 404, the temperature sensor 406, and the shape measurement sensor 408, an infrared sensor, a touch sensor, an acceleration sensor, an odor sensor, and the like.
- the odor sensor is a known sensor to which the principle that the electric resistance is changed by the adsorption of the molecule that is the source of the odor is applied.
- the odor sensor classifies various odors into a plurality of categories (hereinafter referred to as "odor category").
- the communication device 126 is a communication module that performs wireless communication for various external devices such as the server 200, the external sensor 114, and a portable device owned by a user.
- the storage device 124 is configured by a non-volatile memory and a volatile memory, and stores a computer program and various setting information.
- the processor 122 is an execution means of a computer program.
- the drive mechanism 120 is an actuator that controls an internal mechanism. In addition to this, indicators and speakers will also be installed.
- the processor 122 performs action selection of the robot 100 while communicating with the server 200 and the external sensor 114 via the communication device 126.
- Various external information obtained by the internal sensor 128 also affects behavior selection.
- the drive mechanism 120 mainly controls the wheel (front wheel 102), the head (head frame 316) and the torso (hand 106).
- the drive mechanism 120 changes the moving direction and the moving speed of the robot 100 by changing the rotational speed and the rotational direction of each of the two front wheels 102.
- the drive mechanism 120 can also raise and lower the wheels (the front wheel 102 and the rear wheel 103). When the wheel ascends, the wheel is completely housed in the body 104, and the robot 100 abuts on the floor surface at the seating surface 108 to be in a sitting state.
- the drive mechanism 120 can lift the hand 106 by pulling the hand 106 through the wire 134. It is also possible to shake the hand by vibrating the hand 106. More complex gestures can also be expressed using multiple wires 134.
- FIG. 7 is a functional block diagram of the robot system 300.
- robot system 300 includes robot 100, server 200, and a plurality of external sensors 114.
- Each component of the robot 100 and the server 200 includes computing devices such as a CPU (Central Processing Unit) and various co-processors, storage devices such as memory and storage, hardware including wired or wireless communication lines connecting them, and storage It is stored in the device and implemented by software that supplies processing instructions to the computing unit.
- the computer program may be configured by a device driver, an operating system, various application programs located in the upper layer of them, and a library that provides common functions to these programs.
- Each block described below indicates not a hardware unit configuration but a function unit block.
- Some of the functions of the robot 100 may be realized by the server 200, and some or all of the functions of the server 200 may be realized by the robot 100.
- the server 200 includes a communication unit 204, a data processing unit 202, and a data storage unit 206.
- the communication unit 204 takes charge of communication processing with the external sensor 114 and the robot 100.
- the data storage unit 206 stores various data.
- the data processing unit 202 executes various processes based on the data acquired by the communication unit 204 and the data stored in the data storage unit 206.
- the data processing unit 202 also functions as an interface of the communication unit 204 and the data storage unit 206.
- the data storage unit 206 includes a motion storage unit 232, a map storage unit 216, and a personal data storage unit 218.
- the robot 100 has a plurality of motion patterns (motions). A variety of motions are defined, such as shaking hands, approaching the owner while meandering, staring at the owner with a sharp neck, etc.
- the motion storage unit 232 stores a "motion file" that defines control content of motion. Each motion is identified by a motion ID. The motion file is also downloaded to the motion storage unit 160 of the robot 100. Which motion is to be performed may be determined by the server 200 or the robot 100.
- the motions of the robot 100 are configured as complex motions including a plurality of unit motions.
- the robot 100 may be expressed as a combination of a unit motion that turns toward the owner, a unit motion that approaches while raising the hand, a unit motion that approaches while shaking the body, and a unit motion that sits while raising both hands. .
- the combination of such four motions realizes a motion of “close to the owner, raise your hand halfway, and finally sit down with your body shaking”.
- the rotation angle and angular velocity of an actuator provided in the robot 100 are defined in association with the time axis.
- Various motions are represented by controlling each actuator with the passage of time according to a motion file (actuator control information).
- the transition time when changing from the previous unit motion to the next unit motion is called “interval".
- the interval may be defined according to the time required for unit motion change and the contents of the motion.
- the length of the interval is adjustable.
- settings relating to behavior control of the robot 100 such as when to select which motion, output adjustment of each actuator for realizing the motion, and the like are collectively referred to as “behavior characteristics”.
- the behavior characteristics of the robot 100 are defined by a motion selection algorithm, a motion selection probability, a motion file, and the like.
- the map storage unit 216 stores a plurality of action maps.
- the personal data storage unit 218 stores information of the user, in particular, the owner. Specifically, various parameters such as familiarity with the user, physical characteristics and behavioral characteristics of the user are stored. Other attribute information such as age and gender may be stored.
- the robot 100 identifies the user based on physical and behavioral features of the user.
- the robot 100 always images the surroundings with a built-in camera. Then, physical features and behavioral features of the person shown in the image are extracted.
- Physical features may be visual features associated with the body, such as height, preferred clothes, glasses, skin color, hair color, ear size, average body temperature or Other features such as smell, voice quality, etc. may also be included.
- the behavioral feature is a feature that accompanies the action, such as the place the user likes, the activity activity, and the presence or absence of smoking.
- the robot 100 clusters users who frequently appear as “owners” based on physical features and behavioral features obtained from a large amount of image information and other sensing information.
- the method of identifying a user by a user ID is simple and reliable, it is premised that the user has a device capable of providing the user ID.
- the method of identifying the user based on the physical characteristics and the behavioral characteristics has an advantage that even a user who does not possess a portable device although it has a large image recognition processing load can identify it. Only one of the two methods may be adopted, or the user identification may be performed using the two methods in combination in a complementary manner.
- users are clustered from physical features and behavioral features, and the users are identified by deep learning (multilayer neural network).
- the robot 100 has an internal parameter called familiarity for each user.
- familiarity for each user.
- an action indicating favor with itself such as raising itself or giving a voice
- familiarity with the user is increased.
- the closeness to the user who is not involved in the robot 100, the user who is violent, and the user who is infrequently encountered is low.
- the data processing unit 202 includes a position management unit 208, a map management unit 210, a recognition unit 212, an operation determination unit 222, and a closeness management unit 220.
- the position management unit 208 specifies the position coordinates of the robot 100 by the method described using FIG. 4.
- the position management unit 208 may also track the user's position coordinates in real time.
- the map management unit 210 changes the parameter of each coordinate in the method described with reference to FIG. 5 for a plurality of action maps.
- the map management unit 210 manages a temperature map, which is a type of action map.
- the map management unit 210 may select one of the plurality of behavior maps, or may perform weighted averaging of z values of the plurality of behavior maps.
- the coordinates R1 and z values at the coordinates R2 are 4 and 3
- the coordinates R1 and z values at the coordinates R2 are ⁇ 1 and 3, respectively.
- the recognition unit 212 recognizes the external environment.
- the recognition of the external environment includes various recognitions such as recognition of weather and season based on temperature and humidity, recognition of an object shade (safety area) based on light quantity and temperature.
- the recognition unit 212 further includes a person recognition unit 214 and a response recognition unit 228.
- the person recognition unit 214 recognizes a person from an image captured by the built-in camera of the robot 100, and extracts physical features and behavioral features of the person. Then, based on the body feature information and the action feature information registered in the personal data storage unit 218, the captured user, ie, the user who is looking at the robot 100 corresponds to any person such as a father, a mother, a eldest son, etc.
- the person recognition unit 214 includes an expression recognition unit 230.
- the facial expression recognition unit 230 estimates the user's emotion by performing image recognition on the user's facial expression.
- the person recognition unit 214 also performs feature extraction on cats and dogs that are pets, for example.
- the response recognition unit 228 recognizes various response actions made to the robot 100, and classifies them as pleasant and unpleasant actions.
- the response recognition unit 228 also classifies into a positive / negative response by recognizing the owner's response to the behavior of the robot 100.
- the pleasant and unpleasant behavior is determined depending on whether the user's response behavior is comfortable or unpleasant as a living thing. For example, holding is a pleasant act for the robot 100, and kicking is an unpleasant act for the robot 100.
- the positive / negative response is determined depending on whether the user's response indicates a user's pleasant emotion or an unpleasant emotion. For example, holding is an affirmative response that indicates the user's pleasant emotion, and kicking is a negative response that indicates the user's unpleasant emotion.
- the motion determination unit 222 cooperates with the control unit 150 of the robot 100 to determine the motion of the robot 100.
- the movement determination unit 222 creates a movement target point of the robot 100 and a movement route therefor based on the action map selection by the map management unit 210.
- the operation determination unit 222 may create a plurality of movement routes, and then select one of the movement routes.
- the motion determination unit 222 selects the motion of the robot 100 from the plurality of motions of the motion storage unit 232.
- Each motion is associated with a selection probability for each situation. For example, a selection method is defined such that motion A is executed with a probability of 20% when a pleasant action is made by the owner, and motion B is executed with a probability of 5% when the temperature reaches 30 degrees or more. .
- a movement target point and a movement route are determined in the action map, and a motion is selected by various events described later.
- the closeness management unit 220 manages closeness for each user. As described above, the intimacy degree is registered in the personal data storage unit 218 as part of the personal data. When a pleasant act is detected, the closeness management unit 220 increases the closeness to the owner. The intimacy is down when an offensive act is detected. In addition, the closeness of the owner who has not viewed for a long time gradually decreases.
- the robot 100 includes an internal sensor 128, a communication unit 142, a data processing unit 136, a data storage unit 148, and a drive mechanism 120.
- the internal sensor 128 is a collection of various sensors.
- the internal sensor 128 includes a microphone array 404, a camera 410, a temperature sensor 406 and a shape measurement sensor 408.
- the microphone array 404 is a unit in which a plurality of microphones are joined together, and is an audio sensor that detects a sound.
- the microphone array 404 may be any device capable of detecting sound and detecting the direction of the sound source.
- Microphone array 404 is embedded in head frame 316. Since the distance between the sound source and each microphone does not match, the sound collection timing varies. Therefore, the position of the sound source can be specified from the strength and phase of the sound in each microphone.
- the robot 100 detects the position of the sound source, in particular the direction of the sound source, by means of the microphone array 404.
- the camera 410 is a device for photographing the outside.
- the camera 410 includes an omnidirectional camera 400 and a high resolution camera 402.
- the temperature sensor 406 detects the temperature distribution of the external environment and forms an image.
- the shape measurement sensor 408 is an infrared depth sensor that reads near-infrared rays from a projector and detects near-infrared reflected light with a near-infrared camera to read the depth of the target object and, consequently, the uneven shape.
- the communication unit 142 corresponds to the communication device 126 (see FIG. 6), and takes charge of communication processing with the external sensor 114 and the server 200.
- the data storage unit 148 stores various data.
- the data storage unit 148 corresponds to the storage device 124 (see FIG. 6).
- the data processing unit 136 executes various processes based on the data acquired by the communication unit 142 and the data stored in the data storage unit 148.
- the data processing unit 136 corresponds to a processor 122 and a computer program executed by the processor 122.
- the data processing unit 136 also functions as an interface of the communication unit 142, the internal sensor 128, the drive mechanism 120, and the data storage unit 148.
- the data storage unit 148 includes a motion storage unit 160 that defines various motions of the robot 100.
- Various motion files are downloaded from the motion storage unit 232 of the server 200 to the motion storage unit 160.
- Motion is identified by motion ID.
- a state in which the front wheel 102 is accommodated which causes the robot 100 to rotate by having only the front wheel 102 housed and seated, lifting the hand 106, rotating the two front wheels 102 in reverse, or rotating only one front wheel 102
- various motions such as shaking by rotating the front wheel 102 at a time, stopping and turning back once when leaving the user, operation timing, operation time, operation direction, etc. of various actuators (drive mechanism 120) Temporarily defined in motion file.
- the data processing unit 136 includes a recognition unit 156, a control unit 150, and a sensor control unit 172.
- Control unit 150 includes a movement control unit 152 and an operation control unit 154.
- the movement control unit 152 determines the movement direction of the robot 100 together with the operation determination unit 222 of the server 200.
- the movement based on the action map may be determined by the server 200, and the immediate movement such as turning off an obstacle may be determined by the robot 100.
- the drive mechanism 120 drives the front wheel 102 according to the instruction of the movement control unit 152 to direct the robot 100 to the movement target point.
- the motion control unit 154 determines the motion of the robot 100 in cooperation with the motion determination unit 222 of the server 200. Some motions may be determined by the server 200, and other motions may be determined by the robot 100. Also, although the robot 100 determines the motion, the server 200 may determine the motion when the processing load of the robot 100 is high. The base motion may be determined at server 200 and additional motion may be determined at robot 100. How to share the motion determination process in the server 200 and the robot 100 may be designed according to the specification of the robot system 300.
- the operation control unit 154 instructs the drive mechanism 120 to execute the selected motion. The drive mechanism 120 controls each actuator according to the motion file.
- the motion control unit 154 can also execute a motion to lift both hands 106 as a gesture that encourages "hug” when a user with high intimacy is nearby, and when the "hug” gets tired, the left and right front wheels 102 By alternately repeating reverse rotation and stop while being accommodated, it is also possible to express a motion that annoys you.
- the drive mechanism 120 causes the robot 100 to express various motions by driving the front wheel 102, the hand 106, and the neck (head frame 316) according to the instruction of the operation control unit 154.
- the sensor control unit 172 controls the internal sensor 128. Specifically, the measurement directions of the high resolution camera 402, the temperature sensor 406, and the shape measurement sensor 408 are controlled. In accordance with the direction of the head frame 316, the measurement directions of the high resolution camera 402, the temperature sensor 406, and the shape measurement sensor 408 mounted on the head of the robot 100 change.
- the sensor control unit 172 controls the imaging direction of the high resolution camera 402 (that is, controls the movement of the head in accordance with the imaging direction).
- the sensor control unit 172 and the camera 410 function as an “imaging unit”.
- the recognition unit 156 interprets external information obtained from the internal sensor 128.
- the recognition unit 156 is capable of visual recognition (visual unit), odor recognition (olfactory unit), sound recognition (hearing unit), and tactile recognition (tactile unit).
- the recognition unit 156 periodically acquires detection information of the camera 410, the temperature sensor 406, and the shape measurement sensor 408, and detects a moving object such as a person or a pet. These pieces of information are transmitted to the server 200, and the person recognition unit 214 of the server 200 extracts physical features of the moving object. It also detects the smell of the user and the voice of the user. Smells and sounds (voices) are classified into multiple types by known methods.
- the recognition unit 156 recognizes this by the built-in acceleration sensor, and the response recognition unit 228 of the server 200 recognizes that the "abuse act" is performed by the user in the vicinity. Even when the user holds the tongue 112 and lifts the robot 100, it may be recognized as a violent act.
- the response recognition unit 228 of the server 200 may recognize that the “voice call action” has been performed on itself.
- the temperature around the body temperature is detected, it is recognized that the user has made “contact action”, and when the upward acceleration is detected in the state where the contact is recognized, it is recognized that "handing” is made.
- the physical contact when the user lifts the body 104 may be sensed, or the holding on the front wheel 102 may be recognized by lowering the load.
- the response recognition unit 228 of the server 200 recognizes various responses of the user to the robot 100.
- some typical response actions correspond to pleasure or discomfort, affirmation or denial.
- most pleasurable actions are positive responses, and most offensive actions are negative.
- Pleasure and discomfort are related to intimacy, and affirmative and negative responses affect the action selection of the robot 100.
- a series of recognition processing including detection / analysis / judgment may be performed only by the recognition unit 212 of the server 200, or may be performed only by the recognition unit 156 of the robot 100, or both may perform the above-mentioned recognition while sharing roles. Processing may be performed.
- the closeness management unit 220 of the server 200 changes the closeness to the user.
- the intimacy with the user who has performed pleasure is increased, and the intimacy with the user who has performed offensive activity decreases.
- the recognition unit 212 of the server 200 determines the comfort / discomfort according to the response, and the map management unit 210 changes the z value of the point where the comfort / discommitment was performed in the action map expressing “attachment to a place”. May be For example, when a pleasant act is performed in the living, the map management unit 210 may set a favor point in the living with a high probability. In this case, a positive feedback effect is realized in that the robot 100 prefers a living and enjoys an activity in the living, and thus prefers a living more and more.
- the person recognition unit 214 of the server 200 detects a moving object from various data obtained from the external sensor 114 or the internal sensor 128, and extracts its features (physical features and behavioral features). Then, a plurality of moving objects are subjected to cluster analysis based on these features. As moving objects, not only humans but also pets such as dogs and cats may be analyzed.
- the robot 100 periodically takes an image, and the person recognition unit 214 recognizes a moving object from those images, and extracts features of the moving object.
- a moving object is detected, physical features and behavioral features are also extracted from an odor sensor, a built-in sound collection microphone, a temperature sensor, and the like. For example, when moving objects appear in the image, bearded, working in the early morning, wearing red clothes, smelling of perfume, loud voice, wearing glasses, wearing skirts A variety of features are extracted, such as white-haired, tall, fat, tan, and being on a couch.
- the robot 100 newly recognizes a moving object (user) in a state where cluster analysis by such feature extraction is completed.
- the person recognition unit 214 of the server 200 performs feature extraction from sensing information such as an image obtained from the robot 100, and a cluster is used for moving objects in the vicinity of the robot 100 by dive learning (multilayer neural network). Determine if it is applicable.
- cluster analysis cluster analysis
- fitting to clusters involved in feature extraction deep learning
- the robot 100 sets a high degree of intimacy for people who frequently meet, people who frequently touch, and people who frequently speak. On the other hand, the intimacy with the people who rarely see, those who do not touch very much, the violent people, the people who speak loudly becomes low.
- the robot 100 changes the intimacy degree of each user based on various external information detected by sensors (vision, touch, hearing).
- the actual robot 100 autonomously performs complex action selection in accordance with the action map.
- the robot 100 acts while being influenced by a plurality of action maps based on various parameters such as loneliness, boredom and curiosity.
- the robot 100 tries to approach people with high intimacy and leaves people with low intimacy, in principle, when the influence of the action map is excluded or in an internal state where the influence of the behavior map is small. I assume.
- the behavior of the robot 100 is categorized as follows according to closeness.
- Cluster robot 100 with very high intimacy approaches the user (hereinafter referred to as “proximity action”), and by performing a love behavior pre-defined as a gesture showing favor with people Express strongly.
- Cluster robot 100 with relatively high intimacy performs only proximity action.
- Cluster robot 100 with relatively low intimacy does not take any particular action.
- the cluster robot 100 with particularly low intimacy performs a leaving action.
- the robot 100 when the robot 100 finds a user with high intimacy, it approaches that user, and conversely, when finding a user with low intimacy, it leaves the user.
- it is possible to express so-called "human sight" behavior.
- the robot 100 may move away from the visitor and head toward the family (user B with high intimacy).
- the user B can feel that the robot 100 is aware of strangers and feels uneasy, and relies on himself.
- Such a behavioral expression evokes the user B the joy of being selected and relied upon, and the accompanying attachment.
- the user A who is a visitor frequently visits, calls and makes a touch the intimacy with the user A of the robot 100 gradually increases, and the robot 100 does not act as an acquaintance with the user A (disengagement behavior) .
- the user A can also have an attachment to the robot 100 by feeling that the robot 100 has become familiar with himself.
- the robot 100 may not select an action influenced by intimacy because the action map for finding a place satisfying the curiosity is emphasized. .
- the external sensor 114 installed at the entrance detects that the user has returned home, the user may be asked to give priority to the user's meeting action.
- the robot 100 emulates a biologically natural behavior by appropriately maintaining a sense of distance to the user according to the user's attribute or closeness (preference) in the field of view.
- the recognition unit 156 also functions as a “preference determination unit”. Also, by making the user look up at the user's face at a natural angle according to the distance, the user is made to feel attached. The details will be described below.
- FIG. 8 is a schematic view showing a method of controlling the distance to the user.
- the robot 100 extracts the user (object) from the subject of the camera 410, analyzes the image, refers to the corresponding user information, and moves so as to make the distance to the user appropriate.
- the height (height) of the robot 100 in this embodiment is assumed to be about 50 centimeters. When the robot 100 of this height visually recognizes the user's face, the head necessarily tilts and looks up.
- the robot 100 needs to approach the user in order to enlarge the angle to look up and visually recognize the user's face, and the robot 100 needs to move away from the user if the angle to look up is reduced.
- the robot 100 follows the user while maintaining a natural sense of distance from the user by controlling the forward / backward movement of the robot 100 so as to maintain the viewing state of the user's face at a constant angle of looking up. It can be done. That is, a natural sense of distance can be maintained without measuring a clear distance between the robot 100 and the user.
- the celestial imaging range 418 is an imaging range by the omnidirectional camera 400.
- the omnidirectional camera 400 can capture the upper hemisphere of the robot 100 at one time.
- the recognition unit 156 analyzes an image (captured image) of the imaging region 420 which is a predetermined range including the direction of the object 414 in the celestial imaging range 418.
- creatures such as humans (users) and animals (pets) that are targets of dialogue and skinship are referred to as “target objects”, but it is also possible to use only humans as objects in particular. .
- the recognition unit 156 performs image analysis to determine whether a subject having a predetermined feature is present in the imaging region 420.
- the recognition unit 156 of the robot 100 executes the process of recognizing the image of the object 414 from the imaging region 420.
- the recognition unit 212 of the server 200 may execute the image recognition, or both the recognition unit 212 of the server 200 and the recognition unit 156 of the robot 100 may execute.
- the recognition unit 156 When identifying the object 414 from the subject, the recognition unit 156 measures the temperature distribution around the subject using the temperature sensor 406, and the subject is a heating element, in particular, whether it is a heating element at about 30 to 40 degrees Celsius. Determine if The recognition unit 156 also functions as a “temperature determination unit”. Since temperature-controlled animals such as humans and pets are heating elements, audio, television, a wall, a mirror and the like can be excluded from candidates for the object 414 by this temperature measurement.
- the recognition unit 156 further measures the three-dimensional shape of the subject using the shape measurement sensor 408, and determines whether the subject is an object having a predetermined shape. For example, the recognition unit 156 determines whether the subject has a concavo-convex shape. When the subject does not have the uneven shape, the subject is considered to be a flat body such as a television, a wall, a mirror, and the like, so these can be excluded from candidates for the subject 414. More preferably, it is desirable that the shape measurement sensor 408 detect features of the three-dimensional shape of the subject.
- the shape measurement sensor 408 also stores feature information of the face of each cluster. For this reason, more preferably, the shape measurement sensor 408 may identify who the object 414 is.
- the high-resolution camera 402 picks up the target candidate.
- the angle of view is adjusted so that the entire object candidate is included at the center of the screen.
- the optical axis of the high resolution camera 402 coincides with the line of sight. For this reason, an object candidate is present in the direction of the line of sight of the robot 100.
- the recognition unit 156 specifies that the object candidate is the object 414 based on the image of the high resolution camera 402.
- a candidate object having physical / behavioral characteristics specific to a living being such as skin color, moving, wearing clothes, etc. having a part corresponding to two eyes and one mouth It is recognized as an object 414.
- known face recognition technology is used to identify the face of the object 414.
- face recognition for example, an edge portion of a face is detected from an image of the object 414 to specify a face area, and a pattern of feature amounts preset for the image of the face area (arrangement of eyes, nose, and mouth Etc.). If the degree of similarity in this comparison is equal to or greater than the reference value, it can be determined to be a face. If there are multiple objects 414, multiple faces are identified.
- the recognition unit 156 further identifies the height of the object 414 and the size of the face. Specifically, the feature point of the object 414 is extracted from the imaging screen, and the height of the feature point is specified as the “height of the object”. In the present embodiment, a nose protruding forward in the center of the face is extracted as a feature point, and the height of the object 414 is specified. In addition, the contour of the face is extracted, and the length in the vertical direction is specified as “face size”.
- the forward and backward movement of the robot 100 is controlled so that the size of the area recognized as a face (also referred to as a “face area”), that is, the area of the face area, falls within a predetermined range on the imaging screen. If the area of the face area is increased, the sense of distance to the object 414 is closer, and if the area of the face area is decreased, the sense of distance to the object 414 is farther.
- the object 414 to be adjusted for the sense of distance is specified based on the height and face size information.
- this is to be adjusted.
- a look-up angle is set in accordance with the attribute and closeness of the target object 414 which has been adjusted, and when the target object 414 is faced, the target 414 is placed so that the face of the target object 414 is on the line of sight at the look-up angle.
- the distance to the object 414 (also referred to as “face-to-face distance”) is controlled. For example, in the situation where parents and a child gather, when approaching a child, it is difficult to measure an accurate distance with the child using an ultrasonic sensor. However, if the face of the child is visually recognized and the look-up angle is made constant, the sense of distance can be adjusted according to the child by controlling the area of the face region of the child to be within a certain range.
- FIG. 9 is a diagram showing an example of a method of setting a look-up angle.
- the upper and lower range of the line of sight of the robot 100 (that is, the movable range of the head) is set to ⁇ 45 degrees to +45 degrees (refer to the dotted line) with respect to the horizontal direction (refer to the two-dot chain line).
- the look-up angle ⁇ (upward angle with respect to the horizontal direction) of the robot 100 is adjusted in the range of 0 to 45 degrees.
- 30 degrees is set as a reference value.
- the facing distance d differs depending on the attribute such as whether the object 414 is an adult or a child, or a male or a female. That is, the distance between the child 414c is shorter than that of the adults 414m and 414f. Moreover, if it is an adult, distance changes according to the height. In the illustrated example, the distance df to the female 414 f is closer than the distance dm to the adult male 414 m, and the distance dc to the child 414 c is closer.
- the robot 100 prioritizes the distance dc with the small child 414c. That is, the movement control unit 152 moves the robot 100 so as to have a distance dc from the child 414c. At this time, when the robot 100 approaches the child 414 c within a predetermined distance (for example, within 5 m) in order to make the robot 100 behave naturally, the driving of the head is started and the look-up angle ⁇ is gradually set Control to approach.
- a predetermined distance for example, within 5 m
- the speed at which the robot 100 approaches the object 414 may be slow. Thereby, the object 414 can touch the robot 100 while the facing distance is set. That is, the skinship with the robot 100 according to the user's intention is secured.
- FIG. 10 is a diagram showing another example of the setting method of the look-up angle.
- FIG. 10A shows the case where a plurality of objects exist, and FIG.
- FIG. 10 (a) for example, when the father 414m squats near the child 414c and calls for the robot 100, it is natural to respond to the father 414m in intimacy expression, and sympathy is obtained. It is believed that Therefore, when there are a plurality of target objects 414 in this way, if the specified difference ⁇ h between the two heights is equal to or less than a reference value (for example, the height on the screen corresponding to 20 cm or less), the larger face An object 414 is used as a reference for distance control.
- a reference value for example, the height on the screen corresponding to 20 cm or less
- the movement control unit 152 causes the robot 100 to face the father 414m, and moves the robot 100 to a position where the look-up angle ⁇ is a set value (for example, 30 degrees).
- the operation control unit 154 drives the head such that the look-up angle ⁇ gradually approaches the set value in the process of the movement.
- the “height difference ⁇ h” and the “reference value” may be set by the number of pixels in image processing.
- the look-up angle ⁇ is variable according to the closeness of the object 414. That is, when the closeness degree of the object 414 becomes higher than before, as shown in FIG. 10B, the look-up angle ⁇ is larger than the previous angle (for example, 30 degrees: see thin dashed-dotted line) The setting is changed so that it becomes 40 degrees: thick dashed dotted line). As a result, the distance between the robot 100 and the object 414 decreases from d1 to d2. Thereby, it is possible to express that the robot 100 is more favored with the object 414 than before, just as a pet gets used to the owner and reduces the sense of distance.
- FIG. 11 is a diagram showing a setting table referred to when determining the look-up angle.
- This setting table is a data table in which the correspondence between the look-up angle and the closeness is defined.
- the standard closeness (referred to as “standard closeness”) is 21 or more and 40 or less, and the corresponding look-up angle ⁇ is 30 degrees.
- the look-up angle ⁇ becomes larger than 30 degrees.
- the robot 100 gestures to approach and stare at the object 414 more than before. Thereby, it can be expressed that the robot 100 has a favor with the object 414 than before.
- the look-up angle ⁇ becomes smaller than 30 degrees.
- the robot 100 gestures to look away from the object 414 more than before. This makes it possible to express dissatisfaction and caution against lack of love of the object 414.
- the action selection of the robot 100 can be made human or biological.
- FIG. 12 is a flowchart illustrating the operation control of the robot 100. The process of this figure is repeatedly performed with a predetermined
- a case where the object 414 is a user will be described as an example.
- the internal sensor 128 periodically images the periphery of the robot 100 to measure the ambient temperature.
- the recognition unit 156 determines whether the subject is a user based on the detection information of the temperature sensor 406 and the shape measurement sensor 408. If the subject is a user (Y in S10), the recognition unit 156 recognizes the user's face by the high resolution camera 402 and tracks the user's face (S12).
- the recognition unit 156 determines the height h of the user based on the image of the high resolution camera 402 (S14), and determines the size of the face of the user (S16). If there are a plurality of users (Y of S18), the lower user is set as the target of distance adjustment if the difference ⁇ h of heights of the plurality of users is not within the reference value (N of S20) (S22) . On the other hand, if the height difference ⁇ h is within the reference value (Y in S20), the user with the larger face is set (S24). If there is only one user (N in S18), the processing in S20 to S24 is skipped. The user set at this time is also called a "setting user".
- the recognition unit 156 acquires information of the set user, and determines the closeness (S26). If there is no information of the setting user, the intimacy degree is provisionally determined as the standard intimacy degree. Then, the look-up angle ⁇ is set with reference to the setting table (FIG. 11) using the closeness (S28).
- the movement control unit 152 causes the robot 100 to face the setting user (S30), and moves the robot 100 in the direction of the setting user (S32). At this time, the operation control unit 154 drives the head such that the look-up angle ⁇ gradually approaches the set value in the process of the movement. When the user approaches the robot 100 more than the distance corresponding to the look-up angle ⁇ , the movement control unit 152 moves the robot 100 away from the setting user so as to be the distance. When the user is not detected (N in S10), the process of S12 to S32 is skipped and the process is temporarily ended.
- the robot 100 controls the sense of distance to the user according to the height of the user and the size of the face. It takes a distance to a large person and a distance to a small person. Such an operation can maintain a natural sense of distance from the robot 100 as viewed from the user. This sense of distance is due to the setting of the look-up angle, and the robot 100 looks at the user at a natural angle. Therefore, the user can feel comfortable.
- the look-up angle changes according to the degree of closeness. As the closeness degree increases, the look-up angle changes in the direction of increasing. Thereby, it is also possible to express such a gesture that the robot 100 requests the user for the skinship. Conversely, when the closeness degree decreases, the look-up angle decreases. As a result, it is possible to express gestures such as the robot 100 showing a sense of caution to the user or making a stranger.
- the present invention is not limited to the above-described embodiment and modification, and the components can be modified and embodied without departing from the scope of the invention.
- Various inventions may be formed by appropriately combining a plurality of components disclosed in the above-described embodiment and modifications. Moreover, some components may be deleted from all the components shown in the above-mentioned embodiment and modification.
- the robot system 300 is described as being configured of one robot 100, one server 200, and a plurality of external sensors 114, part of the functions of the robot 100 may be realized by the server 200, or the functions of the server 200 A part or all of may be assigned to the robot 100.
- One server 200 may control a plurality of robots 100, or a plurality of servers 200 may cooperate to control one or more robots 100.
- a third device other than the robot 100 or the server 200 may have a part of the function. It is also possible to grasp the collection of each function of the robot 100 and each function of the server 200 described in FIG. 7 as one “robot” as a whole. How to allocate a plurality of functions necessary to realize the present invention to one or more hardwares will be considered in view of the processing capability of each hardware, the specifications required of the robot system 300, etc. It should be decided.
- the “robot in a narrow sense” refers to the robot 100 not including the server 200
- the “robot in a broad sense” refers to the robot system 300.
- Many of the functions of the server 200 may be integrated into the robot 100 in the future.
- the configuration including the omnidirectional camera 400 and the high resolution camera 402 has been exemplified.
- the high resolution camera 402 may be omitted for cost reduction and the like. That is, an imaging screen including the object 414 may be cut out as a part of the celestial imaging range 418 by the omnidirectional camera 400.
- the adjustment of the look-up angle is also the adjustment of the optical axis direction, and the control becomes simple.
- the omnidirectional camera 400 may be omitted, and the high resolution camera 402 may identify the object 414. In that case, however, it is necessary to drive the head constantly to capture the surroundings. Also, as soon as the object 414 enters the imaging field of view of the high resolution camera 402, tracking starts to start, and the operation of the robot 100 tends to be awkward. In this regard, according to the omnidirectional camera 400, detection of the object 414 is easy without moving the camera itself. Therefore, it is preferable to use the omnidirectional camera 400 and the high resolution camera 402 in combination as in the above embodiment.
- the object 414 is sensed by the camera 410, the temperature sensor 406, and the shape measurement sensor 408, and the recognition unit 156 of the robot 100 executes recognition processing.
- part or all of the recognition process may be performed by the recognition unit 212 of the server 200.
- part of the function of the internal sensor 128 may be mounted on the external sensor 114.
- the camera 410 may be mounted on the external sensor 114
- the server 200 may analyze the image information of the external sensor 114
- the robot 100 may specify the position of the object 414 based on the analysis result.
- the omnidirectional camera 400 is provided on the tongue 112 so as to move integrally with the head of the robot 100.
- the omnidirectional camera 400 may be provided at a part independent of the head (a position not influenced by the movement of the line of sight).
- the omnidirectional camera 400 may be fixed to the base plate 328 so as to protrude above the head.
- a microphone array 404 may be used in combination to recognize the object 414.
- the microphone array 404 may detect the object 414 and identify the direction to the object 414.
- microphones may be arranged at a plurality of locations of the robot 100.
- the robot 100 faces the object 414 and adjusts the sense of distance (closely to the main).
- control may be performed to obtain a similar sense of distance.
- the robot 100 may be controlled to escape while maintaining a distance corresponding to the height of the object 414.
- the distance can be controlled using the image of the omnidirectional camera 400.
- the approach of the object 414 from the back of the robot 100 may be detected by the omnidirectional camera 400, and control may be performed to turn the robot 100 to face the object 414. Then, control such as causing the robot 100 to move backward may be performed according to the closeness of the object 414.
- the line of sight of the robot 100 basically faces the opposite side of the object 414, but in order to simplify the control algorithm, it is calculated on the "virtual look-up angle" assuming that there is a face on the back of the head It may be set. This makes it possible to use the same setting table as when facing the user. Such control can be easily realized by cutting out the imaging screen of the omnidirectional camera 400.
- the distance may be controlled in accordance with the size including the width of the object 414.
- Body size takes into account not only height but also widthwise volume. It takes a relatively large distance to large users of the body, and controls to be relatively close to small users of the body. Such motion is generally consistent with biological motion.
- the nose of the object 414 is extracted as a feature point, and the height of the nose is specified as the height of the object 414.
- other feature points may be set to specify the height of the object 414.
- the contour of the head (face) of the object 414 may be identified, and the height of the top of the head may be the height of the object 414.
- the contour of the face is extracted, and the length in the vertical direction is specified as the “size of the face”.
- the area of the face calculated from the outline of the face may be specified as the “face size”.
- the head starts to be driven when approaching within a predetermined distance, and the example of gradually approaching the look-up angle to the set value is shown.
- the look-up angle of the robot 100 may be basically fixed and the object 414 may be approached with the look-up angle. Then, when the entire body of the object 414 enters the angle of view (viewing angle) of the high resolution camera 402, the robot 100 may be stopped. With such a configuration, the frequency of moving the head of the robot 100 can be reduced, and the control load can be reduced.
- the distance to the object 414 is controlled so that the look-up angle ⁇ of the robot 100 becomes constant.
- the robot 100 may be provided with a distance measuring sensor for detecting the distance to the subject. Then, the distance (set distance) to the object 414 when the look-up angle becomes a set value may be calculated, and the robot 100 may be moved so that the set distance is satisfied.
- the detection information of the distance measuring sensor may be sequentially acquired, and the head may be driven from the near side of the set distance to start looking up at 414. Then, control may be performed so that the look-up angle becomes a set value at the timing when the set distance is reached.
- the object 414 is a user (human), but animals such as dogs and cats may be included. As a result, it is possible to express such an effect that the robot 100 faces the animal at an appropriate distance.
- correction processing may be performed on the captured image so that the robot 100 can accurately recognize the object 414 even in a backlit state.
- exposure correction may be performed to brighten the entire captured image.
- correction may be made to raise (brighten) the exposure except for bright portions of the captured image.
- the exposure may be raised centering on the features (such as the face) of the object 414.
- Correction by HDR High-dynamic-range rendering
- the recognition unit 156 may perform the above-described recognition processing such as extracting the feature of the object 414 based on the image after such correction. Since the captured image is not provided to the user but is used for internal processing (image recognition) of the robot 100, “color skipping” or “color loss” for an image portion unrelated to the object 414 is There is no particular problem.
- the omnidirectional camera 400 is illustrated in the above embodiment, a semi-spherical camera may be employed instead. Alternatively, only the upper sphere of the omnidirectional camera 400 may be taken as an imaging target. However, since these celestial camera generally have large distortion of the captured image (celestial image), it is preferable that the recognition unit 156 perform image recognition after correcting so as to reduce distortion of the celestial image.
- the “look-up angle” corresponds to the rotation angle of the pitch axis (also referred to as “pitch angle”).
- the imaging range in the vertical direction (look-up / look-down direction) by the camera can be changed by adjusting the pitch angle. That is, when controlling the distance to the object while keeping the look-up angle of the robot constant, the specific region of the object appearing on the screen with the pitch angle constant is the setting area on the screen (setting position and setting range) It is sufficient to control the movement of the robot so as to be maintained.
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Robotics (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Human Computer Interaction (AREA)
- Radar, Positioning & Navigation (AREA)
- Aviation & Aerospace Engineering (AREA)
- Multimedia (AREA)
- Manipulator (AREA)
- Toys (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Description
本実施形態におけるロボット100は、外部環境および内部状態に基づいて行動や仕草(ジェスチャー)を決定する自律行動型のロボットである。外部環境は、カメラやサーモセンサなど各種のセンサにより認識される。内部状態はロボット100の感情を表現するさまざまなパラメータとして定量化される。これらについては後述する。
図2に示すように、ロボット100のボディ104は、ベースフレーム308、本体フレーム310、一対のホイールカバー312および外皮314を含む。ベースフレーム308は、金属からなり、ボディ104の軸芯を構成するとともに内部機構を支持する。ベースフレーム308は、アッパープレート332とロアプレート334とを複数のサイドプレート336により上下に連結して構成される。複数のサイドプレート336間には通気が可能となるよう、十分な間隔が設けられている。ベースフレーム308の内方には、バッテリー118、制御回路342および各種アクチュエータ等が収容されている。
ロボットシステム300は、ロボット100、サーバ200および複数の外部センサ114を含む。家屋内にはあらかじめ複数の外部センサ114(外部センサ114a、114b、・・・、114n)が設置される。外部センサ114は、家屋の壁面に固定されてもよいし、床に載置されてもよい。サーバ200には、外部センサ114の位置座標が登録される。位置座標は、ロボット100の行動範囲として想定される家屋内においてx,y座標として定義される。
感情マップ116は、サーバ200に格納されるデータテーブルである。ロボット100は、感情マップ116にしたがって行動選択する。感情マップ116は、ロボット100の場所に対する好悪感情の大きさを示す。感情マップ116のx軸とy軸は、二次元空間座標を示す。z軸は、好悪感情の大きさを示す。z値が正値のときにはその場所に対する好感が高く、z値が負値のときにはその場所を嫌悪していることを示す。
ロボット100は、内部センサ128、通信機126、記憶装置124、プロセッサ122、駆動機構120およびバッテリー118を含む。駆動機構120は、上述した車輪駆動機構370を含む。プロセッサ122と記憶装置124は、制御回路342に含まれる。各ユニットは電源線130および信号線132により互いに接続される。バッテリー118は、電源線130を介して各ユニットに電力を供給する。各ユニットは信号線132により制御信号を送受する。バッテリー118は、リチウムイオン電池などの二次電池であり、ロボット100の動力源である。
上述のように、ロボットシステム300は、ロボット100、サーバ200および複数の外部センサ114を含む。ロボット100およびサーバ200の各構成要素は、CPU(Central Processing Unit)および各種コプロセッサなどの演算器、メモリやストレージといった記憶装置、それらを連結する有線または無線の通信線を含むハードウェアと、記憶装置に格納され、演算器に処理命令を供給するソフトウェアによって実現される。コンピュータプログラムは、デバイスドライバ、オペレーティングシステム、それらの上位層に位置する各種アプリケーションプログラム、また、これらのプログラムに共通機能を提供するライブラリによって構成されてもよい。以下に説明する各ブロックは、ハードウェア単位の構成ではなく、機能単位のブロックを示している。ロボット100の機能の一部はサーバ200により実現されてもよいし、サーバ200の機能の一部または全部はロボット100により実現されてもよい。
サーバ200は、通信部204、データ処理部202およびデータ格納部206を含む。通信部204は、外部センサ114およびロボット100との通信処理を担当する。データ格納部206は各種データを格納する。データ処理部202は、通信部204により取得されたデータおよびデータ格納部206に格納されているデータに基づいて各種処理を実行する。データ処理部202は、通信部204およびデータ格納部206のインタフェースとしても機能する。
ロボット100は、内部センサ128、通信部142、データ処理部136、データ格納部148および駆動機構120を含む。内部センサ128は、各種センサの集合体である。内部センサ128は、マイクロフォンアレイ404、カメラ410、温度センサ406および形状測定センサ408を含む。
(1)親密度が非常に高いクラスタ
ロボット100は、ユーザに近づき(以下、「近接行動」とよぶ)、かつ、人に好意を示す仕草としてあらかじめ定義される愛情仕草を行うことで親愛の情を強く表現する。
(2)親密度が比較的高いクラスタ
ロボット100は、近接行動のみを行う。
(3)親密度が比較的低いクラスタ
ロボット100は特段のアクションを行わない。
(4)親密度が特に低いクラスタ
ロボット100は、離脱行動を行う。
本実施形態のロボット100は、視野に入ったユーザの属性や親密度(選好性)に応じてそのユーザとの距離感を適度に保ち、生物的に自然な行動をエミュレートする。認識部156は「選好性判定部」としても機能する。また、その距離に応じて自然な角度でユーザの顔を見上げる仕草をとることにより、ユーザに愛着を感じさせる。以下、その詳細について説明する。
ロボット100は、カメラ410の被写体からユーザ(対象物)を抽出してその画像を分析するとともに、対応するユーザ情報があればそれを参照し、ユーザとの距離が適切となるよう移動する。本実施形態におけるロボット100の身長(高さ)は、50センチメートル程度を想定している。この身長のロボット100がユーザの顔を視認するとき、必然的に頭部を傾けて見上げる状態になる。見上げる角度を大きくし、ユーザの顔を視認するためには、ロボット100はユーザに近づく必要があるし、見上げる角度を小さくすれば、ロボット100はユーザから離れる必要がある。見上げる角度を一定にして、その時のユーザの顔の視認状態を維持するようにロボット100の前進後退動作を制御することで、ユーザとの自然な距離感を維持しながら、ロボット100をユーザに追従させることができる。つまり、ロボット100とユーザとの明確な距離を測定しなくとも、自然な距離感を維持することができる。
ロボット100の視線の上下範囲(つまり頭部の可動範囲)は、水平方向(二点鎖線参照)に対して-45度~+45度(点線参照)とされている。生物のように、頭部の可動域で被写体を快適な視野角に入れる感覚を再現するものである。このため、ロボット100の見上げ角θ(水平方向に対する上向き角度)は、0~45度の範囲で調整される。見上げる仕草を自然に見せるため、基準値として30度(一点鎖線参照)が設定される。
図10(a)に示すように、例えば子供414cの傍で父親414mがしゃがみ、ロボット100に呼びかけているような場合、親密表現をしている父親414mに応えるのが自然であり、共感が得られるものと考えられる。そこで、このように対象物414が複数存在する場合、特定された両者の高さの差Δhが基準値以下(例えば20cm以下に相当する画面上の高さ)であれば、顔が大きいほうの対象物414を距離制御の基準とする。図示の例では、移動制御部152は、ロボット100を父親414mに対して正対させ、見上げ角θが設定値(例えば30度)となる位置に移動させる。動作制御部154は、その移動の過程で見上げ角θが徐々にその設定値に近づくように頭部を駆動する。「高さの差Δh」および「基準値」は、画像処理におけるピクセル数で設定してもよい。
この設定テーブルは、見上げ角と親密度との対応関係が定義されたデータテーブルである。図示の例では、標準的な親密度(「標準親密度」という)を21以上40以下とし、それに対応する見上げ角θを30度としている。対象物414の親密度がその標準親密度よりも高くなると、見上げ角θは30度よりも大きくなる。このとき、ロボット100は以前よりも対象物414に近づいて見つめる仕草をする。それにより、ロボット100が対象物414に対して以前よりも好意をもっていることを表現できる。
本図の処理は、所定の制御周期にて繰り返し実行される。以下、対象物414がユーザである場合を例に説明する。
Claims (10)
- 周囲を撮像する撮像部と、
撮像された対象物の大きさに応じて前記対象物との距離を制御する移動制御部と、
を備えることを特徴とする自律行動型ロボット。 - 頭部の動作を制御する動作制御部と、
撮像された対象物の顔を認識する認識部と、
をさらに備え、
前記動作制御部は、前記頭部が前記対象物の顔を見上げる角度となるよう、前記頭部の動作を制御し、
前記移動制御部は、前記対象物の高さに応じて前記対象物との距離を制御することを特徴とする請求項1に記載の自律行動型ロボット。 - 前記移動制御部は、前記頭部の見上げ角が予め設定した角度範囲に収まるよう、前記対象物との距離を制御することを特徴とする請求項2に記載の自律行動型ロボット。
- 撮像画面に複数の対象物が存在する場合、
前記移動制御部は、低いほうの対象物の高さに応じて前記対象物との距離を制御することを特徴とする請求項2または3に記載の自律行動型ロボット。 - 撮像画面に複数の対象物が存在する場合、
前記移動制御部は、顔が大きいほうの対象物の高さに応じて前記対象物との距離を制御することを特徴とする請求項2または3に記載の自律行動型ロボット。 - 前記移動制御部は、距離制御の基準となった対象物に正対するように制御することを特徴とする請求項4または5に記載の自律行動型ロボット。
- 前記撮像部が、周辺の略全域を撮像可能なカメラを含み、
前記カメラが前記頭部の視線の動きに左右されない位置に設けられていることを特徴とする請求項2~6のいずれかに記載の自律行動型ロボット。 - 対象物に対する選好性を判定する選好性判定部をさらに備え、
前記移動制御部は、前記対象物の選好性に応じて前記対象物の大きさと制御対象である距離との対応関係を変更することを特徴とする請求項1~7のいずれかに記載の自律行動型ロボット。 - 対象物の温度を判定する温度判定部をさらに備え、
前記移動制御部は、前記対象物の温度に基づいて距離制御の対象であることを判定することを特徴とする請求項1~8のいずれかに記載の自律行動型ロボット。 - ロボットの周囲の撮像画像を取得する機能と、
前記撮像画像において所定の対象物を特定する機能と、
撮像された対象物の大きさに応じて、前記ロボットと前記対象物とがとるべき位置関係を演算する機能と、
演算された位置関係を実現するように前記ロボットの移動を制御する機能と、
をコンピュータに実現させるためのプログラム。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2018538413A JP6472113B2 (ja) | 2016-09-08 | 2017-09-05 | 自然な距離感を保つ自律行動型ロボットおよびプログラム |
CN201780054987.7A CN109690435A (zh) | 2016-09-08 | 2017-09-05 | 保持自然的距离感的行为自主型机器人 |
GB1902513.9A GB2570584B (en) | 2016-09-08 | 2017-09-05 | Autonomously acting robot that maintains a natural distance |
DE112017004512.6T DE112017004512T5 (de) | 2016-09-08 | 2017-09-05 | Autonom agierender Roboter, der einen natürlichen Abstand hält |
US16/283,818 US11148294B2 (en) | 2016-09-08 | 2019-02-25 | Autonomously acting robot that maintains a natural distance |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016175163 | 2016-09-08 | ||
JP2016-175163 | 2016-09-08 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/283,818 Continuation US11148294B2 (en) | 2016-09-08 | 2019-02-25 | Autonomously acting robot that maintains a natural distance |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018047802A1 true WO2018047802A1 (ja) | 2018-03-15 |
Family
ID=61562098
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/031890 WO2018047802A1 (ja) | 2016-09-08 | 2017-09-05 | 自然な距離感を保つ自律行動型ロボット |
Country Status (6)
Country | Link |
---|---|
US (1) | US11148294B2 (ja) |
JP (2) | JP6472113B2 (ja) |
CN (1) | CN109690435A (ja) |
DE (1) | DE112017004512T5 (ja) |
GB (1) | GB2570584B (ja) |
WO (1) | WO2018047802A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020022122A1 (ja) * | 2018-07-27 | 2020-01-30 | ソニー株式会社 | 情報処理装置、行動決定方法及びプログラム |
JP7320240B2 (ja) | 2019-04-01 | 2023-08-03 | 国立大学法人豊橋技術科学大学 | ロボット |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10489638B2 (en) * | 2016-11-09 | 2019-11-26 | Nanjing Avatarmind Robot Technology Co., Ltd. | Visual tracking method and robot based on monocular gesture recognition |
NL2022442B1 (nl) * | 2019-01-24 | 2020-01-07 | Lely Patent Nv | Positiebepalingsinrichting |
US11883963B2 (en) * | 2019-06-03 | 2024-01-30 | Cushybots Corporation | Robotic platform for interactive play using a telepresence robot surrogate |
CN110757477A (zh) * | 2019-10-31 | 2020-02-07 | 昆山市工研院智能制造技术有限公司 | 一种陪护机器人的高度方位自适应调整方法及陪护机器人 |
CN111300437B (zh) * | 2019-12-06 | 2022-01-04 | 西南石油大学 | 一种图书馆图书智能收集存储集运机器人 |
US11806881B1 (en) * | 2020-06-09 | 2023-11-07 | Multimedia Led, Inc. | Artificial intelligence system for automatic tracking and behavior control of animatronic characters |
WO2022138474A1 (ja) * | 2020-12-23 | 2022-06-30 | パナソニックIpマネジメント株式会社 | ロボット制御方法、ロボット、プログラム、及び記録媒体 |
US11297247B1 (en) * | 2021-05-03 | 2022-04-05 | X Development Llc | Automated camera positioning for feeding behavior monitoring |
WO2022259595A1 (ja) * | 2021-06-10 | 2022-12-15 | ソニーグループ株式会社 | 介護ロボット |
CN114489133B (zh) * | 2022-01-26 | 2023-12-26 | 深圳市奥拓电子股份有限公司 | 一种无人机自动校正led显示屏的距离保持方法 |
WO2024004623A1 (ja) * | 2022-06-29 | 2024-01-04 | ソニーグループ株式会社 | ロボット及びロボットの制御方法 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH036710A (ja) * | 1989-06-05 | 1991-01-14 | Toshiba Corp | 追随移動ロボット制御装置 |
WO2000067959A1 (fr) * | 1999-05-10 | 2000-11-16 | Sony Corporation | Dispositif robotique et procede de commande associe |
JP2003275976A (ja) * | 2002-03-22 | 2003-09-30 | Matsushita Electric Ind Co Ltd | 移動作業ロボット |
JP2004230480A (ja) * | 2003-01-28 | 2004-08-19 | Sony Corp | ロボット装置およびロボット制御方法、記録媒体、並びにプログラム |
JP2011054082A (ja) * | 2009-09-04 | 2011-03-17 | Hitachi Ltd | 自律移動装置 |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6367959A (ja) | 1986-09-10 | 1988-03-26 | Mitsubishi Electric Corp | プリンタ制御装置 |
JP2000323219A (ja) | 1999-05-10 | 2000-11-24 | Sony Corp | 接続装置及びロボットシステム |
JP4689107B2 (ja) * | 2001-08-22 | 2011-05-25 | 本田技研工業株式会社 | 自律行動ロボット |
JP2003205482A (ja) * | 2002-01-08 | 2003-07-22 | Fuji Photo Film Co Ltd | ペット型ロボット |
WO2007041295A2 (en) * | 2005-09-30 | 2007-04-12 | Irobot Corporation | Companion robot for personal interaction |
ES2358139B1 (es) * | 2009-10-21 | 2012-02-09 | Thecorpora, S.L. | Robot social. |
CN102134022B (zh) * | 2011-03-31 | 2013-09-25 | 浙江天下实业有限公司 | 紧急自锁宠物牵引器 |
US9517559B2 (en) * | 2013-09-27 | 2016-12-13 | Honda Motor Co., Ltd. | Robot control system, robot control method and output control method |
CN106096373A (zh) * | 2016-06-27 | 2016-11-09 | 旗瀚科技股份有限公司 | 机器人与用户的交互方法及装置 |
-
2017
- 2017-09-05 JP JP2018538413A patent/JP6472113B2/ja active Active
- 2017-09-05 DE DE112017004512.6T patent/DE112017004512T5/de active Pending
- 2017-09-05 GB GB1902513.9A patent/GB2570584B/en active Active
- 2017-09-05 WO PCT/JP2017/031890 patent/WO2018047802A1/ja active Application Filing
- 2017-09-05 CN CN201780054987.7A patent/CN109690435A/zh active Pending
-
2019
- 2019-01-18 JP JP2019006775A patent/JP2019089197A/ja active Pending
- 2019-02-25 US US16/283,818 patent/US11148294B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH036710A (ja) * | 1989-06-05 | 1991-01-14 | Toshiba Corp | 追随移動ロボット制御装置 |
WO2000067959A1 (fr) * | 1999-05-10 | 2000-11-16 | Sony Corporation | Dispositif robotique et procede de commande associe |
JP2003275976A (ja) * | 2002-03-22 | 2003-09-30 | Matsushita Electric Ind Co Ltd | 移動作業ロボット |
JP2004230480A (ja) * | 2003-01-28 | 2004-08-19 | Sony Corp | ロボット装置およびロボット制御方法、記録媒体、並びにプログラム |
JP2011054082A (ja) * | 2009-09-04 | 2011-03-17 | Hitachi Ltd | 自律移動装置 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020022122A1 (ja) * | 2018-07-27 | 2020-01-30 | ソニー株式会社 | 情報処理装置、行動決定方法及びプログラム |
US11986959B2 (en) | 2018-07-27 | 2024-05-21 | Sony Corporation | Information processing device, action decision method and program |
JP7320240B2 (ja) | 2019-04-01 | 2023-08-03 | 国立大学法人豊橋技術科学大学 | ロボット |
Also Published As
Publication number | Publication date |
---|---|
GB2570584B (en) | 2021-12-08 |
GB201902513D0 (en) | 2019-04-10 |
GB2570584A (en) | 2019-07-31 |
JP2019089197A (ja) | 2019-06-13 |
CN109690435A (zh) | 2019-04-26 |
US20190184572A1 (en) | 2019-06-20 |
JPWO2018047802A1 (ja) | 2018-10-18 |
DE112017004512T5 (de) | 2019-06-13 |
US11148294B2 (en) | 2021-10-19 |
JP6472113B2 (ja) | 2019-02-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2018047802A1 (ja) | 自然な距離感を保つ自律行動型ロボット | |
JP7068709B2 (ja) | 瞳を変化させる自律行動型ロボット | |
JP7320239B2 (ja) | 音源の方向を認識するロボット | |
CN109414623B (zh) | 行为自主型机器人 | |
JP7177497B2 (ja) | 相手を見つめる自律行動型ロボット | |
JP6884401B2 (ja) | 服を着る自律行動型ロボット | |
JP6671577B2 (ja) | 人を識別する自律行動型ロボット | |
JP7236142B2 (ja) | 自律行動型ロボット | |
JP7055402B2 (ja) | ゲストを受け入れる自律行動型ロボット | |
WO2018181640A1 (ja) | ロボットの関節に好適なジョイント構造 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 2018538413 Country of ref document: JP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17848742 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 201902513 Country of ref document: GB Kind code of ref document: A Free format text: PCT FILING DATE = 20170905 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17848742 Country of ref document: EP Kind code of ref document: A1 |