WO2018207908A1 - Autonomous behavior-type robot, accessory, and robot control program - Google Patents

Autonomous behavior-type robot, accessory, and robot control program Download PDF

Info

Publication number
WO2018207908A1
WO2018207908A1 PCT/JP2018/018287 JP2018018287W WO2018207908A1 WO 2018207908 A1 WO2018207908 A1 WO 2018207908A1 JP 2018018287 W JP2018018287 W JP 2018018287W WO 2018207908 A1 WO2018207908 A1 WO 2018207908A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
motion
unit
control unit
accessory
Prior art date
Application number
PCT/JP2018/018287
Other languages
French (fr)
Japanese (ja)
Inventor
正昭 田原
要 林
Original Assignee
Groove X株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Groove X株式会社 filed Critical Groove X株式会社
Priority to JP2019517713A priority Critical patent/JP6734607B2/en
Publication of WO2018207908A1 publication Critical patent/WO2018207908A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators

Definitions

  • the present invention relates to a robot that autonomously selects an action according to an internal state or an external environment.
  • the present invention is an invention completed based on the above recognition, and its main object is to provide a technology for a robot to easily and reliably recognize another robot.
  • the autonomous action type robot includes a motion control unit for selecting a motion of the robot, a drive mechanism for executing the motion selected by the motion control unit, and a predetermined short distance wireless communication method from another robot.
  • a receiver for receiving the ID of the other robot transmitted in accordance with the above, and a recognition unit for determining the other robot based on the received ID.
  • the autonomous action type robot includes a motion control unit for selecting a motion, a drive mechanism for executing the motion selected by the motion control unit, and an ID for identifying the robot in a predetermined short distance wireless communication
  • the autonomous action type robot includes a motion control unit for selecting a motion of the robot, a drive mechanism for executing the motion selected by the motion control unit, and a predetermined short distance wireless communication system from the charger. And a charge monitoring unit that monitors the remaining battery level of the secondary battery.
  • the operation control unit selects, as the movement target point, a charger that transmits an ID associated with the robot among the plurality of chargers when the battery remaining amount becomes equal to or less than a predetermined threshold.
  • An autonomous action type robot includes a communication connection unit connected to the server by a first wireless communication method based on access information to the server, an operation control unit for determining a motion of the robot, and an operation A drive mechanism for executing the motion selected by the control unit, and a receiver for receiving an ID of another robot from another robot by a second wireless communication method having a communication distance shorter than that of the first wireless communication method. And a transmitter for transmitting access information to another robot when the ID is received.
  • the autonomous action type robot includes a transmitter that transmits an ID for identifying the self robot in accordance with a predetermined short distance wireless communication scheme.
  • a plurality of transmitters are annularly arranged in a projection formed on the head or top of the robot.
  • An autonomous action type robot includes an action control unit that selects a motion of the robot, a drive mechanism that executes a motion selected by the action control unit, and an autonomous action type robot from another autonomous action type robot. And a receiver for receiving the robot ID.
  • the operation control unit instructs the drive mechanism to move to a position where it can receive the position code transmitted from the transmitter installed at the predetermined position of the autonomous action type robot.
  • An accessory includes a transmitter that transmits an operation command to an autonomous behavior robot in accordance with a predetermined short distance wireless communication scheme.
  • a robot can easily recognize another robot.
  • FIG. 1A is a front external view of a robot.
  • FIG. 1 (b) is a side view of the robot.
  • FIG. 2 is a cross-sectional view schematically illustrating the structure of a robot.
  • It is a block diagram of a robot system. It is a conceptual diagram of an emotion map.
  • It is a hardware block diagram of a robot.
  • It is a functional block diagram of a robot system.
  • It is the external appearance enlarged view of a horn.
  • It is a schematic diagram which shows a mode that several robots form a formation.
  • It is a schematic diagram for demonstrating the authentication process of the guest robot by a host robot. It is an external view of a charge station.
  • FIG. 1A is a front external view of the robot 100.
  • FIG. FIG. 1 (b) is a side external view of the robot 100.
  • the robot 100 in the present embodiment is an autonomous action robot that determines an action or gesture (gesture) based on an external environment and an internal state.
  • the external environment is recognized by various sensors such as a camera and a thermo sensor.
  • the internal state is quantified as various parameters representing the emotion of the robot 100. These will be described later.
  • the robot 100 takes an indoor range of an owner's home as an action range.
  • a human being related to the robot 100 is referred to as a "user”
  • a user who is a member of a home to which the robot 100 belongs is referred to as an "owner”.
  • the body 104 of the robot 100 has an overall rounded shape, and includes an outer shell formed of a soft and elastic material such as urethane, rubber, resin, or fiber.
  • the robot 100 may be dressed. By making the body 104 round and soft and have a good touch, the robot 100 provides the user with a sense of security and a pleasant touch.
  • the robot 100 has a total weight of 15 kilograms or less, preferably 10 kilograms or less, and more preferably 5 kilograms or less.
  • the average weight of a 13-month-old baby is just over 9 kilograms for boys and less than 9 kilograms for girls. Therefore, if the total weight of the robot 100 is 10 kilograms or less, the user can hold the robot 100 with almost the same effort as holding a baby that can not walk alone.
  • the average weight of babies less than 2 months old is less than 5 kilograms for both men and women. Therefore, if the total weight of the robot 100 is 5 kg or less, the user can hold the robot 100 with the same effort as holding an infant.
  • the various attributes such as appropriate weight, roundness, softness, and good touch realize an effect that the user can easily hold the robot 100 and can not hold it.
  • it is desirable that the height of the robot 100 is 1.2 meters or less, preferably 0.7 meters or less.
  • being able to hold it is an important concept.
  • the robot 100 includes three wheels for traveling three wheels. As shown, a pair of front wheels 102 (left wheel 102a, right wheel 102b) and one rear wheel 103 are included.
  • the front wheel 102 is a driving wheel
  • the rear wheel 103 is a driven wheel.
  • the front wheel 102 does not have a steering mechanism, but its rotational speed and rotational direction can be individually controlled.
  • the rear wheel 103 is a so-called omni wheel, and is rotatable in order to move the robot 100 back and forth and right and left.
  • the robot 100 can turn left or rotate counterclockwise.
  • the rotational speed of the left wheel 102a larger than that of the right wheel 102b, the robot 100 can turn right or rotate clockwise.
  • the front wheel 102 and the rear wheel 103 can be completely housed in the body 104 by a drive mechanism (a rotation mechanism, a link mechanism). Even when traveling, most of the wheels are hidden by the body 104, but when the wheels are completely housed in the body 104, the robot 100 can not move. That is, the body 104 descends and is seated on the floor surface F along with the storing operation of the wheels. In this sitting state, the flat seating surface 108 (grounding bottom surface) formed on the bottom of the body 104 abuts on the floor surface F.
  • a drive mechanism a rotation mechanism, a link mechanism
  • the robot 100 has two hands 106.
  • the hand 106 does not have the function of gripping an object.
  • the hand 106 can perform simple operations such as raising, shaking and vibrating.
  • the two hands 106 are also individually controllable.
  • the eye 110 can display an image with a liquid crystal element or an organic EL element.
  • the robot 100 mounts various sensors such as a microphone array and an ultrasonic sensor that can identify the sound source direction. In addition, it has a built-in speaker and can emit a simple voice.
  • a horn 112 is attached to the head of the robot 100. As described above, since the robot 100 is lightweight, the user can lift the robot 100 by grasping the tongue 112. An omnidirectional camera is attached to the horn 112 so that the entire upper portion of the robot 100 can be imaged at one time.
  • the horn 112 incorporates a transmitter and a receiver, the details of which will be described later with reference to FIGS. 7 and 8.
  • FIG. 2 is a cross-sectional view schematically showing the structure of the robot 100.
  • the body 104 of the robot 100 includes a base frame 308, a body frame 310, a pair of resin wheel covers 312 and a shell 314.
  • the base frame 308 is made of metal and constitutes an axial center of the body 104 and supports an internal mechanism.
  • the base frame 308 is configured by connecting an upper plate 332 and a lower plate 334 by a plurality of side plates 336 up and down.
  • the plurality of side plates 336 is sufficiently spaced to allow air flow.
  • a battery 118, a control circuit 342 and various actuators are accommodated.
  • the body frame 310 is made of a resin material and includes a head frame 316 and a body frame 318.
  • the head frame 316 has a hollow hemispherical shape and forms a head skeleton of the robot 100.
  • the body frame 318 has a stepped cylindrical shape and forms the body frame of the robot 100.
  • the body frame 318 is integrally fixed to the base frame 308.
  • the head frame 316 is assembled to the upper end of the body frame 318 so as to be relatively displaceable.
  • the head frame 316 is provided with three axes of a yaw axis 320, a pitch axis 322 and a roll axis 324, and an actuator 326 for rotationally driving each axis.
  • the actuator 326 includes a plurality of servomotors for individually driving each axis.
  • the yaw shaft 320 is driven for swinging motion
  • the pitch shaft 322 is driven for loosening motion
  • the roll shaft 324 is driven for tilting motion.
  • a plate 325 supporting the yaw axis 320 is fixed to the top of the head frame 316.
  • the plate 325 is formed with a plurality of vents 327 for ensuring ventilation between the top and bottom.
  • a metallic base plate 328 is provided to support the head frame 316 and its internal features from below.
  • the base plate 328 is connected to the plate 325 via the cross link mechanism 329 (pantograph mechanism), and is connected to the upper plate 332 (base frame 308) via the joint 330.
  • Torso frame 318 houses base frame 308 and wheel drive mechanism 370.
  • the wheel drive mechanism 370 includes a pivot shaft 378 and an actuator 379.
  • the lower half of the body frame 318 is narrow to form a storage space S of the front wheel 102 with the wheel cover 312.
  • the outer cover 314 is made of urethane rubber and covers the body frame 310 and the wheel cover 312 from the outside.
  • the hand 106 is integrally molded with the skin 314.
  • an opening 390 for introducing external air is provided at the upper end of the shell 314.
  • FIG. 3 is a block diagram of the robot system 300.
  • the robot system 300 includes a robot 100, a server 200 and a plurality of external sensors 114.
  • a plurality of external sensors 114 (external sensors 114a, 114b, ..., 114n) are installed in advance in the house.
  • the external sensor 114 may be fixed to the wall of the house or may be mounted on the floor.
  • position coordinates of the external sensor 114 are registered. The position coordinates are defined as x, y coordinates in a house assumed as the action range of the robot 100.
  • the server 200 is installed in a house.
  • the server 200 determines the basic behavior of the robot 100 based on the information obtained from the sensors contained in the robot 100 and the plurality of external sensors 114.
  • the external sensor 114 is for reinforcing the senses of the robot 100, and the server 200 is for reinforcing the brain of the robot 100.
  • the external sensor 114 periodically transmits a wireless signal (hereinafter referred to as a “robot search signal”) including the ID of the external sensor 114 (hereinafter referred to as “beacon ID”).
  • a wireless signal hereinafter referred to as a “robot search signal”
  • the robot 100 sends back a radio signal (hereinafter referred to as a “robot reply signal”) including a beacon ID.
  • the server 200 measures the time from when the external sensor 114 transmits the robot search signal to when the robot reply signal is received, and measures the distance from the external sensor 114 to the robot 100. By measuring the distances between the plurality of external sensors 114 and the robot 100, the position coordinates of the robot 100 are specified. Of course, the robot 100 may periodically transmit its position coordinates to the server 200.
  • FIG. 4 is a conceptual view of the emotion map 116.
  • the emotion map 116 is a data table stored in the server 200.
  • the robot 100 selects an action according to the emotion map 116.
  • An emotion map 116 shown in FIG. 4 indicates the size of a bad feeling for the location of the robot 100.
  • the x-axis and y-axis of emotion map 116 indicate two-dimensional space coordinates.
  • the z-axis indicates the size of the bad feeling. When the z value is positive, the preference for the location is high, and when the z value is negative, it indicates that the location is disliked.
  • the coordinate P1 is a point (hereinafter, referred to as a “favory point”) in the indoor space managed by the server 200 as the action range of the robot 100, in which the favorable feeling is high.
  • the favor point may be a "safe place” such as a shade of a sofa or under a table, a place where people easily gather like a living, or a lively place. In addition, it may be a place which has been gently boiled or touched in the past.
  • the definition of what kind of place the robot 100 prefers is arbitrary, generally, it is desirable to set a place favored by small children such as small children and dogs and cats.
  • a coordinate P2 is a point at which a bad feeling is high (hereinafter, referred to as a “disgust point”).
  • Aversion points are places with loud noise such as near a television, places that are easy to get wet like baths and washrooms, closed spaces or dark places, places that lead to unpleasant memories that have been roughly treated by users, etc. It may be.
  • the definition of what place the robot 100 hates is also arbitrary, it is generally desirable to set a place where small animals such as small children, dogs and cats are scared as a disappointment point.
  • the coordinate Q indicates the current position of the robot 100.
  • the server 200 may grasp how far the robot 100 is from which external sensor 114 and in which direction.
  • the movement distance of the robot 100 may be calculated from the number of revolutions of the front wheel 102 or the rear wheel 103 to specify the current position, or the current position may be determined based on an image obtained from a camera. It may be specified.
  • the emotion map 116 shown in FIG. 4 is given, the robot 100 moves in the direction in which it is drawn to the favor point (coordinate P1) and in the direction away from the aversion point (coordinate P2).
  • the emotion map 116 changes dynamically.
  • the z-value (favorable feeling) at the coordinate P1 decreases with time.
  • the robot 100 can reach the favor point (coordinate P1), and emulate the biological behavior of "feeling of emotion” being satisfied and eventually "being bored” at the place.
  • bad feelings at coordinate P2 are also alleviated with time.
  • new favor points and aversion points are created, whereby the robot 100 makes a new action selection.
  • the robot 100 has an "interest" at a new favor point and continuously selects an action.
  • the emotion map 116 expresses the ups and downs of emotion as the internal state of the robot 100.
  • the robot 100 aims at the favor point, avoids the disgust point, stays at the favor point for a while, and then takes the next action again.
  • Such control can make the behavior selection of the robot 100 human and biological.
  • the map that affects the behavior of the robot 100 (hereinafter collectively referred to as “action map”) is not limited to the emotion map 116 of the type shown in FIG. 4.
  • action map is not limited to the emotion map 116 of the type shown in FIG. 4.
  • various action maps such as curiosity, fear of fear, feeling of relief, feeling of calmness and dimness, feeling of physical comfort such as coolness and warmth, and so on.
  • the destination point of the robot 100 may be determined by weighted averaging the z values of each of the plurality of action maps.
  • the robot 100 has parameters indicating the magnitudes of various emotions and senses separately from the action map. For example, when the value of the emotion parameter of loneliness is increasing, the weighting coefficient of the behavior map for evaluating a safe place is set large, and the value of the emotion parameter is lowered by reaching the target point. Similarly, when the value of the parameter indicating a feeling of being boring is increasing, the weighting coefficient of the behavior map for evaluating a place satisfying the curiosity may be set large.
  • FIG. 5 is a hardware configuration diagram of the robot 100.
  • the robot 100 includes an internal sensor 128, a communicator 126, a storage device 124, a processor 122, a drive mechanism 120 and a battery 118.
  • the drive mechanism 120 includes the wheel drive mechanism 370 described above.
  • Processor 122 and storage 124 are included in control circuit 342.
  • the units are connected to each other by a power supply line 130 and a signal line 132.
  • the battery 118 supplies power to each unit via the power supply line 130. Each unit transmits and receives control signals through a signal line 132.
  • the battery 118 is a lithium ion secondary battery and is a power source of the robot 100.
  • the internal sensor 128 is an assembly of various sensors incorporated in the robot 100. Specifically, it is a camera (all-sky camera), a microphone array, a distance measurement sensor (infrared sensor), a thermo sensor, a touch sensor, an acceleration sensor, an odor sensor, a touch sensor, and the like.
  • the touch sensor is disposed between the outer skin 314 and the body frame 310 to detect a touch of the user.
  • the odor sensor is a known sensor to which the principle that the electric resistance is changed by the adsorption of the molecule that is the source of the odor is applied.
  • the communication device 126 is a communication module that performs wireless communication for various external devices such as the server 200, the external sensor 114, and a portable device owned by a user.
  • the communication device 126 includes a first communication device 302 in charge of communication with the server 200 and the external sensor 114, and a second communication device 304 in charge of communication with another robot 100 or the like.
  • the first communication device 302 communicates with the server 200 and the like in a non-directional communication method (first wireless communication method) according to Wi-Fi (registered trademark).
  • the second communication device 304 communicates with another robot 100 in a communication method (second wireless communication method) by IrDA (Infrared Data Association) (registered trademark) (a registered trademark) having directivity and a narrow communication range. Details will be described later.
  • the storage device 124 is configured by a non-volatile memory and a volatile memory, and stores a computer program and various setting information.
  • the processor 122 is an execution means of a computer program.
  • the drive mechanism 120 is an actuator that controls an internal mechanism. In addition to this, indicators and speakers will also be installed.
  • the processor 122 performs action selection of the robot 100 while communicating with the server 200 and the external sensor 114 via the communication device 126.
  • Various external information obtained by the internal sensor 128 also affects behavior selection.
  • the drive mechanism 120 mainly controls the wheel (front wheel 102) and the head (head frame 316).
  • the drive mechanism 120 changes the rotational direction and the rotational direction of the two front wheels 102 to change the moving direction and the moving speed of the robot 100.
  • the drive mechanism 120 can also raise and lower the wheels (the front wheel 102 and the rear wheel 103). When the wheel ascends, the wheel is completely housed in the body 104, and the robot 100 abuts on the floor surface F at the seating surface 108 to be in the seating state. Also, the drive mechanism 120 controls the hand 106 via the wire 135.
  • FIG. 6 is a functional block diagram of the robot system 300.
  • the robot system 300 includes the robot 100, the server 200, the jewelry 140, and the plurality of external sensors 114.
  • the accessory 140 has a function of transmitting an operation command to the robot 100.
  • the components of the robot 100, the server 200 and the accessory 140 are arithmetic units such as central processing units (CPUs) and various co-processors, storage devices such as memories and storages, and hardware including wired or wireless communication lines connecting them. And software that is stored in the storage device and supplies processing instructions to the computing unit.
  • the computer program may be configured by a device driver, an operating system, various application programs located in the upper layer of them, and a library that provides common functions to these programs. Each block described below indicates not a hardware unit configuration but a function unit block.
  • Some of the functions of the robot 100 may be realized by the server 200, and some or all of the functions of the server 200 may be realized by the robot 100.
  • the server 200 includes a communication unit 204, a data processing unit 202, and a data storage unit 206.
  • the communication unit 204 takes charge of communication processing with the external sensor 114 and the robot 100.
  • the data storage unit 206 stores various data.
  • the data processing unit 202 executes various processes based on the data acquired by the communication unit 204 and the data stored in the data storage unit 206.
  • the data processing unit 202 also functions as an interface of the communication unit 204 and the data storage unit 206.
  • the data storage unit 206 includes a motion storage unit 232, a map storage unit 216, and a personal data storage unit 218.
  • the robot 100 has a plurality of motion patterns (motions). Various motions are defined, such as shaking the hand 106, approaching the owner while meandering, staring at the owner with a sharp neck, and the like.
  • the motion storage unit 232 stores a "motion file" that defines control content of motion. Each motion is identified by a motion ID. The motion file is also downloaded to the motion storage unit 160 of the robot 100. Which motion is to be performed may be determined by the server 200 or the robot 100.
  • the motions of the robot 100 are configured as complex motions including a plurality of unit motions.
  • the robot 100 may be expressed as a combination of a unit motion that turns toward the owner, a unit motion that approaches while raising the hand, a unit motion that approaches while shaking the body, and a unit motion that sits while raising both hands. .
  • the combination of such four motions realizes a motion of “close to the owner, raise your hand halfway, and finally sit down with your body shaking”.
  • the rotation angle and angular velocity of an actuator provided in the robot 100 are defined in association with the time axis.
  • Various motions are represented by controlling each actuator with the passage of time according to a motion file (actuator control information).
  • the transition time when changing from the previous unit motion to the next unit motion is called “interval".
  • the interval may be defined according to the time required for unit motion change and the contents of the motion.
  • the length of the interval is adjustable.
  • settings relating to behavior control of the robot 100 such as when to select which motion, output adjustment of each actuator for realizing the motion, and the like are collectively referred to as “behavior characteristics”.
  • the behavior characteristics of the robot 100 are defined by a motion selection algorithm, a motion selection probability, a motion file, and the like.
  • the motion storage unit 232 stores, in addition to the motion file, a motion selection table that defines motion to be executed when various events occur.
  • a motion selection table that defines motion to be executed when various events occur.
  • one or more motions and their selection probabilities are associated with an event.
  • the map storage unit 216 stores, in addition to a plurality of action maps, a map indicating the arrangement of obstacles such as chairs and tables.
  • the personal data storage unit 218 stores information of the user, in particular, the owner. Specifically, master information indicating the closeness to the user and the physical and behavioral characteristics of the user is stored. Other attribute information such as age and gender may be stored.
  • the personal data storage unit 218 also registers familiarity with not only the user but also other robots 100.
  • the robot 100 has an internal parameter called familiarity for each user.
  • familiarity for each user.
  • an action indicating favor with itself such as raising itself or giving a voice
  • familiarity with the user is increased.
  • the closeness to the user who is not involved in the robot 100, the user who is violent, and the user who is infrequently encountered is low. The closeness of the robot 100 to the robot 100 will be described later.
  • the data processing unit 202 includes a position management unit 208, a map management unit 210, a recognition unit 212, an operation control unit 222, a closeness management unit 220, and a state management unit 244.
  • the position management unit 208 specifies the position coordinates of the robot 100 by the method described with reference to FIG.
  • the position management unit 208 may also track the user's position coordinates in real time.
  • the state management unit 244 manages various internal parameters such as various physical states such as the charging rate, the internal temperature, and the processing load of the processor 122.
  • the state management unit 244 includes an emotion management unit 234.
  • the emotion management unit 234 manages various emotion parameters that indicate the emotion (such as loneliness, curiosity, approval request, etc.) of the robot 100. These emotional parameters are constantly fluctuating.
  • the importance of the plurality of action maps changes according to the emotion parameter, the movement target point of the robot 100 changes according to the action map, and the emotion parameter changes according to the movement of the robot 100 or the passage of time.
  • the emotion management unit 234 sets the weighting coefficient of the behavior map for evaluating a safe place large.
  • the emotion management unit 234 reduces the emotion parameter indicating the loneliness.
  • various emotional parameters are also changed by the response action described later. For example, the emotion parameter indicating loneliness declines when being "held" from the owner, and the emotion parameter indicating loneliness gradually increases when the owner is not viewed for a long time.
  • the map management unit 210 changes the parameter of each coordinate in the method described with reference to FIG. 4 for a plurality of action maps.
  • the recognition unit 212 recognizes the external environment.
  • the recognition of the external environment includes various recognitions such as recognition of weather and season based on temperature and humidity, recognition of an object shade (safety area) based on light quantity and temperature.
  • the recognition unit 156 of the robot 100 acquires various types of environment information by the internal sensor 128, performs primary processing on the environment information, and transfers the information to the recognition unit 212 of the server 200.
  • the recognition unit 156 of the robot 100 extracts an image area corresponding to a moving object, in particular, a person or an animal, from the image, and indicates a physical feature or an action characteristic of the moving object from the extracted image area.
  • the feature vector component is a numerical value that quantifies various physical and behavioral features. For example, the width of the human eye is digitized in the range of 0 to 1 to form one feature vector component.
  • the method of extracting feature vectors from a captured image of a person is an application of known face recognition technology.
  • the robot 100 transmits the feature vector to the server 200.
  • the recognition unit 212 of the server 200 further includes a person recognition unit 214 and a response recognition unit 228.
  • the human recognition unit 214 compares the feature vector extracted from the image captured by the built-in camera of the robot 100 with the feature vector of the user (cluster) registered in advance in the personal data storage unit 218, thereby capturing the user It is determined which person corresponds to (user identification processing).
  • the person recognition unit 214 includes an expression recognition unit 230.
  • the facial expression recognition unit 230 estimates the user's emotion by performing image recognition on the user's facial expression.
  • the person recognition unit 214 also performs user identification processing on moving objects other than a person, for example, cats and dogs that are pets.
  • the response recognition unit 228 recognizes various response actions made to the robot 100, and classifies them as pleasant and unpleasant actions.
  • the response recognition unit 228 also classifies into a positive / negative response by recognizing the owner's response to the behavior of the robot 100.
  • the pleasant and unpleasant behavior is determined depending on whether the user's response behavior is comfortable or unpleasant as a living thing. For example, holding is a pleasant act for the robot 100, and kicking is an unpleasant act for the robot 100.
  • the positive / negative response is determined depending on whether the user's response indicates a user's pleasant emotion or an unpleasant emotion. For example, being held is a positive response indicating the user's pleasant feeling, and kicking is a negative response indicating the user's unpleasant feeling.
  • the motion control unit 222 of the server 200 cooperates with the motion control unit 150 of the robot 100 to determine the motion of the robot 100.
  • the motion control unit 222 of the server 200 creates a movement target point of the robot 100 and a movement route for the movement based on the action map selection by the map management unit 210.
  • the operation control unit 222 may create a plurality of movement routes, and then select one of the movement routes.
  • the motion control unit 222 selects the motion of the robot 100 from the plurality of motions of the motion storage unit 232.
  • Each motion is associated with a selection probability for each situation. For example, a selection method is defined such that motion A is executed with a probability of 20% when a pleasant action is made by the owner, and motion B is executed with a probability of 5% when the temperature reaches 30 degrees or more. .
  • a movement target point and a movement route are determined in the action map, and a motion is selected by various events described later.
  • the closeness management unit 220 manages closeness for each user. As described above, the intimacy degree is registered in the personal data storage unit 218 as part of the personal data. When a pleasant act is detected, the closeness management unit 220 increases the closeness to the owner. The intimacy is down when an offensive act is detected. In addition, the closeness of the owner who has not viewed for a long time gradually decreases.
  • the closeness management unit 220 of the present embodiment manages closeness not only for the owner but also for each robot 100. Familiarity with other robots 100 is also registered in the personal data storage unit 218 as part of personal data. The closeness to another robot 100 increases or decreases depending on how it is associated with the robot 100, which will be described in detail later.
  • the robot 100 includes a first communication unit 142, a second communication unit 134, a data processing unit 136, a data storage unit 148, an internal sensor 128, and a drive mechanism 120.
  • the first communication unit 142 corresponds to the first communication device 302
  • the second communication unit 134 corresponds to the second communication device 304 (see FIG. 5).
  • the first communication unit 142 takes charge of communication processing with the external sensor 114 and the server 200.
  • the second communication unit 134 takes charge of communication processing with another robot 100 and the accessory 140.
  • the accessory 140 in the present embodiment is described as a wristband.
  • the first communication unit 142 includes a communication connection unit 138.
  • the communication connection unit 138 establishes communication connection with the server 200 by Wi-Fi (registered trademark) (first wireless communication method).
  • the second communication unit 134 includes a transmitter 162 (transmitter) and a receiver 164 (receiver).
  • the second communication unit 134 performs near field communication with the other robot 100 or the accessory 140 by IrDA (registered trademark) (second wireless communication method).
  • the “near-field wireless communication” in the present embodiment means wireless communication in which the communication range is at least 5 meters, preferably 1.5 meters, more preferably 1 meter.
  • the near field communication is preferably a directional communication method, and preferably ad hoc communication (direct type).
  • the data storage unit 148 stores various data.
  • the data storage unit 148 corresponds to the storage device 124 (see FIG. 5).
  • the data processing unit 136 executes various processes based on the data acquired by the first communication unit 142 and the second communication unit 134, the sensor information detected by the internal sensor 128, and the data stored in the data storage unit 148. Do.
  • the data processing unit 136 corresponds to a processor 122 and a computer program executed by the processor 122.
  • the data processing unit 136 also functions as an interface of the first communication unit 142, the second communication unit 134, the internal sensor 128, the drive mechanism 120, and the data storage unit 148.
  • the data storage unit 148 includes a motion storage unit 160 that defines various motions of the robot 100.
  • Various motion files are downloaded from the motion storage unit 232 of the server 200 to the motion storage unit 160 of the robot 100.
  • Motion is identified by motion ID.
  • a state in which the front wheel 102 is accommodated which causes the robot 100 to rotate by having only the front wheel 102 housed and seated, lifting the hand 106, rotating the two front wheels 102 in reverse, or rotating only one front wheel 102
  • various motions such as shaking by rotating the front wheel 102 at a time, stopping and turning back once when leaving the user, operation timing, operation time, operation direction, etc. of various actuators (drive mechanism 120) Temporarily defined in motion file.
  • Various data may also be downloaded to the data storage unit 148 from the map storage unit 216 and the personal data storage unit 218.
  • the data processing unit 136 includes a recognition unit 156, an operation control unit 150, an instruction selection unit 166, a robot detection unit 152, and a charge monitoring unit 154.
  • the motion control unit 150 of the robot 100 determines the motion of the robot 100 in cooperation with the motion control unit 222 of the server 200. Some motions may be determined by the server 200, and other motions may be determined by the robot 100. Also, although the robot 100 determines the motion, the server 200 may determine the motion when the processing load of the robot 100 is high. The base motion may be determined at server 200 and additional motion may be determined at robot 100. How to share the motion determination process in the server 200 and the robot 100 may be designed according to the specification of the robot system 300.
  • the motion control unit 150 of the robot 100 determines the moving direction of the robot 100 together with the motion control unit 222 of the server 200.
  • the movement based on the action map may be determined by the server 200, and the immediate movement such as turning off the obstacle may be determined by the movement control unit 150 of the robot 100.
  • the drive mechanism 120 drives the front wheel 102 in accordance with an instruction from the operation control unit 150 to direct the robot 100 to the movement target point.
  • the operation control unit 150 of the robot 100 instructs the drive mechanism 120 to execute the selected motion.
  • the drive mechanism 120 controls each actuator according to the motion file.
  • the motion control unit 150 can also execute a motion to lift both hands 106 as a gesture that encourages "hug” when a user with high intimacy is nearby, and when the "hug” gets tired, the left and right front wheels 102 By alternately repeating reverse rotation and stop while being accommodated, it is also possible to express a motion that annoys you.
  • the drive mechanism 120 causes the robot 100 to express various motions by driving the front wheel 102, the hand 106, and the neck (head frame 316) according to the instruction of the operation control unit 150.
  • the instruction selection unit 166 selects an operation instruction.
  • the motion command is a command for instructing another robot 100 to select a motion.
  • the instruction selection unit 166 may randomly select a plurality of types of operation instructions at arbitrary timing, or may select an operation instruction corresponding to an event when an event occurs. For example, when the motion control unit 150 selects a motion M, the command selection unit 166 may select a motion command X associated with the motion M in advance.
  • the transmitting unit 162 transmits the selected operation command to the other robot 100.
  • the robot detection unit 152 specifies the presence of another robot and the direction or position thereof. The details of the robot information will be described later.
  • the charge monitoring unit 154 monitors the remaining battery level of the battery 118.
  • the recognition unit 156 of the robot 100 interprets external information obtained from the internal sensor 128.
  • the recognition unit 156 is capable of visual recognition (visual unit), odor recognition (olfactory unit), sound recognition (hearing unit), and tactile recognition (tactile unit).
  • the recognition unit 156 periodically images the outside world with the built-in omnidirectional camera, and detects a moving object such as a person or a pet.
  • the recognition unit 156 includes a feature extraction unit 146.
  • the feature extraction unit 146 extracts a feature vector from the captured image of the moving object.
  • the feature vector is a set of parameters (features) indicating physical features and behavioral features of the moving object.
  • the robot system 300 clusters users who frequently appear as “owners” based on physical features and behavioral features obtained from a large amount of image information and other sensing information. For example, if a moving object (user) with a beard is active (early rising) in the early morning and is less likely to wear a red dress, a cluster (with a beard growing early and not wearing much red dress) User), the first profile. On the other hand, moving objects wearing glasses often wear skirts, but if these moving objects have no beards, clusters wearing glasses wearing skirts but no absolute beards The second profile (user) can be created.
  • the robot 100 does not have to recognize that the first profile is "father". It is only necessary to be able to recognize the figure of "a cluster with a beard and often getting up early, and a cluster that rarely wears red clothes”. For each profile, a feature vector characterizing the profile is defined.
  • the robot 100 newly recognizes a moving object (user) while such cluster analysis is completed.
  • the person recognition unit 214 of the server 200 executes the user identification process based on the feature vector of the new moving object, and determines which profile (cluster) the moving object corresponds to. For example, when a moving object with a beard is detected, the moving object is likely to be a father. If this moving object is acting in the early morning, it is more certain that it corresponds to the father. On the other hand, when detecting a moving object wearing glasses, the moving object may be a mother. If the moving object has a beard, it is not a mother but a father, so it is determined to be a new person not subjected to cluster analysis.
  • cluster analysis The formation of clusters (profiles) by feature extraction (cluster analysis) and the fitting to clusters involved in feature extraction may be performed concurrently.
  • the recognition unit 156 of the robot 100 selects and extracts information necessary for recognition, and interpretation processes such as determination are executed by the recognition unit 212 of the server 200. .
  • the recognition processing may be performed only by the recognition unit 212 of the server 200, or may be performed only by the recognition unit 156 of the robot 100, or both perform the above-mentioned recognition processing while sharing roles. It is also good.
  • the recognition unit 156 recognizes this by the built-in acceleration sensor, and the response recognition unit 228 of the server 200 recognizes that the "abuse act" is performed by the user in the vicinity. Even when the user holds the tongue 112 and lifts the robot 100, it may be recognized as a violent act.
  • the response recognition unit 228 of the server 200 may recognize that the “voice call action” has been performed on itself.
  • the temperature around the body temperature is detected, it is recognized that the user has made “contact action”, and when the upward acceleration is detected in the state where the contact is recognized, it is recognized that "handing" is made.
  • the physical contact when the user lifts the body 104 may be sensed, or the holding on the front wheel 102 may be recognized by lowering the load.
  • the robot 100 acquires the user's action as physical information by the internal sensor 128, the response recognition unit 228 of the server 200 determines comfort / discomfort, and the recognition unit 212 of the server 200 performs user identification processing based on the feature vector Run.
  • the response recognition unit 228 of the server 200 recognizes various responses of the user to the robot 100.
  • some typical response actions correspond to pleasure or discomfort, affirmation or denial.
  • most pleasurable actions are positive responses, and most offensive actions are negative.
  • Pleasure and discomfort are related to intimacy, and affirmative and negative responses affect the action selection of the robot 100.
  • the closeness management unit 220 of the server 200 changes the closeness to the user.
  • the intimacy with the user who has performed pleasure is increased, and the intimacy with the user who has performed offensive activity decreases.
  • the closeness to the user changes depending on what action is taken from the moving object (user).
  • the robot 100 sets a high degree of intimacy for people who frequently meet, people who frequently touch, and people who frequently speak. On the other hand, the intimacy with the people who rarely see, those who do not touch very much, the violent people, the people who speak loudly becomes low.
  • the robot 100 changes the intimacy degree of each user based on various external information detected by sensors (vision, touch, hearing).
  • the actual robot 100 autonomously performs complex action selection in accordance with the action map.
  • the robot 100 acts while being influenced by a plurality of action maps based on various parameters such as loneliness, boredom and curiosity.
  • the robot 100 tries to approach people with high intimacy and leaves people with low intimacy, in principle, when the influence of the action map is excluded or in an internal state where the influence of the behavior map is small. I assume.
  • the behavior of the robot 100 is categorized as follows according to closeness.
  • the user robot 100 with a very high degree of intimacy approaches the user (hereinafter referred to as “proximity action”), and performs the affection of love by predefining a gesture of love for people. Express strongly.
  • the user robot 100 with relatively high intimacy performs only the proximity action.
  • the user robot 100 with relatively low intimacy does not perform any particular action.
  • the user robot 100 with a particularly low intimacy performs a leaving action.
  • the robot 100 when the robot 100 finds a user with high intimacy, it approaches that user, and conversely, when finding a user with low intimacy, it leaves the user.
  • it is possible to express so-called "human sight" behavior.
  • the robot 100 may move away from the visitor and head toward the family (user B with high intimacy).
  • the user B can feel that the robot 100 is aware of strangers and feels uneasy, and relies on himself.
  • Such a behavioral expression evokes the user B the joy of being selected and relied upon, and the accompanying attachment.
  • the user A who is a visitor frequently visits, calls and makes a touch the intimacy with the user A of the robot 100 gradually increases, and the robot 100 does not act as an acquaintance with the user A (disengagement behavior) .
  • the user A can also have an attachment to the robot 100 by feeling that the robot 100 has become familiar with himself.
  • the robot 100 may not select the behavior influenced by the intimacy because the action map for finding a place satisfying the curiosity is emphasized. .
  • the external sensor 114 installed at the entrance detects that the user has returned home, the user may be asked to give priority to the user's meeting action.
  • the accessory 140 is an accessory capable of transmitting an operation command to an unspecified number of robots 100.
  • the accessory 140 includes a transmitting unit 144, a receiving unit 170, and an instruction selecting unit 172.
  • the transmitting unit 144 and the receiving unit 170 communicate with the robot 100 by IrDA (registered trademark) (second wireless communication method).
  • the receiving unit 170 receives robot information described later from the transmitting unit 162 of the robot 100.
  • the transmitting unit 144 transmits an operation command to the robot 100.
  • the operation command of the accessory 140 will be described later with reference to FIG.
  • the instruction selection unit 172 selects an operation instruction in accordance with an instruction from the user. A plurality of operation instructions are registered in the accessory 140 in advance. The user can select an operation command to be transmitted by operating a button (not shown) of the accessory 140.
  • FIG. 7 is an enlarged view of the horn 112.
  • a communication device disposition surface G is formed on the horn 112.
  • IrDA registered trademark
  • IrDA registered trademark
  • IrDA is highly directional and has a short communication range of about 0.3 to 1.0 meter.
  • the communication range of IrDA (registered trademark) is limited to a short distance as compared to general wireless communication.
  • IrDA (registered trademark) is not only in a short distance, but also in a visible state, because it can not communicate even if there is only a simple non-transmissive shield such as paper between the transmitter and the receiver. It can only communicate with the other party. In other words, since it is possible to transmit and receive only when the other party can visually recognize, it is possible to perform extremely natural and secure communication in which a person speaks secretly.
  • FIG. 8 is a top view of the communication device disposition surface G.
  • Eight second communication devices 304 (second communication devices 304a to 304h) are annularly arranged on the circular communication device arrangement surface G.
  • the upper side of the drawing corresponds to the front of the robot 100 (front side), and the lower side of the drawing corresponds to the rear of the robot 100 (back side).
  • the second communicator 304 includes one transmitter 158 and one receiver 168. Therefore, eight transmitters 158 and eight receivers 168 are alternately arranged on the communication device arrangement plane G.
  • Position codes (IDs) are set to the eight second communication devices 304 (the transmitters 158 and the receivers 168) as follows.
  • Position code of second communication device 304a (front): F Position code of second communication device 304 b (front left): FL Position code of second communication device 304 c (left): L Position code of second communication device 304 d (left rear): BL Position code of second communication device 304e (rear): B Position code of second communication device 304f (right rear): BR Position code of second communication device 304g (right): R Position code of second communication device 304h (front right): FR Eight transmitters 158 (transmitters 158a to 158h) correspond to the transmitting unit 162 in FIG. 6, and eight receivers (receivers 168a to 168h) correspond to the receiving unit 164.
  • the transmitter 158 transmits “robot information” and an operation instruction on IrDA (registered trademark). Since IrDA (registered trademark) has directivity, signals such as robot information are transmitted periodically and simultaneously by eight transmitters 158 in eight directions.
  • the robot information includes a “robot ID” for identifying the robot 100 and a position code (transmitter ID) for identifying the second communication device 304 (transmitter 158).
  • the transmitter 158a transmits a robot ID and a position code (F)
  • the transmitter 158b transmits a robot ID and a position code (FL).
  • the robot ID may be information that uniquely identifies the robot 100, such as a MAC address (Media Access Control address) or a serial number.
  • the receiver 168 receives robot information and operation instructions from another robot 100.
  • the receiver 168 also receives motion commands from the jewellery 140.
  • the second robot 100 can specify the type, position, direction, and distance of the transmission source based on the robot information.
  • the second robot 100 first specifies the first robot 100 as a transmission source based on the robot ID.
  • the left receiver 168 c (position code: L) of the second robot 100 receives robot information including “position code: F” from the first robot 100.
  • the first robot 100 is positioned to the left of the second robot 100, and the first robot 100 faces the second robot 100. It is determined that This is because the left receiver 168 c (position code: L) of the second robot 100 receives robot information from the transmitter 158 a (position code: F) in front of the first robot 100.
  • the robot detection unit 152 of the second robot 100 specifies the presence direction of the first robot 100 based on the reception intensity of the robot information in each of the plurality of receivers 168. For example, if the reception strength of the receiver 168 c (left) is greater than the reception strength of any of the other receivers 168, it can be identified that the first robot 100 is on the left of the second robot 100. Furthermore, the robot detection unit 152 also specifies the distance to the external robot 100 based on the reception intensity. In addition, when the reception intensity of the signal from the transmitter 158a of the first robot 100 is larger than the reception intensity of the signals from the other transmitters, it is assumed that the first robot 100 faces the second robot 100. It can be determined.
  • the transmitter 158 can transmit robot information only in a narrow area of about 1 meter. For this reason, near field communication by IrDA (registered trademark) is established only when another robot 100 (hereinafter, referred to as “external robot 100”) exists at a close distance.
  • the server 200 registers the robot ID of the external robot 100 in the “robot list”.
  • the external robot 100 whose robot ID has been registered (recognized) is also referred to as “registered robot 100”
  • the unregistered external robot 100 is also referred to as “unregistered robot 100”.
  • the robot list is shared by the server 200 and the robot 100. With respect to the external robot 100 on the recognized side, the robot 100 on the recognition side is also referred to as “self-robot 100”.
  • the recognition unit 156 determines the external robot 100 by referring to the robot list. If a robot ID not present in the robot list is received, the external robot 100 is an unregistered robot 100. In this case, the robot 100 transmits the robot ID to the server 200, and the recognition unit 212 of the server 200 registers the newly detected robot ID in the robot list.
  • the registration robot 100 means the external robot 100 that the robot 100 knows, in other words, has been involved.
  • the motion control unit 150 selects any one of a plurality of types of motion associated with an event E1 of “new detection of robot ID”. Specifically, various motions such as turning around, approaching, leaving, etc. can be considered.
  • the motion control unit 150 expresses action recognition that the external robot 100 is recognized by stopping the motion in progress, changing the interval of motion, changing the execution speed of motion, or the like corresponding to the event E1. It is also good.
  • the operation control unit 150 selects any one of a plurality of types of motion associated with an event E2 of “detection of unregistered robot ID”. Specifically, various motions can be considered, such as turning around, approaching, and turning the neck. The motion control unit 150 may express curiosity or alertness due to finding the unknown robot 100, such as stopping the motion in progress or reducing the execution speed of the motion in response to the event E2.
  • the operation control unit 150 changes the behavior characteristic of the robot 100 according to the direction in which the external robot 100 exists. For example, when there is an external robot 100 behind, motion away from the external robot 100 may be executed. In other words, when an event E3 "detecting the robot ID by the receiver 168e (position code: B) located behind" occurs, the operation control unit 150 sets the movement target point of the robot 100 ahead. , The action of being surprised and fleeing to the robot 100 appearing behind is expressed. The motion control unit 150 may execute a motion looking backward by rotating the robot 100. Similarly, when the external robot 100 exists in front of the self robot 100, the motion control unit 150 may execute a motion approaching the external robot 100.
  • the transmitting unit 162 of the robot 100 transmits an operation command to the external robot 100.
  • the motion command includes the motion ID of the motion to be executed.
  • the robot 100 may transmit any operation command at any timing, for example, random timing, or may transmit an operation command when an event occurs such as when the external robot 100 is detected.
  • the operation control unit 150 of the robot 100Q moves behind the robot 100P that is the transmission source, and thereafter moves following the robot 100P. You may The following operation will be described in more detail with reference to FIG.
  • the robot 100Q When the robot 100Q receives the operation command X from the robot 100P, it may or may not follow the operation command X.
  • the robot 100Q may follow the motion command X with a predetermined probability.
  • the command selection unit 166 may select a motion command associated with the selected motion. For example, it is assumed that the motion command X2 "I will raise the hand 106" and the operation command X3 "stop” are associated with the motion M1 "I raise the hand 106".
  • the command selection unit 166 of the robot 100P randomly selects one of the operation commands X2 and X3. Suppose that the operation instruction X2 is selected.
  • the transmitters 162 (transmitters 158a to 158h) of the robot 100P transmit the operation command X2 to the robot 100Q (external robot 100).
  • the intimacy degree management unit 220 manages not only the user but also the intimacy degree with the external robot 100.
  • the closeness management unit 220 changes the closeness to the external robot 100 according to the short distance wireless communication. For example, when the robot ID of the external robot 100 is detected, the robot 100 adds the closeness to the external robot 100.
  • an operation command is transmitted from the robot 100 to the external robot 100, it is determined whether or not the external robot 100 executes the operation command.
  • the external robot 100 executes the operation command X in other words, when the operation command X is accepted, the transmitting unit 162 of the external robot 100 returns an “acceptance signal” to the self-robot 100 of the transmission source.
  • the robot 100 receives the acceptance signal it adds the closeness to the external robot 100.
  • the external robot 100 performs motion selection in accordance with the operation command X.
  • the external robot 100 rejects the operation command X
  • the external robot 100 returns a "rejection signal" to the self robot 100 of the transmission source.
  • the robot 100 receives the rejection signal, it subtracts the closeness to the external robot 100.
  • the closeness (or preference) to the external robot 100 according to the operation command is enhanced, and the closeness (or preference) to the external robot 100 not conforming to the operation command is lowered.
  • Self-robot 100 may subtract familiarity with external robot 100 with time. According to such a control method, it is possible to express the biological characteristic that the sense of closeness (closeness) gradually decreases for the external robot 100 whose relationship has become thin.
  • the robot 100Q follows the movement command X from the robot 100P is influenced by the closeness of the robot Q to the robot P. Specifically, it is assumed that the robot 100Q receives the operation command X together with the robot information from the robot 100P.
  • the operation control unit 150 of the robot 100Q transmits the robot ID from the first communication unit 142 to the server 200, and inquires about the closeness of the robot 100Q with respect to the robot 100P.
  • the motion control unit 150 of the robot 100Q accepts the operation instruction X with a probability of 90% if the intimacy is 70 or more, and accepts the operation instruction X with a probability of 50% if the intimacy is 30 or more and less than 70. Do. On the other hand, when the intimacy degree is less than 30, the probability of accepting the operation command X is 5%.
  • the robot 100 may change the behavior characteristic according to the intimacy degree.
  • the robot 100 receives the robot ID of the external robot 100
  • the robot 100 confirms the closeness of the external robot 100.
  • the motion control unit 150 of the robot 100 approaches the external robot 100 if the intimacy degree of the external robot 100 is 70 or more, and faces toward the external robot 100 when the intimacy degree is 30 or more and less than 70.
  • the operation control unit 150 may instruct the drive mechanism 120 to separate from the external robot 100. According to such a control method, it is possible to express an action characteristic in which the close robot 100 approaches and the unfriendly robot 100 separates.
  • FIG. 9 is a schematic view showing how a plurality of robots 100 form a formation.
  • the first robot 100P transmits an operation command X1 instructing a "following operation".
  • the plurality of transmitters 158 mounted on the robot 100P transmit the operation command X1 in all directions.
  • the robot 100Q receives the operation command X1 from the robot 100P.
  • the robot 100Q identifies, among the plurality of transmitters 158 mounted on the robot 100P, the transmitter 158e at the rear of the robot 100P with the position code (B) included in the robot information.
  • the motion control unit 150 of the robot 100Q moves to the rear of the robot 100P and follows the robot 100P.
  • the robot 100Q is a position where the receiver 168a (forward) of the robot 100Q can receive robot information from the transmitter 158e (backward) of the robot 100P with a larger reception intensity than the other receivers 168. Move to In this manner, the robot 100Q moves to a position where it can receive robot information from the transmitter 158e at the rear of the robot 100P, and thereafter follows the robot 100P, whereby the robot 100Q follows the robot 100P. It becomes possible to express.
  • the operation control unit 150 of the robot 100Q Based on the reception intensity of the operation command X1, the operation control unit 150 of the robot 100Q adjusts the moving speed so that the distance to the robot 100P falls within a predetermined range.
  • the robot 100Q maintains a distance such that the reception intensity of the operation command X1 from the rear transmitter 158e (position code: B) among the plurality of transmitters 158 of the robot 100P falls within a predetermined range.
  • the robot 100Q sends back an acceptance signal together with the robot ID to the robot 100P.
  • the robot 100Q further transmits (relays) an operation command X1.
  • the robot 100P that has transmitted the operation command X1 is set by the operation control unit 150 in the “command mode”.
  • the robot 100 set in the instruction mode does not receive an operation instruction from another robot 100. Therefore, in FIG. 9, the robot 100P does not respond to the operation command X1 of the robot 100Q.
  • the robot 100R which has not entered the command mode receives the operation command X1 from the robot 100Q and follows the robot 100Q. According to such a control method, it is possible to cause the plurality of robots 100 to take row action in response to the transmission of the operation command X1 from the robot 100P.
  • Operation control unit 150 cancels the instruction mode after a predetermined time has elapsed.
  • the following operation can be performed not only from the rear but also from the side.
  • the robot 100P can move to the left and right following the robot 100Q and the robot 100R.
  • the robot 100Q receives the operation command X1 from the transmitter 158g (position code: R) on the right side of the robot 100P
  • the robot 100R receives the operation command X1 from the transmitter 158c (position code: L) on the left side of the robot 100P. It may move to a position where the operation command X1 is received.
  • a large number of robots 100 gather at a large event hall such as a gymnasium, it is possible to make these robots 100 take simultaneous action.
  • various action chains such as raising the hand 106 all at once, or raising the neck all at once are possible.
  • the action chain can provide the fun for the owner to bring the robot 100 and interact.
  • FIG. 10 is a schematic view showing an action instruction to the robot 100 by the accessory 140.
  • the accessory 140 (wristband) includes a plurality of transmitters 158 and a plurality of receivers 168 in the same manner as the communication device arrangement plane G shown in FIG.
  • a set of transmitters 158 corresponds to the transmitting unit 144, and a set of receivers 168 corresponds to the receiving unit 170.
  • the transmitter 158 and receiver 168 of the jewellery 140 are also identified by the position code.
  • the transmission unit 144 of the accessory 140 periodically transmits the operation command X.
  • the transmission reach area 190 indicates the range in which the movement command X of the accessory 140 can reach.
  • the user wearing the accessory 140 can switch the operation instruction to be transmitted via the instruction selection unit 172. It is assumed that the accessory 140 transmits an operation command X5 of "prohibit entry into the entry prohibited area 180".
  • the entry prohibited area 180 may be set as a range in which the robot 100 can visually recognize the accessory 140, or may be set as a range closer to the accessory 140 so that the reception intensity of the operation command X5 is equal to or more than a predetermined threshold.
  • the robot 100S since the robot 100S receives the operation command X5, it is separated from the accessory 140 without entering the entry prohibited area 180 while approaching the accessory 140. By the operation command X5, an operation instruction to keep the robot 100 away from the accessory 100 can be performed.
  • the non-entry area 180 may not be circular.
  • the non-entry area 180 may have another shape such as an elliptical shape, or may change the size and the shape periodically.
  • the robot 100 receives an operation command, the robot 100 probability determines acceptance or rejection, and sends back an acceptance signal or rejection signal indicating the determination result to the accessory 140.
  • some “strong operation instructions” such as the operation instruction X5 may not be rejected.
  • the accessory 140 transmits an operation command X4 "seat".
  • the robot 100T located in the transmission reach area 190 receives the front wheel 102 and sits down.
  • the instruction selection unit 166 of the robot 100T selects the operation instruction X4 and transmits (relays) the operation instruction X4.
  • the robot 100U receives the operation command X4 from the robot 100T, the robot 100T is also seated.
  • the robot 100U Since the robot 100U is out of the transmission reach area 190, the robot 100U does not receive the operation command X4 of the accessory 140, but is seated by relay of the operation command X4 by the robot 100T. According to such a control method, the movement command X of the accessory 140 can be relayed to many robots 100 without being restricted by the transmission reach area 190 of the accessory 140. For example, when there are a large number of robots 100, if an operation command X4 is transmitted from the accessory 140, the robots 100 near the accessory 140 will be seated one after another like a so-called "tombstone". be able to.
  • the transmission unit 144 of the accessory 140 may transmit the operation command X on condition that the robot information has been received from the robot 100. According to such a control method, since the operation command X is not transmitted when the robot 100 is not around, power consumption of the transmission unit 144 can be suppressed. Further, the transmitter 144 may transmit not only the operation command X but also the accessory ID and the transmitter ID for identifying the accessory 140 like the robot 100.
  • the accessory 140 can simultaneously transmit the operation command X to the plurality of robots 100 present in the transmission reach area 190. Therefore, the user of the accessory 140 can enjoy the feeling of giving instructions to a plurality of robots 100 at the same time. For example, by transmitting an operation command X6 meaning "close up", the plurality of robots 100 are gathered around oneself, and next, an operation command X4 meaning "seating" is transmitted, and the gathered robots 100 are You can sit at the same time. In addition to this, it is possible to define various operation instructions such as shaking the hand 106, and the like. Since a monitor is installed on the eye 110 of the robot 100, the expression of the eye 110 may be controlled by an operation command. For example, an operation command that causes the eyes 110 of the robot 100 to light up simultaneously may be considered.
  • FIG. 11 is a schematic view for explaining an authentication process of the guest robot 100W by the host robot 100V.
  • the robot system 300 including the server 200 and the host robot 100V can accept a new guest robot 100W.
  • the host robot 100V is a robot 100 that receives action support from the server 200.
  • accept means that the guest robot 100W, who is a foreigner, connects to the server 200 and accesses the resources (hardware, software, and data) managed by the server 200 to support the behavior from the server 200.
  • the server 200 supports the actions of the two robots 100 of the host robot 100V and the guest robot 100W.
  • the following description is based on the premise that the guest robot 100W is brought to a house serving as a base of the robot system 300 (server 200 and host robot 100V).
  • the guest robot 100W can connect to the server 200 by receiving access information from the host robot 100V.
  • the "access information” mentioned here is an access key of a wireless local area network (LAN) to which the server 200 is connected, an IP address of the server 200, a port number, a password, and the like.
  • LAN wireless local area network
  • the first communication unit 142 of the host robot 100V connects to the server 200 by Wi-Fi (registered trademark). More specifically, the communication connection unit 138 of the host robot 100V connects to a wireless LAN (wireless environment) to which the server 200 belongs using access information of Wi-Fi (registered trademark), and communicates with the server 200 wirelessly. Communicate. In order to use the resources of the server 200, the host robot 100V receives authentication of the server 200. On the other hand, the guest robot 100W does not know the access information of the server 200. When the host robot 100V receives robot information from the guest robot 100W, the host robot 100V transmits access information to the guest robot 100W using IrDA (registered trademark) (S1). At this time, the communication connection unit 138 of the host robot 100V generates a guest password, and the transmission unit 162 of the host robot 100V also notifies the guest robot 100W of the guest password.
  • Wi-Fi registered trademark
  • the host robot 100V transmits the robot ID and the guest password of the guest robot 100W to the server 200 (S2).
  • the server 200 registers the robot ID and guest password of the guest robot 100W.
  • the communication connection unit 138 of the guest robot 100W connects to the wireless LAN based on the access information and the guest password received from the host robot 100V, and connects to the server 200 wirelessly (S3).
  • the server 200 collates the robot ID notified from the host robot 100V with the guest password notified from the guest robot 100W, and then permits connection with the guest robot 100W.
  • the host robot 100V detects the unregistered robot 100
  • the host robot 100V transfers the access information and the like to the unregistered robot 100 (guest robot 100W) to support the connection of the guest robot 100W to the server 200.
  • the access information may be transmitted when the external robot 100 is detected as well as the unregistered robot 100. By providing the access information in this manner, the guest robot 100W can access the network without human intervention.
  • FIG. 12 is an external view of the charging station 250.
  • the charging station 250 (charging station 250 a and charging station 250 b) is a charger of the robot 100 and has an internal space for accommodating the robot 100.
  • the charging station 250 and the robot 100 are associated on a one-to-one basis.
  • the charging station 250 incorporates the communication device 252.
  • the communication device 252 periodically transmits the robot ID of the corresponding robot 100 using IrDA (registered trademark).
  • the charging station 250 includes a table 260, a slope 262 that smoothly bridges the top surface of the table 260 and the floor F, and a frame 254 provided around the table 260.
  • a marker M is attached as a mark when the robot 100 enters the charging station 250.
  • the marker M is a circular area colored in a color different from that of the table 260.
  • the frame 254 includes a decorative member 256 surrounding the periphery of the table 260.
  • the decorative member 256 is obtained by overlapping a large number of decorative pieces having a leaf motif as a motif, and gives an image of a fence.
  • a connection terminal 258 for feeding is provided at a position slightly offset from the center marker M in the table 260.
  • the charge monitoring unit 154 of the robot 100 monitors the remaining battery level of the battery 118.
  • the battery remaining amount charge amount
  • the charge rate is equal to or less than 30%
  • the robot 100 goes to the charge station 250.
  • the robot 100 receives a robot ID from the communication device 252 incorporated in the charging station 250.
  • the robot 100 sets the charging station 250 that transmits its robot ID as a movement target point. In FIG. 12, the robot 100 is stored in the charging station 250a.
  • the robot 100 captures an image of the marker M when entering the charging station 250a, and controls the traveling direction using the marker M as a mark. After entering charging station 250 a, connection terminal 258 is connected to a connection terminal provided at the bottom of robot 100. As a result, the charging circuits of the robot 100 and the charging station 250 become conductive.
  • the operation control unit 150 adjusts the orientation of the robot 100 so that the receiver 168a (forward) can receive the robot ID with higher reception intensity than the other receivers 168 when entering the charging station 250a.
  • the robot 100 can identify and charge the “charging station 250 for oneself”. For this reason, biological characteristics similar to homing instinct can be behaviorally expressed in the robot 100.
  • the robot 100 and the robot system 300 including the robot 100 have been described above based on the embodiment. There is no big difference in the appearance of the robot 100. For this reason, it is difficult to identify the external robot 100 by image recognition. In the present embodiment, since the external robot 100 is recognized by the robot ID transmitted by IrDA (registered trademark) as the robot ID, the external robot 100 can be easily identified.
  • IrDA registered trademark
  • the robot 100 reflected in the mirror may be misunderstood as the external robot 100, but such an erroneous method is used in the identification method by the robot ID (hereinafter referred to as "ID identification method")
  • ID identification method the identification method by the robot ID
  • the ID identification method also has the advantage that the processing load is lighter than image recognition.
  • the robot detection unit 152 can easily specify the direction in which the robot 100 is present based on which of the plurality of receivers 168 receives the signal from the external robot 100 at the maximum reception intensity.
  • the robot detection unit 152 can also specify the orientation of the external robot 100 relative to the robot 100 based on the “position code of the transmitter 158” transmitted from the external robot 100.
  • the distance at which the robot 100 can receive the robot ID of the external robot 100 is a distance at which the robot 100 can recognize the external robot 100.
  • the reception capability of the robot 100 is low, only the external robot 100 located nearby can be recognized. In other words, the reception capability of the robot 100 can express the “eyesight (recognition possible range)” of the robot 100.
  • the self robot 100 changes the behavior characteristic. According to such a control method, it becomes possible to express an action as if the action of the own robot 100 has been changed with awareness of the presence of the external robot 100 (the other person).
  • a motion approaching the external robot 100 may be selected to express "curiosity", or "wariness” may be displayed by turning away from the head or leaving the external robot 100. You may express an action.
  • the robot 100P By managing the closeness of the robot 100 to the robot 100, it is possible to express liking or dislike between the robots 100. For example, although the robot 100P likes the robot 100Q but the robot 100Q does not like the robot 100P very much, it is possible to express a relationship similar to “human relationship” between the robots 100. In other words, the “sociality” of the robots 100 can be expressed.
  • the robot 100P transmits the operation command X to the robot 100Q, and the robot 100Q transmits the operation command X to another robot 100R, whereby the action chain of many robots can be realized.
  • the accessory 140 transmits the operation command X.
  • various expressions can be made not only on the action characteristic as the single robot 100 but also on the action characteristic as a group.
  • the present invention is not limited to the above-described embodiment and modification, and the components can be modified and embodied without departing from the scope of the invention.
  • Various inventions may be formed by appropriately combining a plurality of components disclosed in the above-described embodiment and modifications. Moreover, some components may be deleted from all the components shown in the above-mentioned embodiment and modification.
  • the robot system 300 is described as being configured of one robot 100, one server 200, and a plurality of external sensors 114, part of the functions of the robot 100 may be realized by the server 200, or the functions of the server 200 A part or all of may be assigned to the robot 100.
  • One server 200 may control a plurality of robots 100, or a plurality of servers 200 may cooperate to control one or more robots 100.
  • a third device other than the robot 100 or the server 200 may have a part of the function.
  • An aggregate of the functions of the robot 100 and the functions of the server 200 described with reference to FIG. 6 can also be generally understood as one “robot”. How to allocate a plurality of functions necessary to realize the present invention to one or more hardwares will be considered in view of the processing capability of each hardware, the specifications required of the robot system 300, etc. It should be decided.
  • the “robot in a narrow sense” refers to the robot 100 not including the server 200
  • the “robot in a broad sense” refers to the robot system 300.
  • Many of the functions of the server 200 may be integrated into the robot 100 in the future.
  • the behavior control program of the robot 100 may be provided from a predetermined server via the Internet, or may be provided by a fixed recording medium such as a CD-ROM. In any case, the behavior control program of the robot 100 may be installed on the robot 100 by being provided from a recording medium (server, CD-ROM or the like) different from the robot 100.
  • the motion command X may be a motion command for executing a game played by a plurality of robots 100.
  • the robot 100P transmits an action command X7 "I want to play a tag game" to gather friends.
  • the robot 100 that has accepted the motion command X7 participates in the tag game.
  • three robots 100P to 100R participate in a tag game.
  • the state management unit 244 of the server 200 registers the robot 100P as a "oni”.
  • the communication unit 204 of the server 200 notifies the robots 100 P to 100 R that the robot 100 P is “oni”.
  • the demonic robot 100P chases the robot 100Q and the robot 100R, and the robot 100Q and the robot 100R escape from the robot 100P (Oki).
  • the robot 100P or demon
  • the robot 100Q recognizes that "Oki” is touched by the reception of the robot information and the detection signal of the touch sensor, and notifies the server 200 of the touch.
  • the server 200 sets the robot 100Q as the "oni”.
  • the motion command X not only designates a motion directly, but may be a start command of various behavior control programs (application programs) mounted on the external robot 100.
  • application programs application programs mounted on the external robot 100.
  • an action control program hereinafter referred to as an “ongo application”
  • the onion game by a plurality of robots 100 is instructed. May be performed.
  • Various action control programs such as the “following operation” described with reference to FIG. 9 and “hand play” in which the two robots 100 touch the hand 106 can be considered.
  • These behavior control programs are identified by application ID.
  • the first robot 100 may instruct the second robot 100 to perform an action control program to be activated by transmitting an operation command X specifying the application ID to the second robot 100.
  • the first robot 100 may also activate the same action control program.
  • the robot 100 carries not only the basic motion but also the action selection program (application program) that defines the action selection method under a specific rule, and the start instruction from one to the other (You may transmit the operation command as invitation of play.
  • the accessory 140 and the action selection program may be sold as a set.
  • the accessory ID is associated with the accessory 140
  • the purchaser of the accessory 140 accesses a predetermined server and inputs the application ID
  • the action selection program corresponding to the application ID is downloaded from the server to the robot 100. It may be According to such an aspect, since the behavior pattern of the robot 100 can be enriched by purchasing the accessory 140, it is possible to collect the accessory 140 and provide the robot 100 with a variety of actions.
  • the robot 100 may execute near field communication with the external robot 100 only when the external robot 100 can be visually recognized.
  • the robot detection unit 152 of the robot 100 captures an image of the external robot 100 with a camera, and when image recognition of the external robot 100 is performed, only when robot information or the like is received from the external robot 100 The position and orientation of the robot 100 may be identified.
  • the operation control unit 150 may accept the operation command X from the external robot 100 only when the image recognition of the external robot 100 can be performed.
  • the motion control unit 150 ignores the motion command X from the external robot 100 if the direction in which the external robot 100 recognized by the camera is present does not coincide with the direction in which the external robot 100 recognized by the second communication unit 134 is present.
  • the self-robot 100 since the self-robot 100 performs short-distance wireless communication only with the external robot 100 that has been visually recognized, unnecessary and unnatural reactions to short-distance wireless communication from other than the external robot 100 Stop doing action.
  • the robot 100 may also receive an operation command X from the accessory 140 on condition that the accessory 140 is viewed.
  • the motion command functions like a so-called "telepathy" between the robots 100. According to the above control method, it is possible to express a more consistent behavior characteristic that only the operation command (telepathy) from the external robot 100 in front of the user is accepted.
  • the operation control unit 150 may approach the external robot 100 and confirm the robot ID by near field communication.
  • the robot 100 can usually view the external robot 100 from outside the communication range of IrDA (registered trademark). According to such a control method, it is possible to express an action characteristic in which the external robot 100 is confirmed from a distance and approached to confirm its identity.
  • the second communication unit 134 may communicate with the external robot 100 or the accessory 140 by another short distance wireless communication method such as ultrasonic communication, near field communication (NFC), or Bluetooth (registered trademark).
  • the robot 100 may control home appliances in the room by an IrDA signal or the like. For example, when the robot 100 receives a voice instruction of "air conditioner on" from the user, the robot 100 may instruct the air conditioner to start up on behalf of the user by transmitting a power-on signal to the air conditioner. .
  • the accessory 140 may be any item (portable item) that can be carried by the user.
  • the jewelry 140 may be a piercing, a mobile phone, a smartphone, a strap, a key ring, a ring, a charm, or the like.
  • the accessory 140 may transmit a plurality of operation instructions at the same time.
  • the entry prohibition command and the seating command may be transmitted simultaneously.
  • the accessory 140 may transmit the operation command simultaneously in a plurality of directions, or may transmit the operation command using a nondirectional radio wave.
  • transmitter 158 and the receiver 168 have been described as being installed in the horn 112.
  • transmitter 158 and receiver 168 may be located on the head of robot 100 or on the chest or abdomen.
  • the communication device disposition plane G be substantially horizontal to F (the inclination angle to F is within 30 degrees).
  • the accessory 140 may be formed as, for example, a magic wand or a rod imitating a baton as a guidance device having a communication function with the robot 100.
  • the accessory 140 includes a grip portion which is a portion gripped and operated by the user, and a stick (stretching member) connected to the grip portion.
  • a light emitting unit is attached to the tip of the stick.
  • the grip comprises a switch.
  • a battery is built in the grip portion.
  • the user operates the switch to select one of a plurality of modes. For example, in the multiple modes, a mode in which a robot at a distant place is called (hereinafter referred to as “calling mode”), a mode in which the robot is played by using a stick to play (hereinafter referred to as “induction mode”) It is a mode (hereinafter referred to as a “walk mode”) in which a walk is taken as connecting a lead (string).
  • the instruction selection unit 172 selects an operation instruction according to the mode and the method of moving the stick at that time according to the operation of the switch.
  • the light emitting portion is formed to be able to emit light by an infrared LED used for communication.
  • the light emitting unit incorporates a transmitting unit 144 and a receiving unit 170 that use infrared light for communication.
  • the light emitting portion (tip portion) is also provided with an acceleration sensor for measuring the movement of the tip portion, and an LED of visible light.
  • a unique identification number is assigned to the accessory 140, and the identification number is registered in the built-in memory of the accessory 140.
  • the accessory 140 includes a link forming portion.
  • the link forming unit notifies the robot 100 of the identification number of the accessory 140, and establishes a connection with the robot 100.
  • the state in which the connection with the accessory 140 is established is referred to as a “link state”.
  • the robot 100 is in the link state, it is possible to receive an operation command from the accessory 140 to be linked.
  • the accessory 140 transmits various identification commands along with various operation commands.
  • the robot 100 follows the operation instruction when the identification number received together with the operation instruction matches the identification number of the accessory 140 to be linked (the identification number registered as the link destination). When the identification number received together with the operation command is different from the identification number of the linked accessory 140, the operation command is ignored. Linking means that the robot 100 follows the instructions from the accessory 140.
  • the link formation unit communicates with the robot 100 using near field communication such as NFC.
  • the link 140 may be established by bringing the accessory 140 into contact with the robot 100 using near-field wireless communication means.
  • the robot 100 may delete the registered identification number of the accessory 140 to be linked, thereby canceling the link state.
  • the identification number of the new accessory 140 is overwritten and registered.
  • the existing link state is cancelled, and the link with the newly touched accessory 140 is established.
  • the robot 100 may automatically cancel the link state after a predetermined period has elapsed.
  • the robot 100 may automatically cancel the link state after a predetermined period has elapsed since the last reception of the operation command from the accessory 140.
  • An acceleration sensor incorporated in the light emitting unit (tip portion) detects movement (speed, acceleration, and moving direction) of the tip portion.
  • the user moves the stick to select an operation command according to the trajectory of the tip. For example, if the stick is moved in a circle around the robot, an operation command instructing to follow the movement is selected.
  • an operation command to move in the swung direction is selected.
  • the accessory 140 instructs the robot 100 with which the link is established to “search” using communication means capable of performing wireless communication far beyond the communicable range of the transmission unit 144.
  • the accessory 140 transmits a search instruction by broadcasting by Bluetooth (registered trademark), and further transmits an operation command instructing the robot 100 to approach the accessory 140 using infrared communication from the light emitting unit.
  • the robot 100 transmits a signal (referred to as “arrival notification”) to the receiver 170 of the accessory 140, which means arrival.
  • arrival notification a signal
  • the accessory 140 stops the transmission of the search instruction as the call is complete.
  • the accessory 140 is a wireless communication unit in a very near range such as NFC (hereinafter referred to as “first wireless communication unit”), and a short range wireless communication unit (hereinafter referred to as “first wireless communication unit”) having a wider communication range than the first wireless communication unit. 2) wireless communication means and wireless communication means (hereinafter referred to as “third wireless communication means”) having a wider communication range than the second wireless communication means.
  • the accessory 140 switches the appropriate communication means according to the mode to communicate with the robot 100.
  • the accessory 140 transmits, to the robot 100, an operation command for instructing "following" from the light emitting unit. Since the distance between the accessory 140 and the robot 100 is relatively short, the second wireless communication means is used. When receiving an operation command instructing tracking, the robot 100 moves following the user (the accessory 140) while keeping the distance to the accessory 140 substantially constant.
  • the grip portion of the accessory 140 includes a tactile sense realization unit (vibrator or the like) for realizing a sense of touch that is connected to the accessory 140 with a lead that can not be seen by the robot 100.
  • the haptic realization unit stimulates the sense of touch of the user's hand by a technique called haptic technology or haptics.
  • the tactile sense realization unit realizes the sense of being pulled by the robot 100 and the sense of sense that the robot 100 is moving to the left and right in conjunction with the tracking state of the robot 100. For example, when the robot 100 moves to the right, in conjunction with it, the tactile sense realization unit realizes the sense of being pulled to the right.
  • the communication unit of the robot 100 transmits information specifying the content of the change (hereinafter, referred to as “motion information”) to the accessory 140 at the timing of changing the following state.
  • the haptic realization unit realizes haptics according to the motion information. If the motion information means "move to the right", the tactile sense effect unit stimulates the user's hand through the grasping unit so that the user feels as if it is pulled to the right.
  • the robot 100 can operate autonomously even in the link state.
  • the haptic realization unit and the visible light LED may notify the user of the establishment of the link.
  • the tactile sense realization unit may vibrate the grip unit in a specific vibration pattern, and the visible light LED may be lit and blinked in a specific color.
  • the tactile sense implementation unit and the visible light LED may notify the user of various information such as power on / off of the accessory 140, transmission of the operation command, and the type of the operation command transmitted.
  • the transmission unit 144 incorporated in the light emitting unit (tip end) may transmit a directional link signal.
  • the robot 100 receives the link signal, the robot is in the “link state”.
  • the robot 100 in the linked state follows the operation command transmitted from the accessory 140 thereafter. After linking the robot 100, the user can control the robot 100 as if it were hypnotized by moving the light emitting portion (tip portion) of the accessory 140.
  • instruction selection unit 172 detects the orbiting motion and speed by the acceleration sensor built in the light emitting unit (tip end), An operation command instructing circular motion is sent to the robot 100.
  • the robot 100 in the link state receives an operation command, it orbits the ground.
  • the orbiting radius and the moving speed of the robot 100 are linked to the orbiting radius and the orbiting speed of the accessory 140.
  • the motion command includes information specifying the orbiting radius and the orbiting speed.
  • the robot 100 in the linked state chases the accessory 140 (user).
  • the transmission unit 144 periodically transmits an operation signal instructing “following” to the robot 100 as a link target.
  • the light emitting unit may incorporate a transmitting unit 144 that transmits a link signal in the axial direction of the stick, and a plurality of transmitting units 144 that transmit an operation signal in the radial direction.
  • the robot 100 chases the user (accessory 140) while keeping a constant distance.
  • the robot 100 may express that it is in the state of being taken by the lead by emitting light from the built-in infrared LED.
  • the light emitting unit may emit light when linked with the robot or when the switch is pressed.
  • the user may control one robot 100 by the transmission unit 144 or may control a plurality of robots 100 collectively.
  • the user may set the plurality of robots 100 in the link state by sequentially establishing links with the plurality of robots 100 to be operated.
  • the user may instruct transmission of an operation command to a large number of unspecified robots 100 collectively.

Abstract

A robot 100P selects and executes its own motion. A robot 100Q receives a robot ID from the robot 100P and identifies the robot 100P, which is the transmission source, by the robot ID. The robot 100Q also receives an operation command from the robot 100P and executes a motion associated with the motion command. As a result, behavioral characteristics of the robot 100Q are influenced by the motion command of the robot 100P. The robot 100Q transmits said motion command to a separate robot 100R. A communication device that transmits and receives the robot ID or the like is built into a horn 112 of robots 100.

Description

[規則37.2に基づきISAが決定した発明の名称] 自律行動型ロボット、装身具及びロボット制御プログラム[Title of the invention determined by ISA based on rule 37.2] Autonomous action robot, jewelry and robot control program
 本発明は、内部状態または外部環境に応じて自律的に行動選択するロボット、に関する。 The present invention relates to a robot that autonomously selects an action according to an internal state or an external environment.
 人間は、癒やしを求めてペットを飼う。その一方、ペットの世話をする時間を十分に確保できない、ペットを飼える住環境にない、アレルギーがある、死別がつらい、といったさまざまな理由により、ペットをあきらめている人は多い。もし、ペットの役割が務まるロボットがあれば、ペットを飼えない人にもペットが与えてくれるような癒やしを与えられるかもしれない(特許文献1参照)。 Humans keep pets for healing. On the other hand, many people give up their pets for various reasons such as lack of time to take care of their pets, lack of living environment for pets, allergies, and bereavement difficulties. If there is a robot that plays the role of a pet, it may be possible to give healing that a pet gives to a person who can not keep the pet (see Patent Document 1).
特開2000-323219号公報JP 2000-323219 A
 ペットを飼う楽しさのひとつは、ペット同士の交流を見ることである。ロボットにおいても、ロボット同士の交流を促す仕組みがあれば、ロボットに対する愛着をいっそう深めることができると考えられる。ロボット同士の交流の前提となるのは、ロボットが他のロボットを識別する機能である。 One of the funs of keeping pets is to see them interact. Even in the case of robots, if there is a mechanism for promoting interaction between robots, it is considered that attachment to robots can be further deepened. The premise of the interaction between the robots is the function of the robots identifying other robots.
 本発明は上記認識に基づいて完成された発明であり、その主たる目的は、ロボットが他のロボットを簡易かつ確実に認識するための技術、を提供することにある。 The present invention is an invention completed based on the above recognition, and its main object is to provide a technology for a robot to easily and reliably recognize another robot.
 本発明のある態様における自律行動型ロボットは、ロボットのモーションを選択する動作制御部と、動作制御部により選択されたモーションを実行する駆動機構と、他のロボットから、所定の近距離無線通信方式にしたがって送信される前記他のロボットのIDを受信する受信機と、受信されたIDにより他のロボットを判別する認識部と、を備える。 The autonomous action type robot according to an aspect of the present invention includes a motion control unit for selecting a motion of the robot, a drive mechanism for executing the motion selected by the motion control unit, and a predetermined short distance wireless communication method from another robot. A receiver for receiving the ID of the other robot transmitted in accordance with the above, and a recognition unit for determining the other robot based on the received ID.
 本発明の別の態様における自律行動型ロボットは、モーションを選択する動作制御部と、動作制御部により選択されたモーションを実行する駆動機構と、自ロボットを識別するIDを所定の近距離無線通信方式にしたがって送信する送信機と、を備える。 The autonomous action type robot according to another aspect of the present invention includes a motion control unit for selecting a motion, a drive mechanism for executing the motion selected by the motion control unit, and an ID for identifying the robot in a predetermined short distance wireless communication A transmitter for transmitting according to a scheme.
 本発明の別の態様における自律行動型ロボットは、ロボットのモーションを選択する動作制御部と、動作制御部により選択されたモーションを実行する駆動機構と、充電器から、所定の近距離無線通信方式にしたがって送信されるIDを受信する受信機と、二次電池の電池残量を監視する充電監視部と、を備える。
 動作制御部は、電池残量が所定の閾値以下となったとき、複数の充電器のうち自ロボットに対応づけられるIDを送信する充電器を移動目標地点として選択する。
The autonomous action type robot according to another aspect of the present invention includes a motion control unit for selecting a motion of the robot, a drive mechanism for executing the motion selected by the motion control unit, and a predetermined short distance wireless communication system from the charger. And a charge monitoring unit that monitors the remaining battery level of the secondary battery.
The operation control unit selects, as the movement target point, a charger that transmits an ID associated with the robot among the plurality of chargers when the battery remaining amount becomes equal to or less than a predetermined threshold.
 本発明の別の態様における自律行動型ロボットは、サーバに対するアクセス情報に基づいて、第1の無線通信方式により前記サーバと接続する通信接続部と、ロボットのモーションを決定する動作制御部と、動作制御部により選択されたモーションを実行する駆動機構と、他のロボットから、第1の無線通信方式よりも通信距離の短い第2の無線通信方式により他のロボットのIDを受信する受信機と、IDが受信されたとき、他のロボットにアクセス情報を送信する送信機と、を備える。 An autonomous action type robot according to another aspect of the present invention includes a communication connection unit connected to the server by a first wireless communication method based on access information to the server, an operation control unit for determining a motion of the robot, and an operation A drive mechanism for executing the motion selected by the control unit, and a receiver for receiving an ID of another robot from another robot by a second wireless communication method having a communication distance shorter than that of the first wireless communication method. And a transmitter for transmitting access information to another robot when the ID is received.
 本発明の別の態様における自律行動型ロボットは、自ロボットを識別するIDを所定の近距離無線通信方式にしたがって送信する送信機、を備える。ロボットの頭部、または、頭頂部に形成される突起において、複数の送信機が環状に配列される。 The autonomous action type robot according to another aspect of the present invention includes a transmitter that transmits an ID for identifying the self robot in accordance with a predetermined short distance wireless communication scheme. A plurality of transmitters are annularly arranged in a projection formed on the head or top of the robot.
 本発明の別の態様における自律行動型ロボットは、ロボットのモーションを選択する動作制御部と、動作制御部により選択されたモーションを実行する駆動機構と、他の自律行動型ロボットから、自律行動型ロボットのIDを受信する受信機と、を備える。
 動作制御部は、駆動機構に対して、自律行動型ロボットの所定位置に設置される送信機から送信される位置コードを受信可能な位置へ移動指示する。
An autonomous action type robot according to another aspect of the present invention includes an action control unit that selects a motion of the robot, a drive mechanism that executes a motion selected by the action control unit, and an autonomous action type robot from another autonomous action type robot. And a receiver for receiving the robot ID.
The operation control unit instructs the drive mechanism to move to a position where it can receive the position code transmitted from the transmitter installed at the predetermined position of the autonomous action type robot.
 本発明のある態様における装身具は、自律行動型ロボットに対して、所定の近距離無線通信方式にしたがって、動作命令を送信する送信機、を備える。 An accessory according to an aspect of the present invention includes a transmitter that transmits an operation command to an autonomous behavior robot in accordance with a predetermined short distance wireless communication scheme.
 本発明によれば、ロボットが他のロボットを簡易に認識しやすくなる。 According to the present invention, a robot can easily recognize another robot.
図1(a)はロボットの正面外観図である。図1(b)はロボットの側面外観図である。FIG. 1A is a front external view of a robot. FIG. 1 (b) is a side view of the robot. ロボットの構造を概略的に表す断面図である。FIG. 2 is a cross-sectional view schematically illustrating the structure of a robot. ロボットシステムの構成図である。It is a block diagram of a robot system. 感情マップの概念図である。It is a conceptual diagram of an emotion map. ロボットのハードウェア構成図である。It is a hardware block diagram of a robot. ロボットシステムの機能ブロック図である。It is a functional block diagram of a robot system. ツノの外観拡大図である。It is the external appearance enlarged view of a horn. 通信機配置面の上面図である。It is a top view of a communication machine arrangement side. 複数のロボットが隊列を組む様子を示す模式図である。It is a schematic diagram which shows a mode that several robots form a formation. 装身具によるロボットへの行動指示を示す模式図である。It is a schematic diagram which shows the action instruction | indication to the robot by a accessory. ホスト・ロボットによるゲスト・ロボットの認証過程を説明するための模式図である。It is a schematic diagram for demonstrating the authentication process of the guest robot by a host robot. 充電ステーションの外観図である。It is an external view of a charge station.
 図1(a)は、ロボット100の正面外観図である。図1(b)は、ロボット100の側面外観図である。
 本実施形態におけるロボット100は、外部環境および内部状態に基づいて行動や仕草(ジェスチャー)を決定する自律行動型のロボットである。外部環境は、カメラやサーモセンサなど各種のセンサにより認識される。内部状態はロボット100の感情を表現するさまざまなパラメータとして定量化される。これらについては後述する。
FIG. 1A is a front external view of the robot 100. FIG. FIG. 1 (b) is a side external view of the robot 100.
The robot 100 in the present embodiment is an autonomous action robot that determines an action or gesture (gesture) based on an external environment and an internal state. The external environment is recognized by various sensors such as a camera and a thermo sensor. The internal state is quantified as various parameters representing the emotion of the robot 100. These will be described later.
 ロボット100は、原則として、オーナー家庭の家屋内を行動範囲とする。以下、ロボット100に関わる人間を「ユーザ」とよび、ロボット100が所属する家庭の構成員となるユーザのことを「オーナー」とよぶ。 In principle, the robot 100 takes an indoor range of an owner's home as an action range. Hereinafter, a human being related to the robot 100 is referred to as a "user", and a user who is a member of a home to which the robot 100 belongs is referred to as an "owner".
 ロボット100のボディ104は、全体的に丸みを帯びた形状を有し、ウレタンやゴム、樹脂、繊維などやわらかく弾力性のある素材により形成された外皮を含む。ロボット100に服を着せてもよい。丸くてやわらかく、手触りのよいボディ104とすることで、ロボット100はユーザに安心感とともに心地よい触感を提供する。 The body 104 of the robot 100 has an overall rounded shape, and includes an outer shell formed of a soft and elastic material such as urethane, rubber, resin, or fiber. The robot 100 may be dressed. By making the body 104 round and soft and have a good touch, the robot 100 provides the user with a sense of security and a pleasant touch.
 ロボット100は、総重量が15キログラム以下、好ましくは10キログラム以下、更に好ましくは、5キログラム以下である。生後13ヶ月までに、赤ちゃんの過半数は一人歩きを始める。生後13ヶ月の赤ちゃんの平均体重は、男児が9キログラム強、女児が9キログラム弱である。このため、ロボット100の総重量が10キログラム以下であれば、ユーザは一人歩きできない赤ちゃんを抱きかかえるのとほぼ同等の労力でロボット100を抱きかかえることができる。生後2ヶ月未満の赤ちゃんの平均体重は男女ともに5キログラム未満である。したがって、ロボット100の総重量が5キログラム以下であれば、ユーザは乳児を抱っこするのと同等の労力でロボット100を抱っこできる。 The robot 100 has a total weight of 15 kilograms or less, preferably 10 kilograms or less, and more preferably 5 kilograms or less. By 13 months of age, the majority of babies will start walking alone. The average weight of a 13-month-old baby is just over 9 kilograms for boys and less than 9 kilograms for girls. Therefore, if the total weight of the robot 100 is 10 kilograms or less, the user can hold the robot 100 with almost the same effort as holding a baby that can not walk alone. The average weight of babies less than 2 months old is less than 5 kilograms for both men and women. Therefore, if the total weight of the robot 100 is 5 kg or less, the user can hold the robot 100 with the same effort as holding an infant.
 適度な重さと丸み、柔らかさ、手触りのよさ、といった諸属性により、ユーザがロボット100を抱きかかえやすく、かつ、抱きかかえたくなるという効果が実現される。同様の理由から、ロボット100の身長は1.2メートル以下、好ましくは、0.7メートル以下であることが望ましい。本実施形態におけるロボット100にとって、抱きかかえることができるというのは重要なコンセプトである。 The various attributes such as appropriate weight, roundness, softness, and good touch realize an effect that the user can easily hold the robot 100 and can not hold it. For the same reason, it is desirable that the height of the robot 100 is 1.2 meters or less, preferably 0.7 meters or less. For the robot 100 in the present embodiment, being able to hold it is an important concept.
 ロボット100は、3輪走行するための3つの車輪を備える。図示のように、一対の前輪102(左輪102a,右輪102b)と、一つの後輪103を含む。前輪102が駆動輪であり、後輪103が従動輪である。前輪102は、操舵機構を有しないが、回転速度や回転方向を個別に制御可能とされている。後輪103は、いわゆるオムニホイールからなり、ロボット100を前後左右へ移動させるために回転自在となっている。左輪102aよりも右輪102bの回転数を大きくすることで、ロボット100は左折したり、左回りに回転できる。右輪102bよりも左輪102aの回転数を大きくすることで、ロボット100は右折したり、右回りに回転できる。 The robot 100 includes three wheels for traveling three wheels. As shown, a pair of front wheels 102 (left wheel 102a, right wheel 102b) and one rear wheel 103 are included. The front wheel 102 is a driving wheel, and the rear wheel 103 is a driven wheel. The front wheel 102 does not have a steering mechanism, but its rotational speed and rotational direction can be individually controlled. The rear wheel 103 is a so-called omni wheel, and is rotatable in order to move the robot 100 back and forth and right and left. By making the rotation speed of the right wheel 102b larger than that of the left wheel 102a, the robot 100 can turn left or rotate counterclockwise. By making the rotational speed of the left wheel 102a larger than that of the right wheel 102b, the robot 100 can turn right or rotate clockwise.
 前輪102および後輪103は、駆動機構(回動機構、リンク機構)によりボディ104に完全収納できる。走行時においても各車輪の大部分はボディ104に隠れているが、各車輪がボディ104に完全収納されるとロボット100は移動不可能な状態となる。すなわち、車輪の収納動作にともなってボディ104が降下し、床面Fに着座する。この着座状態においては、ボディ104の底部に形成された平坦状の着座面108(接地底面)が床面Fに当接する。 The front wheel 102 and the rear wheel 103 can be completely housed in the body 104 by a drive mechanism (a rotation mechanism, a link mechanism). Even when traveling, most of the wheels are hidden by the body 104, but when the wheels are completely housed in the body 104, the robot 100 can not move. That is, the body 104 descends and is seated on the floor surface F along with the storing operation of the wheels. In this sitting state, the flat seating surface 108 (grounding bottom surface) formed on the bottom of the body 104 abuts on the floor surface F.
 ロボット100は、2つの手106を有する。手106には、モノを把持する機能はない。手106は上げる、振る、振動するなど簡単な動作が可能である。2つの手106も個別制御可能である。 The robot 100 has two hands 106. The hand 106 does not have the function of gripping an object. The hand 106 can perform simple operations such as raising, shaking and vibrating. The two hands 106 are also individually controllable.
 目110は、液晶素子または有機EL素子による画像表示が可能である。ロボット100は、音源方向を特定可能なマイクロフォンアレイや超音波センサなどさまざまなセンサを搭載する。また、スピーカーを内蔵し、簡単な音声を発することもできる。 The eye 110 can display an image with a liquid crystal element or an organic EL element. The robot 100 mounts various sensors such as a microphone array and an ultrasonic sensor that can identify the sound source direction. In addition, it has a built-in speaker and can emit a simple voice.
 ロボット100の頭部にはツノ112が取り付けられる。上述のようにロボット100は軽量であるため、ユーザはツノ112をつかむことでロボット100を持ち上げることも可能である。ツノ112には全天球カメラが取り付けられ、ロボット100の上部全域を一度に撮像可能である。また、ツノ112には、送信機と受信機が内蔵されるが、詳細は図7、図8に関連して後述する。 A horn 112 is attached to the head of the robot 100. As described above, since the robot 100 is lightweight, the user can lift the robot 100 by grasping the tongue 112. An omnidirectional camera is attached to the horn 112 so that the entire upper portion of the robot 100 can be imaged at one time. The horn 112 incorporates a transmitter and a receiver, the details of which will be described later with reference to FIGS. 7 and 8.
 図2は、ロボット100の構造を概略的に表す断面図である。
 図2に示すように、ロボット100のボディ104は、ベースフレーム308、本体フレーム310、一対の樹脂製のホイールカバー312および外皮314を含む。ベースフレーム308は、金属からなり、ボディ104の軸芯を構成するとともに内部機構を支持する。ベースフレーム308は、アッパープレート332とロアプレート334とを複数のサイドプレート336により上下に連結して構成される。複数のサイドプレート336間には通気が可能となるよう、十分な間隔が設けられる。ベースフレーム308の内方には、バッテリー118、制御回路342および各種アクチュエータが収容されている。
FIG. 2 is a cross-sectional view schematically showing the structure of the robot 100. As shown in FIG.
As shown in FIG. 2, the body 104 of the robot 100 includes a base frame 308, a body frame 310, a pair of resin wheel covers 312 and a shell 314. The base frame 308 is made of metal and constitutes an axial center of the body 104 and supports an internal mechanism. The base frame 308 is configured by connecting an upper plate 332 and a lower plate 334 by a plurality of side plates 336 up and down. The plurality of side plates 336 is sufficiently spaced to allow air flow. Inside the base frame 308, a battery 118, a control circuit 342 and various actuators are accommodated.
 本体フレーム310は、樹脂材からなり、頭部フレーム316および胴部フレーム318を含む。頭部フレーム316は、中空半球状をなし、ロボット100の頭部骨格を形成する。胴部フレーム318は、段付筒形状をなし、ロボット100の胴部骨格を形成する。胴部フレーム318は、ベースフレーム308と一体に固定される。頭部フレーム316は、胴部フレーム318の上端部に相対変位可能に組み付けられる。 The body frame 310 is made of a resin material and includes a head frame 316 and a body frame 318. The head frame 316 has a hollow hemispherical shape and forms a head skeleton of the robot 100. The body frame 318 has a stepped cylindrical shape and forms the body frame of the robot 100. The body frame 318 is integrally fixed to the base frame 308. The head frame 316 is assembled to the upper end of the body frame 318 so as to be relatively displaceable.
 頭部フレーム316には、ヨー軸320、ピッチ軸322およびロール軸324の3軸と、各軸を回転駆動するためのアクチュエータ326が設けられる。アクチュエータ326は、各軸を個別に駆動するための複数のサーボモータを含む。首振り動作のためにヨー軸320が駆動され、頷き動作のためにピッチ軸322が駆動され、首を傾げる動作のためにロール軸324が駆動される。 The head frame 316 is provided with three axes of a yaw axis 320, a pitch axis 322 and a roll axis 324, and an actuator 326 for rotationally driving each axis. The actuator 326 includes a plurality of servomotors for individually driving each axis. The yaw shaft 320 is driven for swinging motion, the pitch shaft 322 is driven for loosening motion, and the roll shaft 324 is driven for tilting motion.
 頭部フレーム316の上部には、ヨー軸320を支持するプレート325が固定されている。プレート325には、上下間の通気を確保するための複数の通気孔327が形成される。 A plate 325 supporting the yaw axis 320 is fixed to the top of the head frame 316. The plate 325 is formed with a plurality of vents 327 for ensuring ventilation between the top and bottom.
 頭部フレーム316およびその内部機構を下方から支持するように、金属製のベースプレート328が設けられる。ベースプレート328は、クロスリンク機構329(パンタグラフ機構)を介してプレート325と連結される一方、ジョイント330を介してアッパープレート332(ベースフレーム308)と連結されている。 A metallic base plate 328 is provided to support the head frame 316 and its internal features from below. The base plate 328 is connected to the plate 325 via the cross link mechanism 329 (pantograph mechanism), and is connected to the upper plate 332 (base frame 308) via the joint 330.
 胴部フレーム318は、ベースフレーム308と車輪駆動機構370を収容する。車輪駆動機構370は、回動軸378およびアクチュエータ379を含む。胴部フレーム318の下半部は、ホイールカバー312との間に前輪102の収納スペースSを形成するために小幅とされている。 Torso frame 318 houses base frame 308 and wheel drive mechanism 370. The wheel drive mechanism 370 includes a pivot shaft 378 and an actuator 379. The lower half of the body frame 318 is narrow to form a storage space S of the front wheel 102 with the wheel cover 312.
 外皮314は、ウレタンゴムからなり、本体フレーム310およびホイールカバー312を外側から覆う。手106は、外皮314と一体成形される。外皮314の上端部には、外気を導入するための開口部390が設けられる。 The outer cover 314 is made of urethane rubber and covers the body frame 310 and the wheel cover 312 from the outside. The hand 106 is integrally molded with the skin 314. At the upper end of the shell 314, an opening 390 for introducing external air is provided.
 図3は、ロボットシステム300の構成図である。
 ロボットシステム300は、ロボット100、サーバ200および複数の外部センサ114を含む。家屋内にはあらかじめ複数の外部センサ114(外部センサ114a、114b、・・・、114n)が設置される。外部センサ114は、家屋の壁面に固定されてもよいし、床に載置されてもよい。サーバ200には、外部センサ114の位置座標が登録される。位置座標は、ロボット100の行動範囲として想定される家屋内においてx,y座標として定義される。
FIG. 3 is a block diagram of the robot system 300. As shown in FIG.
The robot system 300 includes a robot 100, a server 200 and a plurality of external sensors 114. A plurality of external sensors 114 ( external sensors 114a, 114b, ..., 114n) are installed in advance in the house. The external sensor 114 may be fixed to the wall of the house or may be mounted on the floor. In the server 200, position coordinates of the external sensor 114 are registered. The position coordinates are defined as x, y coordinates in a house assumed as the action range of the robot 100.
 サーバ200は、家屋内に設置される。ロボット100の内蔵するセンサおよび複数の外部センサ114から得られる情報に基づいて、サーバ200がロボット100の基本行動を決定する。
 外部センサ114はロボット100の感覚器を補強するためのものであり、サーバ200はロボット100の頭脳を補強するためのものである。
The server 200 is installed in a house. The server 200 determines the basic behavior of the robot 100 based on the information obtained from the sensors contained in the robot 100 and the plurality of external sensors 114.
The external sensor 114 is for reinforcing the senses of the robot 100, and the server 200 is for reinforcing the brain of the robot 100.
 外部センサ114は、定期的に外部センサ114のID(以下、「ビーコンID」とよぶ)を含む無線信号(以下、「ロボット探索信号」とよぶ)を送信する。ロボット100はロボット探索信号を受信するとビーコンIDを含む無線信号(以下、「ロボット返答信号」とよぶ)を返信する。サーバ200は、外部センサ114がロボット探索信号を送信してからロボット返答信号を受信するまでの時間を計測し、外部センサ114からロボット100までの距離を測定する。複数の外部センサ114とロボット100とのそれぞれの距離を計測することで、ロボット100の位置座標を特定する。
 もちろん、ロボット100が自らの位置座標を定期的にサーバ200に送信する方式でもよい。
The external sensor 114 periodically transmits a wireless signal (hereinafter referred to as a “robot search signal”) including the ID of the external sensor 114 (hereinafter referred to as “beacon ID”). When the robot 100 receives the robot search signal, the robot 100 sends back a radio signal (hereinafter referred to as a “robot reply signal”) including a beacon ID. The server 200 measures the time from when the external sensor 114 transmits the robot search signal to when the robot reply signal is received, and measures the distance from the external sensor 114 to the robot 100. By measuring the distances between the plurality of external sensors 114 and the robot 100, the position coordinates of the robot 100 are specified.
Of course, the robot 100 may periodically transmit its position coordinates to the server 200.
 図4は、感情マップ116の概念図である。
 感情マップ116は、サーバ200に格納されるデータテーブルである。ロボット100は、感情マップ116にしたがって行動選択する。図4に示す感情マップ116は、ロボット100の場所に対する好悪感情の大きさを示す。感情マップ116のx軸とy軸は、二次元空間座標を示す。z軸は、好悪感情の大きさを示す。z値が正値のときにはその場所に対する好感が高く、z値が負値のときにはその場所を嫌悪していることを示す。
FIG. 4 is a conceptual view of the emotion map 116. As shown in FIG.
The emotion map 116 is a data table stored in the server 200. The robot 100 selects an action according to the emotion map 116. An emotion map 116 shown in FIG. 4 indicates the size of a bad feeling for the location of the robot 100. The x-axis and y-axis of emotion map 116 indicate two-dimensional space coordinates. The z-axis indicates the size of the bad feeling. When the z value is positive, the preference for the location is high, and when the z value is negative, it indicates that the location is disliked.
 図4の感情マップ116において、座標P1は、ロボット100の行動範囲としてサーバ200が管理する屋内空間のうち好感情が高い地点(以下、「好意地点」とよぶ)である。好意地点は、ソファの陰やテーブルの下などの「安全な場所」であってもよいし、リビングのように人が集まりやすい場所、賑やかな場所であってもよい。また、過去にやさしく撫でられたり、触れられたりした場所であってもよい。
 ロボット100がどのような場所を好むかという定義は任意であるが、一般的には、小さな子どもや犬や猫などの小動物が好む場所を好意地点として設定することが望ましい。
In the emotion map 116 of FIG. 4, the coordinate P1 is a point (hereinafter, referred to as a “favory point”) in the indoor space managed by the server 200 as the action range of the robot 100, in which the favorable feeling is high. The favor point may be a "safe place" such as a shade of a sofa or under a table, a place where people easily gather like a living, or a lively place. In addition, it may be a place which has been gently boiled or touched in the past.
Although the definition of what kind of place the robot 100 prefers is arbitrary, generally, it is desirable to set a place favored by small children such as small children and dogs and cats.
 座標P2は、悪感情が高い地点(以下、「嫌悪地点」とよぶ)である。嫌悪地点は、テレビの近くなど大きな音がする場所、お風呂や洗面所のように濡れやすい場所、閉鎖空間や暗い場所、ユーザから乱暴に扱われたことがある不快な記憶に結びつく場所などであってもよい。
 ロボット100がどのような場所を嫌うかという定義も任意であるが、一般的には、小さな子どもや犬や猫などの小動物が怖がる場所を嫌悪地点として設定することが望ましい。
A coordinate P2 is a point at which a bad feeling is high (hereinafter, referred to as a “disgust point”). Aversion points are places with loud noise such as near a television, places that are easy to get wet like baths and washrooms, closed spaces or dark places, places that lead to unpleasant memories that have been roughly treated by users, etc. It may be.
Although the definition of what place the robot 100 hates is also arbitrary, it is generally desirable to set a place where small animals such as small children, dogs and cats are scared as a hatred point.
 座標Qは、ロボット100の現在位置を示す。複数の外部センサ114が定期的に送信するロボット探索信号とそれに対するロボット返答信号により、サーバ200はロボット100の位置座標を特定する。たとえば、ビーコンID=1の外部センサ114とビーコンID=2の外部センサ114がそれぞれロボット100を検出したとき、2つの外部センサ114からロボット100の距離を求め、そこからロボット100の位置座標を求める。 The coordinate Q indicates the current position of the robot 100. The server 200 specifies position coordinates of the robot 100 based on a robot search signal periodically transmitted by the plurality of external sensors 114 and a robot reply signal corresponding thereto. For example, when the external sensor 114 with beacon ID = 1 and the external sensor 114 with beacon ID = 2 respectively detect the robot 100, the distance between the robot 100 is determined from the two external sensors 114, and the position coordinate of the robot 100 is determined therefrom. .
 あるいは、ビーコンID=1の外部センサ114は、ロボット探索信号を複数方向に送信し、ロボット100はロボット探索信号を受信したときロボット返答信号を返す。これにより、サーバ200は、ロボット100がどの外部センサ114からどの方向のどのくらいの距離にいるかを把握してもよい。また、別の実施の形態では、前輪102または後輪103の回転数からロボット100の移動距離を算出して、現在位置を特定してもよいし、カメラから得られる画像に基づいて現在位置を特定してもよい。
 図4に示す感情マップ116が与えられた場合、ロボット100は好意地点(座標P1)に引き寄せられる方向、嫌悪地点(座標P2)から離れる方向に移動する。
Alternatively, the external sensor 114 with beacon ID = 1 transmits a robot search signal in a plurality of directions, and when the robot 100 receives the robot search signal, it returns a robot reply signal. Thus, the server 200 may grasp how far the robot 100 is from which external sensor 114 and in which direction. In another embodiment, the movement distance of the robot 100 may be calculated from the number of revolutions of the front wheel 102 or the rear wheel 103 to specify the current position, or the current position may be determined based on an image obtained from a camera. It may be specified.
When the emotion map 116 shown in FIG. 4 is given, the robot 100 moves in the direction in which it is drawn to the favor point (coordinate P1) and in the direction away from the aversion point (coordinate P2).
 感情マップ116は動的に変化する。ロボット100が座標P1に到達すると、座標P1におけるz値(好感情)は時間とともに低下する。これにより、ロボット100は好意地点(座標P1)に到達して、「感情が満たされ」、やがて、その場所に「飽きてくる」という生物的行動をエミュレートできる。同様に、座標P2における悪感情も時間とともに緩和される。時間経過とともに新たな好意地点や嫌悪地点が生まれ、それによってロボット100は新たな行動選択を行う。ロボット100は、新しい好意地点に「興味」を持ち、絶え間なく行動選択する。 The emotion map 116 changes dynamically. When the robot 100 reaches the coordinate P1, the z-value (favorable feeling) at the coordinate P1 decreases with time. As a result, the robot 100 can reach the favor point (coordinate P1), and emulate the biological behavior of "feeling of emotion" being satisfied and eventually "being bored" at the place. Similarly, bad feelings at coordinate P2 are also alleviated with time. As time passes, new favor points and aversion points are created, whereby the robot 100 makes a new action selection. The robot 100 has an "interest" at a new favor point and continuously selects an action.
 感情マップ116は、ロボット100の内部状態として、感情の起伏を表現する。ロボット100は、好意地点を目指し、嫌悪地点を避け、好意地点にしばらくとどまり、やがてまた次の行動を起こす。このような制御により、ロボット100の行動選択を人間的・生物的なものにできる。 The emotion map 116 expresses the ups and downs of emotion as the internal state of the robot 100. The robot 100 aims at the favor point, avoids the disgust point, stays at the favor point for a while, and then takes the next action again. Such control can make the behavior selection of the robot 100 human and biological.
 なお、ロボット100の行動に影響を与えるマップ(以下、「行動マップ」と総称する)は、図4に示したようなタイプの感情マップ116に限らない。たとえば、好奇心、恐怖を避ける気持ち、安心を求める気持ち、静けさや薄暗さ、涼しさや暖かさといった肉体的安楽を求める気持ち、などさまざまな行動マップを定義可能である。そして、複数の行動マップそれぞれのz値を重み付け平均することにより、ロボット100の目的地点を決定してもよい。 The map that affects the behavior of the robot 100 (hereinafter collectively referred to as “action map”) is not limited to the emotion map 116 of the type shown in FIG. 4. For example, it is possible to define various action maps such as curiosity, fear of fear, feeling of relief, feeling of calmness and dimness, feeling of physical comfort such as coolness and warmth, and so on. Then, the destination point of the robot 100 may be determined by weighted averaging the z values of each of the plurality of action maps.
 ロボット100は、行動マップとは別に、さまざまな感情や感覚の大きさを示すパラメータを有する。たとえば、寂しさという感情パラメータの値が高まっているときには、安心する場所を評価する行動マップの重み付け係数を大きく設定し、目標地点に到達することでこの感情パラメータの値を低下させる。同様に、つまらないという感覚を示すパラメータの値が高まっているときには、好奇心を満たす場所を評価する行動マップの重み付け係数を大きく設定すればよい。 The robot 100 has parameters indicating the magnitudes of various emotions and senses separately from the action map. For example, when the value of the emotion parameter of loneliness is increasing, the weighting coefficient of the behavior map for evaluating a safe place is set large, and the value of the emotion parameter is lowered by reaching the target point. Similarly, when the value of the parameter indicating a feeling of being boring is increasing, the weighting coefficient of the behavior map for evaluating a place satisfying the curiosity may be set large.
 図5は、ロボット100のハードウェア構成図である。
 ロボット100は、内部センサ128、通信機126、記憶装置124、プロセッサ122、駆動機構120およびバッテリー118を含む。駆動機構120は、上述した車輪駆動機構370を含む。プロセッサ122と記憶装置124は、制御回路342に含まれる。各ユニットは電源線130および信号線132により互いに接続される。バッテリー118は、電源線130を介して各ユニットに電力を供給する。各ユニットは信号線132により制御信号を送受する。バッテリー118は、リチウムイオン二次電池であり、ロボット100の動力源である。
FIG. 5 is a hardware configuration diagram of the robot 100. As shown in FIG.
The robot 100 includes an internal sensor 128, a communicator 126, a storage device 124, a processor 122, a drive mechanism 120 and a battery 118. The drive mechanism 120 includes the wheel drive mechanism 370 described above. Processor 122 and storage 124 are included in control circuit 342. The units are connected to each other by a power supply line 130 and a signal line 132. The battery 118 supplies power to each unit via the power supply line 130. Each unit transmits and receives control signals through a signal line 132. The battery 118 is a lithium ion secondary battery and is a power source of the robot 100.
 内部センサ128は、ロボット100が内蔵する各種センサの集合体である。具体的には、カメラ(全天球カメラ)、マイクロフォンアレイ、測距センサ(赤外線センサ)、サーモセンサ、タッチセンサ、加速度センサ、ニオイセンサ、タッチセンサなどである。タッチセンサは、外皮314と本体フレーム310の間に設置され、ユーザのタッチを検出する。ニオイセンサは、匂いの元となる分子の吸着によって電気抵抗が変化する原理を応用した既知のセンサである。 The internal sensor 128 is an assembly of various sensors incorporated in the robot 100. Specifically, it is a camera (all-sky camera), a microphone array, a distance measurement sensor (infrared sensor), a thermo sensor, a touch sensor, an acceleration sensor, an odor sensor, a touch sensor, and the like. The touch sensor is disposed between the outer skin 314 and the body frame 310 to detect a touch of the user. The odor sensor is a known sensor to which the principle that the electric resistance is changed by the adsorption of the molecule that is the source of the odor is applied.
 通信機126は、サーバ200や外部センサ114、ユーザの有する携帯機器など各種の外部機器を対象として無線通信を行う通信モジュールである。通信機126は、サーバ200および外部センサ114との通信を担当する第1通信機302と、他のロボット100等との通信を担当する第2通信機304を含む。第1通信機302は、無指向性のWi-Fi(登録商標)による通信方式(第1無線通信方式)にてサーバ200等と通信する。第2通信機304は、指向性があり通信範囲の狭いIrDA(Infrared Data Association)(登録商標)による通信方式(第2無線通信方式)にて他のロボット100と通信する。詳細は後述する。 The communication device 126 is a communication module that performs wireless communication for various external devices such as the server 200, the external sensor 114, and a portable device owned by a user. The communication device 126 includes a first communication device 302 in charge of communication with the server 200 and the external sensor 114, and a second communication device 304 in charge of communication with another robot 100 or the like. The first communication device 302 communicates with the server 200 and the like in a non-directional communication method (first wireless communication method) according to Wi-Fi (registered trademark). The second communication device 304 communicates with another robot 100 in a communication method (second wireless communication method) by IrDA (Infrared Data Association) (registered trademark) (a registered trademark) having directivity and a narrow communication range. Details will be described later.
 記憶装置124は、不揮発性メモリおよび揮発性メモリにより構成され、コンピュータプログラムや各種設定情報を記憶する。プロセッサ122は、コンピュータプログラムの実行手段である。駆動機構120は、内部機構を制御するアクチュエータである。このほかには、表示器やスピーカーなども搭載される。 The storage device 124 is configured by a non-volatile memory and a volatile memory, and stores a computer program and various setting information. The processor 122 is an execution means of a computer program. The drive mechanism 120 is an actuator that controls an internal mechanism. In addition to this, indicators and speakers will also be installed.
 プロセッサ122は、通信機126を介してサーバ200や外部センサ114と通信しながら、ロボット100の行動選択を行う。内部センサ128により得られるさまざまな外部情報も行動選択に影響する。駆動機構120は、主として、車輪(前輪102)と頭部(頭部フレーム316)を制御する。駆動機構120は、2つの前輪102それぞれの回転速度や回転方向を変化させることにより、ロボット100の移動方向や移動速度を変化させる。また、駆動機構120は、車輪(前輪102および後輪103)を昇降させることもできる。車輪が上昇すると、車輪はボディ104に完全に収納され、ロボット100は着座面108にて床面Fに当接し、着座状態となる。また、駆動機構120は、ワイヤ135を介して、手106を制御する。 The processor 122 performs action selection of the robot 100 while communicating with the server 200 and the external sensor 114 via the communication device 126. Various external information obtained by the internal sensor 128 also affects behavior selection. The drive mechanism 120 mainly controls the wheel (front wheel 102) and the head (head frame 316). The drive mechanism 120 changes the rotational direction and the rotational direction of the two front wheels 102 to change the moving direction and the moving speed of the robot 100. The drive mechanism 120 can also raise and lower the wheels (the front wheel 102 and the rear wheel 103). When the wheel ascends, the wheel is completely housed in the body 104, and the robot 100 abuts on the floor surface F at the seating surface 108 to be in the seating state. Also, the drive mechanism 120 controls the hand 106 via the wire 135.
 図6は、ロボットシステム300の機能ブロック図である。
 上述のように、ロボットシステム300は、ロボット100、サーバ200、装身具140および複数の外部センサ114を含む。装身具140は、ロボット100に対して動作命令を送信する機能を有する。ロボット100、サーバ200および装身具140の各構成要素は、CPU(Central Processing Unit)および各種コプロセッサなどの演算器、メモリやストレージといった記憶装置、それらを連結する有線または無線の通信線を含むハードウェアと、記憶装置に格納され、演算器に処理命令を供給するソフトウェアによって実現される。コンピュータプログラムは、デバイスドライバ、オペレーティングシステム、それらの上位層に位置する各種アプリケーション・プログラム、また、これらのプログラムに共通機能を提供するライブラリによって構成されてもよい。以下に説明する各ブロックは、ハードウェア単位の構成ではなく、機能単位のブロックを示している。
 ロボット100の機能の一部はサーバ200により実現されてもよいし、サーバ200の機能の一部または全部はロボット100により実現されてもよい。
FIG. 6 is a functional block diagram of the robot system 300. As shown in FIG.
As described above, the robot system 300 includes the robot 100, the server 200, the jewelry 140, and the plurality of external sensors 114. The accessory 140 has a function of transmitting an operation command to the robot 100. The components of the robot 100, the server 200 and the accessory 140 are arithmetic units such as central processing units (CPUs) and various co-processors, storage devices such as memories and storages, and hardware including wired or wireless communication lines connecting them. And software that is stored in the storage device and supplies processing instructions to the computing unit. The computer program may be configured by a device driver, an operating system, various application programs located in the upper layer of them, and a library that provides common functions to these programs. Each block described below indicates not a hardware unit configuration but a function unit block.
Some of the functions of the robot 100 may be realized by the server 200, and some or all of the functions of the server 200 may be realized by the robot 100.
(サーバ200)
 サーバ200は、通信部204、データ処理部202およびデータ格納部206を含む。
 通信部204は、外部センサ114およびロボット100との通信処理を担当する。データ格納部206は各種データを格納する。データ処理部202は、通信部204により取得されたデータおよびデータ格納部206に格納されるデータに基づいて各種処理を実行する。データ処理部202は、通信部204およびデータ格納部206のインタフェースとしても機能する。
(Server 200)
The server 200 includes a communication unit 204, a data processing unit 202, and a data storage unit 206.
The communication unit 204 takes charge of communication processing with the external sensor 114 and the robot 100. The data storage unit 206 stores various data. The data processing unit 202 executes various processes based on the data acquired by the communication unit 204 and the data stored in the data storage unit 206. The data processing unit 202 also functions as an interface of the communication unit 204 and the data storage unit 206.
 データ格納部206は、モーション格納部232、マップ格納部216および個人データ格納部218を含む。
 ロボット100は、複数の動作パターン(モーション)を有する。手106を震わせる、蛇行しながらオーナーに近づく、首をかしげたままオーナーを見つめる、などさまざまなモーションが定義される。
The data storage unit 206 includes a motion storage unit 232, a map storage unit 216, and a personal data storage unit 218.
The robot 100 has a plurality of motion patterns (motions). Various motions are defined, such as shaking the hand 106, approaching the owner while meandering, staring at the owner with a sharp neck, and the like.
 モーション格納部232は、モーションの制御内容を定義する「モーションファイル」を格納する。各モーションは、モーションIDにより識別される。モーションファイルは、ロボット100のモーション格納部160にもダウンロードされる。どのモーションを実行するかは、サーバ200で決定されることもあるし、ロボット100で決定されることもある。 The motion storage unit 232 stores a "motion file" that defines control content of motion. Each motion is identified by a motion ID. The motion file is also downloaded to the motion storage unit 160 of the robot 100. Which motion is to be performed may be determined by the server 200 or the robot 100.
 ロボット100のモーションの多くは、複数の単位モーションを含む複合モーションとして構成される。たとえば、ロボット100がオーナーに近づくとき、オーナーの方に向き直る単位モーション、手を上げながら近づく単位モーション、体を揺すりながら近づく単位モーション、両手を上げながら着座する単位モーションの組み合わせとして表現されてもよい。このような4つのモーションの組み合わせにより、「オーナーに近づいて、途中で手を上げて、最後は体をゆすった上で着座する」というモーションが実現される。モーションファイルには、ロボット100に設けられたアクチュエータの回転角度や角速度などが時間軸に関連づけて定義される。モーションファイル(アクチュエータ制御情報)にしたがって、時間経過とともに各アクチュエータを制御することで様々なモーションが表現される。 Many of the motions of the robot 100 are configured as complex motions including a plurality of unit motions. For example, when the robot 100 approaches the owner, it may be expressed as a combination of a unit motion that turns toward the owner, a unit motion that approaches while raising the hand, a unit motion that approaches while shaking the body, and a unit motion that sits while raising both hands. . The combination of such four motions realizes a motion of “close to the owner, raise your hand halfway, and finally sit down with your body shaking”. In the motion file, the rotation angle and angular velocity of an actuator provided in the robot 100 are defined in association with the time axis. Various motions are represented by controlling each actuator with the passage of time according to a motion file (actuator control information).
 先の単位モーションから次の単位モーションに変化するときの移行時間を「インターバル」とよぶ。インターバルは、単位モーション変更に要する時間やモーションの内容に応じて定義されればよい。インターバルの長さは調整可能である。
 以下、いつ、どのモーションを選ぶか、モーションを実現する上での各アクチュエータの出力調整など、ロボット100の行動制御に関わる設定のことを「行動特性」と総称する。ロボット100の行動特性は、モーション選択アルゴリズム、モーションの選択確率、モーションファイル等により定義される。
The transition time when changing from the previous unit motion to the next unit motion is called "interval". The interval may be defined according to the time required for unit motion change and the contents of the motion. The length of the interval is adjustable.
Hereinafter, settings relating to behavior control of the robot 100, such as when to select which motion, output adjustment of each actuator for realizing the motion, and the like are collectively referred to as “behavior characteristics”. The behavior characteristics of the robot 100 are defined by a motion selection algorithm, a motion selection probability, a motion file, and the like.
 モーション格納部232は、モーションファイルのほか、各種のイベントが発生したときに実行すべきモーションを定義するモーション選択テーブルを格納する。モーション選択テーブルにおいては、イベントに対して1以上のモーションとその選択確率が対応づけられる。 The motion storage unit 232 stores, in addition to the motion file, a motion selection table that defines motion to be executed when various events occur. In the motion selection table, one or more motions and their selection probabilities are associated with an event.
 マップ格納部216は、複数の行動マップのほか、椅子やテーブルなどの障害物の配置状況を示すマップも格納する。個人データ格納部218は、ユーザ、特に、オーナーの情報を格納する。具体的には、ユーザに対する親密度とユーザの身体的特徴・行動的特徴を示すマスタ情報を格納する。年齢や性別などの他の属性情報を格納してもよい。また、個人データ格納部218は、ユーザだけではなく、他のロボット100に対する親密度も登録される。 The map storage unit 216 stores, in addition to a plurality of action maps, a map indicating the arrangement of obstacles such as chairs and tables. The personal data storage unit 218 stores information of the user, in particular, the owner. Specifically, master information indicating the closeness to the user and the physical and behavioral characteristics of the user is stored. Other attribute information such as age and gender may be stored. The personal data storage unit 218 also registers familiarity with not only the user but also other robots 100.
 ロボット100は、ユーザごとに親密度という内部パラメータを有する。ロボット100が、自分を抱き上げる、声をかけてくれるなど、自分に対して好意を示す行動を認識したとき、そのユーザに対する親密度が高くなる。ロボット100に関わらないユーザや、乱暴を働くユーザ、出会う頻度が低いユーザに対する親密度は低くなる。ロボット100のロボット100に対する親密度については後述する。 The robot 100 has an internal parameter called familiarity for each user. When the robot 100 recognizes an action indicating favor with itself, such as raising itself or giving a voice, familiarity with the user is increased. The closeness to the user who is not involved in the robot 100, the user who is violent, and the user who is infrequently encountered is low. The closeness of the robot 100 to the robot 100 will be described later.
 データ処理部202は、位置管理部208、マップ管理部210、認識部212、動作制御部222、親密度管理部220および状態管理部244を含む。
 位置管理部208は、ロボット100の位置座標を、図3を用いて説明した方法にて特定する。位置管理部208はユーザの位置座標もリアルタイムで追跡してもよい。
The data processing unit 202 includes a position management unit 208, a map management unit 210, a recognition unit 212, an operation control unit 222, a closeness management unit 220, and a state management unit 244.
The position management unit 208 specifies the position coordinates of the robot 100 by the method described with reference to FIG. The position management unit 208 may also track the user's position coordinates in real time.
 状態管理部244は、充電率や内部温度、プロセッサ122の処理負荷などの各種物理状態など各種内部パラメータを管理する。状態管理部244は、感情管理部234を含む。
 感情管理部234は、ロボット100の感情(寂しさ、好奇心、承認欲求など)を示すさまざまな感情パラメータを管理する。これらの感情パラメータは常に揺らいでいる。感情パラメータに応じて複数の行動マップの重要度が変化し、行動マップによってロボット100の移動目標地点が変化し、ロボット100の移動や時間経過によって感情パラメータが変化する。
The state management unit 244 manages various internal parameters such as various physical states such as the charging rate, the internal temperature, and the processing load of the processor 122. The state management unit 244 includes an emotion management unit 234.
The emotion management unit 234 manages various emotion parameters that indicate the emotion (such as loneliness, curiosity, approval request, etc.) of the robot 100. These emotional parameters are constantly fluctuating. The importance of the plurality of action maps changes according to the emotion parameter, the movement target point of the robot 100 changes according to the action map, and the emotion parameter changes according to the movement of the robot 100 or the passage of time.
 たとえば、寂しさを示す感情パラメータが高いときには、感情管理部234は安心する場所を評価する行動マップの重み付け係数を大きく設定する。ロボット100が、この行動マップにおいて寂しさを解消可能な地点に至ると、感情管理部234は寂しさを示す感情パラメータを低下させる。また、後述の応対行為によっても各種感情パラメータは変化する。たとえば、オーナーから「抱っこ」をされると寂しさを示す感情パラメータは低下し、長時間にわたってオーナーを視認しないときには寂しさを示す感情パラメータは少しずつ増加する。 For example, when the emotion parameter indicating loneliness is high, the emotion management unit 234 sets the weighting coefficient of the behavior map for evaluating a safe place large. When the robot 100 reaches a point where loneliness can be eliminated in the action map, the emotion management unit 234 reduces the emotion parameter indicating the loneliness. In addition, various emotional parameters are also changed by the response action described later. For example, the emotion parameter indicating loneliness declines when being "held" from the owner, and the emotion parameter indicating loneliness gradually increases when the owner is not viewed for a long time.
 マップ管理部210は、複数の行動マップについて図4に関連して説明した方法にて各座標のパラメータを変化させる。マップ管理部210は、複数の行動マップのいずれかを選択してもよいし、複数の行動マップのz値を加重平均してもよい。たとえば、行動マップAでは座標R1、座標R2におけるz値が4と3であり、行動マップBでは座標R1、座標R2におけるz値が-1と3であるとする。単純平均の場合、座標R1の合計z値は4-1=3、座標R2の合計z値は3+3=6であるから、ロボット100は座標R1ではなく座標R2の方向に向かう。
 行動マップAを行動マップBの5倍重視するときには、座標R1の合計z値は4×5-1=19、座標R2の合計z値は3×5+3=18であるから、ロボット100は座標R1の方向に向かう。
The map management unit 210 changes the parameter of each coordinate in the method described with reference to FIG. 4 for a plurality of action maps. The map management unit 210 may select one of the plurality of behavior maps, or may perform weighted averaging of z values of the plurality of behavior maps. For example, it is assumed that z values at coordinates R1 and R2 are 4 and 3 in action map A, and z values at coordinates R1 and R2 in action map B are -1 and 3, respectively. In the case of simple average, the total z value of the coordinate R1 is 4-1 = 3, and the total z value of the coordinate R2 is 3 + 3 = 6, so the robot 100 moves in the direction of the coordinate R2 instead of the coordinate R1.
When emphasizing the action map A five times the action map B, the total z value of the coordinate R1 is 4 × 5-1 = 19, and the total z value of the coordinate R2 is 3 × 5 + 3 = 18. Head in the direction of
 認識部212は、外部環境を認識する。外部環境の認識には、温度や湿度に基づく天候や季節の認識、光量や温度に基づく物陰(安全地帯)の認識など多様な認識が含まれる。ロボット100の認識部156は、内部センサ128により各種の環境情報を取得し、これを一次処理した上でサーバ200の認識部212に転送する。 The recognition unit 212 recognizes the external environment. The recognition of the external environment includes various recognitions such as recognition of weather and season based on temperature and humidity, recognition of an object shade (safety area) based on light quantity and temperature. The recognition unit 156 of the robot 100 acquires various types of environment information by the internal sensor 128, performs primary processing on the environment information, and transfers the information to the recognition unit 212 of the server 200.
 具体的には、ロボット100の認識部156は、画像から移動物体、特に、人物や動物に対応する画像領域を抽出し、抽出した画像領域から移動物体の身体的特徴や行動的特徴を示す特徴量の集合として「特徴ベクトル」を抽出する。特徴ベクトル成分(特徴量)は、各種身体的・行動的特徴を定量化した数値である。たとえば、人間の目の横幅は0~1の範囲で数値化され、1つの特徴ベクトル成分を形成する。人物の撮像画像から特徴ベクトルを抽出する手法については、既知の顔認識技術の応用である。ロボット100は、特徴ベクトルをサーバ200に送信する。 Specifically, the recognition unit 156 of the robot 100 extracts an image area corresponding to a moving object, in particular, a person or an animal, from the image, and indicates a physical feature or an action characteristic of the moving object from the extracted image area. Extract "feature vector" as a set of quantities. The feature vector component (feature amount) is a numerical value that quantifies various physical and behavioral features. For example, the width of the human eye is digitized in the range of 0 to 1 to form one feature vector component. The method of extracting feature vectors from a captured image of a person is an application of known face recognition technology. The robot 100 transmits the feature vector to the server 200.
 サーバ200の認識部212は、更に、人物認識部214と応対認識部228を含む。
 人物認識部214は、ロボット100の内蔵カメラによる撮像画像から抽出された特徴ベクトルと、個人データ格納部218にあらかじめ登録されているユーザ(クラスタ)の特徴ベクトルと比較することにより、撮像されたユーザがどの人物に該当するかを判定する(ユーザ識別処理)。人物認識部214は、表情認識部230を含む。表情認識部230は、ユーザの表情を画像認識することにより、ユーザの感情を推定する。
 なお、人物認識部214は、人物以外の移動物体、たとえば、ペットである猫や犬についてもユーザ識別処理を行う。
The recognition unit 212 of the server 200 further includes a person recognition unit 214 and a response recognition unit 228.
The human recognition unit 214 compares the feature vector extracted from the image captured by the built-in camera of the robot 100 with the feature vector of the user (cluster) registered in advance in the personal data storage unit 218, thereby capturing the user It is determined which person corresponds to (user identification processing). The person recognition unit 214 includes an expression recognition unit 230. The facial expression recognition unit 230 estimates the user's emotion by performing image recognition on the user's facial expression.
The person recognition unit 214 also performs user identification processing on moving objects other than a person, for example, cats and dogs that are pets.
 応対認識部228は、ロボット100になされたさまざまな応対行為を認識し、快・不快行為に分類する。応対認識部228は、また、ロボット100の行動に対するオーナーの応対行為を認識することにより、肯定・否定反応に分類する。
 快・不快行為は、ユーザの応対行為が、生物として心地よいものであるか不快なものであるかにより判別される。たとえば、抱っこされることはロボット100にとって快行為であり、蹴られることはロボット100にとって不快行為である。肯定・否定反応は、ユーザの応対行為が、ユーザの快感情を示すものか不快感情を示すものであるかにより判別される。たとえば、抱っこされることはユーザの快感情を示す肯定反応であり、蹴られることはユーザの不快感情を示す否定反応である。
The response recognition unit 228 recognizes various response actions made to the robot 100, and classifies them as pleasant and unpleasant actions. The response recognition unit 228 also classifies into a positive / negative response by recognizing the owner's response to the behavior of the robot 100.
The pleasant and unpleasant behavior is determined depending on whether the user's response behavior is comfortable or unpleasant as a living thing. For example, holding is a pleasant act for the robot 100, and kicking is an unpleasant act for the robot 100. The positive / negative response is determined depending on whether the user's response indicates a user's pleasant emotion or an unpleasant emotion. For example, being held is a positive response indicating the user's pleasant feeling, and kicking is a negative response indicating the user's unpleasant feeling.
 サーバ200の動作制御部222は、ロボット100の動作制御部150と協働して、ロボット100のモーションを決定する。サーバ200の動作制御部222は、マップ管理部210による行動マップ選択に基づいて、ロボット100の移動目標地点とそのための移動ルートを作成する。動作制御部222は、複数の移動ルートを作成し、その上で、いずれかの移動ルートを選択してもよい。 The motion control unit 222 of the server 200 cooperates with the motion control unit 150 of the robot 100 to determine the motion of the robot 100. The motion control unit 222 of the server 200 creates a movement target point of the robot 100 and a movement route for the movement based on the action map selection by the map management unit 210. The operation control unit 222 may create a plurality of movement routes, and then select one of the movement routes.
 動作制御部222は、モーション格納部232の複数のモーションからロボット100のモーションを選択する。各モーションには状況ごとに選択確率が対応づけられている。たとえば、オーナーから快行為がなされたときには、モーションAを20%の確率で実行する、気温が30度以上となったとき、モーションBを5%の確率で実行する、といった選択方法が定義される。
 行動マップに移動目標地点や移動ルートが決定され、後述の各種イベントによりモーションが選択される。
The motion control unit 222 selects the motion of the robot 100 from the plurality of motions of the motion storage unit 232. Each motion is associated with a selection probability for each situation. For example, a selection method is defined such that motion A is executed with a probability of 20% when a pleasant action is made by the owner, and motion B is executed with a probability of 5% when the temperature reaches 30 degrees or more. .
A movement target point and a movement route are determined in the action map, and a motion is selected by various events described later.
 親密度管理部220は、ユーザごとの親密度を管理する。上述したように、親密度は個人データ格納部218において個人データの一部として登録される。快行為を検出したとき、親密度管理部220はそのオーナーに対する親密度をアップさせる。不快行為を検出したときには親密度はダウンする。また、長期間視認していないオーナーの親密度は徐々に低下する。 The closeness management unit 220 manages closeness for each user. As described above, the intimacy degree is registered in the personal data storage unit 218 as part of the personal data. When a pleasant act is detected, the closeness management unit 220 increases the closeness to the owner. The intimacy is down when an offensive act is detected. In addition, the closeness of the owner who has not viewed for a long time gradually decreases.
 本実施形態の親密度管理部220は、オーナーだけではなく、ロボット100ごとの親密度も管理する。他のロボット100に対する親密度も個人データ格納部218において個人データの一部として登録される。他のロボット100に対する親密度は、そのロボット100との関わり方に応じて増減するが詳細は後述する。 The closeness management unit 220 of the present embodiment manages closeness not only for the owner but also for each robot 100. Familiarity with other robots 100 is also registered in the personal data storage unit 218 as part of personal data. The closeness to another robot 100 increases or decreases depending on how it is associated with the robot 100, which will be described in detail later.
(ロボット100)
 ロボット100は、第1通信部142、第2通信部134、データ処理部136、データ格納部148、内部センサ128および駆動機構120を含む。
 第1通信部142は第1通信機302に該当し、第2通信部134は第2通信機304に該当する(図5参照)。第1通信部142は、外部センサ114およびサーバ200との通信処理を担当する。第2通信部134は、他のロボット100および装身具140との通信処理を担当する。本実施形態における装身具140は、リストバンドであるとして説明する。
(Robot 100)
The robot 100 includes a first communication unit 142, a second communication unit 134, a data processing unit 136, a data storage unit 148, an internal sensor 128, and a drive mechanism 120.
The first communication unit 142 corresponds to the first communication device 302, and the second communication unit 134 corresponds to the second communication device 304 (see FIG. 5). The first communication unit 142 takes charge of communication processing with the external sensor 114 and the server 200. The second communication unit 134 takes charge of communication processing with another robot 100 and the accessory 140. The accessory 140 in the present embodiment is described as a wristband.
 第1通信部142は、通信接続部138を含む。通信接続部138は、Wi-Fi(登録商標)(第1の無線通信方式)によりサーバ200と通信接続する。第2通信部134は、送信部162(送信機)および受信部164(受信機)を含む。第2通信部134は、IrDA(登録商標)(第2の無線通信方式)により、他のロボット100または装身具140と近距離無線通信を実行する。
 本実施形態における「近距離無線通信」とは、少なくとも5メートル以内、好ましくは1.5メートル以内、更に好ましくは1メートル以内を通信範囲とする無線通信を意味するものとする。必須ではないが、近距離無線通信は、指向性のある通信方式であることが望ましく、また、アドホック通信(直接型)であることが望ましい。
The first communication unit 142 includes a communication connection unit 138. The communication connection unit 138 establishes communication connection with the server 200 by Wi-Fi (registered trademark) (first wireless communication method). The second communication unit 134 includes a transmitter 162 (transmitter) and a receiver 164 (receiver). The second communication unit 134 performs near field communication with the other robot 100 or the accessory 140 by IrDA (registered trademark) (second wireless communication method).
The “near-field wireless communication” in the present embodiment means wireless communication in which the communication range is at least 5 meters, preferably 1.5 meters, more preferably 1 meter. Although not required, the near field communication is preferably a directional communication method, and preferably ad hoc communication (direct type).
 データ格納部148は各種データを格納する。データ格納部148は、記憶装置124(図5参照)に該当する。データ処理部136は、第1通信部142および第2通信部134により取得されたデータ、内部センサ128により検出されたセンサ情報およびデータ格納部148に格納されているデータに基づいて各種処理を実行する。データ処理部136は、プロセッサ122およびプロセッサ122により実行されるコンピュータプログラムに該当する。データ処理部136は、第1通信部142、第2通信部134、内部センサ128、駆動機構120およびデータ格納部148のインタフェースとしても機能する。 The data storage unit 148 stores various data. The data storage unit 148 corresponds to the storage device 124 (see FIG. 5). The data processing unit 136 executes various processes based on the data acquired by the first communication unit 142 and the second communication unit 134, the sensor information detected by the internal sensor 128, and the data stored in the data storage unit 148. Do. The data processing unit 136 corresponds to a processor 122 and a computer program executed by the processor 122. The data processing unit 136 also functions as an interface of the first communication unit 142, the second communication unit 134, the internal sensor 128, the drive mechanism 120, and the data storage unit 148.
 データ格納部148は、ロボット100の各種モーションを定義するモーション格納部160を含む。
 ロボット100のモーション格納部160には、サーバ200のモーション格納部232から各種モーションファイルがダウンロードされる。モーションは、モーションIDによって識別される。前輪102を収容して着座する、手106を持ち上げる、2つの前輪102を逆回転させることで、あるいは、片方の前輪102だけを回転させることでロボット100を回転行動させる、前輪102を収納した状態で前輪102を回転させることで震える、ユーザから離れるときにいったん停止して振り返る、などのさまざまなモーションを表現するために、各種アクチュエータ(駆動機構120)の動作タイミング、動作時間、動作方向などがモーションファイルにおいて時系列定義される。
The data storage unit 148 includes a motion storage unit 160 that defines various motions of the robot 100.
Various motion files are downloaded from the motion storage unit 232 of the server 200 to the motion storage unit 160 of the robot 100. Motion is identified by motion ID. A state in which the front wheel 102 is accommodated, which causes the robot 100 to rotate by having only the front wheel 102 housed and seated, lifting the hand 106, rotating the two front wheels 102 in reverse, or rotating only one front wheel 102 In order to express various motions such as shaking by rotating the front wheel 102 at a time, stopping and turning back once when leaving the user, operation timing, operation time, operation direction, etc. of various actuators (drive mechanism 120) Temporarily defined in motion file.
 データ格納部148には、マップ格納部216および個人データ格納部218からも各種データがダウンロードされてもよい。 Various data may also be downloaded to the data storage unit 148 from the map storage unit 216 and the personal data storage unit 218.
 データ処理部136は、認識部156、動作制御部150、命令選択部166、ロボット検出部152および充電監視部154を含む。
 ロボット100の動作制御部150は、サーバ200の動作制御部222と協働してロボット100のモーションを決める。一部のモーションについてはサーバ200で決定し、他のモーションについてはロボット100で決定してもよい。また、ロボット100がモーションを決定するが、ロボット100の処理負荷が高いときにはサーバ200がモーションを決定するとしてもよい。サーバ200においてベースとなるモーションを決定し、ロボット100において追加のモーションを決定してもよい。モーションの決定処理をサーバ200およびロボット100においてどのように分担するかはロボットシステム300の仕様に応じて設計すればよい。
The data processing unit 136 includes a recognition unit 156, an operation control unit 150, an instruction selection unit 166, a robot detection unit 152, and a charge monitoring unit 154.
The motion control unit 150 of the robot 100 determines the motion of the robot 100 in cooperation with the motion control unit 222 of the server 200. Some motions may be determined by the server 200, and other motions may be determined by the robot 100. Also, although the robot 100 determines the motion, the server 200 may determine the motion when the processing load of the robot 100 is high. The base motion may be determined at server 200 and additional motion may be determined at robot 100. How to share the motion determination process in the server 200 and the robot 100 may be designed according to the specification of the robot system 300.
 ロボット100の動作制御部150は、サーバ200の動作制御部222とともにロボット100の移動方向を決める。行動マップに基づく移動をサーバ200で決定し、障害物をよけるなどの即時的移動をロボット100の動作制御部150により決定してもよい。駆動機構120は、動作制御部150の指示にしたがって前輪102を駆動することで、ロボット100を移動目標地点に向かわせる。 The motion control unit 150 of the robot 100 determines the moving direction of the robot 100 together with the motion control unit 222 of the server 200. The movement based on the action map may be determined by the server 200, and the immediate movement such as turning off the obstacle may be determined by the movement control unit 150 of the robot 100. The drive mechanism 120 drives the front wheel 102 in accordance with an instruction from the operation control unit 150 to direct the robot 100 to the movement target point.
 ロボット100の動作制御部150は選択したモーションを駆動機構120に実行指示する。駆動機構120は、モーションファイルにしたがって、各アクチュエータを制御する。 The operation control unit 150 of the robot 100 instructs the drive mechanism 120 to execute the selected motion. The drive mechanism 120 controls each actuator according to the motion file.
 動作制御部150は、親密度の高いユーザが近くにいるときには「抱っこ」をせがむ仕草として両方の手106をもちあげるモーションを実行することもできるし、「抱っこ」に飽きたときには左右の前輪102を収容したまま逆回転と停止を交互に繰り返すことで抱っこをいやがるモーションを表現することもできる。駆動機構120は、動作制御部150の指示にしたがって前輪102や手106、首(頭部フレーム316)を駆動することで、ロボット100にさまざまなモーションを表現させる。 The motion control unit 150 can also execute a motion to lift both hands 106 as a gesture that encourages "hug" when a user with high intimacy is nearby, and when the "hug" gets tired, the left and right front wheels 102 By alternately repeating reverse rotation and stop while being accommodated, it is also possible to express a motion that annoys you. The drive mechanism 120 causes the robot 100 to express various motions by driving the front wheel 102, the hand 106, and the neck (head frame 316) according to the instruction of the operation control unit 150.
 命令選択部166は、動作命令を選択する。動作命令とは、他のロボット100に対してモーション選択を指示するための命令である。命令選択部166は、複数種類の動作命令を任意のタイミングにてランダムに選択してもよいし、イベントが発生したときにイベントに対応する動作命令を選択してもよい。たとえば、動作制御部150がモーションMを選択したとき、命令選択部166はモーションMにあらかじめ対応づけられる動作命令Xを選択してもよい。送信部162は、選択された動作命令を他のロボット100に送信する。 The instruction selection unit 166 selects an operation instruction. The motion command is a command for instructing another robot 100 to select a motion. The instruction selection unit 166 may randomly select a plurality of types of operation instructions at arbitrary timing, or may select an operation instruction corresponding to an event when an event occurs. For example, when the motion control unit 150 selects a motion M, the command selection unit 166 may select a motion command X associated with the motion M in advance. The transmitting unit 162 transmits the selected operation command to the other robot 100.
 ロボット検出部152は、受信部164がロボット情報を検出したとき、他のロボットの存在およびその存在する方向や位置を特定する。ロボット情報の詳細は後述する。充電監視部154は、バッテリー118の電池残量を監視する。 When the reception unit 164 detects robot information, the robot detection unit 152 specifies the presence of another robot and the direction or position thereof. The details of the robot information will be described later. The charge monitoring unit 154 monitors the remaining battery level of the battery 118.
 ロボット100の認識部156は、内部センサ128から得られた外部情報を解釈する。認識部156は、視覚的な認識(視覚部)、匂いの認識(嗅覚部)、音の認識(聴覚部)、触覚的な認識(触覚部)が可能である。 The recognition unit 156 of the robot 100 interprets external information obtained from the internal sensor 128. The recognition unit 156 is capable of visual recognition (visual unit), odor recognition (olfactory unit), sound recognition (hearing unit), and tactile recognition (tactile unit).
 認識部156は、内蔵の全天球カメラにより定期的に外界を撮像し、人やペットなどの移動物体を検出する。認識部156は、特徴抽出部146を含む。特徴抽出部146は、移動物体の撮像画像から特徴ベクトルを抽出する。上述したように、特徴ベクトルは、移動物体の身体的特徴と行動的特徴を示すパラメータ(特徴量)の集合である。移動物体を検出したときには、ニオイセンサや内蔵の集音マイク、温度センサ等からも身体的特徴や行動的特徴が抽出される。たとえば、画像に移動物体が写っているとき、ひげが生えている、早朝活動している、赤い服を着ている、香水の匂いがする、声が大きい、メガネをかけている、スカートを履いている、白髪である、背が高い、太っている、日焼けしている、ソファにいる、といったさまざまな特徴が抽出される。これらの特徴も定量化され、特徴ベクトル成分となる。 The recognition unit 156 periodically images the outside world with the built-in omnidirectional camera, and detects a moving object such as a person or a pet. The recognition unit 156 includes a feature extraction unit 146. The feature extraction unit 146 extracts a feature vector from the captured image of the moving object. As described above, the feature vector is a set of parameters (features) indicating physical features and behavioral features of the moving object. When a moving object is detected, physical features and behavioral features are also extracted from an odor sensor, a built-in sound collection microphone, a temperature sensor, and the like. For example, when moving objects appear in the image, bearded, working in the early morning, wearing red clothes, smelling of perfume, loud voice, wearing glasses, wearing skirts A variety of features are extracted, such as white-haired, tall, fat, tan, and being on a couch. These features are also quantified and become feature vector components.
 ロボットシステム300は、大量の画像情報やその他のセンシング情報から得られる身体的特徴および行動的特徴に基づいて、高い頻度で出現するユーザを「オーナー」としてクラスタリングする。
 たとえば、ひげが生えている移動物体(ユーザ)は早朝に活動すること(早起き)が多く、赤い服を着ることが少ないのであれば、早起きでひげが生えていて赤い服をあまり着ないクラスタ(ユーザ)、という第1のプロファイルができる。一方、メガネをかけている移動物体はスカートを履いていることが多いが、この移動物体にはひげが生えていない場合、メガネをかけていてスカートを履いているが絶対ひげは生えていないクラスタ(ユーザ)、という第2のプロファイルができる。
 以上は、簡単な設例であるが、上述の方法により、父親に対応する第1のプロファイルと母親に対応する第2のプロファイルが形成され、この家には少なくとも2人のユーザ(オーナー)がいることをロボット100は認識する。
The robot system 300 clusters users who frequently appear as “owners” based on physical features and behavioral features obtained from a large amount of image information and other sensing information.
For example, if a moving object (user) with a beard is active (early rising) in the early morning and is less likely to wear a red dress, a cluster (with a beard growing early and not wearing much red dress) User), the first profile. On the other hand, moving objects wearing glasses often wear skirts, but if these moving objects have no beards, clusters wearing glasses wearing skirts but no absolute beards The second profile (user) can be created.
The above is a simple example, but according to the above-mentioned method, a first profile corresponding to a father and a second profile corresponding to a mother are formed, and there are at least two users (owners) in this house The robot 100 recognizes that.
 ただし、ロボット100は第1のプロファイルが「父親」であると認識する必要はない。あくまでも、「ひげが生えていて早起きすることが多く、赤い服を着ることはめったにないクラスタ」という人物像を認識できればよい。プロファイルごとに、プロファイルを特徴づける特徴ベクトルが定義される。 However, the robot 100 does not have to recognize that the first profile is "father". It is only necessary to be able to recognize the figure of "a cluster with a beard and often getting up early, and a cluster that rarely wears red clothes". For each profile, a feature vector characterizing the profile is defined.
 このようなクラスタ分析が完了している状態において、ロボット100が新たに移動物体(ユーザ)を認識したとする。
 このとき、サーバ200の人物認識部214は、新たな移動物体の特徴ベクトルに基づいてユーザ識別処理を実行し、移動物体がどのプロファイル(クラスタ)に該当するかを判断する。たとえば、ひげが生えている移動物体を検出したとき、この移動物体は父親である確率が高い。この移動物体が早朝行動していれば、父親に該当することはいっそう確実である。一方、メガネをかけている移動物体を検出したときには、この移動物体は母親である可能性もある。この移動物体にひげが生えていれば、母親ではなく父親でもないので、クラスタ分析されていない新しい人物であると判定する。
It is assumed that the robot 100 newly recognizes a moving object (user) while such cluster analysis is completed.
At this time, the person recognition unit 214 of the server 200 executes the user identification process based on the feature vector of the new moving object, and determines which profile (cluster) the moving object corresponds to. For example, when a moving object with a beard is detected, the moving object is likely to be a father. If this moving object is acting in the early morning, it is more certain that it corresponds to the father. On the other hand, when detecting a moving object wearing glasses, the moving object may be a mother. If the moving object has a beard, it is not a mother but a father, so it is determined to be a new person not subjected to cluster analysis.
 特徴抽出によるクラスタ(プロファイル)の形成(クラスタ分析)と、特徴抽出にともなうクラスタへの当てはめは同時並行的に実行されてもよい。 The formation of clusters (profiles) by feature extraction (cluster analysis) and the fitting to clusters involved in feature extraction may be performed concurrently.
 検出・分析・判定を含む一連の認識処理のうち、ロボット100の認識部156は認識に必要な情報の取捨選択や抽出を行い、判定等の解釈処理はサーバ200の認識部212により実行される。認識処理は、サーバ200の認識部212だけで行ってもよいし、ロボット100の認識部156だけで行ってもよいし、上述のように双方が役割分担をしながら上記認識処理を実行してもよい。 Among the series of recognition processes including detection, analysis, and determination, the recognition unit 156 of the robot 100 selects and extracts information necessary for recognition, and interpretation processes such as determination are executed by the recognition unit 212 of the server 200. . The recognition processing may be performed only by the recognition unit 212 of the server 200, or may be performed only by the recognition unit 156 of the robot 100, or both perform the above-mentioned recognition processing while sharing roles. It is also good.
 ロボット100に対する強い衝撃が与えられたとき、認識部156は内蔵の加速度センサによりこれを認識し、サーバ200の応対認識部228は、近隣にいるユーザによって「乱暴行為」が働かれたと認識する。ユーザがツノ112を掴んでロボット100を持ち上げるときにも、乱暴行為と認識してもよい。ロボット100に正対した状態にあるユーザが特定音量領域および特定周波数帯域にて発声したとき、サーバ200の応対認識部228は、自らに対する「声掛け行為」がなされたと認識してもよい。また、体温程度の温度を検知したときにはユーザによる「接触行為」がなされたと認識し、接触認識した状態で上方への加速度を検知したときには「抱っこ」がなされたと認識する。ユーザがボディ104を持ち上げるときの物理的接触をセンシングしてもよいし、前輪102にかかる荷重が低下することにより抱っこを認識してもよい。
 まとめると、ロボット100は内部センサ128によりユーザの行為を物理的情報として取得し、サーバ200の応対認識部228は快・不快を判定し、サーバ200の認識部212は特徴ベクトルに基づくユーザ識別処理を実行する。
When a strong impact is given to the robot 100, the recognition unit 156 recognizes this by the built-in acceleration sensor, and the response recognition unit 228 of the server 200 recognizes that the "abuse act" is performed by the user in the vicinity. Even when the user holds the tongue 112 and lifts the robot 100, it may be recognized as a violent act. When the user directly facing the robot 100 utters in a specific sound volume region and a specific frequency band, the response recognition unit 228 of the server 200 may recognize that the “voice call action” has been performed on itself. In addition, when the temperature around the body temperature is detected, it is recognized that the user has made "contact action", and when the upward acceleration is detected in the state where the contact is recognized, it is recognized that "handing" is made. The physical contact when the user lifts the body 104 may be sensed, or the holding on the front wheel 102 may be recognized by lowering the load.
In summary, the robot 100 acquires the user's action as physical information by the internal sensor 128, the response recognition unit 228 of the server 200 determines comfort / discomfort, and the recognition unit 212 of the server 200 performs user identification processing based on the feature vector Run.
 サーバ200の応対認識部228は、ロボット100に対するユーザの各種応対を認識する。各種応対行為のうち一部の典型的な応対行為には、快または不快、肯定または否定が対応づけられる。一般的には快行為となる応対行為のほとんどは肯定反応であり、不快行為となる応対行為のほとんどは否定反応となる。快・不快行為は親密度に関連し、肯定・否定反応はロボット100の行動選択に影響する。 The response recognition unit 228 of the server 200 recognizes various responses of the user to the robot 100. Of the various types of response actions, some typical response actions correspond to pleasure or discomfort, affirmation or denial. In general, most pleasurable actions are positive responses, and most offensive actions are negative. Pleasure and discomfort are related to intimacy, and affirmative and negative responses affect the action selection of the robot 100.
 認識部156により認識された応対行為に応じて、サーバ200の親密度管理部220はユーザに対する親密度を変化させる。原則的には、快行為を行ったユーザに対する親密度は高まり、不快行為を行ったユーザに対する親密度は低下する。 In accordance with the response action recognized by the recognition unit 156, the closeness management unit 220 of the server 200 changes the closeness to the user. In principle, the intimacy with the user who has performed pleasure is increased, and the intimacy with the user who has performed offensive activity decreases.
 移動物体(ユーザ)からどのような行為をされるかによってそのユーザに対する親密度が変化する。 The closeness to the user changes depending on what action is taken from the moving object (user).
 ロボット100は、よく出会う人、よく触ってくる人、よく声をかけてくれる人に対して高い親密度を設定する。一方、めったに見ない人、あまり触ってこない人、乱暴な人、大声で叱る人に対する親密度は低くなる。ロボット100はセンサ(視覚、触覚、聴覚)によって検出するさまざまな外界情報にもとづいて、ユーザごとの親密度を変化させる。 The robot 100 sets a high degree of intimacy for people who frequently meet, people who frequently touch, and people who frequently speak. On the other hand, the intimacy with the people who rarely see, those who do not touch very much, the violent people, the people who speak loudly becomes low. The robot 100 changes the intimacy degree of each user based on various external information detected by sensors (vision, touch, hearing).
 実際のロボット100は行動マップにしたがって自律的に複雑な行動選択を行う。ロボット100は、寂しさ、退屈さ、好奇心などさまざまなパラメータに基づいて複数の行動マップに影響されながら行動する。ロボット100は、行動マップの影響を除外すれば、あるいは、行動マップの影響が小さい内部状態にあるときには、原則的には、親密度の高い人に近づこうとし、親密度の低い人からは離れようとする。 The actual robot 100 autonomously performs complex action selection in accordance with the action map. The robot 100 acts while being influenced by a plurality of action maps based on various parameters such as loneliness, boredom and curiosity. The robot 100 tries to approach people with high intimacy and leaves people with low intimacy, in principle, when the influence of the action map is excluded or in an internal state where the influence of the behavior map is small. I assume.
 ロボット100の行動は親密度に応じて以下に類型化される。
(1)親密度が非常に高いユーザ
 ロボット100は、ユーザに近づき(以下、「近接行動」とよぶ)、かつ、人に好意を示す仕草としてあらかじめ定義される愛情仕草を行うことで親愛の情を強く表現する。
(2)親密度が比較的高いユーザ
 ロボット100は、近接行動のみを行う。
(3)親密度が比較的低いユーザ
 ロボット100は特段のアクションを行わない。
(4)親密度が特に低いユーザ
 ロボット100は、離脱行動を行う。
The behavior of the robot 100 is categorized as follows according to closeness.
(1) The user robot 100 with a very high degree of intimacy approaches the user (hereinafter referred to as “proximity action”), and performs the affection of love by predefining a gesture of love for people. Express strongly.
(2) The user robot 100 with relatively high intimacy performs only the proximity action.
(3) The user robot 100 with relatively low intimacy does not perform any particular action.
(4) The user robot 100 with a particularly low intimacy performs a leaving action.
 以上の制御方法によれば、ロボット100は、親密度が高いユーザを見つけるとそのユーザに近寄り、逆に親密度が低いユーザを見つけるとそのユーザから離れる。このような制御方法により、いわゆる「人見知り」を行動表現できる。また、来客(親密度が低いユーザA)が現れたとき、ロボット100は、来客から離れて家族(親密度が高いユーザB)の方に向かうこともある。この場合、ユーザBはロボット100が人見知りをして不安を感じていること、自分を頼っていること、を感じ取ることができる。このような行動表現により、ユーザBは、選ばれ、頼られることの喜び、それにともなう愛着の情を喚起される。 According to the above control method, when the robot 100 finds a user with high intimacy, it approaches that user, and conversely, when finding a user with low intimacy, it leaves the user. By such a control method, it is possible to express so-called "human sight" behavior. In addition, when a visitor (user A with low intimacy) appears, the robot 100 may move away from the visitor and head toward the family (user B with high intimacy). In this case, the user B can feel that the robot 100 is aware of strangers and feels uneasy, and relies on himself. Such a behavioral expression evokes the user B the joy of being selected and relied upon, and the accompanying attachment.
 一方、来客であるユーザAが頻繁に訪れ、声を掛け、タッチをするとロボット100のユーザAに対する親密度は徐々に上昇し、ロボット100はユーザAに対して人見知り行動(離脱行動)をしなくなる。ユーザAも自分にロボット100が馴染んできてくれたことを感じ取ることで、ロボット100に対する愛着を抱くことができる。 On the other hand, when the user A who is a visitor frequently visits, calls and makes a touch, the intimacy with the user A of the robot 100 gradually increases, and the robot 100 does not act as an acquaintance with the user A (disengagement behavior) . The user A can also have an attachment to the robot 100 by feeling that the robot 100 has become familiar with himself.
 なお、以上の行動選択は、常に実行されるとは限らない。たとえば、ロボット100の好奇心を示す内部パラメータが高くなっているときには、好奇心を満たす場所を求める行動マップが重視されるため、ロボット100は親密度に影響された行動を選択しない可能性もある。また、玄関に設置されている外部センサ114がユーザの帰宅を検知した場合には、ユーザのお出迎え行動を最優先で実行するかもしれない。 Note that the above action selection is not always performed. For example, when the internal parameter indicating the curiosity of the robot 100 is high, the robot 100 may not select the behavior influenced by the intimacy because the action map for finding a place satisfying the curiosity is emphasized. . In addition, when the external sensor 114 installed at the entrance detects that the user has returned home, the user may be asked to give priority to the user's meeting action.
(装身具140)
 装身具140(リストバンド)は、不特定多数のロボット100に対して動作命令を送信可能なアクセサリである。装身具140は、送信部144、受信部170および指示選択部172を含む。送信部144および受信部170は、IrDA(登録商標)(第2の無線通信方式)によりロボット100と通信する。受信部170は、ロボット100の送信部162から後述のロボット情報を受信する。送信部144は、ロボット100に対して動作命令を送信する。装身具140の動作命令については図10に関連して後述する。指示選択部172は、ユーザからの指示にしたがって動作命令を選択する。装身具140には、あらかじめ複数の動作命令が登録されている。ユーザは装身具140のボタン(不図示)を操作することにより、送信すべき動作命令を選択できる。
(Sports 140)
The accessory 140 (wristband) is an accessory capable of transmitting an operation command to an unspecified number of robots 100. The accessory 140 includes a transmitting unit 144, a receiving unit 170, and an instruction selecting unit 172. The transmitting unit 144 and the receiving unit 170 communicate with the robot 100 by IrDA (registered trademark) (second wireless communication method). The receiving unit 170 receives robot information described later from the transmitting unit 162 of the robot 100. The transmitting unit 144 transmits an operation command to the robot 100. The operation command of the accessory 140 will be described later with reference to FIG. The instruction selection unit 172 selects an operation instruction in accordance with an instruction from the user. A plurality of operation instructions are registered in the accessory 140 in advance. The user can select an operation command to be transmitted by operating a button (not shown) of the accessory 140.
 図7は、ツノ112の外観拡大図である。
 ツノ112には、通信機配置面Gが形成される。通信機配置面Gには、IrDA(登録商標)の送信機および受信機が設置される。IrDA(登録商標)は、指向性が強く、通信範囲が0.3~1.0メートル程度と短い。IrDA(登録商標)の通信範囲は、一般的な無線通信と比べると近距離に限定される。また、IrDA(登録商標)は、送信機と受信機の間に紙などの簡単な非透過の遮蔽物があるだけでも通信できなくなるため、近距離であるだけでなく、視認可能な状態にある相手としか通信できない。いいかえれば、相手を視覚的に認識可能な状態にあるときに限り送受信が可能であるため、人が内緒話をするような極めて自然でセキュアな通信が可能となる。
FIG. 7 is an enlarged view of the horn 112.
A communication device disposition surface G is formed on the horn 112. On the communication device arrangement plane G, IrDA (registered trademark) transmitters and receivers are installed. IrDA (registered trademark) is highly directional and has a short communication range of about 0.3 to 1.0 meter. The communication range of IrDA (registered trademark) is limited to a short distance as compared to general wireless communication. In addition, IrDA (registered trademark) is not only in a short distance, but also in a visible state, because it can not communicate even if there is only a simple non-transmissive shield such as paper between the transmitter and the receiver. It can only communicate with the other party. In other words, since it is possible to transmit and receive only when the other party can visually recognize, it is possible to perform extremely natural and secure communication in which a person speaks secretly.
 図8は、通信機配置面Gの上面図である。
 円形の通信機配置面Gには、8つの第2通信機304(第2通信機304a~304h)が環状に配置される。図8は図面上方がロボット100の前方(前頭部側)、図面下方がロボット100の後方(後頭部側)に対応する。第2通信機304は、送信機158と受信機168を1つずつ含む。このため、通信機配置面Gには送信機158と受信機168が8個ずつ交互に配列されることになる。8つの第2通信機304(送信機158と受信機168)にはそれぞれ、下記のように位置コード(ID)が設定されている。
 第2通信機304a(前方)の位置コード:F
 第2通信機304b(左前方)の位置コード:FL
 第2通信機304c(左方)の位置コード:L
 第2通信機304d(左後方)の位置コード:BL
 第2通信機304e(後方)の位置コード:B
 第2通信機304f(右後方)の位置コード:BR
 第2通信機304g(右方)の位置コード:R
 第2通信機304h(右前方)の位置コード:FR
 8つの送信機158(送信機158a~158h)が図6の送信部162に対応し、8つの受信機(受信機168a~168h)が受信部164に対応する。
FIG. 8 is a top view of the communication device disposition surface G.
Eight second communication devices 304 (second communication devices 304a to 304h) are annularly arranged on the circular communication device arrangement surface G. In FIG. 8, the upper side of the drawing corresponds to the front of the robot 100 (front side), and the lower side of the drawing corresponds to the rear of the robot 100 (back side). The second communicator 304 includes one transmitter 158 and one receiver 168. Therefore, eight transmitters 158 and eight receivers 168 are alternately arranged on the communication device arrangement plane G. Position codes (IDs) are set to the eight second communication devices 304 (the transmitters 158 and the receivers 168) as follows.
Position code of second communication device 304a (front): F
Position code of second communication device 304 b (front left): FL
Position code of second communication device 304 c (left): L
Position code of second communication device 304 d (left rear): BL
Position code of second communication device 304e (rear): B
Position code of second communication device 304f (right rear): BR
Position code of second communication device 304g (right): R
Position code of second communication device 304h (front right): FR
Eight transmitters 158 (transmitters 158a to 158h) correspond to the transmitting unit 162 in FIG. 6, and eight receivers (receivers 168a to 168h) correspond to the receiving unit 164.
 送信機158は、IrDA(登録商標)にて「ロボット情報」および動作命令を送信する。IrDA(登録商標)は指向性を有するため、ロボット情報等の信号は8つの送信機158により8方向に定期的かつ一斉に送信される。ロボット情報には、ロボット100を識別する「ロボットID」と第2通信機304(送信機158)を識別する位置コード(送信機ID)が含まれる。たとえば、送信機158aはロボットIDと位置コード(F)を送信し、送信機158bはロボットIDと位置コード(FL)を送信する。ロボットIDは、MACアドレス(Media Access Control address)や製造番号など、ロボット100を一意に識別する情報であればよい。 The transmitter 158 transmits “robot information” and an operation instruction on IrDA (registered trademark). Since IrDA (registered trademark) has directivity, signals such as robot information are transmitted periodically and simultaneously by eight transmitters 158 in eight directions. The robot information includes a “robot ID” for identifying the robot 100 and a position code (transmitter ID) for identifying the second communication device 304 (transmitter 158). For example, the transmitter 158a transmits a robot ID and a position code (F), and the transmitter 158b transmits a robot ID and a position code (FL). The robot ID may be information that uniquely identifies the robot 100, such as a MAC address (Media Access Control address) or a serial number.
 受信機168は、他のロボット100からロボット情報および動作命令を受信する。受信機168は、装身具140からも動作命令を受信する。 The receiver 168 receives robot information and operation instructions from another robot 100. The receiver 168 also receives motion commands from the jewellery 140.
 第1のロボット100から第2のロボット100にロボット情報が送信されたとき、第2のロボット100はロボット情報により送信源の種別、位置、向き、距離を特定できる。第2のロボット100は、まず、ロボットIDにより送信源である第1のロボット100を特定する。ここで、第2のロボット100の左方の受信機168c(位置コード:L)が、第1のロボット100から「位置コード:F」を含むロボット情報を受信したとする。このとき、第2のロボット100のロボット検出部152は、第1のロボット100が第2のロボット100の左方に位置し、かつ、第1のロボット100は第2のロボット100の方を向いていると判定する。これは第2のロボット100の左方の受信機168c(位置コード:L)が第1のロボット100の前方の送信機158a(位置コード:F)からロボット情報を受信したためである。 When robot information is transmitted from the first robot 100 to the second robot 100, the second robot 100 can specify the type, position, direction, and distance of the transmission source based on the robot information. The second robot 100 first specifies the first robot 100 as a transmission source based on the robot ID. Here, it is assumed that the left receiver 168 c (position code: L) of the second robot 100 receives robot information including “position code: F” from the first robot 100. At this time, in the robot detection unit 152 of the second robot 100, the first robot 100 is positioned to the left of the second robot 100, and the first robot 100 faces the second robot 100. It is determined that This is because the left receiver 168 c (position code: L) of the second robot 100 receives robot information from the transmitter 158 a (position code: F) in front of the first robot 100.
 より厳密には、第2のロボット100のロボット検出部152は、複数の受信機168それぞれにおけるロボット情報の受信強度により、第1のロボット100の存在方向を特定する。たとえば、受信機168c(左方)の受信強度が他の受信機168のいずれの受信強度よりも大きければ、第1のロボット100が第2のロボット100の左方に存在すると特定できる。更に、受信強度により、ロボット検出部152は外部ロボット100との距離も特定する。また、第1のロボット100の送信機158aからの信号の受信強度が他の送信機からの信号の受信強度より大きいとき、第1のロボット100は第2のロボット100の方を向いていると判定できる。 More strictly, the robot detection unit 152 of the second robot 100 specifies the presence direction of the first robot 100 based on the reception intensity of the robot information in each of the plurality of receivers 168. For example, if the reception strength of the receiver 168 c (left) is greater than the reception strength of any of the other receivers 168, it can be identified that the first robot 100 is on the left of the second robot 100. Furthermore, the robot detection unit 152 also specifies the distance to the external robot 100 based on the reception intensity. In addition, when the reception intensity of the signal from the transmitter 158a of the first robot 100 is larger than the reception intensity of the signals from the other transmitters, it is assumed that the first robot 100 faces the second robot 100. It can be determined.
 次に、IrDA(登録商標)に基づく近距離無線通信にともなうロボット100の動作について説明する。
 上述したように、送信機158は1メートル程度の狭いエリアにしかロボット情報を送信できない。このため、IrDA(登録商標)による近距離無線通信は他のロボット100(以下、「外部ロボット100」とよぶ)が至近距離に存在するときにしか成立しない。サーバ200は、外部ロボット100のロボットIDを「ロボットリスト」に登録している。以下、ロボットIDを登録済み(認識済み)の外部ロボット100のことを「登録ロボット100」、未登録の外部ロボット100のことを「未登録ロボット100」ともよぶ。ロボットリストはサーバ200とロボット100により共有される。認識される側の外部ロボット100に対し、認識する側のロボット100のことを「自ロボット100」ともよぶことにする。
Next, the operation of the robot 100 involved in near field communication based on IrDA (registered trademark) will be described.
As mentioned above, the transmitter 158 can transmit robot information only in a narrow area of about 1 meter. For this reason, near field communication by IrDA (registered trademark) is established only when another robot 100 (hereinafter, referred to as “external robot 100”) exists at a close distance. The server 200 registers the robot ID of the external robot 100 in the “robot list”. Hereinafter, the external robot 100 whose robot ID has been registered (recognized) is also referred to as “registered robot 100”, and the unregistered external robot 100 is also referred to as “unregistered robot 100”. The robot list is shared by the server 200 and the robot 100. With respect to the external robot 100 on the recognized side, the robot 100 on the recognition side is also referred to as “self-robot 100”.
 自ロボット100が搭載する複数の受信機168のいずれかが外部ロボット100からロボット情報を受信したとき、認識部156はロボットリストを参照することにより外部ロボット100を判別する。ロボットリストに存在しないロボットIDを受信した場合、その外部ロボット100は未登録ロボット100である。この場合には、自ロボット100は、ロボットIDをサーバ200に送信し、サーバ200の認識部212は新たに検出されたロボットIDをロボットリストに登録する。登録ロボット100は、自ロボット100が知っている、いいかえれば、関わったことのある外部ロボット100を意味する。 When any of the plurality of receivers 168 mounted on the robot 100 receives robot information from the external robot 100, the recognition unit 156 determines the external robot 100 by referring to the robot list. If a robot ID not present in the robot list is received, the external robot 100 is an unregistered robot 100. In this case, the robot 100 transmits the robot ID to the server 200, and the recognition unit 212 of the server 200 registers the newly detected robot ID in the robot list. The registration robot 100 means the external robot 100 that the robot 100 knows, in other words, has been involved.
 動作制御部150は、外部ロボット100のロボットIDが検出されたとき、「ロボットIDの新規検出」というイベントE1に対応づけられる複数種類のモーションからいずれかのモーションを選択する。具体的には、振り向く、近寄る、離れる、などさまざまなモーションが考えられる。動作制御部150は、イベントE1に対応して、実行中のモーションを停止させる、モーションのインターバルを変化させる、モーションの実行速度を変化させるなどにより、外部ロボット100を認識したことを行動表現してもよい。 When the robot ID of the external robot 100 is detected, the motion control unit 150 selects any one of a plurality of types of motion associated with an event E1 of “new detection of robot ID”. Specifically, various motions such as turning around, approaching, leaving, etc. can be considered. The motion control unit 150 expresses action recognition that the external robot 100 is recognized by stopping the motion in progress, changing the interval of motion, changing the execution speed of motion, or the like corresponding to the event E1. It is also good.
 特に、動作制御部150は、未登録のロボットIDが検出されたときには、「未登録のロボットIDの検出」というイベントE2に対応づけられる複数種類のモーションからいずれかのモーションを選択する。具体的には、振り向く、近寄る、首をかしげるなどさまざまなモーションが考えられる。動作制御部150は、イベントE2に対応して、実行中のモーションを停止させる、モーションの実行速度を落とすなど、見知らぬロボット100を見つけたことによる好奇心や警戒心を行動表現してもよい。 In particular, when an unregistered robot ID is detected, the operation control unit 150 selects any one of a plurality of types of motion associated with an event E2 of “detection of unregistered robot ID”. Specifically, various motions can be considered, such as turning around, approaching, and turning the neck. The motion control unit 150 may express curiosity or alertness due to finding the unknown robot 100, such as stopping the motion in progress or reducing the execution speed of the motion in response to the event E2.
 動作制御部150は、外部ロボット100が存在する方向に応じて、ロボット100の行動特性を変化させる。たとえば、背後に外部ロボット100が存在するときには外部ロボット100から離れるモーションを実行してもよい。いいかえれば、「背後に位置する受信機168e(位置コード:B)によりロボットIDを検出」というイベントE3が発生したときには、動作制御部150は自ロボット100の移動目標地点を前方に設定することで、背後に現れたロボット100に驚いて逃げる、という行動を表現する。動作制御部150は、ロボット100を回転させることで後ろを振り向くモーションを実行してもよい。同様にして、自ロボット100の前方に外部ロボット100が存在するときには、動作制御部150は外部ロボット100に近づくモーションを実行してもよい。 The operation control unit 150 changes the behavior characteristic of the robot 100 according to the direction in which the external robot 100 exists. For example, when there is an external robot 100 behind, motion away from the external robot 100 may be executed. In other words, when an event E3 "detecting the robot ID by the receiver 168e (position code: B) located behind" occurs, the operation control unit 150 sets the movement target point of the robot 100 ahead. , The action of being surprised and fleeing to the robot 100 appearing behind is expressed. The motion control unit 150 may execute a motion looking backward by rotating the robot 100. Similarly, when the external robot 100 exists in front of the self robot 100, the motion control unit 150 may execute a motion approaching the external robot 100.
 ロボット100の送信部162は、外部ロボット100に動作命令を送信する。動作命令は、実行させたいモーションのモーションIDを含む。ロボット100は、任意のタイミング、たとえば、ランダムなタイミングにて任意の動作命令を送信してもよいし、外部ロボット100を検出したときなどイベントの発生時に動作命令を送信してもよい。たとえば、ロボット100Pがロボット100Qに対して、追従動作を指示する動作命令X1を送信したときには、ロボット100Qの動作制御部150は送信元のロボット100Pの後ろに移動し、以後、ロボット100Pに追従移動してもよい。追従動作については、図9に関連して更に詳述する。 The transmitting unit 162 of the robot 100 transmits an operation command to the external robot 100. The motion command includes the motion ID of the motion to be executed. The robot 100 may transmit any operation command at any timing, for example, random timing, or may transmit an operation command when an event occurs such as when the external robot 100 is detected. For example, when the robot 100P transmits an operation command X1 instructing the robot 100Q to follow the movement, the operation control unit 150 of the robot 100Q moves behind the robot 100P that is the transmission source, and thereafter moves following the robot 100P. You may The following operation will be described in more detail with reference to FIG.
 ロボット100Qは、ロボット100Pから動作命令Xを受信したとき、動作命令Xにしたがってもよいし、したがわなくてもよい。ロボット100Qは、所定の確率にて動作命令Xにしたがうとしてもよい。 When the robot 100Q receives the operation command X from the robot 100P, it may or may not follow the operation command X. The robot 100Q may follow the motion command X with a predetermined probability.
 動作制御部150が自ロボット100のモーションを選択したとき、命令選択部166は選択したモーションに対応づけられる動作命令を選択してもよい。たとえば、「手106を挙げる」というモーションM1には、「手106を挙げろ」という動作命令X2と「止まれ」という動作命令X3が対応づけられるとする。ロボット100PがモーションM1を選択したとき、ロボット100Pの命令選択部166は動作命令X2、X3のいずれかをランダムに選択する。仮に動作命令X2が選択されたとする。ロボット100Pの送信部162(送信機158a~158h)は、動作命令X2をロボット100Q(外部ロボット100)に送信する。このような制御方法によれば、ロボット100Pが手106を挙げると(モーションM1)、別のロボット100Qもそれに合わせて手106を挙げるという行動連鎖が実現される。他の例として、「移動する」というモーションM2に対して「追従動作」を指示する動作命令X1が選択されたときには、ロボット100Pが動くと別のロボット100Qが合わせて動き出すという行動連鎖が実現される。 When the motion control unit 150 selects a motion of the robot 100, the command selection unit 166 may select a motion command associated with the selected motion. For example, it is assumed that the motion command X2 "I will raise the hand 106" and the operation command X3 "stop" are associated with the motion M1 "I raise the hand 106". When the robot 100P selects the motion M1, the command selection unit 166 of the robot 100P randomly selects one of the operation commands X2 and X3. Suppose that the operation instruction X2 is selected. The transmitters 162 (transmitters 158a to 158h) of the robot 100P transmit the operation command X2 to the robot 100Q (external robot 100). According to such a control method, when the robot 100P raises the hand 106 (motion M1), another robot 100Q realizes an action chain of raising the hand 106 accordingly. As another example, when an operation command X1 instructing a "following operation" to the motion M2 "move" is selected, an action chain is realized in which another robot 100Q starts to move together when the robot 100P moves. Ru.
 上述したように、親密度管理部220は、ユーザだけではなく、外部ロボット100に対する親密度も管理する。親密度管理部220は、近距離無線通信に応じて外部ロボット100に対する親密度を変化させる。たとえば、自ロボット100は、外部ロボット100のロボットIDを検出したとき、その外部ロボット100に対する親密度を加算する。 As described above, the intimacy degree management unit 220 manages not only the user but also the intimacy degree with the external robot 100. The closeness management unit 220 changes the closeness to the external robot 100 according to the short distance wireless communication. For example, when the robot ID of the external robot 100 is detected, the robot 100 adds the closeness to the external robot 100.
 また、自ロボット100から外部ロボット100に動作命令を送信したとき、外部ロボット100が動作命令を実行するか否かを確率判定する。外部ロボット100が動作命令Xを実行するとき、いいかえれば、動作命令Xを受諾するとき、外部ロボット100の送信部162は「受諾信号」を送信元の自ロボット100に返信する。自ロボット100は受諾信号を受信したとき、外部ロボット100に対する親密度を加算する。外部ロボット100は動作命令Xにしたがってモーション選択を行う。 Further, when an operation command is transmitted from the robot 100 to the external robot 100, it is determined whether or not the external robot 100 executes the operation command. When the external robot 100 executes the operation command X, in other words, when the operation command X is accepted, the transmitting unit 162 of the external robot 100 returns an “acceptance signal” to the self-robot 100 of the transmission source. When the robot 100 receives the acceptance signal, it adds the closeness to the external robot 100. The external robot 100 performs motion selection in accordance with the operation command X.
 外部ロボット100が動作命令Xを拒否するときには、外部ロボット100は「拒否信号」を送信元の自ロボット100に返信する。自ロボット100は拒否信号を受信したとき、外部ロボット100に対する親密度を減算する。このような制御方法によれば、動作命令にしたがった外部ロボット100に対して親密度(好感度)を高め、動作命令にしたがわない外部ロボット100に対しては親密度(好感度)を下げる、いいかえれば、命令を聞いてくれるロボット100を好きになり、命令を聞いてくれないロボット100を嫌いになるという生物的特性をロボット100にもたせることができる。 When the external robot 100 rejects the operation command X, the external robot 100 returns a "rejection signal" to the self robot 100 of the transmission source. When the robot 100 receives the rejection signal, it subtracts the closeness to the external robot 100. According to such a control method, the closeness (or preference) to the external robot 100 according to the operation command is enhanced, and the closeness (or preference) to the external robot 100 not conforming to the operation command is lowered. In other words, it is possible to give the robot 100 the biological property that it likes the robot 100 that listens to instructions and dislikes the robot 100 that does not listen to instructions.
 自ロボット100は、外部ロボット100に対する親密度を時間とともに減算してもよい。このような制御方法によれば、関わりの薄くなった外部ロボット100に対しては親近感(親密度)が徐々に低下する、という生物的特性を表現できる。 Self-robot 100 may subtract familiarity with external robot 100 with time. According to such a control method, it is possible to express the biological characteristic that the sense of closeness (closeness) gradually decreases for the external robot 100 whose relationship has become thin.
 ロボット100Qがロボット100Pからの動作命令Xにしたがうか否かは、ロボットQのロボットPに対する親密度に影響される。具体的には、ロボット100Qがロボット100Pからロボット情報とともに動作命令Xを受信したとする。ロボット100Qの動作制御部150はロボットIDを第1通信部142からサーバ200に送信し、ロボット100Pに対するロボット100Qの親密度を問い合わせる。ロボット100Qの動作制御部150は、親密度が70以上であれば90%の確率にて動作命令Xを受諾し、親密度が30以上70未満のときには50%の確率にて動作命令Xを受諾する。一方、親密度が30未満のときには動作命令Xを受諾する確率は5%である。 Whether or not the robot 100Q follows the movement command X from the robot 100P is influenced by the closeness of the robot Q to the robot P. Specifically, it is assumed that the robot 100Q receives the operation command X together with the robot information from the robot 100P. The operation control unit 150 of the robot 100Q transmits the robot ID from the first communication unit 142 to the server 200, and inquires about the closeness of the robot 100Q with respect to the robot 100P. The motion control unit 150 of the robot 100Q accepts the operation instruction X with a probability of 90% if the intimacy is 70 or more, and accepts the operation instruction X with a probability of 50% if the intimacy is 30 or more and less than 70. Do. On the other hand, when the intimacy degree is less than 30, the probability of accepting the operation command X is 5%.
 このような制御方法によれば、親密度の高いロボット100の動作命令にはしたがいやすいが、親密度の低いロボット100からの動作命令にはしたがいにくいという行動特性を実現できる。ロボット100Pとロボット100Qの互いに対する親密度が高まると、一方のロボット100が動作命令Xを送信すると他方のロボット100が動作命令Xに応えてモーションを変化させるため、複数のロボット100が仲良くなっていっしょに遊んでいるかのような、いいかえれば、友だち同士であるかのような行動表現が可能となる。特に、親密度が高いロボット100同士の場合には、一方のロボット100がモーション選択するとともに動作命令を送信し、それに対して他方のロボット100が動作命令にしたがってモーション選択するとともに新たに動作命令を送信する、という行動連鎖が長く継続する。 According to such a control method, although it is easy to follow the operation command of the robot 100 with high intimacy, it is possible to realize an action characteristic that it is difficult to follow the operation instruction from the robot 100 with low intimacy. When the closeness between the robot 100P and the robot 100Q increases with each other, when one robot 100 transmits the operation command X, the other robot 100 responds to the operation command X to change the motion, so the plurality of robots 100 become friendly In other words, it becomes possible to express behavior as if they are friends. In particular, in the case of robots 100 having high closeness, one robot 100 selects a motion and transmits an operation command, while the other robot 100 selects a motion according to the operation command and newly transmits an operation command. The action chain to send continues for a long time.
 ロボット100は、親密度に応じて行動特性を変化させてもよい。自ロボット100は、外部ロボット100のロボットIDを受信したとき、外部ロボット100の親密度を確認する。たとえば、自ロボット100の動作制御部150は外部ロボット100の親密度が70以上であれば外部ロボット100に近づき、親密度が30以上70未満のときには外部ロボット100の方向に体を向けて着座する。また、外部ロボット100の親密度が30未満のときには、動作制御部150は外部ロボット100から離れるように駆動機構120に指示してもよい。このような制御方法によれば、親しいロボット100に近づき、親しくないロボット100からは離れるという行動特性を表現できる。 The robot 100 may change the behavior characteristic according to the intimacy degree. When the robot 100 receives the robot ID of the external robot 100, the robot 100 confirms the closeness of the external robot 100. For example, the motion control unit 150 of the robot 100 approaches the external robot 100 if the intimacy degree of the external robot 100 is 70 or more, and faces toward the external robot 100 when the intimacy degree is 30 or more and less than 70. . In addition, when the closeness degree of the external robot 100 is less than 30, the operation control unit 150 may instruct the drive mechanism 120 to separate from the external robot 100. According to such a control method, it is possible to express an action characteristic in which the close robot 100 approaches and the unfriendly robot 100 separates.
 図9は、複数のロボット100が隊列を組む様子を示す模式図である。
 図9においては、先頭のロボット100Pが「追従動作」を指示する動作命令X1を送信している。ロボット100Pが搭載する複数の送信機158は、全方向に動作命令X1を送信する。ロボット100Qは、ロボット100Pから動作命令X1を受信する。ロボット100Qは、ロボット100Pが搭載する複数の送信機158のうち、ロボット100Pの後部の送信機158eを、ロボット情報に含まれる位置コード(B)により識別する。ロボット100Qの動作制御部150は、ロボット100Pの後部に移動し、ロボット100Pに追従する。具体的には、ロボット100Qは、ロボット100Qの受信機168a(前方)が他の受信機168よりも大きな受信強度にて、ロボット100Pの送信機158e(後方)からのロボット情報を受信可能な位置に移動する。このように、ロボット100Qは、ロボット100Pの後部の送信機158eからロボット情報を受信可能な位置に移動し、その後はロボット100Pに追従することにより、ロボット100Qがロボット100Pの後ろをついていくという行動表現が可能となる。
FIG. 9 is a schematic view showing how a plurality of robots 100 form a formation.
In FIG. 9, the first robot 100P transmits an operation command X1 instructing a "following operation". The plurality of transmitters 158 mounted on the robot 100P transmit the operation command X1 in all directions. The robot 100Q receives the operation command X1 from the robot 100P. The robot 100Q identifies, among the plurality of transmitters 158 mounted on the robot 100P, the transmitter 158e at the rear of the robot 100P with the position code (B) included in the robot information. The motion control unit 150 of the robot 100Q moves to the rear of the robot 100P and follows the robot 100P. Specifically, the robot 100Q is a position where the receiver 168a (forward) of the robot 100Q can receive robot information from the transmitter 158e (backward) of the robot 100P with a larger reception intensity than the other receivers 168. Move to In this manner, the robot 100Q moves to a position where it can receive robot information from the transmitter 158e at the rear of the robot 100P, and thereafter follows the robot 100P, whereby the robot 100Q follows the robot 100P. It becomes possible to express.
 動作命令X1の受信強度に基づいて、ロボット100Qの動作制御部150はロボット100Pとの距離が所定範囲内となるように移動速度を調整する。ロボット100Qは、ロボット100Pの複数の送信機158のうち、後部の送信機158e(位置コード:B)からの動作命令X1の受信強度が所定範囲となる距離を維持する。なお、追従に際しては、ロボット100Qはロボット100Pに対してロボットIDとともに受諾信号を返信する。 Based on the reception intensity of the operation command X1, the operation control unit 150 of the robot 100Q adjusts the moving speed so that the distance to the robot 100P falls within a predetermined range. The robot 100Q maintains a distance such that the reception intensity of the operation command X1 from the rear transmitter 158e (position code: B) among the plurality of transmitters 158 of the robot 100P falls within a predetermined range. In the case of tracking, the robot 100Q sends back an acceptance signal together with the robot ID to the robot 100P.
 ロボット100Qは、更に、動作命令X1を送信(リレー)する。なお、動作命令X1を送信したロボット100Pは動作制御部150により「命令モード」に設定される。命令モードに設定されたロボット100は他のロボット100からの動作命令を受け付けなくなる。したがって、図9において、ロボット100Pはロボット100Qの動作命令X1には反応しない。一方、命令モードに入っていないロボット100Rは、ロボット100Qから動作命令X1を受信し、ロボット100Qに追従する。このような制御方法によれば、ロボット100Pからの動作命令X1の発信を契機として、複数のロボット100に隊列行動をとらせることができる。動作制御部150は、所定時間の経過後に命令モードを解除する。 The robot 100Q further transmits (relays) an operation command X1. The robot 100P that has transmitted the operation command X1 is set by the operation control unit 150 in the “command mode”. The robot 100 set in the instruction mode does not receive an operation instruction from another robot 100. Therefore, in FIG. 9, the robot 100P does not respond to the operation command X1 of the robot 100Q. On the other hand, the robot 100R which has not entered the command mode receives the operation command X1 from the robot 100Q and follows the robot 100Q. According to such a control method, it is possible to cause the plurality of robots 100 to take row action in response to the transmission of the operation command X1 from the robot 100P. Operation control unit 150 cancels the instruction mode after a predetermined time has elapsed.
 追従動作は、後方からの追従だけでなく、側方からの追従も可能である。たとえば、ロボット100Pが左右にロボット100Qとロボット100Rをしたがえて移動することも可能である。この場合には、ロボット100Qはロボット100Pの右方の送信機158g(位置コード:R)から動作命令X1を受信し、ロボット100Rはロボット100Pの左方の送信機158c(位置コード:L)から動作命令X1を受信する位置に移動すればよい。このような制御方法によれば、体育館などの大きなイベント会場に多数のロボット100が集結するとき、これらのロボット100に対して一斉行動をとらせることができる。追従動作に限らず、一斉に手106を挙げる、一斉に首をかしげるなどさまざまな行動連鎖が可能である。行動連鎖は、オーナーがロボット100を持ち寄って交流させる楽しさを提供できる。 The following operation can be performed not only from the rear but also from the side. For example, the robot 100P can move to the left and right following the robot 100Q and the robot 100R. In this case, the robot 100Q receives the operation command X1 from the transmitter 158g (position code: R) on the right side of the robot 100P, and the robot 100R receives the operation command X1 from the transmitter 158c (position code: L) on the left side of the robot 100P. It may move to a position where the operation command X1 is received. According to such a control method, when a large number of robots 100 gather at a large event hall such as a gymnasium, it is possible to make these robots 100 take simultaneous action. Not only the follow-up operation, various action chains such as raising the hand 106 all at once, or raising the neck all at once are possible. The action chain can provide the fun for the owner to bring the robot 100 and interact.
 図10は、装身具140によるロボット100への行動指示を示す模式図である。
 装身具140(リストバンド)は、図8に示した通信機配置面Gと同様、複数の送信機158と複数の受信機168を含む。送信機158の集合が送信部144に該当し、受信機168の集合が受信部170に該当する。装身具140の送信機158および受信機168も位置コードにより識別される。装身具140の送信部144は、定期的に動作命令Xを送信する。送信到達領域190は、装身具140の動作命令Xが届く範囲を示す。
FIG. 10 is a schematic view showing an action instruction to the robot 100 by the accessory 140.
The accessory 140 (wristband) includes a plurality of transmitters 158 and a plurality of receivers 168 in the same manner as the communication device arrangement plane G shown in FIG. A set of transmitters 158 corresponds to the transmitting unit 144, and a set of receivers 168 corresponds to the receiving unit 170. The transmitter 158 and receiver 168 of the jewellery 140 are also identified by the position code. The transmission unit 144 of the accessory 140 periodically transmits the operation command X. The transmission reach area 190 indicates the range in which the movement command X of the accessory 140 can reach.
 装身具140(リストバンド)を着用するユーザは、指示選択部172を介して送信対象となる動作命令を切り替えることができる。装身具140が「進入禁止領域180への進入を禁じる」という動作命令X5を送信したとする。進入禁止領域180は、ロボット100が装身具140を視認可能な範囲として設定されてもよいし、動作命令X5の受信強度が所定の閾値以上となるほど装身具140に近い範囲として設定されてもよい。図10においては、ロボット100Sは動作命令X5を受信したため、装身具140に近づきながらも進入禁止領域180に入ることなく離れている。動作命令X5により、装身具140によりロボット100を近寄らせないという動作指示が可能となる。 The user wearing the accessory 140 (wristband) can switch the operation instruction to be transmitted via the instruction selection unit 172. It is assumed that the accessory 140 transmits an operation command X5 of "prohibit entry into the entry prohibited area 180". The entry prohibited area 180 may be set as a range in which the robot 100 can visually recognize the accessory 140, or may be set as a range closer to the accessory 140 so that the reception intensity of the operation command X5 is equal to or more than a predetermined threshold. In FIG. 10, since the robot 100S receives the operation command X5, it is separated from the accessory 140 without entering the entry prohibited area 180 while approaching the accessory 140. By the operation command X5, an operation instruction to keep the robot 100 away from the accessory 100 can be performed.
 ユーザが勉強や仕事のためにロボット100と関わりたくないときには、装身具140(リストバンド)を着用して動作命令X5を送信しておけば、ユーザはロボット100に邪魔されることなく作業に集中できる。進入禁止領域180は、円形でなくてもよい。進入禁止領域180は楕円形など他の形状であってもよいし、定期的に大きさや形状を変化させてもよい。ロボット100は、動作命令を受信したときには、受諾または拒否を確率判定し、判定結果を示す受諾信号または拒否信号を装身具140に返信する。ただし、動作命令X5のような一部の「強い動作命令」については、動作拒否できないとしてもよい。 When the user does not want to be associated with the robot 100 for study or work, wearing the accessory 140 (wristband) and transmitting the operation command X5 allows the user to concentrate on the work without being disturbed by the robot 100. . The non-entry area 180 may not be circular. The non-entry area 180 may have another shape such as an elliptical shape, or may change the size and the shape periodically. When the robot 100 receives an operation command, the robot 100 probability determines acceptance or rejection, and sends back an acceptance signal or rejection signal indicating the determination result to the accessory 140. However, some “strong operation instructions” such as the operation instruction X5 may not be rejected.
 また、装身具140が「着座せよ」という動作命令X4を送信したとする。送信到達領域190内に位置するロボット100Tは動作命令X4を受信すると、前輪102を収納して着座する。ロボット100Tの命令選択部166は動作命令X4を選択し、動作命令X4を送信(リレー)する。ロボット100Uがロボット100Tから動作命令X4を受信すると、ロボット100Tも着座する。 In addition, it is assumed that the accessory 140 transmits an operation command X4 "seat". When receiving the operation command X4, the robot 100T located in the transmission reach area 190 receives the front wheel 102 and sits down. The instruction selection unit 166 of the robot 100T selects the operation instruction X4 and transmits (relays) the operation instruction X4. When the robot 100U receives the operation command X4 from the robot 100T, the robot 100T is also seated.
 ロボット100Uは送信到達領域190の外にいるため、装身具140の動作命令X4を受信していないが、ロボット100Tによる動作命令X4の中継により着座することになる。このような制御方法によれば、装身具140の送信到達領域190に制約されることなく、装身具140の動作命令Xを多くのロボット100にリレーさせることができる。たとえば、多数のロボット100が存在するときに装身具140から動作命令X4を送信すれば、装身具140の近くにいるロボット100から多数のロボット100を次々に、いわば、「ドミノ倒し」のように着座させることができる。 Since the robot 100U is out of the transmission reach area 190, the robot 100U does not receive the operation command X4 of the accessory 140, but is seated by relay of the operation command X4 by the robot 100T. According to such a control method, the movement command X of the accessory 140 can be relayed to many robots 100 without being restricted by the transmission reach area 190 of the accessory 140. For example, when there are a large number of robots 100, if an operation command X4 is transmitted from the accessory 140, the robots 100 near the accessory 140 will be seated one after another like a so-called "tombstone". be able to.
 なお、装身具140の送信部144は、ロボット100からロボット情報を受信したことを条件として動作命令Xを送信してもよい。このような制御方法によれば、ロボット100が周りにいないときには動作命令Xを送信しなくなるため、送信部144の電力消費を抑制できる。また、送信部144は、動作命令Xだけではなく、ロボット100と同様、装身具140を識別する装身具IDや送信機IDを送信してもよい。 The transmission unit 144 of the accessory 140 may transmit the operation command X on condition that the robot information has been received from the robot 100. According to such a control method, since the operation command X is not transmitted when the robot 100 is not around, power consumption of the transmission unit 144 can be suppressed. Further, the transmitter 144 may transmit not only the operation command X but also the accessory ID and the transmitter ID for identifying the accessory 140 like the robot 100.
 装身具140は、送信到達領域190内に存在する複数のロボット100に対して一斉に動作命令Xを送信できる。このため、装身具140のユーザは複数のロボット100に対して同時に命令をする感覚を楽しむことができる。たとえば、「近寄れ」を意味する動作命令X6を送信することにより、複数のロボット100を自分の周りに集結させ、次に、「着座」を意味する動作命令X4を送信すれば集まったロボット100を一斉に着座させることができる。このほかにも、手106を振る、首をかしげるなどさまざまな動作命令を定義可能である。ロボット100の目110にはモニタが設置されるため、目110の表情を動作命令により制御してもよい。たとえば、ロボット100の目110を一斉に赤く光らせる動作命令も考えられる。 The accessory 140 can simultaneously transmit the operation command X to the plurality of robots 100 present in the transmission reach area 190. Therefore, the user of the accessory 140 can enjoy the feeling of giving instructions to a plurality of robots 100 at the same time. For example, by transmitting an operation command X6 meaning "close up", the plurality of robots 100 are gathered around oneself, and next, an operation command X4 meaning "seating" is transmitted, and the gathered robots 100 are You can sit at the same time. In addition to this, it is possible to define various operation instructions such as shaking the hand 106, and the like. Since a monitor is installed on the eye 110 of the robot 100, the expression of the eye 110 may be controlled by an operation command. For example, an operation command that causes the eyes 110 of the robot 100 to light up simultaneously may be considered.
 図11は、ホスト・ロボット100Vによるゲスト・ロボット100Wの認証過程を説明するための模式図である。
 サーバ200およびホスト・ロボット100Vを含むロボットシステム300は、新たなゲスト・ロボット100Wを受け入れることができる。ホスト・ロボット100Vはサーバ200により行動支援を受けるロボット100である。ここでいう「受け入れる」とは、外来者であるゲスト・ロボット100Wが、サーバ200に接続し、サーバ200が管理するリソース(ハードウェア、ソフトウェアおよびデータ)にアクセスすることでサーバ200からの行動支援を受けることが可能となることをいう。以後、サーバ200は、ホスト・ロボット100Vおよびゲスト・ロボット100Wの2つのロボット100の行動を支援することになる。
FIG. 11 is a schematic view for explaining an authentication process of the guest robot 100W by the host robot 100V.
The robot system 300 including the server 200 and the host robot 100V can accept a new guest robot 100W. The host robot 100V is a robot 100 that receives action support from the server 200. Here, “accept” means that the guest robot 100W, who is a foreigner, connects to the server 200 and accesses the resources (hardware, software, and data) managed by the server 200 to support the behavior from the server 200. To be able to receive After that, the server 200 supports the actions of the two robots 100 of the host robot 100V and the guest robot 100W.
 以下においては、ロボットシステム300(サーバ200およびホスト・ロボット100V)の拠点となる家に、ゲスト・ロボット100Wが連れて来られた状況を前提として説明する。ゲスト・ロボット100Wはホスト・ロボット100Vからアクセス情報を受け取ることにより、サーバ200と接続可能となる。ここでいう「アクセス情報」とは、サーバ200が接続される無線LAN(Local Area Network)のアクセスキー、サーバ200のIPアドレス、ポート番号、パスワード等である。 The following description is based on the premise that the guest robot 100W is brought to a house serving as a base of the robot system 300 (server 200 and host robot 100V). The guest robot 100W can connect to the server 200 by receiving access information from the host robot 100V. The "access information" mentioned here is an access key of a wireless local area network (LAN) to which the server 200 is connected, an IP address of the server 200, a port number, a password, and the like.
 ホスト・ロボット100Vの第1通信部142は、Wi-Fi(登録商標)によりサーバ200と接続する。より具体的には、ホスト・ロボット100Vの通信接続部138は、Wi-Fi(登録商標)のアクセス情報を用いてサーバ200の所属する無線LAN(無線環境)に接続し、無線でサーバ200と通信をおこなう。サーバ200のリソースを利用するために、ホスト・ロボット100Vは、サーバ200の認証を受ける。一方、ゲスト・ロボット100Wはサーバ200のアクセス情報を知らない。ホスト・ロボット100Vは、ゲスト・ロボット100Wからロボット情報を受信すると、ゲスト・ロボット100Wに対してIrDA(登録商標)にてアクセス情報を送信する(S1)。このとき、ホスト・ロボット100Vの通信接続部138はゲスト・パスワードを生成し、ホスト・ロボット100Vの送信部162はゲスト・パスワードもゲスト・ロボット100Wに通知する。 The first communication unit 142 of the host robot 100V connects to the server 200 by Wi-Fi (registered trademark). More specifically, the communication connection unit 138 of the host robot 100V connects to a wireless LAN (wireless environment) to which the server 200 belongs using access information of Wi-Fi (registered trademark), and communicates with the server 200 wirelessly. Communicate. In order to use the resources of the server 200, the host robot 100V receives authentication of the server 200. On the other hand, the guest robot 100W does not know the access information of the server 200. When the host robot 100V receives robot information from the guest robot 100W, the host robot 100V transmits access information to the guest robot 100W using IrDA (registered trademark) (S1). At this time, the communication connection unit 138 of the host robot 100V generates a guest password, and the transmission unit 162 of the host robot 100V also notifies the guest robot 100W of the guest password.
 ホスト・ロボット100Vは、ゲスト・ロボット100WのロボットIDとゲスト・パスワードをサーバ200に送信する(S2)。サーバ200は、ゲスト・ロボット100WのロボットIDおよびゲスト・パスワードを登録する。 The host robot 100V transmits the robot ID and the guest password of the guest robot 100W to the server 200 (S2). The server 200 registers the robot ID and guest password of the guest robot 100W.
 ゲスト・ロボット100Wの通信接続部138は、ホスト・ロボット100Vから受け取ったアクセス情報およびゲスト・パスワードに基づいて、無線LANに接続し、無線を介してサーバ200に接続する(S3)。サーバ200はホスト・ロボット100Vから通知されたロボットIDとゲスト・パスワードをゲスト・ロボット100Wから通知されたものと照合した上で、ゲスト・ロボット100Wとの接続を許可する。 The communication connection unit 138 of the guest robot 100W connects to the wireless LAN based on the access information and the guest password received from the host robot 100V, and connects to the server 200 wirelessly (S3). The server 200 collates the robot ID notified from the host robot 100V with the guest password notified from the guest robot 100W, and then permits connection with the guest robot 100W.
 このようにホスト・ロボット100Vは、未登録ロボット100を検出したとき、アクセス情報等を未登録ロボット100(ゲスト・ロボット100W)に渡すことでゲスト・ロボット100Wのサーバ200への接続を支援する。無指向性で通信範囲の広い通信方式でアクセス情報を送信する場合、アクセス情報を傍受されるリスクがある。IrDA(登録商標)は上述したように指向性が強く通信範囲も限られるため、傍受されるリスクが低い。なお、未登録のロボット100に限らず、外部ロボット100を検出したときにはアクセス情報を送信するとしてもよい。このようにアクセス情報を提供することで、人手を介することなくゲスト・ロボット100Wをネットワークにアクセスさせることができる。 As described above, when the host robot 100V detects the unregistered robot 100, the host robot 100V transfers the access information and the like to the unregistered robot 100 (guest robot 100W) to support the connection of the guest robot 100W to the server 200. In the case of transmitting the access information in an omnidirectional communication method with a wide communication range, there is a risk that the access information may be intercepted. Since IrDA (registered trademark) has strong directivity and limited communication range as described above, the risk of interception is low. The access information may be transmitted when the external robot 100 is detected as well as the unregistered robot 100. By providing the access information in this manner, the guest robot 100W can access the network without human intervention.
 図12は、充電ステーション250の外観図である。
 充電ステーション250(充電ステーション250aと充電ステーション250b)は、ロボット100の充電器であり、ロボット100を収容する内部スペースを有する。ロボット100が充電ステーション250(充電器)に進入して所定の姿勢となることにより、充電が開始される。充電ステーション250とロボット100は1対1にて対応づけられている。充電ステーション250は通信機252を内蔵する。通信機252は、IrDA(登録商標)にて対応するロボット100のロボットIDを定期的に送信する。
FIG. 12 is an external view of the charging station 250. As shown in FIG.
The charging station 250 (charging station 250 a and charging station 250 b) is a charger of the robot 100 and has an internal space for accommodating the robot 100. When the robot 100 enters the charging station 250 (charger) and assumes a predetermined posture, charging is started. The charging station 250 and the robot 100 are associated on a one-to-one basis. The charging station 250 incorporates the communication device 252. The communication device 252 periodically transmits the robot ID of the corresponding robot 100 using IrDA (registered trademark).
 充電ステーション250は、テーブル260、テーブル260の上面と床面Fとを滑らかに架け渡すスロープ262、およびテーブル260の周囲に設けられたフレーム254を備える。テーブル260の中央には、ロボット100が充電ステーション250へ進入する際に目印とするマーカMが付されている。マーカMは、テーブル260と異なる色で着色された円形状の領域である。 The charging station 250 includes a table 260, a slope 262 that smoothly bridges the top surface of the table 260 and the floor F, and a frame 254 provided around the table 260. At the center of the table 260, a marker M is attached as a mark when the robot 100 enters the charging station 250. The marker M is a circular area colored in a color different from that of the table 260.
 フレーム254は、テーブル260の周囲を取り囲む装飾部材256を含む。装飾部材256は、木の葉をモチーフとした装飾片を多数重ねて得られ、垣根をイメージさせるものである。テーブル260において中心マーカMからややオフセットした位置には、給電用の接続端子258が設けられる。 The frame 254 includes a decorative member 256 surrounding the periphery of the table 260. The decorative member 256 is obtained by overlapping a large number of decorative pieces having a leaf motif as a motif, and gives an image of a fence. A connection terminal 258 for feeding is provided at a position slightly offset from the center marker M in the table 260.
 ロボット100の充電監視部154は、バッテリー118の電池残量を監視する。電池残量(充電量)が所定の閾値以下、たとえば、充電率が30%以下となると、ロボット100は、充電ステーション250に向かう。ロボット100は、充電ステーション250が内蔵する通信機252からロボットIDを受信する。ロボット100は、自らのロボットIDを送信する充電ステーション250を移動目標地点として設定する。図12においては、ロボット100は充電ステーション250aに入庫している。 The charge monitoring unit 154 of the robot 100 monitors the remaining battery level of the battery 118. When the battery remaining amount (charge amount) is equal to or less than a predetermined threshold value, for example, the charge rate is equal to or less than 30%, the robot 100 goes to the charge station 250. The robot 100 receives a robot ID from the communication device 252 incorporated in the charging station 250. The robot 100 sets the charging station 250 that transmits its robot ID as a movement target point. In FIG. 12, the robot 100 is stored in the charging station 250a.
 ロボット100は、充電ステーション250aへの進入に際してマーカMを撮像し、マーカMを目印としてその進行方向を制御する。充電ステーション250aへの進入後、接続端子258は、ロボット100の底部に設けられた接続端子と接続される。それにより、ロボット100と充電ステーション250の互いの充電回路が導通状態となる。動作制御部150は、充電ステーション250aへの進入に際し、受信機168a(前方)が他の受信機168よりも高い受信強度にてロボットIDを受信できるようにロボット100の向きを調整する。 The robot 100 captures an image of the marker M when entering the charging station 250a, and controls the traveling direction using the marker M as a mark. After entering charging station 250 a, connection terminal 258 is connected to a connection terminal provided at the bottom of robot 100. As a result, the charging circuits of the robot 100 and the charging station 250 become conductive. The operation control unit 150 adjusts the orientation of the robot 100 so that the receiver 168a (forward) can receive the robot ID with higher reception intensity than the other receivers 168 when entering the charging station 250a.
 このような制御方法によれば、複数の充電ステーション250が設置されるときでも、ロボット100は「自分用の充電ステーション250」を見分けて充電できる。このため、帰巣本能に似た生物的特性をロボット100において行動表現できる。 According to such a control method, even when a plurality of charging stations 250 are installed, the robot 100 can identify and charge the “charging station 250 for oneself”. For this reason, biological characteristics similar to homing instinct can be behaviorally expressed in the robot 100.
 以上、実施形態に基づいてロボット100およびロボット100を含むロボットシステム300について説明した。
 ロボット100の外観には大きな違いがない。このため、外部ロボット100を画像認識により識別することは難しい。本実施形態においては、ロボットIDをIrDA(登録商標)により送信されるロボットIDにより外部ロボット100を認識するため、外部ロボット100を簡易に識別できる。
The robot 100 and the robot system 300 including the robot 100 have been described above based on the embodiment.
There is no big difference in the appearance of the robot 100. For this reason, it is difficult to identify the external robot 100 by image recognition. In the present embodiment, since the external robot 100 is recognized by the robot ID transmitted by IrDA (registered trademark) as the robot ID, the external robot 100 can be easily identified.
 画像認識の場合、鏡に映る自ロボット100を外部ロボット100だと誤解してしまう可能性があるが、ロボットIDによる識別方法(以下、「ID識別方式」とよぶ)であればこのような誤認識が生じない。たとえば、ロボットID=01のロボット100が、鏡で反射したロボットID=01を受信したとき、鏡に映るロボット100が自ロボット100であって外部ロボット100ではないことを認識できる。ID識別方式は、画像認識に比べて処理負担が軽いというメリットもある。 In the case of image recognition, there is a possibility that the robot 100 reflected in the mirror may be misunderstood as the external robot 100, but such an erroneous method is used in the identification method by the robot ID (hereinafter referred to as "ID identification method") There is no recognition. For example, when the robot 100 with the robot ID = 01 receives the robot ID = 01 reflected by the mirror, it can be recognized that the robot 100 reflected in the mirror is the own robot 100 and not the external robot 100. The ID identification method also has the advantage that the processing load is lighter than image recognition.
 ロボット検出部152は、複数の受信機168のうちのいずれが最大の受信強度にて外部ロボット100からの信号を受信したかにより、ロボット100の存在する方向を簡易に特定できる。また、ロボット検出部152は、外部ロボット100から送信される「送信機158の位置コード」により、外部ロボット100の自ロボット100に対する向きも特定できる。 The robot detection unit 152 can easily specify the direction in which the robot 100 is present based on which of the plurality of receivers 168 receives the signal from the external robot 100 at the maximum reception intensity. The robot detection unit 152 can also specify the orientation of the external robot 100 relative to the robot 100 based on the “position code of the transmitter 158” transmitted from the external robot 100.
 自ロボット100が外部ロボット100のロボットIDを受信可能な距離が、自ロボット100が外部ロボット100を認識できる距離である。ロボット100の受信能力が低いときには、近くにいる外部ロボット100しか認識できない。いいかえれば、ロボット100の受信能力により、ロボット100の「視力(認識可能範囲)」を表現できる。 The distance at which the robot 100 can receive the robot ID of the external robot 100 is a distance at which the robot 100 can recognize the external robot 100. When the reception capability of the robot 100 is low, only the external robot 100 located nearby can be recognized. In other words, the reception capability of the robot 100 can express the “eyesight (recognition possible range)” of the robot 100.
 外部ロボット100のロボットIDを検出したとき、自ロボット100は行動特性を変化させる。このような制御方法によれば、外部ロボット100(他者)の存在を意識して自ロボット100の行動が変化したかのような行動表現が可能となる。特に、未登録ロボットを検出したときには、外部ロボット100に近づくモーションを選択して「好奇心」を行動表現してもよいし、頭をそむけたり、外部ロボット100から離れることで「警戒心」を行動表現してもよい。 When the robot ID of the external robot 100 is detected, the self robot 100 changes the behavior characteristic. According to such a control method, it becomes possible to express an action as if the action of the own robot 100 has been changed with awareness of the presence of the external robot 100 (the other person). In particular, when an unregistered robot is detected, a motion approaching the external robot 100 may be selected to express "curiosity", or "wariness" may be displayed by turning away from the head or leaving the external robot 100. You may express an action.
 ロボット100のロボット100に対する親密度を管理することにより、ロボット100の間での好き嫌いを表現できる。たとえば、ロボット100Pはロボット100Qを好きだが、ロボット100Qはロボット100Pのことをあまり好きではない、といった「人間関係」に似た関係性をロボット100の間で表現することも可能となる。いいかえれば、ロボット100同士の「社会性」を表現できる。 By managing the closeness of the robot 100 to the robot 100, it is possible to express liking or dislike between the robots 100. For example, although the robot 100P likes the robot 100Q but the robot 100Q does not like the robot 100P very much, it is possible to express a relationship similar to “human relationship” between the robots 100. In other words, the “sociality” of the robots 100 can be expressed.
 本実施形態によれば、ロボット100Pがロボット100Qに動作命令Xを送信し、ロボット100Qがその動作命令Xを別のロボット100Rに伝えることにより、多数のロボットによる行動連鎖を実現できる。装身具140が動作命令Xを送信する場合も同様である。このような制御方法によれば、単体のロボット100としての行動特性だけではなく、群れとしての行動特性についても多様な表現が可能となる。 According to the present embodiment, the robot 100P transmits the operation command X to the robot 100Q, and the robot 100Q transmits the operation command X to another robot 100R, whereby the action chain of many robots can be realized. The same applies to the case where the accessory 140 transmits the operation command X. According to such a control method, various expressions can be made not only on the action characteristic as the single robot 100 but also on the action characteristic as a group.
 なお、本発明は上記実施形態や変形例に限定されるものではなく、要旨を逸脱しない範囲で構成要素を変形して具体化することができる。上記実施形態や変形例に開示されている複数の構成要素を適宜組み合わせることにより種々の発明を形成してもよい。また、上記実施形態や変形例に示される全構成要素からいくつかの構成要素を削除してもよい。 The present invention is not limited to the above-described embodiment and modification, and the components can be modified and embodied without departing from the scope of the invention. Various inventions may be formed by appropriately combining a plurality of components disclosed in the above-described embodiment and modifications. Moreover, some components may be deleted from all the components shown in the above-mentioned embodiment and modification.
 1つのロボット100と1つのサーバ200、複数の外部センサ114によりロボットシステム300が構成されるとして説明したが、ロボット100の機能の一部はサーバ200により実現されてもよいし、サーバ200の機能の一部または全部がロボット100に割り当てられてもよい。1つのサーバ200が複数のロボット100をコントロールしてもよいし、複数のサーバ200が協働して1以上のロボット100をコントロールしてもよい。 Although the robot system 300 is described as being configured of one robot 100, one server 200, and a plurality of external sensors 114, part of the functions of the robot 100 may be realized by the server 200, or the functions of the server 200 A part or all of may be assigned to the robot 100. One server 200 may control a plurality of robots 100, or a plurality of servers 200 may cooperate to control one or more robots 100.
 ロボット100やサーバ200以外の第3の装置が、機能の一部を担ってもよい。図6において説明したロボット100の各機能とサーバ200の各機能の集合体は大局的には1つの「ロボット」として把握することも可能である。1つまたは複数のハードウェアに対して、本発明を実現するために必要な複数の機能をどのように配分するかは、各ハードウェアの処理能力やロボットシステム300に求められる仕様等に鑑みて決定されればよい。 A third device other than the robot 100 or the server 200 may have a part of the function. An aggregate of the functions of the robot 100 and the functions of the server 200 described with reference to FIG. 6 can also be generally understood as one “robot”. How to allocate a plurality of functions necessary to realize the present invention to one or more hardwares will be considered in view of the processing capability of each hardware, the specifications required of the robot system 300, etc. It should be decided.
 上述したように、「狭義におけるロボット」とはサーバ200を含まないロボット100のことであるが、「広義におけるロボット」はロボットシステム300のことである。サーバ200の機能の多くは、将来的にはロボット100に統合されていく可能性も考えられる。 As described above, the “robot in a narrow sense” refers to the robot 100 not including the server 200, while the “robot in a broad sense” refers to the robot system 300. Many of the functions of the server 200 may be integrated into the robot 100 in the future.
 ロボット100の行動制御プログラムは、所定のサーバからインターネットを介して提供されてもよいし、CD-ROMなどの固定の記録媒体により提供されてもよい。いずれにしてもロボット100の行動制御プログラムは、ロボット100とは異なる記録媒体(サーバ、CD-ROMなど)から提供されることにより、ロボット100にインストールされてもよい。 The behavior control program of the robot 100 may be provided from a predetermined server via the Internet, or may be provided by a fixed recording medium such as a CD-ROM. In any case, the behavior control program of the robot 100 may be installed on the robot 100 by being provided from a recording medium (server, CD-ROM or the like) different from the robot 100.
 動作命令Xは、複数のロボット100によるゲームを実行するための行動指示であってもよい。一例として、複数のロボット100による「鬼ごっこ」を想定する。ロボット100Pは、まず、「鬼ごっこをやりたい」という動作命令X7を送信し、仲間を集める。動作命令X7を承諾したロボット100は鬼ごっこに参加する。3つのロボット100P~100Rが鬼ごっこに参加したとする。サーバ200の状態管理部244は、まず、ロボット100Pを「鬼」として登録する。サーバ200の通信部204は、ロボット100P~100Rに対して、ロボット100Pが「鬼」である旨を通知する。 The motion command X may be a motion command for executing a game played by a plurality of robots 100. As an example, it is assumed that "playing a tag" by a plurality of robots 100. First, the robot 100P transmits an action command X7 "I want to play a tag game" to gather friends. The robot 100 that has accepted the motion command X7 participates in the tag game. Suppose that three robots 100P to 100R participate in a tag game. First, the state management unit 244 of the server 200 registers the robot 100P as a "oni". The communication unit 204 of the server 200 notifies the robots 100 P to 100 R that the robot 100 P is “oni”.
 鬼役のロボット100Pはロボット100Qとロボット100Rを追いかけ、ロボット100Qとロボット100Rはロボット100P(鬼)から逃げる。ロボット100P(鬼)がロボット100Qに接触すると、ロボット100Qはロボット情報の受信とタッチセンサの検出信号により「鬼」にタッチされたことを認識し、タッチされたことをサーバ200に通知する。サーバ200は、ロボット100Qを「鬼」として設定する。このような制御方法により、複数のロボット100が集まったときにロボット100だけで鬼ごっこをして遊ぶという行動表現が可能となる。 The demonic robot 100P chases the robot 100Q and the robot 100R, and the robot 100Q and the robot 100R escape from the robot 100P (Oki). When the robot 100P (or demon) contacts the robot 100Q, the robot 100Q recognizes that "Oki" is touched by the reception of the robot information and the detection signal of the touch sensor, and notifies the server 200 of the touch. The server 200 sets the robot 100Q as the "oni". By such a control method, it is possible to express an action that a plurality of robots 100 gather and play with only the robot 100 to play.
 動作命令Xは、モーションを直接指定するだけでなく、外部ロボット100が搭載するさまざまな行動制御プログラム(アプリケーション・プログラム)の起動命令であってもよい。たとえば、「鬼ごっこ」を実行するための行動制御プログラム(以下、「鬼ごっこアプリケーション」とよぶ)を搭載する外部ロボット100に対して、鬼ごっこアプリケーションの起動を指示することにより、複数のロボット100による鬼ごっこが実行されてよい。「鬼ごっこ」に限らず、図9に関連して説明した「追従動作」、2つのロボット100が手106を触れ合わせる「手遊び」などさまざまな行動制御プログラムが考えられる。これらの行動制御プログラムは、アプリケーションIDにより識別される。第1のロボット100は、第2のロボット100に対してアプリケーションIDを指定した動作命令Xを送信することにより、第2のロボット100に起動すべき行動制御プログラムを指示してもよい。そして、第2のロボット100が受諾信号を返信したときには、第1のロボット100も同じ行動制御プログラムを起動すればよい。このように、ロボット100は、基本となるモーションだけでなく、特定のルールのもとでの行動選択方法を定義する行動選択プログラム(アプリケーション・プログラム)を搭載し、一方から他方への起動指示(遊びの誘い)としての動作命令を送信してもよい。 The motion command X not only designates a motion directly, but may be a start command of various behavior control programs (application programs) mounted on the external robot 100. For example, by instructing the external robot 100 equipped with an action control program (hereinafter referred to as an “ongo application”) for executing the “ongo game”, the onion game by a plurality of robots 100 is instructed. May be performed. Various action control programs such as the “following operation” described with reference to FIG. 9 and “hand play” in which the two robots 100 touch the hand 106 can be considered. These behavior control programs are identified by application ID. The first robot 100 may instruct the second robot 100 to perform an action control program to be activated by transmitting an operation command X specifying the application ID to the second robot 100. Then, when the second robot 100 returns an acceptance signal, the first robot 100 may also activate the same action control program. As described above, the robot 100 carries not only the basic motion but also the action selection program (application program) that defines the action selection method under a specific rule, and the start instruction from one to the other ( You may transmit the operation command as invitation of play.
 装身具140と、行動選択プログラムはセット販売されてもよい。たとえば、装身具140にアプリケーションIDを対応づけておき、装身具140の購入者が所定のサーバにアクセスしてアプリケーションIDを入力すると、このサーバからロボット100にアプリケーションIDに対応する行動選択プログラムがダウンロードされるとしてもよい。このような態様によれば、装身具140を購入することでロボット100の行動パターンを豊かにできるため、装身具140を集めてロボット100にいろいろな行動を覚えさせる楽しみを提供できる。 The accessory 140 and the action selection program may be sold as a set. For example, when the accessory ID is associated with the accessory 140, and the purchaser of the accessory 140 accesses a predetermined server and inputs the application ID, the action selection program corresponding to the application ID is downloaded from the server to the robot 100. It may be According to such an aspect, since the behavior pattern of the robot 100 can be enriched by purchasing the accessory 140, it is possible to collect the accessory 140 and provide the robot 100 with a variety of actions.
 ロボット100は、外部ロボット100を視認できたときに限り、外部ロボット100との近距離無線通信を実行してもよい。具体的には、ロボット100のロボット検出部152は、カメラにより外部ロボット100を撮像し、外部ロボット100を画像認識できた状態にて、外部ロボット100からロボット情報等を受信したときに限り、外部ロボット100の位置および向きを特定してもよい。また、動作制御部150は、外部ロボット100を画像認識できたときに限り、外部ロボット100から動作命令Xを受け入れるとしてもよい。動作制御部150はカメラにより認識した外部ロボット100の存在する方向と、第2通信部134により認識した外部ロボット100の存在する方向が一致しなければ、外部ロボット100からの動作命令Xを無視するとしてもよい。このような制御方法によれば、自ロボット100は、視認できた外部ロボット100のみと近距離無線通信を行うため、外部ロボット100以外からの近距離無線通信に対して不必要・不自然な反応行動を実行しなくなる。ロボット100は、装身具140についても、装身具140の視認を条件として装身具140からの動作命令Xを受け入れるとしてもよい。動作命令は、ロボット100の間のいわば「テレパシー」のように機能する。上記制御方法によれば、目の前にいる外部ロボット100からの動作命令(テレパシー)しか受け入れない、というより堅実な行動特性を表現できる。 The robot 100 may execute near field communication with the external robot 100 only when the external robot 100 can be visually recognized. Specifically, the robot detection unit 152 of the robot 100 captures an image of the external robot 100 with a camera, and when image recognition of the external robot 100 is performed, only when robot information or the like is received from the external robot 100 The position and orientation of the robot 100 may be identified. Also, the operation control unit 150 may accept the operation command X from the external robot 100 only when the image recognition of the external robot 100 can be performed. The motion control unit 150 ignores the motion command X from the external robot 100 if the direction in which the external robot 100 recognized by the camera is present does not coincide with the direction in which the external robot 100 recognized by the second communication unit 134 is present. It may be According to such a control method, since the self-robot 100 performs short-distance wireless communication only with the external robot 100 that has been visually recognized, unnecessary and unnatural reactions to short-distance wireless communication from other than the external robot 100 Stop doing action. The robot 100 may also receive an operation command X from the accessory 140 on condition that the accessory 140 is viewed. The motion command functions like a so-called "telepathy" between the robots 100. According to the above control method, it is possible to express a more consistent behavior characteristic that only the operation command (telepathy) from the external robot 100 in front of the user is accepted.
 動作制御部150は、外部ロボット100を画像認識したときには外部ロボット100に近づき、近距離無線通信によりロボットIDを確認してもよい。ロボット100は、通常、IrDA(登録商標)の通信範囲外から外部ロボット100を視認できる。このような制御方法によれば、遠方から外部ロボット100を確認し、近づいてその素性を確かめるという行動特性を表現できる。 When the image recognition of the external robot 100 is performed, the operation control unit 150 may approach the external robot 100 and confirm the robot ID by near field communication. The robot 100 can usually view the external robot 100 from outside the communication range of IrDA (registered trademark). According to such a control method, it is possible to express an action characteristic in which the external robot 100 is confirmed from a distance and approached to confirm its identity.
 本実施形態においては、ロボット情報および動作命令をIrDA(登録商標)により送受信するとして説明した。変形例として、第2通信部134は、超音波通信やNFC (Near Field Communication)、ブルートゥース(登録商標)など他の近距離無線通信方式により、外部ロボット100や装身具140と通信してもよい。 In the present embodiment, it has been described that robot information and operation commands are transmitted and received by IrDA (registered trademark). As a modification, the second communication unit 134 may communicate with the external robot 100 or the accessory 140 by another short distance wireless communication method such as ultrasonic communication, near field communication (NFC), or Bluetooth (registered trademark).
 ロボット100は、IrDA信号等により、室内にある家電をコントロールしてもよい。たとえば、ロボット100はユーザから「エアコン・オン」という音声指示を受けたとき、エア・コンディショナーに対して電源投入信号を送信することにより、ユーザの代わりにエア・コンディショナーの起動を指示してもよい。 The robot 100 may control home appliances in the room by an IrDA signal or the like. For example, when the robot 100 receives a voice instruction of "air conditioner on" from the user, the robot 100 may instruct the air conditioner to start up on behalf of the user by transmitting a power-on signal to the air conditioner. .
 装身具140は、ユーザが携帯可能な物品(携帯品)であればよい。たとえば、装身具140は、ピアス、携帯電話、スマートフォン、ストラップ、キーホルダー、指輪、お守りなどであってもよい。装身具140は、同時に複数の動作命令を送信してもよい。たとえば、進入禁止命令と着座命令を同時に送信してもよい。装身具140は、動作命令を複数方向に同時に送信してもよいし、無指向性の電波にて動作命令を送信してもよい。 The accessory 140 may be any item (portable item) that can be carried by the user. For example, the jewelry 140 may be a piercing, a mobile phone, a smartphone, a strap, a key ring, a ring, a charm, or the like. The accessory 140 may transmit a plurality of operation instructions at the same time. For example, the entry prohibition command and the seating command may be transmitted simultaneously. The accessory 140 may transmit the operation command simultaneously in a plurality of directions, or may transmit the operation command using a nondirectional radio wave.
 本実施形態においては、送信機158と受信機168は、ツノ112に設置されるとして説明した。変形例として、送信機158と受信機168はロボット100の頭部に設置されてもよいし、胸部や腹部に設置されてもよい。いずれにしても、通信機配置面GはFに対して略水平(Fに対する傾斜角が30度以内)であることが望ましい。 In the present embodiment, the transmitter 158 and the receiver 168 have been described as being installed in the horn 112. Alternatively, transmitter 158 and receiver 168 may be located on the head of robot 100 or on the chest or abdomen. In any case, it is desirable that the communication device disposition plane G be substantially horizontal to F (the inclination angle to F is within 30 degrees).
(装身具140に関する第1変形例)
 装身具140は、ロボット100との通信機能を備える誘導装置として、たとえば、魔法の杖あるいは指揮棒を模した棒状物として形成されてもよい。具体的には、装身具140は、ユーザが把持・操作する部位である把持部と、把持部に連結されるスティック(延伸部材)を備える。スティックの先端には発光部が取り付けられる。
(First modified example of accessory 140)
The accessory 140 may be formed as, for example, a magic wand or a rod imitating a baton as a guidance device having a communication function with the robot 100. Specifically, the accessory 140 includes a grip portion which is a portion gripped and operated by the user, and a stick (stretching member) connected to the grip portion. A light emitting unit is attached to the tip of the stick.
 把持部はスイッチを備える。また、把持部にはバッテリーが内蔵される。ユーザはスイッチを操作することで、複数のモードからいずれかを選択する。複数のモードは、たとえば、離れた場所にいるロボットを呼び寄せるモード(以下、「呼び寄せモード」とよぶ)、スティックを使いロボットを動かして遊ぶモード(以下、「誘導モード」とよぶ)、あたかもロボットにリード(ひも)をつなげているように散歩するモード(以下、「散歩モード」とよぶ)などである。指示選択部172は、スイッチの操作に応じて、これらのモードおよびその時のスティックの動かし方に応じて動作命令を選択する。 The grip comprises a switch. In addition, a battery is built in the grip portion. The user operates the switch to select one of a plurality of modes. For example, in the multiple modes, a mode in which a robot at a distant place is called (hereinafter referred to as “calling mode”), a mode in which the robot is played by using a stick to play (hereinafter referred to as “induction mode”) It is a mode (hereinafter referred to as a “walk mode”) in which a walk is taken as connecting a lead (string). The instruction selection unit 172 selects an operation instruction according to the mode and the method of moving the stick at that time according to the operation of the switch.
 発光部(先端部)は、通信に用いる赤外線LEDにより発光可能に形成される。発光部は、赤外線を通信に用いる送信部144および受信部170を内蔵する。発光部(先端部)には、先端部の動きを計測するための加速度センサ、および、可視光のLEDも設けられる。また、装身具140には固有の識別番号が付与され、識別番号は装身具140の内蔵メモリに登録される。 The light emitting portion (tip portion) is formed to be able to emit light by an infrared LED used for communication. The light emitting unit incorporates a transmitting unit 144 and a receiving unit 170 that use infrared light for communication. The light emitting portion (tip portion) is also provided with an acceleration sensor for measuring the movement of the tip portion, and an LED of visible light. In addition, a unique identification number is assigned to the accessory 140, and the identification number is registered in the built-in memory of the accessory 140.
 装身具140は、リンク形成部を含む。このリンク形成部は、装身具140の識別番号をロボット100に通知し、ロボット100との接続を確立する。以下、ロボット100が装身具140の識別番号を通知された上で、その装身具140との接続が確立した状態を「リンク状態」とよぶ。ロボット100がリンク状態にあるとき、リンク対象となる装身具140からの動作命令を受け付け可能となる。装身具140は、ロボット100と接続が確立された後、各種動作命令とともにその識別番号も送信する。ロボット100は、動作命令とともに受信した識別番号が、リンク対象の装身具140の識別番号(リンク先として登録済みの識別番号)と一致したとき、動作命令にしたがう。動作命令とともに受信した識別番号が、リンク対象の装身具140の識別番号と異なるときには、動作命令を無視する。リンクするとは、ロボット100が装身具140からの指示にしたがうことを意味する。 The accessory 140 includes a link forming portion. The link forming unit notifies the robot 100 of the identification number of the accessory 140, and establishes a connection with the robot 100. Hereinafter, after the robot 100 is notified of the identification number of the accessory 140, the state in which the connection with the accessory 140 is established is referred to as a “link state”. When the robot 100 is in the link state, it is possible to receive an operation command from the accessory 140 to be linked. After the connection with the robot 100 is established, the accessory 140 transmits various identification commands along with various operation commands. The robot 100 follows the operation instruction when the identification number received together with the operation instruction matches the identification number of the accessory 140 to be linked (the identification number registered as the link destination). When the identification number received together with the operation command is different from the identification number of the linked accessory 140, the operation command is ignored. Linking means that the robot 100 follows the instructions from the accessory 140.
 リンク形成部は、NFCなどの近距離無線通信を用いてロボット100と通信する。たとえば、装身具140をロボット100に接触させることで互いの近距離無線通信手段を用いてリンクを確立してもよい。また、リンクの確立後、装身具140をロボット100に再接触させたとき、ロボット100はリンク対象の装身具140の登録されている識別番号を消去し、これによりリンク状態を解消してもよい。また、既にリンク状態にあるロボット100は、リンク対象ではない別の装身具140に接触された場合、この新たな装身具140の識別番号を上書き登録する。このとき、既存のリンク状態は解消され、新たに接触された装身具140とのリンクが確立される。ロボット100は、所定の期間が経過した後、リンク状態を自動的に解消してもよい。ロボット100は、装身具140から最後に動作命令を受信した時点から、所定の期間が経過した後、リンク状態を自動的に解消してもよい。 The link formation unit communicates with the robot 100 using near field communication such as NFC. For example, the link 140 may be established by bringing the accessory 140 into contact with the robot 100 using near-field wireless communication means. In addition, after the link is established, when the accessory 140 is brought into contact with the robot 100 again, the robot 100 may delete the registered identification number of the accessory 140 to be linked, thereby canceling the link state. Further, when the robot 100 already in the linked state is touched by another accessory 140 not to be linked, the identification number of the new accessory 140 is overwritten and registered. At this time, the existing link state is cancelled, and the link with the newly touched accessory 140 is established. The robot 100 may automatically cancel the link state after a predetermined period has elapsed. The robot 100 may automatically cancel the link state after a predetermined period has elapsed since the last reception of the operation command from the accessory 140.
 発光部(先端部)に内蔵される加速度センサは、先端部の動き(速度、加速度および移動方向)を検出する。誘導モードにおいては、ユーザがスティックを動かすことで、先端部の軌跡に応じた動作命令が選択される。たとえば、スティックをロボットの周りで円を描くように動かせば、その動きを追うように指示する動作命令が選択される。また、スティックを手前から目的の方向に振れば、振られた方向に移動させる動作命令が選択される。 An acceleration sensor incorporated in the light emitting unit (tip portion) detects movement (speed, acceleration, and moving direction) of the tip portion. In the guidance mode, the user moves the stick to select an operation command according to the trajectory of the tip. For example, if the stick is moved in a circle around the robot, an operation command instructing to follow the movement is selected. In addition, when the stick is swung from the near side in a target direction, an operation command to move in the swung direction is selected.
 呼び寄せモードの場合、装身具140は、送信部144の通信可能範囲よりも遠くまで無線通信ができる通信手段を用いて、リンクが確立されているロボット100に「探索」を指示する。たとえば、装身具140は、Bluetooth(登録商標)によるブロードキャストにより探索指示を送信し、更に発光部から赤外線通信を使って装身具140に近づくように指示する動作命令をロボット100に送信する。ロボット100が、装身具140の近くに到達すると、ロボット100は装身具140の受信部170に向けて到達を意味する信号(「到達通知」とよぶ)を送信する。到達通知を受信した後、装身具140は、呼び寄せが完了したとして、探索指示の送信を停止する。 In the case of the call-up mode, the accessory 140 instructs the robot 100 with which the link is established to “search” using communication means capable of performing wireless communication far beyond the communicable range of the transmission unit 144. For example, the accessory 140 transmits a search instruction by broadcasting by Bluetooth (registered trademark), and further transmits an operation command instructing the robot 100 to approach the accessory 140 using infrared communication from the light emitting unit. When the robot 100 reaches near the accessory 140, the robot 100 transmits a signal (referred to as “arrival notification”) to the receiver 170 of the accessory 140, which means arrival. After receiving the arrival notification, the accessory 140 stops the transmission of the search instruction as the call is complete.
 装身具140は、NFCなどの極めて近い範囲の無線通信手段(以下、「第1無線通信手段」とよぶ)と、第1無線通信手段より通信範囲が広い近距離の無線通信手段(以下、「第2無線通信手段」とよぶ)と、第2無線通信手段より通信範囲が広い無線通信手段(以下、「第3無線通信手段」とよぶ)とを備える。装身具140は、モードに応じて、適切な通信手段を切り替えてロボット100と通信する。 The accessory 140 is a wireless communication unit in a very near range such as NFC (hereinafter referred to as “first wireless communication unit”), and a short range wireless communication unit (hereinafter referred to as “first wireless communication unit”) having a wider communication range than the first wireless communication unit. 2) wireless communication means and wireless communication means (hereinafter referred to as "third wireless communication means") having a wider communication range than the second wireless communication means. The accessory 140 switches the appropriate communication means according to the mode to communicate with the robot 100.
 散歩モードの場合、装身具140は発光部から「追従」を指示するための動作命令をロボット100に送信する。装身具140とロボット100の距離は比較的近いので、第2無線通信手段が用いられる。追従を指示する動作命令を受信すると、ロボット100は、その装身具140との距離をほぼ一定に保ちながらユーザ(装身具140)に追従して移動する。装身具140の把持部は、ロボット100が見えないリードで装身具140と繋がっているかのような触覚を実現するための触覚実現部(バイブレータなど)を備える。 In the case of the walk mode, the accessory 140 transmits, to the robot 100, an operation command for instructing "following" from the light emitting unit. Since the distance between the accessory 140 and the robot 100 is relatively short, the second wireless communication means is used. When receiving an operation command instructing tracking, the robot 100 moves following the user (the accessory 140) while keeping the distance to the accessory 140 substantially constant. The grip portion of the accessory 140 includes a tactile sense realization unit (vibrator or the like) for realizing a sense of touch that is connected to the accessory 140 with a lead that can not be seen by the robot 100.
 触覚実現部は、触覚技術やハプティクスと呼ばれる技術により、ユーザの手の触覚を刺激する。触覚実現部は、ロボット100の追従状態に連動して、ロボット100に引っ張られている触覚や、ロボット100が左右に動いている触覚を実現する。たとえば、ロボット100が右に反れるような動きをした際、それに連動して、触覚実現部が右に引っ張られる触覚を実現する。ロボット100の通信部は、追従状態を変化するタイミングで、その変化の内容を特定する情報(以下、「運動情報」とよぶ)を、装身具140に向けて送信する。触覚実現部は、運動情報に応じて、触覚を実現する。運動情報が「右へ移動」を意味するものであれば、触覚実現部は右へ引っ張られているような感覚が得られるように把持部を介してユーザの手を刺激する。 The haptic realization unit stimulates the sense of touch of the user's hand by a technique called haptic technology or haptics. The tactile sense realization unit realizes the sense of being pulled by the robot 100 and the sense of sense that the robot 100 is moving to the left and right in conjunction with the tracking state of the robot 100. For example, when the robot 100 moves to the right, in conjunction with it, the tactile sense realization unit realizes the sense of being pulled to the right. The communication unit of the robot 100 transmits information specifying the content of the change (hereinafter, referred to as “motion information”) to the accessory 140 at the timing of changing the following state. The haptic realization unit realizes haptics according to the motion information. If the motion information means "move to the right", the tactile sense effect unit stimulates the user's hand through the grasping unit so that the user feels as if it is pulled to the right.
 スイッチが操作されていない場合、ロボット100は、リンク状態であっても、自律的に動作できる。装身具140とロボット100のリンクが確立したとき、触覚実現部および可視光LEDはリンクの確立をユーザに通知してもよい。たとえば、リンクが確立したとき、触覚実現部は特定の振動パターンにて把持部を振動させてもよいし、可視光LEDは特定色にて点灯・点滅をしてもよい。同様にして、触覚実現部および可視光LEDの双方または一方は、装身具140の電源のオン・オフ、動作命令の送信、送信した動作命令の種類等の各種情報をユーザに通知してもよい。 When the switch is not operated, the robot 100 can operate autonomously even in the link state. When the link between the accessory 140 and the robot 100 is established, the haptic realization unit and the visible light LED may notify the user of the establishment of the link. For example, when the link is established, the tactile sense realization unit may vibrate the grip unit in a specific vibration pattern, and the visible light LED may be lit and blinked in a specific color. Similarly, one or both of the tactile sense implementation unit and the visible light LED may notify the user of various information such as power on / off of the accessory 140, transmission of the operation command, and the type of the operation command transmitted.
(装身具140に関する第2変形例)
 把持部のスイッチが押されたとき、発光部(先端部)に内蔵される送信部144は、指向性のあるリンク信号を送信してもよい。ユーザが、装身具140の発光部(先端部)を操作対象のロボット100に向けてリンク信号を送信し、ロボット100がリンク信号を受信したとき、ロボットは「リンク状態」となる。リンク状態となったロボット100は、以後、装身具140から送信される動作命令にしたがう。ユーザはロボット100をリンクさせたあと、装身具140の発光部(先端部)を動かすことでロボット100を催眠術にかけたかのようにコントロールできる。
(2nd modification regarding the accessory 140)
When the switch of the grip unit is pressed, the transmission unit 144 incorporated in the light emitting unit (tip end) may transmit a directional link signal. When the user directs the light emitting part (tip part) of the accessory 140 to the robot 100 to be operated and transmits a link signal, and the robot 100 receives the link signal, the robot is in the “link state”. The robot 100 in the linked state follows the operation command transmitted from the accessory 140 thereafter. After linking the robot 100, the user can control the robot 100 as if it were hypnotized by moving the light emitting portion (tip portion) of the accessory 140.
 たとえば、スイッチを押した状態で装身具140の発光部(先端部)を空中で周回させると、指示選択部172は発光部(先端部)に内蔵される加速度センサにより周回運動および速度を検出し、ロボット100に対して円運動を指示する動作命令を送信する。リンク状態にあるロボット100は動作命令を受信したとき、地上を周回する。ロボット100の周回半径および移動速度は、装身具140の周回半径と周回速度に連動する。動作命令には、周回半径および周回速度を指定する情報が含まれる。 For example, when the light emitting unit (tip end) of accessory 140 is circulated in the air with the switch pressed, instruction selection unit 172 detects the orbiting motion and speed by the acceleration sensor built in the light emitting unit (tip end), An operation command instructing circular motion is sent to the robot 100. When the robot 100 in the link state receives an operation command, it orbits the ground. The orbiting radius and the moving speed of the robot 100 are linked to the orbiting radius and the orbiting speed of the accessory 140. The motion command includes information specifying the orbiting radius and the orbiting speed.
 また、ユーザがスイッチを押していない状態で装身具140を持って動くと、リンク状態にあるロボット100は装身具140(ユーザ)を追いかける。スイッチが押されていないとき、送信部144は、定期的に「追従」を指示する動作信号をリンク対象のロボット100に送信する。 In addition, when the user moves the accessory 140 while holding the switch, the robot 100 in the linked state chases the accessory 140 (user). When the switch is not pressed, the transmission unit 144 periodically transmits an operation signal instructing “following” to the robot 100 as a link target.
 発光部(先端部)は、スティックの軸方向にリンク信号を送信する送信部144と、半径方向に動作信号を送信する複数の送信部144を内蔵してもよい。追従動作中においては、ロボット100は一定距離を保ちながらユーザ(装身具140)を追いかける。このとき、ロボット100は、内蔵の赤外線LEDを発光させることにより、「リードによって連れて行かれている」状態にあることを表現してもよい。発光部は、ロボットとリンクしているとき、あるいは、スイッチが押されているときに発光するとしてもよい。 The light emitting unit (tip portion) may incorporate a transmitting unit 144 that transmits a link signal in the axial direction of the stick, and a plurality of transmitting units 144 that transmit an operation signal in the radial direction. During the follow-up operation, the robot 100 chases the user (accessory 140) while keeping a constant distance. At this time, the robot 100 may express that it is in the state of being taken by the lead by emitting light from the built-in infrared LED. The light emitting unit may emit light when linked with the robot or when the switch is pressed.
 ユーザは、送信部144により1つのロボット100をコントロールしてもよいし、複数のロボット100をまとめてコントロールしてもよい。たとえば、ユーザは、操作対象となる複数のロボット100に対して順次リンクを確立することにより、複数のロボット100をリンク状態にしてもよい。このとき、ユーザは、不特定多数のロボット100に対してまとめて動作命令の送信を指示してもよい。 The user may control one robot 100 by the transmission unit 144 or may control a plurality of robots 100 collectively. For example, the user may set the plurality of robots 100 in the link state by sequentially establishing links with the plurality of robots 100 to be operated. At this time, the user may instruct transmission of an operation command to a large number of unspecified robots 100 collectively.

Claims (23)

  1.  ロボットのモーションを選択する動作制御部と、
     前記動作制御部により選択されたモーションを実行する駆動機構と、
     他のロボットから、所定の近距離無線通信方式にしたがって送信される前記他のロボットのIDを受信する受信機と、
     前記受信されたIDにより、前記他のロボットを判別する認識部と、を備えることを特徴とする自律行動型ロボット。
    A motion control unit that selects the motion of the robot;
    A drive mechanism for executing the motion selected by the operation control unit;
    A receiver for receiving the ID of the other robot transmitted from the other robot according to a predetermined short distance wireless communication system;
    And a recognition unit that determines the other robot based on the received ID.
  2.  前記動作制御部は、前記IDが受信されたとき、自ロボットの行動特性を変化させることを特徴とする請求項1に記載の自律行動型ロボット。 The autonomous behavior robot according to claim 1, wherein the motion control unit changes behavior characteristics of the robot when the ID is received.
  3.  前記動作制御部は、前記受信されたIDが未登録IDであることを条件として、前記未登録IDの検出事象に対応づけられる所定のモーションを選択することを特徴とする請求項2に記載の自律行動型ロボット。 The motion control unit according to claim 2, wherein the motion control unit selects a predetermined motion associated with a detection event of the unregistered ID, on the condition that the received ID is an unregistered ID. Autonomous action robot.
  4.  複数の前記受信機による検出信号に基づいて、前記他のロボットが存在する方向を特定するロボット検出部、を更に備え、
     前記動作制御部は、前記他のロボットのIDが検出されたとき、前記特定された方向に応じてロボットの行動特性を変化させることを特徴とする請求項2に記載の自律行動型ロボット。
    And a robot detection unit that specifies a direction in which the other robot is present based on detection signals from a plurality of the receivers.
    The autonomous behavior type robot according to claim 2, wherein the motion control unit changes behavior characteristics of the robot according to the specified direction when an ID of the other robot is detected.
  5.  前記受信機は、前記他のロボットから、前記IDとともに動作命令を受信し、
     前記動作制御部は、前記動作命令が受信されたとき、前記動作命令に対応づけられる所定のアプリケーション・プログラムを起動することを特徴とする請求項1に記載の自律行動型ロボット。
    The receiver receives an operation command together with the ID from the other robot,
    The autonomous behavior robot according to claim 1, wherein the operation control unit activates a predetermined application program associated with the operation command when the operation command is received.
  6.  他のロボットに対する親密度を管理する親密度管理部、を更に備え、
     前記親密度管理部は、前記IDが受信されたとき、前記他のロボットの親密度を更新することを特徴とする請求項1に記載の自律行動型ロボット。
    Further comprising an intimacy management unit for managing intimacy with other robots;
    The autonomous behavior robot according to claim 1, wherein the closeness management unit updates closeness of the other robot when the ID is received.
  7.  前記動作制御部は、前記親密度が所定の閾値以下の他のロボットからIDが受信されたとき、前記他のロボットから離れる方向への移動を前記駆動機構に指示することを特徴とする請求項6に記載の自律行動型ロボット。 The motion control unit instructs the drive mechanism to move in a direction away from the other robot when an ID is received from the other robot whose intimacy degree is less than a predetermined threshold. The autonomous behavior robot according to 6.
  8.  前記受信機は、ユーザの携帯品から所定の近距離無線通信方式にしたがって送信される動作命令を受信し、
     前記動作制御部は、前記動作命令が受信されたとき、前記動作命令に対応するモーションを選択することを特徴とする請求項1に記載の自律行動型ロボット。
    The receiver receives an operation command transmitted from a user's portable item according to a predetermined short distance wireless communication scheme,
    The autonomous behavior robot according to claim 1, wherein the motion control unit selects a motion corresponding to the motion command when the motion command is received.
  9.  前記動作命令が受信されたとき、前記動作命令を他のロボットに向けて送信する送信機、を更に備えることを特徴とする請求項8に記載の自律行動型ロボット。 The robot according to claim 8, further comprising: a transmitter that transmits the operation command to another robot when the operation command is received.
  10.  自ロボットのIDを前記近距離無線通信方式にしたがって送信する送信機、を更に備えることを特徴とする請求項1に記載の自律行動型ロボット。 The autonomous behavior robot according to claim 1, further comprising: a transmitter that transmits an ID of the self-robot according to the short distance wireless communication system.
  11.  モーションを選択する動作制御部と、
     前記動作制御部により選択されたモーションを実行する駆動機構と、
     自ロボットを識別するIDを所定の近距離無線通信方式にしたがって送信する送信機と、を備えることを特徴とする自律行動型ロボット。
    A motion control unit that selects a motion;
    A drive mechanism for executing the motion selected by the operation control unit;
    A transmitter for transmitting an ID for identifying a self-robot according to a predetermined short distance wireless communication method.
  12.  動作命令を選択する命令選択部、を更に備え、
     前記送信機は、前記選択された動作命令を他のロボットに送信することを特徴とする請求項11に記載の自律行動型ロボット。
    Further comprising an instruction selection unit for selecting an operation instruction;
    The robot according to claim 11, wherein the transmitter transmits the selected motion command to another robot.
  13.  請求項8に記載の自律行動型ロボットに対して、所定の近距離無線通信方式にしたがって、動作命令を送信する送信機、を備えることを特徴とする装身具。 An accessory comprising: a transmitter for transmitting an operation command to the autonomous behavior robot according to claim 8 according to a predetermined short distance wireless communication system.
  14.  請求項8に記載の自律行動型ロボットから、前記自律行動型ロボットのIDを受信する受信機、を更に備え、
     前記送信機は、前記自律行動型ロボットのIDを受信したことを条件として前記動作命令を送信することを特徴とする請求項13に記載の装身具。
    A receiver for receiving an ID of the autonomous behavior robot from the autonomous behavior robot according to claim 8, further comprising:
    The accessory according to claim 13, wherein the transmitter transmits the operation command on the condition that the ID of the autonomous behavior robot is received.
  15.  前記送信機は、前記近距離無線通信方式にしたがって、前記動作命令を所定範囲内において複数方向に送信することを特徴とする請求項13に記載の装身具。 The accessory according to claim 13, wherein the transmitter transmits the operation command in a plurality of directions within a predetermined range according to the short distance wireless communication scheme.
  16.  ロボットのモーションを選択する動作制御部と、
     前記動作制御部により選択されたモーションを実行する駆動機構と、
     充電器から、所定の近距離無線通信方式にしたがって送信されるIDを受信する受信機と、
     二次電池の電池残量を監視する充電監視部と、を備え、
     前記動作制御部は、電池残量が所定の閾値以下となったとき、複数の充電器のうち自ロボットに対応づけられるIDを送信する充電器を移動目標地点として選択することを特徴とする自律行動型ロボット。
    A motion control unit that selects the motion of the robot;
    A drive mechanism for executing the motion selected by the operation control unit;
    A receiver for receiving an ID transmitted from the charger according to a predetermined near field communication system;
    A charge monitoring unit that monitors the remaining battery capacity of the secondary battery;
    The operation control unit is characterized in that, when the remaining battery amount becomes equal to or less than a predetermined threshold value, a charger that transmits an ID associated with the robot among the plurality of chargers is selected as the movement target point. Behavioral robot.
  17.  サーバに対するアクセス情報に基づいて、第1の無線通信方式により前記サーバと接続する通信接続部と、
     ロボットのモーションを決定する動作制御部と、
     前記動作制御部により選択されたモーションを実行する駆動機構と、
     他のロボットから、前記第1の無線通信方式よりも通信距離の短い第2の無線通信方式により前記他のロボットのIDを受信する受信機と、
     前記IDが受信されたとき、前記他のロボットに前記アクセス情報を送信する送信機と、を備えることを特徴とする自律行動型ロボット。
    A communication connection unit configured to connect to the server by the first wireless communication method based on access information to the server;
    A motion control unit that determines the motion of the robot;
    A drive mechanism for executing the motion selected by the operation control unit;
    A receiver that receives the ID of the other robot from the other robot by a second wireless communication method whose communication distance is shorter than that of the first wireless communication method;
    A transmitter that transmits the access information to the other robot when the ID is received.
  18.  前記送信機は、前記受信されたIDが未登録IDであることを条件として、前記アクセス情報を送信することを特徴とする請求項17に記載の自律行動型ロボット。 The autonomous behavior robot according to claim 17, wherein the transmitter transmits the access information on condition that the received ID is an unregistered ID.
  19.  自ロボットを識別するIDを所定の近距離無線通信方式にしたがって送信する送信機、を備え、
     ロボットの頭部、または、頭頂部に形成される突起において、複数の前記送信機が環状に配列されることを特徴とする自律行動型ロボット。
    A transmitter for transmitting an ID for identifying the robot in accordance with a predetermined short distance wireless communication system;
    A plurality of the transmitters are annularly arranged in a projection formed on a head or a top of a robot.
  20.  前記送信機は、ロボットのIDに加えて前記送信機の取り付け位置を特定する位置コードを送信することを特徴とする請求項19に記載の自律行動型ロボット。 20. The autonomous behavior robot according to claim 19, wherein the transmitter transmits a position code specifying the mounting position of the transmitter in addition to the ID of the robot.
  21.  ロボットのモーションを選択する動作制御部と、
     前記動作制御部により選択されたモーションを実行する駆動機構と、
     請求項20に記載の自律行動型ロボットから、前記自律行動型ロボットのIDを受信する受信機と、を備え、
     前記動作制御部は、前記駆動機構に対して、前記自律行動型ロボットの所定位置に設置される送信機から送信される位置コードを受信可能な位置へ移動指示することを特徴とする自律行動型ロボット。
    A motion control unit that selects the motion of the robot;
    A drive mechanism for executing the motion selected by the operation control unit;
    A receiver for receiving an ID of the autonomous behavior robot from the autonomous behavior robot according to claim 20,
    The autonomous action type characterized in that the operation control unit instructs the drive mechanism to move to a position where it can receive a position code transmitted from a transmitter installed at a predetermined position of the autonomous action type robot. robot.
  22.  自律行動型ロボットのモーションを選択する機能と、
     他のロボットから、所定の近距離無線通信方式にしたがって送信される前記他のロボットのIDを受信する機能と、
     前記受信されたIDにより、前記他のロボットを判別する機能と、をコンピュータに発揮させることを特徴とするロボット制御プログラム。
    A function to select the motion of an autonomous behavior robot,
    A function of receiving an ID of the other robot transmitted from the other robot according to a predetermined short distance wireless communication system;
    A robot control program that causes a computer to exhibit the function of determining the other robot based on the received ID.
  23.  自律行動型ロボットのモーションを選択する機能と、
     充電器から、所定の近距離無線通信方式にしたがって送信されるIDを受信する機能と、
     二次電池の電池残量を監視する機能と、
     電池残量が所定の閾値以下となったとき、複数の充電器のうち自ロボットに対応づけられるIDを送信する充電器を移動目標地点として選択する機能と、をコンピュータに発揮させることを特徴とするロボット制御プログラム。
    A function to select the motion of an autonomous behavior robot,
    A function of receiving an ID transmitted from the charger according to a predetermined near field communication method;
    With a function to monitor the battery level of the secondary battery,
    Among the plurality of chargers, a function of selecting a charger that transmits an ID associated with the robot is selected as a movement target point when the battery remaining amount falls below a predetermined threshold. Robot control program.
PCT/JP2018/018287 2017-05-11 2018-05-11 Autonomous behavior-type robot, accessory, and robot control program WO2018207908A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2019517713A JP6734607B2 (en) 2017-05-11 2018-05-11 Robots, portable items and robot control programs

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-094435 2017-05-11
JP2017094435 2017-05-11

Publications (1)

Publication Number Publication Date
WO2018207908A1 true WO2018207908A1 (en) 2018-11-15

Family

ID=64104804

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/018287 WO2018207908A1 (en) 2017-05-11 2018-05-11 Autonomous behavior-type robot, accessory, and robot control program

Country Status (2)

Country Link
JP (1) JP6734607B2 (en)
WO (1) WO2018207908A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210064035A1 (en) * 2019-08-30 2021-03-04 Lg Electronics Inc. Method of moving robot in administrator mode and robot of implementing method
CN112677161A (en) * 2020-12-14 2021-04-20 西安新程万创信息技术有限公司 A intelligent robot for financial consultation uses
TWI742644B (en) * 2020-05-06 2021-10-11 東元電機股份有限公司 Following mobile platform and method thereof

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20230012124A (en) * 2021-07-14 2023-01-26 엘지전자 주식회사 Moving robot, docking station and robot system including the same

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03284103A (en) * 1990-03-28 1991-12-13 Shinko Electric Co Ltd Charging control system for mobile robot
JP2001179665A (en) * 1999-12-24 2001-07-03 Toshiba Corp Intelligent robot
JP2001212782A (en) * 2000-01-31 2001-08-07 Sony Corp Robot device and control method for robot device
JP2002233978A (en) * 2001-02-02 2002-08-20 Sony Corp Movable robot and data communication control method between movable robots
JP2003140710A (en) * 2001-10-29 2003-05-16 Sony Corp Information home electrical appliance control system, database server, information home electrical appliance, and information home electrical appliance control method
JP2007160473A (en) * 2005-12-15 2007-06-28 Fujitsu Ltd Interactive object identifying method in robot and robot

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009050970A (en) * 2007-08-28 2009-03-12 Nec Access Technica Ltd Robot system and rescue robot
JP5282457B2 (en) * 2008-06-23 2013-09-04 富士通株式会社 Rescue robot system, rescue method, and rescue robot

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03284103A (en) * 1990-03-28 1991-12-13 Shinko Electric Co Ltd Charging control system for mobile robot
JP2001179665A (en) * 1999-12-24 2001-07-03 Toshiba Corp Intelligent robot
JP2001212782A (en) * 2000-01-31 2001-08-07 Sony Corp Robot device and control method for robot device
JP2002233978A (en) * 2001-02-02 2002-08-20 Sony Corp Movable robot and data communication control method between movable robots
JP2003140710A (en) * 2001-10-29 2003-05-16 Sony Corp Information home electrical appliance control system, database server, information home electrical appliance, and information home electrical appliance control method
JP2007160473A (en) * 2005-12-15 2007-06-28 Fujitsu Ltd Interactive object identifying method in robot and robot

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210064035A1 (en) * 2019-08-30 2021-03-04 Lg Electronics Inc. Method of moving robot in administrator mode and robot of implementing method
US11635759B2 (en) * 2019-08-30 2023-04-25 Lg Electronics Inc. Method of moving robot in administrator mode and robot of implementing method
TWI742644B (en) * 2020-05-06 2021-10-11 東元電機股份有限公司 Following mobile platform and method thereof
CN112677161A (en) * 2020-12-14 2021-04-20 西安新程万创信息技术有限公司 A intelligent robot for financial consultation uses

Also Published As

Publication number Publication date
JPWO2018207908A1 (en) 2019-11-07
JP6734607B2 (en) 2020-08-05

Similar Documents

Publication Publication Date Title
JP6402320B2 (en) An autonomous behavioral robot
JP6884401B2 (en) Autonomous robot wearing clothes
WO2018207908A1 (en) Autonomous behavior-type robot, accessory, and robot control program
JP2019072495A (en) Autonomous travel robot understanding physical contact
WO2017169826A1 (en) Autonomous behavior robot that performs welcoming behavior
WO2018047900A1 (en) Autonomous robot which receives guest
WO2018043235A1 (en) Autonomous behavior type robot recognizing direction of sound source
WO2018047802A1 (en) Autonomous robot that maintains sense of natural distance
JP6671577B2 (en) An autonomous robot that identifies people
JP7177497B2 (en) Autonomous action robot that looks at the opponent
JP7236142B2 (en) Autonomous action robot
WO2018097089A1 (en) Autonomously acting robot changing pupils
JP2019075168A (en) Autonomously behaving robot seeking coolness
JP6755447B2 (en) Autonomous action robot with emergency stop function
JP2019214119A (en) Joint structure excellent for robot joint
WO2020129992A1 (en) Robot, charging station for robot, and landmark device
WO2018216710A1 (en) Image processing device for correcting distortion in omnidirectional image and autonomous travel robot equipped with same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18797855

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019517713

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18797855

Country of ref document: EP

Kind code of ref document: A1