WO2020129992A1 - Robot, charging station for robot, and landmark device - Google Patents

Robot, charging station for robot, and landmark device Download PDF

Info

Publication number
WO2020129992A1
WO2020129992A1 PCT/JP2019/049459 JP2019049459W WO2020129992A1 WO 2020129992 A1 WO2020129992 A1 WO 2020129992A1 JP 2019049459 W JP2019049459 W JP 2019049459W WO 2020129992 A1 WO2020129992 A1 WO 2020129992A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
light emission
light
unit
charging
Prior art date
Application number
PCT/JP2019/049459
Other languages
French (fr)
Japanese (ja)
Inventor
要 林
淳哉 林
克則 藁谷
博教 小川
航平 川崎
智彰 横山
Original Assignee
Groove X株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Groove X株式会社 filed Critical Groove X株式会社
Priority to JP2020561465A priority Critical patent/JP7414285B2/en
Publication of WO2020129992A1 publication Critical patent/WO2020129992A1/en
Priority to JP2023216323A priority patent/JP2024045110A/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • HELECTRICITY
    • H02GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
    • H02JCIRCUIT ARRANGEMENTS OR SYSTEMS FOR SUPPLYING OR DISTRIBUTING ELECTRIC POWER; SYSTEMS FOR STORING ELECTRIC ENERGY
    • H02J7/00Circuit arrangements for charging or depolarising batteries or for supplying loads from batteries
    • HELECTRICITY
    • H02GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
    • H02JCIRCUIT ARRANGEMENTS OR SYSTEMS FOR SUPPLYING OR DISTRIBUTING ELECTRIC POWER; SYSTEMS FOR STORING ELECTRIC ENERGY
    • H02J7/00Circuit arrangements for charging or depolarising batteries or for supplying loads from batteries
    • H02J7/02Circuit arrangements for charging or depolarising batteries or for supplying loads from batteries for charging batteries from ac mains by converters

Definitions

  • the present invention relates to a robot that autonomously selects an action according to an internal state or an external environment, and a charging station for charging the robot.
  • the movement of the robot is a prerequisite for the robot to have a presence as a companion like a pet.
  • the robot is expected to have a function of searching for a charging station and receiving appropriate charging without the assistance of the user (see Patent Documents 3 and 4).
  • the charging station shown in Patent Document 3 displays a predetermined mark to inform the robot of its location.
  • the robot identifies the charging station by visually recognizing this mark (see paragraphs [0093][0094] and the like). Further, the charging station of Patent Document 3 also notifies the robot of the location of the charging station by generating sound waves or radio waves (see paragraphs [0099][0100] and the like).
  • the charging station is expected to be installed in a wide variety of environments. Therefore, even if the charging station displays a predetermined mark, the robot cannot always find the mark. For example, in a room where a mark similar to that of a charging station is drawn on the wall, the robot can misidentify the position of the charging station. The same applies to radio waves and the like, and the robot may mistakenly recognize another signal similar to the signal from the charging station as the signal from the charging station.
  • the present invention is an invention completed based on the recognition of the above problems, and its main purpose is to provide a technique for increasing the accuracy with which a robot identifies a charging station.
  • a charging station includes a charging space having a power supply terminal, a charging control unit that charges a secondary battery built in the robot when the robot and the power supply terminal are connected in the charging space, and a light source.
  • a light emission instruction receiving unit that receives a light emission signal that specifies a light emission mode from the robot, and a light emission control unit that changes the light emission mode of the light source according to the specified light emission mode.
  • a robot includes an operation control unit that selects a motion of the robot, a drive mechanism that executes the motion selected by the operation control unit, and a light emission instruction transmission unit that transmits a light emission signal that specifies a light emission mode.
  • a light recognition unit that recognizes external light.
  • the light emission instruction transmission unit transmits a light emission signal when charging the secondary battery built into the robot, the light recognition unit identifies the external light corresponding to the specified light emission mode, and the operation control unit determines the external light.
  • the movement direction of the robot is determined by using the light emission point as the location point of the charging station.
  • a landmark device includes a light source, a light emission instruction receiving unit that receives a light emission signal that specifies a light emission mode from a robot, and a light emission control that changes the light emission mode of the light source according to the specified light emission mode. And a section.
  • a guidance system in an aspect of the invention includes a robot, a landmark device, and a charging station.
  • the robot includes a light recognition unit that recognizes external light, a motion control unit that selects a motion of the robot, a drive mechanism that executes the motion selected by the motion control unit, and a light emission that transmits a light emission signal that specifies a light emission mode.
  • an instruction transmitting unit transmits a light emission signal when charging the secondary battery built into the robot, the light recognition unit identifies the external light corresponding to the specified light emission mode, and the operation control unit determines the external light.
  • the movement direction of the robot is determined with the light emission point as the movement target point.
  • the landmark device includes a light source, a light emission instruction receiving unit that receives a light emission signal from the robot, and a light emission control unit that changes the light emission mode of the light source according to the light emission mode designated by the light emission signal.
  • the charging station has a charging space having a power supply terminal, a charging control unit that charges a secondary battery of the robot when the robot and the power supply terminal are connected in the charging space, a light source, and a light emission signal that receives a light emission signal from the robot.
  • An instruction receiving unit and a light emission control unit that changes the light emission mode of the light source according to the light emission mode specified by the light emission signal are provided.
  • the light emission instruction transmission unit of the robot transmits a first light emission signal that specifies the first light emission mode.
  • the light emission control unit of the landmark device changes the light emission mode of the light source in accordance with the first light emission mode designated by the first light emission signal.
  • the light recognition unit of the robot identifies the external light according to the first light emission mode, and the operation control unit of the robot determines the moving direction of the robot with the light emission point of the external light according to the first light emission mode as the movement target point.
  • the light emission instruction transmission unit of the robot transmits a second light emission signal that specifies the second light emission mode when the robot reaches the location point of the landmark device.
  • the light emission control unit of the charging station changes the light emission mode of the light source according to the second light emission mode designated by the second light emission signal.
  • the light recognition unit of the robot identifies the external light according to the second light emission mode, and the operation control unit of the robot determines the next movement direction of the robot with the light emission point of the external light according to the second light emission mode as the movement target point. To do.
  • a guidance system in another aspect of the invention includes a first robot, a second robot and a charging station.
  • Each of the first robot and the second robot has a light recognition unit that recognizes external light, an operation control unit that selects a motion of the robot, and a drive mechanism that executes the motion selected by the operation control unit.
  • a light emission instruction transmission unit that transmits a light emission signal that specifies a light emission mode, a light source, a light emission instruction reception unit that receives a light emission signal from another robot, and a light emission mode of the light source according to the light emission mode specified by the light emission signal
  • a light emission control unit for changing for changing.
  • the light emission instruction transmission unit transmits a light emission signal when charging the secondary battery built into the robot, the light recognition unit identifies the external light corresponding to the specified light emission mode, and the operation control unit determines the external light.
  • the movement direction of the robot is determined with the light emission point as the movement target point.
  • the charging station has a charging space having a power supply terminal, a charging control unit that charges a secondary battery of the robot when the robot and the power supply terminal are connected in the charging space, a light source, and a light emission signal that receives a light emission signal from the robot.
  • An instruction receiving unit and a light emission control unit that changes the light emission mode of the light source according to the light emission mode specified by the light emission signal are provided.
  • the light emission instruction transmission unit of the first robot transmits a first light emission signal designating the first light emission mode.
  • the light emission control unit of the second robot changes the light emission mode of the light source of the second robot according to the first light emission mode designated by the first light emission signal.
  • the light recognition unit of the first robot identifies the external light according to the first light emission mode, and the operation control unit of the first robot uses the light emission point of the external light according to the first light emission mode as the movement target point. Determines the moving direction of the robot.
  • the light emission instruction transmission unit of the first robot transmits a second light emission signal that specifies the second light emission mode when the first robot reaches the location point of the second robot.
  • the light emission control unit of the charging station changes the light emission mode of the light source according to the second light emission mode designated by the second light emission signal.
  • the light recognition unit of the first robot identifies the external light according to the second light emission mode, and the operation control unit of the first robot uses the light emission point of the external light according to the second light emission mode as the movement target point. Determines the next movement direction of the robot.
  • the robot can easily identify the location point of the charging station.
  • FIG. 6 is a schematic diagram for explaining a method of identifying a position and a traveling direction of a robot.
  • FIG. 1 is a diagram for explaining the outline of the charging system 10.
  • the charging system 10 includes a charging station 500 capable of simultaneously charging two robots 100.
  • the charging station may be simply referred to as “station”.
  • the robot 100 is a wheel-running autonomous robot.
  • the robot 100 includes two front wheels and one rear wheel.
  • the left and right front wheels are driven wheels, and the rear wheels are driven wheels made of casters (details will be described later).
  • Station 500 directs the nests (beds) of multiple robots 100.
  • Two charging spaces 502 (a left space 502L and a right space 502R) are arranged side by side and close to each other so that the two robots 100 can be charged next to each other in a friendly manner.
  • the robot 100 returns to the nest for charging, and faces the front while charging to appeal to the surroundings.
  • the robot 100 enters backwards into either of the two charging spaces 502. That is, the caster becomes the head when entering.
  • the charging space 502 is provided with a base 504 on which casters ride.
  • the power supply terminal of the station 500 and the charging terminal of the robot 100 are stably connected and charging is possible.
  • the robot 100 transmits a light emission signal designating a “light emission mode” such as a blinking cycle to the station 500, and the station 500 causes the light emission unit to emit light in the instructed light emission mode.
  • a “light emission mode” in the present embodiment is defined as a combination of one or more of a blinking cycle of a light emitting unit (light source), a light emitting amount, a light emitting color, and a light emitting pattern.
  • the light emission pattern may be information designating the distribution of the lighting time and the lighting time, such as "lighting for 0.5 seconds, turning off for 1.5 seconds, lighting for 1.0 second, turning off for 0.5 seconds". , "Information of three types of light sources of red light source, yellow light source, and violet light source, the red light source and the yellow light source are turned on” may be specified.
  • the light generated by the station 500 will be referred to as “guide light”.
  • the robot 100 detects the guide light and specifies the light emission point of the guide light as the location point of the station 500.
  • a basic configuration of the robot 100 will be described with reference to FIGS. 2 to 5, and then a method of identifying the location point of the station 500 by the robot 100 will be mainly described.
  • FIG. 2 is a diagram showing the appearance of the robot 100.
  • 2A is a front view and FIG. 2B is a side view.
  • the robot 100 is an autonomous action type robot that determines an action based on an external environment and an internal state.
  • the external environment is recognized by various sensors such as a camera and a thermo sensor 115.
  • the internal state is quantified as various parameters expressing the emotion of the robot 100.
  • the robot 100 sets the indoor area of the owner's home as an action range.
  • a person involved in the robot 100 is called a “user”. Of the users, the owner or administrator of the robot 100 is called the “owner”.
  • the body 104 of the robot 100 has a rounded shape as a whole, and includes an outer skin 314 formed of a soft and elastic material such as urethane, rubber, resin, or fiber.
  • the robot 100 may be dressed.
  • the total weight of the robot 100 is about 5 to 15 kilograms, and the height is about 0.5 to 1.2 meters. Due to various attributes such as appropriate weight, roundness, softness, and good feel, the effect that the user can easily hold the robot 100 and that he/she wants to hold the robot 100 is realized.
  • the robot 100 includes a pair of front wheels 102 (left wheel 102a, right wheel 102b) and one rear wheel 103.
  • the front wheels 102 are driving wheels and the rear wheels 103 are driven wheels.
  • the front wheels 102 do not have a steering mechanism, but the rotation speed and rotation direction of the left and right wheels can be individually controlled.
  • the rear wheel 103 is a caster and is rotatable to move the robot 100 back and forth and left and right.
  • the rear wheel 103 may be an omni wheel.
  • the front wheel 102 and the rear wheel 103 can be completely housed in the body 104 by the drive mechanism (rotating mechanism, link mechanism).
  • a pair of left and right covers 312 is provided on the lower half of the body 104.
  • the cover 312 is made of a flexible and elastic resin material (rubber, silicone rubber, or the like), constitutes a soft body, and can accommodate the front wheel 102.
  • the cover 312 is formed with a slit 313 (opening) that opens from the side surface to the front surface, and the front wheel 102 can be advanced through the slit 313 and exposed to the outside.
  • the robot 100 cannot move. That is, the body 104 descends and sits on the floor F as the wheels are retracted. In this seated state, a flat seating surface 108 (ground contact bottom surface) formed on the bottom of the body 104 contacts the floor surface F.
  • the robot 100 has two arms 106. Although there is a hand at the tip of the arm 106, it does not have a function of grasping an object.
  • the arm 106 can perform simple operations such as raising, bending, waving, and vibrating by driving an actuator described later.
  • the two arms 106 can be individually controlled.
  • a face area 116 is exposed in front of the head of the robot 100.
  • the face area 116 is provided with two eyes 110.
  • the eye 110 is a device capable of displaying an image with a liquid crystal element or an organic EL element, and expressing a line of sight or a facial expression by moving a pupil or an eyelid displayed as an image.
  • a nose 109 is provided in the center of the face area 116.
  • the nose 109 is provided with an analog stick, and can detect the pushing direction in addition to all the directions of up, down, left and right.
  • the robot 100 is provided with a plurality of touch sensors, and a user's touch can be detected on almost the entire area of the robot 100, such as the head, torso, buttocks, and arms.
  • the robot 100 is equipped with various sensors such as a microphone array and an ultrasonic sensor capable of specifying the sound source direction. It also has a built-in speaker and can emit simple voice.
  • a horn 112 is attached to the head of the robot 100.
  • a omnidirectional camera 113 is attached to the horn 112 so that the entire upper portion of the robot 100 can be imaged at one time.
  • the horn 112 also has a built-in thermo sensor 115 (thermo camera).
  • the horn 112 is provided with a plurality of modules (not shown) for performing communication using infrared rays, and these modules are annularly installed toward the surroundings. Therefore, the robot 100 can perform infrared communication while recognizing the direction.
  • the horn 112 is provided with a switch for emergency stop, and the user can perform an emergency stop of the robot 100 by pulling out the horn 112.
  • FIG. 3 is a sectional view schematically showing the structure of the robot 100.
  • the body 104 includes a main body frame 310, a pair of arms 106, a pair of covers 312, and an outer cover 314.
  • the body frame 310 includes a head frame 316 and a body frame 318.
  • the head frame 316 has a hollow hemispherical shape and forms the head skeleton of the robot 100.
  • the body frame 318 has a rectangular tube shape and forms a body skeleton of the robot 100.
  • the lower end of the body frame 318 is fixed to the lower plate 334.
  • the head frame 316 is connected to the body frame 318 via the connection mechanism 330.
  • the body frame 318 constitutes the axis of the body 104.
  • the body frame 318 is configured by fixing a pair of left and right side plates 336 to the lower plate 334, and supports the pair of arms 106 and the internal mechanism.
  • the battery 118, the control circuit 342, various actuators, and the like are housed inside the body frame 318.
  • the bottom surface of the lower plate 334 forms the seating surface 108.
  • the body frame 318 has an upper plate 332 on its upper part.
  • a cylindrical support portion 319 having a bottom is fixed to the upper plate 332.
  • the upper plate 332, the lower plate 334, the pair of side plates 336, and the support portion 319 form a body frame 318.
  • the outer diameter of the support portion 319 is smaller than the distance between the left and right side plates 336.
  • the pair of arms 106 is integrally assembled with the annular member 340 to form an arm unit 350.
  • the annular member 340 has an annular shape, and the pair of arms 106 are attached so as to radially separate the center line thereof.
  • the annular member 340 is coaxially inserted into the support portion 319 and placed on the upper end surfaces of the pair of side plates 336.
  • the arm unit 350 is supported by the body frame 318 from below.
  • the head frame 316 has a yaw axis 321, a pitch axis 322, and a roll axis 323.
  • the head frame 316 swings around the yaw axis 321 to achieve a swinging motion, and the swinging around the pitch shaft 322 achieves a nod motion, a look-up motion and a look-down motion around the roll shaft 323.
  • the operation of tilting the neck to the left and right is realized by the rotation (rolling).
  • the position and angle of each axis in the three-dimensional space may change according to the driving mode of the connection mechanism 330.
  • the connection mechanism 330 includes a link mechanism and is driven by a plurality of motors installed on the body frame 318.
  • the body frame 318 houses the wheel drive mechanism 370.
  • the wheel drive mechanism 370 includes a front wheel drive mechanism and a rear wheel drive mechanism that move the front wheel 102 and the rear wheel 103 into and out of the body 104, respectively.
  • the front wheels 102 and the rear wheels 103 function as a “moving mechanism” that moves the robot 100.
  • the front wheel 102 has a direct drive motor in the center thereof. Therefore, the left wheel 102a and the right wheel 102b can be driven individually.
  • the front wheel 102 is rotatably supported by the wheel cover 105, and the wheel cover 105 is rotatably supported by the body frame 318.
  • the pair of covers 312 is provided so as to cover the body frame 318 from the left and right, and has a smooth curved surface shape so that the outline of the body 104 is rounded.
  • a closed space is formed between the body frame 318 and the cover 312, and the closed space is a storage space S for the front wheels 102.
  • the rear wheel 103 is housed in a housing space provided in the lower rear part of the body frame 318.
  • the outer skin 314 covers the body frame 310 and the pair of arms 106 from the outside.
  • the outer cover 314 has a thickness that allows a person to feel elasticity, and is formed of a stretchable material such as urethane sponge. As a result, when the user holds the robot 100, he or she can feel appropriate softness and take a natural skinship like a human being makes a pet.
  • the outer cover 314 is attached to the main body frame 310 in such a manner that the cover 312 is exposed.
  • An opening 390 is provided at the upper end of the outer cover 314. The opening 390 is inserted through the horn 112.
  • a touch sensor is arranged between the body frame 310 and the outer cover 314.
  • a touch sensor is embedded in the cover 312.
  • Each of these touch sensors is a capacitance sensor and detects a touch in almost the entire area of the robot 100.
  • the touch sensor may be embedded in the outer cover 314 or may be provided inside the main body frame 310.
  • the arm 106 has a first joint 352 and a second joint 354, and an arm 356 between both joints and a hand 358 at the tip of the second joint 354.
  • the first joint 352 corresponds to the shoulder joint
  • the second joint 354 corresponds to the wrist joint.
  • a motor is provided in each joint to drive the arm 356 and the hand 358, respectively.
  • the drive mechanism for driving the arm 106 includes these motors and their drive circuit 344.
  • FIG. 4 is a hardware configuration diagram of the robot 100.
  • the robot 100 includes an internal sensor 128, a communication device 126, a storage device 124, a processor 122, a drive mechanism 120, and a battery 118.
  • the drive mechanism 120 includes the connection mechanism 330 and the wheel drive mechanism 370 described above.
  • the processor 122 and the storage device 124 are included in the control circuit 342.
  • Each unit is connected to each other by a power supply line 130 and a signal line 132.
  • the battery 118 supplies power to each unit via the power supply line 130.
  • Each unit sends and receives a control signal via a signal line 132.
  • the battery 118 is a lithium-ion secondary battery and is a power source of the robot 100.
  • the internal sensor 128 is an assembly of various sensors built in the robot 100. Specifically, it is a camera, a microphone array, a distance measuring sensor (infrared sensor), a thermo sensor 115, a touch sensor, an acceleration sensor, an atmospheric pressure sensor, an odor sensor, or the like.
  • the touch sensor corresponds to most of the area of the body 104 and detects a user's touch based on a change in capacitance.
  • the odor sensor is a known sensor that applies the principle that electric resistance changes due to adsorption of molecules that are the origin of odor.
  • the communication device 126 is a communication module that performs wireless communication with various external devices.
  • the storage device 124 includes a non-volatile memory and a volatile memory, and stores a computer program and various setting information.
  • the processor 122 is a means for executing a computer program.
  • the drive mechanism 120 includes a plurality of actuators. In addition to this, a display and speakers are also installed.
  • the drive mechanism 120 mainly controls the wheels and the head.
  • the drive mechanism 120 can change the moving direction and moving speed of the robot 100, and can also move the wheels up and down. When the wheels are lifted, the wheels are completely stored in the body 104, and the robot 100 comes into contact with the floor surface F at the seating surface 108 and becomes seated.
  • the drive mechanism 120 also controls the arm 106.
  • FIG. 5 is a functional block diagram of the robot system 300.
  • the robot system 300 includes a robot 100, a server 200, and a plurality of external sensors 114.
  • Each constituent element of the robot 100 and the server 200 includes a computing unit such as a CPU (Central Processing Unit) and various coprocessors, a storage device such as a memory and a storage, hardware including a wired or wireless communication line connecting them, and a storage unit. It is realized by software stored in the device and supplying a processing instruction to the arithmetic unit.
  • the computer program may be configured by a device driver, an operating system, various application programs located in their upper layers, and a library that provides common functions to these programs. Each block described below is not a hardware-based configuration but a function-based block.
  • Some of the functions of the robot 100 may be realized by the server 200, and some or all of the functions of the server 200 may be realized by the robot 100.
  • a plurality of external sensors 114 are installed in advance in the house.
  • the server 200 manages the external sensor 114 and provides the detection value acquired by the external sensor 114 to the robot 100 as needed.
  • the robot 100 determines a basic action based on the information obtained from the internal sensor 128 and the plurality of external sensors 114.
  • the external sensor 114 is for reinforcing the sensory organs of the robot 100, and the server 200 is for reinforcing the processing capacity of the robot 100.
  • the communication device 126 of the robot 100 may periodically communicate with the server 200, and the server 200 may be responsible for the process of identifying the position of the robot 100 by the external sensor 114 (see also Patent Document 2).
  • the server 200 includes a communication unit 204, a data processing unit 202 and a data storage unit 206.
  • the communication unit 204 is in charge of communication processing with the external sensor 114 and the robot 100.
  • the data storage unit 206 stores various data.
  • the data processing unit 202 executes various processes based on the data acquired by the communication unit 204 and the data stored in the data storage unit 206.
  • the data processing unit 202 also functions as an interface for the communication unit 204 and the data storage unit 206.
  • the data storage unit 206 includes a motion storage unit 232 and a personal data storage unit 218.
  • the robot 100 has a plurality of motion patterns (motions). Various motions such as swaying the arm 106, approaching the owner while meandering, and staring at the owner with his/her neck bent are defined.
  • the motion storage unit 232 stores a “motion file” that defines the motion control content. Each motion is identified by a motion ID. The motion file is also downloaded to the motion storage unit 160 of the robot 100. Which motion is executed may be determined by the server 200 or the robot 100.
  • Most of the motions of the robot 100 are configured as compound motions including a plurality of unit motions.
  • the robot 100 may be expressed as a combination of a unit motion of turning toward the owner, a unit motion of approaching while raising a hand, a unit motion of approaching while shaking the body, and a unit motion of sitting while raising both hands. ..
  • Such a combination of four motions realizes a motion of “approaching the owner, raising his hand on the way, and finally shaking his body to sit down”.
  • the rotation angle, angular velocity, etc. of the actuator provided in the robot 100 are defined in association with the time axis. According to the motion file (actuator control information), various motions are expressed by controlling each actuator over time.
  • the transition time when changing from the previous unit motion to the next unit motion is called “interval".
  • the interval may be defined according to the time required to change the unit motion and the content of the motion.
  • the length of the interval is adjustable.
  • action characteristics the settings relating to the behavior control of the robot 100, such as when and which motion to select, output adjustment of each actuator in realizing the motion, are collectively referred to as “action characteristics”.
  • the behavior characteristic of the robot 100 is defined by a motion selection algorithm, a motion selection probability, a motion file, and the like.
  • the motion storage unit 232 stores, in addition to motion files, a motion selection table that defines motions to be executed when various events occur.
  • a motion selection table that defines motions to be executed when various events occur.
  • one or more motions and their selection probabilities are associated with an event.
  • the personal data storage unit 218 stores user information. Specifically, master information indicating intimacy with the user and physical/behavioral characteristics of the user is stored. Other attribute information such as age and gender may be stored.
  • the robot 100 has an internal parameter called intimacy for each user.
  • intimacy for each user.
  • the robot 100 recognizes an action that is favorable to itself, such as hugging itself or calling out, the intimacy with the user increases.
  • the degree of intimacy with a user who is not related to the robot 100, a user who works violently, or a user who rarely encounters is low.
  • the data processing unit 202 includes a position management unit 208, a recognition unit 212, an operation control unit 222, an intimacy management unit 220, and a state management unit 244.
  • the position management unit 208 specifies the position coordinates of the robot 100.
  • the state management unit 244 manages various internal parameters such as the charging rate, the internal temperature, and various physical states such as the processing load of the processor 122.
  • the state management unit 244 manages various emotion parameters indicating the emotion (loneliness, curiosity, desire for approval, etc.) of the robot 100. These emotional parameters are always fluctuating.
  • the movement target point of the robot 100 changes according to the emotion parameter. For example, when loneliness is increasing, the robot 100 sets the place where the user is as the movement target point.
  • -Emotional parameters change over time.
  • various emotion parameters also change due to a response action described later.
  • the emotional parameter indicating loneliness decreases when the owner "hugs”, and the emotional parameter indicating loneliness gradually increases when the owner is not visually recognized for a long time.
  • the recognition unit 212 recognizes the external environment.
  • the recognition of the external environment includes various recognitions such as recognition of weather and season based on temperature and humidity, and recognition of shade (safety zone) based on light intensity and temperature.
  • the recognition unit 156 of the robot 100 acquires various environmental information by the internal sensor 128, performs primary processing on the environmental information, and transfers the environmental information to the recognition unit 212 of the server 200.
  • the recognition unit 156 of the robot 100 extracts a moving object, in particular, an image region corresponding to a person or an animal from the image, and a feature indicating a physical feature or a behavioral feature of the moving object from the extracted image region.
  • a "feature vector" is extracted as a set of quantities.
  • the feature vector component (feature amount) is a numerical value that quantifies various physical and behavioral features. For example, the width of the human eye is digitized in the range of 0 to 1 to form one feature vector component.
  • the method of extracting the feature vector from the captured image of a person is an application of a known face recognition technique.
  • the robot 100 transmits the feature vector to the server 200.
  • the recognition unit 212 of the server 200 is imaged by comparing the feature vector extracted from the image captured by the built-in camera of the robot 100 with the feature vector of the user (cluster) registered in the personal data storage unit 218 in advance. It is determined which person the user corresponds to (user identification process). Further, the recognition unit 212 estimates the emotion of the user by recognizing the facial expression of the user as an image. The recognition unit 212 also performs user identification processing on moving objects other than people, such as cats and dogs that are pets.
  • the recognition unit 212 recognizes various kinds of response actions performed on the robot 100 and classifies them into pleasant and unpleasant actions.
  • the recognition unit 212 also recognizes the owner's response to the action of the robot 100, and classifies the action into an affirmative/negative response.
  • the pleasant/unpleasant behavior is determined depending on whether the user's behavior is pleasant or uncomfortable as a living thing. For example, hugging is a pleasant act for the robot 100, and being kicked is an unpleasant act for the robot 100.
  • the affirmative/negative reaction is determined by whether the user's response action indicates the user's pleasant or unpleasant feeling. Hugging is an affirmative reaction indicating the user's pleasant feeling, and being kicked is a negative reaction indicating the user's unpleasant feeling.
  • the operation control unit 222 of the server 200 cooperates with the operation control unit 150 of the robot 100 to determine the motion of the robot 100.
  • the operation control unit 222 of the server 200 creates a movement target point of the robot 100 and a movement route therefor.
  • the operation control unit 222 may create a plurality of movement routes and select any one of the movement routes.
  • the motion control unit 222 selects a motion of the robot 100 from a plurality of motions in the motion storage unit 232.
  • a selection probability is associated with each motion for each situation. For example, a selection method is defined in which when the owner makes a pleasant action, the motion A is executed with a probability of 20%, and when the temperature is 30 degrees or more, the motion B is executed with a probability of 5%. ..
  • the intimacy degree management unit 220 manages the intimacy degree for each user. As described above, the degree of intimacy is registered in the personal data storage unit 218 as a part of personal data. When the pleasant behavior is detected, the intimacy degree management unit 220 increases the intimacy degree with respect to the owner. The intimacy decreases when an offensive behavior is detected. Moreover, the intimacy of owners who have not been visually recognized for a long period of time gradually decreases.
  • the robot 100 includes a communication unit 142, a data processing unit 136, a data storage unit 148, an internal sensor 128, and a drive mechanism 120.
  • the communication unit 142 corresponds to the communication device 126 (see FIG. 4) and is in charge of communication processing with the external sensor 114, the server 200, and the other robot 100.
  • the data storage unit 148 stores various data.
  • the data storage unit 148 corresponds to the storage device 124 (see FIG. 4).
  • the data processing unit 136 executes various processes based on the data acquired by the communication unit 142 and the data stored in the data storage unit 148.
  • the data processing unit 136 corresponds to the processor 122 and a computer program executed by the processor 122.
  • the data processing unit 136 also functions as an interface for the communication unit 142, the internal sensor 128, the drive mechanism 120, and the data storage unit 148.
  • the data storage unit 148 includes a motion storage unit 160 that defines various motions of the robot 100.
  • Various motion files are downloaded from the motion storage unit 232 of the server 200 to the motion storage unit 160 of the robot 100.
  • the motion is identified by the motion ID.
  • the operation timing, operation time, operation direction, etc. of various actuators are It is defined in time series in the motion file.
  • Various data may be downloaded from the personal data storage unit 218 to the data storage unit 148.
  • the data processing unit 136 includes a recognition unit 156 and an operation control unit 150.
  • the operation control unit 150 of the robot 100 cooperates with the operation control unit 222 of the server 200 to determine the motion of the robot 100.
  • the server 200 may determine some motions, and the robot 100 may determine other motions. Although the robot 100 determines the motion, the server 200 may determine the motion when the processing load of the robot 100 is high.
  • the base motion may be determined in the server 200 and the additional motion may be determined in the robot 100. How the server 200 and the robot 100 share the motion determination process may be designed according to the specifications of the robot system 300.
  • the operation control unit 150 of the robot 100 instructs the drive mechanism 120 to execute the selected motion.
  • the drive mechanism 120 controls each actuator according to the motion file.
  • the motion control unit 150 can also perform a motion of lifting both arms 106 as a gesture of "hugging" when a user with high intimacy is nearby, and when the user gets tired of "hugging", the left and right front wheels 102 are moved. By alternately repeating the reverse rotation and the stop while the robot is housed, it is possible to express a motion to hate the hug.
  • the drive mechanism 120 drives the front wheel 102, the arm 106, and the neck (head frame 316) in accordance with an instruction from the operation control unit 150 to cause the robot 100 to express various motions.
  • the recognition unit 156 of the robot 100 interprets external information obtained from the internal sensor 128.
  • the recognition unit 156 can perform visual recognition (visual part), odor recognition (olfactory part), sound recognition (auditory part), and tactile recognition (tactile part).
  • the recognition unit 156 extracts the feature vector from the captured image of the moving object.
  • the feature vector is a set of parameters (feature amounts) indicating the physical features and behavioral features of the moving object.
  • feature amounts parameters indicating the physical features and behavioral features of the moving object.
  • the recognition unit 156 identifies the user from the feature vector based on a known technique described in Patent Document 2 or the like.
  • the recognition unit 156 of the robot 100 selects or extracts information necessary for recognition, and interpretation processing such as determination is executed by the recognition unit 212 of the server 200. ..
  • the recognition processing may be performed only by the recognition unit 212 of the server 200, or may be performed only by the recognition unit 156 of the robot 100. As described above, both sides perform the above-described recognition processing while sharing roles. Good.
  • the recognition unit 156 recognizes this by the touch sensor and the acceleration sensor, and the recognition unit 212 of the server 200 recognizes that “a violent act” is performed by a user in the vicinity.
  • the recognition unit 212 of the server 200 may recognize that a “calling action” has been performed on itself. Further, when a temperature around the body temperature is detected, it is recognized that the "contact action” has been performed by the user, and when an upward acceleration is detected in the state of contact recognition, it is recognized that the "hugging” has been performed.
  • Physical contact may be sensed when the user lifts the body 104, or the hug may be recognized when the load applied to the front wheel 102 is reduced.
  • the robot 100 acquires the action of the user as physical information by the internal sensor 128, and the recognition unit 212 of the server 200 determines the comfort/discomfort.
  • the recognition unit 212 of the server 200 also executes a user identification process based on the feature vector.
  • the recognition unit 212 of the server 200 recognizes various responses of the user to the robot 100. Some typical response actions among various response actions are associated with pleasantness or discomfort, affirmation or denial. In general, most of the pleasant behaviors are affirmative reactions, and most of the offensive behaviors are negative responses. Pleasure/discomfort is related to intimacy, and positive/negative reactions influence behavior selection of the robot 100.
  • the intimacy degree management unit 220 of the server 200 changes the intimacy degree with respect to the user according to the response action recognized by the recognition unit 156.
  • the degree of intimacy with respect to the user who has performed a pleasant act increases, and the degree of intimacy with respect to the user who has performed an unpleasant act decreases.
  • FIG. 6 is a diagram showing a state in which the robot 100 is wearing the outer cover 314.
  • 6A is a right side view
  • FIG. 6B is a front view
  • FIG. 6C is a rear view.
  • the appearance of the robot 100 is substantially symmetrical.
  • An accommodation opening 377 for accommodating the rear wheel 103 is provided below the rear portion of the body frame 318 of the robot 100.
  • a pair of charging terminals 510 are provided to the left and right of the accommodation port 377.
  • the base end of the charging terminal 510 is located inside the body frame 318, and is connected to the charging circuit via a wire (not shown).
  • the tip of the charging terminal 510 is in the shape of a disk having a slightly large diameter, and has a button shape.
  • the outer cover 314 is configured by sewing the outer cover body 420 and the elastic mounting portion 422 together. Both the outer cover body 420 and the elastic mounting portion 422 are made of a flexible material.
  • the outer cover body 420 includes a bag-shaped portion 424 that covers the head frame 316, a pair of hand portions 426 that extends downward from the left and right side surfaces of the bag-shaped portion 424, and an extension portion 428 that extends downward from the front surface of the bag-shaped portion 424. And an extending portion 430 extending downward from the back surface of the bag-shaped portion 424.
  • An opening 432 for exposing the face region 116 is provided on the front surface side of the bag-shaped portion 424.
  • the elastic mounting portion 422 constitutes the bottom portion of the outer cover 314, and connects the front and rear extending parts 428 and 430 of the outer cover main body 420 below.
  • the elastic mounting portion 422 is provided with an opening 434 at a position corresponding to the accommodation port 377.
  • a pair of holes 436 are formed below the rear portion of the elastic mounting portion 422.
  • the hole 436 has a narrow width shape like a button hole, but since the elastic mounting portion 422 is flexible, it can be widened in the width direction.
  • a pair of charging terminals 510 are inserted into these holes 436. After the charging terminal 510 is inserted into the hole 436, the hole 436 returns to its original narrow shape due to the elastic force. This prevents the head of the charging terminal 510 from being caught around the hole 436 and the outer cover 314 from coming off. That is, the charging terminal 510 is a terminal for charging and also a member for fixing the outer cover 314.
  • An infrared sensor 172 and a pair of microphones 174 are provided as an internal sensor 128 on the rear cover 107 (tail) of the robot 100. That is, the infrared sensor 172 is provided in the center of the rear cover 107, the left microphone 174L is provided on the left side, and the right microphone 174R is provided on the right side. These face the rear of the robot 100 with the rear cover 107 open and the rear wheel 103 exposed. The infrared sensor 172 and the pair of microphones 174 are used for guidance control when the robot 100 enters the station 500.
  • FIG. 7 is a perspective view showing the appearance of the station 500.
  • the station 500 incorporates the server 200 and includes members for decoration such as a base 504, a main body 512, and a pair of rear panels 508 (left panel 508L, right panel 508R).
  • the station 500 provides a charging function and a server function in a single housing.
  • the base 504 has a rectangular shape in plan view, and charging spaces 502 are provided on the left and right.
  • the base 504 includes a power supply terminal 530. By connecting the power supply terminal 530 and the charging terminal 510 of the robot 100, electric power is supplied from the station 500 to the robot 100.
  • a main body 512 is erected at the center of the upper surface of the base 504.
  • the main body 512 has a housing 514 whose upper half is enlarged.
  • the pair of rear panels 508 are arranged on the front left and right sides of the housing 514, respectively.
  • the back panel 508 is detachably attached to the main body 512 via a fixing member 509.
  • the fixing member 509 is an arm-shaped member, and one end thereof is detachably fixed to the back surface of the back panel 508 and the other end is detachably fixed to the back surface of the housing 514.
  • short distance guiding parts 252 (left guiding part 252L, right guiding part 252R) are provided below the respective rear panels 508.
  • a middle distance guiding unit 254 is installed in the center of the housing 514.
  • the short-distance guiding unit 252 includes a short-distance infrared generator that generates infrared rays at an emission angle of about 30 to 60 degrees and an ultrasonic wave generator that generates ultrasonic waves.
  • the transmission distance of infrared rays (hereinafter referred to as “short-range infrared rays”) by the short-range infrared ray generating device is about 1 meter at maximum.
  • the mid-range guiding unit 254 also includes a mid-range infrared transmitter that emits infrared rays.
  • the transmission distance of infrared rays (hereinafter, referred to as “middle-range infrared rays”) by the medium-range infrared ray generator is about 4 meters at maximum.
  • ⁇ Mid-range infrared rays are set to have a higher light intensity (power) than short-range infrared rays.
  • the short-distance guiding unit 252 and the medium-distance guiding unit 254 have different configurations, but the short-distance guiding unit 252 and the medium-distance wireless generation device may be configured integrally.
  • the short-distance infrared rays and ultrasonic waves generated by the short-distance guidance unit 252 are collectively referred to as a "short-distance guidance signal”.
  • the middle-range infrared rays generated by the middle-range guiding unit 254 are also referred to as “middle-range guiding signal”.
  • the short-distance guidance signal and the medium-distance guidance signal are collectively called "guidance signal”.
  • a light emitting unit 256 is installed on the top of the housing 514.
  • the light emitting unit 256 includes a plurality of light sources by LEDs (Light Emitting Diodes) and notifies the robot 100 of the position of the station 500 by visible light (guide light).
  • LEDs Light Emitting Diodes
  • the robot 100 will be described assuming that the robot 100 periodically returns to the charging space 502 of the station 500 and is regularly charged by the station 500. In the present embodiment, it is assumed that the robot 100 repeats an action pattern of charging for 45 minutes (rest) after 45 minutes of activity.
  • return process the process in which the robot 100 returns to the charging space 502 is referred to as “return process”.
  • stocking the process in which the robot 100 returns to the charging space 502 is referred to as “stocking”.
  • FIG. 8 is a functional block diagram of the station 500 in this embodiment.
  • the robot 100 of the present embodiment acquires a large number of captured images (still images) by periodically capturing an image of the surroundings with the omnidirectional camera 113.
  • the robot 100 forms a memory (hereinafter referred to as “image memory”) based on the captured image.
  • Image memory is a collection of multiple keyframes.
  • the key frame is distribution information of feature points (feature amount) in the captured image.
  • the robot 100 of the present embodiment uses a graph-based SLAM (Simultaneous Localization and Mapping) technology using image features, more specifically, a SLAM technique based on ORB (Oriented FAST and Rotated BRIEF) features to form keyframes. It is formed (see Patent Document 5).
  • SLAM Simultaneous Localization and Mapping
  • the robot 100 periodically forms a key frame while moving to form an aggregate of key frames, in other words, an image memory as an image feature distribution.
  • the robot 100 estimates the current point by comparing the key frame acquired at the current point with a large number of key frames already held. That is, the robot 100 performs “spatial recognition” by comparing the captured image that is actually visually recognized with the captured image (memory) that is visually recognized once, and matching the present situation with the past memory.
  • the image memory formed as a set of feature points is a so-called map.
  • the robot 100 updates the map while moving while estimating the current position.
  • the basic configuration of the robot 100 is premised on recognizing the position by the external sensor 114 instead of the key frame.
  • the robot 100 of the present embodiment will be described as recognizing a place based only on a key frame.
  • the station 500 incorporates the server 200 and the charging device 506.
  • the communication unit 204 of the station 500 is shared by the server 200 and the charging device 506.
  • the data used by the robot 100 such as a map is also shared by the plurality of robots 100.
  • the map storage unit 170 generates a common (single) map by acquiring a key frame from each robot 100.
  • a “guidance system (of the robot 100 )” is formed by the station 500 including the server 200 and the charging device 506 and one or more robots 100.
  • the guidance system may further include a landmark device 280.
  • the landmark device 280 supports the return process of the robot 100. The role of the landmark device 280 will be described later with reference to FIG.
  • any of the robots 100 can also serve as the landmark device 280.
  • a method of causing the robot 100 to function as the landmark device 280 will be described later with reference to FIG.
  • Server 200 Each function of the server 200 is realized by loading a program for realizing the function into the memory and instantiating (instantiating) the program.
  • Various processing by the robot 100 is supplemented by the processing capacity of the server 200.
  • the server 200 can be used as a resource of the robot 100. How to use the resources of the server 200 is dynamically determined according to a request from the robot 100. For example, in the robot 100, when it is necessary to continuously generate complex motions according to detection values from a large number of touch sensors, the processing of the processor 122 in the robot 100 is preferentially assigned to motion selection/generation,
  • the recognition unit 212 of the server 200 may perform the processing for image recognition of the surrounding situation.
  • each function of the server 200 is materialized independently for each robot 100.
  • the server 200 may prepare the recognition unit 212 for the robot 100B separately from the recognition unit 212 for the robot 100A.
  • the server 200 assists the robot 100 existing in the installation location (charge area) of the server 200.
  • the server 200 is installed in the user's home and is used for the robot 100 existing therein.
  • the robot 100 generates a map for movement, and the map is shared by the plurality of robots 100 existing in the user's home.
  • the robot 100 uses SLAM to form a map.
  • the feature points (keyframes) forming the map include those having a static structure such as a wall that does not move and those having a dynamic structure that moves such as a chair or a toy. Therefore, the map is not the kind of information that can be used permanently once created, but the information that needs to be updated at all times. Therefore, the map can be updated efficiently by the plurality of robots 100 sharing the map.
  • the position management unit 208 of the data processing unit 202 includes a map management unit 168.
  • the map management unit 168 manages a map based on image storage.
  • the map management unit 168 repeats updating the map shared by the plurality of robots 100.
  • the data storage unit 206 further includes a map storage unit 170. Since the data storage unit 206 collectively stores the information of the plurality of robots 100, the plurality of 100 can share various data of the 206.
  • the map storage unit 170 stores maps.
  • the communication unit 204 includes an event receiving unit 246, a light emission instruction receiving unit 248, a warehousing request receiving unit 250, and a warehousing permission transmitting unit 258.
  • the event receiving unit 246 receives, from the robot 100, event information (environment information) indicating various events recognized by the robot 100.
  • the recognition unit 212 of the server 200 analyzes the event information.
  • the intimacy degree management unit 220 and the state management unit 244 change the intimacy degree, the emotional parameter, and the like based on the event information.
  • the motion control unit 222 selects the motion of the robot 100 based on the event information.
  • the light emission instruction receiving unit 248 receives a light emission signal from the robot 100.
  • the light emission signal is a signal that specifies the light emission mode (light emission) of the guide light.
  • the warehousing request receiving unit 250 receives a warehousing request signal from the robot 100.
  • the warehousing request signal is a signal that the robot 100 requests warehousing (returning) to the station 500.
  • the warehousing permission transmission unit 258 returns a warehousing permission signal indicating the warehousing permission to the robot 100 in response to the warehousing request signal.
  • the charging device 506 includes a light emission control unit 224, a warehousing determination unit 262, a charge control unit 264, a light emitting unit 256, and a guiding unit 266.
  • the light emitting unit 256 is a light source that generates guide light.
  • the light emission control unit 224 controls the light emission mode of the light emitting unit 256.
  • the warehousing determination unit 262 determines whether warehousing is possible when the warehousing request signal is received.
  • the charging control unit 264 charges the robot 100 that has been stored.
  • the guide unit 266 corresponds to the medium-range guide unit 254 and the short-range guide unit 252.
  • the short distance guiding unit 252 includes a left guiding unit 252L and a right guiding unit 252R.
  • the light emitting unit 256 is normally lit.
  • the robot 100 recognizes the location point of the station 500 by detecting the guide light from the light emitting unit 256.
  • the robot 100 can actively confirm whether the light is the guide light or the light other than the guide light by designating the light emitting mode of the guide light (details will be described later).
  • the light emitting unit 256 may be always turned off, and the light emitting unit 256 may be configured to emit light when a light emission signal is received.
  • the robot 100 approaches the station 500 with the guide light as a mark. Subsequently, the robot 100 detects the medium-range infrared rays (medium-range guidance signal) from the medium-range guidance unit 254, and aligns the charging space 502. When the robot 100 further approaches the station 500, it enters the charging space 502 based on the short-distance guidance signal (short-distance infrared rays and ultrasonic waves) from the short-distance guidance unit 252. When the charging terminal 510 and the power feeding terminal are connected, the charging control unit 264 charges the battery 118 of the robot 100.
  • the medium-range infrared rays medium-range guidance signal
  • the short-distance guidance signal short-distance infrared rays and ultrasonic waves
  • FIG. 9 is a functional block diagram of the robot 100 according to this embodiment.
  • the robot 100 further includes a light emitting unit 138 (light source).
  • the light emitting unit 138 emits light when the robot 100 functions as a landmark device.
  • the light emitting unit 138 is provided at an arbitrary location such as the horn 112 of the robot 100.
  • the eyes 110 of the robot 100 may emit light.
  • the data processing unit 136 further includes a captured image acquisition unit 146, a light emission control unit 158, and a battery remaining amount monitoring unit 176.
  • the captured image acquisition unit 146 acquires a captured image from the omnidirectional camera 113.
  • the captured image acquisition unit 146 regularly acquires captured images.
  • the light emission control unit 158 controls the light emission mode of the light emitting unit 138.
  • the battery remaining amount monitoring unit 176 monitors the battery remaining amount (charging rate) of the battery 118.
  • the recognition unit 156 further includes an image feature acquisition unit 152, a light recognition unit 154, a distance measurement unit 162, and a position determination unit 166.
  • the image feature acquisition unit 152 generates a key frame by extracting the image feature amount from the captured image.
  • the light recognition unit 154 recognizes the light emission from the station 500 or the landmark device 280. In the present embodiment, the light recognizing unit 154 recognizes external light by the omnidirectional camera 113, but may recognize external light based on a separately provided optical sensor.
  • the distance measuring unit 162 calculates the distance and the relative angle between the robot 100 and the station 500 based on the medium-range infrared rays, the short-range infrared rays, and the ultrasonic waves.
  • the position determination unit 166 determines with respect to the station 500 whether or not the current position of the robot 100 satisfies the “position condition”.
  • the position condition is a condition for movement restriction when the robot 100 functions as a landmark device.
  • the robot 100 can estimate the direction in which the station 500 exists by using the map generated based on the key frame even when the robot 100 is away from the station 500.
  • the location point of the station 500 can be more surely specified by the guide light.
  • the robot 100 detects the medium-range guidance signal and further the short-range guidance signal. In this way, the robot 100 can estimate the accurate position of the station 500 based on the plurality of guiding means as it approaches the station 500. As a result, the certainty of the return process of the robot 100 can be increased.
  • the communication unit 142 includes a light emission instruction transmission unit 140, a light emission instruction reception unit 144, a warehousing request transmission unit 260, and a warehousing permission reception unit 268.
  • the light emission instruction transmission unit 140 transmits a light emission signal to the station 500 or the landmark device 280.
  • the light emission instruction receiving unit 144 receives a light emission signal from another robot 100.
  • the warehousing request transmission unit 260 transmits a warehousing request signal.
  • the warehousing availability receiving unit 268 receives the warehousing availability signal.
  • FIG. 10 is a functional block diagram of the landmark device 280.
  • the landmark device 280 includes a light emitting unit 282 (light source), a light emission control unit 284, and a light emission instruction receiving unit 286.
  • the light emission instruction receiving unit 286 receives the light emission signal.
  • the light emission control unit 284 causes the light emitting unit 282 to emit light according to the light emission mode instructed by the light emission signal.
  • FIG. 11 is a sequence diagram showing a process in which the robot 100 searches for the station 500.
  • the light recognition unit 154 of the robot 100 searches for the guide light when starting the return process.
  • the light recognition unit 154 identifies the position of the station 500 by detecting the guide light.
  • light that is not guided light but is similar to guided light hereinafter referred to as “similar light” is detected, and the robot 100 misidentifies the position of the station 500. There is a possibility.
  • the light emission instruction transmission unit 140 of the robot 100 transmits a light emission signal designating a specific light emission mode when a plurality of guide lights are detected, in other words, when similar lights are detected (S10).
  • the light emission instruction transmission unit 140 may specify only a predetermined light emission mode by a light emission signal, or may select any one of a plurality of types of light emission modes and transmit the light emission signal. ..
  • the light emission instruction receiving unit 248 of the station 500 receives the light emission signal.
  • the light emission control unit 224 of the station 500 sets the light emission mode specified by the light emission signal in the light emission unit 256, and causes the light emission unit 256 to emit light (S12).
  • the light recognition unit 154 of the robot 100 searches the captured image for external light corresponding to the specified light emission mode (S14).
  • the light recognition unit 154 identifies the external light emitted in the designated light emission mode as the guide light.
  • the operation control unit 150 of the robot 100 sets the light emission point of the guide light as the movement target point, and the robot 100 moves toward the station 500 (S16).
  • the robot 100 can reliably recognize the guide light by designating the light emission mode by the light emission signal.
  • the station 500 can notify the robot 100 of the position of the station 500 by generating guide light corresponding to the light emission signal. By transmitting various light emission signals, the robot 100 does not misidentify similar light as guide light.
  • the station 500 may generate the guide light in a special light emission mode that cannot be obtained by the pseudo light. For example, station 500 may constantly flash the guided light at 10 Hz. Since it is unlikely that there is pseudo light that blinks at 10 Hz, the robot 100 may be able to identify the position of the station 500 by detecting 10 Hz light without transmitting a light emission signal.
  • the station 500 can guide the robot 100 to the station 500 without burdening the eyes of the user by changing the light emission mode of the guide light only when the light emission signal is received from the robot 100.
  • the light emission instruction transmission unit 140 sets another light emission mode.
  • a designated light emission signal may be transmitted.
  • the map management unit 168 of the server 200 (station 500) generates a map from the key frame (image feature information) based on the captured image.
  • the map management unit 168 sets the position of the station 500, which is the light emitting point of the guide light, as the reference point of the map.
  • the map management unit 168 can grasp the current position of the robot 100 more accurately by comparing the light emitting point of the guide light recorded on the map with the light emitting point of the guide light actually viewed by the robot 100.
  • the light emission instruction transmission unit 140 of the robot 100 may transmit a light emission signal for position confirmation even when the current position on the map is lost.
  • the map management unit 168 reconfirms the current location based on the guide light corresponding to the light emission signal.
  • the term "lost the current location” as used herein means that "the similarity between the key frame assumed from image storage at the current location P1 of the robot 100 and the key frame obtained from the actual captured image is smaller than a predetermined threshold value ( The view that is actually seen is different from the view that should be seen)."
  • the map management unit 168 may lose the map data stored in the built-in memory due to force majeure such as power-off.
  • the map management unit 168 transmits a lost signal to the robot 100 when the current position is lost.
  • the light emission instruction transmission unit 140 transmits the light emission signal when receiving the lost signal.
  • the light recognition unit 154 detects the guide light and notifies the server 200 of the direction in which the guide light is visible.
  • the map management unit 168 refers to the map (image storage) and respecifies the current position of the robot 100 from the emission direction of the guide light. For example, when the guide light is seen to the left of the robot 100, it can be specified that the station 500 (reference point) is on the left of the robot 100.
  • the map management unit 168 can recognize that the position where the reference point is viewed to the left in the map (image storage) is the current position of the robot 100.
  • FIG. 12 is an enlarged perspective view of the light emitting unit 256.
  • the light emitting unit 256 includes a central light source 380 (second light source) and two side light sources 382 (left light source 382L, right light source 382R) (first light source).
  • the left light source 382L and the right light source 382R are formed on the first surface 386 as vertically long LED light sources.
  • the central light source 380 is a horizontally long LED light source, and is formed on the second surface 384 which is on the back side of the first surface 386 by about 5 to 10 mm.
  • the height of the light emitting unit 256 is set to be approximately the same as the height of the horn 112 (all-sky camera 113) of the robot 100. This is to make it easier to capture the guided light from the omnidirectional camera 113. By setting the light emitting unit 256 to the same height as the horn 112 of the robot 100, it is less likely that the light recognizing unit 154 will mistakenly recognize similar light at a position far from the height of the horn 112 as guide light.
  • FIG. 13 is a schematic diagram of the light emitting unit 256 when viewed from the front.
  • FIG. 13 shows a state where the light emitting unit 256 is viewed from the robot 100 located on the front side (front side) of the station 500.
  • the apparent distance between the left end of the central light source 380 and the right end of the left light source 382L is WL. Further, the apparent distance between the right end of the central light source 380 and the left end of the right light source 382R is WR. In a front view, WR and WL are equal.
  • the central light source 380 and the two lateral light sources 382 are visually recognized by the robot 100 symmetrically.
  • FIG. 14 is a schematic diagram of the light emitting unit 256 when viewed from the side.
  • FIG. 14 shows a state in which the light emitting unit 256 is viewed from the robot 100 located on the front right side of the station 500.
  • the first surface 386 on which the left light source 382L and the right light source 382R are installed is on the front side of the second surface 384 on which the central light source 380 is installed. Therefore, when the robot 100 is located on the side of the station 500, the apparent positions of the central light source 380 and the side light source 382 are largely displaced.
  • the central light source 380 looks short. Further, the central light source 380 appears closer to the left light source 382L than in the front view (FIG. 13). Therefore, the distance WL becomes shorter than that in the front view. As a result, the robot 100 looks like WL ⁇ WR.
  • the light emitting unit 256 is viewed from the robot 100 located on the right front side of the station 500, the two lateral light sources 382 are not symmetrical with respect to the central light source 380.
  • both the distances WR and WL are reduced by the same amount when the light emitting unit 256 is viewed obliquely.
  • the central light source 380 is apparently short, the central light source 380 and the two lateral light sources 382 are visually recognized symmetrically. Therefore, it becomes difficult for the robot 100 to grasp the positional relationship between the robot 100 and the station 500 from the appearance of the guided light.
  • the robot 100 moves in the direction relative to the station 500. It becomes easier to see. Specifically, when the apparent width when viewed from the robot 100 is WL ⁇ WR (when the central light source 380 appears close to the left light source 382L), the robot 100 is located on the right front side when viewed from the station 500. Can recognize that.
  • the distance measuring unit 162 of the robot 100 can also calculate the relative angle from the station 500 from the ratio of WR and WL. When the robot 100 is located on the left front side when viewed from the station 500, WL>WR.
  • FIG. 15 is a schematic diagram for explaining a method for adjusting the position of the robot 100 using a short-distance guidance signal (short-distance infrared rays and ultrasonic waves).
  • the robot 100 searches for the guided light and heads to the charging space 502 while performing alignment based on the positional relationship between the central light source 380 and the side light source 382.
  • the robot 100 detects the medium-range infrared rays transmitted from the medium-range guidance unit 254, and further approaches the station 500 with the emission point of the medium-range infrared rays as the movement target point.
  • the robot 100 approaches the station 500 in the forward direction and at a low speed.
  • a plurality of infrared sensors are installed in a ring shape toward the surroundings.
  • Each of the plurality of infrared sensors detects medium-range infrared rays, and the distance measuring unit 162 calculates the distance and angle between the robot 100 and the station 500 based on the reception intensity of each infrared sensor.
  • the robot 100 When the robot 100 further approaches the station 500, the robot 100 turns so as to turn its back on the station 500. After that, the robot 100 moves backward toward the station 500.
  • the infrared sensor installed in the horn 112 can detect medium-range infrared rays when it is far from the station 500, but temporarily cannot detect medium-range infrared rays when it is too close to the station 500. This is because the horn 112 is out of the irradiation range of medium-range infrared rays.
  • the operation control unit 150 causes the robot 100 to turn when the infrared sensor of the horn 112 cannot detect mid-range infrared rays.
  • the distance measuring unit 162 may measure the distance between the robot 100 and the station 500 based on the image captured by the omnidirectional camera 113. Specifically, when the size of the image region corresponding to the station 500 in the captured image (all-round image) becomes equal to or larger than the predetermined size, the operation control unit 150 may rotate the robot 100.
  • the infrared sensor 172 (see FIG. 6C) on the back surface of the robot 100 detects short-range infrared rays, and the left microphone 174L and the right microphone 174R detect ultrasonic waves.
  • the distance measuring unit 162 recognizes that the robot 100 and the station 500 are particularly close to each other by detecting the short-range infrared rays.
  • the short-distance guiding unit 252 of the station 500 simultaneously generates short-distance infrared rays (high-speed signal) and ultrasonic waves (low-speed signal). Since the propagation speed of infrared rays is higher than that of ultrasonic waves, the microphone 174 detects ultrasonic waves later than the infrared sensor 172 detects short-range infrared rays.
  • the distance measuring unit 162 calculates the distance and the angle from the robot 100 to the short distance guiding unit 252 based on the difference time between the detection time of the near infrared ray and the detection time of the ultrasonic wave.
  • FIG. 15 shows how the robot 100 approaches the right space 502R formed around the right guiding portion 252R.
  • a difference time occurs between the time point when the right microphone 174R detects ultrasonic waves and the time point when the left microphone 174L detects ultrasonic waves.
  • the distance measuring unit 162 calculates the distance from the left microphone 174L to the right guiding unit 252R based on this difference time.
  • Distance measuring unit 162 similarly calculates the distance from right microphone 174R to right guiding unit 252R.
  • the distance measuring unit 162 specifies the relative angle between the traveling direction (backward direction) of the robot 100 and the right guiding unit 252R based on the two distances (coordinates).
  • the operation control unit 150 moves the robot 100 backward while finely adjusting the traveling direction of the robot 100 based on this relative angle.
  • the station 500 sends a warehousing permission signal, and then sends a weak current to the power supply terminal 530 of the charging space 502 to be warehousing.
  • the charging terminal 510 is connected to the power supply terminal 530, and the distance measuring unit 162 determines that the storage is completed when a weak current is detected.
  • the robot 100 transmits a confirmation signal when a weak current is detected, and the charging control unit 264 of the station 500 starts power supply when receiving the confirmation signal.
  • the microphone 174 gradually detects ultrasonic waves or infrared rays output from the short-distance guiding unit 252. become unable. This is because the heights of the short-distance guiding unit 252 and the microphone 174 are significantly different, and therefore, when the station 500 and the robot 100 are too close to each other, the microphone 174 deviates from the ultrasonic transmission range of the short-distance guiding unit 252. Is.
  • the distance measuring unit 162 measures the time that has passed since the short-range infrared rays could not be detected. When the weak current cannot be detected within the predetermined reference time, the operation control unit 150 determines that the stocking has failed, advances the robot 100 to move away from the station 500, and re-executes the feedback process.
  • the operation control unit 150 may re-execute the return process by regarding the front wheels 102 as idling failure even when the front wheels 102 are idling.
  • the robot 100 may be equipped with an inertial measurement unit (IMU: Inertial Measurement Unit).
  • IMU Inertial Measurement Unit
  • the operation control unit 150 may re-execute the feedback process.
  • FIG. 16 is a time chart for explaining a control method when a plurality of robots 100 desire to return to the station 500.
  • the station 500 in this embodiment can charge two robots 100 (hereinafter, referred to as “robot 100A” and “robot 100B”) in two charging spaces 502 at the same time. However, the two robots 100A and 100B cannot return to the station 500 at the same time.
  • the stations 500 store the robots 100 in the station 500 one by one.
  • the guidance unit 266 When the robot 100A enters the left space 502L, the guidance unit 266 generates a short-distance guidance signal (short-distance infrared rays and ultrasonic waves) from the left guidance unit 252L, but generates a short-distance guidance signal from the right-near-right guidance unit 252R. Do not let The warehousing determination unit 262 determines in which of the two charging spaces 502 the warehousing is performed.
  • a short-distance guidance signal short-distance infrared rays and ultrasonic waves
  • the robot 100A transmits a warehousing request signal to the station 500 at time t0 in FIG.
  • the warehousing determination unit 262 of the station 500 permits the robot 100A to enter the left space 502L.
  • the warehousing permission transmission unit 258 transmits a warehousing permission signal (hereinafter, referred to as “housing permission signal”) indicating the warehousing permission to the robot 100A at time t1.
  • the robot 100A approaches the station 500 based on the guide light and the medium-range infrared rays.
  • the guiding unit 266 of the station 500 causes the left guiding unit 252L to generate a short-distance guiding signal.
  • the robot 100A enters the left space 502L according to the short-distance guidance signal.
  • the power supply terminal 530 and the charging terminal 510 are connected, and the charging control unit 264 starts charging the robot 100A at time t3.
  • the warehousing determination unit 262 sets as a “rejection period” from time t0 when the warehousing request signal is received from the robot 100A to time t3 when the robot 100A starts charging.
  • the warehousing determination unit 262 rejects the warehousing of the robot 100B.
  • the warehousing permission transmission unit 258 transmits a warehousing permission signal (hereinafter, referred to as “housing refusal signal”) indicating refusal of warehousing to the robot 100B.
  • the robot 100B transmits a warehousing request signal again after a lapse of a predetermined time.
  • the warehousing permission transmission unit 258 may again permit the warehousing of the robot 100B after the refusal period ends. According to such a control method, the plurality of robots 100 can be sequentially guided to the station 500.
  • the refusal period may be a period from time t1 to time t3, a period from time t1 to time t2, or a period from time t0 to time t2.
  • the operation control unit 150 of the robot 100B (or the operation control unit 222 of the server 200) may set within a predetermined distance from the station 500, for example, within 2 meters as the action prohibited section of the robot 100B. .. According to such a control method, the robot 100B can be controlled so as not to interfere with the return of the robot 100A.
  • the robot 100 is within a predetermined range formed around the station 500, for example, within a range of 1 meter, except when it sends a warehousing request signal or when it receives a warehousing permission signal and executes feedback processing. May not be set.
  • the robot 100A may transmit a signal indicating “in return processing” to the robot 100B when the storage is permitted.
  • the robot 100B may not enter the predetermined range including the station 500 when receiving this signal from the robot A.
  • the robot 100B may exclude the course of the robot 100A (the straight line connecting the robot 100A and the station 500) from the actionable range. In this way, the robot 100B may be controlled so as not to interfere with the return of the robot 100A by moving away from the path of the robot 100A.
  • the robot 100A may notify the robot 100B that the return process is in progress by an infrared communication device or the like included in the horn 112.
  • the robot 100A may include a light emitting unit such as an LED on the horn 112. The robot 100A may turn on this LED during the return process.
  • the robot 100B can confirm whether or not the robot 100A is in the return process by confirming the lighting state of the LED of the robot 100A.
  • the robot 100B may be separated from the station 500 or may be separated from the course of the robot 100A. That is, the movement of the robot 100A that is returning for charging is prioritized, and the other robots move so that the robot 100A can return to the station 500 within the shortest distance.
  • FIG. 17 is a schematic diagram for explaining a method of guiding the robot 100 to the station 500 by the landmark device 280.
  • the robot 100 cannot visually recognize the guide light from the station 500.
  • the robot 100 can recognize the location point of the station 500 by using the map generated based on the key frame.
  • the landmark device 280 guides the robot 100 to the station 500, so that the robot 100 can return to the station 500 more reliably.
  • the light recognition unit 154 of the robot 100 searches for a guide light from an image captured by the omnidirectional camera 113. Since the station 500 is hidden by the shield 516, the light recognition unit 154 cannot detect the guided light.
  • the light emission instruction transmission unit 140 of the robot 100 transmits the light emission signal L0. Even if the station 500 can receive the light emission signal L0, the robot 100 cannot visually recognize (detect) the guided light because of the shield 516.
  • the light emission instruction transmission unit 140 of the robot 100 transmits a search signal when the guide light corresponding to the light emission signal L0 cannot be detected.
  • the search signal may be an optical signal that blinks in a predetermined pattern, or may be a radio wave signal.
  • the light emission instruction transmission unit 140 of the robot 100 transmits a light emission signal L1 (first light emission signal that specifies the first light emission mode) that specifies a new light emission mode.
  • the light emission control unit 284 transitions to the light emission signal reception preparation state.
  • the light emission instruction receiving unit 286 receives the light emission signal L1
  • the light emission control unit 284 causes the light emission unit 282 to emit light according to the light emission signal L1.
  • the light recognition unit 154 of the robot 100 recognizes the light corresponding to the light emission signal L1 from the landmark device 280 (hereinafter referred to as “landmark light”). Since the landmark device 280 is not shielded from the robot 100, the robot 100 can recognize the landmark light.
  • the operation control unit 150 of the robot 100 sets the light emission point of the landmark light as the current movement target point of the robot 100. The robot 100 moves to the location point of the landmark device 280 (S20).
  • the robot 100 When the robot 100 enters the vicinity of the landmark device 280, for example, within a range of 0.5 meters from the landmark device 280, the robot 100 emits a light emission signal L2 (a second emission mode that specifies a second emission mode). 2 light emission signal) is transmitted.
  • the station 500 generates a guide light corresponding to the light emission signal L2.
  • the robot 100 can visually recognize (detect) the guided light of the station 500 when it is near the landmark device 280.
  • the robot 100 recognizes the guided light from the station 500 and then returns to the station 500 (S22). According to such a control method, even when the guided light cannot be recognized from the robot 100, it is possible to return to the station 500 by once approaching the landmark device 280.
  • FIG. 18 is a schematic diagram for explaining a method of guiding the robot 100A to the station 500 by the robot 100B.
  • the robot 100B can also function as the landmark device 280.
  • the robot 100B can visually recognize the guide light.
  • the robot 100A transmits the light emission signal L0, but cannot detect the guide light. Therefore, the robot 100A transmits a light emission signal L1 (first light emission signal that specifies a first light emission mode) that specifies a new light emission mode.
  • L1 first light emission signal that specifies a first light emission mode
  • the light emission instruction receiving unit 144 of the robot 100B receives the light emission signal L1.
  • the light emission control unit 158 of the robot 100B causes the light emitting unit 138 to emit light in accordance with the light emission signal L1.
  • the light emitted from the robot 100B functions as landmark light.
  • the robot 100A detects the landmark light of the robot 100B and approaches the robot 100B (S30).
  • the robot 100A transmits a light emission signal L2 (second light emission signal that specifies a second light emission mode) that specifies a new light emission mode near the robot 100B.
  • the station 500 generates a guide light corresponding to the light emission signal L2.
  • the robot 100A detects the guided light and returns to the station 500 (S32).
  • the prescribed peripheral area 520 shown in FIG. 18 indicates a range in which the guide light can be visually recognized.
  • the position determination unit 166 of the robot 100A determines that the robot 100A goes out of the specified peripheral area 520, it determines that the position condition is not satisfied, and sends a departure signal to the robot 100B.
  • the operation control unit 150 of the robot 100B limits the action range of the robot 100B to the specified peripheral area 520.
  • the robot 100B can be made to function as a landmark device by keeping the robot 100B in the specified peripheral area 520.
  • the station 500 may always turn on the guide light.
  • the position determination unit 166 of the robot 100 constantly detects the guide light of the station 500 by the omnidirectional camera 113, and determines that the position condition is not satisfied (outside the specified peripheral area 520) when the guide light is no longer visible. May be.
  • the defined peripheral area 520 may be conceptually determined based on whether or not the guided light is visually approved, or may be explicitly set in advance as a predetermined area around the station 500 in the map.
  • the robot 100 recognizes the location point of the station 500 by using the guided light from the station 500 as a clue.
  • the robot 100 can further distinguish the guide light and the similar light by transmitting a light emission signal. Even when the remaining battery level is low, the robot 100 can return to the station 500 without being wasted by similar light and wasteful movement (power waste).
  • a special light emission mode a light emission mode that is not easily mistaken for similar light
  • the user may feel the guide light annoying.
  • Such a problem can be solved by generating the guide light in a special light emission mode only when the light emission signal is received. Further, it is considered that it is not preferable to constantly generate radio waves from the charging station 500 in consideration of the influence on other electronic devices.
  • the light emission signal designating another light emission mode may be retransmitted. Since the station 500 can change the light emission mode according to the light emission signal, the robot 100 can reliably identify the station 500.
  • the light emitting unit 256 of the station 500 has a plurality of light sources (a central light source 380 and a side light source 382).
  • the unique light emitting shape formed by the central light source 380 and the two light emitting units 282 makes it easier for the robot 100 to identify the guide light.
  • the robot 100 can recognize the relative angle between the robot 100 and the station 500 according to the appearance of the guided light.
  • the robot 100 adjusts the moving direction based on the appearance of the guided light so as to return from a position near the front of the station 500.
  • the charging device 506 and the server 200 are integrally formed. By making the charging device 506 and the server 200 into a single housing, the entire robot system 300 can be made compact. It is desirable that the station 500 has a certain weight because it is necessary to receive the impact caused by the storage of the robot 100. By incorporating the server 200 in the station 500, the weight of the station 500 can be increased reasonably. The load of the server 200 also contributes to the stability of the station 500.
  • the station 500 in this embodiment can charge two robots 100A and 100B at the same time. Since the station 500 does not have to be prepared for each robot 100, the station 500 can be formed compactly. In addition, since the plurality of robots 100 are sequentially stored based on the storage request signal, it is easy to prevent the plurality of robots 100 from being congested near the station 500. If the robot 100B also attempts to return during the return processing of the robot 100A, the guidance signal for the robot 100A and the guidance signal for the robot 100B may coexist.
  • the guidance unit 266 of the charging device 506 controls the left guidance unit 252L and the right guidance unit 252R so as not to generate guidance signals at the same time, thereby preventing the guidance signals from being mixed.
  • Robot 100A and robot 100B are stored in order.
  • the robot 100B is kept waiting during the return processing of the robot 100A.
  • the robot 100A enters the warehouse, the robot 100B starts the return process.
  • the robot 100A and the robot 100B politely enter the station 500 in order, and after the entry, the robot 100A and the robot 100B are adjacently charged.
  • the cuteness unique to the second body designating the second light emission mode can be appealed to the user.
  • the robot 100 sets the light emitting point of the guide light of the station 500 as the reference point of the map.
  • the robot 100 forms the map based on the key frame, but by using the light emission point of the guide light as the reference point, it is possible to more reliably recognize which point on the map (image storage) the current point corresponds to.
  • the fact that the station 500 informs its own location point by the guide light not only facilitates the return of the robot 100 to the station 500, but is also useful for position recognition by the robot 100.
  • the robot 100 can return to the station 500 by relying on the landmark light even when it is far away from the station 500.
  • Properly arranging one or more landmark devices 280 in the room makes it easy to expand the action range of the robot 100. In other words, even if the robot 100 is located away from the station 500, the robot 100 can safely go out as far as it can see the landmark device 280.
  • the robot 100B may function as a landmark device for the robot 100A. In this case, even if one of the robots 100A moves far away, the robot 100B can easily return to the station 500 because the robot 100B stays near the station 500.
  • the robot 100A may limit the range of action so that the robot 100B is not separated from the robot 100B so that the robot 100B cannot be visually recognized.
  • the robot system 300 is configured by one robot 100 and one station 500 (charging device 506 and server 200), a part of the functions of the robot 100 may be realized by the server 200 of the station 500. Some or all of the functions of the server 200 may be assigned to the robot 100.
  • One server 200 may control a plurality of robots 100, or a plurality of servers 200 may cooperate to control one or more robots 100.
  • a third device other than the robot 100 and the server 200 may take part of the function.
  • the aggregate of the functions of the robot 100 and the functions of the server 200 described with reference to FIGS. 8 and 9 can be generally understood as one “robot”. How to distribute the plurality of functions required to implement the present invention to one or more pieces of hardware is determined in consideration of the processing capability of each piece of hardware and the specifications required for the robot system 300. It may be decided.
  • the “robot in the narrow sense” means the robot 100 that does not include the server 200, but the “robot in the broad sense” means the robot system 300. Many of the functions of the server 200 may possibly be integrated into the robot 100 in the future.
  • the light recognizing unit 154 of the robot 100 first searches for the guided light in the returning process.
  • the robot 100 has been described as transmitting the light emission signal when the guided light cannot be recognized.
  • the robot 100 may always transmit a light emission signal during the return process.
  • the robot 100 may identify external light that responds to the light emission signal as guide light and start moving toward the station 500.
  • the battery level monitor 176 monitors the battery level of the battery 118.
  • the operation control unit 150 of the robot 100 may start the feedback process when the battery level (charging rate) of the battery 118 becomes less than or equal to a predetermined threshold value.
  • the storage request transmission unit 260 of the robot 100 may transmit a storage request signal when the remaining battery level of the battery 118 becomes equal to or less than a predetermined threshold value.
  • the light emission instruction transmission unit 140 of the robot 100 may transmit a light emission signal when the battery level becomes low.
  • the robot 100 may use the light emission signal as a storage request signal.
  • the robot 100 When the robot 100 enters the station 500, it may be connected to not only the charging device 506 but also the server 200 by wire.
  • the power supply terminal 530 may include not only a power line but also a data line. The same applies to the charging terminal 510 of the robot 100.
  • the server 200 and the robot 100 may send and receive data via a data line.
  • the robot 100 wirelessly transmits various data such as feature vectors and event information to the server 200.
  • various data are wirelessly transmitted from the server 200 to the robot 100. Since wired communication has a higher data transfer rate than wireless communication, wired communication is more suitable for sending and receiving large amounts of data.
  • the server 200 may acquire the image and audio data acquired by the robot 100 while the robot 100 is being charged.
  • the robot 100 may store data such as a captured image in the data storage unit 148 of the robot 100, and upload the data stored during charging to the server 200.
  • the server 200 may download an updated version of the behavior control program for the robot 100 from an external server.
  • the server 200 may send the behavior control program to the robot 100 while the robot 100 is being charged.
  • the robot 100 may install the updated version of the action control program by updating the action control program during charging and automatically restarting before the completion of charging. According to such a control method, the operation of updating the behavior control program, which is peculiar to a computer, can be casually executed during charging, in other words, while the user does not pay much attention to the robot 100.
  • the robot 100 has been described as periodically performing the return process.
  • the robot 100 determines the return timing according to the built-in scheduler.
  • the station 500 may include a return management unit (not shown).
  • the return management unit may manage the return timing of the robot 100 as a scheduler.
  • the station 500 may turn off the light emitting unit 256 during normal operation.
  • the return management unit may automatically turn on the light emitting unit 256 when the return timing approaches.
  • the robot 100 may start the return process when detecting the guide light.
  • the return timing of the robot 100 can be managed by the station 500.
  • the station 500 can avoid simultaneous return of the plurality of robots 100 by shifting the return timing of the plurality of robots 100.
  • the station 500 may generate yellow guide light when returning the robot 100A, and may generate green guide light when returning the robot 100B.
  • the robot 100 may transmit a specific signal when the guided light can be specified.
  • the communication unit 204 of the station 500 may set the guide light to be always in a light emitting state when receiving the specific signal.
  • the light emission mode of the guide light is changed, for example, when the guide light is blinked, the user may be concerned about the guide light.
  • the station 500 may turn off the guide light when the robot 100 enters.
  • the station 500 may change the emission color of the guide light when receiving the specific signal from the robot 100A.
  • the station 500 may normally emit the guide light in blue, and may emit the guide light in red during the return process of the robot 100A.
  • the robot 100B can know whether the robot 100A is in the process of returning.
  • the robot 100B may check the color of the guide light when starting the return process, and wait for the return process when the guide light is red. Not only the color of the guide light, but similar information may be notified in other light emission modes.
  • the station 500 may turn off the light emitting unit 256 in a normal time and turn on the light emitting unit 256 only when a light emitting signal is received. According to such a control method, the power consumption of the light emitting unit 256 can be suppressed. If the guide light is set to be always off, the user is less likely to feel the guide light visually. For example, when the station 500 is installed in the bedroom, it is considered desirable to turn off the guide light during normal times.
  • the return management unit may periodically send a return signal prompting the robot 100 to return.
  • the robot 100 receives the return signal, the robot 100 starts the return process.
  • the return signal may include an ID that specifies the target robot 100.
  • the return signal may be a radio wave signal or a visible light signal.
  • the return management unit may also transmit a return signal to the robot 100B when the robot 100A returns. According to such a control method, when the robot 100A returns to the station 500 (nest), the robot 100B becomes bored and returns to the station 500 (nest). You can direct.
  • the return management unit may send a return signal when the user is not around the station 500 or when the light intensity of the room is low (when there is no user or when it is assumed that it is midnight).
  • a time period during which the robot 100 does not need to interact with the user can be positively used as a charging opportunity for the robot 100. While it is expected that the robot 100 will be actively involved with the user, it is necessary to be appropriately charged by the station 500. In other words, when the user is not present, in other words, by setting the charging time to a time period in which the user is not required to be involved, it is possible to reduce the “time when the robot 100 is resting” seen from the user. In order for the user to feel the robot 100 is active, it is desirable to charge the robot 100 when it is not in view of the user.
  • the user may explicitly set the charging time of the robot 100 on a user terminal such as a smartphone. For example, it is assumed that the user sets the charging time from 10:00 to 10:10.
  • the user terminal transmits this schedule data to the station 500.
  • the return management unit of the station 500 registers schedule data.
  • the return management unit may transmit a return signal at 10:00 and a departure signal to the robot 100 at 10:10.
  • the robot 100 receives the leaving signal, the robot 100 leaves the station 500 and resumes autonomous behavior even if the charging is not completed.
  • the user can control the charging time of the robot 100 using the user terminal. For example, when there is a visitor from 10:00, the user can control the robot 100 so as not to interfere with the visitor reception by setting the corresponding time as the charging time.
  • the user may send a return instruction from the user terminal to the station 500.
  • the station 500 transmits a return signal to the robot 100 when receiving a return instruction from the user terminal.
  • the robot 100 starts the return process when it receives the return signal. For example, even when the child is playing with the robot 100 until late at night, if the parent user secretly transmits the return instruction, the robot 100 starts the return process, and thus the robot 100 becomes sleepy, or You can give the child the impression that he/she wants to return to the station 500 (nest) because he is tired.
  • the robot 100 can know the positional relationship with the station 500 from the guide light (optical signal), the medium-range guide signal (medium-range infrared ray), and the short-range guide signal (short-range infrared ray and ultrasonic wave).
  • the robot 100 approaches the station 500 based on the medium-range infrared rays when detecting the medium-range infrared rays.
  • the robot 100 further approaches the station 500 according to the short-distance guidance signal when detecting the short-distance infrared rays and ultrasonic waves.
  • the robot 100 may measure the distance to the station 500 with a distance measuring sensor, ignore the medium-range infrared rays when the distance from the station 500 is within a predetermined distance, and continue the feedback processing according to the short-range infrared rays. Alternatively, the robot 100 may execute the feedback processing according to the one having the higher radio field intensity when both the medium-range guidance signal and the short-range guidance signal are detected.
  • the station 500 has been described as informing the robot 100 of its location point by the guide light (visible light).
  • the station 500 may notify the robot 100 at a long distance of its location point by transmitting radio waves (invisible light).
  • the robot 100 may transmit the light emission signal LA in the light emission mode A.
  • the robot 100 may retransmit the light emission signal LB in the light emission mode B when the guide light according to the light emission mode A cannot be recognized.
  • the light emitting mode A and the light emitting mode B may be the same.
  • the light emission signal LB is a signal for the landmark device 280 that includes information designating the landmark device 280.
  • the landmark device 280 When receiving the light emission signal LB, the landmark device 280 generates landmark light according to the light emission mode B.
  • the robot 100 may detect the landmark light and approach the landmark device 280. According to such a control method, the robot 100 gives the highest priority to the search of the station 500, and searches the landmark device 280 when the station 500 cannot be searched. The robot 100 can recognize that the approach target is not the station 500 but the landmark device 280 because the landmark device 280 reacts to the light emission signal LB.
  • the warehousing determination unit 262 may reject the warehousing request of the robot 100B when the robot 100B transmits a warehousing request signal to the station 500 when the robot 100A is within a predetermined range from the station 500. This is because the robot 100A may interfere with the storage of the robot 100B.
  • the charging device 506 may send a departure signal to the robot 100A.
  • the robot 100A may leave the station 500 when it receives the leave signal.
  • the two charging spaces 502 are prepared for the two robots 100.
  • the two robots 100 can be charged at the same time after being stored in the charging space 502 in order.
  • the substitution chamber method is effective even when the number of robots 100 is larger than the number of charging spaces 502.
  • the storage determination unit 262 permits the storage of one of the robots 100 and rejects the storage of the other robot 100. ..
  • the two robots 100 may notify the respective battery remaining amounts to the station 500, and the warehousing determination unit 262 may prioritize the warehousing of the robot 100 having the smaller battery remaining amount.
  • the second robot 100 that specifies the second light emission mode has been described as not being able to enter the station 500 at the same time.
  • the second robot 100 that specifies the second light emission mode may be stored in the station at the same time. You may be able to enter 500.
  • the station 500 may transmit a warehousing permission signal designating the left space 502L to the robot 100A and a warehousing permission signal designating the right space 502R to the robot 100B. Further, even during the return processing of the robot 100A, the robot 100B may move toward the station 500 when desiring to store.
  • the two robots 100 may be set in advance in which of the two charging spaces 502 they are to be stored.
  • the storage destination may be set based on the appearance color of the robot 100 and the color of the panel 508 provided in the charging space 502.
  • the outer skin 314 of the robot 100A is dark brown, and the outer skin 314 of the robot 100B is light brown.
  • the color of the left panel 508L is dark brown and the color of the right panel 508R is light brown.
  • the robot 100A may return to the left space 502L having the same color scheme.
  • the robot 100B returns to the right space 502R.
  • the colors of the robot 100 and the rear panel 508 can be made uniform during charging, which contributes to improving the appearance of the robot system 300 and the two robots 100 during charging.
  • the robot 100 may recognize (store) the color of its outer skin 314.
  • the robot 100 may request the station 500 to guide toward the panel whose color is the same as that of the station 100 via the warehousing request transmission unit 260.
  • the warehousing request signal may be transmitted including the color ID, and 262 may select 502 corresponding to the color ID as the warehousing destination.
  • the operation control unit 222 of the server 200 may record the number of times of return to each of the left space 502L and the right space 502R for each of the plurality of robots 100. If the robot 100A stores the robot 100A in the left space 502L more frequently than the right space 502R in a unit period, the storage determination unit 262 of the station 500 may store the robot 100A in the left space 502L preferentially. As a result, the robot 100B is naturally easily guided to the right space 502R. According to such a control method, each of the robot 100A and the robot 100B gradually has its favorite nest (charging space 502), so that the commitment of the two robots 100 to the nest can be expressed.
  • the eye generation unit (not shown) of the robot 100 may express “sleep” by closing the eye image displayed on the eye 110 during charging.
  • the eye generation unit expresses sleep by changing the eye image to an eye-closed image when the first sleep duration has elapsed since the robot 100 was stored.
  • the charging control unit 264 of the server 200 may instruct the two robots 100 to perform a second sleep duration longer than the first sleep duration. ..
  • the robot 100 ⁇ /b>A and the robot 100 ⁇ /b>B may move their line of sight so as to stare at each other during the simultaneous charging, or may move the arm 106 so as to touch each other.
  • the robot 100 is equipped with the thermosensor 115.
  • the station 500 may include a thermal reference near the thermosensor 115 that produces a constant temperature. For example, if the thermal reference is 25 degrees, the station 500 may correct the detection sensitivity of the thermo sensor 115 by detecting 25 degrees of the thermal reference by the thermo sensor 115.
  • the light emission signal and the warehousing request signal may be transmitted as a wireless signal such as Bluetooth (registered trademark) or Wi-Fi. If the light emission signal and the like are wireless signals (radio wave signals), the robot 100 can receive the light emission signal and the like even when the robot 100 is in a position where the station 500 cannot be visually recognized.
  • a wireless signal such as Bluetooth (registered trademark) or Wi-Fi.
  • Station 500 may include a speaker (not shown).
  • the robot 100 may include a sound generation requesting unit (not shown).
  • the sound requesting unit of the robot 100 may send a sound request signal to the station 500.
  • the sound generation control unit (not shown) of the station 500 causes the speaker to generate a sound of a predetermined frequency. It is desirable that the voice at this time be in a frequency band that does not make the user feel uncomfortable and that the voice can be easily estimated. It does not have to be an audible sound.
  • the voice direction specifying unit (not shown) of the robot 100 may estimate the direction of the station 500 by a built-in microphone array.
  • the robot 100 can estimate the location point of the station 500 even when the station 500 cannot be visually recognized from the robot 100.
  • voice since voice has a drawback that the sounding point becomes difficult to be recognized by being reflected on the wall, it is considered preferable to use it auxiliary when the guide light cannot be detected.
  • FIG. 19 is an external view of the station 550 in the modified example.
  • two ultrasonic wave generators 552 and infrared ray generators 554 are embedded in the base 504, instead of providing the short-distance guiding section 252 under the rear panel 508.
  • the ultrasonic wave generation device 552RR, the ultrasonic wave generation device 552RL, and the infrared ray generation device 554R form a short-distance guide portion 556R on the right side.
  • the ultrasonic wave generator 552LL, the ultrasonic wave generator 552LR, and the infrared ray generator 554L form a short-distance guide portion 556L on the left side.
  • the infrared ray generators 554R and 554L may be collectively performed by the intermediate distance guiding unit 254. That is, the infrared generators 554R and 554L may not be provided, and only the infrared generators of the middle distance guiding unit 254 may be provided.
  • the infrared generating device of the intermediate distance guiding unit 254 controls in conjunction with the operation of the left and right ultrasonic generators 552. To be done.
  • the ultrasonic wave generator 552RR and the ultrasonic wave generator 552RL are provided at positions symmetrical to the virtual center line of the right space 502R. Also, ultrasonic waves are emitted toward the front.
  • the distance between each ultrasonic wave generator 552 and the robot 100 can be calculated. Since the position of the ultrasonic generator 552 at the station 550 is fixed, the relative position of the robot 100 with respect to the station 550 can be specified. According to this modification, the relative position of the robot 100 with respect to the station 550 can be specified, and the orientation of the robot 100 with respect to the station 550 can be specified, so that more accurate guidance can be performed.
  • the right guiding portion 252R is arranged on the reference approach line of the taxiway, and since infrared rays have a certain directivity, it is possible to receive infrared rays within a certain range including the taxiway. It is located in the (fan-shaped spread range). Since the infrared rays spread, it becomes more difficult to grasp the exact position as the distance from the station increases.
  • the position of the robot 100 can be specified as coordinates by providing ultrasonic generators on the left and right. Further, the orientation of the robot 100 with respect to the station 500 can be specified. As a result, the station 550 shown in FIG. 19 can be guided more accurately than the station 500 shown in FIG.
  • FIG. 20 is a schematic diagram for explaining a method of identifying the position and the traveling direction of the robot 100.
  • the distance between the ultrasonic wave generator 552RR and the ultrasonic wave generator 552RL and the robot 100 is indicated by a dotted arc.
  • the intersection point of the two dotted lines is the position P of the robot 100.
  • the position P is the position of either the left or right microphone 174 provided on the back surface of the robot 100.
  • the position P is the position of the left microphone 174L.
  • the left microphone 174L receives ultrasonic waves from the ultrasonic wave generation device 552RR and the ultrasonic wave generation device 552RL.
  • the left microphone 174L receives the ultrasonic wave from the ultrasonic wave generator 552RR after the infrared sensor 172 receives the infrared light.
  • the distance measuring unit 162 measures the distance A from the ultrasonic generator 552RR to the left microphone 174L based on the time difference between the two.
  • the left microphone 174L receives the ultrasonic wave from the ultrasonic wave generator 552RL after the infrared sensor 172 receives the infrared light.
  • the distance measuring unit 162 measures the distance B from the ultrasonic generator 552RL to the left microphone 174L.
  • An intersection of a circle having a radius A from the ultrasonic generator 552RR and a circle having a radius B from the ultrasonic generator 552RL is specified as the position coordinate L of the left microphone 174L (position P).
  • the position coordinate R of the right microphone 174R can be similarly obtained. If the position coordinates of the left microphone 174L and the right microphone 174R are known, the center of the line segment can be specified as the position coordinate of the robot 100. Furthermore, as described with reference to FIG. 15, the orientation of the robot 100 with respect to the ultrasonic wave generation device 552 can be calculated based on the time difference between the ultrasonic waves detected by the left microphone 174L and the right microphone 174R. In this modification, since there are two ultrasonic generators 552 on the left and right, the orientation of the robot 100 with respect to each ultrasonic generator is calculated, and by performing numerical processing such as averaging, the orientation can be calculated more accurately. ..
  • the positions of the left and right microphones 174 provided on the back surface of the robot 100 can be specified as position coordinates.
  • the direction of the normal to the line segment connecting the two microphones 174 may be specified as the orientation of the robot.
  • the frequencies of the ultrasonic waves generated by the ultrasonic generator 552RR and the ultrasonic generator 552RL are the same.
  • the infrared generator 554R irradiates infrared rays at the same time as the generation of ultrasonic waves in the ultrasonic generator 552, and transmits information specifying the ultrasonic generator that generated the ultrasonic waves.
  • the infrared ray generation device 554R alternately and regularly generates an infrared ray indicating the ultrasonic wave generation device 552RR on the right side and an infrared ray indicating the ultrasonic wave generation device 552RL on the left side.
  • the ultrasonic wave generator 552RR and the ultrasonic wave generator 552RL alternately generate ultrasonic waves.
  • the movement speed of the robot 100 during short-distance guidance is slower than that during normal movement.
  • the distance that the robot 100 moves is extremely short. Therefore, even if ultrasonic waves are alternately generated by the ultrasonic wave generators 552RR and 552RL, the position of the robot 100 can be measured almost accurately. it can.
  • the frequencies of the ultrasonic waves generated by the ultrasonic wave generator 552RR and the ultrasonic wave generator 552RL may be different. In this case, even if the ultrasonic wave generation device 552RR and the ultrasonic wave generation device 552RL generate ultrasonic waves simultaneously and periodically, the microphone 174 can recognize which ultrasonic wave is generated.

Landscapes

  • Engineering & Computer Science (AREA)
  • Power Engineering (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)
  • Charge And Discharge Circuits For Batteries Or The Like (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Toys (AREA)

Abstract

The present invention improves the accuracy with which a robot identifies a charging station. The charging station comprises: a charging space that has a power supply terminal; a charging control unit that charges a secondary battery built in the robot when the power supply terminal is connected to the robot in the charging space; a light emission instruction reception unit that receives, from the robot, a light emission signal designating a light emission mode; and a light emission control unit that changes a light emission mode of a light source according to the designated light emission mode. The robot comprises: an operation control unit that selects a motion; a drive mechanism that executes the selected motion; a light emission instruction transmission unit that transmits the light emission signal; and a light recognition unit that recognizes external light. The light recognition unit of the robot specifies external light corresponding to the designated light emission mode, and the operation control unit determines the moving direction of the robot by using a light emission point of the external light as a location point of the charging station.

Description

ロボットと、ロボットのための充電ステーションおよびランドマーク装置Robots, charging stations and landmark devices for robots
 本発明は、内部状態または外部環境に応じて自律的に行動選択するロボットと、ロボットを充電するための充電ステーション、に関する。 The present invention relates to a robot that autonomously selects an action according to an internal state or an external environment, and a charging station for charging the robot.
 人間は、癒やしを求めてペットを飼う。その一方、ペットの世話をする時間を十分に確保できない、ペットを飼える住環境にない、アレルギーがある、死別がつらい、といったさまざまな理由により、ペットをあきらめている人は多い。もし、ペットの役割が務まるロボットがあれば、ペットを飼えない人にもペットが与えてくれるような癒やしを与えられるかもしれない(特許文献1、2参照)。 Humans keep pets for healing. On the other hand, many people give up their pets for various reasons, such as not being able to secure sufficient time to care for them, having no living environment for keeping pets, having allergies, and having a difficult bereavement. If there is a robot that plays the role of a pet, healing may be given to those who cannot keep the pet (PATENT DOCUMENTS 1 and 2).
 ロボットにペットのような伴侶としての存在感を発揮させる上では、ロボットの移動が前提となる。ロボットには、ユーザの補助を受けることなく、充電ステーションを自ら探し出して適宜充電を受ける機能が期待される(特許文献3、4参照)。 The movement of the robot is a prerequisite for the robot to have a presence as a companion like a pet. The robot is expected to have a function of searching for a charging station and receiving appropriate charging without the assistance of the user (see Patent Documents 3 and 4).
特開2000-323219号公報Japanese Patent Laid-Open No. 2000-323219 国際公開第2017/169826号International Publication No. 2017/169826 特開2001-125641号公報Japanese Patent Laid-Open No. 2001-125641 特開2004-151924号公報JP, 2004-151924, A
 特許文献3に示す充電ステーションは、自らの所在地をロボットに伝えるために所定のマークを表示する。ロボットは、このマークを視認することで充電ステーションを識別する(段落[0093][0094]等参照)。また、特許文献3の充電ステーションは、音波や電波を発生させることでも、ロボットに充電ステーションの所在地を知らせている(段落[0099][0100]等参照)。 The charging station shown in Patent Document 3 displays a predetermined mark to inform the robot of its location. The robot identifies the charging station by visually recognizing this mark (see paragraphs [0093][0094] and the like). Further, the charging station of Patent Document 3 also notifies the robot of the location of the charging station by generating sound waves or radio waves (see paragraphs [0099][0100] and the like).
 充電ステーションは、多種多様な環境に設置されることが想定される。このため、充電ステーションが所定のマークを表示させても、ロボットがそのマークを見つけることができるとは限らない。たとえば、充電ステーションのマークに類似するマークが壁に描かれている部屋では、ロボットは充電ステーションの位置を誤認する可能性がある。電波等についても同様であり、ロボットは充電ステーションからの信号に類似する別の信号を充電ステーションからの信号と誤認する可能性がある。 The charging station is expected to be installed in a wide variety of environments. Therefore, even if the charging station displays a predetermined mark, the robot cannot always find the mark. For example, in a room where a mark similar to that of a charging station is drawn on the wall, the robot can misidentify the position of the charging station. The same applies to radio waves and the like, and the robot may mistakenly recognize another signal similar to the signal from the charging station as the signal from the charging station.
 本発明は上記課題認識に基づいて完成された発明であり、その主たる目的は、ロボットが充電ステーションを特定する精度を高めるための技術、を提供することにある。 The present invention is an invention completed based on the recognition of the above problems, and its main purpose is to provide a technique for increasing the accuracy with which a robot identifies a charging station.
 本発明のある態様における充電ステーションは、給電端子を有する充電スペースと、充電スペースにおいてロボットと給電端子が接続されたとき、ロボットに内蔵される二次電池を充電する充電制御部と、光源と、ロボットから発光態様を指定する発光信号を受信する発光指示受信部と、指定された発光態様にしたがって、光源の発光態様を変化させる発光制御部と、を備える。 A charging station according to an aspect of the present invention includes a charging space having a power supply terminal, a charging control unit that charges a secondary battery built in the robot when the robot and the power supply terminal are connected in the charging space, and a light source. A light emission instruction receiving unit that receives a light emission signal that specifies a light emission mode from the robot, and a light emission control unit that changes the light emission mode of the light source according to the specified light emission mode.
 本発明のある態様におけるロボットは、ロボットのモーションを選択する動作制御部と、動作制御部により選択されたモーションを実行する駆動機構と、発光態様を指定する発光信号を送信する発光指示送信部と、外部光を認識する光認識部を備える。
 発光指示送信部は、ロボットに内蔵される二次電池を充電するとき、発光信号を送信し、光認識部は、指定された発光態様に対応する外部光を特定し、動作制御部は、外部光の発光地点を充電ステーションの所在地点としてロボットの移動方向を決定する。
A robot according to an aspect of the present invention includes an operation control unit that selects a motion of the robot, a drive mechanism that executes the motion selected by the operation control unit, and a light emission instruction transmission unit that transmits a light emission signal that specifies a light emission mode. , A light recognition unit that recognizes external light.
The light emission instruction transmission unit transmits a light emission signal when charging the secondary battery built into the robot, the light recognition unit identifies the external light corresponding to the specified light emission mode, and the operation control unit determines the external light. The movement direction of the robot is determined by using the light emission point as the location point of the charging station.
 本発明のある態様におけるランドマーク装置は、光源と、ロボットから、発光態様を指定する発光信号を受信する発光指示受信部と、指定された発光態様にしたがって、光源の発光態様を変化させる発光制御部と、を備える。 A landmark device according to an aspect of the present invention includes a light source, a light emission instruction receiving unit that receives a light emission signal that specifies a light emission mode from a robot, and a light emission control that changes the light emission mode of the light source according to the specified light emission mode. And a section.
 本発明のある態様における誘導システムは、ロボット、ランドマーク装置および充電ステーションを含む。
 ロボットは、外部光を認識する光認識部と、ロボットのモーションを選択する動作制御部と、動作制御部により選択されたモーションを実行する駆動機構と、発光態様を指定する発光信号を送信する発光指示送信部と、を備える。
 発光指示送信部は、ロボットに内蔵される二次電池を充電するとき、発光信号を送信し、光認識部は、指定された発光態様に対応する外部光を特定し、動作制御部は、外部光の発光地点を移動目標地点としてロボットの移動方向を決定する。
 ランドマーク装置は、光源と、ロボットから、発光信号を受信する発光指示受信部と、発光信号により指定された発光態様にしたがって、光源の発光態様を変化させる発光制御部と、を備える。
 充電ステーションは、給電端子を有する充電スペースと、充電スペースにおいてロボットと給電端子が接続されたとき、ロボットの二次電池を充電する充電制御部と、光源と、ロボットから、発光信号を受信する発光指示受信部と、発光信号により指定された発光態様にしたがって、光源の発光態様を変化させる発光制御部と、を備える。
 ロボットの発光指示送信部は、第1の発光態様を指定する第1の発光信号を送信する。ランドマーク装置の発光制御部は、第1の発光信号により指定された第1の発光態様にしたがって、光源の発光態様を変化させる。ロボットの光認識部は、第1の発光態様による外部光を特定し、ロボットの動作制御部は、第1の発光態様による外部光の発光地点を移動目標地点としてロボットの移動方向を決定する。
 ロボットの発光指示送信部は、ロボットがランドマーク装置の所在地点に至ったとき、第2の発光態様を指定する第2の発光信号を送信する。充電ステーションの発光制御部は、第2の発光信号により指定された第2の発光態様にしたがって、光源の発光態様を変化させる。ロボットの光認識部は、第2の発光態様による外部光を特定し、ロボットの動作制御部は、第2の発光態様による外部光の発光地点を移動目標地点としてロボットの次の移動方向を決定する。
A guidance system in an aspect of the invention includes a robot, a landmark device, and a charging station.
The robot includes a light recognition unit that recognizes external light, a motion control unit that selects a motion of the robot, a drive mechanism that executes the motion selected by the motion control unit, and a light emission that transmits a light emission signal that specifies a light emission mode. And an instruction transmitting unit.
The light emission instruction transmission unit transmits a light emission signal when charging the secondary battery built into the robot, the light recognition unit identifies the external light corresponding to the specified light emission mode, and the operation control unit determines the external light. The movement direction of the robot is determined with the light emission point as the movement target point.
The landmark device includes a light source, a light emission instruction receiving unit that receives a light emission signal from the robot, and a light emission control unit that changes the light emission mode of the light source according to the light emission mode designated by the light emission signal.
The charging station has a charging space having a power supply terminal, a charging control unit that charges a secondary battery of the robot when the robot and the power supply terminal are connected in the charging space, a light source, and a light emission signal that receives a light emission signal from the robot. An instruction receiving unit and a light emission control unit that changes the light emission mode of the light source according to the light emission mode specified by the light emission signal are provided.
The light emission instruction transmission unit of the robot transmits a first light emission signal that specifies the first light emission mode. The light emission control unit of the landmark device changes the light emission mode of the light source in accordance with the first light emission mode designated by the first light emission signal. The light recognition unit of the robot identifies the external light according to the first light emission mode, and the operation control unit of the robot determines the moving direction of the robot with the light emission point of the external light according to the first light emission mode as the movement target point.
The light emission instruction transmission unit of the robot transmits a second light emission signal that specifies the second light emission mode when the robot reaches the location point of the landmark device. The light emission control unit of the charging station changes the light emission mode of the light source according to the second light emission mode designated by the second light emission signal. The light recognition unit of the robot identifies the external light according to the second light emission mode, and the operation control unit of the robot determines the next movement direction of the robot with the light emission point of the external light according to the second light emission mode as the movement target point. To do.
 本発明の別の態様における誘導システムは、第1のロボット、第2のロボットおよび充電ステーションを含む。
 第1のロボットおよび第2のロボットは、いずれも、外部光を認識する光認識部と、ロボットのモーションを選択する動作制御部と、動作制御部により選択されたモーションを実行する駆動機構と、発光態様を指定する発光信号を送信する発光指示送信部と、光源と、他のロボットから、発光信号を受信する発光指示受信部と、発光信号により指定された発光態様にしたがって、光源の発光態様を変化させる発光制御部と、を備える。
 発光指示送信部は、ロボットに内蔵される二次電池を充電するとき、発光信号を送信し、光認識部は、指定された発光態様に対応する外部光を特定し、動作制御部は、外部光の発光地点を移動目標地点としてロボットの移動方向を決定する。
 充電ステーションは、給電端子を有する充電スペースと、充電スペースにおいてロボットと給電端子が接続されたとき、ロボットの二次電池を充電する充電制御部と、光源と、ロボットから、発光信号を受信する発光指示受信部と、発光信号により指定された発光態様にしたがって、光源の発光態様を変化させる発光制御部と、を備える。
 第1のロボットの発光指示送信部は、第1の発光態様を指定する第1の発光信号を送信する。第2のロボットの発光制御部は、第1の発光信号により指定された第1の発光態様にしたがって、第2のロボットの光源の発光態様を変化させる。第1のロボットの光認識部は、第1の発光態様による外部光を特定し、第1のロボットの動作制御部は、第1の発光態様による外部光の発光地点を移動目標地点として第1のロボットの移動方向を決定する。
 第1のロボットの発光指示送信部は、第1のロボットが第2のロボットの所在地点に至ったとき、第2の発光態様を指定する第2の発光信号を送信する。充電ステーションの発光制御部は、第2の発光信号により指定された第2の発光態様にしたがって、光源の発光態様を変化させる。第1のロボットの光認識部は、第2の発光態様による外部光を特定し、第1のロボットの動作制御部は、第2の発光態様による外部光の発光地点を移動目標地点として第1のロボットの次の移動方向を決定する。
A guidance system in another aspect of the invention includes a first robot, a second robot and a charging station.
Each of the first robot and the second robot has a light recognition unit that recognizes external light, an operation control unit that selects a motion of the robot, and a drive mechanism that executes the motion selected by the operation control unit. A light emission instruction transmission unit that transmits a light emission signal that specifies a light emission mode, a light source, a light emission instruction reception unit that receives a light emission signal from another robot, and a light emission mode of the light source according to the light emission mode specified by the light emission signal And a light emission control unit for changing.
The light emission instruction transmission unit transmits a light emission signal when charging the secondary battery built into the robot, the light recognition unit identifies the external light corresponding to the specified light emission mode, and the operation control unit determines the external light. The movement direction of the robot is determined with the light emission point as the movement target point.
The charging station has a charging space having a power supply terminal, a charging control unit that charges a secondary battery of the robot when the robot and the power supply terminal are connected in the charging space, a light source, and a light emission signal that receives a light emission signal from the robot. An instruction receiving unit and a light emission control unit that changes the light emission mode of the light source according to the light emission mode specified by the light emission signal are provided.
The light emission instruction transmission unit of the first robot transmits a first light emission signal designating the first light emission mode. The light emission control unit of the second robot changes the light emission mode of the light source of the second robot according to the first light emission mode designated by the first light emission signal. The light recognition unit of the first robot identifies the external light according to the first light emission mode, and the operation control unit of the first robot uses the light emission point of the external light according to the first light emission mode as the movement target point. Determines the moving direction of the robot.
The light emission instruction transmission unit of the first robot transmits a second light emission signal that specifies the second light emission mode when the first robot reaches the location point of the second robot. The light emission control unit of the charging station changes the light emission mode of the light source according to the second light emission mode designated by the second light emission signal. The light recognition unit of the first robot identifies the external light according to the second light emission mode, and the operation control unit of the first robot uses the light emission point of the external light according to the second light emission mode as the movement target point. Determines the next movement direction of the robot.
 本発明によれば、ロボットが充電ステーションの所在地点を特定しやすくなる。 According to the present invention, the robot can easily identify the location point of the charging station.
 上述した目的、およびその他の目的、特徴および利点は、以下に述べる好適な実施の形態、およびそれに付随する以下の図面によってさらに明らかになる。 The above-mentioned object and other objects, features and advantages will be further clarified by the preferred embodiments described below and the following drawings accompanying it.
充電システムの概要を説明するための図である。It is a figure for explaining the outline of a charging system. ロボットの正面外観図である。It is a front external view of a robot. ロボットの側面外観図である。It is a side view of a robot. ロボットの構造を概略的に表す断面図である。It is a sectional view showing roughly the structure of a robot. 基本構成におけるロボットのハードウェア構成図である。It is a hardware block diagram of the robot in a basic configuration. 基本構成におけるロボットシステムの機能ブロック図である。It is a functional block diagram of a robot system in a basic configuration. ロボットが外皮を装着した状態を表す側面図である。It is a side view showing a state where the robot wears an outer cover. ロボットが外皮を装着した状態を表す正面図である。It is a front view showing the state where the robot wears the outer skin. ロボットが外皮を装着した状態を表す背面図である。It is a rear view showing a state where the robot wears an outer cover. 本実施形態におけるステーションの外観図である。It is an external view of the station in this embodiment. 本実施形態におけるステーションの機能ブロック図である。It is a functional block diagram of the station in this embodiment. 本実施形態におけるロボットの機能ブロック図である。It is a functional block diagram of the robot in this embodiment. ランドマーク装置の機能ブロック図である。It is a functional block diagram of a landmark device. ロボットがステーションを探索する処理過程を示すシーケンス図である。It is a sequence diagram which shows the process in which a robot searches for a station. 発光部の拡大斜視図である。It is an expansion perspective view of a light-emitting part. 正面から見たときの発光部の模式図である。It is a schematic diagram of the light emission part when it sees from the front. 側方から見たときの発光部の模式図である。It is a schematic diagram of the light emission part when it sees from a side. 近距離誘導信号(近距離赤外線と超音波)によるロボットの位置調整方法を説明するための模式図である。It is a schematic diagram for demonstrating the position adjustment method of the robot by a short-distance guidance signal (short-distance infrared rays and ultrasonic waves). 複数のロボットがステーションへ帰還を希望するときの制御方法を説明するためのタイムチャートである。9 is a time chart for explaining a control method when a plurality of robots desire to return to a station. ランドマーク装置によりロボットをステーションに誘導する方法を説明するための模式図である。It is a schematic diagram for explaining a method of guiding a robot to a station by a landmark device. ロボットによりロボットをステーションに誘導する方法を説明するための模式図である。It is a schematic diagram for demonstrating the method of guiding a robot to a station with a robot. 変形例におけるステーションの外観図である。It is an external view of the station in a modification. ロボットの位置および進行方向を特定する方法を説明するための模式図である。FIG. 6 is a schematic diagram for explaining a method of identifying a position and a traveling direction of a robot.
 図1は、充電システム10の概要を説明するための図である。
 充電システム10は、2台のロボット100を同時に充電可能な充電ステーション500を備える。以下、充電ステーションを単に「ステーション」とよぶことがある。ロボット100は車輪走行型の自律行動型ロボットである。ロボット100は、2つの前輪と1つの後輪を備える。左右の前輪が駆動輪、後輪がキャスターからなる従動輪である(詳細については後述する)。
FIG. 1 is a diagram for explaining the outline of the charging system 10.
The charging system 10 includes a charging station 500 capable of simultaneously charging two robots 100. Hereinafter, the charging station may be simply referred to as “station”. The robot 100 is a wheel-running autonomous robot. The robot 100 includes two front wheels and one rear wheel. The left and right front wheels are driven wheels, and the rear wheels are driven wheels made of casters (details will be described later).
 ステーション500は、複数のロボット100の巣(寝床)を演出する。2台のロボット100が隣り合わせで仲良く充電できるよう、2つの充電スペース502(左スペース502L、右スペース502R)が横並びに近接配置されている。ロボット100は、充電のために巣に戻り、充電中は正面を向くことで周囲に愛らしさをアピールする。ロボット100は、2つの充電スペース502のいずれかに対して後ろ向きに進入する。すなわち、その進入の際にはキャスターが先頭となる。 Station 500 directs the nests (beds) of multiple robots 100. Two charging spaces 502 (a left space 502L and a right space 502R) are arranged side by side and close to each other so that the two robots 100 can be charged next to each other in a friendly manner. The robot 100 returns to the nest for charging, and faces the front while charging to appeal to the surroundings. The robot 100 enters backwards into either of the two charging spaces 502. That is, the caster becomes the head when entering.
 充電スペース502には、キャスターが乗り上げるベース504が設けられる。キャスターがベース504上の目標位置に到達することで、ステーション500の給電端子とロボット100の充電端子とが安定に接続し、充電が可能となる。 The charging space 502 is provided with a base 504 on which casters ride. When the caster reaches the target position on the base 504, the power supply terminal of the station 500 and the charging terminal of the robot 100 are stably connected and charging is possible.
 ロボット100は、ステーション500に対して点滅周期等の「発光態様」を指定する発光信号を送信し、ステーション500は指示された発光態様にて、発光部を発光させる。本実施形態における「発光態様」とは、発光部(光源)の点滅周期、発光量、発光色および発光パターンのうちの1以上の組み合わせとして定義される。 The robot 100 transmits a light emission signal designating a “light emission mode” such as a blinking cycle to the station 500, and the station 500 causes the light emission unit to emit light in the instructed light emission mode. The “light emitting mode” in the present embodiment is defined as a combination of one or more of a blinking cycle of a light emitting unit (light source), a light emitting amount, a light emitting color, and a light emitting pattern.
 発光パターンとは、たとえば、「0.5秒点灯、1.5秒消灯、1.0秒点灯、0.5秒消灯」のように、点灯時間と消灯時間の配分を指定する情報でもよいし、「赤色光源、黄色光源、紫色光源の3種類の光源うち、赤色光源と黄色光源を点灯」のように発光色の組み合わせを指定する情報でもよい。以下、ステーション500が発生させる光を「誘導光」とよぶ。ロボット100は、誘導光を検出し、誘導光の発光地点をステーション500の所在地点として特定する。
 以下、ロボット100の基本構成について図2から図5に関連して説明したあと、ロボット100がステーション500の所在地点を特定する方法を中心として説明する。
The light emission pattern may be information designating the distribution of the lighting time and the lighting time, such as "lighting for 0.5 seconds, turning off for 1.5 seconds, lighting for 1.0 second, turning off for 0.5 seconds". , "Information of three types of light sources of red light source, yellow light source, and violet light source, the red light source and the yellow light source are turned on" may be specified. Hereinafter, the light generated by the station 500 will be referred to as “guide light”. The robot 100 detects the guide light and specifies the light emission point of the guide light as the location point of the station 500.
Hereinafter, a basic configuration of the robot 100 will be described with reference to FIGS. 2 to 5, and then a method of identifying the location point of the station 500 by the robot 100 will be mainly described.
[基本構成]
 図2は、ロボット100の外観を表す図である。図2Aは正面図であり、図2Bは側面図である。
 ロボット100は、外部環境および内部状態に基づいて行動を決定する自律行動型のロボットである。外部環境は、カメラやサーモセンサ115など各種のセンサにより認識される。内部状態はロボット100の感情を表現する様々なパラメータとして定量化される。ロボット100は、オーナー家庭の家屋内を行動範囲とする。以下、ロボット100に関わる人間を「ユーザ」とよぶ。ユーザのうち、ロボット100の所有者または管理者を「オーナー」とよぶ。
[Basic configuration]
FIG. 2 is a diagram showing the appearance of the robot 100. 2A is a front view and FIG. 2B is a side view.
The robot 100 is an autonomous action type robot that determines an action based on an external environment and an internal state. The external environment is recognized by various sensors such as a camera and a thermo sensor 115. The internal state is quantified as various parameters expressing the emotion of the robot 100. The robot 100 sets the indoor area of the owner's home as an action range. Hereinafter, a person involved in the robot 100 is called a “user”. Of the users, the owner or administrator of the robot 100 is called the “owner”.
 ロボット100のボディ104は、全体的に丸みを帯びた形状を有し、ウレタンやゴム、樹脂、繊維などやわらかく弾力性のある素材により形成された外皮314を含む。ロボット100に服を着せてもよい。ロボット100の総重量は5~15キログラム程度、身長は0.5~1.2メートル程度である。適度な重さと丸み、柔らかさ、手触りのよさ、といった諸属性により、ユーザがロボット100を抱きかかえやすく、かつ、抱きかかえたくなるという効果が実現される。 The body 104 of the robot 100 has a rounded shape as a whole, and includes an outer skin 314 formed of a soft and elastic material such as urethane, rubber, resin, or fiber. The robot 100 may be dressed. The total weight of the robot 100 is about 5 to 15 kilograms, and the height is about 0.5 to 1.2 meters. Due to various attributes such as appropriate weight, roundness, softness, and good feel, the effect that the user can easily hold the robot 100 and that he/she wants to hold the robot 100 is realized.
 ロボット100は、一対の前輪102(左輪102a,右輪102b)と、一つの後輪103を含む。前輪102が駆動輪であり、後輪103が従動輪である。前輪102は、操舵機構を有しないが、左右輪の回転速度や回転方向を個別に制御可能とされている。後輪103は、キャスターであり、ロボット100を前後左右へ移動させるために回転自在となっている。後輪103はオムニホイールであってもよい。左輪102aよりも右輪102bの回転数を大きくすることで、ロボット100が左折したり、左回りに回転できる。右輪102bよりも左輪102aの回転数を大きくすることで、ロボット100が右折したり、右回りに回転できる。 The robot 100 includes a pair of front wheels 102 (left wheel 102a, right wheel 102b) and one rear wheel 103. The front wheels 102 are driving wheels and the rear wheels 103 are driven wheels. The front wheels 102 do not have a steering mechanism, but the rotation speed and rotation direction of the left and right wheels can be individually controlled. The rear wheel 103 is a caster and is rotatable to move the robot 100 back and forth and left and right. The rear wheel 103 may be an omni wheel. By making the rotation speed of the right wheel 102b larger than that of the left wheel 102a, the robot 100 can turn left or rotate counterclockwise. By making the rotation speed of the left wheel 102a larger than that of the right wheel 102b, the robot 100 can turn right or rotate clockwise.
 前輪102および後輪103は、駆動機構(回動機構、リンク機構)によりボディ104に完全収納できる。ボディ104の下半部には左右一対のカバー312が設けられている。カバー312は、可撓性および弾性を有する樹脂材(ラバー、シリコーンゴム等)からなり、柔らかい胴体を構成するとともに前輪102を収納できる。カバー312には側面から前面にかけて開口するスリット313(開口部)が形成され、そのスリット313を介して前輪102を進出させ、外部に露出させることができる。 The front wheel 102 and the rear wheel 103 can be completely housed in the body 104 by the drive mechanism (rotating mechanism, link mechanism). A pair of left and right covers 312 is provided on the lower half of the body 104. The cover 312 is made of a flexible and elastic resin material (rubber, silicone rubber, or the like), constitutes a soft body, and can accommodate the front wheel 102. The cover 312 is formed with a slit 313 (opening) that opens from the side surface to the front surface, and the front wheel 102 can be advanced through the slit 313 and exposed to the outside.
 走行時においても各車輪の大半はボディ104に隠れているが、各車輪がボディ104に完全収納されるとロボット100は移動不可能な状態となる。すなわち、車輪の収納動作にともなってボディ104が降下し、床面Fに着座する。この着座状態においては、ボディ104の底部に形成された平坦状の着座面108(接地底面)が床面Fに当接する。 Most of the wheels are hidden by the body 104 even when the vehicle is running, but when the wheels are completely housed in the body 104, the robot 100 cannot move. That is, the body 104 descends and sits on the floor F as the wheels are retracted. In this seated state, a flat seating surface 108 (ground contact bottom surface) formed on the bottom of the body 104 contacts the floor surface F.
 ロボット100は、2つの腕106を有する。腕106の先端に手があるが、モノを把持する機能はない。腕106は、後述するアクチュエータの駆動により、上げる、曲げる、手を振る、振動するなど簡単な動作が可能である。2つの腕106は、それぞれ個別に制御可能である。 The robot 100 has two arms 106. Although there is a hand at the tip of the arm 106, it does not have a function of grasping an object. The arm 106 can perform simple operations such as raising, bending, waving, and vibrating by driving an actuator described later. The two arms 106 can be individually controlled.
 ロボット100の頭部正面には顔領域116が露出している。顔領域116には、2つの目110が設けられている。目110は、液晶素子または有機EL素子による画像表示が可能であり、画像として表示された瞳や瞼を動かすことで視線や表情を表現するためのデバイスである。顔領域116の中央には、鼻109が設けられている。鼻109には、アナログスティックが設けられており、上下左右の全方向に加えて、押し込み方向も検出できる。また、ロボット100には複数のタッチセンサが設けられており、頭部、胴部、臀部、腕など、ロボット100のほぼ全域についてユーザのタッチを検出できる。ロボット100は、音源方向を特定可能なマイクロフォンアレイや超音波センサなど様々なセンサを搭載する。また、スピーカーを内蔵し、簡単な音声を発することもできる。 A face area 116 is exposed in front of the head of the robot 100. The face area 116 is provided with two eyes 110. The eye 110 is a device capable of displaying an image with a liquid crystal element or an organic EL element, and expressing a line of sight or a facial expression by moving a pupil or an eyelid displayed as an image. A nose 109 is provided in the center of the face area 116. The nose 109 is provided with an analog stick, and can detect the pushing direction in addition to all the directions of up, down, left and right. Further, the robot 100 is provided with a plurality of touch sensors, and a user's touch can be detected on almost the entire area of the robot 100, such as the head, torso, buttocks, and arms. The robot 100 is equipped with various sensors such as a microphone array and an ultrasonic sensor capable of specifying the sound source direction. It also has a built-in speaker and can emit simple voice.
 ロボット100の頭部にはツノ112が取り付けられる。ツノ112には全天周カメラ113が取り付けられ、ロボット100の上部全域を一度に撮像可能である。ツノ112にはまた、サーモセンサ115(サーモカメラ)が内蔵されている。また、ツノ112には赤外線を利用した通信をするためのモジュール(図示せず)が複数設けられており、それらのモジュールが周囲に向けて環状に設置されている。このため、ロボット100は方向を認識しながら赤外線通信ができる。更に、ツノ112には、緊急停止用のスイッチが設けられており、ユーザはツノ112を引き抜くことでロボット100を緊急停止できる。 A horn 112 is attached to the head of the robot 100. A omnidirectional camera 113 is attached to the horn 112 so that the entire upper portion of the robot 100 can be imaged at one time. The horn 112 also has a built-in thermo sensor 115 (thermo camera). Further, the horn 112 is provided with a plurality of modules (not shown) for performing communication using infrared rays, and these modules are annularly installed toward the surroundings. Therefore, the robot 100 can perform infrared communication while recognizing the direction. Further, the horn 112 is provided with a switch for emergency stop, and the user can perform an emergency stop of the robot 100 by pulling out the horn 112.
 図3は、ロボット100の構造を概略的に表す断面図である。
 ボディ104は、本体フレーム310、一対の腕106、一対のカバー312および外皮314を含む。本体フレーム310は、頭部フレーム316および胴部フレーム318を含む。頭部フレーム316は、中空半球状をなし、ロボット100の頭部骨格を形成する。胴部フレーム318は、角筒形状をなし、ロボット100の胴部骨格を形成する。胴部フレーム318の下端部が、ロアプレート334に固定されている。頭部フレーム316は、接続機構330を介して胴部フレーム318に接続されている。
FIG. 3 is a sectional view schematically showing the structure of the robot 100.
The body 104 includes a main body frame 310, a pair of arms 106, a pair of covers 312, and an outer cover 314. The body frame 310 includes a head frame 316 and a body frame 318. The head frame 316 has a hollow hemispherical shape and forms the head skeleton of the robot 100. The body frame 318 has a rectangular tube shape and forms a body skeleton of the robot 100. The lower end of the body frame 318 is fixed to the lower plate 334. The head frame 316 is connected to the body frame 318 via the connection mechanism 330.
 胴部フレーム318は、ボディ104の軸芯を構成する。胴部フレーム318は、ロアプレート334に左右一対のサイドプレート336を固定して構成され、一対の腕106および内部機構を支持する。胴部フレーム318の内方には、バッテリー118、制御回路342および各種アクチュエータ等が収容されている。ロアプレート334の底面が着座面108を形成する。 The body frame 318 constitutes the axis of the body 104. The body frame 318 is configured by fixing a pair of left and right side plates 336 to the lower plate 334, and supports the pair of arms 106 and the internal mechanism. The battery 118, the control circuit 342, various actuators, and the like are housed inside the body frame 318. The bottom surface of the lower plate 334 forms the seating surface 108.
 胴部フレーム318は、その上部にアッパープレート332を有する。アッパープレート332には、有底円筒状の支持部319が固定されている。アッパープレート332、ロアプレート334、一対のサイドプレート336および支持部319が、胴部フレーム318を構成している。支持部319の外径は、左右のサイドプレート336の間隔よりも小さい。一対の腕106は、環状部材340と一体に組み付けられることでアームユニット350を構成している。環状部材340は円環状をなし、その中心線上を径方向に離隔するように一対の腕106が取り付けられている。環状部材340は、支持部319に同軸状に挿通され、一対のサイドプレート336の上端面に載置されている。アームユニット350は、胴部フレーム318により下方から支持されている。 The body frame 318 has an upper plate 332 on its upper part. A cylindrical support portion 319 having a bottom is fixed to the upper plate 332. The upper plate 332, the lower plate 334, the pair of side plates 336, and the support portion 319 form a body frame 318. The outer diameter of the support portion 319 is smaller than the distance between the left and right side plates 336. The pair of arms 106 is integrally assembled with the annular member 340 to form an arm unit 350. The annular member 340 has an annular shape, and the pair of arms 106 are attached so as to radially separate the center line thereof. The annular member 340 is coaxially inserted into the support portion 319 and placed on the upper end surfaces of the pair of side plates 336. The arm unit 350 is supported by the body frame 318 from below.
 頭部フレーム316は、ヨー軸321、ピッチ軸322およびロール軸323を有する。頭部フレーム316のヨー軸321周りの回動(ヨーイング)により首振り動作が実現され、ピッチ軸322周りの回動(ピッチング)により頷き動作,見上げ動作および見下ろし動作が実現され、ロール軸323周りの回動(ローリング)により首を左右に傾げる動作が実現される。各軸は、接続機構330の駆動態様に応じて三次元空間における位置や角度が変化し得る。接続機構330は、リンク機構からなり、胴部フレーム318に設置された複数のモータにより駆動される。 The head frame 316 has a yaw axis 321, a pitch axis 322, and a roll axis 323. The head frame 316 swings around the yaw axis 321 to achieve a swinging motion, and the swinging around the pitch shaft 322 achieves a nod motion, a look-up motion and a look-down motion around the roll shaft 323. The operation of tilting the neck to the left and right is realized by the rotation (rolling). The position and angle of each axis in the three-dimensional space may change according to the driving mode of the connection mechanism 330. The connection mechanism 330 includes a link mechanism and is driven by a plurality of motors installed on the body frame 318.
 胴部フレーム318は、車輪駆動機構370を収容している。車輪駆動機構370は、前輪102および後輪103をそれぞれボディ104から出し入れする前輪駆動機構および後輪駆動機構を含む。前輪102および後輪103は、ロボット100を移動させる「移動機構」として機能する。前輪102は、その中心部にダイレクトドライブモータを有する。このため、左輪102aと右輪102bを個別に駆動できる。前輪102はホイールカバー105に回転可能に支持され、そのホイールカバー105が胴部フレーム318に回動可能に支持されている。 The body frame 318 houses the wheel drive mechanism 370. The wheel drive mechanism 370 includes a front wheel drive mechanism and a rear wheel drive mechanism that move the front wheel 102 and the rear wheel 103 into and out of the body 104, respectively. The front wheels 102 and the rear wheels 103 function as a “moving mechanism” that moves the robot 100. The front wheel 102 has a direct drive motor in the center thereof. Therefore, the left wheel 102a and the right wheel 102b can be driven individually. The front wheel 102 is rotatably supported by the wheel cover 105, and the wheel cover 105 is rotatably supported by the body frame 318.
 一対のカバー312は、胴部フレーム318を左右から覆うように設けられ、ボディ104のアウトラインに丸みをもたせるよう、滑らかな曲面形状とされている。胴部フレーム318とカバー312との間に閉空間が形成され、その閉空間が前輪102の収容空間Sとなっている。後輪103は、胴部フレーム318の下部後方に設けられた収容空間に収容される。 The pair of covers 312 is provided so as to cover the body frame 318 from the left and right, and has a smooth curved surface shape so that the outline of the body 104 is rounded. A closed space is formed between the body frame 318 and the cover 312, and the closed space is a storage space S for the front wheels 102. The rear wheel 103 is housed in a housing space provided in the lower rear part of the body frame 318.
 外皮314は、本体フレーム310および一対の腕106を外側から覆う。外皮314は、人が弾力を感じる程度の厚みを有し、ウレタンスポンジなどの伸縮性を有する素材で形成される。これにより、ユーザがロボット100を抱きしめると、適度な柔らかさを感じ、人がペットにするように自然なスキンシップをとることができる。外皮314は、カバー312を露出させる態様で本体フレーム310に装着されている。外皮314の上端部には、開口部390が設けられる。この開口部390がツノ112を挿通する。 The outer skin 314 covers the body frame 310 and the pair of arms 106 from the outside. The outer cover 314 has a thickness that allows a person to feel elasticity, and is formed of a stretchable material such as urethane sponge. As a result, when the user holds the robot 100, he or she can feel appropriate softness and take a natural skinship like a human being makes a pet. The outer cover 314 is attached to the main body frame 310 in such a manner that the cover 312 is exposed. An opening 390 is provided at the upper end of the outer cover 314. The opening 390 is inserted through the horn 112.
 本体フレーム310と外皮314との間にはタッチセンサが配設される。カバー312にはタッチセンサが埋設されている。これらのタッチセンサは、いずれも静電容量センサであり、ロボット100のほぼ全域におけるタッチを検出する。なお、タッチセンサを外皮314に埋設してもよいし、本体フレーム310の内側に配設してもよい。 A touch sensor is arranged between the body frame 310 and the outer cover 314. A touch sensor is embedded in the cover 312. Each of these touch sensors is a capacitance sensor and detects a touch in almost the entire area of the robot 100. The touch sensor may be embedded in the outer cover 314 or may be provided inside the main body frame 310.
 腕106は、第1関節352および第2関節354を有し、両関節の間に腕356、第2関節354の先に手358を有する。第1関節352は肩関節に対応し、第2関節354は手首関節に対応する。各関節にはモータが設けられ、腕356および手358をそれぞれ駆動する。腕106を駆動するための駆動機構は、これらのモータおよびその駆動回路344を含む。 The arm 106 has a first joint 352 and a second joint 354, and an arm 356 between both joints and a hand 358 at the tip of the second joint 354. The first joint 352 corresponds to the shoulder joint, and the second joint 354 corresponds to the wrist joint. A motor is provided in each joint to drive the arm 356 and the hand 358, respectively. The drive mechanism for driving the arm 106 includes these motors and their drive circuit 344.
 図4は、ロボット100のハードウェア構成図である。
 ロボット100は、内部センサ128、通信機126、記憶装置124、プロセッサ122、駆動機構120およびバッテリー118を含む。駆動機構120は、上述した接続機構330および車輪駆動機構370を含む。プロセッサ122と記憶装置124は、制御回路342に含まれる。各ユニットは電源線130および信号線132により互いに接続される。バッテリー118は、電源線130を介して各ユニットに電力を供給する。各ユニットは信号線132により制御信号を送受する。バッテリー118は、リチウムイオン二次電池であり、ロボット100の動力源である。
FIG. 4 is a hardware configuration diagram of the robot 100.
The robot 100 includes an internal sensor 128, a communication device 126, a storage device 124, a processor 122, a drive mechanism 120, and a battery 118. The drive mechanism 120 includes the connection mechanism 330 and the wheel drive mechanism 370 described above. The processor 122 and the storage device 124 are included in the control circuit 342. Each unit is connected to each other by a power supply line 130 and a signal line 132. The battery 118 supplies power to each unit via the power supply line 130. Each unit sends and receives a control signal via a signal line 132. The battery 118 is a lithium-ion secondary battery and is a power source of the robot 100.
 内部センサ128は、ロボット100が内蔵する各種センサの集合体である。具体的には、カメラ、マイクロフォンアレイ、測距センサ(赤外線センサ)、サーモセンサ115、タッチセンサ、加速度センサ、気圧センサ、ニオイセンサなどである。タッチセンサは、ボディ104の大部分の領域に対応し、静電容量の変化に基づいてユーザのタッチを検出する。ニオイセンサは、匂いの元となる分子の吸着によって電気抵抗が変化する原理を応用した既知のセンサである。 The internal sensor 128 is an assembly of various sensors built in the robot 100. Specifically, it is a camera, a microphone array, a distance measuring sensor (infrared sensor), a thermo sensor 115, a touch sensor, an acceleration sensor, an atmospheric pressure sensor, an odor sensor, or the like. The touch sensor corresponds to most of the area of the body 104 and detects a user's touch based on a change in capacitance. The odor sensor is a known sensor that applies the principle that electric resistance changes due to adsorption of molecules that are the origin of odor.
 通信機126は、各種の外部機器を対象として無線通信を行う通信モジュールである。記憶装置124は、不揮発性メモリおよび揮発性メモリにより構成され、コンピュータプログラムや各種設定情報を記憶する。プロセッサ122は、コンピュータプログラムの実行手段である。駆動機構120は、複数のアクチュエータを含む。このほか、表示器やスピーカーなども搭載される。 The communication device 126 is a communication module that performs wireless communication with various external devices. The storage device 124 includes a non-volatile memory and a volatile memory, and stores a computer program and various setting information. The processor 122 is a means for executing a computer program. The drive mechanism 120 includes a plurality of actuators. In addition to this, a display and speakers are also installed.
 駆動機構120は、主として、車輪と頭部を制御する。駆動機構120は、ロボット100の移動方向や移動速度を変化させるほか、車輪を昇降させることもできる。車輪が上昇すると、車輪はボディ104に完全に収納され、ロボット100は着座面108にて床面Fに当接し、着座状態となる。また、駆動機構120は、腕106を制御する。 The drive mechanism 120 mainly controls the wheels and the head. The drive mechanism 120 can change the moving direction and moving speed of the robot 100, and can also move the wheels up and down. When the wheels are lifted, the wheels are completely stored in the body 104, and the robot 100 comes into contact with the floor surface F at the seating surface 108 and becomes seated. The drive mechanism 120 also controls the arm 106.
 図5は、ロボットシステム300の機能ブロック図である。
 ロボットシステム300は、ロボット100、サーバ200および複数の外部センサ114を含む。ロボット100およびサーバ200の各構成要素は、CPU(Central Processing Unit)および各種コプロセッサなどの演算器、メモリやストレージといった記憶装置、それらを連結する有線または無線の通信線を含むハードウェアと、記憶装置に格納され、演算器に処理命令を供給するソフトウェアによって実現される。コンピュータプログラムは、デバイスドライバ、オペレーティングシステム、それらの上位層に位置する各種アプリケーションプログラム、また、これらのプログラムに共通機能を提供するライブラリによって構成されてもよい。以下に説明する各ブロックは、ハードウェア単位の構成ではなく、機能単位のブロックを示している。
 ロボット100の機能の一部はサーバ200により実現されてもよいし、サーバ200の機能の一部または全部はロボット100により実現されてもよい。
FIG. 5 is a functional block diagram of the robot system 300.
The robot system 300 includes a robot 100, a server 200, and a plurality of external sensors 114. Each constituent element of the robot 100 and the server 200 includes a computing unit such as a CPU (Central Processing Unit) and various coprocessors, a storage device such as a memory and a storage, hardware including a wired or wireless communication line connecting them, and a storage unit. It is realized by software stored in the device and supplying a processing instruction to the arithmetic unit. The computer program may be configured by a device driver, an operating system, various application programs located in their upper layers, and a library that provides common functions to these programs. Each block described below is not a hardware-based configuration but a function-based block.
Some of the functions of the robot 100 may be realized by the server 200, and some or all of the functions of the server 200 may be realized by the robot 100.
 家屋内にはあらかじめ複数の外部センサ114が設置される。サーバ200は、外部センサ114を管理し、必要に応じてロボット100に外部センサ114により取得された検出値を提供する。ロボット100は、内部センサ128および複数の外部センサ114から得られる情報に基づいて、基本行動を決定する。外部センサ114はロボット100の感覚器を補強するためのものであり、サーバ200はロボット100の処理能力を補強するためのものである。ロボット100の通信機126がサーバ200と定期的に通信し、サーバ200が外部センサ114によりロボット100の位置を特定する処理を担ってもよい(特許文献2も参照)。 A plurality of external sensors 114 are installed in advance in the house. The server 200 manages the external sensor 114 and provides the detection value acquired by the external sensor 114 to the robot 100 as needed. The robot 100 determines a basic action based on the information obtained from the internal sensor 128 and the plurality of external sensors 114. The external sensor 114 is for reinforcing the sensory organs of the robot 100, and the server 200 is for reinforcing the processing capacity of the robot 100. The communication device 126 of the robot 100 may periodically communicate with the server 200, and the server 200 may be responsible for the process of identifying the position of the robot 100 by the external sensor 114 (see also Patent Document 2).
(サーバ200)
 サーバ200は、通信部204、データ処理部202およびデータ格納部206を含む。
 通信部204は、外部センサ114およびロボット100との通信処理を担当する。データ格納部206は各種データを格納する。データ処理部202は、通信部204により取得されたデータおよびデータ格納部206に格納されるデータに基づいて各種処理を実行する。データ処理部202は、通信部204およびデータ格納部206のインタフェースとしても機能する。
(Server 200)
The server 200 includes a communication unit 204, a data processing unit 202 and a data storage unit 206.
The communication unit 204 is in charge of communication processing with the external sensor 114 and the robot 100. The data storage unit 206 stores various data. The data processing unit 202 executes various processes based on the data acquired by the communication unit 204 and the data stored in the data storage unit 206. The data processing unit 202 also functions as an interface for the communication unit 204 and the data storage unit 206.
 データ格納部206は、モーション格納部232と個人データ格納部218を含む。
 ロボット100は、複数の動作パターン(モーション)を有する。腕106を震わせる、蛇行しながらオーナーに近づく、首をかしげたままオーナーを見つめる、などさまざまなモーションが定義される。
The data storage unit 206 includes a motion storage unit 232 and a personal data storage unit 218.
The robot 100 has a plurality of motion patterns (motions). Various motions such as swaying the arm 106, approaching the owner while meandering, and staring at the owner with his/her neck bent are defined.
 モーション格納部232は、モーションの制御内容を定義する「モーションファイル」を格納する。各モーションは、モーションIDにより識別される。モーションファイルは、ロボット100のモーション格納部160にもダウンロードされる。どのモーションを実行するかは、サーバ200で決定されることもあるし、ロボット100で決定されることもある。 The motion storage unit 232 stores a “motion file” that defines the motion control content. Each motion is identified by a motion ID. The motion file is also downloaded to the motion storage unit 160 of the robot 100. Which motion is executed may be determined by the server 200 or the robot 100.
 ロボット100のモーションの多くは、複数の単位モーションを含む複合モーションとして構成される。たとえば、ロボット100がオーナーに近づくとき、オーナーの方に向き直る単位モーション、手を上げながら近づく単位モーション、体を揺すりながら近づく単位モーション、両手を上げながら着座する単位モーションの組み合わせとして表現されてもよい。このような4つのモーションの組み合わせにより、「オーナーに近づいて、途中で手を上げて、最後は体をゆすった上で着座する」というモーションが実現される。モーションファイルには、ロボット100に設けられたアクチュエータの回転角度や角速度などが時間軸に関連づけて定義される。モーションファイル(アクチュエータ制御情報)にしたがって、時間経過とともに各アクチュエータを制御することで様々なモーションが表現される。 Most of the motions of the robot 100 are configured as compound motions including a plurality of unit motions. For example, when the robot 100 approaches the owner, it may be expressed as a combination of a unit motion of turning toward the owner, a unit motion of approaching while raising a hand, a unit motion of approaching while shaking the body, and a unit motion of sitting while raising both hands. .. Such a combination of four motions realizes a motion of “approaching the owner, raising his hand on the way, and finally shaking his body to sit down”. In the motion file, the rotation angle, angular velocity, etc. of the actuator provided in the robot 100 are defined in association with the time axis. According to the motion file (actuator control information), various motions are expressed by controlling each actuator over time.
 先の単位モーションから次の単位モーションに変化するときの移行時間を「インターバル」とよぶ。インターバルは、単位モーション変更に要する時間やモーションの内容に応じて定義されればよい。インターバルの長さは調整可能である。
 以下、いつ、どのモーションを選ぶか、モーションを実現する上での各アクチュエータの出力調整など、ロボット100の行動制御に関わる設定のことを「行動特性」と総称する。ロボット100の行動特性は、モーション選択アルゴリズム、モーションの選択確率、モーションファイル等により定義される。
The transition time when changing from the previous unit motion to the next unit motion is called "interval". The interval may be defined according to the time required to change the unit motion and the content of the motion. The length of the interval is adjustable.
Hereinafter, the settings relating to the behavior control of the robot 100, such as when and which motion to select, output adjustment of each actuator in realizing the motion, are collectively referred to as “action characteristics”. The behavior characteristic of the robot 100 is defined by a motion selection algorithm, a motion selection probability, a motion file, and the like.
 モーション格納部232は、モーションファイルのほか、各種のイベントが発生したときに実行すべきモーションを定義するモーション選択テーブルを格納する。モーション選択テーブルにおいては、イベントに対して1以上のモーションとその選択確率が対応づけられる。 The motion storage unit 232 stores, in addition to motion files, a motion selection table that defines motions to be executed when various events occur. In the motion selection table, one or more motions and their selection probabilities are associated with an event.
 個人データ格納部218は、ユーザの情報を格納する。具体的には、ユーザに対する親密度とユーザの身体的特徴・行動的特徴を示すマスタ情報を格納する。年齢や性別などの他の属性情報を格納してもよい。 The personal data storage unit 218 stores user information. Specifically, master information indicating intimacy with the user and physical/behavioral characteristics of the user is stored. Other attribute information such as age and gender may be stored.
 ロボット100は、ユーザごとに親密度という内部パラメータを有する。ロボット100が、自分を抱き上げる、声をかけてくれるなど、自分に対して好意を示す行動を認識したとき、そのユーザに対する親密度が高くなる。ロボット100に関わらないユーザや、乱暴を働くユーザ、出会う頻度が低いユーザに対する親密度は低くなる。 The robot 100 has an internal parameter called intimacy for each user. When the robot 100 recognizes an action that is favorable to itself, such as hugging itself or calling out, the intimacy with the user increases. The degree of intimacy with a user who is not related to the robot 100, a user who works violently, or a user who rarely encounters is low.
 データ処理部202は、位置管理部208、認識部212、動作制御部222、親密度管理部220および状態管理部244を含む。
 位置管理部208は、ロボット100の位置座標を特定する。状態管理部244は、充電率や内部温度、プロセッサ122の処理負荷などの各種物理状態など各種内部パラメータを管理する。また、状態管理部244は、ロボット100の感情(寂しさ、好奇心、承認欲求など)を示すさまざまな感情パラメータを管理する。これらの感情パラメータは常に揺らいでいる。感情パラメータに応じてロボット100の移動目標地点が変化する。たとえば、寂しさが高まっているときには、ロボット100はユーザのいるところを移動目標地点として設定する。
The data processing unit 202 includes a position management unit 208, a recognition unit 212, an operation control unit 222, an intimacy management unit 220, and a state management unit 244.
The position management unit 208 specifies the position coordinates of the robot 100. The state management unit 244 manages various internal parameters such as the charging rate, the internal temperature, and various physical states such as the processing load of the processor 122. In addition, the state management unit 244 manages various emotion parameters indicating the emotion (loneliness, curiosity, desire for approval, etc.) of the robot 100. These emotional parameters are always fluctuating. The movement target point of the robot 100 changes according to the emotion parameter. For example, when loneliness is increasing, the robot 100 sets the place where the user is as the movement target point.
 時間経過によって感情パラメータが変化する。また、後述の応対行為によっても各種感情パラメータは変化する。たとえば、オーナーから「抱っこ」をされると寂しさを示す感情パラメータは低下し、長時間にわたってオーナーを視認しないときには寂しさを示す感情パラメータは少しずつ増加する。 -Emotional parameters change over time. In addition, various emotion parameters also change due to a response action described later. For example, the emotional parameter indicating loneliness decreases when the owner "hugs", and the emotional parameter indicating loneliness gradually increases when the owner is not visually recognized for a long time.
 認識部212は、外部環境を認識する。外部環境の認識には、温度や湿度に基づく天候や季節の認識、光量や温度に基づく物陰(安全地帯)の認識など多様な認識が含まれる。ロボット100の認識部156は、内部センサ128により各種の環境情報を取得し、これを一次処理した上でサーバ200の認識部212に転送する。 The recognition unit 212 recognizes the external environment. The recognition of the external environment includes various recognitions such as recognition of weather and season based on temperature and humidity, and recognition of shade (safety zone) based on light intensity and temperature. The recognition unit 156 of the robot 100 acquires various environmental information by the internal sensor 128, performs primary processing on the environmental information, and transfers the environmental information to the recognition unit 212 of the server 200.
 具体的には、ロボット100の認識部156は、画像から移動物体、特に、人物や動物に対応する画像領域を抽出し、抽出した画像領域から移動物体の身体的特徴や行動的特徴を示す特徴量の集合として「特徴ベクトル」を抽出する。特徴ベクトル成分(特徴量)は、各種身体的・行動的特徴を定量化した数値である。たとえば、人間の目の横幅は0~1の範囲で数値化され、1つの特徴ベクトル成分を形成する。人物の撮像画像から特徴ベクトルを抽出する手法については、既知の顔認識技術の応用である。ロボット100は、特徴ベクトルをサーバ200に送信する。 Specifically, the recognition unit 156 of the robot 100 extracts a moving object, in particular, an image region corresponding to a person or an animal from the image, and a feature indicating a physical feature or a behavioral feature of the moving object from the extracted image region. A "feature vector" is extracted as a set of quantities. The feature vector component (feature amount) is a numerical value that quantifies various physical and behavioral features. For example, the width of the human eye is digitized in the range of 0 to 1 to form one feature vector component. The method of extracting the feature vector from the captured image of a person is an application of a known face recognition technique. The robot 100 transmits the feature vector to the server 200.
 サーバ200の認識部212は、ロボット100の内蔵カメラによる撮像画像から抽出された特徴ベクトルと、個人データ格納部218にあらかじめ登録されているユーザ(クラスタ)の特徴ベクトルと比較することにより、撮像されたユーザがどの人物に該当するかを判定する(ユーザ識別処理)。また、認識部212は、ユーザの表情を画像認識することにより、ユーザの感情を推定する。認識部212は、人物以外の移動物体、たとえば、ペットである猫や犬についてもユーザ識別処理を行う。 The recognition unit 212 of the server 200 is imaged by comparing the feature vector extracted from the image captured by the built-in camera of the robot 100 with the feature vector of the user (cluster) registered in the personal data storage unit 218 in advance. It is determined which person the user corresponds to (user identification process). Further, the recognition unit 212 estimates the emotion of the user by recognizing the facial expression of the user as an image. The recognition unit 212 also performs user identification processing on moving objects other than people, such as cats and dogs that are pets.
 認識部212は、ロボット100になされたさまざまな応対行為を認識し、快・不快行為に分類する。認識部212は、また、ロボット100の行動に対するオーナーの応対行為を認識することにより、肯定・否定反応に分類する。
 快・不快行為は、ユーザの応対行為が、生物として心地よいものであるか不快なものであるかにより判別される。たとえば、抱っこされることはロボット100にとって快行為であり、蹴られることはロボット100にとって不快行為である。肯定・否定反応は、ユーザの応対行為が、ユーザの快感情を示すものか不快感情を示すものであるかにより判別される。抱っこされることはユーザの快感情を示す肯定反応であり、蹴られることはユーザの不快感情を示す否定反応である。
The recognition unit 212 recognizes various kinds of response actions performed on the robot 100 and classifies them into pleasant and unpleasant actions. The recognition unit 212 also recognizes the owner's response to the action of the robot 100, and classifies the action into an affirmative/negative response.
The pleasant/unpleasant behavior is determined depending on whether the user's behavior is pleasant or uncomfortable as a living thing. For example, hugging is a pleasant act for the robot 100, and being kicked is an unpleasant act for the robot 100. The affirmative/negative reaction is determined by whether the user's response action indicates the user's pleasant or unpleasant feeling. Hugging is an affirmative reaction indicating the user's pleasant feeling, and being kicked is a negative reaction indicating the user's unpleasant feeling.
 サーバ200の動作制御部222は、ロボット100の動作制御部150と協働して、ロボット100のモーションを決定する。サーバ200の動作制御部222は、ロボット100の移動目標地点とそのための移動ルートを作成する。動作制御部222は、複数の移動ルートを作成し、その上で、いずれかの移動ルートを選択してもよい。 The operation control unit 222 of the server 200 cooperates with the operation control unit 150 of the robot 100 to determine the motion of the robot 100. The operation control unit 222 of the server 200 creates a movement target point of the robot 100 and a movement route therefor. The operation control unit 222 may create a plurality of movement routes and select any one of the movement routes.
 動作制御部222は、モーション格納部232の複数のモーションからロボット100のモーションを選択する。各モーションには状況ごとに選択確率が対応づけられている。たとえば、オーナーから快行為がなされたときには、モーションAを20%の確率で実行する、気温が30度以上となったとき、モーションBを5%の確率で実行する、といった選択方法が定義される。 The motion control unit 222 selects a motion of the robot 100 from a plurality of motions in the motion storage unit 232. A selection probability is associated with each motion for each situation. For example, a selection method is defined in which when the owner makes a pleasant action, the motion A is executed with a probability of 20%, and when the temperature is 30 degrees or more, the motion B is executed with a probability of 5%. ..
 親密度管理部220は、ユーザごとの親密度を管理する。上述したように、親密度は個人データ格納部218において個人データの一部として登録される。快行為を検出したとき、親密度管理部220はそのオーナーに対する親密度をアップさせる。不快行為を検出したときには親密度はダウンする。また、長期間視認していないオーナーの親密度は徐々に低下する。 The intimacy degree management unit 220 manages the intimacy degree for each user. As described above, the degree of intimacy is registered in the personal data storage unit 218 as a part of personal data. When the pleasant behavior is detected, the intimacy degree management unit 220 increases the intimacy degree with respect to the owner. The intimacy decreases when an offensive behavior is detected. Moreover, the intimacy of owners who have not been visually recognized for a long period of time gradually decreases.
(ロボット100)
 ロボット100は、通信部142、データ処理部136、データ格納部148、内部センサ128および駆動機構120を含む。
 通信部142は、通信機126(図4参照)に該当し、外部センサ114、サーバ200および他のロボット100との通信処理を担当する。データ格納部148は各種データを格納する。データ格納部148は、記憶装置124(図4参照)に該当する。データ処理部136は、通信部142により取得されたデータおよびデータ格納部148に格納されているデータに基づいて各種処理を実行する。データ処理部136は、プロセッサ122およびプロセッサ122により実行されるコンピュータプログラムに該当する。データ処理部136は、通信部142、内部センサ128、駆動機構120およびデータ格納部148のインタフェースとしても機能する。
(Robot 100)
The robot 100 includes a communication unit 142, a data processing unit 136, a data storage unit 148, an internal sensor 128, and a drive mechanism 120.
The communication unit 142 corresponds to the communication device 126 (see FIG. 4) and is in charge of communication processing with the external sensor 114, the server 200, and the other robot 100. The data storage unit 148 stores various data. The data storage unit 148 corresponds to the storage device 124 (see FIG. 4). The data processing unit 136 executes various processes based on the data acquired by the communication unit 142 and the data stored in the data storage unit 148. The data processing unit 136 corresponds to the processor 122 and a computer program executed by the processor 122. The data processing unit 136 also functions as an interface for the communication unit 142, the internal sensor 128, the drive mechanism 120, and the data storage unit 148.
 データ格納部148は、ロボット100の各種モーションを定義するモーション格納部160を含む。
 ロボット100のモーション格納部160には、サーバ200のモーション格納部232から各種モーションファイルがダウンロードされる。モーションは、モーションIDによって識別される。前輪102を収容して着座する、腕106を持ち上げる、2つの前輪102を逆回転させることで、あるいは、片方の前輪102だけを回転させることでロボット100を回転行動させる、前輪102を収納した状態で前輪102を回転させることで震える、ユーザから離れるときにいったん停止して振り返る、などのさまざまなモーションを表現するために、各種アクチュエータ(駆動機構120)の動作タイミング、動作時間、動作方向などがモーションファイルにおいて時系列定義される。
 データ格納部148には、個人データ格納部218からも各種データがダウンロードされてもよい。
The data storage unit 148 includes a motion storage unit 160 that defines various motions of the robot 100.
Various motion files are downloaded from the motion storage unit 232 of the server 200 to the motion storage unit 160 of the robot 100. The motion is identified by the motion ID. A state in which the front wheel 102 is stowed, the arm 106 is lifted, the two front wheels 102 are reversely rotated, or only one of the front wheels 102 is rotated to rotate the robot 100, and the front wheel 102 is stored. In order to express various motions such as trembling by rotating the front wheel 102, stopping and turning when leaving the user, the operation timing, operation time, operation direction, etc. of various actuators (drive mechanism 120) are It is defined in time series in the motion file.
Various data may be downloaded from the personal data storage unit 218 to the data storage unit 148.
 データ処理部136は、認識部156および動作制御部150を含む。
 ロボット100の動作制御部150は、サーバ200の動作制御部222と協働してロボット100のモーションを決める。一部のモーションについてはサーバ200で決定し、他のモーションについてはロボット100で決定してもよい。また、ロボット100がモーションを決定するが、ロボット100の処理負荷が高いときにはサーバ200がモーションを決定するとしてもよい。サーバ200においてベースとなるモーションを決定し、ロボット100において追加のモーションを決定してもよい。モーションの決定処理をサーバ200およびロボット100においてどのように分担するかはロボットシステム300の仕様に応じて設計すればよい。
The data processing unit 136 includes a recognition unit 156 and an operation control unit 150.
The operation control unit 150 of the robot 100 cooperates with the operation control unit 222 of the server 200 to determine the motion of the robot 100. The server 200 may determine some motions, and the robot 100 may determine other motions. Although the robot 100 determines the motion, the server 200 may determine the motion when the processing load of the robot 100 is high. The base motion may be determined in the server 200 and the additional motion may be determined in the robot 100. How the server 200 and the robot 100 share the motion determination process may be designed according to the specifications of the robot system 300.
 ロボット100の動作制御部150は選択したモーションを駆動機構120に実行指示する。駆動機構120は、モーションファイルにしたがって、各アクチュエータを制御する。 The operation control unit 150 of the robot 100 instructs the drive mechanism 120 to execute the selected motion. The drive mechanism 120 controls each actuator according to the motion file.
 動作制御部150は、親密度の高いユーザが近くにいるときには「抱っこ」をせがむ仕草として両方の腕106をもちあげるモーションを実行することもできるし、「抱っこ」に飽きたときには左右の前輪102を収容したまま逆回転と停止を交互に繰り返すことで抱っこをいやがるモーションを表現することもできる。駆動機構120は、動作制御部150の指示にしたがって前輪102や腕106、首(頭部フレーム316)を駆動することで、ロボット100にさまざまなモーションを表現させる。 The motion control unit 150 can also perform a motion of lifting both arms 106 as a gesture of "hugging" when a user with high intimacy is nearby, and when the user gets tired of "hugging", the left and right front wheels 102 are moved. By alternately repeating the reverse rotation and the stop while the robot is housed, it is possible to express a motion to hate the hug. The drive mechanism 120 drives the front wheel 102, the arm 106, and the neck (head frame 316) in accordance with an instruction from the operation control unit 150 to cause the robot 100 to express various motions.
 ロボット100の認識部156は、内部センサ128から得られた外部情報を解釈する。認識部156は、視覚的な認識(視覚部)、匂いの認識(嗅覚部)、音の認識(聴覚部)、触覚的な認識(触覚部)が可能である。 The recognition unit 156 of the robot 100 interprets external information obtained from the internal sensor 128. The recognition unit 156 can perform visual recognition (visual part), odor recognition (olfactory part), sound recognition (auditory part), and tactile recognition (tactile part).
 認識部156は、移動物体の撮像画像から特徴ベクトルを抽出する。上述したように、特徴ベクトルは、移動物体の身体的特徴と行動的特徴を示すパラメータ(特徴量)の集合である。移動物体を検出したときには、ニオイセンサや内蔵の集音マイク、温度センサ等からも身体的特徴や行動的特徴が抽出される。これらの特徴も定量化され、特徴ベクトル成分となる。認識部156は、特許文献2等に記載の既知の技術に基づいて、特徴ベクトルからユーザを特定する。 The recognition unit 156 extracts the feature vector from the captured image of the moving object. As described above, the feature vector is a set of parameters (feature amounts) indicating the physical features and behavioral features of the moving object. When a moving object is detected, physical characteristics and behavioral characteristics are extracted from an odor sensor, a built-in sound collecting microphone, a temperature sensor, and the like. These features are also quantified and become feature vector components. The recognition unit 156 identifies the user from the feature vector based on a known technique described in Patent Document 2 or the like.
 検出・分析・判定を含む一連の認識処理のうち、ロボット100の認識部156は認識に必要な情報の取捨選択や抽出を行い、判定等の解釈処理はサーバ200の認識部212により実行される。認識処理は、サーバ200の認識部212だけで行ってもよいし、ロボット100の認識部156だけで行ってもよいし、上述のように双方が役割分担をしながら上記認識処理を実行してもよい。 Of a series of recognition processing including detection, analysis, and determination, the recognition unit 156 of the robot 100 selects or extracts information necessary for recognition, and interpretation processing such as determination is executed by the recognition unit 212 of the server 200. .. The recognition processing may be performed only by the recognition unit 212 of the server 200, or may be performed only by the recognition unit 156 of the robot 100. As described above, both sides perform the above-described recognition processing while sharing roles. Good.
 ロボット100に対する強い衝撃が与えられたとき、認識部156はタッチセンサおよび加速度センサによりこれを認識し、サーバ200の認識部212は、近隣にいるユーザによって「乱暴行為」が働かれたと認識する。ユーザがツノ112を掴んでロボット100を持ち上げるときにも、乱暴行為と認識してもよい。ロボット100に正対した状態にあるユーザが特定音量領域および特定周波数帯域にて発声したとき、サーバ200の認識部212は、自らに対する「声掛け行為」がなされたと認識してもよい。また、体温程度の温度を検知したときにはユーザによる「接触行為」がなされたと認識し、接触認識した状態で上方への加速度を検知したときには「抱っこ」がなされたと認識する。ユーザがボディ104を持ち上げるときの物理的接触をセンシングしてもよいし、前輪102にかかる荷重が低下することにより抱っこを認識してもよい。
 まとめると、ロボット100は内部センサ128によりユーザの行為を物理的情報として取得し、サーバ200の認識部212は快・不快を判定する。また、サーバ200の認識部212は特徴ベクトルに基づくユーザ識別処理を実行する。
When a strong impact is applied to the robot 100, the recognition unit 156 recognizes this by the touch sensor and the acceleration sensor, and the recognition unit 212 of the server 200 recognizes that “a violent act” is performed by a user in the vicinity. When the user holds the horn 112 and lifts the robot 100, it may be recognized as a violent act. When the user facing the robot 100 utters a voice in a specific sound volume region and a specific frequency band, the recognition unit 212 of the server 200 may recognize that a “calling action” has been performed on itself. Further, when a temperature around the body temperature is detected, it is recognized that the "contact action" has been performed by the user, and when an upward acceleration is detected in the state of contact recognition, it is recognized that the "hugging" has been performed. Physical contact may be sensed when the user lifts the body 104, or the hug may be recognized when the load applied to the front wheel 102 is reduced.
In summary, the robot 100 acquires the action of the user as physical information by the internal sensor 128, and the recognition unit 212 of the server 200 determines the comfort/discomfort. The recognition unit 212 of the server 200 also executes a user identification process based on the feature vector.
 サーバ200の認識部212は、ロボット100に対するユーザの各種応対を認識する。各種応対行為のうち一部の典型的な応対行為には、快または不快、肯定または否定が対応づけられる。一般的には快行為となる応対行為のほとんどは肯定反応であり、不快行為となる応対行為のほとんどは否定反応となる。快・不快行為は親密度に関連し、肯定・否定反応はロボット100の行動選択に影響する。 The recognition unit 212 of the server 200 recognizes various responses of the user to the robot 100. Some typical response actions among various response actions are associated with pleasantness or discomfort, affirmation or denial. In general, most of the pleasant behaviors are affirmative reactions, and most of the offensive behaviors are negative responses. Pleasure/discomfort is related to intimacy, and positive/negative reactions influence behavior selection of the robot 100.
 認識部156により認識された応対行為に応じて、サーバ200の親密度管理部220はユーザに対する親密度を変化させる。原則的には、快行為を行ったユーザに対する親密度は高まり、不快行為を行ったユーザに対する親密度は低下する。 The intimacy degree management unit 220 of the server 200 changes the intimacy degree with respect to the user according to the response action recognized by the recognition unit 156. In principle, the degree of intimacy with respect to the user who has performed a pleasant act increases, and the degree of intimacy with respect to the user who has performed an unpleasant act decreases.
 以上の基本構成を前提として、次に、本実施形態におけるロボット100の実装について、特に、本実装の特徴と目的および基本構成との相違点を中心として説明する。 Given the above basic configuration, the implementation of the robot 100 in the present embodiment will be described next, focusing on the features and purposes of this implementation, and the differences from the basic configuration.
[ステーション500の探索機能の実装]
 図6は、ロボット100が外皮314を装着した状態を表す図である。図6Aは右側面図であり、図6Bは正面図であり、図6Cは背面図である。
 なお、ロボット100の外観は、ほぼ左右対称となっている。
[Implementation of search function of station 500]
FIG. 6 is a diagram showing a state in which the robot 100 is wearing the outer cover 314. 6A is a right side view, FIG. 6B is a front view, and FIG. 6C is a rear view.
The appearance of the robot 100 is substantially symmetrical.
 ロボット100における胴部フレーム318の後部下方には、後輪103を収容するための収容口377が設けられる。収容口377の左右に一対の充電端子510が突設される。充電端子510の基端は、胴部フレーム318の内部に位置し、図示しない配線を介して充電回路に接続される。充電端子510の先端は、やや大径の円板状とされ、ボタン態様をなす。 An accommodation opening 377 for accommodating the rear wheel 103 is provided below the rear portion of the body frame 318 of the robot 100. A pair of charging terminals 510 are provided to the left and right of the accommodation port 377. The base end of the charging terminal 510 is located inside the body frame 318, and is connected to the charging circuit via a wire (not shown). The tip of the charging terminal 510 is in the shape of a disk having a slightly large diameter, and has a button shape.
 外皮314は、外皮本体420と弾性装着部422とを縫い合わせて構成される。外皮本体420および弾性装着部422は、いずれも柔軟素材からなる。外皮本体420は、頭部フレーム316に被せられる袋状部424と、袋状部424の左右側面から下方に延びる一対の手部426と、袋状部424の正面から下方に延びる延在部428と、袋状部424の背面から下方に延びる延在部430とを含む。袋状部424の前面側に、顔領域116を露出させるための開口部432が設けられている。 The outer cover 314 is configured by sewing the outer cover body 420 and the elastic mounting portion 422 together. Both the outer cover body 420 and the elastic mounting portion 422 are made of a flexible material. The outer cover body 420 includes a bag-shaped portion 424 that covers the head frame 316, a pair of hand portions 426 that extends downward from the left and right side surfaces of the bag-shaped portion 424, and an extension portion 428 that extends downward from the front surface of the bag-shaped portion 424. And an extending portion 430 extending downward from the back surface of the bag-shaped portion 424. An opening 432 for exposing the face region 116 is provided on the front surface side of the bag-shaped portion 424.
 弾性装着部422は、外皮314の底部を構成し、外皮本体420の前後の延在部428,430を下方で連結している。弾性装着部422には、収容口377と対応する位置に開口部434が設けられている。弾性装着部422の後部下方には一対の孔436が形成されている。孔436は、ボタン穴のような小幅形状を有するが、弾性装着部422が柔軟であるため、幅方向に押し広げることができる。これらの孔436に、一対の充電端子510が挿通される。充電端子510の孔436への挿通後、孔436は弾性力により元の小幅形状に戻る。それにより、充電端子510の頭部が孔436の周辺に引っ掛かり、外皮314が脱げることを防止できる。すなわち、充電端子510は、充電のための端子であるとともに、外皮314を固定するための部材でもある。 The elastic mounting portion 422 constitutes the bottom portion of the outer cover 314, and connects the front and rear extending parts 428 and 430 of the outer cover main body 420 below. The elastic mounting portion 422 is provided with an opening 434 at a position corresponding to the accommodation port 377. A pair of holes 436 are formed below the rear portion of the elastic mounting portion 422. The hole 436 has a narrow width shape like a button hole, but since the elastic mounting portion 422 is flexible, it can be widened in the width direction. A pair of charging terminals 510 are inserted into these holes 436. After the charging terminal 510 is inserted into the hole 436, the hole 436 returns to its original narrow shape due to the elastic force. This prevents the head of the charging terminal 510 from being caught around the hole 436 and the outer cover 314 from coming off. That is, the charging terminal 510 is a terminal for charging and also a member for fixing the outer cover 314.
 また、ロボット100の後部カバー107(尻尾)には、内部センサ128として赤外線センサ172および一対のマイク174が設けられている。すなわち、後部カバー107の中央部に赤外線センサ172が設けられ、その左側に左マイク174L、右側に右マイク174Rが設けられている。後部カバー107が開き、後輪103が出された状態で、これらはロボット100の後方を向く。赤外線センサ172および一対のマイク174は、ロボット100がステーション500へ進入する際の誘導制御に用いられる。 An infrared sensor 172 and a pair of microphones 174 are provided as an internal sensor 128 on the rear cover 107 (tail) of the robot 100. That is, the infrared sensor 172 is provided in the center of the rear cover 107, the left microphone 174L is provided on the left side, and the right microphone 174R is provided on the right side. These face the rear of the robot 100 with the rear cover 107 open and the rear wheel 103 exposed. The infrared sensor 172 and the pair of microphones 174 are used for guidance control when the robot 100 enters the station 500.
 図7は、ステーション500の外観を表す斜視図である。
 なお、以下では説明の便宜上、ステーション500においてロボット100の進入方向奥側(進入方向先側)を「奥側」、進入方向手前側(進入方向後側)を「手前側」,「正面側」と表現することがある。
 ステーション500は、サーバ200を内蔵し、ベース504、本体512のほか、一対の背面パネル508(左パネル508L、右パネル508R)などの装飾のための部材を含む。ステーション500は、充電機能とサーバ機能を単一の筐体にて提供する。
FIG. 7 is a perspective view showing the appearance of the station 500.
Note that, for convenience of description, the rear side of the robot 100 in the approach direction (front side of the approach direction) is "back side", the front side of the approach direction (rear side of the approach direction) is "front side", and "front side" in the station 500 for convenience of description. Sometimes expressed as.
The station 500 incorporates the server 200 and includes members for decoration such as a base 504, a main body 512, and a pair of rear panels 508 (left panel 508L, right panel 508R). The station 500 provides a charging function and a server function in a single housing.
 ベース504は、平面視長方形状をなし、左右に充電スペース502が設けられる。ベース504は、給電端子530を含む。給電端子530とロボット100の充電端子510が接続することにより、ステーション500からロボット100に電力が供給される。 The base 504 has a rectangular shape in plan view, and charging spaces 502 are provided on the left and right. The base 504 includes a power supply terminal 530. By connecting the power supply terminal 530 and the charging terminal 510 of the robot 100, electric power is supplied from the station 500 to the robot 100.
 ベース504の上面中央に本体512が立設される。本体512は、上半部が拡大された筐体514を有する。一対の背面パネル508は、筐体514の正面左右にそれぞれ配置される。背面パネル508は、固定部材509を介して本体512に着脱可能に取り付けられる。固定部材509はアーム状の部材であり、その一端が背面パネル508の裏面に、他端が筐体514の裏面にそれぞれ着脱可能に固定される。 A main body 512 is erected at the center of the upper surface of the base 504. The main body 512 has a housing 514 whose upper half is enlarged. The pair of rear panels 508 are arranged on the front left and right sides of the housing 514, respectively. The back panel 508 is detachably attached to the main body 512 via a fixing member 509. The fixing member 509 is an arm-shaped member, and one end thereof is detachably fixed to the back surface of the back panel 508 and the other end is detachably fixed to the back surface of the housing 514.
 筐体514の左右には、各背面パネル508の下方位置に近距離誘導部252(左誘導部252L,右誘導部252R)がそれぞれ設けられている。また、筐体514の中央部には、中距離誘導部254が設置される。 On the left and right sides of the housing 514, short distance guiding parts 252 (left guiding part 252L, right guiding part 252R) are provided below the respective rear panels 508. In addition, a middle distance guiding unit 254 is installed in the center of the housing 514.
 近距離誘導部252は、放射角度30~60度程度にて赤外線を発生させる近距離赤外線発生装置と、超音波を発生させる超音波発生装置を備える。近距離赤外線発生装置による赤外線(以下、「近距離赤外線」とよぶ)の送信距離は最大1メートル程度である。中距離誘導部254も赤外線を発生させる中距離赤外線送信装置を含む。中距離赤外線発生装置による赤外線(以下、「中距離赤外線」とよぶ)の送信距離は最大4メートル程度である。 The short-distance guiding unit 252 includes a short-distance infrared generator that generates infrared rays at an emission angle of about 30 to 60 degrees and an ultrasonic wave generator that generates ultrasonic waves. The transmission distance of infrared rays (hereinafter referred to as “short-range infrared rays”) by the short-range infrared ray generating device is about 1 meter at maximum. The mid-range guiding unit 254 also includes a mid-range infrared transmitter that emits infrared rays. The transmission distance of infrared rays (hereinafter, referred to as “middle-range infrared rays”) by the medium-range infrared ray generator is about 4 meters at maximum.
 中距離赤外線は、近距離赤外線より光量(パワー)が強くなるように設定される。本実施形態では、近距離誘導部252と中距離誘導部254は別構成としたが、近距離誘導部252と中距離無線発生装置を一体として構成してもよい。 ㆍMid-range infrared rays are set to have a higher light intensity (power) than short-range infrared rays. In the present embodiment, the short-distance guiding unit 252 and the medium-distance guiding unit 254 have different configurations, but the short-distance guiding unit 252 and the medium-distance wireless generation device may be configured integrally.
 近距離誘導部252が発生させる近距離赤外線と超音波をまとめて「近距離誘導信号」とよぶ。また、中距離誘導部254が発生させる中距離赤外線を「中距離誘導信号」ともよぶ。近距離誘導信号と中距離誘導信号をまとめて「誘導信号」とよぶ。  The short-distance infrared rays and ultrasonic waves generated by the short-distance guidance unit 252 are collectively referred to as a "short-distance guidance signal". In addition, the middle-range infrared rays generated by the middle-range guiding unit 254 are also referred to as “middle-range guiding signal”. The short-distance guidance signal and the medium-distance guidance signal are collectively called "guidance signal".
 筐体514の上部には発光部256が設置される。発光部256は、LED(Light Emitting Diode)による複数の光源を含み、可視光(誘導光)によりロボット100にステーション500の位置を伝える。 A light emitting unit 256 is installed on the top of the housing 514. The light emitting unit 256 includes a plurality of light sources by LEDs (Light Emitting Diodes) and notifies the robot 100 of the position of the station 500 by visible light (guide light).
 本実施形態のロボット100は、定期的にステーション500の充電スペース502に帰還し、定期的にステーション500から充電を受けるものとして説明する。本実施形態においては、ロボット100は、45分活動したあと、15分の充電(休憩)をするという行動パターンを繰り返すものとして説明する。以下、ロボット100が充電スペース502に戻る処理のことを「帰還処理」とよぶ。また、ロボット100が充電スペース502に入り、充電端子510と給電端子530が接続または接続可能な状態になることを「入庫」とよぶ。 The robot 100 according to the present embodiment will be described assuming that the robot 100 periodically returns to the charging space 502 of the station 500 and is regularly charged by the station 500. In the present embodiment, it is assumed that the robot 100 repeats an action pattern of charging for 45 minutes (rest) after 45 minutes of activity. Hereinafter, the process in which the robot 100 returns to the charging space 502 is referred to as “return process”. Further, when the robot 100 enters the charging space 502 and the charging terminal 510 and the power feeding terminal 530 are connected or can be connected to each other is referred to as “stocking”.
 図8は、本実施形態におけるステーション500の機能ブロック図である。
 本実施形態のロボット100は全天周カメラ113によって定期的に周辺を撮像することにより多数の撮像画像(静止画像)を取得する。ロボット100は、撮像画像に基づく記憶(以下、「画像記憶」とよぶ)を形成する。
FIG. 8 is a functional block diagram of the station 500 in this embodiment.
The robot 100 of the present embodiment acquires a large number of captured images (still images) by periodically capturing an image of the surroundings with the omnidirectional camera 113. The robot 100 forms a memory (hereinafter referred to as “image memory”) based on the captured image.
 画像記憶は、複数のキーフレームの集合体である。キーフレームは、撮像画像における特徴点(特徴量)の分布情報である。本実施形態のロボット100は、画像特徴量を用いたグラフベースのSLAM(Simultaneous Localization and Mapping)技術、より具体的には、ORB(Oriented FAST and Rotated BRIEF)特徴量に基づくSLAM技術によりキーフレームを形成する(特許文献5参照)。 Image memory is a collection of multiple keyframes. The key frame is distribution information of feature points (feature amount) in the captured image. The robot 100 of the present embodiment uses a graph-based SLAM (Simultaneous Localization and Mapping) technology using image features, more specifically, a SLAM technique based on ORB (Oriented FAST and Rotated BRIEF) features to form keyframes. It is formed (see Patent Document 5).
 ロボット100は、移動しながらキーフレームを定期的に形成することにより、キーフレームの集合体、いいかえれば、画像特徴分布として画像記憶を形成する。ロボット100は、現在地点において取得したキーフレームと、既に保有している多数のキーフレームを比較することにより、現在地点を推定する。すなわち、ロボット100は、実際に視認している撮像画像とかつて視認した撮像画像(記憶)を比較し、自らの現在の状況と過去の記憶を整合させることで「空間認識」を行う。特徴点の集合体として形成される画像記憶は、いわゆるマップ(地図)となる。ロボット100は、現在地点を推定ながら移動しつつ、マップを更新する。 The robot 100 periodically forms a key frame while moving to form an aggregate of key frames, in other words, an image memory as an image feature distribution. The robot 100 estimates the current point by comparing the key frame acquired at the current point with a large number of key frames already held. That is, the robot 100 performs “spatial recognition” by comparing the captured image that is actually visually recognized with the captured image (memory) that is visually recognized once, and matching the present situation with the past memory. The image memory formed as a set of feature points is a so-called map. The robot 100 updates the map while moving while estimating the current position.
 基本構成のロボット100は、キーフレームではなく外部センサ114により位置を認識することが前提となっている。本実施形態のロボット100は、キーフレームのみに基づいて場所を認識するものとして説明する。 The basic configuration of the robot 100 is premised on recognizing the position by the external sensor 114 instead of the key frame. The robot 100 of the present embodiment will be described as recognizing a place based only on a key frame.
 ステーション500は、サーバ200および充電装置506を内蔵する。ステーション500の通信部204は、サーバ200および充電装置506の間で共通化されている。マップなどのロボット100が用いるデータについても、複数のロボット100で共有される。マップ格納部170は、各ロボット100からキーフレームを取得することにより共通(単一)のマップを生成する。 The station 500 incorporates the server 200 and the charging device 506. The communication unit 204 of the station 500 is shared by the server 200 and the charging device 506. The data used by the robot 100 such as a map is also shared by the plurality of robots 100. The map storage unit 170 generates a common (single) map by acquiring a key frame from each robot 100.
 本実施形態においては、サーバ200と充電装置506を内蔵するステーション500および1以上のロボット100により「(ロボット100の)誘導システム」が形成される。誘導システムは、更に、ランドマーク装置280を含んでもよい。ランドマーク装置280は、ロボット100の帰還処理を支援する。ランドマーク装置280の役割については、図17に関連して後述する。複数のロボット100が存在するときには、いずれかのロボット100がランドマーク装置280としての役割を担うこともできる。ロボット100をランドマーク装置280として機能させる方法については図18に関連して後述する。 In the present embodiment, a “guidance system (of the robot 100 )” is formed by the station 500 including the server 200 and the charging device 506 and one or more robots 100. The guidance system may further include a landmark device 280. The landmark device 280 supports the return process of the robot 100. The role of the landmark device 280 will be described later with reference to FIG. When there are a plurality of robots 100, any of the robots 100 can also serve as the landmark device 280. A method of causing the robot 100 to function as the landmark device 280 will be described later with reference to FIG.
(サーバ200)
 サーバ200の各機能は、その機能を実現するためのプログラムがメモリにロードされ実体化(インスタンス化)することで実現される。サーバ200の処理能力により、ロボット100による各種処理を補う。サーバ200は、ロボット100のリソースとして利用できる。サーバ200のリソースをどのように利用するかはロボット100からのリクエストに応じて動的に決められる。たとえば、ロボット100において、多数のタッチセンサからの検出値に応じて複雑なモーションを連続的に生成する必要がある場合、ロボット100におけるプロセッサ122の処理をモーションの選択・生成に優先的に割り当て、周囲の状況を画像認識するための処理はサーバ200の認識部212でおこなうとしてもよい。このように、ロボット100とサーバ200の間でロボットシステム300の各種処理を分散化できる。サーバ200の各機能は、ロボット100ごとに独立して実体化される。たとえば、サーバ200はロボット100Aのための認識部212とは別にロボット100Bのための認識部212を用意してもよい。
(Server 200)
Each function of the server 200 is realized by loading a program for realizing the function into the memory and instantiating (instantiating) the program. Various processing by the robot 100 is supplemented by the processing capacity of the server 200. The server 200 can be used as a resource of the robot 100. How to use the resources of the server 200 is dynamically determined according to a request from the robot 100. For example, in the robot 100, when it is necessary to continuously generate complex motions according to detection values from a large number of touch sensors, the processing of the processor 122 in the robot 100 is preferentially assigned to motion selection/generation, The recognition unit 212 of the server 200 may perform the processing for image recognition of the surrounding situation. In this way, various processes of the robot system 300 can be distributed between the robot 100 and the server 200. Each function of the server 200 is materialized independently for each robot 100. For example, the server 200 may prepare the recognition unit 212 for the robot 100B separately from the recognition unit 212 for the robot 100A.
 サーバ200は、サーバ200の設置場所(担当エリア)に存在するロボット100を補助する。たとえば、サーバ200はユーザの自宅に設置され、そこに存在するロボット100のために利用される。ロボット100は、移動のためにマップを生成するが、マップはユーザの自宅に存在する複数のロボット100で共有される。ロボット100はSLAMを用いて、マップを形成する。マップを形成する特徴点(キーフレーム)には、壁などの移動しない静的な構造によるものと、椅子やおもちゃなどの移動する動的な構造によるものとが含まれる。そのため、マップは一度作成すれば永続的につかえる種類の情報ではなく、常に更新が必要な情報である。そこで、複数のロボット100がマップを共有することで、効率的にマップを更新できる。 The server 200 assists the robot 100 existing in the installation location (charge area) of the server 200. For example, the server 200 is installed in the user's home and is used for the robot 100 existing therein. The robot 100 generates a map for movement, and the map is shared by the plurality of robots 100 existing in the user's home. The robot 100 uses SLAM to form a map. The feature points (keyframes) forming the map include those having a static structure such as a wall that does not move and those having a dynamic structure that moves such as a chair or a toy. Therefore, the map is not the kind of information that can be used permanently once created, but the information that needs to be updated at all times. Therefore, the map can be updated efficiently by the plurality of robots 100 sharing the map.
 データ処理部202の位置管理部208は、マップ管理部168を含む。マップ管理部168は、画像記憶に基づくマップを管理する。マップ管理部168は、複数のロボット100で共有されたマップに対して更新を繰り返す。 The position management unit 208 of the data processing unit 202 includes a map management unit 168. The map management unit 168 manages a map based on image storage. The map management unit 168 repeats updating the map shared by the plurality of robots 100.
 データ格納部206は、更に、マップ格納部170を含む。データ格納部206は、複数のロボット100の情報をまとめて格納するため、複数の100は206の各種データを共有できる。マップ格納部170は、マップを格納する。 The data storage unit 206 further includes a map storage unit 170. Since the data storage unit 206 collectively stores the information of the plurality of robots 100, the plurality of 100 can share various data of the 206. The map storage unit 170 stores maps.
(通信部204)
 通信部204は、イベント受信部246、発光指示受信部248、入庫要求受信部250および入庫可否送信部258を含む。
 イベント受信部246は、ロボット100から、ロボット100が認識した各種イベントを示すイベント情報(環境情報)を受信する。サーバ200の認識部212はイベント情報を解析する。親密度管理部220および状態管理部244はイベント情報に基づいて、親密度や感情パラメータ等を変化させる。動作制御部222はイベント情報に基づいてロボット100のモーションを選択する。
(Communication unit 204)
The communication unit 204 includes an event receiving unit 246, a light emission instruction receiving unit 248, a warehousing request receiving unit 250, and a warehousing permission transmitting unit 258.
The event receiving unit 246 receives, from the robot 100, event information (environment information) indicating various events recognized by the robot 100. The recognition unit 212 of the server 200 analyzes the event information. The intimacy degree management unit 220 and the state management unit 244 change the intimacy degree, the emotional parameter, and the like based on the event information. The motion control unit 222 selects the motion of the robot 100 based on the event information.
 発光指示受信部248は、ロボット100から発光信号を受信する。発光信号は、誘導光の発光態様(光り方)を指定する信号である。入庫要求受信部250は、ロボット100から入庫要求信号を受信する。入庫要求信号は、ロボット100がステーション500への入庫(帰還)を要求する信号である。入庫可否送信部258は、入庫要求信号に対応して、入庫可否を示す入庫可否信号をロボット100に返信する。 The light emission instruction receiving unit 248 receives a light emission signal from the robot 100. The light emission signal is a signal that specifies the light emission mode (light emission) of the guide light. The warehousing request receiving unit 250 receives a warehousing request signal from the robot 100. The warehousing request signal is a signal that the robot 100 requests warehousing (returning) to the station 500. The warehousing permission transmission unit 258 returns a warehousing permission signal indicating the warehousing permission to the robot 100 in response to the warehousing request signal.
(充電装置506)
 充電装置506は、発光制御部224、入庫判定部262、充電制御部264、発光部256および誘導部266を含む。
 発光部256は、誘導光を発生させる光源である。発光制御部224は、発光部256の発光態様を制御する。入庫判定部262は、入庫要求信号が受信されたとき、入庫可否を判定する。充電制御部264は、入庫したロボット100を充電する。誘導部266は、中距離誘導部254および近距離誘導部252に対応する。近距離誘導部252は、左誘導部252Lと右誘導部252Rを含む。
(Charging device 506)
The charging device 506 includes a light emission control unit 224, a warehousing determination unit 262, a charge control unit 264, a light emitting unit 256, and a guiding unit 266.
The light emitting unit 256 is a light source that generates guide light. The light emission control unit 224 controls the light emission mode of the light emitting unit 256. The warehousing determination unit 262 determines whether warehousing is possible when the warehousing request signal is received. The charging control unit 264 charges the robot 100 that has been stored. The guide unit 266 corresponds to the medium-range guide unit 254 and the short-range guide unit 252. The short distance guiding unit 252 includes a left guiding unit 252L and a right guiding unit 252R.
 本実施形態においては、通常、発光部256は常時点灯している。ロボット100は、発光部256の誘導光を検出することによりステーション500の所在地点を認識する。ロボット100は、誘導光の発光態様を指定することにより、誘導光であるか誘導光以外の光であるかを発光信号により能動的に確認できる(詳細は後述)。本実施形態に代えて、発光部256は常時消灯しており、発光信号を受信した場合に発光するように発光部256を構成してもよい。 In this embodiment, the light emitting unit 256 is normally lit. The robot 100 recognizes the location point of the station 500 by detecting the guide light from the light emitting unit 256. The robot 100 can actively confirm whether the light is the guide light or the light other than the guide light by designating the light emitting mode of the guide light (details will be described later). Instead of the present embodiment, the light emitting unit 256 may be always turned off, and the light emitting unit 256 may be configured to emit light when a light emission signal is received.
 ロボット100は、誘導光を目印としてステーション500に近づく。続いて、ロボット100は、中距離誘導部254からの中距離赤外線(中距離誘導信号)を検出し、充電スペース502に対する位置合わせを行う。ロボット100は、更にステーション500に近づくと、近距離誘導部252からの近距離誘導信号(近距離赤外線と超音波)に基づいて、充電スペース502に入庫する。充電端子510と給電端子が接続されたとき、充電制御部264は、ロボット100のバッテリー118を充電する。 The robot 100 approaches the station 500 with the guide light as a mark. Subsequently, the robot 100 detects the medium-range infrared rays (medium-range guidance signal) from the medium-range guidance unit 254, and aligns the charging space 502. When the robot 100 further approaches the station 500, it enters the charging space 502 based on the short-distance guidance signal (short-distance infrared rays and ultrasonic waves) from the short-distance guidance unit 252. When the charging terminal 510 and the power feeding terminal are connected, the charging control unit 264 charges the battery 118 of the robot 100.
 図9は、本実施形態におけるロボット100の機能ブロック図である。
 ロボット100は、更に、発光部138(光源)を含む。発光部138は、ロボット100がランドマーク装置として機能するときに発光する。発光部138は、ロボット100のツノ112など任意の箇所に設けられる。ロボット100の目110を発光させてもよい。
FIG. 9 is a functional block diagram of the robot 100 according to this embodiment.
The robot 100 further includes a light emitting unit 138 (light source). The light emitting unit 138 emits light when the robot 100 functions as a landmark device. The light emitting unit 138 is provided at an arbitrary location such as the horn 112 of the robot 100. The eyes 110 of the robot 100 may emit light.
 データ処理部136は、更に、撮像画像取得部146、発光制御部158および電池残量監視部176を含む。
 撮像画像取得部146は、全天周カメラ113から撮像画像を取得する。撮像画像取得部146は、定期的に撮像画像を取得する。発光制御部158は、発光部138の発光態様を制御する。電池残量監視部176は、バッテリー118の電池残量(充電率)を監視する。
The data processing unit 136 further includes a captured image acquisition unit 146, a light emission control unit 158, and a battery remaining amount monitoring unit 176.
The captured image acquisition unit 146 acquires a captured image from the omnidirectional camera 113. The captured image acquisition unit 146 regularly acquires captured images. The light emission control unit 158 controls the light emission mode of the light emitting unit 138. The battery remaining amount monitoring unit 176 monitors the battery remaining amount (charging rate) of the battery 118.
 認識部156は、更に、画像特徴取得部152と光認識部154、測距部162、位置判定部166を備える。
 画像特徴取得部152は、撮像画像から画像特徴量を抽出することによりキーフレームを生成する。光認識部154は、ステーション500またはランドマーク装置280からの発光を認識する。本実施形態においては、光認識部154は全天周カメラ113により外部光を認識するが、別途設けられる光センサに基づいて外部光を認識してもよい。測距部162は、中距離赤外線、近距離赤外線および超音波に基づいて、ロボット100とステーション500の距離および相対角度を計算する。位置判定部166は、ステーション500に対してロボット100の現在地点が「位置条件」を満たすか否かを判定する。位置条件は、ロボット100がランドマーク装置として機能するときの移動制約のための条件である。
The recognition unit 156 further includes an image feature acquisition unit 152, a light recognition unit 154, a distance measurement unit 162, and a position determination unit 166.
The image feature acquisition unit 152 generates a key frame by extracting the image feature amount from the captured image. The light recognition unit 154 recognizes the light emission from the station 500 or the landmark device 280. In the present embodiment, the light recognizing unit 154 recognizes external light by the omnidirectional camera 113, but may recognize external light based on a separately provided optical sensor. The distance measuring unit 162 calculates the distance and the relative angle between the robot 100 and the station 500 based on the medium-range infrared rays, the short-range infrared rays, and the ultrasonic waves. The position determination unit 166 determines with respect to the station 500 whether or not the current position of the robot 100 satisfies the “position condition”. The position condition is a condition for movement restriction when the robot 100 functions as a landmark device.
 ロボット100は、ステーション500から離れているときでも、キーフレームに基づいて生成されるマップを利用することにより、ステーション500の存在する方向を推定できる。ロボット100は、ステーション500を視認可能な位置にいれば、誘導光によりいっそう確実にステーション500の所在地点を特定できる。ロボット100がステーション500に近づけば、ロボット100は中距離誘導信号、更には近距離誘導信号を検出する。このように、ロボット100はステーション500に近づくほど複数の誘導手段に基づいてステーション500の正確な位置を推定できる。この結果、ロボット100の帰還処理の確実性を高めることができる。 The robot 100 can estimate the direction in which the station 500 exists by using the map generated based on the key frame even when the robot 100 is away from the station 500. When the robot 100 is in a position where the station 500 can be visually recognized, the location point of the station 500 can be more surely specified by the guide light. When the robot 100 approaches the station 500, the robot 100 detects the medium-range guidance signal and further the short-range guidance signal. In this way, the robot 100 can estimate the accurate position of the station 500 based on the plurality of guiding means as it approaches the station 500. As a result, the certainty of the return process of the robot 100 can be increased.
 通信部142は、発光指示送信部140、発光指示受信部144、入庫要求送信部260および入庫可否受信部268を含む。
 発光指示送信部140は、発光信号をステーション500またはランドマーク装置280に送信する。発光指示受信部144は、他のロボット100から発光信号を受信する。入庫要求送信部260は入庫要求信号を送信する。入庫可否受信部268は入庫可否信号を受信する。
The communication unit 142 includes a light emission instruction transmission unit 140, a light emission instruction reception unit 144, a warehousing request transmission unit 260, and a warehousing permission reception unit 268.
The light emission instruction transmission unit 140 transmits a light emission signal to the station 500 or the landmark device 280. The light emission instruction receiving unit 144 receives a light emission signal from another robot 100. The warehousing request transmission unit 260 transmits a warehousing request signal. The warehousing availability receiving unit 268 receives the warehousing availability signal.
 図10は、ランドマーク装置280の機能ブロック図である。
 ランドマーク装置280は、発光部282(光源)、発光制御部284および発光指示受信部286を含む。発光指示受信部286は、発光信号を受信する。発光制御部284は、発光信号により指示された発光態様により、発光部282を発光させる。
FIG. 10 is a functional block diagram of the landmark device 280.
The landmark device 280 includes a light emitting unit 282 (light source), a light emission control unit 284, and a light emission instruction receiving unit 286. The light emission instruction receiving unit 286 receives the light emission signal. The light emission control unit 284 causes the light emitting unit 282 to emit light according to the light emission mode instructed by the light emission signal.
 図11は、ロボット100がステーション500を探索する処理過程を示すシーケンス図である。
 ロボット100の光認識部154は、帰還処理の開始に際し、誘導光を探索する。光認識部154は誘導光を検出することでステーション500の位置を特定する。しかし、ステーション500が設置される環境によっては、誘導光ではないが誘導光に似ている光(以下、「類似光」とよぶ)が検出され、ロボット100がステーション500の位置を誤認してしまう可能性もある。
FIG. 11 is a sequence diagram showing a process in which the robot 100 searches for the station 500.
The light recognition unit 154 of the robot 100 searches for the guide light when starting the return process. The light recognition unit 154 identifies the position of the station 500 by detecting the guide light. However, depending on the environment in which the station 500 is installed, light that is not guided light but is similar to guided light (hereinafter referred to as “similar light”) is detected, and the robot 100 misidentifies the position of the station 500. There is a possibility.
 ロボット100の発光指示送信部140は、複数の誘導光が検出されたとき、いいかえれば、類似光が検出されたときには、特有の発光態様を指定する発光信号を送信する(S10)。発光指示送信部140は、あらかじめ定められた唯一の発光態様を発光信号により指定してもよいし、複数種類の発光態様からいずれかの発光態様を選択した上で発光信号を送信してもよい。 The light emission instruction transmission unit 140 of the robot 100 transmits a light emission signal designating a specific light emission mode when a plurality of guide lights are detected, in other words, when similar lights are detected (S10). The light emission instruction transmission unit 140 may specify only a predetermined light emission mode by a light emission signal, or may select any one of a plurality of types of light emission modes and transmit the light emission signal. ..
 ステーション500の発光指示受信部248は、発光信号を受信する。ステーション500の発光制御部224は、発光信号により指定された発光態様を発光部256に設定し、発光部256を発光させる(S12)。ロボット100の光認識部154は、撮像画像から、指定された発光態様に対応する外部光を探索する(S14)。光認識部154は、指定された発光態様にて発光する外部光を誘導光として特定する。ロボット100の動作制御部150は、誘導光の発光地点を移動目標地点として設定し、ロボット100はステーション500に向けて移動する(S16)。 The light emission instruction receiving unit 248 of the station 500 receives the light emission signal. The light emission control unit 224 of the station 500 sets the light emission mode specified by the light emission signal in the light emission unit 256, and causes the light emission unit 256 to emit light (S12). The light recognition unit 154 of the robot 100 searches the captured image for external light corresponding to the specified light emission mode (S14). The light recognition unit 154 identifies the external light emitted in the designated light emission mode as the guide light. The operation control unit 150 of the robot 100 sets the light emission point of the guide light as the movement target point, and the robot 100 moves toward the station 500 (S16).
 このような制御方法によれば、ロボット100は、発光信号により発光態様を指定することにより、誘導光を確実に認識できる。ステーション500は、発光信号に対応した誘導光を発生させることにより、ロボット100に対してステーション500の位置を知らせることができる。ロボット100は、多様な発光信号を送信することにより、類似光を誘導光と誤認することがなくなる。 According to such a control method, the robot 100 can reliably recognize the guide light by designating the light emission mode by the light emission signal. The station 500 can notify the robot 100 of the position of the station 500 by generating guide light corresponding to the light emission signal. By transmitting various light emission signals, the robot 100 does not misidentify similar light as guide light.
 ステーション500は、擬似光にはありえない特殊な発光態様にて誘導光を発生させてもよい。たとえば、ステーション500は10Hzにて誘導光を常時点滅させてもよい。10Hzで点滅する擬似光が存在する可能性は低いため、ロボット100は発光信号を送信しなくても、10Hzの光を検出することでステーション500の位置を特定できるかもしれない。 The station 500 may generate the guide light in a special light emission mode that cannot be obtained by the pseudo light. For example, station 500 may constantly flash the guided light at 10 Hz. Since it is unlikely that there is pseudo light that blinks at 10 Hz, the robot 100 may be able to identify the position of the station 500 by detecting 10 Hz light without transmitting a light emission signal.
 しかし、ステーション500が誘導光を常時10Hzにて点滅させるとすれば、ユーザは誘導光を煩わしく感じるかもしれない。ステーション500は、ロボット100から発光信号を受信したときだけ誘導光の発光態様を変化させれば、ユーザの目に負担をかけることなくロボット100をステーション500に導くことができる。 However, if the station 500 constantly flashes the guided light at 10 Hz, the user may find the guided light annoying. The station 500 can guide the robot 100 to the station 500 without burdening the eyes of the user by changing the light emission mode of the guide light only when the light emission signal is received from the robot 100.
 なお、ロボット100が指定した発光態様に対応する外部光を複数検出したとき、いいかえれば、発光信号を送信したあとも誘導光を特定できなかったときには、発光指示送信部140は別の発光態様を指定する発光信号を送信すればよい。 When the robot 100 detects a plurality of external lights corresponding to the specified light emission mode, in other words, when the guide light cannot be specified even after transmitting the light emission signal, the light emission instruction transmission unit 140 sets another light emission mode. A designated light emission signal may be transmitted.
 上述したように、サーバ200(ステーション500)のマップ管理部168は、撮像画像に基づくキーフレーム(画像特徴情報)からマップを生成する。マップ管理部168は、誘導光の発光地点であるステーション500の位置をマップの基準点として設定する。マップ管理部168は、マップに記録される誘導光の発光地点と、ロボット100から実際に視認される誘導光の発光地点を比較することにより、ロボット100の現在地点をより正確に把握できる。 As described above, the map management unit 168 of the server 200 (station 500) generates a map from the key frame (image feature information) based on the captured image. The map management unit 168 sets the position of the station 500, which is the light emitting point of the guide light, as the reference point of the map. The map management unit 168 can grasp the current position of the robot 100 more accurately by comparing the light emitting point of the guide light recorded on the map with the light emitting point of the guide light actually viewed by the robot 100.
 ロボット100の発光指示送信部140は、マップにおいて現在地点を見失ったときにも位置確認のために発光信号を送信してもよい。マップ管理部168は、発光信号に対応する誘導光に基づいて、現在地点を再確認する。ここでいう「現在地点を見失う」とは、「ロボット100の現在地点P1において画像記憶から想定されるキーフレームと実際の撮像画像から得られるキーフレームの類似度が所定の閾値よりも小さいこと(実際に見えている景色が、見えるはずの景色と異なる)」であってもよい。あるいは、電源のオフ等の不可抗力により、マップ管理部168が内蔵メモリに記憶していたマップのデータを失ったときであってもよい。 The light emission instruction transmission unit 140 of the robot 100 may transmit a light emission signal for position confirmation even when the current position on the map is lost. The map management unit 168 reconfirms the current location based on the guide light corresponding to the light emission signal. The term "lost the current location" as used herein means that "the similarity between the key frame assumed from image storage at the current location P1 of the robot 100 and the key frame obtained from the actual captured image is smaller than a predetermined threshold value ( The view that is actually seen is different from the view that should be seen)." Alternatively, the map management unit 168 may lose the map data stored in the built-in memory due to force majeure such as power-off.
 マップ管理部168は現在地点を見失ったときにはロボット100にロスト信号を送信する。発光指示送信部140は、ロスト信号を受信したとき、発光信号を送信する。光認識部154は誘導光を検出し、誘導光の見える方向をサーバ200に通知する。マップ管理部168は、マップ(画像記憶)を参照し、誘導光の発光方向からロボット100の現在地点を再特定する。たとえば、誘導光がロボット100の左方向に見えるときには、ロボット100の左にステーション500(基準点)があることを特定できる。マップ管理部168は、マップ(画像記憶)において基準点を左に見る位置がロボット100の現在地点であると認識できる。 The map management unit 168 transmits a lost signal to the robot 100 when the current position is lost. The light emission instruction transmission unit 140 transmits the light emission signal when receiving the lost signal. The light recognition unit 154 detects the guide light and notifies the server 200 of the direction in which the guide light is visible. The map management unit 168 refers to the map (image storage) and respecifies the current position of the robot 100 from the emission direction of the guide light. For example, when the guide light is seen to the left of the robot 100, it can be specified that the station 500 (reference point) is on the left of the robot 100. The map management unit 168 can recognize that the position where the reference point is viewed to the left in the map (image storage) is the current position of the robot 100.
 図12は、発光部256の拡大斜視図である。
 発光部256は、中央光源380(第2の光源)と、2つの側方光源382(左光源382L、右光源382R)(第1の光源)を含む。左光源382Lおよび右光源382Rは縦長のLED光源として第1面386に形成される。中央光源380は横長のLED光源として、第1面386よりも5~10ミリメートル程度奥側にある第2面384に形成される。
FIG. 12 is an enlarged perspective view of the light emitting unit 256.
The light emitting unit 256 includes a central light source 380 (second light source) and two side light sources 382 (left light source 382L, right light source 382R) (first light source). The left light source 382L and the right light source 382R are formed on the first surface 386 as vertically long LED light sources. The central light source 380 is a horizontally long LED light source, and is formed on the second surface 384 which is on the back side of the first surface 386 by about 5 to 10 mm.
 発光部256の高さは、ロボット100のツノ112(全天周カメラ113)と同程度の高さに設定される。これは、全天周カメラ113から誘導光を捉えやすくするためである。発光部256をロボット100のツノ112と同程度の高さとすることにより、光認識部154がツノ112の高さからかけ離れた位置にある類似光を誘導光と誤認する可能性が低くなる。 The height of the light emitting unit 256 is set to be approximately the same as the height of the horn 112 (all-sky camera 113) of the robot 100. This is to make it easier to capture the guided light from the omnidirectional camera 113. By setting the light emitting unit 256 to the same height as the horn 112 of the robot 100, it is less likely that the light recognizing unit 154 will mistakenly recognize similar light at a position far from the height of the horn 112 as guide light.
 図13は、正面から見たときの発光部256の模式図である。
 図13は、ステーション500の正面側(前方)に位置するロボット100から、発光部256を見たときの状態を示す。中央光源380の左端と左光源382Lの右端までの見た目上の距離をWLとする。また、中央光源380の右端と右光源382Rの左端までの見た目上の距離をWRとする。正面視においては、WRとWLは等しい。ロボット100がステーション500と正対するとき、中央光源380と2つの側方光源382は、ロボット100から左右対称にて視認される。
FIG. 13 is a schematic diagram of the light emitting unit 256 when viewed from the front.
FIG. 13 shows a state where the light emitting unit 256 is viewed from the robot 100 located on the front side (front side) of the station 500. The apparent distance between the left end of the central light source 380 and the right end of the left light source 382L is WL. Further, the apparent distance between the right end of the central light source 380 and the left end of the right light source 382R is WR. In a front view, WR and WL are equal. When the robot 100 faces the station 500, the central light source 380 and the two lateral light sources 382 are visually recognized by the robot 100 symmetrically.
 図14は、側方から見たときの発光部256の模式図である。
 図14では、ステーション500から見て右前方に位置するロボット100から、発光部256を見たときの状態を示す。いいかえれば、ロボット100の右前方にステーション500が位置しているときに、ロボット100から発光部256を見たときの状態を示す。左光源382Lおよび右光源382Rが設置される第1面386は、中央光源380が設置される第2面384よりも正面側にある。このため、ロボット100がステーション500の側方に位置するとき、中央光源380と側方光源382の見た目上の位置が大きくずれる。
FIG. 14 is a schematic diagram of the light emitting unit 256 when viewed from the side.
FIG. 14 shows a state in which the light emitting unit 256 is viewed from the robot 100 located on the front right side of the station 500. In other words, the state when the light emitting unit 256 is seen from the robot 100 when the station 500 is located on the right front side of the robot 100 is shown. The first surface 386 on which the left light source 382L and the right light source 382R are installed is on the front side of the second surface 384 on which the central light source 380 is installed. Therefore, when the robot 100 is located on the side of the station 500, the apparent positions of the central light source 380 and the side light source 382 are largely displaced.
 まず、ロボット100は、中央光源380を右側方から見るとき、中央光源380は短く見える。また、中央光源380は左光源382Lに正面視(図13)のときよりも近づいて見える。このため、距離WLは正面視のときに比べると短くなる。この結果、ロボット100からは、WL<WRに見える。ステーション500の右前方に位置するロボット100から発光部256を見ると、中央光源380に対して2つの側方光源382は左右対称ではなくなる。 First, when the robot 100 looks at the central light source 380 from the right side, the central light source 380 looks short. Further, the central light source 380 appears closer to the left light source 382L than in the front view (FIG. 13). Therefore, the distance WL becomes shorter than that in the front view. As a result, the robot 100 looks like WL<WR. When the light emitting unit 256 is viewed from the robot 100 located on the right front side of the station 500, the two lateral light sources 382 are not symmetrical with respect to the central light source 380.
 もし、中央光源380および側方光源382が同一面に形成される場合、発光部256を斜めからみたときには、距離WRおよびWLはどちらも同じだけ小さくなる。中央光源380は見た目上短くなるものの、中央光源380と2つの側方光源382は左右対称にて視認される。このため、ロボット100は誘導光の見え方から、ロボット100とステーション500の位置関係を把握しづらくなる。 If the central light source 380 and the side light source 382 are formed on the same surface, both the distances WR and WL are reduced by the same amount when the light emitting unit 256 is viewed obliquely. Although the central light source 380 is apparently short, the central light source 380 and the two lateral light sources 382 are visually recognized symmetrically. Therefore, it becomes difficult for the robot 100 to grasp the positional relationship between the robot 100 and the station 500 from the appearance of the guided light.
 図12に示したように側方光源382が形成される第1面386と中央光源380が形成される第2面384の位置を前後にずらすことにより、ロボット100はステーション500との相対方向を視認しやすくなる。具体的には、ロボット100から見たときの見た目上の幅がWL<WRのとき(中央光源380が左光源382Lに近づいて見えるとき)、ロボット100はステーション500から見て右前方に位置していることを認識できる。また、WRとWLの比率から、ロボット100の測距部162はステーション500からの相対角度を割り出すこともできる。ロボット100がステーション500から見て左前方に位置するときにはWL>WRとなる。 As shown in FIG. 12, by shifting the positions of the first surface 386 on which the side light source 382 is formed and the second surface 384 on which the central light source 380 is formed to the front and back, the robot 100 moves in the direction relative to the station 500. It becomes easier to see. Specifically, when the apparent width when viewed from the robot 100 is WL<WR (when the central light source 380 appears close to the left light source 382L), the robot 100 is located on the right front side when viewed from the station 500. Can recognize that. The distance measuring unit 162 of the robot 100 can also calculate the relative angle from the station 500 from the ratio of WR and WL. When the robot 100 is located on the left front side when viewed from the station 500, WL>WR.
 図15は、近距離誘導信号(近距離赤外線と超音波)によるロボット100の位置調整方法を説明するための模式図である。
 帰還処理に際し、ロボット100は誘導光を探索し、中央光源380および側方光源382の位置関係に基づいて位置合わせをしながら充電スペース502に向かう。ロボット100がステーション500に近づくと、ロボット100は中距離誘導部254から送信される中距離赤外線を検出し、中距離赤外線の発光地点を移動目標地点として、更に、ステーション500に近づく。このとき、ロボット100は前進方向かつ低速にてステーション500に近づく。ロボット100のツノ112には複数の赤外線センサ(図示せず)が周囲に向けて環状に設置されている。複数の赤外線センサそれぞれが中距離赤外線を検出し、各赤外線センサの受信強度に基づいて、測距部162はロボット100とステーション500との距離および角度を計算する。
FIG. 15 is a schematic diagram for explaining a method for adjusting the position of the robot 100 using a short-distance guidance signal (short-distance infrared rays and ultrasonic waves).
In the returning process, the robot 100 searches for the guided light and heads to the charging space 502 while performing alignment based on the positional relationship between the central light source 380 and the side light source 382. When the robot 100 approaches the station 500, the robot 100 detects the medium-range infrared rays transmitted from the medium-range guidance unit 254, and further approaches the station 500 with the emission point of the medium-range infrared rays as the movement target point. At this time, the robot 100 approaches the station 500 in the forward direction and at a low speed. On the horn 112 of the robot 100, a plurality of infrared sensors (not shown) are installed in a ring shape toward the surroundings. Each of the plurality of infrared sensors detects medium-range infrared rays, and the distance measuring unit 162 calculates the distance and angle between the robot 100 and the station 500 based on the reception intensity of each infrared sensor.
 ロボット100がステーション500に更に近づくと、ロボット100は、ステーション500に対して背を向けるように旋回する。その後、ロボット100は、ステーション500に向かって後進する。ツノ112に設置される赤外線センサは、ステーション500から離れているときには中距離赤外線を検出できるが、ステーション500に近づきすぎると一時的に中距離赤外線を検出できなくなる。中距離赤外線の照射範囲からツノ112が外れるためである。動作制御部150は、ツノ112の赤外線センサが中距離赤外線を検出できなくなったときにロボット100を旋回させる。あるいは、全天周カメラ113による撮像画像に基づいて、測距部162はロボット100とステーション500との距離を計測してもよい。具体的には、撮像画像(全天周画像)においてステーション500に対応する画像領域の大きさが所定サイズ以上となったとき、動作制御部150はロボット100を旋回させてもよい。 When the robot 100 further approaches the station 500, the robot 100 turns so as to turn its back on the station 500. After that, the robot 100 moves backward toward the station 500. The infrared sensor installed in the horn 112 can detect medium-range infrared rays when it is far from the station 500, but temporarily cannot detect medium-range infrared rays when it is too close to the station 500. This is because the horn 112 is out of the irradiation range of medium-range infrared rays. The operation control unit 150 causes the robot 100 to turn when the infrared sensor of the horn 112 cannot detect mid-range infrared rays. Alternatively, the distance measuring unit 162 may measure the distance between the robot 100 and the station 500 based on the image captured by the omnidirectional camera 113. Specifically, when the size of the image region corresponding to the station 500 in the captured image (all-round image) becomes equal to or larger than the predetermined size, the operation control unit 150 may rotate the robot 100.
 ロボット100が後ろ向きになると、ロボット100の背面にある赤外線センサ172(図6C参照)は近距離赤外線を検出し、左マイク174Lと右マイク174Rはそれぞれ超音波を検出する。測距部162は近距離赤外線を検出することでロボット100とステーション500が特に近づいていることを認識する。 When the robot 100 faces backward, the infrared sensor 172 (see FIG. 6C) on the back surface of the robot 100 detects short-range infrared rays, and the left microphone 174L and the right microphone 174R detect ultrasonic waves. The distance measuring unit 162 recognizes that the robot 100 and the station 500 are particularly close to each other by detecting the short-range infrared rays.
 ステーション500の近距離誘導部252は、近距離赤外線(高速信号)と超音波(低速信号)を同時に発生させる。赤外線は超音波に比べると伝播速度が大きいので、赤外線センサ172が近距離赤外線を検出するよりも遅れてマイク174は超音波を検出する。測距部162は近距離赤外線の検出時点と超音波の検出時点の差分時間に基づいて、ロボット100から近距離誘導部252までの距離と角度を計算する。近距離誘導部252は、一定の周期で赤外線と超音波を同時に発生するので、ロボット100は近距離誘導部252までの距離と角度を連続的に測定できる。図15は、右誘導部252Rの周辺に形成される右スペース502Rにロボット100が近づく様子を示している。 The short-distance guiding unit 252 of the station 500 simultaneously generates short-distance infrared rays (high-speed signal) and ultrasonic waves (low-speed signal). Since the propagation speed of infrared rays is higher than that of ultrasonic waves, the microphone 174 detects ultrasonic waves later than the infrared sensor 172 detects short-range infrared rays. The distance measuring unit 162 calculates the distance and the angle from the robot 100 to the short distance guiding unit 252 based on the difference time between the detection time of the near infrared ray and the detection time of the ultrasonic wave. Since the short-distance guiding unit 252 simultaneously generates infrared rays and ultrasonic waves in a constant cycle, the robot 100 can continuously measure the distance and angle to the short-distance guiding unit 252. FIG. 15 shows how the robot 100 approaches the right space 502R formed around the right guiding portion 252R.
 より具体的には、右誘導部252Rとロボット100の距離と相対角度に応じて、右マイク174Rによる超音波の検出時点と左マイク174Lによる超音波の検出時点に差分時間が生じる。この差分時間に基づいて、測距部162は左マイク174Lから右誘導部252Rまでの距離を算出する。測距部162は右マイク174Rから右誘導部252Rまでの距離も同様に算出する。測距部162は、2つの距離(座標)に基づいて、ロボット100の進行方向(後退方向)と右誘導部252Rとの相対角度を特定する。動作制御部150は、この相対角度に基づいて、ロボット100の進行方向を微調整しながらロボット100を後退させる。 More specifically, depending on the distance and relative angle between the right guiding unit 252R and the robot 100, a difference time occurs between the time point when the right microphone 174R detects ultrasonic waves and the time point when the left microphone 174L detects ultrasonic waves. The distance measuring unit 162 calculates the distance from the left microphone 174L to the right guiding unit 252R based on this difference time. Distance measuring unit 162 similarly calculates the distance from right microphone 174R to right guiding unit 252R. The distance measuring unit 162 specifies the relative angle between the traveling direction (backward direction) of the robot 100 and the right guiding unit 252R based on the two distances (coordinates). The operation control unit 150 moves the robot 100 backward while finely adjusting the traveling direction of the robot 100 based on this relative angle.
 ステーション500は、入庫許可信号を送信したあと、入庫対象となる充電スペース502の給電端子530に微弱電流を流す。ロボット100が充電スペース502に入庫したとき、充電端子510は給電端子530と接続され、測距部162は微弱電流が検出されたときに入庫が完了したと判定する。ロボット100は微弱電流が検出されたときに確認信号を送信し、ステーション500の充電制御部264は確認信号を受信したときに電力供給を開始する。 The station 500 sends a warehousing permission signal, and then sends a weak current to the power supply terminal 530 of the charging space 502 to be warehousing. When the robot 100 is stored in the charging space 502, the charging terminal 510 is connected to the power supply terminal 530, and the distance measuring unit 162 determines that the storage is completed when a weak current is detected. The robot 100 transmits a confirmation signal when a weak current is detected, and the charging control unit 264 of the station 500 starts power supply when receiving the confirmation signal.
 ロボット100の後輪103がステーション500に乗り上げ、ロボット100の充電端子510とステーション500の給電端子530が接近したとき、徐々にマイク174は近距離誘導部252から出力される超音波や赤外線を検出できなくなる。これは、近距離誘導部252とマイク174の高さが大きく異なるため、ステーション500とロボット100が接近しすぎたときに近距離誘導部252による超音波の送信範囲からマイク174が外れてしまうためである。測距部162は、近距離赤外線を検出できなくなってから経過した時間を計測する。動作制御部150は、所定の基準時間以内に微弱電流を検出できなかったときには入庫失敗と判定し、ロボット100を前進させてステーション500から離れ、帰還処理を再実行させる。 When the rear wheel 103 of the robot 100 rides on the station 500 and the charging terminal 510 of the robot 100 and the power supply terminal 530 of the station 500 approach each other, the microphone 174 gradually detects ultrasonic waves or infrared rays output from the short-distance guiding unit 252. become unable. This is because the heights of the short-distance guiding unit 252 and the microphone 174 are significantly different, and therefore, when the station 500 and the robot 100 are too close to each other, the microphone 174 deviates from the ultrasonic transmission range of the short-distance guiding unit 252. Is. The distance measuring unit 162 measures the time that has passed since the short-range infrared rays could not be detected. When the weak current cannot be detected within the predetermined reference time, the operation control unit 150 determines that the stocking has failed, advances the robot 100 to move away from the station 500, and re-executes the feedback process.
 動作制御部150は、前輪102が空転しているときにも入庫失敗とみなして帰還処理を再実行させてもよい。ロボット100は、慣性計測装置(IMU:Inertial Measurement Unit)を搭載してもよい。基準時間をすぎても慣性計測装置がロボット100の移動を検出しているときには、ロボット100が正しく入庫できていない可能性が高い。このときにも動作制御部150は帰還処理を再実行させてもよい。 The operation control unit 150 may re-execute the return process by regarding the front wheels 102 as idling failure even when the front wheels 102 are idling. The robot 100 may be equipped with an inertial measurement unit (IMU: Inertial Measurement Unit). When the inertial measurement device detects the movement of the robot 100 even after the reference time has passed, there is a high possibility that the robot 100 has not been correctly stored. At this time also, the operation control unit 150 may re-execute the feedback process.
 図16は、複数のロボット100がステーション500へ帰還を希望するときの制御方法を説明するためのタイムチャートである。
 本実施形態におけるステーション500は、2体のロボット100(以下、「ロボット100A」および「ロボット100B」とよぶ)を2つの充電スペース502において同時に充電できる。ただし、2体のロボット100A,100Bが同時にステーション500に帰還することはできない。ステーション500周辺が混雑するのを防ぐため、本実施形態においては、ステーション500は1体ずつロボット100をステーション500に入庫させる。ロボット100Aが左スペース502Lに入庫するときには、誘導部266は左誘導部252Lから近距離誘導信号(近距離赤外線と超音波)を発生させるが、右近右誘導部252Rからは近距離誘導信号を発生させない。2つの充電スペース502のいずれに入庫させるかは入庫判定部262が決定する。
FIG. 16 is a time chart for explaining a control method when a plurality of robots 100 desire to return to the station 500.
The station 500 in this embodiment can charge two robots 100 (hereinafter, referred to as “robot 100A” and “robot 100B”) in two charging spaces 502 at the same time. However, the two robots 100A and 100B cannot return to the station 500 at the same time. In order to prevent the vicinity of the station 500 from being crowded, in the present embodiment, the stations 500 store the robots 100 in the station 500 one by one. When the robot 100A enters the left space 502L, the guidance unit 266 generates a short-distance guidance signal (short-distance infrared rays and ultrasonic waves) from the left guidance unit 252L, but generates a short-distance guidance signal from the right-near-right guidance unit 252R. Do not let The warehousing determination unit 262 determines in which of the two charging spaces 502 the warehousing is performed.
 図16の時刻t0において、ロボット100Aがステーション500に入庫要求信号を送信したとする。この時点ではステーション500にロボット100Bは入庫していないものとする。ステーション500の入庫判定部262は、ロボット100Aの左スペース502Lへの入庫を許可する。入庫可否送信部258は、時刻t1において入庫許可を示す入庫可否信号(以下、「入庫許可信号」とよぶ)をロボット100Aに送信する。ロボット100Aは、誘導光、中距離赤外線に基づいてステーション500に近づく。また、ステーション500の誘導部266は、左誘導部252Lから近距離誘導信号を発生させる。時刻t2において、ロボット100Aは近距離誘導信号にしたがって左スペース502Lに入庫する。給電端子530と充電端子510が接続され、充電制御部264は時刻t3においてロボット100Aの充電を開始する。 It is assumed that the robot 100A transmits a warehousing request signal to the station 500 at time t0 in FIG. At this time, the robot 100B is not stored in the station 500. The warehousing determination unit 262 of the station 500 permits the robot 100A to enter the left space 502L. The warehousing permission transmission unit 258 transmits a warehousing permission signal (hereinafter, referred to as “housing permission signal”) indicating the warehousing permission to the robot 100A at time t1. The robot 100A approaches the station 500 based on the guide light and the medium-range infrared rays. Further, the guiding unit 266 of the station 500 causes the left guiding unit 252L to generate a short-distance guiding signal. At time t2, the robot 100A enters the left space 502L according to the short-distance guidance signal. The power supply terminal 530 and the charging terminal 510 are connected, and the charging control unit 264 starts charging the robot 100A at time t3.
 入庫判定部262は、ロボット100Aから入庫要求信号を受け付けた時刻t0からロボット100Aが充電を開始する時刻t3までを「拒否期間」として設定する。拒否期間中に、もう一方のロボット100Bから入庫要求信号を受け付けたときには、入庫判定部262はロボット100Bの入庫を拒否する。入庫可否送信部258は、入庫拒否を示す入庫可否信号(以下、「入庫拒否信号」とよぶ)をロボット100Bに送信する。ロボット100Bは、入庫拒否されたときには、所定時間経過後に改めて入庫要求信号を送信する。入庫可否送信部258は、拒否期間の終了後、改めてロボット100Bの入庫を許可してもよい。このような制御方法によれば、複数のロボット100を順番にステーション500に導くことができる。 The warehousing determination unit 262 sets as a “rejection period” from time t0 when the warehousing request signal is received from the robot 100A to time t3 when the robot 100A starts charging. When the warehousing request signal is received from the other robot 100B during the refusal period, the warehousing determination unit 262 rejects the warehousing of the robot 100B. The warehousing permission transmission unit 258 transmits a warehousing permission signal (hereinafter, referred to as “housing refusal signal”) indicating refusal of warehousing to the robot 100B. When the warehousing is rejected, the robot 100B transmits a warehousing request signal again after a lapse of a predetermined time. The warehousing permission transmission unit 258 may again permit the warehousing of the robot 100B after the refusal period ends. According to such a control method, the plurality of robots 100 can be sequentially guided to the station 500.
 拒否期間は、時刻t1から時刻t3までの期間としてもよいし、時刻t1から時刻t2までの期間、時刻t0から時刻t2までの期間としてもよい。入庫拒否されたとき、ロボット100Bの動作制御部150(あるいはサーバ200の動作制御部222)は、ステーション500から所定距離以内、たとえば、2メートル以内をロボット100Bの行動禁止区間として設定してもよい。このような制御方法によれば、ロボット100Bがロボット100Aの帰還の邪魔にならないように制御できる。 The refusal period may be a period from time t1 to time t3, a period from time t1 to time t2, or a period from time t0 to time t2. When the warehousing is rejected, the operation control unit 150 of the robot 100B (or the operation control unit 222 of the server 200) may set within a predetermined distance from the station 500, for example, within 2 meters as the action prohibited section of the robot 100B. .. According to such a control method, the robot 100B can be controlled so as not to interfere with the return of the robot 100A.
 ロボット100は、入庫要求信号を送信するとき以外、あるいは、入庫許可信号を受信して帰還処理を実行するとき以外は、ステーション500の周辺に形成される所定範囲、たとえば、1メートル以内の範囲には入らないように設定されてもよい。ロボット100Aは、入庫許可されたときには、ロボット100Bに「帰還処理中」を示す信号を送信してもよい。ロボット100Bは、ロボットAからこの信号を受信したときには、ステーション500を含む所定範囲内には入らないとしてもよい。 The robot 100 is within a predetermined range formed around the station 500, for example, within a range of 1 meter, except when it sends a warehousing request signal or when it receives a warehousing permission signal and executes feedback processing. May not be set. The robot 100A may transmit a signal indicating “in return processing” to the robot 100B when the storage is permitted. The robot 100B may not enter the predetermined range including the station 500 when receiving this signal from the robot A.
 拒否期間に限らず、ロボット100Aがステーション500に向かっているとき、ロボット100Bはロボット100Aの進路(ロボット100Aとステーション500を結ぶ直線)を行動可能範囲から除外してもよい。このように、ロボット100Bはロボット100Aの進路から離れることでロボット100Aの帰還を邪魔しないように制御してもよい。ロボット100Aは、帰還処理中であることをツノ112に含まれる赤外線通信機等によりロボット100Bに通知してもよい。あるいは、ロボット100Aは、ツノ112にLEDなどの発光部を備えてもよい。ロボット100Aは、帰還処理中においてはこのLEDを点灯させてもよい。ロボット100Bはロボット100AのLEDの点灯状態を確認することでロボット100Aが帰還処理中であるか否かを確認できる。ロボット100Bは、ロボット100Aが帰還処理中であると認識したときには、ステーション500から離れる、あるいは、ロボット100Aの進路から離れるとしてもよい。つまり、充電のために帰還中のロボット100Aの移動が優先され、他のロボットはロボット100Aが最短距離でステーション500に帰還できるように移動する。 Not limited to the refusal period, when the robot 100A is heading to the station 500, the robot 100B may exclude the course of the robot 100A (the straight line connecting the robot 100A and the station 500) from the actionable range. In this way, the robot 100B may be controlled so as not to interfere with the return of the robot 100A by moving away from the path of the robot 100A. The robot 100A may notify the robot 100B that the return process is in progress by an infrared communication device or the like included in the horn 112. Alternatively, the robot 100A may include a light emitting unit such as an LED on the horn 112. The robot 100A may turn on this LED during the return process. The robot 100B can confirm whether or not the robot 100A is in the return process by confirming the lighting state of the LED of the robot 100A. When the robot 100B recognizes that the robot 100A is in the process of returning, the robot 100B may be separated from the station 500 or may be separated from the course of the robot 100A. That is, the movement of the robot 100A that is returning for charging is prioritized, and the other robots move so that the robot 100A can return to the station 500 within the shortest distance.
 図17は、ランドマーク装置280によりロボット100をステーション500に誘導する方法を説明するための模式図である。
 図17においては、ステーション500とロボット100の間に遮蔽物516がある。このため、ロボット100はステーション500の誘導光を視認できない。ロボット100は、キーフレームに基づいて生成されるマップを利用して、ステーション500の所在地点を認識できる。本実施形態においては、更に、ランドマーク装置280がロボット100をステーション500に導くことにより、ロボット100はより確実にステーション500に帰還しやすくなる。
FIG. 17 is a schematic diagram for explaining a method of guiding the robot 100 to the station 500 by the landmark device 280.
In FIG. 17, there is a shield 516 between the station 500 and the robot 100. Therefore, the robot 100 cannot visually recognize the guide light from the station 500. The robot 100 can recognize the location point of the station 500 by using the map generated based on the key frame. Further, in the present embodiment, the landmark device 280 guides the robot 100 to the station 500, so that the robot 100 can return to the station 500 more reliably.
 図17においては、まず、ロボット100の光認識部154は、全天周カメラ113による撮像画像から誘導光を探索する。遮蔽物516によりステーション500が隠されているため、光認識部154は誘導光を検出できない。次に、ロボット100の発光指示送信部140は、発光信号L0を送信する。ステーション500は、仮に発光信号L0を受信できたとしても、遮蔽物516が存在するためロボット100は誘導光をやはり視認(検出)できない。 In FIG. 17, first, the light recognition unit 154 of the robot 100 searches for a guide light from an image captured by the omnidirectional camera 113. Since the station 500 is hidden by the shield 516, the light recognition unit 154 cannot detect the guided light. Next, the light emission instruction transmission unit 140 of the robot 100 transmits the light emission signal L0. Even if the station 500 can receive the light emission signal L0, the robot 100 cannot visually recognize (detect) the guided light because of the shield 516.
 ロボット100の発光指示送信部140は、発光信号L0に対応する誘導光を検出できないとき、検索信号を送信する。検索信号は所定のパターンにて点滅する光信号であってもよいし、電波信号であってもよい。続いて、ロボット100の発光指示送信部140は、新たな発光態様を指定する発光信号L1(第1の発光態様を指定する第1の発光信号)を送信する。ランドマーク装置280の発光指示受信部286が検索信号を受信したとき、発光制御部284は発光信号の受信準備状態に遷移する。発光指示受信部286が発光信号L1を受信すると、発光制御部284は発光信号L1にしたがって発光部282を発光させる。 The light emission instruction transmission unit 140 of the robot 100 transmits a search signal when the guide light corresponding to the light emission signal L0 cannot be detected. The search signal may be an optical signal that blinks in a predetermined pattern, or may be a radio wave signal. Subsequently, the light emission instruction transmission unit 140 of the robot 100 transmits a light emission signal L1 (first light emission signal that specifies the first light emission mode) that specifies a new light emission mode. When the light emission instruction receiving unit 286 of the landmark device 280 receives the search signal, the light emission control unit 284 transitions to the light emission signal reception preparation state. When the light emission instruction receiving unit 286 receives the light emission signal L1, the light emission control unit 284 causes the light emission unit 282 to emit light according to the light emission signal L1.
 ロボット100の光認識部154は、ランドマーク装置280から発光信号L1に対応する光を認識する(以下、「ランドマーク光」とよぶ)。ロボット100からランドマーク装置280は遮蔽されていないため、ロボット100はランドマーク光を認識できる。ロボット100の動作制御部150は、ロボット100の当面の移動目標地点として、ランドマーク光の発光地点を設定する。ロボット100は、ランドマーク装置280の所在地点に移動する(S20)。 The light recognition unit 154 of the robot 100 recognizes the light corresponding to the light emission signal L1 from the landmark device 280 (hereinafter referred to as “landmark light”). Since the landmark device 280 is not shielded from the robot 100, the robot 100 can recognize the landmark light. The operation control unit 150 of the robot 100 sets the light emission point of the landmark light as the current movement target point of the robot 100. The robot 100 moves to the location point of the landmark device 280 (S20).
 ロボット100は、ランドマーク装置280の近く、たとえば、ランドマーク装置280から0.5メートル以内の範囲に入ったとき、新たな発光態様を指定する発光信号L2(第2の発光態様を指定する第2の発光信号)を送信する。ステーション500は、発光信号L2に対応する誘導光を発生させる。ロボット100は、ランドマーク装置280のそばにいるときにはステーション500の誘導光を視認(検出)できる。ロボット100は、ステーション500の誘導光を認識した上で、ステーション500に帰還する(S22)。このような制御方法によれば、ロボット100から誘導光を認識できないときであっても、いったんランドマーク装置280に近づくことでステーション500に戻ることができる。 When the robot 100 enters the vicinity of the landmark device 280, for example, within a range of 0.5 meters from the landmark device 280, the robot 100 emits a light emission signal L2 (a second emission mode that specifies a second emission mode). 2 light emission signal) is transmitted. The station 500 generates a guide light corresponding to the light emission signal L2. The robot 100 can visually recognize (detect) the guided light of the station 500 when it is near the landmark device 280. The robot 100 recognizes the guided light from the station 500 and then returns to the station 500 (S22). According to such a control method, even when the guided light cannot be recognized from the robot 100, it is possible to return to the station 500 by once approaching the landmark device 280.
 図18は、ロボット100Bによりロボット100Aをステーション500に誘導する方法を説明するための模式図である。
 ロボット100Bをランドマーク装置280として機能させることもできる。図18においては、ステーション500とロボット100Aの間に遮蔽物516がある。このため、ロボット100Aからは誘導光を視認できない。一方、ロボット100Bは誘導光を視認できる。ロボット100Aは、発光信号L0を送信するが、誘導光を検出できない。このため、ロボット100Aは、新たな発光態様を指定する発光信号L1(第1の発光態様を指定する第1の発光信号)を送信する。
FIG. 18 is a schematic diagram for explaining a method of guiding the robot 100A to the station 500 by the robot 100B.
The robot 100B can also function as the landmark device 280. In FIG. 18, there is a shield 516 between the station 500 and the robot 100A. Therefore, the guide light cannot be visually recognized from the robot 100A. On the other hand, the robot 100B can visually recognize the guide light. The robot 100A transmits the light emission signal L0, but cannot detect the guide light. Therefore, the robot 100A transmits a light emission signal L1 (first light emission signal that specifies a first light emission mode) that specifies a new light emission mode.
 ロボット100Bの発光指示受信部144は発光信号L1を受信する。ロボット100Bの発光制御部158は、発光信号L1にしたがって発光部138を発光させる。ロボット100Bの発光がランドマーク光として機能する。ロボット100Aは、ロボット100Bのランドマーク光を検出し、ロボット100Bに近づく(S30)。ロボット100Aはロボット100Bの近くで新たな発光態様を指定する発光信号L2(第2の発光態様を指定する第2の発光信号)を送信する。ステーション500は、発光信号L2に対応する誘導光を発生させる。ロボット100Aは誘導光を検出し、ステーション500に帰還する(S32)。 The light emission instruction receiving unit 144 of the robot 100B receives the light emission signal L1. The light emission control unit 158 of the robot 100B causes the light emitting unit 138 to emit light in accordance with the light emission signal L1. The light emitted from the robot 100B functions as landmark light. The robot 100A detects the landmark light of the robot 100B and approaches the robot 100B (S30). The robot 100A transmits a light emission signal L2 (second light emission signal that specifies a second light emission mode) that specifies a new light emission mode near the robot 100B. The station 500 generates a guide light corresponding to the light emission signal L2. The robot 100A detects the guided light and returns to the station 500 (S32).
 図18に示す規定周辺領域520は、誘導光を視認可能な範囲を示す。ロボット100Aの位置判定部166は、ロボット100Aが規定周辺領域520の外に出たと判定したとき、位置条件が不成立と判定し、離脱信号をロボット100Bに送信する。ロボット100Bが離脱信号を受信したとき、ロボット100Bの動作制御部150(またはサーバ200の動作制御部222)はロボット100Bの行動範囲を規定周辺領域520に限定する。ロボット100Aが規定周辺領域520から出たときには、ロボット100Bを規定周辺領域520にとどまらせることでロボット100Bにランドマーク装置としての役割を担わせることができる。 The prescribed peripheral area 520 shown in FIG. 18 indicates a range in which the guide light can be visually recognized. When the position determination unit 166 of the robot 100A determines that the robot 100A goes out of the specified peripheral area 520, it determines that the position condition is not satisfied, and sends a departure signal to the robot 100B. When the robot 100B receives the leaving signal, the operation control unit 150 of the robot 100B (or the operation control unit 222 of the server 200) limits the action range of the robot 100B to the specified peripheral area 520. When the robot 100A exits the specified peripheral area 520, the robot 100B can be made to function as a landmark device by keeping the robot 100B in the specified peripheral area 520.
 ステーション500は、誘導光を常時点灯させてもよい。ロボット100の位置判定部166は、ステーション500の誘導光を全天周カメラ113により常時検出し、誘導光が見えなくなったときに位置条件不成立(規定周辺領域520の外に出た)と判定してもよい。規定周辺領域520は、誘導光の視認可否に基づいて概念的に定められてもよいし、マップにおいてあらかじめステーション500の周辺の所定領域として明示的に設定されてもよい。 The station 500 may always turn on the guide light. The position determination unit 166 of the robot 100 constantly detects the guide light of the station 500 by the omnidirectional camera 113, and determines that the position condition is not satisfied (outside the specified peripheral area 520) when the guide light is no longer visible. May be. The defined peripheral area 520 may be conceptually determined based on whether or not the guided light is visually approved, or may be explicitly set in advance as a predetermined area around the station 500 in the map.
<総括>
 以上、実施形態に基づいてロボットシステム300を説明した。
 本実施形態によれば、ロボット100はステーション500からの誘導光を手がかりとしてステーション500の所在地点を認識する。ロボット100は、更に、発光信号を送信することにより、誘導光と類似光を見分けることができる。ロボット100は電池残量が残り少ないときであっても、類似光に惑わされて無駄な移動(電力浪費)をすることなく、ステーション500に帰還できる。誘導光と通常時においても特殊な発光態様(類似光と誤認されにくい発光態様)にて発光させるとすると、ユーザは誘導光を煩わしく感じる可能性がある。発光信号を受信したときだけ特殊な発光態様にて誘導光を発生させることにより、このような問題点を解消できる。また、充電ステーション500から常時電波を発生させることは、他の電子機器への影響も考慮すると好ましくないと考えられる。
<Summary>
The robot system 300 has been described above based on the embodiment.
According to this embodiment, the robot 100 recognizes the location point of the station 500 by using the guided light from the station 500 as a clue. The robot 100 can further distinguish the guide light and the similar light by transmitting a light emission signal. Even when the remaining battery level is low, the robot 100 can return to the station 500 without being wasted by similar light and wasteful movement (power waste). If the guide light and the normal light are emitted in a special light emission mode (a light emission mode that is not easily mistaken for similar light), the user may feel the guide light annoying. Such a problem can be solved by generating the guide light in a special light emission mode only when the light emission signal is received. Further, it is considered that it is not preferable to constantly generate radio waves from the charging station 500 in consideration of the influence on other electronic devices.
 発光信号を送信したあとでも擬似光と誘導光を識別できないときには、別の発光態様を指定する発光信号を再送信すればよい。ステーション500は、発光信号に対応して発光態様を変化させることができるため、ロボット100はステーション500を確実に特定できる。 If the pseudo light and the guided light cannot be distinguished even after transmitting the light emission signal, the light emission signal designating another light emission mode may be retransmitted. Since the station 500 can change the light emission mode according to the light emission signal, the robot 100 can reliably identify the station 500.
 ステーション500の発光部256は、複数の光源(中央光源380と側方光源382)を有する。中央光源380と2つの発光部282により形成される特有の発光形状により、ロボット100は誘導光を特定しやすくなる。更に、側方光源382を中央光源380よりも手前側に配置することにより、ロボット100は誘導光の見え方に応じて、ロボット100とステーション500の相対角度を認識できる。ロボット100は、誘導光の見え方に基づいて、ステーション500に対して正面に近い位置から帰還するように移動方向を調整する。 The light emitting unit 256 of the station 500 has a plurality of light sources (a central light source 380 and a side light source 382). The unique light emitting shape formed by the central light source 380 and the two light emitting units 282 makes it easier for the robot 100 to identify the guide light. Further, by disposing the side light source 382 on the front side of the central light source 380, the robot 100 can recognize the relative angle between the robot 100 and the station 500 according to the appearance of the guided light. The robot 100 adjusts the moving direction based on the appearance of the guided light so as to return from a position near the front of the station 500.
 ステーション500においては、充電装置506とサーバ200が一体として形成される。充電装置506とサーバ200を単一筐体とすることにより、ロボットシステム300全体をコンパクトに形成できる。ステーション500はロボット100の入庫にともなう衝撃を受け止める必要があるため、ある程度の重量を有することが望ましい。ステーション500にサーバ200を内蔵させることにより、ステーション500の重量を合理的に増加させることができる。サーバ200の荷重はステーション500の安定性にも寄与する。 In the station 500, the charging device 506 and the server 200 are integrally formed. By making the charging device 506 and the server 200 into a single housing, the entire robot system 300 can be made compact. It is desirable that the station 500 has a certain weight because it is necessary to receive the impact caused by the storage of the robot 100. By incorporating the server 200 in the station 500, the weight of the station 500 can be increased reasonably. The load of the server 200 also contributes to the stability of the station 500.
 本実施形態におけるステーション500は、2体のロボット100Aとロボット100Bを同時に充電できる。ロボット100ごとにステーション500を用意しなくてもよいため、ステーション500をコンパクトに形成できる。また、入庫要求信号に基づいて、複数のロボット100を順番に入庫させる方式であるため、ステーション500の近くで複数のロボット100が混雑するのを防ぎやすくなる。ロボット100Aの帰還処理中にロボット100Bも帰還しようとすると、ロボット100Aのための誘導信号とロボット100Bのための誘導信号が混在する可能性がある。充電装置506の誘導部266は、左誘導部252Lと右誘導部252Rから同時に誘導信号を発生させないように制御することで誘導信号の混在を防止している。 The station 500 in this embodiment can charge two robots 100A and 100B at the same time. Since the station 500 does not have to be prepared for each robot 100, the station 500 can be formed compactly. In addition, since the plurality of robots 100 are sequentially stored based on the storage request signal, it is easy to prevent the plurality of robots 100 from being congested near the station 500. If the robot 100B also attempts to return during the return processing of the robot 100A, the guidance signal for the robot 100A and the guidance signal for the robot 100B may coexist. The guidance unit 266 of the charging device 506 controls the left guidance unit 252L and the right guidance unit 252R so as not to generate guidance signals at the same time, thereby preventing the guidance signals from being mixed.
 ロボット100Aとロボット100Bは順番に入庫する。ロボット100Aの帰還処理中は、ロボット100Bは待たされる。ロボット100Aが入庫すると、ロボット100Bは帰還処理を開始する。ロボット100Aとロボット100Bが行儀よく順番にステーション500に入庫し、入庫後はロボット100Aとロボット100Bが隣り合って充電を受ける。ロボット100Aとロボット100Bが並んで充電を受ける姿(図1参照)により、ユーザに第2の発光態様を指定する第2体ならではの可愛らしさをアピールできる。 Robot 100A and robot 100B are stored in order. The robot 100B is kept waiting during the return processing of the robot 100A. When the robot 100A enters the warehouse, the robot 100B starts the return process. The robot 100A and the robot 100B politely enter the station 500 in order, and after the entry, the robot 100A and the robot 100B are adjacently charged. By the appearance that the robot 100A and the robot 100B are charged side by side (see FIG. 1), the cuteness unique to the second body designating the second light emission mode can be appealed to the user.
 ロボット100は、ステーション500の誘導光の発光地点をマップの基準点として設定する。ロボット100は、キーフレームに基づいてマップを形成するが、誘導光の発光地点を基準点とすることで現在地点がマップ(画像記憶)のどの地点に対応するかをいっそう確実に認識できる。ステーション500が誘導光により自らの所在地点を知らせることは、ロボット100のステーション500への帰還を容易にするだけではなく、ロボット100による位置認識にも有用である。 The robot 100 sets the light emitting point of the guide light of the station 500 as the reference point of the map. The robot 100 forms the map based on the key frame, but by using the light emission point of the guide light as the reference point, it is possible to more reliably recognize which point on the map (image storage) the current point corresponds to. The fact that the station 500 informs its own location point by the guide light not only facilitates the return of the robot 100 to the station 500, but is also useful for position recognition by the robot 100.
 ランドマーク装置280を設置することにより、ロボット100はステーション500から大きく離れたときでも、ランドマーク光を頼りとしてステーション500に戻ることができる。室内に1以上のランドマーク装置280を適切に配置しておけば、ロボット100の行動範囲を拡大しやすくなる。いいかえれば、ロボット100はステーション500から離れた場所であっても、ランドマーク装置280を視認できる限り安心して遠くに出かけることができる。 By installing the landmark device 280, the robot 100 can return to the station 500 by relying on the landmark light even when it is far away from the station 500. Properly arranging one or more landmark devices 280 in the room makes it easy to expand the action range of the robot 100. In other words, even if the robot 100 is located away from the station 500, the robot 100 can safely go out as far as it can see the landmark device 280.
 また、ロボット100Aとロボット100Bが存在するとき、ロボット100Bがロボット100Aに対するランドマーク装置として機能してもよい。この場合には、一方のロボット100Aが遠くに移動しても、ロボット100Bがステーション500のそばにとどまることで、ロボット100Aはステーション500に戻りやすくなる。ロボット100Aは、ロボット100Bを視認できなくなるほどロボット100Bから離れないように行動範囲を制限してもよい。 When the robot 100A and the robot 100B exist, the robot 100B may function as a landmark device for the robot 100A. In this case, even if one of the robots 100A moves far away, the robot 100B can easily return to the station 500 because the robot 100B stays near the station 500. The robot 100A may limit the range of action so that the robot 100B is not separated from the robot 100B so that the robot 100B cannot be visually recognized.
 なお、本発明は上記実施形態や変形例に限定されるものではなく、要旨を逸脱しない範囲で構成要素を変形して具体化することができる。上記実施形態や変形例に開示されている複数の構成要素を適宜組み合わせることにより種々の発明を形成してもよい。また、上記実施形態や変形例に示される全構成要素からいくつかの構成要素を削除してもよい。 It should be noted that the present invention is not limited to the above-described embodiments and modified examples, and constituent elements can be modified and embodied without departing from the scope of the invention. Various inventions may be formed by appropriately combining a plurality of constituent elements disclosed in the above-described embodiments and modifications. In addition, some constituent elements may be deleted from all the constituent elements shown in the above-described embodiments and modifications.
 1つのロボット100と1つステーション500(充電装置506とサーバ200)によりロボットシステム300が構成されるとして説明したが、ロボット100の機能の一部はステーション500のサーバ200により実現されてもよいし、サーバ200の機能の一部または全部がロボット100に割り当てられてもよい。1つのサーバ200が複数のロボット100をコントロールしてもよいし、複数のサーバ200が協働して1以上のロボット100をコントロールしてもよい。 Although it has been described that the robot system 300 is configured by one robot 100 and one station 500 (charging device 506 and server 200), a part of the functions of the robot 100 may be realized by the server 200 of the station 500. Some or all of the functions of the server 200 may be assigned to the robot 100. One server 200 may control a plurality of robots 100, or a plurality of servers 200 may cooperate to control one or more robots 100.
 ロボット100やサーバ200以外の第3の装置が、機能の一部を担ってもよい。図8、図9において説明したロボット100の各機能とサーバ200の各機能の集合体は大局的には1つの「ロボット」として把握することも可能である。1つまたは複数のハードウェアに対して、本発明を実現するために必要な複数の機能をどのように配分するかは、各ハードウェアの処理能力やロボットシステム300に求められる仕様等に鑑みて決定されればよい。 A third device other than the robot 100 and the server 200 may take part of the function. The aggregate of the functions of the robot 100 and the functions of the server 200 described with reference to FIGS. 8 and 9 can be generally understood as one “robot”. How to distribute the plurality of functions required to implement the present invention to one or more pieces of hardware is determined in consideration of the processing capability of each piece of hardware and the specifications required for the robot system 300. It may be decided.
 上述したように、「狭義におけるロボット」とはサーバ200を含まないロボット100のことであるが、「広義におけるロボット」はロボットシステム300のことである。サーバ200の機能の多くは、将来的にはロボット100に統合されていく可能性も考えられる。 As described above, the “robot in the narrow sense” means the robot 100 that does not include the server 200, but the “robot in the broad sense” means the robot system 300. Many of the functions of the server 200 may possibly be integrated into the robot 100 in the future.
[変形例]
 本実施形態においては、ロボット100の光認識部154は、帰還処理に際して、まず、誘導光を探索する。ロボット100は、誘導光を認識できないときに発光信号を送信するとして説明した。変形例として、ロボット100は、帰還処理に際しては、常に、発光信号を送信するとしてもよい。ロボット100は、発光信号に応答する外部光を誘導光として特定し、ステーション500に向けて移動を開始してもよい。
[Modification]
In the present embodiment, the light recognizing unit 154 of the robot 100 first searches for the guided light in the returning process. The robot 100 has been described as transmitting the light emission signal when the guided light cannot be recognized. As a modified example, the robot 100 may always transmit a light emission signal during the return process. The robot 100 may identify external light that responds to the light emission signal as guide light and start moving toward the station 500.
 電池残量監視部176は、バッテリー118の電池残量を監視する。ロボット100の動作制御部150は、バッテリー118の電池残量(充電率)が所定の閾値以下となったとき、帰還処理を開始してもよい。ロボット100の入庫要求送信部260は、バッテリー118の電池残量が所定の閾値以下となったとき、入庫要求信号を送信してもよい。ロボット100の発光指示送信部140は、電池残量が少なくなったときに発光信号を送信してもよい。ロボット100は、発光信号を入庫要求信号として利用してもよい。 The battery level monitor 176 monitors the battery level of the battery 118. The operation control unit 150 of the robot 100 may start the feedback process when the battery level (charging rate) of the battery 118 becomes less than or equal to a predetermined threshold value. The storage request transmission unit 260 of the robot 100 may transmit a storage request signal when the remaining battery level of the battery 118 becomes equal to or less than a predetermined threshold value. The light emission instruction transmission unit 140 of the robot 100 may transmit a light emission signal when the battery level becomes low. The robot 100 may use the light emission signal as a storage request signal.
 ロボット100は、ステーション500に入庫したとき、充電装置506だけでなくサーバ200とも有線接続されてもよい。具体的には、給電端子530は電力線だけでなく、データ線を備えてもよい。ロボット100の充電端子510についても同様である。ロボット100の充電端子510が給電端子530と接続したとき、サーバ200とロボット100はデータ線を介してデータの送受をしてもよい。 When the robot 100 enters the station 500, it may be connected to not only the charging device 506 but also the server 200 by wire. Specifically, the power supply terminal 530 may include not only a power line but also a data line. The same applies to the charging terminal 510 of the robot 100. When the charging terminal 510 of the robot 100 is connected to the power supply terminal 530, the server 200 and the robot 100 may send and receive data via a data line.
 上述したように、ロボット100は特徴ベクトルやイベント情報などの各種データをサーバ200に無線送信する。また、サーバ200からロボット100にも各種データが無線送信される。有線通信は無線通信よりもデータ転送速度が大きいため、大量データの送受には有線通信の方が適している。サーバ200は、ロボット100の充電中にロボット100が取得した画像および音声データを取得してもよい。ロボット100は、撮像画像等のデータをロボット100のデータ格納部148に蓄積しておき、充電時に蓄積したデータをサーバ200にアップロードしてもよい。 As described above, the robot 100 wirelessly transmits various data such as feature vectors and event information to the server 200. In addition, various data are wirelessly transmitted from the server 200 to the robot 100. Since wired communication has a higher data transfer rate than wireless communication, wired communication is more suitable for sending and receiving large amounts of data. The server 200 may acquire the image and audio data acquired by the robot 100 while the robot 100 is being charged. The robot 100 may store data such as a captured image in the data storage unit 148 of the robot 100, and upload the data stored during charging to the server 200.
 サーバ200は、ロボット100の行動制御プログラムの更新版を外部サーバからダウンロードしてもよい。サーバ200は、ロボット100の充電中にロボット100に行動制御プログラムを送信してもよい。ロボット100は充電中に行動制御プログラムを更新し、充電完了前に自動的に再起動することで更新版の行動制御プログラムをインストールすればよい。このような制御方法によれば、行動制御プログラムの更新というコンピュータに特有の作業を充電中、いいかえれば、ユーザがロボット100にあまり注目していない期間にさりげなく実行できる。 The server 200 may download an updated version of the behavior control program for the robot 100 from an external server. The server 200 may send the behavior control program to the robot 100 while the robot 100 is being charged. The robot 100 may install the updated version of the action control program by updating the action control program during charging and automatically restarting before the completion of charging. According to such a control method, the operation of updating the behavior control program, which is peculiar to a computer, can be casually executed during charging, in other words, while the user does not pay much attention to the robot 100.
 本実施形態においては、ロボット100は定期的に帰還処理を実行するとして説明した。ロボット100は、内蔵のスケジューラにしたがって帰還タイミングを決定する。
 変形例として、ステーション500は帰還管理部(不図示)を備えてもよい。帰還管理部はスケジューラとしてロボット100の帰還タイミングを管理してもよい。ステーション500は、発光部256を通常時は消灯させてもよい。帰還管理部は帰還タイミングが近づいたときに発光部256を自動的に点灯させてもよい。ロボット100は、誘導光を検出したとき、帰還処理を開始してもよい。このような制御方法によれば、ステーション500によりロボット100の帰還タイミングを管理できる。ステーション500は複数のロボット100の帰還タイミングをずらすことにより、複数のロボット100の同時帰還を回避できる。ステーション500はロボット100Aを帰還させるときには黄色の誘導光を発生させ、ロボット100Bを帰還させるときには緑色の誘導光を発生させるとしてもよい。
In the present embodiment, the robot 100 has been described as periodically performing the return process. The robot 100 determines the return timing according to the built-in scheduler.
Alternatively, the station 500 may include a return management unit (not shown). The return management unit may manage the return timing of the robot 100 as a scheduler. The station 500 may turn off the light emitting unit 256 during normal operation. The return management unit may automatically turn on the light emitting unit 256 when the return timing approaches. The robot 100 may start the return process when detecting the guide light. According to such a control method, the return timing of the robot 100 can be managed by the station 500. The station 500 can avoid simultaneous return of the plurality of robots 100 by shifting the return timing of the plurality of robots 100. The station 500 may generate yellow guide light when returning the robot 100A, and may generate green guide light when returning the robot 100B.
 ロボット100は、誘導光を特定できたときには、特定信号を送信してもよい。ステーション500の通信部204は、特定信号を受信したときには誘導光を常時発光状態に設定してもよい。誘導光の発光態様を変化させると、たとえば、誘導光を点滅させると、ユーザが誘導光を気にする可能性もある。特定信号を受信したあとは常時発光状態に戻すことで、ユーザが誘導光を気にする期間を短くできる。ロボット100が入庫したとき、ステーション500は誘導光を消灯してもよい。 The robot 100 may transmit a specific signal when the guided light can be specified. The communication unit 204 of the station 500 may set the guide light to be always in a light emitting state when receiving the specific signal. When the light emission mode of the guide light is changed, for example, when the guide light is blinked, the user may be concerned about the guide light. By constantly returning to the light emitting state after receiving the specific signal, the period in which the user is concerned about the guide light can be shortened. The station 500 may turn off the guide light when the robot 100 enters.
 ステーション500は、ロボット100Aから特定信号を受信したときには、誘導光の発光色を変化させてもよい。たとえば、ステーション500は通常は誘導光を青色で発光させておき、ロボット100Aの帰還処理中は誘導光を赤色で発光させてもよい。ロボット100Bは誘導光の発光色を確認することにより、ロボット100Aの帰還処理中であるか否かを知ることができる。ロボット100Bは、帰還処理を開始するときには誘導光の色を確認し、誘導光が赤色のときには帰還処理を待機すればよい。誘導光の色彩に限らず、他の発光態様にて同様の情報を通知してもよい。 The station 500 may change the emission color of the guide light when receiving the specific signal from the robot 100A. For example, the station 500 may normally emit the guide light in blue, and may emit the guide light in red during the return process of the robot 100A. By confirming the emission color of the guide light, the robot 100B can know whether the robot 100A is in the process of returning. The robot 100B may check the color of the guide light when starting the return process, and wait for the return process when the guide light is red. Not only the color of the guide light, but similar information may be notified in other light emission modes.
 ステーション500は、発光部256を通常時においては消灯しておき、発光信号を受信したときだけ発光部256を点灯させるとしてもよい。このような制御方法によれば、発光部256の消費電力を抑制できる。誘導光を常時消灯に設定しておけば、ユーザは誘導光を視覚的にわずらわしく感じにくくなる。たとえば、ステーション500を寝室に設置する場合には、誘導光は通常時においては消灯しておくことが望ましいと考えられる。 The station 500 may turn off the light emitting unit 256 in a normal time and turn on the light emitting unit 256 only when a light emitting signal is received. According to such a control method, the power consumption of the light emitting unit 256 can be suppressed. If the guide light is set to be always off, the user is less likely to feel the guide light visually. For example, when the station 500 is installed in the bedroom, it is considered desirable to turn off the guide light during normal times.
 帰還管理部は、定期的にロボット100の帰還を促す帰還信号を送信してもよい。ロボット100は、帰還信号を受信したときには、帰還処理を開始する。帰還信号は、対象となるロボット100を指定するIDを含んでもよい。帰還信号は電波信号であってもよいし、可視光信号であってもよい。 The return management unit may periodically send a return signal prompting the robot 100 to return. When the robot 100 receives the return signal, the robot 100 starts the return process. The return signal may include an ID that specifies the target robot 100. The return signal may be a radio wave signal or a visible light signal.
 帰還管理部は、ロボット100Aが帰還したとき、ロボット100Bにも帰還信号を送信してもよい。このような制御方法によれば、ロボット100Aがステーション500(巣)に帰ると、ロボット100Bも退屈してステーション500(巣)に戻ってくるかのような複数のロボット100の「仲の良さ」を演出できる。 The return management unit may also transmit a return signal to the robot 100B when the robot 100A returns. According to such a control method, when the robot 100A returns to the station 500 (nest), the robot 100B becomes bored and returns to the station 500 (nest). You can direct.
 帰還管理部は、ユーザがステーション500の周辺にいないときや、部屋の光量が少ないとき(ユーザがいない、または、深夜と想定されるとき)に帰還信号を送信してもよい。このような制御方法によれば、ロボット100がユーザとの関わりを必要としない時間帯をロボット100の充電機会として積極的に利用できる。ロボット100は、ユーザと積極的に関わることが期待される一方、ステーション500から適宜充電を受ける必要もある。ユーザがいないとき、いいかえれば、ユーザとの関わりが不要な時間帯を充電時間とすることで、ユーザから見た「ロボット100が休んでいる時間」を減らすことができる。ユーザがロボット100を活動的と感じるためには、ロボット100はユーザの見ていないときに充電することが望ましい。 The return management unit may send a return signal when the user is not around the station 500 or when the light intensity of the room is low (when there is no user or when it is assumed that it is midnight). According to such a control method, a time period during which the robot 100 does not need to interact with the user can be positively used as a charging opportunity for the robot 100. While it is expected that the robot 100 will be actively involved with the user, it is necessary to be appropriately charged by the station 500. In other words, when the user is not present, in other words, by setting the charging time to a time period in which the user is not required to be involved, it is possible to reduce the “time when the robot 100 is resting” seen from the user. In order for the user to feel the robot 100 is active, it is desirable to charge the robot 100 when it is not in view of the user.
 ユーザは、スマートフォンなどのユーザ端末において、ロボット100の充電時間を明示的に設定してもよい。たとえば、ユーザが10:00から10:10を充電時間と設定したとする。ユーザ端末はこのスケジュールデータをステーション500に送信する。ステーション500の帰還管理部は、スケジュールデータを登録しておく。帰還管理部は、10:00になったときに帰還信号を送信し、10:10になったときロボット100に離脱信号を送信してもよい。ロボット100は、離脱信号を受信したときには充電満了でなくても、ステーション500から離れて自律行動を再開する。このような制御方法によれば、ユーザは、ユーザ端末によりロボット100の充電時間をコントロールできる。たとえば、10:00から来客があるときには、ユーザは該当時間を充電時間とすることで、ロボット100が来客応対の邪魔にならないように制御できる。 The user may explicitly set the charging time of the robot 100 on a user terminal such as a smartphone. For example, it is assumed that the user sets the charging time from 10:00 to 10:10. The user terminal transmits this schedule data to the station 500. The return management unit of the station 500 registers schedule data. The return management unit may transmit a return signal at 10:00 and a departure signal to the robot 100 at 10:10. When the robot 100 receives the leaving signal, the robot 100 leaves the station 500 and resumes autonomous behavior even if the charging is not completed. According to such a control method, the user can control the charging time of the robot 100 using the user terminal. For example, when there is a visitor from 10:00, the user can control the robot 100 so as not to interfere with the visitor reception by setting the corresponding time as the charging time.
 スケジュールに限らず、ユーザはユーザ端末からステーション500に帰還指示を送信してもよい。ステーション500は、ユーザ端末から帰還指示を受信したとき、ロボット100に帰還信号を送信する。ロボット100は、帰還信号を受信したときに帰還処理を開始する。たとえば、夜遅くまで子どもがロボット100と遊んでいるときであっても、親ユーザが帰還指示をこっそりと送信すれば、ロボット100は帰還処理を開始するため、ロボット100が眠たくなった、あるいは、疲れたためにステーション500(巣)に戻りたがっているかのような印象を子どもにもたせることができる。 Not limited to the schedule, the user may send a return instruction from the user terminal to the station 500. The station 500 transmits a return signal to the robot 100 when receiving a return instruction from the user terminal. The robot 100 starts the return process when it receives the return signal. For example, even when the child is playing with the robot 100 until late at night, if the parent user secretly transmits the return instruction, the robot 100 starts the return process, and thus the robot 100 becomes sleepy, or You can give the child the impression that he/she wants to return to the station 500 (nest) because he is tired.
 誘導光(光信号)、中距離誘導信号(中距離赤外線)、近距離誘導信号(近距離赤外線と超音波)により、ロボット100はステーション500との位置関係を知ることができる。ロボット100は、中距離赤外線を検出したときには、中距離赤外線に基づいてステーション500に近づく。ロボット100は、近距離赤外線および超音波を検出したときには近距離誘導信号にしたがってステーション500に更に近づく。 The robot 100 can know the positional relationship with the station 500 from the guide light (optical signal), the medium-range guide signal (medium-range infrared ray), and the short-range guide signal (short-range infrared ray and ultrasonic wave). The robot 100 approaches the station 500 based on the medium-range infrared rays when detecting the medium-range infrared rays. The robot 100 further approaches the station 500 according to the short-distance guidance signal when detecting the short-distance infrared rays and ultrasonic waves.
 ロボット100は、ステーション500との距離を測距センサにより計測し、ステーション500から所定距離以内に入ったときには中距離赤外線を無視し、近距離赤外線にしたがって帰還処理を続行するとしてもよい。あるいは、ロボット100は中距離誘導信号と近距離誘導信号の双方を検出したときには電波強度の高い方にしたがって帰還処理を実行するとしてもよい。 The robot 100 may measure the distance to the station 500 with a distance measuring sensor, ignore the medium-range infrared rays when the distance from the station 500 is within a predetermined distance, and continue the feedback processing according to the short-range infrared rays. Alternatively, the robot 100 may execute the feedback processing according to the one having the higher radio field intensity when both the medium-range guidance signal and the short-range guidance signal are detected.
 本実施形態においては、誘導光(可視光)により、ステーション500はロボット100にその所在地点を知らせるとして説明した。変形例として、ステーション500は電波(不可視光)を送信することで遠距離にあるロボット100にその所在地点を知らせるとしてもよい。 In the present embodiment, the station 500 has been described as informing the robot 100 of its location point by the guide light (visible light). As a modification, the station 500 may notify the robot 100 at a long distance of its location point by transmitting radio waves (invisible light).
 ロボット100は、発光態様Aにて発光信号LAを送信してもよい。ロボット100は、発光態様Aによる誘導光を認識できなかったとき、発光態様Bにて発光信号LBを再度送信するとしてもよい。ここでいう発光態様Aと発光態様Bは同一であってもよい。ただし、発光信号LBは、ランドマーク装置280を指定する情報が含まれるランドマーク装置280向けの信号である。 The robot 100 may transmit the light emission signal LA in the light emission mode A. The robot 100 may retransmit the light emission signal LB in the light emission mode B when the guide light according to the light emission mode A cannot be recognized. The light emitting mode A and the light emitting mode B may be the same. However, the light emission signal LB is a signal for the landmark device 280 that includes information designating the landmark device 280.
 ランドマーク装置280は、発光信号LBを受信したとき、発光態様Bにしたがってランドマーク光を発生させる。ロボット100はランドマーク光を検出してランドマーク装置280に近づくとしてもよい。このような制御方法によれば、ロボット100はステーション500の探索を最優先とし、ステーション500を探索できなかったときにはランドマーク装置280を探索することになる。ロボット100は、発光信号LBに対してランドマーク装置280が反応するため、接近対象がステーション500ではなくランドマーク装置280であることを認識できる。 When receiving the light emission signal LB, the landmark device 280 generates landmark light according to the light emission mode B. The robot 100 may detect the landmark light and approach the landmark device 280. According to such a control method, the robot 100 gives the highest priority to the search of the station 500, and searches the landmark device 280 when the station 500 cannot be searched. The robot 100 can recognize that the approach target is not the station 500 but the landmark device 280 because the landmark device 280 reacts to the light emission signal LB.
 ロボット100Aがステーション500から所定範囲内にある状況において、ロボット100Bが入庫要求信号をステーション500に送信したときには、入庫判定部262はロボット100Bの入庫要求を拒否してもよい。ロボット100Aがロボット100Bの入庫の邪魔になる可能性があるためである。このとき、充電装置506はロボット100Aに対して離脱信号を送信してもよい。ロボット100Aは離脱信号を受信したときにはステーション500から離れるとしてもよい。 The warehousing determination unit 262 may reject the warehousing request of the robot 100B when the robot 100B transmits a warehousing request signal to the station 500 when the robot 100A is within a predetermined range from the station 500. This is because the robot 100A may interfere with the storage of the robot 100B. At this time, the charging device 506 may send a departure signal to the robot 100A. The robot 100A may leave the station 500 when it receives the leave signal.
 本実施形態においては、2つのロボット100に対して2つの充電スペース502を用意するとして説明した。2つのロボット100は順番に充電スペース502に入庫したあとは同時に充電を受けることができる。変形例として、充電スペース502の数よりもロボット100の数が多いときにも交代入庫方式は有効である。たとえば、1つの充電スペース502に対して2つのロボット100が同時期に入庫を希望するときには、入庫判定部262はいずれか一方のロボット100の入庫を許可し、他方のロボット100の入庫を拒否する。2つのロボット100はそれぞれの電池残量をステーション500に通知し、入庫判定部262は電池残量の少ない方のロボット100の入庫を優先してもよい。 In the present embodiment, it has been described that the two charging spaces 502 are prepared for the two robots 100. The two robots 100 can be charged at the same time after being stored in the charging space 502 in order. As a modification, the substitution chamber method is effective even when the number of robots 100 is larger than the number of charging spaces 502. For example, when two robots 100 desire to be stored in one charging space 502 at the same time, the storage determination unit 262 permits the storage of one of the robots 100 and rejects the storage of the other robot 100. .. The two robots 100 may notify the respective battery remaining amounts to the station 500, and the warehousing determination unit 262 may prioritize the warehousing of the robot 100 having the smaller battery remaining amount.
 本実施形態においては、第2の発光態様を指定する第2つのロボット100はステーション500に同時に入庫できないとして説明したが、変形例として第2の発光態様を指定する第2つのロボット100は同時にステーション500に入庫できてもよい。ステーション500は、ロボット100Aに対しては左スペース502Lを指定する入庫許可信号を送信し、ロボット100Bに対しては右スペース502Rを指定する入庫許可信号を送信してもよい。また、ロボット100Aの帰還処理中であっても、ロボット100Bは入庫を希望するときにはステーション500に向かって移動するとしてもよい。 In the present embodiment, the second robot 100 that specifies the second light emission mode has been described as not being able to enter the station 500 at the same time. However, as a modification, the second robot 100 that specifies the second light emission mode may be stored in the station at the same time. You may be able to enter 500. The station 500 may transmit a warehousing permission signal designating the left space 502L to the robot 100A and a warehousing permission signal designating the right space 502R to the robot 100B. Further, even during the return processing of the robot 100A, the robot 100B may move toward the station 500 when desiring to store.
 2つのロボット100は、2つの充電スペース502のいずれに入庫するかをあらかじめ設定されてもよい。たとえば、ロボット100の外観の色と、充電スペース502に設けられたパネル508の色とに基づいて入庫先を設定してもよい。ロボット100Aの外皮314の色がダークブラウン、ロボット100Bの外皮314の色がライトブラウンであるとする。また、左パネル508Lの色はダークブラウン、右パネル508Rの色はライトブラウンであるとする。このとき、ロボット100Aは配色の一致する左スペース502Lを対象として帰還してもよい。ロボット100Bは右スペース502Rに帰還する。このような制御方法によれば、充電時においてロボット100と背面パネル508の配色を揃えることができるため、充電中におけるロボットシステム300と2つのロボット100の美観向上にも寄与する。ロボット100は自身の外皮314の色を認識(記憶)してもよい。ロボット100は、入庫要求送信部260を介して、自身の色と配色の一致するパネルの方への誘導をステーション500に要求してもよい。たとえば、入庫要求信号に色彩IDを含めて送信し、262は色彩IDに対応する502を入庫先として選択してもよい。 The two robots 100 may be set in advance in which of the two charging spaces 502 they are to be stored. For example, the storage destination may be set based on the appearance color of the robot 100 and the color of the panel 508 provided in the charging space 502. The outer skin 314 of the robot 100A is dark brown, and the outer skin 314 of the robot 100B is light brown. Further, the color of the left panel 508L is dark brown and the color of the right panel 508R is light brown. At this time, the robot 100A may return to the left space 502L having the same color scheme. The robot 100B returns to the right space 502R. According to such a control method, the colors of the robot 100 and the rear panel 508 can be made uniform during charging, which contributes to improving the appearance of the robot system 300 and the two robots 100 during charging. The robot 100 may recognize (store) the color of its outer skin 314. The robot 100 may request the station 500 to guide toward the panel whose color is the same as that of the station 100 via the warehousing request transmission unit 260. For example, the warehousing request signal may be transmitted including the color ID, and 262 may select 502 corresponding to the color ID as the warehousing destination.
 サーバ200の動作制御部222は、複数のロボット100それぞれについて、左スペース502Lおよび右スペース502Rそれぞれへの帰還回数を記録してもよい。単位期間においてロボット100Aが左スペース502Lに入庫する頻度が右スペース502Rよりも高くなれば、ステーション500の入庫判定部262は、ロボット100Aを左スペース502Lに優先的に入庫させるとしてもよい。この結果、ロボット100Bは、自然に右スペース502Rに導かれやすくなる。このような制御方法によれば、ロボット100Aとロボット100Bそれぞれが徐々にお気に入りの巣(充電スペース502)をもつことになるため、2つのロボット100の巣に対するこだわりを行動表現できる。 The operation control unit 222 of the server 200 may record the number of times of return to each of the left space 502L and the right space 502R for each of the plurality of robots 100. If the robot 100A stores the robot 100A in the left space 502L more frequently than the right space 502R in a unit period, the storage determination unit 262 of the station 500 may store the robot 100A in the left space 502L preferentially. As a result, the robot 100B is naturally easily guided to the right space 502R. According to such a control method, each of the robot 100A and the robot 100B gradually has its favorite nest (charging space 502), so that the commitment of the two robots 100 to the nest can be expressed.
 ロボット100の眼生成部(図示せず)は、充電中は目110に表示される眼画像を閉眼させることにより「眠り」を表現してもよい。眼生成部は、ロボット100が入庫してから第1の導眠時間が経過したとき、眼画像を閉眼画像に変化させることで眠りを表現する。ロボット100Aとロボット100Bを同時充電しているときには、サーバ200の充電制御部264は、2つのロボット100に対して第1の導眠時間よりも長い第2の導眠時間を指示してもよい。ロボット100Aとロボット100Bは同時充電中に互いを見つめるように視線を動かしてもよいし、互いに触れるように腕106を動かしてもよい。このような制御方法によれば、ロボット100Aとロボット100Bが同じステーション500(巣)に戻っているときにも互いに意識している様子を表現できる。こうした表現により、500を単に2台のロボットが充電するための装置としてではなく、2台の兄弟ロボットの巣として印象づけることができる。 The eye generation unit (not shown) of the robot 100 may express “sleep” by closing the eye image displayed on the eye 110 during charging. The eye generation unit expresses sleep by changing the eye image to an eye-closed image when the first sleep duration has elapsed since the robot 100 was stored. When the robot 100A and the robot 100B are being simultaneously charged, the charging control unit 264 of the server 200 may instruct the two robots 100 to perform a second sleep duration longer than the first sleep duration. .. The robot 100</b>A and the robot 100</b>B may move their line of sight so as to stare at each other during the simultaneous charging, or may move the arm 106 so as to touch each other. According to such a control method, it is possible to express a state in which the robot 100A and the robot 100B are aware of each other even when returning to the same station 500 (nest). With this expression, it is possible to impress 500 not as a device for charging two robots but as a nest of two brother robots.
 上述したように、ロボット100はサーモセンサ115を搭載する。ステーション500は、サーモセンサ115の近くに一定温度を発生させるサーマルリファレンスを備えてもよい。たとえば、サーマルリファレンスを25度とすれば、ステーション500はサーモセンサ115によりサーマルリファレンスの25度を検出することで、サーモセンサ115の検出感度を補正すればよい。 As described above, the robot 100 is equipped with the thermosensor 115. The station 500 may include a thermal reference near the thermosensor 115 that produces a constant temperature. For example, if the thermal reference is 25 degrees, the station 500 may correct the detection sensitivity of the thermo sensor 115 by detecting 25 degrees of the thermal reference by the thermo sensor 115.
 発光信号および入庫要求信号は、Bluetooth(登録商標)やWi-Fiなどによる無線信号で送信されてもよい。発光信号等を無線信号(電波信号)とすれば、ロボット100はステーション500を視認できない位置にいるときであっても、ステーション500は発光信号等を受信できる。 The light emission signal and the warehousing request signal may be transmitted as a wireless signal such as Bluetooth (registered trademark) or Wi-Fi. If the light emission signal and the like are wireless signals (radio wave signals), the robot 100 can receive the light emission signal and the like even when the robot 100 is in a position where the station 500 cannot be visually recognized.
 ステーション500はスピーカー(不図示)を内蔵してもよい。ロボット100は、発音要求部(不図示)を備えてもよい。ロボット100が誘導光を検出できないときには、ロボット100の発音要求部は、発音要求信号をステーション500に送信してもよい。ステーション500の発音要求受信部(不図示)が発音要求信号を検出したとき、ステーション500の発音制御部(不図示)はスピーカーから所定周波数の音声を発生させる。このときの音声は、ユーザに不快感をもたせない周波数帯であって、方向推定しやすい音声であることが望ましい。可聴音でなくてもよい。ロボット100の音声方向特定部(不図示)は内蔵するマイクロフォンアレイによりステーション500の方向を推定してもよい。このような制御方法によれば、ロボット100からステーション500を視認できないときでも、ロボット100はステーション500の所在地点を推定できる。なお、音声は壁に反射することで発音地点がわかりにくくなるという欠点があるため、誘導光を検出できないときに補助的に使用する方が望ましいと考えられる。 Station 500 may include a speaker (not shown). The robot 100 may include a sound generation requesting unit (not shown). When the robot 100 cannot detect the guide light, the sound requesting unit of the robot 100 may send a sound request signal to the station 500. When the sound generation request receiving unit (not shown) of the station 500 detects the sound generation request signal, the sound generation control unit (not shown) of the station 500 causes the speaker to generate a sound of a predetermined frequency. It is desirable that the voice at this time be in a frequency band that does not make the user feel uncomfortable and that the voice can be easily estimated. It does not have to be an audible sound. The voice direction specifying unit (not shown) of the robot 100 may estimate the direction of the station 500 by a built-in microphone array. According to such a control method, the robot 100 can estimate the location point of the station 500 even when the station 500 cannot be visually recognized from the robot 100. In addition, since voice has a drawback that the sounding point becomes difficult to be recognized by being reflected on the wall, it is considered preferable to use it auxiliary when the guide light cannot be detected.
 図19は、変形例におけるステーション550の外観図である。
 変形例におけるステーション550では、背面パネル508の下に近距離誘導部252を設けるのではなく、ベース504に2つの超音波発生装置552と赤外線発生装置554を埋設する。超音波発生装置552RR、超音波発生装置552RLおよび赤外線発生装置554Rにより右側の近距離誘導部556Rが形成される。同様にして、超音波発生装置552LL、超音波発生装置552LRおよび赤外線発生装置554Lにより左側の近距離誘導部556Lが形成される。図7に示したステーション500との違いは、ひとつの近距離誘導部556に対して2つの超音波発生装置552が設けられ、それらがロボット100の背面に設けられた赤外線センサ172およびマイク174とほぼ高さがそろうように低い位置に設けられていることである。なお、赤外線発生装置554R,554Lの機能は、中距離誘導部254がまとめて担ってもよい。つまり、赤外線発生装置554Rおよび赤外線発生装置554Lを設けず、中距離誘導部254の赤外線発生装置だけが設けられてよい。中距離誘導部254の赤外線発生装置を、近距離誘導部556の赤外線発生装置として利用する場合は、左右の超音波発生装置552の動作に連動して中距離誘導部254の赤外線発生装置が制御される。
FIG. 19 is an external view of the station 550 in the modified example.
In the station 550 in the modified example, two ultrasonic wave generators 552 and infrared ray generators 554 are embedded in the base 504, instead of providing the short-distance guiding section 252 under the rear panel 508. The ultrasonic wave generation device 552RR, the ultrasonic wave generation device 552RL, and the infrared ray generation device 554R form a short-distance guide portion 556R on the right side. Similarly, the ultrasonic wave generator 552LL, the ultrasonic wave generator 552LR, and the infrared ray generator 554L form a short-distance guide portion 556L on the left side. The difference from the station 500 shown in FIG. 7 is that two ultrasonic wave generators 552 are provided for one short-distance guiding unit 556, which are an infrared sensor 172 and a microphone 174 provided on the back surface of the robot 100. It is provided at a low position so that the heights are almost the same. The functions of the infrared ray generators 554R and 554L may be collectively performed by the intermediate distance guiding unit 254. That is, the infrared generators 554R and 554L may not be provided, and only the infrared generators of the middle distance guiding unit 254 may be provided. When the infrared generator of the intermediate distance guiding unit 254 is used as the infrared generator of the short distance guiding unit 556, the infrared generating device of the intermediate distance guiding unit 254 controls in conjunction with the operation of the left and right ultrasonic generators 552. To be done.
 超音波発生装置552RRと超音波発生装置552RLは、右スペース502Rの仮想的な中心線に対して左右対称の位置に設けられる。また、超音波は正面方向にむけて照射される。左右2つの超音波発生装置552が設けることにより、それぞれの超音波発生装置552とロボット100との距離を算出できる。超音波発生装置552のステーション550における位置は固定なので、ロボット100のステーション550に対する相対位置を特定できる。この変形例によれば、ロボット100のステーション550に対する相対的な位置を特定し、かつステーション550に対するロボット100の向きを特定できるので、より正確な誘導ができる。 The ultrasonic wave generator 552RR and the ultrasonic wave generator 552RL are provided at positions symmetrical to the virtual center line of the right space 502R. Also, ultrasonic waves are emitted toward the front. By providing two left and right ultrasonic wave generators 552, the distance between each ultrasonic wave generator 552 and the robot 100 can be calculated. Since the position of the ultrasonic generator 552 at the station 550 is fixed, the relative position of the robot 100 with respect to the station 550 can be specified. According to this modification, the relative position of the robot 100 with respect to the station 550 can be specified, and the orientation of the robot 100 with respect to the station 550 can be specified, so that more accurate guidance can be performed.
 図7に示したステーション500では、右誘導部252Rが、誘導路の基準進入ライン上に配置されており、赤外線がある程度の指向性を有するので、赤外線を受光できることが誘導路を含むある範囲内(扇状に広がる範囲)に位置していることになる。赤外線は広がるので、ステーションから離れるほど、正確な位置を把握しにくくなる。図19に示したステーション550では、左右に超音波発生装置を設けることで、ロボット100の位置を座標として特定できる。更にロボット100のステーション500に対する向きも特定できる。これにより、図19に示したステーション550によれば、図7に示したステーション500より正確に誘導できる。 In the station 500 shown in FIG. 7, the right guiding portion 252R is arranged on the reference approach line of the taxiway, and since infrared rays have a certain directivity, it is possible to receive infrared rays within a certain range including the taxiway. It is located in the (fan-shaped spread range). Since the infrared rays spread, it becomes more difficult to grasp the exact position as the distance from the station increases. In the station 550 shown in FIG. 19, the position of the robot 100 can be specified as coordinates by providing ultrasonic generators on the left and right. Further, the orientation of the robot 100 with respect to the station 500 can be specified. As a result, the station 550 shown in FIG. 19 can be guided more accurately than the station 500 shown in FIG.
 図20は、ロボット100の位置および進行方向を特定する方法を説明するための模式図である。
 超音波発生装置552RRおよび超音波発生装置552RLのロボット100との距離を点線の円弧で示す。2つの点線の交点がロボット100の位置Pになる。正確には、位置Pは、ロボット100の背面に設けられた左右いずれかのマイク174の位置になる。本図では、位置Pを左マイク174Lの位置であるとする。左マイク174Lは、超音波発生装置552RRおよび超音波発生装置552RLそれぞれから超音波を受信する。左マイク174Lは、赤外線センサ172による赤外線の受信に遅れて、超音波発生装置552RRから超音波を受信する。両者の時間差により、測距部162は超音波発生装置552RRから左マイク174Lまでの距離Aを測定する。同様にして、左マイク174Lは、赤外線センサ172による赤外線の受信に遅れて、超音波発生装置552RLから超音波を受信する。測距部162は超音波発生装置552RLから左マイク174Lまでの距離Bを測定する。超音波発生装置552RRから距離Aを半径とする円と、超音波発生装置552RLから距離Bを半径とする円の交点が左マイク174L(位置P)の位置座標Lとして特定される。
FIG. 20 is a schematic diagram for explaining a method of identifying the position and the traveling direction of the robot 100.
The distance between the ultrasonic wave generator 552RR and the ultrasonic wave generator 552RL and the robot 100 is indicated by a dotted arc. The intersection point of the two dotted lines is the position P of the robot 100. To be precise, the position P is the position of either the left or right microphone 174 provided on the back surface of the robot 100. In this figure, it is assumed that the position P is the position of the left microphone 174L. The left microphone 174L receives ultrasonic waves from the ultrasonic wave generation device 552RR and the ultrasonic wave generation device 552RL. The left microphone 174L receives the ultrasonic wave from the ultrasonic wave generator 552RR after the infrared sensor 172 receives the infrared light. The distance measuring unit 162 measures the distance A from the ultrasonic generator 552RR to the left microphone 174L based on the time difference between the two. Similarly, the left microphone 174L receives the ultrasonic wave from the ultrasonic wave generator 552RL after the infrared sensor 172 receives the infrared light. The distance measuring unit 162 measures the distance B from the ultrasonic generator 552RL to the left microphone 174L. An intersection of a circle having a radius A from the ultrasonic generator 552RR and a circle having a radius B from the ultrasonic generator 552RL is specified as the position coordinate L of the left microphone 174L (position P).
 右マイク174Rの位置座標Rについても同様にして求めることができる。左マイク174Lおよび右マイク174Rそれぞれの位置座標がわかれば、その線分の中央をロボット100の位置座標として特定できる。更に、図15を用い説明したとおり、左マイク174Lと右マイク174Rの超音波の検出時点の差分時間に基づいて、超音波発生装置552に対するロボット100の向きを算出できる。本変形例では、超音波発生装置552が左右2つあるので、それぞれの超音波発生装置に対するロボット100の向きを算出し、平均化などの数値処理を施すことで、より正確に向きを算出できる。 The position coordinate R of the right microphone 174R can be similarly obtained. If the position coordinates of the left microphone 174L and the right microphone 174R are known, the center of the line segment can be specified as the position coordinate of the robot 100. Furthermore, as described with reference to FIG. 15, the orientation of the robot 100 with respect to the ultrasonic wave generation device 552 can be calculated based on the time difference between the ultrasonic waves detected by the left microphone 174L and the right microphone 174R. In this modification, since there are two ultrasonic generators 552 on the left and right, the orientation of the robot 100 with respect to each ultrasonic generator is calculated, and by performing numerical processing such as averaging, the orientation can be calculated more accurately. ..
 また、本変形例では、ロボット100の背面に設けられた左右のマイク174の位置をそれぞれ位置座標として特定できる。2つのマイク174を結ぶ線分に対する法線の方向をロボットの向きとして特定してもよい。 Also, in this modification, the positions of the left and right microphones 174 provided on the back surface of the robot 100 can be specified as position coordinates. The direction of the normal to the line segment connecting the two microphones 174 may be specified as the orientation of the robot.
 超音波発生装置552RRおよび超音波発生装置552RLが発生させる超音波の周波数は同じである。赤外線発生装置554Rは、超音波発生装置552における超音波の発生と同時に赤外線を照射し、超音波を発生した超音波発生装置を特定する情報を送信する。例えば、赤外線発生装置554Rは、右側の超音波発生装置552RRを示す赤外線と左側の超音波発生装置552RLを示す赤外線を交互に定期的に発生させる。また、これに同期させて、超音波発生装置552RRおよび超音波発生装置552RLも交互に超音波を発生させる。近距離誘導中のロボット100の移動速度は通常移動時よりも遅い。超音波の発生周期において、ロボット100が移動する距離は極めて短いので、超音波発生装置552RRと超音波発生装置552RLとに交互に超音波を発生させても、ロボット100の位置をほぼ正確に測定できる。 The frequencies of the ultrasonic waves generated by the ultrasonic generator 552RR and the ultrasonic generator 552RL are the same. The infrared generator 554R irradiates infrared rays at the same time as the generation of ultrasonic waves in the ultrasonic generator 552, and transmits information specifying the ultrasonic generator that generated the ultrasonic waves. For example, the infrared ray generation device 554R alternately and regularly generates an infrared ray indicating the ultrasonic wave generation device 552RR on the right side and an infrared ray indicating the ultrasonic wave generation device 552RL on the left side. Further, in synchronization with this, the ultrasonic wave generator 552RR and the ultrasonic wave generator 552RL alternately generate ultrasonic waves. The movement speed of the robot 100 during short-distance guidance is slower than that during normal movement. In the ultrasonic wave generation cycle, the distance that the robot 100 moves is extremely short. Therefore, even if ultrasonic waves are alternately generated by the ultrasonic wave generators 552RR and 552RL, the position of the robot 100 can be measured almost accurately. it can.
 超音波発生装置552RR、超音波発生装置552RLが発生させる超音波の周波数を異ならせてもよい。この場合には、超音波発生装置552RR、超音波発生装置552RLが同時かつ定期的に超音波を発生させたとしても、マイク174はどちらの超音波であるかを認識できる。 The frequencies of the ultrasonic waves generated by the ultrasonic wave generator 552RR and the ultrasonic wave generator 552RL may be different. In this case, even if the ultrasonic wave generation device 552RR and the ultrasonic wave generation device 552RL generate ultrasonic waves simultaneously and periodically, the microphone 174 can recognize which ultrasonic wave is generated.
 この出願は、××××年×月×日に出願された日本出願特願××××-××××××号を基礎とする優先権を主張し、その開示の全てをここに取り込む。 This application claims priority on the basis of Japanese Patent Application No. XX××-×××××× filed on the date of xx××year×month×, and the entire disclosure thereof is here. take in.

Claims (14)

  1.  給電端子を有する充電スペースと、
     前記充電スペースにおいてロボットと前記給電端子が接続されたとき、前記ロボットに内蔵される二次電池を充電する充電制御部と、
     光源と、
     前記ロボットから、発光態様を指定する発光信号を受信する発光指示受信部と、
     前記指定された発光態様にしたがって、前記光源の発光態様を変化させる発光制御部と、を備えることを特徴とする充電ステーション。
    A charging space having a power supply terminal,
    When the robot and the power supply terminal are connected in the charging space, a charging control unit that charges a secondary battery built in the robot,
    A light source,
    A light emission instruction receiving unit that receives a light emission signal designating a light emission mode from the robot,
    A light-emission control unit that changes the light-emission mode of the light source according to the designated light-emission mode.
  2.  前記発光指示受信部は、前記発光信号により、前記光源の発光量、点滅周期、発光色および発光パターンうちの1以上を前記発光態様として指定されることを特徴とする請求項1に記載の充電ステーション。 The charging according to claim 1, wherein the light emission instruction receiving unit specifies, by the light emission signal, one or more of a light emission amount of the light source, a blinking cycle, a light emission color, and a light emission pattern as the light emission mode. station.
  3.  前記光源は、第1の光源と第2の光源を含み、
     前記充電ステーションを正面視したとき、前記第1の光源は前記第2の光源よりも手前に設置されることを特徴とする請求項1または2に記載の充電ステーション。
    The light source includes a first light source and a second light source,
    The charging station according to claim 1 or 2, wherein the first light source is installed in front of the second light source when the charging station is viewed from the front.
  4.  複数の充電スペースを備え、複数のロボットを同時充電可能であることを特徴とする請求項1から3のいずれかに記載の充電ステーション。 The charging station according to any one of claims 1 to 3, wherein the charging station has a plurality of charging spaces and is capable of simultaneously charging a plurality of robots.
  5.  第1のロボットから、充電スペースへの入庫を要求する入庫要求信号を受信する入庫要求受信部と、
     前記第1のロボットの前記充電スペースへの入庫可否を判定する入庫判定部と、
     入庫可否を示す入庫可否信号を前記第1のロボットに送信する入庫可否送信部と、を更に備え、
     前記入庫判定部は、第2のロボットが充電ステーションへの帰還中であって、前記第2のロボットがいずれかの充電スペースにおいて充電を開始していないときには、前記第1のロボットの入庫を拒否することを特徴とする請求項4に記載の充電ステーション。
    A warehousing request receiving unit that receives a warehousing request signal requesting warehousing into the charging space from the first robot;
    A warehousing determination unit that determines whether or not the first robot can be stored in the charging space;
    A storage availability transmitting unit that transmits a storage availability signal indicating availability of storage to the first robot,
    The warehousing determination unit rejects the warehousing of the first robot when the second robot is returning to the charging station and the second robot has not started charging in any charging space. The charging station according to claim 4, wherein:
  6.  前記ロボットの動作を制御するサーバ装置と一体として形成され、
     前記サーバ装置は、
     前記ロボットから、前記ロボットにより認識されたイベント情報を受信するイベント受信部と、
     前記イベント情報にしたがって、前記ロボットのモーションを選択する動作制御部と、を備えることを特徴とする請求項1から5のいずれかに記載の充電ステーション。
    Formed integrally with a server device that controls the operation of the robot,
    The server device is
    From the robot, an event receiving unit that receives event information recognized by the robot,
    The charging station according to claim 1, further comprising: a motion control unit that selects a motion of the robot according to the event information.
  7.  ロボットのモーションを選択する動作制御部と、
     前記動作制御部により選択されたモーションを実行する駆動機構と、
     発光態様を指定する発光信号を送信する発光指示送信部と、
     外部光を認識する光認識部と、を備え、
     前記発光指示送信部は、前記ロボットに内蔵される二次電池を充電するとき、前記発光信号を送信し、
     前記光認識部は、前記指定された発光態様に対応する外部光を特定し、
     前記動作制御部は、前記外部光の発光地点を充電ステーションの所在地点としてロボットの移動方向を決定することを特徴とする自律行動型ロボット。
    A motion control unit that selects the motion of the robot,
    A drive mechanism that executes a motion selected by the operation control unit;
    A light emission instruction transmission unit that transmits a light emission signal that specifies a light emission mode,
    A light recognition unit for recognizing external light,
    The light emission instruction transmission unit transmits the light emission signal when charging a secondary battery built in the robot,
    The light recognition unit identifies external light corresponding to the designated light emission mode,
    The autonomous action type robot, wherein the operation control unit determines a moving direction of the robot by using a light emitting point of the external light as a location point of a charging station.
  8.  撮像画像を取得する撮像画像取得部と、
     前記撮像画像から特徴点を抽出することにより、画像特徴情報を取得する画像特徴取得部と、
     前記画像特徴情報に基づいてマップを生成するマップ管理部と、を更に備え、
     前記マップ管理部は、前記充電ステーションの発光地点を基準として、前記マップを生成することを特徴とする請求項7に記載の自律行動型ロボット。
    A captured image acquisition unit that acquires a captured image;
    An image feature acquisition unit that acquires image feature information by extracting feature points from the captured image,
    A map management unit that generates a map based on the image feature information,
    The autonomous action type robot according to claim 7, wherein the map management unit generates the map based on a light emitting point of the charging station.
  9.  前記発光指示送信部は、前記マップにおいて現在地点を見失ったとき、前記発光信号を送信することを特徴とする請求項8に記載の自律行動型ロボット。 The autonomous action type robot according to claim 8, wherein the light emission instruction transmission unit transmits the light emission signal when the current position on the map is lost.
  10.  光源と、
     ロボットから、発光態様を指定する発光信号を受信する発光指示受信部と、
     前記指定された発光態様にしたがって、前記光源の発光態様を変化させる発光制御部と、を備えることを特徴とするランドマーク装置。
    A light source,
    A light emission instruction receiving unit that receives a light emission signal that specifies a light emission mode from the robot,
    A light emission control unit that changes the light emission mode of the light source according to the designated light emission mode, the landmark device.
  11.  ロボット、ランドマーク装置および充電ステーションを含み、
     前記ロボットは、
     外部光を認識する光認識部と、
     ロボットのモーションを選択する動作制御部と、
     前記動作制御部により選択されたモーションを実行する駆動機構と、
     発光態様を指定する発光信号を送信する発光指示送信部と、を備え、
     前記発光指示送信部は、前記ロボットに内蔵される二次電池を充電するとき、前記発光信号を送信し、
     前記光認識部は、前記指定された発光態様に対応する外部光を特定し、
     前記動作制御部は、前記外部光の発光地点を移動目標地点としてロボットの移動方向を決定し、
     前記ランドマーク装置は、
     光源と、
     前記ロボットから、発光信号を受信する発光指示受信部と、
     前記発光信号により指定された発光態様にしたがって、前記光源の発光態様を変化させる発光制御部と、を備え、
     前記充電ステーションは、
     給電端子を有する充電スペースと、
     前記充電スペースにおいてロボットと前記給電端子が接続されたとき、前記ロボットの二次電池を充電する充電制御部と、
     光源と、
     前記ロボットから、前記発光信号を受信する発光指示受信部と、
     前記発光信号により指定された発光態様にしたがって、前記光源の発光態様を変化させる発光制御部と、を備え、
     前記ロボットの発光指示送信部は、第1の発光態様を指定する第1の発光信号を送信し、
     前記ランドマーク装置の発光制御部は、前記第1の発光信号により指定された第1の発光態様にしたがって、前記光源の発光態様を変化させ、
     前記ロボットの光認識部は、前記第1の発光態様による外部光を特定し、
     前記ロボットの動作制御部は、前記第1の発光態様による外部光の発光地点を移動目標地点としてロボットの移動方向を決定し、
     前記ロボットの発光指示送信部は、前記ロボットが前記ランドマーク装置の所在地点に至ったとき、第2の発光態様を指定する第2の発光信号を送信し、
     前記充電ステーションの発光制御部は、前記第2の発光信号により指定された第2の発光態様にしたがって、前記光源の発光態様を変化させ、
     前記ロボットの光認識部は、前記第2の発光態様による外部光を特定し、
     前記ロボットの動作制御部は、前記第2の発光態様による外部光の発光地点を移動目標地点としてロボットの次の移動方向を決定することを特徴とする誘導システム。
    Including robots, landmark devices and charging stations,
    The robot is
    A light recognition unit that recognizes external light,
    A motion control unit that selects the motion of the robot,
    A drive mechanism that executes a motion selected by the operation control unit;
    A light emission instruction transmission unit that transmits a light emission signal that specifies a light emission mode,
    The light emission instruction transmission unit transmits the light emission signal when charging a secondary battery built in the robot,
    The light recognition unit identifies external light corresponding to the designated light emission mode,
    The operation control unit determines the movement direction of the robot with the light emission point of the external light as a movement target point,
    The landmark device,
    A light source,
    A light emission instruction receiving unit that receives a light emission signal from the robot,
    A light emission control unit that changes a light emission mode of the light source according to a light emission mode designated by the light emission signal,
    The charging station is
    A charging space having a power supply terminal,
    A charging control unit that charges a secondary battery of the robot when the robot and the power supply terminal are connected in the charging space,
    A light source,
    From the robot, a light emission instruction receiving unit that receives the light emission signal,
    A light emission control unit that changes a light emission mode of the light source according to a light emission mode designated by the light emission signal,
    The light emission instruction transmission unit of the robot transmits a first light emission signal designating a first light emission mode,
    The light emission control unit of the landmark device changes the light emission mode of the light source according to a first light emission mode designated by the first light emission signal,
    The light recognition unit of the robot identifies the external light according to the first light emission mode,
    The operation control unit of the robot determines a movement direction of the robot with a light emission point of the external light according to the first light emission mode as a movement target point,
    The light emission instruction transmission unit of the robot transmits a second light emission signal designating a second light emission mode when the robot reaches the location point of the landmark device,
    The light emission control unit of the charging station changes the light emission mode of the light source according to the second light emission mode designated by the second light emission signal,
    The light recognition unit of the robot identifies the external light according to the second light emission mode,
    The guidance system, wherein the operation control unit of the robot determines a next moving direction of the robot with a light emission point of the external light according to the second light emission mode as a movement target point.
  12.  第1のロボット、第2のロボットおよび充電ステーションを含み、
     前記第1のロボットおよび第2のロボットは、いずれも
     外部光を認識する光認識部と、
     ロボットのモーションを選択する動作制御部と、
     前記動作制御部により選択されたモーションを実行する駆動機構と、
     発光態様を指定する発光信号を送信する発光指示送信部と、
     光源と、
     他のロボットから、発光信号を受信する発光指示受信部と、
     前記発光信号により指定された発光態様にしたがって、前記光源の発光態様を変化させる発光制御部と、を備え、
     前記発光指示送信部は、ロボットに内蔵される二次電池を充電するとき、前記発光信号を送信し、
     前記光認識部は、前記指定された発光態様に対応する外部光を特定し、
     前記動作制御部は、前記外部光の発光地点を移動目標地点としてロボットの移動方向を決定し、
     前記充電ステーションは、
     給電端子を有する充電スペースと、
     前記充電スペースにおいてロボットと前記給電端子が接続されたとき、前記ロボットの二次電池を充電する充電制御部と、
     光源と、
     ロボットから、発光信号を受信する発光指示受信部と、
     前記発光信号により指定された発光態様にしたがって、前記光源の発光態様を変化させる発光制御部と、を備え、
     前記第1のロボットの発光指示送信部は、第1の発光態様を指定する第1の発光信号を送信し、
     前記第2のロボットの発光制御部は、前記第1の発光信号により指定された第1の発光態様にしたがって、前記第2のロボットの光源の発光態様を変化させ、
     前記第1のロボットの光認識部は、前記第1の発光態様による外部光を特定し、
     前記第1のロボットの動作制御部は、前記第1の発光態様による外部光の発光地点を移動目標地点として前記第1のロボットの移動方向を決定し、
     前記第1のロボットの発光指示送信部は、前記第1のロボットが前記第2のロボットの所在地点に至ったとき、第2の発光態様を指定する第2の発光信号を送信し、
     前記充電ステーションの発光制御部は、前記第2の発光信号により指定された第2の発光態様にしたがって、前記光源の発光態様を変化させ、
     前記第1のロボットの光認識部は、前記第2の発光態様による外部光を特定し、
     前記第1のロボットの動作制御部は、前記第2の発光態様による外部光の発光地点を移動目標地点として前記第1のロボットの次の移動方向を決定することを特徴とする誘導システム。
    Including a first robot, a second robot and a charging station,
    Each of the first robot and the second robot has a light recognition unit that recognizes external light,
    A motion control unit that selects the motion of the robot,
    A drive mechanism that executes a motion selected by the operation control unit;
    A light emission instruction transmission unit that transmits a light emission signal that specifies a light emission mode,
    A light source,
    A light emitting instruction receiving unit that receives a light emitting signal from another robot,
    A light emission control unit that changes a light emission mode of the light source according to a light emission mode designated by the light emission signal,
    The light emission instruction transmission unit transmits the light emission signal when charging a secondary battery built in the robot,
    The light recognition unit identifies external light corresponding to the designated light emission mode,
    The operation control unit determines the movement direction of the robot with the light emission point of the external light as a movement target point,
    The charging station is
    A charging space having a power supply terminal,
    A charging control unit that charges a secondary battery of the robot when the robot and the power supply terminal are connected in the charging space,
    A light source,
    A light emission instruction receiving unit that receives a light emission signal from the robot,
    A light emission control unit that changes a light emission mode of the light source according to a light emission mode designated by the light emission signal,
    The light emission instruction transmission unit of the first robot transmits a first light emission signal that specifies a first light emission mode,
    The light emission control unit of the second robot changes the light emission mode of the light source of the second robot according to the first light emission mode designated by the first light emission signal,
    The light recognition unit of the first robot identifies external light according to the first light emission mode,
    The operation control unit of the first robot determines a moving direction of the first robot with a light emission point of external light according to the first light emission mode as a movement target point,
    The light emission instruction transmission unit of the first robot transmits a second light emission signal designating a second light emission mode when the first robot reaches the location point of the second robot,
    The light emission control unit of the charging station changes the light emission mode of the light source according to the second light emission mode designated by the second light emission signal,
    The light recognition unit of the first robot identifies external light according to the second light emission mode,
    The guidance system, wherein the operation control unit of the first robot determines a next moving direction of the first robot with a light emission point of external light according to the second light emission mode as a movement target point.
  13.  前記第1のロボットおよび前記第2のロボットは、いずれも、
     前記充電ステーションに対してロボットの現在地点が所定の位置条件を満たすか否かを判定する位置判定部、を更に備え、
     前記第1のロボットにおいて前記位置条件が成立しないときには、前記第2のロボットの動作制御部は前記位置条件が成立する範囲内を前記第2のロボットの行動範囲として設定することを特徴とする請求項12に記載の誘導システム。
    Both the first robot and the second robot are
    A position determination unit that determines whether or not the current position of the robot with respect to the charging station satisfies a predetermined position condition,
    When the position condition is not satisfied in the first robot, the operation control unit of the second robot sets a range within which the position condition is satisfied as an action range of the second robot. Item 12. The guidance system according to Item 12.
  14.  ロボットのモーションを選択する機能と、
     前記選択されたモーションを駆動機構に実行させる機能と、
     発光態様を指定する発光信号を送信する機能と、
     前記指定された発光態様による外部光の発光地点を認識する機能と、
     前記発光信号が指定する発光態様にて発光する外部光が認識されたとき、前記外部光の発光地点を充電ステーションの所在地点としてロボットの移動方向を決定する機能と、をコンピュータに発揮させることを特徴とする自律行動型ロボットの行動制御プログラム。
    The function to select the motion of the robot,
    A function for causing the drive mechanism to execute the selected motion,
    A function for transmitting a light emission signal that specifies the light emission mode,
    A function of recognizing a light emission point of external light according to the designated light emission mode,
    When the external light emitted in the light emission mode designated by the light emission signal is recognized, the computer is caused to perform the function of determining the moving direction of the robot by using the light emission point of the external light as the location point of the charging station. A behavior control program for a characteristic autonomous robot.
PCT/JP2019/049459 2018-12-17 2019-12-17 Robot, charging station for robot, and landmark device WO2020129992A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2020561465A JP7414285B2 (en) 2018-12-17 2019-12-17 Robots and charging stations and landmark devices for robots
JP2023216323A JP2024045110A (en) 2018-12-17 2023-12-21 Robots and charging stations and landmark devices for robots

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018235567 2018-12-17
JP2018-235567 2018-12-17

Publications (1)

Publication Number Publication Date
WO2020129992A1 true WO2020129992A1 (en) 2020-06-25

Family

ID=71101980

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/049459 WO2020129992A1 (en) 2018-12-17 2019-12-17 Robot, charging station for robot, and landmark device

Country Status (2)

Country Link
JP (2) JP7414285B2 (en)
WO (1) WO2020129992A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112484713A (en) * 2020-10-15 2021-03-12 珊口(深圳)智能科技有限公司 Map construction method, navigation method and control system of mobile robot

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004151924A (en) * 2002-10-30 2004-05-27 Sony Corp Autonomous mobile robot and control method for the same
JP2004216552A (en) * 2003-01-11 2004-08-05 Samsung Electronics Co Ltd Mobile robot, its autonomous travel system, and method
JP2009116634A (en) * 2007-11-07 2009-05-28 Nec Access Technica Ltd Charging control device, charging control system, and charging control method and program used therefor
US20160370804A1 (en) * 2015-01-14 2016-12-22 Varram System Co., Ltd. Mobile robot and method for docking the mobile robot with charging station
US20180191181A1 (en) * 2017-01-04 2018-07-05 Sphero, Inc. Charging unit for mobile device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004151924A (en) * 2002-10-30 2004-05-27 Sony Corp Autonomous mobile robot and control method for the same
JP2004216552A (en) * 2003-01-11 2004-08-05 Samsung Electronics Co Ltd Mobile robot, its autonomous travel system, and method
JP2009116634A (en) * 2007-11-07 2009-05-28 Nec Access Technica Ltd Charging control device, charging control system, and charging control method and program used therefor
US20160370804A1 (en) * 2015-01-14 2016-12-22 Varram System Co., Ltd. Mobile robot and method for docking the mobile robot with charging station
US20180191181A1 (en) * 2017-01-04 2018-07-05 Sphero, Inc. Charging unit for mobile device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112484713A (en) * 2020-10-15 2021-03-12 珊口(深圳)智能科技有限公司 Map construction method, navigation method and control system of mobile robot

Also Published As

Publication number Publication date
JPWO2020129992A1 (en) 2021-11-04
JP2024045110A (en) 2024-04-02
JP7414285B2 (en) 2024-01-16

Similar Documents

Publication Publication Date Title
US20230380383A1 (en) Animal wearable devices, systems, and methods
US11376740B2 (en) Autonomously acting robot that recognizes direction of sound source
CN109526208B (en) Action-controlled autonomous robot
JP6472113B2 (en) Autonomous robots and programs that maintain a natural sense of distance
JP6884401B2 (en) Autonomous robot wearing clothes
JP4867779B2 (en) Pet guiding robot and pet guiding method
US11135726B2 (en) Autonomously acting robot that accepts a guest
JP6557840B2 (en) Robot, server and action control program
US20210283516A1 (en) Robot that wears clothes
US11519456B2 (en) Joint structure appropriate for robot joint
JP6755447B2 (en) Autonomous action robot with emergency stop function
JP2024045110A (en) Robots and charging stations and landmark devices for robots
US11926230B2 (en) Robot charging station
JP6734607B2 (en) Robots, portable items and robot control programs

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19899300

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020561465

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19899300

Country of ref document: EP

Kind code of ref document: A1