WO2018221334A1 - Clothing manufacturing support system - Google Patents

Clothing manufacturing support system Download PDF

Info

Publication number
WO2018221334A1
WO2018221334A1 PCT/JP2018/019775 JP2018019775W WO2018221334A1 WO 2018221334 A1 WO2018221334 A1 WO 2018221334A1 JP 2018019775 W JP2018019775 W JP 2018019775W WO 2018221334 A1 WO2018221334 A1 WO 2018221334A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
costume
unit
user
tag
Prior art date
Application number
PCT/JP2018/019775
Other languages
French (fr)
Japanese (ja)
Inventor
小泉 実
要 林
Original Assignee
Groove X株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Groove X株式会社 filed Critical Groove X株式会社
Priority to JP2019522151A priority Critical patent/JP6579538B2/en
Publication of WO2018221334A1 publication Critical patent/WO2018221334A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A41WEARING APPAREL
    • A41HAPPLIANCES OR METHODS FOR MAKING CLOTHES, e.g. FOR DRESS-MAKING OR FOR TAILORING, NOT OTHERWISE PROVIDED FOR
    • A41H43/00Other methods, machines or appliances
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H11/00Self-movable toy figures
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H3/00Dolls
    • A63H3/36Details; Accessories
    • A63H3/52Dolls' houses, furniture or other equipment; Dolls' clothing or footwear
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions

Definitions

  • the present invention relates to an apparatus for supporting robot costume making.
  • a robot becomes close to a living being, it is expected that the robot will be dressed as a human being becomes a pet, and production and sales of the costume may be established as a business. Some users may not get tired of buying commercial costumes, and may be interested in making original costumes. Then, there may be a business in which only the cloth or pattern of the costume is sold and the costume production is entrusted to the user.
  • robots are limited in driving force and movable range according to their performance. Wearing a costume is a factor that hinders the performance of the robot and shortens its life. The inventor has come to recognize that it is effective to support the production of a robot's costume in consideration of such points.
  • the present invention is an invention completed on the basis of the above problem recognition, and one of its objects is to provide a device suitable for supporting robot costume production.
  • the costume production supporting apparatus includes a storage unit for holding information for guiding the production of a robot costume, and referring to the storage unit according to a user's request input to select a costume to be produced.
  • FIG. 2 is a cross-sectional view schematically illustrating the structure of a robot. It is a front external view of a robot at the time of clothes wearing. It is a block diagram of a robot system. It is a hardware block diagram of a robot. It is a figure which shows typically the structure of a costume production assistance system. It is a functional block diagram of a costume production support system. It is a figure showing the costume production support screen which a costume management server provides to a user. It is a figure showing the costume production support screen which a costume management server provides to a user. It is a block diagram showing the function of a costume management server in detail.
  • FIG. 1 is a view showing the appearance of a robot 100 according to the embodiment.
  • Fig.1 (a) is a front view
  • FIG.1 (b) is a side view.
  • the robot 100 is an autonomous behavior robot that determines an action or gesture (gesture) based on an external environment and an internal state.
  • the external environment is recognized by various sensors such as a camera and a thermo sensor.
  • the internal state is quantified as various parameters that express the emotion of the robot 100.
  • the robot 100 is premised on indoor action, and for example, takes the inside of the user's house as the action range.
  • a human involved in the robot 100 is referred to as a “user”.
  • the body 104 of the robot 100 has an overall rounded shape and includes an outer skin 314 formed of a flexible and resilient material. By making the body 104 round and soft and have a good touch, the robot 100 provides the user with a sense of security and a pleasant touch. The robot 100 can be dressed. By putting on clothes according to the user's preference, it is possible to enjoy events such as seasonal feeling and birthday.
  • the robot 100 includes three wheels for traveling three wheels. As shown, a pair of front wheels 102 (left wheel 102a, right wheel 102b) and one rear wheel 103 are included.
  • the front wheel 102 is a driving wheel
  • the rear wheel 103 is a driven wheel.
  • the front wheel 102 can individually control its rotational speed and rotational direction.
  • the rear wheel 103 is rotatable to move the robot 100 back and forth and left and right.
  • the front wheel 102 and the rear wheel 103 can be completely housed in the body 104 by a drive mechanism (a rotation mechanism, a link mechanism) not shown. Even when traveling, most of the wheels are hidden by the body 104, but when the wheels are completely housed in the body 104, the robot 100 can not move. As the wheel retracts, the body 104 descends and seats on the floor surface F. In this sitting state, the seating surface 108 formed on the bottom of the body 104 abuts on the floor surface F.
  • the robot 100 has two hands 106.
  • the hand 106 does not have the function of gripping an object.
  • the hand 106 can perform simple operations such as raising, shaking and vibrating by pulling or loosening a built-in wire (not shown).
  • the two hands 106 are also individually controllable.
  • Two eyes 110 are provided on the front of the head (face) of the robot 100.
  • the eyes 110 are displayed with various expressions by liquid crystal elements or organic EL elements.
  • the robot 100 has a built-in speaker and can emit a simple voice.
  • a horn 112 is attached to the top of the head of the robot 100.
  • the omnidirectional camera is incorporated in the horn 112, and it is possible to photograph all directions in the vertical and horizontal directions at once.
  • a high resolution camera is provided in front of the head of the robot 100 (not shown).
  • FIG. 2 is a cross-sectional view schematically showing the structure of the robot 100.
  • the body 104 of the robot 100 includes a base frame 308, a body frame 310, a pair of wheel covers 312 and an outer shell 314.
  • the base frame 308 constitutes an axial center of the body 104 and supports an internal mechanism.
  • the base frame 308 is configured by connecting the upper plate 332 and the lower plate 334 by a plurality of side plates 336.
  • a battery 118, a control circuit 342, various actuators and the like are accommodated.
  • Body frame 310 includes a head frame 316 and a torso frame 318.
  • the head frame 316 has a hollow hemispherical shape and forms a head skeleton of the robot 100.
  • the body frame 318 has a stepped cylindrical shape and forms the body frame of the robot 100.
  • the body frame 318 is fixed to the base frame 308.
  • the head frame 316 is connected to the upper plate 332 via an internal mechanism, a joint 330 and the like, and can be displaced relative to the body frame 318.
  • the head frame 316 is provided with three axes of a yaw axis 321, a pitch axis 322, and a roll axis 323, and actuators 324, 325 for rotationally driving each axis.
  • the actuator 324 includes a servomotor for driving the yaw axis 321.
  • the actuator 325 includes a plurality of servomotors for driving the pitch axis 322 and the roll axis 323, respectively.
  • the yaw shaft 321 is driven for the swinging operation, the pitch shaft 322 is driven for the peeping operation, the look-up operation and the look-down operation, and the roll shaft 323 is driven for the tilting operation.
  • a plate 326 supported by the yaw axis 321 is fixed to the top of the head frame 316.
  • a base plate 328 is provided to support the head frame 316 and its internal features from below.
  • the base plate 328 is connected to the upper plate 332 (base frame 308) through the joint 330.
  • a support base 335 is provided on the base plate 328, and the actuators 324 and 325 and the cross link mechanism 329 (pantograph mechanism) are supported.
  • the cross link mechanism 329 may connect the actuators 324 and 325 up and down, and change their distance.
  • the plate 326 and the head frame 316 can be integrally rotated (yawed), and a swinging motion can be realized.
  • the pitch axis 322 By rotating the pitch axis 322, the cross link mechanism 329 and the head frame 316 can be integrally rotated (pitching), and a loosening operation and the like can be realized.
  • the actuator 325 and the head frame 316 By rotating the roll shaft 323, the actuator 325 and the head frame 316 can be integrally rotated (rolling), and an operation of tilting the neck can be realized.
  • By expanding and contracting the cross link mechanism 329 an expansion and contraction operation of the neck can be realized.
  • Torso frame 318 houses base frame 308 and wheel drive mechanism 370.
  • Wheel drive mechanism 370 includes a front wheel drive mechanism that drives front wheel 102, a rear wheel drive mechanism that drives rear wheel 103, and an actuator 379 that drives these drive mechanisms.
  • the body frame 318 has a smooth curved upper half so that the outline of the body 104 is rounded. The lower half of the body frame 318 is narrowed to form a storage space S of the front wheel 102 with the wheel cover 312, and supports the pivot shaft 378 of the front wheel 102.
  • the pair of wheel covers 312 is provided to cover the lower half of the body frame 318 from the left and right.
  • the wheel cover 312 forms a smooth outer surface (curved surface) continuous with the upper half of the trunk frame 318.
  • the upper end of the wheel cover 312 is connected along the lower end of the upper half.
  • a storage space S opened downward is formed between the side wall of the lower half and the wheel cover 312.
  • the front wheel drive mechanism includes a rotation drive mechanism for rotating the front wheel 102 and a storage operation mechanism for advancing and retracting the front wheel 102 from the storage space S.
  • a rotation drive mechanism for rotating the front wheel 102
  • a storage operation mechanism for advancing and retracting the front wheel 102 from the storage space S.
  • the outer cover 314 covers the main body frame 310 from the outside.
  • the outer cover 314 has a thickness that allows a person to feel elasticity, and is formed of a stretchable material such as a urethane sponge. As a result, when the user holds the robot 100, it feels appropriate softness, and it is possible to take a natural skinship to make a person pet.
  • a capacitive touch sensor is provided between the main body frame 310 and the outer skin 314.
  • the touch sensors are provided at a plurality of locations, and detect touch on substantially the entire area of the robot 100. Since the touch sensor is provided inside the outer skin 314, the deformation level of the outer skin 314 increases the detection level.
  • the hand 106 is integrally formed with the skin 314.
  • An opening 390 is provided at the upper end of the outer skin 314.
  • the lower end of the horn 112 is connected to the head frame 316 through the opening 390.
  • the drive mechanism for driving the hand 106 includes a wire 134 embedded in the outer skin 314 and a drive circuit 340 (energization circuit) thereof.
  • the wire 134 is formed of a shape memory alloy wire in the present embodiment, and shrinks and hardens when heated, and relaxes and elongates when heated. Leads drawn from both ends of the wire 134 are connected to the drive circuit 340. When the switch of the drive circuit 340 is turned on, the wire 134 (shape memory alloy wire) is energized.
  • the wire 134 is molded or braided to extend from the skin 314 to the hand 106. Leads are drawn from both ends of the wire 134 to the inside of the body frame 318. One wire 134 may be provided on each of the left and right of the outer covering 314, or a plurality of wires 134 may be provided in parallel.
  • the arm (hand 106) can be raised by energizing the wire 134, and the arm (hand 106) can be lowered by interrupting the energization.
  • a wire is attached near the tip of the hand 106, and the body frame 318 is provided with a mechanism for winding the wire, and the length of the wire is adjusted by the insertion and removal of the winding mechanism. May be
  • the robot 100 does not wear clothes, and the movable range of the arm in a state in which only the outer skin 314 is mounted (hereinafter also referred to as “reference state”) is 75 degrees. It is 75 degrees upward from the state of being in contact with (hereinafter, described as a movable range “0 to 75 °”).
  • the head frame 316 is connected to the body frame 318 via the base plate 328 and the joints 330 and the like. As shown, since a sufficient distance in the vertical direction is secured between the head frame 316 and the body frame 318, the movable range (rotational range of the head frame 316 about the pitch axis 322) ) Can be taken large.
  • the upper and lower movable range of the head frame 316 in the reference state is 90 degrees, and 45 degrees in the vertical direction from the state where the line of sight is horizontal. That is, the limit value of the angle (look-up angle) in which the robot 100 goes up is 45 degrees, and the limit value of the down-angle (look down angle) is -45 degrees (hereinafter, the movable range "-45-45 °" Also written as
  • the left and right movable range of the head frame 316 in the reference state is set to 150 degrees, and it is set to 75 degrees to the left and right from the state where the line of sight is the front. That is, the limit value of the angle of the robot 100 to the right with respect to the front is set to 75 degrees, and the limit value of the angle to the left is set to -75 degrees (hereinafter, the movable range "-75 to 75 degrees" As well).
  • the inclination movable range of the head frame 316 in the reference state is 60 degrees, and the inclination to the left and right is 30 degrees from the state in which the head is upright. That is, the limit value at which the robot 100 tilts to the right is 30 degrees, and the limit value at which the robot 100 tilts to the left is -30 degrees (hereinafter referred to as a movable range "-30 to 30 degrees").
  • the robot 100 wears a costume, the robot 100 adjusts the movable parts according to the costume. Details will be described later.
  • FIG. 3 is a front external view of the robot 100 when wearing a costume.
  • the user can wear various types of costumes 180 on the robot 100.
  • various types of costumes 180 can be considered, such as a casual design polo shirt, a formal design suit, a winter coat or down jacket, a costume imitating a cartoon character.
  • the robot 100 moves the movable part by an actuator such as a motor.
  • an actuator such as a motor.
  • There is a design upper limit to the driving power of each part so when wearing a polo shirt designed to be light and easy to move, the load applied to the movable part is small, and when wearing a coat made of a heavy hard fabric such as a skin The load on the movable part is increased.
  • the design shape of the costume 180 and the fabric used may hinder proper operation control by the robot 100 wearing it.
  • the driving force preset for the robot 100 can not move the joint site.
  • a problem occurs in the recognition processing of the external environment in the robot 100 due to a change in detection sensitivity of a sensor such as a touch sensor.
  • the robot 100 there is a clear limit to the driving force and the movable range, and if the limit is exceeded, the robot will not move at all or an actuator such as a motor will generate heat abnormally.
  • an actuator such as a motor will generate heat abnormally.
  • the robot 100 it can be said to be one of the important elements that affect the normal operation.
  • the costume production support apparatus of the present embodiment supports the production of the costume of the robot and information for adjusting the robot so as to perform proper operation control when the costume is worn (hereinafter referred to as "operation setting file") I will provide a.
  • the robot 100 adjusts the operation conditions of each movable unit according to the costume.
  • the operating conditions include information that can be used to cancel the influence that occurs on the robot 100 by wearing the costume 180, such as the driving force of the movable part, the movable range, and the sensitivity correction of the sensor such as a touch sensor.
  • information for controlling the robot 100 is included so that the robot 100 is not adversely affected by the change in the characteristics of the robot 100 caused by the clothes 180 being worn.
  • the robot 100 can acquire the operating condition (correction information) suitable for the costume being worn.
  • the tag ID functions as "identification information" for acquiring an operation setting file defining operation conditions for each actuator.
  • the IC tag 182 is sewn on a specific position of the costume 180.
  • the IC tag 182 is an RFID (Radio Frequency Identifier) tag.
  • the IC tag 182 transmits the tag ID at a close distance. By reading the tag ID from the IC tag 182, the robot 100 can acquire correction information for each actuator that conforms to the costume 180.
  • FIG. 4 is a block diagram of the robot system 300.
  • the robot system 300 includes a robot 100, a robot management server 200, and a plurality of external sensors 114.
  • a plurality of external sensors 114 (external sensors 114a, 114b, ..., 114n) are installed in advance in the house.
  • the external sensor 114 may be fixed to the wall of the house or may be mounted on the floor.
  • position coordinates of the external sensor 114 are registered. The position coordinates are defined as x, y coordinates in a house assumed as the action range of the robot 100.
  • the robot management server 200 determines the basic behavior of the robot 100 based on the information obtained from the sensors contained in the robot 100 and the plurality of external sensors 114.
  • the external sensor 114 periodically transmits a wireless signal (hereinafter referred to as a “robot search signal”) including the ID of the external sensor 114 (hereinafter referred to as “beacon ID”).
  • a wireless signal hereinafter referred to as a “robot search signal”
  • the robot 100 sends back a radio signal (hereinafter referred to as a “robot reply signal”) including a beacon ID.
  • the robot management server 200 measures the time from when the external sensor 114 transmits the robot search signal to when the robot reply signal is received, and measures the distance from the external sensor 114 to the robot 100. By measuring the distances between the plurality of external sensors 114 and the robot 100, the position coordinates of the robot 100 are specified.
  • FIG. 5 is a hardware configuration diagram of the robot 100.
  • the robot 100 includes an internal sensor 128, a communicator 126, a storage device 124, a processor 122, a drive mechanism 120 and a battery 118.
  • the drive mechanism 120 includes the wheel drive mechanism 370 described above.
  • Processor 122 and storage 124 are included in control circuit 342.
  • the units are connected to each other by a power supply line 130 and a signal line 132.
  • the battery 118 supplies power to each unit via the power supply line 130. Each unit transmits and receives control signals through a signal line 132.
  • the battery 118 is a lithium ion secondary battery and is a power source of the robot 100.
  • the internal sensor 128 is an assembly of various sensors incorporated in the robot 100. Specifically, it is a camera (all-sky camera), a microphone array, a distance measurement sensor (infrared sensor), a thermo sensor, a touch sensor, an acceleration sensor, an odor sensor, and the like.
  • the touch sensor is disposed between the outer cover 314 and the body frame 310, and detects a user's touch based on a change in capacitance.
  • the odor sensor is a known sensor to which the principle that the electric resistance is changed by the adsorption of the molecule that is the source of the odor is applied.
  • the communication device 126 is a communication module that performs wireless communication with various external devices such as the robot management server 200, the external sensor 114, and a portable device owned by a user.
  • the storage device 124 is configured by a non-volatile memory and a volatile memory, and stores a computer program and various setting information.
  • the processor 122 is an execution means of a computer program.
  • the drive mechanism 120 is an actuator that controls an internal mechanism. In addition to this, indicators and speakers will also be installed.
  • the processor 122 performs action selection of the robot 100 while communicating with the robot management server 200 and the external sensor 114 via the communication device 126.
  • Various external information obtained by the internal sensor 128 also affects behavior selection.
  • the drive mechanism 120 mainly controls the wheel (front wheel 102) and the head (head frame 316).
  • the drive mechanism 120 changes the rotational direction and the rotational direction of the two front wheels 102 to change the moving direction and the moving speed of the robot 100.
  • the drive mechanism 120 can also raise and lower the wheels (the front wheel 102 and the rear wheel 103). When the wheel ascends, the wheel is completely housed in the body 104, and the robot 100 abuts on the floor surface F at the seating surface 108 to be in the seating state.
  • the drive mechanism 120 also controls the hand 106 via the wire 134.
  • FIG. 6 is a view schematically showing the configuration of the costume production support system 500.
  • the costume production support system 500 provides a support service for producing an original costume of the robot 100 in response to a request from the user. This service sells the cloth or pattern of the costume to the user, and leaves it to the user to produce the costume such as cutting the cloth or sewing the cloth according to the pattern.
  • the robot management server 200 and the user terminal 250 are connected to the costume management server 400 via the Internet 510.
  • the costume management server 400 is installed in the costume service company 402, and functions as a "clothing production support device".
  • the costume management server 400 holds a support program for executing a support service and support data for producing various clothes for each type of robot.
  • the support data includes image data for each costume, pattern data (design information), fabric data, correction data for the robot (operation conditions), and the like.
  • the user terminal 250 may be a general purpose computer such as a laptop PC or a smart phone.
  • the user terminal 250 and the Internet 510 are connected by wire or wirelessly.
  • the outline of the support service is as follows.
  • the costume management server 400 identifies the costume of the robot 100 and its fabric based on the request from the user terminal 250, transmits the pattern data of the costume to the user, and executes the fabric delivery process.
  • the tag ID of the costume is issued and the shipping process is executed.
  • the costume management server 400 holds an operation setting file in which operation conditions (such as a driving force and a movable range of the movable portion) of the robot 100 at the time of wearing a costume are recorded in association with a tag ID.
  • the fabric is mailed to the user's home 252 from an outside vendor (a fabric vendor).
  • the tag ID is written to the IC tag 182 by an external vendor (IC tag issuer) and mailed to the user home 252. That is, the tag ID is written to the IC tag 182, and provided in a state where the user can not refer directly.
  • the robot 100 When the cloth and the IC tag 182 reach the user home 252, the user cuts the cloth on the basis of the pattern and produces the costume 180. Then, the IC tag 182 is sewn to a designated position in the costume 180, and the costume 180 is worn on the robot 100.
  • the robot 100 transmits the tag ID recorded in the IC tag 182 to the robot management server 200.
  • the robot management server 200 accesses the costume management server 400, acquires an operation setting file corresponding to the tag ID, and transmits the operation setting file to the robot 100.
  • the robot 100 can perform an unreasonable operation suitable for the costume by reflecting the operation conditions recorded in the operation setting file in the control command value. That is, in the case of the costume 180 manufactured by using the costume management server 400, the robot 100 can perform appropriate adjustment according to the costume 180 being worn, and can perform appropriate operation.
  • FIG. 7 is a functional block diagram of the costume production support system 500.
  • the costume production support system 500 includes the robot system 300 and the costume management server 400.
  • the robot system 300 includes a robot 100, a robot management server 200, and a plurality of external sensors 114.
  • Each component of the robot 100, the robot management server 200, and the costume management server 400 is a CPU (Central Processing Unit) and computing units such as various co-processors, storage devices such as memories and storages, and wired or wireless communication lines connecting them. And software that is stored in the storage device and supplies processing instructions to the computing unit.
  • the computer program may be configured by a device driver, an operating system, various application programs located in the upper layer of them, and a library that provides common functions to these programs.
  • Each block described below indicates not a hardware unit configuration but a function unit block.
  • a part of the functions of the robot 100 may be realized by the robot management server 200, or a part or all of the functions of the robot management server 200 may be realized by the robot 100.
  • the robot management server 200 includes a communication unit 204, a data processing unit 202, and a data storage unit 206.
  • the communication unit 204 takes charge of communication processing with the external sensor 114 and the robot 100.
  • the data storage unit 206 stores various data.
  • the data processing unit 202 executes various processes based on the data acquired by the communication unit 204 and the data stored in the data storage unit 206.
  • the data processing unit 202 also functions as an interface of the communication unit 204 and the data storage unit 206.
  • the data storage unit 206 includes a motion storage unit 232, a map storage unit 216, a personal data storage unit 218, and a correction information storage unit 219.
  • the robot 100 has a plurality of motion patterns (motions). Various motions are defined such as shaking the hand 106, approaching the user while meandering, staring at the user with the head held down, and the like.
  • the motion storage unit 232 stores a "motion file" that defines control content of motion. Each motion is identified by a motion ID. The motion file is also downloaded to the motion storage unit 160 of the robot 100. Which motion is to be executed may be determined by the robot management server 200 or may be determined by the robot 100. Many of the motions of the robot 100 are configured as complex motions including a plurality of unit motions.
  • the map storage unit 216 stores, in addition to the action map defining the action of the robot according to the situation, a map indicating the arrangement situation of obstacles such as a chair and a table.
  • the personal data storage unit 218 stores user information. Specifically, master information indicating the closeness to the user and the physical and behavioral characteristics of the user is stored. Other attribute information such as age and gender may be stored.
  • the robot 100 has an internal parameter called familiarity for each user.
  • familiarity for each user.
  • an action indicating favor with itself such as raising itself or giving a voice
  • familiarity with the user is increased.
  • the closeness to the user who is not involved in the robot 100, the user who is violent, and the user who is infrequently encountered is low.
  • the correction information storage unit 219 stores the operation condition of the robot 100 acquired from the costume management server 400 as correction information of the movable unit (actuator).
  • the correction information includes an operation correction value (correction value for driving force) and a movable range (setting value for driving amount) of each actuator of the robot 100.
  • the data processing unit 202 includes a position management unit 208, a recognition unit 212, an operation control unit 222, and a closeness management unit 220.
  • the position management unit 208 specifies the position coordinates of the robot 100 by the method described using FIG. 4.
  • the position management unit 208 may also track the user's position coordinates in real time.
  • the recognition unit 212 recognizes the external environment.
  • the recognition of the external environment includes various recognitions such as recognition of weather and season based on temperature and humidity, recognition of an object shade (safety area) based on light quantity and temperature.
  • the recognition unit 150 of the robot 100 acquires various types of environment information by the internal sensor 128, performs primary processing on the acquired information, and transfers the information to the recognition unit 212 of the robot management server 200.
  • the recognition unit 212 further includes a person recognition unit 214 and a response recognition unit 228.
  • the human recognition unit 214 compares the feature vector extracted from the image captured by the built-in camera of the robot 100 with the feature vector of the user (cluster) registered in advance in the personal data storage unit 218, thereby capturing the user It is determined which person corresponds to (user identification processing).
  • the person recognition unit 214 includes an expression recognition unit 230.
  • the facial expression recognition unit 230 estimates the user's emotion by performing image recognition on the user's facial expression.
  • the response recognition unit 228 recognizes various response actions made to the robot 100 and classifies them as pleasant and unpleasant actions.
  • the response recognition unit 228 also classifies into a positive / negative response by recognizing the user's response to the behavior of the robot 100.
  • the pleasant and unpleasant behavior is determined depending on whether the user's response behavior is comfortable or unpleasant as a living thing.
  • the motion control unit 222 cooperates with the motion control unit 152 of the robot 100 to determine the motion of the robot 100.
  • the motion control unit 222 creates a movement target point of the robot 100 and a movement route for it.
  • the operation control unit 222 may create a plurality of movement routes, and then select one of the movement routes.
  • the motion control unit 222 selects the motion of the robot 100 from the plurality of motions of the motion storage unit 232.
  • the closeness management unit 220 manages closeness for each user.
  • the closeness is registered in the personal data storage unit 218 as part of personal data.
  • the intimacy management unit 220 increases the intimacy with the user.
  • the intimacy is down when an offensive act is detected.
  • the intimacy degree of the user who has not viewed for a long time gradually decreases.
  • the robot 100 includes a communication unit 142, a data processing unit 136, a data storage unit 148, an internal sensor 128, and a drive mechanism 120.
  • the communication unit 142 corresponds to the communication device 126 (see FIG. 5), and takes charge of communication processing with the external sensor 114, the robot management server 200, and the other robots 100.
  • the data storage unit 148 stores various data.
  • the data storage unit 148 corresponds to the storage device 124 (see FIG. 5).
  • the data processing unit 136 executes various processes based on the data acquired by the communication unit 142 and the data stored in the data storage unit 148.
  • the data processing unit 136 corresponds to a processor 122 and a computer program executed by the processor 122.
  • the data processing unit 136 also functions as an interface of the communication unit 142, the internal sensor 128, the drive mechanism 120, and the data storage unit 148.
  • the data storage unit 148 includes a motion storage unit 160 and a correction information storage unit 162.
  • the motion storage unit 160 stores motion files that define various motions of the robot 100.
  • the motion storage unit 160 downloads various motion files from the motion storage unit 232 of the robot management server 200.
  • Motion is identified by motion ID.
  • operation timings, operation times, operation directions, etc. of various actuators are defined in time series in the motion file.
  • the correction information storage unit 162 stores the operation conditions of the robot 100 acquired from the costume management server 400 via the robot management server 200.
  • a correction value file is downloaded from the correction information storage unit 219 of the robot management server 200 to the correction information storage unit 162.
  • Various data may also be downloaded to the data storage unit 148 from the map storage unit 216 and the personal data storage unit 218.
  • the data processing unit 136 includes a recognition unit 150, an operation control unit 152, an equipment detection unit 154, and a correction processing unit 156.
  • the recognition unit 150 interprets external information obtained from the internal sensor 128.
  • the recognition unit 150 can perform visual recognition (vision unit), odor recognition (olfactory unit), sound recognition (hearing unit), and tactile recognition (tactile unit).
  • the recognition unit 150 periodically captures the outside world with the built-in omnidirectional camera, and detects a moving object such as a person or a pet.
  • the recognition unit 150 extracts a feature vector from the captured image of the moving object.
  • the feature vector is a set of parameters (features) indicating physical features and behavioral features of the moving object.
  • features are also extracted from an odor sensor, a built-in sound collection microphone, a temperature sensor, and the like. These features are also quantified and become feature vector components.
  • the closeness degree management unit 220 of the robot management server 200 changes the closeness degree to the user. In principle, the intimacy with the user who has performed pleasure is increased, and the intimacy with the user who has performed offensive activity decreases.
  • the motion control unit 152 determines the moving direction of the robot 100 together with the motion control unit 222 of the robot management server 200.
  • the robot management server 200 may determine the movement based on the action map, and the robot 100 may determine an immediate movement such as turning off an obstacle.
  • the drive mechanism 120 drives the front wheel 102 in accordance with the instruction of the operation control unit 152 to direct the robot 100 to the movement target point.
  • the motion control unit 152 determines the motion of the robot 100 in cooperation with the motion control unit 222 of the robot management server 200. Some motions may be determined by the robot management server 200, and other motions may be determined by the robot 100. Also, although the robot 100 determines the motion, when the processing load of the robot 100 is high, the robot management server 200 may determine the motion. The base motion may be determined in the robot management server 200, and additional motion may be determined in the robot 100. How to share the motion determination process in the robot management server 200 and the robot 100 may be designed according to the specification of the robot system 300.
  • the operation control unit 152 instructs the drive mechanism 120 to execute the selected motion.
  • the drive mechanism 120 controls each actuator according to the motion file.
  • the motion control unit 152 can also execute a motion that lifts both hands 106 as a gesture that encourages "hug” when a user with high intimacy is nearby, and when the "hug” is tired, the left and right front wheels 102 By alternately repeating reverse rotation and stop while being accommodated, it is also possible to express a motion that annoys you.
  • the drive mechanism 120 causes the robot 100 to express various motions by driving the front wheel 102, the hand 106, and the neck (head frame 316) according to an instruction of the operation control unit 152.
  • the equipment detection unit 154 When the equipment detection unit 154 reads the tag ID from the IC tag 182 sewed on the clothes 180, the equipment detection unit 154 determines that the clothes 180 are worn.
  • the tag ID is readable when in close proximity. When a plurality of tag IDs are read, it is determined that the layering is performed.
  • the operation control unit 152 changes the operation setting in accordance with the correction information by a method described later.
  • the equipment detection unit 154 may detect wearing of clothes by various methods other than the IC tag 182. For example, it may be determined that the costume has been worn when the internal temperature of the robot 100 rises. The image of the clothes worn may be recognized by the camera. A capacitance sensor may be installed in a wide area of the outer skin 314, and when this capacitance sensor detects a wide range of contact, it may be determined that the costume has been worn.
  • detection of wearing of clothes based on physical information such as image information, temperature information, contact information and the like in addition to the IC tag 182 is referred to as “physical recognition”.
  • a costume in which the tag ID is registered by the IC tag 182 will be referred to as a “official costume”, and a costume in which the tag ID is not registered by the IC tag 182 will be referred to as a “non-official costume”. When it does not distinguish in particular, it simply calls "the costume”.
  • the correction processing unit 156 performs motion correction of the robot 100 so as to conform to the costume 180. That is, the control command value for each actuator is corrected based on the correction information corresponding to the tag ID.
  • the correction processing unit 156 transmits the tag ID to the robot management server 200, and downloads the corresponding correction information (the operation correction value and the operation range).
  • the robot management server 200 downloads the correction information from the costume management server 400.
  • the operation control unit 152 selects a motion (approval costume wearing request motion) which annoy the state in which the costume is not formally dressed.
  • the official costume wear request motion may be, for example, a rejection operation such as the robot 100 shaking its body violently or not moving at all.
  • the robot 100 may be initialized in advance as a “typical motion (in particular, when something is annoyed among motions informing something”) specific to the robot 100.
  • the correction processing unit 156 can execute the above-mentioned correction processing.
  • the correction processing unit 156 executes an autonomous correction process not relying on the correction information. That is, the operation control unit 152 selects a predetermined correction motion, measures the output given to each actuator at the time of the execution, and records the amount of operation thereof as operation data. The correction processing unit 156 calculates an appropriate correction value based on the operation data, and corrects the control command value for each actuator.
  • the costume management server 400 includes a communication unit 410, a data processing unit 412, and a data storage unit 414.
  • the communication unit 410 takes charge of communication processing with the robot management server 200 and the user terminal 250.
  • the data storage unit 414 stores various data for guiding the production of the costume of the robot 100 as a support service.
  • the data processing unit 412 executes various processes based on the data acquired by the communication unit 410 and the data stored in the data storage unit 414.
  • the data processing unit 412 also functions as an interface between the communication unit 410 and the data storage unit 414.
  • FIGS. 8 and 9 are diagrams showing costume production support screens provided by the costume management server 400 to the user.
  • FIGS. 8 (a) to 8 (c) are diagrams showing screen transition for selecting a costume and selecting a fabric for each part.
  • FIGS. 9 (a) and 9 (b) show the purchase screen displayed after the selection of clothes and fabrics.
  • a costume category selection screen shown in FIG. 8A is displayed.
  • selection buttons for each category of clothes are displayed, such as "suit / coat", “jacket / jersey", and so on.
  • a costume selection screen shown in FIG. 8B is displayed.
  • the jacket on the left side of the screen belongs to the category (Costume ID: J001), Jersey (Costume ID: J002), Vest (Costume ID: J003) ...
  • On the right side of the screen an image of the robot 100 to which a costume is to be attached is displayed.
  • the fabric selection screen shown in FIG. 8C is displayed.
  • the left side of the screen displays fabrics that can be selected for making the jacket.
  • the selection of the fabric is performed stepwise for each part corresponding to the pattern.
  • a selection button for each type of fabric is displayed like fabric A (red) (dough ID: A001), fabric A (blue) (dough ID: A002), ... Ru.
  • the fabric ID differs depending on the pattern and color arrangement.
  • the user can select any material (each pattern) for each part of the costume.
  • the purchase screen of the dough and the paper pattern is displayed.
  • ⁇ 1,000 is displayed as the fee for the fabric and pattern of the jacket (J001), and the buttons “Purchase”, “Cancel”, and “Return to previous screen” are displayed.
  • billing processing is performed.
  • cancel button is selected, it is canceled including the previous selection procedure.
  • return to previous screen the screen returns to the fabric selection screen. The user can reselect the fabric.
  • a purchase completion screen is displayed as shown in FIG. 9B.
  • the template is previewed on the right side of the screen, and a button for downloading the template is displayed on the left side of the screen. The user can download the template file by selecting this button.
  • this screen it is displayed that the fabric and the IC tag are separately mailed (delivered).
  • FIG. 10 is a block diagram showing the function of the costume management server 400 of FIG. 7 in detail.
  • the data processing unit 412 includes a selection unit 430, an operation condition generation unit 432, an ID generation unit 434, a management unit 436, an operation condition provision unit 438, a charge processing unit 440, an IC tag delivery request unit 442, a fabric delivery request unit 444 and a design.
  • An information providing unit 446 is included.
  • the data storage unit 414 includes a design information storage unit 420, a fabric characteristic storage unit 422, an operation condition storage unit 424, and a charge information storage unit 426.
  • the selection unit 430 In response to a request from the user terminal 250, the selection unit 430 provides a selection screen of a costume to be manufactured described with reference to FIGS. 8 and 9.
  • the design information storage unit 420 holds various information on costume design, and the selection unit 430 configures a selection screen with reference to the design information storage unit 420.
  • the design information storage unit 420 records various information necessary for forming a costume selection screen such as a finished image of a costume, a fabric usable for production, an image of a fabric, pattern data of a costume, and a production procedure. Holds information used to produce costumes, such as manuals.
  • FIG. 11A shows an example of the data structure of the costume information table stored in the design information storage unit 420.
  • the costume information table is a table for designating the part of the pattern and the material of the usable cloth for each costume.
  • the costume is identified by "Costlet ID”, and pattern data is assigned to each costume ID.
  • the material of the material which can be utilized is matched with every site
  • the pattern file "C001.zip” is associated with the costume ID "C001”
  • the material A e.g., leather
  • material B for example, fabric
  • material of material A and material B can be selected for the collar.
  • the material of material A and material B can be selected for the sleeve part.
  • FIG. 11B shows an example of the data structure of the dough information table.
  • the material information table stores information of the material in association with each material.
  • the dough is identified by "Dough ID".
  • the fabric ID differs depending on the color and pattern even if the material of the fabric is the same. For example, even if the material A is the same type, different fabric IDs such as “A001” are set if it is a red fabric, and “A003” if it is a fabric having a polka dot pattern. Even if it is the same red, different material IDs are set such as "A001" for material A and "B001" for material B.
  • the design information storage unit 420 functions as a “storage unit” that holds information for guiding the production of a costume.
  • the selection unit 430 of FIG. 10 refers to the costume information table and the fabric information table, and presents the costume selected by the user and the fabric options that can be used for producing the costume while the costume the user desires Determine the combination of fabrics to make.
  • the charging processing unit 440 transmits a charging process screen when the selection by the user is completed, and executes the charging process in response to the user's purchase request.
  • the design information providing unit 446 acquires the template file from the design information storage unit 420 on the condition that the charging process is completed, and transmits the template file to the user terminal 250.
  • the design information providing unit 446 functions as a “providing unit” that outputs a template file as “design information”.
  • the charging information storage unit 426 stores information for charging processing which is executed when providing the cloth and pattern file of the costume. This information includes information on the amount of money to be charged for each costume ID, purchase history information of each user, and the like.
  • the operating condition generation unit 432 receives information on the clothes selected by the user from the selection unit 430 and the material of each pattern part when the charging process is completed in the charge processing unit 440, and the clothes are worn The operating conditions of the robot 100 are determined. Then, the operation condition generation unit 432 generates an operation setting file based on the determined operation condition.
  • the operating condition generation unit 432 functions as a "determination unit” that determines and outputs an operation condition of the robot, and also functions as a "generation unit” that generates an operation setting file based on the operation condition.
  • the fabric characteristic storage unit 422 stores correction information for each fabric for each costume and part used.
  • the correction information is provided as a correction value for correcting the control amount (control command value) of each actuator so that proper operation control can be performed when the robot 100 wears the costume 180.
  • the correction value is set based on the control amount of the reference state in which the robot 100 is mounted only on the outer skin 314, and includes the correction coefficient for the driving force of the actuator and the movable range of the driving target by the actuator.
  • FIG. 12 shows an example of the data structure of the material characteristic storage unit 422.
  • the fabric characteristic storage unit 422 is a part of the costume (a part of the pattern), a material of the fabric adopted for the part, an actuator (movable part) whose part affects the operation, and a driving force of the actuator.
  • the correction factor to be set to and the movable range of the actuator are associated and stored.
  • the working resistance of the head becomes relatively large. Therefore, 1.5 is set as the correction coefficient of the actuator for driving the head left and right, and the left and right movable range of the head is set to -30 to 30 °. As described above, since the movable range in the reference state is ⁇ 75 to 75 °, the control amount is greatly restricted. On the other hand, when the material B (for example, fabric) is used for the collar, the working resistance of the head is relatively small. Therefore, 1.2 is set as the correction coefficient of the actuator for driving the head left and right, and the left and right movable range of the head is set to -50 to 50 °.
  • the structure is more flexible than the coat (C001). Therefore, the correction value of the actuator for driving the head tends to be relaxed. The same applies to the actuator that drives the arm.
  • the costume is a tank top (T001)
  • no correction is made on the actuators that drive the head and arms, as they do not affect the operation of the head and arms.
  • the correction range is set to 1.0, and the movable range is made to coincide with the reference state. Even if it is a tank top, when material C is selected as the material of the front, the weight has a slight influence on the advancing and retracting control of the wheel. For this reason, the correction coefficient is 1.1.
  • the operating condition generation unit 432 of FIG. 10 refers to the fabric characteristic storage unit 422 to determine the operating condition according to the combination of the costume and the fabric.
  • costumes are manufactured by cutting the cloth according to the pattern and sewing the cut cloths together.
  • the operation condition generation unit 432 is manufactured with a combination of materials arbitrarily selected by the user. Can determine the appropriate operating conditions required when wearing a dress.
  • the management unit 436 receives the operation setting file from the operation condition generation unit 432. Thereafter, the management unit 436 instructs the ID generation unit 434 to generate a tag ID.
  • the ID generation unit 434 receives the instruction from the management unit 436 and generates a tag ID.
  • the tag ID is identification information generated for each costume, and does not overlap even if the combination of the same costume or the same cloth is used. That is, the information is generated for each costume manufactured using the costume management server 400 and is information that can uniquely identify the costume.
  • the management unit 436 associates the tag ID generated by the ID generation unit 434 with the operation setting file generated by the operation condition generation unit 432, and stores the association in the operation condition storage unit 424.
  • FIG. 13 is a view showing an example of the data structure of the operating condition storage unit 424.
  • the operating condition storage unit 424 holds a table in which a tag ID and a correction coefficient for each actuator and a movable range are associated as an operation setting file.
  • the information held in the operation setting file is described as a table for the sake of explanation, but the operation condition storage unit 424 stores data so that the corresponding operation setting file can be extracted using the tag ID as a key. do it.
  • the operation condition providing unit 438 in FIG. 10 receives the tag ID from the robot management server 200, and reads the operation setting file associated with the received tag ID from the operation condition storage unit 424. Then, the operation condition providing unit 438 transmits the operation setting file to the robot management server 200.
  • the operating condition providing unit 438 can obtain the correction coefficient and the movable range for each actuator of the robot by referring to the operating condition storage unit 424 using the tag ID as a key.
  • the IC tag shipping request unit 442 requests the IC tag issuer to record the tag ID processed by the management unit 436 in the IC tag 182 and mail it to the user.
  • the fabric delivery request unit 444 requests the fabric supplier to mail the fabric selected by the selection unit 430 to the user. These request information may be automatically transmitted to the terminal of each vendor.
  • FIG. 14 is a sequence diagram showing an outline of communication between the costume management server and the user terminal at the start of using the support service.
  • the costume management server 400 sequentially transmits the above-described selection screen (S12).
  • the user terminal 250 selects information (such as costume information) such as costumes and fabrics according to the user's input (S14), and transmits the selected information to the costume management server 400 (S16).
  • the costume management server 400 registers the selected information (S18), and transmits the purchase screen (S20).
  • the user terminal 250 transmits a purchase request (S24).
  • the costume management server 400 executes charging processing in response to the purchase request (S26), and when it is completed, transmits a charging completion notification to the user terminal 250 (S28), and reads a paper pattern file corresponding to the costume information (S30).
  • the form file is transmitted (S34).
  • the user terminal 250 downloads the template file according to the user's operation (S40), and prints the template (S42).
  • the costume management server 400 issues a tag ID corresponding to the costume information (S44), and creates an operation condition (operation setting file) (S46). Then, the created operation condition is registered in association with the tag ID (S48). Thereafter, the tag ID shipping process described above is executed (S50), and the fabric shipping process is executed (S52).
  • FIG. 15 is a sequence diagram showing an outline of the procedure between the IC tag issuer and fabric supplier and the user.
  • the IC tag issuing vendor writes the tag ID in the IC tag 182 (S60) and delivers the tag ID to the user home 252 (S62).
  • the fabric provider prepares the corresponding fabric (S64), and sends it to the user home 252.
  • the user cuts the cloth according to the already downloaded pattern (S68), sews the costume 180 (S70), and sews the IC tag 182 on the costume 180 (S72).
  • FIG. 16 is a flowchart showing a process of correcting the movement of the robot 100 when wearing a costume.
  • the equipment detection unit 154 physically recognizes the wearing of the clothes based on physical information such as image information, temperature information, and contact information (S100).
  • image information such as image information, temperature information, and contact information (S100).
  • N in S100 image information
  • contact information S100
  • the subsequent processing is not performed.
  • the correction processing unit 156 The following correction processing is executed for the control amount.
  • the correction processing unit 156 first executes operating condition acquisition processing for acquiring an operating condition via the robot management server 200 (N in S104) S106). Accordingly, when the operating condition is acquired (Y in S108), the correction processing is performed to reflect the correction coefficient and the operating range included in the operating condition in the control amount (S110). After the operation correction, the operation control unit 152 actually moves each actuator to measure an output value and an operation amount. The operation data indicating the relationship between the output value and the operation amount is referred to, and it is determined whether or not the desired operation amount is realized after the operation correction (S116).
  • the operation control unit 152 executes the above-mentioned official costume wear request motion (S112).
  • the tag ID is not detected even after the predetermined time has elapsed (Y in S114), that is, when it is determined that the user does not intend to wear the official costume on the robot 100, the process proceeds to S116 to execute operation check. .
  • the operation control unit 152 turns on the correction completion flag (S120), and updates the correction information (S124). If the operation amount is not appropriate (N in S118), the operation control unit 152 autonomously executes the correction process (S122). That is, the operation control unit 152 records the output value and the operation amount of the actuator as operation data, and performs operation correction (correction of the driving force and correction of the movable range) without any trouble as the output value of each actuator. Then, the correction information is updated (S124).
  • FIG. 17 is a sequence diagram showing an outline of communication of the robot, the robot management server, and the costume management server in relation to the operation condition acquiring process of S106 in FIG.
  • the robot management server 200 accesses the costume management server 400, transmits the tag ID, and requests the transmission of the operation condition. .
  • the costume management server 400 refers to the data table based on the received tag ID, reads the corresponding operation setting file (S84), and transmits the file to the robot management server 200.
  • the robot management server 200 stores the operation condition defined in the operation setting file in the correction information storage unit 219 (S88) and transmits it to the robot 100 (S90).
  • the robot 100 stores the operating condition in the operating condition storage unit 424 (S92).
  • the costume production support system 500 has been described above based on the embodiment. According to the present embodiment, by sequentially providing the screens for guiding the production of the costume 180, it is possible to support the production of the costume of the robot 100 by the user.
  • the operating condition correction information
  • the robot 100 can perform an operation that is not unreasonable according to the costume 180 by reflecting the operation condition in the control, and can appropriately exhibit its performance. Since the appropriate operating conditions are set based on the type of the costume and the fabric thereof, it is possible to prevent or suppress the occurrence of a problem in the actuator of the robot 100. Thereby, the user can put on various clothes 180 on the robot 100 with peace of mind, and can enjoy living with the robot 100.
  • the present invention is not limited to the above-described embodiment and modification, and the components can be modified and embodied without departing from the scope of the invention.
  • Various inventions may be formed by appropriately combining a plurality of components disclosed in the above-described embodiment and modifications. Moreover, some components may be deleted from all the components shown in the above-mentioned embodiment and modification.
  • pattern data is provided via a network as “design information” that designates a method of producing a costume.
  • a manual or other design information may be provided to indicate the production procedure.
  • control information of the 3D printer may be provided as design information.
  • design information (pattern etc.) may be delivered.
  • the costume management server 400 has provided an example in which the fabric and pattern of the selected costume are provided to the user.
  • it may be made not to offer about cloth, or it can be made to be selectable by the user side.
  • the user can purchase a cloth made of the same material through another route, and can make a costume with a cloth having a color or pattern more suited to his or her preference.
  • the user can create a more original costume.
  • the costume service company 402 can charge the user for the provision of the pattern and operating conditions. As a result, the degree of freedom of support services is also enhanced.
  • the costume service company may sell materials (fabrics and patterns) necessary for producing costumes on another route, and the costume management server may provide only the operating conditions of the robot.
  • the user may purchase the robot's costume on the Internet (online) or at a store (offline), and obtain the correction ID upon the purchase.
  • the “correction ID” is identification information for acquiring the operating condition (correction information) of the robot wearing the costume.
  • the correction ID may be used to request the costume management server to provide an operating condition.
  • the user may have an IC tag and a writer that writes the correction ID to the IC tag. By mounting the IC tag in which the correction ID is written to the robot, the operating condition of the robot can be acquired as in the above embodiment.
  • the correction ID may be provided directly recognizable by the user, such as provided as a character string.
  • the user terminal accesses the costume management server to request an operating condition.
  • the information acquired by the user terminal can be transferred to the robot or robot management server.
  • it may be provided in a state that the user can not directly recognize, such as being provided as a tag ID as in the above embodiment.
  • the robot or robot management server accesses the costume management server to request an operating condition.
  • information for specifying the attachment position of the IC tag in the costume may be included in the design information (pattern data etc.). If the IC tag is not attached to the designated position, the robot may not be able to read the tag information.
  • the example which makes the user terminal 250 and the robot management server 200 another structure was shown.
  • the user terminal 250 and the robot management server 200 may be integrally configured.
  • the user terminal 250 may be a general-purpose computer such as a laptop PC, and the robot management server 200 may be realized by a part of its functions.
  • the user terminal 250 when the user terminal 250 connects with the costume management server 400 via the internet 510, the structure which receives a support service was shown.
  • the user may directly access the costume management server 400 using a terminal installed in the costume service company 402.
  • the robot has a soft shell having a specific shape, and a configuration is shown in which the costume is worn from above the shell.
  • the above-mentioned support device may be used for producing a costume of a robot having a shell different from the above-described shape or material or a robot not having a shell.
  • one aspect of the robot is shown as a target for wearing a costume, but the support device is also applicable to other humanoid robots and pet robots.
  • the operation condition generation unit 432 of FIG. 10 is a value for correcting a sensor or the like whose characteristics change when the costume is worn, such as sensitivity correction of the touch sensor. May also be included in the operation configuration file.
  • the IC tag shipping request unit 442 in FIG. 10 may transmit the encrypted tag ID to the vendor and request writing to the IC tag.
  • the robot 100 can decrypt the encrypted tag ID, it is possible to prevent the tag ID from being used illegally and the common tag ID being used for a plurality of costumes.
  • a unique motion (also referred to as a "costal-specific motion”) may be set for each costume to be produced.
  • "Costume-specific motion” is a specific action unique to a costume associated with the costume.
  • the triggering condition may be set for each costume-specific motion. For example, when the costume corresponding to the tag ID is a dance costume, a motion that causes the robot to dance as the costume-specific motion may be set, and dance music may be set as the trigger condition.
  • the costume corresponding to the tag ID imitates a specific idol or actor costume, set a motion that causes the robot to imitate the idol etc. as the costume-specific motion, and the user sets the motion condition as the motion condition.
  • the operation setting file includes a costume unique motion file in which the control content of the costume unique motion is defined.
  • the robot can download the operation setting file corresponding to the worn costume by transmitting the tag ID to the robot management server or the costume management server.
  • the costume management server provides costume-specific motion files.
  • the robot executes the costume unique motion corresponding to the tag ID.
  • the costume management server provides costume-specific motion files.
  • various modes can be set for the costume unique motion and its activation condition.
  • a motion representing a hot gesture as a costume unique motion may be set, and the trigger condition may be that the temperature is 20 degrees or more.
  • FIG. 18 is a diagram illustrating a hat as a functional costume.
  • a hole is provided coaxially with the opening 390 of the outer skin 314 at the top of the head frame 316 of the robot 100, and the tongue 112 is inserted (see FIG. 2).
  • a gap between the hole and the tongue 112 is a ventilation passage, and outside air can be introduced into the head frame 316.
  • the outside air cools functional components (heat generating components such as a circuit board) in the head frame 316. For this reason, when producing a hat as a costume, it is preferable not to impair this cooling function.
  • the hat 520 shown in FIG. 18A has an insertion hole 522 at the top, through which the tongue 112 of the robot 100 is inserted.
  • a mesh 524 is provided around the insertion hole 522 in the hat 520 to partially enhance air permeability.
  • the hat 530 shown in FIG. 18 (b) has a mesh 526 at the rear in addition to the mesh 524 at the top.
  • air permeability can be further improved by providing the mesh 526 in accordance with the position of the exhaust port.
  • cloth, leather and synthetic resin (plastic) can be selected as the material of the cloth of each hat.
  • the ventilation structure is provided in the hat is shown in accordance with the structure of the robot 100, but the same ventilation structure may be adopted for clothes other than the hat.
  • the clothes may be provided with holes for inserting the tail in the clothes, and the periphery may be meshed around the holes. .
  • the ventilation improvement region may be provided in a portion covering the ventilation passage in the robot or in the vicinity of the ventilation passage.
  • the "air flow improving region” may be realized by a porous structure such as a mesh, or may be realized by making the fabric relatively thin. It is preferable that the “ventilation improvement region” be provided at a position corresponding to the air supply port for introducing the outside air in the robot and a position corresponding to the exhaust port for discharging the inside air.

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Finance (AREA)
  • Accounting & Taxation (AREA)
  • Strategic Management (AREA)
  • General Physics & Mathematics (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Textile Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • Development Economics (AREA)
  • Theoretical Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)
  • Toys (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

According to the present invention, a clothing management server 400 is provided with: a data storage unit 414 which retains information for guiding the manufacturing of clothing by a robot; a selection unit 430 which, according to a request input from a user, displays a screen for selecting the clothing that is to be manufactured; and a determination unit (an operation condition generating unit 432) which determines and outputs an operation condition of the robot on the basis of the selected clothing. The selection unit 430 may allow the user to select textiles that are used to manufacture the selected clothing. The determination unit may determine the operation condition of the robot on the basis of the selected clothing and textiles.

Description

衣装製作支援装置Costume production support device
 本発明は、ロボットの衣装製作を支援するための装置に関する。 The present invention relates to an apparatus for supporting robot costume making.
 ヒューマノイドロボットやペットロボット等、人間との対話や癒しを提供する自律行動型ロボットの開発が進められている(例えば特許文献1参照)。このようなロボットは、周囲の状況に基づいて自律的に学習することで行動を進化させ、生き物に近い存在になっていくと予想される。近い将来、ペットが感じさせるような癒しをユーザに与えてくれるかもしれない。 Development of autonomous action type robots such as humanoid robots and pet robots that provide dialogue and healing with human beings is in progress (see, for example, Patent Document 1). It is expected that such robots will evolve their behavior by learning autonomously based on the surrounding situation, and will be closer to living beings. In the near future, it may give the user the healing that a pet feels.
特開2000-323219号公報JP 2000-323219 A
 ロボットが生き物に近い存在となった場合、人間がペットにするようにロボットに衣装を着せることが予想され、その衣装の製作販売がビジネスとして成立する可能性がある。ユーザによっては市販の衣装を購入するだけでは飽き足らず、オリジナル衣装の製作に興味をもつかもしれない。そうすると、衣装の生地や型紙のみを販売し、衣装製作についてはユーザに委ねるようなビジネスも成立する可能性がある。 If a robot becomes close to a living being, it is expected that the robot will be dressed as a human being becomes a pet, and production and sales of the costume may be established as a business. Some users may not get tired of buying commercial costumes, and may be interested in making original costumes. Then, there may be a business in which only the cloth or pattern of the costume is sold and the costume production is entrusted to the user.
 しかしながら、ロボットは人間とは異なり、その性能に応じて駆動力や可動範囲に限界がある。衣装を着せることが、ロボットの性能を阻害したり、寿命を短くする要因にもなる。発明者は、このような点を考慮してロボットの衣装製作を支援することが有効であるとの認識に到った。 However, unlike human beings, robots are limited in driving force and movable range according to their performance. Wearing a costume is a factor that hinders the performance of the robot and shortens its life. The inventor has come to recognize that it is effective to support the production of a robot's costume in consideration of such points.
 本発明は上記課題認識に基づいて完成された発明であり、その目的の一つは、ロボットの衣装製作を支援するのに好適な装置を提供することにある。 The present invention is an invention completed on the basis of the above problem recognition, and one of its objects is to provide a device suitable for supporting robot costume production.
 本発明のある態様の衣装製作支援装置は、ロボットの衣装の製作をガイドするための情報を保持する格納部と、ユーザの要求入力に応じて格納部を参照し、製作対象とする衣装の選択画面を表示させる選択部と、選択された衣装に基づいてロボットの作動条件を決定して出力する決定部と、を備える。 The costume production supporting apparatus according to an aspect of the present invention includes a storage unit for holding information for guiding the production of a robot costume, and referring to the storage unit according to a user's request input to select a costume to be produced. A selection unit for displaying a screen, and a determination unit for determining and outputting an operating condition of the robot based on the selected costume.
 本発明によれば、ロボットの衣装製作を支援するのに好適な装置を提供できる。 According to the present invention, it is possible to provide an apparatus suitable for supporting robot costume production.
実施形態に係るロボットの外観を表す図である。It is a figure showing the appearance of the robot concerning an embodiment. ロボットの構造を概略的に表す断面図である。FIG. 2 is a cross-sectional view schematically illustrating the structure of a robot. 衣装着用時におけるロボットの正面外観図である。It is a front external view of a robot at the time of clothes wearing. ロボットシステムの構成図である。It is a block diagram of a robot system. ロボットのハードウェア構成図である。It is a hardware block diagram of a robot. 衣装製作支援システムの構成を模式的に示す図である。It is a figure which shows typically the structure of a costume production assistance system. 衣装製作支援システムの機能ブロック図である。It is a functional block diagram of a costume production support system. 衣装管理サーバがユーザに提供する衣装製作支援画面を表す図である。It is a figure showing the costume production support screen which a costume management server provides to a user. 衣装管理サーバがユーザに提供する衣装製作支援画面を表す図である。It is a figure showing the costume production support screen which a costume management server provides to a user. 衣装管理サーバの機能を詳細に表すブロック図である。It is a block diagram showing the function of a costume management server in detail. データ処理部が保持するデータテーブルを模式的に表す図である。It is a figure which represents typically the data table which a data processing part hold | maintains. データ処理部が保持するデータテーブルを模式的に表す図である。It is a figure which represents typically the data table which a data processing part hold | maintains. データ処理部が保持するデータテーブルを模式的に表す図である。It is a figure which represents typically the data table which a data processing part hold | maintains. 支援サービス利用開始時における衣装管理サーバとユーザ端末との通信の概要を表すシーケンス図である。It is a sequence diagram showing the outline | summary of communication with a costume management server and a user terminal at the time of support service utilization start. ICタグ発行業者および生地業者とユーザとの手続の概要を表すシーケンス図である。It is a sequence diagram showing the outline of a procedure with an IC tag issuer, a textile vendor, and a user. ロボットの衣装着用時の動作補正過程を示すフローチャートである。It is a flowchart which shows the operation | movement correction process at the time of the clothes of a robot. ロボット、ロボット管理サーバおよび衣装管理サーバの通信の概要を表すシーケンス図である。It is a sequence diagram showing the outline of communication of a robot, a robot management server, and a costume management server. 機能的衣装として帽子を例示する図である。It is a figure which illustrates a hat as a functional costume.
 以下、本発明の実施形態を、図面を参照して詳細に説明する。なお、以下の説明においては便宜上、図示の状態を基準に各構造の位置関係を表現することがある。また、以下の実施形態およびその変形例について、ほぼ同一の構成要素については同一の符号を付し、その説明を適宜省略することがある。 Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings. In the following description, for convenience, the positional relationship of each structure may be expressed based on the illustrated state. Further, in the following embodiments and their modifications, substantially the same components will be denoted by the same reference symbols, and the description thereof may be omitted as appropriate.
 図1は、実施形態に係るロボット100の外観を表す図である。図1(a)は正面図であり、図1(b)は側面図である。
 ロボット100は、外部環境および内部状態に基づいて行動や仕草(ジェスチャー)を決定する自律行動型ロボットである。外部環境は、カメラやサーモセンサなど各種のセンサにより認識される。内部状態はロボット100の感情を表現する様々なパラメータとして定量化される。
FIG. 1 is a view showing the appearance of a robot 100 according to the embodiment. Fig.1 (a) is a front view, FIG.1 (b) is a side view.
The robot 100 is an autonomous behavior robot that determines an action or gesture (gesture) based on an external environment and an internal state. The external environment is recognized by various sensors such as a camera and a thermo sensor. The internal state is quantified as various parameters that express the emotion of the robot 100.
 ロボット100は、屋内行動が前提とされており、例えば、ユーザ宅の屋内を行動範囲とする。なお、本実施形態では、ロボット100に関わる人間を「ユーザ」とよぶ。 The robot 100 is premised on indoor action, and for example, takes the inside of the user's house as the action range. In the present embodiment, a human involved in the robot 100 is referred to as a “user”.
 ロボット100のボディ104は、全体的に丸みを帯びた形状を有し、柔軟で弾力性のある素材により形成された外皮314を含む。丸くてやわらかく、手触りのよいボディ104とすることで、ロボット100はユーザに安心感とともに心地よい触感を提供する。ロボット100には、衣装を着せることができる。ユーザの好みに応じて衣装を着せることで、季節感や誕生日などのイベントを楽しむことができる。 The body 104 of the robot 100 has an overall rounded shape and includes an outer skin 314 formed of a flexible and resilient material. By making the body 104 round and soft and have a good touch, the robot 100 provides the user with a sense of security and a pleasant touch. The robot 100 can be dressed. By putting on clothes according to the user's preference, it is possible to enjoy events such as seasonal feeling and birthday.
 ロボット100は、3輪走行するための3つの車輪を備える。図示のように、一対の前輪102(左輪102a,右輪102b)と、一つの後輪103を含む。前輪102が駆動輪であり、後輪103が従動輪である。前輪102は、回転速度や回転方向が個別に制御可能とされている。後輪103は、ロボット100を前後左右への移動させるために回転自在となっている。 The robot 100 includes three wheels for traveling three wheels. As shown, a pair of front wheels 102 (left wheel 102a, right wheel 102b) and one rear wheel 103 are included. The front wheel 102 is a driving wheel, and the rear wheel 103 is a driven wheel. The front wheel 102 can individually control its rotational speed and rotational direction. The rear wheel 103 is rotatable to move the robot 100 back and forth and left and right.
 前輪102および後輪103は、図示しない駆動機構(回動機構、リンク機構)によりボディ104に完全収納できる。走行時においても各車輪の大部分はボディ104に隠れているが、各車輪がボディ104に完全収納されるとロボット100は移動不可能な状態となる。車輪の収納動作に伴ってボディ104が降下し、床面Fに着座する。この着座状態においては、ボディ104の底部に形成された着座面108が床面Fに当接する。 The front wheel 102 and the rear wheel 103 can be completely housed in the body 104 by a drive mechanism (a rotation mechanism, a link mechanism) not shown. Even when traveling, most of the wheels are hidden by the body 104, but when the wheels are completely housed in the body 104, the robot 100 can not move. As the wheel retracts, the body 104 descends and seats on the floor surface F. In this sitting state, the seating surface 108 formed on the bottom of the body 104 abuts on the floor surface F.
 ロボット100は、2つの手106を有する。手106には、モノを把持する機能はない。手106は、図示しない内蔵ワイヤを引っ張る又は緩めることにより、上げる、振る、振動するなど簡単な動作が可能である。2つの手106も個別制御可能である。 The robot 100 has two hands 106. The hand 106 does not have the function of gripping an object. The hand 106 can perform simple operations such as raising, shaking and vibrating by pulling or loosening a built-in wire (not shown). The two hands 106 are also individually controllable.
 ロボット100の頭部正面(顔)には2つの目110が設けられている。目110は、液晶素子または有機EL素子により、様々な表情で表示される。ロボット100は、スピーカーを内蔵し、簡単な音声を発することもできる。ロボット100の頭頂部にはツノ112が取り付けられる。ツノ112には全天球カメラが内蔵され、上下左右全方位を一度に撮影できる。また、ロボット100の頭部正面には、高解像度カメラが設けられる(図示せず)。 Two eyes 110 are provided on the front of the head (face) of the robot 100. The eyes 110 are displayed with various expressions by liquid crystal elements or organic EL elements. The robot 100 has a built-in speaker and can emit a simple voice. A horn 112 is attached to the top of the head of the robot 100. The omnidirectional camera is incorporated in the horn 112, and it is possible to photograph all directions in the vertical and horizontal directions at once. In addition, a high resolution camera is provided in front of the head of the robot 100 (not shown).
 図2は、ロボット100の構造を概略的に表す断面図である。
 ロボット100のボディ104は、ベースフレーム308、本体フレーム310、一対のホイールカバー312および外皮314を含む。ベースフレーム308は、ボディ104の軸芯を構成するとともに内部機構を支持する。ベースフレーム308は、アッパープレート332とロアプレート334とを複数のサイドプレート336により連結して構成される。ベースフレーム308の内方には、バッテリー118、制御回路342および各種アクチュエータ等が収容されている。
FIG. 2 is a cross-sectional view schematically showing the structure of the robot 100. As shown in FIG.
The body 104 of the robot 100 includes a base frame 308, a body frame 310, a pair of wheel covers 312 and an outer shell 314. The base frame 308 constitutes an axial center of the body 104 and supports an internal mechanism. The base frame 308 is configured by connecting the upper plate 332 and the lower plate 334 by a plurality of side plates 336. Inside the base frame 308, a battery 118, a control circuit 342, various actuators and the like are accommodated.
 本体フレーム310は、頭部フレーム316および胴部フレーム318を含む。頭部フレーム316は、中空半球状をなし、ロボット100の頭部骨格を形成する。胴部フレーム318は、段付筒形状をなし、ロボット100の胴部骨格を形成する。胴部フレーム318は、ベースフレーム308に固定されている。頭部フレーム316は、内部機構およびジョイント330等を介してアッパープレート332に接続され、胴部フレーム318に対して相対変位可能とされている。 Body frame 310 includes a head frame 316 and a torso frame 318. The head frame 316 has a hollow hemispherical shape and forms a head skeleton of the robot 100. The body frame 318 has a stepped cylindrical shape and forms the body frame of the robot 100. The body frame 318 is fixed to the base frame 308. The head frame 316 is connected to the upper plate 332 via an internal mechanism, a joint 330 and the like, and can be displaced relative to the body frame 318.
 頭部フレーム316には、ヨー軸321、ピッチ軸322およびロール軸323の3軸と、各軸を回転駆動するアクチュエータ324,325が設けられている。アクチュエータ324は、ヨー軸321を駆動するためのサーボモータを含む。アクチュエータ325は、ピッチ軸322およびロール軸323をそれぞれ駆動するための複数のサーボモータを含む。首振り動作のためにヨー軸321が駆動され、頷き動作,見上げ動作および見下ろし動作のためにピッチ軸322が駆動され、首を傾げる動作のためにロール軸323が駆動される。頭部フレーム316の上部には、ヨー軸321に支持されるプレート326が固定されている。 The head frame 316 is provided with three axes of a yaw axis 321, a pitch axis 322, and a roll axis 323, and actuators 324, 325 for rotationally driving each axis. The actuator 324 includes a servomotor for driving the yaw axis 321. The actuator 325 includes a plurality of servomotors for driving the pitch axis 322 and the roll axis 323, respectively. The yaw shaft 321 is driven for the swinging operation, the pitch shaft 322 is driven for the peeping operation, the look-up operation and the look-down operation, and the roll shaft 323 is driven for the tilting operation. A plate 326 supported by the yaw axis 321 is fixed to the top of the head frame 316.
 頭部フレーム316およびその内部機構を下方から支持するように、ベースプレート328が設けられている。ベースプレート328は、ジョイント330を介してアッパープレート332(ベースフレーム308)と連結されている。ベースプレート328には支持台335が設けられ、アクチュエータ324,325およびクロスリンク機構329(パンタグラフ機構)が支持されている。クロスリンク機構329は、アクチュエータ324,325を上下に連結し、それらの間隔を変化させることができる。 A base plate 328 is provided to support the head frame 316 and its internal features from below. The base plate 328 is connected to the upper plate 332 (base frame 308) through the joint 330. A support base 335 is provided on the base plate 328, and the actuators 324 and 325 and the cross link mechanism 329 (pantograph mechanism) are supported. The cross link mechanism 329 may connect the actuators 324 and 325 up and down, and change their distance.
 ヨー軸321を回転させることにより、プレート326と頭部フレーム316とを一体に回転(ヨーイング)させることができ、首振り動作を実現できる。ピッチ軸322を回転させることにより、クロスリンク機構329と頭部フレーム316とを一体に回転(ピッチング)させることができ、頷き動作等を実現できる。ロール軸323を回転させることにより、アクチュエータ325と頭部フレーム316とを一体に回転(ローリング)させることができ、首を傾げる動作を実現できる。クロスリンク機構329を伸縮させることにより、首の伸縮動作を実現できる。 By rotating the yaw axis 321, the plate 326 and the head frame 316 can be integrally rotated (yawed), and a swinging motion can be realized. By rotating the pitch axis 322, the cross link mechanism 329 and the head frame 316 can be integrally rotated (pitching), and a loosening operation and the like can be realized. By rotating the roll shaft 323, the actuator 325 and the head frame 316 can be integrally rotated (rolling), and an operation of tilting the neck can be realized. By expanding and contracting the cross link mechanism 329, an expansion and contraction operation of the neck can be realized.
 胴部フレーム318は、ベースフレーム308および車輪駆動機構370を収容している。車輪駆動機構370は、前輪102を駆動する前輪駆動機構と、後輪103を駆動する後輪駆動機構と、これらの駆動機構を駆動するアクチュエータ379を含む。胴部フレーム318は、ボディ104のアウトラインに丸みをもたせるよう、上半部が滑らかな曲面形状とされている。胴部フレーム318の下半部は、ホイールカバー312との間に前輪102の収納スペースSを形成するために小幅とされ、前輪102の回動軸378を支持している。 Torso frame 318 houses base frame 308 and wheel drive mechanism 370. Wheel drive mechanism 370 includes a front wheel drive mechanism that drives front wheel 102, a rear wheel drive mechanism that drives rear wheel 103, and an actuator 379 that drives these drive mechanisms. The body frame 318 has a smooth curved upper half so that the outline of the body 104 is rounded. The lower half of the body frame 318 is narrowed to form a storage space S of the front wheel 102 with the wheel cover 312, and supports the pivot shaft 378 of the front wheel 102.
 一対のホイールカバー312は、胴部フレーム318の下半部を左右から覆うように設けられている。ホイールカバー312は、胴部フレーム318の上半部と連続した滑らかな外面(曲面)を形成する。ホイールカバー312の上端部が、上半部の下端部に沿って連結されている。それにより、下半部の側壁とホイールカバー312との間に、下方に向けて開放される収納スペースSが形成されている。 The pair of wheel covers 312 is provided to cover the lower half of the body frame 318 from the left and right. The wheel cover 312 forms a smooth outer surface (curved surface) continuous with the upper half of the trunk frame 318. The upper end of the wheel cover 312 is connected along the lower end of the upper half. Thus, a storage space S opened downward is formed between the side wall of the lower half and the wheel cover 312.
 前輪駆動機構は、前輪102を回転させるための回転駆動機構と、前輪102を収納スペースSから進退させるための収納作動機構とを含む。前輪駆動機構の駆動により、前輪102を収納スペースSから外部へ向けて進退駆動できる。後輪駆動機構の駆動により、後輪103を収納スペースSから外部へ向けて進退駆動できる。 The front wheel drive mechanism includes a rotation drive mechanism for rotating the front wheel 102 and a storage operation mechanism for advancing and retracting the front wheel 102 from the storage space S. By driving the front wheel drive mechanism, the front wheel 102 can be driven to move forward and backward from the storage space S. By driving the rear wheel drive mechanism, the rear wheel 103 can be driven to move forward and backward from the storage space S.
 外皮314は、本体フレーム310を外側から覆う。外皮314は、人が弾力を感じる程度の厚みを有し、ウレタンスポンジなどの伸縮性を有する素材で形成される。これにより、ユーザがロボット100を抱きしめると、適度な柔らかさを感じ、人がペットにするように自然なスキンシップをとることができる。本体フレーム310と外皮314の間には、静電容量型のタッチセンサが設けられる。タッチセンサは、複数箇所に設けられ、ロボット100のほぼ全域におけるタッチを検出する。タッチセンサは外皮314の内側に設けられているので、外皮314が変形すると検出レベルが高くなる。つまり、人がロボット100を強く抱きしめているか、そっと抱きしめているか等の接触状態を判断できる。手106は、外皮314と一体に形成されている。外皮314の上端部には、開口部390が設けられる。ツノ112の下端部が、開口部390を介して頭部フレーム316に接続されている。 The outer cover 314 covers the main body frame 310 from the outside. The outer cover 314 has a thickness that allows a person to feel elasticity, and is formed of a stretchable material such as a urethane sponge. As a result, when the user holds the robot 100, it feels appropriate softness, and it is possible to take a natural skinship to make a person pet. A capacitive touch sensor is provided between the main body frame 310 and the outer skin 314. The touch sensors are provided at a plurality of locations, and detect touch on substantially the entire area of the robot 100. Since the touch sensor is provided inside the outer skin 314, the deformation level of the outer skin 314 increases the detection level. In other words, it is possible to determine the contact state, such as whether a person is holding the robot 100 firmly or gently. The hand 106 is integrally formed with the skin 314. An opening 390 is provided at the upper end of the outer skin 314. The lower end of the horn 112 is connected to the head frame 316 through the opening 390.
 手106を駆動するための駆動機構は、外皮314に埋設されたワイヤ134と、その駆動回路340(通電回路)を含む。ワイヤ134は、本実施形態では形状記憶合金線からなり、加熱されると収縮硬化し、徐熱されると弛緩伸長する。ワイヤ134の両端から引き出されたリード線が、駆動回路340に接続されている。駆動回路340のスイッチがオンされるとワイヤ134(形状記憶合金線)に通電がなされる。 The drive mechanism for driving the hand 106 includes a wire 134 embedded in the outer skin 314 and a drive circuit 340 (energization circuit) thereof. The wire 134 is formed of a shape memory alloy wire in the present embodiment, and shrinks and hardens when heated, and relaxes and elongates when heated. Leads drawn from both ends of the wire 134 are connected to the drive circuit 340. When the switch of the drive circuit 340 is turned on, the wire 134 (shape memory alloy wire) is energized.
 ワイヤ134は、外皮314から手106に延びるようにモールド又は編み込まれている。ワイヤ134の両端から胴部フレーム318の内方にリード線が引き出されている。ワイヤ134は外皮314の左右に1本ずつ設けてもよいし、複数本ずつ並列に設けてもよい。ワイヤ134に通電することで腕(手106)を上げることができ、通電遮断することで腕(手106)を下げることができる。また、別の形態では、手106の先端付近にワイヤを取り付け、胴部フレーム318にワイヤを巻き取る機構を設け、ワイヤの長さを巻き取り機構の出し入れで調整することにより手106を駆動してもよい。 The wire 134 is molded or braided to extend from the skin 314 to the hand 106. Leads are drawn from both ends of the wire 134 to the inside of the body frame 318. One wire 134 may be provided on each of the left and right of the outer covering 314, or a plurality of wires 134 may be provided in parallel. The arm (hand 106) can be raised by energizing the wire 134, and the arm (hand 106) can be lowered by interrupting the energization. In another mode, a wire is attached near the tip of the hand 106, and the body frame 318 is provided with a mechanism for winding the wire, and the length of the wire is adjusted by the insertion and removal of the winding mechanism. May be
 本実施形態では、ロボット100が衣服を着用しておらず、外皮314のみ装着している状態(以下「基準状態」ともいう)での腕の可動範囲を75度とし、腕が下向きでボディ104に当接した状態から上方に75度としている(以下、可動範囲「0~75°」のようにも表記する)。 In the present embodiment, the robot 100 does not wear clothes, and the movable range of the arm in a state in which only the outer skin 314 is mounted (hereinafter also referred to as “reference state”) is 75 degrees. It is 75 degrees upward from the state of being in contact with (hereinafter, described as a movable range “0 to 75 °”).
 頭部フレーム316は、ベースプレート328およびジョイント330等を介して胴部フレーム318と連結されている。図示のように、頭部フレーム316と胴部フレーム318との間には、上下方向に十分な間隔が確保されているため、ピッチ軸322を中心とする頭部フレーム316の可動範囲(回転範囲)を大きくとることができる。 The head frame 316 is connected to the body frame 318 via the base plate 328 and the joints 330 and the like. As shown, since a sufficient distance in the vertical direction is secured between the head frame 316 and the body frame 318, the movable range (rotational range of the head frame 316 about the pitch axis 322) ) Can be taken large.
 本実施形態では、上記基準状態での頭部フレーム316の上下可動範囲を90度とし、視線が水平となる状態から上下に45度ずつとしている。すなわち、ロボット100が上向く角度(見上げ角)の限界値が45度とされ、下向く角度(見下ろし角)の限界値が-45度とされている(以下、可動範囲「-45~45°」のようにも表記する)。 In the present embodiment, the upper and lower movable range of the head frame 316 in the reference state is 90 degrees, and 45 degrees in the vertical direction from the state where the line of sight is horizontal. That is, the limit value of the angle (look-up angle) in which the robot 100 goes up is 45 degrees, and the limit value of the down-angle (look down angle) is -45 degrees (hereinafter, the movable range "-45-45 °" Also written as
 また、基準状態での頭部フレーム316の左右可動範囲を150度とし、視線が正面となる状態から左右に75度ずつとしている。すなわち、ロボット100が正面に対して右を向く角度の限界値が75度とされ、左を向く角度の限界値が-75度とされている(以下、可動範囲「-75~75°」のようにも表記する)。 Further, the left and right movable range of the head frame 316 in the reference state is set to 150 degrees, and it is set to 75 degrees to the left and right from the state where the line of sight is the front. That is, the limit value of the angle of the robot 100 to the right with respect to the front is set to 75 degrees, and the limit value of the angle to the left is set to -75 degrees (hereinafter, the movable range "-75 to 75 degrees" As well).
 さらに、基準状態での頭部フレーム316の傾き可動範囲を60度とし、頭部を真っ直ぐに立てた状態から左右への傾きが30度ずつとしている。すなわち、ロボット100が右側に傾げる限界値が30度とされ、左側に傾げる限界値が-30度とされている(以下、可動範囲「-30~30°」のようにも表記する)。ロボット100は、衣装を着用した場合に、その衣装に応じた各可動部の調整を行う。詳細については後述する。 Furthermore, the inclination movable range of the head frame 316 in the reference state is 60 degrees, and the inclination to the left and right is 30 degrees from the state in which the head is upright. That is, the limit value at which the robot 100 tilts to the right is 30 degrees, and the limit value at which the robot 100 tilts to the left is -30 degrees (hereinafter referred to as a movable range "-30 to 30 degrees"). When the robot 100 wears a costume, the robot 100 adjusts the movable parts according to the costume. Details will be described later.
 図3は、衣装着用時におけるロボット100の正面外観図である。
 ユーザは、ロボット100に様々な種類の衣装180を着せることができる。例えば、カジュアルなデザインのポロシャツ、フォーマルなデザインのスーツ、冬場に着せるコートやダウンの上着、漫画のキャラクタを模した衣装など様々な種類の衣装180が考えられる。ロボット100は、モータ等のアクチュエータにより可動部を動かす。各部の駆動力には設計上の上限があるため、軽く動きやすくデザインされたポロシャツを着せた場合には、可動部に加わる負荷が少なく、皮など重く硬い生地でできたコートを着せた場合には、可動部に加わる負荷は大きくなる。ロボット100にフィットするよう製作できたとしても、衣装180のデザイン形状や使われている生地によりロボット100がそれを着用することで適正な動作制御が阻害される可能性がある。場合によって、ロボット100に予め設定された駆動力では、関節部位を動かせなくなることも想定される。また、タッチセンサなどのセンサの検出感度が変わることで、ロボット100における外部環境の認識処理に問題が生じることも想定される。人間に衣装を着せる場合、体型に対して寸法が合っていなかったり、奇抜なデザインであったとしても、着にくい、動きにくいという感覚はあるものの、着て動けなくなるという状況にはならない。しかしながら、ロボット100の場合は、駆動力や可動範囲に明確な限界があり、その限界を超えると全く動かなくなったり、モータなどのアクチュエータが異常に発熱してしまう。人間にとっては、単なる衣装かもしれないが、ロボット100にとっては正常な動作を左右する重要な要素のひとつと言える。
FIG. 3 is a front external view of the robot 100 when wearing a costume.
The user can wear various types of costumes 180 on the robot 100. For example, various types of costumes 180 can be considered, such as a casual design polo shirt, a formal design suit, a winter coat or down jacket, a costume imitating a cartoon character. The robot 100 moves the movable part by an actuator such as a motor. There is a design upper limit to the driving power of each part, so when wearing a polo shirt designed to be light and easy to move, the load applied to the movable part is small, and when wearing a coat made of a heavy hard fabric such as a skin The load on the movable part is increased. Even if the robot 100 can be manufactured to fit the robot 100, the design shape of the costume 180 and the fabric used may hinder proper operation control by the robot 100 wearing it. In some cases, it is also assumed that the driving force preset for the robot 100 can not move the joint site. In addition, it is also assumed that a problem occurs in the recognition processing of the external environment in the robot 100 due to a change in detection sensitivity of a sensor such as a touch sensor. When people wear clothes, even if they do not fit in size or have a strange design, they do not have the feeling that they can not wear or move, although they have the feeling that they are hard to wear and hard to move. However, in the case of the robot 100, there is a clear limit to the driving force and the movable range, and if the limit is exceeded, the robot will not move at all or an actuator such as a motor will generate heat abnormally. For a human being, it may be a mere costume, but for the robot 100, it can be said to be one of the important elements that affect the normal operation.
 そこで本実施形態の衣装製作支援装置は、ロボットの衣装製作を支援するとともに、ロボットがその衣装を着用した際に適正な動作制御ができるよう調整するための情報(以下「作動設定ファイル」という)を提供する。作動設定ファイルにより、ロボット100は、衣装に応じた各可動部の作動条件の調整を行う。作動条件には、可動部の駆動力、可動範囲、タッチセンサなどセンサの感度補正等の、衣装180を着用することでロボット100に生じる影響をキャンセルするために利用可能な情報が含まれる。言い換えれば、衣装180を着用することでロボット100に生じた特性の変化により、ロボット100に悪影響が生じないようにロボット100を制御するための情報が含まれる。ロボット100は、衣装180に取り付けられたICタグ182の「タグID」を読み込むことで、装着中の衣装に適した作動条件(補正情報)を取得できる。タグIDは、アクチュエータごとの作動条件を定義した作動設定ファイルを取得するための「識別情報」として機能する。 Therefore, the costume production support apparatus of the present embodiment supports the production of the costume of the robot and information for adjusting the robot so as to perform proper operation control when the costume is worn (hereinafter referred to as "operation setting file") I will provide a. Based on the operation setting file, the robot 100 adjusts the operation conditions of each movable unit according to the costume. The operating conditions include information that can be used to cancel the influence that occurs on the robot 100 by wearing the costume 180, such as the driving force of the movable part, the movable range, and the sensitivity correction of the sensor such as a touch sensor. In other words, information for controlling the robot 100 is included so that the robot 100 is not adversely affected by the change in the characteristics of the robot 100 caused by the clothes 180 being worn. By reading the “tag ID” of the IC tag 182 attached to the costume 180, the robot 100 can acquire the operating condition (correction information) suitable for the costume being worn. The tag ID functions as "identification information" for acquiring an operation setting file defining operation conditions for each actuator.
 すなわち、衣装180の特定位置にはICタグ182が縫い付けられている。ICタグ182は、RFID(Radio Frequency Identifier)タグである。ICタグ182は、タグIDを至近距離に発信する。ロボット100は、ICタグ182からタグIDを読み込むことにより、衣装180に適合するアクチュエータごとの補正情報を取得できる。 That is, the IC tag 182 is sewn on a specific position of the costume 180. The IC tag 182 is an RFID (Radio Frequency Identifier) tag. The IC tag 182 transmits the tag ID at a close distance. By reading the tag ID from the IC tag 182, the robot 100 can acquire correction information for each actuator that conforms to the costume 180.
 図4は、ロボットシステム300の構成図である。
 ロボットシステム300は、ロボット100、ロボット管理サーバ200および複数の外部センサ114を含む。家屋内にはあらかじめ複数の外部センサ114(外部センサ114a、114b、・・・、114n)が設置される。外部センサ114は、家屋の壁面に固定されてもよいし、床に載置されてもよい。ロボット管理サーバ200には、外部センサ114の位置座標が登録される。位置座標は、ロボット100の行動範囲として想定される家屋内においてx,y座標として定義される。
FIG. 4 is a block diagram of the robot system 300. As shown in FIG.
The robot system 300 includes a robot 100, a robot management server 200, and a plurality of external sensors 114. A plurality of external sensors 114 ( external sensors 114a, 114b, ..., 114n) are installed in advance in the house. The external sensor 114 may be fixed to the wall of the house or may be mounted on the floor. In the robot management server 200, position coordinates of the external sensor 114 are registered. The position coordinates are defined as x, y coordinates in a house assumed as the action range of the robot 100.
 ロボット100の内蔵するセンサおよび複数の外部センサ114から得られる情報に基づいて、ロボット管理サーバ200がロボット100の基本行動を決定する。 The robot management server 200 determines the basic behavior of the robot 100 based on the information obtained from the sensors contained in the robot 100 and the plurality of external sensors 114.
 外部センサ114は、定期的に外部センサ114のID(以下、「ビーコンID」とよぶ)を含む無線信号(以下、「ロボット探索信号」とよぶ)を送信する。ロボット100はロボット探索信号を受信するとビーコンIDを含む無線信号(以下、「ロボット返答信号」とよぶ)を返信する。ロボット管理サーバ200は、外部センサ114がロボット探索信号を送信してからロボット返答信号を受信するまでの時間を計測し、外部センサ114からロボット100までの距離を測定する。複数の外部センサ114とロボット100とのそれぞれの距離を計測することで、ロボット100の位置座標を特定する。 The external sensor 114 periodically transmits a wireless signal (hereinafter referred to as a “robot search signal”) including the ID of the external sensor 114 (hereinafter referred to as “beacon ID”). When the robot 100 receives the robot search signal, the robot 100 sends back a radio signal (hereinafter referred to as a “robot reply signal”) including a beacon ID. The robot management server 200 measures the time from when the external sensor 114 transmits the robot search signal to when the robot reply signal is received, and measures the distance from the external sensor 114 to the robot 100. By measuring the distances between the plurality of external sensors 114 and the robot 100, the position coordinates of the robot 100 are specified.
 図5は、ロボット100のハードウェア構成図である。
 ロボット100は、内部センサ128、通信機126、記憶装置124、プロセッサ122、駆動機構120およびバッテリー118を含む。駆動機構120は、上述した車輪駆動機構370を含む。プロセッサ122と記憶装置124は、制御回路342に含まれる。各ユニットは電源線130および信号線132により互いに接続される。バッテリー118は、電源線130を介して各ユニットに電力を供給する。各ユニットは信号線132により制御信号を送受する。バッテリー118は、リチウムイオン二次電池であり、ロボット100の動力源である。
FIG. 5 is a hardware configuration diagram of the robot 100. As shown in FIG.
The robot 100 includes an internal sensor 128, a communicator 126, a storage device 124, a processor 122, a drive mechanism 120 and a battery 118. The drive mechanism 120 includes the wheel drive mechanism 370 described above. Processor 122 and storage 124 are included in control circuit 342. The units are connected to each other by a power supply line 130 and a signal line 132. The battery 118 supplies power to each unit via the power supply line 130. Each unit transmits and receives control signals through a signal line 132. The battery 118 is a lithium ion secondary battery and is a power source of the robot 100.
 内部センサ128は、ロボット100が内蔵する各種センサの集合体である。具体的には、カメラ(全天球カメラ)、マイクロフォンアレイ、測距センサ(赤外線センサ)、サーモセンサ、タッチセンサ、加速度センサ、ニオイセンサなどである。タッチセンサは、外皮314と本体フレーム310の間に設置され、静電容量の変化に基づいてユーザのタッチを検出する。ニオイセンサは、匂いの元となる分子の吸着によって電気抵抗が変化する原理を応用した既知のセンサである。 The internal sensor 128 is an assembly of various sensors incorporated in the robot 100. Specifically, it is a camera (all-sky camera), a microphone array, a distance measurement sensor (infrared sensor), a thermo sensor, a touch sensor, an acceleration sensor, an odor sensor, and the like. The touch sensor is disposed between the outer cover 314 and the body frame 310, and detects a user's touch based on a change in capacitance. The odor sensor is a known sensor to which the principle that the electric resistance is changed by the adsorption of the molecule that is the source of the odor is applied.
 通信機126は、ロボット管理サーバ200や外部センサ114、ユーザの有する携帯機器など各種の外部機器を対象として無線通信を行う通信モジュールである。記憶装置124は、不揮発性メモリおよび揮発性メモリにより構成され、コンピュータプログラムや各種設定情報を記憶する。プロセッサ122は、コンピュータプログラムの実行手段である。駆動機構120は、内部機構を制御するアクチュエータである。このほかには、表示器やスピーカーなども搭載される。 The communication device 126 is a communication module that performs wireless communication with various external devices such as the robot management server 200, the external sensor 114, and a portable device owned by a user. The storage device 124 is configured by a non-volatile memory and a volatile memory, and stores a computer program and various setting information. The processor 122 is an execution means of a computer program. The drive mechanism 120 is an actuator that controls an internal mechanism. In addition to this, indicators and speakers will also be installed.
 プロセッサ122は、通信機126を介してロボット管理サーバ200や外部センサ114と通信しながら、ロボット100の行動選択を行う。内部センサ128により得られる様々な外部情報も行動選択に影響する。駆動機構120は、主として、車輪(前輪102)と頭部(頭部フレーム316)を制御する。駆動機構120は、2つの前輪102それぞれの回転速度や回転方向を変化させることにより、ロボット100の移動方向や移動速度を変化させる。また、駆動機構120は、車輪(前輪102および後輪103)を昇降させることもできる。車輪が上昇すると、車輪はボディ104に完全に収納され、ロボット100は着座面108にて床面Fに当接し、着座状態となる。また、駆動機構120は、ワイヤ134を介して、手106を制御する。 The processor 122 performs action selection of the robot 100 while communicating with the robot management server 200 and the external sensor 114 via the communication device 126. Various external information obtained by the internal sensor 128 also affects behavior selection. The drive mechanism 120 mainly controls the wheel (front wheel 102) and the head (head frame 316). The drive mechanism 120 changes the rotational direction and the rotational direction of the two front wheels 102 to change the moving direction and the moving speed of the robot 100. The drive mechanism 120 can also raise and lower the wheels (the front wheel 102 and the rear wheel 103). When the wheel ascends, the wheel is completely housed in the body 104, and the robot 100 abuts on the floor surface F at the seating surface 108 to be in the seating state. The drive mechanism 120 also controls the hand 106 via the wire 134.
 図6は、衣装製作支援システム500の構成を模式的に示す図である。
 衣装製作支援システム500は、ユーザからの依頼に応じてロボット100のオリジナル衣装を製作するための支援サービスを行う。このサービスは、ユーザに対して衣装の生地や型紙を販売し、型紙にあわせた生地の裁断や生地の縫い付けなどの衣装製作についてはユーザに委ねるものである。衣装製作支援システム500においては、ロボット管理サーバ200およびユーザ端末250が、インターネット510を介して衣装管理サーバ400に接続される。
FIG. 6 is a view schematically showing the configuration of the costume production support system 500. As shown in FIG.
The costume production support system 500 provides a support service for producing an original costume of the robot 100 in response to a request from the user. This service sells the cloth or pattern of the costume to the user, and leaves it to the user to produce the costume such as cutting the cloth or sewing the cloth according to the pattern. In the costume production support system 500, the robot management server 200 and the user terminal 250 are connected to the costume management server 400 via the Internet 510.
 衣装管理サーバ400は、衣装サービス会社402に設置され、「衣装製作支援装置」として機能する。衣装管理サーバ400は、支援サービスを実行するための支援プログラムや、ロボットの種別ごとに様々な衣装を製作するための支援データを保有する。支援データには、衣装ごとのイメージデータ、型紙データ(デザイン情報)、生地データ、ロボット用の補正データ(作動条件)等が含まれる。 The costume management server 400 is installed in the costume service company 402, and functions as a "clothing production support device". The costume management server 400 holds a support program for executing a support service and support data for producing various clothes for each type of robot. The support data includes image data for each costume, pattern data (design information), fabric data, correction data for the robot (operation conditions), and the like.
 ユーザ端末250は、ラップトップPCなどの汎用コンピュータ、スマートフォンであってよい。ユーザ端末250とインターネット510は有線又は無線接続される。 The user terminal 250 may be a general purpose computer such as a laptop PC or a smart phone. The user terminal 250 and the Internet 510 are connected by wire or wirelessly.
 支援サービスの概要は以下のとおりである。衣装管理サーバ400は、ユーザ端末250からの要求に基づき、ロボット100の衣装およびその生地を特定し、その衣装の型紙データをユーザに送信するとともに、生地の発送処理を実行する。また、それと並行してその衣装のタグIDを発行し、その発送処理を実行する。衣装管理サーバ400は、衣装を着用する際のロボット100の作動条件(可動部の駆動力や可動範囲等)を記録した作動設定ファイルをタグIDと対応づけて保持している。生地は、外部業者(生地業者)からユーザ宅252に郵送される。タグIDは、外部業者(ICタグ発行業者)によってICタグ182に書き込まれ、ユーザ宅252に郵送される。すなわち、タグIDは、ICタグ182に書き込まれることで、ユーザが直接参照できない状態で提供される。 The outline of the support service is as follows. The costume management server 400 identifies the costume of the robot 100 and its fabric based on the request from the user terminal 250, transmits the pattern data of the costume to the user, and executes the fabric delivery process. At the same time, the tag ID of the costume is issued and the shipping process is executed. The costume management server 400 holds an operation setting file in which operation conditions (such as a driving force and a movable range of the movable portion) of the robot 100 at the time of wearing a costume are recorded in association with a tag ID. The fabric is mailed to the user's home 252 from an outside vendor (a fabric vendor). The tag ID is written to the IC tag 182 by an external vendor (IC tag issuer) and mailed to the user home 252. That is, the tag ID is written to the IC tag 182, and provided in a state where the user can not refer directly.
 ユーザ宅252に生地とICタグ182が届くと、ユーザは、型紙に基づいて生地を裁断し、衣装180を作製する。そして、衣装180における指定位置にICタグ182を縫い付け、その衣装180をロボット100に着せる。ロボット100は、衣装180の装着を検出すると、ICタグ182に記録されたタグIDをロボット管理サーバ200に送信する。ロボット管理サーバ200は、衣装管理サーバ400にアクセスし、タグIDに対応した作動設定ファイルを取得し、ロボット100へ送信する。ロボット100は、その作動設定ファイルに記録されている作動条件を制御指令値に反映させことで、衣装に合った無理のない動作を行えるようになる。つまり、衣装管理サーバ400を利用して製作された衣装180であれば、ロボット100は着用中の衣装180に応じて適切な調整をおこない適切な動作ができる。 When the cloth and the IC tag 182 reach the user home 252, the user cuts the cloth on the basis of the pattern and produces the costume 180. Then, the IC tag 182 is sewn to a designated position in the costume 180, and the costume 180 is worn on the robot 100. When detecting the wearing of the costume 180, the robot 100 transmits the tag ID recorded in the IC tag 182 to the robot management server 200. The robot management server 200 accesses the costume management server 400, acquires an operation setting file corresponding to the tag ID, and transmits the operation setting file to the robot 100. The robot 100 can perform an unreasonable operation suitable for the costume by reflecting the operation conditions recorded in the operation setting file in the control command value. That is, in the case of the costume 180 manufactured by using the costume management server 400, the robot 100 can perform appropriate adjustment according to the costume 180 being worn, and can perform appropriate operation.
 図7は、衣装製作支援システム500の機能ブロック図である。
 上述のように、衣装製作支援システム500は、ロボットシステム300および衣装管理サーバ400を含む。ロボットシステム300は、ロボット100、ロボット管理サーバ200および複数の外部センサ114を含む。ロボット100、ロボット管理サーバ200および衣装管理サーバ400の各構成要素は、CPU(Central Processing Unit)および各種コプロセッサなどの演算器、メモリやストレージといった記憶装置、それらを連結する有線または無線の通信線を含むハードウェアと、記憶装置に格納され、演算器に処理命令を供給するソフトウェアによって実現される。コンピュータプログラムは、デバイスドライバ、オペレーティングシステム、それらの上位層に位置する各種アプリケーションプログラム、また、これらのプログラムに共通機能を提供するライブラリによって構成されてもよい。以下に説明する各ブロックは、ハードウェア単位の構成ではなく、機能単位のブロックを示している。ロボット100の機能の一部はロボット管理サーバ200により実現されてもよいし、ロボット管理サーバ200の機能の一部または全部はロボット100により実現されてもよい。
FIG. 7 is a functional block diagram of the costume production support system 500. As shown in FIG.
As described above, the costume production support system 500 includes the robot system 300 and the costume management server 400. The robot system 300 includes a robot 100, a robot management server 200, and a plurality of external sensors 114. Each component of the robot 100, the robot management server 200, and the costume management server 400 is a CPU (Central Processing Unit) and computing units such as various co-processors, storage devices such as memories and storages, and wired or wireless communication lines connecting them. And software that is stored in the storage device and supplies processing instructions to the computing unit. The computer program may be configured by a device driver, an operating system, various application programs located in the upper layer of them, and a library that provides common functions to these programs. Each block described below indicates not a hardware unit configuration but a function unit block. A part of the functions of the robot 100 may be realized by the robot management server 200, or a part or all of the functions of the robot management server 200 may be realized by the robot 100.
(ロボット管理サーバ200)
 ロボット管理サーバ200は、通信部204、データ処理部202およびデータ格納部206を含む。通信部204は、外部センサ114およびロボット100との通信処理を担当する。データ格納部206は各種データを格納する。データ処理部202は、通信部204により取得されたデータおよびデータ格納部206に格納されるデータに基づいて各種処理を実行する。データ処理部202は、通信部204およびデータ格納部206のインタフェースとしても機能する。
(Robot management server 200)
The robot management server 200 includes a communication unit 204, a data processing unit 202, and a data storage unit 206. The communication unit 204 takes charge of communication processing with the external sensor 114 and the robot 100. The data storage unit 206 stores various data. The data processing unit 202 executes various processes based on the data acquired by the communication unit 204 and the data stored in the data storage unit 206. The data processing unit 202 also functions as an interface of the communication unit 204 and the data storage unit 206.
 データ格納部206は、モーション格納部232、マップ格納部216、個人データ格納部218および補正情報格納部219を含む。ロボット100は、複数の動作パターン(モーション)を有する。手106を震わせる、蛇行しながらユーザに近づく、首をかしげたままユーザを見つめる、など様々なモーションが定義される。 The data storage unit 206 includes a motion storage unit 232, a map storage unit 216, a personal data storage unit 218, and a correction information storage unit 219. The robot 100 has a plurality of motion patterns (motions). Various motions are defined such as shaking the hand 106, approaching the user while meandering, staring at the user with the head held down, and the like.
 モーション格納部232は、モーションの制御内容を定義する「モーションファイル」を格納する。各モーションは、モーションIDにより識別される。モーションファイルは、ロボット100のモーション格納部160にもダウンロードされる。どのモーションを実行するかは、ロボット管理サーバ200で決定されることもあるし、ロボット100で決定されることもある。ロボット100のモーションの多くは、複数の単位モーションを含む複合モーションとして構成される。 The motion storage unit 232 stores a "motion file" that defines control content of motion. Each motion is identified by a motion ID. The motion file is also downloaded to the motion storage unit 160 of the robot 100. Which motion is to be executed may be determined by the robot management server 200 or may be determined by the robot 100. Many of the motions of the robot 100 are configured as complex motions including a plurality of unit motions.
 マップ格納部216は、状況に応じたロボットの行動を定義した行動マップのほか、椅子やテーブルなどの障害物の配置状況を示すマップも格納する。個人データ格納部218は、ユーザの情報を格納する。具体的には、ユーザに対する親密度とユーザの身体的特徴・行動的特徴を示すマスタ情報を格納する。年齢や性別などの他の属性情報を格納してもよい。 The map storage unit 216 stores, in addition to the action map defining the action of the robot according to the situation, a map indicating the arrangement situation of obstacles such as a chair and a table. The personal data storage unit 218 stores user information. Specifically, master information indicating the closeness to the user and the physical and behavioral characteristics of the user is stored. Other attribute information such as age and gender may be stored.
 ロボット100は、ユーザごとに親密度という内部パラメータを有する。ロボット100が、自分を抱き上げる、声をかけてくれるなど、自分に対して好意を示す行動を認識したとき、そのユーザに対する親密度が高くなる。ロボット100に関わらないユーザや、乱暴を働くユーザ、出会う頻度が低いユーザに対する親密度は低くなる。 The robot 100 has an internal parameter called familiarity for each user. When the robot 100 recognizes an action indicating favor with itself, such as raising itself or giving a voice, familiarity with the user is increased. The closeness to the user who is not involved in the robot 100, the user who is violent, and the user who is infrequently encountered is low.
 補正情報格納部219は、衣装管理サーバ400から取得したロボット100の動作条件を、可動部(アクチュエータ)の補正情報として格納する。この補正情報には、ロボット100のアクチュエータごとの動作補正値(駆動力に関する補正値)および可動範囲(駆動量に関する設定値)が含まれる。 The correction information storage unit 219 stores the operation condition of the robot 100 acquired from the costume management server 400 as correction information of the movable unit (actuator). The correction information includes an operation correction value (correction value for driving force) and a movable range (setting value for driving amount) of each actuator of the robot 100.
 データ処理部202は、位置管理部208、認識部212、動作制御部222および親密度管理部220を含む。位置管理部208は、ロボット100の位置座標を、図4を用いて説明した方法にて特定する。位置管理部208はユーザの位置座標もリアルタイムで追跡してもよい。 The data processing unit 202 includes a position management unit 208, a recognition unit 212, an operation control unit 222, and a closeness management unit 220. The position management unit 208 specifies the position coordinates of the robot 100 by the method described using FIG. 4. The position management unit 208 may also track the user's position coordinates in real time.
 認識部212は、外部環境を認識する。外部環境の認識には、温度や湿度に基づく天候や季節の認識、光量や温度に基づく物陰(安全地帯)の認識など多様な認識が含まれる。ロボット100の認識部150は、内部センサ128により各種の環境情報を取得し、これを一次処理した上でロボット管理サーバ200の認識部212に転送する。 The recognition unit 212 recognizes the external environment. The recognition of the external environment includes various recognitions such as recognition of weather and season based on temperature and humidity, recognition of an object shade (safety area) based on light quantity and temperature. The recognition unit 150 of the robot 100 acquires various types of environment information by the internal sensor 128, performs primary processing on the acquired information, and transfers the information to the recognition unit 212 of the robot management server 200.
 認識部212は、更に、人物認識部214と応対認識部228を含む。人物認識部214は、ロボット100の内蔵カメラによる撮像画像から抽出された特徴ベクトルと、個人データ格納部218にあらかじめ登録されているユーザ(クラスタ)の特徴ベクトルと比較することにより、撮像されたユーザがどの人物に該当するかを判定する(ユーザ識別処理)。人物認識部214は、表情認識部230を含む。表情認識部230は、ユーザの表情を画像認識することにより、ユーザの感情を推定する。 The recognition unit 212 further includes a person recognition unit 214 and a response recognition unit 228. The human recognition unit 214 compares the feature vector extracted from the image captured by the built-in camera of the robot 100 with the feature vector of the user (cluster) registered in advance in the personal data storage unit 218, thereby capturing the user It is determined which person corresponds to (user identification processing). The person recognition unit 214 includes an expression recognition unit 230. The facial expression recognition unit 230 estimates the user's emotion by performing image recognition on the user's facial expression.
 応対認識部228は、ロボット100になされた様々な応対行為を認識し、快・不快行為に分類する。応対認識部228は、また、ロボット100の行動に対するユーザの応対行為を認識することにより、肯定・否定反応に分類する。快・不快行為は、ユーザの応対行為が、生物として心地よいものであるか不快なものであるかにより判別される。 The response recognition unit 228 recognizes various response actions made to the robot 100 and classifies them as pleasant and unpleasant actions. The response recognition unit 228 also classifies into a positive / negative response by recognizing the user's response to the behavior of the robot 100. The pleasant and unpleasant behavior is determined depending on whether the user's response behavior is comfortable or unpleasant as a living thing.
 動作制御部222は、ロボット100の動作制御部152と協働して、ロボット100のモーションを決定する。動作制御部222は、ロボット100の移動目標地点とそのための移動ルートを作成する。動作制御部222は、複数の移動ルートを作成し、その上で、いずれかの移動ルートを選択してもよい。動作制御部222は、モーション格納部232の複数のモーションからロボット100のモーションを選択する。 The motion control unit 222 cooperates with the motion control unit 152 of the robot 100 to determine the motion of the robot 100. The motion control unit 222 creates a movement target point of the robot 100 and a movement route for it. The operation control unit 222 may create a plurality of movement routes, and then select one of the movement routes. The motion control unit 222 selects the motion of the robot 100 from the plurality of motions of the motion storage unit 232.
 親密度管理部220は、ユーザごとの親密度を管理する。親密度は個人データ格納部218において個人データの一部として登録される。快行為を検出したとき、親密度管理部220はそのユーザに対する親密度をアップさせる。不快行為を検出したときには親密度はダウンする。また、長期間視認していないユーザの親密度は徐々に低下する。 The closeness management unit 220 manages closeness for each user. The closeness is registered in the personal data storage unit 218 as part of personal data. When a pleasant act is detected, the intimacy management unit 220 increases the intimacy with the user. The intimacy is down when an offensive act is detected. In addition, the intimacy degree of the user who has not viewed for a long time gradually decreases.
(ロボット100)
 ロボット100は、通信部142、データ処理部136、データ格納部148、内部センサ128、および駆動機構120を含む。通信部142は、通信機126(図5参照)に該当し、外部センサ114、ロボット管理サーバ200および他のロボット100との通信処理を担当する。データ格納部148は各種データを格納する。データ格納部148は、記憶装置124(図5参照)に該当する。データ処理部136は、通信部142により取得されたデータおよびデータ格納部148に格納されているデータに基づいて各種処理を実行する。データ処理部136は、プロセッサ122およびプロセッサ122により実行されるコンピュータプログラムに該当する。データ処理部136は、通信部142、内部センサ128、駆動機構120およびデータ格納部148のインタフェースとしても機能する。
(Robot 100)
The robot 100 includes a communication unit 142, a data processing unit 136, a data storage unit 148, an internal sensor 128, and a drive mechanism 120. The communication unit 142 corresponds to the communication device 126 (see FIG. 5), and takes charge of communication processing with the external sensor 114, the robot management server 200, and the other robots 100. The data storage unit 148 stores various data. The data storage unit 148 corresponds to the storage device 124 (see FIG. 5). The data processing unit 136 executes various processes based on the data acquired by the communication unit 142 and the data stored in the data storage unit 148. The data processing unit 136 corresponds to a processor 122 and a computer program executed by the processor 122. The data processing unit 136 also functions as an interface of the communication unit 142, the internal sensor 128, the drive mechanism 120, and the data storage unit 148.
 データ格納部148は、モーション格納部160および補正情報格納部162を含む。モーション格納部160は、ロボット100の各種モーションを定義するモーションファイルを格納する。モーション格納部160には、ロボット管理サーバ200のモーション格納部232から各種モーションファイルがダウンロードされる。モーションは、モーションIDによって識別される。様々なモーションを表現するために、各種アクチュエータ(駆動機構120)の動作タイミング、動作時間、動作方向などがモーションファイルにおいて時系列定義される。 The data storage unit 148 includes a motion storage unit 160 and a correction information storage unit 162. The motion storage unit 160 stores motion files that define various motions of the robot 100. The motion storage unit 160 downloads various motion files from the motion storage unit 232 of the robot management server 200. Motion is identified by motion ID. In order to express various motions, operation timings, operation times, operation directions, etc. of various actuators (drive mechanism 120) are defined in time series in the motion file.
 補正情報格納部162には、ロボット管理サーバ200を介して衣装管理サーバ400から取得したロボット100の動作条件を格納する。補正情報格納部162には、ロボット管理サーバ200の補正情報格納部219から補正値ファイルがダウンロードされる。 The correction information storage unit 162 stores the operation conditions of the robot 100 acquired from the costume management server 400 via the robot management server 200. A correction value file is downloaded from the correction information storage unit 219 of the robot management server 200 to the correction information storage unit 162.
 データ格納部148には、マップ格納部216および個人データ格納部218からも各種データがダウンロードされてもよい。 Various data may also be downloaded to the data storage unit 148 from the map storage unit 216 and the personal data storage unit 218.
 データ処理部136は、認識部150、動作制御部152、装備検出部154および補正処理部156を含む。認識部150は、内部センサ128から得られた外部情報を解釈する。認識部150は、視覚的な認識(視覚部)、匂いの認識(嗅覚部)、音の認識(聴覚部)、触覚的な認識(触覚部)が可能である。 The data processing unit 136 includes a recognition unit 150, an operation control unit 152, an equipment detection unit 154, and a correction processing unit 156. The recognition unit 150 interprets external information obtained from the internal sensor 128. The recognition unit 150 can perform visual recognition (vision unit), odor recognition (olfactory unit), sound recognition (hearing unit), and tactile recognition (tactile unit).
 認識部150は、内蔵の全天球カメラにより定期的に外界を撮像し、人やペットなどの移動物体を検出する。認識部150は、移動物体の撮像画像から特徴ベクトルを抽出する。上述したように、特徴ベクトルは、移動物体の身体的特徴と行動的特徴を示すパラメータ(特徴量)の集合である。移動物体を検出したときには、ニオイセンサや内蔵の集音マイク、温度センサ等からも身体的特徴や行動的特徴が抽出される。これらの特徴も定量化され、特徴ベクトル成分となる。 The recognition unit 150 periodically captures the outside world with the built-in omnidirectional camera, and detects a moving object such as a person or a pet. The recognition unit 150 extracts a feature vector from the captured image of the moving object. As described above, the feature vector is a set of parameters (features) indicating physical features and behavioral features of the moving object. When a moving object is detected, physical features and behavioral features are also extracted from an odor sensor, a built-in sound collection microphone, a temperature sensor, and the like. These features are also quantified and become feature vector components.
 認識部150により認識された応対行為に応じて、ロボット管理サーバ200の親密度管理部220はユーザに対する親密度を変化させる。原則的には、快行為を行ったユーザに対する親密度は高まり、不快行為を行ったユーザに対する親密度は低下する。 In response to the response action recognized by the recognition unit 150, the closeness degree management unit 220 of the robot management server 200 changes the closeness degree to the user. In principle, the intimacy with the user who has performed pleasure is increased, and the intimacy with the user who has performed offensive activity decreases.
 動作制御部152は、ロボット管理サーバ200の動作制御部222とともにロボット100の移動方向を決める。行動マップに基づく移動をロボット管理サーバ200で決定し、障害物をよけるなどの即時的移動をロボット100で決定してもよい。駆動機構120は、動作制御部152の指示にしたがって前輪102を駆動することで、ロボット100を移動目標地点に向かわせる。 The motion control unit 152 determines the moving direction of the robot 100 together with the motion control unit 222 of the robot management server 200. The robot management server 200 may determine the movement based on the action map, and the robot 100 may determine an immediate movement such as turning off an obstacle. The drive mechanism 120 drives the front wheel 102 in accordance with the instruction of the operation control unit 152 to direct the robot 100 to the movement target point.
 動作制御部152は、ロボット管理サーバ200の動作制御部222と協働してロボット100のモーションを決める。一部のモーションについてはロボット管理サーバ200で決定し、他のモーションについてはロボット100で決定してもよい。また、ロボット100がモーションを決定するが、ロボット100の処理負荷が高いときにはロボット管理サーバ200がモーションを決定するとしてもよい。ロボット管理サーバ200においてベースとなるモーションを決定し、ロボット100において追加のモーションを決定してもよい。モーションの決定処理をロボット管理サーバ200およびロボット100においてどのように分担するかはロボットシステム300の仕様に応じて設計すればよい。 The motion control unit 152 determines the motion of the robot 100 in cooperation with the motion control unit 222 of the robot management server 200. Some motions may be determined by the robot management server 200, and other motions may be determined by the robot 100. Also, although the robot 100 determines the motion, when the processing load of the robot 100 is high, the robot management server 200 may determine the motion. The base motion may be determined in the robot management server 200, and additional motion may be determined in the robot 100. How to share the motion determination process in the robot management server 200 and the robot 100 may be designed according to the specification of the robot system 300.
 動作制御部152は、選択したモーションを駆動機構120に実行指示する。駆動機構120は、モーションファイルにしたがって、各アクチュエータを制御する。 The operation control unit 152 instructs the drive mechanism 120 to execute the selected motion. The drive mechanism 120 controls each actuator according to the motion file.
 動作制御部152は、親密度の高いユーザが近くにいるときには「抱っこ」をせがむ仕草として両方の手106をもちあげるモーションを実行することもできるし、「抱っこ」に飽きたときには左右の前輪102を収容したまま逆回転と停止を交互に繰り返すことで抱っこをいやがるモーションを表現することもできる。駆動機構120は、動作制御部152の指示にしたがって前輪102や手106、首(頭部フレーム316)を駆動することで、ロボット100に様々なモーションを表現させる。 The motion control unit 152 can also execute a motion that lifts both hands 106 as a gesture that encourages "hug" when a user with high intimacy is nearby, and when the "hug" is tired, the left and right front wheels 102 By alternately repeating reverse rotation and stop while being accommodated, it is also possible to express a motion that annoys you. The drive mechanism 120 causes the robot 100 to express various motions by driving the front wheel 102, the hand 106, and the neck (head frame 316) according to an instruction of the operation control unit 152.
 装備検出部154は、衣装180に縫い付けられたICタグ182からタグIDを読み取ったとき、衣装180が着用されたと判定する。タグIDは至近距離にあるとき読み取り可能である。複数のタグIDが読み取られたときには、重ね着をしている、と判断される。タグIDに対応する補正情報が既に補正情報格納部162に存在するときには、後述の方法により、動作制御部152はその補正情報に合わせて動作設定を変更する。 When the equipment detection unit 154 reads the tag ID from the IC tag 182 sewed on the clothes 180, the equipment detection unit 154 determines that the clothes 180 are worn. The tag ID is readable when in close proximity. When a plurality of tag IDs are read, it is determined that the layering is performed. When the correction information corresponding to the tag ID is already present in the correction information storage unit 162, the operation control unit 152 changes the operation setting in accordance with the correction information by a method described later.
 装備検出部154は、ICタグ182以外にも、さまざまな方法により衣装着用を検出してもよい。たとえば、ロボット100の内部温度が上昇したとき衣装が着用されたと判断してもよい。カメラにより着用する衣装を画像認識してもよい。外皮314の広範囲に静電容量センサを設置し、この静電容量センサが広範囲の接触を検出したときに、衣装が着用されたと判定してもよい。以下、ICタグ182以外に、画像情報や温度情報、接触情報等の物理的情報に基づいて衣装着用を検出することを「物理認識」とよぶ。 The equipment detection unit 154 may detect wearing of clothes by various methods other than the IC tag 182. For example, it may be determined that the costume has been worn when the internal temperature of the robot 100 rises. The image of the clothes worn may be recognized by the camera. A capacitance sensor may be installed in a wide area of the outer skin 314, and when this capacitance sensor detects a wide range of contact, it may be determined that the costume has been worn. Hereinafter, detection of wearing of clothes based on physical information such as image information, temperature information, contact information and the like in addition to the IC tag 182 is referred to as “physical recognition”.
 以下、ICタグ182によってタグIDを登録されている衣装を「公認衣装」、ICタグ182によってタグIDを登録されていない衣装を「非公認衣装」とよぶ。特に区別しないときには、単に、「衣装」とよぶ。 Hereinafter, a costume in which the tag ID is registered by the IC tag 182 will be referred to as a “official costume”, and a costume in which the tag ID is not registered by the IC tag 182 will be referred to as a “non-official costume”. When it does not distinguish in particular, it simply calls "the costume".
 補正処理部156は、衣装180が検出されると、衣装180に適合するようロボット100の動作補正を行う。つまり、タグIDに対応する補正情報に基づき、各アクチュエータに対する制御指令値を補正する。タグIDに対応する補正情報が補正情報格納部162にないときには、補正処理部156は、ロボット管理サーバ200にタグIDを送信し、対応する補正情報(動作補正値と動作範囲)をダウンロードする。補正情報がロボット管理サーバ200にもないときには、ロボット管理サーバ200は、衣装管理サーバ400から補正情報をダウンロードする。 When the costume 180 is detected, the correction processing unit 156 performs motion correction of the robot 100 so as to conform to the costume 180. That is, the control command value for each actuator is corrected based on the correction information corresponding to the tag ID. When the correction information corresponding to the tag ID is not in the correction information storage unit 162, the correction processing unit 156 transmits the tag ID to the robot management server 200, and downloads the corresponding correction information (the operation correction value and the operation range). When the correction information is not present in the robot management server 200, the robot management server 200 downloads the correction information from the costume management server 400.
 ロボット100に衣装が装着されたことが検出されるものの、タグIDが検出されないとき、その衣装は、公認衣装でありながらICタグ182の装着がなされていないか、又は非公認衣装ということになる。このため、動作制御部152は、形式的に衣装を着ていない状態をいやがるモーション(公認衣装着用要求モーション)を選択する。 When it is detected that the robot 100 is wearing a costume but no tag ID is detected, the costume is either a certified costume but no IC tag 182 is worn, or it is a non-certified costume . For this reason, the operation control unit 152 selects a motion (approval costume wearing request motion) which annoy the state in which the costume is not formally dressed.
 公認衣装着用要求モーションは、例えばロボット100が体を激しく揺らす、全く動かなくなる等の拒絶動作であってもよい。ロボット100に特有の「何かを知らせるときの動作のうち、特に、何かをいやがっているときの典型的な動作(モーション)」としてあらかじめ初期設定されてもよい。それにより、ユーザがICタグ182の装着忘れに気づき、公認衣装への縫い付けを行えば、補正処理部156が上記補正処理を実行できる。 The official costume wear request motion may be, for example, a rejection operation such as the robot 100 shaking its body violently or not moving at all. The robot 100 may be initialized in advance as a “typical motion (in particular, when something is annoyed among motions informing something”) specific to the robot 100. As a result, if the user notices that the IC tag 182 has not been attached and performs sewing on the official costume, the correction processing unit 156 can execute the above-mentioned correction processing.
 一方、非公認衣装であってICタグ182の装着ができない場合、補正情報の取得はできない。このため、公認衣装着用要求モーションの実行後に所定期間が経過してもタグIDが検出されないとき、補正処理部156は、補正情報に頼らない自律補正処理を実行する。すなわち、動作制御部152が、予め定める補正用モーションを選択し、その実行の際に各アクチュエータに与える出力とその動作量を計測して動作データとして記録する。補正処理部156は、その動作データに基づいて適正な補正値を算出し、各アクチュエータに対する制御指令値を補正する。 On the other hand, when the IC tag 182 can not be attached because it is a non-official costume, correction information can not be acquired. For this reason, when the tag ID is not detected even after the predetermined period has elapsed after execution of the official costume request motion, the correction processing unit 156 executes an autonomous correction process not relying on the correction information. That is, the operation control unit 152 selects a predetermined correction motion, measures the output given to each actuator at the time of the execution, and records the amount of operation thereof as operation data. The correction processing unit 156 calculates an appropriate correction value based on the operation data, and corrects the control command value for each actuator.
(衣装管理サーバ400)
 衣装管理サーバ400は、通信部410、データ処理部412およびデータ格納部414を含む。通信部410は、ロボット管理サーバ200やユーザ端末250との通信処理を担当する。データ格納部414は、支援サービスとしてロボット100の衣装製作をガイドするための各種データを格納する。データ処理部412は、通信部410により取得されたデータおよびデータ格納部414に格納されているデータに基づいて各種処理を実行する。データ処理部412は、通信部410とデータ格納部414のインタフェースとしても機能する。
(Costume management server 400)
The costume management server 400 includes a communication unit 410, a data processing unit 412, and a data storage unit 414. The communication unit 410 takes charge of communication processing with the robot management server 200 and the user terminal 250. The data storage unit 414 stores various data for guiding the production of the costume of the robot 100 as a support service. The data processing unit 412 executes various processes based on the data acquired by the communication unit 410 and the data stored in the data storage unit 414. The data processing unit 412 also functions as an interface between the communication unit 410 and the data storage unit 414.
 図8および図9は、衣装管理サーバ400がユーザに提供する衣装製作支援画面を表す図である。図8(a)~(c)は、衣装を選択し、それぞれの部位毎に生地を選択するための画面遷移を示す図である。図9(a)および(b)は、衣装および生地を選択した後に表示される購入画面を示す。ユーザ端末250から衣装管理サーバ400へアクセスすると、図8(a)に示す衣装カテゴリ選択画面が表示される。この画面には、「スーツ/コート」「ジャケット/ジャージ」・・・のように衣装のカテゴリごとの選択ボタンが表示される。 FIGS. 8 and 9 are diagrams showing costume production support screens provided by the costume management server 400 to the user. FIGS. 8 (a) to 8 (c) are diagrams showing screen transition for selecting a costume and selecting a fabric for each part. FIGS. 9 (a) and 9 (b) show the purchase screen displayed after the selection of clothes and fabrics. When the costume management server 400 is accessed from the user terminal 250, a costume category selection screen shown in FIG. 8A is displayed. On this screen, selection buttons for each category of clothes are displayed, such as "suit / coat", "jacket / jersey", and so on.
 ユーザが衣装のカテゴリを選択すると、図8(b)に示す衣装選択画面が表示される。図示の例では、ユーザが「ジャケット/ジャージ」を選択したため、画面左側にそのカテゴリに属するジャケット(衣装ID:J001)、ジャージ(衣装ID:J002)、ベスト(衣装ID:J003)・・・のように衣装ごとの選択ボタンが表示されている。画面右側には、衣装の装着対象となるロボット100の画像が表示されている。 When the user selects a costume category, a costume selection screen shown in FIG. 8B is displayed. In the illustrated example, since the user selects "jacket / jersey", the jacket on the left side of the screen belongs to the category (Costume ID: J001), Jersey (Costume ID: J002), Vest (Costume ID: J003) ... There is a selection button for each costume. On the right side of the screen, an image of the robot 100 to which a costume is to be attached is displayed.
 ユーザが衣装を選択すると、図8(c)に示す生地選択画面が表示される。図示の例では、ユーザが「ジャケット(J001)」を選択したため、画面左側にそのジャケットの製作に選択可能な生地が表示されている。この生地の選択は、型紙に対応したパーツごとに段階的に行われる。図示の例では、「襟」部分について生地A(赤)(生地ID:A001)、生地A(青)(生地ID:A002)、・・・のように生地の種別ごとの選択ボタンが表示される。上述のように、同種の生地であっても模様や配色によって生地IDが異なる。ユーザは、衣装の部位ごとに(型紙ごとに)任意の生地を選択可能である。 When the user selects a costume, the fabric selection screen shown in FIG. 8C is displayed. In the illustrated example, since the user has selected "jacket (J001)", the left side of the screen displays fabrics that can be selected for making the jacket. The selection of the fabric is performed stepwise for each part corresponding to the pattern. In the example shown in the figure, for the "collar" part, a selection button for each type of fabric is displayed like fabric A (red) (dough ID: A001), fabric A (blue) (dough ID: A002), ... Ru. As described above, even in the case of the same type of fabric, the fabric ID differs depending on the pattern and color arrangement. The user can select any material (each pattern) for each part of the costume.
 画面右側には、衣装を装着したロボット100のイメージ画像が表示されている。このとき、生地A(赤)を選択すると、該当するイメージファイルが読み込まれ、赤色の襟を有するジャケットがプレビューされる。襟の選択が終了すると、袖、正面、背面等、各部について生地の選択画面が順次表示される。 On the right side of the screen, an image of the robot 100 wearing a costume is displayed. At this time, when fabric A (red) is selected, the corresponding image file is read, and a jacket having a red collar is previewed. When the selection of the collar is finished, a fabric selection screen is sequentially displayed for each part, such as sleeves, front and back.
 このようにして生地の選択が完了すると、図9(a)に示すように、生地および型紙の購入画面が表示される。図示の例では、ジャケット(J001)の生地および型紙の料金として¥1,000が表示され、「購入」「キャンセル」「前画面に戻る」の各ボタンが表示されている。ユーザが購入ボタンを選択すると、課金処理が実行される。キャンセルボタンを選択すると、それまでの選択手続を含めてキャンセルされる。「前画面に戻る」を選択すると、生地の選択画面に戻る。ユーザは、生地を選択し直すことができる。 Thus, when the selection of the dough is completed, as shown in FIG. 9A, the purchase screen of the dough and the paper pattern is displayed. In the illustrated example, ¥ 1,000 is displayed as the fee for the fabric and pattern of the jacket (J001), and the buttons “Purchase”, “Cancel”, and “Return to previous screen” are displayed. When the user selects the purchase button, billing processing is performed. When the cancel button is selected, it is canceled including the previous selection procedure. When "Return to previous screen" is selected, the screen returns to the fabric selection screen. The user can reselect the fabric.
 ユーザが購入ボタンを選択し、課金処理が完了すると、図9(b)に示すように、購入完了画面が表示される。画面右側には型紙がプレビューされ、画面左側にはその型紙をダウンロードするためのボタンが表示されている。ユーザは、このボタンを選択することで、型紙ファイルをダウンロードできる。また、この画面には、生地およびICタグが別途郵送(配送)される旨が表示される。 When the user selects the purchase button and the charging process is completed, a purchase completion screen is displayed as shown in FIG. 9B. The template is previewed on the right side of the screen, and a button for downloading the template is displayed on the left side of the screen. The user can download the template file by selecting this button. In addition, on this screen, it is displayed that the fabric and the IC tag are separately mailed (delivered).
 図10は、図7の衣装管理サーバ400の機能を詳細に表すブロック図である。データ処理部412は、選択部430、作動条件生成部432、ID生成部434、管理部436、作動条件提供部438、課金処理部440、ICタグ発送依頼部442、生地発送依頼部444およびデザイン情報提供部446を含む。データ格納部414は、デザイン情報格納部420、生地特性格納部422、作動条件格納部424および課金情報格納部426を含む。 FIG. 10 is a block diagram showing the function of the costume management server 400 of FIG. 7 in detail. The data processing unit 412 includes a selection unit 430, an operation condition generation unit 432, an ID generation unit 434, a management unit 436, an operation condition provision unit 438, a charge processing unit 440, an IC tag delivery request unit 442, a fabric delivery request unit 444 and a design. An information providing unit 446 is included. The data storage unit 414 includes a design information storage unit 420, a fabric characteristic storage unit 422, an operation condition storage unit 424, and a charge information storage unit 426.
 選択部430は、ユーザ端末250からの要求に応じて、図8および図9を用いて説明した製作対象とする衣装の選択画面を提供する。デザイン情報格納部420は、衣装デザインに関する様々な情報を保持し、選択部430は、デザイン情報格納部420を参照して、選択画面を構成する。デザイン情報格納部420は、例えば、衣装の完成イメージ、製作に利用できる生地、生地のイメージなどの衣装の選択画面を形成するために必要な種々の情報と、衣装の型紙データ、製作手順を記したマニュアルなどの衣装を製作するために利用する情報を保持する。 In response to a request from the user terminal 250, the selection unit 430 provides a selection screen of a costume to be manufactured described with reference to FIGS. 8 and 9. The design information storage unit 420 holds various information on costume design, and the selection unit 430 configures a selection screen with reference to the design information storage unit 420. The design information storage unit 420 records various information necessary for forming a costume selection screen such as a finished image of a costume, a fabric usable for production, an image of a fabric, pattern data of a costume, and a production procedure. Holds information used to produce costumes, such as manuals.
 図11(a)は、デザイン情報格納部420に格納されている衣装情報テーブルのデータ構造の一例を示す。衣装情報テーブルは、衣装毎に型紙の部位と利用できる生地の材質を指定するためのテーブルである。衣装は「衣装ID」により識別され、衣装IDごとに型紙データが割り当てられている。また、型紙の部位毎に、利用できる生地の材質が対応付けられている。例えば、衣装ID「C001」に対して型紙ファイル「C001.zip」が対応付けられており、型紙の襟の部位には、材質A(例えば、皮)が指定されている。また襟の部位は、他に材質B(例えば、布地)も指定されている。これは、襟には材質Aと材質Bの生地を選択できることを意味する。同様に、袖の部位には、材質Aと材質Bの生地を選択出来ることが指定されている。 FIG. 11A shows an example of the data structure of the costume information table stored in the design information storage unit 420. As shown in FIG. The costume information table is a table for designating the part of the pattern and the material of the usable cloth for each costume. The costume is identified by "Costlet ID", and pattern data is assigned to each costume ID. Moreover, the material of the material which can be utilized is matched with every site | part of a pattern. For example, the pattern file "C001.zip" is associated with the costume ID "C001", and the material A (e.g., leather) is designated at the portion of the pattern collar. In addition, material B (for example, fabric) is also specified for the part of the collar. This means that materials of material A and material B can be selected for the collar. Similarly, it is specified that the material of material A and material B can be selected for the sleeve part.
 図11(b)は、生地情報テーブルのデータ構造の一例を示す。生地情報テーブルは、生地毎に、材質の情報を対応付けて格納する。生地は「生地ID」によって識別される。生地IDは、生地の材質が同じであっても、その色や模様に応じて異なる。例えば、同種の材質Aであっても、赤色の生地であれば「A001」、水玉模様の生地であれば「A003」のように異なる生地IDが設定される。同じ赤色であっても、材質Aであれば「A001」、材質Bであれば「B001」のように異なる生地IDが設定される。デザイン情報格納部420が、衣装の製作をガイドする情報を保持する「格納部」として機能する。図10の選択部430は、衣装情報テーブルおよび生地情報テーブルを参照して、ユーザに選択された衣装と、その衣装の製作に用いることができる生地の選択肢を提示しながら、ユーザが所望する衣装を製作するための生地の組み合わせを確定する。 FIG. 11B shows an example of the data structure of the dough information table. The material information table stores information of the material in association with each material. The dough is identified by "Dough ID". The fabric ID differs depending on the color and pattern even if the material of the fabric is the same. For example, even if the material A is the same type, different fabric IDs such as “A001” are set if it is a red fabric, and “A003” if it is a fabric having a polka dot pattern. Even if it is the same red, different material IDs are set such as "A001" for material A and "B001" for material B. The design information storage unit 420 functions as a “storage unit” that holds information for guiding the production of a costume. The selection unit 430 of FIG. 10 refers to the costume information table and the fabric information table, and presents the costume selected by the user and the fabric options that can be used for producing the costume while the costume the user desires Determine the combination of fabrics to make.
 課金処理部440は、ユーザによる選択が終了したときに課金処理画面を送信し、ユーザの購入要求に応じて課金処理を実行する。デザイン情報提供部446は、課金処理の完了を条件としてデザイン情報格納部420から型紙ファイルを取得し、ユーザ端末250へ送信する。デザイン情報提供部446は、「デザイン情報」として型紙ファイルを出力する「提供部」として機能する。 The charging processing unit 440 transmits a charging process screen when the selection by the user is completed, and executes the charging process in response to the user's purchase request. The design information providing unit 446 acquires the template file from the design information storage unit 420 on the condition that the charging process is completed, and transmits the template file to the user terminal 250. The design information providing unit 446 functions as a “providing unit” that outputs a template file as “design information”.
 課金情報格納部426は、衣装の生地および型紙ファイルの提供に際して実行される課金処理のための情報を格納する。この情報には、衣装IDごとに請求すべき金額情報や、各ユーザの購入履歴情報などが含まれる。 The charging information storage unit 426 stores information for charging processing which is executed when providing the cloth and pattern file of the costume. This information includes information on the amount of money to be charged for each costume ID, purchase history information of each user, and the like.
 作動条件生成部432は、課金処理部440において課金処理が完了したことを契機に、選択部430からユーザに選択された衣装および型紙部位ごとの生地の情報を受け取り、その衣装を着用したときのロボット100の作動条件を決定する。そして、作動条件生成部432は、決定した作動条件に基づいて作動設定ファイルを作成する。作動条件生成部432は、ロボットの作動条件を決定して出力する「決定部」として機能し、また、作動条件に基づいて作動設定ファイルを作成する「生成部」として機能する。 The operating condition generation unit 432 receives information on the clothes selected by the user from the selection unit 430 and the material of each pattern part when the charging process is completed in the charge processing unit 440, and the clothes are worn The operating conditions of the robot 100 are determined. Then, the operation condition generation unit 432 generates an operation setting file based on the determined operation condition. The operating condition generation unit 432 functions as a "determination unit" that determines and outputs an operation condition of the robot, and also functions as a "generation unit" that generates an operation setting file based on the operation condition.
 生地特性格納部422は、生地毎の補正情報を衣装と使用される部位毎に格納する。この補正情報は、ロボット100が衣装180を着用した際に適正な動作制御ができるよう、各アクチュエータの制御量(制御指令値)を補正するための補正値として提供される。この補正値は、ロボット100が外皮314のみ装着している基準状態の制御量を基準として設定され、アクチュエータの駆動力に対する補正係数と、アクチュエータによる駆動対象の可動範囲が含まれる。 The fabric characteristic storage unit 422 stores correction information for each fabric for each costume and part used. The correction information is provided as a correction value for correcting the control amount (control command value) of each actuator so that proper operation control can be performed when the robot 100 wears the costume 180. The correction value is set based on the control amount of the reference state in which the robot 100 is mounted only on the outer skin 314, and includes the correction coefficient for the driving force of the actuator and the movable range of the driving target by the actuator.
 図12は、生地特性格納部422のデータ構造の一例を示す。生地特性格納部422は、衣装の種別ごとに衣装の部位(型紙の部位)、その部位に採用される生地の材質、その部位が作動に影響を与えるアクチュエータ(可動部)、そのアクチュエータの駆動力に設定すべき補正係数、そのアクチュエータの可動範囲が対応づけられて格納される。 FIG. 12 shows an example of the data structure of the material characteristic storage unit 422. The fabric characteristic storage unit 422 is a part of the costume (a part of the pattern), a material of the fabric adopted for the part, an actuator (movable part) whose part affects the operation, and a driving force of the actuator. The correction factor to be set to and the movable range of the actuator are associated and stored.
 例えば、衣装ID「C001」のコートは、襟の部位に材質A(例えば皮)を用いた場合、頭部の作動抵抗が比較的大きくなる。このため、頭部を左右に駆動するアクチュエータの補正係数として1.5が設定され、頭部の左右可動範囲が-30~30°とされる。上述のように、基準状態での可動範囲が-75~75°であることから、制御量に大きな制約をかけることとなる。一方、襟に材質B(例えば布地)を用いた場合、頭部の作動抵抗は比較的小さい。このため、頭部を左右に駆動するアクチュエータの補正係数として1.2が設定され、頭部の左右可動範囲が-50~50°とされる。 For example, in the case of the coat of the costume ID “C001”, when the material A (for example, skin) is used for the collar portion, the working resistance of the head becomes relatively large. Therefore, 1.5 is set as the correction coefficient of the actuator for driving the head left and right, and the left and right movable range of the head is set to -30 to 30 °. As described above, since the movable range in the reference state is −75 to 75 °, the control amount is greatly restricted. On the other hand, when the material B (for example, fabric) is used for the collar, the working resistance of the head is relatively small. Therefore, 1.2 is set as the correction coefficient of the actuator for driving the head left and right, and the left and right movable range of the head is set to -50 to 50 °.
 衣装がジャケット(J001)である場合、構造上、コート(C001)よりも自由度が高い。このため、頭部を駆動するアクチュエータの補正値は緩和される傾向にある。腕部を駆動するアクチュエータについても同様である。衣装がタンクトップ(T001)である場合、頭部や腕部の作動には影響しないため、それらを駆動するアクチュエータについて補正は行わない。あるいは、補正係数1.0とし、可動範囲を基準状態と一致させる。タンクトップであっても、正面の生地に材質Cを選択した場合、その重量が車輪の進退制御にやや影響を与える。このため、補正係数を1.1としている。 If the costume is a jacket (J001), the structure is more flexible than the coat (C001). Therefore, the correction value of the actuator for driving the head tends to be relaxed. The same applies to the actuator that drives the arm. If the costume is a tank top (T001), no correction is made on the actuators that drive the head and arms, as they do not affect the operation of the head and arms. Alternatively, the correction range is set to 1.0, and the movable range is made to coincide with the reference state. Even if it is a tank top, when material C is selected as the material of the front, the weight has a slight influence on the advancing and retracting control of the wheel. For this reason, the correction coefficient is 1.1.
 図10の作動条件生成部432は、生地特性格納部422を参照して、衣装と生地の組み合わせに応じた作動条件を決定する。一般に、衣装は生地を型紙に合わせて裁断し、裁断した生地を縫い合わせることで製作される。型紙の部位毎に、その部位がロボット100のアクチュエータに及ぼす影響を鑑みて得られた補正情報を予め用意することで、作動条件生成部432は、ユーザが任意に選んだ組み合わせの生地で製作された衣装を着用した際に必要となる適切な作動条件を決定できる。 The operating condition generation unit 432 of FIG. 10 refers to the fabric characteristic storage unit 422 to determine the operating condition according to the combination of the costume and the fabric. Generally, costumes are manufactured by cutting the cloth according to the pattern and sewing the cut cloths together. By preparing in advance correction information obtained in consideration of the influence of the part on the actuator of the robot 100 for each part of the template, the operation condition generation unit 432 is manufactured with a combination of materials arbitrarily selected by the user. Can determine the appropriate operating conditions required when wearing a dress.
 管理部436は、作動設定ファイルを作動条件生成部432から受け取る。その後、管理部436は、ID生成部434にタグIDの生成を指示する。ID生成部434は、管理部436の指示を受け、タグIDを生成する。タグIDは、衣装毎に生成される識別情報であり、同一の衣装、同一の生地の組み合わせであっても重複しない。つまり、衣装管理サーバ400を利用して製作された衣装毎に生成され、衣装を一意に特定できる情報である。管理部436は、ID生成部434が生成したタグIDと、作動条件生成部432が生成した作動設定ファイルとを対応づけて作動条件格納部424に格納する。 The management unit 436 receives the operation setting file from the operation condition generation unit 432. Thereafter, the management unit 436 instructs the ID generation unit 434 to generate a tag ID. The ID generation unit 434 receives the instruction from the management unit 436 and generates a tag ID. The tag ID is identification information generated for each costume, and does not overlap even if the combination of the same costume or the same cloth is used. That is, the information is generated for each costume manufactured using the costume management server 400 and is information that can uniquely identify the costume. The management unit 436 associates the tag ID generated by the ID generation unit 434 with the operation setting file generated by the operation condition generation unit 432, and stores the association in the operation condition storage unit 424.
 図13は、作動条件格納部424のデータ構造の一例を示す図である。作動条件格納部424は、タグIDと、作動設定ファイルとしてアクチュエータごとの補正係数と可動範囲と対応づけたテーブルを保持する。この図では、説明のため、作動設定ファイルに保持される情報がテーブルとして表記されているが、作動条件格納部424は、タグIDをキーとして、対応する作動設定ファイルを抽出できるようデータを格納すればよい。 FIG. 13 is a view showing an example of the data structure of the operating condition storage unit 424. As shown in FIG. The operating condition storage unit 424 holds a table in which a tag ID and a correction coefficient for each actuator and a movable range are associated as an operation setting file. In this figure, the information held in the operation setting file is described as a table for the sake of explanation, but the operation condition storage unit 424 stores data so that the corresponding operation setting file can be extracted using the tag ID as a key. do it.
 図10の作動条件提供部438は、ロボット管理サーバ200からタグIDを受信し、受信したタグIDと関連付けられた作動設定ファイルを作動条件格納部424から読み出す。そして、作動条件提供部438は作動設定ファイルをロボット管理サーバ200へ送信する。作動条件提供部438は、タグIDをキーに用いて作動条件格納部424を参照することにより、ロボットの各アクチュエータについて補正係数と可動範囲を取得できる。 The operation condition providing unit 438 in FIG. 10 receives the tag ID from the robot management server 200, and reads the operation setting file associated with the received tag ID from the operation condition storage unit 424. Then, the operation condition providing unit 438 transmits the operation setting file to the robot management server 200. The operating condition providing unit 438 can obtain the correction coefficient and the movable range for each actuator of the robot by referring to the operating condition storage unit 424 using the tag ID as a key.
 ICタグ発送依頼部442は、管理部436にて処理されたタグIDをICタグ182に記録してユーザに郵送するよう、ICタグ発行業者に依頼する。生地発送依頼部444は、選択部430にて選択された生地をユーザに郵送するよう、生地業者に依頼する。これらの依頼情報は、各業者の端末へ自動送信されてもよい。 The IC tag shipping request unit 442 requests the IC tag issuer to record the tag ID processed by the management unit 436 in the IC tag 182 and mail it to the user. The fabric delivery request unit 444 requests the fabric supplier to mail the fabric selected by the selection unit 430 to the user. These request information may be automatically transmitted to the terminal of each vendor.
 図14は、支援サービス利用開始時における衣装管理サーバとユーザ端末との通信の概要を表すシーケンス図である。
 ユーザ端末250からアクセス要求があると(S10)、衣装管理サーバ400は、上述した選択画面を順次送信する(S12)。ユーザ端末250は、ユーザの入力にしたがって衣装や生地等の情報(衣装情報)を選択し(S14)、その選択情報を衣装管理サーバ400へ送信する(S16)。衣装管理サーバ400は、ユーザによる選択が完了すると、その選択情報を登録し(S18)、購入画面を送信する(S20)。
FIG. 14 is a sequence diagram showing an outline of communication between the costume management server and the user terminal at the start of using the support service.
When there is an access request from the user terminal 250 (S10), the costume management server 400 sequentially transmits the above-described selection screen (S12). The user terminal 250 selects information (such as costume information) such as costumes and fabrics according to the user's input (S14), and transmits the selected information to the costume management server 400 (S16). When the selection by the user is completed, the costume management server 400 registers the selected information (S18), and transmits the purchase screen (S20).
 ユーザ端末250は、ユーザにより購入ボタンが選択されると(S22)、購入要求を送信する(S24)。衣装管理サーバ400は、この購入要求に応じて課金処理を実行し(S26)、それが完了すると、ユーザ端末250に課金完了通知を送信するとともに(S28)、衣装情報に対応する型紙ファイルを読み込む(S30)。 When the user selects a purchase button (S22), the user terminal 250 transmits a purchase request (S24). The costume management server 400 executes charging processing in response to the purchase request (S26), and when it is completed, transmits a charging completion notification to the user terminal 250 (S28), and reads a paper pattern file corresponding to the costume information (S30).
 ユーザ端末250から型紙要求があると(S32)、その型紙ファイルを送信する(S34)。ユーザ端末250は、ユーザの操作に応じてその型紙ファイルをダウンロードし(S40)、型紙を印刷する(S42)。 When a request for a form is made from the user terminal 250 (S32), the form file is transmitted (S34). The user terminal 250 downloads the template file according to the user's operation (S40), and prints the template (S42).
 一方、衣装管理サーバ400は、その衣装情報に対応するタグIDを発行し(S44)、作動条件(作動設定ファイル)を作成する(S46)。そして、作成した作動条件をタグIDに対応づけて登録する(S48)。その後、上述したタグID発送処理を実行し(S50)、生地発送処理を実行する(S52)。 On the other hand, the costume management server 400 issues a tag ID corresponding to the costume information (S44), and creates an operation condition (operation setting file) (S46). Then, the created operation condition is registered in association with the tag ID (S48). Thereafter, the tag ID shipping process described above is executed (S50), and the fabric shipping process is executed (S52).
 図15は、ICタグ発行業者および生地業者とユーザとの手続の概要を表すシーケンス図である。
 ICタグ発行業者は、図14のS50の発送指示およびタグIDを受け取った後、そのタグIDをICタグ182に書き込み(S60)、ユーザ宅252へ配送する(S62)。一方、生地業者は、図14のS52の発送指示を受け取った後、該当する生地を準備し(S64)、ユーザ宅252へ発送する。ユーザは、これらを受け取ると、既にダウンロードした型紙にしたがってその生地を裁断し(S68)、衣装180を縫製し(S70)、その衣装180にICタグ182を縫い付ける(S72)。
FIG. 15 is a sequence diagram showing an outline of the procedure between the IC tag issuer and fabric supplier and the user.
After receiving the shipping instruction and the tag ID in S50 of FIG. 14, the IC tag issuing vendor writes the tag ID in the IC tag 182 (S60) and delivers the tag ID to the user home 252 (S62). On the other hand, after receiving the shipping instruction of S52 of FIG. 14, the fabric provider prepares the corresponding fabric (S64), and sends it to the user home 252. When the user receives these, the user cuts the cloth according to the already downloaded pattern (S68), sews the costume 180 (S70), and sews the IC tag 182 on the costume 180 (S72).
 図16は、ロボット100の衣装着用時の動作補正過程を示すフローチャートである。
 まず、装備検出部154は、画像情報や温度情報、接触情報等の物理的情報に基づいて衣装着用を物理認識する(S100)。衣装着用を物理認識できないときには(S100のN)、以降の処理は実行されない。衣装着用を物理認識できたときであって(S100のY)、装備検出部154がタグIDも検出できたとき(S102のY)、つまり公認衣装の着用時には、補正処理部156が各アクチュエータの制御量に関して以下の補正処理を実行する。
FIG. 16 is a flowchart showing a process of correcting the movement of the robot 100 when wearing a costume.
First, the equipment detection unit 154 physically recognizes the wearing of the clothes based on physical information such as image information, temperature information, and contact information (S100). When it is not possible to physically recognize wearing of the costume (N in S100), the subsequent processing is not performed. When it is possible to physically recognize the wearing of the costume (Y in S100) and when the equipment detection unit 154 can also detect the tag ID (Y in S102), that is, when wearing the official costume, the correction processing unit 156 The following correction processing is executed for the control amount.
 補正処理部156は、まず、既に補正がなされたことを示す補正済フラグがオフであれば(S104のN)、ロボット管理サーバ200を介して作動条件を取得する作動条件取得処理を実行する(S106)。それにより作動条件が取得されると(S108のY)、その作動条件に含まれる補正係数および作動範囲を制御量に反映させる補正処理を実行する(S110)。動作補正後、動作制御部152は各アクチュエータを実際に動かし、出力値と動作量を計測する。出力値と動作量の関係を示す動作データを参照し、動作補正後に所望の動作量が実現されているか否かを判定する(S116)。 If the corrected flag indicating that correction has already been made is OFF (N in S104), the correction processing unit 156 first executes operating condition acquisition processing for acquiring an operating condition via the robot management server 200 (N in S104) S106). Accordingly, when the operating condition is acquired (Y in S108), the correction processing is performed to reflect the correction coefficient and the operating range included in the operating condition in the control amount (S110). After the operation correction, the operation control unit 152 actually moves each actuator to measure an output value and an operation amount. The operation data indicating the relationship between the output value and the operation amount is referred to, and it is determined whether or not the desired operation amount is realized after the operation correction (S116).
 一方、装備検出部154がタグIDを検出できないとき(S102のN)、つまり非公認衣装の着用時には、動作制御部152は、上述した公認衣装着用要求モーションを実行する(S112)。このとき、所定時間経過してもタグIDが検出されないとき(S114のY)、つまりロボット100に公認衣装を着せる意思がユーザにないと判断されるときには、S116へ移行し、動作確認を実行する。 On the other hand, when the equipment detection unit 154 can not detect the tag ID (N in S102), that is, when wearing a non-official costume, the operation control unit 152 executes the above-mentioned official costume wear request motion (S112). At this time, when the tag ID is not detected even after the predetermined time has elapsed (Y in S114), that is, when it is determined that the user does not intend to wear the official costume on the robot 100, the process proceeds to S116 to execute operation check. .
 S116の動作確認により、動作量が適正であれば(S118のY)、動作制御部152は、補正済フラグをオンにし(S120)、補正情報を更新する(S124)。動作量が適正でなければ(S118のN)、動作制御部152は、自律的に補正処理を実行する(S122)。すなわち、動作制御部152は、アクチュエータの出力値と動作量を動作データとして記録し、各アクチュエータの出力値として支障のない動作補正(駆動力の補正、可動範囲の補正)を実行する。そして、その補正情報を更新する(S124)。 If the operation amount is appropriate (Y at S118), the operation control unit 152 turns on the correction completion flag (S120), and updates the correction information (S124). If the operation amount is not appropriate (N in S118), the operation control unit 152 autonomously executes the correction process (S122). That is, the operation control unit 152 records the output value and the operation amount of the actuator as operation data, and performs operation correction (correction of the driving force and correction of the movable range) without any trouble as the output value of each actuator. Then, the correction information is updated (S124).
 図17は、図16におけるS106の作動条件取得処理に関連して、ロボット、ロボット管理サーバおよび衣装管理サーバの通信の概要を表すシーケンス図である。
 ロボット100からタグIDが送信されるとともに作動条件の送信要求があると(S80)、ロボット管理サーバ200は、衣装管理サーバ400にアクセスし、そのタグIDを送信するとともに作動条件の送信を要求する。
FIG. 17 is a sequence diagram showing an outline of communication of the robot, the robot management server, and the costume management server in relation to the operation condition acquiring process of S106 in FIG.
When the tag ID is transmitted from the robot 100 and the transmission request of the operation condition is received (S80), the robot management server 200 accesses the costume management server 400, transmits the tag ID, and requests the transmission of the operation condition. .
 衣装管理サーバ400は、受信したタグIDに基づいてデータテーブルを参照し、対応する作動設定ファイルを読み出し(S84)、ロボット管理サーバ200へ送信する。ロボット管理サーバ200は、その作動設定ファイルに定義された作動条件を補正情報格納部219に格納するとともに(S88)、ロボット100へ送信する(S90)。ロボット100は、その作動条件を作動条件格納部424に格納する(S92)。 The costume management server 400 refers to the data table based on the received tag ID, reads the corresponding operation setting file (S84), and transmits the file to the robot management server 200. The robot management server 200 stores the operation condition defined in the operation setting file in the correction information storage unit 219 (S88) and transmits it to the robot 100 (S90). The robot 100 stores the operating condition in the operating condition storage unit 424 (S92).
 以上、実施形態に基づき、衣装製作支援システム500について説明した。本実施形態によれば、衣装180の製作をガイドする画面が順次提供されることで、ユーザによるロボット100の衣装製作を支援できる。そして特に、その支援に基づいて製作された衣装180に関し、ロボット100がこれを着用した際の作動条件(補正情報)が出力される。ロボット100が、その作動条件を制御に反映させることで、衣装180に合わせた無理のない動作を行うことができ、その性能を適正に発揮できる。衣装の種別やその生地に基づいて適正な作動条件が設定されるため、ロボット100のアクチュエータに支障が生じることを防止又は抑制できる。これにより、ユーザが安心して、ロボット100に様々な衣装180を着せることができ、ロボット100との暮らしを楽しむことができる。 The costume production support system 500 has been described above based on the embodiment. According to the present embodiment, by sequentially providing the screens for guiding the production of the costume 180, it is possible to support the production of the costume of the robot 100 by the user. In particular, with regard to the costume 180 manufactured based on the support, the operating condition (correction information) when the robot 100 wears it is output. The robot 100 can perform an operation that is not unreasonable according to the costume 180 by reflecting the operation condition in the control, and can appropriately exhibit its performance. Since the appropriate operating conditions are set based on the type of the costume and the fabric thereof, it is possible to prevent or suppress the occurrence of a problem in the actuator of the robot 100. Thereby, the user can put on various clothes 180 on the robot 100 with peace of mind, and can enjoy living with the robot 100.
 なお、本発明は上記実施形態や変形例に限定されるものではなく、要旨を逸脱しない範囲で構成要素を変形して具体化することができる。上記実施形態や変形例に開示されている複数の構成要素を適宜組み合わせることにより種々の発明を形成してもよい。また、上記実施形態や変形例に示される全構成要素からいくつかの構成要素を削除してもよい。 The present invention is not limited to the above-described embodiment and modification, and the components can be modified and embodied without departing from the scope of the invention. Various inventions may be formed by appropriately combining a plurality of components disclosed in the above-described embodiment and modifications. Moreover, some components may be deleted from all the components shown in the above-mentioned embodiment and modification.
 上記実施形態では、衣装の製作方法を指定する「デザイン情報」として型紙データをネットワークを介して提供する例を示した。変形例においては、製作手順を示すマニュアルその他のデザイン情報を提供してもよい。衣装が3Dプリンターで作製される場合には、その3Dプリンターの制御情報をデザイン情報として提供してもよい。また、デザイン情報(型紙など)を配送してもよい。 In the above embodiment, an example has been shown in which pattern data is provided via a network as “design information” that designates a method of producing a costume. In a variant, a manual or other design information may be provided to indicate the production procedure. When the costume is produced by a 3D printer, control information of the 3D printer may be provided as design information. In addition, design information (pattern etc.) may be delivered.
 上記実施形態では、衣装管理サーバ400が、選択された衣装の生地と型紙をユーザに提供する例を示した。変形例においては、生地については提供しないか、提供有無をユーザ側で選択できるようにしてもよい。それにより、ユーザは同じ材質の生地を別ルートで購入でき、より自分の好みに合う色や模様を有する生地にて衣装製作を行えるようになる。それにより、ユーザは、よりオリジナル性の高い衣装を作製できる。衣装サービス会社402は、型紙および作動条件を提供することの対価としてユーザに課金することができる。その結果、支援サービスの自由度も高められる。 In the above embodiment, the costume management server 400 has provided an example in which the fabric and pattern of the selected costume are provided to the user. In a modification, it may be made not to offer about cloth, or it can be made to be selectable by the user side. As a result, the user can purchase a cloth made of the same material through another route, and can make a costume with a cloth having a color or pattern more suited to his or her preference. As a result, the user can create a more original costume. The costume service company 402 can charge the user for the provision of the pattern and operating conditions. As a result, the degree of freedom of support services is also enhanced.
 あるいは、衣装サービス会社が衣装製作に必要な材料(生地や型紙)を別ルートで販売し、衣装管理サーバは、ロボットの作動条件のみを提供してもよい。例えば、ユーザがロボットの衣装をインターネット上(オンライン)又は店舗(オフライン)にて購入し、その購入に際して補正IDを入手できるようにしてもよい。「補正ID」は、その衣装を着用するロボットの作動条件(補正情報)を取得するための識別情報である。その補正IDを用いて衣装管理サーバに作動条件の提供を要求してもよい。その場合、ユーザ側でICタグとそのICタグに補正IDを書き込むライタを保有してもよい。補正IDが書き込まれたICタグをロボットに装着することにより、上記実施形態と同様、ロボットの作動条件を取得することができる。 Alternatively, the costume service company may sell materials (fabrics and patterns) necessary for producing costumes on another route, and the costume management server may provide only the operating conditions of the robot. For example, the user may purchase the robot's costume on the Internet (online) or at a store (offline), and obtain the correction ID upon the purchase. The “correction ID” is identification information for acquiring the operating condition (correction information) of the robot wearing the costume. The correction ID may be used to request the costume management server to provide an operating condition. In that case, the user may have an IC tag and a writer that writes the correction ID to the IC tag. By mounting the IC tag in which the correction ID is written to the robot, the operating condition of the robot can be acquired as in the above embodiment.
 補正IDは、例えば文字列で提供されるなど、ユーザが直接認識可能な状態で提供されてもよい。その場合、ユーザ端末から衣装管理サーバにアクセスして作動条件を要求する。ユーザ端末で取得した情報をロボット又はロボット管理サーバに転送できる。あるいは、上記実施形態と同様にタグIDとして提供されるなど、ユーザが直接認識できない状態で提供されてもよい。その場合、ロボット又はロボット管理サーバが衣装管理サーバにアクセスして作動条件を要求する。 The correction ID may be provided directly recognizable by the user, such as provided as a character string. In that case, the user terminal accesses the costume management server to request an operating condition. The information acquired by the user terminal can be transferred to the robot or robot management server. Alternatively, it may be provided in a state that the user can not directly recognize, such as being provided as a tag ID as in the above embodiment. In that case, the robot or robot management server accesses the costume management server to request an operating condition.
 上記実施形態では述べなかったが、衣装におけるICタグの取り付け位置を指定する情報をデザイン情報(型紙データ等)に含めてもよい。ICタグがその指定位置に装着されなかった場合、ロボットがタグ情報を読み取れない構成としてもよい。 Although not described in the above embodiment, information for specifying the attachment position of the IC tag in the costume may be included in the design information (pattern data etc.). If the IC tag is not attached to the designated position, the robot may not be able to read the tag information.
 上記実施形態では、ユーザ端末250とロボット管理サーバ200とを別構成とする例を示した。変形例においては、ユーザ端末250とロボット管理サーバ200とが一体に構成されてもよい。例えば、ユーザ端末250をラップトップPCなどの汎用コンピュータとし、その一部の機能によりロボット管理サーバ200を実現してもよい。 In the said embodiment, the example which makes the user terminal 250 and the robot management server 200 another structure was shown. In a modification, the user terminal 250 and the robot management server 200 may be integrally configured. For example, the user terminal 250 may be a general-purpose computer such as a laptop PC, and the robot management server 200 may be realized by a part of its functions.
 上記実施形態では、ユーザ端末250がインターネット510を介して衣装管理サーバ400に接続することで、支援サービスを受ける構成を示した。変形例においては、ユーザが衣装サービス会社402に設置された端末を用いて、衣装管理サーバ400に直接アクセスできるようにしてもよい。 In the said embodiment, when the user terminal 250 connects with the costume management server 400 via the internet 510, the structure which receives a support service was shown. In a modification, the user may directly access the costume management server 400 using a terminal installed in the costume service company 402.
 上記実施形態では、ロボットが特定形状の柔らかい外皮を有し、その外皮の上から衣装を着せる構成を示した。変形例においては、上記と異なる形状や材質の外皮を有するロボット、あるいは外皮を有しないロボットの衣装製作に上記支援装置を利用してもよい。 In the above-described embodiment, the robot has a soft shell having a specific shape, and a configuration is shown in which the costume is worn from above the shell. In a modification, the above-mentioned support device may be used for producing a costume of a robot having a shell different from the above-described shape or material or a robot not having a shell.
 上記実施形態および変形例では、衣装の装着対象としてロボットの一態様を示したが、上記支援装置は、他のヒューマノイドロボットやペットロボット等にも適用可能である。 In the above-described embodiment and modification, one aspect of the robot is shown as a target for wearing a costume, but the support device is also applicable to other humanoid robots and pet robots.
 上記実施形態では、アクチュエータの補正を主に説明したが、図10の作動条件生成部432は、タッチセンサの感度補正など、衣装を着用することで特性が変化するセンサ等の補正するための値も作動設定ファイルに含めてもよい。 Although the correction of the actuator has been mainly described in the above embodiment, the operation condition generation unit 432 of FIG. 10 is a value for correcting a sensor or the like whose characteristics change when the costume is worn, such as sensitivity correction of the touch sensor. May also be included in the operation configuration file.
 上記実施形態の変形例として、図10のICタグ発送依頼部442は、暗号化したタグIDを業者に送信し、ICタグへの書き込みを依頼してもよい。ロボット100だけが暗号化されたタグIDを復号できるように構成することで、タグIDが不正に利用され複数の衣装に共通のタグIDが使われることを防止できる。 As a modification of the above embodiment, the IC tag shipping request unit 442 in FIG. 10 may transmit the encrypted tag ID to the vendor and request writing to the IC tag. By configuring so that only the robot 100 can decrypt the encrypted tag ID, it is possible to prevent the tag ID from being used illegally and the common tag ID being used for a plurality of costumes.
 上記実施形態では述べなかったが、製作対象となる衣装ごとに固有のモーション(「衣装固有モーション」ともいう)を設定してもよい。「衣装固有モーション」は、衣装に対応づけられる衣装特有の特定動作である。さらに、衣装固有のモーションごとにその発動条件を設定してもよい。例えば、タグIDに対応する衣装がダンス衣装である場合、衣装固有のモーションとしてロボットにダンスを踊らせるモーションを設定し、その発動条件としてダンスミュージックが検知されることを設定してもよい。あるいは、タグIDに対応する衣装が特定のアイドルや俳優の衣装を模したものである場合、衣装固有のモーションとしてロボットにそのアイドル等のものまねをさせるモーションを設定し、その発動条件として、ユーザによるそのアイドル等の呼びかけが検知されることを設定してもよい。そして、タグIDに上述した作動条件(補正情報)に加え、衣装固有モーションおよびその発動条件を対応づけて作動設定ファイルを生成してもよい。作動設定ファイルには、衣装固有モーションの制御内容が定義された衣装固有モーションファイルが含まれる。 Although not described in the above embodiment, a unique motion (also referred to as a "costal-specific motion") may be set for each costume to be produced. "Costume-specific motion" is a specific action unique to a costume associated with the costume. Furthermore, the triggering condition may be set for each costume-specific motion. For example, when the costume corresponding to the tag ID is a dance costume, a motion that causes the robot to dance as the costume-specific motion may be set, and dance music may be set as the trigger condition. Alternatively, if the costume corresponding to the tag ID imitates a specific idol or actor costume, set a motion that causes the robot to imitate the idol etc. as the costume-specific motion, and the user sets the motion condition as the motion condition. It may be set that a call such as an idol is detected. Then, in addition to the operation condition (correction information) described above, the tag unique ID may be associated with the costume unique motion and its activation condition to generate the operation setting file. The operation setting file includes a costume unique motion file in which the control content of the costume unique motion is defined.
 ロボットは、タグIDをロボット管理サーバ又は衣装管理サーバに送信することで、着用した衣装に対応する作動設定ファイルをダウンロードできる。衣装管理サーバは、衣装固有モーションファイルを提供する。ロボットは、タグIDに対応した作動条件が成立すると、そのタグIDに対応する衣装固有モーションを実行する。このように、作動設定ファイルに衣装固有モーションを対応づけることで、ロボットの衣装製作に関するサービスをより充実させることができる。 The robot can download the operation setting file corresponding to the worn costume by transmitting the tag ID to the robot management server or the costume management server. The costume management server provides costume-specific motion files. When the operation condition corresponding to the tag ID is established, the robot executes the costume unique motion corresponding to the tag ID. As described above, by associating the costume unique motion with the operation setting file, it is possible to further enhance the service relating to the production of the costume of the robot.
 なお、衣装固有モーションとその発動条件については、上記のほか様々な態様を設定できる。例えば、厚手の衣装のタグIDに対し、衣装固有モーションとして暑がる仕草を表現するモーションを設定し、気温が20度以上であることをその発動条件としてもよい。 In addition to the above, various modes can be set for the costume unique motion and its activation condition. For example, for the tag ID of a thick costume, a motion representing a hot gesture as a costume unique motion may be set, and the trigger condition may be that the temperature is 20 degrees or more.
 上記実施形態では述べなかったが、ロボットを機能面から保護する衣装を製作対象としてもよい。図18は、機能的衣装として帽子を例示する図である。
 ロボット100における頭部フレーム316の頭頂部には、外皮314の開口部390と同軸状に孔が設けられ、ツノ112が挿通される(図2参照)。この孔とツノ112との隙間が通気通路となっており、頭部フレーム316内に外気を導入できる。外気は、頭部フレーム316内の機能部品(回路基板等の発熱部品)を冷却する。このため、衣装として帽子を製作する場合、この冷却機能を損なわないようにするのが好ましい。
Although not described in the above embodiment, a costume for protecting the robot from the functional aspect may be a production target. FIG. 18 is a diagram illustrating a hat as a functional costume.
A hole is provided coaxially with the opening 390 of the outer skin 314 at the top of the head frame 316 of the robot 100, and the tongue 112 is inserted (see FIG. 2). A gap between the hole and the tongue 112 is a ventilation passage, and outside air can be introduced into the head frame 316. The outside air cools functional components (heat generating components such as a circuit board) in the head frame 316. For this reason, when producing a hat as a costume, it is preferable not to impair this cooling function.
 図18(a)に示す帽子520は、頂部にロボット100のツノ112を挿通させる挿通孔522を有する。帽子520における挿通孔522の周囲にはメッシュ524が設けられ、部分的に通気性が高められている。図18(b)に示す帽子530は、頂部のメッシュ524に加え、後部にもメッシュ526を有する。例えば、頭部フレーム316の後頭部に排気口を設けるような場合、その排気口の位置に合わせてメッシュ526を設けることで、通気性をさらに向上させることができる。なお、各帽子の生地の材質として布地、皮、合成樹脂(プラスチック)が選択可能とされている。 The hat 520 shown in FIG. 18A has an insertion hole 522 at the top, through which the tongue 112 of the robot 100 is inserted. A mesh 524 is provided around the insertion hole 522 in the hat 520 to partially enhance air permeability. The hat 530 shown in FIG. 18 (b) has a mesh 526 at the rear in addition to the mesh 524 at the top. For example, in the case where an exhaust port is provided in the back of the head frame 316, air permeability can be further improved by providing the mesh 526 in accordance with the position of the exhaust port. In addition, cloth, leather and synthetic resin (plastic) can be selected as the material of the cloth of each hat.
 なお、本変形例ではロボット100の構造に合わせて、帽子に通気構造を設ける例を示したが、帽子以外の衣装に同様の通気構造を採用してもよい。例えば、ペット型ロボットにおいて尻尾の接続部に通気通路が形成される場合、衣装に尻尾を挿通させる孔を設け、その孔の周囲や近傍をメッシュ構造にするなどして通気性を高めてもよい。 In this modification, an example in which the ventilation structure is provided in the hat is shown in accordance with the structure of the robot 100, but the same ventilation structure may be adopted for clothes other than the hat. For example, in the case where a venting passage is formed at the connection portion of the tail in the pet-type robot, the clothes may be provided with holes for inserting the tail in the clothes, and the periphery may be meshed around the holes. .
 すなわち、ロボットに着用させる衣装について、そのロボットにおける通気通路を覆う部分、又はその通気通路の近傍に通気向上領域を設けてもよい。「通気向上領域」は、メッシュなどの多孔構造により実現してもよいし、さらに相対的に生地が薄い領域とすることで実現してもよい。「通気向上領域」は、ロボットにおいて外気を導入する給気口に対応する位置、および内気を排出する排気口に対応する位置にそれぞれ設けられることが好ましい。 That is, in the clothes worn by the robot, the ventilation improvement region may be provided in a portion covering the ventilation passage in the robot or in the vicinity of the ventilation passage. The "air flow improving region" may be realized by a porous structure such as a mesh, or may be realized by making the fabric relatively thin. It is preferable that the “ventilation improvement region” be provided at a position corresponding to the air supply port for introducing the outside air in the robot and a position corresponding to the exhaust port for discharging the inside air.

Claims (9)

  1.  ロボットの衣装の製作をガイドするための情報を保持する格納部と、
     ユーザの要求入力に応じて、製作対象とする衣装の選択画面を表示させる選択部と、
     選択された衣装に基づいてロボットの作動条件を決定して出力する決定部と、
     を備えることを特徴とする衣装製作支援装置。
    A storage unit that holds information to guide the production of the robot's costume;
    A selection unit for displaying a selection screen of a costume to be produced according to a user's request input;
    A determination unit that determines and outputs an operating condition of the robot based on the selected costume;
    The costume production support device characterized by having.
  2.  前記選択部は、選択された衣装の製作に利用する生地をユーザに選択させ、
     前記決定部は、選択された衣装および生地に基づいて前記ロボットの作動条件を決定することを特徴とする請求項1に記載の衣装製作支援装置。
    The selection unit allows the user to select the fabric to be used for producing the selected costume,
    The apparatus according to claim 1, wherein the determination unit determines an operating condition of the robot based on the selected clothes and fabric.
  3.  前記作動条件が、前記ロボットのアクチュエータの駆動力に関する補正値を含むことを特徴とする請求項1または2に記載の衣装製作支援装置。 The costume production assisting apparatus according to claim 1, wherein the operating condition includes a correction value related to a driving force of an actuator of the robot.
  4.  前記作動条件が、前記ロボットのアクチュエータの駆動量に関する設定値を含むことを特徴とする請求項1~3のいずれかに記載の衣装製作支援装置。 The costume production assisting apparatus according to any one of claims 1 to 3, wherein the operating condition includes a set value regarding a drive amount of an actuator of the robot.
  5.  決定された作動条件に基づいて作動設定ファイルを作成する生成部と、
     前記作動設定ファイルを取得するための識別情報を出力する管理部と、
     を備えることを特徴とする請求項1~4のいずれかに記載の衣装製作支援装置。
    A generation unit that generates an operation setting file based on the determined operation condition;
    A management unit that outputs identification information for acquiring the operation setting file;
    The costume production supporting apparatus according to any one of claims 1 to 4, further comprising:
  6.  前記識別情報は、ユーザが直接参照できない状態で提供されることを特徴とする請求項5に記載の衣装製作支援装置。 The apparatus according to claim 5, wherein the identification information is provided in a state where the user can not directly refer to the identification information.
  7.  前記識別情報は、ICタグに書き込まれてユーザに提供されることを特徴とする請求項5または6に記載の衣装製作支援装置。 7. The apparatus according to claim 5, wherein the identification information is written to an IC tag and provided to a user.
  8.  選択された衣装の製作方法を指定するデザイン情報を出力する提供部をさらに備え、
     前記ロボットは、前記ICタグに書き込まれた識別情報に対応する作動設定ファイルを、ネットワークを介して取得する機能を有し、
     前記デザイン情報には、前記ロボットにおける前記ICタグの取り付け位置を指定する情報が含まれることを特徴とする請求項7に記載の衣装製作支援装置。
    The system further comprises a providing unit that outputs design information specifying a method of producing the selected costume,
    The robot has a function of acquiring, via a network, an operation setting file corresponding to the identification information written to the IC tag,
    8. The apparatus according to claim 7, wherein the design information includes information for specifying a mounting position of the IC tag in the robot.
  9.  課金処理を実行する課金処理部をさらに備え、
     前記提供部は、前記課金処理部による課金処理完了を条件に前記デザイン情報を提供することを特徴とする請求項8に記載の衣装製作支援装置。
    It further comprises a charge processing unit that executes charge processing,
    9. The costume production support apparatus according to claim 8, wherein the providing unit provides the design information on the condition that charging processing by the charging processing unit is completed.
PCT/JP2018/019775 2017-05-31 2018-05-23 Clothing manufacturing support system WO2018221334A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2019522151A JP6579538B2 (en) 2017-05-31 2018-05-23 Costume production support device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-107914 2017-05-31
JP2017107914 2017-05-31

Publications (1)

Publication Number Publication Date
WO2018221334A1 true WO2018221334A1 (en) 2018-12-06

Family

ID=64455776

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/019775 WO2018221334A1 (en) 2017-05-31 2018-05-23 Clothing manufacturing support system

Country Status (2)

Country Link
JP (2) JP6579538B2 (en)
WO (1) WO2018221334A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112099775A (en) * 2020-09-15 2020-12-18 上海岭先机器人科技股份有限公司 Method for coding garment manufacturing process flow
JP7484376B2 (en) 2020-04-22 2024-05-16 セイコーエプソン株式会社 ROBOT SYSTEM AND ROBOT CONTROL METHOD

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001191275A (en) * 1999-10-29 2001-07-17 Sony Corp Robot system, exterior, and robot device
JP2001250045A (en) * 1999-12-30 2001-09-14 Sony Corp System and method for purchase, device and method for receiving order, data selling substituting system, device and method for selling data, and computer program
JP2003001582A (en) * 2001-06-21 2003-01-08 Kenji Fuse Costume for robot
JP2007307628A (en) * 2006-05-16 2007-11-29 Murata Mach Ltd Service robot and its selling system
WO2008007588A1 (en) * 2006-07-12 2008-01-17 Konami Digital Entertainment Co., Ltd. Image display device and image display program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001191275A (en) * 1999-10-29 2001-07-17 Sony Corp Robot system, exterior, and robot device
JP2001250045A (en) * 1999-12-30 2001-09-14 Sony Corp System and method for purchase, device and method for receiving order, data selling substituting system, device and method for selling data, and computer program
JP2003001582A (en) * 2001-06-21 2003-01-08 Kenji Fuse Costume for robot
JP2007307628A (en) * 2006-05-16 2007-11-29 Murata Mach Ltd Service robot and its selling system
WO2008007588A1 (en) * 2006-07-12 2008-01-17 Konami Digital Entertainment Co., Ltd. Image display device and image display program

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7484376B2 (en) 2020-04-22 2024-05-16 セイコーエプソン株式会社 ROBOT SYSTEM AND ROBOT CONTROL METHOD
CN112099775A (en) * 2020-09-15 2020-12-18 上海岭先机器人科技股份有限公司 Method for coding garment manufacturing process flow
CN112099775B (en) * 2020-09-15 2023-09-01 上海岭先机器人科技股份有限公司 Method for coding clothes making process flow

Also Published As

Publication number Publication date
JPWO2018221334A1 (en) 2019-11-07
JP2020007695A (en) 2020-01-16
JP6579538B2 (en) 2019-09-25

Similar Documents

Publication Publication Date Title
JP6508864B2 (en) Robot that changes eyes
US11498222B2 (en) Autonomously acting robot that stares at companion
JP6472113B2 (en) Autonomous robots and programs that maintain a natural sense of distance
WO2018008323A1 (en) Autonomous robot that wears clothes
CN113164822B (en) Robot for wearing clothes
US11712796B2 (en) Robot that acts comically, and structure thereof
US11420132B2 (en) Robot on which outer skin is mounted
JP2020000279A (en) Autonomously acting type robot assuming virtual character
WO2018221334A1 (en) Clothing manufacturing support system
WO2019131696A1 (en) Robot housing movement mechanism
JP2018192559A (en) Autonomous mobile robot detecting touch to body of curved surface shape
JPWO2020171120A1 (en) Robot and its outer skin
JP6601861B2 (en) Costume production support device
WO2020009098A1 (en) Robot

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18809511

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019522151

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18809511

Country of ref document: EP

Kind code of ref document: A1