WO2003019452A1 - Method and system for developing intelligence of robot, method and system for educating robot thereby - Google Patents

Method and system for developing intelligence of robot, method and system for educating robot thereby Download PDF

Info

Publication number
WO2003019452A1
WO2003019452A1 PCT/KR2002/001599 KR0201599W WO03019452A1 WO 2003019452 A1 WO2003019452 A1 WO 2003019452A1 KR 0201599 W KR0201599 W KR 0201599W WO 03019452 A1 WO03019452 A1 WO 03019452A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
command
educatee
allowing
management server
Prior art date
Application number
PCT/KR2002/001599
Other languages
French (fr)
Inventor
Kyung-Chul Shin
Seong-Ju Park
Jong-Hyun Kim
Shin Kim
Sung-Ho Kim
Original Assignee
Yujin Robotics, Co.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yujin Robotics, Co. filed Critical Yujin Robotics, Co.
Publication of WO2003019452A1 publication Critical patent/WO2003019452A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/004Artificial life, i.e. computing arrangements simulating life
    • G06N3/008Artificial life, i.e. computing arrangements simulating life based on physical entities controlled by simulated intelligence so as to replicate intelligent life forms, e.g. based on robots replicating pets or humans in their appearance or behaviour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/80Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication

Definitions

  • the present invention relates to a method and system for developing intelligence of a robot using a computer network, and more particularly to a method and system for developing intelligence of a robot using a computer network, the method and system being capable of controlling a highly intelligent robot's action by communicating information with a server system at a remote site.
  • Robot-related technologies were changed in the 1990s from industrial robots to non-industrial robots, and significantly advanced, but there were many problems in the commercialization of robots.
  • a robot should receive a given command, identify a peripheral environment, perform a decision-making process by itself, and have artificial intelligence to carry out the above-described functions.
  • Artificial intelligence means intelligence of a computer capable of carrying out human-like actions and having abilities such as learning or independent decision-making.
  • artificial intelligence is widely used in fields of computer games, mathematical verification, computer vision, voice recognition, natural language recognition, expert systems, robot engineering, fabrication automation, etc.
  • a system of the artificial intelligence technologies stores all knowledge and inference information, configures a model of a symbolic environment based on the stored information, and interprets situations according to the configured model.
  • the above-described artificial intelligence system has a drawback in that a programmer should predict all situations which may be encountered, and create a program based on the predicted situations. Thus, when an unpredicted event occurs, the system cannot operate appropriately.
  • the artificial intelligence technologies are studied in bioniechanical and neuroscience fields to develop artificial intelligence systems, which can carry out functions corresponding to human abilities.
  • the conventional artificial intelligence system does not yet reach the intelligence of an animal, but it is expected that an improved artificial intelligence system will be put to practical use in the next several tens of years.
  • the present invention has been made in view ofthe above problems, and it is an object of the present invention to provide a method and system for developing intelligence of a robot, the method and system being capable of enabling a robot to take actions in response to a command received from a server or a manager through a computer network when an unpredicted event occurs, whereby a user feels as if the robot is of intelligence similar to that of human.
  • a system for developing intelligence of a robot comprising: a management server for storing and managing data defining a robot's actions associated with various situations, newly updating action definition data when a new situation is encountered, and managing the newly updated data; a freely moving robot connected to the management server through a radio Internet for searching for an appropriate action definition associated with a recognized command from an action definition database ofthe robot and another action definition database of the management server connected through a communication network when a command is generated by a decision-making process of the robot or an outside source, and taking actions based on the action definition; and a communication network for enabling data communication between the management server and the robot; a robot performing a connection with the management server, searching for the appropriate action definition from data of the management server, receiving data of the searched action definition, and talcing actions based on the data of the searched action definition, if an internally programmed action definition does not exist in the robot when a new command is generated.
  • a command given to the robot may comprise: a basic command for defining one action to be performed by the robot: and an extended command based on the basic command and having a plurality of basic commands defined according to the robot's states associated with one action to be performed by the robot.
  • These commands may be realized by a script language. This command system enables a user to readily operate the robot.
  • the system may further comprise: at least one manager terminal connected to the management server for outputting data of the robot transferred from the management server, and transferring data inputted from a manager to the robot through the management server; the management server searching for a connectable manager terminal, transferring state information of the robot to a corresponding manager terminal, and transferring data of the action definition inputted from the manager to the robot, if action definition data associated with a command requested by the robot does not exist in the management server.
  • a response command to be transferred from the manager to the robot may be based on objects and events associated with data of actions to be taken by the robot and moving pictures, and a set of objects associated with one command may be transferred to the robot; and the robot may take actions after receiving all data ofthe command. Accordingly, when the robot executes the command, it may automatically execute the command without a time delay.
  • the manager terminal may provide a list of command execution and action definition data stored in the management server; and the manager may select action definition data to be provided to the manager terminal from the management server, or directly input the action definition data if data of an appropriate action does not exist in the list.
  • response contents inputted into the manager terminal by the manager may be transferred to the robot in the form of text; and the robot may output speech through a TTS (Text-To-Speech) function in response to the received contents in the form of text.
  • TTS Text-To-Speech
  • the user may use a conversation service in a constant manner, and hence identity can be maintained.
  • a method for developing intelligence of a robot by connecting the robot, a management server and a manager terminal through a communication network comprising the steps of: a) allowing the robot to generate a predetermined command according to a user's command or internal state change; b) allowing the robot to recognize the command generated at the step a) and search its own database for action definition data associated with the generated command; c) allowing the robot to take actions based on the action definition data if the action definition data exists in the database as a result ofthe search, and connecting the robot to a management server through a radio network and requesting the management server to search for the action definition data associated with the command if the action definition data does not exist in the robot; d) allowing the management server to search its own database for the action definition data associated with the command transferred from the robot and transmit corresponding action definition data to the robot if the action definition data exists in the database; and e) allowing the robot to receive the data transferred from the management server and execute
  • the method may further comprise the steps of: f) transferring the request ofthe robot to the manager terminal if the action definition data associated with a corresponding command does not exist in the management server at the step d); and g) allowing a manager to input, into the manager terminal, contents of actions to be performed by the robot in relation to the command transferred from the robot, and allowing the manager terminal to transfer the inputted contents to the robot through the management server; the robot taking actions associated with the command provided from the management server or the manager terminal. Accordingly, a user feels as if the robot is of intelligence similar to that of human.
  • a method for developing intelligence of a robot by connecting the robot, a management server and a manager terminal through a communication network comprising the steps of: allowing a user to transfer a 1 : 1 conversation command to the robot; allowing the robot to recognize the 1:1 conversation command from the user and transfer a 1 :1 conversation request to the management server; allowing the management server to search for a connectable manager if it receives the 1 :1 conversation request from the robot, transfer the 1 :1 conversation request of the robot to a corresponding manger terminal, and establish a communication path between the manager terminal and the robot; and allowing the management server to collect voice and video data of a user and transmit the collected voice and video data to the manager terminal if a one-to-one connection between the robot and the manger terminal has been established, and allowing the manager to identify a state of the user through the manager terminal and transfer a command as a response to the robot. Accordingly, a 1:1 conversation service may be naturally provided through the robot.
  • the method may further comprise the steps of: allowing the management server to provide a list of commands and action definitions to the manager terminal and allowing the manager to search for the list through the manager terminal and select an appropriate corresponding command if the appropriate corresponding command exists in the list; allowing the manager to transfer the corresponding command selected by the manager to the robot, and allowing the robot to take actions based on the action definition data; and allowing the manager to directly input a corresponding command if an appropriate corresponding command does not exist in the list provided from the management server and transmit the inputted command to the robot, and allowing the robot to take actions based on the inputted command.
  • a method for allowing an educator to educate a specific educatee using a freely moving robot for transmitting and receiving data through a computer network and processing the data to organize information, and a server connected to the robot through the network comprising the steps of: allowing the robot to track a position of the educatee and move to an area where the educatee can be educated; if the educatee is in the area where the educatee can be educated, connecting the robot to the server and transmitting a specific identification code of the robot to the server; allowing the server to download, to the robot, educational contents corresponding to the identification code transmitted from the robot, search for a connectable educator, and connect the robot with a corresponding educator terminal; and allowing the robot to execute the downloaded educational contents and educate the educatee.
  • the method may further comprise the step of: determining whether information for requesting an educator to educate the educatee is contained in the educational contents; and if information for requesting an educator to educate the educatee is contained in the educational contents, connecting the robot with the educator through the network.
  • the method may further comprise the steps of: allowing the robot to determine whether an action command is contained in the educational contents; and if an action command is contained in the educational contents, allowing the robot to analyze a corresponding action command and perform a specific motion based on the action command.
  • the method may further comprise the steps of: allowing the educatee to input the educatee 's request such as a specific question to the robot; allowing the robot to transmit the educatee's request to an educator terminal; if the educator inputs a response to the educatee's request to the educator terminal, allowing the educator terminal to transmit the response to the robot; and allowing the robot to output the response to the educatee.
  • the method may further comprise the steps of: if an education schedule based on the educational contents is terminated, transmitting an education performance result to the server; allowing the server to store the education performance result and output the education performance result to the educator terminal; allowing the educator to perform an evaluation based on the education performance result and input the evaluation information into the server; and allowing the server to output the evaluation information to the educatee through the robot.
  • the method may further comprise the step of: allowing the server to perform an evaluation based on the education performance result by itself and output the evaluation information to the educatee.
  • a system for performing dynamic education using a robot comprising: a freely moving intelligent robot for transmitting and receiving data through a computer network and processing the data to organize information; and a learning management computer for providing educational contents to the intelligent robot through the computer network; the intelligent robot outputting the educational contents provided from the learning management computer to an educatee using speech, images and motions, and the educatee expresses his own intention to the learning management computer through the intelligent robot.
  • the system may further comprise: an educator terminal connected to the robot through the computer network for enabling an educator to transfer an education command to the educatee in response to state information transferred from the intelligent robot; the learning management computer connecting the robot and the educator terminal in which the information can be transmitted and received.
  • an educator terminal connected to the robot through the computer network for enabling an educator to transfer an education command to the educatee in response to state information transferred from the intelligent robot; the learning management computer connecting the robot and the educator terminal in which the information can be transmitted and received.
  • the system may further comprise: an RF (Radio Frequency) module carried by the educatee for transmitting a position-tracking signal; and a position tracking device installed in the robot for tracking a position of the educatee by transmitting and receiving the position-tracking signal to and from the RF module carried by the educatee, so that the robot moves to the tracked position ofthe educatee.
  • RF Radio Frequency
  • the robot may further comprise: an image processing device for capturing an image of the educatee, digitalizing a captured signal and transferring the digitalized captured signal to the educator terminal, whereby the educator observes a learning state of the educatee at any time, thereby performing efficient learning management.
  • an image processing device for capturing an image of the educatee, digitalizing a captured signal and transferring the digitalized captured signal to the educator terminal, whereby the educator observes a learning state of the educatee at any time, thereby performing efficient learning management.
  • a method for allowing an educator to educate a specific educatee using an intelligent freely moving robot for transmitting and receiving data through a communication network and processing the data to organize information, and a server for providing various educational information to the robot comprising the steps of: allowing the educatee to input a learning request into a computer through the robot; downloading educational contents corresponding to the learning request to the robot; and allowing the robot to output the educational contents to the educatee through speech, images and motions.
  • the method may further comprise the steps of: connecting the robot and a specific educator through the network in response to the learning request; allowing the educatee to input the educatee's request including a question into the robot; outputting the educatee's request to a corresponding educator; inputting the educator's response based on the educatee's request; and allowing the robot to output the educator's response to the educatee through speech, images and motions.
  • the method may further comprise the step of: allowing the robot to track a position of the educatee and move to an area where the educatee can be educated.
  • Fig. 1 is a view illustrating a configuration of a system for developing intelligence of a robot in accordance with the present invention
  • Fig. 2a is a block diagram illustrating an outline of a configuration ofthe robot used in the present invention
  • Fig. 2b is a block diagram illustrating a structure for controlling the robot
  • Fig. 3 is a block diagram illustrating a detailed configuration of the robot used in the present invention
  • Fig. 4 is a flow chart illustrating a command processing procedure for developing intelligence ofthe robot in accordance with the present invention
  • Fig. 5 is a view illustrating a structure of an extended command used in the robot-intelligence developing system in accordance with the present invention
  • Fig. 6 is a view illustrating a command in the form of script in accordance with the present invention
  • Fig. 7 is a table illustrating a set of commands in the form of script in the robot-intelligence developing system in accordance with the present invention.
  • Fig. 8 is a flow chart illustrating a 1:1 conversation service provided through the robot-intelligence developing system in accordance with the present invention.
  • Fig. 9 is a flow chart illustrating a voice and video processing procedure performed in the robot-intelligence developing system in accordance with the present invention.
  • Fig. 10 is a view illustrating a configuration of a dynamic education service system implemented by the robot-intelligence developing system in accordance with the present invention
  • Fig. 11 is a view illustrating an outline of a configuration of a position-tracking device carried by an educatee in an education system in accordance with the present invention
  • Fig. 12 is a view illustrating a functional configuration of a learning management server in accordance with the present invention
  • Fig. 13 is a flow chart illustrating a procedure of tracking a position of an educatee in accordance with a preferred embodiment ofthe present invention
  • Fig. 14 is a flow chart illustrating a procedure of downloading education contents and performing a connection with an educator terminal in accordance with the present invention.
  • Figs. 15 and 16 are flow charts illustrating procedures of providing educatees with dynamic educational services using an educating robot.
  • Fig. 1 is a view illustrating a configuration of a system for developing intelligence of a robot in accordance with the present invention.
  • the robot-intelligence developing system includes: a robot 100 having a radio data communication function for being connected to a management server 300 through the radio data communication function when the robot 100 receives a command or encounters an event, which cannot be processed by a program, so that the robot 100 can take appropriate actions in response to a received command; a communication network 200 for providing a data communication path between the robot 100 and the management server 300; the management server 300 for storing command data associated with actions of the robot 100 to be taken in a plurality of situations, receiving data outputted from the robot 100, and recognizing a state of the robot 100 to search for a corresponding command or provide the corresponding command from the manager terminal 400 to the robot 100; and the manager terminal 400 connected to the management server 300 for receiving information ofthe robot 100 through the management server 300 and enabling a manager to command the robot 100 in response to the information.
  • a radio modem for transmitting and receiving radio data is installed in a terminal connected to the robot 100 of the communication network 200.
  • Another terminal is connected to the management server 300, and hardware configurations of the management server 300 and the manager terminal 400 are the same as those of the conventional Internet server and terminal.
  • the robot 100 described above includes: a battery 110 for supplying electrical power so that the robot 100 can freely move; a power supply 120 for supplying appropriate current and voltage to respective components or boards from the electrical power of the battery 110; a higher-order controller 130 for recognizing a state of the robot 100 through a signal inputted from a plurality of sensors or sensing devices mounted in the robot 100 and then generating a command for the robot 100, or/and recognizing an external input command and instructing the robot 100 to take appropriate actions in response to the recognized command; a lower-order controller 140 for controlling various driving devices in real time in response to the command from the higher-order controller 130; and a driver/sensor 150 for enabling driving wheels, arms and a head of the robot 100 to move according to a control signal of the lower-order controller 140, and sensing motions and states ofthe robot 100.
  • a battery 110 for supplying electrical power so that the robot 100 can freely move
  • a power supply 120 for supplying appropriate current and voltage to respective components or boards from the electrical power of the battery 110
  • the higher-order controller 130 of the robot 100 includes a network processor
  • the network processor 131 connected to the management server 300 located on the Internet 200 notifies the management server 300 of states of the robot 100 and then receives a command for the robot's appropriate action from the management server 300.
  • the command generator 132 generates a command using data inputted from voice or video recognition, a keyboard or buttons, wherein the command is generated according to the external data and a change of a state ofthe robot 100.
  • a command is also generated on the basis of a change of a state of hardware such as a battery-consumption amount, a motor failure, etc. or a change of an internal state such as a change of a state of software, etc.
  • the command processor 133 compares a priority of the inputted command with priorities of commands being currently executed, and executes the inputted command according to the priority.
  • the command received from the command processor 133 is transferred to a controller 142 contained in the lower-order controller 140 through the control communicator 134 and an interface 141, and action control of the robot 100 is performed through the controller 142.
  • the controller 142 transfers an action control signal to a driver 143 of a corresponding device according to an action control command
  • the driver 143 drives a corresponding device 144.
  • a change of a state based on the driving of the device 144 is detected and information ofthe detected state change is transferred to the lower-order controller 140.
  • FIG. 3 an embodiment of the robot 100 is shown in Fig. 3 so that an operation ofthe robot 100 can be better understood.
  • a detailed configuration of the robot 100 will be described with reference to Fig. 3.
  • the robot 100 basically includes an image processor 380, a voice processor
  • a controller 310 controls the operation of the controller 310, a storage unit 320, a motion controller 360, a motion effector 365, a key input unit 350, a communication unit 340, an RF (Radio Frequency) module
  • RF Radio Frequency
  • the image processor 380 includes an image recognizer 386 made up of a CCD
  • the image recognizer 386 captures external images, and then transmits the captured images to the image processor 380.
  • the image processor 380 recognizes external situations from the images captured by the image recognizer 386 or contents of images transferred through an external telephone line on the basis of a captured signal from the image recognizer 386 or image information supplied from the communication unit 340, and then transmits a result ofthe recognition as an image recognition signal to the controller 310.
  • the image processor 380 Under the control of the controller 310, the image processor 380 performs predetermined signal processing in relation to the captured signal from the image recognizer 386, and then transmits an image signal to the communication unit 340.
  • the voice processor 390 is configured by a microphone 396 and a speaker 395.
  • the microphone 396 collects external sounds such as sounds from a user and then transmits the collected sounds to the voice processor 390.
  • the voice processor 390 recognizes meanings of words ofthe collected sounds from the microphone 396 and words transferred through the external telephone line on the basis of the collected sounds supplied from the microphone 396 or voice information supplied from the communication unit 340, and then transmits a result of the recognition as a voice recognition signal to the controller 310 and the communication unit 340. Further, the voice processor 390 generates synthesized voice under the control ofthe controller 310 and then transmits the synthesized voice as a voice signal to the speaker 395 or the communication unit 340.
  • the communication unit 340 is configured by a radio modem, etc. so that the controller 310 can communicate with an external device through a telephone line, an Internet leased line, etc.
  • the key input unit 350 recognizes command contents on the basis of a code number inputted from the user, and then transmits a result of the recognition as a key input signal to the controller 310.
  • the motion controller 360 analyzes a motion control signal transferred from the controller 310 to generate a motion-driving signal, and then moves the robot 100 in forward, backward, left or right directions or moves arms and legs of the robots 100 in a predetermined manner.
  • the RF module 330 generates a position-tracking signal having a user- identification code, which can be recognized by the robot 100, and then externally transmits the position-tracking signal through the antenna 335. Moreover, the RF module 330 processes an external position-tracking signal received from the antenna 335, and then transmits the processed signal to the controller 310.
  • the storage unit 320 is a kind of auxiliary storage unit such as a hard disk, etc., and converts data transmitted from the management server 300 into a form capable of being recognized by the user.
  • the storage unit 320 includes: a browser program for displaying the converted data through the image display unit 385; contents downloaded from the management server 400; a program contained in the above-described contents; a robot control program for controlling the robot 100 according to a schedule; a position-tracking program for tracking a position of an educatee; and an OS (Operating
  • the sensing unit 370 includes a plurality of sensors, and senses a state and a peripheral environment ofthe robot 100.
  • the controller 310 analyzes a peripheral environment or detects the existence of a motion, the existence of a failure, the existence of a user or external command, a state of learning of the user and a pose and position of the robot 100 on the basis of a voice recognition signal, an image recognition signal, a reception information signal, a key input signal and a sensor signal inputted from the voice processor 390, the image processor 380, the communication unit 340, the key input unit 350, the RF module 330 and the sensing unit 370.
  • the controller 310 decides actions of the robot 100 on the basis of the robot control program in response to a result of the analysis, and moves the head, arms or legs of the robot 100 in upward, downward, left or right directions by driving the motion instrument 365 on the basis of a result of the decision, or controls an operation or action of the robot 100 such as the walking of the robot 100 by driving the legs of the robot 100.
  • the controller 310 externally transfers voice based on voice information received through the communication unit 340 or synthesized voice generated by the voice processor 390 through the speaker 395, controls the image display unit 385 to display an image based on external image information received through the communication unit 340 or an image generated by the image processor 380, or transmits, to the RF module 330, a position-tracking signal to be externally transmitted.
  • the controller 310 transmits command information generated from its own device to the learning management server 400 or the educator (information) terminal
  • the controller has a structure including a CPU (Central Processing Unit), a CPU (Central Processing Unit), a CPU (Central Processing Unit), a CPU (Central Processing Unit), a CPU (Central Processing Unit), a CPU (Central Processing Unit), a CPU (Central Processing Unit), a CPU (Central Processing Unit), a CPU (Central Processing Unit), a CPU (Central Processing Unit), a CPU (Central Processing Unit), a CPU (Central Processing Unit), a CPU (Central Processing Unit), a CPU (Central Processing Unit), a CPU (Central Processing Unit), a CPU (Central Processing Unit), a CPU (Central Processing Unit), a CPU (Central Processing Unit), a CPU (Central Processing Unit), a CPU (Central Processing Unit), a CPU (Central Processing Unit), a CPU (Central Processing Unit), a CPU (Central Processing Unit), a CPU (Central Processing Unit), a CPU (Central Processing Unit), a CPU (Central Processing Unit), a CPU (Central Processing Unit), a CPU (Central Processing Unit), a CPU (Central Processing Unit), a CPU (Central Processing Unit), a CPU (Central Processing Unit
  • the robot 100 configured as described above receives a command from the key input unit 350 or the voice processor 390, or determines whether a definition of action of the robot 100 based on a predetermined command exists in an internal program when a state of the robot 100 is sensed through the sensors and then the predetermined command is generated on the basis of a state of the robot 100. At this time, if the action ofthe robot 100 associated with the command is defined in the robot 100, the robot 100 performs the action based on a corresponding definition.
  • the robot 100 performs a connection with the management server 300 to search a database of the management server 300 for the action definition, receive the searched action definition and perform the action based on the received action definition.
  • the robot 100 if the user of the robot 100 transmits a 1:1 conversation request to the robot 100, the robot 100 is connected to the server 300 through the Internet 200, and transfers voice and video data of the user after transferring the 1 : 1 conversation request to the server 300. Further, the server 300 checks the 1 :1 conversation request and then connects the robot 100 and some manager terminal 400. At this time, the voice and video data transferred from the robot 100 is transferred to a corresponding manager terminal 400, and the manager inputs conversation information after viewing and hearing the transferred voice and video data or inputs a command.
  • Command contents of the manager is transmitted from the manager terminal 400 to the robot 100 through a predetermined method, and the robot 100 analyzes the transmitted command, and outputs the voice or video data or executes the command.
  • the user can naturally perform the 1 : 1 conversation with the robot 100.
  • Fig. 4 is a flow chart illustrating the command processing procedure of the robot 100.
  • the robot 100 uses voice of the user as a main command source, but can use data inputted from another device, for example, a keyboard or buttons, as a command source. Further, the robot 100 can receive commands through various methods including a method for generating an internal command according to an internal state ofthe robot 100, etc.
  • the robot 100 searches a database mounted in an internal controller to search for data of the command from the database (S402). At this time, if data of the command exists in the database (S403), the robot 100 executes the command according to the searched data (S404). On the other hand, if data ofthe command does not exist in the database ofthe robot 100, the network processor 131 is driven and the robot 100 is connected to the server 300 (S405), and searches for data of the command from a database of the server 300 (S406).
  • the robot 100 receives data of the command from the server 300 (S408), and executes a corresponding command (S404).
  • the robot 100 outputs a message indicating that the command cannot be executed (S409).
  • the robot 100 can process a greater number of commands than are commands contained in its own database.
  • the commands given to the robot 100 are implemented on the basis of a script language.
  • the robot 100 can readily implement various actions in various environments and a command can readily be extended.
  • the command given to the robot 100 includes a basic command and an extended command based on the basic command.
  • the basic command is given to the controller of the robot 100, and one basic command corresponds to one action to be performed by the robot 100.
  • the extended command exists in the case where the robot 100 takes different actions in different states ofthe robot 100.
  • a basic command is a movement command
  • the robot 100 performs different movements in the case where a battery is almost consumed and in the case where power of the battery is sufficient.
  • An extended command includes one basic command if internal states are the same as each other. However, if internal states are different, the extended command includes the number of basic commands corresponding to the number of different internal states.
  • Fig. 5 is a view illustrating a structure of the extended command.
  • the extended command includes an extended command ID, a keyword for voice recognition for recognizing a voice command of the user, priority for command processing, basic commands corresponding to the number of internal states of the robot 100, and extended-command information.
  • Fig. 6 illustrates an exemplary command in the form of script in accordance with the present invention.
  • the robot should provide desired services to the user in addition to simple movements.
  • a unit of a new command should be configured by a plurality of commands so that one desired service can be provided to the user. That is, the unit consisting of a plurality of extended commands is needed according to commands and environments of the user.
  • Fig. 6 shows an exemplary command using the script language.
  • One extended command can be corresponded to one script function, and actions ofthe robot 100 can be defined using script functions.
  • scripts can be corresponded to a basic command, and an expert can use basic commands to define motions of the robot in detail.
  • actions of the robot associated with logical scripts of "if, "switch", “repeat”, etc. can be readily programmed.
  • Fig. 7 Various types of scripts are shown in Fig. 7.
  • the robot's actions configured by combining a series of actions can be readily programmed using scripts shown in Fig. 7.
  • Fig. 6 shows the exemplary command in the form of script for simple conversation between the user and the robot.
  • a name of this command is defined as "Hello", which is stored as a function. While a function of "Hello” is executed, a function of "Move” is called. Here, it is assumed that the function of "Move” is defined as a function for moving the robot toward the user. If the function of "Hello” is executed, the function of "Move” is called, and the robot moves toward the user and says “How are you?" to the user through TTS (Text-To-Speech) scripts. The robot then receives a reply of "Fine” or "Not fine” through REPLY scripts.
  • TTS Text-To-Speech
  • a temporary parameter "a” is stored as “1”. Further, if the robot receives a reply of "Not fine”, the temporary parameter "a” is stored as "2". If a value of the temporary parameter "a” is "1”, a message of "May I help you?" is transferred to the user through switch scripts. Further, if a value of the temporary parameter "a” is "2", MoveHome scripts are called.
  • MoveHome scripts are called.
  • “MoveHome” is defined as a function for returning the robot to an origin position, the robot returns to the origin position if the robot receives a reply of "Not fine” from the user.
  • an operation procedure of the robot is performed in a manner similar to the command processing procedure of Fig. 4 described above.
  • the higher-order controller 130 of the robot 100 recognizes a voice command from the user, and then sends a result of the recognition to the command processor 133.
  • the command processor 133 searches for the data of the voice command from its own database. At this time, if data of the voice command is searched for, the command processor 133 enables the robot to execute a corresponding command. Otherwise, the command processor 133 performs a connection with the server 300 through the network processor 131 to search for a script command corresponding to the voice command.
  • action of the robot is defined in the server 300 in relation to a given command
  • the robot 100 downloads data of the definition of the action of the robot 100 from the server 300, and the actions of the robot 100 are controlled by the downloaded scripts.
  • the robot 100 periodically performs a connection with the server 300, downloads scripts of a new action definition from the server 300, and stores the downloaded scripts in its own database, thereby extending and changing abilities of the robot 100.
  • the system performs a connection with the server 300 and then receives a command associated with the action of the robot 100 from the server 300.
  • the server 300 outputs an error message indicating that the robot cannot take an appropriate action corresponding to the command. At this time, there is a problem in that the robot 100 cannot provide a desired service to the user.
  • the server 300 transmits the command transmitted from the robot 100 to the manager terminal 400 in order to request the manager terminal 400 to give a response, without transmitting the error message to the robot 100.
  • At least one manager is located at the manager terminal 400. After the manager identifies the request from the server 300, the manger provides, to the server 300 through the manager terminal 400, an appropriate response (data or message associated with action of the robot 100) corresponding to the command transmitted from the server 300.
  • the server 300 transfers data of the definition associated with the action of the robot 100 from the manager terminal 400 to the robot 100, and the robot 100 executes a given command from the manager.
  • the robot 100 can follow various commands, and the user is impressed by execution abilities ofthe robot 100.
  • the system can provide the 1 :1 conversation service to the user.
  • Fig. 8 is a flow chart illustrating a procedure of providing the 1:1 conversation service to the user through the system.
  • a command recognition device of the robot 100 recognizes the command (S802), and transfers the recognized 1 :1 conversation command to the command processor 133
  • the command processor 133 performs a connection with the server 300 through the network processor 131, and then transfers the 1 :1 conversation command to the server 300 (S804). In this case, if the server 300 receives a 1 :1 conversation request, the server
  • the server 300 identifies a manager capable of accessing the robot 100, transfers the 1 :1 conversation request ofthe robot 100 to the manager, and connects the terminal 400 of the manager to the robot 100 (S805).
  • the server 300 transfers the 1:1 conversation request of the robot 100 to a plurality of manager terminals 400.
  • the server 300 can connect the robot 100 with some manager terminal 400 giving a response to the request.
  • the server 300 transmits an error message, and the robot 100 outputs a command execution impossible message and terminates the 1 :1 conversation procedure (S806 and S807).
  • the server 300 connects the robot to the manager terminal 400 (S808).
  • the robot 100 connected to the manager performs voice and video data compression to efficiently send voice and video data of the user in real time.
  • a codec for voice and video is loaded and executed. If this procedure is completed, a syntax analyzer is executed to analyze voice of the user and perform a command execution function, a function of terminating the 1:1 conversation or a function of automatically repeating the conversation. If a termination command is called in the syntax analyzer, the 1:1 conversation is terminated.
  • the robot 100 obtains video and voice data of the user from the camera and the microphone mounted therein and then transmits the obtained video and voice data to the manager terminal 400.
  • the manager identifies a state ofthe user from the transmitted image and voice data and then inputs a reply.
  • the reply of the manager is transferred to the robot 100 through the manager terminal 400 and the management server 300, and the robot 100 outputs sound or takes motion according to the reply (S809 to S813).
  • the manager Because the robot 100 transfers the video and voice data ofthe user so that the manager can better understand a state of the user, the manager correctly instructs the robot 100. That is, the manager receives the video and voice of the user through the manager terminal 400 in an image chatting method, and instructs the robot 100 to output or perform an appropriate sound or motion.
  • the conversation between the user and the robot 100 may not be performed naturally.
  • serial communications between the robot 100 and the manager terminal 400 are performed through the Ethernet on the basis of TCP/IP (Transmission Control Protocol/Internet Protocol).
  • TCP/IP Transmission Control Protocol/Internet Protocol
  • the robot 100 may not be controlled or the user may not understand what the robot 100 is doing.
  • the robot 100 should simultaneously output sound or make a facial expression while taking the action. While the robot 100 displays a moving picture or animation on the screen, the robot 100 should simultaneously take actions.
  • the robot 100 is directly controlled through a communication network, the robot 100 cannot appropriately operate because of a communication delay.
  • the robot 100 can perform natural motions, irrespective of the communication delay.
  • the manager controls the robot 100 through this method, the user feels as if he talks with another human, while talking with the robot 100.
  • the robot 100 receives conversation contents from the manager, and the manager decides conversation contents on the basis of voice and video information identified through the manager terminal 400.
  • the server 300 simply selects the conversation contents corresponding to command data to be provided to the manager terminal 400.
  • stored voice data can be outputted from the server 300.
  • the conversation contents selected by the manager are transferred to the robot 100 in the form of text, and hence the robot's speech is transferred to the user through a TTS function ofthe robot.
  • the manager directly inputs the conversation contents through the manager terminal 400 to transmit the inputted contents in the form of text to the robot 100 so that the robot's speech can be outputted.
  • the robot-intelligence developing system can be used for various fields such as management, education, etc.
  • a program for education can be programmed in the robot 100
  • data of action definitions associated with various commands can be stored in the server capable of managing educational materials.
  • the robot 100 can receive corresponding educational materials from the user and educate the user.
  • the robot 100 can be used as a conversation partner of the elderly through the 1 : 1 conversation service and provide an elderly person management service.
  • Fig. 10 is a view illustrating a configuration of a dynamic education service system implemented by the robot-intelligence developing system in accordance with a preferred embodiment ofthe present invention.
  • the system includes an educating robot 100a, an educator terminal 400a and a learning management server 300a, which are connected through a network 200 such as the Internet. Further, an educatee 600 notifies the educating robot 100a of a position ofthe educatee 600 through the RF module 610.
  • the learning management server 300a is connected to the specific educatee 600 and an educator 500 through a computer network such as the Internet for communication.
  • the learning management server 300a is a computer system for providing predetermined educational contents.
  • the learning management server 300a will be described in detail with reference to Fig. 12.
  • the educating robot 100a downloads educational contents from the learning management server 300a, and outputs speech, images or motions based on the educational contents to the educatee 600.
  • the educating robot 100a performs an interface between the educator 500 and the educatee 600 for communication.
  • An internal configuration of the educating robot 100a can be understood by referring to
  • the educator terminal 400a is a personal computer connected to the educating robot 100a through the 'server 300a and the network 200a, and includes a communication module for data transmission and reception, an OS (Operating System) program, and a Web browser program. Accordingly, the educator terminal 400a outputs voice and image information transmitted from the educating robot 100a to the educator 500, and transmits information inputted from the educator 500 to the learning management server 300a or the educating robot 100a through the network.
  • OS Operating System
  • the network 200a is referred to as the Internet, but the network 200a is not limited to the Internet and includes an intranet, an extranet, a leased line network, etc.
  • the educating robot 100a downloads educational contents from the management server 300a, analyzes the educational contents, educates the educatee tlirough the speech, images and motions, transfers voice or image data of the educatee to the educator terminal 400a through the communication unit 340, converts reply information of the educator 500 transmitted from the educator terminal 400a into the speech, images and motions of the educating robot 100a, and transfers the converted speech, images and motions to the educatee 600.
  • the educating robot 100a runs the position-tracking program according to a schedule set by the robot control program, and externally transmits a position-tracking signal through the RF module 330.
  • the educating robot 100a analyzes the position-tracking signal from an external device, tracks a position of the educatee, and moves to a place within an area where the educatee can be educated.
  • the RF module 610 for position tracking carried by the educatee 600 will be described with reference to Fig. 11.
  • the RF module 610 is a device for notifying the educating robot 100a of a position ofthe educatee 600.
  • the RF module 610 is manufactured in the form of a necklace, a bracelet, or etc. and can be carried by the educatee 600.
  • an RF reception module 613 ofthe RF module 610 receives the position-tracking signal transmitted from the RF module 330 of the educating robot 100a to the outside, and transfers the received position-tracking signal to a signal- processing module 611.
  • the signal-processing module 611 analyzes the received position-tracking signal, extracts an educatee-identification code from a result of the analysis, generates a response signal to the position-tracking signal if the extracted identification code indicates a corresponding educatee, and transmits the response signal to the RF transmission module 612.
  • the RF transmission module 612 transmits the response signal to the outside through an antenna (not shown).
  • Fig. 12 shows a functional configuration of the learning management server 300a.
  • the learning management server 300a searches for, from its own database, an action definition associated with a predetermined command requested from the educating robot 100a by performing the communication with the educating robot 100a, or receives information of the action definition from the educator terminal 400a to provide the action definition information to the educating robot 100a.
  • the learning management server 300a is a computer system for providing the educational contents requested by the user, connecting the educating robot 100a to the educator terminal 400a through the network, evaluating learning state on the basis of learning result information provided from the educating robot 100a, creating a report based on a result ofthe evaluation, and transferring the report to the educatee.
  • the learning management server 300a is a large-capacity computer system having a CPU, a RAM, a ROM, a network interface, a data storage unit, etc.
  • a conventional personal computer or a conventional workstation with a large-capacity memory and data processing capability can be employed as the learning management server 300a.
  • the learning management server 300a can perform a large number of tasks by executing a large number of mathematical calculations
  • Fig. 12 shows a plurality of databases configuring program modules loaded in a ROM ofthe learning management server 300a, and a data storage unit.
  • the learning management server 300a includes the program modules having a database management module 410, an educational-content providing module 420, an educator connection module 430, a learning evaluation module 440, and a result report creation and transmission module 450, and a database system having an educational- content DB (DataBase) 460, an educator information DB 470 and a robot information DB 480.
  • the database management module 410 constructs the educational-content DB
  • the database management module 410 is a program for entirely managing the databases.
  • the educational-content providing module 420 is a program for identifying an educatee on the basis of the specific identification code, determining a state of learning progress of a corresponding educatee, designating educational contents to be provided to a corresponding educating robot, and downloading the designated educational contents to the educating robot 100a through the network, if the educational-content providing module 420 receives a learning request including a specific identification code from the educating robot 100a.
  • the educator connection module 430 is a program for searching the educator information database 470, designating an appropriate educator, and connecting the designated-educator terminal 500 and the educating robot 100a corresponding to the educatee through the network, if an educator connection command or a connection request from the robot 100a is contained in the educational contents.
  • the learning evaluation module 440 is a program for receiving education result information from the educating robot 100a, and storing the education result information in the robot information database 480 or transmitting the education result information to the educator terminal 400a, after the learning based on the educational contents is terminated.
  • the learning evaluation module 440 is a program for evaluating an educatee by itself on the basis ofthe education result information and storing a result of the evaluation in the robot information database 480.
  • the learning evaluation module 440 is contained in the learning management server 300a in Fig. 12, but the learning evaluation module 440 can be stored in the educational contents of the educating robot 100a. In this case, the result of the evaluation should be uploaded from the robot 100a to the learning management server
  • the result report creation and transmission module 450 is a program for combining a result of its own evaluation and the result of the evaluation transferred from the educator, creating a result report for the educatee, and transmitting the result reporter to an e-mail or mobile phone ofthe educatee.
  • the educational-content database 460 individually stores and manages various educational contents to be downloaded to the educating robot 100a.
  • the database 460 stores education schedule information (or education progress information) of an individual educatee and educational-content information based on the education schedule information.
  • the educator information database 470 stores and manages lists of a plurality of clients 400 and educators currently coupled to the learning management server 300a, personal information (resident registration numbers, addresses, educational history, majors, etc.) of the educators, and schedule information of the educators.
  • the learning management server 300a can determine which educators are capable of being currently connected to the educating robot 100a by identifying the schedule information.
  • the robot information database 480 stores and manages the robot's specific identification code, personal information (a resident registration number, an academic background, an address, a name, etc.) of the educatee, a result of learning evaluation, a result report, a learning level, a state of learning progress, etc.
  • Fig. 13 shows a procedure of causing the robot to move to an area where the educatee can be educated by tracking a position ofthe educatee.
  • a robot control program embedded in the storage unit 320 is loaded in the controller 310 of the robot and then the controller 310 initializes the program (SI 301).
  • the controller 310 determines whether a movement command is contained within robot schedule information (or whether the present time matches a learning time) (S1303).
  • the controller 310 initializes a position estimation (or tracking) module within a position-tracking program (S1305).
  • the position estimation module generates a position estimation (or tracking) signal to be transmitted to the outside, for example, using a PSK (Phase Shift Keying) modulation method, and then transmits the generated estimation tracking signal to the outside through the RF module 330 and the antenna 335 (SI 307).
  • PSK Phase Shift Keying
  • the position estimation signal includes an identification code for identifying an educatee.
  • the received position estimation signal is transferred to the signal-processing module 611 (SI 309).
  • the signal-processing module 611 demodulates the inputted position estimation signal, identifies the specific identification code, generates a position estimation (or tracking) response signal if the position estimation signal is determined as a signal for an educatee, and transfers the generated position estimation response signal to the RF transmission module 612 (S1311).
  • the RF transmission module 612 transmits the position estimation response signal to the outside through the antenna (not shown) (S1313).
  • the antenna 335 of the robot transfers the position estimation response signal from the RF transmission module 612 to the RF module 330, and the RF module 330 analyzes the position estimation response signal to transfer a result ofthe analysis to the position estimation module of the controller 310 (S 1315).
  • the position estimation module of the controller 310 calculates a distance between the robot and the educatee and azimuth information using field intensities, a phase difference, or a time delay from the transmitted position estimation signal and the received position estimation response signal (SI 317). If the distance and the azimuth are calculated, the position estimation module generates a motion control signal from calculation values, and transfers the motion control signal to the motion controller 360 (S1319).
  • the motion controller 360 generates a motion-driving signal based on the motion control signal, and transmits the motion-driving signal to the motion effector 365, thereby allowing the educating robot 100a to move toward the user (S1321).
  • steps SI 307 to SI 321 enables the robot to move to an area where the robot can educate the educatee by tracking a position ofthe educatee.
  • the controller 310 If the educatee has been recognized, the controller 310 generates guide voice messages of, for example, "The present time is a lesson time. Are you studying?" and outputs the guide voice messages to the educatee through the speaker
  • the controller 310 ofthe robot forms a connection with the learning management server 300a through the communication unit 340 (SI 407).
  • the controller 310 of the robot then reads a specific identification code stored in an internal memory (not shown), and transmits the read identification code to the educational-content providing module 420 (SI 409).
  • the educational-content providing module 420 receiving the specific identification code from the educating robot 100a accesses the robot information database 480, and identifies an educatee corresponding to the identification code and a state of learning progress ofthe educatee (S1411).
  • the educational-content providing module 420 extracts educational contents appropriate for the learning progress state from the educational-content database 460, and downloads the educational contents to the educating robot 100a (S1413).
  • the controller 310 of the robot downloading the educational contents stores the contents in the storage unit 320.
  • the educational contents include multimedia information such as pictures, characters, videos, sounds, voices, etc. and robot-motion control information for controlling the robot's motions.
  • the educational contents can include an active program (i.e., an educational program) for appropriately interworking and executing the multimedia information and the robot- motion control information through the robot.
  • the storage unit of the robot can store an active program for analyzing and executing the multimedia information and the robot-motion control information.
  • the controller 310 ofthe robot downloading the educational contents from the learning management server 300a analyzes information ofthe educational contents and determines whether an educator connection command is contained in the information of the educational contents (S1415). At this time, if the educator connection command is contained in the information of the educational contents, the controller 310 of the robot transmits an educator connection request to the educator connection module of the learning management server through the communication unit.
  • the educator connection module 430 receiving the educator connection request from the robot 100a searches the educator information database 470 and determines whether an educator capable of being connected to the robot 100a exists. At this time, if an educator capable of being connected to the robot 100a exists, the educator connection module 430 connects the robot 100a and the educator terminal 400a (S1417). At the above step S1415, the robot and a specific educator are automatically and simultaneously connected to the network at the time of downloading the contents, irrespective ofthe robot's request.
  • the controller 310 of the robot analyzes the downloaded and stored educational contents and then carries out the education for an educatee as shown in Figs. 15 and 16 (S1419).
  • the controller 310 of the robot if an action command is contained in the contents while the education based on the downloaded educational contents for the educatee is carried out through voice and image data outputted from the speaker 396 and the image display unit 385 (SI 501 and SI 502), the controller 310 of the robot generates a motion control signal based on the action command, transfers the generated motion control signal to the motion controller 360, enables the robot 100a to take action corresponding to the command (SI 503 and SI 504). That is, the robot 100a carries out the dynamic education tlirough speech, images or gestures by moving its own body in the forward, backward, left or right directions or moving its arms or legs in a predetermined manner. On the other hand, if no action command is contained in the contents, the controller 310 returns to the above step SI 501 and continuously carries out the education based on the voices and images (SI 502).
  • the controller 310 of the robot determines whether its own program can process the request (SI 506). That is, the controller 310 searches its own database for the user request to determine whether an action definition and action performance data exist in the database.
  • the controller 310 gives a response to the user request according to the programming if its own program can process the user request (SI 507). That is, the controller 310 enables the robot to output the set speech, images and motions to the educatee.
  • the controller 310 transfers the request of the educatee 600 to the connected management server 300a (SI 508).
  • the controller 310 transmits information of the action definition to the robot 100a (SI 508).
  • the controller 310 inquires of the educator terminal 400a about the request and receives a response to the request from the educator 500 (SI 508 and SI 509).
  • Response data for the request of the educatee is transferred to the robot 100a through the network, and the controller 310 of the robot analyzes the response data
  • 100a can transfer the speech, images and motions to the educatee in response to a command from the educator.
  • the controller 310 of the robot transfers an education performance result to the learning evaluation module 440 of the learning management server 300a (SI 514).
  • the learning evaluation module 440 stores the education performance result in a directory of a corresponding educatee of the robot information database 480, and then transmits the education performance result to the educator terminal 500 (S1515 and
  • the learning evaluation module 440 generates evaluation information for a corresponding educatee, and stores the generated evaluation information in a directory ofthe corresponding educatee on the basis ofthe education performance result stored in the robot information database 480 (S 1518).
  • the educator receiving the education performance information inputs the evaluation information for the corresponding educatee in the learning evaluation module 440 of the learning management server, and the learning evaluation module
  • the result report creation and transmission module 450 In a state in which the automatic evaluation information and the educator's evaluation information have been stored, the result report creation and transmission module 450 generates a result report based on the automatic evaluation information and the educator's evaluation information, and stores the generated result report in a directory of a corresponding educatee of the robot information database 480 (S 1519).
  • the result report creation and transmission module 450 transmits the result report to the educatee through an e-mail or a character message of a mobile phone (SI 520).
  • a robot-intelligence developing system of the present invention includes a radio modem mounted in a service robot in which various and precise action controls are needed, connects the robot to a robot management server and a manger terminal through the radio modem, enables the robot to receive information of an action definition from the management server or the manager terminal where the robot cannot process a command or instruction, whereby other functions in addition to functions programmed in the robot can be implemented and hence a user feels as if the robot is of intelligence similar to that of human.
  • the robot-intelligence developing system of the present invention performs a one-to-one connection between the robot and the manager terminal, enables the robot to perform a command from a manager on the basis of the one-to-one connection, and enables the robot to perform human-like actions.
  • the robot-intelligence developing system of the present invention enables data communications between the robot, a learning management server, an educator terminal and other devices or units, thereby providing dynamic educational services tlirough the robot.
  • the robot-intelligence developing system ofthe present invention automatically checks a learning time, tracks a position of an educatee at the learning time and educates the educatee, thereby enabling active learning management.

Abstract

There are disclosed a method and system for developing intelligence of a robot and a method and system for providing an educational service using the same, the methods and systems being capable of connecting the robot of limited memory, intelligence and ability to a server, in which processing and data storage capabilities can be flexibly extended through a communication network, and controlling the robot's actions through the server. That is, the method and system search a management server when a command cannot be processed in the robot, enable the robot to execute the command after receiving an action definition associated with the command, provide various services and can educate a user through the robot using educational contents of downloaded data.

Description

METHOD AND SYSTEM FOR DEVELOPING INTELLIGENCE OF ROBOT,
METHOD AND SYSTEM FOR EDUCATING ROBOT THEREBY
Technical Field
The present invention relates to a method and system for developing intelligence of a robot using a computer network, and more particularly to a method and system for developing intelligence of a robot using a computer network, the method and system being capable of controlling a highly intelligent robot's action by communicating information with a server system at a remote site.
Background Art
Robot-related technologies were changed in the 1990s from industrial robots to non-industrial robots, and significantly advanced, but there were many problems in the commercialization of robots.
In particular, to provide necessary services to a user, a robot should receive a given command, identify a peripheral environment, perform a decision-making process by itself, and have artificial intelligence to carry out the above-described functions.
Artificial intelligence means intelligence of a computer capable of carrying out human-like actions and having abilities such as learning or independent decision-making.
In the middle 1950s, research into the artificial intelligence began. Now, artificial intelligence is widely used in fields of computer games, mathematical verification, computer vision, voice recognition, natural language recognition, expert systems, robot engineering, fabrication automation, etc. Like the expert systems, a system of the artificial intelligence technologies stores all knowledge and inference information, configures a model of a symbolic environment based on the stored information, and interprets situations according to the configured model. However, the above-described artificial intelligence system has a drawback in that a programmer should predict all situations which may be encountered, and create a program based on the predicted situations. Thus, when an unpredicted event occurs, the system cannot operate appropriately.
The artificial intelligence technologies are studied in bioniechanical and neuroscience fields to develop artificial intelligence systems, which can carry out functions corresponding to human abilities. The conventional artificial intelligence system does not yet reach the intelligence of an animal, but it is expected that an improved artificial intelligence system will be put to practical use in the next several tens of years.
As robots having computers mounted therein are developed with the development of computer technologies, operations and memories of the robots are improved. Sensor technologies and collected-information processing technologies remain at a lower level. For example, the conventional robot cannot appropriately recognize an object through a camera mounted in the robot except in limited cases, and cannot completely recognize a natural language in relation to voice recognition. Moreover, deductive or reasoning abilities ofthe conventional robot are not considered.
When a user commands the robot to perform a task, a manager in the vicinity of the robot should continuously monitor the robot. Moreover, it is impossible to develop a program, which considers all situations. Accordingly, a new method is seriously needed so that a service robot can have high intelligence to take appropriate actions in response to the user's commands.
Disclosure ofthe Invention
Therefore, the present invention has been made in view ofthe above problems, and it is an object of the present invention to provide a method and system for developing intelligence of a robot, the method and system being capable of enabling a robot to take actions in response to a command received from a server or a manager through a computer network when an unpredicted event occurs, whereby a user feels as if the robot is of intelligence similar to that of human.
It is another object ofthe present invention to provide a method and system for developing intelligence of a robot, the method and system being capable of controlling motions ofthe robot by allowing a manager to obtain voice and image data ofthe robot, thereby developing its intelligence and implementing various services. It is yet another object to provide a method and system for educating educatees using a robot, the method and system being capable of providing a more realistic education environment to the educatees by storing an educational program in the robot so that it can educate the educatees, and enabling the robot to take actions in response to a command received from a server or a manager through a computer network where an unpredicted event arises.
The other objects and other advantages of the present invention will be more clearly understood from the following detailed description. Further, the objects and advantages of the present invention can be implemented by elements and their combinations disclosed in the accompanying claims.
In accordance with one aspect of the present invention, the above and other objects can be accomplished by the provision of a system for developing intelligence of a robot, comprising: a management server for storing and managing data defining a robot's actions associated with various situations, newly updating action definition data when a new situation is encountered, and managing the newly updated data; a freely moving robot connected to the management server through a radio Internet for searching for an appropriate action definition associated with a recognized command from an action definition database ofthe robot and another action definition database of the management server connected through a communication network when a command is generated by a decision-making process of the robot or an outside source, and taking actions based on the action definition; and a communication network for enabling data communication between the management server and the robot; a robot performing a connection with the management server, searching for the appropriate action definition from data of the management server, receiving data of the searched action definition, and talcing actions based on the data of the searched action definition, if an internally programmed action definition does not exist in the robot when a new command is generated. The intelligence of the robot is not limited by the configurations programmed in the robot, but it may be extended according to data provided by the management server.
Preferably, a command given to the robot may comprise: a basic command for defining one action to be performed by the robot: and an extended command based on the basic command and having a plurality of basic commands defined according to the robot's states associated with one action to be performed by the robot. These commands may be realized by a script language. This command system enables a user to readily operate the robot.
Preferably, the system may further comprise: at least one manager terminal connected to the management server for outputting data of the robot transferred from the management server, and transferring data inputted from a manager to the robot through the management server; the management server searching for a connectable manager terminal, transferring state information of the robot to a corresponding manager terminal, and transferring data of the action definition inputted from the manager to the robot, if action definition data associated with a command requested by the robot does not exist in the management server.
Preferably, a response command to be transferred from the manager to the robot may be based on objects and events associated with data of actions to be taken by the robot and moving pictures, and a set of objects associated with one command may be transferred to the robot; and the robot may take actions after receiving all data ofthe command. Accordingly, when the robot executes the command, it may automatically execute the command without a time delay.
Preferably, the manager terminal may provide a list of command execution and action definition data stored in the management server; and the manager may select action definition data to be provided to the manager terminal from the management server, or directly input the action definition data if data of an appropriate action does not exist in the list.
Preferably, response contents inputted into the manager terminal by the manager may be transferred to the robot in the form of text; and the robot may output speech through a TTS (Text-To-Speech) function in response to the received contents in the form of text. Accordingly, the user may use a conversation service in a constant manner, and hence identity can be maintained.
In accordance with another aspect of the present invention, there is provided a method for developing intelligence of a robot by connecting the robot, a management server and a manager terminal through a communication network, comprising the steps of: a) allowing the robot to generate a predetermined command according to a user's command or internal state change; b) allowing the robot to recognize the command generated at the step a) and search its own database for action definition data associated with the generated command; c) allowing the robot to take actions based on the action definition data if the action definition data exists in the database as a result ofthe search, and connecting the robot to a management server through a radio network and requesting the management server to search for the action definition data associated with the command if the action definition data does not exist in the robot; d) allowing the management server to search its own database for the action definition data associated with the command transferred from the robot and transmit corresponding action definition data to the robot if the action definition data exists in the database; and e) allowing the robot to receive the data transferred from the management server and execute the command on the basis ofthe received action definition data.
Preferably, the method may further comprise the steps of: f) transferring the request ofthe robot to the manager terminal if the action definition data associated with a corresponding command does not exist in the management server at the step d); and g) allowing a manager to input, into the manager terminal, contents of actions to be performed by the robot in relation to the command transferred from the robot, and allowing the manager terminal to transfer the inputted contents to the robot through the management server; the robot taking actions associated with the command provided from the management server or the manager terminal. Accordingly, a user feels as if the robot is of intelligence similar to that of human.
In accordance with another aspect of the present invention, there is provided a method for developing intelligence of a robot by connecting the robot, a management server and a manager terminal through a communication network, comprising the steps of: allowing a user to transfer a 1 : 1 conversation command to the robot; allowing the robot to recognize the 1:1 conversation command from the user and transfer a 1 :1 conversation request to the management server; allowing the management server to search for a connectable manager if it receives the 1 :1 conversation request from the robot, transfer the 1 :1 conversation request of the robot to a corresponding manger terminal, and establish a communication path between the manager terminal and the robot; and allowing the management server to collect voice and video data of a user and transmit the collected voice and video data to the manager terminal if a one-to-one connection between the robot and the manger terminal has been established, and allowing the manager to identify a state of the user through the manager terminal and transfer a command as a response to the robot. Accordingly, a 1:1 conversation service may be naturally provided through the robot.
Preferably, the method may further comprise the steps of: allowing the management server to provide a list of commands and action definitions to the manager terminal and allowing the manager to search for the list through the manager terminal and select an appropriate corresponding command if the appropriate corresponding command exists in the list; allowing the manager to transfer the corresponding command selected by the manager to the robot, and allowing the robot to take actions based on the action definition data; and allowing the manager to directly input a corresponding command if an appropriate corresponding command does not exist in the list provided from the management server and transmit the inputted command to the robot, and allowing the robot to take actions based on the inputted command.
In accordance with another aspect of the present invention, there is provided a method for allowing an educator to educate a specific educatee using a freely moving robot for transmitting and receiving data through a computer network and processing the data to organize information, and a server connected to the robot through the network, comprising the steps of: allowing the robot to track a position of the educatee and move to an area where the educatee can be educated; if the educatee is in the area where the educatee can be educated, connecting the robot to the server and transmitting a specific identification code of the robot to the server; allowing the server to download, to the robot, educational contents corresponding to the identification code transmitted from the robot, search for a connectable educator, and connect the robot with a corresponding educator terminal; and allowing the robot to execute the downloaded educational contents and educate the educatee.
Preferably, the method may further comprise the step of: determining whether information for requesting an educator to educate the educatee is contained in the educational contents; and if information for requesting an educator to educate the educatee is contained in the educational contents, connecting the robot with the educator through the network.
Preferably, the method may further comprise the steps of: allowing the robot to determine whether an action command is contained in the educational contents; and if an action command is contained in the educational contents, allowing the robot to analyze a corresponding action command and perform a specific motion based on the action command.
Preferably, the method may further comprise the steps of: allowing the educatee to input the educatee 's request such as a specific question to the robot; allowing the robot to transmit the educatee's request to an educator terminal; if the educator inputs a response to the educatee's request to the educator terminal, allowing the educator terminal to transmit the response to the robot; and allowing the robot to output the response to the educatee.
Preferably, the method may further comprise the steps of: if an education schedule based on the educational contents is terminated, transmitting an education performance result to the server; allowing the server to store the education performance result and output the education performance result to the educator terminal; allowing the educator to perform an evaluation based on the education performance result and input the evaluation information into the server; and allowing the server to output the evaluation information to the educatee through the robot. Preferably, the method may further comprise the step of: allowing the server to perform an evaluation based on the education performance result by itself and output the evaluation information to the educatee.
In accordance with another aspect of the present invention, there is provided a system for performing dynamic education using a robot, comprising: a freely moving intelligent robot for transmitting and receiving data through a computer network and processing the data to organize information; and a learning management computer for providing educational contents to the intelligent robot through the computer network; the intelligent robot outputting the educational contents provided from the learning management computer to an educatee using speech, images and motions, and the educatee expresses his own intention to the learning management computer through the intelligent robot.
Preferably, the system may further comprise: an educator terminal connected to the robot through the computer network for enabling an educator to transfer an education command to the educatee in response to state information transferred from the intelligent robot; the learning management computer connecting the robot and the educator terminal in which the information can be transmitted and received.
Preferably, the system may further comprise: an RF (Radio Frequency) module carried by the educatee for transmitting a position-tracking signal; and a position tracking device installed in the robot for tracking a position of the educatee by transmitting and receiving the position-tracking signal to and from the RF module carried by the educatee, so that the robot moves to the tracked position ofthe educatee.
Preferably, the robot may further comprise: an image processing device for capturing an image of the educatee, digitalizing a captured signal and transferring the digitalized captured signal to the educator terminal, whereby the educator observes a learning state of the educatee at any time, thereby performing efficient learning management.
In accordance with yet another aspect of the present invention, there is provided a method for allowing an educator to educate a specific educatee using an intelligent freely moving robot for transmitting and receiving data through a communication network and processing the data to organize information, and a server for providing various educational information to the robot, comprising the steps of: allowing the educatee to input a learning request into a computer through the robot; downloading educational contents corresponding to the learning request to the robot; and allowing the robot to output the educational contents to the educatee through speech, images and motions.
Preferably, the method may further comprise the steps of: connecting the robot and a specific educator through the network in response to the learning request; allowing the educatee to input the educatee's request including a question into the robot; outputting the educatee's request to a corresponding educator; inputting the educator's response based on the educatee's request; and allowing the robot to output the educator's response to the educatee through speech, images and motions.
Preferably, the method may further comprise the step of: allowing the robot to track a position of the educatee and move to an area where the educatee can be educated.
Brief Description ofthe Drawings
The following drawings annexed to the present invention are disclosed to illustrate embodiments of the present invention, and to better understand the scope and spirit of the present invention along with the following detailed description. Accordingly, the present invention is not limited to the annexed drawings, in which:
Fig. 1 is a view illustrating a configuration of a system for developing intelligence of a robot in accordance with the present invention;
Fig. 2a is a block diagram illustrating an outline of a configuration ofthe robot used in the present invention;
Fig. 2b is a block diagram illustrating a structure for controlling the robot;
Fig. 3 is a block diagram illustrating a detailed configuration of the robot used in the present invention; Fig. 4 is a flow chart illustrating a command processing procedure for developing intelligence ofthe robot in accordance with the present invention;
Fig. 5 is a view illustrating a structure of an extended command used in the robot-intelligence developing system in accordance with the present invention; Fig. 6 is a view illustrating a command in the form of script in accordance with the present invention;
Fig. 7 is a table illustrating a set of commands in the form of script in the robot-intelligence developing system in accordance with the present invention;
Fig. 8 is a flow chart illustrating a 1:1 conversation service provided through the robot-intelligence developing system in accordance with the present invention;
Fig. 9 is a flow chart illustrating a voice and video processing procedure performed in the robot-intelligence developing system in accordance with the present invention;
Fig. 10 is a view illustrating a configuration of a dynamic education service system implemented by the robot-intelligence developing system in accordance with the present invention;
Fig. 11 is a view illustrating an outline of a configuration of a position-tracking device carried by an educatee in an education system in accordance with the present invention; Fig. 12 is a view illustrating a functional configuration of a learning management server in accordance with the present invention;
Fig. 13 is a flow chart illustrating a procedure of tracking a position of an educatee in accordance with a preferred embodiment ofthe present invention;
Fig. 14 is a flow chart illustrating a procedure of downloading education contents and performing a connection with an educator terminal in accordance with the present invention; and
Figs. 15 and 16 are flow charts illustrating procedures of providing educatees with dynamic educational services using an educating robot.
Best Mode for Carrying Out the Invention
Now, preferred embodiments ofthe present invention will be described in detail with reference to the annexed drawings
First, Fig. 1 is a view illustrating a configuration of a system for developing intelligence of a robot in accordance with the present invention.
As shown in Fig. 1, the robot-intelligence developing system includes: a robot 100 having a radio data communication function for being connected to a management server 300 through the radio data communication function when the robot 100 receives a command or encounters an event, which cannot be processed by a program, so that the robot 100 can take appropriate actions in response to a received command; a communication network 200 for providing a data communication path between the robot 100 and the management server 300; the management server 300 for storing command data associated with actions of the robot 100 to be taken in a plurality of situations, receiving data outputted from the robot 100, and recognizing a state of the robot 100 to search for a corresponding command or provide the corresponding command from the manager terminal 400 to the robot 100; and the manager terminal 400 connected to the management server 300 for receiving information ofthe robot 100 through the management server 300 and enabling a manager to command the robot 100 in response to the information.
A radio modem for transmitting and receiving radio data is installed in a terminal connected to the robot 100 of the communication network 200. Another terminal is connected to the management server 300, and hardware configurations of the management server 300 and the manager terminal 400 are the same as those of the conventional Internet server and terminal.
As shown in Fig. 2a, the robot 100 described above includes: a battery 110 for supplying electrical power so that the robot 100 can freely move; a power supply 120 for supplying appropriate current and voltage to respective components or boards from the electrical power of the battery 110; a higher-order controller 130 for recognizing a state of the robot 100 through a signal inputted from a plurality of sensors or sensing devices mounted in the robot 100 and then generating a command for the robot 100, or/and recognizing an external input command and instructing the robot 100 to take appropriate actions in response to the recognized command; a lower-order controller 140 for controlling various driving devices in real time in response to the command from the higher-order controller 130; and a driver/sensor 150 for enabling driving wheels, arms and a head of the robot 100 to move according to a control signal of the lower-order controller 140, and sensing motions and states ofthe robot 100.
Further, a structure for controlling the robot 100 is configured as shown in Fig. 2b. First, the higher-order controller 130 of the robot 100 includes a network processor
131, a command generator 132, a command processor 133 and a control communicator
134.
The network processor 131 connected to the management server 300 located on the Internet 200 notifies the management server 300 of states of the robot 100 and then receives a command for the robot's appropriate action from the management server 300.
The command generator 132 generates a command using data inputted from voice or video recognition, a keyboard or buttons, wherein the command is generated according to the external data and a change of a state ofthe robot 100. -For example, a command is also generated on the basis of a change of a state of hardware such as a battery-consumption amount, a motor failure, etc. or a change of an internal state such as a change of a state of software, etc.
If a specific command is generated in the command generator 132, the command is transferred to the command processor 133, so that it is determined whether the robot 100 can execute the inputted command. Further, if the robot 100 can execute the inputted command, the command processor 133 compares a priority of the inputted command with priorities of commands being currently executed, and executes the inputted command according to the priority.
The command received from the command processor 133 is transferred to a controller 142 contained in the lower-order controller 140 through the control communicator 134 and an interface 141, and action control of the robot 100 is performed through the controller 142.
That is, if the controller 142 transfers an action control signal to a driver 143 of a corresponding device according to an action control command, the driver 143 drives a corresponding device 144. A change of a state based on the driving of the device 144 is detected and information ofthe detected state change is transferred to the lower-order controller 140.
Further, an embodiment of the robot 100 is shown in Fig. 3 so that an operation ofthe robot 100 can be better understood. Hereinafter, a detailed configuration of the robot 100 will be described with reference to Fig. 3.
The robot 100 basically includes an image processor 380, a voice processor
390, a controller 310, a storage unit 320, a motion controller 360, a motion effector 365, a key input unit 350, a communication unit 340, an RF (Radio Frequency) module
330, an antenna 335 and a sensing unit 370.
The image processor 380 includes an image recognizer 386 made up of a CCD
(Charge Coupled Device), and an image display unit 385 made up of an LCD (Liquid
Crystal Display). That is, the image recognizer 386 captures external images, and then transmits the captured images to the image processor 380. The image processor 380 recognizes external situations from the images captured by the image recognizer 386 or contents of images transferred through an external telephone line on the basis of a captured signal from the image recognizer 386 or image information supplied from the communication unit 340, and then transmits a result ofthe recognition as an image recognition signal to the controller 310.
Under the control of the controller 310, the image processor 380 performs predetermined signal processing in relation to the captured signal from the image recognizer 386, and then transmits an image signal to the communication unit 340. The voice processor 390 is configured by a microphone 396 and a speaker 395.
The microphone 396 collects external sounds such as sounds from a user and then transmits the collected sounds to the voice processor 390.
The voice processor 390 recognizes meanings of words ofthe collected sounds from the microphone 396 and words transferred through the external telephone line on the basis of the collected sounds supplied from the microphone 396 or voice information supplied from the communication unit 340, and then transmits a result of the recognition as a voice recognition signal to the controller 310 and the communication unit 340. Further, the voice processor 390 generates synthesized voice under the control ofthe controller 310 and then transmits the synthesized voice as a voice signal to the speaker 395 or the communication unit 340.
The communication unit 340 is configured by a radio modem, etc. so that the controller 310 can communicate with an external device through a telephone line, an Internet leased line, etc.
The key input unit 350 recognizes command contents on the basis of a code number inputted from the user, and then transmits a result of the recognition as a key input signal to the controller 310.
The motion controller 360 analyzes a motion control signal transferred from the controller 310 to generate a motion-driving signal, and then moves the robot 100 in forward, backward, left or right directions or moves arms and legs of the robots 100 in a predetermined manner.
The RF module 330 generates a position-tracking signal having a user- identification code, which can be recognized by the robot 100, and then externally transmits the position-tracking signal through the antenna 335. Moreover, the RF module 330 processes an external position-tracking signal received from the antenna 335, and then transmits the processed signal to the controller 310.
The storage unit 320 is a kind of auxiliary storage unit such as a hard disk, etc., and converts data transmitted from the management server 300 into a form capable of being recognized by the user. The storage unit 320 includes: a browser program for displaying the converted data through the image display unit 385; contents downloaded from the management server 400; a program contained in the above-described contents; a robot control program for controlling the robot 100 according to a schedule; a position-tracking program for tracking a position of an educatee; and an OS (Operating
System) program for providing a platform executing these programs.
The sensing unit 370 includes a plurality of sensors, and senses a state and a peripheral environment ofthe robot 100.
The controller 310 analyzes a peripheral environment or detects the existence of a motion, the existence of a failure, the existence of a user or external command, a state of learning of the user and a pose and position of the robot 100 on the basis of a voice recognition signal, an image recognition signal, a reception information signal, a key input signal and a sensor signal inputted from the voice processor 390, the image processor 380, the communication unit 340, the key input unit 350, the RF module 330 and the sensing unit 370.
The controller 310 decides actions of the robot 100 on the basis of the robot control program in response to a result of the analysis, and moves the head, arms or legs of the robot 100 in upward, downward, left or right directions by driving the motion instrument 365 on the basis of a result of the decision, or controls an operation or action of the robot 100 such as the walking of the robot 100 by driving the legs of the robot 100.
The controller 310 externally transfers voice based on voice information received through the communication unit 340 or synthesized voice generated by the voice processor 390 through the speaker 395, controls the image display unit 385 to display an image based on external image information received through the communication unit 340 or an image generated by the image processor 380, or transmits, to the RF module 330, a position-tracking signal to be externally transmitted. The controller 310 transmits command information generated from its own device to the learning management server 400 or the educator (information) terminal
500.
The controller has a structure including a CPU (Central Processing Unit), a
ROM (Read Only Memory) and a RAM (Random Access Memory). The robot 100 configured as described above receives a command from the key input unit 350 or the voice processor 390, or determines whether a definition of action of the robot 100 based on a predetermined command exists in an internal program when a state of the robot 100 is sensed through the sensors and then the predetermined command is generated on the basis of a state of the robot 100. At this time, if the action ofthe robot 100 associated with the command is defined in the robot 100, the robot 100 performs the action based on a corresponding definition. Otherwise, if the action of the robot 100 corresponding to the command is not defined in a database of the robot 100, the robot 100 performs a connection with the management server 300 to search a database of the management server 300 for the action definition, receive the searched action definition and perform the action based on the received action definition.
In the system of Fig. 1 including the robot 100 configured as described above, if the user of the robot 100 transmits a 1:1 conversation request to the robot 100, the robot 100 is connected to the server 300 through the Internet 200, and transfers voice and video data of the user after transferring the 1 : 1 conversation request to the server 300. Further, the server 300 checks the 1 :1 conversation request and then connects the robot 100 and some manager terminal 400. At this time, the voice and video data transferred from the robot 100 is transferred to a corresponding manager terminal 400, and the manager inputs conversation information after viewing and hearing the transferred voice and video data or inputs a command. Command contents of the manager is transmitted from the manager terminal 400 to the robot 100 through a predetermined method, and the robot 100 analyzes the transmitted command, and outputs the voice or video data or executes the command. Through the above-described method, the user can naturally perform the 1 : 1 conversation with the robot 100.
A command processing procedure performed by the robot-intelligence developing system will be described with reference to the annexed drawings.
Fig. 4 is a flow chart illustrating the command processing procedure of the robot 100. The robot 100 uses voice of the user as a main command source, but can use data inputted from another device, for example, a keyboard or buttons, as a command source. Further, the robot 100 can receive commands through various methods including a method for generating an internal command according to an internal state ofthe robot 100, etc.
If the robot 100 receives a command in any method, the robot 100 searches a database mounted in an internal controller to search for data of the command from the database (S402). At this time, if data of the command exists in the database (S403), the robot 100 executes the command according to the searched data (S404). On the other hand, if data ofthe command does not exist in the database ofthe robot 100, the network processor 131 is driven and the robot 100 is connected to the server 300 (S405), and searches for data of the command from a database of the server 300 (S406).
If data of the command exists in the database of the server 300 (S407), the robot 100 receives data of the command from the server 300 (S408), and executes a corresponding command (S404).
On the other hand, if data of the command does not exist in the database of the server 300, the robot 100 outputs a message indicating that the command cannot be executed (S409). Through the above-described method, the robot 100 can process a greater number of commands than are commands contained in its own database.
The commands given to the robot 100 are implemented on the basis of a script language.
Where the commands given to the robot 100 are configured in the script language, the robot 100 can readily implement various actions in various environments and a command can readily be extended.
The command given to the robot 100 includes a basic command and an extended command based on the basic command. The basic command is given to the controller of the robot 100, and one basic command corresponds to one action to be performed by the robot 100.
The extended command exists in the case where the robot 100 takes different actions in different states ofthe robot 100.
For example, when a basic command is a movement command, the robot 100 performs different movements in the case where a battery is almost consumed and in the case where power of the battery is sufficient. An extended command includes one basic command if internal states are the same as each other. However, if internal states are different, the extended command includes the number of basic commands corresponding to the number of different internal states. Through this structure, the user can readily understand a command system associated with the robot 100 and hence the user can readily operate the robot 100. Fig. 5 is a view illustrating a structure of the extended command. The extended command includes an extended command ID, a keyword for voice recognition for recognizing a voice command of the user, priority for command processing, basic commands corresponding to the number of internal states of the robot 100, and extended-command information.
Fig. 6 illustrates an exemplary command in the form of script in accordance with the present invention.
The robot should provide desired services to the user in addition to simple movements. In this case, a unit of a new command should be configured by a plurality of commands so that one desired service can be provided to the user. That is, the unit consisting of a plurality of extended commands is needed according to commands and environments of the user. Fig. 6 shows an exemplary command using the script language. One extended command can be corresponded to one script function, and actions ofthe robot 100 can be defined using script functions. Further, scripts can be corresponded to a basic command, and an expert can use basic commands to define motions of the robot in detail. In addition to scripts for controlling motions of the robot, actions of the robot associated with logical scripts of "if, "switch", "repeat", etc. can be readily programmed.
Various types of scripts are shown in Fig. 7. The robot's actions configured by combining a series of actions can be readily programmed using scripts shown in Fig. 7.
Fig. 6 shows the exemplary command in the form of script for simple conversation between the user and the robot. A name of this command is defined as "Hello", which is stored as a function. While a function of "Hello" is executed, a function of "Move" is called. Here, it is assumed that the function of "Move" is defined as a function for moving the robot toward the user. If the function of "Hello" is executed, the function of "Move" is called, and the robot moves toward the user and says "How are you?" to the user through TTS (Text-To-Speech) scripts. The robot then receives a reply of "Fine" or "Not fine" through REPLY scripts. If the robot receives a reply of "Fine", a temporary parameter "a" is stored as "1". Further, if the robot receives a reply of "Not fine", the temporary parameter "a" is stored as "2". If a value of the temporary parameter "a" is "1", a message of "May I help you?" is transferred to the user through switch scripts. Further, if a value of the temporary parameter "a" is "2", MoveHome scripts are called. Here, where it is assumed that
"MoveHome" is defined as a function for returning the robot to an origin position, the robot returns to the origin position if the robot receives a reply of "Not fine" from the user.
Further, where the user gives a command to the robot 100 using voice, an operation procedure of the robot is performed in a manner similar to the command processing procedure of Fig. 4 described above.
First, the higher-order controller 130 of the robot 100 recognizes a voice command from the user, and then sends a result of the recognition to the command processor 133. The command processor 133 searches for the data of the voice command from its own database. At this time, if data of the voice command is searched for, the command processor 133 enables the robot to execute a corresponding command. Otherwise, the command processor 133 performs a connection with the server 300 through the network processor 131 to search for a script command corresponding to the voice command. Where action of the robot is defined in the server 300 in relation to a given command, the robot 100 downloads data of the definition of the action of the robot 100 from the server 300, and the actions of the robot 100 are controlled by the downloaded scripts.
Accordingly, even though a program is not programmed in the robot 100, the user feels as if the robot takes actions through its own decision-making process.
Further, the robot 100 periodically performs a connection with the server 300, downloads scripts of a new action definition from the server 300, and stores the downloaded scripts in its own database, thereby extending and changing abilities of the robot 100. As described above, where action of the robot 100 corresponding to a given command is not defined in the robot's database, the system performs a connection with the server 300 and then receives a command associated with the action of the robot 100 from the server 300.
However, where action of the robot corresponding to the command is not defined in the database of the server 300, the server 300 outputs an error message indicating that the robot cannot take an appropriate action corresponding to the command. At this time, there is a problem in that the robot 100 cannot provide a desired service to the user.
To address this problem, where data associated with action of the robot 100 corresponding to the command exists in the server 300, the server 300 transmits the command transmitted from the robot 100 to the manager terminal 400 in order to request the manager terminal 400 to give a response, without transmitting the error message to the robot 100.
At least one manager is located at the manager terminal 400. After the manager identifies the request from the server 300, the manger provides, to the server 300 through the manager terminal 400, an appropriate response (data or message associated with action of the robot 100) corresponding to the command transmitted from the server 300.
The server 300 transfers data of the definition associated with the action of the robot 100 from the manager terminal 400 to the robot 100, and the robot 100 executes a given command from the manager.
The robot 100 can follow various commands, and the user is impressed by execution abilities ofthe robot 100.
The system can provide the 1 :1 conversation service to the user.
Fig. 8 is a flow chart illustrating a procedure of providing the 1:1 conversation service to the user through the system. As described above, if the 1 :1 conversation command is inputted from the user to the robot 100 through voice or key inputs (S801), a command recognition device of the robot 100 recognizes the command (S802), and transfers the recognized 1 :1 conversation command to the command processor 133
(S803).
If the command is the 1:1 conversation command, the command processor 133 performs a connection with the server 300 through the network processor 131, and then transfers the 1 :1 conversation command to the server 300 (S804). In this case, if the server 300 receives a 1 :1 conversation request, the server
300 identifies a manager capable of accessing the robot 100, transfers the 1 :1 conversation request ofthe robot 100 to the manager, and connects the terminal 400 of the manager to the robot 100 (S805). For example, the server 300 transfers the 1:1 conversation request of the robot 100 to a plurality of manager terminals 400. The server 300 can connect the robot 100 with some manager terminal 400 giving a response to the request.
However, where no manager capable of accessing the robot 100 exists, the server 300 transmits an error message, and the robot 100 outputs a command execution impossible message and terminates the 1 :1 conversation procedure (S806 and S807).
On the other hand, where a manger capable of accessing the robot 100 exists, the server 300 connects the robot to the manager terminal 400 (S808).
The robot 100 connected to the manager performs voice and video data compression to efficiently send voice and video data of the user in real time. To perform the voice and video data compression, a codec for voice and video is loaded and executed. If this procedure is completed, a syntax analyzer is executed to analyze voice of the user and perform a command execution function, a function of terminating the 1:1 conversation or a function of automatically repeating the conversation. If a termination command is called in the syntax analyzer, the 1:1 conversation is terminated.
As described above, if a connection between the robot 100 and some manager terminal 400 is completed, the robot 100 obtains video and voice data of the user from the camera and the microphone mounted therein and then transmits the obtained video and voice data to the manager terminal 400. The manager identifies a state ofthe user from the transmitted image and voice data and then inputs a reply. The reply of the manager is transferred to the robot 100 through the manager terminal 400 and the management server 300, and the robot 100 outputs sound or takes motion according to the reply (S809 to S813).
Because the robot 100 transfers the video and voice data ofthe user so that the manager can better understand a state of the user, the manager correctly instructs the robot 100. That is, the manager receives the video and voice of the user through the manager terminal 400 in an image chatting method, and instructs the robot 100 to output or perform an appropriate sound or motion.
However, where a time delay is caused, the conversation between the user and the robot 100 may not be performed naturally. For example, serial communications between the robot 100 and the manager terminal 400 are performed through the Ethernet on the basis of TCP/IP (Transmission Control Protocol/Internet Protocol). Accordingly, if the robot 100 begins to perform screen display or action after data such as HTML (HyperText Markup Language) is received from the manager terminal 400, the robot 100 may not be controlled or the user may not understand what the robot 100 is doing. In the 1 :1 conversation procedure, the robot 100 should simultaneously output sound or make a facial expression while taking the action. While the robot 100 displays a moving picture or animation on the screen, the robot 100 should simultaneously take actions.
Accordingly, if a serial communication protocol based on the HTML like the conventional Internet communication is performed, there are problems in that appropriate synchronization may not be achieved and a desired action may not be taken. To address the above-described problems, data of actions to be taken by the robot, moving pictures, etc. are based on an object or event-oriented protocol using scripts in the present invention. That is, data of action to be taken by the robot, voice, a facial expression, a moving picture or animation is considered as one object based on scripts, and objects to be simultaneously executed are recognized through JOIN scripts. Accordingly, even though data lengths of objects to be simultaneously executed are different, the objects are executed after being transferred to the robot 100.
Where the robot 100 is directly controlled through a communication network, the robot 100 cannot appropriately operate because of a communication delay.
However, where a given command is transferred to the robot 100 through a script method, the robot 100 can perform natural motions, irrespective of the communication delay.
If the manager controls the robot 100 through this method, the user feels as if he talks with another human, while talking with the robot 100.
In the 1 :1 conversation, the robot 100 receives conversation contents from the manager, and the manager decides conversation contents on the basis of voice and video information identified through the manager terminal 400. Where the decided conversation contents are stored in the server 300, the server 300 simply selects the conversation contents corresponding to command data to be provided to the manager terminal 400. At this time, stored voice data can be outputted from the server 300. The conversation contents selected by the manager are transferred to the robot 100 in the form of text, and hence the robot's speech is transferred to the user through a TTS function ofthe robot.
Accordingly, even through different managers access the robot 100, the user can always hear the robot's unique voice. Where the conversation contents are not stored in the server 300, the manager directly inputs the conversation contents through the manager terminal 400 to transmit the inputted contents in the form of text to the robot 100 so that the robot's speech can be outputted.
The robot-intelligence developing system can be used for various fields such as management, education, etc. For example, a program for education can be programmed in the robot 100, data of action definitions associated with various commands can be stored in the server capable of managing educational materials. Thus, the robot 100 can receive corresponding educational materials from the user and educate the user. As another embodiment, the robot 100 can be used as a conversation partner of the elderly through the 1 : 1 conversation service and provide an elderly person management service.
Hereinafter, a description will be given of a system for providing education services using the robot-intelligence developing system as an example of the present invention.
Fig. 10 is a view illustrating a configuration of a dynamic education service system implemented by the robot-intelligence developing system in accordance with a preferred embodiment ofthe present invention.
As shown in Fig. 10, the system includes an educating robot 100a, an educator terminal 400a and a learning management server 300a, which are connected through a network 200 such as the Internet. Further, an educatee 600 notifies the educating robot 100a of a position ofthe educatee 600 through the RF module 610.
In particular, the learning management server 300a is connected to the specific educatee 600 and an educator 500 through a computer network such as the Internet for communication. The learning management server 300a is a computer system for providing predetermined educational contents. The learning management server 300a will be described in detail with reference to Fig. 12.
The educating robot 100a downloads educational contents from the learning management server 300a, and outputs speech, images or motions based on the educational contents to the educatee 600. The educating robot 100a performs an interface between the educator 500 and the educatee 600 for communication. An internal configuration of the educating robot 100a can be understood by referring to
Fig. 3.
The educator terminal 400a is a personal computer connected to the educating robot 100a through the 'server 300a and the network 200a, and includes a communication module for data transmission and reception, an OS (Operating System) program, and a Web browser program. Accordingly, the educator terminal 400a outputs voice and image information transmitted from the educating robot 100a to the educator 500, and transmits information inputted from the educator 500 to the learning management server 300a or the educating robot 100a through the network.
The network 200a is referred to as the Internet, but the network 200a is not limited to the Internet and includes an intranet, an extranet, a leased line network, etc.
The educating robot 100a downloads educational contents from the management server 300a, analyzes the educational contents, educates the educatee tlirough the speech, images and motions, transfers voice or image data of the educatee to the educator terminal 400a through the communication unit 340, converts reply information of the educator 500 transmitted from the educator terminal 400a into the speech, images and motions of the educating robot 100a, and transfers the converted speech, images and motions to the educatee 600. The educating robot 100a runs the position-tracking program according to a schedule set by the robot control program, and externally transmits a position-tracking signal through the RF module 330. Further, the educating robot 100a analyzes the position-tracking signal from an external device, tracks a position of the educatee, and moves to a place within an area where the educatee can be educated. Hereinafter, the RF module 610 for position tracking carried by the educatee 600 will be described with reference to Fig. 11.
As shown in Fig. 11, the RF module 610 is a device for notifying the educating robot 100a of a position ofthe educatee 600. The RF module 610 is manufactured in the form of a necklace, a bracelet, or etc. and can be carried by the educatee 600. Further, in Fig. 11, an RF reception module 613 ofthe RF module 610 receives the position-tracking signal transmitted from the RF module 330 of the educating robot 100a to the outside, and transfers the received position-tracking signal to a signal- processing module 611. The signal-processing module 611 analyzes the received position-tracking signal, extracts an educatee-identification code from a result of the analysis, generates a response signal to the position-tracking signal if the extracted identification code indicates a corresponding educatee, and transmits the response signal to the RF transmission module 612.
The RF transmission module 612 transmits the response signal to the outside through an antenna (not shown).
Next, Fig. 12 shows a functional configuration of the learning management server 300a.
The learning management server 300a searches for, from its own database, an action definition associated with a predetermined command requested from the educating robot 100a by performing the communication with the educating robot 100a, or receives information of the action definition from the educator terminal 400a to provide the action definition information to the educating robot 100a. In addition, the learning management server 300a is a computer system for providing the educational contents requested by the user, connecting the educating robot 100a to the educator terminal 400a through the network, evaluating learning state on the basis of learning result information provided from the educating robot 100a, creating a report based on a result ofthe evaluation, and transferring the report to the educatee.
The learning management server 300a is a large-capacity computer system having a CPU, a RAM, a ROM, a network interface, a data storage unit, etc. A conventional personal computer or a conventional workstation with a large-capacity memory and data processing capability can be employed as the learning management server 300a.
The learning management server 300a can perform a large number of tasks by executing a large number of mathematical calculations Fig. 12 shows a plurality of databases configuring program modules loaded in a ROM ofthe learning management server 300a, and a data storage unit.
Hereinafter, the learning management server 300a will be described in detail. The learning management server 300a includes the program modules having a database management module 410, an educational-content providing module 420, an educator connection module 430, a learning evaluation module 440, and a result report creation and transmission module 450, and a database system having an educational- content DB (DataBase) 460, an educator information DB 470 and a robot information DB 480. The database management module 410 constructs the educational-content DB
460, the educator information DB 470 and the robot information DB 480, and changes the databases to reflect changed information if information ofthe constructed databases is changed. The database management module 410 is a program for entirely managing the databases. The educational-content providing module 420 is a program for identifying an educatee on the basis of the specific identification code, determining a state of learning progress of a corresponding educatee, designating educational contents to be provided to a corresponding educating robot, and downloading the designated educational contents to the educating robot 100a through the network, if the educational-content providing module 420 receives a learning request including a specific identification code from the educating robot 100a.
The educator connection module 430 is a program for searching the educator information database 470, designating an appropriate educator, and connecting the designated-educator terminal 500 and the educating robot 100a corresponding to the educatee through the network, if an educator connection command or a connection request from the robot 100a is contained in the educational contents.
The learning evaluation module 440 is a program for receiving education result information from the educating robot 100a, and storing the education result information in the robot information database 480 or transmitting the education result information to the educator terminal 400a, after the learning based on the educational contents is terminated. The learning evaluation module 440 is a program for evaluating an educatee by itself on the basis ofthe education result information and storing a result of the evaluation in the robot information database 480.
The learning evaluation module 440 is contained in the learning management server 300a in Fig. 12, but the learning evaluation module 440 can be stored in the educational contents of the educating robot 100a. In this case, the result of the evaluation should be uploaded from the robot 100a to the learning management server
300a.
The result report creation and transmission module 450 is a program for combining a result of its own evaluation and the result of the evaluation transferred from the educator, creating a result report for the educatee, and transmitting the result reporter to an e-mail or mobile phone ofthe educatee.
The educational-content database 460 individually stores and manages various educational contents to be downloaded to the educating robot 100a. The database 460 stores education schedule information (or education progress information) of an individual educatee and educational-content information based on the education schedule information.
The educator information database 470 stores and manages lists of a plurality of clients 400 and educators currently coupled to the learning management server 300a, personal information (resident registration numbers, addresses, educational history, majors, etc.) of the educators, and schedule information of the educators. The learning management server 300a can determine which educators are capable of being currently connected to the educating robot 100a by identifying the schedule information. The robot information database 480 stores and manages the robot's specific identification code, personal information (a resident registration number, an academic background, an address, a name, etc.) of the educatee, a result of learning evaluation, a result report, a learning level, a state of learning progress, etc.
Hereinafter, a process associated with a dynamic education method through the education system will be described with reference to Figs. 13 to 16.
First, Fig. 13 shows a procedure of causing the robot to move to an area where the educatee can be educated by tracking a position ofthe educatee.
A robot control program embedded in the storage unit 320 is loaded in the controller 310 of the robot and then the controller 310 initializes the program (SI 301). The controller 310 determines whether a movement command is contained within robot schedule information (or whether the present time matches a learning time) (S1303).
At this time, if the movement command is contained within the robot schedule information, the controller 310 initializes a position estimation (or tracking) module within a position-tracking program (S1305). The position estimation module generates a position estimation (or tracking) signal to be transmitted to the outside, for example, using a PSK (Phase Shift Keying) modulation method, and then transmits the generated estimation tracking signal to the outside through the RF module 330 and the antenna 335 (SI 307).
The position estimation signal includes an identification code for identifying an educatee.
If the RF reception module 613 of the RF module 610 attached to a body ofthe educatee 600 receives the position estimation signal, the received position estimation signal is transferred to the signal-processing module 611 (SI 309).
The signal-processing module 611 demodulates the inputted position estimation signal, identifies the specific identification code, generates a position estimation (or tracking) response signal if the position estimation signal is determined as a signal for an educatee, and transfers the generated position estimation response signal to the RF transmission module 612 (S1311).
The RF transmission module 612 transmits the position estimation response signal to the outside through the antenna (not shown) (S1313).
The antenna 335 of the robot transfers the position estimation response signal from the RF transmission module 612 to the RF module 330, and the RF module 330 analyzes the position estimation response signal to transfer a result ofthe analysis to the position estimation module of the controller 310 (S 1315). The position estimation module of the controller 310 calculates a distance between the robot and the educatee and azimuth information using field intensities, a phase difference, or a time delay from the transmitted position estimation signal and the received position estimation response signal (SI 317). If the distance and the azimuth are calculated, the position estimation module generates a motion control signal from calculation values, and transfers the motion control signal to the motion controller 360 (S1319).
The motion controller 360 generates a motion-driving signal based on the motion control signal, and transmits the motion-driving signal to the motion effector 365, thereby allowing the educating robot 100a to move toward the user (S1321).
The procedure of steps SI 307 to SI 321 enables the robot to move to an area where the robot can educate the educatee by tracking a position ofthe educatee.
If the robot moves to an area where the robot can educate the educatee by tracking a position of the educatee through the procedure of Fig. 13, a procedure of downloading education contents and performing a connection with an educator terminal is performed through steps S1401 to S1419 of Fig. 14.
That is, if an image of the educatee is transferred to the controller 310 through the image recognizer 386 of the robot, the controller 310 recognizes the educatee
(S1401). If the educatee has been recognized, the controller 310 generates guide voice messages of, for example, "The present time is a lesson time. Are you studying?" and outputs the guide voice messages to the educatee through the speaker
395 (S1403).
At this time, if the educatee expresses, to the robot 100a, an intention to study through the microphone 396 or the key input unit 350 (SI 405), the controller 310 ofthe robot forms a connection with the learning management server 300a through the communication unit 340 (SI 407).
The controller 310 of the robot then reads a specific identification code stored in an internal memory (not shown), and transmits the read identification code to the educational-content providing module 420 (SI 409). The educational-content providing module 420 receiving the specific identification code from the educating robot 100a accesses the robot information database 480, and identifies an educatee corresponding to the identification code and a state of learning progress ofthe educatee (S1411).
After identifying the state of learning progress of the educatee, the educational-content providing module 420 extracts educational contents appropriate for the learning progress state from the educational-content database 460, and downloads the educational contents to the educating robot 100a (S1413).
The controller 310 of the robot downloading the educational contents stores the contents in the storage unit 320. Here, the educational contents include multimedia information such as pictures, characters, videos, sounds, voices, etc. and robot-motion control information for controlling the robot's motions. Further, the educational contents can include an active program (i.e., an educational program) for appropriately interworking and executing the multimedia information and the robot- motion control information through the robot. Moreover, the storage unit of the robot can store an active program for analyzing and executing the multimedia information and the robot-motion control information.
The controller 310 ofthe robot downloading the educational contents from the learning management server 300a analyzes information ofthe educational contents and determines whether an educator connection command is contained in the information of the educational contents (S1415). At this time, if the educator connection command is contained in the information of the educational contents, the controller 310 of the robot transmits an educator connection request to the educator connection module of the learning management server through the communication unit. The educator connection module 430 receiving the educator connection request from the robot 100a searches the educator information database 470 and determines whether an educator capable of being connected to the robot 100a exists. At this time, if an educator capable of being connected to the robot 100a exists, the educator connection module 430 connects the robot 100a and the educator terminal 400a (S1417). At the above step S1415, the robot and a specific educator are automatically and simultaneously connected to the network at the time of downloading the contents, irrespective ofthe robot's request.
In a state in which the robot 100a and the educator terminal 400a have been connected to the network, the controller 310 of the robot analyzes the downloaded and stored educational contents and then carries out the education for an educatee as shown in Figs. 15 and 16 (S1419).
That is, if an action command is contained in the contents while the education based on the downloaded educational contents for the educatee is carried out through voice and image data outputted from the speaker 396 and the image display unit 385 (SI 501 and SI 502), the controller 310 of the robot generates a motion control signal based on the action command, transfers the generated motion control signal to the motion controller 360, enables the robot 100a to take action corresponding to the command (SI 503 and SI 504). That is, the robot 100a carries out the dynamic education tlirough speech, images or gestures by moving its own body in the forward, backward, left or right directions or moving its arms or legs in a predetermined manner. On the other hand, if no action command is contained in the contents, the controller 310 returns to the above step SI 501 and continuously carries out the education based on the voices and images (SI 502).
If a request from the educatee, such as a question, is received while the education based on the educational contents is carried out (S 1505), the controller 310 of the robot determines whether its own program can process the request (SI 506). That is, the controller 310 searches its own database for the user request to determine whether an action definition and action performance data exist in the database.
At this time, the controller 310 gives a response to the user request according to the programming if its own program can process the user request (SI 507). That is, the controller 310 enables the robot to output the set speech, images and motions to the educatee.
On the other hand, if the program cannot process the user request, the controller 310 transfers the request of the educatee 600 to the connected management server 300a (SI 508).
Accordingly, if the action definition for the request exists within the management server 300a, the controller 310 transmits information of the action definition to the robot 100a (SI 508). On the other hand, if the action definition for the request does not exist within the management server 300a, the controller 310 inquires of the educator terminal 400a about the request and receives a response to the request from the educator 500 (SI 508 and SI 509).
Response data for the request of the educatee is transferred to the robot 100a through the network, and the controller 310 of the robot analyzes the response data
(SI 510), and then enables the robot 100a to output the images, voices and motions to the educatee (SI 511). In the procedure of performing the above steps SI 501 to S1511, because an image of the educatee is captured through the image recognizer 386 and then transmitted to the educator terminal through the communication unit 340, the educator can observe a learning state ofthe educatee during education. Accordingly, the educator can transfer a necessary command to the robot 100a through the educator terminal 400a without a request of the educatee, and the robot
100a can transfer the speech, images and motions to the educatee in response to a command from the educator.
On the other hand, if all education activities are terminated according to a schedule of the educational contents (SI 512), the controller 310 of the robot transfers an education performance result to the learning evaluation module 440 of the learning management server 300a (SI 514).
The learning evaluation module 440 stores the education performance result in a directory of a corresponding educatee of the robot information database 480, and then transmits the education performance result to the educator terminal 500 (S1515 and
S1516).
The learning evaluation module 440 generates evaluation information for a corresponding educatee, and stores the generated evaluation information in a directory ofthe corresponding educatee on the basis ofthe education performance result stored in the robot information database 480 (S 1518).
The educator receiving the education performance information inputs the evaluation information for the corresponding educatee in the learning evaluation module 440 of the learning management server, and the learning evaluation module
440 further stores evaluation information of the educator in a directory of a corresponding educatee ofthe robot information database (S1517). In a state in which the automatic evaluation information and the educator's evaluation information have been stored, the result report creation and transmission module 450 generates a result report based on the automatic evaluation information and the educator's evaluation information, and stores the generated result report in a directory of a corresponding educatee of the robot information database 480 (S 1519).
Further, the result report creation and transmission module 450 transmits the result report to the educatee through an e-mail or a character message of a mobile phone (SI 520).
The preferred embodiments of the present invention have been described with reference to the annexed drawings. Here, the terms or words disclosed in this specification or claims are not limited to conventional meanings or meanings in a dictionary. The applicant of the present invention selected the terms or words disclosed in this specification or claims so that the present invention can be better understood. The terms or words disclosed in this specification or claims should be interpreted as meanings appropriate for the spirit ofthe present invention. The present invention is not limited to the above-described embodiments and configurations disclosed in the annexed drawings, and those skilled in the art will appreciate that various modifications and equivalents are possible, without departing from the scope and spirit ofthe present invention. As apparent from the above description, a robot-intelligence developing system of the present invention includes a radio modem mounted in a service robot in which various and precise action controls are needed, connects the robot to a robot management server and a manger terminal through the radio modem, enables the robot to receive information of an action definition from the management server or the manager terminal where the robot cannot process a command or instruction, whereby other functions in addition to functions programmed in the robot can be implemented and hence a user feels as if the robot is of intelligence similar to that of human.
The robot-intelligence developing system of the present invention performs a one-to-one connection between the robot and the manager terminal, enables the robot to perform a command from a manager on the basis of the one-to-one connection, and enables the robot to perform human-like actions.
The robot-intelligence developing system of the present invention enables data communications between the robot, a learning management server, an educator terminal and other devices or units, thereby providing dynamic educational services tlirough the robot.
The robot-intelligence developing system ofthe present invention automatically checks a learning time, tracks a position of an educatee at the learning time and educates the educatee, thereby enabling active learning management.

Claims

Claims:
1. A system for developing intelligence of a robot, comprising: a management server for storing and managing data defining a robot's actions associated with various situations, newly updating action definition data when a new situation is encountered, and managing the newly updated data; a freely moving robot connected to the management server through a radio Internet for searching for an appropriate action definition associated with a recognized command from an action definition database of the robot and another action definition database of the management server connected through a communication network when a command is generated by a decision-making process of the robot or an outside source, and taking actions based on the action definition; and a communication network for enabling data communication between the management server and the robot; a robot performing a connection with the management server, searching for the appropriate action definition from data of the management server, receiving data of the searched action definition, and taking actions based on the data of the searched action definition, if an internally programmed action definition does not exist in the robot when a new command is generated.
2. The system as set forth in claim 1, wherein a command given to the robot comprises: a basic command for defining one action to be performed by the robot: and an extended command based on the basic command and having a plurality of basic commands defined according to the robot's states associated with one action to be performed by the robot.
3. The system as set forth in claim 2, wherein the command given to the robot is implemented on the basis of a script language.
4. The system as set forth in claim 1, wherein the system further comprises: at least one manager terminal connected to the management server for outputting data of the robot transferred from the management server, and transferring data inputted from a manager to the robot through the management server; the management server searching for a connectable manager terminal, transferring state information of the robot to a corresponding manager terminal, and transferring data of the action definition inputted from the manager to the robot, if action definition data associated with a command requested by the robot does not exist in the management server.
5. The system as set forth in claim 4, wherein the system transmits a 1:1 conversation request to the management server when the robot recognizes the 1 :1 conversation request; and the management server searches for a connectable manager and establishes a communication path between a corresponding manager terminal and the robot; and voice and video data of a user inputted into the robot are transferred to the manager terminal, and the manager transfers a response inputted into the manager terminal to the robot, thereby enabling a 1 : 1 conversation service to be performed.
6. The system as set forth in claim 4, wherein a response command to be transferred from the manager to the robot is based on objects and events associated with data of actions to be taken by the robot and moving pictures, and a set of objects associated with one command is transferred to the robot; and the robot takes actions after receiving all data ofthe command.
7. The system as set forth in claim 4, wherein the manager terminal provides a list of command execution and action definition data stored in the management server; and the manager selects action definition data to be provided to the manager terminal from the management server, or directly inputs the action definition data if appropriate action definition data does not exist in the list.
8. The system as set forth in claim 4, wherein response contents inputted into the manager terminal by the manager are transferred to the robot in the form of text; and the robot outputs speech through a TTS (Text-To-Speech) function in response to the received contents in the form of text.
9. A method for developing intelligence of a robot by connecting the robot, a management server and a manager terminal tlirough a communication network, comprising the steps of: a) allowing the robot to generate a predetermined command according to a user's command or internal state change; b) allowing the robot to recognize the command generated at the step a) and search its own database for action definition data associated with the generated command; c) allowing the robot to take actions based on the action definition data if the action definition data exists in the database as a result of the search, and connecting the robot to a management server through a radio network and requesting the management server to search for the action definition data associated with the command if the action definition data does not exist in the robot; d) allowing the management server to search its own database for the action definition data associated with the command transferred from the robot and transmit corresponding action definition data to the robot if the action definition data exists in the database; and e) allowing the robot to receive the data transferred from the management server and execute the command on the basis ofthe received action definition data.
10. The method as set forth in claim 9, wherein the method further comprises the steps of: f) transferring the request of the robot to the manager terminal if the action definition data associated with a corresponding command does not exist in the management server at the step d); and g) allowing a manager to input, into the manager terminal, contents of actions to be performed by the robot in relation to the command transferred from the robot, and allowing the manager terminal to transfer the inputted contents to the robot through the management server; the robot taking actions associated with the command provided from the management server or the manager terminal.
11. A method for developing intelligence of a robot by connecting the robot, a management server and a manager terminal through a communication network, comprising the steps of: allowing a user to transfer a 1 : 1 conversation command to the robot; allowing the robot to recognize the 1 : 1 conversation command from the user and transfer a 1 : 1 conversation request to the management server; allowing the management server to search for a connectable manager if it receives the 1 : 1 conversation request from the robot, transfer the 1 : 1 conversation request of the robot to a corresponding manger terminal, and establish a communication path between the manager terminal and the robot; and allowing the management server to collect voice and video data of a user and transmit the collected voice and video data to the manager terminal if a one-to-one connection between the robot and the manger terminal has been established, and allowing the manager to identify a state of the user through the manager terminal and transfer a command as a response to the robot.
12. The method as set forth in claim 11, wherein the method further comprises the steps of: allowing the management server to provide a list of commands and action definitions to the manager terminal and allowing the manager to search for the list through the manager terminal and select an appropriate corresponding command if the appropriate corresponding command exists in the list; allowing the manager to transfer the corresponding command selected by the manager to the robot, and allowing the robot to take actions based on the action definition data; and allowing the manager to directly input a corresponding command if an appropriate corresponding command does not exist in the list provided from the management server and transmit the inputted command to the robot, and allowing the robot to take actions based on the inputted command.
13. The method as set forth in claim 11, wherein the corresponding command transferred from the manager terminal to the robot is provided in the form of text, and the robot outputs speech corresponding to the received text through a TTS (Text-To-Speech) function.
14. A method for allowing an educator to educate a specific educatee using a freely moving robot for transmitting and receiving data through a computer network and processing the data to organize information, and a server connected to the robot through the network, comprising the steps of: allowing the robot to track a position of the educatee and move to an area where the educatee can be educated; if the educatee is in the area where the educatee can be educated, connecting the robot to the server and transmitting a specific identification code of the robot to the server; allowing the server to download, to the robot, educational contents corresponding to the identification code transmitted from the robot, search for a connectable educator, and connect the robot with a corresponding educator terminal; and allowing the robot to execute the downloaded educational contents and educate the educatee.
15. The method as set forth in claim 14, wherein the method further comprises the step of: determining whether information for requesting an educator to educate the educatee is contained in the educational contents; and if information for requesting an educator to educate the educatee is contained in the educational contents, connecting the robot with the educator through the network.
16. The method as set forth in claim 14, wherein the method further comprises the steps of: allowing the robot to determine whether an action command is contained in the educational contents; and if an action command is contained in the educational contents, allowing the robot to analyze a corresponding action command and perform a specific motion based on the action command.
17. The method as set forth in claim 14, wherein the method further comprises the steps of: allowing the educatee to input the educatee's request such as a specific question to the robot; allowing the robot to transmit the educatee's request to an educator terminal; if the educator inputs a response to the educatee's request to the educator terminal, allowing the educator terminal to transmit the response to the robot; and allowing the robot to output the response to the educatee.
18. The method as set forth in claim 14, wherein the method further comprises the steps of: if an education schedule based on the educational contents is terminated, transmitting an education performance result to the server; allowing the server to store the education performance result and output the education performance result to the educator terminal; allowing the educator to perform an evaluation based on the education performance result and input the evaluation information into the server; and allowing the server to output the evaluation information to the educatee tlirough the robot.
19. The method as set forth in claim 18, wherein the method further comprises the step of: allowing the server to perform an evaluation based on the education performance result by itself and output the evaluation information to the educatee.
20. A system for performing dynamic education using a robot, comprising: an educational server for providing data defining a robot's actions associated with various situations and a plurality of educational contents; and a freely moving robot for transmitting and receiving data to and from the educational server connected through a communication network and processing the data; the robot outputting the educational contents provided from the educational server to an educatee using speech, images and motions.
21. The system as set forth in claim 20, wherein the system further comprises: an educator terminal connected to the robot through the educational server for enabling an educator to transfer an education command to the educatee in response to state information transferred from the intelligent robot; the educational server connecting the robot and the educator terminal in which the information can be transmitted and received.
22. The system as set forth in claim 21, wherein the system further comprises: an RF (Radio Frequency) module carried by the educatee for transmitting a position-tracking signal; and a position tracking device installed in the robot for tracking a position of the educatee by transmitting and receiving the position-tracking signal to and from the RF module carried by the educatee, calculating values of direction and distance so that the robot moves to the tracked position ofthe educatee.
23. The system as set forth in claim 22, wherein the robot further comprises: an image processing device for capturing an image ofthe educatee, digitalizing a captured signal and transferring the digitalized captured signal to the educator terminal, whereby the educator observes a learning state of the educatee at any time, thereby performing efficient learning management.
24. A method for allowing an educator to educate a specific educatee using an intelligent freely moving robot for transmitting and receiving data through a communication network and processing the data to organize information, and a server for providing various educational information to the robot, comprising the steps of: allowing the educatee to input a learning request into a computer through the robot; downloading educational contents corresponding to the learning request to the robot; and allowing the robot to output the educational contents to the educatee through speech, images and motions.
25. The method as set forth in claim 24, wherein the method further comprises the steps of: connecting the robot and a specific educator through the network in response to the learning request; allowing the educatee to input the educatee's request including a question into the robot; outputting the educatee's request to a corresponding educator; inputting the educator's response based on the educatee's request; and allowing the robot to output the educator's response to the educatee through speech, images and motions.
26. The method as set forth in claim 24, wherein the method further comprises the step of: allowing the robot to track a position of the educatee and move to an area where the educatee can be educated.
PCT/KR2002/001599 2001-08-28 2002-08-28 Method and system for developing intelligence of robot, method and system for educating robot thereby WO2003019452A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR2001/51964 2001-08-28
KR20010051964 2001-08-28

Publications (1)

Publication Number Publication Date
WO2003019452A1 true WO2003019452A1 (en) 2003-03-06

Family

ID=19713620

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2002/001599 WO2003019452A1 (en) 2001-08-28 2002-08-28 Method and system for developing intelligence of robot, method and system for educating robot thereby

Country Status (2)

Country Link
KR (1) KR100486382B1 (en)
WO (1) WO2003019452A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100360204C (en) * 2005-06-16 2008-01-09 上海交通大学 Control system of intelligent perform robot based on multi-processor cooperation
WO2017027123A1 (en) * 2015-08-12 2017-02-16 Intel Corporation Robot with awareness of users and environment for use in educational applications
WO2017028571A1 (en) * 2015-08-20 2017-02-23 Smart Kiddo Education Limited An education system using connected toys
JP2017047519A (en) * 2015-09-04 2017-03-09 Rapyuta Robotics株式会社 Cloud robotics system, information processor, program, and method for controlling or supporting robot in cloud robotics system
CN108377251A (en) * 2018-05-08 2018-08-07 山西乐博特机器人教育科技有限公司 Educational robot tutoring system
CN110827179A (en) * 2019-10-18 2020-02-21 引力互联国际有限公司 Method, apparatus and storage medium for providing artificial intelligence education
CN110861085A (en) * 2019-11-18 2020-03-06 哈尔滨工业大学 VxWorks-based mechanical arm instruction interpreter system
CN111429263A (en) * 2019-12-11 2020-07-17 南京奥拓电子科技有限公司 Information interaction method, device, server and system for robot

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20040042242A (en) * 2002-11-13 2004-05-20 삼성전자주식회사 home robot using home server and home network system having the robot
KR100627506B1 (en) * 2004-07-05 2006-09-21 한국과학기술원 Netwofk-based software tobot system with own Internet Protocol address
KR100678728B1 (en) * 2005-06-16 2007-02-05 에스케이 텔레콤주식회사 Interaction between mobile robot and user, System for same
KR100824314B1 (en) * 2006-08-01 2008-04-22 주식회사 유진로봇 Image Compositing System for Motivation Using Robot
KR101209012B1 (en) * 2012-01-31 2012-12-24 한성대학교 산학협력단 Play interface device for education using character robot
KR101456554B1 (en) * 2012-08-30 2014-10-31 한국과학기술원 Artificial Cognitive System having a proactive studying function using an Uncertainty Measure based on Class Probability Output Networks and proactive studying method for the same
KR20180018211A (en) * 2016-08-12 2018-02-21 엘지전자 주식회사 Self-learning robot
KR102026338B1 (en) * 2018-10-08 2019-09-27 엘지전자 주식회사 Artificial intelligence ficself-learning robot

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20000066931A (en) * 1999-04-22 2000-11-15 윤덕용 Internet control system and method thereof for remote Mobile Robots Via internet control
US6175206B1 (en) * 1997-05-12 2001-01-16 Kawasaki Jukogyo Kabushiki Kaisha Robot information processor
KR20010095514A (en) * 2000-04-10 2001-11-07 김성헌 Toy system which have medical function by remote control
KR20020000538A (en) * 2001-12-05 2002-01-05 이정욱 Remote robot control system using ethernet
JP2002120184A (en) * 2000-10-17 2002-04-23 Human Code Japan Kk Robot operation control system on network

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11126017A (en) * 1997-08-22 1999-05-11 Sony Corp Storage medium, robot, information processing device and electronic pet system
US6149490A (en) * 1998-12-15 2000-11-21 Tiger Electronics, Ltd. Interactive toy
KR20020037618A (en) * 2000-11-15 2002-05-22 윤종용 Digital companion robot and system thereof
KR20020043982A (en) * 2000-12-05 2002-06-12 구자홍 An intelligent robotic bird based on speaking and learning
KR100423592B1 (en) * 2001-05-28 2004-03-22 주식회사 새손텔레콤 a fuzzy toy robot system and drive method studied possible using radio frequency communication

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6175206B1 (en) * 1997-05-12 2001-01-16 Kawasaki Jukogyo Kabushiki Kaisha Robot information processor
KR20000066931A (en) * 1999-04-22 2000-11-15 윤덕용 Internet control system and method thereof for remote Mobile Robots Via internet control
KR20010095514A (en) * 2000-04-10 2001-11-07 김성헌 Toy system which have medical function by remote control
JP2002120184A (en) * 2000-10-17 2002-04-23 Human Code Japan Kk Robot operation control system on network
KR20020000538A (en) * 2001-12-05 2002-01-05 이정욱 Remote robot control system using ethernet

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100360204C (en) * 2005-06-16 2008-01-09 上海交通大学 Control system of intelligent perform robot based on multi-processor cooperation
WO2017027123A1 (en) * 2015-08-12 2017-02-16 Intel Corporation Robot with awareness of users and environment for use in educational applications
US20170046965A1 (en) * 2015-08-12 2017-02-16 Intel Corporation Robot with awareness of users and environment for use in educational applications
WO2017028571A1 (en) * 2015-08-20 2017-02-23 Smart Kiddo Education Limited An education system using connected toys
JP2017047519A (en) * 2015-09-04 2017-03-09 Rapyuta Robotics株式会社 Cloud robotics system, information processor, program, and method for controlling or supporting robot in cloud robotics system
CN108377251A (en) * 2018-05-08 2018-08-07 山西乐博特机器人教育科技有限公司 Educational robot tutoring system
CN110827179A (en) * 2019-10-18 2020-02-21 引力互联国际有限公司 Method, apparatus and storage medium for providing artificial intelligence education
CN110861085A (en) * 2019-11-18 2020-03-06 哈尔滨工业大学 VxWorks-based mechanical arm instruction interpreter system
CN110861085B (en) * 2019-11-18 2022-11-15 哈尔滨工业大学 VxWorks-based mechanical arm instruction interpreter system
CN111429263A (en) * 2019-12-11 2020-07-17 南京奥拓电子科技有限公司 Information interaction method, device, server and system for robot
CN111429263B (en) * 2019-12-11 2024-04-26 南京奥拓电子科技有限公司 Robot information interaction method, device, server and system

Also Published As

Publication number Publication date
KR20030019125A (en) 2003-03-06
KR100486382B1 (en) 2005-04-29

Similar Documents

Publication Publication Date Title
WO2003019452A1 (en) Method and system for developing intelligence of robot, method and system for educating robot thereby
US11841789B2 (en) Visual aids for debugging
CN110308753B (en) Intelligent agricultural robot cloud control system and method
US20170213156A1 (en) Artificial intelligence engine having multiple independent processes on a cloud based platform configured to scale
CN109521927B (en) Robot interaction method and equipment
JP2016048417A (en) Service providing system and program
KR101970297B1 (en) Robot system for generating and representing emotion and method thereof
JP6617263B2 (en) Learning support system
CN112204654A (en) System and method for predictive-based proactive dialog content generation
JP6963778B2 (en) Service provision system and program
EP4016319A1 (en) Method and system for controlling robotic agents
JP6562328B2 (en) Support system
US20230083486A1 (en) Learning environment representations for agent control using predictions of bootstrapped latents
Fritsch et al. A flexible infrastructure for the development of a robot companion with extensible HRI-capabilities
CN111515970B (en) Interaction method, mimicry robot and related device
KR20210033809A (en) Control server and method for controlling robot using artificial neural network, and the robot implementing the same
Nguyen et al. A framework for learning to request rich and contextually useful information from humans
Hafez et al. Efficient intrinsically motivated robotic grasping with learning-adaptive imagination in latent space
Khamis et al. Remote interaction with mobile robots
JP7208603B2 (en) Service provision system and terminal
Cho et al. Implementation of human-robot VQA interaction system with dynamic memory networks
JP6899103B2 (en) Service provision system and program
Kim Ubiquitous robot: Recent progress and development
Spexard et al. A memory-based software integration for development in autonomous robotics
JP7444489B2 (en) Learning support system and service provision method

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BY BZ CA CH CN CO CR CU CZ DE DM DZ EC EE ES FI GB GD GE GH HR HU ID IL IN IS JP KE KG KP KZ LK LR LS LT LU LV MA MD MG MK MW MX MZ NO NZ OM PH PL PT RO SD SE SG SI SK SL TJ TM TN TR TT UA UG US UZ VC VN YU ZA ZM

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SD SE SG SI SK SL TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ UG ZM ZW AM AZ BY KG KZ RU TJ TM AT BE BG CH CY CZ DK EE ES FI FR GB GR IE IT LU MC PT SE SK TR BF BJ CF CG CI GA GN GQ GW ML MR NE SN TD TG

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR IE IT LU MC NL PT SE SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP