WO2009157733A1 - Interactive learning system using robot and method of operating the same in child education - Google Patents

Interactive learning system using robot and method of operating the same in child education Download PDF

Info

Publication number
WO2009157733A1
WO2009157733A1 PCT/KR2009/003464 KR2009003464W WO2009157733A1 WO 2009157733 A1 WO2009157733 A1 WO 2009157733A1 KR 2009003464 W KR2009003464 W KR 2009003464W WO 2009157733 A1 WO2009157733 A1 WO 2009157733A1
Authority
WO
WIPO (PCT)
Prior art keywords
child
robot
learning
cpu
letter
Prior art date
Application number
PCT/KR2009/003464
Other languages
French (fr)
Inventor
Kyung-Chul Shin
Seong-Ju Park
Kyoung-Seon Lee
Eun-Ja Hyun
So-Yune Kim
Sie-Kyung Jang
Original Assignee
Yujin Robot Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yujin Robot Co., Ltd. filed Critical Yujin Robot Co., Ltd.
Priority to EP09770405.0A priority Critical patent/EP2321817A4/en
Priority to CN200980125606.5A priority patent/CN102077260B/en
Publication of WO2009157733A1 publication Critical patent/WO2009157733A1/en

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/06Electrically-operated educational appliances with both visual and audible presentation of the material to be studied
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H3/00Dolls
    • A63H3/28Arrangements of sound-producing means in dolls; Means in dolls for producing sounds
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B7/00Electrically-operated teaching apparatus or devices working with questions and answers
    • G09B7/02Electrically-operated teaching apparatus or devices working with questions and answers of the type wherein the student is expected to construct an answer to the question which is presented or wherein the machine gives an answer to the question presented by a student
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H2200/00Computerized interactive toys, e.g. dolls

Definitions

  • the robot system of the prior invention unilaterally provides learning content through a robot and a projection television, and a student concentrates on and learns the provided content, so that educational effect thereof can be enhanced when the student over a certain age is a recipient of the education.
  • a child under a certain age is a recipient of this system, the system is unsuitable for the child in view of concentration duration, habit, and the like.
  • the data table may include a letter scan data table that stores input data about a shape of the letter to recognize the letter written on the letter board; a learning data table that stores data of a process for progressing the learning with the teaching material and stores standardized data of activities of the robot while performing the process; an action pattern data table that previously stores an action pattern of a child, which can be generated during learning, and stores the action pattern of the child generated in real time and observed by the input unit; a corresponding pattern data table that previously stores data about activities of the robot to be performed corresponding to the action pattern of a child and stores a situation when the activities of the robot are performed by a corresponding approximate value corresponding to the action pattern of the child generated in real time; and a driving data table that previously stores basic data for allowing the CPU to drive the robot to perform the activities.
  • the learning system employs an instructional design using an intelligent robot when educating a child such that the child recognizes the robot as a playmate, thereby increasing attention span during learning through natural participation and providing more effective learning.
  • Fig. 4 is a table showing operating status of the robot of Fig. 3.
  • Fig. 5 shows a flow of operating the interactive learning system according to one exemplary embodiment of the present invention
  • Fig. 8 is a block diagram of data tables in the interactive learning system according to another exemplary embodiment of the present invention.
  • Fig. 9 is a flowchart of letter board learning operation in the interactive learning system according to another exemplary embodiment of the present invention.
  • Fig. 10 shows the letter board learning operation in the interactive learning system according to another exemplary embodiment of the present invention.
  • Fig. 11 shows results from the learning operation of Fig. 10.
  • the robot using the motion unit 5 includes an emotion expression module, an image composite module, and a driving module.
  • a robot has a face, a head, arms, and a wheel, which can be driven in various ways as shown in Fig. 4 to achieve basic actions for driving the travel unit 4 and the motion unit 5 described below.
  • the input unit 10 includes a sensor 11, a microphone 12, a camera 15, and a touch screen 16.
  • the input unit 10 employs the camera 15 to sense a figure of a child who uses the robot for learning and a teaching material selected for teaching the child; the microphone 12 to sense a voice of the child; and the sensor 11 to sense child'a entrance, exit, movement or the like.
  • a child can input a signal through an interface unit 22.
  • the respective elements of the input unit 10 are generally installed throughout the robot from head to body, but for convenience of maintenance, all elements may be installed in the body.
  • the sensor 11 includes an image sensor for sensing a figure of a child, teaching materials, etc., and a stereo microphone for sensing a position of a sound source based on a child's voice.
  • the touch screen 16 allows a child's guardian or educator to input educational content for a child or allows a child to directly touch and input a signal while progressing the learning.
  • the CPU 20 may include a microcomputer, a microprocessor, or the like to determine whether action corresponds to a signal input through the input unit 10, and to progress a process for teaching a child.
  • the CPU 20 accesses a personal computer (PC), the Internet or a network via a communication line when receiving learning data or the like, and is connected with a universal serial bus (USB) port (mostly, installed in the body of the robot) or the like when updating the data.
  • PC personal computer
  • USB universal serial bus
  • the interface unit 22 transmits the input data of the input unit 10 in a form that can be easily processed by the CPU 20.
  • the data table 30 includes a learning data table 31, an action pattern data table 32, a corresponding pattern data table 33, and a driving data table 34.
  • the learning data table 31 stores data of a process for progressing learning with the teaching material and stores standardized data of activities of the robot while performing the process.
  • the drive unit 40 receives the control signal and drives an audio unit, a video unit and the motor of the robot based on the control signal, thereby allowing the robot to output a sound and an image or to perform travel and motion (hereinafter, referred to as "activities of the robot").
  • the sound and the image output by the robot may be achieved by the speaker 1, the monitor 2 and the LED module 3, and the travel and the motion of the robot may be achieved by the travel unit 4 and the motion unit 5.
  • the drive unit 40 includes an audio driver 41, a video driver 42, and a motor driver 43.
  • the input unit 10 senses the selected teaching material through the sensor 11, the camera 15, etc., and the CPU 20 calls learning data from the learning data table 31 based on the sensed data and allows a reading process to be processed according to a voice of the robot to progress the learning based on the teaching material selected by the child at operation S3-S4.
  • the basic reading is a process for reading the teaching material by a simple pattern; the reading together is performed as the robot outputs a sound such as "shall we read together?" or the like through the speaker 1; the section reading allows a letter, which is difficult to pronounce, or helpful vocabulary, to be displayed and read letter by letter; and the enunciated reading is a process of reading a certain word while changing a tone or a color output as an image.
  • the careful reading process may be achieved by performing the one or more processes simultaneously or independently.
  • the robot While or after reading the teaching material, the robot performs extended activities other than reading based on the learning data of the learning data table 31 through activities of the robot according to an instruction of the CPU 20 at operation S5.
  • the robot performs a finishing activity to notify completion of learning through activities of the robot according to an instruction of the CPU 20 at operation S6.
  • the finishing activity may induce a farewell address as in the foregoing greeting process or induce the child to answer a question about repetition of the learning process.
  • the CPU 20 determines an approximate pattern value previously set and input to the action pattern data and the corresponding pattern data and allows the activities of the robot based on the approximate pattern value to be performed.
  • the robot R recognizes the letter written on the letter board L and induces a child to raise the letter board L, so that at least one of the sound, the image, the travel and the motion can be provided to the child according to results of reading the letter from the letter board L, thereby responding to the results.
  • the robot R includes configurations required for a child'a learning, which include an input unit 10, a central processing unit 20, a data table 30 and a drive unit 40.
  • the camera 15 performs not only the foregoing function but also a function of inputting a letter image of the letter board L, so that the letter scanner 17 can read the letter on the letter board L based on a letter image signal.
  • the letter scanner 17 may include a video signal processor or the like to recognize the previously written letter.
  • the data table 30 is configured to store various control data called by the CPU 20 when a signal input with respect to a child is generated in the CPU 20.
  • the data table 30 includes a letter scan data table 35, a learning data table 31, an action pattern data table 32, a corresponding pattern data table 33, and a driving data table 34.
  • the letter scan data table 35 stores input data about the shape of the letter so as to recognize the letter written on the letter board.
  • each element of the robot has the same function as those of Table 1, but there is a difference in that the motion control in Table 1 employs the TPR method, and object recognition is required for the learning using the letter board.
  • a first item of the table is a progressing order of the story and a second item shows a scene for progressing the story. Further, a third item is a scenario and a fourth item shows cautions of the scenario with regard to each scene.
  • Table 9 11 A child's face appears(picture in anattendance book isusable) ⁇ child's picture ⁇ [TTS] (name)!/ it is very/ interesting[Face] happy[Arms] swing cross[TTS] Now/ look at/my stomach, and moveaccording to theletter board on thescreen[Face] happy[TTS] Now/ are youready?[Arms] swing alittle forward andbackward 12 ⁇ Give me a handclap ⁇ Should appear everytime ⁇ check[TTS] (name) ⁇ / (name) ⁇ [Face] nodding[Arms] swing cross(the screen shows achild one of theaction lists)(one second after)[TTS] (reading theletter board)[Face] nodding[Arms] cross swingCheck [Screen] a key word appears on thescreen while being pronounced, anda hint image is shown together withreading the other sentence.
  • the CPU 20 While or after reading the teaching material, the CPU 20 provides a certain letter through a voice, an image or combination of the voice and the image based on the learning data of the learning data table 31 at operation S10.
  • the letter scanner 17 reads the letter on the letter board L and transmits a reading signal to the CPU 20 at operation S12.
  • the reading of the letter on the letter board L may be achieved by processing a video signal input by the camera 15 in the letter scanner 17 and inputting it to the CPU 20.
  • the CPU 20 After receiving the signal, the CPU 20 calls data from the letter scan data table 35 and expresses determined results about the letter selected or written by the child through activities of the robot at operation S13.
  • the result expression may be achieved by driving the motion unit 5 of the robot to change the expression or the like of the robot along with the expression such as "Great” or "Well, that is not the letter. Let's try again” through a sound or an image from the robot R.
  • the robot R induces the child to read the letter through the foregoing processes and act corresponding to the read letter.
  • the robot R performs the action corresponding to the letter through the motion unit 5, thereby making a child follow it.
  • the robot R reads the letter through the foregoing processes and performs an action of shaking its body through the motion unit 5, in order to induce a child to follow the action, thereby enabling the TPR learning.
  • the interactive learning system of this embodiment is also operated using the robot, so that a child group having experienced the interactive learning is superior to a child group having experienced the unilateral learning based on the multimedia such as a computer system in tale understanding ability, story composing ability, and word reading ability.
  • the multimedia such as a computer system in tale understanding ability, story composing ability, and word reading ability.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Toys (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Electrically Operated Instructional Devices (AREA)

Abstract

An interactive learning system using a robot includes an input unit (10) that inputs a signal using a sensor for sensing a figure and voice of a child and teaching material and inputs content used to teach a child or inputs a signal when directly touched by a child, a CPU (20) that determines an action corresponding to a signal input through the input unit (10) and controls elements to progress of a teaching process, a data table (30) that provides control data called by the CPU (20) when a signal input with respect to a child is generated in the CPU (20), and a drive unit (40) that receives a control signal and drives an audio unit, a video unit and a motor of the robot to perform robot activities required for learning when the CPU (20) transmits the control signal based on the control data of the data table (30). A method of operating the same is also disclosed.

Description

INTERACTIVE LEARNING SYSTEM USING ROBOT AND METHOD OF OPERATING THE SAME IN CHILD EDUCATION
The present invention relates to an interactive learning system using a robot and a method of operating the same. More particularly, the present invention relates to an interactive learning system using a robot and a method of operating the same, which develops suitable content for an interaction function of the robot and provides an interactive learning situation between a child and the robot while the child learns using the robot, instead of allowing the robot to present Internet content per se or to perform a unilateral action. Since learning development during babyhood is achieved through direct experience and interaction, the present invention provides an enhanced learning system adapted to babyhood based on interaction between a robot and a child.
Learning is an act of gaining new information or skills through relatively continuous change in behavior, which results from practice or experience. In general learning, there are givers carrying out teaching, receivers, and media therebetween to enhance learning efficiency. The media involve all types and channels used for delivery in a process of delivering information. Further, it is known that the media used in a learning situation represents channels for educational interaction between a teacher and a child. Particularly, when the media are used in delivering content for a certain instructional purpose, they are called instructional media.
The rise of digital media such as computer systems and the Internet as instructional media has brought about a growing change from instruments for a teacher to media acting as a teacher per se. The computer system has also been developed from stationary media to portable media such as notebook computers, personal digital assistants (PDAs), etc. Furthermore, voluntarily movable robots have begun to appear in learning.
Reports on educational effectiveness of robots say that infants pay increased interest, concentrate for longer durations, and actively imitate an action during interaction with the robot. Further, infants pay continuous attention and interest in the robot and exhibit interactive behavior (touching). Particularly, the reports say that that education using robots is effective in improving reading, mathematical logic, and problem-solving abilities.
Korean Patent Application No. 10-2006-0118105 (entitled "robot system for learning-aids and teacher-assistants" and hereinafter referred to as a "prior invention") discloses a robot capable of improving the aforementioned educational effect.
Not only does the above robot system provide self-control and a creative learning environment while freely moving in a classroom and having a simple conversation with students, it also behaves like a person assisting a teacher, thereby effectively performing education based on information & communication technology (ICT) introduced to improve educational efficiency for the students.
The robot system of the prior invention unilaterally provides learning content through a robot and a projection television, and a student concentrates on and learns the provided content, so that educational effect thereof can be enhanced when the student over a certain age is a recipient of the education. However, if a child under a certain age is a recipient of this system, the system is unsuitable for the child in view of concentration duration, habit, and the like.
In other words, one important factor for a child in learning with this system is that the child must voluntarily participate in learning. However, for one-way education which provides only the educational content as in the robot system of the prior invention, it is difficult to satisfy the factor in learning.
The present invention is conceived to solve the problems of the related art.
An aspect of the invention is to provide an interactive learning system and a method of operating the same, which enable interaction between a child and a robot to be achieved while teaching the child using the robot to induce the child to actively participate in learning and extend learning duration by causing the child to recognize the robot as a playmate.
Another aspect of the invention is to provide an interactive learning system and a method of operating the same, which enables interaction between a child and a robot to be achieved through letter board recognition and total physical response teaching while teaching the child using the robot to induce the child to actively participate in learning and extend learning duration by inducing the child to raise a letter board on which a letter is written when the letter is output as a sound or image of the robot or teaching materials.
In accordance with one aspect, the invention provides an interactive learning system using a robot which outputs a sound and an image to teach a child and performs travel and motion by driving a motor. The interactive learning system includes: an input unit that inputs a signal using a sensor for sensing a figure and voice of a child and a teaching material selected to teach a child and that inputs content to teach a child or inputs a signal when directly touched by a child during learning; a central processing unit (CPU) that determines an action corresponding to a signal input through the input unit and controls elements to progress a process of teaching a child; a data table that provides control data called by the CPU when a signal input with respect to a child is generated in the CPU; and a drive unit that receives a control signal and drives an audio unit, a video unit, and the motor of the robot to perform activities of the robot required for learning when the CPU transmits the control signal based on the control data provided from the data table. Here, the data table includes a learning data table that stores data of a process for progressing the learning using the teaching material and stores standardized data of activities of the robot while performing the process; an action pattern data table that previously stores an action pattern of a child, which can be generated during learning, and stores the action pattern of the child generated in real time and observed by the input unit; a corresponding pattern data table that previously stores data about activities of the robot to be performed corresponding to the action pattern of a child and stores a situation when the activities of the robot are performed by a corresponding approximate value corresponding to the action pattern of the child generated in real time; and a driving data table that previously stores basic data used by the CPU to drive the robot to do the activities.
The CPU may further include a storage medium to store a total learning process performed by the robot and a new data input generated from an outside.
In accordance with another aspect, the invention provides a method of operating an interactive learning system using a robot. The method includes: if an input unit senses appearance of a child and inputs an appearance signal to a central processing unit (CPU), inducing the child to voluntarily greet the robot to perform a greeting process in response to activities of the robot according to an instruction of the CPU and presenting a teaching material to the child through activities of the robot according to an instruction of the CPU to induce the child to select the teaching material; sensing the teaching material selected by the child through the input unit, calling, by the CPU, learning data from a learning data table based on the sensed data and allowing a reading process to be performed according to a voice of the robot to progress the learning based on the teaching material selected by the child; while or after reading the teaching material, performing extended activities other than reading based on the learning data of the learning data table through activities of the robot according to an instruction of the CPU; if the learning process is completed, performing a finishing activity to notify completion of the learning through activities of the robot according to an instruction of the CPU; and, when switching between the foregoing operations, performing a switching operation to attract a child's attention based on the data stored in the data table through activities of the robot according to an instruction of the CPU.
When performing each operation, if an expected action of the child is sensed by the input unit, the CPU may call the action pattern data of the child and the corresponding pattern data from the action pattern data table and allow the robot to respond to the expected action of the child through activities of the robot according to the instruction of the CPU, and if an unexpected action of the child is sensed by the input unit, the CPU may determine an approximate pattern value previously set and input to the action pattern data and the corresponding pattern data and allow activities of the robot based on the approximate pattern value to be performed.
The reading process may include a basic reading process where teaching material content is read and a careful reading process where at least one of re-reading, reading together, section reading, enunciated reading, and repeating is performed while performing the basic reading process.
The performing extended activities includes at least one of a process of reading the teaching material content by changing a letter into an illustration, a process of changing the read letter into an illustration, a process of outputting the read letter as an image in a stroke order, and a process of expressing encouragement if the child says his or her impressions after reading the teaching material.
In accordance with a further aspect, the invention provides an interactive learning system based on a total physical response (TPR) learning model using a robot that outputs a sound and an image to teach a child and performs travel and motion by driving a motor. The system includes: a letter board on which a certain letter is previously written to be provided by the robot to a child in the form of a sound, an image or combination of the sound and the image; and a robot that recognizes the letter written on the letter board and induces a child to raise the letter board, so that at least one of the sound, the image, travel and motion can be provided to the child according to results of reading the letter from the letter board to respond to the results. Here, the robot includes: an input unit that includes a letter scanner to read the letter on the letter board when a child raises the letter board, inputs a signal using a sensor for sensing a figure and voice of a child and a teaching material selected to teach a child, and inputs content to teach a child or inputs a signal when directly touched by a child during learning; a central processing unit (CPU) that determines an action corresponding to a signal input through the input unit and controls elements to progress a process of teaching a child; a data table that provides control data called by the CPU when a signal input with respect to a child is generated in the CPU; and a drive unit that receives a control signal and drives an audio unit, a video unit, and the motor of the robot to perform activities of the robot required for learning when the CPU transmits the control signal based on the control data provided from the data table.
The data table may include a letter scan data table that stores input data about a shape of the letter to recognize the letter written on the letter board; a learning data table that stores data of a process for progressing the learning with the teaching material and stores standardized data of activities of the robot while performing the process; an action pattern data table that previously stores an action pattern of a child, which can be generated during learning, and stores the action pattern of the child generated in real time and observed by the input unit; a corresponding pattern data table that previously stores data about activities of the robot to be performed corresponding to the action pattern of a child and stores a situation when the activities of the robot are performed by a corresponding approximate value corresponding to the action pattern of the child generated in real time; and a driving data table that previously stores basic data for allowing the CPU to drive the robot to perform the activities.
In accordance with yet another aspect, the invention provides a method of operating an interactive learning system using a robot that outputs a sound and an image and performs travel and motion by driving a motor. The method includes: performing a learning process using a teaching material for allowing a child to learn reading and vocabulary through the robot; while or after reading the teaching material, inducing a child to raise a letter board, on which a certain letter is written, through a voice, an image or combination of the voice through a voice according to an instruction of a central processing unit (CPU) based on learning data of a learning data table; and if the child raises the letter board, allowing a letter scanner to read the letter on the letter board and transmit a reading signal to the CPU, and calling data from the letter scan data table and expressing determined results about the letter selected by the child through activities of the robot according to an instruction of the CPU.
The performing the learning may include: if an input unit senses appearance of a child and inputs an appearance signal to a central processing unit (CPU), inducing the child to voluntarily greet the robot to perform a greeting process in response to activities of the robot according to an instruction of the CPU and presenting a teaching material to the child through activities of the robot according to an instruction of the CPU to induce the child to select the teaching material; sensing the teaching material selected by the child through the input unit, calling, by the CPU, learning data from a learning data table based on the sensed data and allowing a reading process to be performed according to a voice of the robot to progress the learning based on the teaching material selected by the child; while or after reading the teaching material, performing extended activities other than reading based on the learning data of the learning data table through activities of the robot according to an instruction of the CPU; if the learning process is completed, performing a finishing activity to notify completion of the learning through activities of the robot according to an instruction of the CPU; and, when switching between the foregoing operations, performing a switching operation to attract a child's attention based on the data stored in the data table through activities of the robot according to an instruction of the CPU.
When performing each operation, if an expected action of the child is sensed by the input unit, the CPU may call action pattern data of the child and corresponding pattern data from an action pattern data table and allow the robot to respond to the expected action of the child through activities of the robot according to an instruction of the CPU, and if an unexpected action of the child is sensed by the input unit, the CPU may determine an approximate pattern value previously set and input to the action pattern and the corresponding pattern data and allow activities of the robot based on the approximate pattern value to be performed.
The performing the extended activities may include performing total physical response (TPR) learning by making the child act corresponding to the letter after the child reads the letter raised by the child or by inducing the child to follow action of the robot after the robot acts corresponding to the letter.
As described above, according to one embodiment of the invention, the learning system employs an instructional design using an intelligent robot when educating a child such that the child recognizes the robot as a playmate, thereby increasing attention span during learning through natural participation and providing more effective learning.
Further, according to one embodiment of the invention, the learning system allows a child to learn a language through physical activity based on playing-mode learning with a letter board, thereby ensuring more effective learning through a total physical response (TPR) method.
Fig. 1 is a block diagram of an interactive learning system according to one exemplary embodiment of the present invention.
Fig. 2 is a block diagram of data tables in the interactive learning system according to one exemplary embodiment of the present invention.
Fig. 3 illustrates a robot to which the interactive learning system of Fig. 1 is applied.
Fig. 4 is a table showing operating status of the robot of Fig. 3.
Fig. 5 shows a flow of operating the interactive learning system according to one exemplary embodiment of the present invention;
Fig. 6 shows a child selecting teaching material through a robot during operation of the interactive learning system according to one exemplary embodiment of the present invention.
Fig. 7 is a block diagram of an interactive learning system according to another exemplary embodiment of the present invention.
Fig. 8 is a block diagram of data tables in the interactive learning system according to another exemplary embodiment of the present invention.
Fig. 9 is a flowchart of letter board learning operation in the interactive learning system according to another exemplary embodiment of the present invention.
Fig. 10 shows the letter board learning operation in the interactive learning system according to another exemplary embodiment of the present invention.
Fig. 11 shows results from the learning operation of Fig. 10.
Exemplary embodiments of the present invention will be described with reference to the accompanying drawings.
Fig. 1 is a block diagram of an interactive learning system according to one exemplary embodiment of the present invention and Fig. 2 is a block diagram of data tables in the interactive learning system according to one exemplary embodiment of the present invention.
Referring to the drawings, an interactive learning system according to this embodiment includes an input unit 10, a central processing unit (CPU) 20, a data table 30, and a drive unit 40. This system is embodied by an internal circuit of a robot. Typically, the robot includes a speaker 1 for outputting a sound; a monitor 2 and a light emitting diode (LED) module 3 for outputting an image; and a travel unit 4 and a motion unit 5 for performing travel and motion by driving a motor. In general, the travel unit 4 includes a motor, a wheel, a power transmission unit, etc., and the motion unit 5 is configured to express behaviors of an arm, a leg, winking, etc., using a motor (or a power means such as a cylinder or the like).
In Fig. 3, the robot using the motion unit 5 includes an emotion expression module, an image composite module, and a driving module. Particularly, such a robot has a face, a head, arms, and a wheel, which can be driven in various ways as shown in Fig. 4 to achieve basic actions for driving the travel unit 4 and the motion unit 5 described below.
In particular, it should be understood that the head, the arm, the wheel or the like may be driven by the power means, such as the motor, and change in the face may be expressed by a liquid crystal display (LCD) or a light emitting diode (LED) module provided in the form of eyes on the face.
The input unit 10 includes a sensor 11, a microphone 12, a camera 15, and a touch screen 16. Thus, the input unit 10 employs the camera 15 to sense a figure of a child who uses the robot for learning and a teaching material selected for teaching the child; the microphone 12 to sense a voice of the child; and the sensor 11 to sense child'a entrance, exit, movement or the like. A child can input a signal through an interface unit 22.
The respective elements of the input unit 10 are generally installed throughout the robot from head to body, but for convenience of maintenance, all elements may be installed in the body. The sensor 11 includes an image sensor for sensing a figure of a child, teaching materials, etc., and a stereo microphone for sensing a position of a sound source based on a child's voice.
In the drawing, an amplifier 13 and a converter 14, connected to the sensor 11 and the microphone 12, are configured to amplify an analog signal and convert it into a digital signal.
Particularly, the touch screen 16 allows a child's guardian or educator to input educational content for a child or allows a child to directly touch and input a signal while progressing the learning.
The CPU 20 may include a microcomputer, a microprocessor, or the like to determine whether action corresponds to a signal input through the input unit 10, and to progress a process for teaching a child.
Further, the CPU 20 accesses a personal computer (PC), the Internet or a network via a communication line when receiving learning data or the like, and is connected with a universal serial bus (USB) port (mostly, installed in the body of the robot) or the like when updating the data.
The CPU 20 includes a storage medium 21 such as a memory, a hard disk and a compact disc (CD) (involving a CD player) to store an overall learning process performed by the robot and new data input from outside.
In the drawing, the interface unit 22 transmits the input data of the input unit 10 in a form that can be easily processed by the CPU 20.
The data table 30 stores various control data called by the CPU 20 when a signal input with respect to a child is generated in the CPU 20.
Specifically, the data table 30 includes a learning data table 31, an action pattern data table 32, a corresponding pattern data table 33, and a driving data table 34.
The learning data table 31 stores data of a process for progressing learning with the teaching material and stores standardized data of activities of the robot while performing the process.
The action pattern data table 32 previously stores an action pattern of a child which can be generated during learning and stores the action pattern of the child generated in real time and observed by an associated element of the input unit 10.
The corresponding pattern data table 33 previously stores data about activities of the robot to be performed corresponding to the action pattern of a child and stores a situation when the activities of the robot is performed by a corresponding approximate value corresponding to the action pattern of the child generated in real time.
The driving data table 34 previously stores basic data for allowing the CPU 20 to drive the activities of the robot.
As such, the data table 34 serves along with the CPU 20 as a controller or an actual operation processor to actually treat an educational interaction. For example, the data table may be a readable and writable memory, hard disk or the like.
When the CPU 20 transmits a control signal based on control data provided from the data table, the drive unit 40 receives the control signal and drives an audio unit, a video unit and the motor of the robot based on the control signal, thereby allowing the robot to output a sound and an image or to perform travel and motion (hereinafter, referred to as "activities of the robot").
Here, the sound and the image output by the robot may be achieved by the speaker 1, the monitor 2 and the LED module 3, and the travel and the motion of the robot may be achieved by the travel unit 4 and the motion unit 5. To this end, the drive unit 40 includes an audio driver 41, a video driver 42, and a motor driver 43.
According to the embodiment, the respective elements of the robot for operating the interactive learning system have functions as shown in Table 1.
Table 1
Main function Contents Details Functions used in practical program
Motion control Element controls wheel/head/arms/expression/dance control All
Voice service Voice synthesis Korean adult/childKorean orally narrated fairy taleEnglish adult/child Korean child voice
Voice recogni-tion voice recognition/name recognition/speaker recognition/sound source recognition/handclap recognition touch recognitionhandclap recognition
Vision service face recognition movement detection, face detection, face recognition, stare Use of staring function
object recognition object recognition Use of letter board recogni-tion
Navigation self-travel navigation [land mark/obstacle recognition] All
Information service Robot informa-tion robot profile, service specialized DB, remote controller/touch sensor bumper, LCD, charging sensor, motor compulsory drive information remote controller, touch sensor, bumper, LCD, motor compulsory movement path, service specialized DB
Media interface speaker volume, microphone volume, audio recording, audio reproduction, video recording, video reproduction, image capture speaker volumeAudio reproduction
System service application handling file input/output application handling file input/output, graphic user interface (GUI) service, [screen keyboard] All
Fig. 5 shows a flow of operating the interactive learning system according to one exemplary embodiment of the present invention.
Referring to Fig. 5, the learning system senses appearance of a child through the input unit 10 and inputs an appearance signal to the CPU 20 to induce the child to voluntarily greet the robot through activities of the robot according to an instruction of the CPU 20, thereby performing a greeting process at operation S1.
Then, the robot presents a teaching material to the child through activities of the robot according to an instruction of the CPU 20 and induces the child to select the teaching material at operation S2. The teaching material generally includes a book such as a picture book or the like, and classification data relating to a cover, pages and the like of the book are previously stored in the learning data table 31 so as to be identified by the sensor 11 or the camera 15 of the input unit 10.
When the child selects teaching material, the input unit 10 senses the selected teaching material through the sensor 11, the camera 15, etc., and the CPU 20 calls learning data from the learning data table 31 based on the sensed data and allows a reading process to be processed according to a voice of the robot to progress the learning based on the teaching material selected by the child at operation S3-S4.
During the reading process, the child reads the book or the teaching material while watching teaching material content output through the monitor 2. The reading process is divided into a basic reading process at operation S3 and a careful reading process where at least one of re-reading, reading together, section reading, enunciated reading, and repeating is performed while performing the basic reading process at operation S4.
For example, the basic reading is a process for reading the teaching material by a simple pattern; the reading together is performed as the robot outputs a sound such as "shall we read together?" or the like through the speaker 1; the section reading allows a letter, which is difficult to pronounce, or helpful vocabulary, to be displayed and read letter by letter; and the enunciated reading is a process of reading a certain word while changing a tone or a color output as an image. Particularly, the careful reading process may be achieved by performing the one or more processes simultaneously or independently.
While or after reading the teaching material, the robot performs extended activities other than reading based on the learning data of the learning data table 31 through activities of the robot according to an instruction of the CPU 20 at operation S5.
Such extended activities may include a process of reading the teaching material content by changing a letter into an illustration, a process of changing the read letter into an illustration, a process of outputting the read letter as an image in a stroke order, a process of expressing encouragement if a child says his or her impressions after reading the teaching material, etc. Also, the processes of the extended activities may be achieved by performing the one or more processes simultaneously or independently.
If the learning process is completed, the robot performs a finishing activity to notify completion of learning through activities of the robot according to an instruction of the CPU 20 at operation S6. The finishing activity may induce a farewell address as in the foregoing greeting process or induce the child to answer a question about repetition of the learning process.
In each operation of the learning system, the robot performs a switching operation to attract a child's attention through activities of the robot according to an instruction of the CPU 20 based on the data stored in the data table 30 when switching between the operations at operation S7.
The switching operation may be achieved by an activity of outputting music to a child through the speaker 1 or allowing a child to observe an interesting image on the monitor 2. Furthermore, the switching operation may be achieved by inducing the travel unit 4, the motion unit 5, etc. of the robot to perform a certain action.
In each operation of the learning system, if an expected action of the child is sensed by the sensor unit 11, the CPU 20 calls the action pattern data of the child and the corresponding pattern data from the action pattern data table 32 and allows the robot to respond to the expected action of the child through activities of the robot according to an instruction of the CPU 20.
On the other hand, if an unexpected action of the child is sensed by the input unit 10, the CPU 20 determines an approximate pattern value previously set and input to the action pattern data and the corresponding pattern data and allows the activities of the robot based on the approximate pattern value to be performed.
The approximate pattern value is the most approximate to the previously input data in the action pattern data table 32 with regard to the unexpected action of a child, as determined by the CPU 20. Thus, the activities of the robot are performed by calling the corresponding pattern data 33 corresponding to the approximate pattern value.
Particularly, in each operation of the learning system according to the embodiment of the invention, the activities of the robot may be achieved by performing at least one of the audio output, the video output, and the travel and motion of the robot simultaneously or independently through the drive unit 40.
Example 1
A test for evaluating development of child's understanding of a tale in operation of the learning system using a robot as described above was carried out and test results are shown in Table 2.
For the evaluation, four-year-old children were recruited and divided into an evaluation experimental group (for the learning system according to one embodiment of the invention) and a control group (for a multimedia-type learning system using a computer system). The test was carried out using the same book in a comfortable room for the children.
Two experts in juvenile education participated in the learning process of the children and the learning process of the children was photographed. The test results of the experimental group and the control group were obtained based on statistics by a well-known official test method.
Table 2
Understanding of tale Pre-test Post-test t
M(SD) M(SD)
Experimental group (17) 1.19(.90) 2.47(.86) .008*
Control group (17) 1.28 (.96) 1.31(1.28) .93
Note: seventeen children in each group
*P<.05 **P<.001
According to the results, the experimental group and the control group have levels (t) of significance of .008 and .93 in the post-test tale-understanding, respectively, which shows that only the experimental group is significantly increased in the tale-understanding as compared with the pre-test tale-understanding.
Example 2
Through the same process as that of Example 1, development in child's ability of composing a story was evaluated and results are shown in Table 3.
Table 3
Story composing ability Pre-test Post-test t
M(SD) M(SD)
Experimental group (17) 1.39(.78) 2.47 (.86) .00**
Control group (17) 1.86(.99) 1.86 (.54) 1.00
Note: seventeen children in each group
*P<.05 **P<.001
According to the results,, the experimental group and the control group have levels (t) of significance of .00 and 1.00 in the post-test story composing ability, respectively, which shows that only the experimental group is significantly and largely increased in the story composing ability as compared with the pre-test story composing ability.
Example 3
Through the same process as that of Example 1, development in child's ability of reading a word was evaluated and results are shown in Table 4.
Table 4
Word learning effect Pre-test Post-test t
M(SD) M(SD)
Experimental group (17) 3.06(3.54) 8.35 (5.93) .00**
Control group (17) 2.47(2.72) 3.76 (4.52) .11
Note: seventeen children in each group
*P<.05 **P<.001
According to the results,, the experimental group and the control group have levels (t) of significance of .00 and .11 in the post-test word reading ability, respectively, which shows that only the experimental group is significantly and largely increased in the word reading ability as compared with the pre-test word reading ability
In the foregoing tests, operations of the learning system using the robot and the learning system using the multimedia are analyzed in view of interaction between a learner and each system, as shown in Table 5.
Table 5
Interaction relative to reading Intelligent robot type Multimedia type
Guided repeated Oral reading 0 ×
Repeated reading 0 0
Pair reading 0 ×
Reading together 0 ×
Writing and reading impressive content 0 ×
Simple question 0 0
Interactive question 0 ×
Simple feedback 0 ×
Interactive feedback 0 ×
Reading picture book shown by child 0 ×
Simple story composition and personal activity 0 ×
Storing/feedback for child's portfolio 0 ×
Sound-writing combination 0 0
Others: following simple action 0 ×
Referring to the results of Examples 1 to 3 and the analysis of Table 5, the child group that experienced interactive learning using the robot learning system was superior to the child group that experienced unilateral learning using the multimedia provided via a computer in view of tale understanding, story composing ability, and word reading ability.
For the story composing ability depending on child's vocabulary power, there was a large difference between learning effects of the two media and the control group was not substantially changed by the learning. For the story composing ability, learning efficiency of the interactive learning using the robot was much higher than that of the unilateral learning based on the multimedia, such as a computer system or the like. Therefore, the interactive learning by the system according to this embodiment can be used with very high efficiency in many educational institutions.
Fig. 7 is a block diagram of an interactive learning system according to another exemplary embodiment of the invention, and Fig. 8 is a block diagram of data tables in the interactive learning system according to another exemplary embodiment of the invention.
Hereinafter, the same descriptions as those for the block diagram of the interactive learning system according to the embodiment shown in Fig. 1 and the block diagram of the data tables in the interactive learning system according to the embodiment shown in Fig. 2 will be omitted, and only differences therebetween will be described.
Referring to Figs. 7 and 8, the interactive learning system according to this embodiment includes a letter board L and a robot R.
On the letter board L, a certain letter is written to be provided by the robot to a child in the form of a sound, an image or combination of the sound and the image. The letter board L is made of a light nontoxic material so as to be easily raised by a child and is rounded at corners thereof to prevent a child form getting hurt. Particularly, a background color and a color of a letter written on the letter board L are set to be photographed or scanned by a letter scanner 17 of the robot R described below.
The robot R recognizes the letter written on the letter board L and induces a child to raise the letter board L, so that at least one of the sound, the image, the travel and the motion can be provided to the child according to results of reading the letter from the letter board L, thereby responding to the results.
In addition to the foregoing functions, the robot R includes configurations required for a child'a learning, which include an input unit 10, a central processing unit 20, a data table 30 and a drive unit 40.
Referring to Fig. 7, the camera 15 performs not only the foregoing function but also a function of inputting a letter image of the letter board L, so that the letter scanner 17 can read the letter on the letter board L based on a letter image signal. To this end, the letter scanner 17 may include a video signal processor or the like to recognize the previously written letter.
Referring to Fig. 8, the data table 30 is configured to store various control data called by the CPU 20 when a signal input with respect to a child is generated in the CPU 20.
Specifically, the data table 30 includes a letter scan data table 35, a learning data table 31, an action pattern data table 32, a corresponding pattern data table 33, and a driving data table 34.
The letter scan data table 35 stores input data about the shape of the letter so as to recognize the letter written on the letter board.
In the interactive learning system with the foregoing configurations according to this embodiment, each element of the robot has the same function as those of Table 1, but there is a difference in that the motion control in Table 1 employs the TPR method, and object recognition is required for the learning using the letter board.
The process of operating the interactive learning system according to this embodiment is the same as that according to the foregoing embodiment shown in Fig. 5, and thus repetitive descriptions thereof will be avoided.
Fig. 9 is a flowchart of a letter board learning operation in the interactive learning system according to another exemplary embodiment of the present invention. Also, Tables 6 to 9 are a story board showing a process of performing the letter board learning operation and the TPR method.
A first item of the table is a progressing order of the story and a second item shows a scene for progressing the story. Further, a third item is a scenario and a fourth item shows cautions of the scenario with regard to each scene.
Table 6
7 Recognized letter board withsentence{Put botharms up} (if a child inputs a letter board)[TTS] hmm, it's a (directive)/I will dothis![Face] happy[Head] nodding[Arms] both arms are waved up and down[TTS] Roby~/ Roby~(move by instruction)(ex., if a card "put your arms up" isinput[TTS] put both arms/ up[Arms] both arms are put up[Face] happy)[TTS] ok/ it's ok/ to do like this. [Face] normal[Head] nodding[TTS] Please select/ plays/ for me/ oneby one/ okay? Now let's get started~[Face] normal[Arms] swing a little forward andbackward [Effect] A card achild raisesis recognizedand read byTTS.
8 {Roby face} [TTS] Roby~/Roby~[TTS] (name)!! Please say my name!Start!![Arms] swing actively left and right andup and down[Face] nodding up and down [Screen] Roby blinks[Screen] A questionblinks.
Table 7
8-1 From thesecond, 8-1to 10 arerepeated{Roby's face} [TTS] Now/ what do you want/to do?[Face] leftward at an angle of 45degrees, rightward at an angle of 45degrees, and forward[Arms] swing a littleChange order[TTS] Roby~/Roby~[Face] normal[Arms] swing a little[TTS] Please select another[Arms] raise at an angle of 90 degrees and return to the origin[Arms LED] flickering and off
9 {Touch the floor} (If a child chooses a letter board,select one of the following responses)[TTS] great~[TTS] okay~[TTS] I see~[TTS] yeah, that's is it(action in common)[Face] shy[Head] nodding[Wheel] leftward at an angle of 45degrees → rightward at an angle of 45degrees)*(If a child does not choose any letterboard, wait until he/she chooses the letterboard)If there is no response for 3 seconds, [TTS] (name)~/ I wonder/what letter boardyou will choose[Face] disappointed[Arms] swing both arms up and downIf there is no response for five seconds,[TTS] (name)~/ What are you doing?/ I amwait- [Recognition] after recognizing the letterboard, thecontents ofthe letterboard aredisplayed onthe screenLetter board:a key word isbold andlarge
Table 8
-ing.[Face] disappointed[Arms] swing both arms up and down
10 {Touch the floor} [TTS] Roby~/Roby~[Arms] swingactively left andright and up anddown[Head] nodding upand down[TTS] (reading thechosen letter board)(action -corresponding to aletter board of thenext line) [Screen] key word flickersLetter board (directive) and action(underlined word is a key word)(common: [Face] happy)[Arms] swing a little left and right (as long as there is no instruction)Nod head![Head] Nodding largelyGo ahead![Wheel] goes ahead and returns to theoriginShake body![Wheel] twist dancePut both arms up[Arms] raising both arms up at an angleof 90 degrees and downDance![Wheel] techno dance 1, 2, 3, 4combination[Sound] background musicShake head![Head] shaking vigorously left and right[Arms] swing a littleSwing both arms![Arms] swing both arms vigorously.
(scenes 8-1 to 10are repeated fivetimes)
Table 9
11 A child's face appears(picture in anattendance book isusable){child's picture} [TTS] (name)!/ it is very/ interesting[Face] happy[Arms] swing cross[TTS] Now/ look at/my stomach, and moveaccording to theletter board on thescreen[Face] happy[TTS] Now/ are youready?[Arms] swing alittle forward andbackward
12 {Give me a handclap} Should appear everytime ~ check[TTS] (name)~/ (name)~[Face] nodding[Arms] swing cross(the screen shows achild one of theaction lists)(one second after)[TTS] (reading theletter board)[Face] nodding[Arms] cross swingCheck [Screen] a key word appears on thescreen while being pronounced, anda hint image is shown together withreading the other sentence. For example, in the case of "giveme a handclap",the screen shows"give me" when "give me" ispronounced and the other sentence "a handclap" with a handclappingimage thereon. Action lists for child(key word is underlined)Jump up suddenly[Arms] cross swingTouch the floor[Arms] swing both hands a littleGive me a handclap![Arms] cross swingCry hurrah[Arms] raise both arms up highHold my arm![Arms] swing both arms vigorouslyStroke my head!
Referring to Fig. 9 and Tables 6 to 9, the letter board learning operation is performed to further improve learning efficiency during the interactive learning operation. Such a letter board learning operation may be performed in the course of the basic reading process, the careful reading process, and the extended reading process.
While or after reading the teaching material, the CPU 20 provides a certain letter through a voice, an image or combination of the voice and the image based on the learning data of the learning data table 31 at operation S10.
Then, a child raises the letter board L according to the proposed letter at operation S11.
If the child raises the letter board L, the letter scanner 17 reads the letter on the letter board L and transmits a reading signal to the CPU 20 at operation S12. The reading of the letter on the letter board L may be achieved by processing a video signal input by the camera 15 in the letter scanner 17 and inputting it to the CPU 20.
After receiving the signal, the CPU 20 calls data from the letter scan data table 35 and expresses determined results about the letter selected or written by the child through activities of the robot at operation S13. For example, the result expression may be achieved by driving the motion unit 5 of the robot to change the expression or the like of the robot along with the expression such as "Great" or "Well, that is not the letter. Let's try again" through a sound or an image from the robot R.
The driving of the motion unit 5 of the robot may be applied to the total physical response (TPR) learning model according to recognition of the letter board. That is, the TPR learning model causes the robot induce a child to act as shown in Tables 6 to 9 in response to sentences displayed on the letter board, thereby improving language learning efficiency.
In more detail, if a child raises the letter board L as shown in Fig. 10, the robot R induces the child to read the letter through the foregoing processes and act corresponding to the read letter.
As the most advantageous method for inducing a child to act corresponding to the letter, the robot R performs the action corresponding to the letter through the motion unit 5, thereby making a child follow it.
For example, as shown in Fig. 11, when the letter board L raised by a child shows "shake your body", the robot R reads the letter through the foregoing processes and performs an action of shaking its body through the motion unit 5, in order to induce a child to follow the action, thereby enabling the TPR learning.
The process of operating the interactive learning system according to this embodiment is the same as that according to the embodiment shown in Fig. 5, and a repetitious description thereof will be omitted herein.
In this embodiment, Example 1 where development in tale understanding ability of a child was evaluated by operating the learning system using the robot, Example 2 where development in story composing ability of a child was evaluated by the same process as Example 1, Example 3 where development in word reading ability of a child was evaluated, show the same results as Examples 1 to 3 according to the foregoing embodiment shown in Tables 3 to 5, and thus repetitious descriptions thereof will be omitted herein.
Accordingly, the interactive learning system of this embodiment is also operated using the robot, so that a child group having experienced the interactive learning is superior to a child group having experienced the unilateral learning based on the multimedia such as a computer system in tale understanding ability, story composing ability, and word reading ability. For the story composing ability, there was a large difference between learning effects of the two media and the control group was not substantially changed by the learning.
As a result, for the story composing ability depending on child's vocabulary power, learning efficiency of the interactive learning using the robot is much higher than that of the unilateral learning based on the multimedia such as the computer system or the like. Therefore, the interactive learning by the system according to this embodiment can be used with very high efficiency in many educational institutions.

Claims (12)

  1. An interactive learning system using a robot that outputs a sound and an image to teach a child and performs travel and motion by driving a motor, the interactive learning system comprising:
    an input unit 10 that inputs a signal using a sensor for sensing a figure and voice of a child and a teaching material selected to teach a child and that inputs contents to teach a child or inputs a signal when directly touched by a child during learning;
    a central processing unit (CPU) 20 that determines an action corresponding to a signal input through the input unit 10 and controls elements to progress a process of teaching a child;
    a data table 30 that provides control data called by the CPU 20 when a signal input with respect to a child is generated in the CPU 20, the data table 30 comprising:
    a learning data table 31 that stores data of a process for progressing the learning with the teaching material and stores standardized data of activities of the robot while performing the process,
    an action pattern data table 32 that previously stores an action pattern of a child, which can be generated during the learning, and stores the action pattern of the child generated in real time and observed by the input unit 10,
    a corresponding pattern data table 33 that previously stores data about activities of the robot to be performed corresponding to the action pattern of a child and stores a situation when the activities of the robot are performed by a corresponding approximate value corresponding to the action pattern of the child generated in real time, and
    a driving data table 34 that previously stores basic data used by the CPU 20 to drive the robot to do the activities; and
    a drive unit 40 that receives a control signal and drives an audio unit, a video unit, and the motor of the robot to perform activities of the robot required for learning when the CPU 20 transmits the control signal based on the control data provided from the data table 30.
  2. The system according to claim 1, wherein the CPU 20 further comprises a storage medium 21 to store a total learning process performed by the robot and a new data input generated from an outside.
  3. A method of operating an interactive learning system using a robot, the method comprising:
    if an input unit 10 senses appearance of a child and inputs an appearance signal to a central processing unit (CPU) 20, inducing the child to voluntarily greet the robot to perform a greeting process in response to activities of the robot according to an instruction of the CPU 20 and presenting a teaching material to the child through activities of the robot according to an instruction of the CPU 20 to induce the child to select the teaching material;
    sensing the teaching material selected by the child through the input unit 10, calling, by the CPU 20, learning data from a learning data table 31 based on the sensed data and allowing a reading process to be performed according to a voice of the robot to progress the learning based on the teaching material selected by the child;
    while or after reading the teaching material, performing extended activities other than reading based on the learning data of the learning data table 31 through activities of the robot according to an instruction of the CPU 20;
    if the learning process is completed, performing a finishing activity to notify completion of the learning through activities of the robot according to an instruction of the CPU 20; and
    when switching between the foregoing operations, performing a switching operation to attract a child's attention based on the data stored in the data table 30 through activities of the robot according to an instruction of the CPU 20.
  4. The method according to claim 3, wherein, when performing each operation, if an expected action of the child is sensed by the input unit 10, the CPU 20 calls the action pattern data of the child and the corresponding pattern data from the action pattern data table 32 and allows the robot to respond to the expected action of the child through activities of the robot according to an instruction of the CPU 20, and if an unexpected action of the child is sensed by the input unit 10, the CPU 20 determines an approximate pattern value previously set and input to the action pattern and the corresponding pattern data and allows activities of the robot based on the approximate pattern value to be performed.
  5. The method according to claim 3, wherein the reading process comprises
    a basic reading process where contents of a teaching material is read; and
    a careful reading process where at least one of reading again, reading together, sectional reading, enunciated reading, and repeating is performed while performing the basic reading process.
  6. The method according to claim 3, wherein the performing extended activities comprises at least one of a process of reading the contents of the teaching material by changing a letter into an illustration, a process of changing the read letter into an illustration, a process of outputting the read letter as an image in a stroke order, and a process of expressing encouragement if the child says his or her impressions after reading the teaching material.
  7. An interactive learning system based on a total physical response (TPR) learning model using a robot that outputs a sound and an image to teach a child and performs travel and motion by driving a motor, the system comprising:
    a letter board L on which a certain letter is previously written to be provided by the robot to a child in the form of a sound, an image or combination of the sound and the image; and
    a robot R that recognizes the letter written on the letter board L and induces a child to raise the letter board L, so that at least one of the sound, the image, travel and motion can be provided to the child according to results of reading the letter from the letter board L to respond to the results, the robot R comprising:
    an input unit 10 that includes a letter scanner 17 to read the letter on the letter board L when a child raises the letter board L, inputs a signal using a sensor for sensing a figure and voice of a child and a teaching material selected to teach a child, and inputs contents to teach a child or inputs a signal when directly touched by a child during learning;
    a central processing unit (CPU) 20 that determines an action corresponding to a signal input through the input unit 10 and controls elements to progress a process of teaching a child;
    a data table 30 that provides control data called by the CPU 20 when a signal input with respect to a child is generated in the CPU 20; and
    a drive unit 40 that receives a control signal and drives an audio unit, a video unit, and the motor of the robot to perform activities of the robot required for learning when the CPU transmits the control signal based on the control data provided from the data table 30.
  8. The interactive learning system according to claim 7, wherein the data table 30 comprises
    a letter scan data table 35 that stores input data about a shape of the letter to recognize the letter written on the letter board L;
    a learning data table 31 that stores data of a process for progressing the learning with the teaching material and stores standardized data of activities of the robot R while performing the process;
    an action pattern data table 32 that previously stores an action pattern of a child, which can be generated during the learning, and stores the action pattern of the child generated in real time and observed by the input unit 10;
    a corresponding pattern data table 33 that previously stores data about activities of the robot to be performed corresponding to the action pattern of a child and stores a situation when the activities of the robot are performed by a corresponding approximate value corresponding to the action pattern of the child generated in real time; and
    a driving data table 34 that previously stores basic data used by the CPU 20 to drive the robot to do the activities.
  9. A method of operating an interactive learning system using a robot that outputs a sound and an image and performs travel and motion by driving a motor, the method comprising:
    performing a learning process using a teaching material for allowing a child to learn reading and vocabulary through the robot;
    while or after reading the teaching material, inducing the child to raise a letter board L, on which a certain letter is written, through a voice, an image or combination of the voice through a voice according to an instruction of a central processing unit (CPU) 20 based on learning data of a learning data table 31; and
    if a child raises the letter board L, allowing a letter scanner 17 to read the letter on the letter board L and transmit a reading signal to the CPU 20, and calling data from a letter scan data table 35 and expressing determined results about the letter selected by the child through activities of the robot according to an instruction of the CPU 20.
  10. The method according to claim 9, wherein the performing the learning comprises
    if an input unit 10 senses appearance of a child and inputs an appearance signal to a central processing unit, inducing the child to voluntarily greet the robot to perform a greeting process in response to activities of the robot according to an instruction of the CPU 20 and presenting a teaching material to the child through activities of the robot according to an instruction of the CPU 20 to induce the child to select the teaching material;
    sensing the teaching material selected by the child through the input unit 10, calling, by the CPU 20, learning data from a learning data table 31 based on the sensed data and allowing a reading process to be performed according to a voice of the robot to progress the learning based on the teaching material selected by the child;
    while or after reading the teaching material, performing extended activities other than reading based on the learning data of the learning data table 31 through activities of the robot according to an instruction of the CPU 20;
    if the learning process is completed, performing a finishing activity to notify completion of the learning through activities of the robot according to an instruction of the CPU 20; and
    when switching between the foregoing operations, performing a switching operation to attract a child's attention based on the data stored in the data table 30 through activities of the robot according to an instruction of the CPU 20.
  11. The method according to claim 9, wherein when performing each operation, if an expected action of the child is sensed by the input unit 10, the CPU 20 calls action pattern data of the child and corresponding pattern data from an action pattern data table 32 and allows the robot to respond to the expected action of the child through activities of the robot according to an instruction of the CPU 20, and if an unexpected action of the child is sensed by the input unit 10, the CPU 20 determines an approximate pattern value previously set and input to the action pattern and the corresponding pattern data and allows activities of the robot based on the approximate pattern value to be performed.
  12. The method according to claim 10, wherein the performing the extended activities comprises performing total physical response (TPR) learning by making the child act corresponding to the letter after the child read the letter raised by the child or by inducing the child to follow action of the robot R after the robot R acts corresponding to the letter.
PCT/KR2009/003464 2008-06-27 2009-06-26 Interactive learning system using robot and method of operating the same in child education WO2009157733A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP09770405.0A EP2321817A4 (en) 2008-06-27 2009-06-26 Interactive learning system using robot and method of operating the same in child education
CN200980125606.5A CN102077260B (en) 2008-06-27 2009-06-26 Interactive learning system using robot and method of operating same in child education

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
KR20080062034 2008-06-27
KR20080062033 2008-06-27
KR10-2008-0062033 2008-06-27
KR10-2008-0062034 2008-06-27
KR10-2009-0057392 2009-06-26
KR20090057392A KR101088406B1 (en) 2008-06-27 2009-06-26 Interactive learning system using robot and method of operating the same in child education

Publications (1)

Publication Number Publication Date
WO2009157733A1 true WO2009157733A1 (en) 2009-12-30

Family

ID=41812317

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2009/003464 WO2009157733A1 (en) 2008-06-27 2009-06-26 Interactive learning system using robot and method of operating the same in child education

Country Status (4)

Country Link
EP (1) EP2321817A4 (en)
KR (1) KR101088406B1 (en)
CN (1) CN102077260B (en)
WO (1) WO2009157733A1 (en)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101833884A (en) * 2010-05-17 2010-09-15 博雅创世(北京)智能科技有限公司 Robot teaching platform
CN102236981A (en) * 2010-04-27 2011-11-09 上海以太软件有限公司 Digital intelligence development machine with large size liquid crystal display (LCD) and touch screen
WO2012056459A1 (en) * 2010-10-28 2012-05-03 Visionstory Ltd An apparatus for education and entertainment
CN102446428A (en) * 2010-09-27 2012-05-09 北京紫光优蓝机器人技术有限公司 Robot-based interactive learning system and interaction method thereof
CN102522008A (en) * 2011-11-23 2012-06-27 康佳集团股份有限公司 Multimedia interactive teaching method and system thereof, and TV
CN104575141A (en) * 2015-01-20 2015-04-29 三峡大学 Man-computer interaction auxiliary classroom teaching aid
EP2710575A4 (en) * 2011-05-17 2015-07-15 Pt Irunt Llc Child-directed learning system integrating cellular communication, education, entertainment, alert and monitoring systems
CN105719519A (en) * 2016-04-27 2016-06-29 深圳前海勇艺达机器人有限公司 Robot with graded teaching function
CN105872828A (en) * 2016-03-30 2016-08-17 乐视控股(北京)有限公司 Television interactive learning method and device
CN106251717A (en) * 2016-09-21 2016-12-21 北京光年无限科技有限公司 Intelligent robot speech follow read learning method and device
CN106393113A (en) * 2016-11-16 2017-02-15 上海木爷机器人技术有限公司 Robot and interactive control method for robot
CN107547925A (en) * 2017-09-27 2018-01-05 刘伟平 A kind of video learns monitor system
WO2018044230A1 (en) * 2016-09-02 2018-03-08 Tan Meng Wee Robotic training apparatus and system
JP2020511324A (en) * 2017-03-24 2020-04-16 ホアウェイ・テクノロジーズ・カンパニー・リミテッド Data processing method and device for child-rearing robot
TWI751511B (en) * 2019-09-05 2022-01-01 日商三菱電機股份有限公司 Inference device, machine control system and learning device
US11511436B2 (en) 2016-08-17 2022-11-29 Huawei Technologies Co., Ltd. Robot control method and companion robot
TWI833681B (en) * 2023-10-13 2024-02-21 國立勤益科技大學 Active vocabulary learning system

Families Citing this family (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101119030B1 (en) * 2010-05-12 2012-03-13 (주) 퓨처로봇 Method of for editing scenario of intelligent robot and computer readable medium thereof, intelligent robot device and service method of intelligent robot
KR101333532B1 (en) * 2011-12-05 2013-11-28 (주)세스넷 Leaning device and method for using the same
KR101209012B1 (en) * 2012-01-31 2012-12-24 한성대학교 산학협력단 Play interface device for education using character robot
KR101344727B1 (en) * 2012-03-02 2014-01-16 주식회사 유진로봇 Apparatus and method for controlling intelligent robot
CN102663904A (en) * 2012-04-20 2012-09-12 江苏奇异点网络有限公司 Children entertainment system
CN102819969B (en) * 2012-08-15 2014-11-26 魔方天空科技(北京)有限公司 Implementation method for multimedia education platform and multimedia education platform system
KR101515178B1 (en) * 2013-01-14 2015-04-24 주식회사 케이티 Robot for providing face based user interface and control method thereof
KR101544044B1 (en) 2013-09-16 2015-08-13 이호현 expansion type robot for education
CN103777595A (en) * 2013-12-30 2014-05-07 深圳市德宝威科技有限公司 Robot system, robot office affair handling system, robot teaching system, robot designing system, robot engineering system and robot household system
CN104252287A (en) * 2014-09-04 2014-12-31 广东小天才科技有限公司 Interactive device and method for improving expressive ability on basis of same
WO2016206642A1 (en) * 2015-06-26 2016-12-29 北京贝虎机器人技术有限公司 Method and apparatus for generating control data of robot
KR101904453B1 (en) * 2016-05-25 2018-10-04 김선필 Method for operating of artificial intelligence transparent display and artificial intelligence transparent display
CN105894873A (en) * 2016-06-01 2016-08-24 北京光年无限科技有限公司 Child teaching method and device orienting to intelligent robot
CN106057023A (en) * 2016-06-03 2016-10-26 北京光年无限科技有限公司 Intelligent robot oriented teaching method and device for children
KR101983728B1 (en) * 2016-07-15 2019-06-04 주식회사 토이트론 Apparatus and method for operating smart toy performing command by recognizing card
CN106097793B (en) * 2016-07-21 2021-08-20 北京光年无限科技有限公司 Intelligent robot-oriented children teaching method and device
CN106295217A (en) * 2016-08-19 2017-01-04 吕佳宁 One breeds robot
CN106205237A (en) * 2016-08-31 2016-12-07 律世刚 Based on movement response and the training method of the second mother tongue of drawing reaction and device
CN106297436A (en) * 2016-09-12 2017-01-04 上海夫子云教育投资股份有限公司 A kind of intelligent robot video classes supplying system
CN107067835A (en) * 2016-11-23 2017-08-18 河池学院 A kind of children speech educational robot
KR20180089667A (en) 2017-02-01 2018-08-09 주식회사 시공미디어 Robot for providing coding education
CN107369341A (en) * 2017-06-08 2017-11-21 深圳市科迈爱康科技有限公司 Educational robot
KR102191488B1 (en) * 2017-10-27 2020-12-15 서울대학교산학협력단 Power and motion sensitized education robot
CN109300341A (en) * 2018-08-30 2019-02-01 合肥虹慧达科技有限公司 Interactive early education robot and its exchange method
CN109147433A (en) * 2018-10-25 2019-01-04 重庆鲁班机器人技术研究院有限公司 Childrenese assistant teaching method, device and robot
CN109366502B (en) * 2018-12-17 2022-04-08 广东誉丰教育科技有限公司 Network interactive education method based on artificial intelligence and robot
KR20200076169A (en) * 2018-12-19 2020-06-29 삼성전자주식회사 Electronic device for recommending a play content and operating method thereof
KR102134189B1 (en) * 2019-07-11 2020-07-15 주식회사 아들과딸 Method and apparatus for providing book contents using artificial intelligence robots
CN110619767A (en) * 2019-09-05 2019-12-27 顾柳泉 Intelligent education robot and computer readable medium
CN110751050A (en) * 2019-09-20 2020-02-04 郑鸿 Motion teaching system based on AI visual perception technology

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002351305A (en) * 2001-05-23 2002-12-06 Apollo Seiko Ltd Robot for language training
US20060257830A1 (en) * 2005-05-13 2006-11-16 Chyi-Yeu Lin Spelling robot

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN2702391Y (en) * 2004-02-04 2005-05-25 上海科技馆 Equipment for arithmetic game between man and robot
JP3923053B2 (en) * 2004-03-31 2007-05-30 ファナック株式会社 Robot teaching device
WO2006062274A1 (en) * 2004-12-07 2006-06-15 Rivalkorea Co., Ltd. Intelligent robot and mobile game method using the same
CN100559422C (en) * 2006-12-15 2009-11-11 华南理工大学 Possess read, the character recognition method of the educational robot of writing function
CN101411948A (en) * 2007-10-19 2009-04-22 鸿富锦精密工业(深圳)有限公司 Electronic toys
KR20090065212A (en) * 2007-12-17 2009-06-22 한국전자통신연구원 Robot chatting system and method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002351305A (en) * 2001-05-23 2002-12-06 Apollo Seiko Ltd Robot for language training
US20060257830A1 (en) * 2005-05-13 2006-11-16 Chyi-Yeu Lin Spelling robot

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2321817A4 *

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102236981A (en) * 2010-04-27 2011-11-09 上海以太软件有限公司 Digital intelligence development machine with large size liquid crystal display (LCD) and touch screen
CN101833884A (en) * 2010-05-17 2010-09-15 博雅创世(北京)智能科技有限公司 Robot teaching platform
CN102446428A (en) * 2010-09-27 2012-05-09 北京紫光优蓝机器人技术有限公司 Robot-based interactive learning system and interaction method thereof
WO2012056459A1 (en) * 2010-10-28 2012-05-03 Visionstory Ltd An apparatus for education and entertainment
US10086302B2 (en) 2011-05-17 2018-10-02 Zugworks, Inc. Doll companion integrating child self-directed execution of applications with cell phone communication, education, entertainment, alert and monitoring systems
EP2710575A4 (en) * 2011-05-17 2015-07-15 Pt Irunt Llc Child-directed learning system integrating cellular communication, education, entertainment, alert and monitoring systems
US11179648B2 (en) 2011-05-17 2021-11-23 Learning Squared, Inc. Educational device
CN102522008A (en) * 2011-11-23 2012-06-27 康佳集团股份有限公司 Multimedia interactive teaching method and system thereof, and TV
CN104575141A (en) * 2015-01-20 2015-04-29 三峡大学 Man-computer interaction auxiliary classroom teaching aid
CN105872828A (en) * 2016-03-30 2016-08-17 乐视控股(北京)有限公司 Television interactive learning method and device
CN105719519A (en) * 2016-04-27 2016-06-29 深圳前海勇艺达机器人有限公司 Robot with graded teaching function
US11511436B2 (en) 2016-08-17 2022-11-29 Huawei Technologies Co., Ltd. Robot control method and companion robot
WO2018044230A1 (en) * 2016-09-02 2018-03-08 Tan Meng Wee Robotic training apparatus and system
CN106251717A (en) * 2016-09-21 2016-12-21 北京光年无限科技有限公司 Intelligent robot speech follow read learning method and device
CN106393113A (en) * 2016-11-16 2017-02-15 上海木爷机器人技术有限公司 Robot and interactive control method for robot
JP2020511324A (en) * 2017-03-24 2020-04-16 ホアウェイ・テクノロジーズ・カンパニー・リミテッド Data processing method and device for child-rearing robot
US11241789B2 (en) 2017-03-24 2022-02-08 Huawei Technologies Co., Ltd. Data processing method for care-giving robot and apparatus
CN107547925A (en) * 2017-09-27 2018-01-05 刘伟平 A kind of video learns monitor system
TWI751511B (en) * 2019-09-05 2022-01-01 日商三菱電機股份有限公司 Inference device, machine control system and learning device
TWI833681B (en) * 2023-10-13 2024-02-21 國立勤益科技大學 Active vocabulary learning system

Also Published As

Publication number Publication date
KR101088406B1 (en) 2011-12-01
CN102077260A (en) 2011-05-25
CN102077260B (en) 2014-04-09
KR20100002210A (en) 2010-01-06
EP2321817A1 (en) 2011-05-18
EP2321817A4 (en) 2013-04-17

Similar Documents

Publication Publication Date Title
WO2009157733A1 (en) Interactive learning system using robot and method of operating the same in child education
Hyun et al. Comparative study of effects of language instruction program using intelligence robot and multimedia on linguistic ability of young children
US6517351B2 (en) Virtual learning environment for children
KR20140007347A (en) Vertically integrated mobile computer system
Bray et al. Technology and the diverse learner: A guide to classroom practice
Freed " This is the fluffy robot that only speaks french": language use between preschoolers, their families, and a social robot while sharing virtual toys
JP2020016880A (en) Dynamic-story-oriented digital language education method and system
Ao et al. Exploring the relationship between interactions and learning performance in robot-assisted language learning
de Souza Jeronimo et al. Comparing social robot embodiment for child musical education
Maxwell Beginning reading and deaf children
CN1279502C (en) An educational device
Raza Teaching listening to EFL students
Smith The Promise and Threat of Microcomputers for
McKeown Unlocking Potential: How ICT can support children with special needs
WO2013089356A1 (en) Teaching material for english education and recording medium having teaching material for english education recorded thereon
Hyun et al. Young children's perception of IrobiQ, the teacher assistive robot, with reference to speech register
CN112863267B (en) English man-machine conversation system and learning method
JP2003208084A (en) Device and method for learning foreign language
Tobin et al. How non-visual modalities can help the young visually impaired child to succeed in visual and other tasks
Sanchez et al. Social Robots in Education to Enhance Social Communications and Interaction Skills of Children with Autism-A Review
Lin et al. A Systematic Review on Oral Interactions in Robot-Assisted Language Learning. Electronics 2022, 11, 290
ZUBRYTSKA et al. INTEGRATING TECHNOLOGY INTO THE EDUCATIONAL PROCESS FOR CHILDREN WITH SPECIAL NEEDS: DRAWING ON EUROPEAN EXPERIENCE
Qiu et al. Integrating computer-based multimedia instructional design into teaching international English phonetic symbols
Yamamoto THE ROLE OF TEACHER-STUDENT RELATIONSHIPS TO CAUSE SYNCHRONY IN EFL CLASS
CN1912951A (en) Educational equipment

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200980125606.5

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09770405

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2009770405

Country of ref document: EP