WO2021111176A1 - Educational robot - Google Patents

Educational robot Download PDF

Info

Publication number
WO2021111176A1
WO2021111176A1 PCT/IB2019/060491 IB2019060491W WO2021111176A1 WO 2021111176 A1 WO2021111176 A1 WO 2021111176A1 IB 2019060491 W IB2019060491 W IB 2019060491W WO 2021111176 A1 WO2021111176 A1 WO 2021111176A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
robot
educational robot
pair
multimedia content
Prior art date
Application number
PCT/IB2019/060491
Other languages
French (fr)
Inventor
Farid Premani
Original Assignee
CHOOBIN, Barry
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CHOOBIN, Barry filed Critical CHOOBIN, Barry
Priority to PCT/IB2019/060491 priority Critical patent/WO2021111176A1/en
Publication of WO2021111176A1 publication Critical patent/WO2021111176A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/0005Manipulators having means for high-level communication with users, e.g. speech generator, face recognition means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/008Manipulators for service tasks
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances

Definitions

  • the present invention relates to an educational remote and more particularly relates to robots used in child education.
  • Robots depending on their utility, are employed in different classes of industry.
  • One such industry in which robots have been increasingly becoming popular is the educational industry.
  • Robots are generally being used in both home and classrooms to help students in their learning. They can also allow a remote connection between a teacher and students, wherein the teacher can communicate through a robot.
  • the present invention aims to alleviate the problems associated with the conventional robots used in the field of education.
  • the present invention relates to an educational robot which is capable interaction with a user in a such a way that the interaction is humanly and user-intuitive.
  • the principal objective of the present invention is directed to an educational robot that is humanly and user-intuitive.
  • the robot has multi functional capabilities to cater to the education needs of a student.
  • the robot has entertainment capabilities for the entertainment of the student.
  • the educational robot includes a body which forms the main portion of the robot.
  • the educational robot also includes a head portion mounted on the body which is capable of making facial expressions.
  • the educational robot includes a plurality of limbs that are capable of making human perceptible gestures.
  • the educational robot includes an interface housed in the body and allows the user to interact with the robot.
  • the interface is a computing device with touch input that allows a user to select multimedia content from a library of multimedia content.
  • the educational robot is provided with a feature to deliver the creative learning lessons.
  • the educational robot includes a camera to capture an image of the user to identify the user. Further, the educational robot renders the multimedia content interactively and intuitively.
  • the educational robot of the present invention makes human perceptible gestures thereby making the interaction with the user intuitive and human-friendly. Moreover, the educational robot of the present invention identifies the user and renders the multimedia content based on the identity of the user. As a result, the educational robot can render multimedia content based on the user.
  • FIG. 1 illustrates a schematic diagram of an educational robot, in accordance with an exemplary embodiment of the present invention.
  • Fig. 2 a side view of the educational robot, in accordance with an exemplary embodiment of the present invention.
  • FIG. 3 illustrates the head portion, in accordance with an exemplary embodiment of the present invention.
  • Fig. 1 illustrates a schematic diagram of an educational robot 100, in accordance with an embodiment of the present invention.
  • the educational robot can be humanoid in appearance and resembles a human body and some physical actions of the human body.
  • educational robot 100 can be used to render multimedia content to a user.
  • the educational robot 100 is capable of interacting with the user by making facial expressions and human perceptible gestures so that the interaction with the user is intuitive and communicative.
  • the robot can communicate with students, teachers, and parents
  • the educational robot includes a body 1 that is a major portion of the educational robot 100.
  • the body 1 houses various components of the educational robot 100.
  • the educational robot 100 also includes a head 2 that is mounted on the body 1.
  • the head 2 is mounted on the top the body 1 and shape and features of the head 2 are configured like a human.
  • the head 2 is shown to have a pair of eyes 3, a nose 4, and lips 5.
  • the head 2 is capable of making human facial expressions.
  • each of the pair of eyes 3, a nose 4, and lips 5 can be actuated selectively to present the human facial expressions.
  • the eyes 3, the nose 4, and mouth 5 may be coupled to actuators, such as a pneumatic, electromagnetic, or hydraulic actuator to impart motion to them.
  • the educational robot 100 includes a plurality of limbs 6 that configured as arms and legs of the educational robot 100.
  • the limbs include a pair of arms and a pair of legs that can be selectively actuated to make human perceptible gestures.
  • the limbs 6 may be coupled to actuators, such as a pneumatic, electromagnetic, or hydraulic actuator to impart motion to them.
  • actuators such as a pneumatic, electromagnetic, or hydraulic actuator to impart motion to them.
  • robot comprises many parts operably connected to motors such as servo motor for imparting motion to the robot.
  • the pair of legs are coupled to wheels that allow the robot 100 to move on the floor.
  • the components of the robot can be powered by a battery, for example, a lithium-ion battery.
  • the battery is preferably a rechargeable battery and a separate adaptor can be provided for recharging the battery.
  • the educational robot 100 also includes an interface 7 coupled to the upper front portion of the body 1.
  • the interface 7 can be a touchscreen display which is mounted on a chest portion of the body 1.
  • Te interface 7 allows the user to interact with the robot and the multimedia contents stored in the memory of the robot 100.
  • the user may also request the multimedia content through an app that can be connected to the robot 100 by wired/wireless network.
  • interface 7 may provide a list of the multimedia contents to the user for the user to make the selection.
  • interface 7 also allows the user to take a test that the teacher has prepared.
  • the interface 7 can be an LCD screen with a touch interface for input from the user.
  • the LCD screen displays the multimedia content, list, selection and other contents to the user.
  • the user can play educational games through the interface 7.
  • the educational robot 100 includes a camera 8 which captures an image of the user.
  • the camera 8 can either be housed in the body 1 or it can be housed on the head portion 2. In either implementation, camera is positioned in such a way that the imaging device captures the image of the user.
  • the educational robot 100 also includes a processor 9 which is operably coupled to all other components of the educational robot 100 for controlling the different components.
  • the processor 9 can be a single processing unit or several units, all of which could include multiple computing units.
  • the processor 9 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals, based on operational instructions.
  • the processor 9 also includes a storage device/memory that stores multimedia content. In operation, the processor controls the operation of the educational robot 100. For instance, the processor 9 controls the actuation of the head portion 2, the limbs 6, and the interface 7, and the imaging device 8.
  • processor 9 through the camera captures an image of the user and using one or more facial recognition algorithms, identify the user. Based on the user, the robot behaves as programmed to interact with students, parent or teacher.
  • the processor 9 identifies the user to provide customized multimedia content. For instance, processor 9 may provide different multimedia content to teachers and different multimedia content to the students. Such recognition can be performed by any known facial recognition technique.
  • the processor 9 controls the actuation of the limbs 6 and the head portion 2 to make gestures which, in one example, are in accordance with the present invention. For instance, the processor 9 controls the interface 7 to render a video of a rhyme for children and simultaneously, controls the limbs to make gestures to perform an act corresponding to the verse of the rhyme, for example dancing. Such an act makes learning effective for children.
  • the educational robot 100 may move its limbs 6 to present a dance form.
  • the educational robot 100 also houses a printer 13 positioned under the screen to print multimedia content for the user.
  • the printer 13 can be colored printer capable of printing content in colored format. Further, the printer can hold up to 100 paper thereby making the printer self-sufficient.
  • the educational robot 100 includes other components that are shown in Fig. 2.
  • Fig. 2 a side view of the educational robot 100, in accordance with one implementation of the present invention.
  • the educational robot 100 includes a projection device 10 positioned at the back of the body 2 and is configured to project the multimedia content on a screen.
  • the projection device 10 may have a 1080p resolution and can project the multimedia content on the screen of size up to 100 inches.
  • the projection device 10 projects the image in such a way that the educational robot 100 does not fall in the line of sight of the user.
  • the education robot 100 includes a speaker 11 that may be audio-based multimedia content.
  • the speaker 11 may be a high bass speaker with volume control to adjust an amplitude of the sound outputted by the speaker.
  • the speaker can be configured to play a welcome tone every time the education robot 100 is booted. For instance, the speaker 11 may play “Myself Captain Robo and I just came from ROBO Planet traveling 1 trillion miles to meet you, How are you doing today? Good! so are you ready to learn new activities with me today?”
  • the educational robot 100 has a microphone 12 that can record voice from the user. The combination of the microphone 12 and the speaker 11 allows oral communication between the educational robot 100 and the user. Such combination is useful when the educational robot 100 is providing communication between kids in the classroom and the teacher at the remote location.
  • the educational robot 100 is well equipped in providing the teaching assistance in home schooling.
  • the educational robot 100 is a specialized Early Childhood Development (ECD) robot designed for kids between 0- 8 years aged.
  • ECD Early Childhood Development
  • Fig. 3 illustrates the head portion 2, in accordance with one implementation of the present invention.
  • the head portion 2 includes a pair of eyes 3, a nose 4, and a mouth 5. Further, the pair of eyes are made to resemble human eyes.
  • each eye includes a light-emitting diode (FED) that can emit light of different colors, such that each color corresponds to an expression of emotion. For instance, in a questionnaire session with the students, the color green of the FED indicates a correct answer, color red indicate incorrect answer, and color blue indicates a mode of thinking.
  • FED light-emitting diode
  • each eye is surrounded by a pair of eyelashes that can be operated by the processor 9 to move relative to the eye. Such movement is required to produce facial expression, such as raising the eyes, etc. As may be understood, such movement is controlled by the processor 9.
  • the mouth 5 can be selectively actuated by the processor 9 to mimic talking gesture of the educational robot 100.
  • the mouth 5 can be made to perform lisping when an audio clip of a rhyme or a story is played. Such motion imparts the attention of the kids. 1.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Toys (AREA)

Abstract

The present invention relates to an educational robot that includes a body which forms the main portion of the robot. The educational robot also includes a head mounted on the body which is capable of making human facial expressions. In addition, the educational robot includes a plurality of limbs that are capable of making human perceptible gesture. Further, the educational robot includes an interface housed in the body and allows the user to interact with the interface. In an example, the interface allows the user to select a multimedia content from a library of multimedia content. The educational robot includes an imaging device to capture an image of the user to identify the user. Further, the educational robot renders the multimedia content in an interactive and intuitive way.

Description

EDUCATIONAL ROBOT
TECHNICAL FIELD
[0001] The present invention relates to an educational remote and more particularly relates to robots used in child education.
BACKGROUND OF THE INVENTION
[0002] Robots, depending on their utility, are employed in different classes of industry. One such industry in which robots have been increasingly becoming popular is the educational industry. Robots are generally being used in both home and classrooms to help students in their learning. They can also allow a remote connection between a teacher and students, wherein the teacher can communicate through a robot.
[0003] Although the robots increasingly being used in education, there have been many drawbacks in known commercial robots. For instance, the robots do not provide intuitive interaction between a student and a robot. Moreover, robots are perceived as mere machines and are unable to attract the attention of the students.
SUMMARY
[0004] This summary is not intended to identify essential features of the claimed subject matter nor is it intended for use in determining or limiting the scope of the claimed subject matter.
[0005] The present invention aims to alleviate the problems associated with the conventional robots used in the field of education. The present invention relates to an educational robot which is capable interaction with a user in a such a way that the interaction is humanly and user-intuitive. [0006] Thus, the principal objective of the present invention is directed to an educational robot that is humanly and user-intuitive.
[0007] Another objective of the present invention is that the robot has multi functional capabilities to cater to the education needs of a student. [0008] Yet another objective of the present invention is that the robot has entertainment capabilities for the entertainment of the student.
[0009] In one aspect, the educational robot includes a body which forms the main portion of the robot. The educational robot also includes a head portion mounted on the body which is capable of making facial expressions. Also, the educational robot includes a plurality of limbs that are capable of making human perceptible gestures. Further, the educational robot includes an interface housed in the body and allows the user to interact with the robot. In an example, the interface is a computing device with touch input that allows a user to select multimedia content from a library of multimedia content. The educational robot is provided with a feature to deliver the creative learning lessons. Furthermore, the educational robot includes a camera to capture an image of the user to identify the user. Further, the educational robot renders the multimedia content interactively and intuitively.
[0010] The educational robot of the present invention makes human perceptible gestures thereby making the interaction with the user intuitive and human-friendly. Moreover, the educational robot of the present invention identifies the user and renders the multimedia content based on the identity of the user. As a result, the educational robot can render multimedia content based on the user.
BRIEF DESCRIPTION OF DRAWINGS
[0011] The features, aspects, and advantages of the subject matter will be better understood with regard to the following description, and accompanying figures. The use of the same reference number in different figures indicates similar or identical features and components.
[0012] Fig. 1 illustrates a schematic diagram of an educational robot, in accordance with an exemplary embodiment of the present invention. [0013] Fig. 2 a side view of the educational robot, in accordance with an exemplary embodiment of the present invention.
[0014] Fig. 3 illustrates the head portion, in accordance with an exemplary embodiment of the present invention.
DETAILED DESCRIPTION
[0015] The above-mentioned implementations are further described herein with reference to the accompanying figures. It should be noted that the description and figures relate to exemplary implementations, and should not be construed as a limitation to the present subject matter. It is also to be understood that various arrangements may be devised that, although not explicitly described or shown herein, embody the principles of the present subject matter. Moreover, all statements herein reciting principles, aspects, and embodiments of the present subject matter, as well as specific examples, are intended to encompass equivalents thereof.
[0016] Fig. 1 illustrates a schematic diagram of an educational robot 100, in accordance with an embodiment of the present invention. The educational robot can be humanoid in appearance and resembles a human body and some physical actions of the human body. Furthermore, educational robot 100 can be used to render multimedia content to a user. Moreover, the educational robot 100 is capable of interacting with the user by making facial expressions and human perceptible gestures so that the interaction with the user is intuitive and communicative. The robot can communicate with students, teachers, and parents
[0017] The educational robot includes a body 1 that is a major portion of the educational robot 100. The body 1 houses various components of the educational robot 100. The educational robot 100 also includes a head 2 that is mounted on the body 1. The head 2 is mounted on the top the body 1 and shape and features of the head 2 are configured like a human. The head 2 is shown to have a pair of eyes 3, a nose 4, and lips 5. The head 2 is capable of making human facial expressions. In one example, each of the pair of eyes 3, a nose 4, and lips 5 can be actuated selectively to present the human facial expressions. In one example, the eyes 3, the nose 4, and mouth 5 may be coupled to actuators, such as a pneumatic, electromagnetic, or hydraulic actuator to impart motion to them.
[0018] In one aspect, the educational robot 100 includes a plurality of limbs 6 that configured as arms and legs of the educational robot 100. For instance, the limbs include a pair of arms and a pair of legs that can be selectively actuated to make human perceptible gestures. In one example, the limbs 6 may be coupled to actuators, such as a pneumatic, electromagnetic, or hydraulic actuator to impart motion to them. It is known that robot comprises many parts operably connected to motors such as servo motor for imparting motion to the robot. The pair of legs are coupled to wheels that allow the robot 100 to move on the floor. The components of the robot can be powered by a battery, for example, a lithium-ion battery. The battery is preferably a rechargeable battery and a separate adaptor can be provided for recharging the battery.
[0019] The educational robot 100 also includes an interface 7 coupled to the upper front portion of the body 1. The interface 7 can be a touchscreen display which is mounted on a chest portion of the body 1. Te interface 7 allows the user to interact with the robot and the multimedia contents stored in the memory of the robot 100. Alternatively, the user may also request the multimedia content through an app that can be connected to the robot 100 by wired/wireless network. In operation, interface 7 may provide a list of the multimedia contents to the user for the user to make the selection. In one example, interface 7 also allows the user to take a test that the teacher has prepared. The interface 7 can be an LCD screen with a touch interface for input from the user. The LCD screen displays the multimedia content, list, selection and other contents to the user. Moreover, the user can play educational games through the interface 7.
[0020] The educational robot 100 includes a camera 8 which captures an image of the user. The camera 8 can either be housed in the body 1 or it can be housed on the head portion 2. In either implementation, camera is positioned in such a way that the imaging device captures the image of the user. The educational robot 100 also includes a processor 9 which is operably coupled to all other components of the educational robot 100 for controlling the different components. The processor 9 can be a single processing unit or several units, all of which could include multiple computing units. The processor 9 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals, based on operational instructions. The processor 9 also includes a storage device/memory that stores multimedia content. In operation, the processor controls the operation of the educational robot 100. For instance, the processor 9 controls the actuation of the head portion 2, the limbs 6, and the interface 7, and the imaging device 8.
[0021] In one example, processor 9 through the camera captures an image of the user and using one or more facial recognition algorithms, identify the user. Based on the user, the robot behaves as programmed to interact with students, parent or teacher. The processor 9 identifies the user to provide customized multimedia content. For instance, processor 9 may provide different multimedia content to teachers and different multimedia content to the students. Such recognition can be performed by any known facial recognition technique. Also, the processor 9 controls the actuation of the limbs 6 and the head portion 2 to make gestures which, in one example, are in accordance with the present invention. For instance, the processor 9 controls the interface 7 to render a video of a rhyme for children and simultaneously, controls the limbs to make gestures to perform an act corresponding to the verse of the rhyme, for example dancing. Such an act makes learning effective for children. Also, based on the multimedia content, the educational robot 100 may move its limbs 6 to present a dance form.
[0022] The educational robot 100 also houses a printer 13 positioned under the screen to print multimedia content for the user. In one example, the printer 13 can be colored printer capable of printing content in colored format. Further, the printer can hold up to 100 paper thereby making the printer self-sufficient.
[0023] The educational robot 100 includes other components that are shown in Fig. 2. Fig. 2 a side view of the educational robot 100, in accordance with one implementation of the present invention. In the illustrated implementation, the educational robot 100 includes a projection device 10 positioned at the back of the body 2 and is configured to project the multimedia content on a screen. Further, the projection device 10 may have a 1080p resolution and can project the multimedia content on the screen of size up to 100 inches. In the illustrated example, the projection device 10 projects the image in such a way that the educational robot 100 does not fall in the line of sight of the user.
[0024] In one implementation, the education robot 100 includes a speaker 11 that may be audio-based multimedia content. The speaker 11 may be a high bass speaker with volume control to adjust an amplitude of the sound outputted by the speaker. Moreover, the speaker can be configured to play a welcome tone every time the education robot 100 is booted. For instance, the speaker 11 may play “Myself Captain Robo and I just came from ROBO Planet traveling 1 trillion miles to meet you, How are you doing today? Good! so are you ready to learn new activities with me today?” [0025] Moreover, the educational robot 100 has a microphone 12 that can record voice from the user. The combination of the microphone 12 and the speaker 11 allows oral communication between the educational robot 100 and the user. Such combination is useful when the educational robot 100 is providing communication between kids in the classroom and the teacher at the remote location. The educational robot 100 is well equipped in providing the teaching assistance in home schooling.
[0026] In one implementation, the educational robot 100 is a specialized Early Childhood Development (ECD) robot designed for kids between 0- 8 years aged.
[0027] Fig. 3 illustrates the head portion 2, in accordance with one implementation of the present invention. As mentioned before, the head portion 2 includes a pair of eyes 3, a nose 4, and a mouth 5. Further, the pair of eyes are made to resemble human eyes. Furthermore, each eye includes a light-emitting diode (FED) that can emit light of different colors, such that each color corresponds to an expression of emotion. For instance, in a questionnaire session with the students, the color green of the FED indicates a correct answer, color red indicate incorrect answer, and color blue indicates a mode of thinking.
[0028] In addition, each eye is surrounded by a pair of eyelashes that can be operated by the processor 9 to move relative to the eye. Such movement is required to produce facial expression, such as raising the eyes, etc. As may be understood, such movement is controlled by the processor 9.
[0029] In the present implementation, the mouth 5 can be selectively actuated by the processor 9 to mimic talking gesture of the educational robot 100. For instance, the mouth 5 can be made to perform lisping when an audio clip of a rhyme or a story is played. Such motion imparts the attention of the kids. 1. Although the invention has been described with reference to specific embodiments, this description is not meant to be construed in a limiting sense. Various modifications of the disclosed embodiments, as well as alternate embodiments of the subject matter, will become apparent to persons skilled in the art upon reference to the description of the subject matter. It is therefore contemplated that such modifications can be made without departing from the spirit or scope of the present invention as defined.

Claims

I/We Claim:
1. An educational robot comprising: a body; a head operably coupled on top the body,; a plurality of limbs operably coupled to the body, wherein the plurality of limbs makes human perceptible gestures; an interface mounted on the body to allow a user to interact with the interface, wherein the interface receives a selection a multimedia content from a plurality of multimedia content by the user, the interface displays information to the user; and an imaging device to capture an image of the user; and an imaging device to receive the captured image to identify the user; and a processor in electronic communication with different components of the education robot for controlling the education robot, the processor configured to identify the identification of the user and behaves as programmed.
2. The educational robot as claimed in claim 1 further comprising a microphone and a speaker to allow oral communication between the user and the processor.
3. The educational robot as claimed in claim 1 further comprising a projection device to project the multimedia content on a screen.
4. The education robot as claimed in claim 1 further comprising a storage device to store the plurality of multimedia content.
5. The educational robot as claimed in claim 1 further comprising a printer housed in the body and beneath the screen.
6. The educational robot as claimed in claim 1, wherein the head portion comprises a pair of eyes, a nose, and a pair of lips, and wherein each of the pair of eyes, the nose, and the pair of lips are selectively actuatable to render human facial expressions.
7. The educational robot as claimed in claim 6, wherein the pair of eyes include light- emitting diodes (LEDs) to emit light of different colors, and wherein each emitted color corresponds to the human perceptible facial expressions.
The educational robot as claimed in claim 1, wherein the plurality of limbs includes a pair of legs and a pair of arms, and wherein each limb of the plurality of limbs is selectively actuable to make the human perceptible gesture.
PCT/IB2019/060491 2019-12-05 2019-12-05 Educational robot WO2021111176A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/IB2019/060491 WO2021111176A1 (en) 2019-12-05 2019-12-05 Educational robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IB2019/060491 WO2021111176A1 (en) 2019-12-05 2019-12-05 Educational robot

Publications (1)

Publication Number Publication Date
WO2021111176A1 true WO2021111176A1 (en) 2021-06-10

Family

ID=76221707

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2019/060491 WO2021111176A1 (en) 2019-12-05 2019-12-05 Educational robot

Country Status (1)

Country Link
WO (1) WO2021111176A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113552949A (en) * 2021-07-30 2021-10-26 北京凯华美亚科技有限公司 Multifunctional immersive audio-visual interaction method, device and system

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10155166B1 (en) * 2017-09-08 2018-12-18 Sony Interactive Entertainment Inc. Spatially and user aware second screen projection from a companion robot or device
US20190118104A1 (en) * 2017-10-20 2019-04-25 Thinker-Tinker, Inc. Interactive plush character system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10155166B1 (en) * 2017-09-08 2018-12-18 Sony Interactive Entertainment Inc. Spatially and user aware second screen projection from a companion robot or device
US20190118104A1 (en) * 2017-10-20 2019-04-25 Thinker-Tinker, Inc. Interactive plush character system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
DAVID 0. JOHNSON ET AL.: "Imitating human emotions with artificial facial expressions", INTERNATIONAL JOURNAL OF SOCIAL ROBOTICS, vol. 5.4, 6 September 2013 (2013-09-06), pages 503 - 513, XP035375194, DOI: 10.1007/s12369-013-0211-1 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113552949A (en) * 2021-07-30 2021-10-26 北京凯华美亚科技有限公司 Multifunctional immersive audio-visual interaction method, device and system

Similar Documents

Publication Publication Date Title
US10896621B2 (en) Educational robot
US9381426B1 (en) Semi-automated digital puppetry control
Tanaka et al. Pepper learns together with children: Development of an educational application
Breazeal Designing sociable robots
KR101169674B1 (en) Telepresence robot, telepresence system comprising the same and method for controlling the same
Savage-Rumbaugh 17 ACQUISITION OF FUNCTIONAL SYMBOL USAGE IN APES AND CHILDREN
US9108114B2 (en) Tangible user interface and a system thereof
WO2017186001A1 (en) Education system using virtual robots
Koenig et al. Communication and knowledge sharing in human–robot interaction and learning from demonstration
US20190270026A1 (en) Automatic Mobile Robot For Facilitating Activities To Improve Child Development
Setapen Creating robotic characters for long-term interaction
Forman Observations of young children solving problems with computers and robots
Reiners et al. Experimental study on consumer-technology supported authentic immersion in virtual environments for education and vocational training
Kerzel et al. Teaching NICO how to grasp: An empirical study on crossmodal social interaction as a key factor for robots learning from humans
WO2021111176A1 (en) Educational robot
CN208697451U (en) A kind of children's early education robot
Granott Microdevelopment of co-construction of knowledge during problem solving: puzzled minds, weird creatures, and wuggles
Ihamäki et al. Social and emotional learning with a robot dog: technology, empathy and playful learning in kindergarten
Nasi et al. Pomelo, a collaborative education technology interaction robot
Degiorgi et al. Puffy—An inflatable robotic companion for pre-schoolers
Ziouzios et al. Utilizing Robotics for Learning English as a Foreign Language
Delaunay A retro-projected robotic head for social human-robot interaction
US11979448B1 (en) Systems and methods for creating interactive shared playgrounds
Garzotto et al. Integrating Virtual Worlds and Mobile Robots in Game-Based Treatment for Children with Intellectual Disability
Setiawati Using puppet as media to increase the children vocabulary

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19955015

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19955015

Country of ref document: EP

Kind code of ref document: A1