WO2016144279A1 - Virtual reality based remote learning system and method - Google Patents

Virtual reality based remote learning system and method Download PDF

Info

Publication number
WO2016144279A1
WO2016144279A1 PCT/TR2016/050054 TR2016050054W WO2016144279A1 WO 2016144279 A1 WO2016144279 A1 WO 2016144279A1 TR 2016050054 W TR2016050054 W TR 2016050054W WO 2016144279 A1 WO2016144279 A1 WO 2016144279A1
Authority
WO
WIPO (PCT)
Prior art keywords
instructor
student
virtual
virtual reality
movement sensor
Prior art date
Application number
PCT/TR2016/050054
Other languages
French (fr)
Inventor
Mujdehan ORS FILIZ
Original Assignee
Ors Filiz Mujdehan
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ors Filiz Mujdehan filed Critical Ors Filiz Mujdehan
Publication of WO2016144279A1 publication Critical patent/WO2016144279A1/en

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/06Electrically-operated educational appliances with both visual and audible presentation of the material to be studied
    • G09B5/065Combinations of audio and video presentations, e.g. videotapes, videodiscs, television systems
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/003Repetitive work cycles; Sequence of movements
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/003Repetitive work cycles; Sequence of movements
    • G09B19/0038Sports
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/08Electrically-operated educational appliances providing for individual presentation of information to a plurality of student stations
    • G09B5/14Electrically-operated educational appliances providing for individual presentation of information to a plurality of student stations with provision for individual teacher-student communication

Definitions

  • the present invention relates to remote learning methods and particularly virtual reality based remote learning methods and systems for realizing said methods.
  • Remote learning methods are systems which are substantially time-saving both for students and for instructors.
  • Such systems essentially comprise digital learning materials (videos, questions, books, etc.) and said materials are shared with the student with the help of the system servers through internet under the control of the instructor.
  • digital learning materials videos, questions, books, etc.
  • the instructor and the student have live communication or video talk facility, and the students can answer various multiple choice questions.
  • An object of the present invention is to provide a virtual reality based learning method where the interaction between the instructor and the student is increased.
  • Another object of the present invention is to reduce application cost of virtual reality based learning systems and thus, to provide the use of such systems by a larger mass.
  • the present invention relates to a virtual reality based learning system comprising a student movement sensing system for sensing the movements of the student in a student location, a student display unit, and a student terminal connected to said student movement sensing system and said student display unit.
  • the subject matter learning system is characterized in that:
  • said learning system comprises an instructor movement sensing system for detecting the movements of the instructor in an instructor location, an instructor display unit and an instructor terminal connected to said instructor movement sensing system and to said instructor display unit,
  • said learning system comprises a server group which provides the whole double sided instantaneous communication between the instructor terminal and the student terminals through a wireless communication network and having at least one application server and a virtual content server,
  • At least one virtual learning application providing transformation of the movements sensed by at least the instructor and student movement sensors into an instructor virtual character or student virtual character related to an instructor and student, is loaded in said application server,
  • the virtual content server instantaneously transfers all of the movement sensing data and other predetermined data, reaching itself through the instructor terminal and through the student terminal, to the other side, and thus the instructor and the student instantaneously see all of the movements of each other and the interactions of them with the virtual objects in the virtual learning medium by means of the instructor virtual character and the student virtual character in the virtual learning medium.
  • the instructor movement sensing system comprises at least one of an instructor hand-arm movement sensor, an instructor head movement sensor, an instructor body movement sensor, an instructor wearable movement sensor and an eye movement sensor.
  • the student movement sensing system comprises at least one of a student hand-arm movement sensor, a student head movement sensor, a student body movement sensor, a student wearable movement sensor and an eye movement sensor.
  • the instructor terminal comprises at least one of an instructor personal computer, an instructor smart cellular phone or an instructor tablet.
  • the student terminal comprises at least one of a student personal computer, a student smart cellular phone or a student tablet.
  • the instructor display unit comprises an instructor terminal screen or a virtual reality eyeglass.
  • the student display unit comprises a terminal screen or a student virtual reality eyeglass.
  • the server group moreover comprises a backup server providing recording the whole learning session and which is in communication with the virtual content server.
  • said other predetermined data are voice and/or written message data produced at least in the instructor location and in the student location.
  • the present invention is a virtual reality based learning method, characterized by comprising the steps of;
  • the instructor movement sensing system comprises at least one of an instructor hand-arm movement sensor, an instructor head movement sensor, an instructor body movement sensor and an instructor wearable movement sensor.
  • the student movement sensing system comprises at least one of a student hand-arm movement sensor, a student head movement sensor, a student body movement sensor and a student wearable movement sensor.
  • the instructor terminal comprises at least one of an instructor personal computer, an instructor smart cellular phone or an instructor tablet.
  • the student terminal comprises at least one of a student personal computer, a student smart cellular phone or a student tablet.
  • the instructor display unit comprises an instructor terminal screen or an instructor virtual reality eyeglass.
  • the student display unit comprises a student terminal screen or a student virtual reality eyeglass.
  • said predetermined data are voice and/or written message data produced at least in the instructor location and in the student location.
  • Figure 1 is a representative view of the subject matter system.
  • Figure 2a, 2b and 2c are the representative views of the movement sensing system, terminals and display units used by the instructor respectively in the subject matter invention.
  • Figure 2d, 2e and 2f are the representative views of the movement sensing system, terminals and display units used by the student in the subject matter invention respectively.
  • Figure 3 is the representative view of a preferred application of the subject matter system.
  • Figure 4 is the representative view of another preferred application of the subject matter system.
  • Figure 5 is a view of a virtual learning medium in the subject matter system and a representative learning session realized in this medium.
  • Figure 6 are the representative views of another preferred application of the subject matter system.
  • Figure 7 is a view of another virtual learning medium in the subject matter system and another representative learning session realized in this medium.
  • Instructor movement sensing system 100 Student movement sensing system
  • Instructor body movement sensor 103 Student body movement sensor
  • Instructor wearable movement sensor 104 Student wearable movement sensor
  • the subject matter system essentially provides instantaneous interaction of at least one instructor (20) and at least one student (90) in a virtual reality medium (130) for training purposes, where said instructor (20) and said student (90) are in different locations.
  • an instructor movement sensing system (30) for sensing the movements of the instructor (20), an instructor display unit (50), and an instructor terminal (40) connected to said instructor movement sensing system (30) and said instructor display unit (50).
  • an instructor movement sensing system (30) for sensing the movements of the instructor (20
  • an instructor display unit (50) for sensing the movements of the instructor (20)
  • an instructor terminal (40) connected to said instructor movement sensing system (30) and said instructor display unit (50).
  • the student location (80) there is essentially a student movement sensing system (100) in order to detect the movements of the student (90), a student display unit (120) and said student movement sensing system (100), and there is a student terminal (110) connected to the student display unit (120).
  • the whole communication between the instructor terminal (40) and the student terminals (1 10) is realized through a server group (70) by means of a wireless communication network (60).
  • Said server group (70) essentially comprises an application server (71), a virtual content server (72) and a backup server (73).
  • the application server (71) there is a loaded virtual training application, and both the instructor (20) and the student (90) connect to said application server (71), and they install said virtual learning application to their own terminals.
  • the virtual content server (72) at least one but preferably a plurality of virtual learning medium (130) is loaded for different purposes, and after the virtual learning application is installed to the student terminal (110) and the instructor (20), the related virtual learning medium (130) is downloaded from the virtual content server (72).
  • Another important task of the virtual content server (72) is to instantaneously transfer all of the movement sensing data and the other predetermined data (like voice, written, real camera images), received from the instructor (20) and student terminal (1 10), to the opposite party.
  • all movements of the instructor (20) and the student (90) are instantaneously reflected to the opposite party in the virtual learning medium (130).
  • the hand of the virtual character (131) of the instructor moves instantaneously in the virtual learning medium (130) viewed by the student (90).
  • Such an instantaneous communication provides a much more realist and effective interaction.
  • Another task of the server group (70) provides recording of the whole training session.
  • backup server (73) is used which is in instantaneous communication with the virtual content server (72), and thus, all training sessions are recorded and when required, this information is provided to the authorized terminals.
  • a single server which realized two functions, can be used instead of the application and virtual content servers (71 , 72).
  • pluralities of application and virtual content servers (71 , 72) can be used in cluster form.
  • the instructor movement sensing system (30) used in the subject matter invention, preferably comprises at least one of an instructor hand-arm movement sensor (31), an instructor head movement sensor (32), an instructor body movement sensor (33) and at least one of instructor wearable movement sensor (34).
  • the hand-arm movement sensor can essentially sense the hand, arm and finger movements of the user in a three-dimensional plane, and said sensor transfers all of said movements to the instructor terminal (40) as digital movement data.
  • the instructor hand-arm movement sensor (31) can sense in case the instructor (20) extends his/her arm and makes a gripping movement by means of his/her fingers.
  • the product of the LeapMotion Company can be given as an example to the hand-arm movement sensor.
  • the head movement sensor can sense the head movements of the user in a three- dimensional plane, and it transfers all of these movements as digital movement data to the instructor terminal (40). For instance, if the instructor (20) turns his/her head at a specific angle or lifts his/her head upwardly, this movement can be sensed by the instructor head movement sensor (32).
  • virtual reality eyeglasses can be given which are used as example to the head movement sensor and which can sense the head movements by means of the gyroscope system included.
  • the virtual reality eyeglasses, produced by the Oculus Company can be given as an example to such a product.
  • the instructor or the student can produce the virtual eyeglass chamber from carton, and a smart cellular phone, where an application dividing the screen to two pieces is loaded, can be placed into the section of this chamber which corresponds to the eyes.
  • a smart cellular phone where an application dividing the screen to two pieces is loaded, can be placed into the section of this chamber which corresponds to the eyes.
  • the gyroscope characteristic of the smart cellular phones in this method, the head movements of the instructor can be sensed.
  • the Google Cardboard product and the related software application can be given which is produced by the Google Company and which is used together with smart cellular phones.
  • the body movement sensor can sense all items in the body of the user, and the general position of the body, and it transfers all of these movements as digital movement data to the instructor terminal (40). For instance, in case the instructor (20) moves, for instance in case the instructor (20) walks forwardly, he/she lifts his/her arm upwardly, and he/she moves his/her leg, this movement can be sensed by the body movement sensor (33). In a preferred application of the present invention, the product of the Microsoft Company with Kinect trademark can be given as an example to the body movement sensor.
  • the wearable movement sensor can sense the movements of some items in the user body in a similar manner to the body movement sensor, and it transfers all of said movements as digital movement data to the instructor terminal (40).
  • the wearable movement sensor comprises at least one of glove sensor fixed to hands, and sensors which can be fixed to elbows, knees and shoulders. Depending on the type of the sensor used, pluralities of movements, like lifting the arm of the instructor (20) upwardly, moving the leg, and rotating around the own axis thereof, can be sensed by the instructor wearable movement sensor (34).
  • the student movement sensing system (100) comprises at least one of student hand-arm movement sensor (101), student head movement sensor (102), student body movement sensor (103) and student wearable movement sensor (104). Since the operation manner of all of these items is the same as the operation manner of the items on the instructor (20) side, a description will not be given here again.
  • an instructor terminal (40) in the subject matter system at least one of an instructor personal computer (41), an instructor smart cellular phone (42) or an instructor tablet (43) can be used.
  • an instructor terminal screen (51) or instructor reality eyeglass (52) can be used as the instructor display unit (50).
  • the instructor terminal screen (51) can be the standard or touch-screen of the instructor personal computer (41), the instructor smart cellular phone (42) or the instructor tablet (43).
  • an exemplary application of the present invention is realized as follows.
  • the object is to teach chess to the persons, and a virtual learning medium designed accordingly has been loaded to the virtual content server (72).
  • said virtual learning medium (130) as the virtual object (133), there is essentially a chess board, and chess pieces provided thereon.
  • the instructor (20) and the student (90) access to the application server (71) through the wireless communication network (60) by means of the user terminals, and afterwards, they install the virtual learning application to their own user terminals. Afterwards, they download said virtual learning medium (130) to their own user mobile terminals through the virtual content server (72).
  • the instructor and student mobile terminals provide authorized access to the virtual content server (72), and all of the processes are realized on the virtual content server (72).
  • a representative learning session is realized as follows according to a preferred application given in Figure 3.
  • the instructor uses the instructor personal computer (41) as the user terminal, and the student uses the student personal computer (11 1) as the student user terminal.
  • an instructor hand-arm movement sensor (31) and a student hand-arm movement sensor (101) are used which are connected to the personal computer and which are preferably LeapMotion devices.
  • the student (90) moves his/her hand in a suitable manner for moving the chess piece which is a virtual object (133) in the virtual learning medium (130) and when the student (90) moves his/her fingers for holding the related piece, the hands and fingers of the virtual character (132) of the student, existing in the virtual learning medium (130), are moved by means of the LeapMotion device.
  • the student can lift the chess piece in the virtual learning medium (130) and he/she can put said chess piece to another location on the chess board.
  • the interaction between the hands of the student virtual character (132) and the instructor virtual character (131) and the chess pieces (virtual object (133)) may become possible by means of the virtual learning application loaded in the user terminals.
  • the virtual learning application is essentially based on coordinates in the virtual learning medium, and said application decides to realize interaction or not by taking the coordinates in the virtual learning medium as a base.
  • the virtual content server (72) instantaneously transfers all of said movements to the personal computer (111) of the student, thus, the student (90) can see the move made by the instructor virtual character and illustrated on the screen of the student personal computer (1 1 1). Voice or written messages can be included in all of these interactions. Therefore, a microphone which can be fixed externally or which is provided internally in the user terminals will be sufficient and an optional earphone set will be sufficient.
  • the instructor and student personal computer instead of the instructor and student personal computer (41 , 1 11), the instructor and student cellular phones (42, 112) are used, and instead of the instructor and student personal computer (41 , 11 1) screen, the virtual reality eyeglasses (52, 122) are used as the display unit. Accordingly, they connect the instructor hand-arm movement sensor (31), the student hand-arm movement sensor (101), and the student virtual reality eyeglass (122) and the instructor virtual reality eyeglass (52) to their own cellular phones where the abovementioned required applications are loaded. Since the operation is substantially same as the application illustrated in Figure 3, no detailed description will be given here, only the differences and the advantages of said differences will be described.
  • the greatest advantage of the virtual reality eyeglass (52) is that it provides a much more realist experience (immersive environment) when compared with the personal computer (41) screen. Of course, this provides a favorable contribution to the efficiency of the learning session.
  • Another advantage is that by means of the gyroscope characteristic in the virtual reality eyeglass, the head movements of the instructor and the student (90) can be monitored and they can be transferred to the virtual learning medium (130). Thus, for instance, the instructor can see which direction the student (90) looks at and even which piece the student (90) looks at, and thus, he/she can transfer some commands and warnings to his/her student (90) without making additional move.
  • indicators like an arrow which extends from the head of the virtual character towards the target, can be used for providing understanding of the looking direction by the opposite party.
  • the Google Cardboard product and the related software application produced by the Google Company, can be used together with the smart cellular phones (42) instead of the virtual reality eyeglass (52).
  • the student or the instructor may have to see the real world at the same time.
  • the front camera of the cellular phone is used, and when desired, the real medium image obtained from this is displayed in a small window or in full-screen manner on the screen of the smart cellular phone.
  • the user can see every direction to which he/she turns his/her head.
  • the images existing on the personal computer (41 , 1 11) of the instructor and/or student are transferred to the smart cellular phone by using software like TeamViewer.
  • one instructor eye movement sensor (35) and one student eye movement sensor (105) can be used which sense the eye movements of the instructor and the student beyond the movements of the head.
  • both the instructor (20) and the student (90) can look at a location without having to turn his/her head; and even this looking point can be shown to the other party with a suitable marking.
  • this marking an arrow exiting the head or eyes of the looking person can be given.
  • an instantaneous representative view of the training session is given as viewed by the instructor.
  • the virtual learning medium (130) there is a chess board as the virtual object (133) and there is a chess piece provided on the chess board.
  • the virtual character (132) of the student comprises the head, body, arm, hand and fingers.
  • the instructor preferably only sees the hand and fingers and a section of the arm of his/her own virtual character (131).
  • the student (90) can see the instructor from his/her side in the same manner.
  • FIG 6 a different application of the present invention is described.
  • the purpose is to give Pilates lesson to the persons, and a virtual learning medium prepared accordingly is loaded to the virtual content server (72).
  • said virtual learning medium generally there is Pilates mat as the virtual object (133).
  • the instructor and the student use a personal computer (41) as the user terminal, and they use body movement sensor (33, 103), connected to the personal computer (41) and preferably which is a Microsoft Kinect device, as the movement sensing system.
  • the virtual character of the student is going to make the same movement in the virtual learning medium by means of the Kinect device.
  • the virtual content server (72) instantaneously transfers all of said movements to the personal computer (41) of the instructor, and thus, the instructor can see the move, made by the virtual character of the student, on the screen of his/her personal computer (41).
  • the instructor detects that the student makes a false gymnastic movement, the instructor can instantaneously intervene in this problem in the virtual learning medium.
  • the instructor warns the student (90) with a voice command, and he/she tells the student (90) to follow his/her movements.
  • Such training has pluralities of advantages when compared with real image trainings which can be realized through camera.
  • the training can be made much more efficient by means of pluralities of additional features which can be added to the virtual learning medium. For instance, there is a cassette object in the virtual learning medium, and when the instructor touches said object with his/her virtual hand, a learning video specific to said movement can be created.
  • the parties do not want to see their real images due to various reasons, such a learning system will be very useful.
  • the bandwidth used will be much lower when compared with the real image, a more rapid and uninterrupted communication will also be possible.
  • some risky teachings may firstly be realized in the virtual medium for preventing injuries, and after it is understood that the student has grasped the procedure to be applied in a clear manner, the passage to the real medium can be realized.
  • FIG 7 an instantaneous representative view of the learning session as viewed by the instructor is given.
  • the virtual character (132) of the student is on the Pilates mat.
  • Said virtual character (132) essentially comprises head, body, arm, hand, leg and feet.
  • the instructor can preferably only see some section of the arm and hand of his/her own virtual character (131).
  • the student (90) can see the instructor from his/her side.
  • the instructor (20) or the student (90) can use fixtures which are different from each other.
  • the instructor may use the fixture given in Figure 4, and the student (90) may use the fixture given in Figure 3, or the instructor may use the fixture given in Figure 5, and the student may use the fixture given in Figure 3.
  • These alternatives can be developed depending on the application requirements and personal capabilities.
  • the positions of all these persons in the virtual learning medium are determined beforehand, and thus, the instructor sees all these persons instantaneously, and he/she may follow the movements of all of them.
  • the best example to this can be the designing of the virtual learning medium as a classroom. In this virtual classroom, there is an instructor stand and pluralities of desks opposite to said stand.
  • the desk of each student, who will attend the virtual learning session, is determined and when the system is connected, the predetermined virtual character of said student is formed at said desk.

Abstract

The present invention essentially relates to a system and method providing instantaneous interaction of at least one instructor (20) and at least one student (90), existing in different locations, in a virtual reality medium (130) for the purpose of learning.

Description

SPECIFICATION
VIRTUAL REALITY BASED REMOTE LEARNING SYSTEM AND METHOD The present invention relates to remote learning methods and particularly virtual reality based remote learning methods and systems for realizing said methods.
PRIOR ART Remote learning methods are systems which are substantially time-saving both for students and for instructors. Such systems essentially comprise digital learning materials (videos, questions, books, etc.) and said materials are shared with the student with the help of the system servers through internet under the control of the instructor. Of course, during said sharing, there is an interaction although said interaction is limited. For instance, the instructor and the student have live communication or video talk facility, and the students can answer various multiple choice questions.
In the recent years, such systems are tried to be combined with virtual reality applications. In virtual reality applications, the users may intervene in the objects designed in the virtual learning medium by means of using suitable equipment. For this reason, in general, a virtual reality eyeglass, which displays the designed digital medium on its screen, is worn, and movement sensors are used which will sense the movements in different regions like hand, arm regions of the body. The use of virtual reality applications for learning purposes is known in the prior art. For instance, in the patent application KR101307609, a virtual learning application for scuba diving purpose is described. In this system, equipment for providing the sense of pressure and fans for providing perception of diving are used. In the patent application TR200903142, a system is described which integrates movement sensing and virtual reality with kinematic and computer aided design (CAD) for instance in order to evaluate an engineering design.
One of the most basic deficiencies in the present virtual reality based learning applications is that the interaction between the instructor and student is limited, and that the systems used has a very high cost. Thus, the advantage of these systems can be used by a very narrow group of people. BRIEF DESCRIPTION OF THE INVENTION
An object of the present invention is to provide a virtual reality based learning method where the interaction between the instructor and the student is increased.
Another object of the present invention is to reduce application cost of virtual reality based learning systems and thus, to provide the use of such systems by a larger mass.
In order to achieve the abovementioned object and the objects which are to be deducted from the below mentioned description, the present invention relates to a virtual reality based learning system comprising a student movement sensing system for sensing the movements of the student in a student location, a student display unit, and a student terminal connected to said student movement sensing system and said student display unit. The subject matter learning system is characterized in that:
- said learning system comprises an instructor movement sensing system for detecting the movements of the instructor in an instructor location, an instructor display unit and an instructor terminal connected to said instructor movement sensing system and to said instructor display unit,
- said learning system comprises a server group which provides the whole double sided instantaneous communication between the instructor terminal and the student terminals through a wireless communication network and having at least one application server and a virtual content server,
- at least one virtual learning application, providing transformation of the movements sensed by at least the instructor and student movement sensors into an instructor virtual character or student virtual character related to an instructor and student, is loaded in said application server,
- at least one virtual learning medium, where at least the instructor virtual character and student virtual character are provided in interaction, is loaded in said virtual content server,
- the virtual content server instantaneously transfers all of the movement sensing data and other predetermined data, reaching itself through the instructor terminal and through the student terminal, to the other side, and thus the instructor and the student instantaneously see all of the movements of each other and the interactions of them with the virtual objects in the virtual learning medium by means of the instructor virtual character and the student virtual character in the virtual learning medium. In a preferred embodiment of the subject matter virtual reality based learning system, the instructor movement sensing system comprises at least one of an instructor hand-arm movement sensor, an instructor head movement sensor, an instructor body movement sensor, an instructor wearable movement sensor and an eye movement sensor.
In another preferred embodiment of the subject matter virtual reality based learning system, the student movement sensing system comprises at least one of a student hand-arm movement sensor, a student head movement sensor, a student body movement sensor, a student wearable movement sensor and an eye movement sensor.
In another preferred embodiment of the subject matter virtual reality based learning system, the instructor terminal comprises at least one of an instructor personal computer, an instructor smart cellular phone or an instructor tablet. In another preferred embodiment of the subject matter virtual reality based learning system, the student terminal comprises at least one of a student personal computer, a student smart cellular phone or a student tablet.
In another preferred embodiment of the subject matter virtual reality based learning system, the instructor display unit comprises an instructor terminal screen or a virtual reality eyeglass.
In another preferred embodiment of the subject matter virtual reality based learning system, the student display unit comprises a terminal screen or a student virtual reality eyeglass. In another preferred embodiment of the subject matter virtual reality based learning system, the server group moreover comprises a backup server providing recording the whole learning session and which is in communication with the virtual content server.
In another preferred embodiment of the subject matter virtual reality based learning system, said other predetermined data are voice and/or written message data produced at least in the instructor location and in the student location.
Moreover, the present invention is a virtual reality based learning method, characterized by comprising the steps of;
- sensing the movements of the instructor by means of an instructor movement sensing system, connected to the instructor terminal, in an instructor location and transferring all of the sensed movements to an instructor virtual character, - sensing the movements of the student by means of a student movement sensing system connected to the student terminal in a student location and transferring all of the sensed movements to a student virtual character,
- providing access of the instructor and student terminals to a predetermined virtual learning medium through a wireless communication network by means of a server group and providing interaction of the instructor virtual character and student virtual character to the virtual objects existing in said virtual learning medium,
- providing instantaneous transferring of all of the movement sensing data and other predetermined data, reaching from the instructor terminal and from the student terminal to the server group, to the other party by means of said server group, and thus providing instantaneous viewing of all of the movements of the instructor and student and the interactions of the instructor and the student with the virtual objects existing in the virtual learning medium by means of the instructor virtual character and student virtual character existing in the virtual learning medium.
In a preferred application of the subject matter learning method, the instructor movement sensing system comprises at least one of an instructor hand-arm movement sensor, an instructor head movement sensor, an instructor body movement sensor and an instructor wearable movement sensor.
In another preferred application of the subject matter learning method, the student movement sensing system comprises at least one of a student hand-arm movement sensor, a student head movement sensor, a student body movement sensor and a student wearable movement sensor.
In another preferred application of the subject matter learning method, the instructor terminal comprises at least one of an instructor personal computer, an instructor smart cellular phone or an instructor tablet. In another preferred application of the subject matter learning method, the student terminal comprises at least one of a student personal computer, a student smart cellular phone or a student tablet.
In another preferred application of the subject matter learning method, the instructor display unit comprises an instructor terminal screen or an instructor virtual reality eyeglass. In another preferred application of the subject matter learning method, the student display unit comprises a student terminal screen or a student virtual reality eyeglass.
In another preferred application of the subject matter learning method, moreover, there is the step of recording the whole learning session by the server group.
In another preferred application of the subject matter learning method, said predetermined data are voice and/or written message data produced at least in the instructor location and in the student location.
BRIEF DESCRIPTION OF THE FIGURES
Figure 1 is a representative view of the subject matter system. Figure 2a, 2b and 2c are the representative views of the movement sensing system, terminals and display units used by the instructor respectively in the subject matter invention.
Figure 2d, 2e and 2f are the representative views of the movement sensing system, terminals and display units used by the student in the subject matter invention respectively.
Figure 3 is the representative view of a preferred application of the subject matter system.
Figure 4 is the representative view of another preferred application of the subject matter system.
Figure 5 is a view of a virtual learning medium in the subject matter system and a representative learning session realized in this medium.
Figure 6 are the representative views of another preferred application of the subject matter system.
Figure 7 is a view of another virtual learning medium in the subject matter system and another representative learning session realized in this medium. REFERENCE NUMBERS
10 Instructor location 80 Student location
20 Instructor 90 Student
30 Instructor movement sensing system 100 Student movement sensing system
31 Instructor hand-arm movement sensor 101 Student hand-arm movement sensor
32 Instructor head movement sensor 102 Student head movement sensor
33 Instructor body movement sensor 103 Student body movement sensor
34 Instructor wearable movement sensor 104 Student wearable movement sensor
35 Instructor eye movement sensor 105 Student eye movement sensor
40 Instructor terminal 110 Student terminal
41 Instructor personal computer 11 1 Student personal computer
42 Instructor smart cellular phone 112 Student smart cellular phone
43 Instructor tablet 113 Student tablet
50 Instructor display unit 120 Student display unit
51 Instructor terminal screen 121 Student terminal screen
52 Instructor virtual reality eyeglass 122 Student virtual reality eyeglass
60 Wireless communication network 130 Virtual learning medium
70 Server group 131 Instructor virtual character
71 Application server 132 Student virtual character
72 Virtual content server 133 Virtual object
73 Backup server
DETAILED DESCRIPTION OF A PREFERRED APPLICATION OF THE INVENTION
In this detailed description, the subject matter system and method are explained with references to examples without forming any restrictive effect only in order to make the subject more understandable.
The subject matter system essentially provides instantaneous interaction of at least one instructor (20) and at least one student (90) in a virtual reality medium (130) for training purposes, where said instructor (20) and said student (90) are in different locations.
Accordingly, in the instructor location (10), there is essentially an instructor movement sensing system (30) for sensing the movements of the instructor (20), an instructor display unit (50), and an instructor terminal (40) connected to said instructor movement sensing system (30) and said instructor display unit (50). In a similar manner, in the student location (80), there is essentially a student movement sensing system (100) in order to detect the movements of the student (90), a student display unit (120) and said student movement sensing system (100), and there is a student terminal (110) connected to the student display unit (120).
The whole communication between the instructor terminal (40) and the student terminals (1 10) is realized through a server group (70) by means of a wireless communication network (60). Said server group (70) essentially comprises an application server (71), a virtual content server (72) and a backup server (73). In the application server (71), there is a loaded virtual training application, and both the instructor (20) and the student (90) connect to said application server (71), and they install said virtual learning application to their own terminals. In the virtual content server (72), at least one but preferably a plurality of virtual learning medium (130) is loaded for different purposes, and after the virtual learning application is installed to the student terminal (110) and the instructor (20), the related virtual learning medium (130) is downloaded from the virtual content server (72). Another important task of the virtual content server (72) is to instantaneously transfer all of the movement sensing data and the other predetermined data (like voice, written, real camera images), received from the instructor (20) and student terminal (1 10), to the opposite party. By means of this, all movements of the instructor (20) and the student (90) are instantaneously reflected to the opposite party in the virtual learning medium (130). For instance, when the instructor (20) moves his/her hand, the hand of the virtual character (131) of the instructor moves instantaneously in the virtual learning medium (130) viewed by the student (90). Such an instantaneous communication provides a much more realist and effective interaction. Another task of the server group (70) provides recording of the whole training session. Therefore, backup server (73) is used which is in instantaneous communication with the virtual content server (72), and thus, all training sessions are recorded and when required, this information is provided to the authorized terminals. In an alternative application of the present invention, a single server, which realized two functions, can be used instead of the application and virtual content servers (71 , 72). Or in very intense applications, pluralities of application and virtual content servers (71 , 72) can be used in cluster form. With reference to Figure 2a, the instructor movement sensing system (30), used in the subject matter invention, preferably comprises at least one of an instructor hand-arm movement sensor (31), an instructor head movement sensor (32), an instructor body movement sensor (33) and at least one of instructor wearable movement sensor (34). The hand-arm movement sensor can essentially sense the hand, arm and finger movements of the user in a three-dimensional plane, and said sensor transfers all of said movements to the instructor terminal (40) as digital movement data. For instance, the instructor hand-arm movement sensor (31) can sense in case the instructor (20) extends his/her arm and makes a gripping movement by means of his/her fingers. The product of the LeapMotion Company can be given as an example to the hand-arm movement sensor.
The head movement sensor can sense the head movements of the user in a three- dimensional plane, and it transfers all of these movements as digital movement data to the instructor terminal (40). For instance, if the instructor (20) turns his/her head at a specific angle or lifts his/her head upwardly, this movement can be sensed by the instructor head movement sensor (32). In a preferred application of the present invention, as an example to head movement sensor, virtual reality eyeglasses can be given which are used as example to the head movement sensor and which can sense the head movements by means of the gyroscope system included. The virtual reality eyeglasses, produced by the Oculus Company, can be given as an example to such a product. Instead of a relatively higher cost virtual reality eyeglass, the instructor or the student can produce the virtual eyeglass chamber from carton, and a smart cellular phone, where an application dividing the screen to two pieces is loaded, can be placed into the section of this chamber which corresponds to the eyes. By means of the gyroscope characteristic of the smart cellular phones, in this method, the head movements of the instructor can be sensed. As an example to this application, the Google Cardboard product and the related software application can be given which is produced by the Google Company and which is used together with smart cellular phones.
The body movement sensor can sense all items in the body of the user, and the general position of the body, and it transfers all of these movements as digital movement data to the instructor terminal (40). For instance, in case the instructor (20) moves, for instance in case the instructor (20) walks forwardly, he/she lifts his/her arm upwardly, and he/she moves his/her leg, this movement can be sensed by the body movement sensor (33). In a preferred application of the present invention, the product of the Microsoft Company with Kinect trademark can be given as an example to the body movement sensor. The wearable movement sensor can sense the movements of some items in the user body in a similar manner to the body movement sensor, and it transfers all of said movements as digital movement data to the instructor terminal (40). The wearable movement sensor comprises at least one of glove sensor fixed to hands, and sensors which can be fixed to elbows, knees and shoulders. Depending on the type of the sensor used, pluralities of movements, like lifting the arm of the instructor (20) upwardly, moving the leg, and rotating around the own axis thereof, can be sensed by the instructor wearable movement sensor (34).
In a similar manner, the student movement sensing system (100), provided on the student side, comprises at least one of student hand-arm movement sensor (101), student head movement sensor (102), student body movement sensor (103) and student wearable movement sensor (104). Since the operation manner of all of these items is the same as the operation manner of the items on the instructor (20) side, a description will not be given here again.
With reference to Figure 2b, as the instructor terminal (40) in the subject matter system, at least one of an instructor personal computer (41), an instructor smart cellular phone (42) or an instructor tablet (43) can be used. In a similar manner on the student side (Figure 2e), as the student terminal (1 10), at least one of a student personal computer (11 1), a student smart cellular phone (112) or a student tablet (113) can be used. With reference to Figure 2c, in the subject matter system, an instructor terminal screen (51) or instructor reality eyeglass (52) can be used as the instructor display unit (50). The instructor terminal screen (51) can be the standard or touch-screen of the instructor personal computer (41), the instructor smart cellular phone (42) or the instructor tablet (43). As the instructor virtual reality eyeglass (52), again a product like Oculus can be used, or Google Cardboard solution, which functions together with smart cellular phones, can be used. Since the items, related to the virtual reality eyeglass (52), are mentioned above, they will not be detailed here more.
In the light of this pre-information, an exemplary application of the present invention is realized as follows. In this exemplary application, the object is to teach chess to the persons, and a virtual learning medium designed accordingly has been loaded to the virtual content server (72). In said virtual learning medium (130), as the virtual object (133), there is essentially a chess board, and chess pieces provided thereon. Prior to the learning session, the instructor (20) and the student (90) access to the application server (71) through the wireless communication network (60) by means of the user terminals, and afterwards, they install the virtual learning application to their own user terminals. Afterwards, they download said virtual learning medium (130) to their own user mobile terminals through the virtual content server (72). In another application, without any download process, the instructor and student mobile terminals provide authorized access to the virtual content server (72), and all of the processes are realized on the virtual content server (72). After this preparation, a representative learning session is realized as follows according to a preferred application given in Figure 3. In this application, the instructor uses the instructor personal computer (41) as the user terminal, and the student uses the student personal computer (11 1) as the student user terminal. As the movement sensing system, an instructor hand-arm movement sensor (31) and a student hand-arm movement sensor (101) are used which are connected to the personal computer and which are preferably LeapMotion devices. Accordingly, for instance, when the student (90) moves his/her hand in a suitable manner for moving the chess piece which is a virtual object (133) in the virtual learning medium (130) and when the student (90) moves his/her fingers for holding the related piece, the hands and fingers of the virtual character (132) of the student, existing in the virtual learning medium (130), are moved by means of the LeapMotion device. Thus, the student can lift the chess piece in the virtual learning medium (130) and he/she can put said chess piece to another location on the chess board. The interaction between the hands of the student virtual character (132) and the instructor virtual character (131) and the chess pieces (virtual object (133)) may become possible by means of the virtual learning application loaded in the user terminals. Preferably, but not limited to this, the virtual learning application is essentially based on coordinates in the virtual learning medium, and said application decides to realize interaction or not by taking the coordinates in the virtual learning medium as a base.
By means of the virtual content server (72), all of these movements are instantaneously transferred to the personal computer (41) of the instructor, and thus, the instructor can see the move of the virtual character of the student in the virtual learning medium illustrated on the screen of the personal computer (41). Accordingly, for instance, if the instructor recognizes that the student (90) moves the chess piece in an erroneous manner, the instructor can immediately intervene this in the virtual learning medium. In more details, when the instructor (20) moves his/her hand in a suitable manner for moving his/her hand, the hand and fingers of the virtual character of the instructor move by means of the LeapMotion device provided on the instructor side. Thus, in the virtual learning medium, the instructor can lift the related piece from the wrong location, and can put said piece to the correct location on the chess board.
Meanwhile, the virtual content server (72) instantaneously transfers all of said movements to the personal computer (111) of the student, thus, the student (90) can see the move made by the instructor virtual character and illustrated on the screen of the student personal computer (1 1 1). Voice or written messages can be included in all of these interactions. Therefore, a microphone which can be fixed externally or which is provided internally in the user terminals will be sufficient and an optional earphone set will be sufficient.
In the application illustrated in Figure 4, instead of the instructor and student personal computer (41 , 1 11), the instructor and student cellular phones (42, 112) are used, and instead of the instructor and student personal computer (41 , 11 1) screen, the virtual reality eyeglasses (52, 122) are used as the display unit. Accordingly, they connect the instructor hand-arm movement sensor (31), the student hand-arm movement sensor (101), and the student virtual reality eyeglass (122) and the instructor virtual reality eyeglass (52) to their own cellular phones where the abovementioned required applications are loaded. Since the operation is substantially same as the application illustrated in Figure 3, no detailed description will be given here, only the differences and the advantages of said differences will be described. The greatest advantage of the virtual reality eyeglass (52) is that it provides a much more realist experience (immersive environment) when compared with the personal computer (41) screen. Of course, this provides a favorable contribution to the efficiency of the learning session. Another advantage is that by means of the gyroscope characteristic in the virtual reality eyeglass, the head movements of the instructor and the student (90) can be monitored and they can be transferred to the virtual learning medium (130). Thus, for instance, the instructor can see which direction the student (90) looks at and even which piece the student (90) looks at, and thus, he/she can transfer some commands and warnings to his/her student (90) without making additional move. In the virtual learning medium (130), indicators like an arrow, which extends from the head of the virtual character towards the target, can be used for providing understanding of the looking direction by the opposite party.
As described in the prior sections of the detailed description, as a lower cost solution, the Google Cardboard product and the related software application, produced by the Google Company, can be used together with the smart cellular phones (42) instead of the virtual reality eyeglass (52).
In the applications of the invention where Google Cardboard and smart cellular phone are combined, the student or the instructor may have to see the real world at the same time. For this reason, the front camera of the cellular phone is used, and when desired, the real medium image obtained from this is displayed in a small window or in full-screen manner on the screen of the smart cellular phone. By means of this camera image, the user can see every direction to which he/she turns his/her head. In another application, the images existing on the personal computer (41 , 1 11) of the instructor and/or student are transferred to the smart cellular phone by using software like TeamViewer. Thus, even if there is no software related to virtual reality applications on the smart cellular phones, it can be used in accordance with the subject matter object by taking the images existing in the personal computer.
In another application of the present invention, one instructor eye movement sensor (35) and one student eye movement sensor (105) can be used which sense the eye movements of the instructor and the student beyond the movements of the head. Thus, both the instructor (20) and the student (90) can look at a location without having to turn his/her head; and even this looking point can be shown to the other party with a suitable marking. As an example to this marking, an arrow exiting the head or eyes of the looking person can be given.
In Figure 5, an instantaneous representative view of the training session is given as viewed by the instructor. Accordingly, in the virtual learning medium (130), there is a chess board as the virtual object (133) and there is a chess piece provided on the chess board. The virtual character (132) of the student comprises the head, body, arm, hand and fingers. The instructor preferably only sees the hand and fingers and a section of the arm of his/her own virtual character (131). The student (90) can see the instructor from his/her side in the same manner.
In Figure 6, a different application of the present invention is described. In this exemplary application, the purpose is to give Pilates lesson to the persons, and a virtual learning medium prepared accordingly is loaded to the virtual content server (72). In said virtual learning medium, generally there is Pilates mat as the virtual object (133). When the preparation process in the user terminals is the same as the application described in Figure 3 and 4, it will not be given here in details. In this application, the instructor and the student use a personal computer (41) as the user terminal, and they use body movement sensor (33, 103), connected to the personal computer (41) and preferably which is a Microsoft Kinect device, as the movement sensing system. Accordingly, for instance, when the student (90) opens his/her hands and arms simultaneously, the virtual character of the student is going to make the same movement in the virtual learning medium by means of the Kinect device. The virtual content server (72) instantaneously transfers all of said movements to the personal computer (41) of the instructor, and thus, the instructor can see the move, made by the virtual character of the student, on the screen of his/her personal computer (41). Accordingly, for instance, when the instructor detects that the student makes a false gymnastic movement, the instructor can instantaneously intervene in this problem in the virtual learning medium. In more details, the instructor warns the student (90) with a voice command, and he/she tells the student (90) to follow his/her movements. Afterwards, when the instructor makes the correct movement with his/her body, this movement is sensed by the Kinect device on the instructor side, and it is transferred to the personal computer (41) of the student through the virtual content server (72). Thus, the student (90) can observe the body movement of the virtual character of the instructor on the screen of the student personal computer (11 1).
Such training has pluralities of advantages when compared with real image trainings which can be realized through camera. First of all, the training can be made much more efficient by means of pluralities of additional features which can be added to the virtual learning medium. For instance, there is a cassette object in the virtual learning medium, and when the instructor touches said object with his/her virtual hand, a learning video specific to said movement can be created. Secondly, when the parties do not want to see their real images due to various reasons, such a learning system will be very useful. Moreover, since the bandwidth used will be much lower when compared with the real image, a more rapid and uninterrupted communication will also be possible. Besides, some risky teachings may firstly be realized in the virtual medium for preventing injuries, and after it is understood that the student has grasped the procedure to be applied in a clear manner, the passage to the real medium can be realized.
In Figure 7, an instantaneous representative view of the learning session as viewed by the instructor is given. Accordingly, there is Pilates mat as the virtual object (133) in the virtual learning medium (130), and the virtual character (132) of the student is on the Pilates mat. Said virtual character (132) essentially comprises head, body, arm, hand, leg and feet. The instructor can preferably only see some section of the arm and hand of his/her own virtual character (131). In the same manner, the student (90) can see the instructor from his/her side. In the applications illustrated in Figure 3, 4 and 6, the instructor (20) or the student (90) can use fixtures which are different from each other. For instance, the instructor may use the fixture given in Figure 4, and the student (90) may use the fixture given in Figure 3, or the instructor may use the fixture given in Figure 5, and the student may use the fixture given in Figure 3. These alternatives can be developed depending on the application requirements and personal capabilities. In alternative applications of the present invention, there may be more than one person particularly on the student side. The positions of all these persons in the virtual learning medium are determined beforehand, and thus, the instructor sees all these persons instantaneously, and he/she may follow the movements of all of them. The best example to this can be the designing of the virtual learning medium as a classroom. In this virtual classroom, there is an instructor stand and pluralities of desks opposite to said stand. The desk of each student, who will attend the virtual learning session, is determined and when the system is connected, the predetermined virtual character of said student is formed at said desk. By means of the subject matter whose operational details are given above, after this moment, all of the movements of that person can be detected by the instructor connected to the virtual learning session.
By means of the subject matter system, training in very different fields can be provided in a very effective manner. As additional examples to said teachings, industry-oriented teachings (assembly or welding teachings), and teachings (critical experiments, bomb disposal, vehicle usage) which are dangerous when sufficient competency is not provided, can be given. Of course, the number of said examples can be substantially increased.

Claims

1. A virtual reality based learning system comprising a student movement sensing system (100) for sensing the movements of the student (90) in a student location (80), a student display unit (120), and a student terminal (110) connected to said student movement sensing system (100) and said student display unit (120), characterized in that:
- said learning system comprises an instructor movement sensing system (30) for detecting the movements of the instructor (20) in an instructor location (10), an instructor display unit (50) and an instructor terminal (40) connected to said instructor movement sensing system (30) and to said instructor display unit (50),
- said learning system comprises a server group (70) which provides the whole double sided instantaneous communication between the instructor terminal (40) and the student terminals (1 10) through a wireless communication network (60) and having at least one application server (71) and a virtual content server (72),
- at least one virtual learning application (131), providing transformation of the movements sensed by at least the instructor and student movement sensors into an instructor virtual character (131) or student virtual character (132) related to an instructor (20) and student (90), is loaded in said application server (71),
- at least one virtual learning medium (130), where at least the instructor virtual character (131) and student virtual character (132) are provided in interaction, is loaded in said virtual content server (72),
- the virtual content server (72) instantaneously transfers all of the movement sensing data and other predetermined data, reaching itself through the instructor terminal (40) and through the student terminal (1 10), to the other side, and thus the instructor (20) and the student (90) instantaneously see all of the movements of each other and the interactions of them with the virtual objects (133) in the virtual learning medium (130) by means of the instructor virtual character (131) and the student virtual character (132) in the virtual learning medium (130).
2. A virtual reality based learning system according to Claim 1 , wherein the instructor movement sensing system (30) comprises at least one of an instructor hand-arm movement sensor (31), an instructor head movement sensor (32), an instructor body movement sensor (33), an instructor wearable movement sensor (34) and an eye movement sensor (35).
3. A virtual reality based learning system according to Claim 1 , wherein the student movement sensing system (100) comprises at least one of a student hand-arm movement sensor (101), a student head movement sensor (102), a student body movement sensor (103), a student wearable movement sensor (104) and an eye movement sensor (105).
4. A virtual reality based learning system according to Claim 1 , wherein the instructor terminal (40) comprises at least one of an instructor personal computer (41), an instructor smart cellular phone (42) or an instructor tablet (43).
5. A virtual reality based learning system according to Claim 1 , wherein the student terminal (1 10) comprises at least one of a student personal computer (11 1), a student smart cellular phone (1 12) or a student tablet (1 13).
6. A virtual reality based learning system according to Claim 1 , wherein the instructor display unit (50) comprises an instructor terminal screen (51) or a virtual reality eyeglass (52).
7. A virtual reality based learning system according to Claim 1 , wherein the student display unit (120) comprises a terminal screen (121) or a student virtual reality eyeglass (122).
8. A virtual reality based learning system according to Claim 1 , wherein the server group (70) moreover comprises a backup server (73) providing recording the whole learning session and which is in communication with the virtual content server (72).
9. A virtual reality based learning system according to Claim 1 , wherein said other predetermined data are voice and/or written message data produced at least in the instructor location (10) and in the student location (80).
10. A virtual reality based learning method, characterized by comprising the steps of;
- sensing the movements of the instructor by means of an instructor movement sensing system (30), connected to the instructor terminal (40), in an instructor location (10) and transferring all of the sensed movements to an instructor virtual character (131), - sensing the movements of the student (90) by means of a student movement sensing system (100) connected to the student terminal (110) in a student location (80) and transferring all of the sensed movements to a student virtual character (132),
- providing access of the instructor and student terminals (40, 110) to a predetermined virtual learning medium (130) through a wireless communication network (60) by means of a server group (70) and providing interaction of the instructor virtual character (131) and student virtual character (132) to the virtual objects (133) existing in said virtual learning medium (130), - providing instantaneous transferring of all of the movement sensing data and other predetermined data, reaching from the instructor terminal (40) and from the student terminal (1 10) to the server group (70), to the other party by means of said server group (70), and thus providing instantaneous viewing of all of the movements of the instructor (20) and student (90) and the interactions of the instructor (20) and the student (90) with the virtual objects (133) existing in the virtual learning medium (130) by means of the instructor virtual character (131) and student virtual character (132) existing in the virtual learning medium (130).
11. A virtual reality based learning method according to Claim 10, wherein the instructor movement sensing system (30) comprises at least one of an instructor hand-arm movement sensor (31), an instructor head movement sensor (32), an instructor body movement sensor (33) and an instructor wearable movement sensor (34).
12. A virtual reality based learning method according to Claim 10, wherein the student movement sensing system (100) comprises at least one of a student hand-arm movement sensor (101), a student head movement sensor (102), a student body movement sensor (103) and a student wearable movement sensor (104).
13. A virtual reality based learning method according to Claim 10, wherein the instructor terminal (40) comprises at least one of an instructor personal computer (41), an instructor smart cellular phone (42) or an instructor tablet (43).
14. A virtual reality based learning method according to Claim 10, wherein the student terminal (110) comprises at least one of a student personal computer (111), a student smart cellular phone (112) or a student tablet (1 13).
15. A virtual reality based learning method according to Claim 10, wherein the instructor display unit (50) comprises an instructor terminal screen (51) or an instructor virtual reality eyeglass (52).
16. A virtual reality based learning method according to Claim 10, wherein the student display unit (120) comprises a student terminal screen (121) or a student virtual reality eyeglass (122).
17. A virtual reality based learning method according to Claim 10, wherein moreover, there is the step of recording the whole learning session by the server group (70).
18. A virtual reality based learning method according to Claim 10, wherein said predetermined data are voice and/or written message data produced at least in the instructor location (10) and in the student location (80).
PCT/TR2016/050054 2015-03-06 2016-03-03 Virtual reality based remote learning system and method WO2016144279A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
TR201502739 2015-03-06
TR2015/02739 2015-03-06

Publications (1)

Publication Number Publication Date
WO2016144279A1 true WO2016144279A1 (en) 2016-09-15

Family

ID=55806743

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/TR2016/050054 WO2016144279A1 (en) 2015-03-06 2016-03-03 Virtual reality based remote learning system and method

Country Status (1)

Country Link
WO (1) WO2016144279A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019023400A1 (en) * 2017-07-28 2019-01-31 Baobab Studios Inc. Systems and methods for real-time complex character animations and interactivity
CN114170859A (en) * 2021-10-22 2022-03-11 青岛虚拟现实研究院有限公司 Online teaching system and method based on virtual reality
EP4138063A1 (en) * 2021-08-17 2023-02-22 Sony Interactive Entertainment LLC Game-based lesson plans and learning
US11734893B2 (en) 2021-08-17 2023-08-22 Sony Interactive Entertainment LLC Curating virtual tours

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5950202A (en) * 1993-09-23 1999-09-07 Virtual Universe Corporation Virtual reality network with selective distribution and updating of data to reduce bandwidth requirements
TR200903142A1 (en) 2009-01-17 2010-08-23 Lockheed Martin Corporation Enveloped, head-mounted display unit and enclosed, co-operative environment using cave.
US20130063432A1 (en) * 2010-08-26 2013-03-14 Blast Motion, Inc. Virtual reality system for viewing current and previously stored or calculated motion data
KR101307609B1 (en) 2012-03-13 2013-09-12 동신대학교산학협력단 Apparatus for educating skin scuba based on realistic contents

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5950202A (en) * 1993-09-23 1999-09-07 Virtual Universe Corporation Virtual reality network with selective distribution and updating of data to reduce bandwidth requirements
TR200903142A1 (en) 2009-01-17 2010-08-23 Lockheed Martin Corporation Enveloped, head-mounted display unit and enclosed, co-operative environment using cave.
US20130063432A1 (en) * 2010-08-26 2013-03-14 Blast Motion, Inc. Virtual reality system for viewing current and previously stored or calculated motion data
KR101307609B1 (en) 2012-03-13 2013-09-12 동신대학교산학협력단 Apparatus for educating skin scuba based on realistic contents

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
KHATTAK SAAD ET AL: "A real-time reconstructed 3D environment augmented with virtual objects rendered with correct occlusion", 2014 IEEE GAMES MEDIA ENTERTAINMENT, IEEE, 22 October 2014 (2014-10-22), pages 1 - 8, XP032738617, ISBN: 978-1-4799-7545-7, [retrieved on 20150224], DOI: 10.1109/GEM.2014.7048102 *
WEBEL SABINE ET AL: "Immersive experience of current and ancient reconstructed cultural attractions", 2013 DIGITAL HERITAGE INTERNATIONAL CONGRESS (DIGITALHERITAGE), IEEE, vol. 1, 28 October 2013 (2013-10-28), pages 395 - 398, XP032568073, ISBN: 978-1-4799-3168-2, [retrieved on 20140218], DOI: 10.1109/DIGITALHERITAGE.2013.6743766 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019023400A1 (en) * 2017-07-28 2019-01-31 Baobab Studios Inc. Systems and methods for real-time complex character animations and interactivity
US10796469B2 (en) 2017-07-28 2020-10-06 Baobab Studios Inc. Systems and methods for real-time complex character animations and interactivity
US10810780B2 (en) 2017-07-28 2020-10-20 Baobab Studios Inc. Systems and methods for real-time complex character animations and interactivity
US10818061B2 (en) 2017-07-28 2020-10-27 Baobab Studios Inc. Systems and methods for real-time complex character animations and interactivity
US10937219B2 (en) 2017-07-28 2021-03-02 Baobab Studios Inc. Systems and methods for real-time complex character animations and interactivity
EP4138063A1 (en) * 2021-08-17 2023-02-22 Sony Interactive Entertainment LLC Game-based lesson plans and learning
US11734893B2 (en) 2021-08-17 2023-08-22 Sony Interactive Entertainment LLC Curating virtual tours
CN114170859A (en) * 2021-10-22 2022-03-11 青岛虚拟现实研究院有限公司 Online teaching system and method based on virtual reality
CN114170859B (en) * 2021-10-22 2024-01-26 青岛虚拟现实研究院有限公司 Online teaching system and method based on virtual reality

Similar Documents

Publication Publication Date Title
Azmandian et al. Haptic retargeting: Dynamic repurposing of passive haptics for enhanced virtual reality experiences
Orlosky et al. Virtual and augmented reality on the 5G highway
Leithinger et al. Physical telepresence: shape capture and display for embodied, computer-mediated remote collaboration
Paterson Feel the presence: technologies of touch and distance
Bowman et al. Virtual reality: how much immersion is enough?
US9747722B2 (en) Methods for teaching and instructing in a virtual world including multiple views
Piumsomboon et al. [POSTER] CoVAR: mixed-platform remote collaborative augmented and virtual realities system with shared collaboration cues
Bowman et al. Evaluating effectiveness in virtual environments with MR simulation
WO2016144279A1 (en) Virtual reality based remote learning system and method
WO2010062117A2 (en) Immersive display system for interacting with three-dimensional content
US10537815B2 (en) System and method for social dancing
CN205246971U (en) Display device with AR shows and VR shows switching each other
CN108805766B (en) AR somatosensory immersive teaching system and method
US9690784B1 (en) Culturally adaptive avatar simulator
Kawai et al. Tsunami evacuation drill system using smart glasses
Whitman et al. Virtual reality: its usefulness for ergonomic analysis
CN106126032A (en) The display packing of a kind of interactive interface and device
CN112287767A (en) Interaction control method, device, storage medium and electronic equipment
Dewez et al. Do you need another hand? investigating dual body representations during anisomorphic 3d manipulation
US9000899B2 (en) Body-worn device for dance simulation
Verma et al. Systematic review of virtual reality & its challenges
Elsayed et al. CameraReady: Assessing the Influence of Display Types and Visualizations on Posture Guidance
Patrão et al. A virtual reality system for training operators
Aras et al. Quantitative assessment of the effectiveness of using display techniques with a haptic device for manipulating 3D objects in virtual environments
Nalluri et al. Evaluation of virtual reality opportunities during pandemic

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16718036

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16718036

Country of ref document: EP

Kind code of ref document: A1