EP3122430A1 - Systems and methods for teaching and instructing in a virtual world including multiple views - Google Patents
Systems and methods for teaching and instructing in a virtual world including multiple viewsInfo
- Publication number
- EP3122430A1 EP3122430A1 EP15769019.9A EP15769019A EP3122430A1 EP 3122430 A1 EP3122430 A1 EP 3122430A1 EP 15769019 A EP15769019 A EP 15769019A EP 3122430 A1 EP3122430 A1 EP 3122430A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- user
- image
- providing visual
- instruction
- visual instruction
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/30—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/213—Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/60—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
- A63F13/65—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
- A63F13/655—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition by importing photos, e.g. of the player
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/20—3D [Three Dimensional] animation
- G06T13/40—3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/003—Navigation within 3D models or images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/23—Recognition of whole body movements, e.g. for sport training
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B19/00—Teaching not covered by other main groups of this subclass
- G09B19/003—Repetitive work cycles; Sequence of movements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/156—Mixing image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/275—Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/04—Indexing scheme for image data processing or generation, in general involving 3D image data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30221—Sports video; Sports image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2213/00—Indexing scheme for animation
- G06T2213/08—Animation software package
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/004—Annotating, labelling
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2016—Rotation, translation, scaling
Definitions
- Systems and methods are provided to generate display images to a user of the methods and systems for teaching and instructing in a virtual world. More particularly, they relate to the use of multiple views in the display images.
- Virtual worlds those that recreate or represent real three dimensional (3D) space can be used as environments to demonstrate, coach, and guide user body motion for the purposes of exercise and rehabilitation instruction.
- These digital environments can employ the use of an animated character, or avatar, in addition to other on-screen feedback to provide real time instruction and visual representation of motion.
- This motion specifically refers to limb or body motion.
- a second 'user' animation can be placed into the virtual 3D environment to provide side by side, preferably real time, representation of real user limb and body motion, including physical orientation to the camera, real world surroundings, and location within the virtual environment (including perceived interaction with virtual elements).
- the system includes a user imaging system with the system generating an output adapted to couple to a display device.
- the systems and methods serve to guide a user body motion, such as for exercise instruction or rehabilitation purposes.
- the method includes the steps of receiving first user positional information from the user imaging system, and then generating a first mirror image of the user positional information. Additionally, the method includes generating a first instructional image having the same positional orientation as the first mirror image of the user positional information. Finally, the method and system generate a composite output display including the first mirror image of the user positional information and the first instructional image.
- the method additionally includes a second instructional image in a mirror image orientation to the first instructional image.
- this second image could comprise a view of the back of the instructor whose front is depicted in the first instructional image.
- the display may optionally include an image of the user, as well as both a front and back display for the instructional image.
- This innovation provides synchronous multi-view animation that show more than one point of view of the instructed motion in the virtual world.
- Second provides the use of a virtual animated 'mirror' that displays the additional angles of view and other feedback of the animated avatar.
- Third provides representation of both the guiding animation and synchronized patient avatar within the confines of the virtual mirror element.
- Fourth, it provides for dynamic state change of the mirror element as needed depending upon the context of the instruction e.g. obscuration of the reflection, zoom in and out, other image manipulation.
- Fig. 1 shows a display with a coach avatar in the foreground and the mirror portion obscured.
- Fig. 2A shows a display with the coach avatar in the foreground, and the patient and coach avatar in the mirror.
- Fig. 2B shows the image of Fig. 2 but including annotations.
- Fig. 3A shows a display with a coach avatar, but with the patient image in a mirrored orientation.
- Fig. 3B shows the display image of Fig. 3 but including annotations.
- Fig. 4 shows a display unit showing a display image of a mirror version of the user's positional information, as well as both a first instructional image having the same positional orientation as the first mirror image and a second instructional image in a mirror image orientation to the first instructional image.
- Fig. 5 shows the components of the overall system including the user position monitor and display.
- Fig. 1 shows a display with a coach avatar 20 in the foreground and the mirror portion 22 obscured.
- the mirror portion is shown bounded by a frame 24 as an optional aid for the user to appreciate that the image shown in the mirror portion 22 is rendered as a mirror image of another object (e.g., the instructor and/or the user).
- This view minimizes environmental distraction and focuses the patient on the information being conveyed by the exercise coach.
- instructor or coach are used interchangeably, and refer to the image intending to display the desired action or position to the user of the system. This view is used when important information is being conveyed to the patient prior to, in between or post exercise session.
- This view can be 'zoomed' in to put more focus on the coach's face as she is talking to the patient.
- the mirror treatment may vary.
- the mirror portion is here shown in an extreme matte format so as to avoid providing 'reflected' information that is distracting or not necessary or useful for the user.
- Fig. 2A shows a display with the coach avatar 20 in the foreground, left had side of the display, and the patient and coach avatar in the mirror region 22.
- This view helps to demonstrate to the patient the proper technique to safely perform a clinician prescribed exercise.
- the position and actions of the instructional avatar depict the backside of the instructional avatar in the foreground, and the frontside of the instructional avatar in the mirrored display portion 22 of the display.
- This view is primarily used for instructing exercises and during the initial repetitions of a patient exercising.
- depth cameras render persons on a screen
- the individual is displayed in a mirrored version of themselves 28 and are displayed in the mirrored section 22 of the display.
- Fig. 2B shows the image of Fig. 2 but including annotations. Note the connection to the coach avatar's 'right hand' as it would be in 'reality' (on the right of the body) and as it appears 'mirrored' in the mirror (on the left side of the body). This view helps the patient to understand that mirroring is occurring and helps to visually inform them that they should sync their movements with the avatar in the foreground such that they appear perfectly synced in the mirror.
- the annotations include specific identification of the avatar, identifying the 'avatar in foreground', 'avatar in mirror', and identify the corresponding had of the avatar ('avatar right hand').
- a rendering 28 of the user or patient (the terms are used interchangeably herein).
- the user positional information is used to generate a mirror image of the user, and then to provide a rendering 28 within the mirrored portions 22 of the display.
- the system compares the user positional information with the desired position and/or action as shown in the first instructional image. For example, if the user performs an action exactly as shown by the instructional image, the instructional image and the user positional images would show similarity of actions and positions.
- the differences between the desired instructional image position and/or action may be displayed to the user.
- the instructional image is to perform a squat exercise
- the user will be instructed that their action does not qualify to count as a rep (repeat unit) of the exercise.
- an overlay image 30, such as a box may be used both in the foreground and then minor image 22 as an aid to the user to identify the user position in space and on the display.
- Fig. 3A shows a display with a coach avatar 20, but with the patient image 28 in a mirrored orientation. This state of the application is used once the patient has demonstrated competence in the exercise movement as it was instructed and initiated in state of Fig. 2 (see above). Again, if the user positional information corresponds to the position or range of positions or actions that are deemed compliant by the system, then the user is optionally advised of the number of reps completed. As shown in Fig. 3, "0/3" or zero of the three required reps have been performed.
- Fig. 3B shows the display image of Fig. 3 but including annotations. The images have labeled as the 'coach avatar' 20 and 'patient' or user 28.
- the patient has been represented as an avatar identical to the coach avatar.
- the patient 'look' may be in any form known to those skilled in the art, ranging from photo realistic of the user, to an abstracted image, or avatar or cartoon image or the user, or of any arbitrary image, avatar or cartoon.
- Fig. 4 shows a display unit 10 showing a display image of a mirror version of the user's positional information (as shown in Fig. 4), as well as both a first instructional image having the same positional orientation as the first mirror image and a second instructional image in a mirror image orientation to the first instructional image.
- FIG. 5 shows a perspective view of components of the overall system.
- a display device 10 is oriented toward a user 12 of the system.
- a user imaging unit is oriented toward the user 12.
- the user imaging system preferably is an imaging system capable of determining the spatial position of the user 12 in real time, preferably in color.
- One or more (multiple) user imaging systems may be utilized.
- the processing system 16 is coupled to the user imaging system(s) 14 to receive information relating to the user position and to compute the user positional information.
- the processing system 16 may be preferably comprised of computer hardware and software to perform the functions described herein.
- the processing system 16 compares the desired position as depicted by the first instructional image with the user positional information, such as to determine the differences between the two positions, such as to determine whether the user is complying with the desired action as shown by the first instructional image.
- the graphic to be rendered on the display device 10 is generated by the processing system 16, and coupled to the display device 10 via the output 18.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Software Systems (AREA)
- General Health & Medical Sciences (AREA)
- Business, Economics & Management (AREA)
- Signal Processing (AREA)
- General Engineering & Computer Science (AREA)
- Public Health (AREA)
- Primary Health Care (AREA)
- Computer Graphics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Medical Informatics (AREA)
- Computer Hardware Design (AREA)
- Epidemiology (AREA)
- Physical Education & Sports Medicine (AREA)
- Biophysics (AREA)
- Educational Technology (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Social Psychology (AREA)
- Psychiatry (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Educational Administration (AREA)
- Entrepreneurship & Innovation (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201461970609P | 2014-03-26 | 2014-03-26 | |
PCT/US2015/022504 WO2015148676A1 (en) | 2014-03-26 | 2015-03-25 | Systems and methods for teaching and instructing in a virtual world including multiple views |
Publications (2)
Publication Number | Publication Date |
---|---|
EP3122430A1 true EP3122430A1 (en) | 2017-02-01 |
EP3122430A4 EP3122430A4 (en) | 2017-11-15 |
Family
ID=57589823
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP15769019.9A Withdrawn EP3122430A4 (en) | 2014-03-26 | 2015-03-25 | Systems and methods for teaching and instructing in a virtual world including multiple views |
Country Status (1)
Country | Link |
---|---|
EP (1) | EP3122430A4 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11990219B1 (en) | 2018-05-01 | 2024-05-21 | Augment Therapy, LLC | Augmented therapy |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8913809B2 (en) * | 2012-06-13 | 2014-12-16 | Microsoft Corporation | Monitoring physical body changes via image sensor |
-
2015
- 2015-03-25 EP EP15769019.9A patent/EP3122430A4/en not_active Withdrawn
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11990219B1 (en) | 2018-05-01 | 2024-05-21 | Augment Therapy, LLC | Augmented therapy |
Also Published As
Publication number | Publication date |
---|---|
EP3122430A4 (en) | 2017-11-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9747722B2 (en) | Methods for teaching and instructing in a virtual world including multiple views | |
US10622111B2 (en) | System and method for image registration of multiple video streams | |
Blum et al. | mirracle: An augmented reality magic mirror system for anatomy education | |
Schmalstieg et al. | Augmented reality: principles and practice | |
Gauglitz et al. | Integrating the physical environment into mobile remote collaboration | |
Juan et al. | An augmented reality system for learning the interior of the human body | |
US11749137B2 (en) | System and method for multisensory psychomotor skill training | |
US20100295921A1 (en) | Virtual Interactive Presence Systems and Methods | |
Luciano et al. | Design of the immersivetouch: a high-performance haptic augmented virtual reality system | |
Ilie et al. | Combining head-mounted and projector-based displays for surgical training | |
Rodrigue et al. | Mixed reality simulation with physical mobile display devices | |
EP3122430A1 (en) | Systems and methods for teaching and instructing in a virtual world including multiple views | |
Andersen et al. | Augmented visual instruction for surgical practice and training | |
Sherstyuk et al. | Dynamic eye convergence for head-mounted displays | |
Riva et al. | Virtual reality as telemedicine tool: technology, ergonomics and actual applications | |
Schwede et al. | HoloR: Interactive mixed-reality rooms | |
Chessa et al. | Insert your own body in the oculus rift to improve proprioception | |
Jadeja et al. | New era of teaching learning: 3D marker based augmented reality | |
Rodríguez-D’Jesús et al. | 360 video recording inside a GI endoscopy room: technical feasibility and its potential use for the acquisition of gastrointestinal endoscopy skills. Pilot experience | |
Guo et al. | A portable immersive surgery training system using RGB-D sensors | |
Andersen | Effective User Guidance Through Augmented Reality Interfaces: Advances and Applications | |
Cidota et al. | [POSTER] Affording Visual Feedback for Natural Hand Interaction in AR to Assess Upper Extremity Motor Dysfunction | |
Deakyne et al. | Development of Anaglyph 3D Functionality for Cost-Effective Virtual Reality Anatomical Education Tool | |
Wischgoll | Toward the Comparison of Different VR Devices for Visualization | |
Garcia | A novel asymmetric collaboration method for extended reality training systems |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20160927 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
A4 | Supplementary search report drawn up and despatched |
Effective date: 20171018 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: A63F 13/655 20140101AFI20171012BHEP Ipc: A63F 9/24 20060101ALI20171012BHEP |
|
17Q | First examination report despatched |
Effective date: 20191004 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN |
|
18W | Application withdrawn |
Effective date: 20200227 |