CN115083563A - Multi-sense-organ immersion interactive virtual reality rehabilitation training method - Google Patents

Multi-sense-organ immersion interactive virtual reality rehabilitation training method Download PDF

Info

Publication number
CN115083563A
CN115083563A CN202210595889.7A CN202210595889A CN115083563A CN 115083563 A CN115083563 A CN 115083563A CN 202210595889 A CN202210595889 A CN 202210595889A CN 115083563 A CN115083563 A CN 115083563A
Authority
CN
China
Prior art keywords
child
information
virtual reality
immersive
interactive
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210595889.7A
Other languages
Chinese (zh)
Inventor
欧剑
郑亚冰
罗逊
王军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN202210595889.7A priority Critical patent/CN115083563A/en
Publication of CN115083563A publication Critical patent/CN115083563A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/70ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mental therapies, e.g. psychological therapy or autogenous training
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Public Health (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Primary Health Care (AREA)
  • Hospice & Palliative Care (AREA)
  • Psychology (AREA)
  • Psychiatry (AREA)
  • Developmental Disabilities (AREA)
  • Social Psychology (AREA)
  • Child & Adolescent Psychology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biophysics (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a multi-sense organ immersion interactive virtual reality rehabilitation training method, which is characterized in that an immersion type treatment rehabilitation scene is presented through a virtual reality system, the rehabilitation scene uses a display system constructed by a flexible LED panel, the panel is provided with a penetrating speaker, autistic children perform interactive sensory system training with a virtual environment through body sensing and touch, the autistic children can also interact with the virtual environment through an entity teaching aid, and the virtual environment can generate an interactive effect of image, sound and force feedback according to the interaction of the children. The ultrahigh-definition cameras and the structured light cameras work together in a coordinated mode to track bones of the movement of the child, track the face of the child in real time, collect facial expression and viewpoint information of the child, and calculate a hot spot area watched by the child. The invention can dynamically replace a sensory integration training scene, and can construct a training environment which can integrate a plurality of rehabilitation methods in the forms of immersive images, immersive sounds and force feedback.

Description

Multi-sense organ immersion interactive virtual reality rehabilitation training method
Technical Field
The invention relates to a multi-sense immersion interactive virtual reality rehabilitation training method.
Background
In China, the incidence rate of autistic children is about 1/100, the rehabilitation training of autistic children mainly depends on sensory integration training, and the commonly used sensory integration training is mainly completed in a sensory integration training room. The system training room is an important place for helping the special children to recover and educate, and aims to help the special children to recover and educate, including improving intelligence, increasing social adaptability, recovering limb functions, exciting interests and hobbies of students and promoting comprehensive development of the students. The functional rooms generally comprise a sensory integration training room, a multi-sensory training room, a psychological consulting room, a rhythm room, an emotion disclosure room, an auditory sense rehabilitation training room, an exercise rehabilitation room and the like.
Usually, these training rooms with different functions are arranged in different rooms, are not integrated in a space, and are usually static environment decoration, and physical props are mainly used for completing sensory training. Therefore, it is difficult to intervene in the course of treatment in children with autism by using a combination of methods.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides a multi-sense immersion interactive virtual reality rehabilitation training method. The method utilizes an immersive LED to construct a completely immersive virtual reality environment, the autistic children complete personalized sensory integration training prescriptions in the virtual reality environment through various input modes such as interactive teaching aids, body feeling and voice, the virtual reality environment can output different training effects according to different inputs, and feedback is provided for the autistic children through output modes such as real-time interactive pictures, immersive sound fields, physical vibration and electric touch. The whole rehabilitation system also carries out personalized and accurate recording, evaluation and analysis on the training of the autistic children in the virtual environment through a motion analysis technology, a dynamic human face and eye movement tracking technology and a voice analysis technology.
The purpose of the invention is realized by the following technical scheme:
a multi-sense organ immersion interactive virtual reality rehabilitation training method comprises the following steps:
firstly, an immersive virtual display environment is constructed through an LED, and a virtual scene is rendered through a graphic processing program and then displayed on the surface of the LED;
secondly, installing a high-definition camera and a depth camera at the top of the immersive reality environment, pointing to an observation object, positioning the spatial motion of the child by using the high-definition camera and the depth camera, solving through inverse dynamics to obtain the bone information of the child, analyzing the action parameters of the bone information, converting the bone information into interactive input of the child in a virtual display environment, and dynamically updating scene graphs and sound effects by using a virtual reality engine according to the interactive input;
thirdly, a high-definition camera and a depth camera are used for carrying out space positioning on the teaching aid, positioning information is transmitted in real time through a wireless communication module, a virtual reality engine carries out virtual scene updating, drawing and calculating according to the position of the teaching aid, and a vibration motor and an electric touch module of the teaching aid are driven through the wireless communication module;
fourthly, calculating the sight line and intersection information of the sight line and the scene by using the high-definition camera and the depth camera, and determining a hot spot area and a hot spot object of the sight line;
step five, converting the face landmark of the child into FACS codes, recording the weight of the FACS codes, and deducing the emotional state of the child;
step six, synchronizing the skeleton positioning information, the sight line hotspot information, the hotspot object information, the 3D position information of the teaching aid, the reaction time of the child and the expression code record, then recording the data into a data file in a json format, adjusting and intervening according to the data file, and carrying out data recording and evaluation on the rehabilitation performance by utilizing motion analysis equipment and eye movement equipment;
and step seven, according to the data in the step six, a positive feedback excitation method is adopted to strengthen visual elements and sound elements which are effective for the rehabilitation of the children, and repeated training is carried out.
Compared with the prior art, the invention has the following advantages:
1. different from the traditional sensory system training room, the invention can dynamically replace the sensory system training scene, and can simultaneously construct a training environment which can integrate a plurality of rehabilitation methods in the forms of immersive images, immersive sounds and force feedback.
2. The invention can capture the skeleton motion, the face orientation and the sight direction of the child and calculate the hot spot area of the viewpoint under the conditions of no contact and no wearing equipment, and can analyze the attention state of the child in real time and perform behavior analysis on the attention state.
3. The spatial position of teaching aid can be trailed in real time to structure light depth camera to judge that children controlled which teaching aid, teaching aid itself has physics vibration feedback and electrotactile, can give children's sense of touch amazing, with its perception effect of reinforcing.
4. The invention is mainly applied to sensory integration training and intervention treatment of the autistic children, can be deployed in hospitals and psychotherapy centers, provides interactive sensory integration training for the autistic children in an immersive virtual reality environment, dynamically adjusts intervention treatment means according to action input, voice input and emotional response of the autistic children, and performs data recording and evaluation on rehabilitation performance of the autistic children by utilizing motion analysis equipment and eye movement equipment.
Drawings
FIG. 1 is a roadmap for multi-sensory immersive interactive virtual reality rehabilitation training;
FIG. 2 is a 3D diagram of an environment constructed for multi-sensory immersive interactive virtual reality rehabilitation training;
FIG. 3 is a front view of an environment constructed for multi-sensory immersive interactive virtual reality rehabilitation training;
FIG. 4 is a virtual scene constructed by virtual reality rehabilitation training;
FIG. 5 is a diagram of bone tracking, gaze calculation, and viewpoint hotspot calculation using information fusion techniques;
FIG. 6 is a mounting position of the camera;
fig. 7 is a chamfered flexible LED.
Detailed Description
The technical solution of the present invention is further described below with reference to the accompanying drawings, but not limited thereto, and any modification or equivalent replacement of the technical solution of the present invention without departing from the spirit and scope of the technical solution of the present invention shall be covered by the protection scope of the present invention.
The invention provides a multi-sense immersion interactive virtual reality rehabilitation training method, which presents an immersion type treatment rehabilitation scene through a virtual reality system, and an autistic child is placed in the immersion type treatment rehabilitation scene. Recovered scene uses the display system of flexible LED panel structure, and the panel itself possesses penetrating type speaker, and autism children are through feeling body, touch and virtual environment carry out interactive sense system training, also can be interactive with virtual environment through entity teaching aid, and virtual environment can produce the interactive effect of image, sound and force feedback according to children's interaction. The ultrahigh-definition cameras and the structured light cameras work together in a coordinated mode to track bones of the movement of the child, track the face of the child in real time, collect facial expression and viewpoint information of the child, and calculate a hot spot area watched by the child. As shown in fig. 1, the specific steps are as follows:
1. an immersive virtual display environment is constructed by the LEDs of the sky, the ground and the wall, the virtual scene can be displayed on the surface of the LEDs after being rendered by a graphic processing program, and the picture of the virtual scene can be dynamically updated according to the 3D position of the viewpoint of the child captured by the camera, so that the image of the virtual scene viewed by the child is seamless and has correct perspective, and the image cannot be deformed due to corners.
2. The children obtain motion information of the children in the space in a mode of a plurality of ultra-high-definition cameras and depth cameras, and extract 3D information of bones of the children, wherein the 3D information of the bones is established in a local coordinate system of an immersive environment.
3. Each entity teaching aid is provided with a locator based on infrared laser and weak power feedback of a vibration motor, the locator is used for transmitting the spatial position of the locator through a wireless network, and the position information is also established in a local coordinate system of an immersive environment. When a force feedback effect is required, vibration and a slight current stimulus are emitted.
4. The multiple high-definition cameras can position the human face by using a face landmark method, 3D sight line estimation is carried out on sight lines of human eyes by combining the 3D information of the front skeleton, intersection calculation is carried out on the 3D sight lines and objects in the scene, and the current attention objects of the children are determined. Meanwhile, the system carries out weight calculation on the expression FACS of the child face and deduces the emotional state of the child.
5. Because the viewpoint information, sight line information, teaching aid 3D position information and skeleton 3D information of children are established in the local coordinate system of the immersive environment, and the information of the virtual scene can be coincided with the local coordinate system of the environment after being calibrated, the 3D information is uniform in the spatial dimension. The system performs time synchronization on rendering of a virtual scene, a video camera, a structured light camera and a locator through a time code technology, ensures that information records are uniform in time dimension, and realizes information fusion.
The invention solves the following technical problems:
1. the immersive image, the panoramic sound field, the human body feeling information, the force feedback and the interactive teaching aid are spatially integrated through an information fusion technology, and a highly immersive interactive virtual reality environment is provided.
2. The method comprises the steps of utilizing a plurality of ultra-high definition 8K cameras to track faces of autistic children, and calculating sight hot spots and eye movement information in a virtual reality environment of the children.
3. Utilize a plurality of super high definition 8K cameras to carry out the motion data record of no mark point to autism children to record children and entity teaching aid, virtual reality teaching aid's interactive information.
4. The highly immersive interactive virtual reality environment plays a virtual reality scene with interventional therapy effects.
Application example:
1. a chamfered LED environment (as shown in fig. 7) was constructed using flexible LEDs, with dimensions: 5000mm (long) x 3000mm (deep) x 4000mm (high), chamfer 800mm, LED pitch: 1.9, soft module, carbon fiber panel, but the bearing, this soft module adopts the penetrating type stereo set, and the sound can see through the screen and send out. Number of tracking system cameras: 4-8, the installation position of the camera is shown in figure 6.
2. The method comprises the steps of positioning the spatial motion of a child by utilizing a plurality of high-definition cameras in combination with a depth camera, solving through inverse dynamics, obtaining skeleton information of the child, analyzing action parameters, converting the skeleton information into interactive input of the child in a virtual environment, and dynamically updating scene graphs and sound effects according to the interactive input by a virtual reality engine.
3. Utilize a plurality of high definition cameras to combine the degree of depth camera to carry out space orientation to the teaching aid, location information carries out real-time transmission through wireless communication module, and the virtual reality engine of system carries out relevant calculation result according to the teaching aid position to vibrations motor and the electric touch module through wireless communication module drive teaching aid.
4. And calculating the sight line and intersection information of the sight line and the scene by using a plurality of high-definition cameras and combining depth cameras to determine a hot spot area and a hot spot object of the sight line.
5. The children's face, landmark, was converted to FACS codes and their weights were recorded.
6. And (4) synchronizing the skeleton positioning information, the sight line hotspot information, the hotspot object information, the 3D position information of the teaching aid, the reaction time of the child and the expression code record and then recording the record into a data file in a json format.

Claims (3)

1. A multi-sensory immersion interactive virtual reality rehabilitation training method is characterized by comprising the following steps:
firstly, an immersive virtual display environment is constructed through an LED, and a virtual scene is rendered through a graphic processing program and then displayed on the surface of the LED;
positioning the spatial motion of the child by using a high-definition camera and a depth camera, solving through inverse dynamics to obtain skeleton information of the child, analyzing action parameters of the skeleton information, converting the skeleton information into interactive input of the child in a virtual display environment, and dynamically updating scene graphs and sound effects by using a virtual reality engine according to the interactive input;
thirdly, a high-definition camera and a depth camera are used for carrying out space positioning on the teaching aid, positioning information is transmitted in real time through a wireless communication module, a virtual reality engine carries out virtual scene updating, drawing and calculating according to the position of the teaching aid, and a vibration motor and an electric touch module of the teaching aid are driven through the wireless communication module;
fourthly, calculating the sight line and intersection information of the sight line and the scene by using the high-definition camera and the depth camera, and determining a hot spot area and a hot spot object of the sight line;
step five, converting the face landmark of the child into FACS codes, recording the weight of the FACS codes, and deducing the emotional state of the child;
step six, synchronizing the skeleton positioning information, the sight line hotspot information, the hotspot object information, the 3D position information of the teaching aid, the reaction time of the child and the expression code record, then recording the data into a data file in a json format, adjusting and intervening according to the data file, and carrying out data recording and evaluation on the rehabilitation performance by utilizing motion analysis equipment and eye movement equipment;
and step seven, according to the data in the step six, a positive feedback excitation method is adopted to strengthen visual elements and sound elements which are effective for the rehabilitation of the children, and repeated training is carried out.
2. The multi-sensory immersive interactive virtual reality rehabilitation training method of claim 1, wherein in the first step, the immersive virtual display environment is constructed by LEDs of the sky + the ground + the wall.
3. The multi-sensory immersive interactive virtual reality rehabilitation training method of claim 1, wherein in the second step, the high-definition camera and the depth camera are installed on top of the immersive reality environment and point to the observation object.
CN202210595889.7A 2022-05-30 2022-05-30 Multi-sense-organ immersion interactive virtual reality rehabilitation training method Pending CN115083563A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210595889.7A CN115083563A (en) 2022-05-30 2022-05-30 Multi-sense-organ immersion interactive virtual reality rehabilitation training method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210595889.7A CN115083563A (en) 2022-05-30 2022-05-30 Multi-sense-organ immersion interactive virtual reality rehabilitation training method

Publications (1)

Publication Number Publication Date
CN115083563A true CN115083563A (en) 2022-09-20

Family

ID=83248867

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210595889.7A Pending CN115083563A (en) 2022-05-30 2022-05-30 Multi-sense-organ immersion interactive virtual reality rehabilitation training method

Country Status (1)

Country Link
CN (1) CN115083563A (en)

Similar Documents

Publication Publication Date Title
KR102045260B1 (en) Simulation method for training first aid treatment using augmented reality and virtual reality
CN109817031B (en) Limbs movement teaching method based on VR technology
KR101381594B1 (en) Education apparatus and method using Virtual Reality
US20090046140A1 (en) Mobile Virtual Reality Projector
CN105144248A (en) Information processing device and information processing method, display device and display method, and information processing system
JP7073481B2 (en) Image display system
CN106205245A (en) Immersion on-line teaching system, method and apparatus
US20090238378A1 (en) Enhanced Immersive Soundscapes Production
CN106293087B (en) A kind of information interacting method and electronic equipment
CN107103801A (en) Long-range three-dimensional scenic interactive education system and control method
CN109951718A (en) A method of it can 360 degree of panorama captured in real-time live streamings by 5G and VR technology
CN105653020A (en) Time traveling method and apparatus and glasses or helmet using same
CN112102667A (en) Video teaching system and method based on VR interaction
Kawai et al. A support system for visually impaired persons to understand three-dimensional visual information using acoustic interface
WO2017095199A1 (en) Virtual reality experience system
CN115083563A (en) Multi-sense-organ immersion interactive virtual reality rehabilitation training method
IJsselsteijn et al. A room with a cue: The efficacy of movement parallax, occlusion, and blur in creating a virtual window
CN107783639A (en) Virtual reality leisure learning system
CN108416255B (en) System and method for capturing real-time facial expression animation of character based on three-dimensional animation
KR100445846B1 (en) A Public Speaking Simulator for treating anthropophobia
CN114979568A (en) Remote operation guidance method based on augmented reality technology
KR20220083552A (en) Method for estimating and correcting 6 DoF of multiple objects of wearable AR device and AR service method using the same
KR102202357B1 (en) System and method for timulus-responsive virtual object augmentation abouut haptic device
JP2013110630A (en) Conversation video display system
KR20150066779A (en) Wearable apparatus for learning and System supporting learning including the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination