WO2021155431A1 - Vr-based treatment system and method - Google Patents

Vr-based treatment system and method Download PDF

Info

Publication number
WO2021155431A1
WO2021155431A1 PCT/AU2021/050082 AU2021050082W WO2021155431A1 WO 2021155431 A1 WO2021155431 A1 WO 2021155431A1 AU 2021050082 W AU2021050082 W AU 2021050082W WO 2021155431 A1 WO2021155431 A1 WO 2021155431A1
Authority
WO
WIPO (PCT)
Prior art keywords
subject
virtual
condition
virtual representation
environment
Prior art date
Application number
PCT/AU2021/050082
Other languages
French (fr)
Inventor
Wilfred Maurice LAX
Dhani Christomo SUTANTO
Original Assignee
Neurotechnology Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2020900282A external-priority patent/AU2020900282A0/en
Application filed by Neurotechnology Pty Ltd filed Critical Neurotechnology Pty Ltd
Priority to AU2021217421A priority Critical patent/AU2021217421A1/en
Priority to EP21750287.1A priority patent/EP4100819A4/en
Priority to US17/796,928 priority patent/US20230047622A1/en
Publication of WO2021155431A1 publication Critical patent/WO2021155431A1/en

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/403D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M21/00Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis
    • A61M21/02Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis for inducing sleep or relaxation, e.g. by direct nerve stimulation, hypnosis, analgesia
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1127Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/163Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4824Touch or pain perception evaluation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4848Monitoring or testing the effects of treatment, e.g. of medication
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6803Head-worn items, e.g. helmets, masks, headphones or goggles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/744Displaying an avatar, e.g. an animated cartoon character
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/70ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mental therapies, e.g. psychological therapy or autogenous training
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1079Measuring physical dimensions, e.g. size of the entire body or parts thereof using optical or photographic means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/117Identification of persons
    • A61B5/1171Identification of persons based on the shapes or appearances of their bodies or parts thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6823Trunk, e.g., chest, back, abdomen, hip
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6824Arm or wrist
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6825Hand
    • A61B5/6826Finger
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6829Foot or ankle
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/683Means for maintaining contact with the body
    • A61B5/6831Straps, bands or harnesses
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6889Rooms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M21/00Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis
    • A61M2021/0005Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus
    • A61M2021/0044Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus by the sight sense
    • A61M2021/005Other devices or methods to cause a change in the state of consciousness; Devices for producing or ending sleep by mechanical, optical, or acoustical means, e.g. for hypnosis by the use of a particular sense, or stimulus by the sight sense images, e.g. video
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2205/00General characteristics of the apparatus
    • A61M2205/33Controlling, regulating or measuring
    • A61M2205/3303Using a biosensor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2205/00General characteristics of the apparatus
    • A61M2205/33Controlling, regulating or measuring
    • A61M2205/3306Optical measuring means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2205/00General characteristics of the apparatus
    • A61M2205/35Communication
    • A61M2205/3576Communication with non implanted data transmission devices, e.g. using external transmitter or receiver
    • A61M2205/3584Communication with non implanted data transmission devices, e.g. using external transmitter or receiver using modem, internet or bluetooth
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M2230/00Measuring parameters of the user
    • A61M2230/63Motion, e.g. physical activity
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/41Medical
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2215/00Indexing scheme for image rendering
    • G06T2215/16Using real world measurements to influence rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/004Annotating, labelling
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/50ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders

Definitions

  • the invention relates to a VR-based treatment system and method, and in particular to a VR-based treatment system and method for the treatment of a health condition, including the treatment or management of pain. More generally, the invention relates to an XR-based treatment system and method.
  • VR virtual reality or VR can effectively be used in the field of pain management.
  • the analgesic properties of VR have been mainly attributed to its distractive capacity. It has also been recognised that immersive VR is effective in diminishing sensations of pain.
  • VR-based interventions have been used to decrease acute pain amongst individuals undergoing painful medical procedures, including treatment of burns injuries, dental pain and physical therapy for blunt force trauma and burns injuries.
  • VR/XR-based systems for the effective treatment of such pain, as well as for the treatment of mental and physical health problems in general in an immersive VR/XR-based environment.
  • a virtual reality-based treatment system for performing treatment on at least one condition of a subject, including; a virtual reality device arranged to be fitted to the subject and for immersing the subject in a virtual reality environment; at least one tracking camera configured to capture physical traits and movement of the body of the subject; a processor communicating with the virtual reality device and the at least one tracking camera; a monitor in communication with the processor, and including a user interface, wherein the processor is programmed to: generate a dynamic virtual representation of the body of the subject based on the captured physical traits and movement of the body of the subject, and to render the virtual representation of the body in the virtual reality environment via the virtual reality device, wherein the dynamic virtual representation is synchronised with the movement of the body of the subject; generate a virtual representation of the at least one condition of the subject in response to one or more inputs; overlay or render the virtual representation of the condition of the subject on the virtual representation of the body of the subject; and receive and process one or more inputs representing one or more attributes of the condition to adjust the virtual
  • a method of performing a treatment on at least one condition of a subject in an immersive virtual reality environment comprising: capturing physical traits and movement of the body of the subject; generating a dynamic virtual representation of the body of the subject based on the captured physical traits and movement of the body of the subject; rendering the dynamic virtual representation of the body of the subject in the virtual reality environment, wherein the dynamic virtual representation is synchronised with the movement of the body of the subject; generating a dynamic virtual representation of the at least one condition of the subject in response to one or more inputs; overlaying or rendering the virtual representation of the condition of the subject on the virtual representation of the body of the subject; and receiving and processing one or more inputs representing one or more attributes of the condition to adjust the virtual representation of the condition of the subject in the virtual reality environment to thereby assist the subject to visualise and resolve the condition.
  • the method may include generating virtual representations of multiple layers or components of the virtual body selected from at least two of a skin layer or component, a muscle layer or component, a nerves layer or component, an organs layer or component, a vascular layer or component, a respiratory layer or component and a skeleton layer or component, and enabling switching between virtual representations of the layers or components.
  • the visual representations of the attributes of the condition may include at least two of location, start point, end point, depth, intensity, size, speed, direction, frequency, temperature as indicated by colour and type as indicated by symbols.
  • the captured physical traits may include at least three of body shape, face shape, skin colour, hair colour/style, eye colour, height, weight, and gender.
  • the step of generating virtual representations of the body of the subject may include generating selectable or interchangeable direct self and mirror self-representations of the subject, the mirror representations of the subject being generated by generating an inverse image of the subject as opposed to using a virtual mirror plane.
  • the method may include generating a virtual representation of the body of a host or treatment provider, typically based on the captured physical traits and movement of the body of the host, and rendering the virtual representation of the body of the host in the virtual reality environment;
  • the condition may include pain, chronic pain, a physical or mental ailment or disability, including various levels of paralysis or palsy, and may further include a physical or mental state which requires enhancing or therapy, such as muscle condition, mental acuity, or stress.
  • the disability may relate to amputees, and the treatment may include mental and physical training of amputees, including emulating their lost limb to train their nerves and muscles before using artificial limbs.
  • the disclosure extends to a system wherein the processor is programmed to implement any of the above methods.
  • the disclosure extends further to a non-transient storage medium readable by a processor, the storage medium storing a sequence of instructions or software executable by the processor to cause the processor to perform the any of the above methods.
  • the disclosure extends to a non-transient storage medium in which the sequence of instructions or software includes: a virtual subject creator module to capture physical traits of the body of a subject and render a virtual subject including those traits; a virtual subject controller module to capture movement of the body of the subject and render a moving virtual subject using the virtual subject from the virtual human creator module; a virtual condition module to generate a virtual representation of at least one condition of the subject in response to one or more inputs, and layer it on the moving virtual subject; and a virtual environment module for providing a selectable virtual environment for the subject.
  • the software may include a virtual camera module for generating a selection of views or perspectives of the subject being treated.
  • the disclosure further extends to a virtual reality-based treatment system for performing treatment on at least one condition of a subject, including: a virtual reality device arranged to be fitted to the subject and for immersing the subject in a virtual reality environment; at least one tracking camera configured to capture physical traits and movement of the body of the subject; a processor communicating with the virtual reality device and the at least one tracking camera; a monitor in communication with the processor, and including a user interface, wherein the processor is programmed with a plurality of software modules to generate a dynamic virtual representation of the body of the subject based on the captured physical traits and movement of the body of the subject, and to render the virtual representation of the body in the virtual reality environment via the virtual reality device, wherein the dynamic virtual representation is synchronised with the movement of the body of the subject, the software modules including; a virtual subject creator module to capture physical traits of the body of a subject and render a virtual subject including those traits; a virtual subject controller module to capture movement of the body of the subject and render a moving virtual subject using the virtual subject from the virtual human
  • the disclosure further extends to an extended reality (XR) based treatment system for performing treatment on at least one condition of a subject, including; an XR device arranged to be fitted to the subject and for engaging the subject in an XR environment; at least one motion tracking device configured to capture physical traits and movement of the body of the subject; a processor communicating with the XR device and the at least one motion tracking device; a monitor in communication with the processor, and including a user interface, wherein the processor is programmed to: generate a dynamic virtual representation of the body of the subject based on the captured physical traits and movement of the body of the subject, and to render the virtual representation of the body in the XR environment via the XR device, wherein the dynamic virtual representation is synchronised with the movement of the body of the subject; generate a virtual representation of the at least one condition of the subject in response to one or more inputs; overlay or render the virtual representation of the condition of the subject on the virtual representation of the body of the subject; receive and process one or more inputs representing one or more attributes of the condition to
  • the extended reality (XR) based treatment system includes the processor being programmed to implement any of the above methods.
  • the disclosure further extends to an extended reality (XR) based treatment system for performing treatment on at least one condition of a subject, including; an XR device arranged to be fitted to the subject and for engaging the subject in an XR environment; at least one motion tracking device configured to capture physical traits and movement of the body of the subject; a processor communicating with the XR device and the at least one motion tracking device; a monitor in communication with the processor, and including a user interface, wherein the processor is programmed with a plurality of software modules to generate a dynamic virtual representation of the body of the subject based on the captured physical traits and movement of the body of the subject, and to render the virtual representation of the body in the XR environment via the XR device, wherein the dynamic virtual representation is synchronised with the movement of the body of the subject, the software modules including; a virtual subject creator module to capture physical traits of the body of a subject and render a virtual subject including those traits; a virtual subject controller module to capture movement of the body of the subject and render a moving virtual subject using the
  • the disclosure further extends to a method of performing a treatment on at least one condition of a subject in an extended reality (XR) environment comprising: capturing physical traits and movement of the body of the subject; generating a dynamic virtual representation of the body of the subject based on the captured physical traits and movement of the body of the subject; rendering the dynamic virtual representation of the body of the subject in the XR environment, wherein the dynamic virtual representation is synchronised with the movement of the body of the subject; generating a virtual representation of the at least one condition of the subject in response to one or more inputs; overlaying or rendering the virtual representation of the condition of the subject on the virtual representation of the body of the subject; and receiving and processing one or more inputs representing one or more attributes of the condition to adjust the virtual representation of the condition of the subject in the XR environment to thereby assist the subject to visualise and resolve the condition.
  • XR extended reality
  • the extended reality (XR) based treatment system or method may be selected from the group comprising at least one of virtual reality (VR), augmented reality (AR) and mixed reality (MR).
  • the XR-based treatment system may further includes: a database for collecting historical data; and a machine learning processor; wherein the historical data is used to train the machine learning processor so that the machine learning processor generates one or more executable treatment actions based on the one or more inputs representing one or more attributes of the at least one condition of the subject; and wherein the generated one or more executable treatment actions are provided to the processor for visualisation and resolving the condition.
  • the historical data may include one or more of XR hardware data, XR software data, user data and host data.
  • the generated one or more executable treatment actions may be fed back to the database.
  • the trained machined learning processor further generates analytical data to evaluate one or more treatment results, and wherein the generated analytical data is fed back to the database.
  • Figure 1a shows a schematic block diagram of one embodiment of a VR-based treatment system
  • Figure 1 b is a schematic block diagram of a computer processing system forming part of the VR-based system of Figure 1a and configurable to perform various features of a VR-based treatment method of the present disclosure
  • Figure 1c is a schematic block diagram of a computer network including the computer processing system of Figure 1 b;
  • Figure 2 shows a workflow diagram incorporating an embodiment of a VR-based treatment method
  • Figure 3 shows one embodiment of a host user interface
  • Figure 4a shows a pop-up menu forming part of the interface of Figure 3 for changing camera and headset settings
  • Figure 4b shows a pop-up menu forming part of the interface of Figure 3 for allowing adjustment of the user’s view
  • Figure 4c shows a controller and part of the interface of Figure 3 for selecting pain type
  • Figure 4d shows a controller and part of the interface of Figure 3 for selecting pain attributes including magnitude and speed
  • Figures 4e, 4f, 4g, 4h and4k show representations of respective skin, muscle, nerve, organ and skeleton layers selectable by the host user interface;
  • Figure 4m shows a pain point selector part of the interface of Figure 3 for selecting pain points
  • Figure 4ma shows an experience mode from a user perspective in which a self view of a virtual image of a user’s arm is shown as well as a reflected view of the user;
  • Figure 4n shows a virtual representation of a user showing the nervous system layer and a wrist-focused pain point with associated pain particles
  • Figure 5 shows a schematic block diagram of an embodiment of the hardware and software components of the VR-based system
  • Figure 6 shows a schematic block diagram of an embodiment of an XR-based system implemented with a machine learning software module.
  • a VR-based system 10 includes at its heart a computer processing system 12 in communication with at least one tracking camera 14.
  • the tracking camera may for example include a Microsoft Kinect 2.0 camera, a Microsoft Azure Kinect camera, or other commercially available tracking cameras to track the user’s body and movement.
  • the computer processing system 12 also communicates with a host monitor 16 with input devices/means in the form of the keyboard 16a and a mouse 16b.
  • Other input means may include a touchscreen enabled monitor, a touchpad, and any form of remote controlled device including a gaming console.
  • the system further includes a VR arrangement 18 including a VR headset 20 worn by a user 22, an associated VR controller 24 which also acts as an input device, and VR trackers 26a and 26b.
  • the VR headset may be selected from a number of commercially available headsets, including for example an HTC® Vive Pro headset or a Microsoft® Mixed Reality headset with corresponding trackers, in the present example HTC Vive Pro® trackers, and a corresponding HTC Vive Pro or Microsoft Mixed Reality controller 24.
  • the trackers may include stationary trackers, such as those indicated 26a and 26b, which are configured to track the movement of the headset 20, as well as individual body trackers used to track the movement of parts of the body, such as wrist, finger, waist, or ankle trackers 28a, 28b, 28c and 28d respectively, which include corresponding straps or belts.
  • Figure 1 b shows a block diagram of the computer processing system 12 configurable to implement embodiments and/or features described herein. It will be appreciated that Figure 1 b does not illustrate all functional or physical components of a computer processing system. For example, no power supply or power supply interface has been depicted, however system 12 will either carry a power supply or be configured for connection to a power supply (or both). It will also be appreciated that the particular type of computer processing system will determine the appropriate hardware and architecture, and alternative computer processing systems suitable for implementing features of the present disclosure may have additional, alternative, or fewer components than those depicted.
  • Computer processing system 12 includes at least one processing unit 12.1 which may in turn include a CPU 12.1 a and a GPU 12.1 b.
  • the CPU 12.1 a may include at least an Intel core i7 8700 processor, preferably a 9700 processor or the like, with the GPU 12.1 b including at least a GTX 1080ti processor, preferably a RTX 2080ti or a Titan RTX processor.
  • GTX 1080ti processor preferably a RTX 2080ti or a Titan RTX processor.
  • the abovementioned hardware, including the VR hardware may be superseded or updated on a regular basis with hardware and technologies having improved specifications, and it is within the scope of this disclosure to include such improved and updated hardware.
  • the processing unit 12.1 may be a single computer processing device (e.g. a combined central processing unit and graphics processing unit, or other computational device), or may include a plurality of computer processing devices, such as a separate CPU and GPU as described above. In some instances all processing will be performed by processing unit 12.1 , however in other instances processing may also be performed by remote processing devices accessible and useable (either in a shared or dedicated manner) by the system 12.
  • system 12 includes a system memory 32 (e.g. a BIOS), volatile memory 34 (e.g. random access memory such as one or more RAM or DRAM modules with a minimum of 32MB RAM), and non-volatile memory 36 (e.g. one or more hard disk or solid state drives)
  • system memory 32 e.g. a BIOS
  • volatile memory 34 e.g. random access memory such as one or more RAM or DRAM modules with a minimum of 32MB RAM
  • non-volatile memory 36 e.g. one or more hard disk or solid state drives
  • System 12 also includes one or more interfaces, indicated generally by 38, via which system 12 interfaces with various devices and/or networks.
  • other devices may be integral with system 12, or may be separate.
  • connection between the device and system 12 may be via wired or wireless hardware and communication protocols, and may be a direct or an indirect (e.g. networked) connection.
  • Wired connection with other devices/networks may be by any appropriate standard or proprietary hardware and connectivity protocols.
  • system 12 may be configured for wired connection with other devices/communications networks by one or more of: USB; FireWire; eSATA; Thunderbolt; Ethernet; OS/2; Parallel; Serial; HDM I ; DVI; VGA; SCSI; AudioPort. Other wired connections are possible.
  • Wireless connection with other devices/networks may similarly be by any appropriate standard or proprietary hardware and communications protocols.
  • system 12 may be configured for wireless connection with other devices/communications networks using one or more of: infrared; BlueTooth; WiFi; near field communications (NFC); Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), long term evolution (LTE), wideband code division multiple access (W-CDMA), code division multiple access (CDMA).
  • GSM Global System for Mobile Communications
  • EDGE Enhanced Data GSM Environment
  • LTE long term evolution
  • W-CDMA wideband code division multiple access
  • CDMA code division multiple access
  • devices to which system 12 connects - whether by wired or wireless means - include one or more input devices to allow data to be input into/received by system 12 for processing by the processing unit 12.1 , and one or more output devices to allow data to be output by system 12.
  • Example devices are described below, however it will be appreciated that not all computer processing systems will include all mentioned devices, and that additional and alternative devices to those mentioned may well be used.
  • system 12 may include or connect to one or more input devices by which information/data is input into (received by) system 12.
  • input devices may include keyboards, mice, trackpads, microphones, accelerometers, proximity sensors, GPS devices and the like.
  • System 12 may also include or connect to one or more output devices controlled by system 12 to output information.
  • output devices may include devices such as a CRT displays, LCD displays, LED displays, plasma displays, touch screen displays, speakers, vibration modules, LEDs/other lights, and such like.
  • System 12 may also include or connect to devices which may act as both input and output devices, for example memory devices (hard drives, solid state drives, disk drives, compact flash cards, SD cards and the like) which system 12 can read data from and/or write data to, and touch screen displays which can both display (output) data and receive touch signals (input).
  • memory devices hard drives, solid state drives, disk drives, compact flash cards, SD cards and the like
  • touch screen displays which can both display (output) data and receive touch signals (input).
  • Figure 1 shows just one exemplary implementation.
  • System 12 may also connect to one or more communications networks (e.g. the Internet, a local area network, a wide area network, a personal hotspot etc.) to communicate data to and receive data from networked devices, which may themselves be other computer processing systems.
  • communications networks e.g. the Internet, a local area network, a wide area network, a personal hotspot etc.
  • System 12 may be any suitable computer processing system such as, by way of non limiting example, a server computer system, a desktop computer, a laptop computer, a netbook computer, a tablet computing device, a mobile/smart phone, a personal digital assistant, a personal media player, a set-top box, and a games console.
  • a server computer system such as, by way of non limiting example, a server computer system, a desktop computer, a laptop computer, a netbook computer, a tablet computing device, a mobile/smart phone, a personal digital assistant, a personal media player, a set-top box, and a games console.
  • system 12 will include at least user input and output devices 40, which may be of the type described with reference to Figure 1 and a communications interface 42 for communication with a network such as network 42.
  • System 12 stores or has access to computer applications (also referred to as software or programs) - i.e. computer readable instructions and data which, when executed by the processing unit 12.1 , configure system 12 to receive, process, and output data.
  • Instructions and data can be stored on non-transient machine readable medium accessible to system 12.
  • instructions and data may be stored on non transient memory 36.
  • Instructions and data may be transmitted to/received by system 12 via a data signal in a transmission channel enabled (for example) by a wired or wireless network connection.
  • Apps accessible to system 12 will typically include an operating system application such as Microsoft Windows®, Apple OSX, Apple IOS, Android, Unix, or Linux.
  • System 12 also stores or has access to applications which, when executed by the processing unit 12.1 , configure system 12 to perform various computer-implemented processing operations described herein.
  • client system 46 includes a client application 48 which configures the client system 46 to perform the described client system operations
  • server system 50 includes a server application 52 which configures the server system 50 to perform the described server system operations.
  • the server application 52 communicates with a database server 54 which enables the storage and retrieval of data stored in a database 56, which may be a distributed or cloud-based database.
  • part or all of a given computer-implemented method will be performed by system 12 itself, while in other cases processing may be performed by other devices in data communication with system 12.
  • the client application 48 is designed, in combination with the hardware described in Figure 1a, to immerse the user in a virtual environment where they see a virtual representation of themselves.
  • This representation is designed to be as accurate as feasible with regard to height and body type with the aid of the body tracking camera 14. It has been established that sufficient user identity with the virtual representation can be achieved without providing strict anatomical accuracy or identical facial features.
  • One aspect which contributes significantly to this is the accurate tracking of actual body movements of the user by their virtual representation with minimal latency (i.e. a delay of typically less than 90ms, more typically 80-88ms). This provides the user with a subjective impression of simultaneity or synchronicity which enhances the immersive experience and the identity of the user with their real and mirror selves.
  • the virtual representation is applied to the main elements of the real self and the mirror self.
  • the application 48 helps the user to visualise their condition and also assists the host, who is typically a trained psychologist, therapist or clinician, to help educate the user and to start the condition management therapy session.
  • the application is also designed to display various visual representations of the user, including a high level or impressionist representation of the gender of the user, which is confined to male and female, as well as various layers of the user’s body, including a skin layer, muscle layer, nerves layer, and internal organs layer, and a skeletal layer. Additional layers may include a vascular or cardio-vascular layer, and a respiratory layer.
  • the application is further designed to provide a symbolic visual representation of the condition, such as pain, which is preferably a dynamic representation, and is overlaid on the virtual visual representation of the user. This is typically achieved by the host using the virtual reality controller 24.
  • the application may further provide a virtual visualisation of the host and the controller which is viewable through the virtual reality headset 20 as well as the monitor 16.
  • the user and host are immersed in a virtual reality environment which may initially include an on-boarding environment followed by other immersive environments described hereinafter.
  • a flow diagram is shown incorporating exemplary steps used in an embodiment of a chronic pain treatment method of the disclosure.
  • the host carries out a detailed assessment of the current pain locations and experiences of the user or client/patient, and documents these.
  • the client or user 22 then dons the VR headset 20.
  • the tracking camera 14 and associated hardware measure the user’s physical traits and track movement of the user.
  • a check is conducted at 64 to see if the tracking and positioning of the headset is correct. If not, the host adjusts the VR settings via the host monitor 16, as is shown at 66.
  • FIG 3 shows one embodiment of a host user interface or GUI 100 which is generated by the application on the host monitor 16.
  • the host interacts with the host user interface via input devices including controller 24, keypad 16a and mouse 16b.
  • the host user interface 100 includes a central display 102 which provides the host’s perspective of the virtual reality environment in which the user 22 is immersed, including a virtual representation 22.1 of the user as well as an optional virtual representation 104 of the host, which may also be viewed by the user through the VR headset 20.
  • the virtual body representation of the host may be similar to that of the user, but may also include natural elements such as fire or water, or in the case of treating children the host may adopt the appearance of an avatar in the form of a friendly robot or a familiar fantasy character.
  • the user 22 and host 104 are immersed in a forest environment 106.
  • the central display 102 is surrounded by a series of selection inputs which are controlled by the host in consultation with the user to customise the treatment of the user and to optimise the experience of the user during a treatment session.
  • Software settings inputs 108 provide respective setup, restart and quit options operable via one of the input devices. Activation of the setup or restart settings opens a pop-up menu 110 shown in Figure 4a via sensor tab 112 which allows the host to commence step 66 of Figure 2 by adjusting the height of the motion tracking camera 14 from the ground to a height greater than one meter and the user distance from the motion sensor camera 14, a minimum of 1.2 meters, as well as to enter a motion smoothing factor which determines how frequently the camera 14 updates the user’s position.
  • the save button 113 is used to save the settings.
  • An additional headset pop-up menu 114 is then activated via headset tab 116, as is shown in Figure 4b.
  • This setting allows the host to adjust the virtual position of the headset so that the user’s VR view is correct relative to their body. This is achieved by using the indicated up, down, left, right, and forward and backward buttons, to adjust the position of the user’s virtual camera so that the self and mirror virtual images that are generated of the user correspond as closely as possible with the user’s actual position, with the final adjusted position being saved.
  • the host then commences treatment at 68 using the VR software, starting treatment in the on-boarding environment at step 70.
  • the on-boarding environment is selectable via Key 1 of an environment selector 118, which includes additional Keys 2, 3, 4 and 5 for respectively enabling the selection of green field, forest, snowy mountain and underwater environments. It will be appreciated that many other possible environments may be generated.
  • the keys may be activated via any of the aforementioned input devices.
  • the weather conditions associated with the environments may also be relevant to treatment. For example a cold (snowy mountain) environment may be effective in the treatment of burns or burning pain.
  • the host then at 76 asks the user to describe their condition/problem, which in this example is pain-related. This may supplement or replace the initial assessment at step 59.
  • the user/client then describes the nature of their pain/problem and its location.
  • the exchange between the user and the host may conveniently be verbal, but may also be in writing, and may in addition operate in an environment where the user and the host are not in the same location, and the written or verbal communication is over a suitable communications network.
  • the host then creates a visualisation of the pain or problem at the described location. This may be achieved at step 80.1 using the VR controller 24 which the host points at the relevant location on the user’s body, or by using a direct selection tool on the host interface 100 including the monitor 16 and inputs 16.1 and 16.2.
  • a pain type selector including menu 120 is displayed on the monitor, including indicated hot, cold and sharp types of pain. It will be appreciated that other pain types may also be indicated for selection, such as dull, or throbbing.
  • the controller 24 has a menu button 24.1 which is repurposed as a pain type button used to select one of the above pain types.
  • the pain types may in turn be represented by colours, with hot, cold and sharp pain types being represented by red, blue and purple colours respectively. These colours may be indicated by red, blue and purple orbs 24.2, 24.3 and 24.4 extending from the tip of the controller 24 in the VR environment. Pain types may also be represented by objects or phenomena associated with creating that type of pain. For example flames/red hot pokers may be used to indicate burning pain, knives or needles or lightning bolts to indicate sharp and intense pain, hammers and clubs to indicate dull throbbing pain, and pincers to indicate localised surface pain.
  • the host user interface 100 also includes a pain attribute selector including a pain attribute menu or circular icon 122 with magnitude of pain from small to big as indicated by the user on the vertical axis and pain velocity or speed from slow to fast on the horizontal axis. Pain velocity may be used to indicate pain frequency in the case of a throbbing pain for instance or pain velocity in the case of a shooting pain.
  • the circular touchpad 24.6 on the controller is repurposed as a pain attribute selector, with the host altering the pain magnitude by scrolling and pressing on the touchpad 24.6, which operates in the same way as the circular icon 122.
  • Figure 4d it can be seen how in the VR environment different sized orbs 24.7, 24.8 and 24.9 are used to indicate magnitude of pain.
  • the host interface 100 further includes a model attribute selector including a model attribute menu 124 for enabling the selection of layers of the user’s VR body to be selected at step 80.2.
  • a model attribute selector including a model attribute menu 124 for enabling the selection of layers of the user’s VR body to be selected at step 80.2.
  • These include a skin layer 126 of Figure 4e which is the default or starting state of the user’s VR body.
  • the skin layer is gender specific, without the associated anatomical detail, and the overall body type and height represents the body type and height of the user, based on the images of the user captured by the tracking camera 14 and processed by the computer system 12.
  • the ability of the user to identify with their virtual selves is enhanced by accurate representations of body height and type.
  • Figure 4f shows a second muscle layer 128 with representations of the muscles on the client’s body, which may be adjusted based on body type so that they conform with the skin layer.
  • Figure 4g shows a third nerves layer 130 which is used to show how pain travels through the body in response to the user’s description of that pain in the manner previously described.
  • the nerves layer 130 is scaled to conform with the size and shape of the user’s body.
  • Figure 4h shows a fourth organs layer on 132. If the pain originates from a particular organ, this organ is highlighted. For example, in Figure 4a, the digestive system 132.1 is highlighted, and the pain is represented as travelling from the digestive system to the brain. It will be appreciated that various other organs can be displayed in the same manner and highlighted when relevant to the pain experienced by the user.
  • Figure 4k shows a fifth skeleton layer 134, which is the deepest layer. Bone or joint pain can be illustrated and localised using this layer.
  • the various layers enhance the user experience by allowing the user to locate their pain more precisely in 3D as well as providing a realistic virtual representation of the affected body part and its relationship with the pain being experienced by the user.
  • the host can turn the user’s mirror body on and off to make it easier for the user to see themselves by looking down when wearing the VR headset 20 to see a virtual representation of their arms and front portion of their bodies co-located with their real bodies, when both moving and still.
  • This is achieved by operating an experience mode toggle 136 in Figure 3 in which the host is able to toggle between the default self view and the experience mode or mirror view in which the user is able to see both views.
  • the experience mode is shown in which a self view of a virtual image of a user’s arm 136.1 is shown as well as a reflected view of the entire body of the user 136.2 in a virtual mirror 136.3 as experienced by the user when wearing the VR headset.
  • the combination of the self and mirror views serves to reinforce the user’s immersion in that the user is able to view themselves both directly and when reflected. Because the self and reflected views are dynamically synchronised with the actual movement of the user this gives the user an even more immersive experience when moving by enhancing the user perception that the self and mirror views are embodiments of the user.
  • the host may also include a virtual image of themselves or a fantasy representation thereof. This is achieved by operating a host attribute toggle 137.
  • the host can use video to capture both real and virtual images of the user and host where applicable to review treatment protocols after the treatment session. This may be securely stored in the database 56.
  • a pain particle or particles are created at the originating location of the pain and shown travelling to the brain. This is achieved using a direct point selector shown at 138 in Figure 3 and Figure 4m.
  • the host may use shortcut keys, such as F1 , F2, F3, F4 and F5 on the keypad to access pain points on the body outline 140 of the direct point selector, corresponding to points 142 on the virtual body of the user 144.
  • the controller 24 may be used to more accurately pinpoint the exact location of pain points on the user’s body, which may include the initial step of selecting an appropriate body layer.
  • the pain type and attribute are also selected in the manner previously described, and this influences the size, colour and frequency of the pain point or zone as well as of the pain particles travelling to and/or from the pain point to the user’s brain.
  • the pain particles are configured and the experience of the user is managed by the host using treatment principles including cognitive behaviour therapy, learning therapy, neuroplasticity, and pain therapy in the VR environment that has been established .
  • step 84 the user is asked if there are any other pain locations. If the answer is positive, the process reverts to step 78 at which the user describes the location and nature of the pain which is then converted by the host into a form which can be readily visualised.
  • step 86 pain particles continue to be created at the originating location(s) and are shown travelling to the brain.
  • step 88 the host continues to explain to the user what they’re looking at and where necessary, adjustments may be made to the visualisations depending in some instances on user feedback.
  • Figure 4n shows a static presentation of a virtual representation of a user 144.1 showing the nervous system 146 and a wrist-focused pain point 148 which is coloured red to represent burning pain, with pain particles 150 travelling to and from the brain 152.
  • the pain particles are dynamically represented with variations in speed and/or frequency used to indicated the nature of the pain.
  • the representation of the pain point or zone as well as the representation of the pain particles is dynamically varied to indicate, for example, an easing of the pain. This may be achieved by decreasing the magnitude of the zone and/or the pain particles, by changing the colour of the zone and pain particles from red to blue for example, and/or by slowing down the speed or frequency pain particles.
  • the pain point was zone and pain particles may be caused to fade away, again to create an illusion of reduced pain.
  • treatment is completed (a session would typically take 15 to 20 minutes) and the user is off-boarded by removing the VR headset. The host then continues with the consultation session.
  • CAMERAS 1 , 2 and 3 the host is able to change their point of view of the user within the virtual world.
  • camera selector interface 154 which in the present example uses Key 8 of the keypad to select the main mirror CAMERA1 providing a reflected or mirror perspective from the user’s point of view, Key 9 to select the virtual host CAMERA 2, providing a perspective from the host’s point of view, and Key 0 to select VR controller CAMERA 3, providing a perspective from the VR controller’s point of view.
  • Camera selection may also occur using a side camera change button on the controller 20.
  • the VR software 49 installed on the computer processing system or PC 12 includes various software modules, including a virtual human/user creator module 160, a virtual human/user controller module 162, a virtual pain/condition module 164, a virtual camera module 166 and a virtual environment module 168.
  • the virtual human/user creator module 160 receives inputs from the tracking camera 14 and renders at sub-module 170 the real images of the user captured by the tracking camera to generate a virtual human/user of the type illustrated, with identifiable user characteristics. These may include body shape, face shape, skin colour, hair style, hair colour, eye colour and any other desired personal user characteristics. In addition, user characteristics of height, weight, body type and gender may be entered by the host via the input hardware 16 in consultation with the user at sub-module 172, with sub- modules 170 and 172 together constituting a rendering engine for rendering the static characteristics of the user. These are then stored in a dedicated user file in secure database 174, which may be a local or remote database.
  • the virtual human/user controller module 162 generates the virtual user and its mirror image or duplicate for dynamic display through the VR headset, as well as viewing by the host. This is achieved by receiving at sub-module 176 static user data from the database 174, including body and face shape, as well as other user characteristics which have been stored in the user file in the database.
  • a body motion sub-module or sub-class 178 retrieves body motion variables from the tracking camera 14. More specific body position and motion attributes, including head position, head rotation, body position and finger movement data, are retrieved as variables at sub-module 180 from the VR headset 20 and one or more of its associated trackers 26a and 26b and 28a-d.
  • a dynamic virtual image of the user is generated by combining the above variables to effectively create a virtual user camera at sub-module 182 for display through the VR headset 20.
  • Dynamic feedback from the headset 20 and tracking camera 14 has the effect of dynamically updating the virtual image as seen by the user with minimal latency.
  • the virtual user VR camera position and rotation changes in concert with user induced movement of the VR headset to vary the view of the VR environment.
  • a layering module 184 is operated by the host via inputs 16a, 16b as previously described to enhance the visualisation of the body layer or part requiring treatment, such as skin, nerves, muscles, organs and/or bones.
  • a mirror within a virtual environment is generated as a plane displaying a reflection of the whole environment. This works like a video projection of the whole environment onto a 2D object within the environment. This creates double the graphical processing requirements, as the engine is trying to render two images of the same environment to show a mirror effect to be displayed on the screen. In the case of a VR headset with two screens, one for each eye, this again doubles the graphical processing requirements, with four environments required to be rendered.
  • a mirroring or inverting sub-module or engine 186 generates an inverse image of the virtual body instead of using a virtual mirror plane, with the same effect of generating a mirrored virtual human or user at 188.
  • This provides flexibility to manipulate the duplicate inverse body, which can be controlled separately from the user body. For example, with a virtual mirror plane it is not possible to see one’s back when looking forward. With the duplicate virtual body technique the virtual body can be rotated to allow the user to observe and have explained to them treatments on their back side.
  • the virtual pain module 164 is used to generate virtual pain images of the type previously described with input from the host in consultation with the user.
  • the various pain parameters including pain type, speed and intensity/magnitude, are input and rendered at sub-module 190, with the start and end positions of the pain being entered at 192 and 194 via the VR controllers 24 in the process of finding the right path at 196 and rendering a pain pathway at 198.
  • a pain mirroring or inverting module 200 generates a mirrored/inverted virtual pain image at 202.
  • the virtual pain images, both direct and inverted, are layered onto the virtual human/user body image generated at the virtual human controller module at 204 and made available to the user as a composite image through the VR headset.
  • the virtual environment module 168 includes a selection of virtual environments which are selected by the host in consultation with the user, with one or more selected environments being stored in a user file in the database 174. The selected environment is then overlayed/underlayed at the virtual human controller module for display through the VR headset 20 and monitor 16.
  • the virtual camera module 166 includes a host camera sub-module 206 including the three previously described virtual software cameras 1 , 2 and 3 providing the host with views from the host, user and controller perspectives.
  • the sub-module 206 may be controlled by the host via keypad and mouse inputs 16a and 16b as well as via the controller 24 as previously described.
  • the host is able to select camera type, position and rotation variables which will determine the host graphical view on the host GUI 100 on monitor 16.
  • the disclosed treatment system and method may be implemented with other real-and-virtual combined environments and human-machine interactions generated by computer technology and wearables (i.e. XR technologies), such as augmented reality (AR), mixed reality (MR) or any combination of VR, AR and MR.
  • XR technologies such as augmented reality (AR), mixed reality (MR) or any combination of VR, AR and MR.
  • a motion tracking device of the XR-based system may be one or more of a Microsoft Kinect 2.0 camera, a Microsoft Azure Kinect camera, a webcam, a mobile phone with LiDAR system or other commercially available tracking cameras, wearables or other sensors to track the user’s body and movement.
  • the VR headset 20 may be extended to other XR devices including smartphones, screen and/or projector to display the dynamic virtual image of the user and/or to provide dynamic feedback for dynamically updating the virtual image as seen by the user.
  • the virtual environment may be created in combination with the real environment to form an XR-type environment such as an AR environment.
  • a machine learning software module 51 may be implemented with the XR-based system to facilitate automation of treatment.
  • the machine learning software module 51 may also facilitate generation of treatment reports with analytical data for the treatments that have been done for a user.
  • cloud or local database 610 may collect historical data including, for example, XR hardware data 601 generated from the XR hardware 47, XR software data 603 generated from the XR software 49, user data 605 and/or host data 607 including user’s condition history and treatment history (not necessarily the historical data from the current user in treatment).
  • the historical data may be used as ground truth to train a machine learning processor 620 by using, for example, supervised learning techniques (e.g. multilayer perceptron, support vector machine) and/or transfer learning techniques (e.g. contrastive learning approach, or graph matching).
  • the trained machine learning processor 620 may then be able to provide one or more executable treatment actions 630 for a user currently in treatment based on the input data from the XR hardware 47, the XR software 48, host input and/or user input (e.g. one or more inputs representing one or more attributes of the at least one condition for treatment) from the user currently in treatment.
  • the generated executable treatment actions 630 may then be provided to the XR software 49 for visualisation and/or selectable use by the host and/or the user in treatment.
  • the generated executable treatment actions may also be fed back to the cloud/local database 610 to enrich the historical data for training the machine learning processor 620.
  • the host may be employed as “human-in-the-loop” to verify and modify the machine generated treatment actions.
  • the verified and/or modified treatment actions may also be fed back to the cloud/local database 610 to enrich the historical data.
  • the trained machine learning processor 620 may also output analytical data 640 to evaluate one or more treatment results.
  • the analytical data 640 may be used to generate treatment reports which can be provided to the user and/or host.
  • the analytical data 640 may also be fed back to the cloud/local database 610 to enrich the historical data.
  • User 2 received total pain reduction and experienced periods of being completely pain free for the first time in seven years.
  • User 3 showed reduction in pain severity, reduction in pain locations or extent and reduction on in the impact of pain on their daily living. However, there was still some residual pain, though at reduced levels. It was also noted the residual pain at reduced levels was only located at the site of the injury without any radiating pain.
  • applications of the treatment method and system are not confined to the treatment of pain, but may potentially be used in treating any condition which can be visualised and depicted.
  • Other applications using neuroplasticity may include rehabilitation therapy in the case of paralysis or palsy, as in the case with stroke sufferers, mental disorders, as well as relaxation therapy using an immersive environment.
  • the condition may relate to amputees, and the treatment may include mental and physical training of amputees, including emulating their lost limb to train their nerves and muscles before using artificial limbs.
  • the onboarding process including tracking of the entire body of the user and the direct and reflected virtual representations of the user's body, contributes to the user believing, feeling and reacting to the virtual representations/embodiments or avatar as being the real self. It is believed that this serves to engage the brain neuroplasticly to enhance the treatment process.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Public Health (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Psychiatry (AREA)
  • Psychology (AREA)
  • Hospice & Palliative Care (AREA)
  • Social Psychology (AREA)
  • Developmental Disabilities (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Child & Adolescent Psychology (AREA)
  • Anesthesiology (AREA)
  • Computer Graphics (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Pain & Pain Management (AREA)
  • Physiology (AREA)
  • Educational Technology (AREA)
  • Hematology (AREA)
  • Acoustics & Sound (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)

Abstract

Disclosed herein is an extended reality (XR)-based treatment system and method for performing treatment on a subject. The XR-based system may include at least one of virtual reality (VR), augmented reality (AR) and mixed reality (MR). In one arrangement, the present disclosure assists the subject to visualise and resolve at least one condition of the subject. The disclosed system and method includes generating a dynamic virtual representation of the body of the subject based on the captured physical traits and movement of the body of the subject captured by at least one motion tracking device, and rendering the virtual representation of the body in the extended reality environment via an extended reality device. The dynamic virtual representation is synchronised with the movement of the body of the subject, generating a virtual representation of at least one condition of the subject in response to one or more inputs, overlaying or rendering the virtual representation of the condition of the subject on the virtual representation of the body of the subject, and receiving and processing one or more inputs representing one or more attributes of the condition to adjust the virtual representation of the condition of the subject in the extended reality environment.

Description

VR-based treatment system and method
This application relates to and claims priority from Australian Provisional Application No. 2020900282, entitled “VR-based treatment system and method”, filed on 3 February 2020, the contents of which are hereby incorporated by reference in their entirety.
Field of the invention
The invention relates to a VR-based treatment system and method, and in particular to a VR-based treatment system and method for the treatment of a health condition, including the treatment or management of pain. More generally, the invention relates to an XR-based treatment system and method.
Background of the invention
Recent research indicates that virtual reality or VR can effectively be used in the field of pain management. The analgesic properties of VR have been mainly attributed to its distractive capacity. It has also been recognised that immersive VR is effective in diminishing sensations of pain. VR-based interventions have been used to decrease acute pain amongst individuals undergoing painful medical procedures, including treatment of burns injuries, dental pain and physical therapy for blunt force trauma and burns injuries.
The effective use of VR or more broadly XR in the treatment of chronic or persistent pain is less well documented, and there is accordingly a need for VR/XR- based systems for the effective treatment of such pain, as well as for the treatment of mental and physical health problems in general in an immersive VR/XR-based environment.
Reference to any prior art in the specification is not an acknowledgment or suggestion that this prior art forms part of the common general knowledge in any jurisdiction or that this prior art could reasonably be expected to be understood, regarded as relevant, and/or combined with other pieces of prior art by a skilled person in the art. Summary of the invention
In one aspect of the disclosure there is provided a virtual reality-based treatment system for performing treatment on at least one condition of a subject, including; a virtual reality device arranged to be fitted to the subject and for immersing the subject in a virtual reality environment; at least one tracking camera configured to capture physical traits and movement of the body of the subject; a processor communicating with the virtual reality device and the at least one tracking camera; a monitor in communication with the processor, and including a user interface, wherein the processor is programmed to: generate a dynamic virtual representation of the body of the subject based on the captured physical traits and movement of the body of the subject, and to render the virtual representation of the body in the virtual reality environment via the virtual reality device, wherein the dynamic virtual representation is synchronised with the movement of the body of the subject; generate a virtual representation of the at least one condition of the subject in response to one or more inputs; overlay or render the virtual representation of the condition of the subject on the virtual representation of the body of the subject; and receive and process one or more inputs representing one or more attributes of the condition to adjust the virtual representation of the condition of the subject in the virtual reality environment to thereby assist the subject to visualise and resolve the condition. In another aspect there is provided a method of performing a treatment on at least one condition of a subject in an immersive virtual reality environment comprising: capturing physical traits and movement of the body of the subject; generating a dynamic virtual representation of the body of the subject based on the captured physical traits and movement of the body of the subject; rendering the dynamic virtual representation of the body of the subject in the virtual reality environment, wherein the dynamic virtual representation is synchronised with the movement of the body of the subject; generating a dynamic virtual representation of the at least one condition of the subject in response to one or more inputs; overlaying or rendering the virtual representation of the condition of the subject on the virtual representation of the body of the subject; and receiving and processing one or more inputs representing one or more attributes of the condition to adjust the virtual representation of the condition of the subject in the virtual reality environment to thereby assist the subject to visualise and resolve the condition.
The method may include generating virtual representations of multiple layers or components of the virtual body selected from at least two of a skin layer or component, a muscle layer or component, a nerves layer or component, an organs layer or component, a vascular layer or component, a respiratory layer or component and a skeleton layer or component, and enabling switching between virtual representations of the layers or components.
The visual representations of the attributes of the condition may include at least two of location, start point, end point, depth, intensity, size, speed, direction, frequency, temperature as indicated by colour and type as indicated by symbols.
The captured physical traits may include at least three of body shape, face shape, skin colour, hair colour/style, eye colour, height, weight, and gender. The step of generating virtual representations of the body of the subject may include generating selectable or interchangeable direct self and mirror self-representations of the subject, the mirror representations of the subject being generated by generating an inverse image of the subject as opposed to using a virtual mirror plane. The method may include generating a virtual representation of the body of a host or treatment provider, typically based on the captured physical traits and movement of the body of the host, and rendering the virtual representation of the body of the host in the virtual reality environment;
The condition may include pain, chronic pain, a physical or mental ailment or disability, including various levels of paralysis or palsy, and may further include a physical or mental state which requires enhancing or therapy, such as muscle condition, mental acuity, or stress. The disability may relate to amputees, and the treatment may include mental and physical training of amputees, including emulating their lost limb to train their nerves and muscles before using artificial limbs. The disclosure extends to a system wherein the processor is programmed to implement any of the above methods.
The disclosure extends further to a non-transient storage medium readable by a processor, the storage medium storing a sequence of instructions or software executable by the processor to cause the processor to perform the any of the above methods.
The disclosure extends to a non-transient storage medium in which the sequence of instructions or software includes: a virtual subject creator module to capture physical traits of the body of a subject and render a virtual subject including those traits; a virtual subject controller module to capture movement of the body of the subject and render a moving virtual subject using the virtual subject from the virtual human creator module; a virtual condition module to generate a virtual representation of at least one condition of the subject in response to one or more inputs, and layer it on the moving virtual subject; and a virtual environment module for providing a selectable virtual environment for the subject.
The software may include a virtual camera module for generating a selection of views or perspectives of the subject being treated.
The disclosure further extends to a virtual reality-based treatment system for performing treatment on at least one condition of a subject, including: a virtual reality device arranged to be fitted to the subject and for immersing the subject in a virtual reality environment; at least one tracking camera configured to capture physical traits and movement of the body of the subject; a processor communicating with the virtual reality device and the at least one tracking camera; a monitor in communication with the processor, and including a user interface, wherein the processor is programmed with a plurality of software modules to generate a dynamic virtual representation of the body of the subject based on the captured physical traits and movement of the body of the subject, and to render the virtual representation of the body in the virtual reality environment via the virtual reality device, wherein the dynamic virtual representation is synchronised with the movement of the body of the subject, the software modules including; a virtual subject creator module to capture physical traits of the body of a subject and render a virtual subject including those traits; a virtual subject controller module to capture movement of the body of the subject and render a moving virtual subject using the virtual subject from the virtual human creator module; a virtual condition module to generate a virtual representation of the at least one condition of the subject in response to one or more inputs, and layer it on the moving virtual subject; and a virtual environment module for providing a selectable virtual environment for the subject.
The disclosure further extends to an extended reality (XR) based treatment system for performing treatment on at least one condition of a subject, including; an XR device arranged to be fitted to the subject and for engaging the subject in an XR environment; at least one motion tracking device configured to capture physical traits and movement of the body of the subject; a processor communicating with the XR device and the at least one motion tracking device; a monitor in communication with the processor, and including a user interface, wherein the processor is programmed to: generate a dynamic virtual representation of the body of the subject based on the captured physical traits and movement of the body of the subject, and to render the virtual representation of the body in the XR environment via the XR device, wherein the dynamic virtual representation is synchronised with the movement of the body of the subject; generate a virtual representation of the at least one condition of the subject in response to one or more inputs; overlay or render the virtual representation of the condition of the subject on the virtual representation of the body of the subject; receive and process one or more inputs representing one or more attributes of the condition to adjust the virtual representation of the condition of the subject in the XR environment to thereby assist the subject to visualise and resolve the condition.
The extended reality (XR) based treatment system includes the processor being programmed to implement any of the above methods.
The disclosure further extends to an extended reality (XR) based treatment system for performing treatment on at least one condition of a subject, including; an XR device arranged to be fitted to the subject and for engaging the subject in an XR environment; at least one motion tracking device configured to capture physical traits and movement of the body of the subject; a processor communicating with the XR device and the at least one motion tracking device; a monitor in communication with the processor, and including a user interface, wherein the processor is programmed with a plurality of software modules to generate a dynamic virtual representation of the body of the subject based on the captured physical traits and movement of the body of the subject, and to render the virtual representation of the body in the XR environment via the XR device, wherein the dynamic virtual representation is synchronised with the movement of the body of the subject, the software modules including; a virtual subject creator module to capture physical traits of the body of a subject and render a virtual subject including those traits; a virtual subject controller module to capture movement of the body of the subject and render a moving virtual subject using the virtual subject from the virtual human creator module; a virtual condition module to generate a virtual representation of the at least one condition of the subject in response to one or more inputs, and layer it on the moving virtual subject; and an XR environment module for providing a selectable XR environment for the subject.
The disclosure further extends to a method of performing a treatment on at least one condition of a subject in an extended reality (XR) environment comprising: capturing physical traits and movement of the body of the subject; generating a dynamic virtual representation of the body of the subject based on the captured physical traits and movement of the body of the subject; rendering the dynamic virtual representation of the body of the subject in the XR environment, wherein the dynamic virtual representation is synchronised with the movement of the body of the subject; generating a virtual representation of the at least one condition of the subject in response to one or more inputs; overlaying or rendering the virtual representation of the condition of the subject on the virtual representation of the body of the subject; and receiving and processing one or more inputs representing one or more attributes of the condition to adjust the virtual representation of the condition of the subject in the XR environment to thereby assist the subject to visualise and resolve the condition.
The extended reality (XR) based treatment system or method may be selected from the group comprising at least one of virtual reality (VR), augmented reality (AR) and mixed reality (MR). The XR-based treatment system may further includes: a database for collecting historical data; and a machine learning processor; wherein the historical data is used to train the machine learning processor so that the machine learning processor generates one or more executable treatment actions based on the one or more inputs representing one or more attributes of the at least one condition of the subject; and wherein the generated one or more executable treatment actions are provided to the processor for visualisation and resolving the condition.
The historical data may include one or more of XR hardware data, XR software data, user data and host data. The generated one or more executable treatment actions may be fed back to the database.
The trained machined learning processor further generates analytical data to evaluate one or more treatment results, and wherein the generated analytical data is fed back to the database.
As used herein, except where the context requires otherwise, the term "comprise" and variations of the term, such as "comprising", "comprises" and "comprised", are not intended to exclude further additives, components, integers or steps.
Further aspects of the present invention and further embodiments of the aspects described in the preceding paragraphs will become apparent from the following description, given by way of example and with reference to the accompanying drawings.
Brief description of the drawings
Figure 1a shows a schematic block diagram of one embodiment of a VR-based treatment system; Figure 1 b is a schematic block diagram of a computer processing system forming part of the VR-based system of Figure 1a and configurable to perform various features of a VR-based treatment method of the present disclosure;
Figure 1c is a schematic block diagram of a computer network including the computer processing system of Figure 1 b;
Figure 2 shows a workflow diagram incorporating an embodiment of a VR-based treatment method;
Figure 3 shows one embodiment of a host user interface;
Figure 4a shows a pop-up menu forming part of the interface of Figure 3 for changing camera and headset settings;
Figure 4b shows a pop-up menu forming part of the interface of Figure 3 for allowing adjustment of the user’s view
Figure 4c shows a controller and part of the interface of Figure 3 for selecting pain type; Figure 4d shows a controller and part of the interface of Figure 3 for selecting pain attributes including magnitude and speed;
Figures 4e, 4f, 4g, 4h and4k show representations of respective skin, muscle, nerve, organ and skeleton layers selectable by the host user interface;
Figure 4m shows a pain point selector part of the interface of Figure 3 for selecting pain points;
Figure 4ma shows an experience mode from a user perspective in which a self view of a virtual image of a user’s arm is shown as well as a reflected view of the user;
Figure 4n shows a virtual representation of a user showing the nervous system layer and a wrist-focused pain point with associated pain particles; Figure 5 shows a schematic block diagram of an embodiment of the hardware and software components of the VR-based system; and Figure 6 shows a schematic block diagram of an embodiment of an XR-based system implemented with a machine learning software module.
While the invention as claimed is amenable to various modifications and alternative forms, specific embodiments are shown by way of example in the drawings and are described in detail. It should be understood, however, that the drawings and detailed description are not intended to limit the invention to the particular form disclosed. The intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present invention as defined by the appended claims. For example, it will be appreciated that the VR technology described in this disclosure is one example of extended reality (XR) technologies, wherein the letter “X” represents a variable for any current or future computer altered reality technologies. In other words, it will be appreciated that the disclosed treatment system and method may be implemented with other real-and-virtual combined environments and human-machine interactions generated by computer technology and wearables (i.e. XR technologies), such as augmented reality (AR), mixed reality (MR) or any combination of VR, AR and MR.
Detailed description of the embodiments
It will be understood that the invention disclosed and defined in this specification extends to all alternative combinations of two or more of the individual features mentioned or evident from the text or drawings. All of these different combinations constitute various alternative aspects of the invention.
In the following description numerous specific details are set forth in order to provide a thorough understanding of the claimed invention. It will be apparent, however, that the claimed invention may be practiced without these specific details. In some instances, well-known structures and devices are shown in block diagram form in order to avoid unnecessary obscuring.
Referring first to Figure 1a, one embodiment of a VR-based system 10 includes at its heart a computer processing system 12 in communication with at least one tracking camera 14. The tracking camera may for example include a Microsoft Kinect 2.0 camera, a Microsoft Azure Kinect camera, or other commercially available tracking cameras to track the user’s body and movement. The computer processing system 12 also communicates with a host monitor 16 with input devices/means in the form of the keyboard 16a and a mouse 16b. Other input means may include a touchscreen enabled monitor, a touchpad, and any form of remote controlled device including a gaming console.
The system further includes a VR arrangement 18 including a VR headset 20 worn by a user 22, an associated VR controller 24 which also acts as an input device, and VR trackers 26a and 26b. The VR headset may be selected from a number of commercially available headsets, including for example an HTC® Vive Pro headset or a Microsoft® Mixed Reality headset with corresponding trackers, in the present example HTC Vive Pro® trackers, and a corresponding HTC Vive Pro or Microsoft Mixed Reality controller 24. The trackers may include stationary trackers, such as those indicated 26a and 26b, which are configured to track the movement of the headset 20, as well as individual body trackers used to track the movement of parts of the body, such as wrist, finger, waist, or ankle trackers 28a, 28b, 28c and 28d respectively, which include corresponding straps or belts.
Figure 1 b shows a block diagram of the computer processing system 12 configurable to implement embodiments and/or features described herein. It will be appreciated that Figure 1 b does not illustrate all functional or physical components of a computer processing system. For example, no power supply or power supply interface has been depicted, however system 12 will either carry a power supply or be configured for connection to a power supply (or both). It will also be appreciated that the particular type of computer processing system will determine the appropriate hardware and architecture, and alternative computer processing systems suitable for implementing features of the present disclosure may have additional, alternative, or fewer components than those depicted.
Computer processing system 12 includes at least one processing unit 12.1 which may in turn include a CPU 12.1 a and a GPU 12.1 b. The CPU 12.1 a may include at least an Intel core i7 8700 processor, preferably a 9700 processor or the like, with the GPU 12.1 b including at least a GTX 1080ti processor, preferably a RTX 2080ti or a Titan RTX processor. It will be appreciated that the abovementioned hardware, including the VR hardware, may be superseded or updated on a regular basis with hardware and technologies having improved specifications, and it is within the scope of this disclosure to include such improved and updated hardware.
The processing unit 12.1 may be a single computer processing device (e.g. a combined central processing unit and graphics processing unit, or other computational device), or may include a plurality of computer processing devices, such as a separate CPU and GPU as described above. In some instances all processing will be performed by processing unit 12.1 , however in other instances processing may also be performed by remote processing devices accessible and useable (either in a shared or dedicated manner) by the system 12.
Through a communications bus 30 the processing unit 12.1 is in data communication with a one or more machine readable storage (memory) devices which store instructions and/or data for controlling operation of the processing system 12. In this example system 12 includes a system memory 32 (e.g. a BIOS), volatile memory 34 (e.g. random access memory such as one or more RAM or DRAM modules with a minimum of 32MB RAM), and non-volatile memory 36 (e.g. one or more hard disk or solid state drives)
System 12 also includes one or more interfaces, indicated generally by 38, via which system 12 interfaces with various devices and/or networks. Generally speaking, other devices may be integral with system 12, or may be separate. Where a device is separate from system 12, connection between the device and system 12 may be via wired or wireless hardware and communication protocols, and may be a direct or an indirect (e.g. networked) connection.
Wired connection with other devices/networks may be by any appropriate standard or proprietary hardware and connectivity protocols. For example, system 12 may be configured for wired connection with other devices/communications networks by one or more of: USB; FireWire; eSATA; Thunderbolt; Ethernet; OS/2; Parallel; Serial; HDM I ; DVI; VGA; SCSI; AudioPort. Other wired connections are possible.
Wireless connection with other devices/networks may similarly be by any appropriate standard or proprietary hardware and communications protocols. For example, system 12 may be configured for wireless connection with other devices/communications networks using one or more of: infrared; BlueTooth; WiFi; near field communications (NFC); Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), long term evolution (LTE), wideband code division multiple access (W-CDMA), code division multiple access (CDMA). Other wireless connections are possible.
Generally speaking, and depending on the particular system in question, devices to which system 12 connects - whether by wired or wireless means - include one or more input devices to allow data to be input into/received by system 12 for processing by the processing unit 12.1 , and one or more output devices to allow data to be output by system 12. Example devices are described below, however it will be appreciated that not all computer processing systems will include all mentioned devices, and that additional and alternative devices to those mentioned may well be used.
For example, system 12 may include or connect to one or more input devices by which information/data is input into (received by) system 12. Such input devices may include keyboards, mice, trackpads, microphones, accelerometers, proximity sensors, GPS devices and the like. System 12 may also include or connect to one or more output devices controlled by system 12 to output information. Such output devices may include devices such as a CRT displays, LCD displays, LED displays, plasma displays, touch screen displays, speakers, vibration modules, LEDs/other lights, and such like. System 12 may also include or connect to devices which may act as both input and output devices, for example memory devices (hard drives, solid state drives, disk drives, compact flash cards, SD cards and the like) which system 12 can read data from and/or write data to, and touch screen displays which can both display (output) data and receive touch signals (input). Figure 1 shows just one exemplary implementation.
System 12 may also connect to one or more communications networks (e.g. the Internet, a local area network, a wide area network, a personal hotspot etc.) to communicate data to and receive data from networked devices, which may themselves be other computer processing systems.
System 12 may be any suitable computer processing system such as, by way of non limiting example, a server computer system, a desktop computer, a laptop computer, a netbook computer, a tablet computing device, a mobile/smart phone, a personal digital assistant, a personal media player, a set-top box, and a games console.
Typically, system 12 will include at least user input and output devices 40, which may be of the type described with reference to Figure 1 and a communications interface 42 for communication with a network such as network 42.
System 12 stores or has access to computer applications (also referred to as software or programs) - i.e. computer readable instructions and data which, when executed by the processing unit 12.1 , configure system 12 to receive, process, and output data. Instructions and data can be stored on non-transient machine readable medium accessible to system 12. For example, instructions and data may be stored on non transient memory 36. Instructions and data may be transmitted to/received by system 12 via a data signal in a transmission channel enabled (for example) by a wired or wireless network connection.
Applications accessible to system 12 will typically include an operating system application such as Microsoft Windows®, Apple OSX, Apple IOS, Android, Unix, or Linux.
System 12 also stores or has access to applications which, when executed by the processing unit 12.1 , configure system 12 to perform various computer-implemented processing operations described herein. For example, and referring to the networked environment of Figure 1c above, client system 46 includes a client application 48 which configures the client system 46 to perform the described client system operations, and server system 50 includes a server application 52 which configures the server system 50 to perform the described server system operations. The server application 52 communicates with a database server 54 which enables the storage and retrieval of data stored in a database 56, which may be a distributed or cloud-based database.
In some cases part or all of a given computer-implemented method will be performed by system 12 itself, while in other cases processing may be performed by other devices in data communication with system 12.
The client application 48 is designed, in combination with the hardware described in Figure 1a, to immerse the user in a virtual environment where they see a virtual representation of themselves. This representation is designed to be as accurate as feasible with regard to height and body type with the aid of the body tracking camera 14. It has been established that sufficient user identity with the virtual representation can be achieved without providing strict anatomical accuracy or identical facial features. One aspect which contributes significantly to this is the accurate tracking of actual body movements of the user by their virtual representation with minimal latency (i.e. a delay of typically less than 90ms, more typically 80-88ms). This provides the user with a subjective impression of simultaneity or synchronicity which enhances the immersive experience and the identity of the user with their real and mirror selves. The virtual representation is applied to the main elements of the real self and the mirror self.
The application 48 helps the user to visualise their condition and also assists the host, who is typically a trained psychologist, therapist or clinician, to help educate the user and to start the condition management therapy session.
The application is also designed to display various visual representations of the user, including a high level or impressionist representation of the gender of the user, which is confined to male and female, as well as various layers of the user’s body, including a skin layer, muscle layer, nerves layer, and internal organs layer, and a skeletal layer. Additional layers may include a vascular or cardio-vascular layer, and a respiratory layer.
The application is further designed to provide a symbolic visual representation of the condition, such as pain, which is preferably a dynamic representation, and is overlaid on the virtual visual representation of the user. This is typically achieved by the host using the virtual reality controller 24. The application may further provide a virtual visualisation of the host and the controller which is viewable through the virtual reality headset 20 as well as the monitor 16. The user and host are immersed in a virtual reality environment which may initially include an on-boarding environment followed by other immersive environments described hereinafter.
Referring now to Figure 2, a flow diagram is shown incorporating exemplary steps used in an embodiment of a chronic pain treatment method of the disclosure. At initial step 59, the host carries out a detailed assessment of the current pain locations and experiences of the user or client/patient, and documents these. At step 60, the client or user 22 then dons the VR headset 20. Then at step 62, the tracking camera 14 and associated hardware measure the user’s physical traits and track movement of the user. A check is conducted at 64 to see if the tracking and positioning of the headset is correct. If not, the host adjusts the VR settings via the host monitor 16, as is shown at 66.
Figure 3 shows one embodiment of a host user interface or GUI 100 which is generated by the application on the host monitor 16. The host interacts with the host user interface via input devices including controller 24, keypad 16a and mouse 16b. The host user interface 100 includes a central display 102 which provides the host’s perspective of the virtual reality environment in which the user 22 is immersed, including a virtual representation 22.1 of the user as well as an optional virtual representation 104 of the host, which may also be viewed by the user through the VR headset 20. The virtual body representation of the host may be similar to that of the user, but may also include natural elements such as fire or water, or in the case of treating children the host may adopt the appearance of an avatar in the form of a friendly robot or a familiar fantasy character.
In the present example, the user 22 and host 104 are immersed in a forest environment 106. The central display 102 is surrounded by a series of selection inputs which are controlled by the host in consultation with the user to customise the treatment of the user and to optimise the experience of the user during a treatment session.
Software settings inputs 108 provide respective setup, restart and quit options operable via one of the input devices. Activation of the setup or restart settings opens a pop-up menu 110 shown in Figure 4a via sensor tab 112 which allows the host to commence step 66 of Figure 2 by adjusting the height of the motion tracking camera 14 from the ground to a height greater than one meter and the user distance from the motion sensor camera 14, a minimum of 1.2 meters, as well as to enter a motion smoothing factor which determines how frequently the camera 14 updates the user’s position. The save button 113 is used to save the settings.
An additional headset pop-up menu 114 is then activated via headset tab 116, as is shown in Figure 4b. This setting allows the host to adjust the virtual position of the headset so that the user’s VR view is correct relative to their body. This is achieved by using the indicated up, down, left, right, and forward and backward buttons, to adjust the position of the user’s virtual camera so that the self and mirror virtual images that are generated of the user correspond as closely as possible with the user’s actual position, with the final adjusted position being saved. The host then commences treatment at 68 using the VR software, starting treatment in the on-boarding environment at step 70. The on-boarding environment is selectable via Key 1 of an environment selector 118, which includes additional Keys 2, 3, 4 and 5 for respectively enabling the selection of green field, forest, snowy mountain and underwater environments. It will be appreciated that many other possible environments may be generated. The keys may be activated via any of the aforementioned input devices.
At step 72, a decision is made as to whether an immersive environment is required for the session. If so, the host chooses one of the above-mentioned immersive environments at step 74, potentially taking user preferences into account, which in this case is the forest environment 106. If not, the host commences directly to 76. The weather conditions associated with the environments may also be relevant to treatment. For example a cold (snowy mountain) environment may be effective in the treatment of burns or burning pain.
The host then at 76 asks the user to describe their condition/problem, which in this example is pain-related. This may supplement or replace the initial assessment at step 59. At step 78, the user/client then describes the nature of their pain/problem and its location. The exchange between the user and the host may conveniently be verbal, but may also be in writing, and may in addition operate in an environment where the user and the host are not in the same location, and the written or verbal communication is over a suitable communications network.
At step 80, the host then creates a visualisation of the pain or problem at the described location. This may be achieved at step 80.1 using the VR controller 24 which the host points at the relevant location on the user’s body, or by using a direct selection tool on the host interface 100 including the monitor 16 and inputs 16.1 and 16.2.
A pain type selector including menu 120 is displayed on the monitor, including indicated hot, cold and sharp types of pain. It will be appreciated that other pain types may also be indicated for selection, such as dull, or throbbing. Referring to Figure 4c, the controller 24 has a menu button 24.1 which is repurposed as a pain type button used to select one of the above pain types. The pain types may in turn be represented by colours, with hot, cold and sharp pain types being represented by red, blue and purple colours respectively. These colours may be indicated by red, blue and purple orbs 24.2, 24.3 and 24.4 extending from the tip of the controller 24 in the VR environment. Pain types may also be represented by objects or phenomena associated with creating that type of pain. For example flames/red hot pokers may be used to indicate burning pain, knives or needles or lightning bolts to indicate sharp and intense pain, hammers and clubs to indicate dull throbbing pain, and pincers to indicate localised surface pain.
The host user interface 100 also includes a pain attribute selector including a pain attribute menu or circular icon 122 with magnitude of pain from small to big as indicated by the user on the vertical axis and pain velocity or speed from slow to fast on the horizontal axis. Pain velocity may be used to indicate pain frequency in the case of a throbbing pain for instance or pain velocity in the case of a shooting pain. As is shown in Figure 4d, the circular touchpad 24.6 on the controller is repurposed as a pain attribute selector, with the host altering the pain magnitude by scrolling and pressing on the touchpad 24.6, which operates in the same way as the circular icon 122. In Figure 4d it can be seen how in the VR environment different sized orbs 24.7, 24.8 and 24.9 are used to indicate magnitude of pain.
The host interface 100 further includes a model attribute selector including a model attribute menu 124 for enabling the selection of layers of the user’s VR body to be selected at step 80.2. These include a skin layer 126 of Figure 4e which is the default or starting state of the user’s VR body. The skin layer is gender specific, without the associated anatomical detail, and the overall body type and height represents the body type and height of the user, based on the images of the user captured by the tracking camera 14 and processed by the computer system 12. The ability of the user to identify with their virtual selves is enhanced by accurate representations of body height and type.
The virtual representation of Figure 4f shows a second muscle layer 128 with representations of the muscles on the client’s body, which may be adjusted based on body type so that they conform with the skin layer. Figure 4g shows a third nerves layer 130 which is used to show how pain travels through the body in response to the user’s description of that pain in the manner previously described. The nerves layer 130 is scaled to conform with the size and shape of the user’s body.
Figure 4h shows a fourth organs layer on 132. If the pain originates from a particular organ, this organ is highlighted. For example, in Figure 4a, the digestive system 132.1 is highlighted, and the pain is represented as travelling from the digestive system to the brain. It will be appreciated that various other organs can be displayed in the same manner and highlighted when relevant to the pain experienced by the user.
Figure 4k shows a fifth skeleton layer 134, which is the deepest layer. Bone or joint pain can be illustrated and localised using this layer. The various layers enhance the user experience by allowing the user to locate their pain more precisely in 3D as well as providing a realistic virtual representation of the affected body part and its relationship with the pain being experienced by the user.
As is shown at step 80.3, the host can turn the user’s mirror body on and off to make it easier for the user to see themselves by looking down when wearing the VR headset 20 to see a virtual representation of their arms and front portion of their bodies co-located with their real bodies, when both moving and still. This is achieved by operating an experience mode toggle 136 in Figure 3 in which the host is able to toggle between the default self view and the experience mode or mirror view in which the user is able to see both views.
In Figure 4ma, the experience mode is shown in which a self view of a virtual image of a user’s arm 136.1 is shown as well as a reflected view of the entire body of the user 136.2 in a virtual mirror 136.3 as experienced by the user when wearing the VR headset. The combination of the self and mirror views serves to reinforce the user’s immersion in that the user is able to view themselves both directly and when reflected. Because the self and reflected views are dynamically synchronised with the actual movement of the user this gives the user an even more immersive experience when moving by enhancing the user perception that the self and mirror views are embodiments of the user. As previously described, in addition to varying the user view of their body, the host may also include a virtual image of themselves or a fantasy representation thereof. This is achieved by operating a host attribute toggle 137.
As indicated at 80.4, the host can use video to capture both real and virtual images of the user and host where applicable to review treatment protocols after the treatment session. This may be securely stored in the database 56.
At step 82, a pain particle or particles are created at the originating location of the pain and shown travelling to the brain. This is achieved using a direct point selector shown at 138 in Figure 3 and Figure 4m. The host may use shortcut keys, such as F1 , F2, F3, F4 and F5 on the keypad to access pain points on the body outline 140 of the direct point selector, corresponding to points 142 on the virtual body of the user 144. The controller 24 may be used to more accurately pinpoint the exact location of pain points on the user’s body, which may include the initial step of selecting an appropriate body layer. The pain type and attribute are also selected in the manner previously described, and this influences the size, colour and frequency of the pain point or zone as well as of the pain particles travelling to and/or from the pain point to the user’s brain.
At step 83 the pain particles are configured and the experience of the user is managed by the host using treatment principles including cognitive behaviour therapy, learning therapy, neuroplasticity, and pain therapy in the VR environment that has been established .
At step 84 the user is asked if there are any other pain locations. If the answer is positive, the process reverts to step 78 at which the user describes the location and nature of the pain which is then converted by the host into a form which can be readily visualised. At step 86 pain particles continue to be created at the originating location(s) and are shown travelling to the brain. At step 88, the host continues to explain to the user what they’re looking at and where necessary, adjustments may be made to the visualisations depending in some instances on user feedback.
Figure 4n shows a static presentation of a virtual representation of a user 144.1 showing the nervous system 146 and a wrist-focused pain point 148 which is coloured red to represent burning pain, with pain particles 150 travelling to and from the brain 152. It will be appreciated that the pain particles are dynamically represented with variations in speed and/or frequency used to indicated the nature of the pain. In some embodiments the representation of the pain point or zone as well as the representation of the pain particles is dynamically varied to indicate, for example, an easing of the pain. This may be achieved by decreasing the magnitude of the zone and/or the pain particles, by changing the colour of the zone and pain particles from red to blue for example, and/or by slowing down the speed or frequency pain particles. In some embodiments, the pain point was zone and pain particles may be caused to fade away, again to create an illusion of reduced pain.
At step 90, treatment is completed (a session would typically take 15 to 20 minutes) and the user is off-boarded by removing the VR headset. The host then continues with the consultation session.
By virtue of three virtual cameras, CAMERAS 1 , 2 and 3, the host is able to change their point of view of the user within the virtual world. This is achieved using camera selector interface 154 which in the present example uses Key 8 of the keypad to select the main mirror CAMERA1 providing a reflected or mirror perspective from the user’s point of view, Key 9 to select the virtual host CAMERA 2, providing a perspective from the host’s point of view, and Key 0 to select VR controller CAMERA 3, providing a perspective from the VR controller’s point of view. Camera selection may also occur using a side camera change button on the controller 20.
Referring now to Figure 5, a schematic block diagram of an embodiment of the interoperable hardware and software components of the VR-based system is shown in which previously described hardware components are indicated with the same numerals. The VR software 49 installed on the computer processing system or PC 12 includes various software modules, including a virtual human/user creator module 160, a virtual human/user controller module 162, a virtual pain/condition module 164, a virtual camera module 166 and a virtual environment module 168.
The virtual human/user creator module 160 receives inputs from the tracking camera 14 and renders at sub-module 170 the real images of the user captured by the tracking camera to generate a virtual human/user of the type illustrated, with identifiable user characteristics. These may include body shape, face shape, skin colour, hair style, hair colour, eye colour and any other desired personal user characteristics. In addition, user characteristics of height, weight, body type and gender may be entered by the host via the input hardware 16 in consultation with the user at sub-module 172, with sub- modules 170 and 172 together constituting a rendering engine for rendering the static characteristics of the user. These are then stored in a dedicated user file in secure database 174, which may be a local or remote database.
The virtual human/user controller module 162 generates the virtual user and its mirror image or duplicate for dynamic display through the VR headset, as well as viewing by the host. This is achieved by receiving at sub-module 176 static user data from the database 174, including body and face shape, as well as other user characteristics which have been stored in the user file in the database. A body motion sub-module or sub-class 178 retrieves body motion variables from the tracking camera 14. More specific body position and motion attributes, including head position, head rotation, body position and finger movement data, are retrieved as variables at sub-module 180 from the VR headset 20 and one or more of its associated trackers 26a and 26b and 28a-d.
A dynamic virtual image of the user is generated by combining the above variables to effectively create a virtual user camera at sub-module 182 for display through the VR headset 20. Dynamic feedback from the headset 20 and tracking camera 14 has the effect of dynamically updating the virtual image as seen by the user with minimal latency. The virtual user VR camera position and rotation changes in concert with user induced movement of the VR headset to vary the view of the VR environment. A layering module 184 is operated by the host via inputs 16a, 16b as previously described to enhance the visualisation of the body layer or part requiring treatment, such as skin, nerves, muscles, organs and/or bones.
In conventional virtual reality systems, a mirror within a virtual environment is generated as a plane displaying a reflection of the whole environment. This works like a video projection of the whole environment onto a 2D object within the environment. This creates double the graphical processing requirements, as the engine is trying to render two images of the same environment to show a mirror effect to be displayed on the screen. In the case of a VR headset with two screens, one for each eye, this again doubles the graphical processing requirements, with four environments required to be rendered.
In the present disclosure, a mirroring or inverting sub-module or engine 186 generates an inverse image of the virtual body instead of using a virtual mirror plane, with the same effect of generating a mirrored virtual human or user at 188. This provides flexibility to manipulate the duplicate inverse body, which can be controlled separately from the user body. For example, with a virtual mirror plane it is not possible to see one’s back when looking forward. With the duplicate virtual body technique the virtual body can be rotated to allow the user to observe and have explained to them treatments on their back side.
There is further a reduction in graphical processing requirements needed to render the experience, with the graphical processor only needing to render the environment once to be displayed on the screen, or twice in the case of a VR headset. This enhances the performance of the system, reducing lag or latency to ensure that the user’s movements are synchronised with their virtual direct and “reflected” representations and increasing the graphic quality of the VR environment.
The virtual pain module 164 is used to generate virtual pain images of the type previously described with input from the host in consultation with the user. The various pain parameters, including pain type, speed and intensity/magnitude, are input and rendered at sub-module 190, with the start and end positions of the pain being entered at 192 and 194 via the VR controllers 24 in the process of finding the right path at 196 and rendering a pain pathway at 198.
A pain mirroring or inverting module 200 generates a mirrored/inverted virtual pain image at 202. The virtual pain images, both direct and inverted, are layered onto the virtual human/user body image generated at the virtual human controller module at 204 and made available to the user as a composite image through the VR headset.
The virtual environment module 168 includes a selection of virtual environments which are selected by the host in consultation with the user, with one or more selected environments being stored in a user file in the database 174. The selected environment is then overlayed/underlayed at the virtual human controller module for display through the VR headset 20 and monitor 16.
The virtual camera module 166 includes a host camera sub-module 206 including the three previously described virtual software cameras 1 , 2 and 3 providing the host with views from the host, user and controller perspectives. The sub-module 206 may be controlled by the host via keypad and mouse inputs 16a and 16b as well as via the controller 24 as previously described. The host is able to select camera type, position and rotation variables which will determine the host graphical view on the host GUI 100 on monitor 16.
It will be appreciated that the disclosed treatment system and method may be implemented with other real-and-virtual combined environments and human-machine interactions generated by computer technology and wearables (i.e. XR technologies), such as augmented reality (AR), mixed reality (MR) or any combination of VR, AR and MR.
For example, the term “VR” or “virtual reality” used above can be replaced with the term “XR” or “extended reality” representing the disclosed treatment system and method to be implemented with any of the XR technologies. In particular, a motion tracking device of the XR-based system may be one or more of a Microsoft Kinect 2.0 camera, a Microsoft Azure Kinect camera, a webcam, a mobile phone with LiDAR system or other commercially available tracking cameras, wearables or other sensors to track the user’s body and movement. The VR headset 20 may be extended to other XR devices including smartphones, screen and/or projector to display the dynamic virtual image of the user and/or to provide dynamic feedback for dynamically updating the virtual image as seen by the user. It will also be understood that the virtual environment may be created in combination with the real environment to form an XR-type environment such as an AR environment.
In some embodiments, a machine learning software module 51 may be implemented with the XR-based system to facilitate automation of treatment. The machine learning software module 51 may also facilitate generation of treatment reports with analytical data for the treatments that have been done for a user. As illustrated in Figure 6, cloud or local database 610 may collect historical data including, for example, XR hardware data 601 generated from the XR hardware 47, XR software data 603 generated from the XR software 49, user data 605 and/or host data 607 including user’s condition history and treatment history (not necessarily the historical data from the current user in treatment). The historical data may be used as ground truth to train a machine learning processor 620 by using, for example, supervised learning techniques (e.g. multilayer perceptron, support vector machine) and/or transfer learning techniques (e.g. contrastive learning approach, or graph matching).
The trained machine learning processor 620 may then be able to provide one or more executable treatment actions 630 for a user currently in treatment based on the input data from the XR hardware 47, the XR software 48, host input and/or user input (e.g. one or more inputs representing one or more attributes of the at least one condition for treatment) from the user currently in treatment. The generated executable treatment actions 630 may then be provided to the XR software 49 for visualisation and/or selectable use by the host and/or the user in treatment. The generated executable treatment actions may also be fed back to the cloud/local database 610 to enrich the historical data for training the machine learning processor 620.
To ensure safety and correctness of the treatment actions, the host may be employed as “human-in-the-loop” to verify and modify the machine generated treatment actions. The verified and/or modified treatment actions may also be fed back to the cloud/local database 610 to enrich the historical data.
The trained machine learning processor 620 may also output analytical data 640 to evaluate one or more treatment results. The analytical data 640 may be used to generate treatment reports which can be provided to the user and/or host. The analytical data 640 may also be fed back to the cloud/local database 610 to enrich the historical data.
Initial test results
Initial development work has established the capability of the virtual reality based treatment system and method generating a seamless virtual reality environment for the user to be immersed in, allowing the user to identify with their self representation as well as with their mirror representation. A pilot sample of four users were tested, all of whom were suffering from chronic pain with a range of diagnoses. All of the users showed immediate transient pain reduction on a single treatment. Of the sample of four, one user subsequently dropped out. The remaining three users responded as follows to treatment over a period of this 10 weeks, with one session per week lasting an average of X minutes:
User 2 received total pain reduction and experienced periods of being completely pain free for the first time in seven years. User 3 showed reduction in pain severity, reduction in pain locations or extent and reduction on in the impact of pain on their daily living. However, there was still some residual pain, though at reduced levels. It was also noted the residual pain at reduced levels was only located at the site of the injury without any radiating pain.
User 4 had variable results with some improvements. They suffer from hyper-mobile joints and have ongoing pain as they continue to dislocate joints, causing acute proprioceptive pain on an ongoing basis.
Based on these initial results, the applicant is conducting ongoing trials including with regard to intensity and frequency.
It will be appreciated that applications of the treatment method and system are not confined to the treatment of pain, but may potentially be used in treating any condition which can be visualised and depicted. Other applications using neuroplasticity may include rehabilitation therapy in the case of paralysis or palsy, as in the case with stroke sufferers, mental disorders, as well as relaxation therapy using an immersive environment. The condition may relate to amputees, and the treatment may include mental and physical training of amputees, including emulating their lost limb to train their nerves and muscles before using artificial limbs.
It is believed that the onboarding process, including tracking of the entire body of the user and the direct and reflected virtual representations of the user's body, contributes to the user believing, feeling and reacting to the virtual representations/embodiments or avatar as being the real self. It is believed that this serves to engage the brain neuroplasticly to enhance the treatment process.

Claims

1. A virtual reality-based treatment system for performing treatment on at least one condition of a subject, including; a virtual reality device arranged to be fitted to the subject and for immersing the subject in a virtual reality environment; at least one tracking camera configured to capture physical traits and movement of the body of the subject; a processor communicating with the virtual reality device and the at least one tracking camera; a monitor in communication with the processor, and including a user interface, wherein the processor is programmed to: generate a dynamic virtual representation of the body of the subject based on the captured physical traits and movement of the body of the subject, and to render the virtual representation of the body in the virtual reality environment via the virtual reality device, wherein the dynamic virtual representation is synchronised with the movement of the body of the subject; generate a virtual representation of the at least one condition of the subject in response to one or more inputs; overlay or render the virtual representation of the condition of the subject on the virtual representation of the body of the subject; receive and process one or more inputs representing one or more attributes of the condition to adjust the virtual representation of the condition of the subject in the virtual reality environment to thereby assist the subject to visualise and resolve the condition.
2. A method of performing a treatment on at least one condition of a subject in an immersive virtual reality environment comprising: capturing physical traits and movement of the body of the subject; generating a dynamic virtual representation of the body of the subject based on the captured physical traits and movement of the body of the subject; rendering the dynamic virtual representation of the body of the subject in the virtual reality environment, wherein the dynamic virtual representation is synchronised with the movement of the body of the subject; generating a virtual representation of the at least one condition of the subject in response to one or more inputs; overlaying or rendering the virtual representation of the condition of the subject on the virtual representation of the body of the subject; and receiving and processing one or more inputs representing one or more attributes of the condition to adjust the virtual representation of the condition of the subject in the virtual reality environment to thereby assist the subject to visualise and resolve the condition.
3. A method according to claim 2 which includes generating virtual representations of multiple layers or components of the virtual body selected from at least two of a skin layer or component, a muscle layer or component, a nerves layer or component, an organs layer or component, a vascular layer or component, a respiratory layer or component and a skeleton layer or component, and enabling switching between virtual representations of the layers or components.
4. A method according to claim 2 wherein the visual representations of the attributes of the condition include at least two of location, start point, end point, depth, intensity, size, speed, direction, frequency, temperature as indicated by colour and type as indicated by symbols.
5. A method according to any one of claims 2-4 in which the captured physical traits include at least three of body shape, face shape, skin colour, hair colour/style, eye colour, height, weight, and gender.
6. A method according to any one of claims 2-5 in which the step of generating virtual representations of the body of the subject includes generating selectable or interchangeable direct self and mirror self-representations of the subject.
7. A method according to claim 6 in which the mirror representations of the subject are generated by generating an inverse image of the subject as opposed to using a virtual mirror plane.
8. A method according to either one of claims 6 or 7 in which the step of generating virtual representations of the at least one condition of the subject includes generating direct and mirror representations of the at least one condition and overlaying the direct and mirror representations of the condition on the respective direct and mirror representations of the subject.
9. A method according to claim 8 in which the mirror representations of the at least one condition are generated by generating an inverse image of the at least one condition as opposed to using a virtual mirror plane.
10. A method according to any one of claims 2-8 which includes the step of immersing the subject in the virtual reality environment, including enabling the selection of a plurality of different immersive environments, such as onboarding (neutral), underwater, green field, snow, mountain, forest, tropical island and desert for occupation by the virtual representation of the subject.
11. A method according to any one of claims 2-10 which includes generating a virtual representation of the body of a host or treatment provider, typically based on the captured physical traits and movement of the body of the host, and rendering the virtual representation of the body of the host in the virtual reality environment;
12. A method according to any one of claims 2-11 wherein the condition may include pain, chronic pain, a physical or mental ailment or disability, including amputeeism and various levels of paralysis or palsy, and may further include a physical or mental state which requires enhancing or therapy, such as muscle condition, mental acuity, or stress.
13. A system according to claim 1 wherein the processor is programmed to implement the method of any one of claims 3-11.
14. A non-transient storage medium readable by a processor, the storage medium storing a sequence of instructions or software executable by the processor to cause the processor to perform the method of any one of claims 2-11.
15. A non-transient storage medium according to claim 14 in which the sequence of instructions or software includes: a virtual subject creator module to capture physical traits of the body of a subject and render a virtual subject including those traits; a virtual subject controller module to capture movement of the body of the subject and render a moving virtual subject using the virtual subject from the virtual human creator module; a virtual condition module to generate a virtual representation of at least one condition of the subject in response to one or more inputs, and layer it on the moving virtual subject.
16. A non-transient storage medium according to claim 15 in which the software further includes: a virtual environment module for providing a selectable virtual environment for the subject, and a virtual camera module for generating a selection of views or perspectives of the subject being treated.
17. A virtual reality-based treatment system for performing treatment on at least one condition of a subject, including ; a virtual reality device arranged to be fitted to the subject and for immersing the subject in a virtual reality environment; at least one tracking camera configured to capture physical traits and movement of the body of the subject; a processor communicating with the virtual reality device and the at least one tracking camera; a monitor in communication with the processor, and including a user interface, wherein the processor is programmed with a plurality of software modules to generate a dynamic virtual representation of the body of the subject based on the captured physical traits and movement of the body of the subject, and to render the virtual representation of the body in the virtual reality environment via the virtual reality device, wherein the dynamic virtual representation is synchronised with the movement of the body of the subject, the software modules including; a virtual subject creator module to capture physical traits of the body of a subject and render a virtual subject including those traits; a virtual subject controller module to capture movement of the body of the subject and render a moving virtual subject using the virtual subject from the virtual human creator module; a virtual condition module to generate a virtual representation of the at least one condition of the subject in response to one or more inputs, and layer it on the moving virtual subject; and a virtual environment module for providing a selectable virtual environment for the subject.
18. An extended reality (XR) based treatment system for performing treatment on at least one condition of a subject, including; an XR device arranged to be fitted to the subject and for engaging the subject in an XR environment; at least one motion tracking device configured to capture physical traits and movement of the body of the subject; a processor communicating with the XR device and the at least one motion tracking device; a monitor in communication with the processor, and including a user interface, wherein the processor is programmed to: generate a dynamic virtual representation of the body of the subject based on the captured physical traits and movement of the body of the subject, and to render the virtual representation of the body in the XR environment via the XR device, wherein the dynamic virtual representation is synchronised with the movement of the body of the subject; generate a virtual representation of the at least one condition of the subject in response to one or more inputs; overlay or render the virtual representation of the condition of the subject on the virtual representation of the body of the subject; receive and process one or more inputs representing one or more attributes of the condition to adjust the virtual representation of the condition of the subject in the XR environment to thereby assist the subject to visualise and resolve the condition.
19. The extended reality (XR) based treatment system according to claim 18 wherein the processor is programmed to implement the method of any one of claims 3 to 12.
20. An extended reality (XR) based treatment system for performing treatment on at least one condition of a subject, including; an XR device arranged to be fitted to the subject and for engaging the subject in an XR environment; at least one motion tracking device configured to capture physical traits and movement of the body of the subject; a processor communicating with the XR device and the at least one motion tracking device; a monitor in communication with the processor, and including a user interface, wherein the processor is programmed with a plurality of software modules to generate a dynamic virtual representation of the body of the subject based on the captured physical traits and movement of the body of the subject, and to render the virtual representation of the body in the XR environment via the XR device, wherein the dynamic virtual representation is synchronised with the movement of the body of the subject, the software modules including; a virtual subject creator module to capture physical traits of the body of a subject and render a virtual subject including those traits; a virtual subject controller module to capture movement of the body of the subject and render a moving virtual subject using the virtual subject from the virtual human creator module; a virtual condition module to generate a virtual representation of the at least one condition of the subject in response to one or more inputs, and layer it on the moving virtual subject; and an XR environment module for providing a selectable XR environment for the subject.
21. A method of performing a treatment on at least one condition of a subject in an extended reality (XR) environment comprising: capturing physical traits and movement of the body of the subject; generating a dynamic virtual representation of the body of the subject based on the captured physical traits and movement of the body of the subject; rendering the dynamic virtual representation of the body of the subject in the XR environment, wherein the dynamic virtual representation is synchronised with the movement of the body of the subject; generating a virtual representation of the at least one condition of the subject in response to one or more inputs; overlaying or rendering the virtual representation of the condition of the subject on the virtual representation of the body of the subject; and receiving and processing one or more inputs representing one or more attributes of the condition to adjust the virtual representation of the condition of the subject in the XR environment to thereby assist the subject to visualise and resolve the condition.
22. The method according to claim 21 which includes the steps of any one of claims 3 to 12.
23. The extended reality (XR) based treatment system or method of any one of claims 18-22 selected from the group comprising at least one of virtual reality (VR), augmented reality (AR) and mixed reality (MR).
24. The XR-based treatment system according to any one of claims 18-20 and 23, further including: a database for collecting historical data; and a machine learning processor; wherein the historical data is used to train the machine learning processor so that the machine learning processor generates one or more executable treatment actions based on the one or more inputs representing one or more attributes of the at least one condition of the subject; and wherein the generated one or more executable treatment actions are provided to the processor for visualisation and resolving the condition.
25. The XR-based treatment system according to claim 24 wherein the historical data includes one or more of XR hardware data, XR software data, user data and host data.
26. The XR-based treatment system according to claim 24 or claim 25, wherein the generated one or more executable treatment actions are fed back to the database.
27. The XR-based treatment system according to any one of claims 24-26, wherein the trained machined learning processor further generates analytical data to evaluate one or more treatment results, and wherein the generated analytical data is fed back to the database.
PCT/AU2021/050082 2020-02-03 2021-02-03 Vr-based treatment system and method WO2021155431A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
AU2021217421A AU2021217421A1 (en) 2020-02-03 2021-02-03 VR-based treatment system and method
EP21750287.1A EP4100819A4 (en) 2020-02-03 2021-02-03 Vr-based treatment system and method
US17/796,928 US20230047622A1 (en) 2020-02-03 2021-02-03 VR-Based Treatment System and Method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AU2020900282A AU2020900282A0 (en) 2020-02-03 VR-based treatment system and method
AU2020900282 2020-02-03

Publications (1)

Publication Number Publication Date
WO2021155431A1 true WO2021155431A1 (en) 2021-08-12

Family

ID=77199123

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2021/050082 WO2021155431A1 (en) 2020-02-03 2021-02-03 Vr-based treatment system and method

Country Status (4)

Country Link
US (1) US20230047622A1 (en)
EP (1) EP4100819A4 (en)
AU (1) AU2021217421A1 (en)
WO (1) WO2021155431A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023170235A1 (en) * 2022-03-09 2023-09-14 Andrea Gioacchini Virtual reality method and associated device

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024194672A1 (en) * 2023-03-20 2024-09-26 Gustav Lo Systems and methods for displaying layered augmented anatomical features

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110054870A1 (en) * 2009-09-02 2011-03-03 Honda Motor Co., Ltd. Vision Based Human Activity Recognition and Monitoring System for Guided Virtual Rehabilitation
US20160077547A1 (en) * 2014-09-11 2016-03-17 Interaxon Inc. System and method for enhanced training using a virtual reality environment and bio-signal data
US20170086943A1 (en) * 2010-03-17 2017-03-30 ClearCorrect Holdings, Inc. Methods and Systems for Employing Artificial Intelligence in Automated Orthodontic Diagnosis and Treatment Planning
WO2017160920A1 (en) * 2016-03-17 2017-09-21 Becton, Dickinson And Company Medical record system using a patient avatar
US9770203B1 (en) * 2013-01-19 2017-09-26 Bertec Corporation Force measurement system and a method of testing a subject

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8953909B2 (en) * 2006-01-21 2015-02-10 Elizabeth T. Guckenberger System, method, and computer software code for mimic training
CN111329553B (en) * 2016-03-12 2021-05-04 P·K·朗 Devices and methods for surgery
US20190130792A1 (en) * 2017-08-30 2019-05-02 Truinject Corp. Systems, platforms, and methods of injection training
US11801114B2 (en) * 2017-09-11 2023-10-31 Philipp K. Lang Augmented reality display for vascular and other interventions, compensation for cardiac and respiratory motion

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110054870A1 (en) * 2009-09-02 2011-03-03 Honda Motor Co., Ltd. Vision Based Human Activity Recognition and Monitoring System for Guided Virtual Rehabilitation
US20170086943A1 (en) * 2010-03-17 2017-03-30 ClearCorrect Holdings, Inc. Methods and Systems for Employing Artificial Intelligence in Automated Orthodontic Diagnosis and Treatment Planning
US9770203B1 (en) * 2013-01-19 2017-09-26 Bertec Corporation Force measurement system and a method of testing a subject
US20160077547A1 (en) * 2014-09-11 2016-03-17 Interaxon Inc. System and method for enhanced training using a virtual reality environment and bio-signal data
WO2017160920A1 (en) * 2016-03-17 2017-09-21 Becton, Dickinson And Company Medical record system using a patient avatar

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
PEDRAM, SHIVA: "Virtual reality may be the next frontier in remote mental health care", INFRASTRUCTURE FACILITY - PAPERS 288, 14 January 2020 (2020-01-14), XP055847944, Retrieved from the Internet <URL:https://ro.uow.edu.au/cgi/viewcontent.cgi?article=1315&context=smartpapers> *
See also references of EP4100819A4 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023170235A1 (en) * 2022-03-09 2023-09-14 Andrea Gioacchini Virtual reality method and associated device
FR3133470A1 (en) * 2022-03-09 2023-09-15 Andrea Gioacchini Virtual reality method and associated device

Also Published As

Publication number Publication date
EP4100819A4 (en) 2023-08-02
US20230047622A1 (en) 2023-02-16
AU2021217421A1 (en) 2022-09-29
EP4100819A1 (en) 2022-12-14

Similar Documents

Publication Publication Date Title
Duchowski Gaze-based interaction: A 30 year retrospective
Ullah et al. Exploring the potential of metaverse technology in healthcare: Applications, challenges, and future directions
US11024430B2 (en) Representation of symptom alleviation
Romanus et al. Mid-air haptic bio-holograms in mixed reality
US20230047622A1 (en) VR-Based Treatment System and Method
US20210401339A1 (en) Adaptive behavioral training, and training of associated physiological responses, with assessment and diagnostic functionality
US12087448B2 (en) Representation of symptom alleviation
Hartzler et al. Real-time feedback on nonverbal clinical communication
Patrão et al. Augmented shared spaces: an application for exposure psychotherapy
De Paolis et al. Augmented Reality, Virtual Reality, and Computer Graphics: 4th International Conference, AVR 2017, Ugento, Italy, June 12-15, 2017, Proceedings, Part I
CN111596761B (en) Face-changing technology and virtual reality technology-based method and device for simulating speech
US20240165518A1 (en) Methods for adaptive behavioral training using gaze-contingent eye tracking and devices thereof
Zhang et al. Interactive Augmented Reality to Support Education
Lu et al. Research on the interaction method that can alleviate cybersickness in virtual reality games
Wirth et al. Extended realities (XRs): how immersive technologies influence assessment and training for extreme environments
Farkas et al. Dynamic 3-D computer graphics for designing a diagnostic tool for patients with schizophrenia
Charalambous et al. Virtual Reality and Augmeneted Reality for Managing Symptoms
Spiss et al. Effect of touch stimuli on proprioceptive recalibration during upper-limb rotation in virtual reality mirror therapy
Da Silva et al. A Mobile VR Tool for Vestibular Therapy
Eck et al. Instant disembodiment of virtual body parts
Dorneich et al. Situational Awareness Methods in Virtual Reality Training: A Scoping Review
Enebechi IMPACT OF VARIABILITY OF HAPTIC FEEDBACK IN VIRTUAL REALITY (VR) DURING TASK PERFORMANCE
Zhang et al. Design and Implementation of Psychotherapy System Based on Virtual Reality
Fiedler et al. Holographic Augmented Reality Mirrors for Daily Self-Reflection on the Own Body Image
Yendigeri Development of Customizable Head Mounted Device for Virtual Reality Applications

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21750287

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2021750287

Country of ref document: EP

Effective date: 20220905

ENP Entry into the national phase

Ref document number: 2021217421

Country of ref document: AU

Date of ref document: 20210203

Kind code of ref document: A