CN112102917B - Exercise amount visualization method and system for active rehabilitation training - Google Patents

Exercise amount visualization method and system for active rehabilitation training Download PDF

Info

Publication number
CN112102917B
CN112102917B CN202010800941.9A CN202010800941A CN112102917B CN 112102917 B CN112102917 B CN 112102917B CN 202010800941 A CN202010800941 A CN 202010800941A CN 112102917 B CN112102917 B CN 112102917B
Authority
CN
China
Prior art keywords
training
speed
rendering
marker
augmented reality
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010800941.9A
Other languages
Chinese (zh)
Other versions
CN112102917A (en
Inventor
招梓枫
朱利丰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southeast University
Original Assignee
Southeast University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southeast University filed Critical Southeast University
Priority to CN202010800941.9A priority Critical patent/CN112102917B/en
Publication of CN112102917A publication Critical patent/CN112102917A/en
Application granted granted Critical
Publication of CN112102917B publication Critical patent/CN112102917B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Psychiatry (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biophysics (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Epidemiology (AREA)
  • Medical Informatics (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a method and a system for visualizing the amount of exercise for active rehabilitation training. The system renders augmented reality contents such as the augmented reality virtual compass and the track and the fault tolerance range thereof to provide position, speed and posture information feedback for the patient. The system determines the best position estimate for the wearable markers by corner matching and searches for the best pose estimate in a corner-pose database. And constructing and rendering an augmented reality virtual compass through the actual speed and the target speed, and enabling the pose of the augmented reality virtual compass to be consistent with the wearable marker. The augmented reality virtual compass reflects the difference between the actual speed and the target speed in the size and direction, guides the patient to correct the movement speed, and guides the patient to correct the movement route by the track. Besides, the system is also provided with a training design module and an evaluation module which are necessary for the active rehabilitation training system.

Description

Exercise amount visualization method and system for active rehabilitation training
Technical Field
The invention relates to the fields of active rehabilitation, motion perception and augmented reality, in particular to a method and a system for visualizing the amount of motion for active rehabilitation training.
Background
The active rehabilitation training can re-awaken the injured nerve cells to participate in activities by guiding and correcting the movement to a certain degree. With the help of the training auxiliary equipment, the neural center corrects the movement, so that the stability and the accuracy of the movement are gradually improved, and the movement function is recovered.
Augmented reality can superimpose virtual objects, scenes and information into a real scene to form the enhancement of real visual effect. The augmented reality-based rehabilitation training stimulates the patient to practice through human-computer interaction, and the patient migrates the learned motor skills into a real scene.
Many active rehabilitation training systems based on augmented reality have appeared in recent years, however most of these systems only build some simple interactive games, and the augmented reality technology is used to create various virtual control objects (such as a ball, a dolly, a stick, etc.) in the games. In these systems, only the current position and the target position are concerned, and quantitative feedback and monitoring of the high-degree-of-freedom motion process, especially visualization of the velocity, such as the motion velocity magnitude, the motion velocity direction, the velocity magnitude deviation, the velocity direction deviation, and the historical position, the target trajectory, the tracking deviation, etc., are lacked. Since many high-dimensional motion information which cannot be directly seen exists in the rehabilitation training exercise, how to efficiently and intuitively display and evaluate the high-dimensional motion information is a problem which needs to be designed.
Disclosure of Invention
The invention aims to: aiming at the technical problems, the invention provides a method and a system for visualizing the amount of exercise for active rehabilitation training. The invention can visualize the high-dimensional motion information through augmented reality, and is helpful for patients and trainees to deeply understand the problems existing in training and the emphasis of exercise.
The technical scheme is as follows: an augmented reality kinematic visualization method for active rehabilitation training, comprising the steps of:
step 1, a trainer designs a guide track f, sets a sliding point P moving along the guide track f on the guide track f, and records the speed along the guide track f when the sliding point P performs simulated movement along the guide track f as the target speed of training
Figure BDA0002627354120000011
Step 2, drawing fault tolerance ranges on two sides of the guide track f by taking equidistant lines of distances d on the two sides of the guide track f as boundaries of the fault tolerance ranges, wherein d is the training error tolerance;
step 3, for f in step 1,
Figure BDA0002627354120000012
Performing AR rendering within the fault tolerance range;
step 4, acquiring training images of a patient wearing the marker M in real time by a camera, and positioning the marker M in each frame of training image to obtain the position of the center of the marker M and an actual track g;
step 5, calculating the movement speed of the mark M along the g as the actual speed by a difference method
Figure BDA0002627354120000013
Step 6, for g and g in step 4
Figure BDA0002627354120000021
Performing AR rendering, wherein starting points of f and g are the same;
step 7, carrying out corner detection on the mark M in the training image, matching the mark M in a pre-stored corner-posture database of the mark M to obtain the real-time posture of the mark M, and forming a posture matrix { R) of the mark M Marke (t)|p Marker (t)},R Marke (t) and p Marker (t) the attitude of the marker M and the position of the center of the marker M at time t, respectively;
step 8, according to
Figure BDA0002627354120000022
And
Figure BDA0002627354120000023
constructing a virtual compass T: taking the center of the mark M as the center of a circle of the virtual compass T, the speed error angle theta as a central angle and the actual speed respectively
Figure BDA0002627354120000024
Magnitude of motion correction speed
Figure BDA0002627354120000025
Making concentric sectors T for radius reality And T reality (ii) a Wherein
Figure BDA0002627354120000026
Speed of motion correction
Figure BDA0002627354120000027
Figure BDA0002627354120000028
Figure BDA0002627354120000029
Is the distance vector from the center of the mark M to the nearest point on f, mu is a scale factor;
step 9, according to { R Marke (t)|p Marke (T), controlling the pose of the virtual compass T: attitude R of virtual compass T T (t)=R Marker (T), position p of virtual Compass T T (t)=p Marke (t);
Step 10, performing AR rendering on the virtual compass T;
and step 11, after the training of the patient is finished, evaluating the training according to the similarity of f and g and the smoothness of g.
Further, in step 8: speed error amount of virtual compass T
Figure BDA00026273541200000210
Further, in step 10: actual speed
Figure BDA00026273541200000211
Arrow and sector T reality Rendering with a color and correcting for speed
Figure BDA00026273541200000212
Arrow and sector T calibrate And rendering with another color to show the distinction.
Further, the virtual compass T is in a transparent state.
Further, in step 11, the cross-correlation function is used to evaluate how close f and g are, and the Dirichlet energy is used to evaluate how smooth g is.
An augmented reality motion visual system for active rehabilitation training comprises a training design module, a tracking module, a calculation module, a rendering module, an evaluation module, a camera and a display;
a training design module for the trainee to design the guide track f and the target speed
Figure BDA00026273541200000218
And a fault tolerance range;
a tracking module for processing the training image collected by the camera in real time to obtain the attitude, central position and actual track g of the marker M, and calculating the actual speed by a difference method
Figure BDA00026273541200000213
A computing module for computing based on
Figure BDA00026273541200000214
And
Figure BDA00026273541200000215
constructing a virtual compass T;
a rendering module for rendering f,
Figure BDA00026273541200000216
Fault tolerance range, g and
Figure BDA00026273541200000217
performing AR rendering;
the evaluation module is used for evaluating the training according to the closeness degree of the f and the g and the smoothness degree of the g;
and the display is used for displaying the AR rendering result.
The invention provides a method and a system for visualizing the amount of exercise for active rehabilitation training, which have the following advantages:
1. the augmented reality virtual compass visualizes the motion quantity such as motion speed, motion speed direction, speed deviation, speed direction deviation, hand position, hand gesture and the like;
2. information such as historical positions, target tracks, actual tracks and tracking deviation is visualized through augmented reality;
3. a specific scheme of a guide track design method and training evaluation considering motion difference and motion smoothness
Drawings
FIG. 1 is a flow chart of a design training of a trainer;
FIG. 2 is a drawing for designing a guide trajectory f by a control point and setting a target speed by a sliding point
Figure BDA0002627354120000031
Wherein (a) is a schematic view of designing a guidance trajectory f by a control point, and (b) is a schematic view of setting a target speed by a sliding point
Figure BDA0002627354120000032
A schematic diagram of (a);
FIG. 3 is a flow chart of a method of the present invention;
FIG. 4 is a rendering schematic diagram, in which (a) is a rendering schematic diagram of a guide track f and a fault tolerance range, and (b), (c) and (d) are rendering schematic diagrams of an actual track g and a virtual compass T at three different time points;
FIG. 5 (a), (b) and (c) are schematic diagrams of three kinds of augmented reality virtual compasses T, respectively
FIG. 6 is a system architecture diagram of the present invention.
Detailed Description
The invention is further described with reference to the accompanying drawings and the detailed description.
The invention discloses an augmented reality motion visual method and system for active rehabilitation training, which visually display three key motion amounts of position, speed and posture of a hand of a patient in the rehabilitation training by designing an augmented reality virtual compass so as to provide visual motion state comparison in real time and guide the rehabilitation training.
As shown in fig. 6, the system of the present invention includes training design, tracking, computation, rendering and evaluation, as well as a camera, display. The system renders augmented reality contents such as the augmented reality virtual compass and the track and the fault tolerance range thereof, and provides position, speed and posture information feedback for the patient. The system determines the best position estimate for the wearable markers by corner matching and searches for the best pose estimate in a corner-pose database. And constructing and rendering an augmented reality virtual compass according to the actual speed and the target speed, and enabling the pose of the augmented reality virtual compass to be consistent with the wearable marker. The augmented reality virtual compass reflects the difference between the actual speed and the target speed in the size and direction, guides the patient to correct the movement speed, and guides the patient to correct the movement route by the track. Besides, the system is also provided with a training design module and an evaluation module which are necessary for the active rehabilitation training system.
An augmented reality motion visualization method for active rehabilitation training, as shown in fig. 3, comprises the following steps:
(1) As shown in fig. 1, the training content design performed by the trainer specifically includes the following steps:
as shown in fig. 2 (a), the trainer defines the positions of the control points in the training space through GUI interaction, and interpolates the key points between the control points using Spline interpolation to generate a curve of the guide track f:
f(τ)=∑(x j ,y j ,z j )s j (τ)
wherein a 3-th order spline function s is used j (τ) as a guide track in the section.
As shown in fig. 2 (b), a sliding point P moving along f is provided on the guide locus f. When the trainer controls the sliding point P to do simulated movement along the guide track f, the speed of the sliding point P along the f is recorded as the target speed of training
Figure BDA0002627354120000041
(2) And drawing the fault tolerance ranges on two sides of the guide track f by taking the equidistant lines of the distance d on the two sides of the guide track f as the boundary of the fault tolerance range, wherein d is the training error tolerance.
(3) To f,
Figure BDA0002627354120000042
And the fault tolerant range, as shown in fig. 4 (a). And (3) rendering the augmented reality object, and calling an object rendering module of the augmented reality platform Vuforia to realize.
(4) The patient wears the marker M on the hand, marking the position p of the marker M marker (t) and attitude R Marke (t) reflects the position and posture of the patient's hand. The camera collects training images of the patient wearing the marker M in real time, and the position of the center of the marker M and the actual track g are obtained by positioning the marker M in each frame of image.
When an image acquired by a camera is processed, edge detection is performed by using a Sobel operator, then corner detection is performed on the basis of the edge detection by using a FREAK operator, and mark identification is performed according to the corner distribution in the image. Before training, the markers are registered in the system, which stores a database of corner-poses of the markers. During tracking, the system calculates angular point information of each frame of image, and determines the position p of the mark by comparing the angular point distribution of the image with the angular point distribution of the mark through Hamming distance marker (t) of (d). And tracking the identified mark by using the improved MOSSE filter on the video stream, only carrying out target identification based on corner detection on the area predicted by the improved MOSSE filter, and recording the actual track g of the mark center M.
(5) Calculating the movement speed of the mark along g by using a difference method as an actual speed:
Figure BDA0002627354120000043
wherein p is marker (t) is the position point at time t, p marker And (t-delta t) is a position point at the time t-delta t, and delta t is a time interval. Searching a corner-pose database, pose-to-estimate R using improved Hausdorff distance matching Marker (t) and using the RANSAC algorithm on the video stream to exclude lattice point interference in certain frames, { R { Marke (t)|p Mar (t) } form a pose matrix.
As shown in fig. 4 (b), (c) and (d), the local appearance of the actual trajectory g depends on the actual motion situation. If the mark moves in the fault-tolerant range, rendering the part of the track into a normal form; if the mark moves outside the fault-tolerant range, the part of the track is rendered into an abnormal form. And (3) rendering the augmented reality object, and calling an object rendering module of the augmented reality platform Vuforia to realize.
(6) A virtual compass T is constructed and rendered.
The distance vector from the center of the mark M to the nearest point on the guide track f is recorded as
Figure BDA0002627354120000044
The correction speed is then:
Figure BDA0002627354120000045
wherein the content of the first and second substances,
Figure BDA0002627354120000046
in order to obtain the target speed, the speed of the motor is set,
Figure BDA0002627354120000047
mu is a scale factor, which is the distance vector from the center o of the marker M to the nearest point on f.
Using actual speed
Figure BDA0002627354120000048
And correct speed
Figure BDA0002627354120000049
And calculating the parameters of the virtual compass T, wherein the speed error amount delta and the speed error angle theta are respectively as follows:
Figure BDA00026273541200000410
Figure BDA00026273541200000411
as shown in (a), (b) and (c) of fig. 5, the speed error is measured from the center of the mark MThe difference angle theta is a central angle and is respectively equal to the actual speed
Figure BDA0002627354120000051
Magnitude of motion correction speed
Figure BDA0002627354120000052
Is a radius and is made into a concentric fan shape T reality And T reality And a virtual compass T is formed.
At rendering time, actual speed
Figure BDA0002627354120000057
Arrow and sector T reality Rendering with a color and correcting for speed
Figure BDA0002627354120000053
Arrow and sector T calibrate And rendering with another color for distinguishing. The size of the augmented reality virtual compass T can be enlarged or reduced by multiplying by a scale factor. And (3) rendering the augmented reality object, and calling an object rendering module of the augmented reality platform Vuforia to realize.
In order to guarantee visibility of the augmented reality virtual compass T and the marker M at the same time, the virtual compass T is in a transparent state:
r pixel (u,v)=(1-β)r video (u,v)+βr T (u,v)
g pixel (u,v)=(1-β)g video (u,v)+βg T (u,v)
b pixel (u,v)=(1-β)b video (u,v)+βb T (u,v)
wherein (r) video (u,v),g video (u,v),b video (u, v)) is the pixel color of the video, (r) T (u,v),g T (u,v),b T (u, v)) is the color of the pixel point of the augmented reality virtual compass T, (r) pixel (u,v),g pixel (u,v),b pixel (u, v)) is the color of a pixel point obtained by superimposing the virtual compass T on the video, (u, v) is the coordinate of the pixel point on the screen, and beta is opacity.
And for the rendering of the augmented reality object, calling an object rendering function of the augmented reality platform Vuforia to realize, and displaying on a display screen.
(7) After training is finished, the training is evaluated according to the closeness degree of f and g.
And drawing a complete guide track f and an actual track g for understanding the tracking condition in training. The change of the speed error magnitude delta and the speed error angle theta in the training is plotted to know the motion speed condition in the training. For an arbitrary position point (x, y, z), if it is on the guide track F, it is written F (x, y, z) =1; if the distance d' from the point (x, y, z) to the guide track F is greater than the error tolerance d, let F (x, y, z) =0; otherwise F (x, y, z) =1-d'/d. For any position point (x, y, z), if it is on the actual trajectory G, then note G (x, y, z) =1, otherwise note G (x, y, z) =0. The degree of closeness of the actual trajectory to the guiding trajectory is evaluated using a cross-correlation function:
Figure BDA0002627354120000054
where G (x, y, z) records the actual track and F (x, y, z) records the guide track. Since F (x, y, z) takes into account the tolerance allowance, R is compared with the conventional curve cross-correlation function GF The deviation of motion within the allowable range can be tolerated for the improved fuzzy cross-correlation function. In addition to the fuzzy cross-correlation function for describing motion differences, dirichlet energy is used to measure the smoothness of the actual motion trajectory:
Figure BDA0002627354120000055
wherein the content of the first and second substances,
Figure BDA0002627354120000056
is the derivative of the actual trajectory curve.
The tracking, feedback, evaluation and other functions are also suitable for training two-dimensional motion. And (3) reducing the dimension of the three-dimensional interaction method, for example, setting z =0, and integrating in a two-dimensional space to evaluate the similarity degree of the tracks.

Claims (6)

1. An augmented reality motion visualization method for active rehabilitation training, comprising the steps of:
step 1, a trainer designs a guide track f, sets a sliding point P moving along f on the guide track f, and records the speed of the sliding point P along the guide track f during simulated movement as the target speed of training
Figure FDA0002627354110000011
Step 2, drawing fault tolerance ranges on two sides of the guide track f by taking equidistant lines of distances d on the two sides of the guide track f as boundaries of the fault tolerance ranges, wherein d is the training error tolerance;
step 3, f and f in the step 1,
Figure FDA0002627354110000012
Performing AR rendering within the fault tolerance range;
step 4, acquiring training images of the patient wearing the marker M in real time by the camera, and positioning the marker M in each frame of training image to obtain the position of the center of the marker M and an actual track g;
step 5, calculating the movement speed of the mark M along the g as the actual speed by a difference method
Figure FDA0002627354110000013
Step 6, for g and g in step 4
Figure FDA0002627354110000014
Performing AR rendering, wherein starting points of f and g are the same;
step 7, carrying out corner point detection on the markers M in the training image, matching the markers M in a pre-stored corner point-posture database of the markers M to obtain the real-time postures of the markers M, and forming a posture matrix { R ] of the markers M Mar (t)|p Marker (t)},R Marke (t) and p Marker (t) the attitude of the marker M and the position of the center of the marker M at time t, respectively;
step 8, according to
Figure FDA0002627354110000015
And
Figure FDA0002627354110000016
constructing a virtual compass T: the center of the mark M is taken as the center of a circle of the virtual compass T, the speed error angle theta is taken as a central angle, and the actual speed is respectively taken
Figure FDA0002627354110000017
Magnitude of motion correction speed
Figure FDA0002627354110000018
Making concentric sectors T for radius reality And T reality (ii) a Wherein
Figure FDA0002627354110000019
Speed of motion correction
Figure FDA00026273541100000110
Figure FDA00026273541100000111
Figure FDA00026273541100000112
Is the distance vector from the center of the mark M to the nearest point on f, mu is a scale factor;
step 9, according to { R Marker (t)|p Marker (T) }, controlling the pose of the virtual compass T: attitude R of virtual compass T T (t)=R Marke (T), position p of the virtual compass T T (t)=p Marke (t);
Step 10, performing AR rendering on the virtual compass T;
and step 11, after the training of the patient is finished, evaluating the training according to the similarity degree of f and g and the smoothness degree of g.
2. The visual method of augmented reality motion for active rehabilitation training according to claim 1, wherein in step 8: amount of velocity error of virtual compass T
Figure FDA00026273541100000113
3. The visual method of augmented reality motion for active rehabilitation training according to claim 1, wherein in step 10: actual speed
Figure FDA00026273541100000114
Arrow and sector T reality Rendering with a color and correcting for speed
Figure FDA00026273541100000115
Arrow and sector T calibrate And rendering with another color for distinguishing.
4. The visual method of augmented reality movements for active rehabilitation training of claim 1 wherein the virtual compass T is in a transparent state.
5. An augmented reality visual motion method for active rehabilitation as claimed in claim 1 wherein step 11 uses cross correlation functions to estimate how close f and g are, and Dirichlet energy to estimate how smooth g is.
6. An augmented reality motion visual system for active rehabilitation training is characterized by comprising a training design module, a tracking module, a calculation module, a rendering module, an evaluation module, a camera and a display;
a training design module for the trainee to design the guide track f and the target speed
Figure FDA0002627354110000021
And a fault tolerance range;
a tracking module for processing the training image collected by the camera in real time to obtain the attitude, the central position and the actual track g of the marker M, and calculating the actual speed by a difference method
Figure FDA0002627354110000022
A computing module for computing based on
Figure FDA0002627354110000023
And
Figure FDA0002627354110000024
constructing a virtual compass T;
a rendering module for rendering f,
Figure FDA0002627354110000025
Fault tolerance range, g and
Figure FDA0002627354110000026
performing AR rendering;
the evaluation module is used for evaluating the training according to the closeness degree of the f and the g and the smoothness degree of the g;
and the display is used for displaying the AR rendering result.
CN202010800941.9A 2020-08-11 2020-08-11 Exercise amount visualization method and system for active rehabilitation training Active CN112102917B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010800941.9A CN112102917B (en) 2020-08-11 2020-08-11 Exercise amount visualization method and system for active rehabilitation training

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010800941.9A CN112102917B (en) 2020-08-11 2020-08-11 Exercise amount visualization method and system for active rehabilitation training

Publications (2)

Publication Number Publication Date
CN112102917A CN112102917A (en) 2020-12-18
CN112102917B true CN112102917B (en) 2022-11-04

Family

ID=73753502

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010800941.9A Active CN112102917B (en) 2020-08-11 2020-08-11 Exercise amount visualization method and system for active rehabilitation training

Country Status (1)

Country Link
CN (1) CN112102917B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107331220A (en) * 2017-09-01 2017-11-07 国网辽宁省电力有限公司锦州供电公司 Transformer O&M simulation training system and method based on augmented reality
CN110739040A (en) * 2019-08-29 2020-01-31 北京邮电大学 rehabilitation evaluation and training system for upper and lower limbs
CN110890140A (en) * 2019-11-25 2020-03-17 上海交通大学 Virtual reality-based autism rehabilitation training and capability assessment system and method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107331220A (en) * 2017-09-01 2017-11-07 国网辽宁省电力有限公司锦州供电公司 Transformer O&M simulation training system and method based on augmented reality
CN110739040A (en) * 2019-08-29 2020-01-31 北京邮电大学 rehabilitation evaluation and training system for upper and lower limbs
CN110890140A (en) * 2019-11-25 2020-03-17 上海交通大学 Virtual reality-based autism rehabilitation training and capability assessment system and method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
上肢康复训练机器人虚拟环境建模技术;李会军等;《中国组织工程研究与临床康复》;20071104(第44期);全文 *
基于HTC VIVE的上肢康复虚拟训练系统研究;陈东林等;《北京服装学院学报(自然科学版)》;20180630(第02期);全文 *

Also Published As

Publication number Publication date
CN112102917A (en) 2020-12-18

Similar Documents

Publication Publication Date Title
Reitmayr et al. Going out: robust model-based tracking for outdoor augmented reality
Riecke et al. Visual homing is possible without landmarks: A path integration study in virtual reality
Alves et al. Comparing spatial and mobile augmented reality for guiding assembling procedures with task validation
CN110246159A (en) The 3D target motion analysis method of view-based access control model and radar information fusion
CN106840148A (en) Wearable positioning and path guide method based on binocular camera under outdoor work environment
CN110825234A (en) Projection type augmented reality tracking display method and system for industrial scene
CN107958479A (en) A kind of mobile terminal 3D faces augmented reality implementation method
EP1435280A2 (en) A method and a system for programming an industrial robot
CN110388919B (en) Three-dimensional model positioning method based on feature map and inertial measurement in augmented reality
JP7164045B2 (en) Skeleton Recognition Method, Skeleton Recognition Program and Skeleton Recognition System
CN111178170B (en) Gesture recognition method and electronic equipment
Zhang et al. Monocular visual traffic surveillance: A review
Wang et al. Immersive human–computer interactive virtual environment using large-scale display system
Marie et al. Visual servoing on the generalized voronoi diagram using an omnidirectional camera
CN112102917B (en) Exercise amount visualization method and system for active rehabilitation training
Jiang et al. Application of virtual reality human-computer interaction technology based on the sensor in English teaching
Dang et al. Path-analysis-based reinforcement learning algorithm for imitation filming
Kohlbrecher et al. Grid-based occupancy mapping and automatic gaze control for soccer playing humanoid robots
Hadfield et al. Object assembly guidance in child-robot interaction using RGB-D based 3D tracking
CN115760919A (en) Single-person motion image summarization method based on key action characteristics and position information
CN115994944A (en) Three-dimensional key point prediction method, training method and related equipment
Roil et al. Exploring Possible Applications of ORB SLAM 2 in Education, Healthcare, and Industry: Insights into the Challenges, Features, and Effects
Meisner et al. An object state estimation for the peg transfer task in computer-guided surgical training
WO2021021085A1 (en) Modification of projected structured light based on identified points within captured image
Mitsuhashi et al. Motion curved surface analysis and composite for skill succession using RGBD camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant