CN113197571A - Gait training assessment method and device based on radar - Google Patents

Gait training assessment method and device based on radar Download PDF

Info

Publication number
CN113197571A
CN113197571A CN202110497172.4A CN202110497172A CN113197571A CN 113197571 A CN113197571 A CN 113197571A CN 202110497172 A CN202110497172 A CN 202110497172A CN 113197571 A CN113197571 A CN 113197571A
Authority
CN
China
Prior art keywords
training
gait
foot
radar
virtual object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110497172.4A
Other languages
Chinese (zh)
Inventor
王俊华
王兆坤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Xiaokang Medical Technology Co ltd
Original Assignee
Guangzhou Xiaokang Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Xiaokang Medical Technology Co ltd filed Critical Guangzhou Xiaokang Medical Technology Co ltd
Priority to CN202110497172.4A priority Critical patent/CN113197571A/en
Publication of CN113197571A publication Critical patent/CN113197571A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/112Gait analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B22/00Exercising apparatus specially adapted for conditioning the cardio-vascular system, for training agility or co-ordination of movements
    • A63B22/02Exercising apparatus specially adapted for conditioning the cardio-vascular system, for training agility or co-ordination of movements with movable endless bands, e.g. treadmills
    • A63B22/0235Exercising apparatus specially adapted for conditioning the cardio-vascular system, for training agility or co-ordination of movements with movable endless bands, e.g. treadmills driven by a motor
    • A63B22/0242Exercising apparatus specially adapted for conditioning the cardio-vascular system, for training agility or co-ordination of movements with movable endless bands, e.g. treadmills driven by a motor with speed variation
    • A63B22/025Exercising apparatus specially adapted for conditioning the cardio-vascular system, for training agility or co-ordination of movements with movable endless bands, e.g. treadmills driven by a motor with speed variation electrically, e.g. D.C. motors with variable speed control
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B23/00Exercising apparatus specially adapted for particular parts of the body
    • A63B23/035Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously
    • A63B23/04Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously for lower limbs
    • A63B23/0405Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously for lower limbs involving a bending of the knee and hip joints simultaneously
    • A63B23/0458Step exercisers without moving parts
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B23/00Exercising apparatus specially adapted for particular parts of the body
    • A63B23/035Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously
    • A63B23/04Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously for lower limbs
    • A63B23/0405Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously for lower limbs involving a bending of the knee and hip joints simultaneously
    • A63B23/0464Walk exercisers without moving parts
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • A63B2071/0638Displaying moving images of recorded environment, e.g. virtual environment

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Physiology (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Public Health (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Surgery (AREA)
  • Molecular Biology (AREA)
  • Rehabilitation Therapy (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Geometry (AREA)
  • Cardiology (AREA)
  • Vascular Medicine (AREA)
  • Rehabilitation Tools (AREA)

Abstract

The invention discloses a gait training evaluation method based on radar, which comprises the following steps: s00: building an operating environment of the training evaluation system, and S10: projecting the planar virtual object by a projector, S20: collecting foot detection data of a gait trainer through a radar and recording collection time, S30: judging and calculating the position relation between the actual step foot of the gait trainer and the plane virtual object according to different training modes in the training evaluation system and generating real-time associated data; s40: and outputting the evaluation result of the gait trainer. According to the gait training evaluation method based on the radar, the position of the actual stepping foot of the gait trainer can be obtained through the radar, whether the coordinate of the actual stepping foot is related to the position of the virtual projection object or not is further determined, meanwhile, various interactive gait training schemes are supported, the interestingness and the richness of gait training of the gait trainer are further improved, and accurate gait rehabilitation training is pertinently implemented.

Description

Gait training assessment method and device based on radar
Technical Field
The invention relates to the field of training evaluation methods, in particular to a gait training evaluation method based on radar and a gait training evaluation device based on radar.
Background
Clinically, the problems of muscle spasm, muscle weakness, joint rigidity, brain injury, spinal cord injury and the like are all easy to cause the dyskinesia of patients. Walking training is an important means for assisting the recovery of walking function of the above-mentioned patients with diseases. At present, the walking rehabilitation training measures of the hospital are mainly that family members or therapists support patients for walking training, the training method not only needs manpower to feed back the walking training condition, but also can cause the movement of the patients to be limited and the gait abundance to be reduced due to the support, therefore, the hospital gradually adopts gait training equipment to assist the patients to carry out the walking training, and the aim of quickly recovering the walking function of the patients is achieved.
Chinese patent document CN111603171A is a gait parameter determination method and system for lower limb rehabilitation. The method comprises the following steps: collecting original data of a laser radar installed on a rehabilitation robot; determining the time and position of the patient to land and land according to the raw data; and calculating gait parameters of the patient according to the time and the position, wherein the gait parameters comprise step length, step width, swing time, standing time and support time. This patent only discloses the acquisition of gait parameters of a patient by lidar, which would not solve the problem of accurate detection of patient by radar in association with virtual projection.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention aims to provide a gait training evaluation method based on radar, which can know the association between the actual footstep and the virtual projection of a gait trainer through the radar and simultaneously support various interactive gait training schemes, thereby improving the interest and the richness of the gait training of the gait trainer and carrying out targeted and accurate gait rehabilitation training.
In order to achieve the purpose, the invention adopts the following technical scheme:
the invention provides a gait training evaluation method based on radar, which is implemented according to the following steps:
s00: building an operating environment of a training evaluation system; in step S00, the training evaluation system is electrically connected to a radar gait rehabilitation training platform, the radar gait rehabilitation training platform is controlled by the training evaluation system, the operation environment establishment step includes that after the training evaluation system is started, a gait trainer selects one of a long-distance linear walking training mode or a training pool stepping training mode through a human-computer interaction module of the training evaluation system, when the training evaluation system is configured to the long-distance linear walking training mode, the radar gait rehabilitation training platform is configured to a moving treadmill belt, and when the training evaluation system is configured to the training pool stepping training mode, the radar gait rehabilitation training platform is configured to a stationary treadmill belt or flat ground.
S10: after a gait trainer selects a training mode, a projector projects the plane virtual object on the radar gait rehabilitation training platform, and stores preset coordinate data of the plane virtual object into a storage module of the training evaluation system; in step S10, the projection range of the planar virtual object is configured to be within a preset range of coordinates with the standing position of the gait trainer as the center, the long-distance linear walking training mode and the training pool stepping training mode have different difficulty levels, and the training evaluation system adjusts the difficulty levels of the different training modes by adjusting the distance between the planar virtual object and the gait trainer, the moving speed of the planar virtual object and the number of shots of the planar virtual object.
S20: acquiring instantaneous foot detection data of a gait trainer stepping on the radar gait rehabilitation training platform through a radar and recording acquisition time; wherein the two-dimensional point set corresponding to the foot detection data is marked as { Pij(i ═ 0,1,2, …, n) }, and the acquisition time is recorded as tj
S30: and a calculation module in the training evaluation system calculates the actual stepping foot outline and the actual stepping foot central point of the gait trainer according to the foot detection data and generates real-time calculation data, and under different training modes, the training evaluation system judges and calculates the position relationship between the real-time calculation data and the virtual object outline and the virtual object standard central point of the plane virtual object and generates real-time correlation data.
S40: and an evaluation module in the training evaluation system inputs the real-time calculation data and the real-time correlation data into a gait evaluation algorithm for calculation and then outputs an evaluation result of the gait trainer.
In the preferred embodiment of the present invention, in the step S30, the step of calculating the actual stepping foot center point includes the following steps:
s301: acquiring an edge point set of the actual step foot outline according to the two-dimensional point set; wherein, TijIs PijThe set of edge points of the actual step foot profile is denoted as { T }ij,(i=0,1,2,…n)};
S302: the calculation module fits and simplifies the edge point set of the actual stepping foot outline to generate a simplified actual stepping foot polygon and a simplified actual stepping foot vertex set { Q }ij,(i=0,1,2,…n)};
S303: calculating the footprint area according to the simplified set of the vertexes of the actual stepping foot; wherein the simplified set of vertices of the actual stepping foot comprises n vertices, and the calculation formula of the actual stepping foot area a is as follows:
Figure BDA0003054844680000031
wherein x isiIs QijX coordinate, y of the ith pointi+1Is QijY coordinate, x of the (i + 1) th pointi+1Is QijX coordinate, y of the (i + 1) th pointiIs QijThe y coordinate of the ith point;
s304: calculating the actual central point of the stepping foot (C)x,Cy) (ii) a Wherein, the calculation formula of the actual step central point is as follows:
Figure BDA0003054844680000032
Figure BDA0003054844680000041
wherein x isiIs QijX coordinate, y of the ith pointi+1Is QijY coordinate, x of the (i + 1) th pointi+1Is QijX coordinate, y of the (i + 1) th pointiIs QijThe y coordinate of the ith point;
in step S301, a preferred embodiment of the present invention is to obtain an edge point set { T } of the actual step foot outline from the two-dimensional point setijThe discrimination formula of (i ═ 0,1,2, … n) is as follows:
Figure BDA0003054844680000042
where j denotes taking the jth acquisition or at tjThe acquisition is carried out in time and,
Figure BDA0003054844680000043
represents the subset T in the j-th acquisitionijThe x-coordinate of the (i-1) th point,
Figure BDA0003054844680000044
represents the subset T in the j-th acquisitionijThe y-coordinate of the (i-1) th point,
Figure BDA0003054844680000045
represents the subset T in the j-th acquisitionijThe x-coordinate of the ith point in (c),
Figure BDA0003054844680000046
represents the subset T in the j-th acquisitionijThe y-coordinate of the ith point in (c),
Figure BDA0003054844680000047
represents the subset P in the j-th acquisitionijThe x-coordinate of the ith point in (c),
Figure BDA0003054844680000048
represents the subset P in the j-th acquisitionijThe y-coordinate of the ith point.
In a preferred embodiment of the present invention, in step S30, the step size is calculated as follows:
Fd=|xc1-xc2|
wherein x isc1Representing the coordinates of one foot of a gait trainer in a certain acquisition in the X direction of the radar gait rehabilitation training platform, Xc2The coordinate of the other foot of the gait trainer in a certain acquisition in the X direction of the radar gait rehabilitation training platform is represented;
in step S30, the calculation formula of the step size is as follows:
Fl=|yc1-(vjΔt+yc2)|
wherein, yc1A coordinate representing one foot of a gait trainer in a certain acquisition in the Y direction of the radar gait rehabilitation training platform, Yc2The coordinate of the other foot of the gait trainer in the Y direction of the radar gait rehabilitation training platform in a certain acquisition is represented, and the moving speed of the radar gait rehabilitation training platform is set to be VjAnd the acquisition time is recorded as tj,tjIs in the range of 0,1,2, … m.
In the long-distance linear walking training mode, the plane virtual object is configured to be a standard footprint, and the position relationship between the actual stepping foot of the gait trainer and the plane virtual object is configured to be whether the standard footprint is stepped or not and the coincidence rate of the standard footprints in the stepping;
the real-time associated data is calculated by the following steps:
s305: presetting a virtual object standard central point in a certain acquisition; wherein the standard center point of the virtual object is marked as CS(xcs,ycs);
S306: comparing the actual stepping center point (C) of the actual stepping foot in a certain collectionx,Cy) With the virtual object reference center point CS(xcs,ycs) Distance between them and threshold value dmIn relation to (1), if trueCentral point of foot of interstep (C)x,Cy) With the virtual object reference center point CS(xcs,ycs) Distance between them is less than threshold value dmIf yes, marking the mark as the standard mark in stepping; if the actual central point of the stepping foot (C)x,Cy) With the virtual object reference center point CS(xcs,ycs) The distance between the two is more than or equal to the threshold value dmIf yes, the user is considered not to step in; wherein, the comparative formula is as follows:
Figure BDA0003054844680000051
s307: calculating the coincidence rate P of standard footprints in treadingS(ii) a Wherein the overlapping ratio PSThe calculation formula is as follows:
Figure BDA0003054844680000052
in the preferred technical solution of the present invention, in the training pool step training mode, the positional relationship between the actual step foot of the gait trainer and the planar virtual object is configured as whether the left foot or the right foot of the gait trainer can avoid the moving planar virtual object or whether the positional relationship between the actual step foot of the gait trainer and the planar virtual object is configured as whether the planar virtual object can be stepped on, and the step of calculating the real-time correlation data is as follows:
s305: presetting a virtual object standard central point in a certain acquisition; wherein the standard center point of the virtual object is marked as CO(xco,yco);
S306: comparing the actual stepping center point (C) of the actual stepping foot in a certain collectionx,Cy) With the virtual object reference center point CO(xco,yco) Distance between them and threshold value dmoIf the actual stepping foot center point (C)x,Cy) Marking the center point of the virtual object as CO(xco,yco) Distance between them > threshold dmoIf the avoidance or stepping is successful, the user is considered to be in the middle; if the actual central point of the stepping foot (C)x,Cy) With the virtual object reference center point CO(xco,yco) The distance between the two is less than or equal to the threshold value dmoIf the avoidance or stepping is not successfully performed, the user is considered to be unsuccessfully avoided or stepped on; wherein, the comparative formula is as follows:
Figure BDA0003054844680000061
the radar gait rehabilitation training platform is preferably configured as a stationary treadmill conveyor belt or a flat ground. When the position relation between the actual stepping foot of the gait trainer and the plane virtual object is configured to be whether the left foot or the right foot of the gait trainer can avoid the moving plane virtual object, a training area is arranged on the immovable running platform conveyor belt or the flat ground, and the projector projects a guide station foot print in the training area; and/or when the position relation between the actual stepping foot of the gait trainer and the plane virtual object is configured to be whether the plane virtual object can be stepped on or not, a training area and a peripheral area are arranged on a fixed running platform conveyor belt or a flat ground, the training area is positioned in the middle of the peripheral area, and the projector projects a guide station footprint in the training area.
In the step S40, in the long-distance straight-line walking training mode, the step of calculating the straight-line walking training score in the gait assessment algorithm is implemented as follows:
s401: presetting a standard step size value, a standard step pitch value and a full score value; wherein, the preset standard step value is recorded as F1SAnd the preset standard step distance value is recorded as FwsThe full score value is recorded as 100 points;
s402: implementing a complete walking training in a long-distance linear walking training mode, wherein a storage module in the training evaluation system records the coincidence rate P of standard footprints in stepping of the gait trainer each time in the linear walking trainingSStep length F1And a step distance FwA calculation module in the training evaluation system calculates the average value of standard footprints in the treading
Figure BDA0003054844680000071
Mean value of step size
Figure BDA0003054844680000072
And average value of step pitch
Figure BDA0003054844680000073
S403: according to the average value of standard footprints in the tread
Figure BDA0003054844680000074
And the average value of said step size
Figure BDA0003054844680000075
Mean value of step pitch
Figure BDA0003054844680000076
Calculating a gait Score Score of the gait trainer, wherein the calculation formula of the gait Score Score is as follows:
Figure BDA0003054844680000077
in the step S40, in the training pool step training mode, the step training score calculation process in the gait assessment algorithm is implemented as follows:
s404: presetting weighting coefficients d of n different difficulty gradesn
S405: implementing one complete stepping training in a training pool stepping training mode, recording different difficulty levels, and recording the total avoidance success number or the total stepping number C corresponding to the different difficulty levels in the one complete stepping training of the gait trainer in the stepping training by a storage module in the training evaluation systemn
Score=d1C1+d2C2+…+dnCn
The present invention also provides a gait training evaluation device based on radar for the above gait training evaluation method based on radar, which comprises:
a radar gait rehabilitation training platform; the device is used for rehabilitation training of the gait trainer;
a radar; the device is used for collecting foot detection data of a gait trainer;
a projector; for projecting the planar virtual object on the radar gait rehabilitation training platform;
the training evaluation system is used for guiding the gait trainers to carry out rehabilitation training in different rehabilitation modes and comprises a storage module, a calculation module, an evaluation module and a human-computer interaction module, wherein the storage module is used for storing preset coordinate data of the plane virtual object, the calculation module is used for generating real-time calculation data and real-time associated data, the evaluation module is used for calculating an evaluation result according to the output of the calculation module, and the human-computer interaction module is used for inputting a control instruction and displaying the evaluation result.
The invention has the beneficial effects that:
the gait training evaluation method based on the radar acquires foot detection data of a gait trainer through the radar, calculates data such as an actual stepping foot central point, step distance and step length based on the foot detection data, calculates the position relation between the data and virtual objects in different training modes and generates real-time associated data, thereby obtaining the association between the footprint data of a patient acquired by the radar and virtual projection data, further achieving the purpose of accurately calculating data such as the stepping rate or the obstacle avoidance success probability in a game and the like, and further providing accurate data basis for the evaluation result of the gait trainer. Meanwhile, the gait training assessment method can support game interaction in multiple training modes, and the gait richness of the patient is greatly improved.
Drawings
Figure 1 is a flow chart of a method for radar-based gait training assessment provided in an embodiment of the invention;
FIG. 2 is a schematic illustration of a footprint provided in an embodiment of the present invention;
FIG. 3 is a schematic diagram of a training area provided in a second embodiment of the present invention;
FIG. 4 is a schematic diagram of a training area provided in a third embodiment of the present invention;
fig. 5 is a functional block diagram of a radar-based gait training evaluation device provided in an embodiment of the invention.
In the figure:
1. a gait rehabilitation training platform; 2. a radar; 3. a projector; 4. training an evaluation system; 41. a storage module; 42. a calculation module; 43. an evaluation module; 44. and a man-machine interaction module.
Detailed Description
The technical scheme of the invention is further explained by the specific implementation mode in combination with the attached drawings.
Example one
As shown in fig. 1, in the embodiment, a gait training evaluation method based on radar is provided, which is used for collecting gait rehabilitation data of a patient and evaluating the gait rehabilitation data by using radar 2, in the embodiment, when a training evaluation system 4 is in a long-distance straight-line walking training mode, a radar 2 gait rehabilitation training platform 1 is configured as a moving treadmill conveyor belt, in the long-distance straight-line walking training mode, the patient can perform a walking game on the moving treadmill conveyor belt, and a planar virtual object is configured as a planar footprint, and the method is implemented according to the following steps:
step S00: and constructing an operating environment of the training evaluation system 4. In step S00, the training evaluation system 4 is electrically connected to the radar 2 gait rehabilitation training platform 1, the radar 2 gait rehabilitation training platform 1 is controlled by the training evaluation system 4, the operation environment setting step includes that after the training evaluation system 4 is started, the gait trainer selects one of the long-distance linear walking training mode or the training pool stepping training mode through the human-computer interaction module 44 of the training evaluation system 4, when the training evaluation system 4 is configured to the long-distance linear walking training mode, the radar 2 gait rehabilitation training platform 1 is configured to be a moving treadmill belt, and when the training evaluation system 4 is configured to the training pool stepping training mode, the radar 2 gait rehabilitation training platform 1 is configured to be a stationary treadmill belt or a flat ground. The gait training evaluation method based on the radar is realized by a gait training evaluation device based on the radar, the gait training evaluation device comprises a radar 2 gait rehabilitation training platform 1, a radar 2, a projector 3 and a training evaluation system 4, the radar 2 is arranged right in front of the gait rehabilitation training platform, a detection area of the radar 2 is the upper surface of the gait rehabilitation training platform and is very close to the upper surface of the gait rehabilitation training platform, the projector 3 is positioned on one side of the gait rehabilitation training platform, and the projector 3 projects the projection of the projector to the upper surface of the gait rehabilitation training platform. The driving module of the radar 2 gait rehabilitation training platform 1, the radar 2 and the projector 3 can perform data interaction with the training evaluation system 4 and are controlled by a computer carried in the training evaluation system 4.
Step S10: projecting a plane virtual object on the radar 2 gait rehabilitation training platform 1 according to a selected training mode of a gait trainer by the projector 3, and storing preset coordinate data of the plane virtual object into a storage module 41 of the training evaluation system 4; in step S10, the projection range of the planar virtual object is configured to be within a preset range with the standing position of the gait trainer as the center coordinate, the long-distance linear walking training mode and the training pool stepping training mode both have different difficulty levels, the training evaluation system 4 adjusts the difficulty levels of the different training modes by adjusting the distance between the planar virtual object and the gait trainer, the moving speed of the planar virtual object and the number of dropped planar virtual objects, and the difficulty levels of the different training modes can be combined with the diagnosis and treatment plan to achieve an ideal therapeutic effect, for example: the diagnosis and treatment quality is gradually improved by setting a material arrangement scheme which is easy before difficult. The computer is internally provided with a storage module 41, and the relative position of the projector 3 and the gait rehabilitation training platform is determined after the radar-based gait training evaluation device is built, so when the radar 2 gait rehabilitation training platform 1 projects a plane virtual object, the preset coordinate data of the plane virtual object can be preset and can also be input through a human-computer interaction module 44 of the training evaluation system 4.
Step S20: tong (Chinese character of 'tong')Collecting foot detection data of a gait trainer by a radar 2 and recording the collection time; wherein, the two-dimensional point set corresponding to the foot detection data is marked as { Pij(i-0, 1,2, …, n) and the acquisition time is recorded as tjSince the scanning period of the radar 2 is generally fixed, tjJ in (1) can also represent the acquisition times, and under the condition of fixed acquisition period, knowing the acquisition times is equivalent to knowing the acquisition time tj. When a patient performs walking training on the radar 2 gait rehabilitation training platform 1, after the scanning period of the radar 2 is reasonably set, the radar 2 can continuously scan the upper surface of the rehabilitation training platform to determine whether the feet of the gait trainer exist, and when the feet of the gait trainer are detected, the feet detection data of the gait trainer are collected and the collection time or the scanning frequency is recorded.
Step S30: the calculation module 42 in the training evaluation system 4 calculates the actual stepping foot contour and the actual stepping foot center point of the gait trainer according to the foot detection data and generates real-time calculation data, and judges and calculates the position relationship between the real-time calculation data and the virtual object contour and the virtual object standard center point of the planar virtual object according to different training modes in the training evaluation system 4 and generates real-time correlation data. Usually, different training modes require different real-time associated data, because the evaluation criteria are inconsistent, the long-distance linear walking training mode needs to verify whether the actual stepping foot of the patient steps on the midplane virtual object, so it needs to calculate three data, namely the actual stepping foot center point, the step pitch and the step length, while in the training pool stepping training mode, it mainly needs the calculated data of the virtual object profile of the planar virtual object to judge whether the data such as the stepping rate is reached or not.
In order to improve the calculation accuracy of the actual stepping foot center point, further, in step S30, the calculation of the actual stepping foot center point includes the following steps:
step S301: and acquiring an edge point set of the actual step foot outline according to the two-dimensional point set. Wherein, TijIs PijThe subset of (1), the set of edge points of the actual step foot profile is denoted as { T }ij(i ═ 0,1,2, … n) }. As shown in FIG. 2, the set of edge points of the actual step foot contour can outlineGenerating a foot print of the actual step foot of the patient, determining the outline of the actual step foot of the patient by the foot print, and obtaining the subset T by the following discriminant formulaij
Step S302: fitting and simplifying the edge point set of the actual stepping foot outline to generate a simplified actual stepping foot polygon and a simplified actual stepping foot vertex set { Q }ij(i ═ 0,1,2, … n) }. Set of actual stepping foot vertices { QijThe specific method for acquiring (i ═ 0,1,2, … n) comprises the following steps: and performing polygon fitting and simplification on the edge of the footprint by a Douglas-Peucker algorithm, and generating a simplified actual stepping foot polygon according to the fitting and simplification result so as to achieve the purpose of calculating the actual stepping foot area and the actual stepping foot central point.
Step S303: based on the simplified actual stepping foot polygon and the simplified actual stepping foot vertex set { Qij(i ═ 0,1,2, … n) } calculating footprint area; wherein the simplified set of vertices of the actual stepping foot { Q }ijWhere n vertices are included in (i ═ 0,1,2, … n), the actual step area a is calculated as follows:
Figure BDA0003054844680000121
wherein x isiIs QijX coordinate, y of the ith pointi+1Is QijY coordinate, x of the (i + 1) th pointi+1Is QijX coordinate, y of the (i + 1) th pointiIs QijThe y-coordinate of the ith point.
Step S304: calculating the actual central point of the stepping foot (C)x,Cy) (ii) a Wherein, the calculation formula of the actual step central point is as follows:
Figure BDA0003054844680000122
Figure BDA0003054844680000123
wherein x isiIs QijX coordinate, y of the ith pointi+1Is QijY coordinate, x of the (i + 1) th pointi+1Is QijX coordinate, y of the (i + 1) th pointiIs QijThe y-coordinate of the ith point.
To obtain a set of edge points from a two-dimensional set of points { Q ] for an accurate actual step foot profileij(i ═ 0,1,2, … n) }, further preferably, in step S301, according to the two-dimensional point set { Pij(i ═ 0,1,2, …, n) } the discriminant equation for obtaining the set of edge points for the actual step foot profile is as follows:
Figure BDA0003054844680000124
where j denotes taking the jth acquisition or at tjThe acquisition is carried out in time and,
Figure BDA0003054844680000125
represents the subset T in the j-th acquisitionijThe x-coordinate of the (i-1) th point,
Figure BDA0003054844680000126
represents the subset T in the j-th acquisitionijThe y-coordinate of the (i-1) th point,
Figure BDA0003054844680000127
represents the subset T in the j-th acquisitionijThe x-coordinate of the ith point in (c),
Figure BDA0003054844680000128
represents the subset T in the j-th acquisitionijThe y-coordinate of the ith point in (c),
Figure BDA0003054844680000129
represents the subset P in the j-th acquisitionijX coordinate, y of the ith pointPijRepresents the subset P in the j-th acquisitionijThe y-coordinate of the ith point. That is, the two-dimensional point set PijThe child satisfying the above-mentioned discriminant formulaSet is the set of edge points of the actual step profile { Qij,(i=0,1,2,…n)}。
In order to calculate the step distance, further, the step distance involved in the radar-based gait training evaluation method provided in this embodiment refers to the distance from one foot to the other foot of the gait trainer in the X direction of the radar 2 gait rehabilitation training platform 1 in a certain acquisition, so in step S30, the calculation formula of the step distance is as follows:
Fd=|xc1-xc2|。
wherein x isc1Represents the coordinate of one foot of the gait trainer in the X direction of the radar 2 gait rehabilitation training platform 1 during a certain acquisition, namely the abscissa, X of the one footc2And the coordinate of the other foot of the gait trainer in a certain acquisition in the X direction of the radar 2 gait rehabilitation training platform 1, namely the abscissa of the other foot.
In order to calculate the step length, further, the step length related to the radar-based gait training evaluation method provided in this embodiment refers to coordinates of one foot to the other foot in the Y direction of the radar 2 gait rehabilitation training platform 1 in a certain acquisition, and at this time, if the radar 2 gait rehabilitation training platform 1 is in motion, it is necessary to consider that the moving speed of the radar 2 gait rehabilitation training platform 1 is vjOf course, if the radar 2 gait rehabilitation training platform 1 also moves along the X direction, the moving speed of the radar 2 gait rehabilitation training platform 1 along the X direction should be considered, but the moving speed along the X direction is not generally considered in the actual long-distance linear walking training mode. In step S30, the step size is calculated as follows:
Fl=|yc1-(vjΔt+yc2)|
wherein, yc1Represents the coordinate of one foot of the gait trainer in the Y direction of the radar 2 gait rehabilitation training platform 1 during a certain acquisition, Yc2The coordinate of the other foot of the gait trainer in the Y direction in a certain acquisition is represented, and the moving speed of the radar 2 gait rehabilitation training platform 1 is set as VjAnd the acquisition time is recorded as tj,tjIs in the range of 0,1,2, … n.
In order to accurately calculate whether the patient steps on the mid-plane footprint and obtain the coincidence rate in stepping, further, the position relationship between the actual stepping foot of the gait trainer and the plane virtual object is configured to be whether the patient steps on the mid-plane virtual object and the coincidence rate of the standard footprint in stepping, the calculation steps of the real-time associated data are as follows:
step S305: and presetting a virtual object standard central point in a certain acquisition. Wherein, the standard center point of the virtual object is marked as CS(xcs,ycs)
Step S306: comparing the actual step center point (C) of the actual step of the patient in a certain acquisitionx,Cy) With a standard centre point CS(xcs,ycs) Distance between them and threshold value dmIf the actual stepping foot center point (C)x,Cy) Center point C of the virtual objectS(xcs,ycs) Distance between them is less than threshold value dmIf yes, the gait training system considers that the gait is in the middle of stepping, and adjusts the color of a screen in the human-computer interaction module 44 or the color of a certain mark in the screen through the human-computer interaction module 44 of the training evaluation system 4 to remind the gait training person to know the result of stepping each time in time so as to give more training confidence to the gait training person and mark the training confidence as the standard foot mark in stepping; if the actual central point of the stepping foot (C)x,Cy) Center point C of the virtual objectS(xcs,ycs) The distance between the two is more than or equal to the threshold value dmThen, it is considered not to be stepped on.
Wherein, the comparative formula is as follows:
Figure BDA0003054844680000141
step S307: calculating the coincidence rate P of standard footprints in stepping in a certain acquisitionSCoincidence ratio PSIs an important index for evaluating the stepping quality of the gait trainer. Wherein the overlapping ratio PSThe calculation formula is as follows:
Figure BDA0003054844680000142
step S40: the evaluation module 43 in the training evaluation system 4 inputs the real-time calculation data and the real-time correlation data into the gait evaluation algorithm, and outputs the evaluation result of the gait trainer after statistical calculation. Different training modes refer to different training modes, the gait evaluation algorithm can output the score of a certain game completed by a patient after counting and analyzing the real-time calculation data and the real-time associated data, some general-purpose games can be selectively set for the patient to play, and the richness of the gait of the patient can be greatly improved after the patient finishes different types of games. In the long-distance straight-line walking training mode, the evaluation result is based on the comprehensive evaluation of the actual stepping foot outline, the actual stepping foot central point, the coincidence rate of standard footprints in stepping and the straight-line walking training score of a gait trainer each time. In the training pool stepping training mode, outputting the avoiding or stepping times of the gait trainer to the moving plane virtual object each time in the training pool stepping training scheme, carrying out comprehensive evaluation on the performance under different difficulty levels, and finally outputting the evaluation result of the gait trainer after the comprehensive evaluation.
In order to accurately evaluate the game result of the patient, in step S40, the gait evaluation algorithm is implemented as follows:
in step S40, in the long-distance straight-line walking training mode, the straight-line walking training score calculation process in the gait assessment algorithm is implemented as follows:
s401: presetting a standard step size value, a standard step pitch value and a full score value; wherein, the preset standard step value is recorded as F1SAnd the preset standard step distance value is recorded as FwsThe full score is recorded as 100. The step length value can be determined according to the data of the height, the weight, the shoulder width or the waist circumference of the human body, and the full score value is set by the user at the same time.
S402: implementing a complete walking training in the long-distance linear walking training mode, the storage module 41 in the training evaluation system 4 records the coincidence rate P of the standard footprints in stepping of each stepping of the gait trainer in the linear walking trainingSStep length F1And a step distance FwCalculation module 42 in training assessment system 4 calculates the bid-winning bidMean value of quasi-footprint
Figure BDA0003054844680000151
Mean value of step size
Figure BDA0003054844680000152
And average value of step pitch
Figure BDA0003054844680000153
S403: according to average value of standard footprints in treading
Figure BDA0003054844680000154
And average of step sizes
Figure BDA0003054844680000155
Mean value of step pitch
Figure BDA0003054844680000156
Calculating a gait Score Score of the gait trainer, wherein the calculation formula of the gait Score Score is as follows:
Figure BDA0003054844680000157
example two
The gait training evaluation method based on radar provided in this embodiment is used for collecting and evaluating gait rehabilitation data of a patient by using a radar 2, in this embodiment, a training evaluation system 4 is in an obstacle avoidance training mode, a radar 2 gait rehabilitation training platform 1 is configured as a flat ground or a stationary treadmill conveyor belt, the patient can perform an obstacle avoidance game on the moving flat ground or the stationary treadmill conveyor belt, and a planar virtual object is configured as a planar obstacle, for example: bombs, flyballs, and the like. As shown in fig. 3, the positional relationship between the actual step foot of the gait trainer and the planar virtual object is configured to determine whether the left foot or the right foot of the gait trainer can avoid the moving planar virtual object. Preferably, a training area is arranged on the immovable running platform conveyor belt or the flat ground, the projector 3 projects guide station footprints in the training area, the guide station footprints comprise left and right standing virtual footprints, left and upper and right lower standing virtual footprints and right and upper and lower left standing virtual footprints, a patient can be guided to stand by the guide station footprints so as to prepare for obstacle avoidance, and different standing modes can be used for pertinently helping gait trainers to recover different body parts.
In order to accurately calculate whether the obstacle avoidance is successful, further, the calculation steps of collecting the position relationship between the actual stepping foot of the gait trainer and the plane virtual object are configured to determine whether the plane virtual object can be avoided, and the calculation steps of the real-time associated data of the position relationship between the actual stepping foot and the plane virtual object are as follows:
s305: and presetting a virtual object standard central point in a certain acquisition. Wherein, the standard center point of the virtual object is marked as CO(xco,yco)。
S306: comparing the actual stepping center point (C) of the actual stepping foot in a certain collectionx,Cy) Marking the center point of the virtual object as CO(xco,yco) Distance between them and threshold value dmoIf the actual stepping foot center point (C)x,Cy) Center point C of the virtual objectO(xco,yco) Distance between them > threshold dmoIf so, the avoidance is considered to be successful; if the actual central point of the stepping foot (C)x,Cy) Center point C of the virtual objectO(xco,yco) The distance between the two is less than or equal to the threshold value dmoThen it is considered to be unsuccessfully avoided. Wherein, the comparative formula is as follows:
Figure BDA0003054844680000161
in step S40, in the training pool step training mode, the step training score calculation process in the gait assessment algorithm is implemented as follows:
s404: presetting weighting coefficients d of n different difficulty gradesn. For example: presetting 3 weighting coefficients d with different difficulty grades3For ease, medium, and difficulty, respectively, the bonus factor for different grades is referenced in the table below.
Serial number Difficulty rating Velocity value Weighting factor (d)n)
1 Easy Can be configured by a rehabilitation therapist 1
2 Medium and high grade Can be configured by a rehabilitation therapist 1.5
3 Difficulty in Can be configured by a rehabilitation therapist 2
S405: a complete stepping training is carried out once in a training pool stepping training mode, different difficulty levels are recorded, and a storage module 41 in the training evaluation system 4 records the total avoidance success number or the total stepping number C corresponding to the different difficulty levels in the complete stepping training of the gait trainer once in the stepping trainingn
Score=d1C1+d2C2+…+dnCn
EXAMPLE III
In the embodiment, the training evaluation system 4 is in a training pool stepping training mode, the radar 2 gait rehabilitation training platform 1 is configured as a flat or immobile running platform conveyor belt, the patient can perform a stepping game on the flat or immobile running platform conveyor belt, and the planar virtual object is configured as a planar moving object, for example: balloon, hamster, and the like. As shown in fig. 4, the position relationship between the actual step foot of the gait trainer and the plane virtual object is configured as to whether the plane virtual object can be stepped on. Preferably, a training area and a peripheral area are arranged on the immovable running platform conveyor belt or the flat ground, the training area is positioned in the middle of the peripheral area, the projector 3 projects guide station foot prints in the training area, the guide station foot prints comprise left and right standing virtual foot prints, left and upper and right lower standing virtual foot prints and right and upper and lower left standing virtual foot prints, and the guide station foot prints can guide a patient to stand so as to prepare for stepping and hitting games, so that the rehabilitation effects of different parts are achieved.
In order to accurately calculate whether the treading is effective, further, the position relationship between the actual treading foot of the gait trainer and the plane virtual object is configured to be whether the plane virtual object can be treaded, and the calculation steps of the data related to the position relationship between the actual treading foot of the gait trainer and the plane virtual object in real time are as follows:
s305: presetting a virtual object standard central point in a certain acquisition; wherein the standard center point of the virtual object is marked as CO(xco,yco);
S306: comparing the actual stepping center point (C) of the actual stepping foot in a certain collectionx,Cy) Marking the center point of the virtual object as CO(xco,yco) Distance between them and threshold value dmoIf the actual stepping foot center point (C)x,Cy) With the virtual object reference center point CO(xco,yco) Distance between them > threshold dmoIf the user does not step in the middle, the user is considered not to step in the middle; if actually steppingFoot center point (C)x,Cy) With the virtual object reference center point CO(xco,yco) The distance between the two is less than or equal to the threshold value dmoThen, the user is considered to step on the middle. Wherein, the comparative formula is as follows:
Figure BDA0003054844680000181
in step S40, in the training pool step training mode, the step training score calculation process in the gait assessment algorithm is implemented as follows:
s404: presetting weighting coefficients d of n different difficulty gradesn. In the stepping and collision training in the training pool stepping training mode, two factors of position coding and detention speed need to be considered for weighting. The position code is determined according to the codes of the training area and the peripheral area, and the retention speed represents the time difference between the display and the disappearance of the plane virtual object and is in seconds. The position codes of different areas and the detention speed of the plane virtual object are set, so that the stress ability and the rehabilitation effect of the rehabilitee can be well evaluated.
Difficulty rating Position coding Rate of residence Weighting factor (d)n)
1 Easy A or B or C or D 20 seconds 1
2 Medium and high grade E or F or G or H 20 seconds 1.5
3 Difficulty in E or F or G or H 10 seconds 2
4 Medium and high grade A or B or C or D 15 seconds 1.5
5 Difficulty in A or B or C or D 5 seconds 2
S405: implementing one complete stepping training in a training pool stepping training mode, recording different difficulty levels, and recording the total avoidance success number or the total stepping number C corresponding to the different difficulty levels in the one complete stepping training of the gait trainer in the stepping training by a storage module 41 in a training evaluation system 4n
Score=d1C1+d2C2+…+dnCn
Example four
As shown in fig. 5, the radar-based gait training evaluation device for the above-mentioned radar-based gait training evaluation method in the fourth embodiment includes a gait rehabilitation training platform 1, a radar 2, a projector 3 and a training evaluation system 4, wherein the radar 2 gait rehabilitation training platform 1 is used for rehabilitation training of a gait trainer, the radar 2 is used for collecting foot detection data of the gait trainer, the projector 3 is used for projecting a planar virtual object on the radar 2 gait rehabilitation training platform 1, the training evaluation system 4 is used for providing guidance for rehabilitation training of the gait trainer in different rehabilitation modes, the training evaluation system 4 includes a storage module 41, a calculation module 42, an evaluation module 43 and a human-computer interaction module 44, the storage module 41 is used for storing preset coordinate data of the planar virtual object, the calculation module 42 is used for generating real-time calculation data and real-time correlation data, the evaluation module 43 is used for calculating an evaluation result according to the output of the calculation module 42, and the human-computer interaction module 44 is used for inputting a control instruction and displaying the evaluation result. Set up radar 2 in the dead ahead of attitude rehabilitation training platform, and radar 2's detection area is the upper surface of rehabilitation training platform and is close to in the upper surface of rehabilitation training platform to the protection data acquisition accuracy, projector 3 is located one side of attitude rehabilitation training platform, and projector 3 projects its projection to the upper surface of rehabilitation training platform. When the system is used, after the operating environment of the training evaluation system 4 based on the radar 2 gait rehabilitation training platform 1 is built, different training modes are selected through the human-computer interaction module 44, then the projector 3 projects a plane virtual object on the radar 2 gait rehabilitation training platform 1 according to the preset training mode, the preset coordinate data of the plane virtual object is stored in the storage module 41 of the training evaluation system 4, the radar 2 is used for collecting foot detection data of a gait trainer and recording the collection time, the calculation module 42 in the training evaluation system 4 selectively calculates one or more of the actual stepping foot center point, the step distance and the step length of the gait trainer according to the foot detection data and generates real-time calculation data, and the position relation between the actual stepping foot of the gait trainer and the plane virtual object is judged and calculated according to the different training modes in the training evaluation system 4 and generates real-time correlation data, the evaluation module 43 in the training evaluation system 4 inputs the real-time calculation data and the real-time associated data into the gait evaluation algorithm for statistical calculation, and outputs the evaluation result of the gait trainer and displays the evaluation result through the human-computer interaction module 44, so that the gait trainer can quickly and intuitively obtain the comprehensive evaluation effect, and the training evaluation system 4 also supports the user to adjust the real-time data of the step pitch and the step length to be made into an icon to further analyze the stepping stability of the patient.
While the invention has been described with reference to a preferred embodiment, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted for elements thereof without departing from the spirit and scope of the invention. The present invention is not to be limited by the specific embodiments disclosed herein, and other embodiments that fall within the scope of the claims of the present application are intended to be within the scope of the present invention.

Claims (10)

1. A gait training evaluation method based on radar is characterized by comprising the following steps:
s00: building an operating environment of a training evaluation system; in step S00, the training evaluation system is electrically connected to a radar gait rehabilitation training platform, the radar gait rehabilitation training platform is controlled by the training evaluation system, the operation environment setting step includes that after the training evaluation system is started, a gait trainer selects one of a long-distance linear walking training mode or a training pool stepping training mode through a human-computer interaction module of the training evaluation system, when the training evaluation system is configured to the long-distance linear walking training mode, the radar gait rehabilitation training platform is configured to a moving treadmill belt, and when the training evaluation system is configured to the training pool stepping training mode, the radar gait rehabilitation training platform is configured to a stationary treadmill belt or flat ground;
s10: after a gait trainer selects a training mode, a projector projects the plane virtual object on the radar gait rehabilitation training platform, and stores preset coordinate data of the plane virtual object into a storage module of the training evaluation system; in step S10, the projection range of the planar virtual object is configured to be within a preset range of coordinates with the standing position of the gait trainer as the center, the long-distance linear walking training mode and the training pool stepping training mode both have different difficulty levels, and the training evaluation system adjusts the difficulty levels of the different training modes by adjusting the distance between the planar virtual object and the gait trainer, the moving speed of the planar virtual object and the number of shots of the planar virtual object;
s20: acquiring instantaneous foot detection data of a gait trainer stepping on the radar gait rehabilitation training platform through a radar and recording acquisition time; wherein the two-dimensional point set corresponding to the foot detection data is marked as { Pij(i ═ 0,1,2, …, n) }, and the acquisition time is recorded as tj
S30: a calculation module in the training evaluation system calculates the actual stepping foot outline and the actual stepping foot central point of a gait trainer according to the foot detection data and generates real-time calculation data, and under different training modes, the training evaluation system judges and calculates the position relationship between the real-time calculation data and the virtual object outline and the virtual object standard central point of the plane virtual object and generates real-time associated data;
s40: and an evaluation module in the training evaluation system inputs the real-time calculation data and the real-time correlation data into a gait evaluation algorithm for calculation and then outputs an evaluation result of the gait trainer.
2. The radar-based gait training assessment method according to claim 1, characterized in that:
in step S30, the step of calculating the actual stepping foot center point is as follows:
s301: acquiring an edge point set of the actual step foot outline according to the two-dimensional point set; wherein, TijIs PijThe set of edge points of the actual step foot profile is denoted as { T }ij,(i=0,1,2,…n)};
S302: the computation module fits and simplifies the set of edge points of the actual step profile,generating a simplified actual stepping foot polygon and a simplified set of actual stepping foot vertices { Q }ij,(i=0,1,2,…n)};
S303: calculating the footprint area according to the simplified set of the vertexes of the actual stepping foot; wherein the simplified set of vertices of the actual stepping foot comprises n vertices, and the calculation formula of the actual stepping foot area a is as follows:
Figure FDA0003054844670000021
wherein x isiIs QijX coordinate, y of the ith pointi+1Is QijY coordinate, x of the (i + 1) th pointi+1Is QijX coordinate, y of the (i + 1) th pointiIs QijThe y coordinate of the ith point;
s304: calculating the actual central point of the stepping foot (C)x,Cy) (ii) a Wherein, the calculation formula of the actual step central point is as follows:
Figure FDA0003054844670000022
Figure FDA0003054844670000023
wherein x isiIs QijX coordinate, y of the ith pointi+1Is QijY coordinate, x of the (i + 1) th pointi+1Is QijX coordinate, y of the (i + 1) th pointiIs QijThe y-coordinate of the ith point.
3. The radar-based gait training assessment method according to claim 2, characterized in that:
in step S301, an edge point set { T ] of the actual step foot profile is obtained according to the two-dimensional point setijThe discrimination formula of (i ═ 0,1,2, … n) is as follows:
Figure FDA0003054844670000031
where j denotes taking the jth acquisition or at tjThe acquisition is carried out in time and,
Figure FDA0003054844670000032
represents the subset T in the j-th acquisitionijThe x-coordinate of the (i-1) th point,
Figure FDA0003054844670000033
represents the subset T in the j-th acquisitionijThe y-coordinate of the (i-1) th point,
Figure FDA0003054844670000034
represents the subset T in the j-th acquisitionijThe x-coordinate of the ith point in (c),
Figure FDA0003054844670000035
represents the subset T in the j-th acquisitionijThe y-coordinate of the ith point in (c),
Figure FDA0003054844670000036
represents the subset P in the j-th acquisitionijThe x-coordinate of the ith point in (c),
Figure FDA0003054844670000037
represents the subset P in the j-th acquisitionijThe y-coordinate of the ith point.
4. The radar-based gait training assessment method according to claim 2, characterized in that:
in step S30, the calculation formula of the step distance is as follows:
Fd=|xc1-xc2|
wherein x isc1Representing one foot of a gait trainer during a certain acquisitionCoordinates in the X direction of the radar gait rehabilitation training platform, Xc2The coordinate of the other foot of the gait trainer in a certain acquisition in the X direction of the radar gait rehabilitation training platform is represented;
in step S30, the calculation formula of the step size is as follows:
Fl=|yc1-(vjΔt+yc2)|
wherein, yc1A coordinate representing one foot of a gait trainer in a certain acquisition in the Y direction of the radar gait rehabilitation training platform, Yc2The coordinate of the other foot of the gait trainer in the Y direction of the radar gait rehabilitation training platform in a certain acquisition is represented, and the moving speed of the radar gait rehabilitation training platform is set to be VjAnd the acquisition time is recorded as tj,tjIs in the range of 0,1,2, … m.
5. The radar-based gait training assessment method according to claim 2, characterized in that:
in the long-distance straight walking training mode, the plane virtual object is configured to be a standard footprint, and the position relation between the actual stepping foot of the gait trainer and the plane virtual object is configured to be whether the standard footprint is stepped or not and the coincidence rate of the standard footprints in stepping;
the real-time associated data is calculated by the following steps:
s305: presetting a virtual object standard central point in a certain acquisition; wherein the standard center point of the virtual object is marked as CS(xcs,ycs);
S306: comparing the actual stepping center point (C) of the actual stepping foot in a certain collectionx,Cy) With the virtual object reference center point CS(xcs,ycs) Distance between them and threshold value dmIf the actual stepping foot center point (C)x,Cy) With the virtual object reference center point CS(xcs,ycs) Distance between them is less than threshold value dmIf yes, marking the mark as the standard mark in stepping; if the actual stepping foot centerDot (C)x,Cy) With the virtual object reference center point CS(xcs,ycs) The distance between the two is more than or equal to the threshold value dmIf yes, the user is considered not to step in; wherein, the comparative formula is as follows:
Figure FDA0003054844670000041
s307: calculating the coincidence rate P of standard footprints in treadingS(ii) a Wherein the overlapping ratio PSThe calculation formula is as follows:
Figure FDA0003054844670000042
6. the radar-based gait training assessment method according to claim 4, characterized in that:
in the training pool stepping training mode, the position relation between the actual stepping foot of the gait trainer and the plane virtual object is configured to be whether the left foot or the right foot of the gait trainer can avoid the moving plane virtual object or whether the position relation between the actual stepping foot of the gait trainer and the plane virtual object is configured to be capable of stepping on the plane virtual object, and the calculation steps of the real-time correlation data are as follows:
s305: presetting a virtual object standard central point in a certain acquisition; wherein the standard center point of the virtual object is marked as CO(xco,yco);
S306: comparing the actual stepping center point (C) of the actual stepping foot in a certain collectionx,Cy) With the virtual object reference center point CO(xco,yco) Distance between them and threshold value dmoIf the actual stepping foot center point (C)x,Cy) Marking the center point of the virtual object as CO(xco,yco) Distance between them > threshold dmoIf the avoidance or stepping is successful, the user is considered to be in the middle; if the actual central point of the stepping foot (C)x,Cy) With the virtual object reference center point CO(xco,yco) The distance between the two is less than or equal to the threshold value dmoIf the avoidance or stepping is not successfully performed, the user is considered to be unsuccessfully avoided or stepped on; wherein, the comparative formula is as follows:
Figure FDA0003054844670000051
7. the radar-based gait training assessment method according to claim 6, characterized in that:
the radar gait rehabilitation training platform is configured to be a fixed running platform conveyor belt or a flat ground;
when the position relation between the actual stepping foot of the gait trainer and the plane virtual object is configured to be whether the left foot or the right foot of the gait trainer can avoid the moving plane virtual object, a training area is arranged on the immovable running platform conveyor belt or the flat ground, and the projector projects a guide station foot print in the training area; and/or when the position relation between the actual stepping foot of the gait trainer and the plane virtual object is configured to be whether the plane virtual object can be stepped on or not, a training area and a peripheral area are arranged on a fixed running platform conveyor belt or a flat ground, the training area is positioned in the middle of the peripheral area, and the projector projects a guide station footprint in the training area.
8. The radar-based gait training assessment method according to claim 5, characterized in that:
in step S40, in the long-distance straight-line walking training mode, the straight-line walking training score calculation process in the gait assessment algorithm is implemented as follows:
s401: presetting a standard step size value, a standard step pitch value and a full score value; wherein, the preset standard step value is recorded as F1SAnd the preset standard step distance value is recorded as FwsThe full score value is recorded as 100 points;
s402: walk in straight line at long distanceImplementing a complete walking training in the training mode, wherein a storage module in the training evaluation system records the coincidence rate P of standard footprints in stepping of the gait trainer each time in the straight walking trainingSStep length F1And a step distance FwA calculation module in the training evaluation system calculates the average value of standard footprints in the treading
Figure FDA0003054844670000061
Mean value of step size
Figure FDA0003054844670000062
And average value of step pitch
Figure FDA0003054844670000063
S403: according to the average value of standard footprints in the tread
Figure FDA0003054844670000064
And the average value of said step size
Figure FDA0003054844670000065
Mean value of step pitch
Figure FDA0003054844670000066
Calculating a gait Score Score of the gait trainer, wherein the calculation formula of the gait Score Score is as follows:
Figure FDA0003054844670000067
9. the radar-based gait training assessment method according to claim 6, characterized in that:
in step S40, in the training pool step training mode, the step training score calculation process in the gait evaluation algorithm is implemented as follows:
s404: preparation ofSetting weighting coefficients d of n different difficulty gradesn
S405: implementing one complete stepping training in a training pool stepping training mode, recording different difficulty levels, and recording the total avoidance success number or the total stepping number C corresponding to the different difficulty levels in the one complete stepping training of the gait trainer in the stepping training by a storage module in the training evaluation systemn
Score=d1C1+d2C2+…+dnCn
10. A radar-based gait training evaluation device for use in the radar-based gait training evaluation method according to any one of claims 1 to 8, characterized by comprising:
a radar gait rehabilitation training platform; the device is used for rehabilitation training of the gait trainer;
a radar; the device is used for collecting foot detection data of a gait trainer;
a projector; for projecting the planar virtual object on the radar gait rehabilitation training platform;
the training evaluation system is used for guiding the gait trainers to carry out rehabilitation training in different rehabilitation modes and comprises a storage module, a calculation module, an evaluation module and a human-computer interaction module, wherein the storage module is used for storing preset coordinate data of the plane virtual object, the calculation module is used for generating real-time calculation data and real-time associated data, the evaluation module is used for calculating an evaluation result according to the output of the calculation module, and the human-computer interaction module is used for inputting a control instruction and displaying the evaluation result.
CN202110497172.4A 2021-05-07 2021-05-07 Gait training assessment method and device based on radar Pending CN113197571A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110497172.4A CN113197571A (en) 2021-05-07 2021-05-07 Gait training assessment method and device based on radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110497172.4A CN113197571A (en) 2021-05-07 2021-05-07 Gait training assessment method and device based on radar

Publications (1)

Publication Number Publication Date
CN113197571A true CN113197571A (en) 2021-08-03

Family

ID=77029589

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110497172.4A Pending CN113197571A (en) 2021-05-07 2021-05-07 Gait training assessment method and device based on radar

Country Status (1)

Country Link
CN (1) CN113197571A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113599773A (en) * 2021-09-22 2021-11-05 上海海压特智能科技有限公司 Gait rehabilitation training system and method based on rhythmic visual stimulation

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09168529A (en) * 1995-12-19 1997-06-30 Anima Kk Floor reaction force measuring device
CN101327126A (en) * 2008-07-23 2008-12-24 天津大学 Method for extracting morphologic characteristic of human body bare footprint feature
CN102038504A (en) * 2009-10-19 2011-05-04 上海理工大学 Back-supported and weight-reduced wave mode balance estimating and training method
CN104298243A (en) * 2014-08-19 2015-01-21 北京理工大学 Humanoid robot uneven ground walking stability control method
US20160113584A1 (en) * 2013-05-30 2016-04-28 Baro Postural Instruments Srl System and method for detecting baropostural parameters
CN106095201A (en) * 2016-05-30 2016-11-09 安徽慧视金瞳科技有限公司 A kind of double-click detection method projecting interactive system
CN106466219A (en) * 2015-08-17 2017-03-01 丰田自动车株式会社 Gait state determines that equipment, gait state determine method and ambulation training equipment
CN107038424A (en) * 2017-04-20 2017-08-11 华中师范大学 A kind of gesture identification method
RU2645002C2 (en) * 2016-07-12 2018-02-15 Федеральное государственное бюджетное учреждение науки Сибирский федеральный научный центр агробиотехнологий Российской академии наук (СФНЦА РАН) Method for determining complex of parameters of cross section of objects of quasi-cylindrical form
CN108037505A (en) * 2017-12-08 2018-05-15 吉林大学 A kind of night front vehicles detection method and system
KR20180079786A (en) * 2017-01-02 2018-07-11 한국생산기술연구원 Device for gait rehabilitation
CN108939511A (en) * 2018-07-18 2018-12-07 广州市三甲医疗信息产业有限公司 Four limbs recovery training method and system based on virtual reality
CN109589557A (en) * 2018-11-29 2019-04-09 广州晓康医疗科技有限公司 Based on reality environment tandem race rehabilitation training of upper limbs system and appraisal procedure
CN109758157A (en) * 2019-01-29 2019-05-17 广州晓康医疗科技有限公司 Gait rehabilitation training and estimating method and system based on augmented reality
CN111420348A (en) * 2019-12-24 2020-07-17 广州晓康医疗科技有限公司 Gait rehabilitation training instrument and using method thereof
CN111603171A (en) * 2020-06-03 2020-09-01 上海金矢机器人科技有限公司 Gait parameter determination method and system for lower limb rehabilitation
WO2020253062A1 (en) * 2019-06-20 2020-12-24 平安科技(深圳)有限公司 Method and apparatus for detecting image border
CN112617837A (en) * 2021-01-05 2021-04-09 悦动奇点(北京)健康科技有限公司 Method and device for evaluating endurance of lower limbs of human body

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09168529A (en) * 1995-12-19 1997-06-30 Anima Kk Floor reaction force measuring device
CN101327126A (en) * 2008-07-23 2008-12-24 天津大学 Method for extracting morphologic characteristic of human body bare footprint feature
CN102038504A (en) * 2009-10-19 2011-05-04 上海理工大学 Back-supported and weight-reduced wave mode balance estimating and training method
US20160113584A1 (en) * 2013-05-30 2016-04-28 Baro Postural Instruments Srl System and method for detecting baropostural parameters
CN104298243A (en) * 2014-08-19 2015-01-21 北京理工大学 Humanoid robot uneven ground walking stability control method
CN106466219A (en) * 2015-08-17 2017-03-01 丰田自动车株式会社 Gait state determines that equipment, gait state determine method and ambulation training equipment
CN106095201A (en) * 2016-05-30 2016-11-09 安徽慧视金瞳科技有限公司 A kind of double-click detection method projecting interactive system
RU2645002C2 (en) * 2016-07-12 2018-02-15 Федеральное государственное бюджетное учреждение науки Сибирский федеральный научный центр агробиотехнологий Российской академии наук (СФНЦА РАН) Method for determining complex of parameters of cross section of objects of quasi-cylindrical form
KR20180079786A (en) * 2017-01-02 2018-07-11 한국생산기술연구원 Device for gait rehabilitation
CN107038424A (en) * 2017-04-20 2017-08-11 华中师范大学 A kind of gesture identification method
CN108037505A (en) * 2017-12-08 2018-05-15 吉林大学 A kind of night front vehicles detection method and system
CN108939511A (en) * 2018-07-18 2018-12-07 广州市三甲医疗信息产业有限公司 Four limbs recovery training method and system based on virtual reality
CN109589557A (en) * 2018-11-29 2019-04-09 广州晓康医疗科技有限公司 Based on reality environment tandem race rehabilitation training of upper limbs system and appraisal procedure
CN109758157A (en) * 2019-01-29 2019-05-17 广州晓康医疗科技有限公司 Gait rehabilitation training and estimating method and system based on augmented reality
WO2020253062A1 (en) * 2019-06-20 2020-12-24 平安科技(深圳)有限公司 Method and apparatus for detecting image border
CN111420348A (en) * 2019-12-24 2020-07-17 广州晓康医疗科技有限公司 Gait rehabilitation training instrument and using method thereof
CN111603171A (en) * 2020-06-03 2020-09-01 上海金矢机器人科技有限公司 Gait parameter determination method and system for lower limb rehabilitation
CN112617837A (en) * 2021-01-05 2021-04-09 悦动奇点(北京)健康科技有限公司 Method and device for evaluating endurance of lower limbs of human body

Non-Patent Citations (15)

* Cited by examiner, † Cited by third party
Title
LIU, BY (LIU, BOYANG) [1] ; FAN, CH (FAN, CHUNHUA) [1] ; CHEN, JW (CHEN, JIANWEI) [1] ; ZHOU, Y (ZHOU, YUN) [1] ; DONG, LH (DONG, : "Low temperature one-step synthesis of carbon co-encapsulated NiS2, NiS and S8 nanocrystals by electrophilic oxidation of nickelocene", 《MATERIALS LETTERS 》, 31 March 2015 (2015-03-31) *
QI, YC (QI, YOUCUN) [1] , [2] ; ZHANG, J (ZHANG, JIAN) [3] ; ZHANG, PF (ZHANG, PENGFEI) [1]: "A real-time automated convective and stratiform precipitation segregation algorithm in native radar coordinates", 《QUARTERLY JOURNAL OF THE ROYAL METEOROLOGICAL SOCIETY 》, 31 October 2013 (2013-10-31) *
任娜, 张道军.: "基于空间推理及语义的图斑扣除线状地物面积关键算法及其在土地调查建库中的应用", 《安徽农业科学》, 31 December 2011 (2011-12-31) *
刘立超, 魏国粱, 张青松, 等.: "基于激光雷达的农业耕作微地貌测量装置设计与试验", 《农业机械学报》, 31 December 2019 (2019-12-31) *
古真杰: "RoboCup中型组机器人全景视觉硬件研究及设计", 《CNKI》, 15 June 2009 (2009-06-15) *
吴樊, 王超, 张红等.: "基于空间信息的SAR图像船只交通监测方法", 《遥感信息》, 31 December 2010 (2010-12-31) *
孙晓龙, 韩俊峰.: "一种基于三坐标相控阵雷达的点迹凝聚方法", 《现代导航》, 31 December 2020 (2020-12-31) *
彭瑞, 江瑞.: "关于地理实体编码设计和应用的研究", 《现代测绘》, 31 December 2012 (2012-12-31) *
王俊华, 代晶晶, 令天宇等.: "基于RS与GIS技术的西藏多龙矿集区生态环境监测研究", 《地质学报》, 31 December 2019 (2019-12-31) *
王俊华, 曾科学, 罗力等.: "步态康复机器人意向运动补偿控制策略", 《现代制造技术与装备》, 31 December 2016 (2016-12-31) *
王卓清, 李永豪, 郭元芳, 等.: "心功能不同时期患者的无创血流动力学检测分析", 《军事医学》, 31 December 2017 (2017-12-31) *
申思, 马劲松.: "GIS关系数据库SQL空间扩展算子的实现", 《计算机工程与应用》, 31 December 2005 (2005-12-31) *
薛亚昕.: "基于步态分析的下肢康复训练评估方法", 《CNKI》, 15 February 2020 (2020-02-15) *
雷道竖, 刘海波: "一种极坐标系下的激光雷达扫描匹配SLAM方法", 《中国电子科学研究院学报》, 31 December 2019 (2019-12-31) *
马炳镇.: "起伏地形下地面瞬变电磁法三维正演数值模拟研究", 《物探与化探》, 31 December 2018 (2018-12-31) *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113599773A (en) * 2021-09-22 2021-11-05 上海海压特智能科技有限公司 Gait rehabilitation training system and method based on rhythmic visual stimulation

Similar Documents

Publication Publication Date Title
CN109758157B (en) Gait rehabilitation training evaluation method and system based on augmented reality
CN109863535B (en) Motion recognition device, storage medium, and motion recognition method
CN108986884A (en) The training system and method that a kind of balanced rehabilitation and cognitive rehabilitation blend
CN107174255A (en) Three-dimensional gait information gathering and analysis method based on Kinect somatosensory technology
CN105451829A (en) Personal digital trainer for physiotheraputic and rehabilitative video games
CN105451827A (en) Rehabilitative posture and gesture recognition
US20060211462A1 (en) System and method for tracking and assessing movement skills in multidimensional space
US20020187846A1 (en) Interactive method and apparatus for tracking and analyzing a golf swing
US20160129335A1 (en) Report system for physiotherapeutic and rehabilitative video games
CN104598867A (en) Automatic evaluation method of human body action and dance scoring system
CN103227888B (en) A kind of based on empirical mode decomposition with the video stabilization method of multiple interpretational criteria
US20220392370A1 (en) Virtual reality-based ground gait training system and method
US20150157938A1 (en) Personal digital trainer for physiotheraputic and rehabilitative video games
CN105902273A (en) Hand function rehabilitation quantitative evaluation method based on hand ulnar deviation motion
CN113197571A (en) Gait training assessment method and device based on radar
Yasser et al. Smart coaching: Enhancing weightlifting and preventing injuries
CN112933581A (en) Sports action scoring method and device based on virtual reality technology
CN113663312A (en) Micro-inertia-based non-apparatus body-building action quality evaluation method
CN112568898A (en) Method, device and equipment for automatically evaluating injury risk and correcting motion of human body motion based on visual image
CN114307117A (en) Standing long jump result measuring method and device based on video
US9420963B2 (en) Apparatus and method for recognizing user's posture in horse-riding simulator
EP3179446A1 (en) Orientation estimation method, and orientation estimation device
CN103198297B (en) Based on the kinematic similarity assessment method of correlativity geometric properties
CN117503115A (en) Rehabilitation training system and training method for nerve injury
CN115985462A (en) Rehabilitation and intelligence-developing training system for children cerebral palsy

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination