CN110801233B - Human body gait monitoring method and device - Google Patents

Human body gait monitoring method and device Download PDF

Info

Publication number
CN110801233B
CN110801233B CN201911072886.XA CN201911072886A CN110801233B CN 110801233 B CN110801233 B CN 110801233B CN 201911072886 A CN201911072886 A CN 201911072886A CN 110801233 B CN110801233 B CN 110801233B
Authority
CN
China
Prior art keywords
gait
key joint
trainer
joint point
monitored
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911072886.XA
Other languages
Chinese (zh)
Other versions
CN110801233A (en
Inventor
杜跃斐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Electric Group Corp
Original Assignee
Shanghai Electric Group Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Electric Group Corp filed Critical Shanghai Electric Group Corp
Priority to CN201911072886.XA priority Critical patent/CN110801233B/en
Publication of CN110801233A publication Critical patent/CN110801233A/en
Application granted granted Critical
Publication of CN110801233B publication Critical patent/CN110801233B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/112Gait analysis

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Dentistry (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Physiology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Rehabilitation Tools (AREA)

Abstract

The invention discloses a human body gait monitoring method and a human body gait monitoring device, which are used for solving the problem that the conventional human body gait monitoring method requires a patient to wear a sensor, so that the rehabilitation training operation is inconvenient. The method comprises the following steps: acquiring a video frame of the lower limb movement of a trainer to be monitored acquired by using image acquisition equipment in a training time period and the acquisition time of the video frame; aiming at each video frame, acquiring two-dimensional pixel coordinates of each key joint point of the lower limb of a trainer to be monitored according to a preset deep learning training model; for each key joint point, determining the actual position coordinates of the key joint point according to the two-dimensional pixel coordinates of the key joint point and the position information of the image acquisition equipment; determining the gait cycle and gait information of each step of the trainer to be monitored according to the actual position coordinates of the key joint points and the acquisition time of the video frame; and comparing the gait information in the gait cycle with a preset gait standard value corresponding to the key joint point to obtain a comparison result.

Description

Human body gait monitoring method and device
Technical Field
The invention relates to the technical field of computer network communication, in particular to a human gait monitoring method and device.
Background
With the increasing aging degree in China, patients with lower limb motor dysfunction caused by cerebral apoplexy are increasing continuously, and in addition, patients with mental or limb injuries caused by industrial injuries, traffic accidents, diseases and the like are also increasing remarkably. The lower limb rehabilitation robot is used as a rehabilitation medical device, and carries out scientific and effective rehabilitation training by assisting a patient so as to achieve the aim of recovering the motion function of the patient.
In the lower limb rehabilitation robot key technology, acquiring gait information of a patient in real time is a key index for mastering the rehabilitation training effect of the patient and is used for judging whether the patient performs training under the condition of meeting requirements, and the identification method of human gait mainly comprises two types at present: based on biomechanical information and based on human-computer interaction information. The existing human body gait recognition technology adopts a method of installing a sensor on the lower limb of a human body to acquire gait information and data, and a patient needs to wear sensor equipment during rehabilitation training, so that the inconvenience of rehabilitation training operation is caused.
Disclosure of Invention
In order to solve the problem that the conventional human gait monitoring method requires a patient to wear a sensor, so that the rehabilitation training operation is inconvenient, the embodiment of the invention provides a human gait monitoring method and a human gait monitoring device.
In a first aspect, an embodiment of the present invention provides a human gait monitoring method, including:
acquiring a video frame of the lower limb movement of a trainer to be monitored acquired by using image acquisition equipment in a training time period and the acquisition time of the video frame;
aiming at each video frame, acquiring two-dimensional pixel coordinates of each key joint point of the lower limb of the trainer to be monitored according to a preset deep learning training model;
for each key joint point, determining the actual position coordinates of the key joint point according to the two-dimensional pixel coordinates of the key joint point and the position information of the image acquisition equipment;
determining the gait cycle and gait information of each step of the trainer to be monitored according to the actual position coordinates of the key joint points and the acquisition time of the video frame;
and comparing the gait information in the gait cycle with a preset gait standard value corresponding to the key joint point to obtain a comparison result.
In the human body gait monitoring method provided by the embodiment of the invention, the lower limb rehabilitation robot acquires a video frame of the lower limb movement of a trainer to be monitored acquired by using an image acquisition device within a training time period and the acquisition time of the video frame, acquires two-dimensional pixel coordinates of each key joint point of the lower limb of the trainer to be monitored according to a preset deep learning training model aiming at each video frame, determines the actual position coordinates of the key joint point according to the two-dimensional pixel coordinates of the key joint point and the position information of the image acquisition device aiming at each key joint point, determines the gait cycle and the gait information of each step of the trainer to be monitored according to the actual position coordinates of the key joint point and the acquisition time of the video frame, wherein the gait information is used for representing the step form, and further compares the gait information within the gait cycle with a preset gait standard value corresponding to the key joint point, the human body gait monitoring method provided by the embodiment of the invention can capture the motion of each key joint point of the lower limb of the human body by utilizing a preset deep learning training model, acquire the two-dimensional pixel coordinates of each key joint point of the lower limb on each video frame, acquire the gait information of a trainer to be monitored in the rehabilitation training process in real time on the premise of not wearing sensor equipment, and compare the acquired gait information with the preset gait standard values corresponding to each key joint point so as to evaluate the standard rate of the training posture in the rehabilitation training process and evaluate the rehabilitation training effect according to the comparison result.
Preferably, determining the gait cycle of each step of the trainer to be monitored according to the actual position coordinates of the key joint points and the acquisition time of the video frame, specifically comprising:
and determining the gait cycle of each step of the trainer to be monitored according to the change of the actual position coordinates of the key joint points corresponding to each video frame and the acquisition time of the video frames.
Preferably, the step frequency is included in the gait information, and the step frequency determining step information of the trainer to be monitored according to the actual position coordinates of the key joint point and the acquisition time of the video frame includes:
and determining the step frequency of each step of the trainer to be monitored according to the gait cycle of each step.
Preferably, the step information includes a stride, and the step information of the trainer to be monitored is determined according to the actual position coordinates of the key joint point and the acquisition time of the video frame, including:
and determining the stride of each step of the trainer to be monitored according to the initial actual position coordinates of the key joint points in the gait cycle of each step.
Optionally, the method further comprises:
determining the training posture standard rate of the trainee to be monitored according to the comparison result;
and evaluating the rehabilitation degree of the trainer to be monitored according to the training posture standard rate.
Preferably, the determining the training posture standard rate of the trainer to be monitored according to the comparison result specifically includes:
calculating an error value between the gait of the trainer to be monitored in each gait cycle and a preset gait standard value corresponding to the key joint point;
and determining the training posture standard rate of the trainer to be monitored in the gait cycle according to the error value and the gait standard value corresponding to the key joint point.
In a second aspect, an embodiment of the present invention provides a human gait monitoring device, including:
the device comprises a first acquisition unit, a second acquisition unit and a monitoring unit, wherein the first acquisition unit is used for acquiring a video frame of the lower limb movement of a trainer to be monitored acquired by utilizing image acquisition equipment within a training time period and the acquisition time of the video frame;
the second acquisition unit is used for acquiring two-dimensional pixel coordinates of each key joint point of the lower limb of the trainer to be monitored according to a preset deep learning training model aiming at each video frame;
a first determining unit, configured to determine, for each key joint point, an actual position coordinate of the key joint point according to the two-dimensional pixel coordinate of the key joint point and the position information of the image acquisition device;
the second determining unit is used for determining the gait cycle and the gait information of each step of the trainer to be monitored according to the actual position coordinates of the key joint points and the acquisition time of the video frame, wherein the gait information is used for representing the step form;
and the comparison unit is used for comparing the gait information in the gait cycle with a preset gait standard value corresponding to the key joint point to obtain a comparison result.
Preferably, the second determining unit is specifically configured to determine a gait cycle of each step of the trainer to be monitored according to the change of the actual position coordinates of the key joint point corresponding to each video frame and the acquisition time of the video frame.
Preferably, the gait information includes a step frequency, and the second determining unit is specifically configured to determine the step frequency of each step of the trainer to be monitored according to the gait cycle of each step.
Preferably, the gait information includes a stride, and the second determining unit is specifically configured to determine the stride of each step of the trainer to be monitored according to the initial actual position coordinates of the key joint points in the gait cycle of each step.
Optionally, the apparatus further comprises:
the third determining unit is used for determining the training posture standard rate of the trainer to be monitored according to the comparison result;
and the evaluation unit is used for evaluating the rehabilitation degree of the trainer to be monitored according to the training posture standard rate.
Preferably, the third determining unit is specifically configured to calculate, for each gait cycle, an error value between the gait of the trainer to be monitored in the gait cycle and a preset gait standard value corresponding to the key joint point; and determining the training posture standard rate of the trainer to be monitored in the gait cycle according to the error value and the gait standard value corresponding to the key joint point.
The technical effects of the human gait monitoring device provided by the invention can be seen in the first aspect or the technical effects of the implementation manners of the first aspect, which are not described herein again.
In a third aspect, an embodiment of the present invention provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the human gait monitoring method when executing the program.
In a fourth aspect, the embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the steps in the human gait monitoring method according to the invention.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the invention and do not limit the invention. In the drawings:
fig. 1 is a schematic flow chart of an implementation of a human gait monitoring method according to an embodiment of the invention;
FIG. 2 is a schematic diagram of an implementation flow of determining a standard rate of training postures of a trainer to be monitored in an embodiment of the present invention;
FIG. 3 is a schematic structural diagram of a human gait monitoring device according to an embodiment of the invention;
fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
In order to solve the problem that the conventional human gait monitoring method requires a patient to wear a sensor, so that the rehabilitation training operation is inconvenient, the embodiment of the invention provides a human gait monitoring method and a human gait monitoring device.
The preferred embodiments of the present invention will be described below with reference to the accompanying drawings of the specification, it being understood that the preferred embodiments described herein are merely for illustrating and explaining the present invention, and are not intended to limit the present invention, and that the embodiments and features of the embodiments in the present invention may be combined with each other without conflict.
As shown in fig. 1, which is a schematic flow chart of an implementation of a human gait monitoring method provided by an embodiment of the present invention, the method may include the following steps:
and S11, acquiring video frames of the lower limb movement of the trainee to be monitored acquired by using the image acquisition equipment in the training time period and the acquisition time of the video frames.
During specific implementation, when a trainer to be monitored performs rehabilitation training by using the lower limb rehabilitation robot, the lower limb rehabilitation robot acquires video frames of the lower limb movement of the trainer to be monitored, which are acquired by using image acquisition equipment within a training time period, and the acquisition time of the video frames. The image acquisition device may be an image acquisition module built in the lower limb rehabilitation robot, or an external image acquisition device, such as a camera, which is not limited in the embodiments of the present invention.
And S12, acquiring two-dimensional pixel coordinates of each key joint point of the lower limb of the trainer to be monitored according to a preset deep learning training model aiming at each video frame.
In specific implementation, the lower limb rehabilitation robot acquires two-dimensional pixel coordinates of each key joint point of the lower limb of the trainer to be monitored according to a preset deep learning training model for each video frame, wherein the key joint points can be but are not limited to include: knee joint points and ankle joint points of the left leg and the right leg. The preset deep learning training model can be an OpenPose human posture recognition model, the model can recognize the positions of human key points in a video frame, and two-dimensional pixel coordinates of knee joint points and ankle joint points of a left leg and a right leg of a lower limb are recorded in real time.
And S13, determining the actual position coordinates of each key joint point according to the two-dimensional pixel coordinates of the key joint point and the position information of the image acquisition equipment.
In specific implementation, the lower limb rehabilitation robot determines the actual position coordinates of each key joint point according to the two-dimensional pixel coordinates of each key joint point and the position information of the image acquisition equipment.
Specifically, the image acquisition equipment acquires a two-dimensional image, the two-dimensional pixel coordinates of each key joint point which are directly acquired are size coordinates on the image, the actual size and the image meet the triangular similarity principle according to the small hole imaging principle, and the similarity proportion can be determined by the focal length and the distance between the image acquisition equipment and a human body.
And S14, determining the gait cycle and the gait information of each step of the trainer to be monitored according to the actual position coordinates of the key joint points and the acquisition time of the video frame.
In specific implementation, aiming at each key joint point, the lower limb rehabilitation robot determines the gait cycle and the gait information of each step of the trainer to be monitored according to the actual position coordinates of the key joint point and the acquisition time of the video frame, wherein the gait information comprises the step frequency and the step length.
Specifically, determining the gait cycle of each step of the trainer to be monitored according to the actual position coordinates of the key joint points and the acquisition time of the video frame, specifically comprising: and determining the gait cycle of each step of the trainer to be monitored according to the change of the actual position coordinates of the key joint points corresponding to each video frame and the acquisition time of the video frames.
In specific implementation, the time taken by the trainer to take one step is the gait cycle of each step.
Specifically, when the trainee to be monitored in the first frame of video is standing still before starting to move, a coordinate point corresponding to the key joint point may be set as a reference point, and the time taken for the key joint point to pass through the reference point every two consecutive times is one gait cycle.
Specifically, the gait cycle of each step can be calculated by the following formula:
ΔTn=Tn+1(x=0)-Tn(x=0)
ΔTnrepresenting the nth gait cycle;
Tn+1(x ═ 0) represents the moment when the key joint point passes through the reference point for the (n + 1) th time, namely the acquisition time of the (n + 1) th frame video frame;
Tn-1and (x ═ 0) represents the moment when the key joint point passes through the reference point for the nth time, namely the acquisition time of the nth frame video frame.
And further, determining the step frequency of each step of the trainer to be monitored according to the gait cycle of each step, wherein the reciprocal of the gait cycle is the step frequency.
The specific calculation formula is as follows:
Figure GDA0003623510570000071
wherein f isnRepresenting the step frequency of the trainer to be monitored in the nth gait cycle;
and determining the stride of each step of the trainer to be monitored according to the initial actual position coordinates of the key joint points in the gait cycle of each step.
Specifically, the stride of each step of the trainer to be monitored can be calculated by the following formula:
An=Hn(xmax)+Hn(xmin)
wherein, AnRepresenting the stride of the trainer to be monitored in the nth gait cycle;
Hn(xmax) A maximum value representing the distance the critical joint point moves in the positive x-axis direction for the nth gait cycle;
Hn(xmin) Represents the maximum value of the distance that the key joint point moves in the opposite direction of the x-axis in the nth gait cycle.
Through the above formula, the step frequency and the step length in each gait cycle (i.e. each step) of the trainer to be monitored can be calculated based on the knee joint point of the left leg (or the right leg) of the trainer to be monitored, and the step frequency and the step length in each gait cycle (i.e. each step) of the trainer to be monitored can be calculated based on the ankle joint point of the left leg (or the right leg) of the trainer to be monitored.
And S15, comparing the gait information in the gait cycle with a preset gait standard value corresponding to the key joint point to obtain a comparison result.
In specific implementation, the gait standard values (namely, the gait frequency standard value and the stride standard value) corresponding to the key joint points in each training period (in each training time period) are preset by the lower limb rehabilitation robot and stored.
Specifically, the lower limb rehabilitation robot compares the gait information obtained in step S14 with a preset gait standard value corresponding to the key joint point to obtain a comparison result.
Specifically, the step frequency of each step obtained by the trainer to be monitored based on each key joint point is compared with the step frequency standard value corresponding to each key joint point in the corresponding training period, and the stride of each step obtained by the trainer to be monitored based on each key joint point is compared with the stride standard value corresponding to each key joint point in the corresponding training period, so as to obtain the comparison result.
And further, determining the training posture standard rate of the trainee to be monitored according to the comparison result.
In specific implementation, the training posture standard rate of the trainer to be monitored can be determined according to the process shown in fig. 2, and the method can include the following steps:
and S21, calculating an error value between the gait of the trainer to be monitored in the gait cycle and a preset gait standard value corresponding to the key joint point for each gait cycle.
In specific implementation, for each gait cycle, an error value between a step frequency of the trainer to be monitored in the gait cycle, which is obtained based on each key joint point, and a step frequency standard value corresponding to the key joint point, which is preset in the corresponding training cycle, is respectively calculated, and an error value between a stride of the trainer to be monitored in the gait cycle, which is obtained based on each key joint point, and a stride standard value corresponding to the key joint point, which is preset in the corresponding training cycle, is respectively calculated.
S22, determining the training posture standard rate of the trainer to be monitored in the gait cycle according to the error value and the gait standard value corresponding to the key joint point.
In specific implementation, for each gait cycle, the step frequency standard rate corresponding to each key joint point can be calculated by the following method: and subtracting the absolute value of the ratio of the step frequency error value corresponding to the key joint point to the step frequency standard value by 1 to obtain the step frequency standard rate corresponding to the key joint point. Similarly, the step standard rate corresponding to each key joint point can be calculated by the following method: and subtracting the absolute value of the ratio of the stride error value corresponding to the key joint point to the stride standard value by 1 to obtain the stride standard rate corresponding to the key joint point.
And further, weighting and summing the step frequency standard rate and the stride standard rate corresponding to each key joint point, and calculating the training posture standard rate of the trainer to be monitored in the gait cycle, namely the training posture standard rate of each step.
Preferably, a training posture standard rate curve of the trainer to be monitored in each gait cycle in the training cycle can be drawn.
And further, evaluating the rehabilitation degree of the trainer to be monitored according to the training posture standard rate.
In specific implementation, the rehabilitation degree of the trainer to be monitored corresponding to the training posture standard rate is determined according to the corresponding relation between the preset training posture standard rate and the rehabilitation degree.
In the human body gait monitoring method provided by the embodiment of the invention, the lower limb rehabilitation robot acquires a video frame of the lower limb movement of a trainer to be monitored acquired by using an image acquisition device in a training time period and the acquisition time of the video frame, acquires two-dimensional pixel coordinates of each key joint point of the lower limb of the trainer to be monitored according to a preset deep learning training model aiming at each video frame, determines the actual position coordinates of the key joint points according to the two-dimensional pixel coordinates of the key joint points and the position information of the image acquisition device aiming at each key joint point, determines the gait cycle and the gait information of each step of the trainer to be monitored according to the actual position coordinates of the key joint points and the acquisition time of the video frame, wherein the gait information is used for representing the step form, and further compares the gait information in the gait cycle with the preset gait standard value corresponding to the key joint points, the human body gait monitoring method provided by the embodiment of the invention can capture the motion of each key joint point of the lower limb of the human body by utilizing a preset deep learning training model, acquire the two-dimensional pixel coordinates of each key joint point of the lower limb on each video frame, acquire the gait information of a trainer to be monitored in the rehabilitation training process in real time on the premise of not wearing sensor equipment, and compare the acquired gait information with the preset gait standard values corresponding to each key joint point so as to evaluate the standard rate of the training posture in the rehabilitation training process and evaluate the rehabilitation training effect according to the comparison result.
Based on the same conception, the embodiment of the invention also provides a human body gait monitoring device, and as the principle of solving the problems of the human body gait monitoring device is similar to the human body gait monitoring method, the implementation of the device can be referred to the implementation of the method, and repeated parts are not described again.
As shown in fig. 3, which is a schematic structural diagram of a human gait monitoring device according to an embodiment of the invention, the human gait monitoring device may include:
the first acquisition unit 31 is configured to acquire a video frame of the lower limb movement of the trainer to be monitored, acquired by using an image acquisition device within a training time period, and acquisition time of the video frame;
the second obtaining unit 32 is configured to obtain, for each video frame, two-dimensional pixel coordinates of each key joint point of the lower limb of the trainer to be monitored according to a preset deep learning training model;
a first determining unit 33, configured to determine, for each key joint point, actual position coordinates of the key joint point according to two-dimensional pixel coordinates of the key joint point and position information of the image acquisition device;
the second determining unit 34 is configured to determine a gait cycle and gait information of each step of the trainer to be monitored according to the actual position coordinates of the key joint point and the acquisition time of the video frame, where the gait information is used to represent a step shape;
and the comparison unit 35 is configured to compare the gait information in the gait cycle with a preset gait standard value corresponding to the key joint point, and obtain a comparison result.
Preferably, the second determining unit 34 is specifically configured to determine a gait cycle of each step of the trainer to be monitored according to the change of the actual position coordinates of the key joint point corresponding to each video frame and the acquisition time of the video frame.
Preferably, the gait information includes a step frequency, and the second determining unit 34 is specifically configured to determine the step frequency of each step of the trainer to be monitored according to the gait cycle of each step.
Preferably, the gait information includes a stride, and the second determining unit 34 is specifically configured to determine the stride of each step of the trainer to be monitored according to the initial actual position coordinates of the key joint points in the gait cycle of each step.
Optionally, the apparatus further comprises:
the third determining unit is used for determining the training posture standard rate of the trainer to be monitored according to the comparison result;
and the evaluation unit is used for evaluating the rehabilitation degree of the trainer to be monitored according to the training posture standard rate.
Preferably, the third determining unit is specifically configured to calculate, for each gait cycle, an error value between the gait of the trainer to be monitored in the gait cycle and a preset gait standard value corresponding to the key joint point; and determining the training posture standard rate of the trainer to be monitored in the gait cycle according to the error value and the gait standard value corresponding to the key joint point.
Based on the same technical concept, an embodiment of the present invention further provides an electronic device 400, referring to fig. 4, where the electronic device 400 is configured to implement the human gait monitoring method described in the foregoing method embodiment, and the electronic device 400 of this embodiment may include: a memory 401, a processor 402, and a computer program, such as a human gait monitoring program, stored in the memory and executable on the processor. The processor, when executing the computer program, implements the steps of the above-mentioned various embodiments of the human gait monitoring method, such as step S11 shown in fig. 1. Alternatively, the processor, when executing the computer program, implements the functions of the modules/units in the above-described device embodiments, for example, 31.
The embodiment of the present invention does not limit the specific connection medium between the memory 401 and the processor 402. In the embodiment of the present application, the memory 401 and the processor 402 are connected by the bus 403 in fig. 4, the bus 403 is represented by a thick line in fig. 4, and the connection manner between other components is merely illustrative and is not limited thereto. The bus 403 may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown in FIG. 4, but this does not indicate only one bus or one type of bus.
The memory 401 may be a volatile memory (volatile memory), such as a random-access memory (RAM); the memory 401 may also be a non-volatile memory (non-volatile memory) such as, but not limited to, a read-only memory (rom), a flash memory (flash memory), a Hard Disk Drive (HDD) or a solid-state drive (SSD), or the memory 401 may be any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. The memory 401 may be a combination of the above memories.
A processor 402 for implementing a human gait monitoring method as shown in fig. 1, comprising:
the processor 402 is configured to invoke the computer program stored in the memory 401 to execute step S11 shown in fig. 1, obtain a video frame of the lower limb movement of the trainer to be monitored acquired by using an image acquisition device within a training time period and an acquisition time of the video frame, step S12, for each video frame, obtain two-dimensional pixel coordinates of each key joint point of the lower limb of the trainer to be monitored according to a preset deep learning training model, step S13, for each key joint point, determine the actual position coordinates of the key joint point according to the two-dimensional pixel coordinates of the key joint point and the position information of the image acquisition device, step S14, determine the gait cycle and gait information of each step of the trainer to be monitored according to the actual position coordinates of the key joint point and the acquisition time of the video frame, wherein the gait information is used for representing a step shape, and step S15, comparing the gait information with a preset gait standard value corresponding to the key joint point to obtain a comparison result.
The embodiment of the present application further provides a computer-readable storage medium, which stores computer-executable instructions required to be executed by the processor, and includes a program required to be executed by the processor.
In some possible embodiments, the aspects of the human gait monitoring method provided by the invention can also be implemented in the form of a program product, which includes program code for causing an electronic device to execute the steps of the human gait monitoring method according to the various exemplary embodiments of the invention described above in this specification when the program product runs on the electronic device, for example, the electronic device may execute step S11 shown in fig. 1, acquiring a video frame of the lower limb movement of the trainer to be monitored acquired by an image acquisition device during a training period and the acquisition time of the video frame, step S12, acquiring two-dimensional pixel coordinates of key joint points of the lower limb of the trainer to be monitored according to a preset deep learning training model for each video frame, step S13, determining the actual position coordinates of the key joint points according to the two-dimensional pixel coordinates of the key joint points and the position information of the image acquisition equipment, and determining the gait cycle and the gait information of each step of the trainer to be monitored according to the actual position coordinates of the key joint points and the acquisition time of the video frame in step S14, wherein the gait information is used for representing the step shape, and step S15, comparing the gait information with a preset gait standard value corresponding to the key joint points to obtain a comparison result.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A program product for human gait monitoring of embodiments of the invention may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a computing device. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device over any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., over the internet using an internet service provider).
It should be noted that although several units or sub-units of the apparatus are mentioned in the above detailed description, such division is merely exemplary and not mandatory. Indeed, the features and functions of two or more of the units described above may be embodied in one unit, according to embodiments of the invention. Conversely, the features and functions of one unit described above may be further divided into embodiments by a plurality of units.
Moreover, while the operations of the method of the invention are depicted in the drawings in a particular order, this does not require or imply that the operations must be performed in this particular order, or that all of the illustrated operations must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (devices), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the invention.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (10)

1. A human gait monitoring method is characterized by comprising the following steps:
acquiring a video frame of the lower limb movement of a trainer to be monitored acquired by using image acquisition equipment in a training time period and the acquisition time of the video frame;
aiming at each video frame, acquiring two-dimensional pixel coordinates of each key joint point of the lower limb of the trainer to be monitored according to a preset deep learning training model;
for each key joint point, determining the actual position coordinates of the key joint point according to the two-dimensional pixel coordinates of the key joint point and the position information of the image acquisition equipment;
determining a gait cycle and gait information of each step of the trainer to be monitored according to the actual position coordinates of the key joint points and the acquisition time of the video frame, wherein the gait information is used for representing a step shape and comprises step frequency and step length;
comparing the gait information in the gait cycle with a preset gait standard value corresponding to the key joint point to obtain a comparison result;
determining the training posture standard rate of the trainer to be monitored according to the comparison result, which specifically comprises the following steps: calculating an error value between the gait of the trainer to be monitored in each gait cycle and a preset gait standard value corresponding to each key joint point, wherein the error value is obtained by the trainer to be monitored based on each key joint point; determining the training posture standard rate of the trainer to be monitored in the gait cycle according to the error value and the gait standard value corresponding to the key joint point; the training posture standard rate of each step is obtained by weighted summation calculation according to a step frequency standard rate and a step rate standard rate corresponding to the key joint point, the step frequency standard rate corresponding to each key joint point is 1 minus the absolute value of the ratio of a step frequency error value and a step frequency standard value corresponding to the key joint point, and the step rate standard rate corresponding to each key joint point is 1 minus the absolute value of the ratio of a step error value and a step standard value corresponding to the key joint point;
and evaluating the rehabilitation degree of the trainer to be monitored according to the training posture standard rate.
2. The method of claim 1, wherein determining the gait cycle of each step of the trainer to be monitored according to the actual position coordinates of the key joint points and the acquisition time of the video frame comprises:
and determining the gait cycle of each step of the trainer to be monitored according to the change of the actual position coordinates of the key joint points corresponding to each video frame and the acquisition time of the video frames.
3. The method as claimed in claim 2, wherein determining the gait information of each step of the trainer to be monitored according to the actual position coordinates of the key joint points and the acquisition time of the video frame comprises:
and determining the step frequency of each step of the trainer to be monitored according to the gait cycle of each step.
4. The method as claimed in claim 2, wherein determining the gait information of each step of the trainer to be monitored according to the actual position coordinates of the key joint points and the acquisition time of the video frame comprises:
and determining the stride of each step of the trainer to be monitored according to the initial actual position coordinates of the key joint points in the gait cycle of each step.
5. A human gait monitoring device, comprising:
the device comprises a first acquisition unit, a second acquisition unit and a monitoring unit, wherein the first acquisition unit is used for acquiring a video frame of the lower limb movement of a trainer to be monitored acquired by utilizing image acquisition equipment within a training time period and the acquisition time of the video frame;
the second acquisition unit is used for acquiring two-dimensional pixel coordinates of each key joint point of the lower limb of the trainer to be monitored according to a preset deep learning training model aiming at each video frame;
a first determining unit, configured to determine, for each key joint point, an actual position coordinate of the key joint point according to the two-dimensional pixel coordinate of the key joint point and the position information of the image acquisition device;
the second determining unit is used for determining a gait cycle and gait information of each step of the trainer to be monitored according to the actual position coordinates of the key joint points and the acquisition time of the video frame, wherein the gait information is used for representing a step form and comprises step frequency and step width;
the comparison unit is used for comparing the gait information in the gait cycle with a preset gait standard value corresponding to the key joint point to obtain a comparison result;
the third determining unit is used for determining the training posture standard rate of the trainer to be monitored according to the comparison result;
the third determining unit is specifically configured to calculate, for each gait cycle, an error value between a gait of the trainer to be monitored in the gait cycle, which is obtained based on each key joint point, and a preset gait standard value corresponding to the key joint point; determining the training posture standard rate of the trainer to be monitored in the gait cycle according to the error value and the gait standard value corresponding to the key joint point; the training posture standard rate of each step is obtained by weighted summation calculation according to a step frequency standard rate and a step rate standard rate corresponding to the key joint point, the step frequency standard rate corresponding to each key joint point is 1 minus the absolute value of the ratio of a step frequency error value and a step frequency standard value corresponding to the key joint point, and the step rate standard rate corresponding to each key joint point is 1 minus the absolute value of the ratio of a step error value and a step standard value corresponding to the key joint point;
and the evaluation unit is used for evaluating the rehabilitation degree of the trainer to be monitored according to the training posture standard rate.
6. The apparatus of claim 5,
the second determining unit is specifically configured to determine a gait cycle of each step of the trainer to be monitored according to the change of the actual position coordinates of the key joint point corresponding to each video frame and the acquisition time of the video frame.
7. The apparatus as claimed in claim 6, wherein the gait information includes a step frequency, and the second determining unit is specifically configured to determine the step frequency of each step of the trainer to be monitored according to the gait cycle of each step.
8. The apparatus as claimed in claim 6, wherein the gait information includes a stride, the second determining unit is specifically configured to determine the stride of each step of the trainer to be monitored according to the start actual position coordinates of the key joint points in the gait cycle of each step.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor when executing the program implements the human gait monitoring method of any of claims 1 to 4.
10. A computer readable storage medium having stored thereon a computer program, wherein the program when executed by a processor implements the steps of the human gait monitoring method according to any of claims 1 to 4.
CN201911072886.XA 2019-11-05 2019-11-05 Human body gait monitoring method and device Active CN110801233B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911072886.XA CN110801233B (en) 2019-11-05 2019-11-05 Human body gait monitoring method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911072886.XA CN110801233B (en) 2019-11-05 2019-11-05 Human body gait monitoring method and device

Publications (2)

Publication Number Publication Date
CN110801233A CN110801233A (en) 2020-02-18
CN110801233B true CN110801233B (en) 2022-06-07

Family

ID=69501289

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911072886.XA Active CN110801233B (en) 2019-11-05 2019-11-05 Human body gait monitoring method and device

Country Status (1)

Country Link
CN (1) CN110801233B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11980790B2 (en) 2020-05-06 2024-05-14 Agile Human Performance, Inc. Automated gait evaluation for retraining of running form using machine learning and digital video data
CN112597903B (en) * 2020-12-24 2021-08-13 珠高电气检测有限公司 Electric power personnel safety state intelligent identification method and medium based on stride measurement
CN113095268B (en) * 2021-04-22 2023-11-21 中德(珠海)人工智能研究院有限公司 Robot gait learning method, system and storage medium based on video stream
CN114202772B (en) * 2021-12-07 2022-08-09 湖南长信畅中科技股份有限公司 Reference information generation system and method based on artificial intelligence and intelligent medical treatment

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109635644A (en) * 2018-11-01 2019-04-16 北京健康有益科技有限公司 A kind of evaluation method of user action, device and readable medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2515280A (en) * 2013-06-13 2014-12-24 Biogaming Ltd Report system for physiotherapeutic and rehabiliative video games
US10244990B2 (en) * 2015-09-30 2019-04-02 The Board Of Trustees Of The University Of Alabama Systems and methods for rehabilitation of limb motion
US20180177436A1 (en) * 2016-12-22 2018-06-28 Lumo BodyTech, Inc System and method for remote monitoring for elderly fall prediction, detection, and prevention
CN109325479B (en) * 2018-11-28 2020-10-16 清华大学 Step detection method and device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109635644A (en) * 2018-11-01 2019-04-16 北京健康有益科技有限公司 A kind of evaluation method of user action, device and readable medium

Also Published As

Publication number Publication date
CN110801233A (en) 2020-02-18

Similar Documents

Publication Publication Date Title
CN110801233B (en) Human body gait monitoring method and device
US10898755B2 (en) Method for providing posture guide and apparatus thereof
US10296102B1 (en) Gesture and motion recognition using skeleton tracking
Javeed et al. Wearable sensors based exertion recognition using statistical features and random forest for physical healthcare monitoring
US10911775B1 (en) System and method for vision-based joint action and pose motion forecasting
US8213678B2 (en) System and method of analyzing the movement of a user
US20230368578A1 (en) Methods and apparatus for human pose estimation from images using dynamic multi-headed convolutional attention
EP3690702A1 (en) Motion recognition and gesture prediction method and device
US20210315486A1 (en) System and Method for Automatic Evaluation of Gait Using Single or Multi-Camera Recordings
Chaudhari et al. Yog-guru: Real-time yoga pose correction system using deep learning methods
US11759126B2 (en) Scoring metric for physical activity performance and tracking
KR20190050724A (en) System and Method of Generating Blood Pressure Estimation Model, and System and Method of Blood Pressure Estimation
US11423699B2 (en) Action recognition method and apparatus and electronic equipment
CN115427982A (en) Methods, systems, and media for identifying human behavior in digital video using convolutional neural networks
CN111597975B (en) Personnel action detection method and device and electronic equipment
JP5604249B2 (en) Human body posture estimation device, human body posture estimation method, and computer program
Rahman et al. [Retracted] Automated Detection of Rehabilitation Exercise by Stroke Patients Using 3‐Layer CNN‐LSTM Model
Wei et al. Real-time limb motion tracking with a single imu sensor for physical therapy exercises
Chen et al. A motion tracking system for hand activity assessment
CN105243675A (en) Star-shaped skeleton model based pig hobbling identification method
CN113761965B (en) Motion capture method, motion capture device, electronic equipment and storage medium
Bolaños et al. A comparative analysis of pose estimation models as enablers for a smart-mirror physical rehabilitation system
CN115116039A (en) Vehicle cabin outside sight line tracking method and device, vehicle and storage medium
KR20220066535A (en) Method, Server and System for Recognizing Motion in Video
CN113544736A (en) Lower limb muscle strength estimation system, lower limb muscle strength estimation method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant