CN112741620A - Cervical spondylosis evaluation device based on limb movement - Google Patents

Cervical spondylosis evaluation device based on limb movement Download PDF

Info

Publication number
CN112741620A
CN112741620A CN202011601461.6A CN202011601461A CN112741620A CN 112741620 A CN112741620 A CN 112741620A CN 202011601461 A CN202011601461 A CN 202011601461A CN 112741620 A CN112741620 A CN 112741620A
Authority
CN
China
Prior art keywords
cervical spondylosis
module
data analysis
video
frame image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011601461.6A
Other languages
Chinese (zh)
Inventor
陈俊颖
梁国彦
郑书豪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Original Assignee
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT filed Critical South China University of Technology SCUT
Priority to CN202011601461.6A priority Critical patent/CN112741620A/en
Publication of CN112741620A publication Critical patent/CN112741620A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1118Determining activity level

Abstract

The invention provides a cervical spondylosis evaluation device based on limb actions, which comprises: the storage module is used for storing reference data for evaluating the severity of cervical spondylosis; the video acquisition module is used for acquiring a video of the limb movement containing the evaluation target; the information extraction module is used for splitting the video into a plurality of single-frame images; the data analysis module is used for calculating test parameters according to the single-frame image, the data analysis module is used for setting a plurality of key points corresponding to the limbs of the evaluation target on the single-frame image, the test parameters comprise the times of finishing preset actions of the limbs of the evaluation target in a preset time period and/or the motion parameters of the key points, and the data analysis module is also used for calculating the difference value between the test parameters and the reference data; and the display module is used for displaying the evaluation result, and the evaluation result comprises any one or more of the test parameters, the difference values and the disease degree related to the difference values. The user can utilize the cervical spondylosis evaluation device to operate by himself to evaluate the severity of the cervical spondylosis.

Description

Cervical spondylosis evaluation device based on limb movement
Technical Field
The invention relates to the technical field of auxiliary equipment for disease diagnosis, in particular to a cervical spondylosis evaluation device based on limb actions.
Background
The incidence of cervical spondylosis in middle-aged and old people is high, and the movement function of limbs is affected after the cervical nerves are pressed, which is mainly manifested as hand and leg weakness, inflexible movement, stiff feeling and the like, so the evaluation of the movement function of the limbs can reflect the severity of the cervical spondylosis. Clinically, a patient can complete a specific limb action, and a doctor can evaluate the severity of the cervical spondylosis according to the frequency, amplitude and other conditions of the patient completing the limb action.
At present, counting the frequency, amplitude and other conditions of limb actions of a patient depends on human eye counting, and a counter is easy to be tired and easy to miss; in addition, patients need to rely on the on-site examination of doctors, the examination cannot be performed at home, and the time and the field are limited.
Therefore, there is a need to provide a new cervical spondylosis assessment device based on limb movements to solve the above problems.
Disclosure of Invention
In view of the above-mentioned drawbacks of the prior art, the present invention provides a cervical spondylosis assessment apparatus based on limb movements, which is convenient for a user to operate and assess the severity of cervical spondylosis.
The technical scheme adopted by the invention for solving the technical problem is as follows:
a cervical spondylosis assessment device based on limb movement, comprising: the storage module is used for storing reference data for evaluating the severity of cervical spondylosis; the video acquisition module is used for acquiring a video of the limb movement containing the evaluation target; the information extraction module is used for splitting the video into a plurality of single-frame images; the data analysis module is used for calculating test parameters according to the single-frame image, the data analysis module is used for setting a plurality of key points corresponding to limbs of an evaluation target on the single-frame image, the test parameters comprise the times of finishing preset actions of the limbs of the evaluation target in a preset time period and/or the motion parameters of the key points, and the data analysis module is further used for calculating the difference value between the test parameters and the reference data; a display module for displaying an assessment result comprising any one or more of the test parameter, the difference value, and a degree of a condition associated with the difference value.
Preferably, the data analysis module captures a limb movement of an evaluation target in a plurality of the single-frame images by using a convolutional neural network-based deep learning method.
Preferably, the motion parameters include one or more of a motion speed, a motion frequency, and a motion amplitude of the key points.
Preferably, the key points are arranged according to a hand structure of an evaluation target, in the single-frame image, a fingertip and three finger joints of each finger are provided with key points, and the root of the palm is provided with at least one key point.
Preferably, the storage module further stores attributes of the evaluation target, the attributes including sex and age, and the degree of the condition is related to the attributes.
Preferably, the cervical spondylosis evaluation device further comprises a server and a data transmission module, the storage module, the information extraction module and the data analysis module are all arranged on the server, and the video acquisition module and the display module are all in data communication with the server through the data transmission module.
Preferably, the cervical spondylosis evaluation device further comprises a mobile phone or a tablet computer, and the video acquisition module and the display module are both arranged on the mobile phone or the tablet computer.
Preferably, the convolutional neural network comprises a convolutional gesture machine, the convolutional gesture machine comprises a plurality of stages, the first stage extracts the features of the single-frame image and calculates the confidence maps of the key points, the stages after the first stage take the features of the single-frame image and the confidence maps output in the last stage as input, and after the convolutional gesture machine obtains the output in the last stage, the predicted coordinate with the highest confidence level in the confidence map corresponding to each key point is taken as a final result.
Preferably, the convolution gesture machine comprises a picture feature extractor, the picture feature extractor is used for extracting features of the single-frame image, and the picture feature extractor is a pre-trained VGG-19 network; the convolution gesture machine is provided with two convolution extra convolutional layers in the first stage to obtain 128-channel features.
Preferably, the data analysis module comprises a calculation component that calculates the motion parameters of the keypoints from their coordinates in the plurality of confidence maps.
Compared with the prior art, the invention mainly has the following beneficial effects:
the user can complete specific limb actions and record a video by using the video acquisition module, the information extraction module, the data analysis module and the storage module can calculate data conversion parameters for measuring the condition that the user completes the specific limb actions according to the video and convert the data conversion parameters into evaluation results, and the user can see the evaluation results through the display module, so that the severity of the cervical spondylosis can be known, the user can utilize the cervical spondylosis evaluation device to automatically operate and evaluate the severity of the cervical spondylosis, the use is convenient, and the severity of illness state can be automatically diagnosed by a wide range of cervical spondylosis patients at any time, so that corresponding positive treatment can be performed.
Drawings
In order to illustrate the solution of the present application more clearly, the drawings that are needed in the description of the embodiments will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present application, and that other drawings can be obtained by those skilled in the art without inventive effort.
Fig. 1 is a system block diagram of a cervical spondylosis evaluating apparatus according to the present invention;
FIG. 2 is a schematic diagram of the distribution of key points involved in the present invention;
FIG. 3 is a schematic diagram of the architecture of a convolutional neural network involved in the present invention;
fig. 4 is a flowchart of the operation of the cervical spondylosis evaluating apparatus according to the present invention.
Reference numerals:
100-a cervical spondylosis assessment device, 101-key points, 10-a video acquisition module, 20-an information extraction module, 30-a data analysis module, 40-a storage module, 50-a display module, 60-a mobile phone and 70-a server.
Detailed Description
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs; the terminology used in the description of the application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application; the terms "including" and "having," and any variations thereof, in the description and claims of this application and the description of the above figures are intended to cover non-exclusive inclusions. The terms "first," "second," and the like in the description and claims of this application or in the above-described drawings are used for distinguishing between different objects and not for describing a particular order.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
Fig. 1 is a system block diagram of a cervical spondylosis evaluating apparatus 100 according to the present invention.
As shown in fig. 1, a cervical spondylosis evaluation device 100 based on limb movements according to a preferred embodiment of the present invention includes: the system comprises a video acquisition module 10, an information extraction module 20, a data analysis module 30, a storage module 40, a display module 50, a mobile phone 60, a server 70 and a data transmission module. The cervical spondylosis evaluation device 100 may be used to evaluate the severity of cervical spondylosis based on the completion of the limb movements of the evaluation target. The evaluation target may be a cervical spondylosis patient.
The storage module 40 stores reference data for evaluating the severity of cervical spondylosis; the video acquisition module 10 is used for acquiring a video of limb actions containing an evaluation target; the information extraction module 20 is configured to split the video into a plurality of single-frame images; the data analysis module 30 is configured to calculate a test parameter according to the single frame image, the data analysis module 30 sets a plurality of key points 101 corresponding to the limbs of the evaluation target on the single frame image, the test parameter includes the number of times that the limbs of the evaluation target complete a preset action and/or a motion parameter of the key point 101 in a preset time period, and the data analysis module 30 is further configured to calculate a difference value between the test parameter and the reference data; the display module 50 is used for displaying the evaluation result, and the evaluation result includes any one or more of the test parameters, the difference values and the disease degree related to the difference values.
In this embodiment, the user can complete a specific limb action and record a video by using the video acquisition module 10, the information extraction module 20, the data analysis module 30 and the storage module 40 can calculate a data-converted parameter for measuring the situation that the user completes the specific limb action according to the video and convert the data-converted parameter into an evaluation result, and the user can see the evaluation result through the display module 50, so as to know the severity of the cervical spondylosis, and the user can evaluate the severity of the cervical spondylosis by using the cervical spondylosis evaluation device 100 to operate the cervical spondylosis evaluation device, so that the cervical spondylosis evaluation device is convenient to use, and is beneficial for a wide range of cervical spondylosis patients to diagnose the severity of illness at any time, so as to make corresponding positive treatment.
The limb movements for evaluating the severity of cervical spondylosis include a palm grip movement or a leg step movement. The preset time period is not less than 5 seconds and not more than 60 seconds. In some examples, the user may perform a palm gripping action, i.e., extend one palm, make a quick fist and then open, cycling through the gripping action for a number of times and for a duration of no less than 5 seconds. In other examples, the user may perform a leg step motion, i.e., both legs step continuously, for a duration of no less than 5 seconds. The time for the user to perform the limb action does not usually exceed 60 seconds so as not to influence the accuracy of cervical spondylosis evaluation due to fatigue. In some examples, the time for the user to perform the limb movement may be 5 seconds and the video is recorded all the time, and the data analysis module 30 may define the preset time period as 5 seconds. In other examples, the preset time period may also be 10 seconds, 15 seconds, or 20 seconds.
The memory module 40 stores reference data for evaluating the severity of cervical spondylosis. The reference data may be defined by a physician based on experience with clinical diagnosis. In some examples, the reference data may be 10/5 seconds or 20/10 seconds of palm grip action. The storage module 40 also stores attributes of the evaluation target, which include sex and age, and the degree of illness is associated with the attributes.
The Memory module 40 may be a nonvolatile Memory, a Flash Memory (Flash Memory), a ferroelectric random access Memory (FeRAM), a Magnetic Random Access Memory (MRAM), a phase change random access Memory (PRAM), or a resistance change random access Memory (RRAM). This can reduce the possibility of data loss due to sudden power outage. In other examples, the storage module 40 may also be other types of readable storage media, such as: Read-Only Memory (ROM), Random Access Memory (RAM), Programmable Read-Only Memory (PROM), Erasable Programmable Read-Only Memory (EPROM), One-time Programmable Read-Only Memory (OTPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), compact-Read-Only Memory (CD-ROM) or other optical disk storage, magnetic disk storage, tape storage, or any other medium readable by a computer that can be used to carry or store data. Thus, an appropriate memory can be selected according to different situations.
The video capture module 10 is used to capture video of the limb movements containing the assessment target. The video capture module 10 may be a device capable of recording video, such as a cell phone 60, a tablet computer, a pan-tilt head, or a digital camera. The video capture module 10 may also be a component of the above-described apparatus, such as a camera module.
The information extraction module 20 is configured to split the video into a plurality of single-frame images. The information extraction module 20 includes a preprocessing component for splitting the video into a plurality of single-frame images and a checking component for checking whether the single-frame images contain a limb of the evaluation target. The limb may be a palm or a leg. In some examples, the preprocessing component splits the video into a plurality of single frame images and transmits to the inspection component, which picks out the single frame image of the limb containing the assessment target from the plurality of single frame images and transmits to the data analysis module 30. The inspection component can cull excess useless single frame images to improve processing efficiency. The verification component can determine whether the single frame image contains a limb of the assessment target using a deep learning method based on a convolutional neural network, which can employ ResNet-10. In some examples, the inspection component may not be needed, and the information extraction module 20 splits the video into a plurality of single frame images and then transmits the single frame images to the data analysis module 30.
Fig. 2 is a schematic diagram of the distribution of the key points 101 involved in the present invention.
The data analysis module 30 is configured to calculate a test parameter according to the single frame image, the data analysis module 30 sets a plurality of key points 101 corresponding to the limbs of the evaluation target on the single frame image, the test parameter includes the number of times that the limbs of the evaluation target complete a preset action and/or a motion parameter of the key point 101 in a preset time period, and the data analysis module 30 is further configured to calculate a difference between the test parameter and the reference data. The motion parameters include one or more of a motion speed, a motion frequency, and a motion amplitude of the keypoint 101. In some examples, the key points 101 may be arranged according to a hand structure of the evaluation target, and in the single-frame image, the fingertip and three finger joints of each finger are provided with the key points 101, and the root of the palm is provided with at least one key point 101. Referring to fig. 2, taking a palm grasping action test as an example, 21 key points 101 may be set on the hand, including a palm root key point and 4 key points (1 fingertip key point and 3 finger joint key points) included in each finger. The data analysis module 30 analyzes and calculates data (such as the movement speed, movement frequency, movement amplitude, and the like of the key point 101) required by the disease condition evaluation by detecting the hand key point 101 of each single frame image in the video, and then obtains a disease condition evaluation result after analysis by combining with predefined medical knowledge. In some cases, the motion amplitude of the keypoint 101 may be characterized by a motion angle or a motion stroke.
Fig. 3 is a schematic diagram of the architecture of a convolutional neural network involved in the present invention. In FIG. 3, T represents the phase, and T.gtoreq.2; c represents a convolutional layer; p represents a pooling layer; h represents the height of a single frame image; w represents the width of a single frame image; x represents an image feature extracted from a single frame image; p (lower case) represents the total number of joints; g represents a classifier for predicting 2D confidence maps; b represents the respective prediction result of each joint obtained from g; psi represents the 2D confidence map resulting from combining all joint predictors contained in b; loss f represents the Loss function.
The data analysis module 30 captures the body motion of the evaluation target in a plurality of single-frame images using a convolutional neural network-based deep learning method. Preferably, the Convolutional neural network comprises Convolutional Posing Machines (CPMs), the Convolutional posing Machines comprise a plurality of stages, a first stage extracts features of a single frame image and calculates a confidence map of the key points 101, stages after the first stage take the features of the single frame image and the confidence map output by the previous stage as input, and after the Convolutional posing Machines obtain output of a final stage, the predicted coordinate with the highest confidence level in the confidence map corresponding to each key point 101 is taken as a final result. Preferably, the convolution gesture machine comprises a picture feature extractor, wherein the picture feature extractor is used for extracting the features of the single-frame image, and the picture feature extractor is a pre-trained VGG-19 network; the convolution gesture machine is provided with two convolution extra convolutional layers in the first stage to obtain 128-channel features.
Specifically, CPMs are a serialized prediction framework that can learn informative spatial information models. The CPMs consist of a full convolutional network serialization and repeatedly output a 2D confidence map for each keypoint 101. As shown in fig. 3, the framework of CPMs can be divided into T stages, the first stage extracts the features of a single frame image and calculates the confidence map of each keypoint 101, and the later stages each use the features of the single frame image and the 2D confidence map output from the previous stage as input. The confidence map provides non-parametric coding of the spatial uncertainty of each keypoint 101 location for later stages, so that CPMs can learn a rich spatial model of the relationships between image-related keypoints 101.
In this embodiment, in order to successfully apply the CPMs model to the detection of the hand keypoints 101, the following settings are made for the convolutional network structure used by the model: (1) setting the picture feature extractor in the stage 1 as a pre-trained VGG-19 network, adding two convolution extra layers to obtain a feature of 128 channels (for an image with the size of w × h, the feature size obtained in the stage 1 is w ' × 128, w ' ═ w/8, h ' ═ h/8), and then generating a 2D confidence map of each hand key point 101 on the obtained feature map as an input of a subsequent stage; (2) and in the stage 2 to the stage T, the confidence map of the previous stage and the picture feature are connected and then used as input, and then a new 2D confidence map is calculated (each key point 101 corresponds to one confidence map). The model uses 6 stages to detect the key points 101 of the hand, and after the output of the final stage is obtained, the predicted coordinate with the highest reliability in the confidence map corresponding to each key point 101 is taken as the final result.
The data analysis module 30 includes a computation component that computes the motion parameters of the keypoint 101 based on the coordinates of the keypoint 101 in the plurality of confidence maps. After each single-frame image of the video is subjected to key point 101 detection, coordinates of the key points 101 in the multiple confidence maps are obtained and are paired with the time point of each single-frame image, the calculation component can obtain motion parameters such as a motion track, a motion speed, a motion angle or a motion stroke of each key point 101 through calculation, and the motion parameters are compared with medical auxiliary information defined in advance, so that the evaluation on the disease severity can be obtained. In some cases, the calculation of the data analysis module 30 may be simplified, and only the number of times the limbs of the assessment target complete the preset action within the preset time period may be calculated, and the severity of the cervical spondylosis of the assessment target may be assessed according to the number of times the limbs complete the preset action.
The display module 50 is used for displaying the evaluation result, and the evaluation result includes any one or more of the test parameters, the difference values and the disease degree related to the difference values. The display module 50 may be presented as a display screen for displaying the evaluation results. For example, for a palm grip action test of 10 seconds, the doctor sets the reference data to 20 (one) according to clinical experience, and 15 palm grip actions whose evaluation target is completed within 10 seconds are set. The evaluation result displayed by the display module 50 may be a test parameter, for example, 15. The evaluation result displayed by the display module 50 may be the difference between the test parameter and the reference data, for example-5. The evaluation result displayed by the display module 50 may also be a degree of illness associated with the difference. In some examples, the degree of condition may include a plurality of rating scores, such as 1 point, 2 points, 3 points, 4 points, and 5 points. The score of 5 can represent that the cervical vertebra is normal, the score of 4 can represent that the cervical spondylosis degree is low-degree serious, the score of 3 can represent that the cervical spondylosis degree is medium-degree serious, the score of 2 can represent that the cervical spondylosis degree is very serious, and the score of 1 can represent that the cervical spondylosis degree is extreme serious. The evaluation result displayed by the display module 50 may be any one of 1 point, 2 points, 3 points, 4 points, and 5 points. In other examples, the degree of condition may also be presented as a textual description.
The storage module 40, the information extraction module 20 and the data analysis module 30 are all disposed on the server 70, and the video acquisition module 10 and the display module 50 all implement data communication with the server 70 through the data transmission module. The data transmission module can be a wired network or a wireless network. The user can record a video by using the video acquisition module 10, the evaluation result is checked through the display module 50, and the functions of the storage module 40, the information extraction module 20 and the data analysis module 30 are all realized by uploading to the server 70, so that the requirements on equipment held by the user can be simplified, and the popularization of the cervical spondylosis evaluation device 100 is facilitated. The information extraction module 20 and the data analysis module 30 may be integrated in the same processor of the server 70. The storage module 40 may be independent of the information extraction module 20 and the data analysis module 30. In some examples, the information extraction module 20 and the data analysis module 30 may also have their own storage units. The storage module 40, the information extraction module 20 and the data analysis module 30 may also be disposed on the mobile phone 60 or a tablet computer, and the user may complete the cervical spondylosis evaluation on the mobile phone 60 or the tablet computer.
Fig. 4 is a flowchart of the operation of the cervical spondylosis evaluating apparatus 100 according to the present invention.
As shown in fig. 4, the workflow of the cervical spondylosis evaluating apparatus 100 according to the present invention includes: acquiring a video of limb movement containing an evaluation target (step S1); splitting the video into a plurality of single-frame images (step S2); calculating test parameters from the single frame image (step S3); the evaluation result is displayed (step S4).
At step S1, the user may record a video of the completion of a particular limb movement using video capture module 10. The specific limb action may be a palm grip action or a leg step action.
In step S2, the user may upload the video to the server 70 in the cloud via the wireless network, and the information extraction module 20 on the server 70 splits the video into a plurality of single-frame images.
In step S3, the data analysis module 30 on the server 70 captures the limb movement of the evaluation target in a plurality of single-frame images using a convolutional neural network-based deep learning method, and calculates the motion parameters of the key points 101 on the limb, the motion parameters including the motion speed, the motion frequency, and the motion amplitude of the key points 101. In some cases, the calculation of the data analysis module 30 may be simplified, and only the number of times the limbs of the assessment target complete the preset action within the preset time period may be calculated, and the severity of the cervical spondylosis of the assessment target may be assessed according to the number of times the limbs complete the preset action.
In step S4, the evaluation result is displayed on the display module 50. The assessment results include any one or more of the test parameters, the differences between the test parameters and the reference data, and the degree of the condition associated with the differences.
The video capture module 10 and the display module 50 may be both disposed on the mobile phone 60 or a tablet computer. That is, the user can record a video for completing a specific limb action using the mobile phone 60, and view the evaluation result on the mobile phone 60.
In some examples, the application interface of the cervical spondylosis evaluating apparatus 100 may be an APP installed on the mobile phone 60, and the user may record a video of a specific limb action using the mobile phone 60, upload the video to the server 70 in the cloud through a wireless network on the APP, process and analyze the video by the information extracting module 20 and the data analyzing module 30 on the server 70, and then transmit the evaluation result to the APP on the mobile phone 60 through the wireless network, and the user may view the evaluation result through the mobile phone 60. In some examples, the video and assessment results may also be transmitted to a physician who may view the completion of the limb movements and assessment results of the assessment target through the display module 50 so that further assessment and treatment opinions may be given. In addition, the evaluation results of the past times can be stored in the storage module 40, the change process of the cervical spondylosis can be recorded, and the targeted treatment can be favorably carried out.
In this embodiment, the user can complete a specific limb action and record a video by using the video acquisition module 10, the information extraction module 20, the data analysis module 30 and the storage module 40 can calculate a data-converted parameter for measuring the situation that the user completes the specific limb action according to the video and convert the data-converted parameter into an evaluation result, and the user can see the evaluation result through the display module 50, so as to know the severity of the cervical spondylosis, and the user can evaluate the severity of the cervical spondylosis by using the cervical spondylosis evaluation device 100 to operate the cervical spondylosis evaluation device, so that the cervical spondylosis evaluation device is convenient to use, and is beneficial for a wide range of cervical spondylosis patients to diagnose the severity of illness at any time, so as to make corresponding positive treatment. Compared with the existing mode of manual counting by doctors, the cervical spondylosis evaluation device 100 has higher counting precision and more accurate evaluation result. In addition, the current mode of counting through the doctor is artifical, and interpretation to the statistical result is influenced by doctor personal experience, and the doctor experience of local hospital of subordinate is not enough, and current mode is difficult to popularize and popularize, relatively speaking, the cervical spondylopathy evaluation device 100 that this application provided is simple to use, and the dependency is lower, and the degree of accuracy is higher, more does benefit to and promotes and popularize. The cervical spondylosis assessment device 100 provided by the application analyzes the hand or leg movement information in the video by adopting artificial intelligence, is not easy to fatigue and miss, and has accuracy superior to manual counting; the score is automatically obtained without being influenced by the experience of a doctor; the network end can be used for remote diagnosis and treatment and self-examination of patients; the data may be archived for dynamic analysis.
It is to be understood that the above-described embodiments are merely illustrative of some, but not restrictive, of the broad invention, and that the appended drawings illustrate preferred embodiments of the invention and do not limit the scope of the invention. This application is capable of embodiments in many different forms and is provided for the purpose of enabling a thorough understanding of the disclosure of the application. Although the present application has been described in detail with reference to the foregoing embodiments, it will be apparent to one skilled in the art that the present application may be practiced without modification or with equivalents of some of the features described in the foregoing embodiments. All equivalent structures made by using the contents of the specification and the drawings of the present application are directly or indirectly applied to other related technical fields and are within the protection scope of the present application.

Claims (10)

1. A cervical spondylosis assessment device based on limb movement, comprising:
the storage module is used for storing reference data for evaluating the severity of cervical spondylosis;
the video acquisition module is used for acquiring a video of the limb movement containing the evaluation target;
the information extraction module is used for splitting the video into a plurality of single-frame images;
the data analysis module is used for calculating test parameters according to the single-frame image, the data analysis module is used for setting a plurality of key points corresponding to limbs of an evaluation target on the single-frame image, the test parameters comprise the times of finishing preset actions of the limbs of the evaluation target in a preset time period and/or the motion parameters of the key points, and the data analysis module is further used for calculating the difference value between the test parameters and the reference data;
a display module for displaying an assessment result comprising any one or more of the test parameter, the difference value, and a degree of a condition associated with the difference value.
2. The cervical spondylosis evaluating apparatus according to claim 1,
the data analysis module captures a body motion of an evaluation target in a plurality of the single-frame images by using a convolutional neural network-based deep learning method.
3. The cervical spondylosis evaluating apparatus according to claim 2,
the motion parameters include one or more of a motion speed, a motion frequency, and a motion amplitude of the keypoint.
4. The cervical spondylosis evaluating apparatus according to claim 2,
the key points are arranged according to the hand structure of an evaluation target, in the single-frame image, the fingertip and three finger joints of each finger are provided with key points, and the root of the palm is provided with at least one key point.
5. The cervical spondylosis evaluating apparatus according to claim 2,
the storage module also stores attributes of the assessment target, the attributes including gender and age, and the degree of the condition is associated with the attributes.
6. The cervical spondylosis evaluating apparatus according to claim 2,
the video acquisition module and the display module realize data communication with the server through the data transmission module.
7. The cervical spondylosis evaluating apparatus according to claim 5,
the mobile phone or the tablet computer is further included, and the video acquisition module and the display module are both arranged on the mobile phone or the tablet computer.
8. The cervical spondylosis evaluation device of any one of claims 2 to 7,
the convolutional neural network comprises a convolutional attitude machine, the convolutional attitude machine comprises a plurality of stages, the first stage extracts the characteristics of the single-frame image and calculates the confidence maps of the key points, the stages after the first stage take the characteristics of the single-frame image and the confidence map output by the last stage as input, and after the convolutional attitude machine obtains the output of the last stage, the predicted coordinate with the highest confidence level in the confidence map corresponding to each key point is taken as a final result.
9. The cervical spondylosis evaluating apparatus according to claim 8,
the convolution gesture machine comprises a picture feature extractor, the picture feature extractor is used for extracting the features of the single-frame image, and the picture feature extractor is a pre-trained VGG-19 network; the convolution gesture machine is provided with two convolution extra convolutional layers in the first stage to obtain 128-channel features.
10. The cervical spondylosis evaluating apparatus according to claim 8,
the data analysis module includes a computation component that computes the motion parameters of the keypoints from their coordinates in the plurality of confidence maps.
CN202011601461.6A 2020-12-30 2020-12-30 Cervical spondylosis evaluation device based on limb movement Pending CN112741620A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011601461.6A CN112741620A (en) 2020-12-30 2020-12-30 Cervical spondylosis evaluation device based on limb movement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011601461.6A CN112741620A (en) 2020-12-30 2020-12-30 Cervical spondylosis evaluation device based on limb movement

Publications (1)

Publication Number Publication Date
CN112741620A true CN112741620A (en) 2021-05-04

Family

ID=75647298

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011601461.6A Pending CN112741620A (en) 2020-12-30 2020-12-30 Cervical spondylosis evaluation device based on limb movement

Country Status (1)

Country Link
CN (1) CN112741620A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114305398A (en) * 2021-12-15 2022-04-12 上海长征医院 System for detecting spinal cervical spondylosis of object to be detected

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105046893A (en) * 2015-08-07 2015-11-11 天津中科智能技术研究院有限公司 Sitting posture monitor and control method
CN108031109A (en) * 2017-12-25 2018-05-15 华南理工大学广州学院 A kind of game control device of interactive gait recognition method
WO2018158552A1 (en) * 2017-02-28 2018-09-07 Pro Sport Support Ltd System, method and markers for assessing athletic performance
CN110738192A (en) * 2019-10-29 2020-01-31 腾讯科技(深圳)有限公司 Human motion function auxiliary evaluation method, device, equipment, system and medium
WO2020046219A1 (en) * 2018-08-28 2020-03-05 Alanay Ahmet Novel calculation and analysis method, planning and application platform that personalizes the mathematical definition of spinal alignment and shape
CN111539941A (en) * 2020-04-27 2020-08-14 上海交通大学 Parkinson's disease leg flexibility task evaluation method and system, storage medium and terminal
CN111973151A (en) * 2020-05-30 2020-11-24 华南理工大学 Infectious disease monitoring system and method based on wearable intelligent bandage

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105046893A (en) * 2015-08-07 2015-11-11 天津中科智能技术研究院有限公司 Sitting posture monitor and control method
WO2018158552A1 (en) * 2017-02-28 2018-09-07 Pro Sport Support Ltd System, method and markers for assessing athletic performance
CN108031109A (en) * 2017-12-25 2018-05-15 华南理工大学广州学院 A kind of game control device of interactive gait recognition method
WO2020046219A1 (en) * 2018-08-28 2020-03-05 Alanay Ahmet Novel calculation and analysis method, planning and application platform that personalizes the mathematical definition of spinal alignment and shape
CN110738192A (en) * 2019-10-29 2020-01-31 腾讯科技(深圳)有限公司 Human motion function auxiliary evaluation method, device, equipment, system and medium
CN111539941A (en) * 2020-04-27 2020-08-14 上海交通大学 Parkinson's disease leg flexibility task evaluation method and system, storage medium and terminal
CN111973151A (en) * 2020-05-30 2020-11-24 华南理工大学 Infectious disease monitoring system and method based on wearable intelligent bandage

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
SHIH-EN WEI等: ""Convolutional Pose Machines"", 《CVPR》 *
张风雷: ""基于姿态机和卷积神经网络的手的关键点估计"", 《计算机与数字工程》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114305398A (en) * 2021-12-15 2022-04-12 上海长征医院 System for detecting spinal cervical spondylosis of object to be detected
CN114305398B (en) * 2021-12-15 2023-11-24 上海长征医院 System for be used for detecting spinal cord type cervical spondylosis of object to be tested

Similar Documents

Publication Publication Date Title
US11961620B2 (en) Method and apparatus for determining health status
US20230078968A1 (en) Systems and Methods for Monitoring and Evaluating Body Movement
US9629573B2 (en) Interactive virtual care
CN110738192A (en) Human motion function auxiliary evaluation method, device, equipment, system and medium
US20180184947A1 (en) Integrated Goniometry System and Method for Use of Same
JP2006305260A (en) Expression diagnosis assisting apparatus
KR102466438B1 (en) Cognitive function assessment system and method of assessing cognitive funtion
CN111887867A (en) Method and system for analyzing character formation based on expression recognition and psychological test
CN113658211A (en) User posture evaluation method and device and processing equipment
CN111524093A (en) Intelligent screening method and system for abnormal tongue picture
CN114022512A (en) Exercise assisting method, apparatus and medium
Bumacod et al. Image-processing-based digital goniometer using OpenCV
CN112562852A (en) Cervical spondylosis screening device based on limb movement
CN112741620A (en) Cervical spondylosis evaluation device based on limb movement
CN113869090A (en) Fall risk assessment method and device
CN114998816B (en) Case improvement method and device based on skeleton AI video and storage medium
Vitali et al. Quantitative assessment of shoulder rehabilitation using digital motion acquisition and convolutional neural network
CN108742548A (en) A kind of visualization Traditional Chinese medicinal wrist drawing system and device
KR20140132864A (en) easy measuring meathods for physical and psysiological changes on the face and the body using users created contents and the service model for healing and wellness using these techinics by smart devices
WO2024055192A1 (en) Method and system for marking motion data and generating motion evaluation model
CN114334082B (en) Electronic device
Alpatov et al. Chatbot for Remote Physical Knee Rehabilitation with Exercise Monitoring based on Machine Learning
JP2022105426A (en) Method for generating learning model, method for processing information and program
Machado Extraction of Biomedical Indicators from Gait Videos
KR20230161131A (en) Method for diagnosing skin based on artificial intelligence

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210504

RJ01 Rejection of invention patent application after publication