CN115272260A - Joint movement detection method and system based on image data - Google Patents

Joint movement detection method and system based on image data Download PDF

Info

Publication number
CN115272260A
CN115272260A CN202210936110.3A CN202210936110A CN115272260A CN 115272260 A CN115272260 A CN 115272260A CN 202210936110 A CN202210936110 A CN 202210936110A CN 115272260 A CN115272260 A CN 115272260A
Authority
CN
China
Prior art keywords
image
angle
joint
calculating
activity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210936110.3A
Other languages
Chinese (zh)
Inventor
陈昶旭
王腾宽
豆正磊
程吉安
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Shangyong Technology Co ltd
Original Assignee
Shanghai Shangyong Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Shangyong Technology Co ltd filed Critical Shanghai Shangyong Technology Co ltd
Priority to CN202210936110.3A priority Critical patent/CN115272260A/en
Publication of CN115272260A publication Critical patent/CN115272260A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30008Bone
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The invention relates to the technical field of disability identification, and provides a joint movement detection method and system based on image data, which comprises the following steps: selecting a joint part to be detected, and acquiring image information of the joint part; acquiring a single-frame image of image information, and converting the single-frame image into a gray-scale image; calculating the gray value of each pixel point and the absolute value of the average difference value to obtain the brightness difference between the single-frame image and normal illumination; supplementing or dimming a single-frame image through Gamma change; obtaining two-dimensional or three-dimensional coordinates of each joint point of the picture after the light supplementing or dimming processing is completed through an image neural network; calculating the activity of the joint part according to the two-dimensional or three-dimensional coordinates of each joint point, and recording the maximum activity; and calculating the disability grade according to the calculated activity degree and combined with the provision of the disability identification standard. The invention can save labor cost, improve human body joint degree detection efficiency and further improve the efficiency of disability identification.

Description

Joint movement detection method and system based on image data
Technical Field
The invention relates to the technical field of disability identification, in particular to a joint movement detection method and system based on image data.
Background
The joint mobility is a common detection item in clinical rehabilitation medicine and disability identification, and a detector can use the joint mobility as an index for evaluating the postoperative rehabilitation degree of a joint of a patient. Although the accuracy of manual measurement is relatively high, a detected person is required to designate a detection mechanism during measurement, and a plurality of detection persons are required to detect the detected person, so that manpower is wasted, and the efficiency is low.
Disclosure of Invention
The invention mainly solves the technical problems that manual measurement in the prior art requires a plurality of detection personnel to detect a detected person, manpower is wasted, and efficiency is low, and provides a joint movement detection method and system based on image data, so that the aims of improving accuracy and reliability of joint movement detection, saving labor cost and improving detection efficiency are fulfilled.
The invention provides a joint movement detection method based on image data, which comprises the following steps:
selecting a joint part to be detected, and collecting image information of the joint part;
acquiring a single-frame image of the image information, and converting the single-frame image into a gray-scale image;
calculating the absolute value of the gray value and the average difference value of each pixel point to obtain the brightness difference between the single-frame image and normal illumination;
supplementing or dimming the single-frame image through Gamma change;
obtaining the coordinates of each joint point of the picture after the light supplementing or dimming processing is completed through an image neural network;
calculating the degree of motion of the joint part according to the coordinates of each joint point, and recording the maximum activity;
and calculating the disability grade according to the calculated activity degree and combined with the provision of the disability identification standard.
Further, the selecting a joint part to be detected and acquiring image information of the joint part includes: the examiner makes the examinee move the joints according to the legal medical clinical examination standard; and acquiring each group of joint movement actions of the detected person by using a camera.
Further, the acquiring a single frame image of the image information and converting the single frame image into a gray scale image includes:
reading the single-frame image through OpenCV, and converting the single-frame image into a gray-scale image through an equation (1):
Gray=R*0.3+G*0.59+B*0.11 (1)
in the formula, gray represents a Gray scale, R represents a red channel value of an image, G represents a green channel value of an image, and B represents a blue channel value of an image.
Further, the calculating an absolute value of the gray value and the average difference value of each pixel point to obtain a brightness difference between the single-frame image and normal illumination includes:
obtaining the gray value of each pixel point and the absolute value of the average difference value through the formulas (2) and (3):
Figure BDA0003783308810000021
in the formula, I is a gray level image (two-dimensional array), H is the height of the gray level image, W is the width of the gray level image, light is the brightness value of the gray level image, if the brightness is greater than 0, the brightness of the image is high, and if the brightness is less than 0, the brightness is low;
Figure BDA0003783308810000022
in the formula, average (I) is the average gray scale of each pixel, and the larger the value is, the more uneven the illumination is.
Further, the performing light filling or dimming on the single-frame image through Gamma variation includes:
and (3) performing light supplement or dimming on the single-frame image by using an equation (4):
s=c*r γ r∈(0,1) (4)
wherein c and gamma are constants; s represents the processed single-channel pixel value, and r represents the pixel value of the original image after single-channel normalization.
Further, the calculating the degree of motion of the joint part according to the coordinates of each joint point comprises: the coordinates are two-dimensional coordinates;
obtaining the two-dimensional mobility of the joint part by the formula (5):
Figure BDA0003783308810000031
in the formula, point represents a joint point output by the model; angle represents the two-dimensional plane angle of the joint; dis denotes the euclidean distance in two-dimensional space,
Figure BDA0003783308810000032
further, the calculating the degree of motion of the joint part according to the coordinates of each joint point comprises: the coordinates are three-dimensional coordinates;
obtaining the three-dimensional mobility of the joint part by the formula (6):
the three-dimensional space angle calculation formula is as follows:
Figure BDA0003783308810000033
where dis denotes the Euclidean distance in three-dimensional space,
Figure BDA0003783308810000034
further, the calculating the disability grade according to the calculated activity degree and combined with the disability identification standard specification comprises the following steps:
the following indexes are measured according to the traffic accident disability identification standard:
the calculation formula of the function losing proportion of the two hands is (the activity of the left wrist + the activity of the right wrist)/2;
the calculation formula of the wrist activity degree is (2-dorsal extension measuring and calculating angle/dorsal extension reference angle-palmflexion measuring and calculating angle/palmflexion reference angle-radial deviation measuring and calculating angle/radial deviation reference angle-ulnar deviation measuring and calculating angle/ulnar deviation reference angle)/2;
the formula for calculating the ratio of the forearm rotation function loss of the upper limb is (the rotation activity of the left elbow + the rotation activity of the right elbow)/2;
the calculation formula of the elbow rotation activity degree is 1-measuring and calculating the angle before rotation/the reference angle before rotation-measuring and calculating the angle after rotation/measuring and calculating the angle after rotation;
the maximum upper limb loss function ratio is calculated by the formula of 0.18 wrist activity, 0.12 elbow activity and 0.7 shoulder activity;
the calculation formula of the shoulder motion degree is (3) -the angle of elevation measurement in anteflexion/the reference angle of elevation measurement in anteflexion-the angle of measurement in postextension/the reference angle of measurement in postextension-the angle of elevation measurement in abduction/the reference angle of elevation measurement in abduction-the angle of measurement in adduction/the reference angle of measurement in adduction-the angle of measurement in horizontal supination/the reference angle of measurement in horizontal supination/the angle of measurement in sticking arm position/the reference angle of measurement in sticking arm position-the angle of measurement in sticking arm position/the reference angle of measurement in sticking arm position)/3.
An image data-based joint motion detection system of an image data-based joint motion detection method, the system comprising: the device comprises an image acquisition module, an image processing module and an action calculation module; the image acquisition module is used for selecting a joint part to be detected and acquiring image information of the joint part; the image processing module is used for acquiring a single-frame image of the image information and converting the single-frame image into a gray-scale image; calculating the absolute value of the gray value and the average difference value of each pixel point to obtain the brightness difference between the single-frame image and normal illumination; supplementing or dimming the single-frame image through Gamma change; sending the pictures subjected to the light supplement or dimming processing to the action calculation module; the motion calculation module is used for obtaining the coordinates of each joint point from the pictures subjected to the light supplementing or dimming processing through an Openpos model, calculating the activity of the joint part and recording the maximum activity; and calculating the disability grade according to the calculated activity degree and the disability identification standard specification.
According to the joint movement detection method and system based on the image data, the human joint point positions are identified and the angle is calculated through the image neural network, so that compared with manual detection, the labor cost is saved, and the detection efficiency is improved. Meanwhile, the invention adopts two detection technical schemes of 2D and 3D, wherein in the 3D scheme, the three-dimensional coordinates of each joint point are calculated through an Openpos model, and then the activity of the joint part is calculated. In addition, in the 2D scheme, part of actions need to be carried out by the tested person to face the lens, and the 3D scheme does not need to be carried out by the tested person to face the lens and only needs to carry out actions on the front side, so that the detection efficiency is further improved.
Drawings
FIG. 1 is a flow chart of a method for detecting joint movement based on image data according to the present invention;
FIG. 2 is a schematic diagram of the structure of an image data-based joint movement detection system provided by the present invention;
FIG. 3 is an exemplary diagram of body joint locations in the Openpos model of the present invention;
FIG. 4 is an exemplary diagram of hand joint locations in the Openpos model of the present invention;
FIG. 5 is a partial motion requirement and measurement example of a prior art wrist measurement using a protractor;
FIG. 6 is a partial motion request and measurement example of a prior art shoulder joint measured with a protractor;
FIG. 7 is a front-end and back-end matching flow chart of the joint movement detection system based on image data in the embodiment.
Detailed Description
In order to make the technical problems solved, the technical solutions adopted and the technical effects achieved by the present invention clearer, the present invention is further described in detail below with reference to the accompanying drawings and embodiments. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some but not all of the relevant elements of the present invention are shown in the drawings.
As shown in fig. 7, this embodiment requires a computer with GPU as the back end and a computer as the front end and requires a ZED binocular camera if a 3D solution is required. The front end obtains the authority of the camera or the ZED binocular camera through webpage access and sends a two-dimensional picture or a depth map to the rear end for calculation. The front end may show example actions and videos that the user needs to detect. The tested person needs to do example actions according to requirements, and the rear end returns to the detection angle in real time. And returning a detection result after all actions are finished. The detection result is the disability grade and the moving angle of each joint.
As shown in fig. 1, a joint movement detection method based on image data according to an embodiment of the present invention includes:
101. selecting a joint part to be detected, and acquiring image information of the joint part;
specifically, the doctor makes a plurality of groups of joint movement actions for the detected person according to the legal medical clinical examination specification; and respectively acquiring each group of joint movement actions made by the detected person by using a camera or a ZED binocular camera. All actions only need to be performed by the front of the tested person, the tested person is not required to perform actions with high difficulty when the actions are measured by a protractor manually as in fig. 5 and 6, and the tested person is not required to convert different body angles when the actions are detected by a 2D system.
102. Acquiring a single-frame image of image information, and converting the single-frame image into a gray-scale image;
specifically, the single-frame image is read by OpenCV, and is converted into a grayscale image by equation (1):
Gray=R*0.3+G*0.59+B*0.11 (1)
where Gray represents a Gray scale map, R represents a red channel value of an image, G represents a green channel value of an image, and B represents a blue channel value of an image.
The image read by OpenCV is a three-channel image, and the image is converted into a gray scale image (three channels are converted into a single-channel picture).
103. Calculating the gray value of each pixel point and the absolute value of the average difference value to obtain the brightness difference between the single-frame image and normal illumination;
specifically, since the result of the motion calculation is affected by the difference in the light conditions of the shooting environment, the 2D image video is processed first. Obtaining the gray value of each pixel point and the absolute value of the average difference value through the formulas (2) and (3):
Figure BDA0003783308810000061
in the formula, I is a gray level image (two-dimensional array), H is the height of the gray level image, W is the width of the gray level image, light is the brightness value of the gray level image, if the brightness is greater than 0, the brightness of the image is high, and if the brightness is less than 0, the brightness is low;
Figure BDA0003783308810000062
in the formula, average (I) is the average gray level of each pixel, and a larger value indicates more uneven illumination.
104. Supplementing or dimming a single-frame image through Gamma change;
specifically, the single-frame image is subjected to light filling or dimming by equation (4):
s=c*r γ r∈(0,1) (4)
wherein c and gamma are constants; s represents the processed single-channel pixel value, and r represents the pixel value of the original image after single-channel normalization.
105. Obtaining the coordinates of each joint point of the picture after the light supplementing or dimming processing is completed through an image neural network;
specifically, as shown in fig. 3 and 4, the processed pictures are sent to the openposition model to obtain the 2D or 3D coordinates of each joint point.
106. Calculating the activity of the joint part according to the coordinates of each joint point, and recording the maximum activity;
specifically, the invention is realized by two technical schemes of 2D or 3D.
The technical scheme of the 2D is as follows:
obtaining the two-dimensional mobility of the joint part by the formula (5):
Figure BDA0003783308810000063
in the formula, point represents a joint point of the model output; angle represents the two-dimensional plane angle of the joint; dis denotes the euclidean distance in two-dimensional space,
Figure BDA0003783308810000064
the wrist movements and calculation formula are shown in table 1:
TABLE 1
Figure BDA0003783308810000065
Figure BDA0003783308810000071
The elbow movements and calculation formula are shown in table 2:
TABLE 2
Movement of Left hand formula Right hand formula
Flexion angle(body 1 ,body 8 ,body 6 ,body 7 ) angle(body 1 ,body 8 ,body 3 ,body 4 )
Stretching out angle(body 1 ,body 8 ,body 6 ,body 7 ) angle(body 1 ,body 8 ,body 3 ,body 4 )
Before screwing angle(hand 3 ,hand 4 ,body 1 ,body 6 ) angle(hand 3 ,hand 4 ,body 1 ,body 6 )
After screwing off angle(hand 3 ,hand 4 ,body 1 ,body 6 ) angle(hand 3 ,hand 4 ,body 1 ,body 6 )
The shoulder motion and calculation formula are shown in table 3:
TABLE 3
Movement of Left hand formula Right hand formula
Anteflexion and uplift angle(body 1 ,body 8 ,body 5 ,body 6 ) angle(body 1 ,body 8 ,body 2 ,body 3 )
Rear extension angle(body 1 ,body 8 ,body 5 ,body 6 ) angle(body 1 ,body 8 ,body 2 ,body 3 )
Abduction lift angle(body 1 ,body 8 ,body 5 ,body 6 ) angle(body 1 ,body 8 ,body 2 ,body 3 )
Adduction and adduction angle(body 1 ,body 8 ,body 5 ,body 6 ) angle(body 1 ,body 8 ,body 2 ,body 3 )
Horizontal internal rotation arccos(dis(body 6 ,body 7 )/len_forearm) arccos(dis(body 3 ,body 4 )/len_forearm)
Horizontal position external rotation arccos(dis(body 6 ,body 7 )/len_forearm) arccos(dis(body 3 ,body 4 )/len_forearm)
Internal rotation at arm-pasting position arccos(dis(body 6 ,body 7 )/len_forearm) arccos(dis(body 3 ,body 4 )/len_forearm)
Stick the arm position and revolve outward arccos(dis(body 6 ,body 7 )/len_forearm) arccos(dis(body 3 ,body 4 )/len_forearm)
In tables 1 to 3, len _ forearm is the forearm length recorded in the previous action, and the calculation formula is dis (body) 7 ,body 6 )or dis(body 3 ,body 4 );
The technical scheme of 3D is as follows:
obtaining the three-dimensional mobility of the joint part by the formula (6):
the three-dimensional space angle calculation formula is as follows:
Figure BDA0003783308810000072
where dis denotes the Euclidean distance in three-dimensional space,
Figure BDA0003783308810000073
the plane angle calculation formula is (taking the plane xy as an example)
Figure BDA0003783308810000074
The following expression is expressed as a plane angle calculation formula for simplicity:
angle P (body 1 ,body 8 ,body 6 ,body 7 ) x,y (8)
the wrist movements and calculation formula are shown in table 4:
TABLE 4
Movement of Left hand formula Right hand formula
Palm bending angle S (hand 10 ,hand 12 ,body 7 ,body 6 ) angle S (hand 10 ,hand 12 ,body 4 ,body 3 )
Zxfoom angle S (hand 10 ,hand 12 ,body 7 ,body 6 ) angle S (hand 10 ,hand 12 ,body 4 ,body 3 )
Radial deviation angle s (hand 10 ,hand 12 ,body 7 ,body 6 ) angle S (hand 10 ,hand 12 ,body 4 ,body 3 )
Ruler deviation angle S (hand 10 ,hand 12 ,body 7 ,body 6 ) angle S (hand 10 ,hand 12 ,body 4 ,body 3 )
The elbow movements and calculation formula are shown in table 5:
TABLE 5
Movement of Left hand formula Right hand formula
Flexion angle S (body 1 ,body 8 ,body 6 ,body 7 ) angle S (body 1 ,body 8 ,body 3 ,body 4 )
Stretch out angle S (body 1 ,body 8 ,body 6 ,body 7 ) angle S (body 1 ,body 8 ,body 3 ,body 4 )
Pronation angle P (hand 3 ,hand 4 ,body 1 ,body 8 ) x,y angle P (hand 3 ,hand 4 ,body 1 ,body 8 ) x,y
After screwing off angle P (hand 3 ,hand 4 ,body 1 ,body 8 ) x,y angle P (hand 3 ,hand 4 ,body 1 ,body 8 ) x,y
The shoulder motion and calculation formula are shown in table 6:
TABLE 6
Figure BDA0003783308810000081
107. And calculating the disability grade according to the calculated activity degree and combining with the disability identification standard.
Specifically, when the traffic accident disability identification is performed, the following indexes are measured according to the traffic accident disability identification standard:
1. the calculation formula of the function losing proportion of the hands is as follows: (left wrist activity + right wrist activity)/2;
the calculation formula of the wrist activity degree is as follows: (2-dorsal extension reckoning angle/dorsal extension reference angle-palmar flexion reckoning angle/palmar flexion reference angle-radial deviation reckoning angle/radial deviation reference angle-ulnar deviation reckoning angle/ulnar deviation reference angle)/2;
2. the calculation formula of the upper limb forearm rotation function loss proportion is as follows: (left elbow rotation activity + right elbow rotation activity)/2;
the calculation formula of the elbow rotation activity degree is 1-the angle is measured and calculated before the rotation/the reference angle before the rotation-the angle is measured and calculated after the rotation/the angle is measured and calculated after the rotation;
3. the maximum upper limb loss function ratio is calculated by the formula of 0.18 wrist activity, 0.12 elbow activity and 0.7 shoulder activity;
the calculation formula of the shoulder motion degree is (3-the angle of elevation in anteflexion/the reference angle of elevation in anteflexion-the angle of elevation in postextension/the reference angle of elevation in postextension-the angle of elevation in abduction/the reference angle of elevation in abduction-the angle of elevation in adduction/the reference angle of reference in adduction-horizontal supination/the angle of reference in supination in horizontal position-the angle of measurement in supination in arm position/the reference angle in supination in arm position)/3.
As shown in fig. 2, the present embodiment provides an image data-based joint movement detection system including: the device comprises an image acquisition module, an image processing module and an action calculation module; the image acquisition module is used for selecting joint parts needing to be detected and acquiring image information of the joint parts; the image processing module is used for acquiring a single-frame image of the image information and converting the single-frame image into a gray-scale image; calculating the gray value of each pixel point and the absolute value of the average difference value to obtain the brightness difference between the single-frame image and normal illumination; supplementing or dimming a single-frame image through Gamma change; sending the pictures subjected to the light supplementing or dimming processing to an action calculation module; the motion calculation module is used for obtaining two-dimensional or three-dimensional coordinates of each joint point from the picture after the light supplementing or dimming processing is completed through an Openpos model, calculating the activity of the joint part and recording the maximum activity; and obtaining the motion angle of each joint according to the motion degree obtained by calculation, and calculating the disability grade by combining with the disability identification standard.
In this embodiment, the joint movement detection system based on image data needs to use a computer with GPU as the back end and a computer as the front end. The front end obtains the authority of the camera or the ZED binocular camera through webpage access and sends the 2D image or the depth map to the rear end for calculation. The front end may show example actions and videos that the user needs to detect. The tested person needs to do example actions according to requirements, and the rear end returns to the detection angle in real time. And returning a detection result after all actions are finished. The detection result is the disability grade and the moving angle of each joint.
The front end is divided into three modules, one is a video transmission module, one is action display, and the other is result display. The video transmission module is used for transmitting the depth map of the front end to the rear end. The depth map consists of two parts-a 2D image and a depth map. The back end mainly comprises two modules, namely an image processing module and an action calculating module. And after the rear end finishes processing the data of the joint detection, sending the result to a result display module at the front end for displaying.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: modifications of the technical solutions described in the embodiments or equivalent replacements of some or all technical features may be made without departing from the scope of the technical solutions of the embodiments of the present invention.

Claims (9)

1. An image data-based joint movement detection method, characterized by comprising:
selecting a joint part to be detected, and acquiring image information of the joint part;
acquiring a single-frame image of the image information, and converting the single-frame image into a gray-scale image;
calculating the gray value of each pixel point and the absolute value of the average difference value to obtain the brightness difference between the single-frame image and normal illumination;
supplementing or dimming the single-frame image through Gamma change;
obtaining the coordinates of each joint point of the picture after the light supplementing or dimming processing is completed through an image neural network;
calculating the activity of the joint part according to the coordinates of each joint point, and recording the maximum activity;
and calculating the disability grade according to the calculated activity degree and combining with the disability identification standard.
2. The method for detecting joint movement according to claim 1, wherein the selecting a joint portion to be detected and acquiring image information of the joint portion comprises:
the examiner makes the examinee move the joints according to the legal medical clinical examination standard;
and acquiring each group of joint movement actions of the detected person by using a camera.
3. The method for detecting joint movement based on image data according to claim 1, wherein said obtaining a single frame image of said image information and converting into a gray-scale image comprises:
reading the single-frame image through OpenCV, and converting the single-frame image into a gray-scale image through an equation (1):
Gray=R*0.3+G*0.59+B*0.11 (1)
where Gray represents a Gray scale map, R represents a red channel value of an image, G represents a green channel value of an image, and B represents a blue channel value of an image.
4. The method according to claim 3, wherein the calculating an absolute value of a gray value and an average difference of each pixel point to obtain a luminance difference between the single-frame image and normal illumination includes:
obtaining the gray value of each pixel point and the absolute value of the average difference value through the formulas (2) and (3):
Figure FDA0003783308800000011
in the formula, I is a gray level image (two-dimensional array), H is the height of the gray level image, W is the width of the gray level image, light is the brightness value of the gray level image, if the brightness is greater than 0, the brightness of the image is high, and if the brightness is less than 0, the brightness is low;
Figure FDA0003783308800000021
in the formula, average (I) is the average gray level of each pixel, and a larger value indicates more uneven illumination.
5. The method according to claim 4, wherein the supplementing or dimming the single-frame image by Gamma change comprises:
and (3) performing light supplement or dimming on the single-frame image by using an equation (4):
s=c*r γ r∈(0,1) (4)
wherein c and gamma are constants; s represents the processed single-channel pixel value, and r represents the pixel value of the original image after single-channel normalization.
6. The method according to claim 5, wherein the calculating the degree of motion of the joint part from the coordinates of the respective joint points includes: the coordinates are two-dimensional coordinates;
obtaining the two-dimensional mobility of the joint part by the formula (5):
Figure FDA0003783308800000022
in the formula, point represents a joint point of the model output; angle represents the two-dimensional plane angle of the joint; dis denotes the euclidean distance in two-dimensional space,
Figure FDA0003783308800000023
7. the method according to claim 5, wherein the calculating the degree of motion of the joint portion from the coordinates of each joint point includes: the coordinates are three-dimensional coordinates;
obtaining the three-dimensional mobility of the joint part by the formula (6):
the three-dimensional space angle calculation formula is as follows:
Figure FDA0003783308800000024
where dis represents the Euclidean distance in three-dimensional space,
Figure FDA0003783308800000025
8. the method according to claim 6 or 7, wherein the calculating a disability level according to the calculated activity degree and a disability identification standard specification comprises:
the following indexes are measured according to the 'traffic accident disability identification standard':
the calculation formula of the function losing proportion of the two hands is (the activity of the left wrist + the activity of the right wrist)/2;
the calculation formula of the wrist activity degree is (2-dorsal extension measuring and calculating angle/dorsal extension reference angle-palmflexion measuring and calculating angle/palmflexion reference angle-radial deviation measuring and calculating angle/radial deviation reference angle-ulnar deviation measuring and calculating angle/ulnar deviation reference angle)/2;
the formula for calculating the ratio of the forearm rotation function loss of the upper limb is (the rotation activity of the left elbow + the rotation activity of the right elbow)/2;
the calculation formula of the elbow rotation activity degree is 1-the angle is measured and calculated before the rotation/the reference angle before the rotation-the angle is measured and calculated after the rotation/the angle is measured and calculated after the rotation;
the maximum upper limb loss function ratio is calculated by the formula of 0.18 wrist activity, 0.12 elbow activity and 0.7 shoulder activity;
the calculation formula of the shoulder motion degree is (3-the angle of elevation in anteflexion/the reference angle of elevation in anteflexion-the angle of elevation in postextension/the reference angle of elevation in postextension-the angle of elevation in abduction/the reference angle of elevation in abduction-the angle of elevation in adduction/the reference angle of reference in adduction-horizontal supination/the angle of reference in supination in horizontal position-the angle of measurement in supination in arm position/the reference angle in supination in arm position)/3.
9. An image data based joint motion detection system based on the method of claim 1, the system comprising:
the device comprises an image acquisition module, an image processing module and an action calculation module;
the image acquisition module is used for selecting a joint part to be detected and acquiring image information of the joint part;
the image processing module is used for acquiring a single-frame image of the image information and converting the single-frame image into a gray-scale image; calculating the gray value of each pixel point and the absolute value of the average difference value to obtain the brightness difference between the single-frame image and normal illumination; supplementing or dimming the single-frame image through Gamma change; after the light supplement or dimming is completed the picture is sent to the action calculation module;
the motion calculation module obtains the coordinates of each joint point from the pictures subjected to the light supplementing or dimming processing through an image neural network, calculates the activity degree of the joint part and records the maximum activity degree; and calculating the disability grade according to the calculated activity degree and the disability identification standard specification.
CN202210936110.3A 2022-08-05 2022-08-05 Joint movement detection method and system based on image data Pending CN115272260A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210936110.3A CN115272260A (en) 2022-08-05 2022-08-05 Joint movement detection method and system based on image data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210936110.3A CN115272260A (en) 2022-08-05 2022-08-05 Joint movement detection method and system based on image data

Publications (1)

Publication Number Publication Date
CN115272260A true CN115272260A (en) 2022-11-01

Family

ID=83748789

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210936110.3A Pending CN115272260A (en) 2022-08-05 2022-08-05 Joint movement detection method and system based on image data

Country Status (1)

Country Link
CN (1) CN115272260A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116630318A (en) * 2023-07-24 2023-08-22 凯泰铭科技(北京)有限公司 Method and system for optimizing mobile terminal measurement activity

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116630318A (en) * 2023-07-24 2023-08-22 凯泰铭科技(北京)有限公司 Method and system for optimizing mobile terminal measurement activity
CN116630318B (en) * 2023-07-24 2023-10-13 凯泰铭科技(北京)有限公司 Method and system for optimizing mobile terminal measurement activity

Similar Documents

Publication Publication Date Title
CN104700433B (en) A kind of real-time body's whole body body motion capture method of view-based access control model and system thereof
US20210174505A1 (en) Method and system for imaging and analysis of anatomical features
Bonnechere et al. Determination of the precision and accuracy of morphological measurements using the Kinect™ sensor: comparison with standard stereophotogrammetry
CN111368810A (en) Sit-up detection system and method based on human body and skeleton key point identification
Yang et al. Automatic 3-D imaging and measurement of human spines with a robotic ultrasound system
CN102679964B (en) Gait parameter measurement system and data processing device and method thereof
CN104887238A (en) Hand rehabilitation training evaluation system and method based on motion capture
CN113139962B (en) System and method for scoliosis probability assessment
CN108742519A (en) Machine vision three-dimensional reconstruction technique skin ulcer surface of a wound intelligent auxiliary diagnosis system
US11328535B1 (en) Motion identification method and system
CN115272260A (en) Joint movement detection method and system based on image data
CN104732586B (en) A kind of dynamic body of 3 D human body and three-dimensional motion light stream fast reconstructing method
CN110960219A (en) Intelligent auxiliary diagnosis system for skin ulcer wound surfaces
CN113283373B (en) Method for enhancing limb movement parameters detected by depth camera
KR20230025656A (en) Image display system and image display method
CN113807323B (en) Accurate hand function evaluation system and method based on image recognition
CN108720825B (en) Multi-camera-based seamless detection method for non-contact vital sign parameters
CN111582081A (en) Multi-Kinect serial gait data space-time combination method and measuring device
O'Malley et al. Kinematic analysis of human walking gait using digital image processing
CN115797595A (en) Orthodontic treatment monitoring method, orthodontic treatment monitoring device, orthodontic treatment monitoring equipment and orthodontic treatment monitoring storage medium
CN115568823A (en) Method, system and device for evaluating human body balance ability
CN111184535B (en) Handheld unconstrained scanning wireless three-dimensional ultrasonic real-time voxel imaging system
Albuquerque et al. Remote Pathological Gait Classification System
Liu et al. Physical sensor difference-based method and virtual sensor difference-based method for visual and quantitative estimation of lower limb 3D gait posture using accelerometers and magnetometers
TW202143908A (en) Multi-parameter physiological signal measuring method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination