CN107392086A - Apparatus for evaluating, system and the storage device of human body attitude - Google Patents
Apparatus for evaluating, system and the storage device of human body attitude Download PDFInfo
- Publication number
- CN107392086A CN107392086A CN201710386839.7A CN201710386839A CN107392086A CN 107392086 A CN107392086 A CN 107392086A CN 201710386839 A CN201710386839 A CN 201710386839A CN 107392086 A CN107392086 A CN 107392086A
- Authority
- CN
- China
- Prior art keywords
- human body
- standard
- posture
- parts
- relative position
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/23—Recognition of whole body movements, e.g. for sport training
Landscapes
- Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Image Analysis (AREA)
Abstract
The invention provides a kind of appraisal procedure of human body attitude, device and storage device, this method includes obtaining the range image sequence of human body to be assessed;The parts of body of human body to be assessed is identified according to range image sequence and determines the human body reference point of human body to be assessed;Obtain parts of body and the relative position relation of human body reference point;By relative position relation compared with the standard relative position relation of the standard posture to prestore;Export comparative result.The device includes processor and the depth camera being connected with processor.The storage device is had program stored therein data, and routine data can be performed to realize the above method.The present invention can differentiate the position of partes corporis humani position exactly and obtain accurate relative position relation, to improve the accuracy of posture assessment result, so as to improve the training effectiveness of every motion.
Description
Technical field
The present invention relates to technical field of image processing, more particularly to a kind of appraisal procedure of human body attitude, device and deposits
Storage device.
Background technology
The depth information that each pixel has in the depth image of depth camera capturing scenes is scene surface to depth phase
The distance of machine, so as to which the positional information of scene objects can be obtained according to depth image.
, it is necessary to catch body posture so that the posture of parts of body is formed into data in the training such as motion, dancing, body
Information, the body posture is assessed by the analysis to data message, to improve training effect.In the prior art,
Human body attitude is assessed using 2D image sequences, in the research and practice process of prior art, the present inventor
It was found that from 2D image sequences for such as limbs in front of trunk the identification of posture of hiding relation can not be accurate
Really differentiate, thus easily cause posture assessment result inaccurate.
The content of the invention
The present invention provides a kind of appraisal procedure of human body attitude, device and storage device, can solve the problem that prior art is present
The problem of human body attitude assessment result inaccuracy.
In order to solve the above technical problems, one aspect of the present invention is:A kind of assessment of human body attitude is provided
Method, this method comprise the following steps:Obtain the range image sequence of human body to be assessed;Identified according to the range image sequence
The parts of body of the human body to be assessed and the human body reference point for determining the human body to be assessed;Obtain the parts of body
With the relative position relation of the human body reference point;By the relative position of the standard of standard posture of the relative position relation with prestoring
The relation of putting is compared;Export comparative result.
In order to solve the above technical problems, another technical solution used in the present invention is:Offer kind human body attitude
Apparatus for evaluating, the apparatus for evaluating include depth camera and processor, and the depth camera connects with the processor;Wherein, institute
State the range image sequence that depth camera is used to obtain human body to be assessed;The processor is used for according to the range image sequence
Identify the parts of body of the human body to be assessed and determine the human body reference point of the human body to be assessed;It is each to obtain the body
Position and the relative position relation of the human body reference point;By the relative position relation and the standard phase of the standard posture to prestore
Position relationship is compared;Export comparative result.
In order to solve the above technical problems, another technical scheme that the present invention uses is:A kind of storage device is provided, this is deposited
Storage device is had program stored therein data, and described program data can be performed to realize the above method.
The beneficial effects of the invention are as follows:The situation of prior art is different from, the present invention to range image sequence by carrying out
Processing, parts of body and human body reference point in human body attitude are extracted, and by between parts of body and human body reference point
Relative position relation come with the standard relative position relation of the human body of standard posture compared with draw the assessment of human body attitude
As a result, the position of partes corporis humani position can be differentiated exactly and obtains accurate relative position relation, to improve posture assessment result
Accuracy, so as to improve the training effectiveness of every motion.
Brief description of the drawings
Technical scheme in order to illustrate the embodiments of the present invention more clearly, make required in being described below to embodiment
Accompanying drawing is briefly described, it should be apparent that, drawings in the following description are only some embodiments of the present invention, for
For those of ordinary skill in the art, on the premise of not paying creative work, other can also be obtained according to these accompanying drawings
Accompanying drawing.
Fig. 1 is a kind of schematic flow sheet of human body attitude appraisal procedure embodiment provided by the invention;
Fig. 2 is a kind of schematic flow sheet of another embodiment of human body attitude appraisal procedure provided by the invention;
Fig. 3 is the schematic flow sheet of step S22 in Fig. 2;
Fig. 4 is the schematic flow sheet of step S23 in Fig. 2;
Fig. 5 is a kind of kneed barycenter of another embodiment of human body attitude appraisal procedure provided by the invention and human body center
Spatial relation schematic diagram;
Fig. 6 is the schematic flow sheet of step S26 in Fig. 2;
Fig. 7 is a kind of structural representation of human body attitude apparatus for evaluating embodiment provided by the invention.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out clear, complete
Site preparation describes, it is clear that described embodiment is only the part of the embodiment of the present invention, rather than whole embodiments.It is based on
Embodiment in the present invention, those of ordinary skill in the art are obtained all other under the premise of creative work is not made
Embodiment, belong to the scope of protection of the invention.
Referring to Fig. 1, Fig. 1 is a kind of schematic flow sheet of human body attitude appraisal procedure embodiment provided by the invention.Fig. 1
Shown human body attitude appraisal procedure includes step:
S11, the range image sequence for obtaining human body to be assessed.
In step S11, the human body attitude of human body to be assessed is human body attitude to be assessed.The concrete application of the present invention can be with
It is the human body attitude or static stance, sitting posture in the action such as dynamic athletic performance, dance movement under a certain state
Deng human body attitude.Range image sequence can be obtained by depth camera.Depth image not only includes the pixel of space object
Information, include the depth information of each Pixel Information, i.e., object is to the distance between depth camera information in space.Depth map
As sequence then refers to continuous depth image in a period of time.
S12, the parts of body of human body to be assessed is identified according to range image sequence and determines the human body of human body to be assessed
Reference point.
Specifically, the parts of body of human body to be assessed can be the portions such as head, shoulder neck, trunk, four limbs, hand, foot
Position, and the joint such as knee, elbow, wrist, ankle, hip joint.The human body reference point of human body to be assessed can be mass center of human body or human body
Center, the present invention will be described using human body center as human body reference point for the present embodiment.Certainly, in some other embodiment
In, it is also an option that other specified points of human body are as human body reference point.
S13, obtain parts of body and the relative position relation of human body reference point.
In one embodiment, relative position relation can be the Europe between the barycenter of parts of body and ginseng human body center
Formula distance and COS distance, for example, Euclidean distance and COS distance of the barycenter on head to human body center, hand barycenter to human body
The Euclidean distance and COS distance at center etc..
S14, by relative position relation compared with the standard relative position relation of the standard posture to prestore.
The acquisition of the standard relative position relation of standard posture and the appearance to be assessed of step S11 to S13 human body to be assessed
The acquisition modes of state can be with identical.Can be before human body attitude assessment be carried out for the first time, by the standard relative position of standard posture
Relation is preserved to be called in step S14 to be compared.
S15, output comparative result.
In step S15, comparative result is standard posture including the human body attitude to be assessed, or the human body attitude to be assessed
It is not standard posture, in certain embodiments, acceptable further output adjustment suggestion, including the position of adjustment and adjustment side
To, for example, left hand needs to move still further below, foot needs to move up again, and right arm needs to be moved to the left again etc., at other
In embodiment, mobile distance further can also be clearly proposed, for example, left hand moves down 5cm etc..
Wherein, comparative result can be exported by way of voice or display screen are shown, display screen, which is shown, to be passed through
The mode of word or image is shown, it is, of course, also possible to be exported with reference to above-mentioned each mode.So as to prompt user to adjust its appearance
State.
Prior art is different from, the present invention extracts the body in human body attitude by handling range image sequence
Each position and human body reference point, and pass through the relative position relation between parts of body and human body reference point and standard posture
The standard relative position relation of human body draws the assessment result of human body attitude to be compared, and can differentiate partes corporis humani position exactly
Position and obtain accurate relative position relation, to improve the accuracy of posture assessment result, so as to improve every motion
Training effectiveness.
Referring to Fig. 2, Fig. 2 is a kind of flow signal of another embodiment of human body attitude appraisal procedure provided by the invention
Figure.
S21, the range image sequence for obtaining human body to be assessed.
S22, the parts of body of human body to be assessed is identified according to range image sequence and determines the human body of human body to be assessed
Reference point.
The present embodiment illustrates by taking the end posture that deep-knee-bend acts as an example.The human body reference point of the present embodiment is in human body
The heart.
Referring to Fig. 3, Fig. 3 is the schematic flow sheet of step S22 in Fig. 2.Step S22 further comprises:
Background in S221, removal series of depth images.
For example, a patch (blob, i.e. there is the connection group of the pixel of similar value) can be primarily determined that in depth map
As the body of object, other patches with significantly different depth value are then removed from the patch.It is preliminary by this way
The patch of determination generally has to have some minimum dimension.It is however, therefore, simple between the pixel coordinate of patch edge
Euclidean distance does not provide the accurate measurement of the size.The reason for this is inaccurate is, with the object with given actual size
The size (in units of pixel) of corresponding patch increaseds or decreases with the change of the object and the distance of equipment.
Therefore, in order to determine the actual size of object, first by following formula by (x, y, depth) coordinate of object
It is transformed to " real world " coordinate (xr, yr, depth):
Xr=(x-fovx/2) * Pixel Dimensions * depth/reference depth
Yr=(y-fovy/2) * Pixel Dimensions * depth/reference depth
Here, fovx and fovy is the visual field of the depth map on x and y directions (in units of pixel).Pixel Dimensions are,
The length opposite to set a distance (reference depth) place pixel from plotting equipment.Then, the size of patch can be by seeking the spot
Euclidean distance between the real-world coordinates of block edge actually determines.
Therefore, the background in depth image can be removed by identifying the patch with required minimum dimension, its
In, there is smallest average depth value among each patch of the patch in the scene.It assume that, the spot nearest apart from depth camera
Block is human body, and all pixels of big at least some threshold value of the depth ratio average depth value, which are all assumed to be, belongs to background object, and
The depth value of these pixels is arranged to null value.Wherein, above-mentioned threshold value can determine according to being actually needed.In addition, at some
, can also be by each pixel zero setting of the depth value with the average depth value for being significantly less than patch in embodiment.Furthermore it is also possible to
A depth capacity is preset, so as to ignore the object for exceeding the depth capacity.
In certain embodiments, depth value can also be dynamically determined, if the depth value, object is just from depth map
Middle removal.Therefore, assume that the object in scene moves.Therefore, depth does not have times changed in certain minimal amount frame
What pixel is assumed to be background object.The pixel that depth value is more than the static depth value is considered as to belong to background object,
Therefore all it is zeroed out.Start, all pixels in scene can all be defined as all pixels in static state, or scene can be with
All it is defined as non-static.In both situations, once object setting in motion, it is possible to the actual depth mistake of dynamic generation
Filter.
It is, of course, also possible to remove the background in depth image by other methods well known in the prior art.
S222, obtain range image sequence in human body profile.
After background is removed, the exterior contour of body can be found out in depth map by edge detection method.This reality
Apply in example, the profile of human body is found out using two step thresholding mechanism:
First, travel through with all pixels in humanoid corresponding patch in depth image, also, if any give fixation
Element has effective depth value, and if the pixel and at least one in its four neighborhood pixels being connected (right, left, upper and lower)
The difference of depth value between individual pixel is more than first threshold, then is marked as outline position.(wherein, effective depth value and zero
Difference between value is considered as infinitely great).
Then, previous step and then secondary traversal patch are being completed, and if (pixel is also in any pixel
Be not flagged as outline position) eight connected neighborhood pixels among have a contour pixel, and if current pixel and surplus
Under connected close position at least one pixel between the difference of depth value be more than Second Threshold (being less than first threshold), then
It is marked as outline position.
S223, the trunk according to outline identification human body.
Finding out the outline of human body and then identifying each position of body, for example, head, trunk and four limbs.
First rotate depth image so that body contour is in vertical position.The purpose of the rotation is in order to by by body
The longitudinal axis alignd with Y-coordinate (vertical) axle to simplify the calculating in following step.Selectively, following calculating can be relative to body
The longitudinal axis of body performs, without carrying out the rotation, as understood by those skilled in the art.
Before each position of identification body, the 3D axles of body can be first found out.Specifically, the 3D axles for finding out body can
With using following methods:
It is node grid by original depth image down-sampling (down-sample), wherein, in the x-direction and the z-direction every n
Individual pixel takes a node.The depth value of each node is calculated based on the depth value in n × n squares centered on node.
If there is null value more than half pixel in square, respective nodes are arranged to null value.Otherwise, the node is arranged to n × n
The average value of effective depth value in square.
It is then possible to based on the value of adjacent node come the depth image of further " cleaning " down-sampling:If given section
Most of adjacent node of point has null value, then the node being also configured as into null value, (it has after abovementioned steps
The depth value of effect).
When above-mentioned steps are completed, the longitudinal axis of remaining node in the figure of down-sampling is found out.Therefore, linear minimum can be carried out
Two multiply fitting is most fitted the line of each node to find out.Selectively, an ellipse around each node can be fitted and find out it
Main shaft.
After the 3D axles of body are found out, by the thickness that body contour is measured on the direction for be parallel and perpendicular to the longitudinal axis
To identify the trunk of body.Therefore, constraint frame can be limited around body contour, then can be to the pixel value in the frame
Carry out binaryzation:Pixel with depth zero value is set to 0, and the pixel with non-zero depth value is set to 1.
Then, by being added along corresponding vertical line to binary pixel values, each X values calculating to inframe is vertical
To thickness value, and by being summed up along corresponding horizontal line to binary pixel values, transverse gage is calculated to each Y value
Value.To resulting value threshold application, to identify along which bar vertical line and horizontal line profile relative thick.
When the transverse gage in profile certain level region is more than X threshold values, the longitudinal thickness of a certain vertical area is more than Y threshold values
When, the common factor of the horizontal zone and vertical area can be defined as trunk.
S224, the parts of body according to trunk identification human body.
After trunk is determined, the head of body and four limbs can be identified based on geometry consideration.Hand arm is to connect
It is connected to the left side of torso area and the region on right side;Head is the join domain above torso area;Leg is under torso area
The join domain of side.The upper left corner of torso area and the upper right corner can also be tentatively identified as shoulder.
In another embodiment, identify that the parts of body of human body to be assessed can also be by following three step realities
It is existing:
(1) human body segmentation.The method that the present embodiment is combined using inter-frame difference and background subtraction split-phase is moved to split
Human body, the frame in RGBD images is chosen in advance as background frames, establishes the Gauss model of each pixel, then with inter-frame difference
Method carries out difference processing to adjacent two field pictures, distinguishes background dot and region (the region bag changed in the current frame of change
Include and appear area and moving object), then the corresponding region of region of variation and background frames progress models fitting is distinguished and appears area
And moving object, shade is finally removed in moving object, so comes out the moving meshes without shade.Context update
When inter-frame difference is determined as to the point of background, then be updated with certain rule;Background subtraction timesharing is determined as appearing area
Point, then background frames are updated with larger turnover rate, region corresponding to moving object is without renewal.The method can be managed relatively
The segmentation object thought.
(2) contours extract and analysis.After the image after obtaining binaryzation, calculated using some classical rim detections
Method obtains profile.For example with Canny algorithms, Canny edge detection operators fully reflect the number of optimal edge detector
Characteristic is learned, for different types of edge, good signal to noise ratio is respectively provided with, excellent positioning performance, single edge is produced more
The low probability of individual response and the maximum suppression ability to false edge response, after obtaining light stream segmentation field using partitioning algorithm,
Our all moving targets of concern are included in these cut zone.Therefore, Canny will be utilized in these cut zone
Operator extraction edge, ambient interferences on the one hand can be limited significantly, on the other hand can effectively improve the speed of operation.
(3) artis marks automatically.Obtain being partitioned into moving target by calculus of finite differences, Canny edge detection operators carry
After contouring, by MaylorK.LeungandYee-HongYang 2D belt patterns (RibbonModel) to human body target
Further analysis.Human body front is divided into different regions by the model, for example, human body is constructed with 5 U-shaped regions, this 5
U-shaped region represents head and the four limbs of human body respectively.
So, the body end points of 5 U-shapes of searching is passed through, so that it may the approximate location of body is determined, in the profile extracted
On the basis of, compressed by vector outline, to extract the information of needs, retain the feature of most important human limb, by human body
Profile is compressed into a fixed shape, such as so that profile is inverted U-shaped with fixed 8 end points and 5 U-shaped points and 3
Point, so obvious feature can conveniently calculate profile.The distance algorithm that adjacent end points on profile may be used herein carrys out compression wheel
Exterior feature, pass through 8 end points of iterative processing profile boil down to.
Automatic marking is carried out to parts of body using following algorithm can after compression profile is obtained:
(1) the body end points of U-shape is determined.Some reference length M is set, the vector more than M can consider that it is body
A part for body profile, then ignore less than it.Begun look for according to the profile after vector quantization from certain point, find one and be more than M
Vector be designated as Mi, find next vector and be designated as Mj, compare Mi to Mj angle, if angle within a certain range (0~
90 °) (noticing that angle is just, it is convex to represent it here), then it is assumed that they are U end points, record the two vectors, find one
U end points.So until finding out 5 U end points.
(2) end points of three inverted u-shaped is determined.Same step (1), as long as angle condition is just being changed to negative.
(3) head, hand, the position of pin are readily available according to U and U end points.According to the physiology shape of body, so that it may
To determine each artis, using arm and body angle portions, head and leg angle portions, trunk can be determined respectively
Width and length;Then trunk ratio 0.75,0.3 is accounted for respectively using neck, waist position, ancon is located at the midpoint of shoulder and hand,
Knee is located at the midpoint of waist and pin.So parts of body approximate location, which can define, comes.
S225, with reference to trunk and parts of body human body center is obtained to be used as human body reference point.
Wherein, human body center is the geometric center of human body in depth image.When the parts of body of trunk and human body is known
Not Chu Lai after, you can pass through the profile of the whole human body of depth image, i.e., the intermediate value of the outside edge value at 3 D human body edge come
Determine human body center.
S23, obtain parts of body and the relative position relation of human body reference point.
Relative position relation for the parts of body of the human body of posture to be assessed barycenter to the relative position with human body center
Relation is put, for example, the barycenter of parts of body can be included to the Euclidean distance and COS distance at human body center.Standard is with respect to position
The barycenter of the parts of body for the human body that relation is standard posture is put to the relative position relation at human body center.
As shown in figure 4, Fig. 4 is the schematic flow sheet of step S23 in Fig. 2.In the present embodiment, step S23 includes:
S231, the first coordinate value for obtaining human body center.
In the present embodiment, first coordinate value at human body center is seat of the human body center in the camera coordinates system of depth camera
Scale value.
For example, in the end posture of deep-knee-bend action, human body central point A the first coordinate value is (x1,y1,z1)。
Second coordinate value of S232, the barycenter for obtaining parts of body and parts of body barycenter.
Specifically, after parts of body identifies, it may be determined that the barycenter in each region of body.Wherein,
The barycenter in region refers to representative depth or the position in the region.Therefore, for example, can with the histogram of depth value in formation zone, and
Depth value (or average value of two or more depth values with highest frequency) with highest frequency is set to the region
Barycenter.After the barycenter that parts of body is determined, you can determine coordinate of the barycenter of parts of body in camera coordinates system.
It is noted that the barycenter in the present invention refers to handle acquired barycenter by depth image, and not thing
Manage barycenter.The barycenter of the present invention can be obtained by centroid method, can also be obtained by other methods, and the present invention does not limit.
For example, in the end posture of deep-knee-bend action, the barycenter B of knee endoprosthesis the second coordinate value is (x2,y2,z2)。
S233, calculated according to the first coordinate value and the second coordinate value the barycenter of parts of body and the European of human body center away from
From and COS distance, to form the vector to be assessed of the body posture of the human body.
COS distance, also referred to as cosine similarity, it is to be used as measurement by the use of two vectorial angle cosine values in vector space
The measurement of the size of two interindividual variations.This concept is borrowed in machine learning to weigh the difference between sample vector.Its
In, two vectorial COS distances can be represented with the cosine value of angle between them.
For example, as shown in figure 5, Fig. 5 is another embodiment knee of a kind of human body attitude appraisal procedure provided by the invention
The schematic diagram of the barycenter in joint and the spatial relation at human body center., can after the first coordinate value and the second coordinate value is obtained
To draw the vector at human body center And the vector of the barycenter of knee endoprosthesis
Specifically, the Euclidean distance between knee endoprosthesis and human body center is calculated by below equation to obtain:
WithBetween COS distance can by below equation calculate and:
Wherein, what Euclidean distance was weighed is the absolute distance of spatial points, for example, dABIt is absolute between measurement point A and point B
Distance, it is directly related with the position coordinates where each point;And COS distance weigh be space vector angle, more embody
Difference on direction, rather than position.
Specifically, COS distance span is [- 1,1].Included angle cosine is bigger to represent that two vectorial angles are smaller, folder
The angle of the smaller vector of expression two of angle cosine is bigger.When two vectorial directions overlap, included angle cosine takes maximum 1, when two
The complete opposing angles cosine in direction of vector takes minimum value -1.
Certainly, in human body attitude evaluation process, the barycenter of hand, foot etc. other body parts generally can also be calculated
To the Euclidean distance and COS distance at human body center.Finally, the barycenter of all required body parts is to the European of human body center
Value obtained by distance and COS distance corresponds with each body part, and is integrally formed the to be assessed of the human body attitude to be assessed
Vectorial X.
S24, the parts of body of human body and the standard relative position relation at human body center for obtaining standard posture.
The parts of body of the human body of the method criterion of identification posture similar to step S22 and step S23 can be used
Barycenter and human body center, and obtain the barycenter of the parts of body of the human body of standard posture the 3rd coordinate and human body center the
4-coordinate, then the barycenter and human body center for passing through the 3rd coordinate and the parts of body of the human body of 4-coordinate calculating standard posture
Euclidean distance and COS distance, to form the standard vector A of the standard physical posture.
It is noted that the acquisition of standard relative position relation need to be with the acquisition of the relative position relation of human body to be assessed
Mode is consistent.For example, because human body attitude to be assessed have selected human body center as human body reference point, under standard posture
Human body reference point is also required to be human body center.
S25, preserve standard relative position relation.
Preservation standard relative position relation is called to be compared in order to step S26.
Wherein, step S24-S25 can be before or after step S21-S23, or can also appoint in step S21-23
Between two steps of anticipating, as long as before S26.
S26, by relative position relation compared with the standard relative position relation of the standard posture to prestore.
Step S26 is by human body attitude to be assessed compared with the standard posture to prestore, is then showed after digitization
The body of the relative position relation and human body during standard posture between parts of body and human body center for human body attitude to be assessed
The comparison between standard relative position relation between each position of body and human body center.
For example, the barycenter of the knee endoprosthesis of standard posture and the Euclidean distance at human body center are dAB', COS distance cos
θ’.By dABWith dAB', cos θ are compared with cos θ '.
As shown in fig. 6, Fig. 6 is the schematic flow sheet of step S26 in Fig. 2.Specifically, step S26 includes:
S261, the difference for calculating vector sum standard vector to be assessed.
In step S261, R=X-A is calculated, wherein, R items are corresponding body part and the body under standard posture
The deviation of Euclidean distance and COS distance between position.
For example, d is contained in RABWith dAB' between deviation, the deviation between cos θ and cos θ ', also include other bodies
In the barycenter and human body of the parts of body under Euclidean distance and standard posture between the barycenter at each position of body and human body center
COS distance and standard between the deviation of Euclidean distance between the heart, and the barycenter of other parts of body and human body center
The deviation of COS distance between the barycenter of parts of body under posture and human body center.
S262, by difference compared with predetermined threshold value, and judge whether difference is less than predetermined threshold value.
Specifically, in step S262, R vectorial characteristic value can be calculated, and by the vectorial characteristic value with pre-setting
Good predetermined threshold value is compared.
In the present embodiment, if the characteristic value of difference is less than predetermined threshold value, show the human body attitude to be assessed and standard posture
It is similar or consistent, or reached the requirement of standard posture, then into step S263.If the characteristic value of difference is more than or equal to
Predetermined threshold value, the standard human body attitude to be assessed differ larger with standard posture, thus enter step S264.
S263, judge that comparative result is that body posture is standard posture, and enter step S27.
S264, judge that comparative result for body posture is not standard posture, and enter step S27.
S27, output comparative result.
In the present embodiment, no matter whether body posture to be assessed reaches the requirement of standard posture, can output result, with
Prompt user.
When body posture reaches the requirement of standard posture, user is prompted to reach standard posture.When body posture is not up to
During the requirement of standard posture, prompt user to be not up to standard posture, while prompt user on how to be adjusted to body posture
The adjustment suggestion of standard posture.For example, in the end posture of human body deep-knee-bend action to be assessed, if R characteristic value is more than or equal to
Predetermined threshold value, then it is which does not meet standard posture to confirm, e.g., analysis result is hand by analyzing items in R
The insufficient height of portion's lift, then prompt user that hand is too high.
The present embodiment illustrates by taking the end posture that deep-knee-bend acts as an example, it is possible to understand that ground, in other embodiments,
It can be posture of particular point in time etc. in motion process.
Referring to Fig. 7, Fig. 7 is a kind of structural representation of human body attitude apparatus for evaluating embodiment provided by the invention.
Present invention also offers a kind of human body attitude apparatus for evaluating, the device includes depth camera 10, processor 11 and deposited
Reservoir 12, depth camera 10 and memory 12 are connected with processor 11.
Depth camera 10 is used for the range image sequence for obtaining human body to be assessed.Wherein it is possible to pass through a depth camera
10 shoot the range image sequence of human body to be assessed, can also be shot from different perspectives by multiple depth cameras 10 to be assessed
The range image sequence of human body.
Processor 11 is used to identify the parts of body of human body to be assessed according to range image sequence and determines people to be assessed
The human body reference point of body;Obtain parts of body and the relative position relation of human body reference point;By relative position relation with prestoring
The standard relative position relation of standard posture be compared;Export comparative result.
Processor 11 is additionally operable to the parts of body position relative with the standard of human body reference point of the human body of acquisition standard posture
Put relation.
Memory 12 is used to preserve standard relative position relation.
Wherein, relative position relation is the barycenter of the parts of body of the human body of posture to be assessed to the human body with the human body
The relative position relation of reference point;Standard relative position relation for the parts of body of the human body of standard posture barycenter to the mark
The relative position relation of the human body reference point of the human body of quasi- posture.
Processor 11 is additionally operable to obtain the first coordinate value of human body reference point;Barycenter and the body for obtaining parts of body are each
Second coordinate value of the barycenter at position;The barycenter of parts of body is calculated according to the first coordinate value and the second coordinate value and human body is joined
The Euclidean distance and COS distance of examination point, to form the vector to be assessed of the body posture of the human body.
Processor 11 is additionally operable to the European of the barycenter of the parts of body of the human body of calculating standard posture and human body reference point
Distance and COS distance, to form the standard vector of the standard physical posture.
Processor 11 is additionally operable to calculate the difference of vector sum standard vector to be assessed;By difference compared with predetermined threshold value, if
Difference is less than predetermined threshold value, then it is that body posture is standard posture to judge comparative result.
If processor 11, which is additionally operable to difference, is more than or equal to predetermined threshold value, judge comparative result for body posture be not mark
Quasi- posture;The output adjustment suggestion when exporting comparative result.
Processor 11 is additionally operable to remove the background in series of depth images;Obtain the wheel of the human body in range image sequence
It is wide;According to the trunk of outline identification human body;The parts of body of human body is identified according to trunk;Obtained with reference to trunk and parts of body
Take human body reference point.
Present invention also offers a kind of storage device, the storage device is had program stored therein data, and the routine data can be by
Perform to realize the method for the human body attitude of any of the above-described embodiment assessment.
For example, the storage device can be portable storage media, such as USB flash disk, mobile hard disk, read-only storage
(ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), magnetic disc or CD
Deng.It is to be appreciated that storage device can also be that server etc. is various can be with the medium of store program codes.
In summary, the present invention can differentiate the position of partes corporis humani position exactly and obtain accurate relative position relation,
The accuracy of posture assessment result is improved, and then improves the training effectiveness of every motion.
Embodiments of the present invention are the foregoing is only, are not intended to limit the scope of the invention, it is every to utilize this
The equivalent structure or equivalent flow conversion that description of the invention and accompanying drawing content are made, or directly or indirectly it is used in other correlations
Technical field, it is included within the scope of the present invention.
Claims (11)
1. a kind of appraisal procedure of human body attitude, it is characterised in that comprise the following steps:
Obtain the range image sequence of human body to be assessed;
The parts of body of the human body to be assessed is identified according to the range image sequence and determines the human body to be assessed
Human body reference point;
Obtain the relative position relation of the parts of body and the human body reference point;
By the relative position relation compared with the standard relative position relation of the standard posture to prestore;
Export comparative result.
2. appraisal procedure according to claim 1, it is characterised in that described by the relative position relation and the mark to prestore
Before the step of standard relative position relation of quasi- posture is compared, in addition to:
Obtain the parts of body of the human body of the standard posture and the standard relative position relation of the human body reference point;
Preserve the standard relative position relation.
3. appraisal procedure according to claim 2, it is characterised in that the relative position relation is the people of posture to be assessed
Relative position relation of the barycenter of the parts of body of body to the human body reference point with the human body;The standard relative position
Relation for standard posture human body parts of body barycenter to the human body reference point of the human body of the standard posture phase
To position relationship;
Described the step of obtaining the parts of body and the relative position relation of the human body reference point, includes:
Obtain the first coordinate value of the human body reference point;
Obtain the second coordinate value of the barycenter of the parts of body and the barycenter of the parts of body;
The barycenter and human body reference point of the parts of body are calculated according to first coordinate value and second coordinate value
Euclidean distance and COS distance, to form the vector to be assessed of the body posture of the human body;
The parts of body and the standard relative position relation of the human body reference point of the human body for obtaining the standard posture
The step of include:
The barycenter of the parts of body of the human body of the standard posture and the Euclidean distance and COS distance of human body reference point are calculated,
To form the standard vector of the standard physical posture;
It is described by the relative position relation compared with the standard relative position relation of the standard posture to prestore the step of wrap
Include:
Calculate the difference of standard vector described in the vector sum to be assessed;
By the difference compared with predetermined threshold value, if the difference is less than the predetermined threshold value, it is described to judge comparative result
Body posture is the standard posture.
4. appraisal procedure according to claim 3, it is characterised in that it is described by the difference compared with predetermined threshold value
In step, if the difference is more than or equal to the predetermined threshold value, judge that comparative result for the body posture is not described
Standard posture;
In the step of output comparative result, in addition to output adjustment suggestion.
5. appraisal procedure according to claim 1, it is characterised in that described according to range image sequence identification
The parts of body of human body to be assessed simultaneously includes the step of determine the human body reference point of the human body to be assessed:
Remove the background in the series of depth images;
Obtain the profile of the human body in the range image sequence;
According to the trunk of human body described in the outline identification;
The parts of body of the human body is identified according to the trunk;
The human body reference point is obtained with reference to the trunk and the parts of body.
A kind of 6. apparatus for evaluating of human body attitude, it is characterised in that including depth camera and processor, the depth camera and institute
State processor connection;
Wherein, the depth camera is used for the range image sequence for obtaining human body to be assessed;
The processor is used to identify the parts of body of the human body to be assessed according to the range image sequence and determines institute
State the human body reference point of human body to be assessed;Obtain the relative position relation of the parts of body and the human body reference point;Will
The relative position relation is compared with the standard relative position relation of the standard posture to prestore;Export comparative result.
7. apparatus for evaluating according to claim 6, it is characterised in that also including memory, the memory and the place
Manage device connection;
The processor is used for the parts of body and the standard phase of the human body reference point for obtaining the human body of the standard posture
To position relationship;
The memory is used to preserve the standard relative position relation.
8. apparatus for evaluating according to claim 7, it is characterised in that the relative position relation is the people of posture to be assessed
Relative position relation of the barycenter of the parts of body of body to the human body reference point with the human body;The standard relative position
Relation for standard posture human body parts of body barycenter to the human body reference point of the human body of the standard posture phase
To position relationship;
The processor is used for the first coordinate value for obtaining the human body reference point;Obtain barycenter and the institute of the parts of body
State the second coordinate value of the barycenter of parts of body;The body is calculated according to first coordinate value and second coordinate value
The barycenter at each position and the Euclidean distance and COS distance of human body reference point, with formed the body posture of the human body it is to be assessed to
Amount;
The processor is additionally operable to calculate Europe of the barycenter with human body reference point of the parts of body of the human body of the standard posture
Formula distance and COS distance, to form the standard vector of the standard physical posture;
The processor is additionally operable to calculate the difference of standard vector described in the vector sum to be assessed;By the difference and default threshold
Value compares, if the difference is less than the predetermined threshold value, it is that the body posture is the standard posture to judge comparative result.
9. apparatus for evaluating according to claim 8, it is characterised in that if the processor be additionally operable to the difference be more than or
Equal to the predetermined threshold value, then judge that comparative result for the body posture is not the standard posture;In output comparative result
When output adjustment suggestion.
10. apparatus for evaluating according to claim 6, it is characterised in that the processor is additionally operable to remove the depth map
As the background in series;Obtain the profile of the human body in the range image sequence;According to people described in the outline identification
The trunk of body;The parts of body of the human body is identified according to the trunk;Obtained with reference to the trunk and the parts of body
Take the human body reference point.
11. a kind of storage device, it is characterised in that had program stored therein data, and described program data can be performed to realize such as
Method described in any one of claim 1 to 5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710386839.7A CN107392086B (en) | 2017-05-26 | 2017-05-26 | Human body posture assessment device, system and storage device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710386839.7A CN107392086B (en) | 2017-05-26 | 2017-05-26 | Human body posture assessment device, system and storage device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107392086A true CN107392086A (en) | 2017-11-24 |
CN107392086B CN107392086B (en) | 2020-11-03 |
Family
ID=60338372
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710386839.7A Active CN107392086B (en) | 2017-05-26 | 2017-05-26 | Human body posture assessment device, system and storage device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107392086B (en) |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108256433A (en) * | 2017-12-22 | 2018-07-06 | 银河水滴科技(北京)有限公司 | A kind of athletic posture appraisal procedure and system |
CN108537284A (en) * | 2018-04-13 | 2018-09-14 | 东莞松山湖国际机器人研究院有限公司 | Posture assessment scoring method based on computer vision deep learning algorithm and system |
CN108573216A (en) * | 2018-03-20 | 2018-09-25 | 浙江大华技术股份有限公司 | A kind of limbs posture judgment method and device |
CN108846996A (en) * | 2018-08-06 | 2018-11-20 | 浙江理工大学 | One kind falling down detecting system and method |
CN109330602A (en) * | 2018-11-01 | 2019-02-15 | 中山市人民医院 | A kind of woman body intelligent evaluation detection device and method, storage medium |
CN109829442A (en) * | 2019-02-22 | 2019-05-31 | 焦点科技股份有限公司 | A kind of method and system of the human action scoring based on camera |
CN110278389A (en) * | 2018-03-13 | 2019-09-24 | 上海西门子医疗器械有限公司 | Imaging method, device, system and the storage medium of x-ray image |
CN110321754A (en) * | 2018-03-28 | 2019-10-11 | 西安铭宇信息科技有限公司 | A kind of human motion posture correcting method based on computer vision and system |
CN111783702A (en) * | 2020-07-20 | 2020-10-16 | 杭州叙简科技股份有限公司 | Efficient pedestrian tumble detection method based on image enhancement algorithm and human body key point positioning |
CN111862296A (en) * | 2019-04-24 | 2020-10-30 | 京东方科技集团股份有限公司 | Three-dimensional reconstruction method, three-dimensional reconstruction device, three-dimensional reconstruction system, model training method and storage medium |
CN113033552A (en) * | 2021-03-19 | 2021-06-25 | 北京字跳网络技术有限公司 | Text recognition method and device and electronic equipment |
CN113398556A (en) * | 2021-06-28 | 2021-09-17 | 浙江大学 | Push-up identification method and system |
CN113673492A (en) * | 2021-10-22 | 2021-11-19 | 科大讯飞(苏州)科技有限公司 | Human body posture evaluation method, electronic device and storage medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103390174A (en) * | 2012-05-07 | 2013-11-13 | 深圳泰山在线科技有限公司 | Physical education assisting system and method based on human body posture recognition |
CN104167016A (en) * | 2014-06-16 | 2014-11-26 | 西安工业大学 | Three-dimensional motion reconstruction method based on RGB color and depth image |
CN104318520A (en) * | 2014-09-28 | 2015-01-28 | 南通大学 | Pixel local area direction detection method |
KR20160098560A (en) * | 2015-02-09 | 2016-08-19 | 한국전자통신연구원 | Apparatus and methdo for analayzing motion |
CN106056053A (en) * | 2016-05-23 | 2016-10-26 | 西安电子科技大学 | Human posture recognition method based on skeleton feature point extraction |
CN106250867A (en) * | 2016-08-12 | 2016-12-21 | 南京华捷艾米软件科技有限公司 | A kind of skeleton based on depth data follows the tracks of the implementation method of system |
-
2017
- 2017-05-26 CN CN201710386839.7A patent/CN107392086B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103390174A (en) * | 2012-05-07 | 2013-11-13 | 深圳泰山在线科技有限公司 | Physical education assisting system and method based on human body posture recognition |
CN104167016A (en) * | 2014-06-16 | 2014-11-26 | 西安工业大学 | Three-dimensional motion reconstruction method based on RGB color and depth image |
CN104318520A (en) * | 2014-09-28 | 2015-01-28 | 南通大学 | Pixel local area direction detection method |
KR20160098560A (en) * | 2015-02-09 | 2016-08-19 | 한국전자통신연구원 | Apparatus and methdo for analayzing motion |
CN106056053A (en) * | 2016-05-23 | 2016-10-26 | 西安电子科技大学 | Human posture recognition method based on skeleton feature point extraction |
CN106250867A (en) * | 2016-08-12 | 2016-12-21 | 南京华捷艾米软件科技有限公司 | A kind of skeleton based on depth data follows the tracks of the implementation method of system |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108256433A (en) * | 2017-12-22 | 2018-07-06 | 银河水滴科技(北京)有限公司 | A kind of athletic posture appraisal procedure and system |
CN108256433B (en) * | 2017-12-22 | 2020-12-25 | 银河水滴科技(北京)有限公司 | Motion attitude assessment method and system |
CN110278389A (en) * | 2018-03-13 | 2019-09-24 | 上海西门子医疗器械有限公司 | Imaging method, device, system and the storage medium of x-ray image |
CN110278389B (en) * | 2018-03-13 | 2022-08-19 | 上海西门子医疗器械有限公司 | X-ray image imaging method, device, system and storage medium |
CN108573216A (en) * | 2018-03-20 | 2018-09-25 | 浙江大华技术股份有限公司 | A kind of limbs posture judgment method and device |
CN110321754A (en) * | 2018-03-28 | 2019-10-11 | 西安铭宇信息科技有限公司 | A kind of human motion posture correcting method based on computer vision and system |
CN110321754B (en) * | 2018-03-28 | 2024-04-19 | 西安铭宇信息科技有限公司 | Human motion posture correction method and system based on computer vision |
CN108537284A (en) * | 2018-04-13 | 2018-09-14 | 东莞松山湖国际机器人研究院有限公司 | Posture assessment scoring method based on computer vision deep learning algorithm and system |
CN108846996A (en) * | 2018-08-06 | 2018-11-20 | 浙江理工大学 | One kind falling down detecting system and method |
CN109330602B (en) * | 2018-11-01 | 2022-06-24 | 中山市人民医院 | Female body intelligent evaluation detection device and method and storage medium |
CN109330602A (en) * | 2018-11-01 | 2019-02-15 | 中山市人民医院 | A kind of woman body intelligent evaluation detection device and method, storage medium |
CN109829442A (en) * | 2019-02-22 | 2019-05-31 | 焦点科技股份有限公司 | A kind of method and system of the human action scoring based on camera |
CN111862296A (en) * | 2019-04-24 | 2020-10-30 | 京东方科技集团股份有限公司 | Three-dimensional reconstruction method, three-dimensional reconstruction device, three-dimensional reconstruction system, model training method and storage medium |
CN111862296B (en) * | 2019-04-24 | 2023-09-29 | 京东方科技集团股份有限公司 | Three-dimensional reconstruction method, three-dimensional reconstruction device, three-dimensional reconstruction system, model training method and storage medium |
CN111783702A (en) * | 2020-07-20 | 2020-10-16 | 杭州叙简科技股份有限公司 | Efficient pedestrian tumble detection method based on image enhancement algorithm and human body key point positioning |
CN113033552A (en) * | 2021-03-19 | 2021-06-25 | 北京字跳网络技术有限公司 | Text recognition method and device and electronic equipment |
CN113033552B (en) * | 2021-03-19 | 2024-02-02 | 北京字跳网络技术有限公司 | Text recognition method and device and electronic equipment |
CN113398556A (en) * | 2021-06-28 | 2021-09-17 | 浙江大学 | Push-up identification method and system |
CN113398556B (en) * | 2021-06-28 | 2022-03-01 | 浙江大学 | Push-up identification method and system |
CN113673492A (en) * | 2021-10-22 | 2021-11-19 | 科大讯飞(苏州)科技有限公司 | Human body posture evaluation method, electronic device and storage medium |
CN113673492B (en) * | 2021-10-22 | 2022-03-11 | 科大讯飞(苏州)科技有限公司 | Human body posture evaluation method, electronic device and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN107392086B (en) | 2020-11-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107392086A (en) | Apparatus for evaluating, system and the storage device of human body attitude | |
CN107335192A (en) | Move supplemental training method, apparatus and storage device | |
US9235753B2 (en) | Extraction of skeletons from 3D maps | |
CN103778635B (en) | For the method and apparatus processing data | |
CN104867126B (en) | Based on point to constraint and the diameter radar image method for registering for changing region of network of triangle | |
CN109544612A (en) | Point cloud registration method based on the description of characteristic point geometric jacquard patterning unit surface | |
CN104899918B (en) | The three-dimensional environment modeling method and system of a kind of unmanned plane | |
CN104200200B (en) | Fusion depth information and half-tone information realize the system and method for Gait Recognition | |
CN108717531A (en) | Estimation method of human posture based on Faster R-CNN | |
EP2751777A1 (en) | Method for estimating a camera motion and for determining a three-dimensional model of a real environment | |
CN101604447A (en) | No-mark human body motion capture method | |
CN110060284A (en) | A kind of binocular vision environmental detecting system and method based on tactilely-perceptible | |
Yue et al. | Fast 3D modeling in complex environments using a single Kinect sensor | |
CN110059683A (en) | A kind of license plate sloped antidote of wide-angle based on end-to-end neural network | |
CN108921864A (en) | A kind of Light stripes center extraction method and device | |
CN107742306A (en) | Moving Target Tracking Algorithm in a kind of intelligent vision | |
CN107341179A (en) | Generation method, device and the storage device of standard movement database | |
CN107239744A (en) | Monitoring method, system and the storage device of human body incidence relation | |
CN107507188B (en) | Method and device for extracting image information based on machine learning | |
CN106650701A (en) | Binocular vision-based method and apparatus for detecting barrier in indoor shadow environment | |
CN111862315A (en) | Human body multi-size measuring method and system based on depth camera | |
CN107230226A (en) | Determination methods, device and the storage device of human body incidence relation | |
CN112288814A (en) | Three-dimensional tracking registration method for augmented reality | |
Senior | Real-time articulated human body tracking using silhouette information | |
Mahmoudi et al. | A new approach for cervical vertebrae segmentation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |