CN101650776A - Method and system for tracking position of human limbs - Google Patents

Method and system for tracking position of human limbs Download PDF

Info

Publication number
CN101650776A
CN101650776A CN200810146100A CN200810146100A CN101650776A CN 101650776 A CN101650776 A CN 101650776A CN 200810146100 A CN200810146100 A CN 200810146100A CN 200810146100 A CN200810146100 A CN 200810146100A CN 101650776 A CN101650776 A CN 101650776A
Authority
CN
China
Prior art keywords
limbs
candidate
profile
depth
tracking
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN200810146100A
Other languages
Chinese (zh)
Other versions
CN101650776B (en
Inventor
陈柏戎
郭建春
王科翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Industrial Technology Research Institute ITRI
Original Assignee
Industrial Technology Research Institute ITRI
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Industrial Technology Research Institute ITRI filed Critical Industrial Technology Research Institute ITRI
Priority to CN 200810146100 priority Critical patent/CN101650776B/en
Publication of CN101650776A publication Critical patent/CN101650776A/en
Application granted granted Critical
Publication of CN101650776B publication Critical patent/CN101650776B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention relates to a method for tracking the position of human limbs. The method comprises the following steps: acquiring a left image of a first limb part by a first photographic device, and acquiring an outline candidate position of the first limb part according to characteristic information of the left image; acquiring a right image of the first limb part by a second photographic device,and acquiring a depth candidate position of the first limb part according to depth information of the right image; and calculating a geometric relationship between the outline candidate position as well as the depth candidate position and a second limb position of the second limb part, and judging whether the current limb position of the first limb part is to be updated according to the geometricrelationship.

Description

The method and system of tracking position of human limbs
Technical field
The present invention relates to a kind of image tracing method, and be particularly to a kind of method and system of tracking position of human limbs.
Background technology
In the identification and analysis field of video signal, the identification of personage's face, hand or body limbs and tracking are very popular research fields always.Yet, in existing business-like technology, focus on mostly with whole character contour and do interaction, and do not have the difference of trickle actions such as differentiating four limbs or finger, thus very limited on using.Existing body sense interaction mode is only responded to local motion or is only done interaction according to the whole body profile, therefore is difficult to distinguish complicated action, for example, lifts left hand, lifts on the right hand, the both hands and lift, play right leg ... or the like.
Therefore, the invention provides a kind of method and system of tracking position of human limbs, can follow the trail of and differentiate the position of trick four limbs effectively, and provide the limbs interaction mode that also enriches to be applied to fields such as game control, motion analysis, multimedia show.
Summary of the invention
One embodiment of the invention have disclosed a kind of method of tracking position of human limbs.Obtain a left image at one first limbs position via one first camera, and obtain a profile position candidate at this first limbs position according to the characteristic information of this left side image.Via a right image at this first limbs position of one second camera, and obtain the degree of depth position candidate at this first limbs position according to the depth information of this right side image.Calculate the geometric relationship between one second position at this profile position candidate and this degree of depth position candidate and one second limbs position, and judge whether to upgrade a present position at this first limbs position according to this geometric relationship.
Further embodiment of this invention has also disclosed a kind of system of tracking position of human limbs, comprises one first camera, one second camera and an arithmetic unit.This arithmetic unit is obtained a left image at one first limbs position via this first camera and via the right image at this first limbs position of this second camera, obtain a profile position candidate at this first limbs position according to the characteristic information of this left side image, obtain a degree of depth position candidate at this first limbs position according to the depth information of this right side image, calculate the geometric relationship between one second position at this profile position candidate and this degree of depth position candidate and one second limbs position, and a present position that judges whether to upgrade this first limbs position according to this geometric relationship.
Description of drawings
Fig. 1 is the system architecture synoptic diagram that shows the tracking position of human limbs of the embodiment of the invention;
Fig. 2 is the method step process flow diagram that shows the tracking position of human limbs of the embodiment of the invention.
[main element symbol description]
100~arithmetic unit
200,300~camera
S201..S212~process step
Embodiment
For allow purpose of the present invention, feature, and advantage can also become apparent, embodiment cited below particularly, and cooperate appended graphic Fig. 1 to Fig. 2 is described in detail.Instructions of the present invention provides different embodiment that the technical characterictic of the different embodiments of the present invention is described.Wherein, the usefulness that is configured to explanation of each element among the embodiment is not in order to restriction the present invention.And the part of reference numerals repeats among the embodiment, for the purpose of simplifying the description, is not the relevance that means between the different embodiment.
The embodiment of the invention has disclosed a kind of method and system of tracking position of human limbs.
The method and system of the tracking position of human limbs of embodiments of the invention is the continuous images that utilize two cameras retrieval human limbs, and judges and the position of following the trail of human limbs according to the characteristic area of retrieval image and the depth information that the phase differential between image causes.
Fig. 1 is the system architecture synoptic diagram that shows the tracking position of human limbs of the embodiment of the invention.
The system of the tracking position of human limbs of the embodiment of the invention comprises an arithmetic unit 100 and two cameras 200 and 300.Utilize camera 200 and 300 to catch a personage left image and right image respectively in real time, the left and right sides image that will obtain then sends arithmetic unit 100 to, to judge and the position of following the trail of the personage.Following fitting method process flow diagram is described further.
Fig. 2 is the method step process flow diagram that shows the tracking position of human limbs of the embodiment of the invention.
The limbs position that arithmetic unit 100 is obtained the personage via camera 200 (for example, left hand, the i.e. first limbs position) left image, and the processing that this left side image is removed background and profile/edge are with the characteristic information (step S201) that obtains this limbs position.Arithmetic unit 100 is obtained the position candidate (abbreviating the profile position candidate as) (step S202) at the profile/edge at this limbs position according to the characteristic information of this left side image, then follow the trail of the position (step S212) of gained according to the last time at this limbs position, the profile position candidate of calculation procedure S202 gained and the last time at this limbs position are followed the trail of the distance between the position of gained, and then confidence level (abbreviating the Contour tracing confidence level as) (step S203) is followed the trail of at the profile/edge that calculates this limbs position.This Contour tracing confidence level and distance are inversely proportional to, at a distance of more then confidence level is higher.
The geometry confidence level that arithmetic unit 100 goes out this profile position candidate according to personage's health axis or head (i.e. the second limbs position) position calculation (abbreviating how much confidence levels of profile as) (step S204).When this profile position candidate is far away from health axis or head position, then its confidence level is higher, otherwise more near health axis or head position, then its confidence level is lower, but not linear.Can roughly calculate personage's the shared regional location of body trunk according to the characteristic information at this limbs position of personage, when exceeding this regional location one predeterminable range when above, then this geometry confidence level is higher, otherwise if near or in this regional location, then confidence level significantly reduces.
Arithmetic unit 100 is obtained the right image at this limbs position (for example, left hand) of personage via camera 300, and calculates the depth information (step S205) at this limbs position (for example, left hand) according to this left side image and this right side eiconometer.Arithmetic unit 100 (is for example obtained this limbs position according to the depth information of this right side image, left hand) degree of depth position candidate (step S206), follow the trail of the position (step S212) of gained again according to the last time at this limbs position, the last time that calculates the degree of depth position candidate of step S205 gained and this limbs position is followed the trail of the distance between the position of gained, and then the degree of depth that calculates this limbs position is followed the trail of confidence level (step S207).
Then, arithmetic unit 100 calculates the geometry confidence level (abbreviating the depth geometry confidence level as) (step S208) of this degree of depth position candidate according to health axis or head position.Arithmetic unit 100 is followed the trail of with Contour tracing confidence level and the degree of depth that confidence level is compared and how much confidence levels of profile is compared with the depth geometry confidence level, through choosing higher wherein one profile of confidence level/degree of depth tracking confidence level or profile/depth geometry confidence level (step S209) after the computing assessment.Arithmetic unit 100 is compared this confidence level of choosing with one profile/degree of depth confidence level threshold value, to judge that whether this confidence level of choosing is greater than this profile/degree of depth confidence level threshold value (step S210).
If this confidence level of choosing is greater than this profile/degree of depth confidence level threshold value, then will be updated to the present position (step S211) at this limbs position to the position candidate of the confidence level that should choose.If this confidence level of choosing is not more than this profile/degree of depth confidence level threshold value, it is invalid then to be considered as following the trail of, so followed the trail of the present position (step S212) of the position of gained as this limbs position the last time.
Note that this first limbs position is head, hand, pin, elbow, finger or other human body limb position, or be positioned at the label on this human body limb position.
Note that this second position at one first position at this first limbs position, this profile position candidate, this degree of depth position candidate or this second limbs position is a bidimensional image, and represents with image coordinate or image block.
Note that said method comprises that also turning point with the profile at this first limbs position or edge is as this profile position candidate.
Note that said method comprises that also the diversity factor of two images that utilize this first limbs position calculates this depth information.
Note that said method comprises that also the diversity factor of the regional area in two images that utilize this first limbs position calculates this depth information.
Note that said method also comprises and utilizes a three-dimensional rebuilding method to calculate this depth information.
Note that this second limbs position is head, face, neck, weber's point, health axis, waist location, hip upper/lower positions or the human body limb position that can be used for reference, or be positioned at the label on this human body limb position.
Note that this geometric relationship comprises the distance between this second position that calculates this profile position candidate and this second limbs position or calculates the distance of S and Q.
Note, this geometric relationship comprise the distance between this second position that calculates this profile position candidate and this second limbs position or calculate this degree of depth position candidate and this second position at this second limbs position between distance.
Note that this geometric relationship comprises when this profile position candidate or this degree of depth position candidate and healing when near apart from this second position at this second limbs position, improve one first position that upgrades this first limbs position possibility to closer distance.
Note that this geometric relationship comprises when this profile position candidate heals from T and improve one first position that upgrades this first limbs position possibility to this profile position candidate when far away, then reduce this possibility when healing when near.
Note, this geometric relationship comprise this profile position candidate a specific axis to coordinate and the coordinate gap that makes progress in this specific axis of this second position at this second limbs position heal when big, improve to upgrade the possibility of one first position at this first limbs position, when gap more hour then reduces this possibility to this profile position candidate.
Note, this geometric relationship comprise this degree of depth position candidate a specific axis to coordinate and the coordinate gap that makes progress in this specific axis of this second position at this second limbs position more hour, improve to upgrade the possibility of one first position at this first limbs position, then reduce this possibility when big when gap heals to this degree of depth position candidate.
Note that said method comprises that also one first position that judges whether to upgrade this first limbs position is to being this profile position candidate or this degree of depth position candidate.
The present invention also provides a kind of recording medium (for example discs, disk sheet and removable hard drive or the like), and it is the authority sign-off program of record one embodied on computer readable, so that carry out the method for above-mentioned tracking position of human limbs.At this, be stored in the authority sign-off program on the recording medium, basically (for example the setting up organization chart code segment, sign-off forms code segment, setting program code snippet and deployment program code snippet) formed by a plurality of code segment, and the function of these code segment is to correspond to the step of said method and the functional block diagram of said system.
Though the present invention discloses as above with embodiment; right its is not in order to limiting the present invention, anyly has the knack of this skill person, without departing from the spirit and scope of the present invention; when can doing various also moving and retouchings, so protection scope of the present invention is as the criterion when looking accompanying the claim person of defining.

Claims (40)

1. the method for a tracking position of human limbs is characterized in that, comprises the following steps:
Obtain a left image at one first limbs position via one first camera;
Obtain a profile position candidate at this first limbs position according to the characteristic information of this left side image;
Right image via this first limbs position of one second camera;
Obtain a degree of depth position candidate at this first limbs position according to the depth information of this right side image;
Calculate the geometric relationship between one second position at this profile position candidate and this degree of depth position candidate and one second limbs position; And
Judge whether to upgrade a present position at this first limbs position according to this geometric relationship.
2. the method for tracking position of human limbs as claimed in claim 1 is characterized in that, also comprises the following steps:
The processing that this left side image is removed background and edge contour is to obtain the characteristic information at this first limbs position;
Obtain this profile position candidate at the profile/edge at this first limbs position according to this characteristic information of this left side image;
Follow the trail of a position of gained according to last time at this first limbs position, calculate one first distance between this profile position candidate and this position, and then calculate a Contour tracing confidence level at this first limbs position; And
Calculate how much confidence levels of a profile of this profile position candidate according to this second position at this second limbs position.
3. the method for tracking position of human limbs as claimed in claim 2 is characterized in that, also comprises the following steps:
Calculate the depth information at this first limbs position according to this left side image and this right side eiconometer;
Obtain this degree of depth position candidate at this first limbs position according to this depth information of this right side image;
Follow the trail of this position of gained according to last time at this first limbs position, calculate the second distance between this degree of depth position candidate and this position, and then calculate the degree of depth tracking confidence level at this first limbs position; And
Calculate a depth geometry confidence level of this degree of depth position candidate according to this second position at this second limbs position.
4. the method for tracking position of human limbs as claimed in claim 3 is characterized in that, also comprises the following steps:
This Contour tracing confidence level and this degree of depth are followed the trail of that confidence level is compared and how much confidence levels of profile are compared with the depth geometry confidence level, to choose the higher wherein confidence level of confidence level;
This confidence level of choosing is compared with a confidence level threshold value, to judge that whether this confidence level of choosing is greater than this confidence level threshold value;
If this confidence level of choosing is greater than this confidence level threshold value, then will be updated to a present position at this first limbs position to a position candidate of the confidence level that should choose; And
If this confidence level of choosing is not more than this confidence level threshold value, then followed the trail of this position of gained this present position the last time as this first limbs position.
5. the method for tracking position of human limbs as claimed in claim 2 is characterized in that, this Contour tracing confidence level and this first distance are inversely proportional to.
6. the method for tracking position of human limbs as claimed in claim 1 is characterized in that, this profile position candidate is directly proportional with this second position from this second limbs position but is not linear.
7. the method for tracking position of human limbs as claimed in claim 1 is characterized in that, this first limbs position is head, hand, pin, elbow, finger or other human body limb position, or is positioned at the label on this human body limb position.
8. the method for tracking position of human limbs as claimed in claim 1, it is characterized in that, this second position at one first position at this first limbs position, this profile position candidate, this degree of depth position candidate or this second limbs position is a bidimensional image, and represents with image coordinate or image block.
9. the method for tracking position of human limbs as claimed in claim 1 is characterized in that, comprises that also turning point with the profile at this first limbs position or edge is as this profile position candidate.
10. the method for tracking position of human limbs as claimed in claim 1 is characterized in that, comprises that also the diversity factor of two images that utilize this first limbs position is calculated this depth information.
11. the method for tracking position of human limbs as claimed in claim 1 is characterized in that, comprises that also the diversity factor of the regional area in two images that utilize this first limbs position is calculated this depth information.
12. the method for tracking position of human limbs as claimed in claim 1 is characterized in that, also comprises utilizing a three-dimensional rebuilding method to calculate this depth information.
13. the method for tracking position of human limbs as claimed in claim 1, it is characterized in that, this second limbs position is head, face, neck, weber's point, health axis, waist location, hip upper/lower positions or the human body limb position that can be used for reference, or is positioned at the label on this human body limb position.
14. the method for tracking position of human limbs as claimed in claim 1 is characterized in that, this geometric relationship comprises the distance between this second position that calculates this profile position candidate and this second limbs position or calculates the distance of S and Q.
15. the method for tracking position of human limbs as claimed in claim 1, it is characterized in that, this geometric relationship comprise the distance between this second position that calculates this profile position candidate and this second limbs position or calculate this degree of depth position candidate and this second position at this second limbs position between distance.
16. the method for tracking position of human limbs as claimed in claim 1, it is characterized in that, this geometric relationship comprises when this profile position candidate or this degree of depth position candidate and healing when near apart from this second position at this second limbs position, improves one first position that upgrades this first limbs position possibility to closer distance.
17. the method for tracking position of human limbs as claimed in claim 1, it is characterized in that, this geometric relationship comprises when this profile position candidate heals from T and improves one first position that upgrades this first limbs position possibility to this profile position candidate when far away, then reduces this possibility when near when healing.
18. the method for tracking position of human limbs as claimed in claim 1, it is characterized in that, this geometric relationship comprise this profile position candidate a specific axis to coordinate and the coordinate gap that makes progress in this specific axis of this second position at this second limbs position heal when big, improve to upgrade the possibility of one first position at this first limbs position, when gap more hour then reduces this possibility to this profile position candidate.
19. the method for tracking position of human limbs as claimed in claim 1, it is characterized in that, this geometric relationship comprise this degree of depth position candidate a specific axis to coordinate and the coordinate gap that makes progress in this specific axis of this second position at this second limbs position more hour, improve to upgrade the possibility of one first position at this first limbs position, then reduce this possibility when big when gap heals to this degree of depth position candidate.
20. the method for tracking position of human limbs as claimed in claim 1 is characterized in that, also comprises judging whether to upgrade one first position at this first limbs position to being this profile position candidate or this degree of depth position candidate.
21. the system of a tracking position of human limbs is characterized in that comprising:
One first camera;
One second camera; And
One arithmetic unit, it obtains a left image at one first limbs position via this first camera and via the right image at this first limbs position of this second camera, obtain a profile position candidate at this first limbs position according to the characteristic information of this left side image, obtain a degree of depth position candidate at this first limbs position according to the depth information of this right side image, calculate the geometric relationship between one second position at this profile position candidate and this degree of depth position candidate and one second limbs position, and a present position that judges whether to upgrade this first limbs position according to this geometric relationship.
22. the system of tracking position of human limbs as claimed in claim 21, it is characterized in that, the processing that this arithmetic unit is removed background and edge contour to this left side image, to obtain the characteristic information at this first limbs position, obtain this profile position candidate at the profile/edge at this first limbs position according to this characteristic information of this left side image, follow the trail of a position of gained according to the last time at this first limbs position, calculate one first distance between this profile position candidate and this position, and then calculate the Contour tracing confidence level at this first limbs position and how much confidence levels of a profile that calculate this profile position candidate according to this second position at this second limbs position.
23. the system of tracking position of human limbs as claimed in claim 22, it is characterized in that, this arithmetic unit is calculated the depth information at this first limbs position according to this left side image and this right side eiconometer, obtain this degree of depth position candidate at this first limbs position according to this depth information of this right side image, follow the trail of this position of gained according to the last time at this first limbs position, calculate the second distance between this degree of depth position candidate and this position, and then a degree of depth that calculates this first limbs position is followed the trail of a confidence level and a depth geometry confidence level that calculates this degree of depth position candidate according to this second position at this second limbs position.
24. the system of tracking position of human limbs as claimed in claim 23, it is characterized in that, this arithmetic unit is followed the trail of with this Contour tracing confidence level and this degree of depth that confidence level is compared and how much confidence levels of profile is compared with the depth geometry confidence level, to choose the higher wherein confidence level of confidence level, this confidence level of choosing is compared with a confidence level threshold value, to judge that whether this confidence level of choosing is greater than this confidence level threshold value, if this confidence level of choosing is greater than this confidence level threshold value, then will be updated to a present position at this first limbs position to a position candidate of the confidence level that should choose, and, then followed the trail of this position of gained this present position the last time as this first limbs position if this confidence level of choosing is not more than this confidence level threshold value.
25. the system of tracking position of human limbs as claimed in claim 22 is characterized in that, this Contour tracing confidence level and this first distance are inversely proportional to.
26. the system of tracking position of human limbs as claimed in claim 21 is characterized in that, this profile position candidate is directly proportional with this second position from this second limbs position but is not linear.
27. the system of tracking position of human limbs as claimed in claim 21 is characterized in that, this first limbs position is head, hand, pin, elbow, finger or other human body limb position, or is positioned at the label on this human body limb position.
28. the system of tracking position of human limbs as claimed in claim 21, it is characterized in that, this second position at one first position at this first limbs position, this profile position candidate, this degree of depth position candidate or this second limbs position is a bidimensional image, and represents with image coordinate or image block.
29. the system of tracking position of human limbs as claimed in claim 21 is characterized in that, with the turning point at the profile at this first limbs position or edge as this profile position candidate.
30. the system of tracking position of human limbs as claimed in claim 21 is characterized in that, this arithmetic unit utilizes the diversity factor of two images at this first limbs position to calculate this depth information.
31. the system of tracking position of human limbs as claimed in claim 21 is characterized in that, this arithmetic unit utilizes the diversity factor of the regional area in two images at this first limbs position to calculate this depth information.
32. the system of tracking position of human limbs as claimed in claim 21 is characterized in that, this arithmetic unit utilizes a three-dimensional rebuilding method to calculate this depth information.
33. the system of tracking position of human limbs as claimed in claim 21, it is characterized in that, this second limbs position is head, face, neck, weber's point, health axis, waist location, hip upper/lower positions or the human body limb position that can be used for reference, or is positioned at the label on this human body limb position.
34. the system of tracking position of human limbs as claimed in claim 21 is characterized in that, this geometric relationship comprises the distance between this second position that calculates this profile position candidate and this second limbs position or calculates the distance of S and Q.
35. the system of tracking position of human limbs as claimed in claim 21, it is characterized in that, this geometric relationship comprise the distance between this second position that calculates this profile position candidate and this second limbs position or calculate this degree of depth position candidate and this second position at this second limbs position between distance.
36. the system of tracking position of human limbs as claimed in claim 21, it is characterized in that, this geometric relationship comprises when this profile position candidate or this degree of depth position candidate and healing when near apart from this second position at this second limbs position, improves one first position that upgrades this first limbs position possibility to closer distance.
37. the system of tracking position of human limbs as claimed in claim 21, it is characterized in that, this geometric relationship comprises when this profile position candidate heals from T and improves one first position that upgrades this first limbs position possibility to this profile position candidate when far away, then reduces this possibility when near when healing.
38. the system of tracking position of human limbs as claimed in claim 21, it is characterized in that, this geometric relationship comprise this profile position candidate a specific axis to coordinate and the coordinate gap that makes progress in this specific axis of this second position at this second limbs position heal when big, improve to upgrade the possibility of one first position at this first limbs position, when gap more hour then reduces this possibility to this profile position candidate.
39. the system of tracking position of human limbs as claimed in claim 21, it is characterized in that, this geometric relationship comprise this degree of depth position candidate a specific axis to coordinate and the coordinate gap that makes progress in this specific axis of this second position at this second limbs position more hour, improve to upgrade the possibility of one first position at this first limbs position, then reduce this possibility when big when gap heals to this degree of depth position candidate.
40. the system of tracking position of human limbs as claimed in claim 21 is characterized in that, this arithmetic unit judges whether to upgrade one first position at this first limbs position to being this profile position candidate or this degree of depth position candidate.
CN 200810146100 2008-08-12 2008-08-12 Method and system for tracking position of human limbs Active CN101650776B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 200810146100 CN101650776B (en) 2008-08-12 2008-08-12 Method and system for tracking position of human limbs

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 200810146100 CN101650776B (en) 2008-08-12 2008-08-12 Method and system for tracking position of human limbs

Publications (2)

Publication Number Publication Date
CN101650776A true CN101650776A (en) 2010-02-17
CN101650776B CN101650776B (en) 2013-03-20

Family

ID=41673013

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 200810146100 Active CN101650776B (en) 2008-08-12 2008-08-12 Method and system for tracking position of human limbs

Country Status (1)

Country Link
CN (1) CN101650776B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105787498A (en) * 2014-12-25 2016-07-20 财团法人车辆研究测试中心 Pedestrian detection system
CN108885087A (en) * 2016-03-28 2018-11-23 日本电气方案创新株式会社 Measuring device, measurement method and computer readable recording medium
CN111353355A (en) * 2018-12-24 2020-06-30 财团法人工业技术研究院 Motion tracking system and method
CN111632285A (en) * 2020-05-28 2020-09-08 杜颖 Joint gout treatment device

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113570616B (en) * 2021-06-10 2022-05-13 北京医准智能科技有限公司 Image processing method, image processing device, electronic equipment and computer readable storage medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100186385B1 (en) * 1996-04-27 1999-05-01 이종수 Image cognizing apparatus
CN1753028A (en) * 2005-09-15 2006-03-29 上海交通大学 Human limb three dimensional motion parameter estimation method based on skeleton
CN101211411B (en) * 2007-12-21 2010-06-16 北京中星微电子有限公司 Human body detection process and device

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105787498A (en) * 2014-12-25 2016-07-20 财团法人车辆研究测试中心 Pedestrian detection system
CN105787498B (en) * 2014-12-25 2019-05-10 财团法人车辆研究测试中心 Pedestrian's detecting system
CN108885087A (en) * 2016-03-28 2018-11-23 日本电气方案创新株式会社 Measuring device, measurement method and computer readable recording medium
US10796449B2 (en) 2016-03-28 2020-10-06 Nec Solution Innovators, Ltd. Measurement device, measurement method, and computer readable recording medium
CN111353355A (en) * 2018-12-24 2020-06-30 财团法人工业技术研究院 Motion tracking system and method
CN111353355B (en) * 2018-12-24 2023-09-19 财团法人工业技术研究院 Motion tracking system and method
CN111632285A (en) * 2020-05-28 2020-09-08 杜颖 Joint gout treatment device

Also Published As

Publication number Publication date
CN101650776B (en) 2013-03-20

Similar Documents

Publication Publication Date Title
CN102609683B (en) Automatic labeling method for human joint based on monocular video
CN101650776B (en) Method and system for tracking position of human limbs
Chaudhari et al. Yog-guru: Real-time yoga pose correction system using deep learning methods
KR20110113152A (en) Apparatus, method and computer-readable medium providing marker-less motion capture of human
Ong et al. The efficacy of a video-based marker-less tracking system for gait analysis
CN108829233A (en) A kind of exchange method and device
KR20060021001A (en) Implementation of marker-less augmented reality and mixed reality system using object detecting method
JP2012141881A (en) Human body motion estimation device, human body motion estimation method and computer program
CN105205786B (en) A kind of picture depth restoration methods and electronic equipment
CN107195163B (en) A kind of alarm method, device and wearable device
Kang et al. Real-time tracking and recognition systems for interactive telemedicine health services
CN113229807A (en) Human body rehabilitation evaluation device, method, electronic device and storage medium
JP6635848B2 (en) Three-dimensional video data generation device, three-dimensional video data generation program, and method therefor
Alghamdi et al. Safe trajectory estimation at a pedestrian crossing to assist visually impaired people
KR101447958B1 (en) Method and apparatus for recognizing body point
Sanders et al. Kinematic parameters contributing to the production of spin in elite finger spin bowling
CN113288611B (en) Operation safety guarantee method and system based on electric wheelchair traveling scene
de Gusmao Lafayette et al. The virtual kinect
Dimiccoli Computer vision for egocentric (first-person) vision
JP2010205015A (en) Group behavior estimation device and service provision system
Wang et al. Ordered over-relaxation based Langevin Monte Carlo sampling for visual tracking
Chaki et al. Applications of binarization
Li et al. Real-time human tracking based on switching linear dynamic system combined with adaptive Meanshift tracker
Chen et al. Application of Gesture Estimation Method Based on Computer
CN110263702A (en) A kind of real-time three-dimensional gesture method for tracing based on method of geometry

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant