CN104951734A - Kinect detecting apparatus - Google Patents

Kinect detecting apparatus Download PDF

Info

Publication number
CN104951734A
CN104951734A CN201410117933.9A CN201410117933A CN104951734A CN 104951734 A CN104951734 A CN 104951734A CN 201410117933 A CN201410117933 A CN 201410117933A CN 104951734 A CN104951734 A CN 104951734A
Authority
CN
China
Prior art keywords
finger
image
body sense
point
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201410117933.9A
Other languages
Chinese (zh)
Inventor
方宗舟
罗业鑫
庞宗璧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Anxuan Software Co Ltd
Original Assignee
Zhuhai Anxuan Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Anxuan Software Co Ltd filed Critical Zhuhai Anxuan Software Co Ltd
Priority to CN201410117933.9A priority Critical patent/CN104951734A/en
Publication of CN104951734A publication Critical patent/CN104951734A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The object of the present invention is to provide a kinect detecting apparatus for identifying at least one appearance feature of a human body or an animal. The apparatus comprises a lens module group, an image sensing unit, a light source supply unit, an image processing unit and a kinect identifying module group, wherein the lens module group is arranged beside the image sensing unit to obtain images, the light source supply unit provides a light source, the image processing unit controls the actions of the image sensing unit and the light source supply unit and receive images sent out by the image sensing unit, and the kinect identifying module group receive the output of the image processing unit. The image processing unit controls the image sensing unit to sense the image periodically one frame by one frame, controls the light source supply unit to quench one frame after n frame is lit on, and weakens the background interference of other frame images according to the frame image sensed by the image sensing unit as the light source supply unit quenches the frame to increase the contrast with other frame images. Other frame images are output to the kinect identifying module group.

Description

Body sense arrangement for detecting
Technical field
The present invention relates to a kind of body sense arrangement for detecting, and especially about the body sense arrangement for detecting of identification human hands feature.
Background technology
In recent years, based on gesture identification and the system fast development thereof of image action, especially games system.Wherein, the action through computer for analysis user performs the interactive approach that instruction has become following most possibility, and how allowing the interactive interface between user and computer more friendly is a day by day important problem.But traditional solution often needs to configure an inductor on user's finger, although this measure can increase the accuracy of detecting hand motion, also increases the burden of user.Another preferably mode be directly the hand of user is considered as an instruction issuing utensil, analyze the hand move mode of user to input instruction in the mode of optical profile type infrared sensing process, control the operating system of computer or peripheral device.But this kind of traditional optical image analytical approach is too complicated and stable not.
Therefore, how allowing user free hand gestures and computation interface can carry out interaction, is a problem demanding prompt solution.
Summary of the invention
Because the problems referred to above, the invention provides a kind of body sense arrangement for detecting, in the middle of image, pick out the finger tip edge line of candidate and calculate best finger orientation and best finger length, to pick out the hand-characteristic of human body, being convenient to user free hand gestures and computation interface can carry out interaction.
An object of the present invention is to provide a kind of body sense arrangement for detecting, for at least one external appearance characteristic of identification human body or animal, it is characterized in that: comprising: a camera lens module, an image sensing unit, a light source feeding unit, an image process unit and one sense identification module.Image sensing unit is other is located at this camera lens module to obtain an image, light source feeding unit provides a light source, image process unit controls the start of this image sensing unit and this light source feeding unit, and receiving this image that this image sensing unit sends, body sense identification module receives the output of this image process unit.Wherein, this image process unit controls this image sensing unit and periodically on a frame-by-frame basis senses this image, control this light source feeding unit and often light n frame just extinguishing one frame, and this frame image that when extinguishing according to this light source feeding unit, this image sensing unit senses arrives, weaken the background interference of other frame images, to promote the contrast of other frame images, then by other frame image outputs to this body sense identification module.
Aspect is implemented according to one of the present invention, it is characterized in that: body sense identification module carries out following start, as: at least one hand-characteristic of identification human body, it is characterized in that: the finger tip edge line comprising the following steps: all candidates picked out in each image block of multiple image block; At least one finger edge direction is calculated from the finger tip edge line of each this candidate; Relatively each finger edge direction and the finger orientation estimated value obtained from this image block, a best finger orientation will be saved as with this finger orientation estimated value this at least one finger edge direction immediate, and point length estimation by corresponding with this finger orientation estimated value this at least one finger edge direction immediate one and save as a best finger length; And according to this best finger orientation and this best finger length, in those image block, judge the finger edge line of at least one candidate.
Aspect is implemented according to one of the present invention, it is characterized in that: body sense identification module carries out following start, as: the step picking out the finger tip edge line of all candidates in each image block of multiple image block more comprises the following steps: in each image block of those image block, pick out the curve of all bend in one direction: and in the curve of those bend in one direction, pick out at least one curve of the nearly predetermined value of the corresponding center of circle corner connection of radian, as the finger tip edge line of this candidate.
Implement aspect according to one of the present invention, it is characterized in that: this predetermined value is pi/2.
Aspect is implemented according to one of the present invention, it is characterized in that: body sense identification module carries out following start, as: in each image block of those image block, the step picking out the curve of all bend in one direction more comprises the following steps: Grad and the gradient direction thereof of each picture point calculated in this image block; Multiple image border in this image block is extracted with Tuscany (Canny) rim detection operand; And each of those image borders is judged one by one, when the gradient direction of the arbitrary continuation in an image border 3 meets the gradually large or gradually little variation relation of gradient direction, this image border is debated to know be the curve of a bend in one direction.
Aspect is implemented according to one of the present invention, it is characterized in that: body sense identification module carries out following start, as: calculate on the finger tip edge line of this candidate, there is a bit of the distance maximal value of the two end point connecting line of the finger tip edge line to this candidate, and using this o'clock as a finger tip summit; And obtain by this finger tip summit, a vertical line vertical with the line of the two-end-point of the finger tip edge line of this candidate, and by it as a finger center line.
Implement aspect according to one of the present invention, it is characterized in that: body sense identification module carries out following start, as: those obtaining corresponding thumb and middle finger point the intersection point of center line, and by it as a palm centre of the palm point.
Aspect is implemented according to one of the present invention, it is characterized in that: body sense identification module carries out following start, as: according to the distance of this palm centre of the palm point and the distal point with this finger center line of corresponding thumb thereof, using this palm centre of the palm point as the center of circle, this palm centre of the palm point is recognized as a palm with the distance of this distal point of this finger center line of corresponding thumb as the circle that radius is formed.
Aspect is implemented according to one of the present invention, it is characterized in that: body sense identification module carries out following start, as: the step calculating at least one finger edge direction from the finger tip edge line of each this candidate more comprises the following steps: the two-end-point of the finger tip edge line from this candidate, bring this finger length estimated value and the finger orientation estimated value obtained from this image block into a predetermined party formula, to find out corresponding multiple pixels in this image block; And go out this at least one finger edge direction from the mean value calculation of the gradient direction of those pixels.
Implement aspect according to one of the present invention, it is characterized in that: this predetermined party formula is:
tan(D finger)=(y-y n)/(x-x n);
Wherein x ∈ { x n..., x n+ L max× cos (D finger), y ∈ { y n..., y n+ L max× sin (D finger), n ∈ { 1,2}, D fingerrepresent a predetermined tangential direction of an end points of the finger tip edge line of this candidate, L maxrepresent a predetermined finger maximum length, and meet following relationship:
L max=W estimate×max(R L/W),
W estimatedistance between the two-end-point representing the finger tip edge line of this candidate, max (R l/W) be a predetermined value, representative refers to maximal value that is long and finger beam ratio, and the coordinate of the two-end-point of finger tip edge line is (X n, y n) (n ∈ 1,2}).
Accompanying drawing explanation
Fig. 1 shows the body sense arrangement for detecting according to one of the present invention embodiment, at least one external appearance characteristic of identification human body or animal.
Fig. 2 shows a clock pulse schematic diagram of the body sense arrangement for detecting use according to one of the present invention embodiment.
The one sense method for detecting that the one sense identification module that Fig. 3 shows one of foundation the present invention embodiment performs picks out the schematic diagram of at least one hand-characteristic of human body in an image.
Fig. 4 shows the schematic diagram of a finger tip edge line.
Another body sense method for detecting of the one sense identification module execution of one of Fig. 5 foundation the present invention embodiment picks out the schematic diagram of at least one hand-characteristic of human body in an image.
Fig. 6 shows a process flow diagram of the one sense method for detecting of the one sense identification module execution according to one of the present invention embodiment.
Fig. 7 shows one of the one sense method for detecting of the one sense identification module execution according to one of the present invention embodiment and selects process flow diagram.
Another of the one sense method for detecting that the one sense identification module that Fig. 8 shows one of foundation the present invention embodiment performs selects process flow diagram.
Fig. 9 shows a process flow diagram of another body sense method for detecting of the one sense identification module execution according to one of the present invention embodiment.
Figure 10 show the body sense method for detecting of one of the present invention embodiment another select process flow diagram.
Embodiment
For further illustrating each embodiment, the present invention provides graphic.This is graphic is a bit a part for disclosure of the present invention, and it is mainly that embodiment is described, and the associated description of instructions can be coordinated to explain the operation principles of embodiment.Coordinate with reference to these contents, this area has knows that the knowledgeable will be understood that the advantage of other possible embodiments and the present invention usually.Element in figure not drawn on scale, and similar component symbol is commonly used to element like representation class.
First please refer to Fig. 1, its display according to the body sense arrangement for detecting of one of the present invention embodiment, at least one external appearance characteristic of identification human body or animal.As shown in FIG., body sense arrangement for detecting 1 comprises that camera lens module 11, image sensing unit 12 is other is located at camera lens module 11, light source feeding unit 13, image process unit 14 and one sense identification module 15.
Camera lens module 11 comprises multiple eyeglass, through the optical characteristics of this little eyeglass, changes the path of the light injected from a light inlet, thus form an image on image sensing unit 12.
In the present embodiment, image sensing unit 12 example is a cmos sensor, light source feeding unit 13 is exemplarily wall emission infrared light supply, but the present invention is not limited to this, in other embodiments, image sensing unit can be the sensor of other kenels, and as ccd sensor, light source feeding unit can be selected indoor lamp, comprises redness, the RGB lamp of green, blue-light source or the light source feeding unit of other kenels.
Image process unit 14 controls the start of light source feeding unit 13 and image sensing unit 12, and make light source feeding unit 13 provide a light source, image sensing unit 12 senses an image.In detail, light source feeding unit 13 is controlled often to light n frame by image process unit 14 and flickers with regard to the mode extinguishing a frame, image sensing unit 12 is controlled to sense this image in periodically on a frame-by-frame basis mode by image process unit 14, sense each frame image when light source feeding unit 13 is lighted after, image process unit 14 start is to carry out image processing to this frame image.
At this clock pulse schematic diagram please also refer to Fig. 2, it provides a demonstration example of image sensing unit 12, light source feeding unit 13 and image process unit 14 work time pulse, the present embodiment to be described.For example, as can be seen from Fig. 2, the light source feeding unit 13 in demonstration example often lights four frames to flicker with regard to the mode extinguishing a frame, and for each frame, image sensing unit 12 all senses this image and sends this image to image process unit 14.Therefore, image process unit 14 can receive the frame image that senses when light source feeding unit 13 is not lighted and light source feeding unit 13 is lighted continuously afterwards time the four frame images that sense.After image process unit 14 receives each the frame image of the four frame images sensed when this light source feeding unit 13 is lighted continuously, this frame image sensed by image sensing unit 12 when can extinguish according to light source feeding unit 13, weaken the background interference of other frame images sensed when other light sources feeding unit 13 is lighted, to promote the contrast of other frame images, then by other frame image outputs to body sense identification module 15.Body sense identification module 15 can perform and integrally feel method for detecting after receiving the output of image process unit 14, to pick out at least one external appearance characteristic of human body or animal.
Controlling light source feeding unit 13 respectively via image process unit 14 makes start therebetween be worked in coordination with image sensing unit 12, image when image when acquisition light source feeding unit 13 extinguishes and light source feeding unit 13 are lighted, thus the bias light interference in image when extinguishing by light source feeding unit 13 weakens when light source feeding unit 13 is lighted image, the contrast of image when strengthening light source feeding unit 13 is lighted, makes prospect object image get over obvious.In other embodiments, image process unit 14 can control light and shade, spectrum, the colour temperature of light source feeding unit 13 further, to optimize the contrast of each character, for object identification provides the image of high-quality.
In other embodiments, if image sensing unit operates with the frame per second of 90fps, suggestion light source feeding unit often lights four frames to flicker with regard to the mode extinguishing a frame, and report point rate is 72; If image sensing unit operates with the frame per second of 60fps, suggestion light source feeding unit often lights nine frames to flicker with regard to the mode extinguishing a frame, and report point rate is 54.On the other hand, when the light source kind selected is RGB lamp, the environment such as indoor, outdoor, cloudy day, projecting lamp can be adapted to, control mode is now use the image during extinguishing of light source feeding unit in order to analyze the more weak spectrum of infrared light, ruddiness, green glow and blue light, uses the illumination that the lamp socket composed compared with the low light level is enhanced contrast.
Body sense identification module 15 can extract the contrast of object image by character such as brightness, color, light are general, the edge line analyzing contrast filters out the object with identification.About the body sense method for detecting that body sense identification module 15 performs, please refer to follow-up explanation.
Please refer to Fig. 3 and Fig. 6, the one sense method for detecting that the one sense identification module that wherein Fig. 3 shows one of foundation the present invention embodiment performs picks out the schematic diagram of at least one hand-characteristic of human body in an image, and Fig. 6 shows a process flow diagram of the body sense method for detecting of the one sense identification module execution according to one of the present invention embodiment.As shown in FIG., first, optionally for the image of a frame frame, video conversion can be comprised through a succession of, as the image processing step being converted to 256 gray-scale figure from RGB image, the removal of Gaussian convolution Fuzzy Processing burr, bright spot etc. make an uproar news becomes multiple image block.Then, in multiple image block, pick out the finger tip edge line (step S100) of all candidates in each image block.Separately please refer to Fig. 7, wherein show the body sense method for detecting of one of the present invention embodiment more meticulously one selects process flow diagram.As shown in Figure 7, in order to pick out the finger tip edge line of candidate, before this in each image block of image block, picked out the curve (step S110) of all bend in one direction.Then, in the curve of this little bend in one direction, at least one curve of the nearly predetermined value of the corresponding center of circle corner connection of radian is picked out, finger tip edge line (step S120) alternatively.
Separately please refer to Fig. 8, another of the body sense method for detecting that the one sense identification module wherein showing one of the present invention embodiment more meticulously performs selects process flow diagram.About in each image block of image block, pick out carrying out of the curve of all bend in one direction, can according to the flow process of Fig. 8 at this.In step S111, the Grad of each picture point in computed image block and gradient direction.For example, can through Grad and gradient direction image block being carried out to the calculation of Sobel process of convolution in horizontal direction and vertical direction and find out each picture point in image block, then used two WITH CROSS GRADIENTS to calculate the Grad of each picture point in European plane and gradient direction again.Then, in step S112, extract the multiple image borders in image block with Tuscany (Canny) rim detection operand, this little image border is preferably contrast comparatively significantly edge line.Then, in step S113, the relation of gradient direction between continuous image vegetarian refreshments in combining image edge and limit unit, the each of those image borders is judged one by one, when the gradient direction of the arbitrary continuation in an image border 3 meets the gradually large or gradually little variation relation of gradient direction, this image border is debated to know be the curve of a bend in one direction.
Body sense method for detecting designs according to the observation of characteristics of human body, therefore can the actual characteristic of identification laminating human body, and have good identification correctness.Such as in order to identification hand-characteristic, observe the geometric configuration of finger of straight configuration, finger can be regarded as approx one cylindric, project to figure two dimensional surface then can regarded two parallel lines and finger tip half source curve as and form.Secondly, according to the physiological property that human body is general, the middle finger of staff is the longest, therefore the ratio of the finger length and finger beam with maximal value is (with max (R l/W) representative), and a value can be preset as max (R l/W).Observe the finger tip profile of staff, advise aforementioned predetermined value to be set as pi/2.Therefore, at this with Sx (x ∈ 1,2,3 ...) indicate finger tip effective coverage, specific go out the curve of the nearly pi/2 of the corresponding center of circle corner connection of radian, finger tip edge line alternatively.
After step S100 picks out the finger tip edge line of all candidates in each image block, just calculate at least one finger edge direction (step S200) from the finger tip edge line of each this candidate.In detail, as shown in Figure 10, first carry out step S210, from the two-end-point of the finger tip edge line of candidate, bring obtain from image block one finger length estimation and a finger orientation estimated value into a predetermined party formula, to find out corresponding multiple pixels in image block.This predetermined equational design is the external appearance characteristic from finger.The distance of the two-end-point of the finger tip edge line of candidate can W estimaterepresent, thus, the finger maximum length L that the finger tip edge line of candidate is corresponding maxcan W estimate× max (R l/W) represent, and the line segment equation obtaining representing finger edge is:
Tan (D finger)=(y-y n)/(x-x n) equation (1);
Line segment length L is:
L=[(x n-x) × (x n-x)+(y n-y) × (y n-y)] 1/2equation (2);
Wherein x ∈ { x n..., x n+ L max× cos (D finger), y ∈ { y n..., y n+ L max× sin (D finger), n ∈ { 1,2}, D fingerrepresent a predetermined tangential direction of an end points of the finger tip edge line of this candidate, the coordinate of the two-end-point of finger tip edge line is (x n, y n) (n ∈ 1,2}).Therefore, the finger length estimation L will obtained from image block estimatewith a finger orientation estimated value D finger_estimatebring in equation (1), (2), suppose L estimatefor (0, L max], D finger_estimatefor [D finger-θ, D finger+ θ], in image block, pixel Px corresponding to line segment is found out in matching.Then, as described in step S220, from the gradient direction G of those pixels Px xmean value G meancalculate this at least one finger edge direction.Because gradient direction is vertical with finger edge direction, obtain the mean value G of gradient direction meanafterwards, finger edge direction D can be obtained estimate_edgefor G mean± pi/2.
Then, at acquisition finger edge direction D estimate_edgeafterwards, in step S300, compare each finger edge direction D estimate_edgewith the finger orientation estimated value D obtained from this image block finger_estimate, will with this finger orientation estimated value D finger_estimateimmediate finger edge direction D estimate_edgesave as a best finger orientation D finger_best, and will with finger orientation estimated value D finger_estimateimmediate finger edge direction D estimate_edgecorresponding finger length estimated value L estimatesave as best finger length L finger_best.
Above step is for each different finger, as: thumb, forefinger, middle finger, the third finger, little finger of toe carry out, and therefore have different best finger orientations and best finger length for each different finger.Achieving best finger orientation D finger_bestwith best finger length L finger_bestafterwards, the finger edge line (step S400) of candidate just can be judged accordingly in image block.The standard judged can be close on best finger orientation D in angle finger_bestbest finger length L is closed on length in one scope finger_besta scope in finger tip edge line and form the pixel Px of finger edge, be deemed to be the finger edge line of candidate.
Another please also refer to Fig. 4 and Fig. 9, wherein Fig. 4 shows the schematic diagram of a finger tip edge line, and Fig. 9 shows another process flow diagram of the body sense method for detecting of the one sense identification module execution according to one of the present invention embodiment.In this case brief and concise explanation the present embodiment, only lists the flow process different from previous embodiment.Before step S500, be carry out, as the step S100 of previous embodiment, S200, S300 and S400, not repeating them here.Judge the finger edge line of candidate in image block after, additionally can carry out the judgement pointing center line, to this, as can be seen from Figure 9 the finger tip summit P3 on the finger tip edge line of candidate is maximum to the distance of two-end-point, therefore, judge that the first step pointing center line calculates on the finger tip edge line of this candidate, there is the distance maximal value of the two end point connecting line of the finger tip edge line to candidate a bit, and using this point as finger tip summit P3 (step S500).If with mathematical equation representative, suppose with (x n, y n) (n ∈ 1,2,3 ..., m}) represent each point on the finger tip edge line of candidate, by Outer Product of Vectors can obtain on the finger tip edge line of candidate a bit to the distance S Δ of two end point connecting line p1p2pnfor:
p1p2pn=[(x 2-x 1)×(y n-y 1)-(y 2-y 1)×(x n-x 1)]/2;
And calculate S Δ p1p2pnthere is (x during maximal value n, y n), then this (x n, y n) value is the coordinate of finger tip summit P3.Then, in step S600, obtain by finger tip summit P3, a vertical line vertical with the line of the two-end-point of the finger tip edge line of this candidate, and by it as a finger center line.For example, the length pointing center line can determine according to the length of the finger edge line of correspondence.
Separately please refer to Fig. 5, another body sense method for detecting of the one sense identification module execution of one of its display foundation the present invention embodiment picks out the schematic diagram of at least one hand-characteristic of human body in an image.In this case brief and concise explanation the present embodiment, only lists the flow process different from previous embodiment.The present embodiment is carry out, as the step S100 of previous embodiment, S200, S300, S400 and S500, not repeating them here.After obtaining finger center line, the present embodiment find out corresponding thumb and middle finger accordingly those point the intersection point of center line, and by it as a palm centre of the palm point.In detail, first, be extend from the terminal M 3 of the finger center line of middle finger, namely make straight line S2M3, be denoted as L s2M3, then, the terminal M 1 crossing the finger center line of thumb makes L s2M3vertical line, meet at C point, this C point is palm centre of the palm point.
In another embodiment, when the hand-characteristic in the wish identification centre of the palm, can be radius according to the distance between the terminal M 1 of the finger center line of the palm centre of the palm point C point of previous embodiment and thumb, palm centre of the palm point C point is for making a circle in the center of circle, this circle can represent an inscribed circle of palm, thus, namely the position in the palm centre of the palm and the scope of palm can be determined.
Be therefore, can learn from above-mentioned, the present invention, therefore can the actual characteristic of identification laminating human body through method flow and the device of the observation of characteristics of human body being designed to identification, and has good identification correctness.
More than describe according to the multiple different embodiment of the present invention, wherein various features can single or different combination enforcement.Therefore, the exposure of embodiment of the present invention, for illustrating the specific embodiment of principle of the present invention, should be regardless of limit the present invention in disclosed embodiment.Further it, previously described and accompanying drawing is only the use of the present invention's demonstration, do not limit by its limit.The change of other element or combination all may, and not to be contrary in the spirit of the present invention and scope.
[main element symbol description]
1 body sense arrangement for detecting 11 camera lens module
12 image sensing unit
13 light source feeding units
14 image process units
15 body sense identification modules
S100, S110, S111, S112, S113, S120, S200, S210, S220, S300, S400, S500, S600 step.

Claims (10)

1. a body sense arrangement for detecting, at least one external appearance characteristic of identification human body or animal, is characterized in that: comprising:
One camera lens module;
One image sensing unit, this camera lens module is located on side, to obtain an image;
One light source feeding unit, provides a light source;
One image process unit, controls the start of this image sensing unit and this light source feeding unit, and receives this image that this image sensing unit sends; And
One sense identification module, receives the output of this image process unit;
Wherein, this image process unit controls this image sensing unit and periodically on a frame-by-frame basis senses this image, control this light source feeding unit and often light n frame just extinguishing one frame, and this frame image that when extinguishing according to this light source feeding unit, this image sensing unit senses arrives, weaken the background interference of other frame images, to promote the contrast of other frame images, then by other frame image outputs to this body sense identification module.
2. body sense arrangement for detecting as claimed in claim 1, is characterized in that: this body sense identification module picks out the finger tip edge line of all candidates in each image block of multiple image block, at least one finger edge direction is calculated from the finger tip edge line of each this candidate, relatively each finger edge direction and the finger orientation estimated value obtained from this image block, a best finger orientation will be saved as with this finger orientation estimated value this at least one finger edge direction immediate, and point length estimation by corresponding with this finger orientation estimated value this at least one finger edge direction immediate one and save as a best finger length, and according to this best finger orientation and this best finger length, the finger edge line of at least one candidate is judged in those image block.
3. body sense arrangement for detecting as claimed in claim 3, it is characterized in that: this body sense identification module is in each image block of those image block, pick out the curve of all bend in one direction, and in the curve of those bend in one direction, pick out at least one curve of the nearly predetermined value of the corresponding center of circle corner connection of radian, as the finger tip edge line of this candidate.
4. body sense arrangement for detecting as claimed in claim 2, is characterized in that:
This predetermined value is pi/2.
5. body sense arrangement for detecting as claimed in claim 3, it is characterized in that: this body sense identification module calculates Grad and the gradient direction thereof of each picture point in this image block, multiple image border in this image block is extracted with Tuscany (Canny) rim detection operand, and each of those image borders is judged one by one, when the gradient direction of the arbitrary continuation in an image border 3 meets the gradually large or gradually little variation relation of gradient direction, this image border is debated to know be the curve of a bend in one direction.
6. body sense arrangement for detecting as claimed in claim 3, it is characterized in that: this body sense identification module calculates on the finger tip edge line of this candidate, have the distance maximal value of the two end point connecting line of the finger tip edge line to this candidate a bit, and using this o'clock as a finger tip summit, and obtain by this finger tip summit, a vertical line vertical with the line of the two-end-point of the finger tip edge line of this candidate, and by it as a finger center line.
7. body sense arrangement for detecting as claimed in claim 6, is characterized in that: this body sense identification module obtain corresponding thumb and middle finger those point the intersection point of center line, and by it as a palm centre of the palm point.
8. body sense arrangement for detecting as claimed in claim 7, it is characterized in that: this body sense identification module according to the distance of this palm centre of the palm point and the distal point with this finger center line of corresponding thumb thereof, using this palm centre of the palm point as the center of circle, this palm centre of the palm point is recognized as a palm with the distance of this distal point of this finger center line of corresponding thumb as the circle that radius is formed.
9. body sense arrangement for detecting as claimed in claim 2, it is characterized in that: this body sense identification module is from the two-end-point of the finger tip edge line of this candidate, bring this finger length estimated value and the finger orientation estimated value obtained from this image block into a predetermined party formula, to find out corresponding multiple pixels in this image block, and go out this at least one finger edge direction from the mean value calculation of the gradient direction of those pixels.
10. body sense arrangement for detecting as claimed in claim 9, is characterized in that:
This predetermined party formula is:
tan(D finger)=(y-y n)/(x-x n);
Wherein x ∈ { x n..., x n+ L max× cos (D finger), y ∈ { y n..., y n+ L max× sin (D finger), n ∈ { 1,2}, D fingerrepresent a predetermined tangential direction of an end points of the finger tip edge line of this candidate,
L maxrepresent a predetermined finger maximum length, meet following relationship:
L max=W estimate×max(R L/W),
W estimatedistance between the two-end-point representing the finger tip edge line of this candidate, max (R l/W) be a predetermined value, representative refers to maximal value that is long and finger beam ratio, and the coordinate of the two-end-point of this finger tip edge line is (X n, y n) (n ∈ 1,2}).
CN201410117933.9A 2014-03-26 2014-03-26 Kinect detecting apparatus Pending CN104951734A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410117933.9A CN104951734A (en) 2014-03-26 2014-03-26 Kinect detecting apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410117933.9A CN104951734A (en) 2014-03-26 2014-03-26 Kinect detecting apparatus

Publications (1)

Publication Number Publication Date
CN104951734A true CN104951734A (en) 2015-09-30

Family

ID=54166380

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410117933.9A Pending CN104951734A (en) 2014-03-26 2014-03-26 Kinect detecting apparatus

Country Status (1)

Country Link
CN (1) CN104951734A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI728459B (en) * 2019-09-06 2021-05-21 威剛科技股份有限公司 Method for assisting custom mouse making and assisting customized mouse making system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI728459B (en) * 2019-09-06 2021-05-21 威剛科技股份有限公司 Method for assisting custom mouse making and assisting customized mouse making system

Similar Documents

Publication Publication Date Title
EP2904472B1 (en) Wearable sensor for tracking articulated body-parts
CN107077258B (en) Projection type image display device and image display method
JP6547013B2 (en) Biological information analysis apparatus and method thereof
CN102369498A (en) Touch pointers disambiguation by active display feedback
CN108496142A (en) A kind of gesture identification method and relevant apparatus
JP2010522922A (en) System and method for tracking electronic devices
US20190266798A1 (en) Apparatus and method for performing real object detection and control using a virtual reality head mounted display system
US10929078B2 (en) Electronic apparatus for generating screen image to be displayed by display apparatus and control method thereof
CN104199550A (en) Man-machine interactive type virtual touch device, system and method
CN110489027B (en) Handheld input device and display position control method and device of indication icon of handheld input device
CN106843530A (en) Optical signalling output, processing unit, method and system and imaging device
CN205080499U (en) Mutual equipment of virtual reality based on gesture recognition
CN104199548B (en) A kind of three-dimensional man-machine interactive operation device, system and method
US7853080B2 (en) System and method for identifying and labeling cluster pixels in a frame of image data for optical navigation
CN105302295A (en) Virtual reality interaction device having 3D camera assembly
CN104267802A (en) Human-computer interactive virtual touch device, system and method
CN113160260B (en) Head-eye double-channel intelligent man-machine interaction system and operation method
CN104951734A (en) Kinect detecting apparatus
US10824237B2 (en) Screen display control method and screen display control system
CN205080498U (en) Mutual equipment of virtual reality with 3D subassembly of making a video recording
CN104951731A (en) Kinect detecting method
CN104951737A (en) Kinect detecting method
CN104951739A (en) Kinect detecting method
CN104951053A (en) Body feeding detecting method
CN104951740A (en) Kinect detecting method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20150930