CN104951053A - Body feeding detecting method - Google Patents
Body feeding detecting method Download PDFInfo
- Publication number
- CN104951053A CN104951053A CN201410117889.1A CN201410117889A CN104951053A CN 104951053 A CN104951053 A CN 104951053A CN 201410117889 A CN201410117889 A CN 201410117889A CN 104951053 A CN104951053 A CN 104951053A
- Authority
- CN
- China
- Prior art keywords
- finger
- candidate
- edge line
- image block
- line
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Landscapes
- Image Analysis (AREA)
Abstract
The invention aims at providing a body feeding detecting method. The body feeding detecting method is suitable for identifying one or more hand features of a human body, a candidate fingertip edge line is identified in an image, the best finger direction and the best finger length are calculated, so that the hand features of the human body are identified, interaction can be carried out between a user using the gestures in a bare-handed mode with a computer operation interface conveniently, according to the best finger direction and the best finger length, one or more candidate fingertip edge lines are judged in an image block, and the candidate fingertip edge lines are calculated, a point with the maximum valve of the distance of a connecting line between two end points of the candidate fingertip edge lines is obtained, the point serves as a fingertip top point, a perpendicular line which is perpendicular to the connecting line between two end points of the candidate fingertip edge lines and penetrated through the fingertip top point is obtained, and the perpendicular line serves as the central line of fingers.
Description
Technical field
The present invention relates to a kind of body sense method for detecting, and especially about the body sense method for detecting of identification human hands feature.
Background technology
In recent years, based on gesture identification and the system fast development thereof of image action, especially games system.Wherein, the action through computer for analysis user performs the interactive approach that instruction has become following most possibility, and how allowing the interactive interface between user and computer more friendly is a day by day important problem.But traditional solution often needs to configure an inductor on user's finger, although this measure can increase the accuracy of detecting hand motion, also increases the burden of user.Another preferably mode be directly the hand of user is considered as an instruction issuing utensil, analyze the hand move mode of user to input instruction in the mode of optical profile type infrared sensing process, control the operating system of computer or peripheral device.But this kind of traditional optical image analytical approach is too complicated and stable not.
Therefore, how allowing user free hand gestures and computation interface can carry out interaction, is a problem demanding prompt solution.
Summary of the invention
Because the problems referred to above, the invention provides a kind of body sense method for detecting, in the middle of image, pick out the finger tip edge line of candidate and calculate best finger orientation and best finger length, to pick out the hand-characteristic of human body, being convenient to user free hand gestures and computation interface can carry out interaction.
An object of the present invention is to provide a kind of body sense method for detecting, at least one hand-characteristic of identification human body, it is characterized in that: the finger tip edge line comprising the following steps: all candidates picked out in each image block of multiple image block; At least one finger edge direction is calculated from the finger tip edge line of each this candidate; Relatively each finger edge direction and the finger orientation estimated value obtained from this image block, a best finger orientation will be saved as with this finger orientation estimated value this at least one finger edge direction immediate, and point length estimation by corresponding with this finger orientation estimated value this at least one finger edge direction immediate one and save as a best finger length; According to this best finger orientation and this best finger length, in those image block, judge the finger edge line of at least one candidate; Calculate on the finger tip edge line of this candidate, there is a bit of the distance maximal value of the two end point connecting line of the finger tip edge line to this candidate, and using this o'clock as a finger tip summit; And obtain by this finger tip summit, a vertical line vertical with the line of the two-end-point of the finger tip edge line of this candidate, and by it as a finger center line.
Aspect is implemented according to one of the present invention, it is characterized in that: the step picking out the finger tip edge line of all candidates in each image block of multiple image block more comprises the following steps: in each image block of those image block, pick out the curve of all bend in one direction: and in the curve of those bend in one direction, pick out at least one curve of the nearly predetermined value of the corresponding center of circle corner connection of radian, as the finger tip edge line of this candidate.
Implement aspect according to one of the present invention, it is characterized in that: this predetermined value is pi/2.
Aspect is implemented according to one of the present invention, it is characterized in that: the step calculating at least one finger edge direction from the finger tip edge line of each this candidate more comprises the following steps: the two-end-point of the finger tip edge line from this candidate, bring this finger length estimated value and the finger orientation estimated value obtained from this image block into a predetermined party formula, to find out corresponding multiple pixels in this image block; And go out this at least one finger edge direction from the mean value calculation of the gradient direction of those pixels.
Implement aspect according to one of the present invention, it is characterized in that: this predetermined party formula is:
tan(D
finger)=(y-y
n)/(x-x
n);
Wherein x ∈ { x
n..., x
n+ L
max× cos (D
finger), y ∈ { y
n..., y
n+ L
max× sin (D
finger), n ∈ { 1,2}, D
fingerrepresent a predetermined tangential direction of an end points of the finger tip edge line of this candidate, L
maxrepresent a predetermined finger maximum length, and meet following relationship:
L
max=W
estimate×max(R
L/W),
W
estimatedistance between the two-end-point representing the finger tip edge line of this candidate, max (R
l/W) be a predetermined value, representative refers to maximal value that is long and finger beam ratio, and the coordinate of the two-end-point of finger tip edge line is (X
n, y
n) (n ∈ 1,2}).
Accompanying drawing explanation
Fig. 1 shows and integrally feels arrangement for detecting, at least one external appearance characteristic of identification human body or animal.
Fig. 2 shows the clock pulse schematic diagram that one sense arrangement for detecting uses.
Fig. 3 body sense method for detecting shown according to one of the present invention embodiment picks out the schematic diagram of at least one hand-characteristic of human body in an image.
Fig. 4 shows the schematic diagram of a finger tip edge line.
The body sense method for detecting of another embodiment of Fig. 5 foundation the present invention picks out the schematic diagram of at least one hand-characteristic of human body in an image.
Fig. 6 shows a process flow diagram of the body sense method for detecting according to one of the present invention embodiment.
Fig. 7 shows one of the body sense method for detecting of one of the present invention embodiment and selects process flow diagram.
Fig. 8 show the body sense method for detecting of one of the present invention embodiment another select process flow diagram.
Fig. 9 shows a process flow diagram of the body sense method for detecting of another embodiment according to the present invention.
Figure 10 show the body sense method for detecting of one of the present invention embodiment another select process flow diagram.
Embodiment
For further illustrating each embodiment, the present invention provides graphic.This is graphic is a bit a part for disclosure of the present invention, and it is mainly that embodiment is described, and the associated description of instructions can be coordinated to explain the operation principles of embodiment.Coordinate with reference to these contents, this area has knows that the knowledgeable will be understood that the advantage of other possible embodiments and the present invention usually.Element in figure not drawn on scale, and similar component symbol is commonly used to element like representation class.
First please refer to Fig. 1, its display integrally feels arrangement for detecting, at least one external appearance characteristic of identification human body or animal.As shown in FIG., body sense arrangement for detecting 1 comprises that camera lens module 11, image sensing unit 12 is other is located at camera lens module 11, light source feeding unit 13, image process unit 14 and one sense identification module 15.
Camera lens module 11 comprises multiple eyeglass, through the optical characteristics of this little eyeglass, changes the path of the light injected from a light inlet, thus form an image on image sensing unit 12.
In the present embodiment, image sensing unit 12 example is a cmos sensor, light source feeding unit 13 is exemplarily wall emission infrared light supply, but the present invention is not limited to this, in other embodiments, image sensing unit can be the sensor of other kenels, and as ccd sensor, light source feeding unit can be selected indoor lamp, comprises redness, the RGB lamp of green, blue-light source or the light source feeding unit of other kenels.
Image process unit 14 controls the start of light source feeding unit 13 and image sensing unit 12, and make light source feeding unit 13 provide a light source, image sensing unit 12 senses an image.In detail, light source feeding unit 13 is controlled often to light n frame by image process unit 14 and flickers with regard to the mode extinguishing a frame, image sensing unit 12 is controlled to sense this image in periodically on a frame-by-frame basis mode by image process unit 14, sense each frame image when light source feeding unit 13 is lighted after, image process unit 14 start is to carry out image processing to this frame image.
At this clock pulse schematic diagram please also refer to Fig. 2, it provides a demonstration example of image sensing unit 12, light source feeding unit 13 and image process unit 14 work time pulse, the present embodiment to be described.For example, as can be seen from Fig. 2, the light source feeding unit 13 in demonstration example often lights four frames to flicker with regard to the mode extinguishing a frame, and for each frame, image sensing unit 12 all senses this image and sends this image to image process unit 14.Therefore, image process unit 14 can receive the frame image that senses when light source feeding unit 13 is not lighted and light source feeding unit 13 is lighted continuously afterwards time the four frame images that sense.After image process unit 14 receives each the frame image of the four frame images sensed when this light source feeding unit 13 is lighted continuously, this frame image sensed by image sensing unit 12 when can extinguish according to light source feeding unit 13, weaken the background interference of other frame images sensed when other light sources feeding unit 13 is lighted, to promote the contrast of other frame images, then by other frame image outputs to body sense identification module 15.Body sense identification module 15 can perform and integrally feel method for detecting after receiving the output of image process unit 14, to pick out at least one external appearance characteristic of human body or animal.
Controlling light source feeding unit 13 respectively via image process unit 14 makes start therebetween be worked in coordination with image sensing unit 12, image when image when acquisition light source feeding unit 13 extinguishes and light source feeding unit 13 are lighted, thus the bias light interference in image when extinguishing by light source feeding unit 13 weakens when light source feeding unit 13 is lighted image, the contrast of image when strengthening light source feeding unit 13 is lighted, makes prospect object image get over obvious.In other embodiments, image process unit 14 can control light and shade, spectrum, the colour temperature of light source feeding unit 13 further, to optimize the contrast of each character, for object identification provides the image of high-quality.
In other embodiments, if image sensing unit operates with the frame per second of 90fps, suggestion light source feeding unit often lights four frames to flicker with regard to the mode extinguishing a frame, and report point rate is 72; If image sensing unit operates with the frame per second of 60fps, suggestion light source feeding unit often lights nine frames to flicker with regard to the mode extinguishing a frame, and report point rate is 54.On the other hand, when the light source kind selected is RGB lamp, the environment such as indoor, outdoor, cloudy day, projecting lamp can be adapted to, control mode is now use the image during extinguishing of light source feeding unit in order to analyze the more weak spectrum of infrared light, ruddiness, green glow and blue light, uses the illumination that the lamp socket composed compared with the low light level is enhanced contrast.
Body sense identification module 15 can extract the contrast of object image by character such as brightness, color, light are general, the edge line analyzing contrast filters out the object with identification.About the body sense method for detecting that body sense identification module 15 performs, please refer to follow-up explanation.
Please refer to Fig. 3 and Fig. 6, wherein Fig. 3 body sense method for detecting shown according to one of the present invention embodiment picks out the schematic diagram of at least one hand-characteristic of human body in an image, and Fig. 6 shows a process flow diagram of the body sense method for detecting according to one of the present invention embodiment.As shown in FIG., first, optionally for the image of a frame frame, video conversion can be comprised through a succession of, as the image processing step being converted to 256 gray-scale figure from RGB image, the removal of Gaussian convolution Fuzzy Processing burr, bright spot etc. make an uproar news becomes multiple image block.Then, in multiple image block, pick out the finger tip edge line (step S100) of all candidates in each image block.Separately please refer to Fig. 7, wherein show the body sense method for detecting of one of the present invention embodiment more meticulously one selects process flow diagram.As shown in Figure 7, in order to pick out the finger tip edge line of candidate, before this in each image block of image block, picked out the curve (step S110) of all bend in one direction.Then, in the curve of this little bend in one direction, at least one curve of the nearly predetermined value of the corresponding center of circle corner connection of radian is picked out, finger tip edge line (step S120) alternatively.
Separately please refer to Fig. 8, another wherein showing the body sense method for detecting of one of the present invention embodiment more meticulously selects process flow diagram.About in each image block of image block, pick out carrying out of the curve of all bend in one direction, can according to the flow process of Fig. 8 at this.In step S111, the Grad of each picture point in computed image block and gradient direction.For example, can through Grad and gradient direction image block being carried out to the calculation of Sobel process of convolution in horizontal direction and vertical direction and find out each picture point in image block, then used two WITH CROSS GRADIENTS to calculate the Grad of each picture point in European plane and gradient direction again.Then, in step S112, extract the multiple image borders in image block with Tuscany (Canny) rim detection operand, this little image border is preferably contrast comparatively significantly edge line.Then, in step S113, the relation of gradient direction between continuous image vegetarian refreshments in combining image edge and limit unit, the each of those image borders is judged one by one, when the gradient direction of the arbitrary continuation in an image border 3 meets the gradually large or gradually little variation relation of gradient direction, this image border is debated to know be the curve of a bend in one direction.
Body sense method for detecting designs according to the observation of characteristics of human body, therefore can the actual characteristic of identification laminating human body, and have good identification correctness.Such as in order to identification hand-characteristic, observe the geometric configuration of finger of straight configuration, finger can be regarded as approx one cylindric, project to figure two dimensional surface then can regarded two parallel lines and finger tip half source curve as and form.Secondly, according to the physiological property that human body is general, the middle finger of staff is the longest, therefore the ratio of the finger length and finger beam with maximal value is (with max (R
l/W) representative), and a value can be preset as max (R
l/W).Observe the finger tip profile of staff, advise aforementioned predetermined value to be set as pi/2.Therefore, at this with Sx (x ∈ 1,2,3 ...) indicate finger tip effective coverage, specific go out the curve of the nearly pi/2 of the corresponding center of circle corner connection of radian, finger tip edge line alternatively.
After step S100 picks out the finger tip edge line of all candidates in each image block, just calculate at least one finger edge direction (step S200) from the finger tip edge line of each this candidate.In detail, as shown in Figure 10, first carry out step S210, from the two-end-point of the finger tip edge line of candidate, bring obtain from image block one finger length estimation and a finger orientation estimated value into a predetermined party formula, to find out corresponding multiple pixels in image block.This predetermined equational design is the external appearance characteristic from finger.The distance of the two-end-point of the finger tip edge line of candidate can W
estimaterepresent, thus, the finger maximum length L that the finger tip edge line of candidate is corresponding
maxcan W
estimate× max (R
l/W) represent, and the line segment equation obtaining representing finger edge is:
Tan (D
finger)=(y-y
n)/(x-x
n) equation (1);
Line segment length L is:
L=[(x
n-x) × (x
n-x)+(y
n-y) × (y
n-y)]
1/2equation (2);
Wherein x ∈ { x
n..., x
n+ L
max× cos (D
finger), y ∈ { y
n..., y
n+ L
max× sin (D
finger), n ∈ { 1,2}, D
fingerrepresent a predetermined tangential direction of an end points of the finger tip edge line of this candidate, the coordinate of the two-end-point of finger tip edge line is (x
n, y
n) (n ∈ 1,2}).Therefore, the finger length estimation L will obtained from image block
estimatewith a finger orientation estimated value D
finger_estimatebring in equation (1), (2), suppose L
estimatefor (0, L
max], D
finger_estimatefor [D
finger-θ, D
finger+ θ], in image block, pixel Px corresponding to line segment is found out in matching.Then, as described in step S220, from the gradient direction G of those pixels Px
xmean value G
meancalculate this at least one finger edge direction.Because gradient direction is vertical with finger edge direction, obtain the mean value G of gradient direction
meanafterwards, finger edge direction D can be obtained
estimate_edgefor G
mean± pi/2.
Then, at acquisition finger edge direction D
estimate_edgeafterwards, in step S300, compare each finger edge direction D
estimate_edgewith the finger orientation estimated value D obtained from this image block
finger_estimate, will with this finger orientation estimated value D
finger_estimateimmediate finger edge direction D
estimate_edgesave as a best finger orientation D
finger_best, and will with finger orientation estimated value D
finger_estimateimmediate finger edge direction D
estimate_edgecorresponding finger length estimated value L
estimatesave as best finger length L
finger_best.
Above step is for each different finger, as: thumb, forefinger, middle finger, the third finger, little finger of toe carry out, and therefore have different best finger orientations and best finger length for each different finger.Achieving best finger orientation D
finger_bestwith best finger length L
finger_bestafterwards, the finger edge line (step S400) of candidate just can be judged accordingly in image block.The standard judged can be close on best finger orientation D in angle
finger_bestbest finger length L is closed on length in one scope
finger_besta scope in finger tip edge line and form the pixel Px of finger edge, be deemed to be the finger edge line of candidate.
Another please also refer to Fig. 4 and Fig. 9, wherein Fig. 4 shows the schematic diagram of a finger tip edge line, and Fig. 9 shows a process flow diagram of the body sense method for detecting of another embodiment according to the present invention.In this case brief and concise explanation the present embodiment, only lists the flow process different from previous embodiment.Before step S500, be carry out, as the step S100 of previous embodiment, S200, S300 and S400, not repeating them here.Judge the finger edge line of candidate in image block after, additionally can carry out the judgement pointing center line, to this, as can be seen from Figure 9 the finger tip summit P3 on the finger tip edge line of candidate is maximum to the distance of two-end-point, therefore, judge that the first step pointing center line calculates on the finger tip edge line of this candidate, there is the distance maximal value of the two end point connecting line of the finger tip edge line to candidate a bit, and using this point as finger tip summit P3 (step S500).If with mathematical equation representative, suppose with (x
n, y
n) (n ∈ 1,2,3 ..., m}) represent each point on the finger tip edge line of candidate, by Outer Product of Vectors can obtain on the finger tip edge line of candidate a bit to the distance S Δ of two end point connecting line
p1p2pnfor:
SΔ
p1p2pn=[(x
2-x
1)×(y
n-y
1)-(y
2-y
1)×(x
n-x
1)]/2;
And calculate S Δ
p1p2pnthere is (x during maximal value
n, y
n), then this (x
n, y
n) value is the coordinate of finger tip summit P3.Then, in step S600, obtain by finger tip summit P3, a vertical line vertical with the line of the two-end-point of the finger tip edge line of this candidate, and by it as a finger center line.For example, the length pointing center line can determine according to the length of the finger edge line of correspondence.
Separately please refer to Fig. 5, the body sense method for detecting of another embodiment of its display foundation the present invention picks out the schematic diagram of at least one hand-characteristic of human body in an image.In this case brief and concise explanation the present embodiment, only lists the flow process different from previous embodiment.The present embodiment is carry out, as the step S100 of previous embodiment, S200, S300, S400 and S500, not repeating them here.After obtaining finger center line, the present embodiment find out corresponding thumb and middle finger accordingly those point the intersection point of center line, and by it as a palm centre of the palm point.In detail, first, be extend from the terminal M 3 of the finger center line of middle finger, namely make straight line S2M3, be denoted as L
s2M3, then, the terminal M 1 crossing the finger center line of thumb makes L
s2M3vertical line, meet at C point, this C point is palm centre of the palm point.
In another embodiment, when the hand-characteristic in the wish identification centre of the palm, can be radius according to the distance between the terminal M 1 of the finger center line of the palm centre of the palm point C point of previous embodiment and thumb, palm centre of the palm point C point is for making a circle in the center of circle, this circle can represent an inscribed circle of palm, thus, namely the position in the palm centre of the palm and the scope of palm can be determined.
Be therefore, can learn from above-mentioned, the present invention, therefore can the actual characteristic of identification laminating human body through method flow and the device of the observation of characteristics of human body being designed to identification, and has good identification correctness.
More than describe according to the multiple different embodiment of the present invention, wherein various features can single or different combination enforcement.Therefore, the exposure of embodiment of the present invention, for illustrating the specific embodiment of principle of the present invention, should be regardless of limit the present invention in disclosed embodiment.Further it, previously described and accompanying drawing is only the use of the present invention's demonstration, do not limit by its limit.The change of other element or combination all may, and not to be contrary in the spirit of the present invention and scope.
[main element symbol description]
1 body sense arrangement for detecting 11 camera lens module
12 image sensing unit
13 light source feeding units
14 image process units
15 body sense identification modules
S100, S110, S111, S112, S113, S120, S200, S210, S220, S300, S400, S500, S600 step.
Claims (5)
1. a body sense method for detecting, at least one hand-characteristic of identification human body, is characterized in that: comprise the following steps:
Pick out the finger tip edge line of all candidates in each image block of multiple image block;
At least one finger edge direction is calculated from the finger tip edge line of each this candidate;
Relatively each finger edge direction and the finger orientation estimated value obtained from this image block, a best finger orientation will be saved as with this finger orientation estimated value this at least one finger edge direction immediate, and point length estimation by corresponding with this finger orientation estimated value this at least one finger edge direction immediate one and save as a best finger length;
According to this best finger orientation and this best finger length, in those image block, judge the finger edge line of at least one candidate;
Calculate on the finger tip edge line of this candidate, there is a bit of the distance maximal value of the two end point connecting line of the finger tip edge line to this candidate, and using this o'clock as a finger tip summit; And
Obtain by this finger tip summit, a vertical line vertical with the line of the two-end-point of the finger tip edge line of this candidate, and by it as a finger center line.
2. body sense method for detecting as claimed in claim 1, is characterized in that: the step picking out the finger tip edge line of all candidates in each image block of multiple image block more comprises the following steps:
In each image block of those image block, pick out the curve of all bend in one direction: and
In the curve of those bend in one direction, pick out at least one curve of the nearly predetermined value of the corresponding center of circle corner connection of radian, as the finger tip edge line of this candidate.
3. body sense method for detecting as claimed in claim 2, is characterized in that:
This predetermined value is pi/2.
4. body sense method for detecting as claimed in claim 1, is characterized in that: the step calculating at least one finger edge direction from the finger tip edge line of each this candidate more comprises the following steps:
From the two-end-point of the finger tip edge line of this candidate, bring this finger length estimated value and the finger orientation estimated value obtained from this image block into a predetermined party formula, to find out corresponding multiple pixels in this image block; And
This at least one finger edge direction is gone out from the mean value calculation of the gradient direction of those pixels.
5. body sense method for detecting as claimed in claim 4, is characterized in that:
This predetermined party formula is:
tan(D
finger)=(y-y
n)/(x-x
n);
Wherein x ∈ { x
n..., x
n+ L
max× cos (D
finger), y ∈ { y
n..., y
n+ L
max× sin (D
finger), n ∈ { 1,2}, D
fingerrepresent a predetermined tangential direction of an end points of the finger tip edge line of this candidate,
L
maxrepresent a predetermined finger maximum length, meet following relationship:
L
max=W
estimate×max(R
L/W),
W
estimatedistance between the two-end-point representing the finger tip edge line of this candidate, max (R
l/W) be a predetermined value, representative refers to maximal value that is long and finger beam ratio, and the coordinate of the two-end-point of this finger tip edge line is (X
n, y
n) (n ∈ 1,2}).
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410117889.1A CN104951053A (en) | 2014-03-26 | 2014-03-26 | Body feeding detecting method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410117889.1A CN104951053A (en) | 2014-03-26 | 2014-03-26 | Body feeding detecting method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN104951053A true CN104951053A (en) | 2015-09-30 |
Family
ID=54165759
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410117889.1A Pending CN104951053A (en) | 2014-03-26 | 2014-03-26 | Body feeding detecting method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN104951053A (en) |
-
2014
- 2014-03-26 CN CN201410117889.1A patent/CN104951053A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2904472B1 (en) | Wearable sensor for tracking articulated body-parts | |
US10540812B1 (en) | Handling real-world light sources in virtual, augmented, and mixed reality (xR) applications | |
JP6547013B2 (en) | Biological information analysis apparatus and method thereof | |
CN107077258B (en) | Projection type image display device and image display method | |
US20190266798A1 (en) | Apparatus and method for performing real object detection and control using a virtual reality head mounted display system | |
CN105045398A (en) | Virtual reality interaction device based on gesture recognition | |
CN107357428A (en) | Man-machine interaction method and device based on gesture identification, system | |
MX2011008489A (en) | Touch pointers disambiguation by active display feedback. | |
CN110489027B (en) | Handheld input device and display position control method and device of indication icon of handheld input device | |
JPH10326148A (en) | Hand pointing device | |
JP2010522922A (en) | System and method for tracking electronic devices | |
CN104199550A (en) | Man-machine interactive type virtual touch device, system and method | |
TWI479430B (en) | Gesture identification with natural images | |
CN205080499U (en) | Mutual equipment of virtual reality based on gesture recognition | |
CN104199548B (en) | A kind of three-dimensional man-machine interactive operation device, system and method | |
CN105302295A (en) | Virtual reality interaction device having 3D camera assembly | |
US7853080B2 (en) | System and method for identifying and labeling cluster pixels in a frame of image data for optical navigation | |
CN105302294A (en) | Interactive virtual reality presentation device | |
CN113160260B (en) | Head-eye double-channel intelligent man-machine interaction system and operation method | |
CN105046249A (en) | Human-computer interaction method | |
CN104951734A (en) | Kinect detecting apparatus | |
CN205080498U (en) | Mutual equipment of virtual reality with 3D subassembly of making a video recording | |
US10824237B2 (en) | Screen display control method and screen display control system | |
CN104951053A (en) | Body feeding detecting method | |
CN108401452B (en) | Apparatus and method for performing real target detection and control using virtual reality head mounted display system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
WD01 | Invention patent application deemed withdrawn after publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20150930 |