CN102749994A - Indicating method for motion direction and speed strength of gesture in interaction system - Google Patents
Indicating method for motion direction and speed strength of gesture in interaction system Download PDFInfo
- Publication number
- CN102749994A CN102749994A CN2012101968966A CN201210196896A CN102749994A CN 102749994 A CN102749994 A CN 102749994A CN 2012101968966 A CN2012101968966 A CN 2012101968966A CN 201210196896 A CN201210196896 A CN 201210196896A CN 102749994 A CN102749994 A CN 102749994A
- Authority
- CN
- China
- Prior art keywords
- gesture
- motion
- speed
- motion direction
- image sequence
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Landscapes
- User Interface Of Digital Computer (AREA)
Abstract
The invention discloses an indicating method for motion direction and speed strength of a gesture in an interaction system. The indicating method comprises the following steps of: (1) starting up; (2) acquiring a segment of image sequence and transmitting the image sequence to a calculation module by a sensor; (3) calculating the motion direction and the speed strength of the gesture in the segment of image sequence and respectively comparing the motion direction and the speed strength of the gesture with a comparison rule by the calculation module; and (4) obtaining the comparison result and promoting and displaying according to the actual motion direction and speed strength of the gesture. According to the indicating method for the motion direction and the speed strength of the gesture in the interaction system, which is disclosed by the invention, the motion direction and the speed strength of the gesture can be obtained accurately and quickly and further the interaction system makes an accurate reaction.
Description
Technical field
The present invention relates to field of human-computer interaction, refer in particular to the reminding method of the direction of motion and the speed intensity of gesture in a kind of interactive system.
Background technology
Gesture is the widely used in daily life a kind of exchange waies of people, has the stronger function of expressing the meaning.The development of human-computer interaction technology is also all carried out round gesture mostly, like traditional mouse, keyboard, telepilot, touch-screen etc.Some emerging man-machine interaction modes based on gesture utilize the common camera or the degree of depth to make a video recording first-class equipment as front end sensors; Catch the image sequence that gesture is moved aloft; Through technology such as Flame Image Process, machine learning, pattern-recognitions, the image sequence that sensor is captured carries out analyzing and processing, identification and tracking gesture; And carry out alternately with the interface, realize remote contactless man-machine interaction.
As the operative body in the man-machine interactive system, the people obtains through sensation (comprising vision, the sense of hearing, sense of touch, cinesthesia etc.) the understanding in the external world, and handle the back via brain and form consciousness information, and then the action that draws oneself up according to these information.The people is in using traditional interactive devices such as mouse, keyboard, telepilot, touch-screen; Main sense of touch, vision, the cinesthesia of relying on obtains the cognition to current situation; Feedback regulation is to obtain the motion control purpose of certain precision, efficient, and wherein sense of touch has play a part very important.Remote non-contact type human-machine interaction is compared with the control of traditional interactive device; Because controlled motion aloft carries out; Lacked important tactile feedback information, only relied on vision and cinesthetic perception, wherein cinesthesia belongs to the characteristic of the person itself; Realize that aerial gesture is accurate, the motion control purpose of efficient, then mainly rely on visual feedback information to add strong man's Situated Cognition.
Contactless aerial gesture motion control is that the spatial movement information (comprising direction of motion, speed intensity) with gesture is mapped in the interface, realizes operation such as is selected, browses in the location of interface information.In the life, we can find such phenomenon usually, when closing eyes and on paper, writing or draw a picture with hand, write out or the thing that draws often is difficult to consistent with idea.In contactless aerial gesture motion control procedure, people's visual attention concentrates on the show media on the one hand, but not hand; Machine perception gesture motion has non-precision on the other hand.Therefore, this process also exists same problem: the people is difficult to hold the mapping relations between current travel direction and motion amplitude size and interface, thereby is difficult to fast and effeciently realize control task.
Summary of the invention
The reminding method that the purpose of this invention is to provide the direction of motion and the speed intensity of gesture in a kind of interactive system; Use reminding method of the present invention; Can know the direction of motion and the speed intensity of gesture accurately and fast, so that interactive system is made reaction accurately.
The objective of the invention is to realize like this:
The reminding method of the direction of motion of gesture and speed intensity in a kind of interactive system, this method comprises the steps:
(1) start;
(2) sensor obtains one section image sequence, and this section image sequence is sent to computing module;
(3) computing module calculates the direction of motion and the speed intensity of the gesture in this section image sequence, and the direction of motion and the speed intensity of gesture is compared with the comparison scale respectively;
(4) obtain comparative result, and show according to the gesture motion direction and the prompting of speed intensity of reality.
Among embodiment, in step (3), said relatively scale is provided with direction scale and speed scale therein.
Among embodiment, said sensor is common camera or degree of depth camera therein.
The reminding method of the direction of motion of gesture and speed intensity compared with prior art has following beneficial effect in the interactive system of the present invention:
(1) solved in the gesture motion control procedure, caught and effectively pointed out problem the current motion control situation of staff (direction of motion, speed intensity).Make the user improve cognitive ability, realize aerial gesture motion control purpose accurate, efficient according to institute's information to current situation;
(2) when pointing out direction of motion or speed intensity and user's practical operation difference very big, the information that the user can pass through to be pointed out is learnt and mistake occurred, improved interactive experience;
(3) information such as direction of motion, speed intensity are carried out grade quantizing, overcome the nose motion control information and extracted the not high problem of bringing shake of precision, realize smooth operation.
Description of drawings
Fig. 1 for the reminding method of the direction of motion of gesture in the interactive system of the present invention and speed intensity at the synoptic diagram that does not have under the display reminding state;
Fig. 2 is the direction of motion of gesture in the interactive system of the present invention and the synoptic diagram of reminding method under the display reminding state of speed intensity.
Embodiment
The reminding method of the direction of motion of gesture and speed intensity in the interactive system of the present invention, this method comprise the steps: (1) start; (2) sensor obtains one section image sequence, and this section image sequence is sent to computing module; (3) computing module calculates the direction of motion and the speed intensity of the gesture in this section image sequence, and the direction of motion and the speed intensity of gesture is compared with the comparison scale respectively; (4) obtain comparative result, and show according to the gesture motion direction and the prompting of speed intensity of reality.
Wherein, in step (3), said relatively scale is provided with direction scale and speed scale.
Specify again in the face of the said detection method of present embodiment down:
Shown in Fig. 1 and Fig. 2, this embodiment in advance with gesture previous moment position be central point defined 8 gesture motion directions interval 1: (22.5,22.5]; 2: (22.5,67.5]; 3: (67.5,112.5]; 4: (112.5,157.5]; 5: (157.5 ,-157.5]; 6: (157.5 ,-112.5]; 7: (112.5 ,-67.5]; 8: (67.5 ,-22.5] } (unit: degree), and movement rate divided N intensity quantification gradation, and setting strength grade upper limit Smax and lower limit Smin.
Then, make a video recording first-class equipment as sensor, the image sequence that captures is carried out analyzing and processing, obtain each hand gesture location of t constantly, and then obtain the direction of motion and the speed strength information of gesture through certain calculating through the common camera or the degree of depth.
For example; Known t hand gesture location coordinate [x (t) constantly; Y (t)]; T+1 hand gesture location coordinate [x (t+1), y (t+1)] constantly, its deviation angle can be calculated by formula
and obtain; According to predefined direction of motion interval range, obtain the direction of motion in this moment; It departs from displacement can calculate acquisition by through type
; Quantize rate range according to predefined intensity, obtain the speed strength grade in this moment.
As shown in Figure 2; Human-computer interaction interface carries out real-time display reminding with the direction of motion of extracting, speed strength information; 8 direction arrows have been represented 8 direction of motion information among the figure, and after receiving certain directional information, then the outstanding demonstration of the arrow of this direction (distinguishes over the characteristics of other part; For example adopt various colors to show or the flicker demonstration, the direction of motion of prompting user gesture); Simultaneously, the power of speed intensity that the length of this direction upward arrow is corresponding, the little microinching of arrow length weak point expression speed, the big rapid movement of arrow length long expression speed, Application Program Interface design is looked concrete and difference in aspects such as concrete size and color.
The above embodiment has only expressed several kinds of embodiments of the present invention, and it describes comparatively concrete and detailed, but can not therefore be interpreted as the restriction to claim of the present invention.Should be pointed out that for the person of ordinary skill of the art under the prerequisite that does not break away from the present invention's design, can also make some distortion and improvement, these all belong to protection scope of the present invention.Therefore, the protection domain of patent of the present invention should be as the criterion with accompanying claims.
Claims (3)
1. the direction of motion of gesture and the reminding method of speed intensity in the interactive system is characterized in that this method comprises the steps:
(1) start;
(2) sensor obtains one section image sequence, and this section image sequence is sent to computing module;
(3) computing module calculates the direction of motion and the speed intensity of the gesture in this section image sequence, and the direction of motion and the speed intensity of gesture is compared with the comparison scale respectively;
(4) obtain comparative result, and show according to the gesture motion direction and the prompting of speed intensity of reality.
2. the reminding method of the direction of motion of gesture and speed intensity is characterized in that in the interactive system according to claim 1, and in step (3), said relatively scale is provided with direction scale and speed scale.
3. the reminding method of the direction of motion of gesture and speed intensity is characterized in that in the interactive system according to claim 1, and said sensor is common camera or degree of depth camera.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201210196896.6A CN102749994B (en) | 2012-06-14 | 2012-06-14 | The reminding method of the direction of motion of gesture and speed intensity in interactive system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201210196896.6A CN102749994B (en) | 2012-06-14 | 2012-06-14 | The reminding method of the direction of motion of gesture and speed intensity in interactive system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN102749994A true CN102749994A (en) | 2012-10-24 |
CN102749994B CN102749994B (en) | 2016-05-04 |
Family
ID=47030255
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201210196896.6A Expired - Fee Related CN102749994B (en) | 2012-06-14 | 2012-06-14 | The reminding method of the direction of motion of gesture and speed intensity in interactive system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN102749994B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107209580A (en) * | 2015-01-29 | 2017-09-26 | 艾尔希格科技股份有限公司 | Identification system and method based on action |
CN110069137A (en) * | 2019-04-30 | 2019-07-30 | 徐州重型机械有限公司 | Gestural control method, control device and control system |
CN111338547A (en) * | 2020-03-02 | 2020-06-26 | 联想(北京)有限公司 | Operation control method and electronic equipment |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1731793A (en) * | 2005-08-17 | 2006-02-08 | 孙丹 | Method for replacing key-press operation via detecting camera mobile phone cam movement |
JP2007244534A (en) * | 2006-03-14 | 2007-09-27 | Sony Computer Entertainment Inc | Entertainment system and game controller |
US20090183125A1 (en) * | 2008-01-14 | 2009-07-16 | Prime Sense Ltd. | Three-dimensional user interface |
CN201897800U (en) * | 2010-09-26 | 2011-07-13 | 中国科学院深圳先进技术研究院 | Gesture recognition system |
CN102222342A (en) * | 2010-04-16 | 2011-10-19 | 上海摩比源软件技术有限公司 | Tracking method of human body motions and identification method thereof |
-
2012
- 2012-06-14 CN CN201210196896.6A patent/CN102749994B/en not_active Expired - Fee Related
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1731793A (en) * | 2005-08-17 | 2006-02-08 | 孙丹 | Method for replacing key-press operation via detecting camera mobile phone cam movement |
JP2007244534A (en) * | 2006-03-14 | 2007-09-27 | Sony Computer Entertainment Inc | Entertainment system and game controller |
US20090183125A1 (en) * | 2008-01-14 | 2009-07-16 | Prime Sense Ltd. | Three-dimensional user interface |
CN102222342A (en) * | 2010-04-16 | 2011-10-19 | 上海摩比源软件技术有限公司 | Tracking method of human body motions and identification method thereof |
CN201897800U (en) * | 2010-09-26 | 2011-07-13 | 中国科学院深圳先进技术研究院 | Gesture recognition system |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107209580A (en) * | 2015-01-29 | 2017-09-26 | 艾尔希格科技股份有限公司 | Identification system and method based on action |
CN110069137A (en) * | 2019-04-30 | 2019-07-30 | 徐州重型机械有限公司 | Gestural control method, control device and control system |
CN110069137B (en) * | 2019-04-30 | 2022-07-08 | 徐州重型机械有限公司 | Gesture control method, control device and control system |
CN111338547A (en) * | 2020-03-02 | 2020-06-26 | 联想(北京)有限公司 | Operation control method and electronic equipment |
Also Published As
Publication number | Publication date |
---|---|
CN102749994B (en) | 2016-05-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103472916B (en) | A kind of man-machine interaction method based on human body gesture identification | |
US11663784B2 (en) | Content creation in augmented reality environment | |
CN104679246B (en) | The Wearable and control method of human hand Roaming control in a kind of interactive interface | |
KR20120045667A (en) | Apparatus and method for generating screen for transmitting call using collage | |
EP2877909A1 (en) | Multimodal interaction with near-to-eye display | |
CN106406544B (en) | A kind of semanteme formula natural human-machine interaction control method and system | |
WO2013009040A3 (en) | Remote manipulation device and method using a virtual touch of a three-dimensionally modeled electronic device | |
WO2018099258A1 (en) | Method and device for flight control for unmanned aerial vehicle | |
JP2015172887A (en) | Gesture recognition device and control method of gesture recognition device | |
WO2015153673A1 (en) | Providing onscreen visualizations of gesture movements | |
CN102749994B (en) | The reminding method of the direction of motion of gesture and speed intensity in interactive system | |
CN105404384A (en) | Gesture operation method, method for positioning screen cursor by gesture, and gesture system | |
CN203552178U (en) | Wrist strip type hand motion identification device | |
WO2015030482A1 (en) | Input device for wearable display | |
Kakkoth et al. | Survey on real time hand gesture recognition | |
KR20100048747A (en) | User interface mobile device using face interaction | |
CN106547339B (en) | Control method and device of computer equipment | |
US20130187890A1 (en) | User interface apparatus and method for 3d space-touch using multiple imaging sensors | |
CN103914186A (en) | Image location recognition system | |
CN203070205U (en) | Input equipment based on gesture recognition | |
CN103136541B (en) | Based on the both hands 3 D non-contacting type dynamic gesture identification method of depth camera | |
CN104375631A (en) | Non-contact interaction method based on mobile terminal | |
CN104536568B (en) | Detect the dynamic control system of user's head and its control method | |
CN103389793A (en) | Human-computer interaction method and human-computer interaction system | |
CN104866112A (en) | Non-contact interaction method based on mobile terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20160504 |
|
CF01 | Termination of patent right due to non-payment of annual fee |