CN106951072A - On-screen menu body feeling interaction method based on Kinect - Google Patents
On-screen menu body feeling interaction method based on Kinect Download PDFInfo
- Publication number
- CN106951072A CN106951072A CN201710127591.2A CN201710127591A CN106951072A CN 106951072 A CN106951072 A CN 106951072A CN 201710127591 A CN201710127591 A CN 201710127591A CN 106951072 A CN106951072 A CN 106951072A
- Authority
- CN
- China
- Prior art keywords
- menu
- posture
- user
- screen
- arm
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention provides the giant-screen menu body feeling interaction method towards Kinect.Based on human motion perception principle, by judging whether user's posture is specific operational order, being used interchangeably for menu is carried out.The technology of the present invention does not require that user is accurately clicked on and slide in the air, and simply the angle by arm and body is carried out in menucommand selection, selection course, user without high visual note can probably sensing arm and body angle.Therefore, technology involved in the present invention, which has, realizes simple, the natural advantage of interaction.
Description
Technical field
The present invention relates to a kind of menu mutual design, it is adaptable to towards the order of the interactive giant-screen of Kinect somatosensory
Selection.
Background technology
Body feeling interaction formula giant-screen based on Kinect has a wide range of applications scene, for example for family game amusement,
Interaction display of spectators etc. in exhibition center.Body-sensing detection device Kinect can detect the body skeleton data of user, and then recognize
Its body posture is acted, to support that user uses body to manipulate giant-screen as input tool.Body feeling interaction formula giant-screen
Most distinguishing feature be it does not require user wearing or hand-held any equipment, therefore user can quickly start interaction,
Perform man-machine interactive task.
Menucommand selection is the steps necessary of same large-screen interactive.Set currently for the menu interface of Kinect giant-screens
Meter is often based upon the design pattern in desktop interactive device.General scheme is that palm is modeled into mouse, position of the palm in space
Put the cursor coordinates for being mapped to screen.Mobile palm can accordingly move cursor, and then select menu item.The design of menu interface
Form has:1) linear menu.Desktop menu design pattern of the design based on main flow, the arrangement of menu item is linear.If choosing
Menu item, then need to move cursor and stay for some time to carry out selection determination in targets option;2) fan-shaped menu.Base
This form is that circle is divided into eight sectors, each decile sector one menu item of correspondence.Menu item is selected, then is needed hand
Palm correspondence cursor is slided from the center of circle to round arc.When sliding distance exceedes a certain threshold value, then the menu item is selected.
Above technology is that palm is modeled into mouse, thus perform they need palm carry out degree of precision positioning or
It is mobile, to select menu item.But, relied on due to lacking, the accurate motion that arm and palm are carried out in the air be one compared with
For fatigue and difficult task, therefore cause interactive efficiency low.In view of the extensive use of body feeling interaction formula giant-screen, it is necessary to
Design more meets the menu mutual mode of human factor engineering.
The content of the invention
Goal of the invention:The problem of menu mutual inefficiency in giant-screen being interacted for current Kinect somatosensory, the present invention
A kind of on-screen menu body feeling interaction scheme is proposed based on human factor engineering, to overcome the limitation of current menu mutual technology.
Technical scheme:
On-screen menu body feeling interaction method based on Kinect, including step:
(1) menu etc. is to be activated, and user is interacted by posture with screen, and screen judges whether the posture of user is sharp
Posture living, if so, then showing menu, waits follow-up input order;If it is not, then menu continues waiting for activation;
(2) user carries out menu-selected commands by posture, and screen confirms to user's posture;If there is secondary dish
It is single, then update menu interface;If without secondary menu, performing menucommand;
(3) user exits menu by posture selection, and screen determines whether menu-exiting command, if so, then exiting dish
It is single;Otherwise follow-up input order is continued waiting for.
Step (1) screen judges whether the posture of user is that activation posture is specially:
Judge whether the triangle that shoulder joint in user's posture, elbow joint and wrist joint are constituted is acute triangle, if so,
It is activation posture then to show current posture.
The menu is circle, and is divided into eight parts, per one menu item of a correspondence;Shield in the step (2)
Curtain confirms that user's posture is specially:By arm angles judge user's arm whether menucommand to be selected menu item
Interior, wherein arm angles are calculated level lines to shoulder joint and the angle of wrist joint line;Arm need to be stopped in menu item
500ms to 1000ms with determine selection.
Screen judges whether user's posture is that menu-exiting command is specially in the step (3):Calculate two arm shoulders
Whether joint, elbow joint and the corresponding angle of wrist joint are more than 150 °, if it is greater, then representing that user's posture exits life for menu
Order.
Beneficial effect:1) present invention is easy to user learning.Menu interface activates posture and menu item selects the design of posture
Common actions in user's life are all based on, therefore user can skillfully use after of short duration study;2) menu mutual is designed
Based on human motion perception principle, it is easy to which user mutual is used.In man-machine interaction, the interactive action that high visual notes is (for example
It is accurate to click on, slide) often lead to long task time and high error rate.The technology of the present invention does not require that user is carried out in the air
It is accurate to click on and slide, and simply carry out menucommand selection by arm and body gesture.In selection course, user
In the case where noting without high visual, it is capable of the angle of probably sensing arm and body.For example, user is to the left or to the right
Make a stretch of the arm, arm and body easily can be substantially formed 90 °.This interactive mode perceived based on human motion is not required
High visual notes, therefore, it is possible to produce preferably interaction performance;3) the present invention is directed body feeling interaction giant-screen, normal at present
The body feeling interaction equipment seen, which can meet technology, to be realized and requires (detecting body skeleton data), without other equipment, therefore skill
Art invention is economical and practical;4) invention software design is simple, not comprising complicated algorithm, is easy to implement and transplants, therefore design
Scheme has a wide range of application.
Brief description of the drawings
Fig. 1 is menu interface active mode schematic diagram of the invention;
Wherein, (a) interface activation posture;(b) posture detection schematic diagram, the bone that solid line is detected by somatosensory device are activated
Data, dotted line is drawn line;(c) posture is exited at interface.
Fig. 2 is menucommand selection mode schematic diagram of the invention;
Wherein, (a) arms swing progress menu is selected;(b) arm angles are calculated, and wherein solid line is detected by somatosensory device
Skeleton data, dotted line is drawn line;(c) interface on screen menu, overstriking line segment and straight dashed line are used by example.
Fig. 3 is body feeling interaction flow chart of the invention.
Embodiment
The present invention is further described below in conjunction with the accompanying drawings.
The technical solution adopted in the present invention includes three parts.
First, menu interface active mode.As shown in figure 1, the present invention uses " implicit " menu mode, i.e., menu is usual
Invisible under state, only user needs select command to be just activated to show.The advantage of this " implicit " menu is it
Screen space will not be occupied under unactivated state.The present invention devises " implicit " menu activation mechanism as follows.Such as Fig. 1 (a)
Shown, user with arms akimbo, that is, represents the activation mechanism.This posture of present invention selection has two reasons:1) posture is daily
Posture more conventional in life, is easy to user to remember and perform;2) posture applies less in existing body feeling interaction, will not
Conflict with other interactive modes, therefore a kind of available menu interface active mode can be used as.The present invention devises a kind of letter
Single algorithm judges action with arms akimbo.As shown in Fig. 1 (b), when user with arms akimbo when, shoulder joint, elbow joint and wrist
The triangle (left hand correspondence △ B1 A1C1, the right hand correspondence △ B2 A2C2) that joint is constituted is acute triangle, then shows current
Posture is with arms akimbo posture.
Second, menucommand selection mode.As shown in Fig. 2 after activation menu interface, the present invention devises a kind of strategy
It is easy to user to select menucommand, including two pieces of designs.One is menu interface is designed.The design is used for reference in background technology and introduced
Fan-shaped menu, be circle by design menu, and be divided into eight parts, per it is a be 45 °, one menu item of correspondence.Menu
The top menu angle (as shown in Fig. 2 (c) ∠ ABO) of item is divided equally by perpendicular bisector, and other menu items are then arranged in order.The second is
With the interaction design of above-mentioned menu item.User is in order to select a certain menu item in screen, and an arm keeps activation menu strategy
Posture used, another arm, which is stretched, to be moved in the corresponding angular range of target menu, and stops 500ms to 1000ms i.e.
Can.By the angle of calculated level line to shoulder joint and wrist joint line, (such as Fig. 2 (b) is shown, straight line CE to horizontal angle
θ), judge the arm in the corresponding angle of which menu item.For surface (such as Fig. 2 (c) ∠ ABO are represented) and underface dish
Individual event (such as Fig. 2 (c) ∠ ABO downward relative angle), any arm can be carried out selection.If there is multilevel menu
Selection, then after the current set of menu is selected, the interactive action that user can continue to repeat the menu item carries out sub-menus selection.
3rd, menu interface withdrawing mode.When user vertically puts down two hands (Fig. 1 (c)), that is, exit menu circle
Face.The present invention devises a simple algorithm to judge arm droop posture:Calculate two arm shoulder joint, elbow joint and
The corresponding ∠ B of wrist joint3A3C3With ∠ B4A4C4Whether 150 ° are more than, if it is greater, then representing that arm is vertically put down.
Fig. 3 is body feeling interaction flow chart of the invention.As shown in figure 3, original state, which is menu, waits state of activation.If
Activation posture is detected, menu is then activated, show menu, wait follow-up input order.Follow-up input order has three kinds of situations.
First is menu-exiting command, and system returns to original state in this case.Second is menu-selected commands, and system is according to user
Posture carry out menucommand selection.This stage if secondary menu, then updates menu and waits input gestures state.If
Without secondary menu, then performing order, menu enters original state simultaneously.3rd is other orders, and system is not responding to, still located
Shown in menu and wait follow-up input coomand mode.
The above menu design and pose detection method are only presently preferred embodiments of the present invention, all according to the present patent application
The menu design and input gestures impartial change that the scope of the claims is done and modification, should all belong to the covering scope of the present invention.
Claims (4)
1. the on-screen menu body feeling interaction method based on Kinect, it is characterised in that:Including step:
(1) menu etc. is to be activated, and user is interacted by posture with screen, and screen judges whether the posture of user is activation appearance
Gesture, if so, then showing menu, waits follow-up input order;If it is not, then menu continues waiting for activation;
(2) user carries out menu-selected commands by posture, and screen confirms to user's posture;If there is secondary menu,
Update menu interface;If without secondary menu, performing menucommand;
(3) user exits menu by posture selection, and screen determines whether menu-exiting command, if so, then exiting menu;It is no
Then continue waiting for follow-up input order.
2. on-screen menu body feeling interaction method according to claim 1, it is characterised in that:Step (1) screen judges
Whether the posture of user is that activation posture is specially:
Judge whether the triangle that shoulder joint in user's posture, elbow joint and wrist joint are constituted is acute triangle, if so, then table
Bright current posture is activation posture.
3. on-screen menu body feeling interaction method according to claim 1, it is characterised in that:The menu is circle, and will
It is divided into eight parts, per one menu item of a correspondence;Screen confirms that user's posture is specially in the step (2):Pass through hand
Arm angle judges user's arm whether in the menu item of menucommand to be selected, and wherein arm angles are calculated level lines
To shoulder joint and the angle of wrist joint line;Arm need to stop 500ms to 1000ms to determine selection in menu item.
4. on-screen menu body feeling interaction method according to claim 1, it is characterised in that:Screen is sentenced in the step (3)
Whether disconnected user's posture is that menu-exiting command is specially:Calculate two arm shoulder joint, elbow joint and the corresponding angle of wrist joint
Whether degree is more than 150 °, if it is greater, then representing that user's posture is menu-exiting command.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710127591.2A CN106951072A (en) | 2017-03-06 | 2017-03-06 | On-screen menu body feeling interaction method based on Kinect |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710127591.2A CN106951072A (en) | 2017-03-06 | 2017-03-06 | On-screen menu body feeling interaction method based on Kinect |
Publications (1)
Publication Number | Publication Date |
---|---|
CN106951072A true CN106951072A (en) | 2017-07-14 |
Family
ID=59467788
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710127591.2A Pending CN106951072A (en) | 2017-03-06 | 2017-03-06 | On-screen menu body feeling interaction method based on Kinect |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106951072A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108983967A (en) * | 2018-06-20 | 2018-12-11 | 网易(杭州)网络有限公司 | Information processing method, device, storage medium and electronic equipment in VR scene |
CN110728828A (en) * | 2019-11-11 | 2020-01-24 | 中国地质大学(武汉) | Office sitting posture correction instrument and use method thereof |
CN114443038A (en) * | 2022-04-08 | 2022-05-06 | 绿城科技产业服务集团有限公司 | Zero code configurable display system based on browser |
CN117369649A (en) * | 2023-12-05 | 2024-01-09 | 山东大学 | Virtual reality interaction system and method based on proprioception |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102929547A (en) * | 2012-10-22 | 2013-02-13 | 四川长虹电器股份有限公司 | Intelligent terminal contactless interaction method |
CN103218124A (en) * | 2013-04-12 | 2013-07-24 | 北京国铁华晨通信信息技术有限公司 | Depth-camera-based menu control method and system |
CN104460972A (en) * | 2013-11-25 | 2015-03-25 | 安徽寰智信息科技股份有限公司 | Human-computer interaction system based on Kinect |
CN104866096A (en) * | 2015-05-18 | 2015-08-26 | 中国科学院软件研究所 | Method for selecting command by using upper arm extension information |
CN105404449A (en) * | 2015-07-21 | 2016-03-16 | 浙江传媒学院 | Hierarchically extendable multi-pie somatosensory menu and syntax-directed identification method therefor |
US20160085358A1 (en) * | 2014-09-22 | 2016-03-24 | Intel Corporation | Dynamic input mode selection |
-
2017
- 2017-03-06 CN CN201710127591.2A patent/CN106951072A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102929547A (en) * | 2012-10-22 | 2013-02-13 | 四川长虹电器股份有限公司 | Intelligent terminal contactless interaction method |
CN103218124A (en) * | 2013-04-12 | 2013-07-24 | 北京国铁华晨通信信息技术有限公司 | Depth-camera-based menu control method and system |
CN104460972A (en) * | 2013-11-25 | 2015-03-25 | 安徽寰智信息科技股份有限公司 | Human-computer interaction system based on Kinect |
US20160085358A1 (en) * | 2014-09-22 | 2016-03-24 | Intel Corporation | Dynamic input mode selection |
CN104866096A (en) * | 2015-05-18 | 2015-08-26 | 中国科学院软件研究所 | Method for selecting command by using upper arm extension information |
CN105404449A (en) * | 2015-07-21 | 2016-03-16 | 浙江传媒学院 | Hierarchically extendable multi-pie somatosensory menu and syntax-directed identification method therefor |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108983967A (en) * | 2018-06-20 | 2018-12-11 | 网易(杭州)网络有限公司 | Information processing method, device, storage medium and electronic equipment in VR scene |
CN110728828A (en) * | 2019-11-11 | 2020-01-24 | 中国地质大学(武汉) | Office sitting posture correction instrument and use method thereof |
CN114443038A (en) * | 2022-04-08 | 2022-05-06 | 绿城科技产业服务集团有限公司 | Zero code configurable display system based on browser |
CN114443038B (en) * | 2022-04-08 | 2023-08-18 | 绿城科技产业服务集团有限公司 | Zero code configurable display system based on browser |
CN117369649A (en) * | 2023-12-05 | 2024-01-09 | 山东大学 | Virtual reality interaction system and method based on proprioception |
CN117369649B (en) * | 2023-12-05 | 2024-03-26 | 山东大学 | Virtual reality interaction system and method based on proprioception |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Carter et al. | Pathsync: Multi-user gestural interaction with touchless rhythmic path mimicry | |
Ren et al. | 3D selection with freehand gesture | |
US20180059925A1 (en) | Enhanced 3D interfacing for remote devices | |
CN103518172B (en) | Stare auxiliary computer interface | |
Banerjee et al. | Pointable: an in-air pointing technique to manipulate out-of-reach targets on tabletops | |
KR101791366B1 (en) | Enhanced virtual touchpad and touchscreen | |
Bailly et al. | Comparing free hand menu techniques for distant displays using linear, marking and finger-count menus | |
CN115443445A (en) | Hand gesture input for wearable systems | |
WO2015113503A1 (en) | Ring-type wireless finger controller, control method and control system | |
Lv | Wearable smartphone: Wearable hybrid framework for hand and foot gesture interaction on smartphone | |
Vogel et al. | Hand occlusion on a multi-touch tabletop | |
JP2018010653A (en) | Remote control of computer device | |
EP2175229A1 (en) | Object position and orientation detection system | |
CN106951072A (en) | On-screen menu body feeling interaction method based on Kinect | |
WO2012135747A1 (en) | Multi-touch screen recognition of interactive objects, and applications thereof | |
Au et al. | Multitouch finger registration and its applications | |
Yeo et al. | Wrist: Watch-ring interaction and sensing technique for wrist gestures and macro-micro pointing | |
Hürst et al. | Multimodal interaction concepts for mobile augmented reality applications | |
Schwaller et al. | Pointing in the air: measuring the effect of hand selection strategies on performance and effort | |
Bonnet et al. | Extending the vocabulary of touch events with ThumbRock | |
CN106445118A (en) | Virtual reality interaction method and apparatus | |
CN101847057A (en) | Method for touchpad to acquire input information | |
CN109663345A (en) | Information processing method and device for 3D game | |
Bossavit et al. | Hierarchical menu selection with a body-centered remote interface | |
Fukuchi et al. | Pac-pac: pinching gesture recognition for tabletop entertainment system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20170714 |