CN108459702A - Man-machine interaction method based on gesture identification and visual feedback and system - Google Patents

Man-machine interaction method based on gesture identification and visual feedback and system Download PDF

Info

Publication number
CN108459702A
CN108459702A CN201710097594.6A CN201710097594A CN108459702A CN 108459702 A CN108459702 A CN 108459702A CN 201710097594 A CN201710097594 A CN 201710097594A CN 108459702 A CN108459702 A CN 108459702A
Authority
CN
China
Prior art keywords
control
hand
user
cursor
state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710097594.6A
Other languages
Chinese (zh)
Other versions
CN108459702B (en
Inventor
张硕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Qiaoniu Technology Co ltd
Original Assignee
Tianjin Feng Time Interactive Technology Co Ltd Shenzhen Branch
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin Feng Time Interactive Technology Co Ltd Shenzhen Branch filed Critical Tianjin Feng Time Interactive Technology Co Ltd Shenzhen Branch
Priority to CN201710097594.6A priority Critical patent/CN108459702B/en
Publication of CN108459702A publication Critical patent/CN108459702A/en
Application granted granted Critical
Publication of CN108459702B publication Critical patent/CN108459702B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Disclose man-machine interaction method and system based on gesture identification and visual feedback.Disclosed man-machine interaction method, including:In response to sight towards control panel region, it sets the control on control panel to state of activation;In response to entering the region for the first control being active corresponding to the cursor of hand, cursor is plotted as foresight icon;According to hand the size of foresight icon is updated with the distance of the first control in virtual coordinate system;And it is less than threshold value with respect to the distance of the first control in virtual coordinate system in response to hand, generate first event.

Description

Man-machine interaction method based on gesture identification and visual feedback and system
Technical field
This application involves field of human-computer interaction.In particular it relates to using based on gesture identification and visual feedback into The method and system of row human-computer interaction.
Background technology
In human-computer interaction technology, control is the reusable software component for building graphic user interface.In general, a control Part corresponds to a kind of function.For example, Fig. 1 illustrates " confirmation " control in two-dimensional graphical user interface." confirmation " control includes carrying Show that window, reminding window include " confirmation " button and the " Cancel " button.When " confirmation " control is called, pop up as shown in Figure 1 Reminding window, identification user is intended to the click of " confirmation " button or the " Cancel " button to obtain the operation of user, and realizes man-machine Interaction.Sliding unlocking technology in the prior art informs user by the sliding of hand on the touchscreen to information processing equipment Input be intended to.
Novel human-machine interaction technology is also evolving, and the human-computer interaction technology based on gesture identification is one of hot spot.It is right The identification of hand exercise can pass through accomplished in many ways.US20100199228A1 (publication date from Microsoft:2010 On August 5) body posture that user is captured and analyzed using depth camera is provided, and it is construed as computer command Scheme.US20080291160A1 (publication date from Nintendo companies:On November 27th, 2008) it provides using infrared The scheme of sensor and acceleration transducer capture user's hand position.From Panasonic Electric Equipment Industrial Co., Ltd CN1276572A, which is provided, takes pictures to hand using camera, analysis then is normalized to image, and will normalization Obtained image carries out space projection, and the projection coordinate of gained is compared with the projection coordinate of pre-stored image. Fig. 2 shows the gesture identifications that the patent application CN201110100532.9 from Tianjin Fengshi Interaction Technology Co., Ltd. is provided With the sensory perceptual system and method for spatial position.As shown in Fig. 2, gesture recognition system includes:Main frame 101, multi-cam The control circuit 102 of system, multiple cameras 103, user's hand 104, the application program for running on main frame 101 105, the object to be operated 106 in application program 105 and virtual hand cursor 107.Gesture recognition system further includes not showing in fig. 2 The infrared fileter for illuminating the infrared illumination source of user's hand 104 and being positioned over before each camera gone out.It is more A camera 103 captures the image of user hand 104, at the hand images that control circuit 102 acquires camera 103 Reason, and identify posture and/or the position of hand.It is assisted to hand gestures using data glove in addition, also having in the prior art Identification scheme.
Invention content
In the interactive process based on gesture identification, need to give the user effective feedback, to inform to user System state in which, reaction of the system to the input of user instruct user to implement the interactive action of next step, man-machine to promote Interactive completion.Control is designed to promote the exploitation of application program.Control generates event or message work using gesture as input For output.Event or message may indicate that the operation purpose of " confirmation " or " cancellation " of user, or the use of a variety of different meanings of instruction Family is intended to.And due to the biological characteristic of people determine the hand of user the track in three-dimension interaction space cannot achieve it is upper straight or The problem of person's specification, to which existing human-computer interaction technology is difficult to effectively understand the intention of gesture input.
In embodiments herein, by providing a user visual feedback, to promote the completion of human-computer interaction.
According to the first aspect of the invention, the first man-machine interaction method is provided, including:It is controlled in response to sight direction Panel zone processed sets the control on control panel to state of activation;Enter in sharp in response to the cursor corresponding to hand The region of first control of state living, foresight icon is plotted as by cursor;According to hand with the first control in virtual coordinate system Distance, update foresight icon size;And the distance in response to hand with respect to the first control in virtual coordinate system is less than Threshold value generates first event.
The first man-machine interaction mode according to the first aspect of the invention, provides second according to a first aspect of the present invention Man-machine interaction method, wherein be less than threshold value with respect to the distance of the first control in virtual coordinate system in response to hand, also draw the The effect that one control is pressed.
The second man-machine interaction mode according to the first aspect of the invention, provides third according to a first aspect of the present invention Man-machine interaction method, wherein after drawing the effect that the first control is pressed, also draw the effect that the first control floats.
Aforementioned man-machine interaction mode according to the first aspect of the invention, provides the 4th according to a first aspect of the present invention Man-machine interaction method, wherein set the control on control panel to state of activation, include by ActiveX drafting be the effect floated Fruit.
Aforementioned man-machine interaction mode according to the first aspect of the invention, provides the 5th according to a first aspect of the present invention Man-machine interaction method, wherein set the control on control panel to state of activation, be included in around control and draw shade.
Aforementioned man-machine interaction mode according to the first aspect of the invention, provides the 6th according to a first aspect of the present invention Man-machine interaction method, including:In response to entering the region for the first control being active corresponding to the cursor of hand, Aiming frame is drawn around the first control.
Aforementioned man-machine interaction mode according to the first aspect of the invention, provides the 7th according to a first aspect of the present invention Man-machine interaction method, including:In response to entering the region for the second control being active corresponding to the cursor of hand, Aiming frame is drawn around the second control;And aiming frame of the removing around other controls in addition to the second control.
Aforementioned first according to the first aspect of the invention provides to Sixth Man machine interactive mode according to the present invention first 8th man-machine interaction method of aspect, including:In response to removing the region of the first control corresponding to the cursor of hand, remove Except the aiming frame around the first control.
Aforementioned man-machine interaction mode according to the first aspect of the invention, provides the 9th according to a first aspect of the present invention Man-machine interaction method, including:Control panel region is left in response to sight, the control on control panel is set as not swashing State living.
The aforementioned first to the 8th man-machine interaction mode according to the first aspect of the invention is provided according to the present invention first Tenth man-machine interaction method of aspect, including:In response to leaving be active first corresponding to the cursor of hand The region of control sets control to unactivated state if sight leaves control panel region.
Aforementioned 8th to the tenth man-machine interaction mode according to the first aspect of the invention is provided according to the present invention first 11st man-machine interaction method of aspect, including:It includes that will remove floating for control to set control to unactivated state Effect.
Aforementioned man-machine interaction mode according to the first aspect of the invention, provides the tenth according to a first aspect of the present invention Two man-machine interaction methods, including:After generating first event, also it sets the first control to state of activation.
Aforementioned man-machine interaction mode according to the first aspect of the invention, provides the tenth according to a first aspect of the present invention Three man-machine interaction methods, including:The cursor corresponding to hand is obtained in void according to position of the hand in real coordinate system Position in quasi-coordinate system.
Aforementioned man-machine interaction mode according to the first aspect of the invention, provides the tenth according to a first aspect of the present invention Four man-machine interaction methods, including:Virtual coordinates are obtained according to the posture of pupil direction and/or head in real coordinate system Sight in system.
Aforementioned man-machine interaction mode according to the first aspect of the invention, provides the tenth according to a first aspect of the present invention Five man-machine interaction methods, including:In response to entering the area for the first control being active corresponding to the cursor of hand Domain sets the first control to aiming state;Only for the first control in the state of aiming, detection hand is in virtual coordinate system In with the first control the first distance, and determine according to the first distance the diameter of foresight icon.
According to the second aspect of the invention, a kind of human-computer interaction device is provided, including:Active module, for ringing Control panel region should be entered in sight, set the control on control panel to state of activation;Foresight drafting module, for ringing Ying Yu corresponds to the cursor of hand into the region for the first control being active, and cursor is plotted as foresight icon;It is accurate Star update module updates the size of foresight icon for the distance according to hand same first control in virtual coordinate system;And Event generation module generates first for being less than threshold value with respect to the distance of the first control in virtual coordinate system in response to hand Event.
According to the third aspect of the invention we, the first man-machine interactive system according to a third aspect of the present invention is provided, wherein Including:Including computing unit, display equipment and sensor assembly;The computing unit is for running virtual reality applications with structure Build virtual reality scenario;The display equipment is used to show the virtual display scene of computing unit structure;Sensor assembly is used for Perceive the posture of the direction and user's hand of user's sight in real coordinate system;The computing unit is based on sensor assembly The direction of the user's sight perceived, enters control panel region in response to sight, and the control on control panel is set as sharp State living;Posture of the user's hand that the computing unit is perceived based on sensor in real coordinate system, determination correspond to The position of the cursor of hand, in response to entering the region for the first control being active corresponding to the cursor of hand, by light Plotting is made as foresight icon;The computing unit is according to user's hand with the distance of the first control, update in virtual coordinate system The size of foresight icon;And it is less than threshold value with respect to the distance of the first control in virtual coordinate system in response to hand, generate the One event.
The first man-machine interactive system according to the third aspect of the invention we, provides second according to a third aspect of the present invention Man-machine interactive system, including:It is less than threshold value with respect to the distance of the first control in virtual coordinate system in response to hand, also paints Make the effect that the first control is pressed.
The second man-machine interactive system according to the third aspect of the invention we, provides third according to a third aspect of the present invention Man-machine interactive system, including:After drawing the effect that the first control is pressed, the effect that the first control floats also is drawn.
Aforementioned man-machine interactive system according to the third aspect of the invention we, provides the 4th according to a third aspect of the present invention Man-machine interactive system, including:It sets the control on control panel to state of activation, includes floating ActiveX drafting Effect.
Aforementioned man-machine interactive system according to the third aspect of the invention we, provides the 5th according to a third aspect of the present invention Man-machine interactive system, including:It sets the control on control panel to state of activation, is included in around control and draws the moon Shadow.
Aforementioned man-machine interactive system according to the third aspect of the invention we, provides the 6th according to a third aspect of the present invention Man-machine interactive system, including:In response to entering the region for the first control being active corresponding to the cursor of hand, Aiming frame is drawn around the first control.
Aforementioned man-machine interactive system according to the third aspect of the invention we, provides the 7th according to a third aspect of the present invention Man-machine interactive system, including:In response to entering the region for the second control being active corresponding to the cursor of hand, Aiming frame is drawn around the second control;And aiming frame of the removing around other controls in addition to the second control.
Aforementioned first according to the third aspect of the invention we provides to Sixth Man machine interactive system according to third of the present invention 8th man-machine interactive system of aspect, including:In response to removing the region of the first control corresponding to the cursor of hand, remove Except the aiming frame around the first control.
Aforementioned man-machine interactive system according to the third aspect of the invention we, provides the 9th according to a third aspect of the present invention Man-machine interactive system, including:Control panel region is left in response to sight, the control on control panel is set as not swashing State living.
The aforementioned first to the 8th man-machine interactive system according to the third aspect of the invention we, provides according to third of the present invention Tenth man-machine interactive system of aspect, including:In response to leaving be active first corresponding to the cursor of hand The region of control sets control to unactivated state if sight leaves control panel region.
Aforementioned 8th to the tenth man-machine interactive system according to the third aspect of the invention we, provides according to third of the present invention 11st man-machine interactive system of aspect, including:It includes that will remove floating for control to set control to unactivated state Effect.
Aforementioned man-machine interactive system according to the third aspect of the invention we, provides the tenth according to a third aspect of the present invention Two man-machine interactive systems, including:After generating first event, also it sets the first control to state of activation.
Aforementioned man-machine interactive system according to the third aspect of the invention we, provides the tenth according to a third aspect of the present invention Three man-machine interactive systems, including:The cursor corresponding to hand is obtained in void according to position of the hand in real coordinate system Position in quasi-coordinate system.
Aforementioned man-machine interactive system according to the third aspect of the invention we, provides the tenth according to a third aspect of the present invention Four man-machine interactive systems, including:Virtual coordinates are obtained according to the posture of pupil direction and/or head in real coordinate system Sight in system.
Aforementioned man-machine interactive system according to the third aspect of the invention we, provides the tenth according to a third aspect of the present invention Five man-machine interactive systems, including:In response to entering the area for the first control being active corresponding to the cursor of hand Domain sets the first control to aiming state;Only for the first control in the state of aiming, detection hand is in virtual coordinate system In with the first control the first distance, and determine according to the first distance the diameter of foresight icon.
According to the fourth aspect of the invention, a kind of information processing equipment is provided, wherein described information processing equipment includes Processor, memory, display equipment, described information processing equipment are additionally coupled to sensor assembly and the perception of receiving sensor module User state;The memory stores program, and the processor operation described program makes described information processing equipment execute Aforementioned man-machine interaction method according to the first aspect of the invention.
Description of the drawings
When being read together with attached drawing, by reference to the detailed description of illustrative embodiment, will be best understood below The present invention and preferred use pattern and its further objects and advantages, wherein attached drawing include:
Fig. 1 illustrates " confirmation " control of two-dimensional graphical user interface in the prior art;
Fig. 2 is gesture recognition system structural schematic diagram in the prior art;
Fig. 3 is the block diagram of the man-machine interactive system according to the ... of the embodiment of the present invention based on gesture identification;
Fig. 4 A-4D are the schematic diagrames of the various states of control according to the ... of the embodiment of the present invention;
Fig. 5 is the state diagram of control according to the ... of the embodiment of the present invention;
Fig. 6 A-6C are the flow charts of the method according to the ... of the embodiment of the present invention for realizing human-computer interaction;
Fig. 7 A-7D are the schematic diagrames of control panel in human-computer interaction according to the ... of the embodiment of the present invention;And
Fig. 8 is the block diagram of information processing equipment according to the embodiment.
Specific implementation mode
The embodiment of the present invention is described below in detail, examples of the embodiments are shown in the accompanying drawings, wherein from beginning to end Same or similar label indicates same or similar element or element with the same or similar functions.Below with reference to attached The embodiment of figure description is exemplary, and is only used for explaining the present invention, and is not considered as limiting the invention.On the contrary, this The embodiment of invention includes all changes fallen within the scope of the spiritual and intension of attached claims, modification and is equal Object.
In the description of the present invention, it is to be understood that, term " first ", " second " etc. are used for description purposes only, without It can be interpreted as indicating or implying relative importance.In the description of the present invention, it should be noted that unless otherwise specific regulation And restriction, term " connected ", " connection " shall be understood in a broad sense, for example, it may be being fixedly connected, may be a detachable connection, Or it is integrally connected;It can be mechanical connection, can also be electrical connection;It can be directly connected, intermediary can also be passed through It is indirectly connected.For the ordinary skill in the art, the tool of above-mentioned term in the present invention can be understood with concrete condition Body meaning.In addition, in the description of the present invention, unless otherwise indicated, the meaning of " plurality " is two or more.
Any process described otherwise above or method description are construed as in flow chart or herein, and expression includes It is one or more for realizing specific logical function or process the step of executable instruction code module, segment or portion Point, and the range of the preferred embodiment of the present invention includes other realization, wherein can not press shown or discuss suitable Sequence, include according to involved function by it is basic simultaneously in the way of or in the opposite order, to execute function, this should be of the invention Embodiment person of ordinary skill in the field understood.
Fig. 3 is the block diagram of the man-machine interactive system according to the ... of the embodiment of the present invention based on gesture identification.According to of the invention real The man-machine interactive system for applying example includes gesture input device 310 coupled to each other, information processing equipment 320 and display equipment 330.In one example, gesture input device 310, the image for capturing user's hand, and the image of acquisition is sent to Information processing equipment is handled.Information processing equipment 320, the hand images for receiving gesture input device transmission, identification The gesture information of user's hand in image.Information processing equipment 320 also by show equipment 330 to user's present graphical and/or Image, for example, display equipment 330 on draw user's hand virtual image.Information processing equipment can be such as computer, Mobile phone or dedicated gesture identification equipment.Show that equipment 330 can be such as flat-panel screens, projecting apparatus, head-mounted display.
In another example, gesture input device 310 perceives position and/or the posture of user's hand, identifies user hand The gesture information in portion, and user's hand information is sent to information processing equipment 320.Information processing equipment 320 identifies that gesture is defeated The user's hand information for entering the offer of equipment 310 makees the input provided to the user, and by showing that it is defeated that equipment 330 provides a user Go out, to realize human-computer interaction.Obviously, information processing equipment 320 can also be handed over by the forms such as sound, mechanical function and user Mutually.
As still another example, gesture input device 310 still can be such as depth transducer, Distance-sensing Device, VR controllers (such as Oculus Rift Touch), game paddle, data glove (such as CyberGlove), motion capture system (such as OptiTracker), gyroscope, the position for perceiving user's hand and/or posture.
Gesture input device 310 further includes line-of-sight detection equipment 312.Line-of-sight detection equipment 312 can be for example the helmet or Helmet.The position and direction of user's head are identified in the gyroscope on the helmet by step, and determine the side of eyes of user To as direction of visual lines.As another example, eyes of user, especially pupil are identified by video or image-capturing apparatus, Observed direction is as direction of visual lines.Radio frequency or image-capturing apparatus can be disposed on the helmet or can in man-machine interactive system Observe the position of eyes of user.Line-of-sight detection equipment 312 identifies the direction of visual lines of user, and is supplied to information processing equipment 320.Optionally, line-of-sight detection equipment 312 captures essential information and is supplied to information processing equipment 320, and is set by information processing Standby 320 extract the direction of user's sight from essential information.As another example, line-of-sight detection equipment 312 is independently of hand Gesture input equipment 310, and interacted with information processing equipment 320 by independent channel.Optionally, line-of-sight detection equipment 312 is known The field range of other user, and it is supplied to information processing equipment 320.Field range can be with the eyes or head of user The cone of point, using the sight of user as the main shaft of centrum.
From user's gesture and/or action done in real world (or " realistic space "), extract based on virtual The gesture information (i) of coordinate system.Gesture information (i) can be a vector, and formalization representation be i=C, palm, thumb, index,mid,ring,little}.Wherein, c indicates the hand-type of entire hand, for example, clench fist, the five fingers opening, triumphantly gesture etc., Palm represent instruction palm location information, thumb, index, mid, ring and little respectively represent thumb, index finger, in Refer to, nameless and little finger of toe location information and/or orientation information.And wherein, virtual coordinate system is showing by information processing The location information in virtual world (or " Virtual Space ") constructed by equipment 320.And show real generation with real coordinate system Object or the location information in space in boundary.Virtual world constructed by information processing equipment 320 can be such as X-Y scheme use Two-dimensional space, three dimensions or the virtual reality scenario for having merged user at family interface.Real coordinate system and virtual coordinate system, can To be two-dimensional coordinate system or three-dimensional system of coordinate.Gesture information (i) can be updated by certain frequency or time interval, or in user Hand position and/or posture when changing, update gesture information (i).
It on a user interface, can be according to gesture information (i) display highlighting, for providing a user eye response.Cursor Position on graphical interfaces is represented by the function of gesture information (i), such as func_a (i).Those skilled in the art can be with Understand, function func_a differences according to different application scenarios or setting.
For example, in a two-dimensional user interface, the position that draw cursor is calculated by formula (1):
Func_a (i)=C*0+palm*0+index.position*0.5+mid*0+little*0 (1)
(1) in formula, index.position refers to the position of user's index finger, thus it is found that cursor is in user circle from (1) formula Position on face only relies upon user's index finger location, and the distance that cursor moves on a user interface, be user's index finger movement away from From half.
Cursor can have single pattern, such as the shape of hand.Cursor can also have a variety of patterns for corresponding to not homochirality. As an example, cursor display hand-type, for example, index finger stretches out, in addition four refers to bending, hand of the palm of the hand towards user oneself Type, cursor additionally provide the direction of index finger direction.
Based on gesture information (i), can determine in virtual world, the relative position of user's hand and other objects or away from From for example, the index finger of hand is at a distance from plane where control.As another example, by that in virtual coordinate system, will use The hand at family projects to the plane where control, to obtain the position of the cursor shown according to gesture information (i).And according to hand Portion determines the size of the cursor of display at a distance from plane where control.According to an embodiment of the present application, the diameter of cursor Related to the distance, index finger is closer apart from the plane where control, and the diameter of cursor is smaller, and flat where index finger distance controlling Face is remoter, and the diameter of cursor is bigger.To by the diameter of the cursor of drafting, provide a user visual feedback, to indicate in void In the quasi- world, index finger is at a distance from control so that user be able to know the action of oneself be so that finger in virtual world more Close to control still further from control.
From user in the direction of sight with the real world, direction of the sight in virtual world is extracted.And it is based on The position of user's head and/or hand in virtual world calculates straight line, cylinder in virtual world residing for the sight of user Body or cone.In virtual world the sight of user with residing straight line, cylinder or cone with the plane where control May intersection, by the location determination of intersection by user's sight direction, observed position or region.Intersection location or region Control or object are the objects observed by user's sight.It also identifies the movement of user's sight in virtual world, and determines user The region that sight enters or leaves, or determine user's sight in the region for entering or leaving in plane where control.
Fig. 4 A- Fig. 4 D illustrate the visual feedback provided during button according to the ... of the embodiment of the present invention is interacted with user.In void In the quasi- world visually button is shown to user.Button can be two-dimensional, can also be three-dimensional.It is appreciated that button is As an example, the embodiment of the present invention can also be applied to the controls such as menu, choice box.
Fig. 4 A illustrate the button in unactivated state.Button is associated with " return " event.In button subscript explanatory notes Word " return ", with prompt the user with trigger the button will cause " return " event for returning to a upper scene or a upper process step or Message.Indicate that button is in unactivated state by way of vision, for example, button is shown as being adjacent in the plane at place (for example, panel or control panel), button surrounding does not have shade.As another example, the button in unactivated state is Hiding, without showing the button in unactivated state in a manner of visual.In still another example, unactivated state Button distance controlling panel it is nearest, the area of the shade of button surrounding is minimum at this time, dash area transparency highest, edge mould Paste degree is minimum.
" return " button for being active of Fig. 4 B shows.It indicates to the user that button is in by way of vision to swash State living.For example, button is shown as floating from plane, for example, shade is presented in the surrounding in button, the position of button is by saturating Visual gauge then integral translation.As another example, button is shown as concave from plane.As another example, change control Part color indicates that button is active.
Fig. 4 C illustrate " return " button in aiming state.It indicates to the user that button is in by way of vision to take aim at Quasi- state.For example, there is aiming frame in the surrounding in button.Aiming frame is by the quadrangle for surrounding button that four outside of angle of button occurs Rectangular shaped composition.To indicate to the user that the button is the target of current user operation.Further, it is also drawn on button Foresight.Foresight is shown according to gesture information, with a kind of pattern of the corresponding cursor of user's hand.Cursor is put down where button Position on face can be the projection of hand in the plane in Virtual Space, be obtained according to gesture information and the plane position of itself It arrives.When cursor is located in the button region of state of activation, button becomes aiming state from state of activation.And change Star chart case subject to the pattern of cursor.Further, according in Virtual Space, distance of the hand apart from plane where button determines The diameter of foresight.For example, distance of the hand apart from button is L, then a diameter of 4L/3+10 of foresight.To as hand is gradual Close to plane, the diameter of foresight becomes smaller, to indicate to the user that the hand in Virtual Space is moving closer to plane, and the throwing of hand Shadow (cursor) is still maintained on button.It, can to which user is able to know to keep the direction of motion of the hand in realistic space Button is contacted in Virtual Space.And if cursor has left key area, button will be become state of activation, light from aiming state Target pattern also becomes normal style from foresight, to which user is able to know whether the direction of hand movement meets expection.
In the virtual reality applications of the Virtual Spaces 3D, the aiming state of button is particularly useful.User does in the 3 d space The action or gesture gone out is above difficult to keep stable in position, direction and distance.By offer aiming state, and pass through foresight Position constantly prompts user, guiding user to complete the point to target button with regard to the direction of user's hand movement with size with degree It hits, or adjustment target.
Fig. 4 D illustrate " return " button in click state.Indicate to the user that button is in point by way of vision Hit state.In Virtual Space, the distance of plane where hand contacts button or hand to button is less than threshold value, triggering pair The click of button.Generation event or message, to indicate that user clicks button.Click event or message are the same as the word on button " return " is associated.And foresight is plotted as filled circles, to indicate to the user that the click triggered to button.Optionally, it paints The effect that button processed is pressed, or the effect that button is pressed and bounces is drawn, it is triggered to button with indicating to the user that It clicks, and clicks and completed.In one example, button is pressed, and generates click event or message, but do not bounce, presses Key exits click state, and the aiming frame around button disappears, and the pattern of cursor becomes common cursor from foresight pattern, by button This state is known as being clicked.In another example, button is pressed, and click event or message is generated, then with virtual Far from button, button bounces user's hand in space, and whether button exits click state, and according to cursor in key area Interior, button enters state of activation or aims at state.
There is provided button different visual effects by providing various states for button, and in each state, come in time to User provides feedback, and to guide, user is convenient using gesture, effectively and easily hands over the control in virtual reality system Mutually.Still optionally, it in addition to providing visual feedback, also plays to specify sound and/or provide a user mechanics and feed back.For example, with Distance change of the hand apart from plane where button in virtual control, the sound of broadcasting gradually changes and/or frequency gradually becomes Change.
In addition to button, the embodiment of the present invention applies also for other controls, such as radio box, check box, text box or Other can respond the control of click event.For radio box, click action makes radio box choose, and its in same group His radio box, which is cancelled, to be chosen;For check box, click event makes check box choose;And for text box, click event makes Text box prepares to receive text input.Control can have different shapes.In various states, different visual effects can be used To indicate to the user that control state in which.
For each state of button, the shape residing for button can also be indicated to the user that by text prompt or sound The variation that state or state occur.
Fig. 5 is the state diagram of control according to the ... of the embodiment of the present invention.The state of control includes unactivated state 510, activation State 520 aims at state 530 and clicks state 540.After system initialization, control is in unactivated state 510.And foundation Cursor is drawn in position of user's hand in Virtual Space.Control in unactivated state can be as shown in Figure 4 A, can also It is hiding, or the instruction simply with regard to control existence is only provided.For the control in unactivated state, when detecting When the control panel region in region or carrying control where user's observation control, control, control is converted from unactivated state 510 For state of activation 520 (referring also to Fig. 4 B).Enter for example, by line-of-sight detection equipment 312 (referring also to Fig. 3) identification user's sight Region where control.As another example, line-of-sight detection equipment 312 provides the field range of user, obtains field range Centrum and control where region or control panel region section.If the section of field range or field range and control institute Region where entering control more than the main shaft of the centrum of threshold value or field range in the accounting of the lap in region, Then identify the region that user's sight enters where control.Control is activated by detecting user's point of view control, is man-machine Interaction provides great convenience.User's sight can simultaneously and co-operating, to activate and operational controls with gesture.
For being in the control of state of activation, if detecting, user no longer observes the control or sight leaves control location Domain, control become unactivated state 510 from state of activation 520.For being in the control of state of activation, if detecting with user The movement of hand, with user's hand, corresponding cursor is moved to control regions, and control becomes aiming state from state of activation 520. As an example, in Virtual Space, cursor is projected to the plane (such as plane residing for control panel) where control, To identify whether cursor is moved to control regions.As another example, user's hand in Virtual Space is projected into control The plane at place, and draw cursor in the projected position of hand plane where control.
It is corresponding with user's hand if detecting the movement with user's hand for being in the control of aiming state 530 Cursor is removed from control regions, and control becomes state of activation 520 from aiming state 530.As an example, control, which is in, aims at When state 530, user's sight leaves control panel region, does not influence the state of control.And as another example, control is in When aiming state 530, if user's sight leaves control panel region, control enters unactivated state 510 from state 530 is aimed at. In different environments, user may need different man-machine interaction experiences.
Referring also to Fig. 4 C, in the control for aiming at state 530, aiming frame being drawn around control, by same user hand The corresponding cursor in portion is plotted as foresight pattern, to provide a user visual feedback.And the size of foresight with user's hand in void The distance dependent of control is arrived in quasi- space.To, the size by foresight and position, guiding user complete to the click of control or Avoid the click to non-targeted control.Show the position of foresight, guiding user avoids hand or corresponding cursor moves with hand Go out control regions, knows whether the control that will be clicked is desirable to the control being clicked thereby using family.Show the size of foresight, With user's hand in Virtual Space close to control, foresight reduced diameter, to make user know in Virtual Space, hand Portion is gradually close to control.Also, user is able to be estimated as completing clicking also needing to move in realistic space by the diameter of foresight It starts the distance in portion, and optionally moving direction.
For the control in aiming state 530, if in Virtual Space, user's hand contacts control, or in order to carry High robust, when distance of user's hand away from control is less than specified threshold, control enters click state 540 from state 530 is aimed at.
Event or message are generated into the control 540 of the state of click referring also to Fig. 4 D, the generation clicked with instruction.
For in click state 540 control, as time goes by or generate instruction click event or message, Just automatically enter other states (for example, " exiting click " state (not depending on going out)).Alternatively, the position according to hand or cursor, Enter aiming state 530 or state of activation 520 from state 540 is clicked.If cursor is in control regions, control enters aiming shape State 530;If cursor is in except control regions, control enters state of activation 520.As another example, for being in a little The control for hitting state 540, in Virtual Space hand leave control where plane or hand apart from plane where control Distance be more than specified threshold, the location of according to cursor, control aims at 530 or state of activation from clicking state 540 and enter 520.It is to be appreciated that Fig. 5 shows the state change of single control.Multiple controls can be arranged on the control panel, be Each control safeguards the state of oneself.For example, when cursor is removed from the region of a control, and enter the region of another control When, the control for obtaining cursor enters aiming state 530, and the control for losing cursor enters state of activation 520.
Fig. 6 A are the flow charts according to the ... of the embodiment of the present invention for realizing human-computer interaction.Control plane is set in Virtual Space Plate, the one or more controls of setting on control panel.Control panel has flat shape, and according to the demand of human-computer interaction, if It sets in the designated position of Virtual Space.When Virtual Space initializes, or in response to control panel setting and/or wound It builds, initializes one or more controls on control panel and control panel, and set control to unactivated state 510 (referring also to Fig. 5).The cursor corresponding to user's hand is also drawn in Virtual Space, according to user's hand in Virtual Space The position of location determination cursor.
Enter control panel region in response to user's sight, one or more controls on control panel are set as activating State 520 (S610).And floated from control panel or concave by making one or more ActiveX draftings of state of activation, The surrounding of control draws shade, and/or from displaying is hidden into, to be activated to user's displaying control.
As user moves hand, cursor moves.User is moved the cursor to by mobile hand residing for control Region.Enter the region residing for one of control on control plane in response to cursor, sets the control to aiming state 530, and Cursor is plotted as foresight pattern (referring to Fig. 4 C) (S611).And it is aimed at being drawn around the control for aiming at state 530 Frame, to indicate to the user that the control has been aimed.
When control is in aiming state, as user moves hand, in Virtual Space on hand relation control panel The distance for being aimed control changes correspondingly.According to the distance of the opposite control in aiming state of user's hand, foresight figure is updated The size (S612) of case.So that hand distance be aimed control it is closer when, the diameter of foresight pattern is smaller, and hand distance is taken aim at When accurate control is remoter, the diameter of foresight pattern is bigger.
Optionally, the aiming frame being aimed around control is drawn, to show the change of the opposite distance for being aimed control of hand Change.For example, as hand is close to control is aimed, repeat to draw multiple aiming frames from big to small;And as hand is far from quilt Control is aimed at, repeats to draw multiple aiming frames from small to large.
It is less than with respect to the distance for being aimed control when position contact of the hand in Virtual Space is aimed control or hand Threshold value sets control to click state from aiming state, and generates event or message that instruction is clicked (referring also to Fig. 4 D) (S613).And foresight is plotted as solid circles, visually to show the generation of click to user.Optionally, control is also drawn Part is pressed, and/or the effect bounced, visually to show the generation of click to user.
Fig. 6 B are the flow charts according to the realization human-computer interaction of further embodiment of this invention.As shown in Figure 6B, the man-machine friendship Mutual method, which starts from, starts step S620, includes the following steps:Step S621 initializes control, the control on control panel is set It is set to unactivated state.
Optionally, one or more controls are set on the control panel according to human-computer interaction demand.As an example, will The control exposure of unactivated state is to be adjacent on the control panel, and control surrounding do not have shade, and control is indicated with this " un-activation " state.Next, in step S622, whether detection user sight enters control panel region.As an example, The visual angle that user is detected using line-of-sight detection equipment, to determine whether user's sight is controlled towards the one or more on control panel Part, to come detect user's sight whether enter control panel region.
If detecting, user's sight enters control panel region, enters step S623, by one on control panel or Multiple controls are set as state of activation 520 by unactivated state 510 (referring also to Fig. 5).As an example, by " activation " state Control exposure be float from control panel, meanwhile, shade is presented in the surrounding of control, the position of control is whole by perspective rule Body translates.As another example, it is concave from plane by control exposure.As another example, change control color to come Instruction control is in " activation " state.
Next, in step S624, detect whether the corresponding cursor with user's hand enters the controls region on control panel Domain.
As an example, the cursor corresponding to user's hand is being drawn in Virtual Space.And cursor is in Virtual Space Location-dependent query in the position of user's hand.As user moves hand, the position of cursor moves.As an example, The position of cursor or user's hand in Virtual Space is projected to the plane where the control panel where control, to foundation Be projected in control panel identification cursor in the plane whether be moved to control regions.
When detecting that cursor enters the control regions on control panel, S625 is entered step, by the control on control panel Part is set as " aiming at " state 530 (referring also to Fig. 5), and cursor is plotted as foresight icon.For example, cursor is projected in control Panel position in the plane be overlapped with one of control or the distance apart from the edge of one of control is less than threshold value, it is believed that cursor Into the region of the control on control panel.As an example, there is aiming frame in the surrounding of control to show " aiming " The control of state.Optionally, aiming frame is made of the rectangular shaped of the quadrangle of the encirclement control occurred in four outside of angle of control, with Indicate to the user that the control is the target of current user operation.Further, foresight icon is drawn also on control.Foresight icon It is to be shown according to gesture information, is a kind of pattern of the corresponding cursor of same user's hand.Optionally, according to hand in virtual control Foresight icon is drawn in position where the projection of portion position on the control panel, and foresight icon is presented as annulus cursor, annulus The size of cursor changes with the movement of hand.As another example, foresight icon can also be rendered as rectangle, and Cross pattern or " X " pattern is presented in rectangular centre.
For being in the control of " aiming " state 530, if detecting the movement with user's hand, corresponded to user's hand Cursor remove (step S629) from control regions, then the control of " aiming " state will be on control panel from " aiming " state 530 become " activating " state 520.
For being in the control of " aiming " state 530, in step S626, it is according to user's hand in Virtual Space is opposite The distance of the control of " aiming " state 530 updates the size of foresight icon.As an example, foresight icon is annulus cursor. The diameter of annulus cursor changes according to function with hand with the variation of the distance of control, while the ring diameter of annulus is kept not Become.For example, distance of the hand apart from control is L, then the variation function of the diameter of annulus cursor is:4L/3+10, to, with Hand moves closer to control panel, and the diameter of annulus cursor becomes smaller, to indicate to the user that the hand in Virtual Space is gradually connecing Hither plane, and the projection (cursor) in control panel plane of hand is still maintained on the control in " aiming " state.It can The diameter of selection of land, annulus cursor is changed with the movement of hand, and the ring diameter of annulus remains unchanged, i.e., outer circle and inner circle is straight The difference of diameter remains unchanged.
In step S627, judge whether hand is less than threshold in Virtual Space with respect to the distance of the control of " aiming " state Value.Threshold value can be 0 or other designated values.Threshold value is 0 or other designated values, indicates that hand contacts in Virtual Space The distance of control or hand to control on control panel is less than threshold value, shows that user view clicks control.If step S627 sentences Break and user's hand and less than threshold value be set as " clicking " by the control on control panel in step S628 with respect to control distance State, and corresponding event or message are generated with the control, to indicate that user clicks control.
As an example, when the distance that hand contacts control on control panel or hand to control is less than threshold value, Foresight icon (annulus cursor) is plotted as filled circles, to indicate to the user that the click triggered to control.In response to " click " State draws control and falls back to control panel, and the shade around control disappears, and generates click event or message.It is further optional Ground, the control for falling back to control panel do not bounce, and set control to " exiting click " state, the aiming frame around control It disappears, the pattern of cursor becomes common cursor from foresight icon.As another example, in response to " click " state, control is drawn Part is pressed, and generates click event or message, then as user's hand in Virtual Space is far from the control, draws control bullet Rise, and according to cursor plane where control panel projection whether in the control regions, which is set as " activating " State or " aiming " state.
Optionally, in Virtual Space, user and control panel interactive process, in response to detecting that user's sight leaves control Panel zone processed sets the control on control panel to unactivated state 510 (step S630).
Fig. 6 C are the flow charts of the method for realization human-computer interaction according to yet another embodiment of the invention.The embodiment of the present invention carries The man-machine interaction method of confession can provide the scene of the panel of fixed control panel suitable for virtual control, it is preferable that control plane Distance of the plate apart from user is no more than the length of user's arm.As an example, control panel has flat shape, on control panel It is provided with one or more controls.According to the demand of human-computer interaction, designated position of the setting control panel in Virtual Space.User It is interacted by hand and the control on control panel.
When Virtual Space initializes, or in response to the setting and/or establishment to control panel, initialize control panel And one or more controls on control panel, and set control to " un-activation " state 510.In response to detecting user Sight enters control panel region, sets all controls on control panel to " to activate " (the step S641- of state 520 S642), specifically, draw multiple controls from control panel to float, and draw shade around control.
After the control on control panel is activated, the index finger for starting to detect user in Virtual Space is in control panel institute Whether fallen on one of control (step S643) in the projection of plane.If testing result is yes, set the control to " aiming " state.As an example, it is that the feedback that control is in " aiming " state is visually provided to user, in the quadrangle of the control Draw aiming frame;And cursor is plotted as to the pattern of annulus cursor;And the movement with hand, update the ruler of annulus cursor It is very little.Assuming that a diameter of d of annulus cursor, distance of the index finger apart from control is L in virtual control, the diameter d of annulus cursor with It distance L of the index finger apart from control and (step S644) is changed according to function, such as d=4L/3+10.In this way as index finger distance is controlled The distance of part constantly reduces, and the diameter of annulus cursor reduces therewith, to indicate to the user that hand is moving closer to control panel, and The position of annulus cursor is still maintained at the projected position of index finger plane where control panel.When distance of the index finger apart from control L is 0 or is less than specified threshold, represents the control on user view forefinger contact control panel.In response, control is generated The event being clicked is set as the control being clicked " to click " state.Visually to indicate that control is arranged to " point to user Hit " state, it draws control and falls after rise to control panel and be shot back to original position, and triggering is the same as the associated event (step of control S645-S646).After triggering the associated event of same control, step S643 is returned to, continues the index finger for detecting user in Virtual Space Whether fallen on one of control in the projection of plane where control panel.If detecting index finger apart from control L's in step S645 Distance is not less than threshold value, then control still maintains as " activation " state, and returns to step S643, continues in detection Virtual Space Whether the index finger of user is falling on one of control in the projection of plane where control panel.
In step S643, whether the index finger projection for detecting user is falling on one of control.If testing result is no, i.e., The index finger projection of user is not fallen on any one control on control panel, then continues to detect whether user's sight leaves control Panel zone (S647) processed if user's sight has left control panel region, draw the control on control panel fall after rise to Control panel, the shade around control disappears, and sets control to " un-activation " state (step S647-S648).And turn To step S641, restart to execute whether detection user sight enters control panel.
Fig. 7 A-7D are the schematic diagrames of control panel in human-computer interaction according to the ... of the embodiment of the present invention.To promote human-computer interaction, The button on control panel and control panel is visually shown to user in Virtual Space.As an example, on control panel Including " broadcasting ", " pause ", " advance ", " retrogressing " four buttons.The pattern of button can be two-dimensional, can also be three-dimensional 's.It is appreciated that button is as an example, the embodiment of the present invention can also be applied to the controls such as menu, choice box.Number of keys It can be four, can also be arranged according to human-computer interaction demand any number of.It is being controlled according to position of the hand in virtual control The projective rendering cursor of plane where panel processed, the shape of cursor can there are many, for example, hand, arrowhead form, circular ring shape Deng.
The control panel of Fig. 7 A displayings, multiple buttons thereon are in " un-activation " state.When Virtual Space initializes, Or in response to the setting and/or establishment to control panel, the control panel of display such as Fig. 7 A in Virtual Space.Pass through vision Mode indicate that button is in unactivated state, for example, button is shown as being adjacent on the control panel and/or button Surrounding does not have shade.As another example, when the button on control panel is in " un-activation " state, control panel is hidden And/or the control on control panel, to reduce the influence to user.
The control panel of Fig. 7 B shows, multiple buttons thereon are in " activation " state.Enter control in response to user's sight Panel zone processed, the control panel of display such as Fig. 7 B in Virtual Space.Indicate to the user that button is in by way of vision " activation " state.Optionally, button is shown as floating from plane, for example, shade is presented in the surrounding in button, and/or is pressed Press the regular integral translation of perspective in the position of key.
The control panel of Fig. 7 C displayings, " retrogressing " button thereon are in " aiming " state, and remaining button, which is maintained at, " to swash It is living " state.Entering " retrogressing " key area in response to cursor, " retrogressing " button is become " aiming at " state from " activation " state, The control panel of display such as Fig. 7 C in Virtual Space.Indicate to the user that button is in " aiming " state by way of vision.Example Such as, aiming frame is drawn in the surrounding of button.Aiming frame is by the square in the quadrangle for surrounding button that four outside of angle of button occurs Shape forms.To indicate to the user that the button is the target of current user operation.Further, foresight figure is drawn also on button Mark.
The control panel of Fig. 7 D displayings, " retrogressing " button thereon are in " click " state, and remaining button keeps " swashing It is living " state.It is small with respect to the distance of " retrogressing " button in response to user's hand contact " retrogressing " button or hand in Virtual Space In threshold value, " retrogressing " button is become " clicking " state from " aiming " state.And by foresight cursor to draw filled circles, and trigger With " retrogressing " key associated event.The control panel of display such as Fig. 7 D in Virtual Space, to user in a manner of by vision Instruction button is in " click " state.And it is generated to user's displaying with " retrogressing " key associated click with visual manner Event or message or click action are completed.For example, drawing the aiming frame of button surrounding, button falls back on control panel, presses Key surrounding shadow disappears, and to indicate to the user that the click triggered to button, and clicks and has completed.Optionally, with button It falls back to the effect that control panel bounces again and indicates to the user that the click triggered to button, and click and completed.
As an embodiment, after " retrogressing " button becomes " to click " state, if user withdraws hand (virtual sky Between middle hand to button distance be more than threshold value), set " retrogressing " button to " to activate " state, be opened up again in Virtual Space Show control panel as shown in Figure 7 B, and continues to detect the operation of user.
As another embodiment, after " retrogressing " button becomes " to click " state, if user withdraws hand (virtually The distance of hand to button is more than threshold value in space), set " retrogressing " button to " exiting click " state.In " exit point Hitting " button of state is in " activation " state together or " aiming " status keys can be different, for example, button is located on control panel Without floating automatically, when cursor is located in the key area of " exiting click " state, cursor be shown as normal light standard specimen formula and It is not annulus cursor pattern.
Optionally, operation panel, control, the cursor of the embodiment of the present invention can be different color, and between three Color have notable difference, in order to user distinguish and operate.
Optionally, according to an embodiment of the present application, when user is interacted using finger, keep finger to gesture input The visibility of equipment 310, avoids palm from blocking finger, to be conducive to the identification to hand gestures, finger position.
Fig. 8 is the block diagram for the information processing equipment for realizing the embodiment of the present invention.In an embodiment according to the present invention, information Processing equipment 800 generates control on a user interface, and identifies user gesture information (i) or receive gesture input/gesture identification The gesture information (i) that equipment provides, identifies the instruction of user, and provides a user feedback and interacted with same user.Shown in Fig. 8 Information processing equipment 800 is computer.Computer is only an example of computing environment appropriate, and be not intended to imply about The use of the present invention or any restrictions of envelop of function.The information processing equipment that Fig. 8 is shown also is not construed as in institute Any component or component combined aspects shown has any dependence or requirement.
Information processing equipment 800 includes being coupled directly or indirectly to the memory 812 of bus 810, one or more places Manage device 814, one or more presentation components 816, I/O components 820 and power supply 822.Representated by bus 810 can be a kind of Or more bus (such as address bus, data/address bus or combinations thereof).It is that various components define not in a practical situation It is the inevitable mode as in Fig. 8.For example, I/O components 820 can be considered as the presentation component of equipment etc is such as shown. In addition, processor can have memory.The diagram of Fig. 8 is merely to illustrate that can be with one or more embodiments of the invention The exemplary computer system being used in conjunction with.
Information processing equipment 800 generally includes multiple memorizers 812.By way of example and not limitation, memory 812 can wrap It includes:Random access memory (RAM), electronic erasable programmable read only memory (EEPROM), dodges at read-only memory (ROM) It deposits, compact disk read-only memory (CDROM), digital versatile disc (DVD) or other optics or holographic media, magnetic holder, tape, disk Storage or other magnetic storage apparatus.Computer storage media can be non-volatile.
Information processing equipment 800 includes one or more processors 814, from such as bus 810, memory 812 or The various entities of I/O components 820 etc read data.One or more is presented component 816 and number is presented to user or other equipment According to instruction.The illustrative component 816 that presents includes display equipment, loud speaker, print components, vibration component, flat-panel screens, throwing Shadow instrument, head-mounted display etc..It can also be to show equipment, loud speaker, print components, vibration for coupling that component 816, which is presented, The ports I/O of component, flat-panel screens, projecting apparatus, head-mounted display etc..Illustrative I/O components 820 include camera, Microphone, control stick, game paddle, dish-shaped satellite signal transmitting and receiving antenna, scanner, printer, wireless device etc..
It although an embodiment of the present invention has been shown and described, for the ordinary skill in the art, can be with Understanding without departing from the principles and spirit of the present invention can carry out these embodiments a variety of variations, modification, replace And modification, the scope of the present invention is by appended claims and its equivalent limits.

Claims (10)

1. a kind of man-machine interaction method, including:
In response to sight towards control panel region, it sets the control on control panel to state of activation;
In response to entering the region for the first control being active corresponding to the cursor of hand, cursor is plotted as foresight figure Mark;According to hand the size of foresight icon is updated with the distance of the first control in virtual coordinate system;And
It is less than threshold value with respect to the distance of the first control in virtual coordinate system in response to hand, generates first event.
2. according to the method described in claim 1, wherein,
It is less than threshold value with respect to the distance of the first control in virtual coordinate system in response to hand, also draws what the first control was pressed Effect.
3. according to the method described in one of claim 1-2, further include
In response to entering the region for the first control being active corresponding to the cursor of hand, drawn around the first control Aiming frame.
4. according to the method described in one of claim 1-3, further include:
Control panel region is left in response to sight, sets the control on control panel to unactivated state.
5. according to the method described in one of claim 1-4, wherein
Position of the cursor that position of the foundation hand in real coordinate system obtains corresponding to hand in virtual coordinate system.
6. according to the method described in one of claim 1-5, wherein
The sight in virtual coordinate system is obtained according to the posture of pupil direction and/or head in real coordinate system.
7. according to the method described in one of claim 1-6, wherein
In response to entering the region for the first control being active corresponding to the cursor of hand, the first control is set as taking aim at Quasi- state;Only in aim at state the first control, detection hand in virtual coordinate system with the first control first away from From, and according to the diameter of the determining foresight icon of the first distance.
8. a kind of human-computer interaction device, including:
Active module sets the control on control panel to state of activation for entering control panel region in response to sight;
Foresight drafting module, the region of the first control for being active in response to the cursor entrance corresponding to hand, Cursor is plotted as foresight icon;
Foresight update module updates the ruler of foresight icon for the distance according to hand same first control in virtual coordinate system It is very little;And
Event generation module is generated for being less than threshold value with respect to the distance of the first control in virtual coordinate system in response to hand First event.
9. a kind of man-machine interactive system, including computing unit, display equipment and sensor assembly;
The computing unit is for running virtual reality applications to build virtual reality scenario;
The display equipment is used to show the virtual reality scenario of computing unit structure;
Sensor assembly is used to perceive the posture of the direction and user's hand of user's sight in real coordinate system;
The direction for user's sight that the computing unit is perceived based on sensor assembly, enters control panel district in response to sight Domain sets the control on control panel to state of activation;
Posture of the user's hand that the computing unit is perceived based on sensor in real coordinate system, determines and corresponds to hand Cursor position, enter the region of the first control being active in response to the cursor corresponding to hand, cursor painted It is made as foresight icon;
The computing unit is according to user's hand with the distance of the first control, the ruler of update foresight icon in virtual coordinate system It is very little;And it is less than threshold value with respect to the distance of the first control in virtual coordinate system in response to hand, generate first event.
10. a kind of information processing equipment, including processor, memory, display equipment, described information processing equipment are additionally coupled to pass The state of the user of sensor module and the perception of receiving sensor module;
The memory stores program, and the processor operation described program makes described information processing equipment perform claim require 1- Method described in one of 7.
CN201710097594.6A 2017-02-22 2017-02-22 Man-machine interaction method and system based on gesture recognition and visual feedback Active CN108459702B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710097594.6A CN108459702B (en) 2017-02-22 2017-02-22 Man-machine interaction method and system based on gesture recognition and visual feedback

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710097594.6A CN108459702B (en) 2017-02-22 2017-02-22 Man-machine interaction method and system based on gesture recognition and visual feedback

Publications (2)

Publication Number Publication Date
CN108459702A true CN108459702A (en) 2018-08-28
CN108459702B CN108459702B (en) 2024-01-26

Family

ID=63220832

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710097594.6A Active CN108459702B (en) 2017-02-22 2017-02-22 Man-machine interaction method and system based on gesture recognition and visual feedback

Country Status (1)

Country Link
CN (1) CN108459702B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109725724A (en) * 2018-12-29 2019-05-07 百度在线网络技术(北京)有限公司 There are the gestural control method and device of screen equipment
CN109725723A (en) * 2018-12-29 2019-05-07 百度在线网络技术(北京)有限公司 Gestural control method and device
CN113760137A (en) * 2021-06-16 2021-12-07 荣耀终端有限公司 Cursor display method and electronic equipment
TWI776522B (en) * 2021-03-23 2022-09-01 宏達國際電子股份有限公司 Method for interacting with virtual environment, electronic device, and computer readable storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101441513A (en) * 2008-11-26 2009-05-27 北京科技大学 System for performing non-contact type human-machine interaction by vision
US20090231278A1 (en) * 2006-02-08 2009-09-17 Oblong Industries, Inc. Gesture Based Control Using Three-Dimensional Information Extracted Over an Extended Depth of Field
CN102375683A (en) * 2010-08-20 2012-03-14 索尼公司 Information processing device, computer program product, and display control method for cursor control
CN102426480A (en) * 2011-11-03 2012-04-25 康佳集团股份有限公司 Man-machine interactive system and real-time gesture tracking processing method for same
CN103347437A (en) * 2011-02-09 2013-10-09 普莱姆森斯有限公司 Gaze detection in a 3d mapping environment
CN104808795A (en) * 2015-04-29 2015-07-29 王子川 Gesture recognition method for reality-augmented eyeglasses and reality-augmented eyeglasses system
CN105955461A (en) * 2016-04-25 2016-09-21 乐视控股(北京)有限公司 Interactive interface management method and system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090231278A1 (en) * 2006-02-08 2009-09-17 Oblong Industries, Inc. Gesture Based Control Using Three-Dimensional Information Extracted Over an Extended Depth of Field
CN101441513A (en) * 2008-11-26 2009-05-27 北京科技大学 System for performing non-contact type human-machine interaction by vision
CN102375683A (en) * 2010-08-20 2012-03-14 索尼公司 Information processing device, computer program product, and display control method for cursor control
CN103347437A (en) * 2011-02-09 2013-10-09 普莱姆森斯有限公司 Gaze detection in a 3d mapping environment
CN102426480A (en) * 2011-11-03 2012-04-25 康佳集团股份有限公司 Man-machine interactive system and real-time gesture tracking processing method for same
CN104808795A (en) * 2015-04-29 2015-07-29 王子川 Gesture recognition method for reality-augmented eyeglasses and reality-augmented eyeglasses system
CN105955461A (en) * 2016-04-25 2016-09-21 乐视控股(北京)有限公司 Interactive interface management method and system

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109725724A (en) * 2018-12-29 2019-05-07 百度在线网络技术(北京)有限公司 There are the gestural control method and device of screen equipment
CN109725723A (en) * 2018-12-29 2019-05-07 百度在线网络技术(北京)有限公司 Gestural control method and device
CN109725724B (en) * 2018-12-29 2022-03-04 百度在线网络技术(北京)有限公司 Gesture control method and device for screen equipment
TWI776522B (en) * 2021-03-23 2022-09-01 宏達國際電子股份有限公司 Method for interacting with virtual environment, electronic device, and computer readable storage medium
CN113760137A (en) * 2021-06-16 2021-12-07 荣耀终端有限公司 Cursor display method and electronic equipment
CN113760137B (en) * 2021-06-16 2022-08-05 荣耀终端有限公司 Cursor display method and electronic equipment
US11989385B2 (en) 2021-06-16 2024-05-21 Honor Device Co., Ltd. Cursor display method and electronic device

Also Published As

Publication number Publication date
CN108459702B (en) 2024-01-26

Similar Documents

Publication Publication Date Title
US20220382379A1 (en) Touch Free User Interface
US10015402B2 (en) Electronic apparatus
US9685005B2 (en) Virtual lasers for interacting with augmented reality environments
US10324293B2 (en) Vision-assisted input within a virtual world
US20210011556A1 (en) Virtual user interface using a peripheral device in artificial reality environments
US10257423B2 (en) Method and system for determining proper positioning of an object
CN105229582B (en) Gesture detection based on proximity sensor and image sensor
KR101791366B1 (en) Enhanced virtual touchpad and touchscreen
JP6539816B2 (en) Multi-modal gesture based interactive system and method using one single sensing system
EP2480955B1 (en) Remote control of computer devices
CN108536273A (en) Man-machine menu mutual method and system based on gesture
JP4513830B2 (en) Drawing apparatus and drawing method
EP2558924B1 (en) Apparatus, method and computer program for user input using a camera
CN108459702A (en) Man-machine interaction method based on gesture identification and visual feedback and system
KR20130001176A (en) System and method for close-range movement tracking
CN103347437A (en) Gaze detection in a 3d mapping environment
KR20220032059A (en) Touch free interface for augmented reality systems
KR20100106203A (en) Multi-telepointer, virtual object display device, and virtual object control method
US20160073017A1 (en) Electronic apparatus
EP2590060A1 (en) 3D user interaction system and method
US20160231807A1 (en) Electronic apparatus
JP2014026355A (en) Image display device and image display method
JP6534011B2 (en) INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING PROGRAM, INFORMATION PROCESSING SYSTEM, AND INFORMATION PROCESSING METHOD
JP6519075B2 (en) INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING PROGRAM, INFORMATION PROCESSING SYSTEM, AND INFORMATION PROCESSING METHOD
CN110673810B (en) Display device, display method and device thereof, storage medium and processor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20191121

Address after: 300450 room 203b-3, building 3, No.4, Haitai development road 2, Huayuan Industrial Zone (outside the ring), Binhai high tech Zone, Binhai New Area, Tianjin

Applicant after: TIANJIN SHARPNOW TECHNOLOGY Co.,Ltd.

Address before: 518000 Guangdong, Shenzhen, Nanshan District science and technology south twelve road Konka R & D building 12 floor, A2

Applicant before: TIANJIN FENGSHI HUDONG TECHNOLOGY Co.,Ltd. SHENZHEN BRANCH

SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20210120

Address after: 518000 B1018, 99 Dahe Road, Runcheng community, Guanhu street, Longhua District, Shenzhen City, Guangdong Province

Applicant after: Shenzhen laimile Intelligent Technology Co.,Ltd.

Address before: Room 203b-3, building 3, No.4, Haitai development road 2, Huayuan Industrial Zone, Binhai high tech Zone, Binhai New Area, Tianjin 300450

Applicant before: Tianjin Sharpnow Technology Co.,Ltd.

TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20210917

Address after: 518000 509, xintengda building, building M8, Maqueling Industrial Zone, Maling community, Yuehai street, Nanshan District, Shenzhen City, Guangdong Province

Applicant after: Shenzhen qiaoniu Technology Co.,Ltd.

Address before: 518000 B1018, 99 Dahe Road, Runcheng community, Guanhu street, Longhua District, Shenzhen City, Guangdong Province

Applicant before: Shenzhen laimile Intelligent Technology Co.,Ltd.

GR01 Patent grant
GR01 Patent grant