CN107918481A - Man-machine interaction method and system based on gesture identification - Google Patents

Man-machine interaction method and system based on gesture identification Download PDF

Info

Publication number
CN107918481A
CN107918481A CN201610878812.5A CN201610878812A CN107918481A CN 107918481 A CN107918481 A CN 107918481A CN 201610878812 A CN201610878812 A CN 201610878812A CN 107918481 A CN107918481 A CN 107918481A
Authority
CN
China
Prior art keywords
control
guide rail
user
sliding block
rail area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610878812.5A
Other languages
Chinese (zh)
Other versions
CN107918481B (en
Inventor
党建勋
刘津甦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Qiaoniu Technology Co ltd
Original Assignee
Tianjin Feng Time Interactive Technology Co Ltd Shenzhen Branch
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin Feng Time Interactive Technology Co Ltd Shenzhen Branch filed Critical Tianjin Feng Time Interactive Technology Co Ltd Shenzhen Branch
Priority to CN201610878812.5A priority Critical patent/CN107918481B/en
Publication of CN107918481A publication Critical patent/CN107918481A/en
Application granted granted Critical
Publication of CN107918481B publication Critical patent/CN107918481B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Abstract

Disclose man-machine interaction method and system based on gesture identification.The disclosed man-machine interaction method based on gesture identification, the wherein order of gesture information identification user of the control based on user, the control includes entrance area and guide rail area, user interface is divided into Part I and Part II by entrance area, Part I does not include guide rail area, and Part II includes guide rail area;The described method includes:It is moved to the Part II of user interface by entrance area from the Part I of user interface in response to cursor, control enters activated state, and shows sliding block in the guide rail area of control;Guide rail area is removed from the first end of guide rail area in response to sliding block, the control generates the first event.

Description

Man-machine interaction method and system based on gesture identification
Technical field
The present invention relates to field of human-computer interaction.In particular it relates to using based on the control of gesture identification into pedestrian The method and system of machine interaction.
Background technology
In human-computer interaction technology, control is the reusable software component for building graphic user interface.An in general, control Part corresponds to a kind of function.For example, Fig. 1 illustrates " confirmation " control in two-dimensional graphical user interface." confirmation " control includes carrying Show window, reminding window includes " confirming " button and " cancellation " button.When " confirmation " control is called, eject as shown in Figure 1 Reminding window, click of the identification user to " confirmation " button or " cancellation " button is intended to obtain the operation of user, and realizes man-machine Interaction.Slip unlocking technology of the prior art, user is informed by the slip of hand on the touchscreen to information processing equipment Input be intended to.
Novel human-machine interaction technology is also evolving, and the human-computer interaction technology based on gesture identification is one of hot spot.It is right The identification of hand exercise, can pass through accomplished in many ways.US20100199228A1 (publication date from Microsoft:2010 On August 5) body posture that user is captured and analyzed using depth camera is provided, and it is construed as computer command Scheme.US20080291160A1 (publication date from Nintendo companies:On November 27th, 2008) provide using infrared The scheme of sensor and acceleration transducer capture user's hand position.From Panasonic Electric Equipment Industrial Co., Ltd CN1276572A, which is provided, takes pictures hand using camera, then image is normalized analysis, and will normalization Obtained image carries out space projection, and by the projection coordinate of gained compared with the projection coordinate of the image prestored. Fig. 2 shows the gesture identification that the patent application CN201110100532.9 from Tianjin Fengshi Interaction Technology Co., Ltd. is provided With the sensory perceptual system and method for locus.As shown in Fig. 2, gesture recognition system includes:Main frame 101, multi-cam Control circuit 102, multiple cameras 103, user's hand 104, the application program for running on main frame 101 of system 105th, the object to be operated 106 in application program 105 and virtual hand cursor 107.Gesture recognition system further includes not to be shown in fig. 2 Go out for the infrared fileter that illuminates the infrared illumination source of user's hand 104 and be positioned over before each camera.It is more A camera 103 captures the image of user's hand 104, at the hand images that control circuit 102 gathers camera 103 Reason, and identify posture and/or the position of hand.Aided in addition, also having in the prior art using data glove to hand gestures Identification scheme.
The content of the invention
, it is necessary to design control to promote the exploitation of application program in the interactive process based on gesture identification.Control Using gesture as input, and event or message are produced as output.Event or message may indicate that " confirmation " or " cancellation " of user Operation purpose, or the user view of a variety of different implications of instruction.And since the biological characteristic of people determines that the hand of user exists The track of interactive space can not realize the problem of straight or specification, so that existing human-computer interaction technology is difficult to effective geography Solve the intention of gesture input.
According to the first aspect of the invention, there is provided the first man-machine interaction method based on gesture identification, wherein control base In the order of the gesture information identification user of user, the control includes entrance area and guide rail area, and entrance area is by user Interface is divided into Part I and Part II, and Part I does not include guide rail area, and Part II includes guide rail area;The side Method includes:It is moved to the Part II of user interface, control by entrance area from the Part I of user interface in response to cursor Part enters activated state, and shows sliding block in the guide rail area of control;In response to sliding block guide rail is removed from the first end of guide rail area Region, the control generate the first event.
The first man-machine interaction mode based on gesture identification according to the first aspect of the invention, there is provided according to the present invention The second man-machine interaction method based on gesture identification of first aspect, including:In response to sliding block from guide rail area second End removes guide rail area, and the control generates second event.
Foregoing man-machine interaction mode according to the first aspect of the invention, there is provided according to a first aspect of the present invention based on Third party's machine exchange method of gesture identification, including:Removed in response to sliding block from the first end or second end of guide rail area Guide rail area, the control enter unactivated state.
Foregoing man-machine interaction mode according to the first aspect of the invention, there is provided the according to a first aspect of the present invention the 4th Man-machine interaction method, including:The control is initialized, the control enters unactivated state.
Foregoing man-machine interaction mode according to the first aspect of the invention, there is provided the according to a first aspect of the present invention the 5th Man-machine interaction method, wherein the first end is guide rail area close to the part of entrance area.
The second man-machine interaction mode based on gesture identification according to the first aspect of the invention, there is provided according to the present invention The Sixth Man machine exchange method based on gesture identification of first aspect, wherein the second end is guide rail area away from entrance area Part.
Foregoing man-machine interaction mode according to the first aspect of the invention, there is provided the according to a first aspect of the present invention the 7th Man-machine interaction method, including:On a user interface according to gesture information display highlighting, and gesture information instruction by The position of the user's hand for user's extracting hand images that image capture device is caught and/or posture.
Foregoing man-machine interaction mode according to the first aspect of the invention, there is provided the according to a first aspect of the present invention the 8th Man-machine interaction method, including:According to projected position of the cursor on the center line of guide rail area, draw and slide in guide rail area Block.
Foregoing man-machine interaction mode according to the first aspect of the invention, there is provided the according to a first aspect of the present invention the 9th Man-machine interaction method, including:Enter activated state in response to control, hide cursor on a user interface, and change control Appearance, enter activated state to prompt the user with control.
Foregoing man-machine interaction mode according to the first aspect of the invention, there is provided the according to a first aspect of the present invention the tenth Man-machine interaction method, including:Enter unactivated state in response to control, the position supported on a user interface according to gesture information Put display highlighting.
Foregoing man-machine interaction mode according to the first aspect of the invention, there is provided the according to a first aspect of the present invention the tenth One man-machine interaction method, including:Guide rail area is removed in response to sliding block, hides the sliding block.
Foregoing man-machine interaction mode according to the first aspect of the invention, there is provided the according to a first aspect of the present invention the tenth Two man-machine interaction methods, including:Under the activated state of control, " grasping " in response to gesture information instruction acts, and makes light Mark is fixed on sliding block, and draws sliding block according to gesture information.
The first man-machine interaction mode based on gesture identification according to the first aspect of the invention, there is provided according to the present invention 13rd man-machine interaction method of first aspect, wherein the first end of the guide rail area have it is multiple.
The second man-machine interaction mode based on gesture identification according to the first aspect of the invention, there is provided according to the present invention 14th man-machine interaction method of first aspect, wherein the second end of the guide rail area have it is multiple.
According to the second aspect of the invention, there is provided first based on gesture identification according to a second aspect of the present invention is man-machine The order of the gesture information identification user of interactive device, wherein control based on user, the control include entrance area and guide rail User interface is divided into Part I and Part II by region, entrance area, and Part I does not include guide rail area, Part II Including guide rail area;Described device includes:Active module, for passing through entrance from the Part I of user interface in response to cursor Region is moved to the Part II of user interface, and control enters activated state, and shows sliding block in the guide rail area of control;Event is given birth to Into module, guide rail area is removed from the first end of guide rail area in response to sliding block, the control generates the first event.
The first human-computer interaction device based on gesture identification according to the second aspect of the invention, there is provided according to the present invention Second human-computer interaction device of second aspect, including:Second event generation module, in response to sliding block from guide rail area Second end remove guide rail area, the control generates second event.
Foregoing human-computer interaction device according to the second aspect of the invention, there is provided the according to a second aspect of the present invention the 3rd Human-computer interaction device, including:Flexible module is deactivated, guide rail is removed from the first end or second end of guide rail area in response to sliding block Region, the control enter unactivated state.
Foregoing human-computer interaction device according to the second aspect of the invention, there is provided the according to a second aspect of the present invention the 4th Human-computer interaction device, including:Initialization module, for initializing the control, the control enters unactivated state.
Foregoing human-computer interaction device according to the second aspect of the invention, there is provided the according to a second aspect of the present invention the 5th Human-computer interaction device, wherein the second end is guide rail area close to the part of entrance area.
The second human-computer interaction device according to the second aspect of the invention, there is provided the according to a second aspect of the present invention the 6th Human-computer interaction device, wherein the first end is part of the guide rail area away from entrance area.
Foregoing human-computer interaction device according to the second aspect of the invention, there is provided the according to a second aspect of the present invention the 7th Human-computer interaction device, wherein further including:The device shown on a user interface according to gesture information, and the gesture information refer to Show position and/or the posture of user's hand of the user's extracting hand images caught by image capture device.
Foregoing human-computer interaction device according to the second aspect of the invention, there is provided the according to a second aspect of the present invention the 8th Human-computer interaction device, including:Sliding block drafting module, for according to projected position of the cursor on the center line of guide rail area, Sliding block is drawn in guide rail area.
Foregoing human-computer interaction device according to the second aspect of the invention, there is provided the according to a second aspect of the present invention the 9th Human-computer interaction device, including:Outward appearance change module, for entering activated state in response to control, hides on a user interface Cursor, and play the sound specified, shows the word specified, and/or provides mechanics feedback, is entered with to prompt the user with control and swashed State living.
Foregoing human-computer interaction device according to the second aspect of the invention, there is provided the according to a second aspect of the present invention the tenth Human-computer interaction device, including:Cursor display module, for entering unactivated state in response to control, on a user interface according to The position display cursor supported according to gesture information, plays the sound specified, and shows the word specified, and/or provides mechanics feedback.
Foregoing human-computer interaction device according to the second aspect of the invention, there is provided the according to a second aspect of the present invention the tenth One human-computer interaction device, including:Sliding block hidden module, for removing guide rail area in response to sliding block, hides the sliding block.
Foregoing human-computer interaction device according to the second aspect of the invention, there is provided the according to a second aspect of the present invention the tenth Two human-computer interaction devices, including:Cursor fixed module, under the activated state of control, being indicated in response to gesture information " grasping " action, cursor is fixed on sliding block, and sliding block is drawn according to gesture information.
The first human-computer interaction device according to the second aspect of the invention, there is provided the according to a second aspect of the present invention the tenth Three human-computer interaction devices, wherein the first end of the guide rail area have it is multiple.
The second human-computer interaction device according to the second aspect of the invention, there is provided the according to a second aspect of the present invention the tenth Four human-computer interaction devices, wherein the second end of the guide rail area have it is multiple.
According to the third aspect of the invention we, there is provided a kind of information processing equipment, wherein described information processing equipment include Processor, memory and display device, described information processing equipment are additionally coupled to gesture identification equipment and receive gesture identification The gesture information that equipment provides;The memory storage program, the processor operation described program set described information processing The standby foregoing man-machine interaction method performed according to the first aspect of the invention.
According to the fourth aspect of the invention, there is provided a kind of computer program, it causes when by information processing equipment When managing device operation described program, make the foregoing a variety of man-machine friendships of described information processing equipment execution according to the first aspect of the invention One of mutual method.
Brief description of the drawings
When being read together with attached drawing, by reference to the detailed description of illustrative embodiment, will be best understood below The present invention and preferable use pattern and its further objects and advantages, wherein attached drawing include:
Fig. 1 illustrates " confirmation " control of two-dimensional graphical user interface in the prior art;
Fig. 2 is gesture recognition system structure diagram of the prior art;
Fig. 3 is the block diagram of the man-machine interactive system according to embodiments of the present invention based on gesture identification;
Fig. 4 is the schematic diagram of the control based on gesture identification in two-dimensional user interface according to embodiments of the present invention;
Fig. 5 A-5D are the various states of the control based on gesture identification in two-dimensional user interface according to embodiments of the present invention Schematic diagram;
Fig. 6 is the flow of the man-machine interaction method based on gesture identification in two-dimensional user interface according to embodiments of the present invention Figure;
Fig. 7 is the schematic diagram of the control based on gesture identification in three-dimensional user interface according to embodiments of the present invention;
Fig. 8 A-8D are the various states of the control based on gesture identification in three-dimensional user interface according to embodiments of the present invention Schematic diagram;And
Fig. 9 is the block diagram for the information processing equipment for realizing the embodiment of the present invention.
Embodiment
The embodiment of the present invention is described below in detail, the example of the embodiment is shown in the drawings, wherein from beginning to end Same or similar label represents same or similar element or has the function of same or like element.Below with reference to attached The embodiment of figure description is exemplary, and is only used for explaining the present invention, and is not considered as limiting the invention.On the contrary, this The embodiment of invention includes falling into all changes in the range of the spirit and intension of attached claims, modification and equivalent Thing.
In the description of the present invention, it is to be understood that term " first ", " second " etc. are only used for description purpose, without It is understood that to indicate or implying relative importance.In the description of the present invention, it is necessary to which explanation, provides unless otherwise clear and definite And restriction, term " connected ", " connection " should be interpreted broadly, for example, it may be fixedly connected or be detachably connected, Or it is integrally connected;Can mechanically connect or be electrically connected;It can be directly connected, intermediary can also be passed through It is indirectly connected.For the ordinary skill in the art, the tool of above-mentioned term in the present invention can be understood with concrete condition Body implication.In addition, in the description of the present invention, unless otherwise indicated, " multiple " are meant that two or more.
Any process or method described otherwise above description in flow chart or herein is construed as, and represents to include Module, fragment or the portion of the code of the executable instruction of one or more the step of being used for realization specific logical function or process Point, and the scope of the preferred embodiment of the present invention includes other realization, wherein can not press shown or discuss suitable Sequence, including according to involved function by it is basic at the same time in the way of or in the opposite order, carry out perform function, this should be of the invention Embodiment person of ordinary skill in the field understood.
Fig. 3 is the block diagram of the man-machine interactive system according to embodiments of the present invention based on gesture identification.It is real according to the present invention Applying the man-machine interactive system of example includes gesture input device 310 coupled to each other, information processing equipment 320 and display device 330.In one example, gesture input device 310, are sent to for capturing the image of user's hand, and by the image of acquisition Information processing equipment is handled.Information processing equipment 320, for receiving the hand images of gesture input device transmission, identification The gesture information of user's hand in image.Information processing equipment 320 also by display device 330 to user's present graphical and/or Image, for example, on display device 330 draw user's hand virtual image.Information processing equipment can be such as computer, Mobile phone or dedicated gesture identification equipment.Display device 330 can be such as flat-panel screens, projecting apparatus, head-mounted display.
In another example, gesture input device 310 perceives position and/or the posture of user's hand, identifies user's hand The gesture information in portion, and user's hand information is sent to information processing equipment 320.Information processing equipment 320 identifies that gesture is defeated The user's hand information for entering the offer of equipment 310 makees the input that provides to the user, and is provided a user by display device 330 defeated Go out, to realize human-computer interaction.Obviously, information processing equipment 320 can also be handed over by the forms such as sound, mechanical function and user Mutually.
As still another example, gesture input device 310 still can be such as depth transducer, Distance-sensing Device, VR controllers (such as Oculus Rift Touch), game paddle, data glove (such as CyberGlove), motion capture system (such as OptiTracker), gyroscope, for perceiving position and/or the posture of user's hand.
From user's gesture done in real world and/or action, the gesture information based on virtual coordinate system is extracted (i).Gesture information (i) can be a vector, and formalization representation for i=C, palm, thumb, index, mid, ring, little}.Wherein, c represents the hand-type of whole hand, for example, clenching fist, the five fingers open, triumphantly gesture etc., palm represent instruction palm Positional information, thumb, index, mid, ring and little represent thumb, forefinger, middle finger, the third finger and little finger of toe respectively Positional information and/or orientation information.And wherein, virtual coordinate system is showing as the void constructed by information processing equipment 320 Intend the positional information in the world.And show the positional information in object or space in real world with real coordinate system.Information processing Virtual world constructed by equipment 320 can be the two-dimensional space, three dimensions or fusion of such as two-dimensional graphical user interface The virtual reality scenario of user.Real coordinate system and virtual coordinate system, can be two-dimensional coordinate system or three-dimensional system of coordinate.Can be by one Fixed frequency or time interval renewal gesture information (i), or when the hand position and/or posture of user change, more New gesture information (i).
On a user interface, can be according to gesture information (i) display highlighting, for providing a user eye response.Cursor Position on graphical interfaces is represented by the function of gesture information (i), such as func_a (i).Those skilled in the art can be with Understand, function func_a differences according to different application scenarios or setting.
For example, in a two-dimensional user interface, the position that draw cursor is calculated by formula (1):
Func_a (i)=C*0+palm*0+index.position*0.5+mid*0+little*0 (1)
(1) in formula, index.position refers to the position of user's forefinger, thus from (1) formula it was found from, cursor is in user circle Position on face, only relies upon user's index finger location, and the distance that cursor moves on a user interface, be the movement of user's forefinger away from From half.
Cursor can have single pattern, such as the shape of hand.Cursor can also have a variety of patterns for corresponding to not homochirality.
Illustrate with reference to Fig. 4-Fig. 6, in two-dimensional user interface how by gesture come operational controls.
Fig. 4 is the schematic diagram of the control based on gesture identification in two-dimensional user interface according to embodiments of the present invention.Referring to Fig. 4, the control based on gesture identification includes in two-dimensional user interface according to embodiments of the present invention:Entrance area and guide rail area Domain.Entrance area in Fig. 4 is a line segment.In another example, entrance area can be curve.Entrance area is by user The two dimensional surface at interface is divided into two parts, by the side including guide rail area, is known as rail-sides, and opposite side is known as freely Side.In Fig. 4, guide rail area is rectangle.Obviously, in other examples, guide rail area can have other shapes, such as line segment, three It is angular, oval etc..Entrance area can be plotted in two-dimensional user interface with guide rail area, to prompt user control place Position.In another example, entrance area and/or guide rail area can be hidden, not influence to show in user interface Content.Guide rail area closes on or is adjacent to entrance area.Guide rail area is known as arrival end close to the part of entrance area, and leads Part of the rail region away from entrance area is known as the port of export.In another example, in order to allow the hanging of user's hand to operate It is easier to be identified, entrance area and the guide rail area of control are rendered as gap shape or bell mouth shape, so as to be easy to guide user Cursor is set to enter guide rail area by gesture.
In the example in fig. 4, guide rail area further includes sliding block.Sliding block can be moved along guide rail area.
Fig. 5 A-5D are the various states based on gesture identification control in two-dimensional user interface according to embodiments of the present invention Schematic diagram.
Control based on gesture identification has activated state and inactive state.Inactive state is the original state of control.Fig. 5 A The control in inactive state is illustrated, and is associated with the cursor of gesture information (i).Notice in fig. 5, guide rail area On do not draw sliding block, or sliding block is hidden.Do not include sliding block in guide rail area, can be the prompting to user, inform user Control is in inactive state.
When the gesture of user make it that cursor enters rail-sides from free side by entrance area, control is turned by inactive state It is changed to activated state.Control receives the event of instruction gesture information (i), and identifies the change that cursor position occurs for gesture information (i) Change.When gesture information (i) makes cursor position enter rail-sides via entrance area from the free side of control, by control state change For activated state.And draw the control of activated state.
Alternatively, when the gesture of user make it that cursor enters rail-sides from free side by entrance area, in guide rail area Sliding block is drawn on domain, and displays that cursor;User is moved the cursor on sliding block by gesture, and is acted by " grasping ", Cursor is set to be fixed on sliding block, and next, sliding block will follow cursor to move." grasping " action is not essential, in a reality Apply in example, when control is in activated state, sliding block follows cursor to move, or control is based on gesture information (i) and sends out slide position Changing.Still alternatively, with the movement of sliding block, also play and specify sound, change visual presence and/or provide a user power Learn feedback.For example, as sliding block is moved to the port of export, gradually increase and/or frequency gradually rise for the sound of broadcasting, and with cunning Block is moved to the port of export, and the sound abated gradually and/or frequency gradually reduces for broadcasting.
Fig. 5 B shows are in the control of activated state.In the control of activated state, guide rail area includes sliding block.Sliding block associates In gesture information (i) to the prompting to its hand position of user, and hide cursor.As an example, according to definite drafting light The rule of cursor position, determines to draw the position of sliding block.Further, the position for drawing sliding block is limited at guide rail area so that Sliding block is showed to move along guide rail.In the example of Fig. 5 B, guide rail area includes center line.The position of cursor (not drawing out) is in Projected position on line, to draw the position of sliding block in guide rail area.Alternatively, the appearance of control is changed, to be carried to user Show that control is activated and enters activated state.For example, drawing shade along the edge of control, and/or change the color of control regions, And/or the word that display is specified.Still alternatively, fed back by providing a user mechanics, and/or play the sound specified, with Control is prompted the user with to be activated and enter activation.
If the gesture of user causes cursor to enter guide rail area from rail-sides, or the gesture of user makes cursor bypass entrance Region and enter rail-sides, then the control in inactive state will not be made to be changed into activated state.And sliding block will not be shown, and according to Right display highlighting.
In an embodiment according to the present invention, led if user makes sliding block be removed from the port of export of guide rail area by gesture Rail, represents user to control instruction " confirmations " order, and if user sliding block is made from the shifting of the arrival end of guide rail area by gesture Go out guide rail, represent user to control instruction " cancellation " order.
Fig. 5 C illustrate the control for receiving " confirmation " order.For the control of activated state, as an example, user's sidesway to the right Dynamic forefinger, the position (sliding block that dotted line pattern is shown as in Fig. 5 C) of sliding block accordingly move right along guide rail area, with time to User provides visual feedback, user is known whether control correctly identifies the intention of oneself.When with the movement of user's forefinger, Control detects that sliding block removes guide rail area from the port of export of guide rail area, and control generation represents the event of " confirmation " order.It is logical The event that processing represents " confirmation " order is crossed, is identified with the operation associated by control.As sliding block removes guide rail area, hide Sliding block, for example, only drawing sliding block in guide rail area, and hides part of the sliding block beyond guide rail area.Alternatively, control is changed The appearance of part, identifies the intention of user and generates " confirmation " event to prompt the user with control.For example, dodge control regions It is bright, and/or change the color of control regions, and/or the word that display is specified.Still alternatively, mechanics feedback is provided a user, And/or the sound specified is played, to prompt the user with the intention that control identifies user.Further, led as sliding block removes Rail region, the State Transferring of control is inactive state, and draws cursor to track the gesture of user.
Fig. 5 D illustrate the control for receiving " cancellation " order.For the control of activated state, as an example, user's sidesway to the left Dynamic forefinger, the position (sliding block that dotted line pattern is shown as in Fig. 5 D) of sliding block are accordingly moved to the left along guide rail area, with time to User provides visual feedback, user is known whether control correctly identifies the intention of oneself.When with the movement of user's forefinger, Control detects that sliding block removes guide rail area from the arrival end of guide rail area, and control generation represents the event of " cancellation " order, or Person does not produce event, carrys out instruction user and does not make " confirmation " instruction, or original attempt is abandoned or eliminated to instruction user. By handling the event for representing " cancellation " and ordering, it is cancelled or ignores with the operation associated by control.As sliding block removes guide rail Region, hides sliding block.Alternatively, change the appearance of control, and/or provide a user mechanics feedback, to prompt the user with control Identify the intention of user and generate " cancellation " event.Further, as sliding block removes guide rail area, the state of control turns Inactive state is changed to, and draws cursor to track the gesture of user.
In another embodiment in accordance with the invention, guide rail area has cross shape.In activated state, when sliding block is from right side Or during the removal guide rail of top, control produces " confirmation " event;And when sliding block removes guide rail from left side or lower section, control produces " cancellation " event.
One of ordinary skill in the art will realize that guide rail area can have multiple outlets.When sliding block is removed from some outlets When, control produces " confirmation " event, and sliding block is exported from other when removing, and control produces " cancellation " event.To prompt user The different implications of outlet, in each export direction, can feed back the implication just exported to user by video, audio and/or mechanics Different instructions is provided.
Still there is one embodiment according to the present invention, guide rail area has cross shape, and have multiple outlets, Each branch in cross region corresponds to one outlet, and each outlet indicates that different implications or multiple outlet instructions are a variety of Implication.For example, when sliding block is exported from top and removed, control produces " cancellation " event, and instruction user is abandoned to playing music Attempt;When sliding block is removed from right-side outlet, control produces " mute " event, and instruction user wishes vertical even if from certain application The volume of the audio output of program falls to 0;When sliding block is exported from top to be removed, control produces " being arranged to high sampling rate " thing Part;And when sliding block exports removal from below, control produces " being arranged to low sampling rate " event.It is defeated in response to receiving control institute The different implications of instruction or the event of different command gone out, application program are handled accordingly.
Fig. 6 is the flow of the man-machine interaction method based on gesture identification in two-dimensional user interface according to embodiments of the present invention Figure.To use control according to embodiments of the present invention, initialization control (610).Control initialization procedure, is included in user interface Upper drafting control, for example, drawing control as shown in Figure 5A in the user interface.And control is set to receive gesture information (i).Can Selection of land, also draws cursor on a user interface, and the position for drawing cursor is associated with gesture information (i).In another example, by controlling The program or other programs that part is applied to draw cursor on a user interface.Control receives gesture information (i), from gesture information (i) position of cursor is obtained in.Rail-sides are moved to by entrance area from free side in response to cursor, control enters activated state (620).Fig. 5 B shows control of activated state.Alternatively, when control enters activated state, the appearance of control is also changed, generation refers to Fixed sound, and/or mechanics feedback is provided, enter activated state to prompt the user with control.Control is also drawn in guide rail area to be slided Block.The position for drawing sliding block is limited at guide rail area so that shows sliding block and is moved along guide rail.And sliding block is set to follow user Gesture and move.As an example, the rule according to definite drafting cursor position, determines to draw the position of sliding block.Further, By cursor position in the projection of guide rail area center line, the position as drafting sliding block.
The position of sliding block is obtained in control gesture information (i).Whether control detection sliding block removes from the side of guide rail area Guide rail area (640).Referring to Fig. 5 C, when control detects that sliding block removes guide rail area, control production from the port of export of guide rail area As an example, the first event can be " confirmation " event, " mute " event etc. to raw first event (650).And referring to Fig. 5 D, When control detects that sliding block removes guide rail area from the arrival end of guide rail area, control produces second event (650).Second event Can be the events such as " cancellation " event, " be arranged to high and use rate ".
In step 650, with first event that produces, control enters unactivated state.Alternatively, change the appearance of control, produce The raw sound specified, and/or provide mechanics feedback, identifies the intention of user and generates the first thing to prompt the user with control Part.And alternatively, cursor is drawn to track the gesture of user.
In step 660, with second event is produced, control enters unactivated state.Alternatively, change the appearance of control, produce The raw sound specified, and/or provide mechanics feedback, identifies the intention of user and generates the second thing to prompt the user with control Part.And alternatively, cursor is drawn to track the gesture of user.
In a further embodiment, user creates space according to embodiments of the present invention in virtual world, and/or sets Put or change the position of control.Control can be arranged on position easy to operation by user.For example, user's arm is fully stretched to side During expansion, the position where cursor.So as to not only indicate the order such as " confirmation "/" cancellation " easy to user, but also do not influence to virtual generation The operation of other objects in boundary.
Illustrated with reference to Fig. 7-8, gesture identification control is based on according to the present invention in three-dimensional user interface Embodiment.Fig. 7 is the schematic diagram of the control based on gesture identification in three-dimensional user interface according to embodiments of the present invention.Referring to Fig. 7, the control based on gesture identification includes in three-dimensional user interface according to embodiments of the present invention:Entrance area and guide rail area Domain.Entrance area in Fig. 7 is a finite rectangular plane.In another example, entrance area can be in curved surface or plane The region surrounded by closed curve.In Fig. 7, the three dimensions of user interface is divided into two parts by the plane where entrance area, By the side including guide rail area, it is known as rail-sides, and opposite side is known as free side.In Fig. 7, guide rail area is cuboid. Obviously, in other examples, guide rail area can have other shapes, such as cylinder, sphere, spheroid etc..Entrance area with Guide rail area can be plotted on three-dimensional user interface, to prompt the position where user control.Guide rail area can be with three The object of dimension user interface blends, for example, the vase of user interface, mailbox etc..In another example, entrance area and/ Or guide rail area can be hidden, not influence the content shown in user interface.Guide rail area closes on or is adjacent to inlet region Domain.Guide rail area is known as arrival end close to the part of entrance area, and part of the guide rail area away from entrance area is known as exporting End.
In the example of fig. 7, guide rail area further includes sliding block.Sliding block can be moved along guide rail area.Alternatively, in control also Including guide rail line, in the figure 7, guide rail line is the center line along the long axis direction of the cuboid of guide rail area.One end of guide rail line Point is on entrance area.Sliding block is moved along guide rail line.
Fig. 8 A-8D are the various states of the control based on gesture identification in three-dimensional user interface according to embodiments of the present invention Schematic diagram.
Fig. 8 A illustrate the control in inactive state, and are associated with the cursor of gesture information (i).In inactive state On control, sliding block is not drawn on guide rail area, or sliding block is hidden.Do not include sliding block in guide rail area, be carrying to user Show, inform that user control is in inactive state.
When the gesture of user make it that cursor enters rail-sides from free side by entrance area, control is turned by inactive state It is changed to activated state.Control receives the event of instruction gesture information (i), and identifies the change that cursor position occurs for gesture information (i) Change.When gesture information (i) makes cursor position enter rail-sides via entrance area from the free side of control, by control state change For activated state.And draw the control of activated state.
Alternatively, when the gesture of user make it that cursor enters rail-sides from free side by entrance area, in guide rail area Sliding block is drawn on domain, and displays that cursor;User is moved the cursor on sliding block by gesture, and is acted by " grasping ", Cursor is set to be fixed on sliding block;And next, sliding block will follow cursor to move.
Fig. 8 B shows are in the control of activated state.In the control of activated state, guide rail area includes sliding block.Sliding block associates In gesture information (i) to the prompting to its hand position of user, and hide cursor.As an example, according to definite drafting light The rule of cursor position, determines to draw the position of sliding block.Further, the position for drawing sliding block is limited at guide rail area so that Sliding block is showed to move along guide rail.In the example of Fig. 7 B, guide rail area includes guide rail line.The position of cursor (not drawing out) exists Projected position in guide rail line, to draw the position of sliding block in guide rail area.Alternatively, change the appearance of control, with to Prompt control to be activated and enter activated state in family.For example, drawing shade along the edge of control, and/or change the face of control regions Color, and/or the sound specified is played, and/or the word that display is specified.
If the gesture of user causes cursor to enter guide rail area from rail-sides, or the gesture of user makes cursor bypass entrance Region and enter rail-sides, then the control in inactive state will not be made to be changed into activated state.And sliding block will not be shown, and according to Right display highlighting.
In an embodiment according to the present invention, led if user makes sliding block be removed from the port of export of guide rail area by gesture Rail, represents user to control instruction " confirmations " order, and if user sliding block is made from the shifting of the arrival end of guide rail area by gesture Go out guide rail, represent user to control instruction " cancellation " order.And alternatively, if the gesture of user is attempted to make sliding block from arrival end Or the region beyond the port of export removes guide rail area, sliding block is limited in guide rail area, and control is still in inactive state.
Fig. 8 C illustrate the control for receiving " confirmation " order.For the control of activated state, as an example, user's sidesway to the right Dynamic forefinger, the position (sliding block that dotted line pattern is shown as in Fig. 7 C) of sliding block accordingly move right along guide rail area, with time to User provides visual feedback, user is known whether control correctly identifies the intention of oneself.When with the movement of user's forefinger, Control detects that sliding block removes guide rail area from the port of export of guide rail area, and control generation represents the event of " confirmation " order.It is logical The event that processing represents " confirmation " order is crossed, is identified with the operation associated by control.As sliding block removes guide rail area, hide Sliding block.Further, with sliding block remove guide rail area, the State Transferring of control be inactive state, and drafting cursor with The gesture of track user.
Fig. 8 D illustrate the control for receiving " cancellation " order.For the control of activated state, as an example, user's sidesway to the left Dynamic forefinger, the position (sliding block that dotted line pattern is shown as in Fig. 8 D) of sliding block are accordingly moved to the left along guide rail area, with time to User provides visual feedback, user is known whether control correctly identifies the intention of oneself.When with the movement of user's forefinger, Control detects that sliding block removes guide rail area from the arrival end of guide rail area, and control generation represents the event of " cancellation " order, or Person does not produce event, carrys out instruction user and does not make " confirmation " instruction, or original attempt is abandoned or eliminated to instruction user. By handling the event for representing " cancellation " and ordering, it is cancelled or ignores with the operation associated by control.As sliding block removes guide rail Region, hides sliding block.Further, as sliding block removes guide rail area, the State Transferring of control is inactive state, and is drawn Cursor is to track the gesture of user.
One of ordinary skill in the art will realize that guide rail area can have multiple outlets.When sliding block is removed from some outlets When, control produces " confirmation " event, and sliding block is exported from other when removing, and control produces " cancellation " event.To prompt user The different implications of outlet, in each export direction, can feed back the implication just exported to user by video, audio and/or mechanics Different instructions is provided.
Still there is one embodiment according to the present invention, guide rail area has each outlet instruction in multiple outlets different Implication or multiple outlets indicate a variety of implications.For example, when sliding block is removed from first outlet, control produces " cancellation " thing Part;When sliding block is removed from second outlet, control produces " mute " event;When sliding block is removed from the 3rd outlet, control produces " being arranged to high sampling rate " event;And when sliding block is removed from the 4th outlet, control produces " being arranged to low sampling rate " event. The event for indicating different implications or different command exported in response to receiving control, application program are handled accordingly.
In another embodiment of the present invention, control is shown as follows in three-dimensional user interface.When initial, control During in unactivated state, display highlighting, hides entrance area, shows guide rail area;Control is converted into sharp by unactivated state During state living, gesture cursor is weakened, sliding block is shown on guide rail area, and show guide rail area and/or guide rail line.Control into Enter the moment of state of activation, trigger special efficacy, including:Gradually reduction cursor, and display guide rail area and/or guide rail line.With Another special efficacy when confirming or cancelling the operation associated by current control, is triggered by control in family, including:Sliding block is being highlighted Gradually weaken and disappear afterwards, gradually recover the display to cursor, guide rail area and/or guide rail line segment are progressively reverted into control Pattern during in inactive state.
In another embodiment in accordance with the invention, illustrate and how to be operated according to the present invention using gesture in the application The control that embodiment provides.(1) control of unactivated state is shown in the user interface of display device and is associated with gesture The cursor of information (i);(2) user changes gesture or mobile hand position, observes the change of control and cursor on display device;With Family controls cursor to enter guide rail area, control quilt by entrance area from free side by varying gesture or mobile hand position Activation.When activating control, user interface prompt user control is activated.(3) after control is active, user passes through Change gesture or mobile hand position carrys out control slide block so that sliding block is moved along guide rail area.If user will perform confirmation behaviour Make, then so that sliding block is removed from the port of export of guide rail area;If performing cancellation operation, cause sliding block from guide rail area Arrival end removes.Alternatively, when performing confirmation operation and cancelling operation, the operation performed by user interface prompt user.
Fig. 9 is the block diagram for the information processing equipment for realizing the embodiment of the present invention.In an embodiment according to the present invention, information Processing equipment 900 generates control on a user interface, and identifies user gesture information (i) or receive gesture input/gesture identification The gesture information (i) that equipment provides, identifies the instruction of user, and provides a user feedback with same user mutual.Shown in Fig. 9 Information processing equipment 900 is computer.Computer is only an example of appropriate computing environment, and be not intended to imply on The use of the present invention or any restrictions of envelop of function.The information processing equipment that Fig. 9 is shown also is not construed as in institute Any component or the component combined aspects shown have any dependence or requirement.
Information processing equipment 900 includes being coupled directly or indirectly to the memory 912 of bus 910, one or more places Manage device 914, one or more presentation component 916, I/O components 920 and power supplys 922.Representated by bus 910 can be a kind of Or more kind bus (such as address bus, data/address bus or its combination).It is that various components define not in a practical situation It is the inevitable mode as in Fig. 9.For example, the presentation component of such as display device etc can be considered as I/O components 920. In addition, processor can have memory.Present inventors have realized that this is exactly the property of this area, and reaffirm Fig. 9's Diagram is merely to illustrate that the illustrative computer system that can be used in conjunction with one or more embodiments of the invention System.
Information processing equipment 900 generally includes multiple memorizers 912.Unrestricted as an example, memory 912 can wrap Include:Random access memory (RAM), read-only storage (ROM), electronic erasable programmable read only memory (EEPROM), sudden strain of a muscle Deposit, compact disk read-only storage (CDROM), digital universal disc (DVD) or other optics or holographic media, magnetic holder, tape, disk Storage or other magnetic storage apparatus.Computer-readable storage medium can be non-volatile.
Information processing equipment 900 includes one or more processors 914, its from such as bus 910, memory 912 or The various entities of I/O components 920 etc read data.One or more is presented component 916 and number is presented to user or other equipment According to instruction.Exemplary presentation component 916 includes display device, loudspeaker, print components, vibration component, flat-panel screens, throwing Shadow instrument, head-mounted display etc..It can also be used to couple display device, loudspeaker, print components, vibration that component 916, which is presented, The I/O ports of component, flat-panel screens, projecting apparatus, head-mounted display etc..Illustrative I/O components 920 include camera, Microphone, control stick, game paddle, dish-shaped satellite signal transmitting and receiving antenna, scanner, printer, wireless device etc..
The control based on gesture identification can also be implemented in gesture identification equipment or gesture input device according to the present invention. Gesture identification equipment or gesture input device are desirably integrated into the input equipments such as keyboard, mouse, remote controler.
Although an embodiment of the present invention has been shown and described, for the ordinary skill in the art, can be with Understanding without departing from the principles and spirit of the present invention can carry out these embodiments a variety of changes, modification, replace And modification, the scope of the present invention is by appended claims and its equivalent limits.

Claims (10)

  1. A kind of 1. life of the gesture information identification user of man-machine interaction method based on gesture identification, wherein control based on user Order, the control include entrance area and guide rail area, and user interface is divided into Part I and Part II by entrance area, the A part does not include guide rail area, and Part II includes guide rail area;The described method includes:
    Be moved to the Part II of user interface by entrance area from the Part I of user interface in response to cursor, control into Enter activated state, and sliding block is shown in the guide rail area of control;
    Guide rail area is removed from the first end of guide rail area in response to sliding block, the control generates the first event.
  2. 2. according to the method described in claim 1, further include:
    Guide rail area is removed from the second end of guide rail area in response to sliding block, the control generates second event.
  3. 3. according to the method described in one of claim 1-2, further include:
    Guide rail area is removed from the first end or second end of guide rail area in response to sliding block, the control enters unactivated state.
  4. 4. according to the method described in one of claim 1-3, wherein
    On a user interface according to gesture information display highlighting, and gesture information instruction is by image capture device seizure The position of user's hand of user's extracting hand images and/or posture.
  5. 5. according to the method described in one of claim 1-4, further include:
    Enter activated state in response to control, hide cursor on a user interface and change the appearance of control, play the sound specified, Show the word specified, and/or mechanics feedback is provided, enter activated state to prompt the user with control.
  6. 6. according to the method described in one of claim 1-5, further include:
    Enter unactivated state in response to control, the position display cursor supported on a user interface according to gesture information, broadcasting refers to Fixed sound, shows the word specified, and/or provides mechanics feedback.
  7. 7. according to the method described in claim 1-6, further include:
    Under the activated state of control, " grasping " in response to gesture information instruction acts, and cursor is fixed on sliding block, and foundation Gesture information draws sliding block.
  8. 8. according to the method described in one of claim 1-7, the first end of wherein guide rail area has multiple.
  9. A kind of 9. life of the gesture information identification user of human-computer interaction device based on gesture identification, wherein control based on user Order, the control include entrance area and guide rail area, and user interface is divided into Part I and Part II by entrance area, the A part does not include guide rail area, and Part II includes guide rail area;Described device includes:
    Active module, for being moved to the of user interface by entrance area from the Part I of user interface in response to cursor Two parts, control enters activated state, and shows sliding block in the guide rail area of control;
    Event generation module, removes guide rail area, the control generates the first thing in response to sliding block from the first end of guide rail area Part.
  10. 10. a kind of information processing equipment, including processor, memory and display device, described information processing equipment also couple To gesture identification equipment and receive gesture identification equipment offer gesture information;
    The memory storage program, the processor operation described program make described information processing equipment perform claim require 1- Method described in one of 8.
CN201610878812.5A 2016-10-08 2016-10-08 Man-machine interaction method and system based on gesture recognition Active CN107918481B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610878812.5A CN107918481B (en) 2016-10-08 2016-10-08 Man-machine interaction method and system based on gesture recognition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610878812.5A CN107918481B (en) 2016-10-08 2016-10-08 Man-machine interaction method and system based on gesture recognition

Publications (2)

Publication Number Publication Date
CN107918481A true CN107918481A (en) 2018-04-17
CN107918481B CN107918481B (en) 2022-11-11

Family

ID=61891617

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610878812.5A Active CN107918481B (en) 2016-10-08 2016-10-08 Man-machine interaction method and system based on gesture recognition

Country Status (1)

Country Link
CN (1) CN107918481B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108564901A (en) * 2018-06-22 2018-09-21 南京达斯琪数字科技有限公司 A kind of real time human-machine interaction holographic display system
CN111176500A (en) * 2018-11-13 2020-05-19 青岛海尔洗衣机有限公司 Display control method of slider in touch screen
CN112394811A (en) * 2019-08-19 2021-02-23 华为技术有限公司 Interaction method for air-separating gesture and electronic equipment
CN113760137A (en) * 2021-06-16 2021-12-07 荣耀终端有限公司 Cursor display method and electronic equipment
WO2022127478A1 (en) * 2020-12-17 2022-06-23 Huawei Technologies Co., Ltd. Methods and systems for multi-precision discrete control of user interface control element of gesture-controlled device
US11681428B2 (en) 2020-09-11 2023-06-20 Tencent Technology (Shenzhen) Company Limited Location adjustment method and apparatus for control in application, device, and storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090187824A1 (en) * 2008-01-21 2009-07-23 Microsoft Corporation Self-revelation aids for interfaces
WO2009108894A1 (en) * 2008-02-27 2009-09-03 Gesturetek, Inc. Enhanced input using recognized gestures
US20110310010A1 (en) * 2010-06-17 2011-12-22 Primesense Ltd. Gesture based user interface
KR101154137B1 (en) * 2010-12-17 2012-06-12 곽희수 User interface for controlling media using one finger gesture on touch pad
EP2474950A1 (en) * 2011-01-05 2012-07-11 Softkinetic Software Natural gesture based user interface methods and systems
WO2012159254A1 (en) * 2011-05-23 2012-11-29 Microsoft Corporation Invisible control
US20130179781A1 (en) * 2012-01-06 2013-07-11 Microsoft Corporation Edge-based hooking gestures for invoking user interfaces
EP2976690A1 (en) * 2013-03-21 2016-01-27 Sony Corporation Head-mounted device for user interactions in an amplified reality environment
KR20160046725A (en) * 2014-10-21 2016-04-29 삼성전자주식회사 Display device and method for controlling display device

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090187824A1 (en) * 2008-01-21 2009-07-23 Microsoft Corporation Self-revelation aids for interfaces
WO2009108894A1 (en) * 2008-02-27 2009-09-03 Gesturetek, Inc. Enhanced input using recognized gestures
US20110310010A1 (en) * 2010-06-17 2011-12-22 Primesense Ltd. Gesture based user interface
KR101154137B1 (en) * 2010-12-17 2012-06-12 곽희수 User interface for controlling media using one finger gesture on touch pad
EP2474950A1 (en) * 2011-01-05 2012-07-11 Softkinetic Software Natural gesture based user interface methods and systems
WO2012159254A1 (en) * 2011-05-23 2012-11-29 Microsoft Corporation Invisible control
US20130179781A1 (en) * 2012-01-06 2013-07-11 Microsoft Corporation Edge-based hooking gestures for invoking user interfaces
EP2976690A1 (en) * 2013-03-21 2016-01-27 Sony Corporation Head-mounted device for user interactions in an amplified reality environment
KR20160046725A (en) * 2014-10-21 2016-04-29 삼성전자주식회사 Display device and method for controlling display device

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108564901A (en) * 2018-06-22 2018-09-21 南京达斯琪数字科技有限公司 A kind of real time human-machine interaction holographic display system
CN111176500A (en) * 2018-11-13 2020-05-19 青岛海尔洗衣机有限公司 Display control method of slider in touch screen
CN112394811A (en) * 2019-08-19 2021-02-23 华为技术有限公司 Interaction method for air-separating gesture and electronic equipment
CN112394811B (en) * 2019-08-19 2023-12-08 华为技术有限公司 Interaction method of air-separation gestures and electronic equipment
US11681428B2 (en) 2020-09-11 2023-06-20 Tencent Technology (Shenzhen) Company Limited Location adjustment method and apparatus for control in application, device, and storage medium
WO2022127478A1 (en) * 2020-12-17 2022-06-23 Huawei Technologies Co., Ltd. Methods and systems for multi-precision discrete control of user interface control element of gesture-controlled device
CN113760137A (en) * 2021-06-16 2021-12-07 荣耀终端有限公司 Cursor display method and electronic equipment
CN113760137B (en) * 2021-06-16 2022-08-05 荣耀终端有限公司 Cursor display method and electronic equipment

Also Published As

Publication number Publication date
CN107918481B (en) 2022-11-11

Similar Documents

Publication Publication Date Title
CN107918481A (en) Man-machine interaction method and system based on gesture identification
US11048333B2 (en) System and method for close-range movement tracking
US9910498B2 (en) System and method for close-range movement tracking
Rautaray Real time hand gesture recognition system for dynamic applications
Karam A taxonomy of gestures in human computer interactions
LaViola 3d gestural interaction: The state of the field
TWI411935B (en) System and method for generating control instruction by identifying user posture captured by image pickup device
JP2013037675A5 (en)
US20140068526A1 (en) Method and apparatus for user interaction
CN103440033B (en) A kind of method and apparatus realizing man-machine interaction based on free-hand and monocular cam
CN107918482A (en) The method and system of overstimulation is avoided in immersion VR systems
CN111258420B (en) Information interaction method, head-mounted device and medium
CN108536273A (en) Man-machine menu mutual method and system based on gesture
CN107272890A (en) A kind of man-machine interaction method and device based on gesture identification
CN107015743A (en) A kind of suspension key control method and terminal
CN112152894B (en) Household appliance control method based on virtual reality and virtual reality system
CN111240483B (en) Operation control method, head-mounted device, and medium
KR101525011B1 (en) tangible virtual reality display control device based on NUI, and method thereof
JP7155613B2 (en) Information processing device and program
CN110717993A (en) Interaction method, system and medium of split type AR glasses system
CN115496850A (en) Household equipment control method, intelligent wearable equipment and readable storage medium
Spanogianopoulos et al. Human computer interaction using gestures for mobile devices and serious games: A review
CN109144598A (en) Electronics mask man-machine interaction method and system based on gesture
KR20200121304A (en) Information processing device, information processing method and program
Chen Universal Motion-based control and motion recognition

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
TA01 Transfer of patent application right

Effective date of registration: 20191122

Address after: 300450 room 203b-3, building 3, No.4, Haitai development road 2, Huayuan Industrial Zone (outside the ring), Binhai high tech Zone, Binhai New Area, Tianjin

Applicant after: TIANJIN SHARPNOW TECHNOLOGY Co.,Ltd.

Address before: 518000 A2, Shenzhen City, Guangdong Province, the 12 building of Kang Jia R & D building, south of science and technology south twelve

Applicant before: TIANJIN FENGSHI HUDONG TECHNOLOGY Co.,Ltd. SHENZHEN BRANCH

TA01 Transfer of patent application right
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20210118

Address after: 518000 B1018, 99 Dahe Road, Runcheng community, Guanhu street, Longhua District, Shenzhen City, Guangdong Province

Applicant after: Shenzhen laimile Intelligent Technology Co.,Ltd.

Address before: Room 203b-3, building 3, No.4, Haitai development road 2, Huayuan Industrial Zone, Binhai high tech Zone, Binhai New Area, Tianjin 300450

Applicant before: Tianjin Sharpnow Technology Co.,Ltd.

TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20210917

Address after: 518000 509, xintengda building, building M8, Maqueling Industrial Zone, Maling community, Yuehai street, Nanshan District, Shenzhen City, Guangdong Province

Applicant after: Shenzhen qiaoniu Technology Co.,Ltd.

Address before: 518000 B1018, 99 Dahe Road, Runcheng community, Guanhu street, Longhua District, Shenzhen City, Guangdong Province

Applicant before: Shenzhen laimile Intelligent Technology Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant