The focus of VR systems of the prior art concentrates on the product industrial design that can provide immersion experience, interaction
In design and content design.When experiencing some irritating VR contents, for example watch horror film or immersively carry out
During irritating game, some users are there may be Psychological inadaptability or resistance, in 3D scene of the long-time exposed to VR systems
When, dizziness occurs in certain customers.The VR systems of the prior art lack the effective technological means reply above problem, ignore use
The uncomfortable experience that family is being undergone, causes user to contradict virtual reality experience, or cause the overstimulation to user's spirit.It is existing
There is technology also not consider how that identification user receives overstimulation in immersion VR experience, and how to tackle.
According to the first aspect of the invention, there is provided the first man-machine interaction method based on gesture identification, wherein control base
In the order of the gesture information identification user of user, the control includes entrance area and guide rail area, and entrance area is by user
Interface is divided into Part I and Part II, and Part I does not include guide rail area, and Part II includes guide rail area;The side
Method includes:It is moved to the Part II of user interface, control by entrance area from the Part I of user interface in response to cursor
Part enters activated state, and shows sliding block in the guide rail area of control;In response to sliding block guide rail is removed from the first end of guide rail area
Region, the control generate the first event.
The first man-machine interaction mode based on gesture identification according to the first aspect of the invention, there is provided according to the present invention
The second man-machine interaction method based on gesture identification of first aspect, including:In response to sliding block from guide rail area second
End removes guide rail area, and the control generates second event.
Foregoing man-machine interaction mode according to the first aspect of the invention, there is provided according to a first aspect of the present invention based on
Third party's machine exchange method of gesture identification, including:Removed in response to sliding block from the first end or second end of guide rail area
Guide rail area, the control enter unactivated state.
Foregoing man-machine interaction mode according to the first aspect of the invention, there is provided the according to a first aspect of the present invention the 4th
Man-machine interaction method, including:The control is initialized, the control enters unactivated state.
Foregoing man-machine interaction mode according to the first aspect of the invention, there is provided the according to a first aspect of the present invention the 5th
Man-machine interaction method, wherein the first end is guide rail area close to the part of entrance area.
The second man-machine interaction mode based on gesture identification according to the first aspect of the invention, there is provided according to the present invention
The Sixth Man machine exchange method based on gesture identification of first aspect, wherein the second end is guide rail area away from entrance area
Part.
Foregoing man-machine interaction mode according to the first aspect of the invention, there is provided the according to a first aspect of the present invention the 7th
Man-machine interaction method, including:On a user interface according to gesture information display highlighting, and gesture information instruction by
The position of the user's hand for user's extracting hand images that image capture device is caught and/or posture.
Foregoing man-machine interaction mode according to the first aspect of the invention, there is provided the according to a first aspect of the present invention the 8th
Man-machine interaction method, including:According to projected position of the cursor on the center line of guide rail area, draw and slide in guide rail area
Block.
Foregoing man-machine interaction mode according to the first aspect of the invention, there is provided the according to a first aspect of the present invention the 9th
Man-machine interaction method, including:Enter activated state in response to control, hide cursor on a user interface, and change control
Appearance, enter activated state to prompt the user with control.
Foregoing man-machine interaction mode according to the first aspect of the invention, there is provided the according to a first aspect of the present invention the tenth
Man-machine interaction method, including:Enter unactivated state in response to control, the position supported on a user interface according to gesture information
Put display highlighting.
Foregoing man-machine interaction mode according to the first aspect of the invention, there is provided the according to a first aspect of the present invention the tenth
One man-machine interaction method, including:Guide rail area is removed in response to sliding block, hides the sliding block.
Foregoing man-machine interaction mode according to the first aspect of the invention, there is provided the according to a first aspect of the present invention the tenth
Two man-machine interaction methods, including:Under the activated state of control, " grasping " in response to gesture information instruction acts, and makes light
Mark is fixed on sliding block, and draws sliding block according to gesture information.
The first man-machine interaction mode based on gesture identification according to the first aspect of the invention, there is provided according to the present invention
13rd man-machine interaction method of first aspect, wherein the first end of the guide rail area have it is multiple.
The second man-machine interaction mode based on gesture identification according to the first aspect of the invention, there is provided according to the present invention
14th man-machine interaction method of first aspect, wherein the second end of the guide rail area have it is multiple.
According to the second aspect of the invention, there is provided first based on gesture identification according to a second aspect of the present invention is man-machine
The order of the gesture information identification user of interactive device, wherein control based on user, the control include entrance area and guide rail
User interface is divided into Part I and Part II by region, entrance area, and Part I does not include guide rail area, Part II
Including guide rail area;Described device includes:Active module, for passing through entrance from the Part I of user interface in response to cursor
Region is moved to the Part II of user interface, and control enters activated state, and shows sliding block in the guide rail area of control;Event is given birth to
Into module, guide rail area is removed from the first end of guide rail area in response to sliding block, the control generates the first event.
The first human-computer interaction device based on gesture identification according to the second aspect of the invention, there is provided according to the present invention
Second human-computer interaction device of second aspect, including:Second event generation module, in response to sliding block from guide rail area
Second end remove guide rail area, the control generates second event.
Foregoing human-computer interaction device according to the second aspect of the invention, there is provided the according to a second aspect of the present invention the 3rd
Human-computer interaction device, including:Flexible module is deactivated, guide rail is removed from the first end or second end of guide rail area in response to sliding block
Region, the control enter unactivated state.
Foregoing human-computer interaction device according to the second aspect of the invention, there is provided the according to a second aspect of the present invention the 4th
Human-computer interaction device, including:Initialization module, for initializing the control, the control enters unactivated state.
Foregoing human-computer interaction device according to the second aspect of the invention, there is provided the according to a second aspect of the present invention the 5th
Human-computer interaction device, wherein the second end is guide rail area close to the part of entrance area.
The second human-computer interaction device according to the second aspect of the invention, there is provided the according to a second aspect of the present invention the 6th
Human-computer interaction device, wherein the first end is part of the guide rail area away from entrance area.
Foregoing human-computer interaction device according to the second aspect of the invention, there is provided the according to a second aspect of the present invention the 7th
Human-computer interaction device, wherein further including:The device shown on a user interface according to gesture information, and the gesture information refer to
Show position and/or the posture of user's hand of the user's extracting hand images caught by image capture device.
Foregoing human-computer interaction device according to the second aspect of the invention, there is provided the according to a second aspect of the present invention the 8th
Human-computer interaction device, including:Sliding block drafting module, for according to projected position of the cursor on the center line of guide rail area,
Sliding block is drawn in guide rail area.
Foregoing human-computer interaction device according to the second aspect of the invention, there is provided the according to a second aspect of the present invention the 9th
Human-computer interaction device, including:Outward appearance change module, for entering activated state in response to control, hides on a user interface
Cursor, and play the sound specified, shows the word specified, and/or provides mechanics feedback, is entered with to prompt the user with control and swashed
State living.
Foregoing human-computer interaction device according to the second aspect of the invention, there is provided the according to a second aspect of the present invention the tenth
Human-computer interaction device, including:Cursor display module, for entering unactivated state in response to control, on a user interface according to
The position display cursor supported according to gesture information, plays the sound specified, and shows the word specified, and/or provides mechanics feedback.
Foregoing human-computer interaction device according to the second aspect of the invention, there is provided the according to a second aspect of the present invention the tenth
One human-computer interaction device, including:Sliding block hidden module, for removing guide rail area in response to sliding block, hides the sliding block.
Foregoing human-computer interaction device according to the second aspect of the invention, there is provided the according to a second aspect of the present invention the tenth
Two human-computer interaction devices, including:Cursor fixed module, under the activated state of control, being indicated in response to gesture information
" grasping " action, cursor is fixed on sliding block, and sliding block is drawn according to gesture information.
The first human-computer interaction device according to the second aspect of the invention, there is provided the according to a second aspect of the present invention the tenth
Three human-computer interaction devices, wherein the first end of the guide rail area have it is multiple.
The second human-computer interaction device according to the second aspect of the invention, there is provided the according to a second aspect of the present invention the tenth
Four human-computer interaction devices, wherein the second end of the guide rail area have it is multiple.
According to the third aspect of the invention we, there is provided a kind of information processing equipment, wherein described information processing equipment include
Processor, memory and display device, described information processing equipment are additionally coupled to gesture identification equipment and receive gesture identification
The gesture information that equipment provides;The memory storage program, the processor operation described program set described information processing
The standby foregoing man-machine interaction method performed according to the first aspect of the invention.
According to the fourth aspect of the invention, there is provided a kind of computer program, it causes when by information processing equipment
When managing device operation described program, make the foregoing a variety of man-machine friendships of described information processing equipment execution according to the first aspect of the invention
One of mutual method.
According to the fifth aspect of the invention, there is provided first according to a fifth aspect of the present invention virtually shows in immersion is
Overstimulation user method is avoided in system, including:Virtual reality scenario is presented;Discomfort is experienced in response to identification user, is changed
Virtual reality scenario.
First according to a fifth aspect of the present invention avoids overstimulation user method in immersion virtual display system, carries
Supplied according to a fifth aspect of the present invention second overstimulation user method is avoided in immersion virtual display system, wherein logical
Cross identification user's head to move with designated mode, to identify that user experiences discomfort.
First or second according to a fifth aspect of the present invention avoids overstimulation user in immersion virtual display system
Method, there is provided the 3rd according to a fifth aspect of the present invention avoids overstimulation user side in immersion virtual display system
Method, wherein by identifying that user's hand is moved with designated mode, to identify that user experiences discomfort.
First to the 3rd according to a fifth aspect of the present invention avoids overstimulation user in immersion virtual display system
One of method, there is provided the 4th according to a fifth aspect of the present invention avoids overstimulation user in immersion virtual display system
Method, wherein specific characteristic and/or audio occur by the audio of capture includes the voice of the specified phrase of instruction or sentence, comes
Identification user experiences discomfort.
First to fourth according to a fifth aspect of the present invention avoids overstimulation user in immersion virtual display system
One of method, there is provided the 5th according to a fifth aspect of the present invention avoids overstimulation user in immersion virtual display system
Method, wherein the instruction by receiving user from interactive device, to identify that user experiences discomfort.
First to the 5th according to a fifth aspect of the present invention avoids overstimulation user in immersion virtual display system
One of method, there is provided the 4th according to a sixth aspect of the present invention avoids overstimulation user in immersion virtual display system
Method, wherein changing virtual reality scenario by switching, weakening and/or closing the video content played.
The 6th according to a fifth aspect of the present invention avoids overstimulation user method in immersion virtual display system, carries
Supplied according to a fifth aspect of the present invention the 7th overstimulation user method is avoided in immersion virtual display system, wherein logical
Cross switching, reduction and/or close the audio content that is playing to change virtual reality scenario.
The the 6th or the 7th according to a fifth aspect of the present invention avoids overstimulation user in immersion virtual display system
Method, there is provided the 8th according to a fifth aspect of the present invention avoids overstimulation user side in immersion virtual display system
Method, wherein changing virtual reality scenario by ejecting prompt window.
Second according to a fifth aspect of the present invention avoids overstimulation user method in immersion virtual display system, carries
Supplied according to a fifth aspect of the present invention the 9th overstimulation user method is avoided in immersion virtual display system, wherein logical
The speed and/or acceleration for crossing detection user's head movement identify that user's head is moved with designated mode more than threshold value.
The 3rd according to a fifth aspect of the present invention avoids overstimulation user method in immersion virtual display system, carries
Supplied according to a fifth aspect of the present invention the tenth overstimulation user method is avoided in immersion virtual display system, wherein logical
Cross the speed of detection user hand movement and/or acceleration is more than threshold value, or the distance of hand opposing headers or eyes is persistently small
Identify that user's hand is moved with designated mode in threshold value.
The 4th according to a fifth aspect of the present invention avoids overstimulation user method in immersion virtual display system, carries
Supplied according to a fifth aspect of the present invention the 11st overstimulation user method is avoided in immersion virtual display system, wherein
The word occurred in threshold value, or identification audio is exceeded by the frequency and/or loudness that detect capture audio and belongs to specified set,
To identify that user experiences discomfort.
The 5th according to a fifth aspect of the present invention avoids overstimulation user method in immersion virtual display system, carries
Supplied according to a fifth aspect of the present invention the 12nd overstimulation user method is avoided in immersion virtual display system, wherein
By identifying that user presses the specified button of interactive device, user exceeds threshold value, and/or user to handing over to the grip of interactive device
Mutual equipment makes the action thrown away and/or got rid of, to identify that user experiences discomfort.
According to the sixth aspect of the invention, there is provided the first immersive VR system according to a sixth aspect of the present invention
System, including computing unit, display device and sensor assembly;The computing unit is used to run virtual reality applications to build
Virtual reality scenario;Sensor assembly is used for the state for perceiving user;The computing unit also operation program is with based on sensor
The state for the user that module is perceived identifies whether user experiences discomfort, and in response to identifying that user experiences discomfort,
Change constructed virtual reality scenario.
The first immersed system of virtual reality according to the sixth aspect of the invention, there is provided according to a sixth aspect of the present invention
The second immersed system of virtual reality, wherein the computing unit is changed by virtual reality by indicating virtual reality applications
Using constructed virtual reality scenario.
The first immersed system of virtual reality according to the sixth aspect of the invention, there is provided according to a sixth aspect of the present invention
The 3rd immersed system of virtual reality, wherein the computing unit is changed by virtual reality applications by indicating real world devices
Constructed virtual reality scenario.
First according to the sixth aspect of the invention is one of to the 3rd immersed system of virtual reality, there is provided according to this hair
4th immersed system of virtual reality of bright 6th aspect, wherein sensor assembly include head pose acquisition equipment, the head
Portion's attitude acquisition device is used to capture and export the posture of user's head.
The 4th immersed system of virtual reality according to the sixth aspect of the invention, there is provided according to a sixth aspect of the present invention
The 5th immersed system of virtual reality, wherein sensor assembly gesture acquisition equipment, the gesture acquisition equipment is used to capture
And export the gesture of user.
The 5th immersed system of virtual reality according to the sixth aspect of the invention, there is provided according to a sixth aspect of the present invention
The 6th immersed system of virtual reality, wherein sensor assembly includes audio capturing device, and the audio capturing device is used for
Words and phrases in sound, voice, and/or identification voice that capture user sends.
The 6th immersed system of virtual reality according to the sixth aspect of the invention, there is provided according to a sixth aspect of the present invention
The 7th immersed system of virtual reality, further include interactive device, the interactive device is used for pushing button, using for instruction user
The grip at family makes interactive device the action thrown away and/or got rid of beyond threshold value, speed, acceleration, and/or user.
The 4th immersed system of virtual reality according to the sixth aspect of the invention, there is provided according to a sixth aspect of the present invention
The 8th immersed system of virtual reality, wherein the computing unit also operation program obtains user's head by sensor assembly
Posture, and the speed of detection user's head movement and/or acceleration use account more than threshold value to identify from user's head posture
Moved with designated mode in portion;And user's head is moved into the foundation as identification user experience discomfort using designated mode.
The 5th immersed system of virtual reality according to the sixth aspect of the invention, there is provided according to a sixth aspect of the present invention
The 9th immersed system of virtual reality, wherein the computing unit also operation program pass through sensor assembly obtain user's hand
Posture, and the speed of detection user hand movement and/or acceleration are more than threshold value, or user's hand from user's hand gestures
The distance of opposing headers identifies that user experiences discomfort continuously less than threshold value.
The 6th immersed system of virtual reality according to the sixth aspect of the invention, there is provided according to a sixth aspect of the present invention
The tenth immersed system of virtual reality, wherein the computing unit also operation program by sensor assembly obtain user send
Audio, threshold value is exceeded by the frequency and/or loudness that detect capture audio, or identify that the word occurred in audio belongs to finger
Fixed set, to identify that user experiences discomfort.
The 7th immersed system of virtual reality according to the sixth aspect of the invention, there is provided according to a sixth aspect of the present invention
The 11st immersed system of virtual reality, wherein the computing unit also operation program obtain interactive device instruction, to know
Other user experiences discomfort.
First according to the sixth aspect of the invention is one of to the 11st immersed system of virtual reality, there is provided according to this
The 12nd immersed system of virtual reality of the 6th aspect is invented, wherein the computing unit is by switching, weakening and/or close
The video content played, and/or by switching, weakening and/or close the audio content played, and/or pass through bullet
Go out prompt window to change constructed virtual reality scenario.
According to the seventh aspect of the invention, there is provided one kind avoids overstimulation user in immersion virtual display system
Device, including:Module is presented, for virtual reality scenario to be presented;Change module, for being experienced not in response to identification user
It is suitable, change virtual reality scenario.
According to the eighth aspect of the invention, there is provided a kind of information processing equipment, including processor, memory, display are set
Standby, described information processing equipment is additionally coupled to the state of sensor assembly and the user of receiving sensor module perception;It is described to deposit
Reservoir storage program, the processor operation described program make the execution of described information processing equipment carry according to a fifth aspect of the present invention
What is supplied avoids one of overstimulation user method in immersion virtual display system.
Embodiment
The embodiment of the present invention is described below in detail, the example of the embodiment is shown in the drawings, wherein from beginning to end
Same or similar label represents same or similar element or has the function of same or like element.Below with reference to attached
The embodiment of figure description is exemplary, and is only used for explaining the present invention, and is not considered as limiting the invention.On the contrary, this
The embodiment of invention includes falling into all changes in the range of the spirit and intension of attached claims, modification and equivalent
Thing.
In the description of the present invention, it is to be understood that term " first ", " second " etc. are only used for description purpose, without
It is understood that to indicate or implying relative importance.In the description of the present invention, it is necessary to which explanation, provides unless otherwise clear and definite
And restriction, term " connected ", " connection " should be interpreted broadly, for example, it may be fixedly connected or be detachably connected,
Or it is integrally connected;Can mechanically connect or be electrically connected;It can be directly connected, intermediary can also be passed through
It is indirectly connected.For the ordinary skill in the art, the tool of above-mentioned term in the present invention can be understood with concrete condition
Body implication.In addition, in the description of the present invention, unless otherwise indicated, " multiple " are meant that two or more.
Any process or method described otherwise above description in flow chart or herein is construed as, and represents to include
Module, fragment or the portion of the code of the executable instruction of one or more the step of being used for realization specific logical function or process
Point, and the scope of the preferred embodiment of the present invention includes other realization, wherein can not press shown or discuss suitable
Sequence, including according to involved function by it is basic at the same time in the way of or in the opposite order, carry out perform function, this should be of the invention
Embodiment person of ordinary skill in the field understood.
Fig. 3 is the block diagram of the man-machine interactive system according to embodiments of the present invention based on gesture identification.It is real according to the present invention
Applying the man-machine interactive system of example includes gesture input device 310 coupled to each other, information processing equipment 320 and display device
330.In one example, gesture input device 310, are sent to for capturing the image of user's hand, and by the image of acquisition
Information processing equipment is handled.Information processing equipment 320, for receiving the hand images of gesture input device transmission, identification
The gesture information of user's hand in image.Information processing equipment 320 also by display device 330 to user's present graphical and/or
Image, for example, on display device 330 draw user's hand virtual image.Information processing equipment can be such as computer,
Mobile phone or dedicated gesture identification equipment.Display device 330 can be such as flat-panel screens, projecting apparatus, head-mounted display.
In another example, gesture input device 310 perceives position and/or the posture of user's hand, identifies user's hand
The gesture information in portion, and user's hand information is sent to information processing equipment 320.Information processing equipment 320 identifies that gesture is defeated
The user's hand information for entering the offer of equipment 310 makees the input that provides to the user, and is provided a user by display device 330 defeated
Go out, to realize human-computer interaction.Obviously, information processing equipment 320 can also be handed over by the forms such as sound, mechanical function and user
Mutually.
As still another example, gesture input device 310 still can be such as depth transducer, Distance-sensing
Device, VR controllers (such as Oculus Rift Touch), game paddle, data glove (such as CyberGlove), motion capture system
(such as OptiTracker), gyroscope, for perceiving position and/or the posture of user's hand.
From user's gesture done in real world and/or action, the gesture information based on virtual coordinate system is extracted
(i).Gesture information (i) can be a vector, and formalization representation for i=C, palm, thumb, index, mid, ring,
little}.Wherein, c represents the hand-type of whole hand, for example, clenching fist, the five fingers open, triumphantly gesture etc., palm represent instruction palm
Positional information, thumb, index, mid, ring and little represent thumb, forefinger, middle finger, the third finger and little finger of toe respectively
Positional information and/or orientation information.And wherein, virtual coordinate system is showing as the void constructed by information processing equipment 320
Intend the positional information in the world.And show the positional information in object or space in real world with real coordinate system.Information processing
Virtual world constructed by equipment 320 can be the two-dimensional space, three dimensions or fusion of such as two-dimensional graphical user interface
The virtual reality scenario of user.Real coordinate system and virtual coordinate system, can be two-dimensional coordinate system or three-dimensional system of coordinate.Can be by one
Fixed frequency or time interval renewal gesture information (i), or when the hand position and/or posture of user change, more
New gesture information (i).
On a user interface, can be according to gesture information (i) display highlighting, for providing a user eye response.Cursor
Position on graphical interfaces is represented by the function of gesture information (i), such as func_a (i).Those skilled in the art can be with
Understand, function func_a differences according to different application scenarios or setting.
For example, in a two-dimensional user interface, the position that draw cursor is calculated by formula (1):
Func_a (i)=C*0+palm*0+index.position*0.5+mid*0+little*0 (1)
(1) in formula, index.position refers to the position of user's forefinger, thus from (1) formula it was found from, cursor is in user circle
Position on face, only relies upon user's index finger location, and the distance that cursor moves on a user interface, be the movement of user's forefinger away from
From half.
Cursor can have single pattern, such as the shape of hand.Cursor can also have a variety of patterns for corresponding to not homochirality.
Illustrate with reference to Fig. 4-Fig. 6, in two-dimensional user interface how by gesture come operational controls.
Fig. 4 is the schematic diagram of the control based on gesture identification in two-dimensional user interface according to embodiments of the present invention.Referring to
Fig. 4, the control based on gesture identification includes in two-dimensional user interface according to embodiments of the present invention:Entrance area and guide rail area
Domain.Entrance area in Fig. 4 is a line segment.In another example, entrance area can be curve.Entrance area is by user
The two dimensional surface at interface is divided into two parts, by the side including guide rail area, is known as rail-sides, and opposite side is known as freely
Side.In Fig. 4, guide rail area is rectangle.Obviously, in other examples, guide rail area can have other shapes, such as line segment, three
It is angular, oval etc..Entrance area can be plotted in two-dimensional user interface with guide rail area, to prompt user control place
Position.In another example, entrance area and/or guide rail area can be hidden, not influence to show in user interface
Content.Guide rail area closes on or is adjacent to entrance area.Guide rail area is known as arrival end close to the part of entrance area, and leads
Part of the rail region away from entrance area is known as the port of export.In another example, in order to allow the hanging of user's hand to operate
It is easier to be identified, entrance area and the guide rail area of control are rendered as gap shape or bell mouth shape, so as to be easy to guide user
Cursor is set to enter guide rail area by gesture.
In the example in fig. 4, guide rail area further includes sliding block.Sliding block can be moved along guide rail area.
Fig. 5 A-5D are the various states based on gesture identification control in two-dimensional user interface according to embodiments of the present invention
Schematic diagram.
Control based on gesture identification has activated state and inactive state.Inactive state is the original state of control.Fig. 5 A
The control in inactive state is illustrated, and is associated with the cursor of gesture information (i).Notice in fig. 5, guide rail area
On do not draw sliding block, or sliding block is hidden.Do not include sliding block in guide rail area, can be the prompting to user, inform user
Control is in inactive state.
When the gesture of user make it that cursor enters rail-sides from free side by entrance area, control is turned by inactive state
It is changed to activated state.Control receives the event of instruction gesture information (i), and identifies the change that cursor position occurs for gesture information (i)
Change.When gesture information (i) makes cursor position enter rail-sides via entrance area from the free side of control, by control state change
For activated state.And draw the control of activated state.
Alternatively, when the gesture of user make it that cursor enters rail-sides from free side by entrance area, in guide rail area
Sliding block is drawn on domain, and displays that cursor;User is moved the cursor on sliding block by gesture, and is acted by " grasping ",
Cursor is set to be fixed on sliding block, and next, sliding block will follow cursor to move." grasping " action is not essential, in a reality
Apply in example, when control is in activated state, sliding block follows cursor to move, or control is based on gesture information (i) and sends out slide position
Changing.Still alternatively, with the movement of sliding block, also play and specify sound, change visual presence and/or provide a user power
Learn feedback.For example, as sliding block is moved to the port of export, gradually increase and/or frequency gradually rise for the sound of broadcasting, and with cunning
Block is moved to the port of export, and the sound abated gradually and/or frequency gradually reduces for broadcasting.
Fig. 5 B shows are in the control of activated state.In the control of activated state, guide rail area includes sliding block.Sliding block associates
In gesture information (i) to the prompting to its hand position of user, and hide cursor.As an example, according to definite drafting light
The rule of cursor position, determines to draw the position of sliding block.Further, the position for drawing sliding block is limited at guide rail area so that
Sliding block is showed to move along guide rail.In the example of Fig. 5 B, guide rail area includes center line.The position of cursor (not drawing out) is in
Projected position on line, to draw the position of sliding block in guide rail area.Alternatively, the appearance of control is changed, to be carried to user
Show that control is activated and enters activated state.For example, drawing shade along the edge of control, and/or change the color of control regions,
And/or the word that display is specified.Still alternatively, fed back by providing a user mechanics, and/or play the sound specified, with
Control is prompted the user with to be activated and enter activation.
If the gesture of user causes cursor to enter guide rail area from rail-sides, or the gesture of user makes cursor bypass entrance
Region and enter rail-sides, then the control in inactive state will not be made to be changed into activated state.And sliding block will not be shown, and according to
Right display highlighting.
In an embodiment according to the present invention, led if user makes sliding block be removed from the port of export of guide rail area by gesture
Rail, represents user to control instruction " confirmations " order, and if user sliding block is made from the shifting of the arrival end of guide rail area by gesture
Go out guide rail, represent user to control instruction " cancellation " order.
Fig. 5 C illustrate the control for receiving " confirmation " order.For the control of activated state, as an example, user's sidesway to the right
Dynamic forefinger, the position (sliding block that dotted line pattern is shown as in Fig. 5 C) of sliding block accordingly move right along guide rail area, with time to
User provides visual feedback, user is known whether control correctly identifies the intention of oneself.When with the movement of user's forefinger,
Control detects that sliding block removes guide rail area from the port of export of guide rail area, and control generation represents the event of " confirmation " order.It is logical
The event that processing represents " confirmation " order is crossed, is identified with the operation associated by control.As sliding block removes guide rail area, hide
Sliding block, for example, only drawing sliding block in guide rail area, and hides part of the sliding block beyond guide rail area.Alternatively, control is changed
The appearance of part, identifies the intention of user and generates " confirmation " event to prompt the user with control.For example, dodge control regions
It is bright, and/or change the color of control regions, and/or the word that display is specified.Still alternatively, mechanics feedback is provided a user,
And/or the sound specified is played, to prompt the user with the intention that control identifies user.Further, led as sliding block removes
Rail region, the State Transferring of control is inactive state, and draws cursor to track the gesture of user.
Fig. 5 D illustrate the control for receiving " cancellation " order.For the control of activated state, as an example, user's sidesway to the left
Dynamic forefinger, the position (sliding block that dotted line pattern is shown as in Fig. 5 D) of sliding block are accordingly moved to the left along guide rail area, with time to
User provides visual feedback, user is known whether control correctly identifies the intention of oneself.When with the movement of user's forefinger,
Control detects that sliding block removes guide rail area from the arrival end of guide rail area, and control generation represents the event of " cancellation " order, or
Person does not produce event, carrys out instruction user and does not make " confirmation " instruction, or original attempt is abandoned or eliminated to instruction user.
By handling the event for representing " cancellation " and ordering, it is cancelled or ignores with the operation associated by control.As sliding block removes guide rail
Region, hides sliding block.Alternatively, change the appearance of control, and/or provide a user mechanics feedback, to prompt the user with control
Identify the intention of user and generate " cancellation " event.Further, as sliding block removes guide rail area, the state of control turns
Inactive state is changed to, and draws cursor to track the gesture of user.
In another embodiment in accordance with the invention, guide rail area has cross shape.In activated state, when sliding block is from right side
Or during the removal guide rail of top, control produces " confirmation " event;And when sliding block removes guide rail from left side or lower section, control produces
" cancellation " event.
One of ordinary skill in the art will realize that guide rail area can have multiple outlets.When sliding block is removed from some outlets
When, control produces " confirmation " event, and sliding block is exported from other when removing, and control produces " cancellation " event.To prompt user
The different implications of outlet, in each export direction, can feed back the implication just exported to user by video, audio and/or mechanics
Different instructions is provided.
Still there is one embodiment according to the present invention, guide rail area has cross shape, and have multiple outlets,
Each branch in cross region corresponds to one outlet, and each outlet indicates that different implications or multiple outlet instructions are a variety of
Implication.For example, when sliding block is exported from top and removed, control produces " cancellation " event, and instruction user is abandoned to playing music
Attempt;When sliding block is removed from right-side outlet, control produces " mute " event, and instruction user wishes vertical even if from certain application
The volume of the audio output of program falls to 0;When sliding block is exported from top to be removed, control produces " being arranged to high sampling rate " thing
Part;And when sliding block exports removal from below, control produces " being arranged to low sampling rate " event.It is defeated in response to receiving control institute
The different implications of instruction or the event of different command gone out, application program are handled accordingly.
Fig. 6 is the flow of the man-machine interaction method based on gesture identification in two-dimensional user interface according to embodiments of the present invention
Figure.To use control according to embodiments of the present invention, initialization control (610).Control initialization procedure, is included in user circle
Control is drawn on face, for example, drawing control as shown in Figure 5A in the user interface.And control is set to receive gesture information (i).
Alternatively, cursor is also drawn on a user interface, and the position for drawing cursor is associated with gesture information (i).In another example, by
The program or other programs that control is applied to draw cursor on a user interface.Control receives gesture information (i), believes from gesture
Cease the position that cursor is obtained in (i).Rail-sides are moved to by entrance area from free side in response to cursor, control enters activation
State (620).Fig. 5 B shows control of activated state.Alternatively, when control enters activated state, also change the appearance of control, produce
The sound specified, and/or mechanics feedback is provided, enter activated state to prompt the user with control.Control is also drawn in guide rail area
Sliding block.The position for drawing sliding block is limited at guide rail area so that shows sliding block and is moved along guide rail.And sliding block is set to follow use
The gesture at family and move.As an example, the rule according to definite drafting cursor position, determines to draw the position of sliding block.Further
Ground, by cursor position in the projection of guide rail area center line, the position as drafting sliding block.
The position of sliding block is obtained in control gesture information (i).Whether control detection sliding block removes from the side of guide rail area
Guide rail area (640).Referring to Fig. 5 C, when control detects that sliding block removes guide rail area, control production from the port of export of guide rail area
As an example, the first event can be " confirmation " event, " mute " event etc. to raw first event (650).And referring to Fig. 5 D,
When control detects that sliding block removes guide rail area from the arrival end of guide rail area, control produces second event (650).Second event
Can be the events such as " cancellation " event, " be arranged to high and use rate ".
In step 650, with first event that produces, control enters unactivated state.Alternatively, change the appearance of control, produce
The raw sound specified, and/or provide mechanics feedback, identifies the intention of user and generates the first thing to prompt the user with control
Part.And alternatively, cursor is drawn to track the gesture of user.
In step 660, with second event is produced, control enters unactivated state.Alternatively, change the appearance of control, produce
The raw sound specified, and/or provide mechanics feedback, identifies the intention of user and generates the second thing to prompt the user with control
Part.And alternatively, cursor is drawn to track the gesture of user.
In a further embodiment, user creates space according to embodiments of the present invention in virtual world, and/or sets
Put or change the position of control.Control can be arranged on position easy to operation by user.For example, user's arm is fully stretched to side
During expansion, the position where cursor.So as to not only indicate the order such as " confirmation "/" cancellation " easy to user, but also do not influence to virtual generation
The operation of other objects in boundary.
Illustrated with reference to Fig. 7-8, gesture identification control is based on according to the present invention in three-dimensional user interface
Embodiment.Fig. 7 is the schematic diagram of the control based on gesture identification in three-dimensional user interface according to embodiments of the present invention.Referring to
Fig. 7, the control based on gesture identification includes in three-dimensional user interface according to embodiments of the present invention:Entrance area and guide rail area
Domain.Entrance area in Fig. 7 is a finite rectangular plane.In another example, entrance area can be in curved surface or plane
The region surrounded by closed curve.In Fig. 7, the three dimensions of user interface is divided into two parts by the plane where entrance area,
By the side including guide rail area, it is known as rail-sides, and opposite side is known as free side.In Fig. 7, guide rail area is cuboid.
Obviously, in other examples, guide rail area can have other shapes, such as cylinder, sphere, spheroid etc..Entrance area with
Guide rail area can be plotted on three-dimensional user interface, to prompt the position where user control.Guide rail area can be with three
The object of dimension user interface blends, for example, the vase of user interface, mailbox etc..In another example, entrance area and/
Or guide rail area can be hidden, not influence the content shown in user interface.Guide rail area closes on or is adjacent to inlet region
Domain.Guide rail area is known as arrival end close to the part of entrance area, and part of the guide rail area away from entrance area is known as exporting
End.
In the example of fig. 7, guide rail area further includes sliding block.Sliding block can be moved along guide rail area.Alternatively, in control also
Including guide rail line, in the figure 7, guide rail line is the center line along the long axis direction of the cuboid of guide rail area.One end of guide rail line
Point is on entrance area.Sliding block is moved along guide rail line.
Fig. 8 A-8D are the various states of the control based on gesture identification in three-dimensional user interface according to embodiments of the present invention
Schematic diagram.
Fig. 8 A illustrate the control in inactive state, and are associated with the cursor of gesture information (i).In inactive state
On control, sliding block is not drawn on guide rail area, or sliding block is hidden.Do not include sliding block in guide rail area, be carrying to user
Show, inform that user control is in inactive state.
When the gesture of user make it that cursor enters rail-sides from free side by entrance area, control is turned by inactive state
It is changed to activated state.Control receives the event of instruction gesture information (i), and identifies the change that cursor position occurs for gesture information (i)
Change.When gesture information (i) makes cursor position enter rail-sides via entrance area from the free side of control, by control state change
For activated state.And draw the control of activated state.
Alternatively, when the gesture of user make it that cursor enters rail-sides from free side by entrance area, in guide rail area
Sliding block is drawn on domain, and displays that cursor;User is moved the cursor on sliding block by gesture, and is acted by " grasping ",
Cursor is set to be fixed on sliding block;And next, sliding block will follow cursor to move.
Fig. 8 B shows are in the control of activated state.In the control of activated state, guide rail area includes sliding block.Sliding block associates
In gesture information (i) to the prompting to its hand position of user, and hide cursor.As an example, according to definite drafting light
The rule of cursor position, determines to draw the position of sliding block.Further, the position for drawing sliding block is limited at guide rail area so that
Sliding block is showed to move along guide rail.In the example of Fig. 8 B, guide rail area includes guide rail line.The position of cursor (not drawing out) exists
Projected position in guide rail line, to draw the position of sliding block in guide rail area.Alternatively, change the appearance of control, with to
Prompt control to be activated and enter activated state in family.For example, drawing shade along the edge of control, and/or change the face of control regions
Color, and/or the sound specified is played, and/or the word that display is specified.
If the gesture of user causes cursor to enter guide rail area from rail-sides, or the gesture of user makes cursor bypass entrance
Region and enter rail-sides, then the control in inactive state will not be made to be changed into activated state.And sliding block will not be shown, and
Still display highlighting.
In an embodiment according to the present invention, led if user makes sliding block be removed from the port of export of guide rail area by gesture
Rail, represents user to control instruction " confirmations " order, and if user sliding block is made from the shifting of the arrival end of guide rail area by gesture
Go out guide rail, represent user to control instruction " cancellation " order.And alternatively, if the gesture of user is attempted to make sliding block from arrival end
Or the region beyond the port of export removes guide rail area, sliding block is limited in guide rail area, and control is still in inactive state.
Fig. 8 C illustrate the control for receiving " confirmation " order.For the control of activated state, as an example, user's sidesway to the right
Dynamic forefinger, the position (sliding block that dotted line pattern is shown as in Fig. 8 C) of sliding block accordingly move right along guide rail area, with time to
User provides visual feedback, user is known whether control correctly identifies the intention of oneself.When with the movement of user's forefinger,
Control detects that sliding block removes guide rail area from the port of export of guide rail area, and control generation represents the event of " confirmation " order.It is logical
The event that processing represents " confirmation " order is crossed, is identified with the operation associated by control.As sliding block removes guide rail area, hide
Sliding block.Further, with sliding block remove guide rail area, the State Transferring of control be inactive state, and drafting cursor with
The gesture of track user.
Fig. 8 D illustrate the control for receiving " cancellation " order.For the control of activated state, as an example, user's sidesway to the left
Dynamic forefinger, the position (sliding block that dotted line pattern is shown as in Fig. 8 D) of sliding block are accordingly moved to the left along guide rail area, with time to
User provides visual feedback, user is known whether control correctly identifies the intention of oneself.When with the movement of user's forefinger,
Control detects that sliding block removes guide rail area from the arrival end of guide rail area, and control generation represents the event of " cancellation " order, or
Person does not produce event, carrys out instruction user and does not make " confirmation " instruction, or original attempt is abandoned or eliminated to instruction user.
By handling the event for representing " cancellation " and ordering, it is cancelled or ignores with the operation associated by control.As sliding block removes guide rail
Region, hides sliding block.Further, as sliding block removes guide rail area, the State Transferring of control is inactive state, and is drawn
Cursor is to track the gesture of user.
One of ordinary skill in the art will realize that guide rail area can have multiple outlets.When sliding block is removed from some outlets
When, control produces " confirmation " event, and sliding block is exported from other when removing, and control produces " cancellation " event.To prompt user
The different implications of outlet, in each export direction, can feed back the implication just exported to user by video, audio and/or mechanics
Different instructions is provided.
Still there is one embodiment according to the present invention, guide rail area has each outlet instruction in multiple outlets different
Implication or multiple outlets indicate a variety of implications.For example, when sliding block is removed from first outlet, control produces " cancellation " thing
Part;When sliding block is removed from second outlet, control produces " mute " event;When sliding block is removed from the 3rd outlet, control produces
" being arranged to high sampling rate " event;And when sliding block is removed from the 4th outlet, control produces " being arranged to low sampling rate " event.
The event for indicating different implications or different command exported in response to receiving control, application program are handled accordingly.
In another embodiment of the present invention, control is shown as follows in three-dimensional user interface.When initial, control
During in unactivated state, display highlighting, hides entrance area, shows guide rail area;Control is converted into sharp by unactivated state
During state living, gesture cursor is weakened, sliding block is shown on guide rail area, and show guide rail area and/or guide rail line.Control into
Enter the moment of state of activation, trigger special efficacy, including:Gradually reduction cursor, and display guide rail area and/or guide rail line.With
Another special efficacy when confirming or cancelling the operation associated by current control, is triggered by control in family, including:Sliding block is being highlighted
Gradually weaken and disappear afterwards, gradually recover the display to cursor, guide rail area and/or guide rail line segment are progressively reverted into control
Pattern during in inactive state.
In another embodiment in accordance with the invention, illustrate and how to be operated according to the present invention using gesture in the application
The control that embodiment provides.(1) control of unactivated state is shown in the user interface of display device and is associated with gesture
The cursor of information (i);(2) user changes gesture or mobile hand position, observes the change of control and cursor on display device;With
Family controls cursor to enter guide rail area, control quilt by entrance area from free side by varying gesture or mobile hand position
Activation.When activating control, user interface prompt user control is activated.(3) after control is active, user passes through
Change gesture or mobile hand position carrys out control slide block so that sliding block is moved along guide rail area.If user will perform confirmation behaviour
Make, then so that sliding block is removed from the port of export of guide rail area;If performing cancellation operation, cause sliding block from guide rail area
Arrival end removes.Alternatively, when performing confirmation operation and cancelling operation, the operation performed by user interface prompt user.
Fig. 9 is the block diagram for the information processing equipment for realizing the embodiment of the present invention.In an embodiment according to the present invention, information
Processing equipment 900 generates control on a user interface, and identifies user gesture information (i) or receive gesture input/gesture identification
The gesture information (i) that equipment provides, identifies the instruction of user, and provides a user feedback with same user mutual.Shown in Fig. 9
Information processing equipment 900 is computer.Computer is only an example of appropriate computing environment, and be not intended to imply on
The use of the present invention or any restrictions of envelop of function.The information processing equipment that Fig. 9 is shown also is not construed as in institute
Any component or the component combined aspects shown have any dependence or requirement.
Information processing equipment 900 includes being coupled directly or indirectly to the memory 912 of bus 910, one or more places
Manage device 914, one or more presentation component 916, I/O components 920 and power supplys 922.Representated by bus 910 can be a kind of
Or more kind bus (such as address bus, data/address bus or its combination).It is that various components define not in a practical situation
It is the inevitable mode as in Fig. 9.For example, the presentation component of such as display device etc can be considered as I/O components 920.
In addition, processor can have memory.Present inventors have realized that this is exactly the property of this area, and reaffirm Fig. 9's
Diagram is merely to illustrate that the illustrative computer system that can be used in conjunction with one or more embodiments of the invention
System.
Information processing equipment 900 generally includes multiple memorizers 912.Unrestricted as an example, memory 912 can wrap
Include:Random access memory (RAM), read-only storage (ROM), electronic erasable programmable read only memory (EEPROM), sudden strain of a muscle
Deposit, compact disk read-only storage (CDROM), digital universal disc (DVD) or other optics or holographic media, magnetic holder, tape, magnetic
Disk storage or other magnetic storage apparatus.Computer-readable storage medium can be non-volatile.
Information processing equipment 900 includes one or more processors 914, its from such as bus 910, memory 912 or
The various entities of I/O components 920 etc read data.One or more is presented component 916 and number is presented to user or other equipment
According to instruction.Exemplary presentation component 916 includes display device, loudspeaker, print components, vibration component, flat-panel screens, throwing
Shadow instrument, head-mounted display etc..It can also be used to couple display device, loudspeaker, print components, vibration that component 916, which is presented,
The I/O ports of component, flat-panel screens, projecting apparatus, head-mounted display etc..Illustrative I/O components 920 include camera,
Microphone, control stick, game paddle, dish-shaped satellite signal transmitting and receiving antenna, scanner, printer, wireless device etc..
The control based on gesture identification can also be implemented in gesture identification equipment or gesture input device according to the present invention.
Gesture identification equipment or gesture input device are desirably integrated into the input equipments such as keyboard, mouse, remote controler.
Figure 10 is the block diagram of virtual reality system according to embodiments of the present invention.Virtual reality system is by hardware and operation
Software sharing on hardware, is clear purpose, in Fig. 10, illustrates the hardware layer and software layer split by dotted line.
The hardware device with virtual reality system according to embodiments of the present invention is shown in hardware layer, and is illustrated in software layer virtual existing
The software of real system.
Virtual reality system includes the computing unit 1010, display device 1020, memory/storage device 1030 being mutually coupled
And sensor assembly 1040.Computing unit 1010 includes CPU (Central Processing Unit, central processing unit)
And for accelerating GPU (Graphic Processing Unit, the figure of the tasks such as video display, image recognition, scene rendering
Shape processing unit).Computing unit 1010 may include one or more CPU and one or more GPU.Display device 1020 is being counted
Calculate under the instruction of unit 1010 to user's present graphical, picture material, virtual reality scenario is presented.Display device 220 can be with
It is such as head-mounted display, projector, virtual reality glasses.Memory/storage device 1030 can include:Random access memory
Device (RAM), read-only storage (ROM), electronic erasable programmable read only memory (EEPROM), flash memory, compact disk is read-only deposits
Reservoir (CDROM), digital universal disc (DVD) or other optics or holographic media, magnetic holder, tape, disk storage or other magnetism
Storage device.Memory/storage device 1030 is used for ephemeral data and/or result data when storage program and program operation.
Sensor assembly 1040 detects User Status, and provides input to virtual reality with regard to the state of user.
Sensor assembly 1040 includes multiple sensors and/or input unit.For example, sensor assembly 1040 includes head
Attitude acquisition device 1042, gesture acquisition equipment 1044, audio/speech acquisition equipment 1046 and/or interactive device 1048.
As an example, head pose acquisition equipment 1042 includes the gyroscope inside the head-mounted display, for
Certain frequency provides the head pose information such as the position of user's head, direction, pitching, alternatively, head to VR systems in real time
Portion's attitude acquisition device is also handled head attitude information, to identify the action of user's head, such as from head pose
Identified in information shake the head, head is acutely rocked, head quickly layback etc. headwork.In another example, head pose
Acquisition equipment 242 includes video acquisition device, by capturing the image of user's head, identifies the attitude information and/or head on head
The action in portion.
Gesture acquisition equipment 1044 may include the gesture identification provided in Chinese patent application CN201110100532.9
Its full text, is herein incorporated by system by quoting.Gesture identifying device 1044 can also be data glove (CyberGlove).Hand
Gesture acquisition equipment 244 extracts gesture posture (i) from user's gesture done in real world and/or action.Gesture posture
(i) it can be a vector, and formalization representation is i={ C, palm, thumb, index, mid, ring, little }.Wherein,
C represents the hand-type of whole hand, for example, clenching fist, the five fingers opening, triumphantly gesture etc., palm represents the positional information of instruction palm,
Thumb, index, mid, ring and little represent respectively thumb, forefinger, middle finger, the third finger and little finger of toe positional information and/
Or orientation information.
Audio/speech acquisition equipment 1046 can be microphone/array, speech processes/identification module, language processing module
Deng.Audio/speech acquisition equipment 1046 is used to identify audio-frequency information and/or the identification user emergency such as scream, cry and shout of user, embraces
The language such as resentment.Audio/speech acquisition equipment 1046 also can extract the features such as power, the frequency of audio, and/or convert speech into
Herein.
Interactive device 1048 can be such as VR controllers (such as Oculus Rift Touch), game paddle, data glove
(such as CyberGlove), motion capture system (such as OptiTracker), for identifying the instruction of user.For example, when user presses
Button or when making predetermined action/gesture, represents that user provides the instruction specified to virtual reality system.In another example
In, interactive device 1048 includes pressure sensor, can perceive user's grip size, and identify by the grip size of user
The anxiety or sense of discomfort of user.In another example, interactive device 1048 includes inertial sensor, by speed and/or adds
Speed can perceive user and the interactive action such as acutely be rocked, throws away/get rid of to interactive device, and use is identified by detecting these actions
The degree to get a fright.
VR application software 1060 is run on computing unit 1010, for building virtual reality scenario, and same user mutual.
Stimulate detection and processing module 1070 to collect the information obtained from sensor assembly 1040, the degree that is upset of identification user or
The sense of discomfort that user experiences, and instruction is made to VR application software 1060, indicate virtual constructed by the change of VR application software
Reality scene, to mitigate the stimulation user or reduce the sense of discomfort caused by user.Stimulate detection can with processing module 1070
1060 masked segment of VR application software or whole VR scenes are indicated, to reduce the stimulation to user.Or stimulate detection and processing
Module 1070 indicates 1060 pop-up dialogue box of VR application software, suspension or pause to the structure to VR scenes.Stimulate detection and place
Reason module 1070 may further indicate that VR application software 1060 weakens current VR scenes, such as reduce the degree true to nature of current VR scenes
Or change the content of VR scenes, reduce its terrified, degree of violence.
Alternatively, detection is stimulated directly to change the VR scenes that display device 1020 is shown with processing module 1060, and nothing
The assistance of VR application software 1060 is needed, hence for the third party's VR application software for not providing the corresponding interface or controlling mechanism,
It can implement according to an embodiment of the invention.Detection is alternatively stimulated to indicate what audio frequency apparatus adjustment played with processing module 1060
Audio, reduces volume, changes rhythm, plays cheerful and light-hearted music etc., the assistance without VR application software 1060.
Stimulate detection and processing module 1060 can poll sensors module 1040 with identify degree that user is upset or
The sense of discomfort that user experiences.And in another example, sensor assembly perceive user be upset or undergo it is not in due course,
Or under specified requirements (for example, timer expires), message or interrupt signal are sent with processing module 1070 to detection is stimulated.
Figure 11 is the flow chart of the method for avoiding virtual reality system overstimulation user according to embodiments of the present invention.
According to an embodiment of the invention, VR is using 1060 (referring also to Figure 10) structure VR scenes (1110).Stimulate detection and processing mould
Block 1060 identifies that the user in VR scenes is just undergoing the degree of uncomfortable experience and/or uncomfortable experience by sensor assembly 1040
(1120).And when necessary, to avoid giving user's overstimulation, sense of discomfort of the user in VR scenes is reduced, changes institute's structure
The VR scenes (1130) built, reduce the stimulation to user, so as to alleviate tension/sense of discomfort of user.
Figure 12 is the flow of the real another method for avoiding virtual reality system overstimulation user for applying example according to the present invention
Figure.VR is using 1060 (referring also to Fig. 2) structure VR scenes (1210).Pass through head pose trap setting 1042 (referring also to Figure 10)
Obtain user's head posture (1220).As an example, head pose trap setting 1042 carries out head pose with fixed frequency
Sampling, obtains the head pose of t frames, is denoted as Phead(t).Head pose includes 3 frees degree, i.e.,Its
Middle ψ, θ,The angle that head is rotated around yaw axis, pitch axis and the axis of rolling is represented respectively.In another example, head pose
Including 6 frees degree, i.e.,In addition to the angle that head is rotated around three axis, x, y, z represents head respectively
The offset that portion is translated along yaw axis, pitch axis and the axis of rolling.
From head pose, the motor pattern on head is identified.When there is the motor pattern specified in the head of user, it is believed that use
Experiencing nervous or uncomfortable (421) in family.In one example, user is frightened then head and can be swung back rapidly, and head is rapid
The speed that the head movement pattern of layback can be moved by detecting user's head is more than specified threshold τ1To identify, for example, Phead
(t)-Phead(t-1) > τ1, wherein Phead(t) offset taken in the head pose of t frames along the axis of rolling is taken.In another example
In son, whether the head pose for comparing t frames head pose and t-n frames is more than threshold tau along the offset difference of the axis of rolling1,
To identify whether the speed of user's head movement is more than threshold tau1.Wherein, t and n is positive integer.
In another example, user is frightened then, continuously can acutely be rocked head and be represented to resist the VR scenes seen.Even
The acceleration that the continuous motor pattern for acutely rocking head can be rotated by detecting user's head is more than specified threshold τ '1To identify,
Such as Phead(t)+Phead(t-2)-2Phead(t-1) > τ '1, wherein Phead(t) take in the head pose of t frames around yaw axis
Rotational angle.In still another example, the head pose for comparing t frames head pose and t-n frames turns around yaw axis
The second order difference of dynamic angle whether be more than threshold tau '1。
If by the head pose of user, identification user is experiencing nervous or uncomfortable, change VR scenes (1230).
Alternatively, user gesture posture (1222) is obtained by gesture acquisition equipment 1044 (referring also to Figure 10).As act
Example, gesture acquisition equipment 1044 provide gesture posture (i) with assigned frequency, and gesture posture (i) can be a vector, and form
Change is expressed as i={ C, palm, thumb, index, mid, ring, little }.Wherein, c represents the hand-type of whole hand, for example,
Clench fist, the five fingers open, triumph gesture etc., palm represents the positional information of instruction palm, thumb, index, mid, ring and
Little represents the positional information and/or orientation information of thumb, forefinger, middle finger, the third finger and little finger of toe respectively.In another example
In, gesture posture includes 3 frees degree, is denoted asWherein ψ, θ,Represent hand around yaw axis, pitching respectively
The angle that axis and the axis of rolling rotate.As still another example, gesture posture includes 6 frees degree, i.e.,In addition to the angle that hand is rotated around three axis, x, y, z represents hand along yaw axis, pitch axis respectively
With the offset of axis of rolling translation.
From gesture posture, the motor pattern of user's hand is identified.When the motor pattern specified occurs in the hand of user, recognize
Experienced for user nervous or uncomfortable (1223).In one example, user is frightened then hand and can be pushed forward rapidly, to wish
Prevention is dangerous to be considered, and the hand exercise pattern that hand is pushed forward rapidly can be by detecting the speed of user's hand movement more than specified
Threshold tau2To identify, for example, Ppalm(t)-Ppalm(t-1) > τ2, wherein Ppalmm(t) take and taken in the hand gestures of t frames along rolling
The offset of axis.In another example, user, which is frightened, then continuously can acutely wave to represent to resist the VR contents seen, and use
The hand exercise pattern that family is continuously acutely waved can by detect user's hand move acceleration be more than threshold tau '2To identify,
For example, Ppalm(t)+Ppalm(t2)2Ppalm(t1) > τ '2, wherein Ppalm(t) take in the hand gestures of t frames and turn around the axis of rolling
Dynamic angle.In another example, the behavior that user covers eyes implies user and wants to avoid the content seen, and with covering eye
The hand exercise pattern of eyeball can by detect user's palm seal it is before eyes reach certain time and identify, for example, the hand of continuous T frame
Hand position is less than threshold tau apart from the average distance of eye position q in portion's posture "2, useTable
Show, T is positive integer.And alternatively, the head pose provided by head pose trap setting 1042 represents eye position q,
Or eyes are identified from head pose and obtain the position of eyes.
If nervous or uncomfortable, change VR scenes (1230) are being experienced by the recognizing model of movement user of user's hand.
According to another embodiment of the invention, structure is shown in Fig. 4 according to the present invention or Fig. 7 in VR scenes
The control of embodiment, the gesture by identifying user activate control.When user is ordered by gesture to control instruction " confirmation ",
Show that user is nervous or overstimulation.In response, change VR scenes, user is stimulated with reducing, reduces the tight of user
Open sense.And when user is ordered by gesture to control instruction " cancellation ", show that user can receive the content of VR scenes, and it is temporary
Shi Wuxu makes a change the VR scenes of structure.
Alternatively, user gesture posture (1224) is obtained by audio/speech acquisition equipment 1046 (referring also to Figure 10).Make
For citing, audio/speech acquisition equipment 1046 provides the sound of capture to stimulation detection with processing module 1070 (referring also to Figure 10)
Frequently, acoustic characteristic, voice and/or the text identified from voice.Acoustic characteristic includes the frequency of audio, amplitude etc..
If specific characteristic is presented in the audio signal of capture, or voice includes specifying words and phrases, it is believed that user is experiencing tightly
Open or uncomfortable (425).In one example, user, which is frightened, can then call out words such as " help ".Word set y is provided, wherein
Element to be user experiencing words and phrases that are nervous or can not saying or call out out in due course.The audio x that capture user sends, identifies sound
Words and phrases in frequency x, and word y is converted into, if word y is in predefined character set y, then it is assumed that user represents tight in experience
Open, VR contents are not felt well or resisted.
In another example, user, which is frightened, then can scream or shout.The frequency of audio that identification user sends and/or
Loudness, if frequency and/or loudness exceed a prescribed threshold value, then it is assumed that user represent experience it is nervous, do not feel well to VR contents or
Resist.
If nervous or uncomfortable, change VR scenes (1230) are being experienced by audio identification user.
Still alternatively, the signal (1226) of interactive device is obtained by interactive device 1048 (referring also to Figure 10).Interaction
The instruction of user of the signal designation of equipment.The sense of discomfort (1227) of the instruction instruction user of user.In one example, user
It is apprised of when not feeling well in VR systems, can be informed by pushing button or making predetermined action/gesture to VR systems
Sense of discomfort.For example, button is provided on the interactive device 1048 of the VR controllers of such as user, game paddle or data glove,
User pushes button, and interactive device 1048 to stimulating detection or processing module 1070 to send signal, by button sent out by instruction user
The instruction not felt well is gone out.In another example, user's body can be nervous or not in due course, can hold hand-held object.And hand over
Mutual equipment 1048 includes pressure sensor, and is held by user.When the grip for perceiving user is held beyond threshold value or for a long time
Continuous when exceed threshold value, interactive device 1048 is to stimulating detection or the transmission signal of processing module 1070, the sense of discomfort of instruction user
(1227)。
If nervous or uncomfortable, change VR scenes (1230) are being experienced by the signal identification user of interactive device.
Still alternatively, the motor pattern of the interactive device held by user, to identify the motor pattern of user's hand.
When there is the motor pattern specified in the hand of user, it is believed that user is experiencing nervous or uncomfortable.For example, on interactive device
Gyroscope is set, and the posture of handheld device is provided in real time or with certain frequency.Or the image of capture interactive device, from image
The posture of middle identification interactive device.The configured informations such as ultrasonic wave, infrared ray, visible ray can also be produced on interactive device, from
And identify the posture of handheld device by receiving configured information.From the posture of handheld device, the movement of handheld device is obtained
Pattern.If the speed and/or acceleration of handheld device are more than threshold value, it is believed that the user for holding handheld device is just experiencing nervous or not
It is suitable.
It is pointed out that proposed above with way of example can be by identifying head movement pattern, the user of user
Hand exercise pattern, the signal of the audio/speech feature of user or interactive device identify that it is nervous or not that user is experiencing
It is suitable.One of ordinary skill in the art will realize that above-mentioned identification user is experiencing nervous or uncomfortable means and use can be combined.It is logical
Cross the one or more for combining above-mentioned means, can it is more accurate, more succinct or more effectively identification user experiencing it is nervous or not
It is suitable.
Nervous or uncomfortable, change VR scenes (1230) are just being experienced in response to recognizing user.Implementing according to the present invention
In example, change VR scenes by multiple means.In one example, by switching, weakening and/or closing the video of current presentation
Content (1232) changes VR scenes, and user is stimulated with reducing, and reduces the tension of user.For example, ought be preceding to user's exhibition
The terrified VR scenes shown are switched to easily content, reduce fidelity/resolution ratio of current VR scenes, make the VR fields of current presentation
Scape thickens, or changes the brightness of current VR scenes.In another example, worked as by switching, weakening and/or close
The audio content (1234) of preceding broadcasting changes VR scenes, to reduce the stimulation to user.For example, by present tense, stimulate
Music, changes into light, cheerful and light-hearted music, makes to bring source of sound that is nervous, stimulating to move to the direction away from user, changes sound
Amount, or even stop the broadcasting of sound etc..In still another example, prompt window (1236) is ejected to user, to inquire use
The wish at family and the intense strain for alleviating user.For example, whether excessively nervous requried the users by pop-up window, if need
Rest.Multiple means provided above can be combined to change VR scenes.
Still in another embodiment, detection is being stimulated also to be supervised with processing module 1070 (referring to Figure 10) according to the present invention
Survey VR scenes.VR scenes displaying it is nervous, stimulate etc. when, access sensors module 1040 is to identify whether user experiences not
It is suitable.Alternatively, VR informs stimulation detection and processing module 1070 using 1060 with regard to its content, for example, will appear from or going out
It is existing nervous, when stimulating content, notice stimulates detection to start or strengthen the detection to user with processing module 1070, effectively to identify
Whether user experiences discomfort.
Although an embodiment of the present invention has been shown and described, for the ordinary skill in the art, can be with
Understanding without departing from the principles and spirit of the present invention can carry out these embodiments a variety of changes, modification, replace
And modification, the scope of the present invention is by appended claims and its equivalent limits.