CN105867613A - Head control interaction method and apparatus based on virtual reality system - Google Patents
Head control interaction method and apparatus based on virtual reality system Download PDFInfo
- Publication number
- CN105867613A CN105867613A CN201610159807.9A CN201610159807A CN105867613A CN 105867613 A CN105867613 A CN 105867613A CN 201610159807 A CN201610159807 A CN 201610159807A CN 105867613 A CN105867613 A CN 105867613A
- Authority
- CN
- China
- Prior art keywords
- head
- user
- head control
- time
- foresight
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/012—Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
Abstract
The invention discloses a head control interaction method and apparatus based on a virtual reality system. The method includes steps: when the virtual reality system is in a head control state, head motion data of a user is obtained in real time; according to the head motion data, the current position of a head control sight bead displayed on a display interface and the residence time of the head control sight bead on the position are determined; and according to the current position and the residence time of the head control sight bead, interaction operation performed by the user is recognized, and the function corresponding to the interaction operation is performed. According to the method and apparatus, the head motion data of the user is obtained, the current position and the residence time of the head control sight bead are determined based on the motion data, and the interaction operation performed by the user is recognized according to the current position and the residence time of the head control sight bead so that the interaction of the user with the display interface in the manner of head control can be effectively recognized, the interaction method is simple and effective, and the user experience is improved.
Description
Technical field
The invention belongs to technical field of virtual reality, particularly relate to a kind of head control based on virtual reality system and hand over
Method and device mutually.
Background technology
Virtual reality (Virtual Reality, VR) technology is a kind of advanced person, digitized human-machine interface vocal mimicry
Art, its feature be computer produce the most virtual a kind of environment, generate one based on visual experience,
Including audition, the artificial environment of the comprehensive perception of sense of touch, people can pass through vision, audition, sense of touch and add
The virtual world of the polyesthesia passage perception computer virtuals such as speed, it is also possible to also by movement, voice,
The most natural modes such as expression, gesture and sight line and virtual world are mutual, thus produce experience on the spot in person.
Virtual reality technology is computer technology, sensor technology, human-computer interaction technology, artificial intelligence technology
Etc. the comprehensive development of multiple technologies, at present, military affairs, medical science, educate, entertain, manufacturing industry, work
The various aspects such as Cheng Xunlian are applied, and it is considered as the important technology that current and future affects people's life
One of.
But, at present, virtual reality system there is no effective and simple interactive mode, limit virtual
The development of reality technology and application.
Summary of the invention
The present invention provides a kind of head control exchange method based on virtual reality system and device, existing in order to solve
Technology in virtual reality system without the problem of effective and simple interactive mode.
First aspect present invention provides a kind of head control exchange method based on virtual reality system, described method bag
Include:
When virtual reality system is under head control state, obtain the head movement data of user in real time;
The position and institute that the head control foresight that display interface shows is currently located is determined according to described head movement data
State the head control foresight time of staying on described position;
The position being currently located according to described head control foresight and the described time of staying identify what described user performed
Interactive operation, and perform the function corresponding to described interactive operation.
In the first feasible implementation of first aspect, described determine aobvious according to described head movement data
When showing the stop on described position of position and described head control foresight that the head control foresight of interface display is currently located
Between step include:
Resolve described head movement data, determine the track of the head movement of user and in described track
Time out;
Utilize described user head movement TRAJECTORY CONTROL described in the position of head control foresight be changed, and
The position that position after described head control foresight being changed is currently located as described head control foresight, by described time-out
Time is as described head control foresight time of staying on described position.
In the implementation that first aspect the second is feasible, described it is currently located according to described head control foresight
Position and the described time of staying identify that the step of the interactive operation of described user execution includes:
Determine position project of correspondence on described display interface that described head control foresight is currently located;
The interactive operation to the described project implementation of the described user is determined according to the described time of staying.
In conjunction with the implementation that first aspect the second is feasible, in the third feasible implementation of first aspect
In, described determine the described user step to the interactive operation of the described project implementation according to the described time of staying
Including:
If the described time of staying is more than the first numerical value pre-set, it is determined that described user is to described project
Perform to select operation;
If the described time of staying is less than the first numerical value pre-set, it is determined that described user is to described project
It is not carried out selecting operation.
In conjunction with the implementation or the 3rd that the first feasible implementation of first aspect or the second are feasible
Planting feasible implementation, in the 4th kind of feasible implementation of first aspect, described real-time acquisition uses
The step of the head movement data of person includes:
Obtain the head movement data of the described user that head movement sensor sensing arrives in real time;
Or,
Obtain the head movement data of the described user of photographic head shooting in real time.
Second aspect present invention provides a kind of head control interactive device based on virtual reality system, described device bag
Include:
Acquisition module, for when virtual reality system is under head control state, obtains the head of user in real time
Portion's exercise data;
Determine module, for determining that the head control foresight that display interface shows is current according to described head movement data
The position at place and the described head control foresight time of staying on described position;
Identification module, identifies institute for the position being currently located according to described head control foresight and the described time of staying
State the interactive operation that user performs, and perform the function corresponding to described interactive operation.
In the first feasible implementation of second aspect, described determine that module includes:
Parsing determines module, is used for resolving described head movement data, determines the rail of the head movement of user
Mark and the time out in described track;
Change determine module, for utilize described user head movement TRAJECTORY CONTROL described in head control foresight
Position be changed, and using described head control foresight change after position be currently located as described head control foresight
Position, using described time out as described head control foresight time of staying on described position.
In the implementation that second aspect the second is feasible, it is characterised in that described identification module includes:
Project determines module, for determining that position that described head control foresight is currently located is at described display interface
Corresponding project;
Operation determines module, for determining that described user is to the described project implementation according to the described time of staying
Interactive operation.
In conjunction with the implementation that second aspect the second is feasible, in the third feasible implementation of second aspect
In, described operation determine module specifically for:
If the described time of staying is more than the first numerical value pre-set, it is determined that described user is to described project
Perform to select operation;
If the described time of staying is less than the first numerical value pre-set, it is determined that described user is to described project
It is not carried out selecting operation.
In conjunction with the implementation or the 3rd that the first feasible implementation of second aspect or the second are feasible
Planting feasible implementation, in the 4th kind of feasible implementation of second aspect, described acquisition module is concrete
For: obtain the head movement data of the described user that head movement sensor sensing arrives in real time;
Or, obtain the head movement data of the described user of photographic head shooting in real time.
Knowable to the invention described above embodiment, in the present invention, under virtual reality system is in head control state
Time, obtain the head movement data of user in real time, determine that display interface shows according to these head movement data
The position that is currently located of head control foresight and head control foresight in the time of staying of this position and accurate according to this head control
Position and the above-mentioned time of staying that star is currently located identify the interactive operation that user performs, and perform this friendship
Function corresponding to interoperability, enabling effectively realize the head control between user and virtual reality system and hand over
Mutually, interactive mode is simple and convenient, improves user and experiences.
Accompanying drawing explanation
In order to be illustrated more clearly that the embodiment of the present invention or technical scheme of the prior art, below will be to enforcement
In example or description of the prior art, the required accompanying drawing used is briefly described, it should be apparent that, describe below
In accompanying drawing be only some embodiments of the present invention, for those skilled in the art, do not paying wound
On the premise of the property made is laborious, it is also possible to obtain other accompanying drawing according to these accompanying drawings.
Fig. 1 is the flow process signal of head control exchange method based on virtual reality system in first embodiment of the invention
Figure;
Fig. 2 is the flow process signal of head control exchange method based on virtual reality system in second embodiment of the invention
Figure;
Fig. 3 is the structural representation of head control interactive device based on virtual reality system in third embodiment of the invention
Figure;
Fig. 4 is the structural representation of head control interactive device based on virtual reality system in fourth embodiment of the invention
Figure.
Detailed description of the invention
For making the goal of the invention of the present invention, feature, the advantage can be the most obvious and understandable, below will knot
Close the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is clearly and completely retouched
State, it is clear that described embodiment is only a part of embodiment of the present invention, and not all embodiments.Base
Embodiment in the present invention, those skilled in the art are obtained under not making creative work premise
Every other embodiment, broadly falls into the scope of protection of the invention.
Refer to Fig. 1, for head control exchange method based on virtual reality system in first embodiment of the invention
Schematic flow sheet, the method includes:
Step 101, when virtual reality system is under head control state, obtain in real time the head fortune of user
Dynamic data;
In embodiments of the present invention, virtual reality system comprises virtual implementing helmet or virtual reality glasses,
This virtual implementing helmet and this virtual reality glasses all comprise display, and this display dares to transmitting to user
Virtual environment so that user can produce impression on the spot in person.
In embodiments of the present invention, user (can be called for short: head control) with empty by the way of head controls
The aforementioned display device intended in reality system interacts, to control the display of display.
In embodiments of the present invention, it is achieved present invention head based on virtual reality system control exchange method is base
In the head control interactive device (hereinafter referred to as: head control interactive device) of virtual reality system, and this head control is handed over
Device is the device in virtual reality system mutually, such as, can be PC, notebook computer, panel computer,
Mobile terminal etc..
Wherein, when virtual reality system is under head control state, head control interactive device will obtain use in real time
The head movement data of person.
In embodiments of the present invention, head control interactive device specifically can obtain head movement sensor sensing in real time
The head movement data of the user arrived.
Concrete, virtual implementing helmet in virtual reality system or virtual reality glasses are provided with head
Motion sensor, this head movement sensor can sense the motion of user head so that the head of user
Portion's action is when changing, and what this head movement sensor can be real-time senses the head movement number of this user
According to, and the head movement data sensed are transferred to head control interactive device.
Wherein, the motion of user head includes spinning movement and the deflection action by a small margin of head.
Wherein, this head movement sensor can be gyroscope, and this gyroscope can be used for positioning 360 degree
The headwork of user in space, it is possible to sense the spinning movement around tri-axles of X, Y, Z.Wherein enclose
Spinning movement about the z axis refers to the spinning movement with vertical direction as axle of the user head, i.e. produces angle of rotation
The action of change, such as user motion or substantially static state head portion left-right rotation action, also include example
Uprightly stand such as user or sitting state following head and health are almost without the least situation of displacement or displacement
Under head left-right rotation action.The above-mentioned spinning movement around tri-axles of X, Y, Z refer to except around
Beyond the spinning movement of Z axis, also include the axial athletic performance in user head upright direction, i.e. produce rotation
Any head turning motion of angle, deflection angle and the angle of pitch, such as user, including head turning motion,
Nodding action, backward wing, yaw action etc., the most also include that such as user is uprightly stood or sitting shape
State head portion and health almost without head turning motion in the case of the least of displacement or displacement, nodding action,
Backward wing, yaw action etc..Gyroscope can sense the above-mentioned action of user head, and generates head
Portion's exercise data.
Or;
Head control interactive device can also obtain the head movement number of the user of photographic head shooting by photographic head
According to.Concrete: virtual reality system is provided with can the photographic head of image of captured in real-time user head,
And photographic head is after the head movement data (view data) photographing user, by the head of this user
Portion's exercise data is transferred to head control interactive device.
Wherein, this photographic head is preferably infrared pick-up head, and also include coordinating with this infrared pick-up head
RF transmitter.In use, this infrared pick-up head and this RF transmitter, towards user, make
The range of movement of the head obtaining user falls in the coverage of this infrared pick-up head.
Wherein, RF transmitter is for uniformly projecting infrared light, infrared pick-up in above-mentioned coverage
Head is then for recording the head movement data of the speckle data in this coverage, i.e. user, and will obtain
Head movement data be transferred to head control interactive device.
Wherein, the uniform Infrared irradiation that RF transmitter is launched continuously to coverage to rough object (as
Human body surface or clothing surface) upper time, form random reflected spot, i.e. speckle.Speckle have height with
Machine, the changing patterns as well as distance, in space, the speckle at any two can be all different pattern,
Be equivalent to whole space is added labelling, so any object enters this coverage, and time mobile,
The position that recorded this object can be determined, based on above-mentioned principle, the number obtained based on infrared pick-up head
According to can determine the head movement of user, to realize the acquisition of the head movement data to user.
Step 102, determine the position that the head control foresight that display interface shows is currently located according to head movement data
Put and the head control foresight time of staying in position;
In virtual reality system, when the head of user moves, the situation elements in the user visual field is not
Meeting is along with moving in the visual field, but head control foresight can be always held at user central region.Therefore, using
When the head of person moves, head control foresight and situation elements can produce relative movement, therefore, head control interactive device
Head movement data based on user can determine the position that the head control foresight that the display amount of money shows is currently located
And the time of staying that head control foresight is in position.
Step 103, the position being currently located according to head control foresight and the time of staying identify the friendship that user performs
Interoperability, and perform the function corresponding to interactive operation.
In embodiments of the present invention, head control interactive device is in the position and stop determining that head control foresight is currently located
After time, the position being currently located according to this control foresight and the time of staying are identified that user performs mutual
Operation, and perform the function corresponding to this interactive operation, such as: if this interactive operation is point praises operation, then
Carry out a little praising to the project of this interactive operation effect, if this interactive operation is the startup operation of application program, then
Application program selected by startup.
Wherein, user is when performing the interactive operation under head control state, i.e. by controlling head control foresight
Position and the time of staying in position realize head control.
In embodiments of the present invention, under virtual reality system is in head control state, head control interactive device is real-time
Obtain the head movement data of user, and determine, according to these head movement data, the head control that display interface shows
Position that foresight is currently located and the head control foresight time of staying on this position, and work as according to this head control foresight
The position at front place and the head control foresight time of staying on this position identifies the interactive operation that user performs,
And perform the function corresponding to this interactive operation, enabling effectively realize head control mutual, improve user
Experience.
Refer to Fig. 2, for head control exchange method based on virtual reality system in second embodiment of the invention
Schematic flow sheet, the method includes:
Step 201, when virtual reality system is under head control state, obtain in real time the head fortune of user
Dynamic data;
The content of the step 101 described in step 201 and first embodiment shown in Fig. 1 is similar, does not does
Repeat.
Step 202, parsing head movement data, determine the track of the head movement of user and in track
Time out;
Step 203, the position of the TRAJECTORY CONTROL head control foresight of the head movement of user is utilized to be changed,
And head is controlled the position that the position after foresight change is currently located as head control foresight, using time out as head
The control foresight time of staying in position;
In embodiments of the present invention, head control interactive device, by resolving the head movement data of user, determines this
The track of the head movement of user and the time out in track, and utilize the head movement of user
The position of TRAJECTORY CONTROL head control foresight is changed, and head is controlled foresight change after position control foresight as head
The position being currently located, using this time out as head control foresight time of staying in position.Such as: if
The track of the head movement of user for turning left, and rotate after time out be 5s, then according to this to
Dynamic action of turning left determines head control foresight change in location on display interface, and determines change back control foresight
The position being currently located, and using time out 5s as head control foresight time of staying on this position.
Step 204, determine the project that position that head control foresight is currently located is corresponding on display interface;
Step 205, determine the user interactive operation to the project implementation according to the time of staying, and perform mutual
Function corresponding to operation.
In embodiments of the present invention, the position being currently located according to head control foresight is determined it by head control interactive device
Project corresponding on display interface, and determine the user interactive operation to the project implementation according to the time of staying,
And the function corresponding to execution interactive operation, such as, if the time of staying is more than the first numerical value pre-set,
Then determine that user selects operation to the project implementation;If the time of staying is less than the first numerical value pre-set, then
Determine that project is not carried out selecting operation by user.
Wherein, the time of staying can be as the measurement of " degree ", such as the time of staying is the longest, then
The probability of selection project corresponding to head control foresight is the biggest.And control to the end in order to user can be experienced
Foresight is in the time of staying of current position, it is also possible to represent that head control foresight is current by animation effect
The time of staying of position so that what what user can be subjective experienced with display interface is mutual.
In embodiments of the present invention, when virtual reality system is under head control state, head control interactive device will be real
Time obtain the head movement data of user, and resolve this head movement data, determine the head fortune of user
Dynamic track and the time out in track, and utilize the head movement TRAJECTORY CONTROL head control standard of this user
The change in location of star, controls the position that the position after foresight change is currently located as head control foresight using head, and will
Above-mentioned time out is as head control foresight time of staying on this position, and also will determine that head control foresight is worked as
According to this time of staying, the project that the position at front place is corresponding on display interface, determines that user is to this project
The interactive operation performed, and perform the function corresponding to this interactive operation, enabling head based on user
Portion exercise data controls head control foresight, mutual with realize with display interface, i.e. realizes head control, is effectively improved
The experience of user.
Referring to Fig. 3, Fig. 3 is that the head control based on virtual reality system that third embodiment of the invention provides is handed over
The mutually structural representation of device, including: acquisition module 301, determines module 302 and identification module 303.
Each functional module describes in detail as follows:
Acquisition module 301, for when virtual reality system is under head control state, obtains user in real time
Head movement data;
In embodiments of the present invention, virtual reality system comprises virtual implementing helmet or virtual reality glasses,
This virtual implementing helmet and this virtual reality glasses all comprise display, and this display dares to transmitting to user
Virtual environment so that user can produce impression on the spot in person.
In embodiments of the present invention, user (can be called for short: head control) with empty by the way of head controls
The aforementioned display device intended in reality system interacts, to control the display of display.
In embodiments of the present invention, this control interactive device is the device in virtual reality system, the most permissible
It is PC, notebook computer, panel computer, mobile terminal etc..
Wherein, when virtual reality system is under head control state, acquisition module 301 will obtain use in real time
The head movement data of person.
In embodiments of the present invention, acquisition module 301 specifically can obtain head movement sensor sensing in real time
The head movement data of the user arrived.
Concrete, virtual implementing helmet in virtual reality system or virtual reality glasses are provided with head
Motion sensor, this head movement sensor can sense the motion of user head so that the head of user
Portion's action is when changing, and what this head movement sensor can be real-time senses the head movement number of this user
According to, and the head movement data sensed are transferred to the acquisition module 301 in head control interactive device.
Wherein, the motion of user head includes spinning movement and the deflection action by a small margin of head.
Wherein, this head movement sensor can be gyroscope, and this gyroscope can be used for positioning 360 degree
The headwork of user in space, it is possible to sense the spinning movement around tri-axles of X, Y, Z.Wherein enclose
Spinning movement about the z axis refers to the spinning movement with vertical direction as axle of the user head, i.e. produces angle of rotation
The action of change, such as user motion or substantially static state head portion left-right rotation action, also include example
Uprightly stand such as user or sitting state following head and health are almost without the least situation of displacement or displacement
Under head left-right rotation action.The above-mentioned spinning movement around tri-axles of X, Y, Z refer to except around
Beyond the spinning movement of Z axis, also include the axial athletic performance in user head upright direction, i.e. produce rotation
Any head turning motion of angle, deflection angle and the angle of pitch, such as user, including head turning motion,
Nodding action, backward wing, yaw action etc., the most also include that such as user is uprightly stood or sitting shape
State head portion and health almost without head turning motion in the case of the least of displacement or displacement, nodding action,
Backward wing, yaw action etc..Gyroscope can sense the above-mentioned action of user head, and generates head
Portion's exercise data.
Or;
Acquisition module 301 can also obtain the head movement number of the user of photographic head shooting by photographic head
According to.Concrete: virtual reality system is provided with can the photographic head of image of captured in real-time user head,
And photographic head is after the head movement data (view data) photographing user, by the head of this user
Portion's exercise data is transferred to head control interactive device.
Wherein, this photographic head is preferably infrared pick-up head, and also include coordinating with this infrared pick-up head
RF transmitter.In use, this infrared pick-up head and this RF transmitter, towards user, make
The range of movement of the head obtaining user falls in the coverage of this infrared pick-up head.
Wherein, RF transmitter is for uniformly projecting infrared light, infrared pick-up in above-mentioned coverage
Head is then for recording the head movement data of the speckle data in this coverage, i.e. user, and will obtain
Head movement data be transferred to head control interactive device.
Wherein, the uniform Infrared irradiation that RF transmitter is launched continuously to coverage to rough object (as
Human body surface or clothing surface) upper time, form random reflected spot, i.e. speckle.Speckle have height with
Machine, the changing patterns as well as distance, in space, the speckle at any two can be all different pattern,
Be equivalent to whole space is added labelling, so any object enters this coverage, and time mobile,
The position that recorded this object can be determined, based on above-mentioned principle, the number obtained based on infrared pick-up head
According to can determine the head movement of user, to realize the acquisition of the head movement data to user.
Determine module 302, for determining that the head control foresight that display interface shows is current according to head movement data
The position at place and the head control foresight time of staying in position;
In virtual reality system, when the head of user moves, the situation elements in the user visual field is not
Meeting is along with moving in the visual field, but head control foresight can be always held at user central region.Therefore, using
When the head of person moves, head control foresight and situation elements can produce relative movement, therefore, head control interactive device
In really cover half block 302 can head movement data based on user determine display the amount of money show head control standard
Position that star is currently located and the head control foresight time of staying in position.
Identification module 303, identifies user for the position being currently located according to head control foresight and the time of staying
The interactive operation performed, and perform the function corresponding to interactive operation.
In embodiments of the present invention, determining position that head control foresight is currently located and after the time of staying, identifying
It is mutual that the position being currently located according to this control foresight and the time of staying are identified that user performs by module 303
Operation, and perform the function corresponding to this interactive operation, such as: if this interactive operation is point praises operation, then
Carry out a little praising to the project of this interactive operation effect, if this interactive operation is the startup operation of application program, then
Application program selected by startup.
Wherein, user is when performing the interactive operation under head control state, i.e. by controlling head control foresight
Position and the time of staying in position realize head control.
In embodiments of the present invention, under virtual reality system is in head control state, in head control interactive device
Acquisition module 301 obtains the head movement data of user in real time, and by determining that module 302 is according to this head
Exercise data determines that position and head control foresight that the head control foresight that display interface shows is currently located are on this position
The time of staying, and existed according to this position of being currently located of control foresight and head control foresight by identification module 303
The time of staying on this position identifies the interactive operation that user performs, and performs corresponding to this interactive operation
Function, enabling effectively realize head control mutual, improve the experience of user.
Refer to Fig. 4, for head control interactive device based on virtual reality system in fourth embodiment of the invention
Structural representation, including: as shown in Figure 3 the acquisition module 301 in the 3rd embodiment, determine module 302
And identification module 303, and similar to the content described in the 3rd embodiment shown in Fig. 3, do not repeat.
In embodiments of the present invention, determine that module 302 includes:
Parsing determines module 401, is used for resolving head movement data, determines the rail of the head movement of user
Mark and the time out in track;
Change determines module 402, for utilizing the position of the TRAJECTORY CONTROL head control foresight of the head movement of user
Put and be changed, and head is controlled the position that the position after foresight change is currently located as head control foresight, will temporarily
As head control foresight time of staying in position between the stopping time.
In embodiments of the present invention, determine that the parsing in module 302 determines that module 401 will resolve user
Head movement data, determine the track of the head movement of this user and the time out in track, and by
Change determines that module 402 utilizes the position of the TRAJECTORY CONTROL head control foresight of the head movement of user to become
Change, and head is controlled the position that the position after foresight change is currently located as head control foresight, by this time out
As the head control foresight time of staying in position.Such as: if the track of the head movement of user is to the left
Rotate, and the time out after rotation is 5s, then determine that head control foresight is aobvious according to this action turned left
Show the change in location on interface, and determine the position that control foresight in change back is currently located, and by time out
5s is as head control foresight time of staying on this position.
In embodiments of the present invention, identification module 303 includes:
Project determines module 403, for determining position correspondence on display interface that head control foresight is currently located
Project;
Operation determines module 404, for determining the user interactive operation to the project implementation according to the time of staying.
Wherein, operation determine module 404 specifically for:
If the time of staying is more than the first numerical value pre-set, it is determined that user selects operation to the project implementation;
If the time of staying is less than the first numerical value pre-set, it is determined that project is not carried out selecting behaviour by user
Make.
In embodiments of the present invention, acquisition module 301 specifically for: obtain in real time head movement sensor sense
The head movement data of the user that should arrive;Or, obtain the head fortune of the user of photographic head shooting in real time
Dynamic data.
In embodiments of the present invention, project determines that module 403 is by true for the position being currently located according to head control foresight
Its project corresponding on display interface fixed, and determined that module 404 determines use according to the time of staying by operation
Person's interactive operation to the project implementation, and perform the function corresponding to interactive operation, such as, if the time of staying
More than the first numerical value pre-set, it is determined that user selects operation to the project implementation;If the time of staying is little
In the first numerical value pre-set, it is determined that project is not carried out selecting operation by user.
Wherein, the time of staying can be as the measurement of " degree ", such as the time of staying is the longest, then
The probability of selection project corresponding to head control foresight is the biggest.And control to the end in order to user can be experienced
Foresight is in the time of staying of current position, it is also possible to represent that head control foresight is current by animation effect
The time of staying of position so that what what user can be subjective experienced with display interface is mutual.
In embodiments of the present invention, when virtual reality system is under head control state, in head control interactive device
Acquisition module 301 will obtain the head movement data of user in real time, and be determined that module 301 resolves by parsing
These head movement data, determine the track of the head movement of user and the time out in track, and become
Change and determine that module 402 utilizes the change in location of the head movement TRAJECTORY CONTROL head control foresight of this user, by head
The position that is currently located as head control foresight, position after control foresight change, and using above-mentioned time out as
The head control foresight time of staying on this position, and also will be determined that module 403 determines that head control foresight is worked as by project
The project that the position at front place is corresponding on display interface, and when being determined module 404 according to this stop by operation
Between determine the user interactive operation to this project implementation, and perform the function corresponding to this interactive operation, make
Can head movement Data Control head control foresight based on user, mutual with realize with display interface,
I.e. realize head control, be effectively improved the experience of user.
In several embodiments provided herein, it should be understood that disclosed apparatus and method, can
To realize by another way.Such as, device embodiment described above is only schematically, example
Such as, the division of described module, being only a kind of logic function and divide, actual can have other drawing when realizing
Point mode, the most multiple modules or assembly can in conjunction with or be desirably integrated into another system, or some are special
Levy and can ignore, or do not perform.Another point, shown or discussed coupling each other or direct-coupling
Or communication connection can be the INDIRECT COUPLING by some interfaces, device or module or communication connection, Ke Yishi
Electrically, machinery or other form.
The described module illustrated as separating component can be or may not be physically separate, as
The parts that module shows can be or may not be physical module, i.e. may be located at a place, or
Can also be distributed on multiple mixed-media network modules mixed-media.Can select therein some or all of according to the actual needs
Module realizes the purpose of the present embodiment scheme.
It addition, each functional module in each embodiment of the present invention can be integrated in a processing module,
Can also be that modules is individually physically present, it is also possible to two or more modules are integrated in a module
In.Above-mentioned integrated module both can realize to use the form of hardware, it would however also be possible to employ software function module
Form realizes.
If described integrated module realizes using the form of software function module and as independent production marketing or
During use, can be stored in a computer read/write memory medium.Based on such understanding, the present invention
The part that the most in other words prior art contributed of technical scheme or this technical scheme whole or
Part can embody with the form of software product, and this computer software product is stored in a storage medium
In, including some instructions with so that computer equipment (can be personal computer, server, or
Person's network equipment etc.) perform all or part of step of method described in each embodiment of the present invention.And it is aforesaid
Storage medium includes: USB flash disk, portable hard drive, read only memory (ROM, Read-Only Memory),
Random access memory (RAM, Random Access Memory), magnetic disc or CD etc. are various permissible
The medium of storage program code.
It should be noted that for aforesaid each method embodiment, in order to simplicity describes, therefore it is all stated
For a series of combination of actions, but those skilled in the art should know, the present invention is not by described
The restriction of sequence of movement, because according to the present invention, some step can use other order or carry out simultaneously.
Secondly, those skilled in the art also should know, embodiment described in this description belongs to be preferable to carry out
Example, involved action and module might not be all necessary to the present invention.
In the above-described embodiments, the description to each embodiment all emphasizes particularly on different fields, in certain embodiment the most in detail
The part stated, may refer to the associated description of other embodiments.
It is more than to a kind of head control exchange method based on virtual reality system provided by the present invention and device
Describe, for those skilled in the art, according to the thought of the embodiment of the present invention, in detailed description of the invention and
All will change in range of application, to sum up, this specification content should not be construed as limitation of the present invention.
Claims (10)
1. a head control exchange method based on virtual reality system, it is characterised in that described method includes:
When virtual reality system is under head control state, obtain the head movement data of user in real time;
The position and institute that the head control foresight that display interface shows is currently located is determined according to described head movement data
State the head control foresight time of staying on described position;
The position being currently located according to described head control foresight and the described time of staying identify what described user performed
Interactive operation, and perform the function corresponding to described interactive operation.
Head control exchange method the most according to claim 1, it is characterised in that described according to described head
Exercise data determines that position and described head control foresight that the head control foresight that display interface shows is currently located are described
The step of the time of staying on position includes:
Resolve described head movement data, determine the track of the head movement of user and in described track
Time out;
Utilize described user head movement TRAJECTORY CONTROL described in the position of head control foresight be changed, and
The position that position after described head control foresight being changed is currently located as described head control foresight, by described time-out
Time is as described head control foresight time of staying on described position.
Head control exchange method the most according to claim 1, it is characterised in that described control according to described head
Position that foresight is currently located and the described time of staying identify the step bag of the interactive operation that described user performs
Include:
Determine position project of correspondence on described display interface that described head control foresight is currently located;
The interactive operation to the described project implementation of the described user is determined according to the described time of staying.
Head control exchange method the most according to claim 3, it is characterised in that described according to described stop
Time determines that the step of the interactive operation of the described project implementation is included by described user:
If the described time of staying is more than the first numerical value pre-set, it is determined that described user is to described project
Perform to select operation;
If the described time of staying is less than the first numerical value pre-set, it is determined that described user is to described project
It is not carried out selecting operation.
5. according to the method described in Claims 1-4 any one, it is characterised in that described real-time acquisition
The step of the head movement data of user includes:
Obtain the head movement data of the described user that head movement sensor sensing arrives in real time;
Or,
Obtain the head movement data of the described user of photographic head shooting in real time.
6. a head control interactive device based on virtual reality system, it is characterised in that described device includes:
Acquisition module, for when virtual reality system is under head control state, obtains the head of user in real time
Portion's exercise data;
Determine module, for determining that the head control foresight that display interface shows is current according to described head movement data
The position at place and the described head control foresight time of staying on described position;
Identification module, identifies institute for the position being currently located according to described head control foresight and the described time of staying
State the interactive operation that user performs, and perform the function corresponding to described interactive operation.
Head control interactive device the most according to claim 6, it is characterised in that described determine that module includes:
Parsing determines module, is used for resolving described head movement data, determines the rail of the head movement of user
Mark and the time out in described track;
Change determine module, for utilize described user head movement TRAJECTORY CONTROL described in head control foresight
Position be changed, and using described head control foresight change after position be currently located as described head control foresight
Position, using described time out as described head control foresight time of staying on described position.
Head control interactive device the most according to claim 6, it is characterised in that described identification module includes:
Project determines module, for determining that position that described head control foresight is currently located is at described display interface
Corresponding project;
Operation determines module, for determining that described user is to the described project implementation according to the described time of staying
Interactive operation.
Head control interactive device the most according to claim 8, it is characterised in that described operation determines module
Specifically for:
If the described time of staying is more than the first numerical value pre-set, it is determined that described user is to described project
Perform to select operation;
If the described time of staying is less than the first numerical value pre-set, it is determined that described user is to described project
It is not carried out selecting operation.
10. according to the device described in claim 6 to 9 any one, it is characterised in that described acquisition mould
Block specifically for: obtain in real time the head movement data of the described user that head movement sensor sensing arrives;
Or, obtain the head movement data of the described user of photographic head shooting in real time.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610159807.9A CN105867613A (en) | 2016-03-21 | 2016-03-21 | Head control interaction method and apparatus based on virtual reality system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610159807.9A CN105867613A (en) | 2016-03-21 | 2016-03-21 | Head control interaction method and apparatus based on virtual reality system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN105867613A true CN105867613A (en) | 2016-08-17 |
Family
ID=56624707
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610159807.9A Pending CN105867613A (en) | 2016-03-21 | 2016-03-21 | Head control interaction method and apparatus based on virtual reality system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105867613A (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106371612A (en) * | 2016-10-11 | 2017-02-01 | 惠州Tcl移动通信有限公司 | Virtual reality glasses and menu control method |
CN106445142A (en) * | 2016-09-27 | 2017-02-22 | 网易(杭州)网络有限公司 | Interaction method, system and terminal equipment in virtual reality |
CN106681506A (en) * | 2016-12-26 | 2017-05-17 | 惠州Tcl移动通信有限公司 | Interaction method of non-VR application in terminal equipment and terminal equipment |
CN106873767A (en) * | 2016-12-30 | 2017-06-20 | 深圳超多维科技有限公司 | The progress control method and device of a kind of virtual reality applications |
CN107454947A (en) * | 2016-09-26 | 2017-12-08 | 深圳市大疆创新科技有限公司 | Unmanned aerial vehicle (UAV) control method, wear-type show glasses and system |
WO2018068673A1 (en) * | 2016-10-11 | 2018-04-19 | 传线网络科技(上海)有限公司 | Interaction control method and device based on virtual reality |
CN108012195A (en) * | 2016-11-01 | 2018-05-08 | 北京星辰美豆文化传播有限公司 | A kind of live broadcasting method, device and its electronic equipment |
CN108926828A (en) * | 2017-05-24 | 2018-12-04 | 胡敏君 | Team sport Virtual Reality Training System |
WO2021168922A1 (en) * | 2020-02-25 | 2021-09-02 | 上海唯二网络科技有限公司 | Method and apparatus for achieving man-machine interaction using head-mounted vr display device |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102789313A (en) * | 2012-03-19 | 2012-11-21 | 乾行讯科(北京)科技有限公司 | User interaction system and method |
CN102968245A (en) * | 2012-10-29 | 2013-03-13 | Tcl集团股份有限公司 | Method and device for cooperatively controlling mouse touch and method and system for smart television interaction |
US20140002351A1 (en) * | 2012-07-02 | 2014-01-02 | Sony Computer Entertainment Inc. | Methods and systems for interaction with an expanded information space |
CN105046283A (en) * | 2015-08-31 | 2015-11-11 | 宇龙计算机通信科技(深圳)有限公司 | Terminal operation method and terminal operation device |
CN105301778A (en) * | 2015-12-08 | 2016-02-03 | 北京小鸟看看科技有限公司 | Three-dimensional control device, head-mounted device and three-dimensional control method |
-
2016
- 2016-03-21 CN CN201610159807.9A patent/CN105867613A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102789313A (en) * | 2012-03-19 | 2012-11-21 | 乾行讯科(北京)科技有限公司 | User interaction system and method |
US20140002351A1 (en) * | 2012-07-02 | 2014-01-02 | Sony Computer Entertainment Inc. | Methods and systems for interaction with an expanded information space |
CN102968245A (en) * | 2012-10-29 | 2013-03-13 | Tcl集团股份有限公司 | Method and device for cooperatively controlling mouse touch and method and system for smart television interaction |
CN105046283A (en) * | 2015-08-31 | 2015-11-11 | 宇龙计算机通信科技(深圳)有限公司 | Terminal operation method and terminal operation device |
CN105301778A (en) * | 2015-12-08 | 2016-02-03 | 北京小鸟看看科技有限公司 | Three-dimensional control device, head-mounted device and three-dimensional control method |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107454947A (en) * | 2016-09-26 | 2017-12-08 | 深圳市大疆创新科技有限公司 | Unmanned aerial vehicle (UAV) control method, wear-type show glasses and system |
CN106445142A (en) * | 2016-09-27 | 2017-02-22 | 网易(杭州)网络有限公司 | Interaction method, system and terminal equipment in virtual reality |
CN106371612A (en) * | 2016-10-11 | 2017-02-01 | 惠州Tcl移动通信有限公司 | Virtual reality glasses and menu control method |
WO2018068673A1 (en) * | 2016-10-11 | 2018-04-19 | 传线网络科技(上海)有限公司 | Interaction control method and device based on virtual reality |
CN108012195A (en) * | 2016-11-01 | 2018-05-08 | 北京星辰美豆文化传播有限公司 | A kind of live broadcasting method, device and its electronic equipment |
CN108012195B (en) * | 2016-11-01 | 2020-06-23 | 北京星辰美豆文化传播有限公司 | Live broadcast method and device and electronic equipment thereof |
CN106681506A (en) * | 2016-12-26 | 2017-05-17 | 惠州Tcl移动通信有限公司 | Interaction method of non-VR application in terminal equipment and terminal equipment |
CN106681506B (en) * | 2016-12-26 | 2020-11-13 | 惠州Tcl移动通信有限公司 | Interaction method for non-VR application in terminal equipment and terminal equipment |
CN106873767A (en) * | 2016-12-30 | 2017-06-20 | 深圳超多维科技有限公司 | The progress control method and device of a kind of virtual reality applications |
CN106873767B (en) * | 2016-12-30 | 2020-06-23 | 深圳超多维科技有限公司 | Operation control method and device for virtual reality application |
CN108926828A (en) * | 2017-05-24 | 2018-12-04 | 胡敏君 | Team sport Virtual Reality Training System |
WO2021168922A1 (en) * | 2020-02-25 | 2021-09-02 | 上海唯二网络科技有限公司 | Method and apparatus for achieving man-machine interaction using head-mounted vr display device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105867613A (en) | Head control interaction method and apparatus based on virtual reality system | |
US20240168602A1 (en) | Throwable interface for augmented reality and virtual reality environments | |
US10949671B2 (en) | Augmented reality system capable of manipulating an augmented reality object and an augmented reality method using the same | |
US10665036B1 (en) | Augmented reality system and method with dynamic representation technique of augmented images | |
CN106139564B (en) | Image processing method and device | |
US10719993B1 (en) | Augmented reality system and method with space and object recognition | |
CN101715581B (en) | Volume recognition method and system | |
KR20200066371A (en) | Event camera-based deformable object tracking | |
CN109643014A (en) | Head-mounted display tracking | |
KR102012835B1 (en) | An augmented reality system capable of manipulating an augmented reality object using three-dimensional attitude information and recognizes handwriting of character | |
JP2017529635A5 (en) | ||
CN103930944A (en) | Adaptive tracking system for spatial input devices | |
CN106575153A (en) | Gaze-based object placement within a virtual reality environment | |
US10726631B1 (en) | Augmented reality system and method with frame region recording and reproduction technology based on object tracking | |
CN107609490B (en) | Control method, control device, Intelligent mirror and computer readable storage medium | |
CN102226880A (en) | Somatosensory operation method and system based on virtual reality | |
CN102222431A (en) | Hand language translator based on machine | |
CN103503030A (en) | Image processing device for specifying depth of object present in real space by performing image processing, stereoscopic viewing device, integrated circuit, and program | |
CN105917386A (en) | Information processing device, information processing system, block system, and information processing method | |
WO2021223611A1 (en) | Robot control method and apparatus, and robot and storage medium | |
CN106200985A (en) | Desktop type individual immerses virtual reality interactive device | |
CN106125938A (en) | A kind of information processing method and electronic equipment | |
CN108427595A (en) | The determination method and device of user interface controls display location in virtual reality | |
CN109844820A (en) | The hand that hologram is modified based on contextual information is blocked | |
CN105824411A (en) | Interaction control method and device based on virtual reality |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20160817 |