CN107967089A - A kind of virtual reality interface display methods - Google Patents
A kind of virtual reality interface display methods Download PDFInfo
- Publication number
- CN107967089A CN107967089A CN201711387177.1A CN201711387177A CN107967089A CN 107967089 A CN107967089 A CN 107967089A CN 201711387177 A CN201711387177 A CN 201711387177A CN 107967089 A CN107967089 A CN 107967089A
- Authority
- CN
- China
- Prior art keywords
- virtual reality
- virtual
- field angle
- keyboard
- reality interface
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
Abstract
The invention discloses a kind of virtual reality interface display methods, its method comprises the following steps:A, whether detection predeterminable area has object;B, it is no, then stop display virtual real interface;C, it is then to measure the distance between object and virtual reality glasses;D, if distance is less than pre-determined distance threshold value.The present invention passes through this method, usable family need not rock head can just watch content on the virtual reality interface with optimal field angle, so as to adaptively according to the eyes field angle of user come display virtual real interface, improve user experience, and user just need not can carry out human-computer interaction by peripheral hardware handle with virtual reality device, so as to strengthen the experience of virtual reality immersion formula, when user has been worn on virtual reality glasses at the same time, the automatic start of controllable virtual reality glasses, save the power-on operation time, use virtual reality glasses more intelligent and convenient.
Description
Technical field
The present invention relates to technical field of virtual reality, is specially a kind of virtual reality interface display methods.
Background technology
Virtual reality technology is a kind of computer simulation system that can be created with the experiencing virtual world, it utilizes computer
A kind of simulated environment is generated, is a kind of Multi-source Information Fusion, the system of interactive Three-Dimensional Dynamic what comes into a driver's and entity behavior imitates
Very, and family can be used to be immersed in the environment, virtual reality technology is emulation technology and computer graphics, human-machine interface vocal mimicry
The set of the multiple technologies such as art, multimedia technology, sensing technology, network technology, it mainly includes simulated environment, perception, nature
Technical ability and sensing equipment etc., but the prior art works as user and passes through the aobvious of user interface frequent switching virtual reality display area
Show during content, it is necessary to frequently rock head, and after needing user manually opened in advance, could see in glasses in virtual reality
Hold, it is complicated, not intelligent enough and convenient.
The content of the invention
It is an object of the invention to provide a kind of virtual reality interface display methods, to solve to propose in above-mentioned background technology
The problem of.
To achieve the above object, the present invention provides following technical solution:A kind of virtual reality interface display methods, its method
Comprise the following steps:
A, whether detection predeterminable area has object;
B, it is no, then stop display virtual real interface;
C, it is then to measure the distance between object and virtual reality glasses;
D, if distance is less than pre-determined distance threshold value, at the display area display virtual real interface of virtual reality glasses;
E, if distance is more than pre-determined distance threshold value, stop display virtual real interface;
F, the target level eyes field angle and target vertical eyes field angle of currently used person is determined;
G, target level eyes field angle and target vertical eyes field angle construction three-dimensional space are used in virtual display space
Between;
H, the kinematic parameter to be identified that user produces in pre-set space is obtained;
I, the corresponding spatial interaction gesture of kinematic parameter to be identified is identified;
J, the corresponding control instruction of spatial interaction gesture is determined;
K, the corresponding virtual reality interface of control instruction in curved surface is shown in three dimensions;
L, a dummy keyboard is shown in virtual reality interface, and the keyboard at least has a virtual key, while this is virtual
Button and the physical button of physical keyboard correspond;
M, when any entity button for detecting physical keyboard is triggered, then sign and any entity button pair in dummy keyboard
The virtual key answered, and the corresponding character of the virtual key is shown in virtual reality interface.
Preferably, it is described before step F, further include:1., obtain the target body site data of currently used person;②、
From the correspondence of body part data and horizontal eyes field angle, the corresponding target level of inquiry target body site data
Eyes field angle;3., from body part data with the correspondence of vertical eyes field angle, inquire about target body site data
Corresponding target vertical eyes field angle.
Preferably, the field angle of the horizontal direction is 90 degree of corresponding visible areas, and the field angle of vertical direction is
60 degree of corresponding visible areas.
Preferably, the step H includes:1., the micro- electromagnetic wave signal of transmitting, and the default sky of micro- electromagnetic wave signal covering
Between;2., obtain user the reflected signal of generation moved in pre-set space;3., according to micro- electromagnetic wave signal and reflected signal
Determine the kinematic parameter to be identified of user.
Preferably, the step J includes:According to spatial interaction gesture information and the default correspondence of control instruction,
And determine the corresponding control instruction of spatial interaction gesture.
Preferably, any two position on the corresponding virtual reality interface of the control instruction in curved surface is to currently making
The distance of user's eyes is equal.
Preferably, it is described after step L, further include:If detect there are operating body in physical keyboard, virtual
Display is used for the input icon for indicating operating body and physical keyboard position relationship on keyboard.
Preferably, the display on the virtual keyboard is used for the input icon for indicating operating body and physical keyboard position relationship
Including:1., when it is human hand to detect operating body, then dummy keyboard correspond to show a Hand icon;2., detection operating body phase
For the first position information of the relative position of physical keyboard;3., that the first position information is converted to the input icon is opposite
In the second place information of dummy keyboard;4., according to second place information dummy keyboard show input icon.
Compared with prior art, beneficial effects of the present invention are as follows:
For the present invention by this method, usable family need not rock head just can watch the virtual reality interface with optimal field angle
On content, so as to adaptively according to the eyes field angle of user come display virtual real interface, improve user experience,
And user just need not can carry out human-computer interaction by peripheral hardware handle with virtual reality device, so as to strengthen virtual reality immersion formula
Experience, while when user has been worn on virtual reality glasses, the automatic start of virtual reality glasses is can control, save power-on operation
Time, uses virtual reality glasses more intelligent and convenient.
Brief description of the drawings
Fig. 1 is flow diagram of the present invention.
Embodiment
Below in conjunction with the attached drawing in the embodiment of the present invention, the technical solution in the embodiment of the present invention is carried out clear, complete
Site preparation describes, it is clear that described embodiment is only part of the embodiment of the present invention, instead of all the embodiments.It is based on
Embodiment in the present invention, those of ordinary skill in the art are obtained every other without making creative work
Embodiment, belongs to the scope of protection of the invention.
Referring to Fig. 1, a kind of virtual reality interface display methods, its method comprise the following steps:
A, whether detection predeterminable area has object;
B, it is no, then stop display virtual real interface;
C, it is then to measure the distance between object and virtual reality glasses;
D, if distance is less than pre-determined distance threshold value, at the display area display virtual real interface of virtual reality glasses;
E, if distance is more than pre-determined distance threshold value, stop display virtual real interface;
F, the target level eyes field angle and target vertical eyes field angle of currently used person is determined;
G, target level eyes field angle and target vertical eyes field angle construction three-dimensional space are used in virtual display space
Between;
H, the kinematic parameter to be identified that user produces in pre-set space is obtained;
I, the corresponding spatial interaction gesture of kinematic parameter to be identified is identified;
J, the corresponding control instruction of spatial interaction gesture is determined;
K, the corresponding virtual reality interface of control instruction in curved surface is shown in three dimensions;
L, a dummy keyboard is shown in virtual reality interface, and the keyboard at least has a virtual key, while this is virtual
Button and the physical button of physical keyboard correspond;
M, when any entity button for detecting physical keyboard is triggered, then sign and any entity button pair in dummy keyboard
The virtual key answered, and the corresponding character of the virtual key is shown in virtual reality interface.
Before step F, further include:1., obtain the target body site data of currently used person;2., from body part
Data are with the correspondence of horizontal eyes field angle, inquiring about the corresponding target level eyes visual field of target body site data
Angle;3., from body part data with the correspondence of vertical eyes field angle, the corresponding mesh of inquiry target body site data
Mark vertical eyes field angle.
The field angle of horizontal direction is 90 degree of corresponding visible areas, and the field angle of vertical direction is corresponding for 60 degree
Visible area.
Step H includes:1., the micro- electromagnetic wave signal of transmitting, and micro- electromagnetic wave signal covering pre-set space;2., obtain use
The reflected signal of generation is moved in pre-set space in family;3., waiting for user determined according to micro- electromagnetic wave signal and reflected signal
The kinematic parameter of identification.
Step J includes:According to spatial interaction gesture information and the default correspondence of control instruction, and determine that space is handed over
The corresponding control instruction of mutual gesture.
Any two position on the corresponding virtual reality interface of control instruction in curved surface is to currently used person's eyes
Apart from equal.
After step L, further include:If detected in physical keyboard there are operating body, show on the virtual keyboard
For indicating the input icon of operating body and physical keyboard position relationship.
Display includes for indicating the input icon of operating body and physical keyboard position relationship on the virtual keyboard:1., when
When to detect operating body be human hand, then corresponded in dummy keyboard and show a Hand icon;2., detection operating body relative to entity key
The first position information of the relative position of disk;3., the first position information is converted into the input icon relative to dummy keyboard
Second place information;4., according to second place information dummy keyboard show input icon.
In use, by this method, usable family need not rock head can just watch this virtually now with optimal field angle
Content on real interface, so as to adaptively be used according to the eyes field angle of user come display virtual real interface, raising
Family is experienced, and user just need not can carry out human-computer interaction by peripheral hardware handle with virtual reality device, so as to strengthen virtual existing
The experience of real immersion, while when user has been worn on virtual reality glasses, can control the automatic start of virtual reality glasses, save
The power-on operation time, uses virtual reality glasses more intelligent and convenient.
Although an embodiment of the present invention has been shown and described, for the ordinary skill in the art, can be with
Understanding without departing from the principles and spirit of the present invention can carry out these embodiments a variety of changes, modification, replace
And modification, the scope of the present invention is defined by the appended.
Claims (8)
- A kind of 1. virtual reality interface display methods, it is characterised in that:Its method comprises the following steps:A, whether detection predeterminable area has object;B, it is no, then stop display virtual real interface;C, it is then to measure the distance between object and virtual reality glasses;D, if distance is less than pre-determined distance threshold value, at the display area display virtual real interface of virtual reality glasses;E, if distance is more than pre-determined distance threshold value, stop display virtual real interface;F, the target level eyes field angle and target vertical eyes field angle of currently used person is determined;G, target level eyes field angle and target vertical eyes field angle construction three-dimensional space are used in virtual display space Between;H, the kinematic parameter to be identified that user produces in pre-set space is obtained;I, the corresponding spatial interaction gesture of kinematic parameter to be identified is identified;J, the corresponding control instruction of spatial interaction gesture is determined;K, the corresponding virtual reality interface of control instruction in curved surface is shown in three dimensions;L, a dummy keyboard is shown in virtual reality interface, and the keyboard at least has a virtual key, while this is virtual Button and the physical button of physical keyboard correspond;M, when any entity button for detecting physical keyboard is triggered, then sign and any entity button pair in dummy keyboard The virtual key answered, and the corresponding character of the virtual key is shown in virtual reality interface.
- A kind of 2. virtual reality interface display methods according to claim 1, it is characterised in that:It is described before step F, Further include:1., obtain the target body site data of currently used person;2., from body part data and horizontal eyes field angle Correspondence in, the corresponding target level eyes field angle of inquiry target body site data;3., from body part data with In the correspondence of vertical eyes field angle, the corresponding target vertical eyes field angle of inquiry target body site data.
- A kind of 3. virtual reality interface display methods according to claim 1, it is characterised in that:The horizontal direction regards Rink corner is 90 degree of corresponding visible areas, and the field angle of vertical direction is 60 degree of corresponding visible areas.
- A kind of 4. virtual reality interface display methods according to claim 1, it is characterised in that:The step H includes: 1., the micro- electromagnetic wave signal of transmitting, and micro- electromagnetic wave signal covering pre-set space;2., obtain user production is moved in pre-set space Raw reflected signal;3., determine according to micro- electromagnetic wave signal and reflected signal the kinematic parameter to be identified of user.
- A kind of 5. virtual reality interface display methods according to claim 1, it is characterised in that:The step J includes: According to spatial interaction gesture information and the default correspondence of control instruction, and determine that the corresponding control of spatial interaction gesture refers to Order.
- A kind of 6. virtual reality interface display methods according to claim 1, it is characterised in that:The control in curved surface Instruct the distance of any two position to currently used person's eyes on corresponding virtual reality interface equal.
- A kind of 7. virtual reality interface display methods according to claim 1, it is characterised in that:It is described after step L, Further include:If detect that display is for indicating operating body and entity on the virtual keyboard there are operating body in physical keyboard The input icon of keyboard position relation.
- A kind of 8. virtual reality interface display methods according to claim 7, it is characterised in that:It is described on the virtual keyboard The input icon that display is used to indicate operating body and physical keyboard position relationship includes:1., when it is human hand to detect operating body, Then corresponded in dummy keyboard and show a Hand icon;2., detection operating body is relative to first of the relative position of physical keyboard Confidence ceases;3., the first position information is converted to second place information of the input icon relative to dummy keyboard;4., root According to the second place information input icon is shown in dummy keyboard.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711387177.1A CN107967089A (en) | 2017-12-20 | 2017-12-20 | A kind of virtual reality interface display methods |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711387177.1A CN107967089A (en) | 2017-12-20 | 2017-12-20 | A kind of virtual reality interface display methods |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107967089A true CN107967089A (en) | 2018-04-27 |
Family
ID=61994638
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711387177.1A Pending CN107967089A (en) | 2017-12-20 | 2017-12-20 | A kind of virtual reality interface display methods |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107967089A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109710075A (en) * | 2018-12-29 | 2019-05-03 | 北京诺亦腾科技有限公司 | A kind of method and device showing content in VR scene |
CN112748795A (en) * | 2019-10-30 | 2021-05-04 | 厦门立达信照明有限公司 | Somatosensory simulation method and system |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130335318A1 (en) * | 2012-06-15 | 2013-12-19 | Cognimem Technologies, Inc. | Method and apparatus for doing hand and face gesture recognition using 3d sensors and hardware non-linear classifiers |
CN106227345A (en) * | 2016-07-21 | 2016-12-14 | 深圳市金立通信设备有限公司 | A kind of virtual reality glasses and control method thereof |
CN106569654A (en) * | 2016-10-09 | 2017-04-19 | 深圳市金立通信设备有限公司 | Virtual reality interface display method and virtual reality device |
CN106951069A (en) * | 2017-02-23 | 2017-07-14 | 深圳市金立通信设备有限公司 | The control method and virtual reality device of a kind of virtual reality interface |
CN107479715A (en) * | 2017-09-29 | 2017-12-15 | 广州云友网络科技有限公司 | The method and apparatus that virtual reality interaction is realized using gesture control |
-
2017
- 2017-12-20 CN CN201711387177.1A patent/CN107967089A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130335318A1 (en) * | 2012-06-15 | 2013-12-19 | Cognimem Technologies, Inc. | Method and apparatus for doing hand and face gesture recognition using 3d sensors and hardware non-linear classifiers |
CN106227345A (en) * | 2016-07-21 | 2016-12-14 | 深圳市金立通信设备有限公司 | A kind of virtual reality glasses and control method thereof |
CN106569654A (en) * | 2016-10-09 | 2017-04-19 | 深圳市金立通信设备有限公司 | Virtual reality interface display method and virtual reality device |
CN106951069A (en) * | 2017-02-23 | 2017-07-14 | 深圳市金立通信设备有限公司 | The control method and virtual reality device of a kind of virtual reality interface |
CN107479715A (en) * | 2017-09-29 | 2017-12-15 | 广州云友网络科技有限公司 | The method and apparatus that virtual reality interaction is realized using gesture control |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109710075A (en) * | 2018-12-29 | 2019-05-03 | 北京诺亦腾科技有限公司 | A kind of method and device showing content in VR scene |
CN109710075B (en) * | 2018-12-29 | 2021-02-09 | 北京诺亦腾科技有限公司 | Method and device for displaying content in VR scene |
CN112748795A (en) * | 2019-10-30 | 2021-05-04 | 厦门立达信照明有限公司 | Somatosensory simulation method and system |
CN112748795B (en) * | 2019-10-30 | 2022-05-27 | 厦门立达信照明有限公司 | Somatosensory simulation method and system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6906580B6 (en) | Viewport-based augmented reality tactile effects systems, methods and non-transitory computer-readable media | |
JP7182851B2 (en) | Systems and methods for position-based haptic effects | |
US9753547B2 (en) | Interactive displaying method, control method and system for achieving displaying of a holographic image | |
CN107632699B (en) | Natural human-machine interaction system based on the fusion of more perception datas | |
US11226734B1 (en) | Triggering multiple actions from a single gesture | |
CN105446481A (en) | Gesture based virtual reality human-machine interaction method and system | |
CN103985137B (en) | It is applied to the moving body track method and system of man-machine interaction | |
CN106030495A (en) | Multi-modal gesture based interactive system and method using one single sensing system | |
Bai et al. | 3D gesture interaction for handheld augmented reality | |
US20190272040A1 (en) | Manipulation determination apparatus, manipulation determination method, and, program | |
CN103995623B (en) | Non-contact type touch screen control device | |
CN104360738A (en) | Space gesture control method for graphical user interface | |
CN106708270A (en) | Display method and apparatus for virtual reality device, and virtual reality device | |
CN106861184B (en) | Method and system for realizing human-computer interaction in immersive VR game | |
CN102426486A (en) | Stereo interaction method and operated apparatus | |
EP3486747A1 (en) | Gesture input method for wearable device, and wearable device | |
CN103176667A (en) | Projection screen touch terminal device based on Android system | |
CN105183236A (en) | Touch screen input device and method | |
CN107967089A (en) | A kind of virtual reality interface display methods | |
CN102929547A (en) | Intelligent terminal contactless interaction method | |
CN108873911A (en) | It is a kind of that luggage case and its control method are followed based on ROS automatically | |
CN111240483B (en) | Operation control method, head-mounted device, and medium | |
CN208888763U (en) | A kind of virtual reality fusion keyboard system for virtual reality | |
CN106569654A (en) | Virtual reality interface display method and virtual reality device | |
CN107102725B (en) | Control method and system for virtual reality movement based on somatosensory handle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180427 |