CN111580669A - Interaction method and device for virtual reality and augmented reality mobile end plane application - Google Patents

Interaction method and device for virtual reality and augmented reality mobile end plane application Download PDF

Info

Publication number
CN111580669A
CN111580669A CN202010398550.9A CN202010398550A CN111580669A CN 111580669 A CN111580669 A CN 111580669A CN 202010398550 A CN202010398550 A CN 202010398550A CN 111580669 A CN111580669 A CN 111580669A
Authority
CN
China
Prior art keywords
data
application
key
module
touch screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010398550.9A
Other languages
Chinese (zh)
Inventor
蔡小飞
张琦
曹俊
赖小松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Ruiyue Information Technology Co ltd
Original Assignee
Nanjing Ruiyue Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Ruiyue Information Technology Co ltd filed Critical Nanjing Ruiyue Information Technology Co ltd
Priority to CN202010398550.9A priority Critical patent/CN111580669A/en
Publication of CN111580669A publication Critical patent/CN111580669A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

The invention provides an interaction method and device for a virtual reality mobile terminal plane application and an augmented reality mobile terminal plane application.

Description

Interaction method and device for virtual reality and augmented reality mobile end plane application
Technical Field
The invention belongs to the technical field of virtual reality, and particularly relates to an interaction method for mobile end plane application of virtual reality and augmented reality.
Background
The virtual reality technology comprehensively utilizes a plurality of technologies such as computer graphics, photoelectric imaging technology, sensing technology, computer simulation, artificial intelligence and the like, and aims to provide a vivid virtual world with multiple perceptions for users by means of various interactive devices.
In recent years, virtual reality technology has rapidly developed, and host-side virtual reality provides a virtual reality experience using a high-performance PC or a game host as a computing core. Rely on its bold hardware, can bring fine immersive for the user and experience, nevertheless because of its with high costs, corollary equipment is loaded down with trivial details, fails to have fine popularization.
With the continuous improvement of the hardware performance of mobile virtual reality equipment in recent years, the performance gap between a mobile platform and a PC or a game host is gradually reduced, and high-quality contents on some PCs and host VR platforms are being transplanted to the mobile VR platform successively, but many virtual reality users feel that virtual reality applications are not rich enough, so that the demands of the users on 2D content VR are particularly great, and especially, video applications are very suitable for the users to watch in bed, and the dizziness of some users can be avoided. For example, the PhoneCast application of samsung can enable a flat application of a user to run on a GearVR, and the PhoneCast application can enable the user to run mobile games and other applications in the VR, including huge virtual screens that can stream video to the VR, into a dynamic application program. It can create a flat window in the VR environment, in which you can run an android application, and the user plays an android game just like watching a movie. However, this technique still has limitations, cannot be adapted to all applications, and many operations of the planar application cannot be completed.
The current Virtual Reality (abbreviated as VR) system mainly aims at the operation and display of three-dimensional stereoscopic applications, and can be installed in a general mobile phone or a flat panel, but the user cannot use the functions of the application, mainly because the application picture needs to be displayed according to left and right screens, and the user cannot perform touch screen operation like a mobile phone. With the continuous improvement of hardware performance of mobile virtual reality equipment in recent years, although a lot of high-quality contents are transplanted to a VR platform, many users feel that application contents are not rich enough, and the transplantation of some common plane applications into VR applications is very costly, so that how to enable the virtual reality equipment to interact with traditional plane applications and support a plurality of interactive equipment becomes a key technical problem.
Approximately 10 system-level basic applications need to be pre-installed in the virtual reality system to ensure the complete system functions. At present, most of system-level applications adopt VR applications developed by Unity, the applications are large in size and occupy memory, so that firmware is very large, the planar applications with the same functions can replace part of VR applications, and entry-level VR film watching equipment can even completely use the planar applications. On the other hand, the VR application has higher requirements on developers, a stable VR application needs to be developed for a long time, the development of the planar application is relatively mature, a lot of development time of customers can be saved, and a VR system can be customized quickly.
Disclosure of Invention
The technical problem to be solved by the invention is to provide an interaction method and device for virtual reality and augmented reality mobile end plane applications, so that various plane applications can be more easily and naturally used in a virtual reality system.
The technical solution for realizing the purpose of the invention is as follows:
an interaction method for virtual reality and augmented reality mobile end plane applications comprises the following steps:
step 1: starting a plane application:
starting a virtual mouse module for the mouse application;
starting a virtual touch screen module for a touch screen application;
starting a key processing module for the key application;
step 2: receiving data, judging the data type:
if the key data is the key data, turning to the step 3;
if the data is sensor data, transmitting the sensor data into a sensor module to calculate sensor attitude data, and turning to the step 4;
if the data is the Bluetooth handle data, turning to the step 5;
if the data is the touch pad data, turning to step 6;
and step 3: judging the current application type:
if the current application is the mouse application, transmitting the key data into the virtual mouse module, and turning to the step 7;
if the current application is the touch screen application, transmitting the key data into the virtual touch screen module, and turning to the step 8;
if the current application is the key application, transmitting the key data into a key value processing module, and turning to the step 9;
and 4, step 4: judging the current application type:
if the current application is the mouse application, transmitting the sensor attitude data into the virtual mouse module, and turning to the step 7;
if the current application is a touch screen application, transmitting the sensor attitude data into a virtual touch screen module, and turning to the step 8;
and 5: judging whether the Bluetooth handle data is handle key data or handle posture data, if the Bluetooth handle data is the handle key data, turning to the step 3, and if the Bluetooth handle data is the handle posture data, turning to the step 4;
step 6: judging the current application type:
if the current application is the mouse application, transmitting the touch pad data into the virtual mouse module, and turning to the step 7;
if the current application is a touch screen application or a key application, converting the touch gesture into key data and transmitting the key data into a key value processing module;
and 7: creating a virtual mouse, and setting a mouse moving boundary and an initial position;
calculating virtual mouse coordinates for the sensor attitude data or the handle attitude data and then sending a mouse event;
directly sending mouse events to the touch pad data, the key data and the handle key data;
the mouse application responds to the mouse event;
and 8: creating a virtual touch screen and setting a touch screen boundary;
calculating the intersection point coordinate of the handle ray and the application for the handle posture data, calculating the intersection point coordinate of the fixation point ray and the application for the sensor posture data, calculating the touch screen coordinate, converting into a touch screen event and sending;
converting the key data, the handle key data and the key value processing module into touch screen events and sending the touch screen events;
the touch screen application responds to the touch screen event;
and step 9: the key application processes the module response according to the key value.
Further, in the interaction method of the virtual reality and augmented reality mobile end plane application of the present invention, the sensor data in step 2 includes: and acquiring gyroscope data and/or accelerometer data, and performing filtering processing through a filtering algorithm.
An interaction device for virtual reality and augmented reality mobile end plane applications comprises a data acquisition module and a data processing module which are sequentially connected, wherein the output end of the data processing module is connected with the plane applications;
the data acquisition module comprises physical keys, a Bluetooth handle, a touch pad and a sensor module, the data processing module comprises a sensor processing module, a key processing module, a virtual mouse module and a virtual touch screen module, and the plane application comprises mouse application, touch screen application and key application;
the input end of the sensor processing module is connected with the sensor module; the input end of the key processing module is connected with the physical keys, the Bluetooth handle and the touch pad, and the output end of the key processing module corresponds to the key application and the touch screen application; the input end of the virtual mouse module is connected with the physical keys, the Bluetooth handle, the touch pad and the output end of the sensor processing module, and the output end of the virtual mouse module corresponds to mouse application; the virtual touch screen module is connected with the output ends of the physical keys, the Bluetooth handle and the sensor processing module, and the output ends of the virtual touch screen module correspond to the touch screen application.
Furthermore, the interactive device for the virtual reality and augmented reality mobile end plane application comprises physical keys, wherein the physical keys comprise an up key, a down key, a left key, a right key, a confirmation key and a return key.
Furthermore, in the interaction device for the virtual reality and augmented reality mobile end plane application, the data collected by the Bluetooth handle comprises handle key data and handle attitude data.
Furthermore, the interaction device for the virtual reality and augmented reality mobile end plane application comprises a sensor module, a filtering module, a gyroscope and/or an accelerometer, wherein the gyroscope and the accelerometer are respectively connected with the filtering module.
Furthermore, according to the interaction device for the virtual reality and augmented reality mobile end plane application, the virtual mouse module is externally connected with a mouse.
Compared with the prior art, the invention adopting the technical scheme has the following technical effects:
the interaction method and the device for the plane application of the virtual reality and augmented reality mobile terminal can realize the interaction between the virtual reality mobile terminal and the plane application, so that various plane applications can be more easily and naturally used in a virtual reality system, and simultaneously, various interaction devices are supported.
Drawings
Fig. 1 is a flow chart of the method steps 1 of the interaction of the virtual reality and augmented reality mobile end-plane applications of the present invention.
Fig. 2 is a flow chart of the method step 3 of the interaction of the virtual reality and augmented reality mobile end-plane applications of the present invention.
Fig. 3 is a flow chart of the method step 4 of the interaction of the virtual reality and augmented reality mobile end-plane applications of the present invention.
Fig. 4 is a flow chart of the method step 5 of the interaction of the virtual reality and augmented reality mobile end-plane applications of the present invention.
Fig. 5 is a flow chart of the method step 6 of the interaction of the virtual reality and augmented reality mobile end-plane applications of the present invention.
Fig. 6 is a flow chart of the method step 7 of the interaction of the virtual reality and augmented reality mobile end-plane applications of the present invention.
Fig. 7 is a flow chart of the method step 8 of the interaction of the virtual reality and augmented reality mobile end-plane applications of the present invention.
Fig. 8 is a process flow diagram of a sensor processing module in the interaction method of the virtual reality and augmented reality mobile end-plane applications of the present invention.
Fig. 9 is a schematic diagram of an interactive means of virtual reality and augmented reality mobile end-plane applications of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention.
An interaction method for virtual reality and augmented reality mobile end plane applications comprises the following steps:
step 1: launch a flat application, as shown in FIG. 1:
starting a virtual mouse module for the mouse application;
starting a virtual touch screen module for a touch screen application;
starting a key processing module for the key application;
step 2: receiving data, judging the data type:
if the key data is the key data, turning to the step 3;
if the sensor data is sensor data, transmitting the sensor data into a sensor module to calculate sensor attitude data, and performing step 4, wherein the sensor data comprises gyroscope data and/or accelerometer data, and performing filtering processing through a filtering algorithm, as shown in fig. 8;
if the data is the Bluetooth handle data, turning to the step 5;
if the data is the touch pad data, turning to step 6;
and step 3: judging the current application type, as shown in fig. 2:
if the current application is the mouse application, transmitting the key data into the virtual mouse module, and turning to the step 7;
if the current application is the touch screen application, transmitting the key data into the virtual touch screen module, and turning to the step 8;
if the current application is the key application, transmitting the key data into a key value processing module, and turning to the step 9;
and 4, step 4: judging the current application type, as shown in fig. 3:
if the current application is the mouse application, transmitting the sensor attitude data into the virtual mouse module, and turning to the step 7;
if the current application is a touch screen application, transmitting the sensor attitude data into a virtual touch screen module, and turning to the step 8;
and 5: judging that the Bluetooth handle data is handle key data or handle posture data, as shown in FIG. 4, if the Bluetooth handle data is the handle key data, turning to step 3, and if the Bluetooth handle data is the handle posture data, turning to step 4;
step 6: judging the current application type, as shown in fig. 5:
if the current application is the mouse application, transmitting the touch pad data into the virtual mouse module, and turning to the step 7;
if the current application is a touch screen application or a key application, converting the touch gesture into key data and transmitting the key data into a key value processing module;
and 7: as shown in fig. 6, a virtual mouse is created, and a mouse movement boundary and an initial position are set;
calculating virtual mouse coordinates for the sensor attitude data or the handle attitude data and then sending a mouse event;
directly sending mouse events to the touch pad data, the key data and the handle key data;
the mouse application responds to the mouse event;
and 8: as shown in fig. 7, a virtual touch screen is created, and a touch screen boundary is set;
calculating the intersection point coordinate of the handle ray and the application for the handle posture data, calculating the intersection point coordinate of the fixation point ray and the application for the sensor posture data, calculating the touch screen coordinate, converting into a touch screen event and sending;
converting the key data, the handle key data and the key value processing module into touch screen events and sending the touch screen events;
the touch screen application responds to the touch screen event;
and step 9: and sending an event, and responding by the key application according to the key value processing module.
An interactive device for virtual reality and augmented reality mobile end-plane applications is shown in fig. 9, and comprises a data acquisition module and a data processing module which are connected in sequence, wherein the output end of the data processing module is connected with the plane applications.
The data acquisition module comprises physical keys, a Bluetooth handle, a touch pad and a sensor module, the physical keys comprise upward keys, downward keys, leftward keys, rightward keys, confirmation keys and return keys, the data acquired by the Bluetooth handle comprise handle key data and handle attitude data, the sensor module comprises a filtering module, a gyroscope and/or an accelerometer, and the gyroscope and the accelerometer are respectively connected with the filtering module. The data processing module comprises a sensor processing module, a key processing module, a virtual mouse module and a virtual touch screen module, and the plane application comprises mouse application, touch screen application and key application.
The input end of the sensor processing module is connected with the sensor module; the input end of the key processing module is connected with the physical keys, the Bluetooth handle and the touch pad, and the output end of the key processing module corresponds to the key application and the touch screen application; the input end of the virtual mouse module is connected with the physical keys, the Bluetooth handle, the touch pad and the output end of the sensor processing module, and the output end of the virtual mouse module corresponds to mouse application; the virtual touch screen module is connected with the output ends of the physical keys, the Bluetooth handle and the sensor processing module, and the output ends of the virtual touch screen module correspond to the touch screen application.
In another embodiment, the virtual mouse module is replaced by an external mouse.
The foregoing is directed to embodiments of the present invention and, more particularly, to a method and apparatus for controlling a power converter in a power converter, including a power converter, a power.

Claims (7)

1. An interaction method for a virtual reality mobile end plane application and an augmented reality mobile end plane application is characterized by comprising the following steps:
step 1: starting a plane application:
starting a virtual mouse module for the mouse application;
starting a virtual touch screen module for a touch screen application;
starting a key processing module for the key application;
step 2: receiving data, judging the data type:
if the key data is the key data, turning to the step 3;
if the data is sensor data, transmitting the sensor data into a sensor module to calculate sensor attitude data, and turning to the step 4;
if the data is the Bluetooth handle data, turning to the step 5;
if the data is the touch pad data, turning to step 6;
and step 3: judging the current application type:
if the current application is the mouse application, transmitting the key data into the virtual mouse module, and turning to the step 7;
if the current application is the touch screen application, transmitting the key data into the virtual touch screen module, and turning to the step 8;
if the current application is the key application, transmitting the key data into a key value processing module, and turning to the step 9;
and 4, step 4: judging the current application type:
if the current application is the mouse application, transmitting the sensor attitude data into the virtual mouse module, and turning to the step 7;
if the current application is a touch screen application, transmitting the sensor attitude data into a virtual touch screen module, and turning to the step 8;
and 5: judging whether the Bluetooth handle data is handle key data or handle posture data, if the Bluetooth handle data is the handle key data, turning to the step 3, and if the Bluetooth handle data is the handle posture data, turning to the step 4;
step 6: judging the current application type:
if the current application is the mouse application, transmitting the touch pad data into the virtual mouse module, and turning to the step 7;
if the current application is a touch screen application or a key application, converting the touch gesture into key data and transmitting the key data into a key value processing module;
and 7: creating a virtual mouse, and setting a mouse moving boundary and an initial position;
calculating virtual mouse coordinates for the sensor attitude data or the handle attitude data and then sending a mouse event;
directly sending mouse events to the touch pad data, the key data and the handle key data;
the mouse application responds to the mouse event;
and 8: creating a virtual touch screen and setting a touch screen boundary;
calculating the intersection point coordinate of the handle ray and the application for the handle posture data, calculating the intersection point coordinate of the fixation point ray and the application for the sensor posture data, calculating the touch screen coordinate, converting into a touch screen event and sending;
converting the key data, the handle key data and the key value processing module into touch screen events and sending the touch screen events;
the touch screen application responds to the touch screen event;
and step 9: the key application processes the module response according to the key value.
2. The method of claim 1, wherein the sensor data in step 2 comprises: and acquiring gyroscope data and/or accelerometer data, and performing filtering processing through a filtering algorithm.
3. An interaction device for virtual reality and augmented reality mobile end plane applications is characterized by comprising a data acquisition module and a data processing module which are sequentially connected, wherein the output end of the data processing module is connected with the plane applications;
the data acquisition module comprises physical keys, a Bluetooth handle, a touch pad and a sensor module, the data processing module comprises a sensor processing module, a key processing module, a virtual mouse module and a virtual touch screen module, and the plane application comprises mouse application, touch screen application and key application;
the input end of the sensor processing module is connected with the sensor module; the input end of the key processing module is connected with the physical keys, the Bluetooth handle and the touch pad, and the output end of the key processing module corresponds to the key application and the touch screen application; the input end of the virtual mouse module is connected with the physical keys, the Bluetooth handle, the touch pad and the output end of the sensor processing module, and the output end of the virtual mouse module corresponds to mouse application; the virtual touch screen module is connected with the output ends of the physical keys, the Bluetooth handle and the sensor processing module, and the output ends of the virtual touch screen module correspond to the touch screen application.
4. The interactive apparatus of virtual reality and augmented reality mobile end-plane applications of claim 3, wherein the physical keys include up key, down key, left key, right key, confirm key, and return key.
5. The device of claim 3, wherein the data collected by the Bluetooth handle comprises handle button data and handle gesture data.
6. The device of claim 3, wherein the sensor module comprises a filter module, a gyroscope and/or an accelerometer, and the gyroscope and the accelerometer are respectively connected to the filter module.
7. The device as claimed in claim 3, wherein the virtual mouse module is an external mouse.
CN202010398550.9A 2020-05-12 2020-05-12 Interaction method and device for virtual reality and augmented reality mobile end plane application Pending CN111580669A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010398550.9A CN111580669A (en) 2020-05-12 2020-05-12 Interaction method and device for virtual reality and augmented reality mobile end plane application

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010398550.9A CN111580669A (en) 2020-05-12 2020-05-12 Interaction method and device for virtual reality and augmented reality mobile end plane application

Publications (1)

Publication Number Publication Date
CN111580669A true CN111580669A (en) 2020-08-25

Family

ID=72123065

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010398550.9A Pending CN111580669A (en) 2020-05-12 2020-05-12 Interaction method and device for virtual reality and augmented reality mobile end plane application

Country Status (1)

Country Link
CN (1) CN111580669A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114201104A (en) * 2021-12-13 2022-03-18 杭州灵伴科技有限公司 Virtual application interface updating method, head-mounted display device assembly and medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102664988A (en) * 2012-03-23 2012-09-12 中国科学院软件研究所 Three-dimensional interaction method based on intelligent mobile phone and system thereof
CN103345312A (en) * 2013-07-03 2013-10-09 张帆 System and method with intelligent terminal as host, mouse and touch panel at the same time
CN105807915A (en) * 2016-02-24 2016-07-27 北京小鸟看看科技有限公司 Control method and control device of virtual mouse, and head-mounted display equipment
CN106445114A (en) * 2016-08-31 2017-02-22 华勤通讯技术有限公司 Virtual interactive device and virtual interactive system
CN106657609A (en) * 2016-11-18 2017-05-10 成都六维人科技有限公司 Virtual reality device and control device and method thereof
CN107145227A (en) * 2017-04-20 2017-09-08 腾讯科技(深圳)有限公司 The exchange method and device of virtual reality scenario
CN107783674A (en) * 2016-08-27 2018-03-09 杨博 A kind of augmented reality exchange method and action induction felt pen
CN110832441A (en) * 2017-05-19 2020-02-21 奇跃公司 Keyboard for virtual, augmented and mixed reality display systems

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102664988A (en) * 2012-03-23 2012-09-12 中国科学院软件研究所 Three-dimensional interaction method based on intelligent mobile phone and system thereof
CN103345312A (en) * 2013-07-03 2013-10-09 张帆 System and method with intelligent terminal as host, mouse and touch panel at the same time
CN105807915A (en) * 2016-02-24 2016-07-27 北京小鸟看看科技有限公司 Control method and control device of virtual mouse, and head-mounted display equipment
CN107783674A (en) * 2016-08-27 2018-03-09 杨博 A kind of augmented reality exchange method and action induction felt pen
CN106445114A (en) * 2016-08-31 2017-02-22 华勤通讯技术有限公司 Virtual interactive device and virtual interactive system
CN106657609A (en) * 2016-11-18 2017-05-10 成都六维人科技有限公司 Virtual reality device and control device and method thereof
CN107145227A (en) * 2017-04-20 2017-09-08 腾讯科技(深圳)有限公司 The exchange method and device of virtual reality scenario
CN110832441A (en) * 2017-05-19 2020-02-21 奇跃公司 Keyboard for virtual, augmented and mixed reality display systems

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114201104A (en) * 2021-12-13 2022-03-18 杭州灵伴科技有限公司 Virtual application interface updating method, head-mounted display device assembly and medium

Similar Documents

Publication Publication Date Title
US10348795B2 (en) Interactive control management for a live interactive video game stream
US10016679B2 (en) Multiple frame distributed rendering of interactive content
EP2671148B1 (en) Apparatus, systems and methods for presenting displayed image information of a mobile media device on a large display and control of the mobile media device therefrom
US9158391B2 (en) Method and apparatus for controlling content on remote screen
KR20140147095A (en) Instantiable gesture objects
CN108885521A (en) Cross-environment is shared
CN108776544B (en) Interaction method and device in augmented reality, storage medium and electronic equipment
US20150301730A1 (en) Object Suspension Realizing Method and Device
CN103984494A (en) System and method for intuitive user interaction among multiple pieces of equipment
EP4207083A1 (en) Elastic object rendering method and apparatus, device, and storage medium
CN102109924B (en) Method of generating multi-touch signal, data transmission connecting apparatus, and control system
CN107807774A (en) The control method and split type glasses of a kind of Split type intelligent glasses
CN111580669A (en) Interaction method and device for virtual reality and augmented reality mobile end plane application
CN109718554A (en) A kind of real-time rendering method, apparatus and terminal
WO2024016924A1 (en) Video processing method and apparatus, and electronic device and storage medium
US20080082991A1 (en) Computer interface system using multiple independent hardware and virtual human-computer input devices and related enabling subroutines
CN109542218B (en) Mobile terminal, human-computer interaction system and method
WO2014059842A1 (en) Information processing system and processing method therefor
TW202328872A (en) Metaverse content modality mapping
US20140358250A1 (en) Method and device for the control of at least one appliance by at least one other appliance, appliance and system implementing such a device
CN115624740A (en) Virtual reality equipment, control method, device and system thereof, and interaction system
CN115120979A (en) Display control method and device of virtual object, storage medium and electronic device
CN112287708A (en) Near Field Communication (NFC) analog card switching method, device and equipment
CN112073812A (en) Application management method on smart television and display device
US20230377248A1 (en) Display control method and apparatus, terminal, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200825

RJ01 Rejection of invention patent application after publication