CN113144592B - Interaction method of VR equipment and mobile equipment - Google Patents

Interaction method of VR equipment and mobile equipment Download PDF

Info

Publication number
CN113144592B
CN113144592B CN202110237149.1A CN202110237149A CN113144592B CN 113144592 B CN113144592 B CN 113144592B CN 202110237149 A CN202110237149 A CN 202110237149A CN 113144592 B CN113144592 B CN 113144592B
Authority
CN
China
Prior art keywords
equipment
mobile device
data
mobile
calibration data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110237149.1A
Other languages
Chinese (zh)
Other versions
CN113144592A (en
Inventor
吕淼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing IQIYI Intelligent Technology Co Ltd
Original Assignee
Nanjing IQIYI Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing IQIYI Intelligent Technology Co Ltd filed Critical Nanjing IQIYI Intelligent Technology Co Ltd
Priority to CN202110237149.1A priority Critical patent/CN113144592B/en
Publication of CN113144592A publication Critical patent/CN113144592A/en
Application granted granted Critical
Publication of CN113144592B publication Critical patent/CN113144592B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides an interaction method of VR equipment and mobile equipment, wherein a mapping relation between the data type of the VR equipment and the data type of the mobile equipment is calibrated on a server, the server acquires key data of the VR equipment and the mobile equipment in real time, converts the key data of the VR equipment into first calibration data which can be identified by the mobile equipment according to the mapping relation, converts the key data of the mobile equipment into second calibration data which can be identified by the VR equipment, and respectively sends the second calibration data to the mobile equipment and the VR equipment for updating and synchronizing pictures in real time. The invention utilizes the server to perform unified data acquisition and conversion, not only can support the VR equipment to perform entertainment and interaction with various mobile equipment, and solves the limitation of different interaction modes of different equipment, so that a user can perform multi-terminal interaction experience in a familiar mode, but also can keep the characteristic experience of VR, and can support more users in different game modes.

Description

Interaction method of VR equipment and mobile equipment
Technical Field
The invention relates to the technical field of computer application, in particular to an interaction method of VR equipment and mobile equipment.
Background
VR (Virtual Reality) technology is an important direction of simulation technology, and a Virtual environment where a three-dimensional dynamic visual scene interacts with physical behaviors can be created through VR technology, so that a user can be immersed in the Virtual environment, and a feeling of being placed in the Virtual world is given to the user. VR technology is widely used in a variety of industries, medicine, entertainment, education, etc., such as in the repair of large equipment where the assembled structure of the large equipment can be revealed by VR equipment.
One important piece of content in VR technology is the implementation of VR interactions. In VR interactions, interactions are performed between a user and a simulated environment. At present, a method for realizing VR interaction mainly comprises the following steps: and the user sends an operation instruction to the VR equipment through the handheld key controller, so that interaction between the user and the simulation environment is realized.
With popularization of VR devices and update of hardware configuration, VR content is more and more diversified, users demand more various entertainment content for VR devices, however, most of content still stays in single local content at present, and compared with most of current entertainment games, real-time online multi-user network functions are lacking, so that users lack multi-user cooperation, countermeasure and real-time interaction experience when experiencing VR content.
The user can experience the visual experience of being in the scene through VR, but because VR equipment user individual produces real-time data more, many people carry out real-time interaction in the VR environment and have always been the difficult problem of facing the development challenge, on the one hand network data exchange in real time, and the front section of the other party needs to be matched and show in real time through the virtual avatar of user in the VR environment. At present, VR devices can only be connected through real-time interconnection of multiple VR persons, and only interaction of fewer persons is supported in the same space.
Disclosure of Invention
The invention aims to provide an interaction method of VR equipment and mobile equipment, which can enable the VR equipment and the mobile equipment to realize interaction and break the limit of the number of interactions.
In order to achieve the above object, the present invention provides an interaction method between a VR device and a mobile device, including:
calibrating a mapping relation between the data type of the VR equipment and the data type of the mobile equipment on a server;
the server acquires key data of the VR equipment and the mobile equipment in real time;
the server converts the key data of the VR equipment into first calibration data which can be identified by the mobile equipment according to the mapping relation, converts the key data of the mobile equipment into second calibration data which can be identified by the VR equipment according to the mapping relation, and sends the first calibration data and the second calibration data to the mobile equipment and the VR equipment respectively; the method comprises the steps of,
and the mobile device and the VR device update and synchronize pictures in real time according to the first calibration data and the second calibration data respectively.
Optionally, the key data of the VR device includes: one or more of virtual character information data controlled by the VR device, real-time position information data of a handle of the VR device, and real-time position information data of a head display of the VR device.
Optionally, the second calibration data includes: one or more of a current location, an action, and a behavioral event of a virtual character controlled by the VR device.
Optionally, the key data of the mobile device includes: the virtual character information data controlled by the mobile device and/or the virtual rocker touch information data of the mobile device.
Optionally, the first calibration data includes: one or more of a current location, an action, and a behavioral event of the virtual character controlled by the mobile device.
Optionally, the first calibration data and the second calibration data are respectively used for characterizing expected behavior results of virtual roles controlled by the VR device and the mobile device.
Optionally, after the mobile device and the VR device update and synchronize the frames in real time according to the first calibration data and the second calibration data, respectively, the virtual role controlled by the mobile device in the VR device and the virtual role controlled by the VR device in the mobile device execute corresponding expected behavior results respectively.
Optionally, the mapping relationship between the data type of the VR device and the data type of the mobile device is used to characterize the conversion relationship of the interaction manner between the VR device and the mobile device.
Optionally, the conversion relation of the interaction mode between the VR device and the mobile device includes: one or more of converting the 6DOF spatial displacement of the VR device to virtual joystick movement of the mobile device, converting the gyroscope rotation of the VR device to virtual joystick view angle movement of the mobile device, converting the handle function button of the VR device to virtual joystick function button of the mobile device, and converting the handle spatial location of the VR device to virtual joystick function button of the mobile device.
In the interaction method of the VR equipment and the mobile equipment, the mapping relation between the data type of the VR equipment and the data type of the mobile equipment is calibrated on the server, the server acquires key data of the VR equipment and the mobile equipment in real time, the key data of the VR equipment are converted into first calibration data which can be identified by the mobile equipment according to the mapping relation, the key data of the mobile equipment are converted into second calibration data which can be identified by the VR equipment according to the mapping relation, and the first calibration data and the second calibration data are respectively sent to the mobile equipment and the VR equipment for updating and synchronizing pictures in real time. The invention utilizes the server to perform unified data acquisition and conversion, not only can support the VR equipment to perform entertainment and interaction with various mobile equipment, and solves the limitation of different interaction modes of different equipment, so that a user can perform multi-terminal interaction experience in a familiar mode, but also can keep the characteristic experience of VR, and can support more users in different game modes.
Drawings
Fig. 1 is a flowchart of an interaction method between VR device and mobile device provided in an embodiment of the present invention;
fig. 2 is a block diagram of a data synchronization structure of a VR device and a mobile device according to an embodiment of the present invention;
fig. 3 is a block diagram of a data conversion structure between a VR device and a mobile device according to an embodiment of the present invention.
Detailed Description
Specific embodiments of the present invention will be described in more detail below with reference to the drawings. The advantages and features of the present invention will become more apparent from the following description. It should be noted that the drawings are in a very simplified form and are all to a non-precise scale, merely for convenience and clarity in aiding in the description of embodiments of the invention.
Fig. 1 is a flowchart of an interaction method between VR device and mobile device provided in this embodiment. As shown in fig. 1, this embodiment provides an interaction method between a VR device and a mobile device, including:
step S1: calibrating a mapping relation between the data type of the VR equipment and the data type of the mobile equipment on a server;
step S2: the server acquires key data of the VR equipment and the mobile equipment in real time;
step S3: the server converts the key data of the VR equipment into first calibration data which can be identified by the mobile equipment according to the mapping relation, converts the key data of the mobile equipment into second calibration data which can be identified by the VR equipment according to the mapping relation, and sends the first calibration data and the second calibration data to the mobile equipment and the VR equipment respectively; the method comprises the steps of,
step S4: and the mobile device and the VR device update and synchronize pictures in real time according to the first calibration data and the second calibration data respectively.
Fig. 2 is a block diagram of a data synchronization structure of a VR device and a mobile device provided in this embodiment, fig. 3 is a block diagram of a data conversion structure of a VR device and a mobile device provided in this embodiment, and next, an interaction method of a VR device and a mobile device provided in this embodiment will be described in detail with reference to fig. 2 to 3.
Referring to fig. 2 and 3, step S1 is first performed to calibrate a mapping relationship between a data type of a VR device and a data type of a mobile device on a server. Specifically, the VR and the mobile device have completely different interaction modes, for example, in the same virtual space, the 6DOF spatial displacement generated by the VR device, the operation of the visual angle change control role, and the like are different from the operation input mode of the mobile device virtual rocker control role, but the interaction behavior can be kept consistent. When playing a game, the VR and the mobile device control virtual roles through different input modes, and if the picture synchronization of the VR and the mobile device is required to be realized, key data are required to be converted and matched on the same roles.
The mapping relationship between the data type of the VR device and the data type of the mobile device is used to characterize the conversion relationship of the interaction mode between the VR device and the mobile device. For example, the conversion relationship of the interaction manner between the VR device and the mobile device includes: one or more of converting the 6DOF spatial displacement of the VR device to virtual joystick movement of the mobile device, converting the gyroscope rotation of the VR device to virtual joystick view angle movement of the mobile device, converting the handle function button of the VR device to virtual joystick function button of the mobile device, and converting the handle spatial location of the VR device to virtual joystick function button of the mobile device.
Next, step S2 is executed, where the server acquires the key data of the VR device and the mobile device in real time. Specifically, when the VR device and the mobile device play a game, the server may actively obtain, in real time, key data input by the VR device and the mobile device. For example, the key data of the VR device includes: one or more of virtual character information data controlled by the VR device, real-time position information data of a handle of the VR device, and real-time position information data of a head display of the VR device. And the key data of the mobile device includes: the virtual character information data controlled by the mobile device and/or the virtual rocker touch information data of the mobile device.
It should be understood that the key data of the VR device and the mobile device may be of other kinds, and may be adjusted according to different game kinds, which are not illustrated here.
Next, step S3 is performed. The server converts the key data of the VR device into first calibration data which can be identified by the mobile device according to the mapping relation, wherein the first calibration data comprises one or more of the current position, action and behavior event of the virtual character controlled by the mobile device. The server further converts key data of the mobile device into second calibration data which can be identified by the VR device according to the mapping relationship, wherein the second calibration data comprises: one or more of a current location, an action, and a behavioral event of a virtual character controlled by the VR device. It should be understood that when the server converts the key data of the VR device into the first calibration data and converts the key data of the mobile device into the second calibration data, it is necessary to perform integrated data arrangement and matching, which is not repeated here.
Then, since the first calibration data and the second calibration data can be respectively identified by the VR device and the mobile device, the server can send the first calibration data and the second calibration data to the mobile device and the VR device in real time.
And then executing step S4, wherein the mobile device and the VR device update and synchronize the pictures in real time according to the first calibration data and the second calibration data respectively, so that picture synchronization among different clients is realized.
It should be appreciated that, in this embodiment, the server is utilized to perform unified data acquisition and conversion, which not only can support entertainment and interaction between the VR device and various mobile devices, but also solves the limitation of different interaction modes of different devices, so that a user can perform multi-terminal interaction experience in a familiar manner, and meanwhile, the characteristic experience of VR is maintained, and more users can be supported in different game modes.
Alternatively, the mobile device may be one or more mobile devices, which is not limited by the present invention.
Further, the first calibration data and the second calibration data are used to characterize expected behavior results of the VR device and the virtual character controlled by the mobile device, respectively. And after the mobile device and the VR device update and synchronize the frames in real time according to the first calibration data and the second calibration data respectively, the virtual roles controlled by the mobile device in the VR device and the virtual roles controlled by the VR device in the mobile device execute corresponding expected behavior results respectively.
Specifically, the first calibration data and the second calibration data correspond to all control and interaction event triggering and results of virtual roles controlled by the VR device and the mobile device, and correspond to different input devices of the VR device and the mobile device; meanwhile, the first calibration data and the second calibration data comprise feedback of virtual character state results controlled by the VR equipment and the mobile equipment, and the feedback is conventional game numerical data; in addition, continuous physical motion calculation can be generated by interaction with physical props in space, and distribution results mainly comprise distributed main-end data for keeping consistency of a plurality of client ends due to inherent differentiation of physical calculation.
After the mobile device and the VR device update and synchronize the frames in real time according to the first calibration data and the second calibration data, the virtual roles controlled by the mobile device in the VR device and the virtual roles controlled by the VR device in the mobile device execute corresponding expected behavior results respectively, so that real-time interaction between the mobile device and the VR device is realized.
In summary, in the interaction method between the VR device and the mobile device provided in the present embodiment, a mapping relationship between a data type of the VR device and a data type of the mobile device is calibrated on a server, the server obtains key data of the VR device and the mobile device in real time, converts the key data of the VR device into first calibration data that can be identified by the mobile device according to the mapping relationship, converts the key data of the mobile device into second calibration data that can be identified by the VR device according to the mapping relationship, and sends the first calibration data and the second calibration data to the mobile device and the VR device respectively for updating and synchronizing pictures in real time. The invention utilizes the server to perform unified data acquisition and conversion, not only can support the VR equipment to perform entertainment and interaction with various mobile equipment, and solves the limitation of different interaction modes of different equipment, so that a user can perform multi-terminal interaction experience in a familiar mode, but also can keep the characteristic experience of VR, and can support more users in different game modes.
The foregoing is merely a preferred embodiment of the present invention and is not intended to limit the present invention in any way. Any person skilled in the art will make any equivalent substitution or modification to the technical solution and technical content disclosed in the invention without departing from the scope of the technical solution of the invention, and the technical solution of the invention is not departing from the scope of the invention.

Claims (5)

1. A method of interaction between a VR device and a mobile device, comprising:
calibrating a mapping relation between the data type of the VR equipment and the data type of the mobile equipment on a server;
the server acquires key data of the VR equipment and the mobile equipment in real time;
the server converts the key data of the VR equipment into first calibration data which can be identified by the mobile equipment according to the mapping relation, converts the key data of the mobile equipment into second calibration data which can be identified by the VR equipment according to the mapping relation, and sends the first calibration data and the second calibration data to the mobile equipment and the VR equipment respectively; the method comprises the steps of,
the mobile device and the VR device update and synchronize pictures in real time according to the first calibration data and the second calibration data respectively;
the key data of the VR device includes: one or more of virtual character information data controlled by the VR device, real-time position information data of a handle of the VR device, and real-time position information data of a head display of the VR device;
the second calibration data includes: one or more of a current location, an action, and a behavioral event of a virtual character controlled by the VR device;
the key data of the mobile device include: the virtual character information data controlled by the mobile device and/or the virtual rocker touch information data of the mobile device;
the first calibration data includes: one or more of a current location, an action, and a behavioral event of the virtual character controlled by the mobile device.
2. The method of claim 1, wherein the first calibration data and the second calibration data are used to characterize expected behavioral results of virtual roles controlled by the VR device and the mobile device, respectively.
3. The method of interaction between a VR device and a mobile device according to claim 2, wherein after the mobile device and the VR device update and synchronize the frames in real time according to the first calibration data and the second calibration data, respectively, the virtual role controlled by the mobile device in the VR device and the virtual role controlled by the VR device in the mobile device execute corresponding expected behavior results, respectively.
4. The method of interaction between a VR device and a mobile device of claim 1, wherein a mapping between a data type of the VR device and a data type of the mobile device is used to characterize a translation relationship of an interaction style between the VR device and the mobile device.
5. The method of interaction between a VR device and a mobile device of claim 4, wherein the conversion relationship of interaction between the VR device and the mobile device comprises: one or more of converting the 6DOF spatial displacement of the VR device to virtual joystick movement of the mobile device, converting the gyroscope rotation of the VR device to virtual joystick view angle movement of the mobile device, converting the handle function button of the VR device to virtual joystick function button of the mobile device, and converting the handle spatial location of the VR device to virtual joystick function button of the mobile device.
CN202110237149.1A 2021-03-03 2021-03-03 Interaction method of VR equipment and mobile equipment Active CN113144592B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110237149.1A CN113144592B (en) 2021-03-03 2021-03-03 Interaction method of VR equipment and mobile equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110237149.1A CN113144592B (en) 2021-03-03 2021-03-03 Interaction method of VR equipment and mobile equipment

Publications (2)

Publication Number Publication Date
CN113144592A CN113144592A (en) 2021-07-23
CN113144592B true CN113144592B (en) 2023-05-16

Family

ID=76884078

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110237149.1A Active CN113144592B (en) 2021-03-03 2021-03-03 Interaction method of VR equipment and mobile equipment

Country Status (1)

Country Link
CN (1) CN113144592B (en)

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9551873B2 (en) * 2014-05-30 2017-01-24 Sony Interactive Entertainment America Llc Head mounted device (HMD) system having interface with mobile computing device for rendering virtual reality content
CN105739703A (en) * 2016-02-02 2016-07-06 北方工业大学 Virtual reality somatosensory interaction system and method for wireless head-mounted display equipment
CN106445098A (en) * 2016-07-29 2017-02-22 北京小米移动软件有限公司 Control method and control apparatus used for head-mounted device, and mobile device
CN106293101A (en) * 2016-09-30 2017-01-04 陈华丰 A kind of man-machine interactive system for head-mounted display and method
CN107343192A (en) * 2017-07-20 2017-11-10 武汉市陆刻科技有限公司 A kind of 3D solids interpolation model and VR mobile terminal interaction methods and system
CN108646997A (en) * 2018-05-14 2018-10-12 刘智勇 A method of virtual and augmented reality equipment is interacted with other wireless devices
US11733824B2 (en) * 2018-06-22 2023-08-22 Apple Inc. User interaction interpreter
CN111199561B (en) * 2020-01-14 2021-05-18 上海曼恒数字技术股份有限公司 Multi-person cooperative positioning method and system for virtual reality equipment
CN111897435B (en) * 2020-08-06 2022-08-02 陈涛 Man-machine identification method, identification system, MR intelligent glasses and application

Also Published As

Publication number Publication date
CN113144592A (en) 2021-07-23

Similar Documents

Publication Publication Date Title
US11443472B2 (en) Time-dependent client inactivity indicia in a multi-user animation environment
CN100442313C (en) Three-dimensional virtual space simulator, three-dimensional virtual space simulation program, and computer readable recording medium where the program is recorded
JP5101737B2 (en) Apparatus and method for interworking between virtual reality services
CN109885367B (en) Interactive chat implementation method, device, terminal and storage medium
CN111298435B (en) Visual field control method for VR game, VR display terminal, device and medium
Handa et al. Immersive technology–uses, challenges and opportunities
CN107850948A (en) Mixed reality is social
KR102415719B1 (en) Metaverse server for displaying avatar status information for virtual counseling envuronment
JP2006201912A (en) Processing method for three-dimensional virtual object information providing service, three-dimensional virtual object providing system, and program
Joselli et al. An architecture for game interaction using mobile
Webel et al. Immersive experience of current and ancient reconstructed cultural attractions
JP2024012545A (en) Information processing system, information processing method, and program
Kim et al. Virtual world control system using sensed information and adaptation engine
KR20020073313A (en) Method and apparatus for producing avatar on terminal background screen and community communications method and system using the same, and method of performing games using avatar
CN113144592B (en) Interaction method of VR equipment and mobile equipment
KR20180005356A (en) Method and program for providing game by touch screen
CN115624740A (en) Virtual reality equipment, control method, device and system thereof, and interaction system
CN115526967A (en) Animation generation method and device for virtual model, computer equipment and storage medium
CN113457127A (en) Control method and device of virtual role, electronic equipment and storage medium
Sakamoto et al. Human interaction issues in a digital-physical hybrid world
JP6457603B1 (en) Image processing program, image processing apparatus, and image processing method
Lin et al. Space connection: a new 3D tele-immersion platform for web-based gesture-collaborative games and services
Perl Distributed Multi-User VR With Full-Body Avatars
KR20100116251A (en) Method for processing damage motion of bone animation character
KR100449808B1 (en) Immersive virtual environment system based on Internet environment, projection and stepper

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant