CN111530089B - Multimedia VR interaction method and system - Google Patents

Multimedia VR interaction method and system Download PDF

Info

Publication number
CN111530089B
CN111530089B CN202010326318.4A CN202010326318A CN111530089B CN 111530089 B CN111530089 B CN 111530089B CN 202010326318 A CN202010326318 A CN 202010326318A CN 111530089 B CN111530089 B CN 111530089B
Authority
CN
China
Prior art keywords
user
coordinate
virtual
information
multimedia
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010326318.4A
Other languages
Chinese (zh)
Other versions
CN111530089A (en
Inventor
李洪涛
史浩浩
李�瑞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Langxing Digital Technology Co ltd
Original Assignee
Shenzhen Langxing Digital Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Langxing Digital Technology Co ltd filed Critical Shenzhen Langxing Digital Technology Co ltd
Priority to CN202010326318.4A priority Critical patent/CN111530089B/en
Publication of CN111530089A publication Critical patent/CN111530089A/en
Application granted granted Critical
Publication of CN111530089B publication Critical patent/CN111530089B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/85Providing additional services to players
    • A63F13/87Communicating with other players during game play, e.g. by e-mail or chat
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/32Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using local area network [LAN] connections
    • A63F13/327Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using local area network [LAN] connections using wireless networks, e.g. Wi-Fi or piconet
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/79Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/1066Session management
    • H04L65/1069Session establishment or de-establishment
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/131Protocols for games, networked simulations or virtual reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/383Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/40Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of platform network
    • A63F2300/404Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of platform network characterized by a local network connection
    • A63F2300/405Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of platform network characterized by a local network connection being a wireless ad hoc network, e.g. Bluetooth, Wi-Fi, Pico net
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/55Details of game data or player data management
    • A63F2300/5546Details of game data or player data management using player registration data, e.g. identification, account, preferences, game history
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Abstract

The utility model discloses a multimedia VR interaction method and a system, comprising the following steps: the video camera comprises a virtual ball, the server stores computer instructions which can be executed by the processor, and the system is set to track the gaze of a user in a VR environment.

Description

Multimedia VR interaction method and system
Technical Field
The utility model relates to a multimedia VR interaction method and system, and belongs to the technical field of VR.
Background
Virtual Reality (VR) is a computer simulation technology capable of creating and experiencing a virtual world, and it uses a computer to generate an interactive three-dimensional dynamic view, and a simulation system of its physical behavior can immerse a user into the environment.
However, in the existing industry field, voice communication between users can not be supported by connecting with a local area network, and facial makeup of an exchange object cannot be observed between the communication.
Disclosure of Invention
In order to achieve the above object, the present utility model is realized by the following technical scheme: a multimedia VR interaction method and system comprising one or more processors and one or more servers and a plurality of cameras surrounding a virtual space, two or more VR devices, the cameras comprising virtual balls, the servers storing computer instructions executable by the processors and the system being configured to track a user's gaze in a VR environment, the system performing at least the following operations causing the computer executable instructions:
1) One or more database servers are established through the server, and VR hardware devices are all connected with a local area network;
2) Generating an environment coordinate system of VR in the virtual space;
generating a hot spot in the virtual space, wherein the hot spot comprises a user self hot spot coordinate and other user hot spot coordinates;
3) The system tracks view information received from one or more sensors in the user VR hardware device, the view information being along with a direction of user gaze in the real world;
4) Mapping the tracked view information to an ambient coordinate system and determining a user gaze based on the mapping.
In order to optimize the technical scheme, the further measures are as follows:
according to a preferred form, the VR environment coordinate system comprises: an X-coordinate, a Y-coordinate, a D-coordinate, a T1-coordinate, a T2-coordinate, the X-coordinate being defined as a latitude value, the Y-coordinate being defined as a portrait value, the D-coordinate being defined as a virtual element distance value of a user in a virtual three-dimensional environment, the T1-coordinate being defined as a virtual time, the T2-coordinate being defined as a real-time, the computer instructions being executed by the processor to generate a user coordinate system within the virtual space associated with a virtual sphere in a camera position surrounding the virtual space, and a user name, the processor being executed to calculate the X1 coordinate, the Y1 coordinate, the D1 coordinate generating the user coordinate system, the user name corresponding to the VR hardware device.
According to a preferred mode, the database server stores user registration information, the VR hardware device comprises VR glasses, and the VR glasses are provided with switch buttons.
According to a preferred form, the user coordinate system comprises a user position relative to a sphere in three-dimensional space.
According to a preferred approach, the processor execution system generates a field of view dataset from view information received by one or more sensors in the user VR hardware device, the field of view dataset including a center point defining the user's gaze, and confirms whether the center point hit point hits other user coordinate hotspots.
According to a preferred mode, the VR hardware device includes an audio module and an information exchange module, where the audio module corresponds to the information exchange module, and the information exchange module is in the server.
According to a preferred form, the method comprises the steps of;
s1, registering and storing user information in one or more databases, wherein the database server is also imported with designed 3D facial makeup and 3D scenes;
s2, shooting user face information through a camera, converting the user face information into a 3D facial makeup by using software, and importing the 3D facial makeup into a user information database;
s3, VR hardware equipment connected with the local area network on the user band searches for hot spots in the local area network, and other user coordinate systems and user names can be observed;
s4, selecting a hot spot by the user through the staring center point, pressing a switch button to determine the hot spot, and determining whether to establish connection by the opposite user through the switch button;
s5, after the other party user confirms connection, voice communication with the user is carried out through the audio module, and 3D facial makeup registered by the other party user is exported to a display screen of the VR glasses through the processor.
Advantageous effects
(1) According to the utility model, the VR environment coordinate system is generated in the virtual space, so that the VR environment coordinate system has a distance sense in the virtual space, and the user body can more practically experience the same distance sense as the reality.
(2) The utility model also generates the hot spot in the virtual space, and tags each user with a user name and user hot spot coordinates, so that the user can feel the existence of other players.
(3) The user can select the facial makeup in the 3D facial makeup gallery or take pictures by the camera autonomously and register the facial makeup in the 3D facial makeup in production, so that the facial makeup in the 3D facial makeup gallery has diversity, and when the user communicates with other user players, the facial makeup in the 3D facial makeup gallery is displayed, has interesting playability, and plays an alternative identification among users.
(4) After the user determines the connection information exchange, voice communication exchange can be performed through the VR hardware device, so that the experience of the user player is improved, and the social effect is achieved.
Drawings
In order to more clearly illustrate the technical solution of the embodiments of the present utility model, the following detailed description of the drawings in the description of the embodiments will be presented, so that other features, objects and advantages of the present utility model will become more apparent:
fig. 1 is a schematic structural diagram of a multimedia VR interaction method and system according to the present utility model.
Fig. 2 is a schematic structural view of the mode of the present utility model.
Detailed Description
The utility model is further described in connection with the following detailed description, in order to make the technical means, the creation characteristics, the achievement of the purpose and the effect of the utility model easy to understand.
Referring to fig. 1-2, the present utility model provides a multimedia VR interaction method and system:
the method and system for multimedia VR interaction comprise a processor, a server, five cameras 1 and two VR devices in a surrounding virtual space, wherein the cameras 1 comprise virtual balls, the server stores computer instructions executable by the processor, the system is used for tracking the gaze of a user in a VR environment, and the system at least performs the following operations to enable the computer to execute the instructions:
1) A database server is established by the server, and both VR hardware devices 2 are connected to a local area network, the VR hardware devices 2 being such as, but not limited to, computer systems of the following components: CPU, GPU, RAM, hard disk drives, etc., head mounted displays (dedicated or non-dedicated mobile device displays), input devices (e.g., keyboard, mouse, game controller or microphone), haptic devices and sensors (e.g., touch pad, position sensor, e.g., gyroscope, accelerometer, magnetometer), information exchange modules (e.g., voice communication, radio frequency, wi-Fi, mobile cellular, broadband, etc., device operating systems and software), and virtual players for operating simulated 3D virtual reality environments, such as the virtual players described above may include software for operating virtual reality environments, exemplary virtual software may include, but is not limited to, a spherical video player or a rendering engine configured to render three-dimensional images.
2) An ambient coordinate system of the VR is generated within the virtual space.
And generating a hotspot in the virtual space, wherein the hotspot comprises user own hotspot coordinates and other user hotspot coordinates.
3) The system tracks view information received from one or more sensors in the user VR hardware device 2, which view information is in the direction of the user's gaze in the real world.
4) Mapping the tracked view information to an ambient coordinate system and determining whether the user's gaze includes a hotspot based on the mapping.
The VR environment coordinate system includes: an X coordinate, a Y coordinate, a D coordinate, a T1 coordinate, and a T2 coordinate, where the X coordinate is defined as a latitude value, the Y coordinate is defined as a longitudinal value, the D coordinate is defined as a virtual element distance value of a user in a virtual three-dimensional environment, the T1 coordinate is defined as a virtual time, the T2 coordinate is defined as a real-time, the computer instructions are executed by the processor to generate a user coordinate system and a user name in a virtual space, the user coordinate system is associated with a virtual sphere in a position around a camera 1 in the virtual space, the processor executes to calculate an X1 coordinate, a Y1 coordinate, and a D1 coordinate to generate a user coordinate system, the user name corresponds to a VR hardware device 2, the VR environment coordinate system and the user coordinate system established by a server are positioned in the VR environment coordinate system, the database server stores user registration information, the VR hardware device 2 includes VR glasses, the VR glasses are provided with a switch button, the user coordinate system includes a user position, the user position is associated with a sphere in a three-dimensional space, the processor executes a view point of the user, the VR device receives a view point from the VR device, and confirms whether the view point of the user has a view point, and confirms that the view point of the user has a view point, and a view point of the view has a view point. The VR hardware device 2 comprises an audio module and an information exchange module, wherein the audio module corresponds to the information exchange module, and the information exchange module is arranged in a server.
Comprises the following steps of;
s1, registering and storing user information (such as identity information of a filled user, mobile phone numbers and the like) in one or more databases, wherein the database server is also imported with designed 3D facial makeup and 3D scenes;
s2, shooting the face information of the user A\B through a camera, converting the face information of the user A\B into a 3D facial makeup by using software, and importing the 3D facial makeup into a user information database; the user A\B can select the 3D facial makeup and the 3D scene by editing the 3D facial makeup and the 3D scene in advance, or the special 3D facial makeup can be produced according to the shot user A\B photo and imported into a database.
S3, the user A is provided with VR hardware equipment 2 connected with the local area network, and searches for hot spots in the local area network, so that a user B coordinate system and a user name can be observed;
s4, selecting a user B hot spot by the user A through a staring center point, pressing a switch button to determine whether connection is established or not by the user B through the switch button;
s5, after the user B confirms connection, voice communication with the user A is carried out through the audio module, and 3D facial makeup registered by the user B and the user A are exported to a display screen of the VR glasses through the processor.
The electric parts such as the virtual ball and the sensor belong to the prior art, and are not protected by the utility model, so that the applicant does not need to describe the electric parts in detail.
The specific embodiments described herein are offered by way of example only. Those skilled in the art may make various modifications or additions to the described embodiments or substitutions thereof without departing from the scope of the utility model or exceeding the scope of the utility model as defined in the accompanying claims.

Claims (6)

1. A multimedia VR interactive system, comprising: one or more processors and one or more servers and a plurality of cameras, two or more VR devices, within a surrounding virtual space, the cameras comprising virtual spheres, the servers storing computer instructions executable by the processors and the system being configured to track a user's gaze in a VR environment, the system performing at least the following operations causing the computer executable instructions:
1) One or more database servers are established through the server, and VR hardware devices are all connected with a local area network;
2) Generating an environment coordinate system of VR in the virtual space;
generating a hot spot in the virtual space, wherein the hot spot comprises a user self hot spot coordinate and other user hot spot coordinates;
3) The system tracks view information received from one or more sensors in the user VR hardware device, the view information being along with a direction of user gaze in the real world;
4) Mapping the tracked view information to an ambient coordinate system and determining a user gaze based on the mapping;
the VR environment coordinate system includes: an X-coordinate, a Y-coordinate, a D-coordinate, a T1-coordinate, a T2-coordinate, the X-coordinate being defined as a latitude value, the Y-coordinate being defined as a portrait value, the D-coordinate being defined as a virtual element distance value of a user in a virtual three-dimensional environment, the T1-coordinate being defined as a virtual time, the T2-coordinate being defined as a real-time, the computer instructions being executed by the processor to generate a user coordinate system within the virtual space associated with a virtual sphere in a camera position surrounding the virtual space, and a user name, the processor being executed to calculate the X1 coordinate, the Y1 coordinate, the D1 coordinate generating the user coordinate system, the user name corresponding to the VR hardware device.
2. The multimedia VR interactive system of claim 1, wherein: the user registration information is stored in the database server, the VR hardware device comprises VR glasses, and the VR glasses are provided with switch buttons.
3. The multimedia VR interactive system of claim 2, wherein: the user coordinate system includes a user position relative to a sphere in three-dimensional space.
4. The multimedia VR interactive system of claim 3, wherein: the processor execution system generates a field of view dataset from view information received by one or more sensors in the user VR hardware device, the field of view dataset including a center point defining the user's gaze, and confirms whether the center point hit point hits other user coordinate hotspots.
5. The multimedia VR interactive system of claim 4, wherein: the VR hardware device comprises an audio module and an information exchange module, wherein the audio module corresponds to the information exchange module, and the information exchange module is arranged in the server.
6. The method of claim 5, wherein the steps of: comprises the following steps of;
s1, registering and storing user information in one or more databases, wherein the database server is also imported with designed 3D facial makeup and 3D scenes;
s2, shooting user face information through a camera, converting the user face information into a 3D facial makeup by using software, and importing the 3D facial makeup into a user information database;
s3, VR hardware equipment connected with the local area network on the user band searches for hot spots in the local area network, and other user coordinate systems and user names can be observed;
s4, selecting a hot spot by the user through the staring center point, pressing a switch button to determine the hot spot, and determining whether to establish connection by the opposite user through the switch button;
s5, after the other party user confirms connection, voice communication with the user is carried out through the audio module, and 3D facial makeup registered by the other party user is exported to a display screen of the VR glasses through the processor.
CN202010326318.4A 2020-04-23 2020-04-23 Multimedia VR interaction method and system Active CN111530089B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010326318.4A CN111530089B (en) 2020-04-23 2020-04-23 Multimedia VR interaction method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010326318.4A CN111530089B (en) 2020-04-23 2020-04-23 Multimedia VR interaction method and system

Publications (2)

Publication Number Publication Date
CN111530089A CN111530089A (en) 2020-08-14
CN111530089B true CN111530089B (en) 2023-08-22

Family

ID=71969779

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010326318.4A Active CN111530089B (en) 2020-04-23 2020-04-23 Multimedia VR interaction method and system

Country Status (1)

Country Link
CN (1) CN111530089B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106774830A (en) * 2016-11-16 2017-05-31 网易(杭州)网络有限公司 Virtual reality system, voice interactive method and device
CN107194964A (en) * 2017-05-24 2017-09-22 电子科技大学 A kind of VR social intercourse systems and its method based on real-time body's three-dimensional reconstruction
CN109683706A (en) * 2018-12-10 2019-04-26 中车青岛四方机车车辆股份有限公司 A kind of method and system of the more people's interactions of virtual reality

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9818225B2 (en) * 2014-09-30 2017-11-14 Sony Interactive Entertainment Inc. Synchronizing multiple head-mounted displays to a unified space and correlating movement of objects in the unified space

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106774830A (en) * 2016-11-16 2017-05-31 网易(杭州)网络有限公司 Virtual reality system, voice interactive method and device
CN107194964A (en) * 2017-05-24 2017-09-22 电子科技大学 A kind of VR social intercourse systems and its method based on real-time body's three-dimensional reconstruction
CN109683706A (en) * 2018-12-10 2019-04-26 中车青岛四方机车车辆股份有限公司 A kind of method and system of the more people's interactions of virtual reality

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"交互反馈机制在虚拟社交形象设计中的应用研究—以Zepeto 为例";张真榕;《工业设计》;20190507;第132-133页 *

Also Published As

Publication number Publication date
CN111530089A (en) 2020-08-14

Similar Documents

Publication Publication Date Title
RU2719454C1 (en) Systems and methods for creating, translating and viewing 3d content
CN110354489B (en) Virtual object control method, device, terminal and storage medium
JP7008730B2 (en) Shadow generation for image content inserted into an image
US20150070274A1 (en) Methods and systems for determining 6dof location and orientation of head-mounted display and associated user movements
US11734791B2 (en) System and method for rendering perspective adjusted views
TW201835723A (en) Graphic processing method and device, virtual reality system, computer storage medium
CN102749990A (en) Systems and methods for providing feedback by tracking user gaze and gestures
JP2022505998A (en) Augmented reality data presentation methods, devices, electronic devices and storage media
CN108771866B (en) Virtual object control method and device in virtual reality
CN109426343B (en) Collaborative training method and system based on virtual reality
CN111836110B (en) Method and device for displaying game video, electronic equipment and storage medium
TWI758869B (en) Interactive object driving method, apparatus, device, and computer readable storage meidum
CN112637665B (en) Display method and device in augmented reality scene, electronic equipment and storage medium
JPWO2007139074A1 (en) Three-dimensional game display system, display method, and display program
JPWO2008016064A1 (en) Game device, object display method and display program in game device
WO2022267729A1 (en) Virtual scene-based interaction method and apparatus, device, medium, and program product
CN113082707A (en) Virtual object prompting method and device, storage medium and computer equipment
TWI630505B (en) Interactive augmented reality system and portable communication device and interaction method thereof
CN108983974A (en) AR scene process method, apparatus, equipment and computer readable storage medium
US20220032188A1 (en) Method for selecting virtual objects, apparatus, terminal and storage medium
JP2022507502A (en) Augmented Reality (AR) Imprint Method and System
CN111530089B (en) Multimedia VR interaction method and system
WO2022166173A1 (en) Video resource processing method and apparatus, and computer device, storage medium and program
Lo et al. From off-site to on-site: A Flexible Framework for XR Prototyping in Sports Spectating
CN114425162A (en) Video processing method and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant