CN111744180A - Method and device for loading virtual game, storage medium and electronic device - Google Patents

Method and device for loading virtual game, storage medium and electronic device Download PDF

Info

Publication number
CN111744180A
CN111744180A CN202010603335.8A CN202010603335A CN111744180A CN 111744180 A CN111744180 A CN 111744180A CN 202010603335 A CN202010603335 A CN 202010603335A CN 111744180 A CN111744180 A CN 111744180A
Authority
CN
China
Prior art keywords
game
scene
virtual
game client
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010603335.8A
Other languages
Chinese (zh)
Inventor
刘俊杰
王平戈
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Perfect World Chongqing Interactive Technology Co ltd
Original Assignee
Perfect World Chongqing Interactive Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Perfect World Chongqing Interactive Technology Co ltd filed Critical Perfect World Chongqing Interactive Technology Co ltd
Priority to CN202010603335.8A priority Critical patent/CN111744180A/en
Publication of CN111744180A publication Critical patent/CN111744180A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/843Special adaptations for executing a specific game genre or game mode involving concurrently two or more players on the same game device, e.g. requiring the use of a plurality of controllers or of a specific view of game data for each player
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/53Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of basic data processing
    • A63F2300/538Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of basic data processing for performing operations on behalf of the game client, e.g. rendering
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality

Abstract

The invention provides a method and a device for loading a virtual game, a storage medium and an electronic device, wherein the method comprises the following steps: the method comprises the steps of projecting a virtual game stage scene and virtual characters presented on a game client based on scene information of a real scene, and carrying out game data interaction in the virtual game stage scene, so that the technical problem that the virtual scene cannot be projected in the real scene in the related technology is solved, a scheme for projecting the whole game scene into the real scene is provided, more game strategies and playability are provided for a multi-player interactive game of the virtual game stage scene, multi-player AR customs in the virtual game stage scene is realized by projecting virtual characters controlled by other game clients, and the limitation of AR playability is reduced.

Description

Method and device for loading virtual game, storage medium and electronic device
Technical Field
The invention relates to the technical field of computers, in particular to a method and a device for loading a virtual game, a storage medium and an electronic device.
Background
In the related art, AR is known as Augmented Reality, and is a technology for adding a virtual world created by a computer program to a real world captured by a camera. ARKit is a tool that projects a virtual world through a view to the real world.
In the related art, the IOS native API is used for scene management (including virtual article management), the coordinate system used by the virtual article is directly the coordinate system used by the ARKit, and some processing is built in, so that the virtual article and the real article (actually, the real plane) can be bound together, and the following effects are achieved: it appears that the virtual item shows the sensation directly on top of the real item. In synchronization, WorldMap directly puts the information of the virtual article in WorldMap and synchronizes them together, in addition to the coordinate system information (actually, the reference object) necessary for synchronization. And performing virtual scene management by using Unity, acquiring the position information of the real camera through an ARKit, and directly applying the position information to the virtual camera of Unity. The virtual object is generated by Unity, and the coordinate system used is converted into a Unity coordinate system from the coordinate system of the ARKit. The binding of the virtual object to the real plane needs to be completed by itself. During synchronization, WorldMap only synchronizes the coordinate system information, and the synchronization of the virtual article information is completed by the self-defined logic.
In the related art, most of the AR applications are very simple to add a small amount of virtual objects in the 'real' world, and rarely project the whole virtual scene to the 'real' world directly, so that the possibility of AR playing is limited. Double-person AR applications generally require two persons to be in the same location, and are not perfect for handling two persons in different locations (this is not a technical problem, and more is the consideration of the playing design is not perfect). AR applications rarely combine "real" inputs, such as "blow the phone", "shake the phone", "position the point of view of the phone", etc., without realism. Due to the accuracy problem of the ARKit, when two players play games together at the same place, if the moving ranges of the two players are large, the positions of the virtual objects seen by the two players in the real world have obvious errors. Because the ARKit depends on the stability of the equipment, after the mobile phone shakes greatly, the positioning of the ARKit has great errors, and the ARKit can not be recovered to normal, so that the whole AR experience is damaged.
In view of the above problems in the related art, no effective solution has been found at present.
Disclosure of Invention
The embodiment of the invention provides a method and a device for loading a virtual game, a storage medium and an electronic device.
According to an embodiment of the present invention, there is provided a method of loading a virtual game, including: acquiring scene information of a real scene through a first game client; projecting the virtual game stage scene presented on the first game client into the real scene according to the scene information and the corresponding projection information; synchronizing scene information of a real scene and corresponding projection information acquired by a first game client to a second game client through a game server; adding a first virtual role controlled by a first game client into the virtual game level scene; adding a second virtual role controlled by a second game client into the virtual game level scene; and the first virtual character controlled by the first game client side and the second virtual character controlled by the second game client side perform game data interaction in the virtual game level scene.
Optionally, the method further includes: synchronizing the virtual game level scene presented on the first game client to the second game client through the game server; and projecting the virtual game stage scene presented on the second game client into the real scene, and synchronizing the first virtual character data controlled by the first game client to the second game client through the game server.
Optionally, before synchronizing the virtual game level scene presented on the first game client to the second game client through the game server, the method further includes: acquiring first position information of a first game client and second position information of a second game client; judging whether the first game client and the second game client are at the same position or not according to the first position information and the second position information; and if the first game client side and the second game client side are not at the same position, determining that the virtual game level scene presented on the first game client side is synchronized to the second game client side through the game server.
Optionally, synchronizing the virtual game level scene presented on the first game client to the second game client through the game server includes at least one of: synchronizing a virtual line of sight presented on a first game client to a second game client through a game server; the virtual trajectory of the item presented on the first game client is synchronized to the second game client through the game server.
Optionally, after projecting the virtual game level scene presented on the first game client into the real scene, the method further includes: and sending the scene information and the corresponding projection information of the first game client to a game server, wherein the game server is used for establishing and storing the incidence relation between the scene information and the projection information and a virtual level room.
Optionally, synchronizing scene information of a real scene and corresponding projection information acquired by the first game client to the second game client through the game server includes: determining that the second game client enters a virtual level room where the first game client is located, and the second game client and the first game client are at the same physical position; if the scene information and the corresponding projection information are synchronized by the first game client, the game server sends the scene information and the corresponding projection information to the second game client; if the scene information and the corresponding projection information are not synchronized by the first game client, the game server sends the scene information and the corresponding projection information to the second game client after determining that the first game client is synchronized.
Optionally, projecting the virtual game level scene presented on the first game client into the real scene according to the scene information and the corresponding projection information, including: keeping a projection matrix of a virtual camera of the virtual game level scene consistent with a projection matrix of a real camera of the first game client, wherein the scene information comprises the real camera, and the projection information comprises the virtual camera; determining a first reference point of the real scene in a world coordinate system and a second reference point corresponding to the first reference point in the virtual game level scene in a screen coordinate system; and converting the screen coordinate system into the world coordinate system according to the first reference point and the second reference point, and projecting the converted virtual game level scene to the real scene.
Optionally, converting the screen coordinate system into the world coordinate system according to the first reference point and the second reference point includes: calculating the element proportion K of the screen coordinate system and the world coordinate system; converting the screen coordinate system to the world coordinate system according to the following formula: p3 ═ P1+ (P2-P0) × K; wherein P3 is the coordinate position of the virtual camera, P1 is the coordinate position of the second reference point, P2 is the coordinate position of the real camera, and P0 is the coordinate position of the first reference point.
Optionally, the game data interaction of the first virtual character controlled by the first game client and the second virtual character controlled by the second game client in the virtual game level scene includes: acquiring the interactive information of the virtual game stage scene through the first game client or the second game client; generating a control instruction of the virtual game level scene based on the interaction information; and controlling the virtual game level scene according to the control instruction.
Optionally, when the interaction information is a viewpoint ray, generating the control instruction of the virtual game level scene based on the interaction information includes: positioning the intersection point position of the viewpoint ray to the virtual game level scene; determining the intersection point position as a target position of the first virtual character movement; and generating a movement instruction according to the target position, wherein the movement instruction is used for indicating the first virtual character to move to the target position.
Optionally, when the interaction information is an environmental volume blown by a microphone, generating the control instruction of the virtual game level scene based on the interaction information includes: triggering an environment wind field according to the environment volume, wherein the size of the environment wind field corresponds to the size of the environment volume; generating a pneumatic painting parameter according to the environment wind field, and generating a first rendering instruction, wherein the first rendering instruction is used for instructing the pneumatic painting special effect to be rendered in the virtual game stage scene by using the wind animation parameter.
Optionally, when the interaction information is the orientation information of the first game client or the second game client, generating the control instruction of the virtual game level scene based on the interaction information includes: determining a movement track of the first game client or the second game client according to the direction information; and searching the role special effect matched with the moving track in a preset special effect library, and generating a second rendering instruction, wherein the second rendering instruction is used for instructing the role special effect to be rendered in the virtual game level scene.
Optionally, the game data interaction of the first virtual character controlled by the first game client and the second virtual character controlled by the second game client in the virtual game level scene includes: intercepting or recording a display picture of the virtual game stage scene on a display screen of the first game client or the second game client; and sharing or saving the display picture.
Optionally, after adding a second virtual character controlled by a second game client to the virtual game level scene, the method further includes: acquiring identification information of a virtual element in the virtual game level scene and position information of the virtual element in the real scene; after the identification information is associated with the position information, storing the identification information and the position information in a game server; and when the first game client enters the virtual game stage scene again, generating the virtual element in the real scene according to the identification information and the position information.
Optionally, after adding a second virtual character controlled by a second game client to the virtual game level scene, the method further includes: storing world map data of the virtual game level scene at a first time, or acquiring the world map data of the virtual game level scene at the first time from the second game client, wherein the world map data is used for generating an AR view of the virtual game level scene; and reconstructing or correcting the AR view of the virtual game level scene at the second time according to the world map data.
According to another embodiment of the present invention, there is provided an apparatus for loading a virtual game, including: the acquisition module is used for acquiring scene information of a real scene through the first game client; the projection module is used for projecting the virtual game level scene presented on the first game client into the real scene according to the scene information and the corresponding projection information; the synchronization module is used for synchronizing scene information of a real scene and corresponding projection information acquired by the first game client to the second game client through the game server; the joining module is used for joining a first virtual character controlled by a first game client into the virtual game stage scene and joining a second virtual character controlled by a second game client into the virtual game stage scene; and the control module is used for controlling the interaction of game data of a first virtual character controlled by the first game client and a second virtual character controlled by the second game client in the virtual game level scene.
Optionally, the apparatus further comprises: the second synchronization module is used for synchronizing the virtual game stage scene presented on the first game client to the second game client through the game server; and the processing module is used for projecting the virtual game level scene presented on the second game client into the real scene and synchronizing the first virtual character data controlled by the first game client to the second game client through the game server.
Optionally, the apparatus further comprises: the first acquisition module is used for acquiring first position information of the first game client and second position information of the second game client before the second synchronization module synchronizes the virtual game level scene presented on the first game client to the second game client through the game server; the judging module is used for judging whether the first game client and the second game client are at the same position or not according to the first position information and the second position information; and the determining module is used for determining that the virtual game level scene presented on the first game client is synchronized to the second game client through the game server if the first game client and the second game client are not at the same position.
Optionally, the second synchronization module includes at least one of: a first synchronization unit for synchronizing the virtual sight line presented on the first game client to the second game client through the game server; and the second synchronization unit is used for synchronizing the virtual track of the object presented on the first game client to the second game client through the game server.
Optionally, the apparatus further comprises: and the sending module is used for sending the scene information and the corresponding projection information of the first game client to a game server after the projection module projects the virtual game level scene presented on the first game client into the real scene, wherein the game server is used for establishing and storing the incidence relation between the scene information and the projection information and the virtual level room.
Optionally, the first synchronization module includes: the first determining unit is used for determining that the second game client enters a virtual level room where the first game client is located, and the second game client and the first game client are at the same physical position; a sending unit, configured to send the scene information and the corresponding projection information to the second game client by the game server if the scene information and the corresponding projection information have been synchronized by the first game client; if the scene information and the corresponding projection information are not synchronized by the first game client, the game server sends the scene information and the corresponding projection information to the second game client after determining that the first game client is synchronized.
Optionally, the first synchronization module includes: a holding unit, configured to keep a projection matrix of a virtual camera of the virtual game level scene consistent with a projection matrix of a real camera of the first game client, where the scene information includes the real camera, and the projection information includes the virtual camera; the second determining unit is used for determining a first reference point of the real scene in a world coordinate system and a second reference point, corresponding to the first reference point, in a screen coordinate system in the virtual game level scene; and the conversion unit is used for converting the screen coordinate system into the world coordinate system according to the first reference point and the second reference point and projecting the converted virtual game level scene to the real scene.
Optionally, the conversion unit includes: the calculating subunit is used for calculating the element proportion K of the screen coordinate system and the world coordinate system; a converting subunit, configured to convert the screen coordinate system into the world coordinate system according to the following formula: p3 ═ P1+ (P2-P0) × K; wherein P3 is the coordinate position of the virtual camera, P1 is the coordinate position of the second reference point, P2 is the coordinate position of the real camera, and P0 is the coordinate position of the first reference point.
Optionally, the control module includes: the acquisition unit is used for acquiring the interaction information of the virtual game stage scene through the first game client side or the second game client side; the generating unit is used for generating a control instruction of the virtual game stage scene based on the interaction information; and the control unit is used for controlling the virtual game stage scene according to the control instruction.
Optionally, when the interaction information is a viewpoint ray, the generating unit includes: the positioning subunit is used for positioning the intersection point position of the viewpoint ray to the virtual game level scene; a first determining subunit, configured to determine the intersection point position as a target position of the first virtual character movement; and the first generation subunit is used for generating a movement instruction according to the target position, wherein the movement instruction is used for indicating the first virtual character to move to the target position.
Optionally, when the interaction information is an ambient volume blown by a microphone, the generating unit includes: the triggering subunit is used for triggering an environment wind field according to the environment volume, wherein the size of the environment wind field corresponds to the size of the environment volume; and the second generation subunit is used for generating a pneumatic painting parameter according to the environment wind field and generating a first rendering instruction, wherein the first rendering instruction is used for indicating that the pneumatic painting special effect is rendered by using the wind animation parameter in the virtual game level scene.
Optionally, when the interaction information is the direction information of the first game client or the second game client, the generating unit includes: the second determining subunit is used for determining the movement track of the first game client or the second game client according to the direction information; and the third generating subunit is configured to search a preset special effect library for the role special effect matched with the movement track, and generate a second rendering instruction, where the second rendering instruction is used to instruct rendering of the role special effect in the virtual game level scene.
Optionally, the control module includes: the acquisition unit is used for intercepting or recording a display picture of the virtual game level scene on a display screen of the first game client or the second game client; and the sharing unit is used for sharing or saving the display picture.
Optionally, the apparatus further comprises: the second obtaining module is used for obtaining the identification information of the virtual element in the virtual game stage scene and the position information of the virtual element in the real scene after the adding module adds the second virtual character controlled by the second game client to the virtual game stage scene; the first storage module is used for storing the identification information and the position information in a game server after being associated; and the generating module is used for generating the virtual element in the real scene according to the identification information and the position information when the first game client enters the virtual game level scene again.
Optionally, the apparatus further comprises: the second storage module is used for storing world map data of the virtual game level scene at the first time after the joining module joins a second virtual character controlled by a second game client into the virtual game level scene, or acquiring the world map data of the virtual game level scene at the first time from the second game client, wherein the world map data is used for generating an AR view of the virtual game level scene; and the building module is used for reconstructing or correcting the AR view of the virtual game level scene at the second time according to the world map data.
According to a further embodiment of the present invention, there is also provided a storage medium having a computer program stored therein, wherein the computer program is arranged to perform the steps of any of the above method embodiments when executed.
According to yet another embodiment of the present invention, there is also provided an electronic device, including a memory in which a computer program is stored and a processor configured to execute the computer program to perform the steps in any of the above method embodiments.
By the method and the system, the virtual game stage scene and the virtual character displayed on the game client are projected based on the scene information of the real scene, and the game data interaction is carried out in the virtual game stage scene, so that the technical problem that the virtual scene cannot be projected in the real scene in the related technology is solved, a scheme for projecting the whole game scene into the real scene is provided, more game strategies and playability are provided for the multi-player interactive game of the virtual game stage scene, multi-player AR customs clearance in the virtual game stage scene is realized by projecting the virtual character controlled by other game clients, and the limitation of multi-player AR playing methods is reduced.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
fig. 1 is a block diagram of a hardware structure of a mobile phone loaded with a virtual game according to an embodiment of the present invention;
FIG. 2 is a flow diagram of a method of loading a virtual game according to an embodiment of the present invention;
FIG. 3 is a diagram illustrating the effect of sharing a virtual game by two players according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating the effects of scene projection according to an embodiment of the present invention;
FIG. 5 is an illustration of the effect of projecting a stage scene on a desktop according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of coordinate system conversion according to an embodiment of the present invention;
FIG. 7 is a diagram illustrating the effect of controlling the movement of a character through a viewpoint ray according to an embodiment of the present invention;
FIG. 8 is a diagram illustrating the effect of the embodiment of the present invention on dissolving ice and snow by blowing;
FIG. 9 is a diagram of the effect of embodiments of the present invention on releasing character skills by insufflation;
FIG. 10 is a diagram illustrating the effect of releasing character skills via gravity sensing according to an embodiment of the present invention;
FIG. 11 is a schematic diagram of saving a game screen by screen capture according to an embodiment of the invention;
FIG. 12 is an effect diagram of associating virtual elements in a real scene according to an embodiment of the present invention;
fig. 13 is a block diagram of an apparatus for loading a virtual game according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application. It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
The method provided by the first embodiment of the present application may be executed in a mobile phone, a tablet, a computer, or a similar electronic terminal. Taking the example of running on a mobile phone, fig. 1 is a block diagram of a hardware structure of a mobile phone loaded with a virtual game according to an embodiment of the present invention. As shown in fig. 1, the handset 10 may include one or more (only one shown in fig. 1) processors 102 (the processor 102 may include, but is not limited to, a processing device such as a microprocessor MCU or a programmable logic device FPGA) and a memory 104 for storing data, and optionally may also include a transmission device 106 for communication functions and an input-output device 108. It will be understood by those skilled in the art that the structure shown in fig. 1 is merely illustrative and not limiting to the structure of the mobile phone. For example, the handset 10 may also include more or fewer components than shown in FIG. 1, or have a different configuration than shown in FIG. 1.
The memory 104 may be used to store a mobile phone program, for example, a software program and a module of an application software, such as a mobile phone program corresponding to a method for loading a virtual game in an embodiment of the present invention, and the processor 102 executes various functional applications and data processing by running the mobile phone program stored in the memory 104, so as to implement the method described above. The memory 104 may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 104 may further include memory located remotely from the processor 102, which may be connected to the handset 10 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof. In the present embodiment, the processor 104 is configured to control the target virtual character to perform a specified operation to complete the game task in response to the human-machine interaction instruction and the game policy. The memory 104 is used for storing program scripts of the electronic game, configuration information, attribute information of the virtual character, and the like.
The transmission device 106 is used for receiving or transmitting data via a network. Specific examples of such networks may include wireless networks provided by the communications provider of the handset 10. In one example, the transmission device 106 includes a Network adapter (NIC), which can be connected to other Network devices through a base station so as to communicate with the internet. In one example, the transmission device 106 may be a Radio Frequency (RF) module, which is used for communicating with the internet in a wireless manner.
Optionally, the input/output device 108 further includes a human-computer interaction screen for acquiring a human-computer interaction instruction through a human-computer interaction interface and for presenting a game picture in a game task;
in this embodiment, a method for loading a virtual game is provided, and fig. 2 is a flowchart of a method for loading a virtual game according to an embodiment of the present invention, as shown in fig. 2, the flowchart includes the following steps:
step S202, scene information of a real scene is collected through a first game client;
the real scene of the embodiment is a real scene of a physical space where the first game client is located, and is a three-dimensional world which actually exists, such as a floor, a wall, a road and the like.
Step S204, projecting the virtual game stage scene presented on the first game client into a real scene according to the scene information and the corresponding projection information;
when a user plays a virtual game at a game client, a virtual game stage scene including virtual information such as a map, roles, props, weather and the like is displayed on a screen of the game client, a three-dimensional view picture is presented through 3D rendering, a virtual and real combined mixed scene, namely an AR scene, can be realized by fusing the virtual game stage scene with a real scene in a certain proportion, the virtual game stage scene is projected into the real scene according to projection information, and the user can see the view of the virtual game stage scene in the real scene.
Step S206, synchronizing scene information of the real scene and corresponding projection information acquired by the first game client to the second game client through the game server;
through data synchronization, a virtual game level scene presented on a second game client can be projected into a real scene.
Step S208, adding a first virtual character controlled by a first game client into a virtual game stage scene, and adding a second virtual character controlled by a second game client into the virtual game stage scene;
step S210, a first virtual character controlled by a first game client and a second virtual character controlled by a second game client perform game data interaction in a virtual game level scene.
The game scenario of the virtual game stage scene needs two or more persons to pass, and the multi-person interaction of the AR scene can be realized by adding the virtual roles controlled by the game client side into the virtual game stage scene and synchronizing the game data. Fig. 3 is an effect diagram of a double-player shared virtual game according to an embodiment of the present invention, where a virtual game level scene includes a first virtual character controlled by a first game client and a second virtual character controlled by a second game client, and the first game client and the second game client are controlled by a first user and a second user, respectively.
Through the steps, the virtual game stage scene and the virtual roles presented on the game client side are projected based on the scene information of the real scene, and the game data interaction is carried out in the virtual game stage scene, so that the technical problem that the virtual scene cannot be projected in the real scene in the related technology is solved, the scheme for projecting the whole game scene into the real scene is provided, more game strategies and playability are provided for the multi-player interactive game of the virtual game stage scene, the multi-player AR customs in the virtual game stage scene is realized by projecting the virtual roles controlled by other game client sides, and the limitation of the multi-player AR playing method is reduced.
In this embodiment, a game client (e.g., a mobile terminal such as a mobile phone) is used to build coordinates of a virtual scene in a real space, and then perform scene projection, in some examples, a real-world coordinate system is built through an ARKit tool, a virtual game level scene, also called a Unity scene, is created and rendered through a Unity tool (a development engine of a virtual game), the virtual game level scene is located in a Unity coordinate system, and to implement Unity coordinate system conversion from the world coordinate system, the entire Unity scene is "projected" to a "real" world, and the virtual game level scene is implemented by setting a mapping relationship between a position of a game virtual camera and a real device camera and before rotation. Since a scene is allowed to "cast" to an arbitrary real location, it is necessary to "align" a point of a virtual scene (generally, the center point of the scene) with a point of the real world (that is, the point of the virtual scene and the point of the real world are exactly coincident when displayed on the screen) in order to "look" fixed "at a certain location in the real world.
In an implementation of this embodiment, the scheme further includes: synchronizing the virtual game level scene presented on the first game client to the second game client through the game server; and projecting the virtual game stage scene presented on the second game client into the real scene, and synchronizing the first virtual character data controlled by the first game client to the second game client through the game server.
Optionally, before synchronizing the virtual game level scene presented on the first game client to the second game client through the game server, a determination that the GPS location information is the same location may also be performed, and if not, the above steps are performed, including: acquiring first position information of a first game client and second position information of a second game client; judging whether the first game client and the second game client are at the same position or not according to the first position information and the second position information; and if the first game client and the second game client are not in the same position, determining that the virtual game level scene presented on the first game client is synchronized to the second game client through the game server. After projecting the virtual game level scene presented on the first game client into the real scene, the method further comprises the following steps: and sending the scene information and the corresponding projection information of the first game client to a game server, wherein the game server is used for establishing and storing the incidence relation between the scene information and the projection information and the virtual level room.
In this embodiment, projecting a virtual game level scene presented on a first game client to a real scene according to scene information to obtain an AR scene, includes:
s11, keeping the projection matrix of the virtual camera of the virtual game level scene consistent with the projection matrix of the real camera of the first game client, wherein the scene information comprises the real camera, and the projection information comprises the virtual camera;
first, the projection matrix of the virtual camera is kept consistent with the projection matrix of the real camera, otherwise, the objective cannot be achieved by a relatively simple camera coordinate transformation.
S12, determining a first reference point of the real scene in a world coordinate system and a second reference point corresponding to the first reference point in the virtual game level scene in a screen coordinate system;
and S13, converting the screen coordinate system into a world coordinate system according to the first reference point and the second reference point, and projecting the converted virtual game stage scene to a real scene.
In one embodiment of the present embodiment, converting the screen coordinate system into the world coordinate system according to the first reference point and the second reference point includes: calculating the element proportion K of a screen coordinate system and a world coordinate system; the screen coordinate system is converted to the world coordinate system according to the following formula: p3 ═ P1+ (P2-P0) × K; wherein, the coordinate position of the P3 virtual camera, the coordinate position of the P1 second reference point, the coordinate position of the P2 real camera and the coordinate position of the P0 first reference point. By converting the coordinate system, virtual and real scene conversion in different coordinate systems can be realized.
Assuming that the position of the center point of the scene is P1(Unity coordinate system), the corresponding position of the real world is P0(ARKit coordinate system), and the ratio of the item in the virtual scene to the item in the real world is K, then the relationship between the virtual camera position P3(Unity coordinate system) and the real camera (including the depth sensor) position P2(ARKit coordinate system, i.e., world coordinate system) is P3 ═ P1+ (P2-P0) × K (the rotation of the camera is similar to this calculation). The effect of "projecting a scene" can be achieved by maintaining the relationship between the virtual camera and the real camera, and fig. 4 is an effect diagram of scene projection in the embodiment of the present invention, including a real scene and a virtual game level scene. Taking a real scene as a flat desktop as an example, fig. 5 is an effect diagram of projecting a level real scene on the desktop according to the embodiment of the present invention, where the left side is a first stage of rendering, and the right side is a second stage of rendering, so that a complete virtual game level scene is projected.
FIG. 6 is a schematic diagram of coordinate system transformation according to an embodiment of the present invention, in a three-dimensional space, objects exist in the form of element models, each model has its own coordinate system, e.g. bounds of each graph in a two-dimensional environment, which is a model coordinate system, and a plurality of objects are in the same environment, e.g. a plurality of elements in a game scene are in the same virtual scene, each model has a corresponding coordinate position, which corresponds to a frame of each graph in a two-dimensional environment, which is a world coordinate system or an environment coordinate system, since real world objects are in a three-dimensional environment, they need to be transformed into two-dimensional pictures to be drawn on a screen, it needs to observe the model from a certain angle, see a two-dimensional picture, and by setting a virtual camera, the camera observes the model from a moving position and an angle, takes a two-dimensional picture, which is an observation coordinate system, and finally, putting the current environment into a normalization equipment coordinate system, setting a coordinate range according to a system, mapping the normalization equipment coordinate system into a cube, mapping the model in the scene to the cube, mapping the normalization equipment coordinate system into the cube for normalization to the cube due to the fixed coordinate range, then cutting the model which is not in the visual field range, finally mapping the model into a screen coordinate system after cutting, converting the graph into a buffer area through scanning, acquiring data from the buffer area by a GPU, displaying the data, and finally projecting the data to a real scene.
The world map is used for storing spatial information data scanned by the game client, such as planes, feature points, anchor points and other information, so that the application can be recovered and continuously run after interruption, and can also be transmitted to other game clients, so that the opposite side can see the same AR view under the same environment, and the sharing effect is realized. After opening the AR session (session) for a period of time, if the ARKit has identified enough scene features, the scene features identified by the AR session can be stored in WorldMap format. Other AR sessions (either another AR session opened after the device or AR sessions of other devices) may attempt to resolve the WorldMap. If the resolution is successful (that is, the scene "seen" by the new AR session can correspond to the scene corresponding to WorldMap), the new AR session can keep the coordinate system consistent with the AR session corresponding to WorldMap, and a basis is provided for the subsequent multiple sharing of the scene and the permanent storage of the virtual article.
In this embodiment, the game play data interaction between the first virtual character controlled by the first game client and the second virtual character controlled by the second game client in the virtual game level scene includes: acquiring interactive information of a virtual game stage scene through a first game client or a second game client; generating a control instruction of the virtual game level scene based on the interactive information; and controlling the virtual game stage scene according to the control instruction.
Alternatively, the first virtual Character of the first game client and the second virtual Character of the second game client may be Player-Controlled characters (PCCs) or Non-Player characters (NPCs). When the first virtual character is controlled, the attribute of the virtual character can be controlled and controlled to change through the control instruction, for example, the moving state of the character is changed, the skill special effect is released, the attack and defense state of the character is changed, the visual field range is adjusted, the visual angle is switched, and the like.
In an interactive scene, when the interactive information is a viewpoint ray, the control instruction for generating the virtual game stage scene based on the interactive information comprises: positioning the intersection point position of the viewpoint ray to the virtual game level scene; determining the intersection point position as a target position of the first virtual character movement; and generating a movement instruction according to the target position, wherein the movement instruction is used for indicating the first virtual character to move to the target position.
In the interactive scene, the movement of the character in the virtual game can be controlled through the movement of the viewpoint ray: when the player holds down the screen, the viewpoint ray movement is turned on. The ray is triggered by the game virtual camera and is emitted to the game virtual scene on the screen, and the position where the line intersects with the scene is the target position of the character controlled by the player. When the ray has an intersection point with the scene, the client broadcasts the starting point and the ending point of the ray to all other clients through the server, and then the other clients can also see the virtual 'sight line'. If two players play the game at the same place, the line of sight of other players appears to be emitted from the mobile phones of the corresponding players because the coordinates of the virtual world and the coordinates of the real world (ARKit coordinates) are converted. If the two players are in different locations. It can still be seen that a ray is shot at a certain location, just as if there was a "hidden player". Fig. 7 is an effect diagram of controlling the movement of a character through a viewpoint ray according to the embodiment of the present invention, where the viewpoint ray moves and a main control character in a virtual game also moves correspondingly.
In an interactive scene, when the interactive information is the volume of the environment blown by the microphone, the control instruction for generating the virtual game level scene based on the interactive information comprises: triggering an environment wind field according to the environment volume, wherein the size of the environment wind field corresponds to the size of the environment volume; generating a pneumatic painting parameter according to the environment wind field, and generating a first rendering instruction, wherein the first rendering instruction is used for instructing the pneumatic painting parameter to render the pneumatic painting special effect in the virtual game stage scene.
The interactive scene can realize the role blowing interaction, and whether a player blows the mobile phone or not can be determined by analyzing the volume received by the microphone within a period of time (the level and the complex amplitude of the volume is larger during blowing). When the air blowing of the player is received, the character (actually the character shot by the mobile phone of the player or the character controlled by the mobile phone of the player) which is "seen" by the player at the moment is triggered to play the blown animation if the distance is relatively close to the character, the blown animation looks like being blown down by the player, and a wind field can be triggered to control the decoration of the character or the scene in the scene to flutter. Fig. 8 is an effect diagram of a special effect of dissolving ice and snow by blowing air, and ice and snow melting in a virtual scene can be controlled by blowing air to a microphone according to the embodiment of the invention. Fig. 9 is a diagram of the effect of releasing character skills by blowing air, and the specific skills of a virtual character can be triggered by blowing air to a microphone according to the embodiment of the invention.
In an interactive scenario, when the interactive information is the orientation information of the first game client, the generating of the control instruction of the virtual game level scenario based on the interactive information includes: determining the movement track of the first game client or the second game client according to the direction information; and searching the role special effect matched with the moving track in a preset special effect library, and generating a second rendering instruction, wherein the second rendering instruction is used for instructing the role special effect to be rendered in the virtual game level scene.
The interactive scene can realize that the mobile phone is shaken to trigger special effects such as light work and the like, and whether a player shakes the mobile phone upwards or not is judged by analyzing the gravitational acceleration of the mobile phone within a period of time and estimating. When the player shakes the mobile phone upwards, the character controlled by the player plays a light action, and the player looks like shaking the character. In addition, the crawling effect of the mobile phone triggered character can be turned over, the rolling effect of the mobile phone triggered character can be turned over, and the like. Fig. 10 is an effect diagram of releasing the skill of the character through gravity sensing according to the embodiment of the present invention, and the skill of the virtual character can be triggered by shaking the game device, so as to achieve a special effect of the wall-moving with a cornice.
The triggering hardware of the embodiment can be a touch screen, a volume key, an on-off key, a mute key, a navigation key and the like besides a camera, a positioning module and a microphone, the corresponding relation between the keys and corresponding instructions can be preset for each key, the sensor can analyze and process the state of the sensor in a certain period besides directly acquiring the state of the sensor, and then continuous data such as gestures and gestures are obtained, and truly input interactive information is combined into an AR game strategy, so that augmented reality is more true.
In some examples, the current AR scene may also be photographed and shared, and the game play data interaction between the first virtual character controlled by the first game client and the second virtual character controlled by the second game client in the virtual game level scene includes: intercepting or recording a display picture of a virtual game stage scene on a display screen of a first game client or a second game client; and sharing or saving the display picture.
Because the virtual article and the actual scene are displayed through the mobile phone, the photographing and sharing function is actually a simple screen capturing function. After the screen shot, a sharable picture is obtained, which can be saved locally and allowed to be shared in various ways. FIG. 11 is a diagram illustrating saving a game screen by screen capture according to an embodiment of the present invention.
Optionally, after adding the second virtual character controlled by the second game client to the virtual game level scene, the method further includes: acquiring identification information of a virtual element in a virtual game level scene and position information of the virtual element in a real scene; after the identification information and the position information are associated, storing the identification information and the position information in a game server; and when the first game client enters the virtual game level scene again, generating a virtual element in the real scene according to the identification information and the position information.
In some gaming strategies for virtual gaming level scenarios, a player is allowed to place a virtual item on a virtual scene that appears as if it were placed directly in a real location, as the virtual scene is associated with a real physical location. The items placed by the players are synchronized to other clients through the server, so each player can see the items placed by other players, and if two players are in the same place, the items seen by the players are in the same position in the real world due to the synchronized scenes of the two players.
In addition, the server records all items that the player has placed in a certain room (records item ID, item location), and stores it with WorldMap data.
When the player enters the AR mode of the virtual game again at the same position or an adjacent position next time, the last stored scene data can be selected to be used, and then the server synchronizes WorldMap and all the previously stored item data to the client. If the client successfully resolves the WorldMap through the ARKit and reestablishes the coordinate system, the client then regenerates all virtual articles at the corresponding positions by using the received scene virtual article information. It appears that the virtual object, as it was last placed in the real world, still exists in place. Fig. 12 is an effect diagram of associating virtual elements in a real scene according to an embodiment of the present invention.
When a new player joins the homeowner's room, the new player can see what the homeowner previously placed in the room, if in the same location, by the same mechanism.
In one embodiment of the present embodiment, view restoration and repair is performed by stored historical world map data. After joining a second virtual character controlled by a second game client to the virtual game level scene, the method further comprises: storing world map data of the virtual game level scene at the first time, or acquiring the world map data of the virtual game level scene at the first time from the second game client, wherein the world map data is used for generating an AR view of the virtual game level scene; and reconstructing or correcting the AR view of the virtual game stage scene at the second time according to the world map data.
For various reasons, the ARKit session occasionally experiences an abnormality in the coordinate tracking function, and in order for the player to still have an opportunity to restore the AR session in this case (the AR session is a bridge connecting the underlying data and the AR view), a mechanism is utilized by the ARKit that can parse WorldMap and reconstruct the coordinate system. And allowing the player to download WorldMap related data again and perform scene reconstruction, wherein the reconstruction process is the same as that of a player joining a room in the multiplayer game.
In this embodiment, there are two sources of WorldMap data: data previously uploaded by a first game client; and the second game client synchronizes the data. To this end, the player is allowed to extract and upload WorldMap data at any time, and is also allowed to download and reconstruct WorldMap data of other players in the room.
By using the correction mechanism of the embodiment, the AR player of the game has higher fault tolerance rate, can repair and correct itself, and avoids the condition that the player cannot carry out due to abnormal AR session. By adding a mechanism of re-correction, the chance that the AR experience cannot be normally performed due to the damage of the ARSession can be reduced, and the method is an effective compensation scheme for the ARKit self problem.
Optionally, determining the real scene according to the location information of the second game client includes: acquiring position information of a second game client; judging whether the second game client and the first game client are at the same physical position or not according to the position information; if the second game client and the first game client are at the same physical position, determining a real scene according to scene information of the first game client; and if the second game client and the first game client are not at the same physical position, determining the real scene according to the scene information of the current position acquired by the second game client.
According to the above embodiments, projecting a virtual game level scene to a real scene includes the following examples:
example one: if the second game client and the first game client are in the same real scene, the world map information and the scene projection information of the first game client are obtained from the game server, and the virtual game level scene is projected to the real scene according to the world map information and the scene projection information;
example two: and if the second game client and the first game client are not in the same real scene, constructing a world map according to scene information acquired by the second game client, and projecting the virtual game level scene to the real scene according to the world map.
After a first game client side logs in a virtual game stage scene, a player invites other players to join, face-to-face invitation and remote invitation, or a second game client side actively joins, and a game server automatically matches, so that a plurality of players can share the stage in the same game.
The description is given to the game interaction flow of the multi-player level-sharing game: when player B (corresponding to the second game client) wants to see the same scene as player a (corresponding to the first game client) at the same place, in addition to synchronizing the coordinate system of the ARKit by WorldMap, it is necessary to synchronize the projected point of player a (P0 in the above embodiment) by the game server, and the synchronized points of the scene (P1) and K are identical for all clients. After the information is available, the player B also uses the same rule to process the relationship between the positions of the virtual camera and the real camera, and the two cameras can see the same virtual scene in the same real place.
In game strategy, it can take the form of "room", invitation, automatic matching, etc., and each player can create a room by himself and then cast a scene by himself. Other players may be invited into the room, or actively into a friend's room. After entering the rooms of other players, the players select whether the two persons are in the same place.
If the two players are in the same place, the WorldMap information of the homeowner and the scene projection information are distributed to the new player through the server, and then the two players can see the same virtual scene in the same real place.
If the two players are in different locations, then the new player is allowed to do the scene cast on its own. In this way, although the effect of "seeing the same virtual object at the same physical location" cannot be achieved, all virtual objects in the game scene can still be synchronized normally. The player can still see other players and interact with other players to jointly transform the virtual scene and jointly complete the game task.
In this embodiment, synchronizing the virtual game level scene presented on the first game client to the second game client via the game server comprises at least one of: synchronizing a virtual line of sight presented on a first game client to a second game client through a game server; the virtual trajectory of the item presented on the first game client is synchronized to the second game client through the game server.
In order to guarantee the game experience of two players in different places to the maximum extent, most data synchronization is based on virtual scenes rather than real coordinates. Such as: synchronization of virtual lines of sight (a player may see virtual lines of sight of other players), synchronization of tracks of items (a player, in some play, emits a track of a virtual item), and so forth. Even if the body is in different places, two people can feel that the opposite side is close to the body.
After player a creates a virtual room on the first game client, the server synchronizes the room information to all the friends of the player, and the friends can choose to enter the room. After player a projects the scene to the real physical location, the client will collect the WorldMap at that time and send it to the server together with the associated scene data (scene projection point), which will be associated with this room and stored by the server.
In some scenarios of this embodiment, synchronizing, by the game server, the scene information of the real scene and the corresponding casting information collected by the first game client to the second game client includes: determining that a second game client enters a virtual level room where a first game client is located, and the second game client and the first game client are at the same physical position; if the first game client side synchronizes the scene information and the corresponding projection information, the game server sends the scene information and the corresponding projection information to the second game client side; if the first game client does not synchronize the scene information and the corresponding projection information, the game server sends the scene information and the corresponding projection information to the second game client after determining that the first game client completes synchronization.
When new player B joins a virtual room of a on a second game client, and both are in the same physical location. If A has synchronized its own scene information, the server will send these data directly to B, otherwise, B will wait until A projects the scene and synchronizes the information before B can receive the corresponding data. When B gets the data, the client tries to parse WorldMap through ARKit, and associates the world seen by A with the world seen by B, and after the association is successful, the two can see the same world. If B and A are not in the same place, B will perform scene projection by itself, and then two people will see the same virtual scene.
Through the above description of the embodiments, those skilled in the art can clearly understand that the method according to the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but the former is a better implementation mode in many cases. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
Example 2
In this embodiment, a device for loading a virtual game is further provided, which is used to implement the foregoing embodiments and preferred embodiments, and the description that has been already made is omitted. As used below, the term "module" may be a combination of software and/or hardware that implements a predetermined function. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware, or a combination of software and hardware is also possible and contemplated.
Fig. 13 is a block diagram of an apparatus for loading a virtual game according to an embodiment of the present invention, as shown in fig. 13, the apparatus including: an acquisition module 130, a projection module 132, a first synchronization module 134, a join module 136, a control module 138, wherein,
the acquisition module 130 is configured to acquire scene information of a real scene through a first game client;
a projection module 132, configured to project the virtual game level scene presented on the first game client into the real scene according to the scene information and the corresponding projection information;
a first synchronization module 134, configured to synchronize scene information of a real scene and corresponding projection information acquired by a first game client to a second game client through a game server;
a joining module 136, configured to join a first virtual character controlled by a first game client into the virtual game level scene, and join a second virtual character controlled by a second game client into the virtual game level scene;
and the control module 138 is configured to control game data interaction between a first virtual character controlled by the first game client and a second virtual character controlled by the second game client in the virtual game level scene.
Optionally, the apparatus further comprises: the second synchronization module is used for synchronizing the virtual game stage scene presented on the first game client to the second game client through the game server; and the processing module is used for projecting the virtual game level scene presented on the second game client into the real scene and synchronizing the first virtual character data controlled by the first game client to the second game client through the game server.
Optionally, the apparatus further comprises: the first acquisition module is used for acquiring first position information of the first game client and second position information of the second game client before the second synchronization module synchronizes the virtual game level scene presented on the first game client to the second game client through the game server; the judging module is used for judging whether the first game client and the second game client are at the same position or not according to the first position information and the second position information; and the determining module is used for determining that the virtual game level scene presented on the first game client is synchronized to the second game client through the game server if the first game client and the second game client are not at the same position.
Optionally, the second synchronization module includes at least one of: a first synchronization unit for synchronizing the virtual sight line presented on the first game client to the second game client through the game server; and the second synchronization unit is used for synchronizing the virtual track of the object presented on the first game client to the second game client through the game server.
Optionally, the apparatus further comprises: and the sending module is used for sending the scene information and the corresponding projection information of the first game client to a game server after the projection module projects the virtual game level scene presented on the first game client into the real scene, wherein the game server is used for establishing and storing the incidence relation between the scene information and the projection information and the virtual level room.
Optionally, the first synchronization module includes: the first determining unit is used for determining that the second game client enters a virtual level room where the first game client is located, and the second game client and the first game client are at the same physical position; a sending unit, configured to send the scene information and the corresponding projection information to the second game client by the game server if the scene information and the corresponding projection information have been synchronized by the first game client; if the scene information and the corresponding projection information are not synchronized by the first game client, the game server sends the scene information and the corresponding projection information to the second game client after determining that the first game client is synchronized.
Optionally, the first synchronization module includes: a holding unit, configured to keep a projection matrix of a virtual camera of the virtual game level scene consistent with a projection matrix of a real camera of the first game client, where the scene information includes the real camera, and the projection information includes the virtual camera; the second determining unit is used for determining a first reference point of the real scene in a world coordinate system and a second reference point, corresponding to the first reference point, in a screen coordinate system in the virtual game level scene; and the conversion unit is used for converting the screen coordinate system into the world coordinate system according to the first reference point and the second reference point and projecting the converted virtual game level scene to the real scene.
Optionally, the conversion unit includes: the calculating subunit is used for calculating the element proportion K of the screen coordinate system and the world coordinate system; a converting subunit, configured to convert the screen coordinate system into the world coordinate system according to the following formula: p3 ═ P1+ (P2-P0) × K; wherein P3 is the coordinate position of the virtual camera, P1 is the coordinate position of the second reference point, P2 is the coordinate position of the real camera, and P0 is the coordinate position of the first reference point.
Optionally, the control module includes: the acquisition unit is used for acquiring the interaction information of the virtual game stage scene through the first game client side or the second game client side; the generating unit is used for generating a control instruction of the virtual game stage scene based on the interaction information; and the control unit is used for controlling the virtual game stage scene according to the control instruction.
Optionally, when the interaction information is a viewpoint ray, the generating unit includes: the positioning subunit is used for positioning the intersection point position of the viewpoint ray to the virtual game level scene; a first determining subunit, configured to determine the intersection point position as a target position of the first virtual character movement; and the first generation subunit is used for generating a movement instruction according to the target position, wherein the movement instruction is used for indicating the first virtual character to move to the target position.
Optionally, when the interaction information is an ambient volume blown by a microphone, the generating unit includes: the triggering subunit is used for triggering an environment wind field according to the environment volume, wherein the size of the environment wind field corresponds to the size of the environment volume; and the second generation subunit is used for generating a pneumatic painting parameter according to the environment wind field and generating a first rendering instruction, wherein the first rendering instruction is used for indicating that the pneumatic painting special effect is rendered by using the wind animation parameter in the virtual game level scene.
Optionally, when the interaction information is the direction information of the first game client or the second game client, the generating unit includes: the second determining subunit is used for determining the movement track of the first game client or the second game client according to the direction information; and the third generating subunit is configured to search a preset special effect library for the role special effect matched with the movement track, and generate a second rendering instruction, where the second rendering instruction is used to instruct rendering of the role special effect in the virtual game level scene.
Optionally, the control module includes: the acquisition unit is used for intercepting or recording a display picture of the virtual game level scene on a display screen of the first game client or the second game client; and the sharing unit is used for sharing or saving the display picture.
Optionally, the apparatus further comprises: the second obtaining module is used for obtaining the identification information of the virtual element in the virtual game stage scene and the position information of the virtual element in the real scene after the adding module adds the second virtual character controlled by the second game client to the virtual game stage scene; the first storage module is used for storing the identification information and the position information in a game server after being associated; and the generating module is used for generating the virtual element in the real scene according to the identification information and the position information when the first game client enters the virtual game level scene again.
Optionally, the apparatus further comprises: the second storage module is used for storing world map data of the virtual game level scene at the first time after the joining module joins a second virtual character controlled by a second game client into the virtual game level scene, or acquiring the world map data of the virtual game level scene at the first time from the second game client, wherein the world map data is used for generating an AR view of the virtual game level scene; and the building module is used for reconstructing or correcting the AR view of the virtual game level scene at the second time according to the world map data.
According to another embodiment of the present invention, there is provided a system for loading a virtual game, including a first game client and a second game client, and a game server including the apparatus as described in the above embodiments.
It should be noted that, the above modules may be implemented by software or hardware, and for the latter, the following may be implemented, but not limited to: the modules are all positioned in the same processor; alternatively, the modules are respectively located in different processors in any combination.
Example 3
Embodiments of the present invention also provide a storage medium having a computer program stored therein, wherein the computer program is arranged to perform the steps of any of the above method embodiments when executed.
Alternatively, in the present embodiment, the storage medium may be configured to store a computer program for executing the steps of:
s1, acquiring scene information of a real scene through a first game client;
s2, projecting the virtual game stage scene presented on the first game client into the real scene according to the scene information and the corresponding projection information;
s3, synchronizing the scene information of the real scene and the corresponding projection information collected by the first game client to the second game client through the game server;
s4, adding a first virtual character controlled by a first game client into the virtual game stage scene, and adding a second virtual character controlled by a second game client into the virtual game stage scene;
and S5, the first virtual character controlled by the first game client terminal and the second virtual character controlled by the second game client terminal interact with the game playing data in the virtual game stage scene.
Optionally, in this embodiment, the storage medium may include, but is not limited to: various media capable of storing computer programs, such as a usb disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic disk, or an optical disk.
Embodiments of the present invention also provide an electronic device comprising a memory having a computer program stored therein and a processor arranged to run the computer program to perform the steps of any of the above method embodiments.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, wherein the transmission device is connected to the processor, and the input/output device is connected to the processor.
Optionally, in this embodiment, the processor may be configured to execute the following steps by a computer program:
s1, acquiring scene information of a real scene through a first game client;
s2, projecting the virtual game stage scene presented on the first game client into the real scene according to the scene information and the corresponding projection information;
s3, synchronizing the scene information of the real scene and the corresponding projection information collected by the first game client to the second game client through the game server;
s4, adding a first virtual character controlled by a first game client into the virtual game stage scene, and adding a second virtual character controlled by a second game client into the virtual game stage scene;
and S5, the first virtual character controlled by the first game client terminal and the second virtual character controlled by the second game client terminal interact with the game playing data in the virtual game stage scene.
Optionally, the specific examples in this embodiment may refer to the examples described in the above embodiments and optional implementation manners, and this embodiment is not described herein again.
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present application, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one type of division of logical functions, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present application and it should be noted that those skilled in the art can make several improvements and modifications without departing from the principle of the present application, and these improvements and modifications should also be considered as the protection scope of the present application.
Embodiments of the present invention also include these and other aspects as specified in the following numbered clauses:
1. a method of loading a virtual game, comprising:
acquiring scene information of a real scene through a first game client;
projecting the virtual game stage scene presented on the first game client into the real scene according to the scene information and the corresponding projection information;
synchronizing scene information of a real scene and corresponding projection information acquired by a first game client to a second game client through a game server;
adding a first virtual role controlled by a first game client into the virtual game level scene;
adding a second virtual role controlled by a second game client into the virtual game level scene;
and the first virtual character controlled by the first game client side and the second virtual character controlled by the second game client side perform game data interaction in the virtual game level scene.
2. The method of clause 1, further comprising:
synchronizing the virtual game level scene presented on the first game client to the second game client through the game server;
and projecting the virtual game stage scene presented on the second game client into the real scene, and synchronizing the first virtual character data controlled by the first game client to the second game client through the game server.
3. The method of clause 2, presenting a virtual game on a first game client
Before the card scene is synchronized to the second game client by the game server, the method further comprises:
acquiring first position information of a first game client and second position information of a second game client;
judging whether the first game client and the second game client are at the same position or not according to the first position information and the second position information;
and if the first game client side and the second game client side are not at the same position, determining that the virtual game level scene presented on the first game client side is synchronized to the second game client side through the game server.
4. The method of clause 2, synchronizing the virtual game level scene presented on the first game client to the second game client via the game server comprises at least one of:
synchronizing a virtual line of sight presented on a first game client to a second game client through a game server;
the virtual trajectory of the item presented on the first game client is synchronized to the second game client through the game server.
5. The method of clause 1, synchronizing scene information of a real scene and corresponding casting information collected by a first game client to a second game client through a game server comprises:
determining that the second game client enters a virtual level room where the first game client is located, and the second game client and the first game client are at the same physical position;
if the scene information and the corresponding projection information are synchronized by the first game client, the game server sends the scene information and the corresponding projection information to the second game client; if the scene information and the corresponding projection information are not synchronized by the first game client, the game server sends the scene information and the corresponding projection information to the second game client after determining that the first game client is synchronized.
6. The method of clause 1, projecting the virtual game level scene presented on the first game client into the real scene according to the scene information and the corresponding projection information, comprising:
keeping a projection matrix of a virtual camera of the virtual game level scene consistent with a projection matrix of a real camera of the first game client, wherein the scene information comprises the real camera, and the projection information comprises the virtual camera;
determining a first reference point of the real scene in a world coordinate system and a second reference point corresponding to the first reference point in the virtual game level scene in a screen coordinate system;
and converting the screen coordinate system into the world coordinate system according to the first reference point and the second reference point, and projecting the converted virtual game level scene to the real scene.
7. The method of clause 6, converting the screen coordinate system to the world coordinate system according to the first reference point and the second reference point comprises:
calculating the element proportion K of the screen coordinate system and the world coordinate system;
converting the screen coordinate system to the world coordinate system according to the following formula:
P3=P1+(P2–P0)*K;
wherein P3 is the coordinate position of the virtual camera, P1 is the coordinate position of the second reference point, P2 is the coordinate position of the real camera, and P0 is the coordinate position of the first reference point.
8. The method of clause 1, wherein the game play data interaction of a first virtual character controlled by a first game client with a second virtual character controlled by a second game client in the virtual game level scene comprises:
acquiring the interactive information of the virtual game stage scene through the first game client or the second game client;
generating a control instruction of the virtual game level scene based on the interaction information;
and controlling the virtual game level scene according to the control instruction.
9. The method of clause 8, wherein generating the control instruction for the virtual game level scene based on the interaction information when the interaction information is a viewpoint ray comprises:
positioning the intersection point position of the viewpoint ray to the virtual game level scene;
determining the intersection point position as a target position of the first virtual character movement;
and generating a movement instruction according to the target position, wherein the movement instruction is used for indicating the first virtual character to move to the target position.
10. The method of clause 8, wherein generating the control instruction of the virtual game level scene based on the interaction information when the interaction information is an ambient volume blown by a microphone comprises:
triggering an environment wind field according to the environment volume, wherein the size of the environment wind field corresponds to the size of the environment volume;
generating a pneumatic painting parameter according to the environment wind field, and generating a first rendering instruction, wherein the first rendering instruction is used for instructing the pneumatic painting special effect to be rendered in the virtual game stage scene by using the wind animation parameter.
11. The method of clause 8, wherein generating the control instruction of the virtual game level scene based on the interaction information when the interaction information is the orientation information of the first game client or the second game client comprises:
determining a movement track of the first game client or the second game client according to the direction information;
and searching the role special effect matched with the moving track in a preset special effect library, and generating a second rendering instruction, wherein the second rendering instruction is used for instructing the role special effect to be rendered in the virtual game level scene.
12. The method of clause 1, after projecting the virtual game level scene presented on the first game client into the real scene, further comprising:
and sending the scene information and the corresponding projection information of the first game client to a game server, wherein the game server is used for establishing and storing the incidence relation between the scene information and the projection information and a virtual level room.
13. The method of clause 1, wherein the game play data interaction of a first virtual character controlled by a first game client with a second virtual character controlled by a second game client in the virtual game level scene comprises:
intercepting or recording a display picture of the virtual game stage scene on a display screen of the first game client or the second game client;
and sharing or saving the display picture.
14. The method of clause 1, after joining a second virtual character controlled by a second game client to the virtual game level scene, further comprising:
acquiring identification information of a virtual element in the virtual game level scene and position information of the virtual element in the real scene;
after the identification information is associated with the position information, storing the identification information and the position information in a game server;
and when the first game client enters the virtual game stage scene again, generating the virtual element in the real scene according to the identification information and the position information.
15. The method of clause 1, after joining a second virtual character controlled by a second game client to the virtual game level scene, further comprising:
storing world map data of the virtual game level scene at a first time, or acquiring the world map data of the virtual game level scene at the first time from the second game client, wherein the world map data is used for generating an AR view of the virtual game level scene;
and reconstructing or correcting the AR view of the virtual game level scene at the second time according to the world map data.
16. An apparatus for loading a virtual game, comprising:
the acquisition module is used for acquiring scene information of a real scene through the first game client;
the projection module is used for projecting the virtual game level scene presented on the first game client into the real scene according to the scene information and the corresponding projection information;
the first synchronization module is used for synchronizing scene information of a real scene and corresponding projection information acquired by the first game client to the second game client through the game server;
the joining module is used for joining a first virtual character controlled by a first game client into the virtual game stage scene and joining a second virtual character controlled by a second game client into the virtual game stage scene;
and the control module is used for controlling the interaction of game data of a first virtual character controlled by the first game client and a second virtual character controlled by the second game client in the virtual game level scene.
17. The apparatus of clause 16, further comprising:
the second synchronization module is used for synchronizing the virtual game stage scene presented on the first game client to the second game client through the game server;
and the processing module is used for projecting the virtual game level scene presented on the second game client into the real scene and synchronizing the first virtual character data controlled by the first game client to the second game client through the game server.
18. The apparatus of clause 16, the first synchronization module comprising:
a holding unit, configured to keep a projection matrix of a virtual camera of the virtual game level scene consistent with a projection matrix of a real camera of the first game client, where the scene information includes the real camera, and the projection information includes the virtual camera;
the second determining unit is used for determining a first reference point of the real scene in a world coordinate system and a second reference point, corresponding to the first reference point, in a screen coordinate system in the virtual game level scene;
and the conversion unit is used for converting the screen coordinate system into the world coordinate system according to the first reference point and the second reference point and projecting the converted virtual game level scene to the real scene.
19. A storage medium having a computer program stored therein, wherein the computer program is arranged to perform the method of any of clauses 1 to 15 when run.
20. An electronic device comprising a memory having a computer program stored therein and a processor arranged to run the computer program to perform the method of any of clauses 1 to 15.

Claims (10)

1. A method of loading a virtual game, comprising:
acquiring scene information of a real scene through a first game client;
projecting the virtual game stage scene presented on the first game client into the real scene according to the scene information and the corresponding projection information;
synchronizing scene information of a real scene and corresponding projection information acquired by a first game client to a second game client through a game server;
adding a first virtual role controlled by a first game client into the virtual game level scene;
adding a second virtual role controlled by a second game client into the virtual game level scene;
and the first virtual character controlled by the first game client side and the second virtual character controlled by the second game client side perform game data interaction in the virtual game level scene.
2. The method of claim 1, further comprising:
synchronizing the virtual game level scene presented on the first game client to the second game client through the game server;
and projecting the virtual game stage scene presented on the second game client into the real scene, and synchronizing the first virtual character data controlled by the first game client to the second game client through the game server.
3. The method of claim 2, wherein prior to synchronizing the virtual game level scene presented on the first game client to the second game client through the game server, the method further comprises:
acquiring first position information of a first game client and second position information of a second game client;
judging whether the first game client and the second game client are at the same position or not according to the first position information and the second position information;
and if the first game client side and the second game client side are not at the same position, determining that the virtual game level scene presented on the first game client side is synchronized to the second game client side through the game server.
4. The method of claim 2, wherein synchronizing the virtual game level scene presented on the first game client to the second game client via the game server comprises at least one of:
synchronizing a virtual line of sight presented on a first game client to a second game client through a game server;
the virtual trajectory of the item presented on the first game client is synchronized to the second game client through the game server.
5. The method of claim 1, wherein synchronizing scene information of the real scene and corresponding casting information collected by the first game client to the second game client via the game server comprises:
determining that the second game client enters a virtual level room where the first game client is located, and the second game client and the first game client are at the same physical position;
if the scene information and the corresponding projection information are synchronized by the first game client, the game server sends the scene information and the corresponding projection information to the second game client; if the scene information and the corresponding projection information are not synchronized by the first game client, the game server sends the scene information and the corresponding projection information to the second game client after determining that the first game client is synchronized.
6. The method of claim 1, wherein projecting the virtual game level scene presented on the first game client into the real scene according to the scene information and the corresponding projection information comprises:
keeping a projection matrix of a virtual camera of the virtual game level scene consistent with a projection matrix of a real camera of the first game client, wherein the scene information comprises the real camera, and the projection information comprises the virtual camera;
determining a first reference point of the real scene in a world coordinate system and a second reference point corresponding to the first reference point in the virtual game level scene in a screen coordinate system;
and converting the screen coordinate system into the world coordinate system according to the first reference point and the second reference point, and projecting the converted virtual game level scene to the real scene.
7. The method of claim 6, wherein converting the screen coordinate system to the world coordinate system based on the first reference point and the second reference point comprises:
calculating the element proportion K of the screen coordinate system and the world coordinate system;
converting the screen coordinate system to the world coordinate system according to the following formula:
P3=P1+(P2–P0)*K;
wherein P3 is the coordinate position of the virtual camera, P1 is the coordinate position of the second reference point, P2 is the coordinate position of the real camera, and P0 is the coordinate position of the first reference point.
8. An apparatus for loading a virtual game, comprising:
the acquisition module is used for acquiring scene information of a real scene through the first game client;
the projection module is used for projecting the virtual game level scene presented on the first game client into the real scene according to the scene information and the corresponding projection information;
the first synchronization module is used for synchronizing scene information of a real scene and corresponding projection information acquired by the first game client to the second game client through the game server;
the joining module is used for joining a first virtual character controlled by a first game client into the virtual game stage scene and joining a second virtual character controlled by a second game client into the virtual game stage scene;
and the control module is used for controlling the interaction of game data of a first virtual character controlled by the first game client and a second virtual character controlled by the second game client in the virtual game level scene.
9. A storage medium, in which a computer program is stored, wherein the computer program is arranged to perform the method of any of claims 1 to 7 when executed.
10. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, and wherein the processor is arranged to execute the computer program to perform the method of any of claims 1 to 7.
CN202010603335.8A 2020-06-29 2020-06-29 Method and device for loading virtual game, storage medium and electronic device Pending CN111744180A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010603335.8A CN111744180A (en) 2020-06-29 2020-06-29 Method and device for loading virtual game, storage medium and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010603335.8A CN111744180A (en) 2020-06-29 2020-06-29 Method and device for loading virtual game, storage medium and electronic device

Publications (1)

Publication Number Publication Date
CN111744180A true CN111744180A (en) 2020-10-09

Family

ID=72677882

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010603335.8A Pending CN111744180A (en) 2020-06-29 2020-06-29 Method and device for loading virtual game, storage medium and electronic device

Country Status (1)

Country Link
CN (1) CN111744180A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112256128A (en) * 2020-10-22 2021-01-22 武汉科领软件科技有限公司 Interactive effect development platform
CN112492231A (en) * 2020-11-02 2021-03-12 重庆创通联智物联网有限公司 Remote interaction method, device, electronic equipment and computer readable storage medium
CN112717375A (en) * 2021-01-04 2021-04-30 厦门梦加网络科技股份有限公司 Game special effect realization method
CN113535171A (en) * 2021-07-23 2021-10-22 上海米哈游璃月科技有限公司 Information searching method, device, equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140213361A1 (en) * 2013-01-25 2014-07-31 Tencent Technology (Shenzhen) Company Limited Method, device, and system for interacting with a virtual character in smart terminal
CN106984043A (en) * 2017-03-24 2017-07-28 武汉秀宝软件有限公司 The method of data synchronization and system of a kind of many people's battle games
CN108830939A (en) * 2018-06-08 2018-11-16 杭州群核信息技术有限公司 A kind of scene walkthrough experiential method and experiencing system based on mixed reality
WO2019019248A1 (en) * 2017-07-28 2019-01-31 深圳市瑞立视多媒体科技有限公司 Virtual reality interaction method, device and system
CN110665219A (en) * 2019-10-14 2020-01-10 网易(杭州)网络有限公司 Operation control method and device for virtual reality game

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140213361A1 (en) * 2013-01-25 2014-07-31 Tencent Technology (Shenzhen) Company Limited Method, device, and system for interacting with a virtual character in smart terminal
CN106984043A (en) * 2017-03-24 2017-07-28 武汉秀宝软件有限公司 The method of data synchronization and system of a kind of many people's battle games
WO2019019248A1 (en) * 2017-07-28 2019-01-31 深圳市瑞立视多媒体科技有限公司 Virtual reality interaction method, device and system
CN108830939A (en) * 2018-06-08 2018-11-16 杭州群核信息技术有限公司 A kind of scene walkthrough experiential method and experiencing system based on mixed reality
CN110665219A (en) * 2019-10-14 2020-01-10 网易(杭州)网络有限公司 Operation control method and device for virtual reality game

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112256128A (en) * 2020-10-22 2021-01-22 武汉科领软件科技有限公司 Interactive effect development platform
CN112492231A (en) * 2020-11-02 2021-03-12 重庆创通联智物联网有限公司 Remote interaction method, device, electronic equipment and computer readable storage medium
CN112717375A (en) * 2021-01-04 2021-04-30 厦门梦加网络科技股份有限公司 Game special effect realization method
CN113535171A (en) * 2021-07-23 2021-10-22 上海米哈游璃月科技有限公司 Information searching method, device, equipment and storage medium
CN113535171B (en) * 2021-07-23 2024-03-08 上海米哈游璃月科技有限公司 Information searching method, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
CN111744202A (en) Method and device for loading virtual game, storage medium and electronic device
US11948260B1 (en) Streaming mixed-reality environments between multiple devices
CN105843396B (en) The method of multiple view is maintained on shared stabilization Virtual Space
US11620800B2 (en) Three dimensional reconstruction of objects based on geolocation and image data
CN111744180A (en) Method and device for loading virtual game, storage medium and electronic device
CN106984043B (en) Data synchronization method and system for multiplayer battle game
CN111638793B (en) Display method and device of aircraft, electronic equipment and storage medium
US9947139B2 (en) Method and apparatus for providing hybrid reality environment
CN111694430A (en) AR scene picture presentation method and device, electronic equipment and storage medium
CN111228811B (en) Virtual object control method, device, equipment and medium
CN108771866B (en) Virtual object control method and device in virtual reality
US20230050933A1 (en) Two-dimensional figure display method and apparatus for virtual object, device, and storage medium
CN112933606A (en) Game scene conversion method and device, storage medium and computer equipment
US20230245385A1 (en) Interactive method and apparatus based on virtual scene, device, and medium
CN110545363B (en) Method and system for realizing multi-terminal networking synchronization and cloud server
CN111569414B (en) Flight display method and device of virtual aircraft, electronic equipment and storage medium
KR20220125540A (en) A method for providing a virtual space client-based mutual interaction service according to location interlocking between objects in a virtual space and a real space
CN112843687B (en) Shooting method, shooting device, electronic equipment and storage medium
CN111651048B (en) Multi-virtual object arrangement display method and device, electronic equipment and storage medium
CN112843682B (en) Data synchronization method, device, equipment and storage medium
US20230117046A1 (en) Videographer mode in online games
CN117861200A (en) Information processing method and device in game, electronic equipment and storage medium
CN116912463A (en) 3D avatar processing method, apparatus, electronic device, and readable storage medium
KR20220125539A (en) Method for providing mutual interaction service according to location linkage between objects in virtual space and real space
KR20220125541A (en) Method for providing mutual interaction service based on augmented reality client according to location linkage between objects in virtual space and real space

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination