WO2024060799A1 - Multi-device collaboration method, and client - Google Patents

Multi-device collaboration method, and client Download PDF

Info

Publication number
WO2024060799A1
WO2024060799A1 PCT/CN2023/106607 CN2023106607W WO2024060799A1 WO 2024060799 A1 WO2024060799 A1 WO 2024060799A1 CN 2023106607 W CN2023106607 W CN 2023106607W WO 2024060799 A1 WO2024060799 A1 WO 2024060799A1
Authority
WO
WIPO (PCT)
Prior art keywords
client
scene
coordinate system
virtual
real
Prior art date
Application number
PCT/CN2023/106607
Other languages
French (fr)
Chinese (zh)
Inventor
郑亚
冯艳妮
董亚龙
胡潇
魏记
Original Assignee
华为云计算技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为云计算技术有限公司 filed Critical 华为云计算技术有限公司
Publication of WO2024060799A1 publication Critical patent/WO2024060799A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/216Input arrangements for video game devices characterised by their sensors, purposes or types using geographical information, e.g. location of the game device or player using GPS
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/23Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/31Communication aspects specific to video games, e.g. between several handheld game devices at close range
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • A63F13/5372Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for tagging characters, objects or locations in the game scene, e.g. displaying a circle under the character controlled by the player
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/843Special adaptations for executing a specific game genre or game mode involving concurrently two or more players on the same game device, e.g. requiring the use of a plurality of controllers or of a specific view of game data for each player
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/847Cooperative playing, e.g. requiring coordinated actions from several players to achieve a common goal
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1062Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to a type of game, e.g. steering wheel
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/20Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
    • A63F2300/205Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform for detecting the geographical location of the game platform
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/40Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of platform network
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/53Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of basic data processing
    • A63F2300/537Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of basic data processing for exchanging game data using a messaging service, e.g. e-mail, SMS, MMS
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/69Involving elements of the real world in the game world, e.g. measurement in live races, real video
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8023Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game the game being played by multiple players at a common site, e.g. in an arena, theatre, shopping mall using a large public display
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality

Definitions

  • This application relates to the field of Internet technology, and in particular to a multi-device collaboration method and client.
  • multiple clients can all be (virtual reality, VR) clients.
  • Each VR client can access the virtual collaboration scene and present the collaboration scene in VR mode to the user.
  • a virtual location in the collaborative scene includes a virtual object
  • each VR client can present the object at the virtual location in the respective VR scene, so that users of each VR client can see the object.
  • the collaboration method between multiple VR clients is relatively simple and the user experience is very poor.
  • this application provides a multi-device collaboration method and client, which can realize collaboration between augmented reality (AR) clients and VR clients, breaking the barriers between AR scenes and VR scenes.
  • AR augmented reality
  • an embodiment of the present application provides a multi-device collaboration method, which is characterized in that the method includes:
  • the first AR client presents a first AR scene, and the first AR scene is a first collaborative scene in AR mode;
  • the first VR client presents a first VR scene, and the first VR scene is the first collaborative scene in VR mode;
  • the first VR client When the first real position of the first AR scene includes a first object, the first VR client presents the first object at a first virtual position of the first VR scene, the first real position and the first virtual position is the same position in the first collaborative scene; and/or, when the second virtual position of the first VR scene includes a second object, the first AR client is in that The second object is presented in a second real position of the first AR scene, and the second virtual position and the second real position are the same position in the first collaborative scene.
  • the first VR scene may be a virtual scene obtained by processing the first AR scene.
  • the first AR client and the first VR client can access the same first collaboration scene, the first AR client presents the first AR scene, and the first VR client presents the first VR scene, that is, The first AR client and the first VR client can collaboratively present the first collaborative scene, where the first VR scene can be a virtual scene obtained by processing the first AR scene.
  • the first VR client presents the first object at the first virtual position of the first VR scene, that is, the first VR client can display the first AR
  • the first object in the scene is synchronously presented in the first VR scene, so that the user can see the first object from the first AR scene from the first VR client.
  • the first AR client presents the second object at the second real position of the first AR scene, that is, the first AR client can place the first VR
  • the second object in the scene is synchronously presented in the first AR scene, so that the user can see the second object from the first VR scene from the first AR client. Therefore, the first AR client and the first VR client can present the object in the first collaborative scene from the opposite end, so that the user of the first AR client and the user of the first VR client can see the object, realizing the AR client Collaboration and interoperability with VR clients.
  • the first object in the first AR scene may be a real object, a combined virtual and real object, or a virtual object, and the first object presented in the first VR scene may be a virtual object.
  • the first object in the first AR scene may be a user of the first AR client, and the first object presented in the first VR scene may be an avatar corresponding to the user.
  • the first object is an object that moves synchronously with the first AR client (for example, the first AR client is tied to the first object, or the first object carries the first AR client)
  • the location information of the first object with the first AR client The position information may be the same, and the rotation information of the first object and the rotation information of the first AR client may be the same.
  • the first AR client can generate or collect at least one of the position information and rotation information of the first object.
  • the second object in the first VR scene may be a virtual object, and the second object in the first AR scene may also be a virtual object.
  • the first VR client can receive user input, generate or collect at least one of position information and rotation information of the second object.
  • the local coordinate system of the first AR client may be the third coordinate system
  • the local coordinate system of the first VR client may be the fourth coordinate system
  • the first AR client and the first VR client are both located together.
  • the common coordinate system may include a second coordinate system.
  • the first AR client and the first VR client can realize the conversion of the position information of the first object and the second object in the third coordinate system and the fourth coordinate system through the second coordinate system.
  • the first AR client can present the first object at the first real position in the first AR scene based on the position information of the first object in the third coordinate system
  • the first VR client can present the first object based on the position information of the first object in the third coordinate system.
  • the position information of the first object in the fourth coordinate system presents the first object in the first VR scene.
  • the first AR client and the first VR client can realize the conversion of the rotation information of the first object and the second object in the third coordinate system and the fourth coordinate system through the second coordinate system.
  • the first AR client can present the first object at the first real position in the first AR scene based on the rotation information and position information of the first object in the third coordinate system.
  • the first VR client The client can present the first object in the first VR scene based on the rotation information and position information of the first object in the fourth coordinate system, so that the angle of the first object in the first AR scene is consistent with the rotation of the first AR client.
  • the angles match, and the angle of the first object in the first VR scene matches the rotation angle of the first VR client, thereby improving the authenticity and user experience of presenting the first object.
  • the method further includes any one or more of the following:
  • the first VR client When a first event corresponding to the first object occurs in the first AR scene, the first VR client presents the first event in the first VR scene; or,
  • the first AR client When a second event corresponding to the second object occurs in the first VR scene, the first AR client presents the second event in the first AR scene.
  • the first event includes any one or more of the following:
  • the motion event of the first object includes the movement of the first object from the first real position to a third real position in the first AR scene, and the first VR client The first VR scene presents the first event, including:
  • the first VR client presents the movement process of the first object moving from the first virtual position in the first VR scene to a third virtual position, and the third virtual position and the third real position are the same position in the first collaborative scene.
  • the second event includes any one or more of the following:
  • the first VR client and the first AR client improve the authenticity of the game scene collaboratively presented by the first VR client and the first AR client, thereby improving the user experience.
  • the motion event of the second object includes the movement of the second object from the second virtual position to a fourth virtual position in the first VR scene, and the first AR client The first AR scene presents the second event, including:
  • the first AR client presents the movement process of the second object moving from the second real position in the first AR scene to a fourth real position, the fourth real position and the fourth virtual
  • the location is the same location in the first collaborative scene.
  • the first object is a real object or a virtual combined object
  • the first VR client is Presenting the first object in the first virtual position of the first VR scene includes:
  • the first VR client presents the virtual image corresponding to the first object in the first virtual position of the first VR scene.
  • the second object is a virtual object
  • the first AR client presents the second object at a second real location of the first AR scene, including:
  • the first AR client presents the second object or a combined virtual and real image corresponding to the second object in the second real position of the first AR scene.
  • the first object is a user of the first AR client
  • the second object is a player-controlled character controlled by the first VR client. That is to say, the user of the first AR client can interact with the player-controlled character controlled by the first VR client in the first AR scene, and the player-controlled character controlled by the first VR client can interact with the player-controlled character controlled by the first VR client in the first VR scene. Users of an AR client can see and interact with each other, enabling users of VR clients and users of AR clients to see and interact with each other in the same collaborative scene, breaking the barriers between AR scenes and VR scenes.
  • the first AR scene includes at least part of the real scene
  • the first VR scene includes a virtual scene obtained by processing the at least part of the real scene.
  • the first VR scene may be a virtual scene obtained by processing the real scene in the first AR scene through a view synthesis model such as a neural radiance fields (NeRF) network.
  • a view synthesis model such as a neural radiance fields (NeRF) network.
  • the view synthesis model can re-model and render the actual scene, thereby outputting a virtual scene corresponding to the real scene.
  • the method before the first AR client presents the first AR scene, and the first AR scene is the first collaborative scene of the AR mode, the method further includes:
  • the first AR client accesses the first collaborative scene based on the real position of the first AR client in the first coordinate system
  • the first VR client accesses the first collaborative scene based on the location of the first VR client.
  • the real position in the first coordinate system is connected to the first collaborative scene, where the real position of the first AR client in the first coordinate system is the same as the real position of the first VR client in the first coordinate system.
  • the real position in the coordinate system is in the same AOI area; or,
  • the first AR client accesses the first collaborative scene based on the real position of the first AR client in the first coordinate system
  • the first VR client accesses the first collaborative scene based on the location of the first VR client.
  • the virtual position in the first coordinate system is connected to the first collaborative scene, where the real position of the first AR client in the first coordinate system is the same as the real position of the first VR client in the first coordinate system.
  • the virtual positions in the coordinate system are in the same AOI area.
  • the first coordinate system may be a common coordinate system where the first AR client and the first VR client are located.
  • the first AR client can access the first collaborative scene based on the real position of the first AR client in the first coordinate system
  • the first VR client can access the first collaborative scene based on the real position of the first VR client in the first coordinate system or
  • the virtual location is connected to the first collaboration scene, which enables the first AR client and the first VR client to access the same first collaboration scene whether they are at the same location in the real world, realizing cross-region and cross-device collaboration. Improved the flexibility and user experience of multi-device collaboration.
  • the method further comprises:
  • the first AR client presents a third object in the first AR scene
  • the first VR client presents a virtual image corresponding to the third object in the first VR scene
  • the third object is an object included in the second AR scene presented by the second AR client
  • the third object is a real object or a combination of virtual and real objects.
  • the method further includes:
  • the first AR client presents a fourth object in the first AR scene
  • the first VR client presents the fourth object in the first VR scene
  • the fourth object is a second Objects included in the second VR scene presented by the VR client
  • the fourth object is a virtual object.
  • the first AR client presents the second object in a second real-world location of the first AR scene, including:
  • the first AR client obtains the position of the second object in a second coordinate system, and the second coordinate system is a public coordinate system;
  • the first AR client converts the position of the second object in the second coordinate system into the second real position of the second object in the third coordinate system.
  • the third coordinate is the local coordinate system corresponding to the first AR client.
  • the first VR client presents the first virtual location in the first VR scene.
  • Objects including:
  • the first VR client obtains the position of the first object in a second coordinate system, and the second coordinate system is a public coordinate system;
  • the first VR client converts the position of the first object in the second coordinate system into the first virtual position of the first object in the fourth coordinate system.
  • the fourth coordinate is the local coordinate system corresponding to the first VR client.
  • the method further includes:
  • the first AR client converts the first real position of the first object in the third coordinate system into the position of the first object in the second coordinate system;
  • the first AR client sends the position of the first object in the second coordinate system to the service server;
  • the first VR client obtains the position of the first object in the second coordinate system, including:
  • the first AR client obtains the position of the first object in the second coordinate system from the service server.
  • the method further includes:
  • the first VR client converts the second virtual position of the second object in the fourth coordinate system into the position of the second object in the second coordinate system
  • the first VR client sends the position of the second object in the second coordinate system to the service server;
  • the first AR client obtains the position of the second object in the second coordinate system, including:
  • the first VR client obtains the position of the second object in the second coordinate system from the business server.
  • embodiments of the present application provide a multi-device collaboration method, applied to the first AR device, and the method includes:
  • the second virtual position of the first VR scene includes a second object
  • the second object is presented at a second real position of the first AR scene, and the second virtual position and the second real position are the The same position in the first collaborative scene;
  • the first VR scene is a scene presented by the first VR client, and the first VR scene is the first collaborative scene in VR mode.
  • the method further includes:
  • the first AR client When a second event corresponding to the second object occurs in the first VR scene, the first AR client presents the second event in the first AR scene.
  • the second event includes any one or more of the following:
  • embodiments of the present application provide a multi-device collaboration method, applied to the first VR device.
  • the method includes:
  • the first object is presented at the first virtual position of the first VR scene, and the first real position and the first virtual position are the The same position in the first collaborative scene;
  • the first AR scene is a scene presented by a first AR client, and the first AR scene is the first collaborative scene in AR mode.
  • the method further includes:
  • the first event When a first event corresponding to the first object occurs in the first AR scene, the first event is presented in the first VR scene.
  • the first event includes any one or more of the following:
  • embodiments of the present application provide a system including a first AR device and a first VR device;
  • the first AR client presents a first AR scene, and the first AR scene is a first collaborative scene in AR mode;
  • the first VR client presents a first VR scene, and the first VR scene is the first collaborative scene in VR mode;
  • the first VR client When the first real position of the first AR scene includes a first object, the first VR client The first virtual position of the scene presents the first object, and the first real position and the first virtual position are the same position in the first collaborative scene; and/or, when the first VR scene When the second virtual position includes a second object, the first AR client presents the second object at a second real position of the first AR scene, and the second virtual position and the second real position are The same location in the first collaborative scene.
  • embodiments of the present application provide a device that has the function of realizing the client behavior in any possible implementation of the first aspect, or the function of realizing the client behavior in any possible implementation of the second aspect.
  • Functions can be implemented by hardware, or by hardware executing corresponding software.
  • Hardware or software includes one or more modules or units corresponding to the above functions. For example, a transceiver module or unit, a processing module or unit, an acquisition module or unit, etc.
  • embodiments of the present application provide a client, including: a memory and a processor.
  • the memory is used to store a computer program; the processor is used to execute any of the methods described in the first aspect when calling the computer program or The method described in any one of the second aspects.
  • embodiments of the present application provide a chip system.
  • the chip system includes a processor, the processor is coupled to a memory, and the processor executes a computer program stored in the memory to implement any of the above-mentioned aspects of the first aspect.
  • the chip system may be a single chip or a chip module composed of multiple chips.
  • embodiments of the present application provide a computer-readable storage medium on which a computer program is stored.
  • the computer program is executed by a processor, the method described in any one of the first aspects or any one of the second aspects is implemented. method described in the item.
  • embodiments of the present application provide a computer program product.
  • the computer program product When the computer program product is run on a client, it causes the client to execute the method described in any one of the first aspects or any one of the second aspects. method described.
  • Figure 1 is a schematic structural diagram of a client provided by an embodiment of the present application.
  • Figure 2 is a schematic structural diagram of a server provided by an embodiment of the present application.
  • Figure 3 is a schematic diagram of a game scene provided by an embodiment of the present application.
  • Figure 4 is a schematic diagram of another game scene provided by an embodiment of the present application.
  • Figure 5 is a schematic structural diagram of a multi-device collaboration system provided by an embodiment of the present application.
  • Figure 6 is a schematic structural diagram of another multi-device collaboration system provided by an embodiment of the present application.
  • Figure 7 is a schematic structural diagram of another multi-device collaboration system provided by an embodiment of the present application.
  • Figure 8 is a flow chart of a multi-device collaboration method provided by an embodiment of the present application.
  • Figure 9 is a schematic diagram of another game scene provided by an embodiment of the present application.
  • Figure 10 is a flow chart of a method of accessing a game scene provided by an embodiment of the present application.
  • Figure 11 is a schematic diagram of a login interface provided by an embodiment of the present application.
  • Figure 12 is a schematic diagram of a camp selection interface provided by an embodiment of the present application.
  • Figure 13 is a schematic diagram of a game lobby interface provided by an embodiment of the present application.
  • Figure 14 is a flow chart of a method for presenting an AR scene provided by an embodiment of the present application.
  • FIG15 is a flowchart of a method for presenting a VR scene provided in an embodiment of the present application.
  • FIG16 is a flow chart of a method for presenting an AR scene provided in an embodiment of the present application.
  • Figure 17 is a flow chart of a method for presenting a VR scene provided by an embodiment of the present application.
  • Figure 18 is a schematic diagram of an AR scene provided by an embodiment of the present application.
  • Figure 19 is a schematic diagram of a VR scene provided by an embodiment of the present application.
  • Figure 20 is a flow chart of a method for presenting game objects provided by an embodiment of the present application.
  • Figure 21 is a flow chart of another method of presenting game objects provided by an embodiment of the present application.
  • Figure 22 is a flow chart of another method of presenting game objects provided by an embodiment of the present application.
  • Figure 23 is a schematic diagram of another AR scene provided by an embodiment of the present application.
  • Figure 24 is a schematic diagram of another VR scene provided by an embodiment of the present application.
  • Figure 25 is a flow chart of an object interaction method provided by an embodiment of the present application.
  • Figure 26 is a schematic diagram of another AR scene provided by an embodiment of the present application.
  • Figure 27 is a schematic diagram of another VR scene provided by an embodiment of the present application.
  • the multi-device collaboration method provided in the embodiment of the present application can be applied to the client and the server.
  • clients may include AR clients and VR clients.
  • the VR client can use computer technology to simulate and generate a three-dimensional virtual world that provides users with visual, auditory, tactile and other sensory simulations. Users can naturally interact with the virtual world with the help of special input/output devices.
  • the AR client can overlay the virtual world with the real world, making it difficult for users to distinguish between real and fake. Users can interact with the virtual world and/or the real world.
  • AR clients and VR clients can be implemented as mobile phones, tablets, wearable devices (such as smart glasses, smart helmets, etc.), vehicle-mounted devices, laptops, ultra-mobile personal computers (ultra-mobile personal computers), etc. computer (UMPC), netbook, personal digital assistant (personal digital assistant, PDA) and other devices.
  • UMPC ultra-mobile personal computers
  • PDA personal digital assistant
  • FIG. 1 is a schematic structural diagram of a client 100 provided by this application.
  • the client 100 may include a processor A110, an external memory interface 120, an internal memory 121, an antenna 1, a communication module A160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, a headphone interface 170D, a button 190, a motor 191, and a camera 193 , display screen 194, and subscriber identification module (subscriber identification module, SIM) card interface 195, etc.
  • SIM subscriber identification module
  • the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the client 100 .
  • the client 100 may include more or fewer components than shown in the figures, or combine some components, or split some components, or arrange different components.
  • the components illustrated may be implemented in hardware, software, or a combination of software and hardware.
  • the processor A 110 may include one or more processing units.
  • the processor A 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (GPU), an image signal Processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit, NPU) etc.
  • application processor application processor, AP
  • modem processor graphics processing unit
  • GPU graphics processing unit
  • ISP image signal Processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • neural-network processing unit NPU
  • the controller may be the nerve center and command center of the client 100.
  • the controller can generate operation control signals based on the instruction operation code and timing signals to complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor A 110 for storing instructions and data.
  • the memory in processor A 110 is cache memory. This memory can hold instructions or data that have just been used or recycled by processor A 110. If processor A 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are reduced and the waiting time of processor A 110 is reduced, thus improving the efficiency of the system.
  • the communication module A 160 can be used for communication between various internal modules of the client 100, or communication between the client 100 and other external electronic devices. Exemplarily, if the client 100 communicates with other electronic devices via a wireless connection, the communication module A 160 can provide solutions for wireless communications including 2G/3G/4G/5G applied on the client 100, and/or, the communication module A 160 can provide solutions for wireless communications including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), Bluetooth (BT), global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), infrared technology (IR), etc. applied on the client 100.
  • WLAN wireless local area networks
  • WiFi wireless fidelity
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication
  • IR infrared technology
  • the antenna 1 of the client 100 is coupled with the communication module A 160 so that the client 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi -zenith satellite system (QZSS) and/or satellite based enhancement system (satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based enhancement system
  • the client 100 implements display functions through a GPU, a display screen 194, and an application processor.
  • the GPU is a microprocessor for image processing, which connects the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • the processor A 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos, etc.
  • Display 194 includes a display panel.
  • the display panel can use a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode).
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • AMOLED organic light-emitting diode
  • FLED flexible light-emitting diode
  • Miniled MicroLed, Micro-oLed, quantum dot light emitting diode (QLED), etc.
  • the client 100 may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the client 100 can implement the shooting function through the ISP, camera 193, video codec, GPU, display screen 194 and application processor.
  • the ISP is used to process the data fed back by the camera 193. For example, when taking a photo, the shutter is opened, the light is transmitted to the camera sensor through the lens, the optical signal is converted into an electrical signal, and the camera sensor passes the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye. ISP can also perform algorithm optimization on image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene. In some embodiments, the ISP may be provided in the camera 193.
  • Camera 193 is used to capture still images or video.
  • the object passes through the lens to produce an optical image that is projected onto the photosensitive element.
  • the photosensitive element can be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then passes the electrical signal to the ISP to convert it into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other format image signals.
  • the client 100 may include 1 or N cameras 193, where N is a positive integer greater than 1.
  • the client 100 can superimpose the virtual image with the image captured by the camera 193, and display the superimposed image on the display screen 194, thereby realizing a display effect that combines virtual and real such as augmented reality.
  • the display screen 194 is a transparent display screen
  • the client 100 can project a virtual image on the display screen 194, and when the user's line of sight passes through the display screen 194, the projected virtual image can be Superimpose it with real images to achieve a display effect that combines virtual and real images such as augmented reality.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the client 100.
  • the external memory card communicates with the processor A 110 through the external memory interface 120 to implement the data storage function. Such as saving music, videos, etc. files in external memory card.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the processor A 110 executes instructions stored in the internal memory 121 to execute various functional applications and data processing of the client 100 .
  • the internal memory 121 may include a program storage area and a data storage area. Among them, the stored program area can store an operating system, at least one application program required for a function (such as a sound playback function, an image playback function, etc.).
  • the storage data area can store data created during use of the client 100 (such as audio data, phone book, etc.).
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one disk storage device, flash memory device, universal flash storage (UFS), etc.
  • the client 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signals. Audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be provided in the processor A 110, or some functional modules of the audio module 170 may be provided in the processor A 110.
  • Speaker 170A also called “speaker” is used to convert audio electrical signals into sound signals. Client 100 can listen to music through speaker 170A, or listen to hands-free calls.
  • Receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the client 100 answers a call or a voice message, the voice can be heard by bringing the receiver 170B close to the human ear.
  • Microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals. When making a call or sending When sending voice information, the user can speak with the mouth close to the microphone 170C and input the sound signal to the microphone 170C.
  • the client 100 may be provided with at least one microphone 170C. In other embodiments, the client 100 may be provided with two microphones 170C, which in addition to collecting sound signals, may also implement a noise reduction function. In other embodiments, the client 100 can also be equipped with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions, etc.
  • the headphone interface 170D is used to connect wired headphones.
  • the headphone interface 170D can be a USB interface, or a 3.5mm open mobile terminal platform (OMTP) standard interface, or a Cellular Telecommunications Industry Association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA Cellular Telecommunications Industry Association of the USA
  • FIG. 2 is a schematic structural diagram of a server 200 provided by this application.
  • the server 200 may include a memory 210, a processor B 220, a communication module B 230, and the like.
  • Processor B 220 may include one or more processing units, and memory 210 may be used to store program code and data. In the embodiment of the present application, the processor B 220 can execute computer execution instructions stored in the memory 210 for controlling and managing the actions of the server 200.
  • the communication module B 230 can be used to implement communication between the server 200 and other devices such as the client 100.
  • the embodiment of the present application does not specifically limit the structure of the server 200.
  • the server 200 may also include more or less components than shown in the figures, or combine some components, or split some components, or arrange different components.
  • the components illustrated may be implemented in hardware, software, or a combination of software and hardware.
  • Client-server is an important service architecture pattern.
  • the client can connect to the server through the network, the client can request network services from the server, and the server can respond to the request to provide the client with the requested data.
  • Multiple clients can communicate through the server to present collaborative scenarios to users.
  • the collaborative scene can be a scene presented by multiple clients at the same time.
  • multiple clients can simultaneously present at least part of the information in the collaborative scene, so that users of multiple clients can see at least part of the information at the same time.
  • collaborative scenarios may include game scenarios, entertainment scenarios, or conference scenarios.
  • collaborative scenarios may also include more or fewer scenarios.
  • the collaborative scenarios may also include other applications such as industrial production. field scene.
  • each user's client can obtain game data from the server, and based on the game data, present the user with a game scene, as well as one or more player characters and other objects distributed in the game scene.
  • the game scene can be used to indicate the environment, area, terrain, building, etc. where the game is located.
  • Each object in the game scene can interact in the game scene.
  • the object may include an object controlled by the user, such as a player character (PC).
  • the object may include a non-user-controlled object, such as a non-player character (NPC).
  • the object may include other objects that can interact with the user, such as various game props such as vehicles, tools, treasure chests, etc.
  • At least one of the interactive objects is a user-controlled object.
  • the user can control an object in the game scene to interact with another object, such as the user's player-controlled character dancing with another user's player-controlled character.
  • the user's player-controlled character opens the treasure chest, the user's player-controlled character and another user's player-controlled character drive the same vehicle together, etc.
  • the interactive objects may not be objects controlled by the user.
  • the game scene includes two animals: "wolf” and "sheep". Both animals are not controlled by the user. When the "wolf” encounters After reaching "Sheep", you can hunt "Sheep".
  • each user's client corresponds to different directions and positions in the game scene.
  • the position corresponding to the client can be understood as the position of the user of the client in the game scene
  • the direction corresponding to the client can be understood as the direction in which the user of the client observes the game scene.
  • the client may, based on the position and direction corresponding to the client, present to the user at least part of the game scene corresponding to the position and the direction and one of the at least part of the game scene. or multiple objects, that is, based on the position of the user of the client in the game scene and the direction in which the user observes, the user is presented with at least part of the game scene and objects observed by the user at the position and from the direction. .
  • the game screen displayed by client A can be as shown in Figure 3.
  • the game screen of client A The picture shown in the third-person perspective can be understood as the game scene that the user of Client A can see at the current position and direction.
  • the game picture of Client A includes the player-controlled character A corresponding to Client A 301 , the player-controlled character B 302 and the treasure box 303 corresponding to another client B, wherein the player-controlled character B 302 is located in the northwest direction in front of the player-controlled character A 301, and the treasure box 303 is located in the northeast direction in front of the player-controlled character A 301.
  • the game screen displayed by another client B can be as shown in Figure 4.
  • the game screen of client B is a third-person perspective screen, which can be understood as the game scene that the user of client B can see at the current position and direction.
  • the game screen of client B includes the game scene corresponding to client B.
  • the player controls character B 302 and the treasure box 303.
  • the treasure box 303 is located in the due east direction of the player-controlled character B 302. Comparing Figure 3 and Figure 4, we can see that although Client A and Client B are in the same game scene, the direction and position corresponding to Client A are different from the direction and position corresponding to Client B. Therefore, the game of Client A The screen is also different from the game screen of client B.
  • the game screen of client A can also simultaneously display the movement process of player-controlled character B 302, that is, player-controlled character A 301 observes the movement of player-controlled character B 302. If the player controls character B 302 and opens the treasure box 303, then the process of the player controlling character B 302 to open the treasure box 303 and the status change of the treasure box 303 can be displayed simultaneously on the game screen of client A and the game screen of client B, that is, Player-controlled character A 301 observed player-controlled character B 302 opening the treasure chest 303 .
  • the game screen shown in Figure 3 includes the player-controlled character A 301 corresponding to client A.
  • the game shown in Figure 4 The screen includes the player-controlled character B 302 corresponding to client B.
  • the game screen shown in Figure 3 does not include the character B 302 corresponding to client A.
  • the player controls character A 301.
  • the game screen shown in Figure 4 does not include the player-controlled character B 302 corresponding to client B. This is more in line with the visual experience that humans cannot see themselves under normal circumstances, making users feel immersed in the scene. gaming experience.
  • This embodiment of the present application provides a multi-device collaboration system, which can realize the aforementioned collaboration scenario.
  • the system can include multiple clients 100 and servers 200. Each client 100 and the server 200 are connected via network 300.
  • the multiple clients 100 include an AR client 510 and a VR client 520 .
  • the AR client 510 may include a mobile phone 530 and smart glasses 540
  • the VR client 520 may include a smart large screen 550 and a laptop 560 .
  • the AR client 510 and the VR client 520 can also include more or less devices, and the mobile phone 530 and the smart glasses 540 can also be used as the VR client 520 and the smart large screen 550 And laptop 560 can also serve as AR client 510.
  • the AR client 510 can present an AR scene to the user, and the VR client 520 can present a VR scene to the user.
  • the AR scene and the VR scene can be the same collaborative scene respectively, with different presentation methods on the AR client 510 and the VR client 520 , AR scenes can be virtual and real scenes superimposed on real scenes and virtual scenes, and VR scenes can be virtual scenes.
  • the object can be a virtual object, a real object, or a combination of virtual and real objects in the AR scene, and the object can be a virtual object in the VR scene.
  • the AR scene includes a real person, and the person can be presented as a virtual humanoid image corresponding to the real person in the VR scene.
  • the AR scene includes a virtual and real horse composed of a real support frame and a virtual projection. The horse can be presented as a virtual horse in the VR scene.
  • the AR scene includes a virtual humanoid character. Since the humanoid character is also a virtual object, the humanoid character is also a virtual humanoid character when presented in the VR scene.
  • the server 200 can provide multiple clients 100 with data required for multi-device collaboration.
  • the server 200 can provide each client 100 with location information and direction information corresponding to each client 100 , provide each client 100 with scene data for presenting a collaborative scene, and provide each client 100 with the collaborative scene.
  • the server 200 may be omitted, and various functions of the server 200 may be integrated on at least one of the multiple clients 100 .
  • any client of the plurality of clients 100 (for example, any AR client 510) can serve as both a client and a server, and other clients of the plurality of clients 100 can communicate with any of the clients 100.
  • a client is connected through the network 300, and any client can provide data required for multi-device collaboration to multiple clients 100 in the same or similar manner as the server 200.
  • multiple clients 100 include multiple mobile phones 530 and laptops 560, where the laptop 560 serves as both a client and a server, and the user of the laptop 560 creates a In the game scene, multiple mobile phones 530 serve as clients to access the game scene. Users of each mobile phone 530 and laptop 560 can play games in the game scene created by the laptop 560.
  • FIG. 6 is a structural diagram of a multi-device collaboration system provided by an embodiment of the present application.
  • the multiple device collaboration system may include a client 100, a gateway 600, and a server 200.
  • the gateway 600 is used to implement communication between the client 100 and the server 200 .
  • the gateway 600 may be a network device in the aforementioned network 300.
  • the client 100 can establish a communication connection with the server 200 through the gateway 600, and the client 100 can communicate with the server 200 through the communication connection.
  • the gateway 600 may be a service mesh-based gateway.
  • a service mesh is an infrastructure layer that handles communication between services.
  • a service mesh can be composed of a series of lightweight network proxies.
  • the communication connection may be a long connection, that is, once established, the communication connection can be maintained for a long time or permanently, so that when the client 100 communicates with the server 200 subsequently, there is no need to establish a new communication connection again, saving connection requests. Reduce time and resource consumption and reduce communication delay.
  • the gateway 600 can be deployed in a container based on docker to achieve dynamic expansion and contraction to meet message routing and forwarding in massive request scenarios.
  • the server 200 may include a first synchronization module 210, a second synchronization module 220, and an official service module 230.
  • the first synchronization module 210 may be used to implement area of interest (AOI) synchronization of multiple clients 100 in the first coordinate system.
  • AOI can be used to indicate regional geographical entities in the map, such as a residential area, a university, an office building, an industrial park, a comprehensive shopping mall, a hospital, a scenic spot or a stadium, etc.
  • the AOI of the residential area can be used to indicate the outline and area of the residential area.
  • the first coordinate system may be a coordinate system in the real world.
  • the first coordinate system is the public coordinate system.
  • the first coordinate system may include a GPS coordinate system.
  • the first synchronization module 210 can serialize the latitude and longitude data in the map through the geohash spatial index algorithm, thereby dividing the map into different areas, and each area can be used as an AOI area.
  • the first synchronization module 210 may determine the AOI area where the client 100 is located, and the client 100 included in each AOI area.
  • an AOI area may correspond to one or more collaborative scenes, and a collaborative scene may correspond to one or more AOI areas. When a client is in an AOI area, it may access the collaborative scene corresponding to the AOI area.
  • the server 200 configures three AOI areas through the first synchronization module 210, urban area A, urban area B, and urban area C. Each urban area can be used as an independent game scene.
  • the client 100 can access the game scene.
  • the second synchronization module 220 may be used to implement AOI synchronization of multiple clients 100 in the second coordinate system, and the synchronization accuracy of the second synchronization module 220 may be higher than the synchronization accuracy of the first synchronization module 210 .
  • the second synchronization module 220 can be specifically used to more accurately determine the position of the client 100 in the collaborative scene, as well as the positions of different clients 100 in the collaborative scene, and then detect whether a certain client 100 is present with other clients 100 Related content, and/or whether to send content related to the client 100 to other clients 100 so that other clients 100 can present content related to the client 100 .
  • the second coordinate system may be a real-world coordinate system.
  • the second coordinate system may be a common coordinate system.
  • the second coordinate system may be a UTM coordinate system.
  • the second synchronization module 220 can implement AOI synchronization of multiple clients 100 in the second coordinate system through a nine-square grid algorithm.
  • the public service module 230 may include one or more public services, such as terrain resource services, event bus services, etc.
  • the public service module 230 may provide one or more public services to one or more clients 100 based on at least one of the first synchronization module 210 and the second synchronization module 220 .
  • At least part of the data in the server 200 can be stored in memory, thereby reducing or avoiding complex addressing operations when obtaining the at least part of the data, improving the efficiency of obtaining the at least part of the data, and thereby improving the efficiency of obtaining the at least part of the data.
  • FIG. 7 is a structural diagram of a multi-device collaboration system provided by an embodiment of the present application. Among them, the system shown in FIG. 7 is based on the system shown in FIG. 6 , and the first synchronization module 210 is described in detail.
  • the first synchronization module 210 may include a regional service module 211, a regional management module 212 and a public access module 213.
  • the area service module 211 can be used to provide multiple divided AOI areas, including the location, boundary, size of each AOI area, and the positional relationship between any two AOI areas.
  • the regional management module 212 can serve as a regional configuration center to manage the AOI regions included in the regional service module 211, including regional division, load balancing, elastic computing, etc.
  • the area management module 212 may be used to divide the map into regions in the first coordinate system for different collaborative scenarios, thereby achieving physical and logical isolation between different collaborative scenarios.
  • the public access module 213 can be used to access one or more public service functions based on multiple AOI areas in the regional service module 211, such as chat function, friend function, log monitoring function, etc.
  • each client and each server described below can implement multi-device collaboration in a manner similar to or the same as that in the game scenario.
  • Figure 8 is a flowchart of a multi-device collaboration method provided in an embodiment of the present application. It should be noted that the method is not limited to Figure 8 and the specific sequence described below. It should be understood that in other embodiments, the order of some steps in the method can be interchanged according to actual needs, or some steps can be omitted or deleted. The method includes the following steps:
  • the first AR client and the first VR client can access the same game scene, so that the game scene can be presented to respective users, so that the respective users can play in the game scene.
  • S801 is an optional step.
  • the first AR client and the first VR client may always remain in a state of accessing the game scene.
  • S802 The first AR client presents the first AR scene, and the first VR client presents the first VR scene.
  • the first AR scene can be a game scene in AR mode
  • the first VR scene can be a game scene in VR mode. That is, the first AR scene and the first VR scene can be the same game scene, and the results are presented in different modes in the first AR client and the first VR client respectively.
  • the first AR scene includes at least part of a realistic scene.
  • the at least part of the real scene may be a scene captured by a camera of the first AR client.
  • the first VR scene may be a virtual scene corresponding to the first AR scene.
  • the first VR scene may be a virtual scene obtained by processing the real scene included in the first AR scene.
  • the first VR scene may be a virtual scene obtained by processing the real scene in the first AR scene through a view synthesis model such as a NeRF network.
  • the view synthesis model may remodel and render the realization scene, thereby outputting a virtual scene corresponding to the real scene.
  • the first AR scene and the first VR scene may be as shown in Figure 9.
  • the first AR scene can be a real scene captured by the camera of the first AR client in real time, including real buildings, corridors, trees, etc.
  • the first VR scene is a virtual scene obtained by processing the first AR scene, including virtual buildings corresponding to real buildings, virtual corridors corresponding to real corridors, and virtual trees corresponding to real trees.
  • the first AR scene and the first VR scene are scenes presented by different clients, and since the corresponding positions and angles of different clients are different, the specific contents included in the first AR scene and the first VR scene The content also varies.
  • the first AR scene and the first VR scene in Figure 9 can be understood as different game scenes seen by users in two opposite directions.
  • the way in which the first AR client presents the first AR scene can also refer to the relevant description in Figure 14 below, and the way in which the first VR client presents the first VR scene can also refer to the relevant description in Figure 15 below.
  • the first VR client presents the first object at the first virtual position of the first VR scene, and the first real position and the first virtual position are the game scene. at the same position in; and/or, when When the second virtual position of the first VR scene includes the second object, the first AR client presents the second object at the second real position of the first AR scene, and the second virtual position and the second real position are the same in the game scene. Location.
  • the first VR client presents the first object at the first virtual position of the first VR scene, that is, the first VR client can display the first AR
  • the first object in the scene is synchronously presented in the first VR scene, so that the user can see the first object from the first AR scene from the first VR client.
  • the second virtual position of the first VR scene includes the second object
  • the first AR client presents the second object at the second real position of the first AR scene, that is, the first AR client can place the first VR
  • the second object in the scene is synchronously presented in the first AR scene, so that the user can see the second object from the first VR scene from the first AR client. Therefore, both the first AR client and the first VR client can present the object in the game scene from the opposite end, so that both the user of the first AR client and the user of the first VR client can see the object, realizing the AR client Collaboration and interoperability with VR clients.
  • the real staircase includes virtual character A 701 and virtual character B 702, and virtual character B 702 is in front of virtual character A 701.
  • the same position on the virtual staircase also includes virtual character A 701 and virtual character B 702, and virtual character B 702 is in front of virtual character A 701.
  • the objects in the first AR scene may be virtual objects, real objects, or objects that combine virtuality and reality.
  • Real objects may be real objects in the first AR scene, such as real people, animals, objects, buildings, etc., such as real buildings, stairs, and trees in the first AR scene shown in Figure 9.
  • the virtual object can be an image superimposed on the real scene or an image projected in the real scene, such as virtual character A 701 and virtual character B 702 in the first AR scene as shown in Figure 9.
  • the virtual combination object may be an object that actually exists in the first AR scene and the image on the display screen, or it may be an object that actually exists in the first AR scene and a virtual image projected on the object.
  • the first AR scene includes a realistic torch frame
  • the first AR client projects a virtual image of the flame on the torch frame
  • the torch frame and the virtual image of the flame are combined into a virtual and real torch.
  • the objects in the first VR scene may be virtual objects.
  • the first object may be an object controlled by the first AR client, for example, the first object may be a player-controlled character controlled by the first AR client.
  • the first object may not be an object controlled by the first AR client, for example, the first object may be a user of the first AR client, a real animal or object in the first AR scene, etc.
  • the second object may be an object controlled by the first VR client.
  • the second object may be a player-controlled character controlled by the first VR client.
  • the second object may not be an object controlled by the first VR client.
  • the second object may be a character controlled by a non-player.
  • the first object may be the user of the first AR client.
  • the first VR client may be in the first real position of the first VR scene.
  • the first object is presented in the virtual position.
  • the first object of the first VR scene can be a virtual image corresponding to the user of the first AR client, which can enable the user of the first VR client to see the first object in the first VR scene.
  • a virtual image corresponding to the user of an AR client; the second object is a player-controlled character controlled by the first VR client, and the first AR client presents the second object at the second real position of the first AR scene, which can make the first AR
  • the user of the client can see the player-controlled character controlled by the first VR client in the first AR scene. Therefore, multiple users from multiple different types of clients can interact with each other.
  • the first object in the first AR scene may be a real object or a virtual combined object, and the first VR client may present the virtual image corresponding to the first object at the first virtual position of the first VR scene.
  • the first object in the first AR scene is a virtual object, and the first VR client can present the virtual image of the first object in the first VR scene.
  • the second object in the first VR scene is a virtual object
  • the first AR client can present the second object or a combination of virtual and real objects corresponding to the second object in the second real position of the first AR scene. image.
  • the first VR client when the first event corresponding to the first object occurs in the first AR scene, can also present the first event in the first VR scene. That is, the first VR client can A first event occurring in the first AR scene is presented synchronously with the first AR client. For example, when the first object in the first AR scene is a real object and the first event occurs in the real object, the first event can also occur in the virtual image of the first object in the first VR scene, and the first event can also occur. The event is presented to the first VR scene in the first VR client. The first VR client and the first AR client present the first event synchronously, This improves the authenticity of the first VR client and the first AR client in collaboratively presenting the game scene, thereby improving the user experience.
  • the first event may be at least one of a status update event of the first object, a movement event of the first object, and an interaction event between the first object and other objects.
  • a status update event of the first object may be used to indicate a status change of the first object.
  • the status update event of the first object can be used to indicate changes in the first object's body shape, body shape, hairstyle, clothing, etc., the first object's health value, physical strength value, endurance value, skill cooling time, remaining number of props, tasks Numerical changes such as completion progress.
  • the state of the first object and the specific content of the state change can be determined by relevant technical personnel such as the game developer. The embodiment of the present application does not limit the state of the first object and the specific content of the state change.
  • the first object motion event can be used to indicate the motion process of the first object, such as speed, direction, motion route, and so on.
  • the motion event of the first object includes the movement of the first object from a first real position in the first AR scene to a third real position, and the first VR client can present the first object from the first VR scene.
  • the first virtual position in , the movement process of moving to the third virtual position, and the third virtual position and the third real position are the same position in the game scene.
  • interaction events between the first object and other objects may be used to indicate the interaction process between the first object and other objects.
  • the other objects may be any interactive objects, and the interaction method between the first object and any object may differ based on the object.
  • the interaction method may include dialogue, dancing, attack, assistance, etc.; or, if the object that interacts with the first object is a real object, the interaction method may include Including operating the object; or, the object interacting with the first object is an animal, and the interaction method may include stroking, feeding, driving, attacking, riding, etc.
  • the interactive objects in the game scene and the interaction methods for different objects can be determined by relevant technical personnel such as game developers. The embodiments of this application do not specifically limit the types of interactive objects and the corresponding interaction methods.
  • the first AR client may present the second event corresponding to the second object in the first AR scene, that is, the second event corresponding to the second object occurs in the first VR scene.
  • a VR client can present a second event occurring in the first VR scene synchronously with the first AR client.
  • the second event may be at least one of a status update event of the second object, a movement event of the second object, and an interaction event between the second object and other objects.
  • a status update event of the second object e.g., a movement event of the second object
  • an interaction event between the second object and other objects e.g., a movement event of the second object
  • the second event please refer to the description related to the first event and will not be described again here.
  • first event and the second event may also include more or less events.
  • the way in which the first VR client presents the first object at the first virtual position of the first VR scene, and the way in which the motion event of the first object is presented can also refer to the relevant descriptions in the following Figures 16, 19 and 20;
  • the way in which the first AR client presents the second object at the second real position of the first AR scene, and the way in which the motion event of the second object is presented can also refer to the relevant descriptions in the following Figures 17-18 and 21;
  • the way in which the first AR client and the first VR client present the state change event of the first object can also refer to the relevant descriptions in the following Figures 22-25.
  • the first AR client and the first VR client can access the same game scene, the first AR client presents the first AR scene, and the first VR client presents the first VR scene, that is, the first The AR client and the first VR client can collaboratively present a game scene, where the first VR scene can be a virtual scene obtained by processing the first AR scene.
  • the first VR client presents the first object at the first virtual position of the first VR scene, that is, the first VR client can display the first AR
  • the first object in the scene is synchronously presented in the first VR scene, so that the user can see the first object from the first AR scene from the first VR client.
  • the first AR client presents the second object at the second real position of the first AR scene, that is, the first AR client can place the first VR
  • the second object in the scene is synchronously presented in the first AR scene, so that the user can see the second object from the first VR scene from the first AR client. Therefore, the first AR client and the first VR client can present the object in the game scene from the opposite end, so that the user of the first AR client and the user of the first VR client can see the object, realizing the realization of the AR client and VR Collaboration and interoperability between clients.
  • FIG. 10 is a flow chart of a method for accessing a game scene provided by an embodiment of the present application. It should be noted that this method is not limited to the specific sequence shown in Figure 10 and below. It should be understood that in other embodiments, the order of some steps of the method can be exchanged with each other according to actual needs, or some of the steps can be It can also be omitted or deleted.
  • the method includes the following steps:
  • the user can start the client by clicking on the icon of the client in the terminal device.
  • the client can also be started in other ways.
  • the embodiments of this application do not limit the way in which the client is started.
  • S1001 is an optional step. When S1001 is omitted, the client can reside in the terminal device, thereby improving the efficiency of the client accessing the game scene.
  • the client updates local resources.
  • the client can query whether there is updated data when starting up. If there is updated data, the updated data is obtained and the local resources are updated based on the updated data. It is understandable that the client can also update the local resources at other times, or not actively update the resources.
  • S1002 is an optional step.
  • the client can also verify the security and integrity of local resources. If the verification passes, continue to perform subsequent steps. Otherwise, the client can repair the resource and continue to perform subsequent steps.
  • a client can check whether at least one of the permissions required for the client to run is granted. If you continue to perform subsequent steps, if one or more permissions have not been authorized, you can obtain the one or more permissions and then continue to perform subsequent steps.
  • the at least one permission includes positioning permission, permission to call the network, permission to read and write storage devices, permission to read and write photo albums, permission to obtain device information of the terminal device, permission to call the microphone and camera, etc. .
  • the at least one permission may also include more or fewer permissions. The embodiment of this application does not limit the number of permissions verified by the client and the specific types of permissions.
  • the client can also enable at least one function on the terminal device.
  • the at least one function may be used to ensure proper functioning of the client.
  • the at least one function may include a positioning function, a network function, and the like.
  • the at least one function may also include more or less functions. The embodiments of this application do not limit the number and specific types of functions enabled by the client.
  • S1003 is an optional step. When S1003 is omitted, the efficiency of client login can be improved.
  • the client can obtain the first login credential information of the SMS server from the SMS server through a network request such as a hypertext transfer protocol (HTTP) request, and obtain it from the business server based on the first login credential information.
  • HTTP hypertext transfer protocol
  • the second login credential information of the business server can obtain the first login credential information of the SMS server from the SMS server through a network request such as a hypertext transfer protocol (HTTP) request, and obtain it from the business server based on the first login credential information.
  • HTTP hypertext transfer protocol
  • the login credential information is used to prove that the user name and password provided by the client are correct.
  • the client avoids frequently providing user names and passwords to the server by providing login credential information to the server.
  • the login credential information can be a token, and the token can be a string generated by the server.
  • the first login credential information may be a character string generated by the SMS server, and the second login credential information may be a character string generated by the business server.
  • the client name is "Starlight Tower”.
  • the user opens “Starlight Tower” on the mobile phone, thereby displaying the login interface shown in Figure 11, which includes a mobile phone number input box 901, a verification code input box 902, a verification code refresh button 903, a login button 904 and a registration button 905.
  • the user can enter a pre-registered mobile phone number in the mobile phone number input box 901, then click the verification code refresh button 903, enter the obtained verification code into the verification code input box 902, and then click the login button 904 to complete the login.
  • the registration button 905 to complete the registration and then log in.
  • the client can also log in to the business server in other ways, and the embodiment of the present application does not limit the way in which the client logs in to the business server.
  • the client obtains role data from the business server.
  • the client can obtain role data from the business server and select the role.
  • the client can obtain role data from the business server based on the second login credential information. If there is no role data corresponding to the client in the business server, it is possible that the user of the client has not created a role, so the client can create a role and thereby generate role data.
  • the role data can be role-related data.
  • role data may include role index information and role names.
  • the client can also obtain role data from the business server through other means, and the role data is not limited to the role index information and role name mentioned above.
  • the embodiment of the present application does not limit the client to obtain role data from the server and the specific content included in the role data.
  • S1006 The client selects a game scene and a method of accessing the game scene.
  • the client can establish a transmission control protocol (TCP) connection with the service server.
  • TCP transmission control protocol
  • the lobby scene can include one or more game scenes, and the client can choose to access any one of the game scenes.
  • the client selects the game scene to access, it can further choose to access the game scene through AR or VR.
  • the client can access the game scene as an AR client.
  • the client chooses to access the game scene through VR, the client can access the game scene as a VR client.
  • the lobby scene includes the Jet Arena scene and the classic scene.
  • the client selects the Jet Arena scene, it can further select AR Arena or VR Arena.
  • the client selects a classic scene it can further select an AR classic scene or a VR classic scene.
  • the mobile phone can display the camp selection interface shown in Figure 12.
  • the camp selection interface includes a blue camp button 906 and a red camp button 907 .
  • the user can join the blue camp by clicking the blue camp button 906, or click the red camp button 907 to join the red camp.
  • the game lobby interface as shown in Figure 13 can be displayed.
  • the game lobby interface includes user information 913 such as the user's avatar, name, and level, and a game map 908, and the game map 908 includes 5 games. Scenes, the user can click on any of the game scenes to select the game scene.
  • the game scene 909 currently located in the middle of the game map 908 displays the user's avatar, which means that the user currently selects the game scene 909.
  • the game lobby interface also includes an access method switching button 910. The user can click the access method switching button 910 to choose to access the game scene through AR or VR. For example, the current access method switching button 910 displays "AR", which means that the game scene is currently accessed through AR.
  • the game lobby interface also includes a backpack button 911. The user can click the backpack button 911 to open a virtual backpack, which can include virtual game props and assets, etc.
  • the game lobby interface also includes a message button 912. When the user clicks the message button 912, one or more messages can be displayed to the user. These can be system messages from the server or messages from other users.
  • the client may also select the game scene and/or the method of accessing the game scene in other ways, and the embodiments of the present application do not limit the method of selecting the game scene and accessing the game scene by the client.
  • the client may first select the method of accessing the game scene, and then select the game scene to be accessed.
  • the client is configured to act only as an AR client or a VR client, so it is not necessary to select the method of accessing the game scene in S1006.
  • FIG. 14 is a flow chart of a method for presenting an AR scene provided by an embodiment of the present application. It should be noted that this method is not limited to the specific sequence shown in Figure 14 and below. It should be understood that in other embodiments, the order of some steps of the method can be exchanged with each other according to actual needs, or some of the steps can be It can also be omitted or deleted.
  • the method includes the following steps:
  • a first AR client sends first positioning information collected by the first AR client to a resource server.
  • the first positioning information may be used to indicate the position of the first AR client in the first coordinate system.
  • the first positioning information may be used to indicate the AOI area where the first AR client is located, such as the country, province, city or administrative region where the first AR client is located.
  • the first coordinate system is a GPS coordinate system, and the first positioning information may be the first GPS coordinates.
  • the first AR client may also send first range information to the resource server, and the first range information may be used to indicate the range of the game scene presented by the first AR client.
  • the resource server returns the first digital resource to the first AR client based on the first positioning information.
  • the resource server may obtain the first digital resource from the stored digital resource collection based on the first positioning information, and return the first digital resource to the first AR client.
  • the first digital resource can be used to present at least part of the scene in the game scene.
  • the resource server may, based on the first positioning information and the first range information, retrieve the digital resource from the stored digital resource. Get the first digital resource in the collection.
  • the first AR client sends the first positioning information and the first environment image collected by the first AR client to the positioning server.
  • the first AR client may send the first positioning information and the first environment image collected by the first AR client to the positioning server, thereby requesting the positioning server to perform more accurate positioning of the first AR client.
  • the first AR client can photograph the scene where the first AR client is located through the camera, thereby obtaining the first environment image.
  • the positioning server may be a visual positioning service (visual positioning service) server.
  • the first AR client compresses the first environment image, thereby sending the compressed first environment image to the positioning server.
  • the amount of data required to be transmitted can be reduced, and positioning efficiency and real-time positioning can be improved.
  • the positioning server returns the second positioning information to the first AR client.
  • the positioning server may determine the second positioning information based on the first environment image and the first positioning information, and return the second positioning information to the first AR client.
  • the second positioning information may indicate the first AR client more accurately than the first positioning information. end position.
  • the second positioning information may be used to indicate the specific location of the first AR client in the area indicated by the first positioning information.
  • the second positioning information may be used to indicate the position of the first AR client in the second coordinate system.
  • the second coordinate system is a universal transverse mercator grid system (UTM) coordinate system, and the second positioning information may be the first UTM coordinates.
  • UTM universal transverse mercator grid system
  • the positioning server may also determine the first rotation information based on the first environment image and the first positioning information, and return the first rotation information to the first AR client.
  • the first rotation information may be used to indicate the rotation angle of the first AR client in the second coordinate system.
  • S1405 The first AR client converts the second positioning information into third positioning information.
  • the first AR client can perform coordinate system conversion on the second positioning information, thereby obtaining the third positioning information in the third coordinate system.
  • the third positioning information may be used to indicate the position of the first AR client in a third coordinate system, and the third coordinate system may be a coordinate system with the camera of the first AR client as a reference.
  • the third coordinate system can be understood as the local coordinate system of the first AR client, and the third positioning information can be understood as the local coordinates of the first AR client.
  • the first AR client may convert the first rotation information into the second rotation information.
  • the second rotation information may be used to indicate the rotation angle of the first AR client in the third coordinate system.
  • the first AR client loads the first digital resource based on the third positioning information and presents the first AR scene.
  • the first AR client can load the first digital resource based on the third positioning information, thereby presenting the first AR scene, so that the user of the first AR client can accurately see objects at various locations around him based on the location, that is, see The first AR scene that combines reality and reality.
  • the first AR client can load the first digital resource based on the third positioning information and the second rotation information, so that each object in the first AR scene matches the rotation angle of the camera of the first AR client. , improve the authenticity and user experience of the first AR scene.
  • FIG. 15 is a flow chart of a method for presenting a VR scene provided by an embodiment of the present application. It should be noted that this method is not limited to the specific sequence shown in Figure 15 and below. It should be understood that in other embodiments, the order of some steps of the method can be exchanged with each other according to actual needs, or some of the steps can be It can also be omitted or deleted.
  • the method includes the following steps:
  • the first VR client sends preset fourth positioning information to the resource server.
  • the fourth positioning information may be used to indicate the position of the first VR client in the first coordinate system.
  • the fourth positioning information may be used to indicate the AOI area where the first VR client is located, such as the country, province, city or administrative region where the first VR client is located.
  • the first coordinate system is a GPS coordinate system, and the fourth positioning information may be a second GPS coordinate.
  • the first VR client may also send second range information to the resource server, and the second range information may be used to indicate the range of the game scene presented by the first AR client.
  • the fourth positioning information may be obtained by the first VR client receiving a user submission.
  • the first VR client is located in Shanghai, but it can still receive the GPS coordinates submitted by the user and belongs to Beijing, so that for the resource server, the first VR client is the client located in Beijing.
  • the resource server returns the second digital resource to the first VR client based on the fourth positioning information.
  • the resource server may obtain the second digital resource from the stored digital resource collection based on the fourth positioning information, and return the second digital resource to the first VR client.
  • the second digital resource can be used to present at least part of the scene in the game scene.
  • the resource server may obtain the second digital resource from the stored digital resource set based on the fourth positioning information and the second range information.
  • the first VR client converts the preset fifth positioning information into sixth positioning information.
  • the fifth positioning information may indicate the position of the first VR client more accurately than the fourth positioning information.
  • the fifth positioning information may be used to indicate the specific location of the first VR client in the AOI area indicated by the fourth positioning information.
  • the fifth positioning information may be used to indicate the position of the first VR client in the second coordinate system.
  • the second coordinate system is a UTM coordinate system
  • the fifth positioning information may be the second UTM coordinates.
  • the fifth positioning information may be stored in a local file of the first VR client, or the fifth positioning information may be obtained from the service server.
  • the position indicated by the fifth positioning information can be understood as the birth point, rebirth point, refresh point, etc. of the first VR client in the first area.
  • the first AR client can perform coordinate system conversion on the fifth positioning information, thereby obtaining the sixth positioning information in the fourth coordinate system.
  • the sixth positioning information may be used to indicate the position of the first VR client in a fourth coordinate system, and the fourth coordinate system may be a coordinate system with the first VR client as a reference.
  • the fourth coordinate system can be understood as the local coordinate system of the first VR client, and the sixth positioning information can be understood as the local coordinates of the first AR client.
  • the first VR client loads the second digital resource based on the sixth positioning information, thereby presenting the first VR scene.
  • the first VR client can load the second digital resource based on the sixth positioning information, thereby presenting the first VR scene, so that the user of the first VR client can accurately see objects at various locations around him based on the location, that is, see to the virtual first VR scene.
  • the first VR client can load the second digital resource based on the sixth positioning information and the third rotation information, so that each object in the first VR scene matches the rotation angle of the camera of the first VR client. , improve the authenticity and user experience of the first VR scene.
  • the third rotation information can be obtained by the first VR client receiving a user submission.
  • the first VR client can obtain the second digital resource through the preset fourth positioning information, which makes it possible that even in reality, the first VR client is not in the same position as the first AR client.
  • the same AOI area can also be collaborated with the first AR client. It can be understood that in other implementations, if in reality, the first VR client and the first AR client are in the same AOI area, then the first VR client can also follow a similar process as the first AR client. way, the fourth positioning information is obtained through collection.
  • the first AR client obtains the first digital resource from the resource server based on the first positioning information
  • the first VR client obtains the second digital resource from the resource server based on the fourth positioning information
  • the first positioning information and the fourth positioning information are both in the first coordinate system, so for the resource server, the first AR client and the first VR client are in the same coordinate system, so the first AR client and the first VR client
  • the terminal can present the same game scene.
  • the first AR client loads the first digital resource based on the third positioning information to present the first AR scene
  • the first VR client loads the second digital resource based on the sixth positioning information to present the first VR scene
  • the third The positioning information is obtained by converting the second positioning information in the second coordinate system.
  • the sixth positioning information is obtained by converting the fifth positioning information in the second coordinate system.
  • the second positioning information and the fifth positioning information belong to The same coordinate system, so for the resource server, the first AR client and the first VR client are both in the same coordinate system, which makes the first AR scene presented by the first AR client and the first VR client The first VR scene presented by the client can be synchronized.
  • the character corresponding to the first AR client and the character corresponding to the first VR client are relative in the game scene, then when the character corresponding to the first AR client is in the first When you see the character corresponding to the first VR client in the AR scene, the character corresponding to the first VR client can also see the character corresponding to the first AR client in the first VR scene, that is, the character corresponding to the first AR client
  • the characters corresponding to the first VR client can interact with each other in the game scene.
  • Figure 16 is a flow chart of a method for collaboratively presenting game objects provided by an embodiment of the present application.
  • This method can be used to implement: when the first real position of the first AR scene includes the first object, the first VR client in the first VR scene A virtual location presents the first object, and the method can also be used to present motion events of the first object.
  • the information is the positioning information corresponding to the first object.
  • this method is not limited to the specific order described in Figure 16 and below. It should be understood that in other embodiments, the order of some steps of the method can be exchanged with each other according to actual needs, or some of the steps can be It can also be omitted or deleted.
  • the method includes the following steps:
  • the service server sends the second positioning information corresponding to the first object to the first VR client.
  • the first object may be at a first realistic location in the first AR scene.
  • the business server can send the second positioning information of the first object in the second coordinate system to the first VR client. Since to the business server, the first AR client and the first VR client are in the second coordinate system, therefore The first VR client can present the first object in the first VR scene based on the second positioning information.
  • the second positioning information may correspond to the first real position of the first object in the first AR scene.
  • the service server may send the first corresponding first rotation information to the first AR client.
  • the service server may send at least one of the second positioning information and the first rotation information to the first VR client by broadcasting.
  • the first VR client converts the second positioning information into seventh positioning information.
  • the first VR client can perform coordinate system conversion on the second positioning information, thereby obtaining seventh positioning information in the fourth coordinate system.
  • the seventh positioning information may indicate the position of the first object in the fourth coordinate system, and the seventh positioning information may also be understood as the local coordinates of the first object.
  • the first VR client may convert the first rotation information into fourth rotation information.
  • the fourth rotation information may be the rotation angle of the first object in the fourth coordinate system, or may be understood as the fourth rotation information.
  • the rotation information indicates the local rotation angle of the first object on the first VR client.
  • the first VR client presents the first object at the first virtual position in the first VR scene indicated by the seventh positioning information.
  • the first VR client may display the first object based on the fourth rotation information in the first VR scene.
  • the first VR client may display the first object based on the fourth rotation information at the first virtual position in the first VR scene indicated by the seventh positioning information.
  • the first AR client controls the first object to move in the first AR scene.
  • the moved first object is in the third real position in the first AR scene, and the positioning information corresponding to the third real position is the new third real position. Positioning information.
  • the user of the first AR client can hold or wear the first AR client.
  • the first AR client will also be driven to move.
  • the user and the first AR client The movement of the end can be understood as the movement of the first object.
  • the first object can also move in the first AR scene in other ways.
  • the first AR client controls the movement of the first object in the first AR scene, and the rotation information corresponding to the moved first object is the new second rotation information.
  • S1605 The first AR client converts the new third positioning information into new second positioning information.
  • the first AR client can perform coordinate system conversion on the new third positioning information, thereby obtaining new second positioning information in the second coordinate system. Since the first AR client and the first AR client are both in the second coordinate system for the business server, the first VR client can present the motion status of the first object in the first VR scene based on the new second positioning information. .
  • the first AR client may convert the new second rotation information into new first rotation information.
  • S1606 The first AR client sends new second positioning information to the service server.
  • the first AR client may send new first rotation information to the service server.
  • S1607 The service server sends new second positioning information to the first VR client.
  • the service server may send new first rotation information to the first VR client.
  • S1608 The first VR client converts the new second positioning information into the new seventh positioning information.
  • the first VR client may convert the new first rotation information into new fourth rotation information.
  • the first VR client presents the first object at the third virtual position in the first VR scene indicated by the new seventh positioning information.
  • the third virtual position in the first VR scene may correspond to the third real position in the first AR scene.
  • the third virtual position and the third real position are the same position in the game scene.
  • the first VR client may present the first object in the first VR scene based on the new fourth rotation information.
  • the first VR client may present the first object based on the new fourth rotation information at a third virtual position in the first VR scene indicated by the new seventh positioning information.
  • the first object when the first object moves from the first real position in the first AR scene to the third real position in the first VR scene, the first object also moves from the first virtual position in the first VR scene.
  • the third virtual position in the first VR scene that is, the instantaneous position of the first object in the first AR client and the position changes at different times are visible to the first VR client.
  • the motion status of the first object can be presented synchronously by executing at least part of the above-mentioned steps S1601-S1609 multiple times.
  • Figure 17 is a flow chart of a method for collaboratively presenting game objects provided by an embodiment of the present application.
  • This method can be used to implement: when the second virtual position of the first VR scene includes a second object, the first The AR client presents the second object in the second real-world position of the first AR scene, and this method can also be used to present motion events of the second object.
  • the second object is the player-controlled character corresponding to the first VR client (that is, the second object is a virtual object), and the positioning information corresponding to the first VR client is the positioning information corresponding to the second object.
  • this method is not limited to the specific order described in Figure 17 and below. It should be understood that in other embodiments, the order of some steps of the method can be exchanged with each other according to actual needs, or some of the steps can be exchanged. It can also be omitted or deleted.
  • the method includes the following steps:
  • the service server sends the fifth positioning information corresponding to the second object to the first AR client.
  • the second object may be in a second virtual location in the first VR scene.
  • the business server may send the fifth positioning information of the second object in the second coordinate system to the first AR client. Since to the business server, the first AR client and the first VR client are both in the second coordinate system, therefore The first AR client can present the second object in the first AR scene based on the fifth positioning information.
  • the fifth positioning information may correspond to the second virtual position of the second object in the first VR scene.
  • the service server may send fifth rotation information to the first AR client.
  • the first VR client may convert the third rotation information to obtain the fifth rotation information, and send the fifth rotation information to the business server.
  • the fifth rotation information may indicate the rotation angle of the second object in the second coordinate system. .
  • the service server may send at least one of the fifth positioning information and the fifth rotation information to the first AR client by broadcasting.
  • the first AR client converts the fifth positioning information into the eighth positioning information.
  • the first AR client may perform coordinate system conversion on the fifth positioning information to obtain the eighth positioning information in the third coordinate system.
  • the eighth positioning information may indicate the position of the second object in the third coordinate system, and the eighth positioning information may also be understood as the local coordinates of the second object.
  • the first AR client may convert the fifth rotation information into sixth rotation information.
  • the sixth rotation information may be the rotation angle of the second object in the third coordinate system, or it may be understood that the sixth rotation information indicates the rotation angle of the second object local to the first AR client.
  • the first AR client presents the second object at the second real position in the first AR scene indicated by the eighth positioning information.
  • the first AR client may present the second object based on the sixth rotation information at a second real-world location in the first AR scene.
  • the first AR client may present the second object based on the sixth rotation information at a second real-world position in the first AR scene indicated by the eighth positioning information.
  • the first VR client controls the second object to move in the first VR scene.
  • the moved second object is in the fourth virtual position in the first VR scene, and the positioning information corresponding to the fourth virtual position is the new sixth Positioning information.
  • the first VR client can receive movement instructions submitted by the user through peripheral devices such as keyboard, mouse, handle, remote control, microphone, smart glasses, helmet, and somatosensory equipment, and then based on the movement instructions, the second VR client Objects move in the first VR scene.
  • the type of movement instruction may include one or more types of key operations, touch operations, gesture operations, eye movement operations, and brain waves.
  • the first VR client can also receive movement instructions submitted by the user through other methods, or can control the movement of the second object in the first VR scene through other methods.
  • the first VR client controls the second object to move in the first VR scene, and the rotation information corresponding to the moved second object is the new third rotation information.
  • S1705 The first VR client converts the new sixth positioning information into new fifth positioning information.
  • the first VR client can perform coordinate system conversion on the new sixth positioning information, thereby obtaining new fifth positioning information in the second coordinate system. Since the first AR client and the first AR client are both in the second coordinate system for the business server, the first AR client can present the motion status of the second object in the first AR scene based on the new fifth positioning information. .
  • the first VR client may convert the new third rotation information into new fifth rotation information.
  • S1706 The first VR client sends new fifth positioning information to the service server.
  • the first VR client may also send new fifth rotation information to the service server.
  • S1707 The service server sends new fifth positioning information to the first AR client.
  • the service server may also send new fifth rotation information to the first AR client.
  • the first AR client converts the new fifth positioning information into the new eighth positioning information.
  • the first AR client may also convert the new fifth rotation information into new sixth rotation information.
  • the first AR client presents the second object at the fourth real position in the first AR scene indicated by the new eighth positioning information.
  • the fourth real position in the first AR scene may correspond to the fourth virtual position in the first VR scene.
  • the first AR client may present the second object at a fourth real-world location in the first AR scene based on the new sixth rotation information.
  • the first AR client may present the second object based on the new sixth rotation information at a fourth real-world position in the first AR scene indicated by the new eighth positioning information.
  • the second object when the second object moves from the second virtual position in the first VR scene to the fourth virtual position in the first VR scene, the second object also moves from the second real position in the first AR scene.
  • the fourth real position in the first AR scene that is, the instantaneous position of the second object in the first VR client and the position changes at different times are visible to the first AR client.
  • the motion status of the second object can be presented synchronously by executing at least part of the above-mentioned steps S1701-S1709 multiple times.
  • the instantaneous position of the first object in the first AR client and the position changes at different times are visible to the first VR client, and the second object in the first VR client
  • the instantaneous position of the object and its position changes at different moments are visible to the first AR client. Therefore, when the user of the first AR client sees the virtual character corresponding to the first VR client in the first AR scene, the user of the first VR client can also see the virtual character corresponding to the first AR client in the first VR scene.
  • the virtual characters that is, the virtual characters controlled by the real user of the first AR client and the real user of the first VR client, can interact with each other in the game scene.
  • Figure 18 is the first AR scene
  • Figure 19 is the first VR scene
  • the first AR scene shown in Figure 18 is a game scene presented by the first AR client from a first-person perspective.
  • the first AR scene includes the second object 1601 corresponding to the first VR client, but does not include the first object 1602 corresponding to the first AR client. That is to say, the user of the first AR client can see other users playing together, but cannot see the user himself.
  • the second object 1601 presented in the first AR scene is a virtual object.
  • the first VR scene shown in FIG. 19 is a game scene presented by the first VR client from a third-person perspective.
  • the first VR scene includes both the second object 1601 corresponding to the first VR client and the first object 1602 corresponding to the first AR client, wherein the first object 1602 is a virtual object corresponding to the user of the first AR client. That is, the user of the first AR client can see other users playing together, and can also see the user himself.
  • the first AR client When the user of the first VR client controls the second object 1601 to move in the first VR scene, the first AR client also The movement process of the second object 1601 is presented in the first AR scene. As shown in Figure 18, the second object 1601 moves from a position close to the left in the first AR scene to a middle position.
  • the first VR client When the user of the first AR client (ie, the first object 1602) moves in the first AR scene, the first VR client also presents the movement process of the first object 1602 in the first VR scene. As shown in Figure 19, the first object 1602 moves from a position near the right to a position near the middle of the first VR scene.
  • Figure 20 is a flowchart of a method for collaboratively presenting game objects provided in an embodiment of the present application.
  • the method can be used to implement: when the first real position of the first AR scene includes the first object, the first VR client presents the first object at the first virtual position of the first VR scene.
  • the method can also be used to present motion events of the first object.
  • the first object is an object that is not controlled by the first AR client and the first VR client.
  • the first object can be a non-player-controlled character, or the first object can be a vehicle in the first AR scene or other objects that are not controlled by the first AR client.
  • the method is not limited to Figure 20 and the specific order described below. It should be understood that in other embodiments, the order of some steps in the method can be interchanged according to actual needs, or some steps can be omitted or deleted.
  • the method includes the following steps:
  • the first AR client obtains ninth positioning information corresponding to the first object.
  • the ninth positioning information corresponds to the third real position of the first object in the first AR scene.
  • the ninth positioning information may be used to indicate the position of the first object in the third coordinate system.
  • the first AR client may obtain seventh rotation information corresponding to the first object.
  • the seventh rotation information may be used to indicate the rotation angle of the first object in the third coordinate system.
  • the first object is an object such as a real vehicle in the first AR scene that is not controlled by the first AR client. Then the first AR client obtains the ninth positioning information corresponding to the first object in a manner similar to The method of obtaining the second positioning information corresponding to the first AR client is the same or similar, and the method of obtaining the seventh rotation information by the first AR client can be the same as or similar to the method of obtaining the first rotation information corresponding to the first AR client. .
  • the first object is a virtual object not controlled by the first AR client, then at least one of the ninth positioning information and the seventh rotation information corresponding to the first object can be determined by the first object. Generated by preset movement strategies. Of course, in practical applications, the first AR client can also obtain at least one of the ninth positioning information and the seventh rotation information corresponding to the first object through other methods.
  • the first AR client converts the ninth positioning information into the tenth positioning information.
  • the tenth positioning information may be used to indicate the position of the first object in the second coordinate system.
  • the first AR client may convert the seventh rotation information into the eighth rotation information.
  • the eighth rotation information may be used to indicate the rotation angle of the first object in the second coordinate system.
  • S2003 The first AR client sends the tenth positioning information to the service server.
  • the first AR client may also send eighth rotation information to the service server.
  • the service server sends the tenth positioning information to the first VR client.
  • the service server may also send the eighth rotation information to the first VR client.
  • the first VR client converts the tenth positioning information into the eleventh positioning information.
  • the eleventh positioning information may be used to indicate the position of the first object in the fourth coordinate system.
  • the first VR client may convert the eighth rotation information into ninth rotation information.
  • the ninth rotation information may be used to indicate the rotation angle of the first object in the fourth coordinate system.
  • the first VR client presents the first object at the first virtual position in the first VR scene indicated by the eleventh positioning information.
  • the first VR client may present the first object based on the ninth rotation information in the first VR scene.
  • the first VR client may present the first object based on the ninth rotation information at the first virtual position in the first VR scene indicated by the eleventh positioning information.
  • the first AR client acquires new ninth positioning information after the movement of the first object.
  • the new ninth positioning information corresponds to the third real position of the first object after movement in the first AR scene.
  • the first AR client may obtain new seventh rotation information corresponding to the movement of the first object after the movement.
  • the first object is an object such as a real vehicle in the first AR scene that is not controlled by the first AR client, and the first object can move on its own.
  • the first object is not controlled by the first AR client virtual object, then the first object moves based on the preset movement strategy corresponding to the first object.
  • the first object may also move based on other control methods. The embodiments of this application do not limit the reasons for the movement of the first object.
  • the first AR client converts the new ninth positioning information into the new tenth positioning information.
  • the first AR client may convert the new seventh rotation information into new eighth rotation information.
  • S2009 The first AR client sends new tenth positioning information to the service server.
  • the first AR client may send new eighth rotation information to the service server.
  • S2010 The service server sends new tenth positioning information to the first VR client.
  • the service server may send new eighth rotation information to the first VR client.
  • the first VR client converts the new tenth positioning information into the new eleventh positioning information.
  • the first VR client may convert the new eighth rotation information into new ninth rotation information.
  • the first VR client presents the first object at the third virtual position in the first VR scene indicated by the new eleventh positioning information.
  • the first VR client may present the first object based on the new ninth rotation information in the first VR scene.
  • the first VR client may present the first object based on the new ninth rotation information at a third virtual position in the first VR scene indicated by the new eleventh positioning information.
  • the first object when the first object moves from the first real position in the first AR scene to the third real position in the first VR scene, the first object also moves from the first virtual position in the first VR scene. Moving to the third virtual position in the first VR scene, also in time, the instantaneous position of the first object in the first AR client and the position changes at different times are visible to the first VR client. It can be understood that the motion status of the first object can be presented synchronously by executing at least part of the above-mentioned steps S2001-S2012 multiple times.
  • Figure 21 is a flow chart of a method for collaboratively presenting game objects provided by an embodiment of the present application.
  • This method can be used to implement: when the second virtual position of the first VR scene includes a second object, the first The AR client presents the second object in the second real-world position of the first AR scene, and this method can also be used to present motion events of the second object.
  • the second object is an object that is not controlled by the first VR client.
  • the second object can be a non-player controlled character, etc.
  • this method is not limited to the specific order described in Figure 21 and below. It should be understood that in other embodiments, the order of some steps of the method can be exchanged with each other according to actual needs, or some of the steps can be It can also be omitted or deleted.
  • the method includes the following steps:
  • the first VR client obtains twelfth positioning information corresponding to the second object.
  • the twelfth positioning information corresponds to the second virtual position of the second object in the first VR scene.
  • the twelfth positioning information may be used to indicate the position of the second object in the fourth coordinate system.
  • the first VR client may obtain the tenth rotation information corresponding to the second object.
  • the tenth rotation information may be used to indicate the rotation angle of the second object in the fourth coordinate system.
  • the second object is a virtual object that is not controlled by the first VR client, then at least one of the twelfth positioning information and the tenth rotation information corresponding to the second object can be generated by a preset motion strategy corresponding to the second object.
  • the first VR client can also obtain at least one of the twelfth positioning information and the tenth rotation information corresponding to the second object by other means.
  • the first VR client converts the twelfth positioning information into the thirteenth positioning information.
  • the thirteenth positioning information may be used to indicate the position of the second object in the second coordinate system.
  • the first VR client may convert the tenth rotation information into eleventh rotation information, and the eleventh rotation information may be used to indicate the rotation angle of the second object in the second coordinate system.
  • the first VR client sends the thirteenth positioning information corresponding to the second object to the service server.
  • the first VR client may send eleventh rotation information to the service server.
  • S2104 The service server sends the thirteenth positioning information corresponding to the second object to the first AR client.
  • the service server may send the eleventh rotation information to the first AR client.
  • S2105 The first AR client converts the thirteenth positioning information into the fourteenth positioning information.
  • the fourteenth positioning information may be used to indicate the position of the second object in the third coordinate system.
  • the first AR client may convert the eleventh rotation information into the twelfth rotation information.
  • the twelfth rotation information may be used to indicate the rotation angle of the second object in the third coordinate system.
  • the first AR client presents the second object at the second real position in the first AR scene indicated by the fourteenth positioning information.
  • the first AR client may present the second object based on the twelfth rotation angle at a second real-world location in the first AR scene.
  • the first AR client may present the second object based on the twelfth rotation angle at the second real position in the first AR scene indicated by the fourteenth positioning information.
  • the first VR client obtains new twelfth positioning information after the movement of the second object.
  • the new twelfth positioning information corresponds to the fourth virtual position of the second object after movement in the first VR scene.
  • the first VR client obtains new tenth rotation information corresponding to the second object after the movement.
  • the second object is a virtual object not controlled by the first VR client, then the second object moves based on the preset movement strategy corresponding to the second object.
  • the second object may also move based on other control methods. The embodiments of this application do not limit the reasons for the movement of the second object.
  • the first VR client converts the new twelfth positioning information into the new thirteenth positioning information.
  • the first VR client may convert the new tenth rotation information into new eleventh rotation information.
  • S2109 The first VR client sends new thirteenth positioning information to the service server.
  • the first VR client may send new eleventh rotation information to the service server.
  • S2110 The service server sends new thirteenth positioning information to the first AR client.
  • the service server may send new eleventh rotation information to the first AR client.
  • the first AR client converts the new thirteenth positioning information into the new fourteenth positioning information.
  • the first AR client may further convert the new eleventh rotation information into new twelfth rotation information.
  • the first AR client presents the second object at the fourth real position in the first AR scene indicated by the new fourteenth positioning information.
  • the first AR client may present the second object at a fourth real-world location in the first AR scene based on the new twelfth rotation information.
  • the first AR client may present the second object based on the new twelfth rotation information at a fourth real-world position in the first AR scene indicated by the new fourteenth positioning information.
  • the second object when the second object moves from the second virtual position in the first VR scene to the fourth virtual position in the first VR scene, the second object also moves from the second real position in the first AR scene. Moving to the fourth real position in the first AR scene, also in time, the instantaneous position of the second object in the first VR client and the position changes at different times are visible to the first AR client. It can be understood that the motion status of the second object can be presented synchronously by performing at least part of the above-mentioned steps S2101-S2112 multiple times.
  • FIG. 22 is a flow chart of a method for collaboratively presenting game objects provided by an embodiment of the present application. It can be understood that the following takes the first object in the game scene as an example to illustrate the manner in which the AR client and the VR client collaboratively present the first object and the status update event of the first object.
  • the AR client and the VR client can collaboratively present the object and the state update event of the object in a similar or identical manner.
  • this method is not limited to the specific order described in Figure 22 and below. It should be understood that in other embodiments, the order of some steps of the method can be exchanged with each other according to actual needs, or some of the steps can be exchanged. It can also be omitted or deleted.
  • the method includes the following steps:
  • the business server sends the identity information corresponding to the first object to the first AR client and the first VR client.
  • the business server can determine whether the first object is within the range indicated by the first range information based on the positions of the first object and the first AR client in the second coordinate system and the first range information corresponding to the first AR client, and if so, send the identity information corresponding to the first object to the first AR client.
  • the business server may determine whether the first object is in the second range information based on the positions of the first object and the first VR client in the second coordinate system and the second range information corresponding to the first VR client. Within the indicated range, if the identity information corresponding to the first object is sent to the first VR client.
  • the service server may also determine whether to send the identity information of the first object to the first AR client and/or the first VR client through other methods.
  • the identity information of the first object may include at least one of a user ID, a name, and a role ID corresponding to the first object.
  • the identity information of the first object may also include more or less information indicating the identity of the first object.
  • the first AR client and the first VR client obtain resource data corresponding to the first object based on the identity information corresponding to the first object.
  • the resource data corresponding to the first object can be used to present the first object.
  • the resource data may include an avatar corresponding to the first object.
  • the resource data may include props, assets, status, etc. corresponding to the first object.
  • the resource data may also include more or less data for presenting the first object.
  • both the first AR client and the first VR client locally store at least part of the resource data corresponding to the first object, then the first AR client can locally obtain at least part of the resource data from the first AR client.
  • the first VR client can obtain at least part of the resource data locally from the first VR client.
  • the resource server stores at least part of the resource data corresponding to the first object, and then the first AR client and the first VR client can obtain the at least part of the resource data from the resource server.
  • the first AR client and the first VR client present the first object based on the resource data corresponding to the first object.
  • the first AR client and the first VR client may render and display the virtual image corresponding to the first object, thereby presenting the first object.
  • the first object is a real object in the first AR scene, and what is presented by the first AR client is the first object photographed by the first AR client through the camera, or the user of the first AR client can The first object is directly seen, so the first AR client may not present the virtual image corresponding to the first object.
  • the first AR client and the first VR client may display identity information corresponding to the first object.
  • the first AR client and the first VR client can display the status of the first object, such as health value, mana value, stamina value, endurance value, remaining items, etc.
  • S2204 When a status update event of the first object is triggered, the service server notifies the first AR client and the first VR client of the status event of the first object.
  • the status update event of the first object may be an event triggered in the first AR scene.
  • the first object is a real user corresponding to the first AR client.
  • the real user obtains a prop in the first AR scene, that is, the prop balance of the first object changes; or, the first object is a real user in the first AR scene.
  • the treasure box in the scene if the real user corresponding to the first AR client opens the treasure box, the treasure box can change from an unopened state to an open state.
  • the status update event of the first object may be an event triggered in the first VR scene.
  • the first object is a treasure box, and the player corresponding to the first VR client controls the character to open the treasure box in the first VR scene.
  • the first AR client and the first VR client update the status of the first object, that is, present the status update event of the first object.
  • the first AR client can update the first object presented in the first AR scene
  • the first VR client can update the first object presented in the first VR scene.
  • the first object is a real object
  • the first AR client can capture the state change of the first object through the camera, or the user of the first AR client can directly see the state change of the first object, so the first AR client can capture the state change of the first object through the camera.
  • An AR client may not update the state of the first object.
  • the first AR client and the first VR client can obtain the identity information corresponding to the first object, and obtain the resource data corresponding to the first object based on the identity information.
  • the first AR client can present the first object in the first AR scene, and the first VR client can present the first object in the first VR scene, so that the user corresponding to the first AR client and the first VR client can both see the image and status of the first object.
  • the first AR client can update the status of the first object in the first AR scene
  • the first VR client can update the status of the first object in the first VR scene, so that the user corresponding to the first AR client and the first VR client can both see the status change of the first object, thereby improving the accuracy of the collaborative presentation of the first object.
  • the business server can send the identity corresponding to the second object to the first AR client and the first VR client.
  • Information the first AR client and the first VR client can obtain the resource data corresponding to the second object based on the identity information corresponding to the second object, and present the second object based on the resource data corresponding to the second object, so that the first AR Both the user corresponding to the client and the first VR client can see the image and status of the second object.
  • the business server can notify the first AR client and the first VR client of the status event of the second object, and the first AR client and the first VR client can present the second object.
  • status update event thereby updating the status of the second object, so that both the user corresponding to the first AR client and the first VR client can see the status change of the second object, which improves the accuracy of collaborative presentation of the second object.
  • the first AR scene presented by the first AR client is as shown in Figure 23, and the first VR scene presented by the first VR client is as shown in Figure 24.
  • the first object is a real user of the first AR client, and the first object is a virtual image in the first VR scene, the second object is a player-controlled character corresponding to the first VR client, and the second object is in the first AR Both the scene and the first VR scene are virtual images.
  • the upper left corner of the first AR scene also includes identity information such as the avatar A 1603 and character name A 1604 corresponding to the first object 1602, and also includes status information such as the health value A 1605 corresponding to the first object 1602. .
  • the top of the head of the second object 1601 in the first AR scene also includes identity information such as the avatar B 1606 and character name B 1607 corresponding to the second object 1601, and also includes status information such as the health value B 1608 corresponding to the second object 1601. .
  • the upper left corner of the first VR scene also includes identity information such as the avatar B 1606 and character name B 1607 corresponding to the second object 1601, and also includes status information such as the health value B 1608 corresponding to the second object 1601. .
  • the top of the head of the first object 1602 in the first VR scene also includes identity information such as the avatar A 1603 and character name A 1604 corresponding to the first object 1602, and also includes status information such as the health value A 1605 corresponding to the first object 1602. .
  • the health value A 1605 and other status information corresponding to the first object 1602 is consistent in the first AR scene shown in Figure 23 and the first VR scene shown in Figure 24, and the health value B 1608 and so on corresponding to the second object 1601
  • the status information is also consistent in the first AR scene shown in FIG. 23 and the first VR scene shown in FIG. 24 .
  • the health value A 1605 corresponding to the first object 1602 is in a incomplete state
  • the health value B 1608 corresponding to the second object 1601 is in a full state.
  • both the first AR scene and the first VR scene can update the health value A 1605 of the first object 1602.
  • the health value A 1605 of the first object 1602 is both updated to a full state.
  • both the first AR scene and the first VR scene can update the health value B 1608 of the second object 1601.
  • the status update on the right side of Figure 23 In the first AR scene after the state update and the first VR scene after the status update on the right side of Figure 24, the health value A 1605 of the second object 1601 is updated to the incomplete state.
  • FIG. 23 and FIG. 24 describe the possible ways in which the first object and the second object may be presented in the first AR scene and the first VR scene, the status update event of the first object, and the second object.
  • the state update event of the first object and the second object will be described without limiting the manner of the first object and the second object, the state update event of the first object, and the state update event of the second object.
  • FIG. 25 is a flow chart of an object interaction method provided by an embodiment of the present application. It can be understood that the interaction method between the first object and the second object will be described below by taking the first object shooting at the second object in the game scene as an example. It should be noted that this method is not limited to the specific order described in Figure 25 and below. It should be understood that in other embodiments, the order of some steps of the method can be exchanged with each other according to actual needs, or some of the steps can be It can also be omitted or deleted. The method includes the following steps:
  • the first AR client responds to the attack command of the first object and sends the bullet information corresponding to the first object to the business server.
  • the first object is an object controlled by the user corresponding to the first AR client, and the first AR client can receive attack instructions submitted by the user.
  • the first object may not be an object controlled by the user, and the first AR client may obtain the attack instruction from the preset interaction strategy corresponding to the first object.
  • the first AR client can also obtain the attack instructions of the first object through other methods.
  • the attack instruction of the first object may be used to indicate bullet information corresponding to the first object.
  • the bullet information may include the bullet ID, the position of the bullet in the second coordinate system, and the firing direction of the bullet in the second coordinate system.
  • the bullet information may also include more or less information, such as one or more of bullet type, bullet flight speed, bullet flight distance, etc.
  • the business server sends the bullet information corresponding to the first object to the first VR client.
  • the service server may send the bullet information corresponding to the first object to the first VR client by broadcasting.
  • the service server may be based on the second positioning information corresponding to the first AR client (ie, the position of the first AR client in the second coordinate system) and the fifth positioning information corresponding to the first VR client (ie, the position of the first AR client in the second coordinate system). That is, the position of the first VR client in the second coordinate system), determine whether the first VR client is included around the first AR client, and if so, send the bullet information corresponding to the first object to the first VR client.
  • the business server can perform meshing on the game scene based on the second coordinate system, determine the position of the first VR client in the second coordinate system, and determine whether the first AR client is in the second coordinate system.
  • the business server can send a message to the first VR client. Bullet information corresponding to the first object.
  • the first VR client presents the bullet shot by the first object in the first VR scene based on the bullet information corresponding to the first object.
  • the first VR client may localize the bullet information corresponding to the first object, including converting the position of the bullet in the second coordinate system into the position of the bullet in the fourth coordinate system, and converting the firing direction of the bullet in the second coordinate system into the firing direction of the bullet in the fourth coordinate system.
  • the first VR client presents the bullet shot by the first object in the first VR scene based on the bullet ID, the position of the bullet in the fourth coordinate system, and the firing direction of the bullet in the fourth coordinate system.
  • the first VR client can instantiate a bullet fired by the first object and simulate the trajectory of the bullet. In some ways, the first VR client may present the auditory and/or visual effects of the first object's shooting.
  • the first AR client can also present the bullet shot by the first object in the first AR scene based on the bullet information corresponding to the first object in the same or similar manner as the first VR client in S2503. .
  • the bullet hits the second object if the second object is in the trajectory of a bullet fired by the first object.
  • the first VR client can detect whether the bullet fired by the first object hits the second object through a collision detection algorithm.
  • the first VR client can also detect whether the bullet fired by the first object hits the second object through other methods.
  • the business server or the first AR client can also determine whether the bullet fired by the first object hits the second object.
  • the business server obtains the health value change information corresponding to the second object based on the shooting result of the first object.
  • the health value change information corresponding to the second object may include the health value change amount and/or the changed health value.
  • S2506 The business server sends health value change information corresponding to the second object to the first AR client and the first VR client.
  • the service server may be based on the second positioning information corresponding to the first AR client (ie, the position of the first AR client in the second coordinate system) and the fifth positioning information corresponding to the first VR client (ie, the position of the first AR client in the second coordinate system). That is, the position of the first VR client in the second coordinate system), determine whether the first VR client is surrounded by the first AR client, and if so, send the health value change information corresponding to the second object to the first AR client. .
  • the first AR client and the first VR client update the health value of the second object based on the health value change information corresponding to the second object.
  • the first AR client can present the result of the reduction of the health value corresponding to the second object in the first AR scene, and can also present the auditory effect and/or the visual effect when the health value is reduced.
  • the first VR client can present the result of the second presentation corresponding to the reduced health value in a VR scene, and can also present the auditory effect and/or visual effect when the health value is reduced.
  • the business server when it obtains the bullet information corresponding to the first object, it can send the information to the first VR client according to the The client sends the bullet information in a similar or identical manner, and sends the bullet information corresponding to the first object to more clients, thereby detecting whether the first object hits each of the more clients.
  • the business server obtains the life value change information corresponding to the second object, it can send the life value change information to more clients in a similar or identical manner to the first AR client, so that The further client may present a state change event of a health value change of the second object.
  • the first VR client may send bullet information corresponding to the second object to the business server.
  • the business server sends the bullet information corresponding to the second object to the first AR client.
  • the first AR client presents the bullet fired by the second object in the first AR scene based on the bullet information corresponding to the second object.
  • the first AR client sends the identity information of the first object to the business server.
  • the business server obtains the health value change information corresponding to the first object based on the shooting result of the second object.
  • the business server sends health value change information corresponding to the first object to the first AR client and the first VR client.
  • the first AR client and the first VR client update the health value of the first object based on the health value change information corresponding to the first object.
  • the first AR client and the first VR client access the same game scene
  • the first real position of the first AR scene includes the first object
  • the first VR client A manner in which a first object is presented in a first virtual location of a VR scene
  • a second virtual location of the first VR scene includes a second object
  • the first AR client presents the first object in a second real location of the first AR scene.
  • Two object methods are described. However, it is understandable that in actual applications, the clients accessing the game scene may also include more AR clients and/or more VR clients.
  • the first AR client, the first VR client and the second AR client access the same game scene.
  • the second AR device presents the
  • the fifth real position includes the third object
  • the first AR client can present the third object at the fifth real position in the first AR scene
  • the first VR client can present it at the fifth virtual position in the first VR scene.
  • the third object, wherein the fifth real position and the fifth virtual position are the same position in the game scene.
  • the first VR client when the third event corresponding to the third object occurs in the second AR scene, can present the second event in the first VR scene, and the first AR The client can present a third event in the first AR scene.
  • the third event may include at least one of a status update event of the third object, a movement event of the third object, and an interaction event between the first object and other objects.
  • the first AR client, the first VR client and the second VR client access the same game scene, and when the second VR scene presented by the second VR device
  • the sixth virtual position includes the fourth object
  • the first AR client can present the fourth object at the sixth real position in the first AR scene
  • the first VR client can present it at the sixth virtual position in the first VR scene.
  • the fourth object, wherein the sixth real position and the sixth virtual position are the same position in the game scene.
  • the first VR client when a fourth event corresponding to a fourth object occurs in the second AR scene, the first VR client may present the fourth event in the first VR scene, and the first AR client may present the fourth event in the first AR scene.
  • the fourth event may include at least one of a state update event of a fourth object, a fourth object motion event, and an interaction event between the fourth object and other objects.
  • the first AR client can present the second object from the first VR scene and the third object from the second AR scene in the first AR scene, and the first VR client can present it in the first VR scene.
  • the first object from the first AR client and the fourth object from the second VR scene realize multi-type and multi-client collaborative presentation of the same game scene, and multiple users can play the same game scene through multiple different types of clients. Meet, interact and play with each other in the game scene.
  • the first AR scene as shown in Figure 26 also includes a third object 1603 from the second AR scene, where the third object 1603 is a real user corresponding to the second AR client.
  • the third object 1603 from the second AR scene is also included, and the third object 1603 is a virtual image corresponding to the real user.
  • inventions of the present application also provide a client.
  • the client includes: a memory and a processor.
  • the memory is used to store the computer program; the processor is used to execute the method described in the above method embodiment when the computer program is called.
  • the client provided in this embodiment can execute the above method embodiments, and its implementation principles and technical effects are similar, and will not be repeated here. narrate.
  • an embodiment of the present application further provides a chip system, which includes a processor coupled to a memory, and the processor executes a computer program stored in the memory to implement the method described in the above method embodiment.
  • the chip system may be a single chip or a chip module composed of multiple chips.
  • Embodiments of the present application also provide a computer-readable storage medium on which a computer program is stored. When the computer program is executed by a processor, the method described in the above method embodiment is implemented.
  • An embodiment of the present application also provides a computer program product.
  • the computer program product When the computer program product is run on a client, the client implements the method described in the above method embodiment when executed.
  • the above-mentioned integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer-readable storage medium.
  • this application can implement all or part of the processes in the methods of the above embodiments by instructing relevant hardware through a computer program.
  • the computer program can be stored in a computer-readable storage medium.
  • the computer program When executed by a processor, the steps of each of the above method embodiments may be implemented.
  • the computer program includes computer program code, which may be in the form of source code, object code, executable file or some intermediate form.
  • the computer-readable storage medium may at least include: any entity or device capable of carrying computer program code to the camera device/client, recording media, computer memory, read-only memory (ROM), random access memory (random access memory, RAM), electrical carrier signals, telecommunications signals, and software distribution media.
  • ROM read-only memory
  • RAM random access memory
  • electrical carrier signals electrical carrier signals
  • telecommunications signals and software distribution media.
  • U disk mobile hard disk, magnetic disk or CD, etc.
  • the disclosed devices/devices and methods can be implemented in other ways.
  • the apparatus/equipment embodiments described above are only illustrative.
  • the division of modules or units is only a logical function division. In actual implementation, there may be other division methods, such as multiple units or units. Components may be combined or may be integrated into another system, or some features may be ignored, or not implemented.
  • the coupling or direct coupling or communication connection between each other shown or discussed may be through some interfaces, indirect coupling or communication connection of devices or units, which may be in electrical, mechanical or other forms.
  • the term “if” may be interpreted as “when” or “once” or “in response to determining” or “in response to detecting” depending on the context. ". Similarly, the phrase “if determined” or “if [the described condition or event] is detected” may be interpreted, depending on the context, to mean “once determined” or “in response to a determination” or “once the [described condition or event] is detected ]” or “in response to detection of [the described condition or event]”.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Environmental & Geological Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Theoretical Computer Science (AREA)
  • Information Transfer Between Computers (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The present application relates to the technical field of terminals. Provided are a multi-device collaboration method, and a client. The method comprises: a first AR client presenting a first AR scene, wherein the first AR scene is a first collaborative scene in an AR mode; a first VR client presenting a first VR scene, wherein the first VR scene is a first collaborative scene in a VR mode; when a first real location of the first AR scene comprises a first object, the first VR client presenting the first object at a first virtual location of the first VR scene, wherein the first real location and the first virtual location are the same location in the first collaborative scene; and/or when a second virtual location of the first VR scene comprises a second object, the first AR client presenting the second object at a second real location of the first AR scene, wherein the second virtual location and the second real location are the same location in the first collaborative scene. The technical solution provided in the present application can realize collaboration between an AR client and a VR client.

Description

多设备协同方法及客户端Multi-device collaboration method and client
本申请要求于2022年9月23日提交国家知识产权局、申请号为202211168295.4、申请名称为“多设备协同方法及客户端”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。This application claims priority to the Chinese patent application submitted to the State Intellectual Property Office on September 23, 2022, with application number 202211168295.4 and the application name "Multi-device collaboration method and client", the entire content of which is incorporated into this application by reference. middle.
技术领域Technical field
本申请涉及互联网技术领域,尤其涉及一种多设备协同方法及客户端。This application relates to the field of Internet technology, and in particular to a multi-device collaboration method and client.
背景技术Background technique
随着客户端以及互联网技术的不断发展,多个客户端之间可以进行协同,满足了游戏等多人交互的需求。With the continuous development of clients and Internet technology, multiple clients can collaborate with each other to meet the needs of multi-person interaction such as games.
现有技术中,多个客户端可以均为(virtual reality,VR)客户端。各VR客户端可以接入虚拟的协同场景,向用户呈现VR模式的协同场景。当该协同场景中的某个虚拟位置包括虚拟的对象时,各VR客户端可以在各自的VR场景中的该虚拟位置呈现该对象,使得各VR客户端的用户可以看到该对象。但多个VR客户端之间的协同方式比较单一,用户体验很差。In the existing technology, multiple clients can all be (virtual reality, VR) clients. Each VR client can access the virtual collaboration scene and present the collaboration scene in VR mode to the user. When a virtual location in the collaborative scene includes a virtual object, each VR client can present the object at the virtual location in the respective VR scene, so that users of each VR client can see the object. However, the collaboration method between multiple VR clients is relatively simple and the user experience is very poor.
发明内容Contents of the invention
有鉴于此,本申请提供一种多设备协同方法及客户端,能够实现增强现实(augmented reality,AR)客户端和VR客户端之间的协同,打破了AR场景和VR场景之间的壁垒。In view of this, this application provides a multi-device collaboration method and client, which can realize collaboration between augmented reality (AR) clients and VR clients, breaking the barriers between AR scenes and VR scenes.
为了实现上述目的,第一方面,本申请实施例提供一种多设备协同方法,其特征在于,所述方法包括:In order to achieve the above object, in a first aspect, an embodiment of the present application provides a multi-device collaboration method, which is characterized in that the method includes:
第一AR客户端呈现第一AR场景,所述第一AR场景为AR模式的第一协同场景;The first AR client presents a first AR scene, and the first AR scene is a first collaborative scene in AR mode;
第一VR客户端呈现第一VR场景,所述第一VR场景为VR模式的所述第一协同场景;The first VR client presents a first VR scene, and the first VR scene is the first collaborative scene in VR mode;
当所述第一AR场景的第一现实位置包括第一对象时,所述第一VR客户端在所述第一VR场景的第一虚拟位置呈现所述第一对象,所述第一现实位置和所述第一虚拟位置为所述第一协同场景中的同一位置;和/或,当所述第一VR场景的第二虚拟位置包括第二对象时,所述第一AR客户端在所述第一AR场景的第二现实位置呈现所述第二对象,所述第二虚拟位置和所述第二现实位置为所述第一协同场景中的同一位置。When the first real position of the first AR scene includes a first object, the first VR client presents the first object at a first virtual position of the first VR scene, the first real position and the first virtual position is the same position in the first collaborative scene; and/or, when the second virtual position of the first VR scene includes a second object, the first AR client is in that The second object is presented in a second real position of the first AR scene, and the second virtual position and the second real position are the same position in the first collaborative scene.
其中,第一VR场景可以为对第一AR场景进行处理得到的虚拟场景。The first VR scene may be a virtual scene obtained by processing the first AR scene.
在本申请实施例中,第一AR客户端和第一VR客户端可以接入同一第一协同场景,第一AR客户端呈现第一AR场景,第一VR客户端呈现第一VR场景,即第一AR客户端和第一VR客户端可以协同呈现第一协同场景,其中,第一VR场景可以为对第一AR场景进行处理得到的虚拟场景。当第一AR场景的第一现实位置包括第一对象时,第一VR客户端在第一VR场景的第一虚拟位置呈现第一对象,也即是,第一VR客户端可以将第一AR场景中的第一对象,同步呈现在第一VR场景中,使得用户可以从第一VR客户端看到来自第一AR场景中的第一对象。当第一VR场景的第二虚拟位置包括第二对象时,第一AR客户端在第一AR场景的第二现实位置呈现第二对象,也即是,第一AR客户端可以将第一VR场景中的第二对象,同步呈现在第一AR场景中,使得用户可以从第一AR客户端看到来自第一VR场景中的第二对象。因此,第一AR客户端和第一VR客户端可以呈现来自对端的第一协同场景中对象,使得第一AR客户端的用户和第一VR客户端的用户可以看到该对象,实现了AR客户端和VR客户端之间的协同互通。In this embodiment of the present application, the first AR client and the first VR client can access the same first collaboration scene, the first AR client presents the first AR scene, and the first VR client presents the first VR scene, that is, The first AR client and the first VR client can collaboratively present the first collaborative scene, where the first VR scene can be a virtual scene obtained by processing the first AR scene. When the first real position of the first AR scene includes the first object, the first VR client presents the first object at the first virtual position of the first VR scene, that is, the first VR client can display the first AR The first object in the scene is synchronously presented in the first VR scene, so that the user can see the first object from the first AR scene from the first VR client. When the second virtual position of the first VR scene includes the second object, the first AR client presents the second object at the second real position of the first AR scene, that is, the first AR client can place the first VR The second object in the scene is synchronously presented in the first AR scene, so that the user can see the second object from the first VR scene from the first AR client. Therefore, the first AR client and the first VR client can present the object in the first collaborative scene from the opposite end, so that the user of the first AR client and the user of the first VR client can see the object, realizing the AR client Collaboration and interoperability with VR clients.
在一些实施方式中,第一AR场景中的第一对象可以为现实对象、虚实结合的对象或者虚拟对象,第一VR场景中呈现的第一对象可以为虚拟对象。在一些实施方式中,第一AR场景中的第一对象可以为第一AR客户端的用户,第一VR场景中呈现的第一对象可以为与该用户对应的虚拟形象。其中,若第一对象为与第一AR客户端同步运动的物体(比如第一AR客户端捆绑在第一对象上,或者第一对象携带第一AR客户端),则第一对象的位置信息与第一AR客户端的 位置信息可以相同,第一对象的旋转信息与第一AR客户端的旋转信息可以相同。若第一对象不为与第一AR客户端同步运动的物体,则第一AR客户端可以生成或者采集得到第一对象的位置信息和旋转信息中的至少一个。In some implementations, the first object in the first AR scene may be a real object, a combined virtual and real object, or a virtual object, and the first object presented in the first VR scene may be a virtual object. In some implementations, the first object in the first AR scene may be a user of the first AR client, and the first object presented in the first VR scene may be an avatar corresponding to the user. Among them, if the first object is an object that moves synchronously with the first AR client (for example, the first AR client is tied to the first object, or the first object carries the first AR client), then the location information of the first object with the first AR client The position information may be the same, and the rotation information of the first object and the rotation information of the first AR client may be the same. If the first object is not an object that moves synchronously with the first AR client, the first AR client can generate or collect at least one of the position information and rotation information of the first object.
在一些实施方式中,第一VR场景中的第二对象可以为虚拟对象,第一AR场景中的第二对象也可以为虚拟对象。其中,第一VR客户端可以接收用户输入、生成或者采集得到第二对象的位置信息和旋转信息中的至少一个。In some implementations, the second object in the first VR scene may be a virtual object, and the second object in the first AR scene may also be a virtual object. Wherein, the first VR client can receive user input, generate or collect at least one of position information and rotation information of the second object.
在一些实施方式中,第一AR客户端的本地坐标系可以为第三坐标系,第一VR客户端的本地坐标系可以为第四坐标系,第一AR客户端和第一VR客户端共同所在的公共坐标系可以包括第二坐标系。第一AR客户端和第一VR客户端,可以通过第二坐标系,实现第一对象和第二对象,在第三坐标系和第四坐标系中的位置信息的转换。以第一对象为例,第一AR客户端可以基于第一对象在第三坐标系中的位置信息,在第一AR场景中的第一现实位置呈现第一对象,第一VR客户端可以基于第一对象在第四坐标系中的位置信息,在第一VR场景中呈现第一对象。In some implementations, the local coordinate system of the first AR client may be the third coordinate system, the local coordinate system of the first VR client may be the fourth coordinate system, and the first AR client and the first VR client are both located together. The common coordinate system may include a second coordinate system. The first AR client and the first VR client can realize the conversion of the position information of the first object and the second object in the third coordinate system and the fourth coordinate system through the second coordinate system. Taking the first object as an example, the first AR client can present the first object at the first real position in the first AR scene based on the position information of the first object in the third coordinate system, and the first VR client can present the first object based on the position information of the first object in the third coordinate system. The position information of the first object in the fourth coordinate system presents the first object in the first VR scene.
在一些实施方式中,第一AR客户端和第一VR客户端,可以通过第二坐标系,实现第一对象和第二对象,在第三坐标系和第四坐标系中的旋转信息的转换。以第一对象为例,第一AR客户端可以基于第一对象在第三坐标系中的旋转信息和位置信息,在第一AR场景中的第一现实位置呈现第一对象,第一VR客户端可以基于第一对象在第四坐标系中的旋转信息和位置信息,在第一VR场景中呈现第一对象,从而使得第一对象在第一AR场景中的角度与第一AR客户端的旋转角度相匹配,第一对象在第一VR场景中的角度与第一VR客户端的旋转角度相匹配,提高了呈现第一对象的真实性和用户体验。In some embodiments, the first AR client and the first VR client can realize the conversion of the rotation information of the first object and the second object in the third coordinate system and the fourth coordinate system through the second coordinate system. . Taking the first object as an example, the first AR client can present the first object at the first real position in the first AR scene based on the rotation information and position information of the first object in the third coordinate system. The first VR client The client can present the first object in the first VR scene based on the rotation information and position information of the first object in the fourth coordinate system, so that the angle of the first object in the first AR scene is consistent with the rotation of the first AR client. The angles match, and the angle of the first object in the first VR scene matches the rotation angle of the first VR client, thereby improving the authenticity and user experience of presenting the first object.
在一些实施方式中,所述方法还包括以下任意一项或多项:In some embodiments, the method further includes any one or more of the following:
当所述第一AR场景发生与所述第一对象对应的第一事件时,所述第一VR客户端在所述第一VR场景呈现所述第一事件;或,When a first event corresponding to the first object occurs in the first AR scene, the first VR client presents the first event in the first VR scene; or,
当所述第一VR场景发生与所述第二对象对应的第二事件时,所述第一AR客户端在所述第一AR场景呈现所述第二事件。When a second event corresponding to the second object occurs in the first VR scene, the first AR client presents the second event in the first AR scene.
在一些实施方式中,所述第一事件包括以下任意一项或多项:In some embodiments, the first event includes any one or more of the following:
所述第一对象的状态更新事件,或,a status update event of the first object, or,
所述第一对象的运动事件;或,The motion event of the first object; or,
所述第一对象与其他对象的交互事件。Interaction events between the first object and other objects.
在一些实施方式中,所述第一对象的运动事件包括所述第一对象从所述第一AR场景中的所述第一现实位置运动至第三现实位置,所述第一VR客户端在所述第一VR场景呈现所述第一事件,包括:In some implementations, the motion event of the first object includes the movement of the first object from the first real position to a third real position in the first AR scene, and the first VR client The first VR scene presents the first event, including:
所述第一VR客户端呈现所述第一对象从所述第一VR场景中的所述第一虚拟位置,移动至第三虚拟位置的运动过程,所述第三虚拟位置和所述第三现实位置为所述第一协同场景中的同一位置。The first VR client presents the movement process of the first object moving from the first virtual position in the first VR scene to a third virtual position, and the third virtual position and the third real position are the same position in the first collaborative scene.
在一些实施方式中,所述第二事件包括以下任意一项或多项:In some embodiments, the second event includes any one or more of the following:
所述第二对象的状态更新事件;或,The status update event of the second object; or,
所述第二对象的运动事件;或,a motion event of the second object; or,
所述第二对象与其他对象的交互事件。Interaction events between the second object and other objects.
第一VR客户端与第一AR客户端通过同步呈现第一事件和第二事件中的至少一个,提高了第一VR客户端与第一AR客户端协同呈现游戏场景的真实性,进而提高了用户体验。By synchronously presenting at least one of the first event and the second event, the first VR client and the first AR client improve the authenticity of the game scene collaboratively presented by the first VR client and the first AR client, thereby improving the user experience.
在一些实施方式中,所述第二对象的运动事件包括所述第二对象从所述第一VR场景中的所述第二虚拟位置运动至第四虚拟位置,所述第一AR客户端在所述第一AR场景呈现所述第二事件,包括:In some implementations, the motion event of the second object includes the movement of the second object from the second virtual position to a fourth virtual position in the first VR scene, and the first AR client The first AR scene presents the second event, including:
所述第一AR客户端呈现所述第二对象从所述第一AR场景中的所述第二现实位置移动至第四现实位置的运动过程,所述第四现实位置和所述第四虚拟位置为所述第一协同场景中的同一位置。The first AR client presents the movement process of the second object moving from the second real position in the first AR scene to a fourth real position, the fourth real position and the fourth virtual The location is the same location in the first collaborative scene.
在一些实施方式中,所述第一对象为现实对象或虚拟结合的对象,所述第一VR客户端在所 述第一VR场景的第一虚拟位置呈现所述第一对象,包括:In some implementations, the first object is a real object or a virtual combined object, and the first VR client is Presenting the first object in the first virtual position of the first VR scene includes:
所述第一VR客户端在所述第一VR场景的所述第一虚拟位置,呈现所述第一对象对应的虚拟形象。The first VR client presents the virtual image corresponding to the first object in the first virtual position of the first VR scene.
在一些实施方式中,所述第二对象为虚拟对象,所述第一AR客户端在所述第一AR场景的第二现实位置呈现所述第二对象,包括:In some implementations, the second object is a virtual object, and the first AR client presents the second object at a second real location of the first AR scene, including:
所述第一AR客户端在所述第一AR场景的所述第二现实位置,呈现所述第二对象或呈现所述第二对象对应的虚实结合的形象。The first AR client presents the second object or a combined virtual and real image corresponding to the second object in the second real position of the first AR scene.
在一些实施方式中,所述第一对象为所述第一AR客户端的用户,和/或,所述第二对象为所述第一VR客户端控制的玩家控制角色。也即是,第一AR客户端的用户可以在第一AR场景中与第一VR客户端控制的玩家控制角色互见互动,第一VR客户端控制的玩家控制角色可以在第一VR场景中与第一AR客户端的用户互见互动,实现了VR客户端的用户和AR客户端的用户在同一协同场景中的互见互动,打破了AR场景和VR场景之间的壁垒。In some implementations, the first object is a user of the first AR client, and/or the second object is a player-controlled character controlled by the first VR client. That is to say, the user of the first AR client can interact with the player-controlled character controlled by the first VR client in the first AR scene, and the player-controlled character controlled by the first VR client can interact with the player-controlled character controlled by the first VR client in the first VR scene. Users of an AR client can see and interact with each other, enabling users of VR clients and users of AR clients to see and interact with each other in the same collaborative scene, breaking the barriers between AR scenes and VR scenes.
在一些实施方式中,所述第一AR场景中包括至少部分现实场景,所述第一VR场景中包括对所述至少部分现实场景进行处理得到的虚拟场景。In some implementations, the first AR scene includes at least part of the real scene, and the first VR scene includes a virtual scene obtained by processing the at least part of the real scene.
在一些实施方式中,第一VR场景可以为通过神经辐射场(neural radiance fields,NeRF)网络等视图合成模型,对第一AR场景中的现实场景进行处理所得到的虚拟场景。其中,视图合成模型可以对实现场景重新进行建模和渲染,从而输出与该现实场景对应的虚拟场景。In some implementations, the first VR scene may be a virtual scene obtained by processing the real scene in the first AR scene through a view synthesis model such as a neural radiance fields (NeRF) network. Among them, the view synthesis model can re-model and render the actual scene, thereby outputting a virtual scene corresponding to the real scene.
在一些实施方式中,在所述第一AR客户端呈现第一AR场景,所述第一AR场景为AR模式的第一协同场景之前,所述方法还包括:In some implementations, before the first AR client presents the first AR scene, and the first AR scene is the first collaborative scene of the AR mode, the method further includes:
所述第一AR客户端基于所述第一AR客户端在第一坐标系中的真实位置接入所述第一协同场景,所述第一VR客户端基于所述第一VR客户端在所述第一坐标系中的真实位置接入所述第一协同场景,其中,所述第一AR客户端在第一坐标系中的真实位置,与所述第一VR客户端在所述第一坐标系中的真实位置处于同一兴趣面AOI区域;或,The first AR client accesses the first collaborative scene based on the real position of the first AR client in the first coordinate system, and the first VR client accesses the first collaborative scene based on the location of the first VR client. The real position in the first coordinate system is connected to the first collaborative scene, where the real position of the first AR client in the first coordinate system is the same as the real position of the first VR client in the first coordinate system. The real position in the coordinate system is in the same AOI area; or,
所述第一AR客户端基于所述第一AR客户端在第一坐标系中的真实位置接入所述第一协同场景,所述第一VR客户端基于所述第一VR客户端在所述第一坐标系中的虚拟位置接入所述第一协同场景,其中,所述第一AR客户端在第一坐标系中的真实位置,与所述第一VR客户端在所述第一坐标系中的虚拟位置处于同一AOI区域。The first AR client accesses the first collaborative scene based on the real position of the first AR client in the first coordinate system, and the first VR client accesses the first collaborative scene based on the location of the first VR client. The virtual position in the first coordinate system is connected to the first collaborative scene, where the real position of the first AR client in the first coordinate system is the same as the real position of the first VR client in the first coordinate system. The virtual positions in the coordinate system are in the same AOI area.
其中,第一坐标系可以为第一AR客户端和第一VR客户端所在的公共坐标系。第一AR客户端可以基于第一AR客户端在第一坐标系中的真实位置接入第一协同场景,第一VR客户端可以基于第一VR客户端在第一坐标系中的真实位置或者虚拟位置接入第一协同场景,实现了第一AR客户端和第一VR客户端在现实世界中是否处于同一位置,都可以接入同一个第一协同场景,实现了跨地域跨设备协同,提高了多设备协同的灵活性和用户体验。The first coordinate system may be a common coordinate system where the first AR client and the first VR client are located. The first AR client can access the first collaborative scene based on the real position of the first AR client in the first coordinate system, and the first VR client can access the first collaborative scene based on the real position of the first VR client in the first coordinate system or The virtual location is connected to the first collaboration scene, which enables the first AR client and the first VR client to access the same first collaboration scene whether they are at the same location in the real world, realizing cross-region and cross-device collaboration. Improved the flexibility and user experience of multi-device collaboration.
在一些实施方式中,所述方法还包括:In some embodiments, the method further comprises:
所述第一AR客户端在所述第一AR场景中呈现第三对象,所述第一VR客户端在所述第一VR场景中呈现所述第三对象对应的虚拟形象,所述第三对象为第二AR客户端所呈现的第二AR场景中包括的对象,所述第三对象为现实对象或虚实结合的对象。The first AR client presents a third object in the first AR scene, the first VR client presents a virtual image corresponding to the third object in the first VR scene, and the third The object is an object included in the second AR scene presented by the second AR client, and the third object is a real object or a combination of virtual and real objects.
在一些实施方式中,所述方法还包括:In some embodiments, the method further includes:
所述第一AR客户端在所述第一AR场景中呈现第四对象,所述第一VR客户端在所述第一VR场景中呈现所述第四对象,所述第四对象为第二VR客户端所呈现的第二VR场景中包括的对象,所述第四对象为虚拟对象。The first AR client presents a fourth object in the first AR scene, the first VR client presents the fourth object in the first VR scene, and the fourth object is a second Objects included in the second VR scene presented by the VR client, and the fourth object is a virtual object.
在一些实施方式中,所述第一AR客户端在所述第一AR场景的第二现实位置呈现所述第二对象,包括:In some implementations, the first AR client presents the second object in a second real-world location of the first AR scene, including:
所述第一AR客户端获取所述第二对象在第二坐标系中的位置,所述第二坐标系为公共坐标系;The first AR client obtains the position of the second object in a second coordinate system, and the second coordinate system is a public coordinate system;
所述第一AR客户端将所述第二对象在所述第二坐标系中的位置,转换为所述第二对象在第三坐标系中的所述第二现实位置,所述第三坐标系为与所述第一AR客户端对应的本地坐标系。The first AR client converts the position of the second object in the second coordinate system into the second real position of the second object in the third coordinate system. The third coordinate is the local coordinate system corresponding to the first AR client.
在一些实施方式中,所述第一VR客户端在所述第一VR场景的第一虚拟位置呈现所述第一 对象,包括:In some implementations, the first VR client presents the first virtual location in the first VR scene. Objects, including:
所述第一VR客户端获取所述第一对象在第二坐标系中的位置,所述第二坐标系为公共坐标系;The first VR client obtains the position of the first object in a second coordinate system, and the second coordinate system is a public coordinate system;
所述第一VR客户端将所述第一对象在所述第二坐标系中的位置,转换为所述第一对象在第四坐标系中的所述第一虚拟位置,所述第四坐标系为与所述第一VR客户端对应的本地坐标系。The first VR client converts the position of the first object in the second coordinate system into the first virtual position of the first object in the fourth coordinate system. The fourth coordinate is the local coordinate system corresponding to the first VR client.
在一些实施方式中,所述方法还包括:In some embodiments, the method further includes:
所述第一AR客户端将所述第一对象在所述第三坐标系中的所述第一现实位置,转换为所述第一对象在所述第二坐标系中的位置;The first AR client converts the first real position of the first object in the third coordinate system into the position of the first object in the second coordinate system;
所述第一AR客户端向业务服务器发送所述第一对象在所述第二坐标系中的位置;The first AR client sends the position of the first object in the second coordinate system to the service server;
所述第一VR客户端获取所述第一对象在第二坐标系中的位置,包括:The first VR client obtains the position of the first object in the second coordinate system, including:
所述第一AR客户端从所述业务服务器,获取所述第一对象在所述第二坐标系中的位置。The first AR client obtains the position of the first object in the second coordinate system from the service server.
在一些实施方式中,所述方法还包括:In some embodiments, the method further includes:
所述第一VR客户端将所述第二对象在所述第四坐标系中的所述第二虚拟位置,转换为所述第二对象在所述第二坐标系中的位置;The first VR client converts the second virtual position of the second object in the fourth coordinate system into the position of the second object in the second coordinate system;
所述第一VR客户端向业务服务器发送所述第二对象在所述第二坐标系中的位置;The first VR client sends the position of the second object in the second coordinate system to the service server;
所述第一AR客户端获取所述第二对象在第二坐标系中的位置,包括:The first AR client obtains the position of the second object in the second coordinate system, including:
所述第一VR客户端从所述业务服务器,获取所述第二对象在所述第二坐标系中的位置。The first VR client obtains the position of the second object in the second coordinate system from the business server.
第二方面,本申请实施例提供一种多设备协同方法,应用于第一AR设备,所述方法包括:In a second aspect, embodiments of the present application provide a multi-device collaboration method, applied to the first AR device, and the method includes:
呈现第一AR场景,所述第一AR场景为AR模式的第一协同场景;Presenting a first AR scene, the first AR scene being a first collaborative scene in AR mode;
当第一VR场景的第二虚拟位置包括第二对象时,在所述第一AR场景的第二现实位置呈现所述第二对象,所述第二虚拟位置和所述第二现实位置为所述第一协同场景中的同一位置;When the second virtual position of the first VR scene includes a second object, the second object is presented at a second real position of the first AR scene, and the second virtual position and the second real position are the The same position in the first collaborative scene;
其中,所述第一VR场景为第一VR客户端呈现的场景,所述第一VR场景为VR模式的所述第一协同场景。Wherein, the first VR scene is a scene presented by the first VR client, and the first VR scene is the first collaborative scene in VR mode.
在一些实施方式中,所述方法还包括:In some embodiments, the method further includes:
当所述第一VR场景发生与所述第二对象对应的第二事件时,所述第一AR客户端在所述第一AR场景呈现所述第二事件。When a second event corresponding to the second object occurs in the first VR scene, the first AR client presents the second event in the first AR scene.
在一些实施方式中,所述第二事件包括以下任意一项或多项:In some embodiments, the second event includes any one or more of the following:
所述第二对象的状态更新事件;或,The status update event of the second object; or,
所述第二对象的运动事件;或,The motion event of the second object; or,
所述第二对象与其他对象的交互事件。Interaction events between the second object and other objects.
第三方面,本申请实施例提供一种多设备协同方法,应用于第一VR设备,所述方法包括:In a third aspect, embodiments of the present application provide a multi-device collaboration method, applied to the first VR device. The method includes:
呈现第一VR场景,所述第一VR场景为VR模式的第一协同场景;Presenting a first VR scene, the first VR scene being a first collaborative scene in VR mode;
当第一AR场景的第一现实位置包括第一对象时,在所述第一VR场景的第一虚拟位置呈现所述第一对象,所述第一现实位置和所述第一虚拟位置为所述第一协同场景中的同一位置;When the first real position of the first AR scene includes the first object, the first object is presented at the first virtual position of the first VR scene, and the first real position and the first virtual position are the The same position in the first collaborative scene;
其中,所述第一AR场景为第一AR客户端呈现的场景,所述第一AR场景为AR模式的所述第一协同场景。Wherein, the first AR scene is a scene presented by a first AR client, and the first AR scene is the first collaborative scene in AR mode.
在一些实施方式中,所述方法还包括:In some embodiments, the method further includes:
当所述第一AR场景发生与所述第一对象对应的第一事件时,在所述第一VR场景呈现所述第一事件。When a first event corresponding to the first object occurs in the first AR scene, the first event is presented in the first VR scene.
在一些实施方式中,所述第一事件包括以下任意一项或多项:In some embodiments, the first event includes any one or more of the following:
所述第一对象的状态更新事件,或,a status update event of the first object, or,
所述第一对象的运动事件;或,The motion event of the first object; or,
所述第一对象与其他对象的交互事件。Interaction events between the first object and other objects.
第四方面,本申请实施例提供一种系统,包括第一AR设备和第一VR设备;In a fourth aspect, embodiments of the present application provide a system including a first AR device and a first VR device;
所述第一AR客户端呈现第一AR场景,所述第一AR场景为AR模式的第一协同场景;The first AR client presents a first AR scene, and the first AR scene is a first collaborative scene in AR mode;
所述第一VR客户端呈现第一VR场景,所述第一VR场景为VR模式的所述第一协同场景;The first VR client presents a first VR scene, and the first VR scene is the first collaborative scene in VR mode;
当所述第一AR场景的第一现实位置包括第一对象时,所述第一VR客户端在所述第一VR 场景的第一虚拟位置呈现所述第一对象,所述第一现实位置和所述第一虚拟位置为所述第一协同场景中的同一位置;和/或,当所述第一VR场景的第二虚拟位置包括第二对象时,所述第一AR客户端在所述第一AR场景的第二现实位置呈现所述第二对象,所述第二虚拟位置和所述第二现实位置为所述第一协同场景中的同一位置。When the first real position of the first AR scene includes a first object, the first VR client The first virtual position of the scene presents the first object, and the first real position and the first virtual position are the same position in the first collaborative scene; and/or, when the first VR scene When the second virtual position includes a second object, the first AR client presents the second object at a second real position of the first AR scene, and the second virtual position and the second real position are The same location in the first collaborative scene.
第五方面,本申请实施例提供了一种装置,该装置具有实现上述第一方面任一可能实现方式中客户端行为,或者,实现上述第二方面任一可能实现方式中客户端行为的功能。功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。硬件或软件包括一个或多个与上述功能相对应的模块或单元。例如,收发模块或单元、处理模块或单元、获取模块或单元等。In the fifth aspect, embodiments of the present application provide a device that has the function of realizing the client behavior in any possible implementation of the first aspect, or the function of realizing the client behavior in any possible implementation of the second aspect. . Functions can be implemented by hardware, or by hardware executing corresponding software. Hardware or software includes one or more modules or units corresponding to the above functions. For example, a transceiver module or unit, a processing module or unit, an acquisition module or unit, etc.
第六方面,本申请实施例提供一种客户端,包括:存储器和处理器,存储器用于存储计算机程序;处理器用于在调用计算机程序时执行上述第一方面中任一项所述的方法或第二方面任一项所述的方法。In a sixth aspect, embodiments of the present application provide a client, including: a memory and a processor. The memory is used to store a computer program; the processor is used to execute any of the methods described in the first aspect when calling the computer program or The method described in any one of the second aspects.
第七方面,本申请实施例提供一种芯片系统,所述芯片系统包括处理器,所述处理器与存储器耦合,所述处理器执行存储器中存储的计算机程序,以实现上述第一方面中任一项所述的方法或第二方面任一项所述的方法。In a seventh aspect, embodiments of the present application provide a chip system. The chip system includes a processor, the processor is coupled to a memory, and the processor executes a computer program stored in the memory to implement any of the above-mentioned aspects of the first aspect. The method described in one aspect or the method described in any one of the second aspects.
其中,所述芯片系统可以为单个芯片,或者多个芯片组成的芯片模组。The chip system may be a single chip or a chip module composed of multiple chips.
第八方面,本申请实施例提供一种计算机可读存储介质,其上存储有计算机程序,计算机程序被处理器执行时实现上述第一方面中任一项所述的方法或第二方面任一项所述的方法。In an eighth aspect, embodiments of the present application provide a computer-readable storage medium on which a computer program is stored. When the computer program is executed by a processor, the method described in any one of the first aspects or any one of the second aspects is implemented. method described in the item.
第九方面,本申请实施例提供一种计算机程序产品,当计算机程序产品在客户端上运行时,使得客户端执行上述第一方面中任一项所述的方法或第二方面任一项所述的方法。In a ninth aspect, embodiments of the present application provide a computer program product. When the computer program product is run on a client, it causes the client to execute the method described in any one of the first aspects or any one of the second aspects. method described.
可以理解的是,上述第二方面至第九方面的有益效果可以参见上述第一方面中的相关描述,在此不再赘述。It can be understood that the beneficial effects of the above-mentioned second aspect to the ninth aspect can be referred to the relevant description in the above-mentioned first aspect, and will not be described again here.
附图说明Description of the drawings
图1为本申请实施例所提供的一种客户端的结构示意图;Figure 1 is a schematic structural diagram of a client provided by an embodiment of the present application;
图2为本申请实施例所提供的一种服务器的结构示意图;Figure 2 is a schematic structural diagram of a server provided by an embodiment of the present application;
图3为本申请实施例所提供的一种游戏场景的示意图;Figure 3 is a schematic diagram of a game scene provided by an embodiment of the present application;
图4为本申请实施例所提供的另一种游戏场景的示意图;Figure 4 is a schematic diagram of another game scene provided by an embodiment of the present application;
图5为本申请实施例提供的一种多设备协同系统的结构示意图;Figure 5 is a schematic structural diagram of a multi-device collaboration system provided by an embodiment of the present application;
图6为本申请实施例提供的另一种多设备协同系统的结构示意图;Figure 6 is a schematic structural diagram of another multi-device collaboration system provided by an embodiment of the present application;
图7为本申请实施例提供的另一种多设备协同系统的结构示意图;Figure 7 is a schematic structural diagram of another multi-device collaboration system provided by an embodiment of the present application;
图8为本申请实施例所提供的一种多设备协同方法的流程图;Figure 8 is a flow chart of a multi-device collaboration method provided by an embodiment of the present application;
图9为本申请实施例所提供的另一种游戏场景的示意图;Figure 9 is a schematic diagram of another game scene provided by an embodiment of the present application;
图10为本申请实施例所提供的一种接入游戏场景的方法的流程图;Figure 10 is a flow chart of a method of accessing a game scene provided by an embodiment of the present application;
图11为本申请实施例所提供的一种登录界面的示意图;Figure 11 is a schematic diagram of a login interface provided by an embodiment of the present application;
图12为本申请实施例所提供的一种阵营选择界面的示意图;Figure 12 is a schematic diagram of a camp selection interface provided by an embodiment of the present application;
图13为本申请实施例所提供的一种游戏大厅界面的示意图;Figure 13 is a schematic diagram of a game lobby interface provided by an embodiment of the present application;
图14为本申请实施例所提供的一种呈现AR场景的方法的流程图;Figure 14 is a flow chart of a method for presenting an AR scene provided by an embodiment of the present application;
图15为本申请实施例所提供的一种呈现VR场景的方法的流程图;FIG15 is a flowchart of a method for presenting a VR scene provided in an embodiment of the present application;
图16为本申请实施例所提供的一种呈现AR场景的方法的流程图;FIG16 is a flow chart of a method for presenting an AR scene provided in an embodiment of the present application;
图17为本申请实施例所提供的一种呈现VR场景的方法的流程图;Figure 17 is a flow chart of a method for presenting a VR scene provided by an embodiment of the present application;
图18为本申请实施例所提供的一种AR场景的示意图;Figure 18 is a schematic diagram of an AR scene provided by an embodiment of the present application;
图19为本申请实施例所提供的一种VR场景的示意图;Figure 19 is a schematic diagram of a VR scene provided by an embodiment of the present application;
图20为本申请实施例所提供的一种呈现游戏对象的方法的流程图;Figure 20 is a flow chart of a method for presenting game objects provided by an embodiment of the present application;
图21为本申请实施例所提供的另一种呈现游戏对象的方法的流程图;Figure 21 is a flow chart of another method of presenting game objects provided by an embodiment of the present application;
图22为本申请实施例所提供的另一种呈现游戏对象的方法的流程图;Figure 22 is a flow chart of another method of presenting game objects provided by an embodiment of the present application;
图23为本申请实施例所提供的另一种AR场景的示意图;Figure 23 is a schematic diagram of another AR scene provided by an embodiment of the present application;
图24为本申请实施例所提供的另一种VR场景的示意图;Figure 24 is a schematic diagram of another VR scene provided by an embodiment of the present application;
图25为本申请实施例所提供的一种对象交互的方法的流程图; Figure 25 is a flow chart of an object interaction method provided by an embodiment of the present application;
图26为本申请实施例所提供的另一种AR场景的示意图;Figure 26 is a schematic diagram of another AR scene provided by an embodiment of the present application;
图27为本申请实施例所提供的另一种VR场景的示意图。Figure 27 is a schematic diagram of another VR scene provided by an embodiment of the present application.
具体实施方式Detailed ways
本申请实施例提供的多设备协同方法可以应用于客户端和服务器。The multi-device collaboration method provided in the embodiment of the present application can be applied to the client and the server.
在一些实施例中,客户端可以包括AR客户端和VR客户端。VR客户端可以利用计算机技术模拟产生一个为用户提供视觉、听觉、触觉等感官模拟的三度空间虚拟世界,用户借助特殊的输入/输出设备,与虚拟世界进行自然的交互。AR客户端可以将虚拟世界与现实世界进行叠加,使用户产生真假难辨的效果,用户可以与虚拟世界和/或现实世界进行交互。In some embodiments, clients may include AR clients and VR clients. The VR client can use computer technology to simulate and generate a three-dimensional virtual world that provides users with visual, auditory, tactile and other sensory simulations. Users can naturally interact with the virtual world with the help of special input/output devices. The AR client can overlay the virtual world with the real world, making it difficult for users to distinguish between real and fake. Users can interact with the virtual world and/or the real world.
在一些实施例中,AR客户端和VR客户端可以被实现为手机、平板电脑、可穿戴设备(如智能眼镜、智能头盔等)、车载设备、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、个人数字助理(personal digital assistant,PDA)等设备,本申请实施例对AR客户端和VR客户端的具体类型不作任何限制。In some embodiments, AR clients and VR clients can be implemented as mobile phones, tablets, wearable devices (such as smart glasses, smart helmets, etc.), vehicle-mounted devices, laptops, ultra-mobile personal computers (ultra-mobile personal computers), etc. computer (UMPC), netbook, personal digital assistant (personal digital assistant, PDA) and other devices. The embodiments of this application do not place any restrictions on the specific types of AR clients and VR clients.
请参照图1,为本申请所提供的一种客户端100的结构示意图。客户端100可以包括处理器A110,外部存储器接口120,内部存储器121,天线1,通信模块A160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,按键190,马达191,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。Please refer to FIG. 1 , which is a schematic structural diagram of a client 100 provided by this application. The client 100 may include a processor A110, an external memory interface 120, an internal memory 121, an antenna 1, a communication module A160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, a headphone interface 170D, a button 190, a motor 191, and a camera 193 , display screen 194, and subscriber identification module (subscriber identification module, SIM) card interface 195, etc.
可以理解的是,本申请实施例示意的结构并不构成对客户端100的具体限定。在本申请另一些实施例中,客户端100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。It can be understood that the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the client 100 . In other embodiments of the present application, the client 100 may include more or fewer components than shown in the figures, or combine some components, or split some components, or arrange different components. The components illustrated may be implemented in hardware, software, or a combination of software and hardware.
处理器A 110可以包括一个或多个处理单元,例如:处理器A 110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。The processor A 110 may include one or more processing units. For example, the processor A 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (GPU), an image signal Processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit, NPU) etc. Among them, different processing units can be independent devices or integrated in one or more processors.
其中,控制器可以是客户端100的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。Among them, the controller may be the nerve center and command center of the client 100. The controller can generate operation control signals based on the instruction operation code and timing signals to complete the control of fetching and executing instructions.
处理器A 110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器A 110中的存储器为高速缓冲存储器。该存储器可以保存处理器A 110刚用过或循环使用的指令或数据。如果处理器A 110需要再次使用该指令或数据,可从所述存储器中直接调用。减少了重复存取,减少了处理器A 110的等待时间,因而提高了系统的效率。A memory may also be provided in the processor A 110 for storing instructions and data. In some embodiments, the memory in processor A 110 is cache memory. This memory can hold instructions or data that have just been used or recycled by processor A 110. If processor A 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are reduced and the waiting time of processor A 110 is reduced, thus improving the efficiency of the system.
通信模块A 160可以用于向客户端100的各个内部模块之间的通信、或者客户端100和其他外部电子设备之间的通信等。示例性的,如果客户端100通过无线连接的方式和其他电子设备通信,通信模块A 160可以提供应用在客户端100上的包括2G/3G/4G/5G等无线通信的解决方案,和/或,通信模块A 160可以提供应用在客户端100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。The communication module A 160 can be used for communication between various internal modules of the client 100, or communication between the client 100 and other external electronic devices. Exemplarily, if the client 100 communicates with other electronic devices via a wireless connection, the communication module A 160 can provide solutions for wireless communications including 2G/3G/4G/5G applied on the client 100, and/or, the communication module A 160 can provide solutions for wireless communications including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), Bluetooth (BT), global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), infrared technology (IR), etc. applied on the client 100.
在一些实施例中,客户端100的天线1和通信模块A 160耦合,使得客户端100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based  augmentation systems,SBAS)。In some embodiments, the antenna 1 of the client 100 is coupled with the communication module A 160 so that the client 100 can communicate with the network and other devices through wireless communication technology. The wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc. The GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi -zenith satellite system (QZSS) and/or satellite based enhancement system (satellite based augmentation systems (SBAS).
客户端100通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器A 110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。The client 100 implements display functions through a GPU, a display screen 194, and an application processor. The GPU is a microprocessor for image processing, which connects the display screen 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor A 110 may include one or more GPUs that execute program instructions to generate or change display information.
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,客户端100可以包括1个或N个显示屏194,N为大于1的正整数。The display screen 194 is used to display images, videos, etc. Display 194 includes a display panel. The display panel can use a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode). emitting diode (AMOLED), flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diode (QLED), etc. In some embodiments, the client 100 may include 1 or N display screens 194, where N is a positive integer greater than 1.
客户端100可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。The client 100 can implement the shooting function through the ISP, camera 193, video codec, GPU, display screen 194 and application processor.
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。The ISP is used to process the data fed back by the camera 193. For example, when taking a photo, the shutter is opened, the light is transmitted to the camera sensor through the lens, the optical signal is converted into an electrical signal, and the camera sensor passes the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye. ISP can also perform algorithm optimization on image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene. In some embodiments, the ISP may be provided in the camera 193.
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,客户端100可以包括1个或N个摄像头193,N为大于1的正整数。Camera 193 is used to capture still images or video. The object passes through the lens to produce an optical image that is projected onto the photosensitive element. The photosensitive element can be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, and then passes the electrical signal to the ISP to convert it into a digital image signal. ISP outputs digital image signals to DSP for processing. DSP converts digital image signals into standard RGB, YUV and other format image signals. In some embodiments, the client 100 may include 1 or N cameras 193, where N is a positive integer greater than 1.
在一些实施例中,客户端100可以将虚拟的图像与通过摄像头193捕获的图像进行叠加,并将叠加后的图像在显示屏194进行显示,从而实现增强现实等虚实结合的显示效果。或者,在另一些实施例中,显示屏194为透明的显示屏,客户端100可以将虚拟的图像投影在显示屏194上,当用户的视线通过显示屏194时,可以将投影的虚拟的图像和真实的图像进行叠加,从而实现增强现实等虚实结合的显示效果。In some embodiments, the client 100 can superimpose the virtual image with the image captured by the camera 193, and display the superimposed image on the display screen 194, thereby realizing a display effect that combines virtual and real such as augmented reality. Or, in other embodiments, the display screen 194 is a transparent display screen, the client 100 can project a virtual image on the display screen 194, and when the user's line of sight passes through the display screen 194, the projected virtual image can be Superimpose it with real images to achieve a display effect that combines virtual and real images such as augmented reality.
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展客户端100的存储能力。外部存储卡通过外部存储器接口120与处理器A 110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。The external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the client 100. The external memory card communicates with the processor A 110 through the external memory interface 120 to implement the data storage function. Such as saving music, videos, etc. files in external memory card.
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器A 110通过运行存储在内部存储器121的指令,从而执行客户端100的各种功能应用以及数据处理。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储客户端100使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。Internal memory 121 may be used to store computer executable program code, which includes instructions. The processor A 110 executes instructions stored in the internal memory 121 to execute various functional applications and data processing of the client 100 . The internal memory 121 may include a program storage area and a data storage area. Among them, the stored program area can store an operating system, at least one application program required for a function (such as a sound playback function, an image playback function, etc.). The storage data area can store data created during use of the client 100 (such as audio data, phone book, etc.). In addition, the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one disk storage device, flash memory device, universal flash storage (UFS), etc.
客户端100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。The client 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playback, recording, etc.
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器A 110中,或将音频模块170的部分功能模块设置于处理器A 110中。The audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signals. Audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be provided in the processor A 110, or some functional modules of the audio module 170 may be provided in the processor A 110.
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。客户端100可以通过扬声器170A收听音乐,或收听免提通话。Speaker 170A, also called "speaker", is used to convert audio electrical signals into sound signals. Client 100 can listen to music through speaker 170A, or listen to hands-free calls.
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当客户端100接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。Receiver 170B, also called "earpiece", is used to convert audio electrical signals into sound signals. When the client 100 answers a call or a voice message, the voice can be heard by bringing the receiver 170B close to the human ear.
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发 送语音信息时,用户可以通过人嘴靠近麦克风170C发声,将声音信号输入到麦克风170C。客户端100可以设置至少一个麦克风170C。在另一些实施例中,客户端100可以设置两个麦克风170C,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,客户端100还可以设置三个,四个或更多麦克风170C,实现采集声音信号,降噪,还可以识别声音来源,实现定向录音功能等。Microphone 170C, also called "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or sending When sending voice information, the user can speak with the mouth close to the microphone 170C and input the sound signal to the microphone 170C. The client 100 may be provided with at least one microphone 170C. In other embodiments, the client 100 may be provided with two microphones 170C, which in addition to collecting sound signals, may also implement a noise reduction function. In other embodiments, the client 100 can also be equipped with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions, etc.
耳机接口170D用于连接有线耳机。耳机接口170D可以是USB接口,也可以是3.5mm的开放移动电子设备平台(open mobile terminal platform,OMTP)标准接口,美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口。The headphone interface 170D is used to connect wired headphones. The headphone interface 170D can be a USB interface, or a 3.5mm open mobile terminal platform (OMTP) standard interface, or a Cellular Telecommunications Industry Association of the USA (CTIA) standard interface.
请参照图2,为本申请所提供的一种服务器200的结构示意图。服务器200可以包括存储器210、处理器B 220和通信模块B 230等。Please refer to FIG. 2 , which is a schematic structural diagram of a server 200 provided by this application. The server 200 may include a memory 210, a processor B 220, a communication module B 230, and the like.
处理器B 220可以包括一个或多个处理单元,存储器210用于存储程序代码和数据。在本申请实施例中,处理器B 220可执行存储器210存储的计算机执行指令,用于对服务器200的动作进行控制管理。Processor B 220 may include one or more processing units, and memory 210 may be used to store program code and data. In the embodiment of the present application, the processor B 220 can execute computer execution instructions stored in the memory 210 for controlling and managing the actions of the server 200.
通信模块B 230可以用于实现服务器200与客户端100等其他设备之间的通信。The communication module B 230 can be used to implement communication between the server 200 and other devices such as the client 100.
应理解,除了图2中列举的各种部件或者模块之外,本申请实施例对服务器200的结构不做具体限定。在本申请另一些实施例中,服务器200还可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。It should be understood that, except for the various components or modules listed in Figure 2, the embodiment of the present application does not specifically limit the structure of the server 200. In other embodiments of the present application, the server 200 may also include more or less components than shown in the figures, or combine some components, or split some components, or arrange different components. The components illustrated may be implemented in hardware, software, or a combination of software and hardware.
为了便于理解本申请实施例中的技术方案,下面首先对本申请实施例的应用场景予以介绍。In order to facilitate understanding of the technical solutions in the embodiments of the present application, the application scenarios of the embodiments of the present application are first introduced below.
客户端-服务器是一种重要的服务架构模式。在该架构模式中,客户端可以与服务器通过网络连接,客户端可以从服务器请求网络服务,相应的,服务器可以响应该请求,从而向客户端提供其所请求的数据。多个客户端之间可以通过服务器进行通信,从而向用户呈现协同场景。其中,协同场景可以为多个客户端同时呈现的场景。在一些实施方式中,多个客户端可以同时呈现该协同场景中的至少部分信息,以使得多个客户端的用户可以同时看到该至少部分信息。在一些实施方式中,当该多个客户端中的任一客户端,对该场景执行任一操作而导致该场景出现的任一变化时,该多个客户端中的其他客户端也会同步呈现这一变化。在一些实施方式中,协同场景可以包括游戏场景、娱乐场景或会议场景,当然在实际应用中,协同场景也可以更多或更少的场景,比如该协同场景还可以包括应用于工业生产等其他领域的场景。Client-server is an important service architecture pattern. In this architecture model, the client can connect to the server through the network, the client can request network services from the server, and the server can respond to the request to provide the client with the requested data. Multiple clients can communicate through the server to present collaborative scenarios to users. Among them, the collaborative scene can be a scene presented by multiple clients at the same time. In some implementations, multiple clients can simultaneously present at least part of the information in the collaborative scene, so that users of multiple clients can see at least part of the information at the same time. In some implementations, when any client among the multiple clients performs any operation on the scene, resulting in any change in the scene, other clients among the multiple clients will also synchronize. Present this change. In some embodiments, collaborative scenarios may include game scenarios, entertainment scenarios, or conference scenarios. Of course, in practical applications, collaborative scenarios may also include more or fewer scenarios. For example, the collaborative scenarios may also include other applications such as industrial production. field scene.
以基于游戏的协同场景为例,各用户的客户端可以从服务器获取游戏数据,基于该游戏数据向用户呈现游戏场景、以及分布在该游戏场景中的一个或多个的玩家角色等对象。游戏场景可以用于指示游戏所在的环境、区域、地形、建筑等,游戏场景中的各对象可以在该游戏场景中进行交互。在一些实施方式中,该对象可以包括由用户控制的对象,比如玩家控制角色(player character,PC)。在一些实施方式中,该对象可以包括非用户控制的对象,比如非玩家控制角色(non-player character,NPC)。在一些实施方式中,该对象可以包括其他可以与用户交互的对象,比如载具、工具和宝箱等各种游戏道具等等。在一些实施方式中,进行交互的对象中至少有一个为用户控制的对象,用户可以控制游戏场景的对象与另一对象进行交互,比如该用户的玩家控制角色与另一用户的玩家控制角色跳舞,该用户的玩家控制角色打开宝箱、该用户的玩家控制角色和另一用户的玩家控制角色共同驾驶同一个载具等。在一些实施方式中,进行交互的对象可以都不是用户控制的对象,比如该游戏场景中包括“狼”和“羊”两种动物,这两种动物都不受用户控制,当“狼”遇到“羊”之后,可以猎食“羊”。Taking a game-based collaborative scenario as an example, each user's client can obtain game data from the server, and based on the game data, present the user with a game scene, as well as one or more player characters and other objects distributed in the game scene. The game scene can be used to indicate the environment, area, terrain, building, etc. where the game is located. Each object in the game scene can interact in the game scene. In some implementations, the object may include an object controlled by the user, such as a player character (PC). In some implementations, the object may include a non-user-controlled object, such as a non-player character (NPC). In some implementations, the object may include other objects that can interact with the user, such as various game props such as vehicles, tools, treasure chests, etc. In some embodiments, at least one of the interactive objects is a user-controlled object. The user can control an object in the game scene to interact with another object, such as the user's player-controlled character dancing with another user's player-controlled character. , the user's player-controlled character opens the treasure chest, the user's player-controlled character and another user's player-controlled character drive the same vehicle together, etc. In some implementations, the interactive objects may not be objects controlled by the user. For example, the game scene includes two animals: "wolf" and "sheep". Both animals are not controlled by the user. When the "wolf" encounters After reaching "Sheep", you can hunt "Sheep".
在一些实施方式中,各用户的客户端在该游戏场景中对应有不同的方向和位置。其中,客户端对应的位置可以理解为该客户端的用户在该游戏场景中的位置,客户端对应的方向可以理解为该客户端的用户在该游戏场景中观察的方向。当客户端向用户呈现游戏场景时,该客户端可以基于该客户端对应的位置和方向,向用户呈现与该位置和该方向对应的至少部分游戏场景和以及该至少部分游戏场景中包括的一个或多个对象,也即是,基于该客户端的用户在该游戏场景中的位置以及该用户观察的方向,向用户呈现该用户在该位置,从该方向所观察到的至少部分游戏场景以及对象。In some implementations, each user's client corresponds to different directions and positions in the game scene. The position corresponding to the client can be understood as the position of the user of the client in the game scene, and the direction corresponding to the client can be understood as the direction in which the user of the client observes the game scene. When the client presents a game scene to the user, the client may, based on the position and direction corresponding to the client, present to the user at least part of the game scene corresponding to the position and the direction and one of the at least part of the game scene. or multiple objects, that is, based on the position of the user of the client in the game scene and the direction in which the user observes, the user is presented with at least part of the game scene and objects observed by the user at the position and from the direction. .
例如,在多人对战游戏中,客户端A所展示的游戏画面可以如图3所示,客户端A的游戏画 面为第三人称视角的画面,可以理解为客户端A的用户,在当前的位置和方向所能看到的游戏场景,客户端A的游戏画面中包括与客户端A对应的玩家控制角色A 301、另一客户端B对应的玩家控制角色B 302以及宝箱303,其中,玩家控制角色B 302处于玩家控制角色A 301前方的西北方向,宝箱303处于玩家控制角色A 301前方的东北方向。另一客户端B所展示的游戏画面可以如图4所示。客户端B的游戏画面为第三人称视角的画面,可以理解为客户端B的用户,在当前的位置和方向所能看到的游戏场景,客户端B的游戏画面中包括与客户端B对应的玩家控制角色B 302以及宝箱303,宝箱303处于玩家控制角色B 302的正东方向。对比图3和图4可知,尽管客户端A和客户端B处于同一游戏场景,但由于客户端A对应的方向和位置,与客户端B对应的方向和位置不同,因此,客户端A的游戏画面和客户端B的游戏画面也不同。如果客户端B的用户控制玩家控制角色B 302移动,客户端A的游戏画面中也可以同步显示玩家控制角色B 302的移动过程,即玩家控制角色A 301观察到了玩家控制角色B 302移动。如果玩家控制角色B 302与打开了宝箱303,那么玩家控制角色B 302打开宝箱303的过程以及宝箱303的状态变化,可以同步显示在客户端A的游戏画面和客户端B的游戏画面中,即玩家控制角色A 301观察到了玩家控制角色B 302打开了宝箱303。For example, in a multiplayer battle game, the game screen displayed by client A can be as shown in Figure 3. The game screen of client A The picture shown in the third-person perspective can be understood as the game scene that the user of Client A can see at the current position and direction. The game picture of Client A includes the player-controlled character A corresponding to Client A 301 , the player-controlled character B 302 and the treasure box 303 corresponding to another client B, wherein the player-controlled character B 302 is located in the northwest direction in front of the player-controlled character A 301, and the treasure box 303 is located in the northeast direction in front of the player-controlled character A 301. The game screen displayed by another client B can be as shown in Figure 4. The game screen of client B is a third-person perspective screen, which can be understood as the game scene that the user of client B can see at the current position and direction. The game screen of client B includes the game scene corresponding to client B. The player controls character B 302 and the treasure box 303. The treasure box 303 is located in the due east direction of the player-controlled character B 302. Comparing Figure 3 and Figure 4, we can see that although Client A and Client B are in the same game scene, the direction and position corresponding to Client A are different from the direction and position corresponding to Client B. Therefore, the game of Client A The screen is also different from the game screen of client B. If the user of client B controls the movement of player-controlled character B 302, the game screen of client A can also simultaneously display the movement process of player-controlled character B 302, that is, player-controlled character A 301 observes the movement of player-controlled character B 302. If the player controls character B 302 and opens the treasure box 303, then the process of the player controlling character B 302 to open the treasure box 303 and the status change of the treasure box 303 can be displayed simultaneously on the game screen of client A and the game screen of client B, that is, Player-controlled character A 301 observed player-controlled character B 302 opening the treasure chest 303 .
可以理解的是,图3和图4所示的游戏画面都为第三人称视角的画面,因而图3所示的游戏画面中包括客户端A对应的玩家控制角色A 301,图4所示的游戏画面中包括客户端B对应的玩家控制角色B 302,但若图3和图4所示的游戏画面都为第一人称视角的画面,那么图3所示的游戏画面中不包括客户端A对应的玩家控制角色A 301,图4所示的游戏画面中不包括客户端B对应的玩家控制角色B 302,这样更符合人类在正常情况下无法看到自己的视觉体验,使用户产生身临其境的游戏体验。It can be understood that the game screens shown in Figures 3 and 4 are both from a third-person perspective. Therefore, the game screen shown in Figure 3 includes the player-controlled character A 301 corresponding to client A. The game shown in Figure 4 The screen includes the player-controlled character B 302 corresponding to client B. However, if the game screens shown in Figures 3 and 4 are both from the first-person perspective, then the game screen shown in Figure 3 does not include the character B 302 corresponding to client A. The player controls character A 301. The game screen shown in Figure 4 does not include the player-controlled character B 302 corresponding to client B. This is more in line with the visual experience that humans cannot see themselves under normal circumstances, making users feel immersed in the scene. gaming experience.
请参照图5,本申请实施例提供了一种多设备协同系统,该系统可以实现前述中的协同场景,该系统可以包括多个客户端100和服务器200,每个客户端100与服务器200之间通过网络300连接。Please refer to Figure 5. This embodiment of the present application provides a multi-device collaboration system, which can realize the aforementioned collaboration scenario. The system can include multiple clients 100 and servers 200. Each client 100 and the server 200 are connected via network 300.
多个客户端100中包括AR客户端510和VR客户端520。AR客户端510可以包括手机530和智能眼镜540,VR客户端520可以包括智慧大屏550和笔记本电脑560。且需要说明的是,在实际应用中,AR客户端510以及VR客户端520也可以包括更多或更少的设备,且手机530和智能眼镜540也可以作为VR客户端520,智慧大屏550和笔记本电脑560也可以作为AR客户端510。AR客户端510可以向用户呈现AR场景,VR客户端520可以向用户呈现VR场景,其中,AR场景和VR场景可以分别为同一协同场景,在AR客户端510和VR客户端520的不同呈现方式,AR场景可以为现实场景和虚拟场景叠加的虚实结合的场景,VR场景可以为虚拟场景。在一些实施方式中,对于同一对象,该对象在AR场景中可以为虚拟对象、现实对象或虚实结合的对象,该对象在VR场景可以为虚拟对象。例如,AR场景中包括一个现实中的人,该人在VR场景中可以呈现为与该现实中的人对应的虚拟人型形象。又例如,AR场景中包括一个由现实的支撑架和虚拟的投影构成的虚实结合的马,该马在VR场景中可以呈现为虚拟的马。又例如,AR场景中包括虚拟的人型角色,由于该人型角色也是虚拟对象,因此该人型角色在VR场景中呈现的也是虚拟的人型角色。The multiple clients 100 include an AR client 510 and a VR client 520 . The AR client 510 may include a mobile phone 530 and smart glasses 540 , and the VR client 520 may include a smart large screen 550 and a laptop 560 . It should be noted that in actual applications, the AR client 510 and the VR client 520 can also include more or less devices, and the mobile phone 530 and the smart glasses 540 can also be used as the VR client 520 and the smart large screen 550 And laptop 560 can also serve as AR client 510. The AR client 510 can present an AR scene to the user, and the VR client 520 can present a VR scene to the user. The AR scene and the VR scene can be the same collaborative scene respectively, with different presentation methods on the AR client 510 and the VR client 520 , AR scenes can be virtual and real scenes superimposed on real scenes and virtual scenes, and VR scenes can be virtual scenes. In some implementations, for the same object, the object can be a virtual object, a real object, or a combination of virtual and real objects in the AR scene, and the object can be a virtual object in the VR scene. For example, the AR scene includes a real person, and the person can be presented as a virtual humanoid image corresponding to the real person in the VR scene. For another example, the AR scene includes a virtual and real horse composed of a real support frame and a virtual projection. The horse can be presented as a virtual horse in the VR scene. For another example, the AR scene includes a virtual humanoid character. Since the humanoid character is also a virtual object, the humanoid character is also a virtual humanoid character when presented in the VR scene.
服务器200可以向多个客户端100提供多设备协同所需的数据。比如,服务器200可以向每个客户端100提供各客户端100对应的位置信息和方向信息,向每个客户端100提供用于呈现协同场景的场景数据,向每个客户端100提供该协同场景中的对象的状态数据等等。The server 200 can provide multiple clients 100 with data required for multi-device collaboration. For example, the server 200 can provide each client 100 with location information and direction information corresponding to each client 100 , provide each client 100 with scene data for presenting a collaborative scene, and provide each client 100 with the collaborative scene. The status data of the objects in the object, etc.
在一些实施方式中,服务器200可以省略,服务器200的各项功能可以集成在多个客户端100中的至少一个上。在一些实施方式中,多个客户端100的任一客户端(例如任一AR客户端510)既可以作为客户端也可以作为服务器,该多个客户端100中的其他客户端可以与该任一客户端通过网络300连接,该任一客户端可以按照与服务器200相同或相似的方式,向多个客户端100提供多设备协同所需的数据。例如,仍以基于游戏的协同场景为例,多个客户端100包括多个手机530和笔记本电脑560,其中笔记本电脑560既作为客户端也作为服务器,笔记本电脑560的用户在笔记本电脑560中创建游戏场景,多个手机530作为客户端接入该游戏场景,每个手机530的用户以及笔记本电脑560的用户,便可以在该笔记本电脑560创建的游戏场景中进行游戏。 In some implementations, the server 200 may be omitted, and various functions of the server 200 may be integrated on at least one of the multiple clients 100 . In some embodiments, any client of the plurality of clients 100 (for example, any AR client 510) can serve as both a client and a server, and other clients of the plurality of clients 100 can communicate with any of the clients 100. A client is connected through the network 300, and any client can provide data required for multi-device collaboration to multiple clients 100 in the same or similar manner as the server 200. For example, still taking a game-based collaboration scenario as an example, multiple clients 100 include multiple mobile phones 530 and laptops 560, where the laptop 560 serves as both a client and a server, and the user of the laptop 560 creates a In the game scene, multiple mobile phones 530 serve as clients to access the game scene. Users of each mobile phone 530 and laptop 560 can play games in the game scene created by the laptop 560.
请参照图6,为本申请实施例所提供的一种多设备协同系统的结构图。该多个设备协同系统可以包括客户端100、网关600和服务器200。Please refer to FIG. 6 , which is a structural diagram of a multi-device collaboration system provided by an embodiment of the present application. The multiple device collaboration system may include a client 100, a gateway 600, and a server 200.
网关600用于实现客户端100与服务器200之间的通信。网关600可以为前述的网络300中的网络设备。The gateway 600 is used to implement communication between the client 100 and the server 200 . The gateway 600 may be a network device in the aforementioned network 300.
客户端100可以通过网关600与服务器200之间建立通信连接,客户端100可以通过该通信连接与服务器200之间进行通信。The client 100 can establish a communication connection with the server 200 through the gateway 600, and the client 100 can communicate with the server 200 through the communication connection.
在一些实施方式中,网关600可以为基于服务网格(service mesh)的网关。服务网格是一个基础设施层,可以用于处理服务间通信。服务网格可以由一系列轻量级的网络代理组成。In some implementations, the gateway 600 may be a service mesh-based gateway. A service mesh is an infrastructure layer that handles communication between services. A service mesh can be composed of a series of lightweight network proxies.
在一些实施方式中,该通信连接可以为长连接,即该通信连接一旦建立之后可以长期或者永久保持,使得后续客户端100与服务器200进行通信时,无需再次建立新的通信连接,节省连接请求耗时和资源消耗,减少通信时延。In some embodiments, the communication connection may be a long connection, that is, once established, the communication connection can be maintained for a long time or permanently, so that when the client 100 communicates with the server 200 subsequently, there is no need to establish a new communication connection again, saving connection requests. Reduce time and resource consumption and reduce communication delay.
在一些实施方式中,网关600可以基于docker进行容器化部署,从而实现动态扩缩容,满足海量请求场景下的消息路由转发。In some implementations, the gateway 600 can be deployed in a container based on docker to achieve dynamic expansion and contraction to meet message routing and forwarding in massive request scenarios.
在一些实施方式中,服务器200可以包括第一同步模块210、第二同步模块220和公务服务模块230。In some implementations, the server 200 may include a first synchronization module 210, a second synchronization module 220, and an official service module 230.
第一同步模块210可以用于实现多个客户端100在第一坐标系中的兴趣面(area of interest,AOI)同步。其中,AOI可以用于指示地图中表达区域状的地理实体,如一个居民小区、一所大学、一个写字楼、一个产业园区、一个综合商场、一个医院、一个景区或一个体育馆等等。居民小区为例,该居民小区的AOI可以用于指示该居民小区的轮廓和面积。第一坐标系可以为现实世界中的坐标系。第一坐标系为公共坐标系。The first synchronization module 210 may be used to implement area of interest (AOI) synchronization of multiple clients 100 in the first coordinate system. Among them, AOI can be used to indicate regional geographical entities in the map, such as a residential area, a university, an office building, an industrial park, a comprehensive shopping mall, a hospital, a scenic spot or a stadium, etc. Taking a residential area as an example, the AOI of the residential area can be used to indicate the outline and area of the residential area. The first coordinate system may be a coordinate system in the real world. The first coordinate system is the public coordinate system.
在一些实施方式中,第一坐标系可以包括GPS坐标系。In some implementations, the first coordinate system may include a GPS coordinate system.
在一些实施方式中,第一同步模块210可以通过geohash空间索引算法,对地图中的经纬度数据序列化,从而将该地图分界为不同的区域,每个区域可以作为一个AOI区域。In some implementations, the first synchronization module 210 can serialize the latitude and longitude data in the map through the geohash spatial index algorithm, thereby dividing the map into different areas, and each area can be used as an AOI area.
在一些实施方式中,第一同步模块210可以确定客户端100所在的AOI区域,以及每个AOI区域所包括的客户端100。在一些实施方式中,一个AOI区域可以对应一个或多个的协同场景,一个协同场景可以对应一个或多个的AOI区域,当客户端处于某一个AOI区域时,可以接入该AOI区域对应的协同场景。In some embodiments, the first synchronization module 210 may determine the AOI area where the client 100 is located, and the client 100 included in each AOI area. In some embodiments, an AOI area may correspond to one or more collaborative scenes, and a collaborative scene may correspond to one or more AOI areas. When a client is in an AOI area, it may access the collaborative scene corresponding to the AOI area.
例如,在多人对战游戏中,服务器200通过第一同步模块210配置了三个AOI区域,A市区、B市区和C市区,每个市区可以作为一个独立的游戏场景。当客户端100处于A市区时,客户端100可以接入该游戏场景。For example, in a multiplayer battle game, the server 200 configures three AOI areas through the first synchronization module 210, urban area A, urban area B, and urban area C. Each urban area can be used as an independent game scene. When the client 100 is in urban area A, the client 100 can access the game scene.
第二同步模块220可以用于实现多个客户端100在第二坐标系中的AOI同步,第二同步模块220的同步精度可以高于第一同步模块210的同步精度。第二同步模块220可以具体用于更精准地确定客户端100在协同场景中位置,以及该协同场景中不同客户端100之间的位置,进而检测某一个客户端100是否呈现与其他客户端100相关的内容,和/或,是否将该客户端100相关的内容发送至其他客户端100,以使得其他客户端100呈现与该客户端100相关的内容。第二坐标系可以为现实世界坐标系。第二坐标系可以为公共坐标系。The second synchronization module 220 may be used to implement AOI synchronization of multiple clients 100 in the second coordinate system, and the synchronization accuracy of the second synchronization module 220 may be higher than the synchronization accuracy of the first synchronization module 210 . The second synchronization module 220 can be specifically used to more accurately determine the position of the client 100 in the collaborative scene, as well as the positions of different clients 100 in the collaborative scene, and then detect whether a certain client 100 is present with other clients 100 Related content, and/or whether to send content related to the client 100 to other clients 100 so that other clients 100 can present content related to the client 100 . The second coordinate system may be a real-world coordinate system. The second coordinate system may be a common coordinate system.
在一些实施方式中,第二坐标系可以为UTM坐标系。In some implementations, the second coordinate system may be a UTM coordinate system.
在一些实施方式中,第二同步模块220可以通过九宫格网格算法,实现多个客户端100在第二坐标系中的AOI同步。In some implementations, the second synchronization module 220 can implement AOI synchronization of multiple clients 100 in the second coordinate system through a nine-square grid algorithm.
公共服务模块230可以包括一个或多个公务服务,比如地形资源服务、事件总线服务等。公共服务模块230可以基于第一同步模块210和第二同步模块220中的至少一个,向一个或多个客户端100,提供一个或多个公共服务。The public service module 230 may include one or more public services, such as terrain resource services, event bus services, etc. The public service module 230 may provide one or more public services to one or more clients 100 based on at least one of the first synchronization module 210 and the second synchronization module 220 .
在一些实施方式中,服务器200中的至少部分数据可以存储在内存中,从而在获取该至少部分数据时,减少或避免复杂的寻址操作,提高获取该至少部分数据的效率,进而也提高多设备协同的效率。In some embodiments, at least part of the data in the server 200 can be stored in memory, thereby reducing or avoiding complex addressing operations when obtaining the at least part of the data, improving the efficiency of obtaining the at least part of the data, and thereby improving the efficiency of obtaining the at least part of the data. Equipment collaboration efficiency.
请参照图7,为本申请实施例所提供的一种多设备协同系统的结构图。其中,图7所示的系统,是图6所示的系统的基础上,对第一同步模块210进行详细说明。 Please refer to FIG. 7 , which is a structural diagram of a multi-device collaboration system provided by an embodiment of the present application. Among them, the system shown in FIG. 7 is based on the system shown in FIG. 6 , and the first synchronization module 210 is described in detail.
第一同步模块210可以包括区域服务模块211、区域管理模块212和公共接入模块213。The first synchronization module 210 may include a regional service module 211, a regional management module 212 and a public access module 213.
区域服务模块211中可以用于提供已经划分好的多个AOI区域,包括各AOI区域的位置、边界、大小以及任意两个AOI区域之间的位置关系。The area service module 211 can be used to provide multiple divided AOI areas, including the location, boundary, size of each AOI area, and the positional relationship between any two AOI areas.
区域管理模块212可以作为区域配置中心,从而对区域服务模块211所包括的AOI区域进行管理,包括进行区域划分、负载均衡以及弹性计算等。The regional management module 212 can serve as a regional configuration center to manage the AOI regions included in the regional service module 211, including regional division, load balancing, elastic computing, etc.
在一些实施方式中,可以针对不同的协同场景,通过区域管理模块212,在第一坐标系中对地图进行的区域划分,从而实现不同协同场景之间的物理隔离和逻辑隔离。In some implementations, the area management module 212 may be used to divide the map into regions in the first coordinate system for different collaborative scenarios, thereby achieving physical and logical isolation between different collaborative scenarios.
公共接入模块213可以用于基于区域服务模块211中的多个AOI区域,接入一个或多个的公务服务功能,比如聊天功能、好友功能和日志监控功能等。The public access module 213 can be used to access one or more public service functions based on multiple AOI areas in the regional service module 211, such as chat function, friend function, log monitoring function, etc.
下面以具体地实施例对本申请的技术方案进行详细说明。下面这几个具体的实施例可以相互结合,对于相同或相似的概念或过程可能在某些实施例不再赘述。The technical solution of the present application will be described in detail below with specific examples. The following specific embodiments can be combined with each other, and the same or similar concepts or processes may not be described again in some embodiments.
需要说明的是,以下仅以第一协同场景为游戏场景为例,对本申请实施例所提供的多设备协同方法进行说明。在第一协同场景为其他类型的场景的情况下,下述中的各客户端以及各服务器,可以按照与其在游戏场景中相似或相同的方式,实现多设备协同。It should be noted that the following only takes the first collaborative scenario as a game scenario as an example to illustrate the multi-device collaboration method provided by the embodiment of the present application. When the first collaborative scenario is another type of scenario, each client and each server described below can implement multi-device collaboration in a manner similar to or the same as that in the game scenario.
请参照图8,为本申请实施例所提供的一种多设备协同方法的流程图。需要说明的是,该方法并不以图8以及以下所述的具体顺序为限制,应当理解,在其它实施例中,该方法其中部分步骤的顺序可以根据实际需要相互交换,或者其中的部分步骤也可以省略或删除。该方法包括如下步骤:Please refer to Figure 8, which is a flowchart of a multi-device collaboration method provided in an embodiment of the present application. It should be noted that the method is not limited to Figure 8 and the specific sequence described below. It should be understood that in other embodiments, the order of some steps in the method can be interchanged according to actual needs, or some steps can be omitted or deleted. The method includes the following steps:
S801,第一AR客户端和第一VR客户端接入同一游戏场景。S801: The first AR client and the first VR client access the same game scene.
第一AR客户端和第一VR客户端可以接入同一游戏场景,从而能够向各自的用户呈现该游戏场景,使得各自的用户可以在该游戏场景中游玩。The first AR client and the first VR client can access the same game scene, so that the game scene can be presented to respective users, so that the respective users can play in the game scene.
其中,第一AR客户端和第一VR客户端接入同一游戏场景的方式,也可以参照下述图10-图13中的相关描述。Among them, the way in which the first AR client and the first VR client access the same game scene can also refer to the relevant descriptions in the following Figures 10 to 13.
需要说明的是,S801为可选的步骤。比如,在一些实施方式中,第一AR客户端和第一VR客户端可以始终保持在接入该游戏场景的状态。It should be noted that S801 is an optional step. For example, in some implementations, the first AR client and the first VR client may always remain in a state of accessing the game scene.
S802,第一AR客户端呈现第一AR场景,第一VR客户端呈现第一VR场景。S802: The first AR client presents the first AR scene, and the first VR client presents the first VR scene.
其中,第一AR场景可以为AR模式的游戏场景,第一VR场景可以为VR模式下的游戏场景,也即是,第一AR场景和第一VR场景可以为同一游戏场景,分别在第一AR客户端和第一VR客户端中按照不同模式所呈现的结果。Among them, the first AR scene can be a game scene in AR mode, and the first VR scene can be a game scene in VR mode. That is, the first AR scene and the first VR scene can be the same game scene, and the results are presented in different modes in the first AR client and the first VR client respectively.
在一些实施方式中,第一AR场景包括至少部分现实场景。在一些实施方式中,该至少部分现实场景可以为通过第一AR客户端的摄像头拍摄到的场景。In some implementations, the first AR scene includes at least part of a realistic scene. In some implementations, the at least part of the real scene may be a scene captured by a camera of the first AR client.
在一些实施方式中,第一VR场景可以为与第一AR场景对应的虚拟场景。在一些实施方式中,第一VR场景可以为对第一AR场景包括的现实场景进行处理之后得到的虚拟场景。在一些实施方式中,第一VR场景可以为通过NeRF网络等视图合成模型,对第一AR场景中的现实场景进行处理所得到的虚拟场景。其中,视图合成模型可以对实现场景重新进行建模和渲染,从而输出与该现实场景对应的虚拟场景。In some embodiments, the first VR scene may be a virtual scene corresponding to the first AR scene. In some embodiments, the first VR scene may be a virtual scene obtained by processing the real scene included in the first AR scene. In some embodiments, the first VR scene may be a virtual scene obtained by processing the real scene in the first AR scene through a view synthesis model such as a NeRF network. The view synthesis model may remodel and render the realization scene, thereby outputting a virtual scene corresponding to the real scene.
例如,第一AR场景和第一VR场景可以如图9所示。第一AR场景可以为第一AR客户端的摄像头实时拍摄到的现实场景,包括现实的建筑物、楼道和树木等。第一VR场景为对第一AR场景进行处理得到的虚拟场景,包括与现实的建筑物对应的虚拟建筑物,与现实的楼道对应的虚拟的楼道以及与现实的树木对应的虚拟树木。且可以理解是,由于第一AR场景和第一VR场景是不同客户端所呈现的场景,由于不同客户端对应的位置和角度不同,因此,第一AR场景和第一VR场景所包括的具体内容也有所差异。图9中的第一AR场景和第一VR场景可以理解为两个相对方向上的用户所看到的不同的游戏场景。For example, the first AR scene and the first VR scene may be as shown in Figure 9. The first AR scene can be a real scene captured by the camera of the first AR client in real time, including real buildings, corridors, trees, etc. The first VR scene is a virtual scene obtained by processing the first AR scene, including virtual buildings corresponding to real buildings, virtual corridors corresponding to real corridors, and virtual trees corresponding to real trees. And it can be understood that since the first AR scene and the first VR scene are scenes presented by different clients, and since the corresponding positions and angles of different clients are different, the specific contents included in the first AR scene and the first VR scene The content also varies. The first AR scene and the first VR scene in Figure 9 can be understood as different game scenes seen by users in two opposite directions.
其中,第一AR客户端呈现第一AR场景的方式也可以参照下图14中的相关描述,第一VR客户端呈现第一VR场景的方式也可以参照下图15中的相关描述。The way in which the first AR client presents the first AR scene can also refer to the relevant description in Figure 14 below, and the way in which the first VR client presents the first VR scene can also refer to the relevant description in Figure 15 below.
S803,当第一AR场景的第一现实位置包括第一对象时,第一VR客户端在第一VR场景的第一虚拟位置呈现第一对象,第一现实位置和第一虚拟位置为游戏场景中的同一位置;和/或,当 第一VR场景的第二虚拟位置包括第二对象时,第一AR客户端在第一AR场景的第二现实位置呈现第二对象,第二虚拟位置和第二现实位置为游戏场景中的同一位置。S803. When the first real position of the first AR scene includes the first object, the first VR client presents the first object at the first virtual position of the first VR scene, and the first real position and the first virtual position are the game scene. at the same position in; and/or, when When the second virtual position of the first VR scene includes the second object, the first AR client presents the second object at the second real position of the first AR scene, and the second virtual position and the second real position are the same in the game scene. Location.
当第一AR场景的第一现实位置包括第一对象时,第一VR客户端在第一VR场景的第一虚拟位置呈现第一对象,也即是,第一VR客户端可以将第一AR场景中的第一对象,同步呈现在第一VR场景中,使得用户可以从第一VR客户端看到来自第一AR场景中的第一对象。当第一VR场景的第二虚拟位置包括第二对象时,第一AR客户端在第一AR场景的第二现实位置呈现第二对象,也即是,第一AR客户端可以将第一VR场景中的第二对象,同步呈现在第一AR场景中,使得用户可以从第一AR客户端看到来自第一VR场景中的第二对象。因此,第一AR客户端和第一VR客户端都可以呈现来自对端的游戏场景中对象,使得第一AR客户端的用户和第一VR客户端的用户都可以看到该对象,实现了AR客户端和VR客户端之间的协同互通。When the first real position of the first AR scene includes the first object, the first VR client presents the first object at the first virtual position of the first VR scene, that is, the first VR client can display the first AR The first object in the scene is synchronously presented in the first VR scene, so that the user can see the first object from the first AR scene from the first VR client. When the second virtual position of the first VR scene includes the second object, the first AR client presents the second object at the second real position of the first AR scene, that is, the first AR client can place the first VR The second object in the scene is synchronously presented in the first AR scene, so that the user can see the second object from the first VR scene from the first AR client. Therefore, both the first AR client and the first VR client can present the object in the game scene from the opposite end, so that both the user of the first AR client and the user of the first VR client can see the object, realizing the AR client Collaboration and interoperability with VR clients.
例如,在图9所示的第一AR场景中,现实的楼梯上包括虚拟角色A 701和虚拟角色B 702,且虚拟角色B 702在虚拟角色A 701的前方。在图9所示的第一VR场景中,虚拟的楼梯上的相同位置也包括虚拟角色A 701和虚拟角色B 702,且虚拟角色B 702在虚拟角色A 701的前方。For example, in the first AR scene shown in FIG9 , the real staircase includes virtual character A 701 and virtual character B 702, and virtual character B 702 is in front of virtual character A 701. In the first VR scene shown in FIG9 , the same position on the virtual staircase also includes virtual character A 701 and virtual character B 702, and virtual character B 702 is in front of virtual character A 701.
需要说明的是,由于第一AR场景为虚实结合的场景,因此,第一AR场景中的对象可以是虚拟对象、现实对象或者虚实结合的对象。现实对象可以为第一AR场景中切实存在的对象,比如真实的人、动物、物体和建筑物等等,比如图9所示的第一AR场景中现实的建筑、楼梯和树木。虚拟对象可以为在现实场景上叠加显示图像或者投射在现实场景中的影像,比如如图9所示的第一AR场景中的虚拟角色A 701和虚拟角色B 702。虚拟结合对对象,可以为第一AR场景中切实存在的对象与显示屏上的图像相结合的对象,或者可以为第一AR场景中切实存在的对象与投影在该对象的虚拟影像的相结合的对象,比如,第一AR场景中包括一个现实的火炬框架,第一AR客户端在该火炬框架上投影出火焰的虚拟影像,该火炬框架和该火焰的虚拟影像结合为一个虚实结合的火炬。而由于第一VR场景为虚拟场景,因此第一VR场景中的对象可以是虚拟对象。It should be noted that since the first AR scene is a scene that combines virtuality and reality, the objects in the first AR scene may be virtual objects, real objects, or objects that combine virtuality and reality. Real objects may be real objects in the first AR scene, such as real people, animals, objects, buildings, etc., such as real buildings, stairs, and trees in the first AR scene shown in Figure 9. The virtual object can be an image superimposed on the real scene or an image projected in the real scene, such as virtual character A 701 and virtual character B 702 in the first AR scene as shown in Figure 9. The virtual combination object may be an object that actually exists in the first AR scene and the image on the display screen, or it may be an object that actually exists in the first AR scene and a virtual image projected on the object. For example, the first AR scene includes a realistic torch frame, the first AR client projects a virtual image of the flame on the torch frame, and the torch frame and the virtual image of the flame are combined into a virtual and real torch. . Since the first VR scene is a virtual scene, the objects in the first VR scene may be virtual objects.
在一些实施方式中,第一对象可以为第一AR客户端控制的对象,比如第一对象可以为第一AR端控制的玩家控制角色。当然,在实际应用中,第一对象也可以不为第一AR客户端控制的对象,比如第一对象可以为第一AR客户端的用户、第一AR场景中现实的动物或物体等。In some embodiments, the first object may be an object controlled by the first AR client, for example, the first object may be a player-controlled character controlled by the first AR client. Of course, in practical applications, the first object may not be an object controlled by the first AR client, for example, the first object may be a user of the first AR client, a real animal or object in the first AR scene, etc.
在一些实施方式中,第二对象可以第一VR客户端控制的对象,比如第二对象可以为第一VR端控制的玩家控制角色。当然,在实际应用中,第二对象也可以不为第一VR客户端控制的对象,比如第二对象可以为非玩家控制角色。In some implementations, the second object may be an object controlled by the first VR client. For example, the second object may be a player-controlled character controlled by the first VR client. Of course, in actual applications, the second object may not be an object controlled by the first VR client. For example, the second object may be a character controlled by a non-player.
其中,第一对象可以为第一AR客户端的用户,示例性的,当第一AR客户端的用户在第一AR场景的第一现实位置,第一VR客户端可以在第一VR场景的第一虚拟位置呈现第一对象,示例性的,第一VR场景的第一对象可以为第一AR客户端的用户对应的虚拟形象,可以使得第一VR客户端的用户能够在第一VR场景中看到第一AR客户端的用户对应的虚拟形象;第二对象为第一VR客户端控制的玩家控制角色,第一AR客户端在第一AR场景的第二现实位置呈现第二对象,可以使得第一AR客户端的用户,能够在第一AR场景中看到第一VR客户端控制的玩家控制角色,因此,来自多个不同类型的客户端的多个用户,可以互见互动。The first object may be the user of the first AR client. For example, when the user of the first AR client is in the first real position of the first AR scene, the first VR client may be in the first real position of the first VR scene. The first object is presented in the virtual position. For example, the first object of the first VR scene can be a virtual image corresponding to the user of the first AR client, which can enable the user of the first VR client to see the first object in the first VR scene. A virtual image corresponding to the user of an AR client; the second object is a player-controlled character controlled by the first VR client, and the first AR client presents the second object at the second real position of the first AR scene, which can make the first AR The user of the client can see the player-controlled character controlled by the first VR client in the first AR scene. Therefore, multiple users from multiple different types of clients can interact with each other.
在一些实施方式中,第一AR场景中的第一对象可以为现实对象或虚拟结合的对象,第一VR客户端可以在第一VR场景的第一虚拟位置,呈现第一对象对应的虚拟形象。在一些实施方式中,第一AR场景中的第一对象为虚拟对象,第一VR客户端可以在第一VR场景中呈现第一对象的虚拟形象。In some implementations, the first object in the first AR scene may be a real object or a virtual combined object, and the first VR client may present the virtual image corresponding to the first object at the first virtual position of the first VR scene. . In some implementations, the first object in the first AR scene is a virtual object, and the first VR client can present the virtual image of the first object in the first VR scene.
在一些实施方式中,第一VR场景中的第二对象为虚拟对象,第一AR客户端可以在第一AR场景的第二现实位置,呈现第二对象或呈现第二对象对应的虚实结合的形象。In some implementations, the second object in the first VR scene is a virtual object, and the first AR client can present the second object or a combination of virtual and real objects corresponding to the second object in the second real position of the first AR scene. image.
在一些实施方式中,当第一AR场景发生与第一对象对应的第一事件时,第一VR客户端也可以在第一VR场景呈现第一事件,也即是,第一VR客户端可以与第一AR客户端同步呈现发生在第一AR场景的第一事件。示例性的,当第一AR场景的第一对象为现实对象,当该现实对象发生第一事件,第一VR场景的第一对象的虚拟形象也可以发生第一事件,还可以将该第一事件呈现在第一VR客户端的第一VR场景。第一VR客户端与第一AR客户端通过同步呈现第一事件, 提高了第一VR客户端与第一AR客户端协同呈现游戏场景的真实性,进而提高了用户体验。In some implementations, when the first event corresponding to the first object occurs in the first AR scene, the first VR client can also present the first event in the first VR scene. That is, the first VR client can A first event occurring in the first AR scene is presented synchronously with the first AR client. For example, when the first object in the first AR scene is a real object and the first event occurs in the real object, the first event can also occur in the virtual image of the first object in the first VR scene, and the first event can also occur. The event is presented to the first VR scene in the first VR client. The first VR client and the first AR client present the first event synchronously, This improves the authenticity of the first VR client and the first AR client in collaboratively presenting the game scene, thereby improving the user experience.
在一些实施方式中,第一事件可以为第一对象的状态更新事件、第一对象运动事件和第一对象与其他对象的交互事件中的至少一个。In some implementations, the first event may be at least one of a status update event of the first object, a movement event of the first object, and an interaction event between the first object and other objects.
在一些实施方式中,第一对象的状态更新事件可以用于指示第一对象的状态变化。In some implementations, a status update event of the first object may be used to indicate a status change of the first object.
例如,第一对象的状态更新事件可以用于指示第一对象的体型、身材、发型、服装等形象变化,第一对象的生命值、体力值、耐力值、技能冷却时间、道具剩余数量、任务完成进度等数值变化。需要说明的是,第一对象的状态以及状态变化的具体内容,可以由游戏开发商等相关技术人员确定,本申请实施例不对第一对象的状态以及该状态变化的具体内容进行限定。For example, the status update event of the first object can be used to indicate changes in the first object's body shape, body shape, hairstyle, clothing, etc., the first object's health value, physical strength value, endurance value, skill cooling time, remaining number of props, tasks Numerical changes such as completion progress. It should be noted that the state of the first object and the specific content of the state change can be determined by relevant technical personnel such as the game developer. The embodiment of the present application does not limit the state of the first object and the specific content of the state change.
在一些实施方式中,第一对象运动事件可以用于指示第一对象的运动过程,比如速度、方向、运动路线等等。在一些实施方式中,第一对象的运动事件包括述第一对象从第一AR场景中的第一现实位置运动至第三现实位置,第一VR客户端可以呈现第一对象从第一VR场景中的第一虚拟位置,移动至第三虚拟位置的运动过程,第三虚拟位置和第三现实位置为游戏场景中的同一位置。In some implementations, the first object motion event can be used to indicate the motion process of the first object, such as speed, direction, motion route, and so on. In some embodiments, the motion event of the first object includes the movement of the first object from a first real position in the first AR scene to a third real position, and the first VR client can present the first object from the first VR scene. The first virtual position in , the movement process of moving to the third virtual position, and the third virtual position and the third real position are the same position in the game scene.
在一些实施方式中,第一对象与其他对象的交互事件可以用于指示第一对象与其他对象之间的交互过程。In some implementations, interaction events between the first object and other objects may be used to indicate the interaction process between the first object and other objects.
其中,其他对象可以为任意可交互的对象,第一对象与任一对象的交互方式,可以基于该任一对象的不同而有所差异。比如与第一对象交互的对象为玩家控制角色和非玩家控制角色,那么该交互方式可以包括对话、跳舞、攻击和协助等;或者,与第一对象交互的对象现实的物体,该交互方式可以包括操作该物体;或者,与第一对象交互的对象为动物,该交互方式可以包括抚摸、喂养、驱赶、攻击和骑乘等。需要说明的是,游戏场景中可交互的对象以及针对不同对象的交互方式,可以由游戏开发商等相关技术人员确定,本申请实施例不对可交互对象的类型以及相应的交互方式进行具体限定。The other objects may be any interactive objects, and the interaction method between the first object and any object may differ based on the object. For example, if the object that interacts with the first object is a player-controlled character or a non-player-controlled character, then the interaction method may include dialogue, dancing, attack, assistance, etc.; or, if the object that interacts with the first object is a real object, the interaction method may include Including operating the object; or, the object interacting with the first object is an animal, and the interaction method may include stroking, feeding, driving, attacking, riding, etc. It should be noted that the interactive objects in the game scene and the interaction methods for different objects can be determined by relevant technical personnel such as game developers. The embodiments of this application do not specifically limit the types of interactive objects and the corresponding interaction methods.
在一些实施方式中,当第一VR场景发生与第二对象对应的第二事件时,第一AR客户端可以在第一AR场景呈现与第二对象对应的第二事件,也即是,第一VR客户端可以与第一AR客户端同步呈现发生在第一VR场景的第二事件。第一VR客户端与第一AR客户端通过同步呈现第一事件,提高了第一VR客户端与第一AR客户端协同呈现游戏场景的真实性,进而提高了用户体验。In some implementations, when a second event corresponding to the second object occurs in the first VR scene, the first AR client may present the second event corresponding to the second object in the first AR scene, that is, the second event corresponding to the second object occurs in the first VR scene. A VR client can present a second event occurring in the first VR scene synchronously with the first AR client. By synchronously presenting the first event, the first VR client and the first AR client improve the authenticity of the game scene collaboratively presented by the first VR client and the first AR client, thereby improving the user experience.
在一些实施方式中,第二事件可以为第二对象的状态更新事件、第二对象运动事件和第二对象与其他对象的交互事件中的至少一个。其中,第二事件参见和第一事件相关的描述,此处不再赘述。In some implementations, the second event may be at least one of a status update event of the second object, a movement event of the second object, and an interaction event between the second object and other objects. For the second event, please refer to the description related to the first event and will not be described again here.
可以理解的是,第一事件和第二事件还可以包括更多或更少的事件。It is understood that the first event and the second event may also include more or less events.
其中,第一VR客户端在第一VR场景的第一虚拟位置呈现第一对象的方式,以及呈现第一对象的运动事件的方式,也可以参照下述图16、图19和图20中的相关描述;第一AR客户端在第一AR场景的第二现实位置呈现第二对象的方式,以及呈现第二对象的运动事件的方式也可以参照下述图17-图18和图21中的相关描述;第一AR客户端和第一VR客户端呈现第一对象的状态变化事件的方式,也可以参照下述图22-图25中的相关描述。Among them, the way in which the first VR client presents the first object at the first virtual position of the first VR scene, and the way in which the motion event of the first object is presented, can also refer to the relevant descriptions in the following Figures 16, 19 and 20; the way in which the first AR client presents the second object at the second real position of the first AR scene, and the way in which the motion event of the second object is presented, can also refer to the relevant descriptions in the following Figures 17-18 and 21; the way in which the first AR client and the first VR client present the state change event of the first object can also refer to the relevant descriptions in the following Figures 22-25.
在本申请实施例中,第一AR客户端和第一VR客户端可以接入同一游戏场景,第一AR客户端呈现第一AR场景,第一VR客户端呈现第一VR场景,即第一AR客户端和第一VR客户端可以协同呈现游戏场景,其中,第一VR场景可以为对第一AR场景进行处理得到的虚拟场景。当第一AR场景的第一现实位置包括第一对象时,第一VR客户端在第一VR场景的第一虚拟位置呈现第一对象,也即是,第一VR客户端可以将第一AR场景中的第一对象,同步呈现在第一VR场景中,使得用户可以从第一VR客户端看到来自第一AR场景中的第一对象。当第一VR场景的第二虚拟位置包括第二对象时,第一AR客户端在第一AR场景的第二现实位置呈现第二对象,也即是,第一AR客户端可以将第一VR场景中的第二对象,同步呈现在第一AR场景中,使得用户可以从第一AR客户端看到来自第一VR场景中的第二对象。因此,第一AR客户端和第一VR客户端可以呈现来自对端的游戏场景中对象,使得第一AR客户端的用户和第一VR客户端的用户可以看到该对象,实现了AR客户端和VR客户端之间的协同互通。 In this embodiment of the present application, the first AR client and the first VR client can access the same game scene, the first AR client presents the first AR scene, and the first VR client presents the first VR scene, that is, the first The AR client and the first VR client can collaboratively present a game scene, where the first VR scene can be a virtual scene obtained by processing the first AR scene. When the first real position of the first AR scene includes the first object, the first VR client presents the first object at the first virtual position of the first VR scene, that is, the first VR client can display the first AR The first object in the scene is synchronously presented in the first VR scene, so that the user can see the first object from the first AR scene from the first VR client. When the second virtual position of the first VR scene includes the second object, the first AR client presents the second object at the second real position of the first AR scene, that is, the first AR client can place the first VR The second object in the scene is synchronously presented in the first AR scene, so that the user can see the second object from the first VR scene from the first AR client. Therefore, the first AR client and the first VR client can present the object in the game scene from the opposite end, so that the user of the first AR client and the user of the first VR client can see the object, realizing the realization of the AR client and VR Collaboration and interoperability between clients.
请参照图10,为本申请实施例所提供的一种接入游戏场景的方法的流程图。需要说明的是,该方法并不以图10以及以下所述的具体顺序为限制,应当理解,在其它实施例中,该方法其中部分步骤的顺序可以根据实际需要相互交换,或者其中的部分步骤也可以省略或删除。该方法包括如下步骤:Please refer to FIG. 10 , which is a flow chart of a method for accessing a game scene provided by an embodiment of the present application. It should be noted that this method is not limited to the specific sequence shown in Figure 10 and below. It should be understood that in other embodiments, the order of some steps of the method can be exchanged with each other according to actual needs, or some of the steps can be It can also be omitted or deleted. The method includes the following steps:
S1001,客户端启动。S1001, client starts.
用户可以通过点击终端设备中的客户端的图标等方式来启动该客户端。当然,在实际应用中,也可以通过其他方式启动客户端,本申请实施例不对客户端启动的方式进行限定。The user can start the client by clicking on the icon of the client in the terminal device. Of course, in actual applications, the client can also be started in other ways. The embodiments of this application do not limit the way in which the client is started.
需要说明的是,S1001为可选的步骤。当省略S1001时,客户端可以常驻在终端设备中,从而提高客户端接入游戏场景的效率。It should be noted that S1001 is an optional step. When S1001 is omitted, the client can reside in the terminal device, thereby improving the efficiency of the client accessing the game scene.
S1002,客户端对本地的资源进行更新。S1002, the client updates local resources.
为了确保客户端的稳定运行,客户端在启动时可以查询是否有更新数据,如果存在更新数据,则获取该更新数据,并基于该更新数据对本地的资源进行更新。且可以理解的是,客户端也可以在其他时机对本地的资源进行更新,或者不主动对该资源进行更新,S1002为可选的步骤。In order to ensure stable operation of the client, the client can query whether there is updated data when starting up. If there is updated data, the updated data is obtained and the local resources are updated based on the updated data. It is understandable that the client can also update the local resources at other times, or not actively update the resources. S1002 is an optional step.
在一些实施方式中,客户端还可以校验本地的资源的安全性和完整性,如果校验通过则继续执行后续步骤,否则可以对该资源进行修复之后在继续执行后续步骤。In some implementations, the client can also verify the security and integrity of local resources. If the verification passes, continue to perform subsequent steps. Otherwise, the client can repair the resource and continue to perform subsequent steps.
S1003,客户端进行权限校验。S1003, the client performs permission verification.
客户端可以检查该客户端运行所需的至少一项权限是否获得授权。如果是继续执行后续步骤,如果存在一项或多项权限未获取到授权,则可以获取该一项或多权项之后,在继续执行后续步骤。在一些实施方式中,该至少一项权限包括定位权限、调用网络的权限、读写存储设备的权限、读写相册的权限、获取终端设备的设备信息的权限、调用麦克风和摄像头的权限等等。当然在实际应用中,该至少一项权限还可以包括更多或更少的权限,本申请实施例不对客户端校验的权限的数目以及权限的具体类型进行限定。A client can check whether at least one of the permissions required for the client to run is granted. If you continue to perform subsequent steps, if one or more permissions have not been authorized, you can obtain the one or more permissions and then continue to perform subsequent steps. In some embodiments, the at least one permission includes positioning permission, permission to call the network, permission to read and write storage devices, permission to read and write photo albums, permission to obtain device information of the terminal device, permission to call the microphone and camera, etc. . Of course, in practical applications, the at least one permission may also include more or fewer permissions. The embodiment of this application does not limit the number of permissions verified by the client and the specific types of permissions.
在一些实施方式中,客户端还可以开启终端设备上的至少一项功能。该至少一项功能可以用于确保客户端的正常运行。在一些实施方式中,该至少一项功能可以包括定位功能和网络功能等。当然,在实际应用中,该至少一项功能还可以包括更多或更少的功能,本申请实施例不对客户端开启功能的数目和具体类型进行限定。In some implementations, the client can also enable at least one function on the terminal device. The at least one function may be used to ensure proper functioning of the client. In some implementations, the at least one function may include a positioning function, a network function, and the like. Of course, in actual applications, the at least one function may also include more or less functions. The embodiments of this application do not limit the number and specific types of functions enabled by the client.
需要说明的是,S1003为可选的步骤,当省略S1003时,可以提高客户端登录的效率。It should be noted that S1003 is an optional step. When S1003 is omitted, the efficiency of client login can be improved.
S1004,客户端登录业务服务器。S1004, the client logs in to the business server.
在一些实施方式中,客户端可以通过超文本传输协议(hyper text transfer potocol,HTTP)请求等网络请求,从短信服务器获取短信服务器的第一登录凭证信息,基于第一登录凭证信息从业务服务器获取业务服务器的第二登录凭证信息。In some implementations, the client can obtain the first login credential information of the SMS server from the SMS server through a network request such as a hypertext transfer protocol (HTTP) request, and obtain it from the business server based on the first login credential information. The second login credential information of the business server.
其中,登录凭证信息用于证明该客户端所提供的用户名和密码正确。客户端通过向服务器提供登录凭证信息,来避免频繁地向服务器提供户名和密码。在一些实施方式中,登录凭证信息可以为token,token可以为服务器生成的字符串。在一些实施方式中,第一登录凭证信息可以为短信服务器生成的字符串,第二登录凭证信息可以为业务服务器生成的字符串。Among them, the login credential information is used to prove that the user name and password provided by the client are correct. The client avoids frequently providing user names and passwords to the server by providing login credential information to the server. In some implementations, the login credential information can be a token, and the token can be a string generated by the server. In some implementations, the first login credential information may be a character string generated by the SMS server, and the second login credential information may be a character string generated by the business server.
例如,客户端名称为“星光巨塔”。用户在手机上打开“星光巨塔”,从而显示如图11所示的登录界面,该登录界面中包括手机号码输入框901、验证码输入框902、验证码刷新按钮903、登录按钮904和注册按钮905。用户可以在手机号码输入框901输入事先注册的手机号码,然后点击验证码刷新按钮903,将获取到的验证码输入至验证码输入框902,然后点击登录按钮904完成登录。当然,若用户尚未注册,则可以点击注册按钮905完成注册之后,再登录。可以理解的是,在实际应用中,客户端也可以通过其他方式来登录至业务服务器,本申请实施例不对客户端登录至业务服务器的方式进行限定。For example, the client name is "Starlight Tower". The user opens "Starlight Tower" on the mobile phone, thereby displaying the login interface shown in Figure 11, which includes a mobile phone number input box 901, a verification code input box 902, a verification code refresh button 903, a login button 904 and a registration button 905. The user can enter a pre-registered mobile phone number in the mobile phone number input box 901, then click the verification code refresh button 903, enter the obtained verification code into the verification code input box 902, and then click the login button 904 to complete the login. Of course, if the user has not registered yet, he can click the registration button 905 to complete the registration and then log in. It can be understood that in actual applications, the client can also log in to the business server in other ways, and the embodiment of the present application does not limit the way in which the client logs in to the business server.
S1005,客户端从业务服务器获取角色数据。S1005, the client obtains role data from the business server.
客户端可以从业务服务器获取角色数据,并客户端选择角色。The client can obtain role data from the business server and select the role.
客户端可以基于第二登录凭证信息,从业务服务器获取角色数据。而如果业务服务器中没有与该客户端对应的角色数据,则可能该客户端的用户并未创建角色,因此该客户端可以创建角色,从而生成角色数据。 The client can obtain role data from the business server based on the second login credential information. If there is no role data corresponding to the client in the business server, it is possible that the user of the client has not created a role, so the client can create a role and thereby generate role data.
其中,角色数据可以为角色相关的数据。在一些实施方式中,角色数据可以包括角色索引信息和角色名称。Among them, the role data can be role-related data. In some implementations, role data may include role index information and role names.
可以理解是,在实际应用中,客户端也可以通过其他方式来从业务服务器获取角色数据,且角色数据也并不限于上述提到的角色索引信息和角色名称,本申请实施例不对客户端从服务器获取角色数据以及角色数据所包括的具体内容进行限定。It can be understood that in actual applications, the client can also obtain role data from the business server through other means, and the role data is not limited to the role index information and role name mentioned above. The embodiment of the present application does not limit the client to obtain role data from the server and the specific content included in the role data.
S1006,客户端选择游戏场景以及接入该游戏场景的方式。S1006: The client selects a game scene and a method of accessing the game scene.
在一些实施方式中,客户端可以与业务服务器建立传输控制协议(transmission control protocol,TCP)连接。当TCP连接成功时进入大厅场景,该大厅场景可以包括一个或多个的游戏场景,客户端可以选择接入其中任一个游戏场景。当客户端选择所接入的游戏场景时,可以进步选择通过AR或VR的方式接入该游戏场景。若该客户端选择了通过AR的方式接入该游戏场景,则该客户端可以作为AR客户端接入该游戏场景。如果该客户端选择了通过VR方式接入该游戏场景在,则该客户端可以作为VR客户端接入该游戏场景。例如,如图10所示,大厅场景中包括喷射竞技场场景和经典场景。当客户端选择了喷射竞技场场景时,可以进一步选择AR竞技场或VR竞技场。当客户端选择了经典场景时,可以进一步选择AR经典场景或VR经典场景。In some implementations, the client can establish a transmission control protocol (TCP) connection with the service server. When the TCP connection is successful, the lobby scene is entered. The lobby scene can include one or more game scenes, and the client can choose to access any one of the game scenes. When the client selects the game scene to access, it can further choose to access the game scene through AR or VR. If the client chooses to access the game scene through AR, the client can access the game scene as an AR client. If the client chooses to access the game scene through VR, the client can access the game scene as a VR client. For example, as shown in Figure 10, the lobby scene includes the Jet Arena scene and the classic scene. When the client selects the Jet Arena scene, it can further select AR Arena or VR Arena. When the client selects a classic scene, it can further select an AR classic scene or a VR classic scene.
例如,当用户基于图11所示的登录界面登录成功时,手机可以显示如图12所示的阵营选择界面。该阵营选择界面包括蓝色阵营按钮906和红色阵营按钮907。用户可以通过点击蓝色阵营按钮906从而加入蓝色阵营,或者,点击红色阵营按钮907加入红色阵营。当用户选择阵营之后,可以显示如图13所示的游戏大厅界面,该游戏大厅界面中包括该用户的头像、名称和等级等用户信息913,游戏地图908,且游戏地图908中包括5个游戏场景,用户可以通过点击其中的任一游戏场景从而选择该游戏场景。在图13中,当前位于游戏地图908中间位置的游戏场景909显示有该用户的头像,即表示该用户目前选择该游戏场景909。该游戏大厅界面还包括接入方式切换按钮910,用户可以通过点击该接入方式切换按钮910从而选择通过AR方式接入游戏场景还是通过VR方式接入游戏场景,比如当前该接入方式切换按钮910显示“AR”,即表示当前是通过AR方式接入游戏场景。该游戏大厅界面还包括背包按钮911,用户可以通过点击背包按钮911来打开虚拟的背包,该背包中可以包括虚拟的游戏道具和资产等等。该游戏答大厅界面还包括消息按钮912,当用户点击该消息按钮912时,可以向用户展示一个或多个的消息,这些可以是来自服务器的系统消息,也可以是来自其他用户的消息。For example, when the user successfully logs in based on the login interface shown in Figure 11, the mobile phone can display the camp selection interface shown in Figure 12. The camp selection interface includes a blue camp button 906 and a red camp button 907 . The user can join the blue camp by clicking the blue camp button 906, or click the red camp button 907 to join the red camp. After the user selects a camp, the game lobby interface as shown in Figure 13 can be displayed. The game lobby interface includes user information 913 such as the user's avatar, name, and level, and a game map 908, and the game map 908 includes 5 games. Scenes, the user can click on any of the game scenes to select the game scene. In Figure 13, the game scene 909 currently located in the middle of the game map 908 displays the user's avatar, which means that the user currently selects the game scene 909. The game lobby interface also includes an access method switching button 910. The user can click the access method switching button 910 to choose to access the game scene through AR or VR. For example, the current access method switching button 910 displays "AR", which means that the game scene is currently accessed through AR. The game lobby interface also includes a backpack button 911. The user can click the backpack button 911 to open a virtual backpack, which can include virtual game props and assets, etc. The game lobby interface also includes a message button 912. When the user clicks the message button 912, one or more messages can be displayed to the user. These can be system messages from the server or messages from other users.
可以理解的是,在实际应用中,客户端也可以通过其他方式来选择游戏场景和/或接入该游戏场景的方式,本申请实施例不对客户端选择游戏场景以及接入该游戏场景的方式进行限定。例如,在另一些实施方式中,客户端也可以先选择接入游戏场景的方式,再选择所要接入的游戏场景。又或者,在另一些实施方式中,客户端被配置为只作为AR客户端或VR客户端,因此在S1006中不必再选择接入游戏场景的方式。It is understandable that, in actual applications, the client may also select the game scene and/or the method of accessing the game scene in other ways, and the embodiments of the present application do not limit the method of selecting the game scene and accessing the game scene by the client. For example, in other embodiments, the client may first select the method of accessing the game scene, and then select the game scene to be accessed. Alternatively, in other embodiments, the client is configured to act only as an AR client or a VR client, so it is not necessary to select the method of accessing the game scene in S1006.
请参照图14,为本申请实施例所提供的一种呈现AR场景的方法的流程图。需要说明的是,该方法并不以图14以及以下所述的具体顺序为限制,应当理解,在其它实施例中,该方法其中部分步骤的顺序可以根据实际需要相互交换,或者其中的部分步骤也可以省略或删除。该方法包括如下步骤:Please refer to FIG. 14 , which is a flow chart of a method for presenting an AR scene provided by an embodiment of the present application. It should be noted that this method is not limited to the specific sequence shown in Figure 14 and below. It should be understood that in other embodiments, the order of some steps of the method can be exchanged with each other according to actual needs, or some of the steps can be It can also be omitted or deleted. The method includes the following steps:
S1401,第一AR客户端向资源服务器发送第一AR客户端采集的第一定位信息。S1401: A first AR client sends first positioning information collected by the first AR client to a resource server.
其中,第一定位信息可以用于指示第一AR客户端在第一坐标系中的位置。在一些实施方式中,第一定位信息可以用于指示第一AR客户端所在的AOI区域,比如第一AR客户端所在的国家、省份、城市或行政区等。在一些实施方式中,第一坐标系为GPS坐标系,第一定位信息可以为第一GPS坐标。The first positioning information may be used to indicate the position of the first AR client in the first coordinate system. In some implementations, the first positioning information may be used to indicate the AOI area where the first AR client is located, such as the country, province, city or administrative region where the first AR client is located. In some implementations, the first coordinate system is a GPS coordinate system, and the first positioning information may be the first GPS coordinates.
在一些实施方式中,第一AR客户端还可以向资源服务器发送第一范围信息,第一范围信息可以用于指示第一AR客户端所呈现的游戏场景的范围。In some implementations, the first AR client may also send first range information to the resource server, and the first range information may be used to indicate the range of the game scene presented by the first AR client.
S1402,资源服务器基于第一定位信息向第一AR客户端返回第一数字资源。S1402: The resource server returns the first digital resource to the first AR client based on the first positioning information.
资源服务器可以基于第一定位信息,从存储的数字资源集合中获取第一数字资源,并向第一AR客户端返回第一数字资源。其中,第一数字资源可以用于呈现游戏场景中的至少部分场景。The resource server may obtain the first digital resource from the stored digital resource collection based on the first positioning information, and return the first digital resource to the first AR client. Wherein, the first digital resource can be used to present at least part of the scene in the game scene.
在一些实施方式中,资源服务器可以基于第一定位信息和第一范围信息,从存储的数字资源 集合中获取第一数字资源。In some embodiments, the resource server may, based on the first positioning information and the first range information, retrieve the digital resource from the stored digital resource. Get the first digital resource in the collection.
S1403,第一AR客户端向定位服务器发送第一AR客户端采集的第一定位信息以及第一环境图像。S1403. The first AR client sends the first positioning information and the first environment image collected by the first AR client to the positioning server.
第一AR客户端可以向定位服务器发送第一AR客户端采集的第一定位信息以及第一环境图像,从而请求定位服务器对第一AR客户端进行更加精准的定位。The first AR client may send the first positioning information and the first environment image collected by the first AR client to the positioning server, thereby requesting the positioning server to perform more accurate positioning of the first AR client.
其中,第一AR客户端可以通过相机对第一AR客户端所在的现场进行拍摄,从而得到第一环境图像。Wherein, the first AR client can photograph the scene where the first AR client is located through the camera, thereby obtaining the first environment image.
在一些实施方式中,定位服务器可以为视觉定位服务(visual positioning service)服务器。In some implementations, the positioning server may be a visual positioning service (visual positioning service) server.
在一些实施方式中,第一AR客户端对第一环境图像进行压缩,从而向定位服务器发送压缩后的第一环境图像。通过对第一环境图像进行压缩,可以减少所需传输的数据量,提高定位效率和定位的实时性。In some implementations, the first AR client compresses the first environment image, thereby sending the compressed first environment image to the positioning server. By compressing the first environment image, the amount of data required to be transmitted can be reduced, and positioning efficiency and real-time positioning can be improved.
S1404,定位服务器向第一AR客户端返回第二定位信息。S1404: The positioning server returns the second positioning information to the first AR client.
定位服务器可以基于第一环境图像和第一定位信息确定第二定位信息,并向第一AR客户端返回第二定位信息,第二定位信息可以比第一定位信息更精准地指示第一AR客户端的位置。在一些实施方式中,第二定位信息可以用于指示第一AR客户端在第一定位信息所指示的区域中的具体位置。The positioning server may determine the second positioning information based on the first environment image and the first positioning information, and return the second positioning information to the first AR client. The second positioning information may indicate the first AR client more accurately than the first positioning information. end position. In some implementations, the second positioning information may be used to indicate the specific location of the first AR client in the area indicated by the first positioning information.
其中,第二定位信息可以用于指示第一AR客户端在第二坐标系中的位置。在一些实施方式中,第二坐标系为通用横墨卡托格网系统(universal transverse mercator grid system,UTM)坐标系,第二定位信息可以为第一UTM坐标。The second positioning information may be used to indicate the position of the first AR client in the second coordinate system. In some implementations, the second coordinate system is a universal transverse mercator grid system (UTM) coordinate system, and the second positioning information may be the first UTM coordinates.
在一些实施方式中,定位服务器还可以基于第一环境图像和第一定位信息确定第一旋转信息,并向第一AR客户端返回第一旋转信息。其中,第一旋转信息可以用于指示第一AR客户端在第二坐标系中的旋转角度。In some implementations, the positioning server may also determine the first rotation information based on the first environment image and the first positioning information, and return the first rotation information to the first AR client. The first rotation information may be used to indicate the rotation angle of the first AR client in the second coordinate system.
S1405,第一AR客户端将第二定位信息转换为第三定位信息。S1405: The first AR client converts the second positioning information into third positioning information.
第一AR客户端可以将第二定位信息进行坐标系转换,从而得到在第三坐标系中的第三定位信息。其中,第三定位信息可以用于指示第一AR客户端在第三坐标系中的位置,第三坐标系可以为以第一AR客户端的相机为参考的坐标系。在一些实施方式中,第三坐标系可以理解为第一AR客户端的本地坐标系,第三定位信息可以理解为第一AR客户端的本地坐标。The first AR client can perform coordinate system conversion on the second positioning information, thereby obtaining the third positioning information in the third coordinate system. The third positioning information may be used to indicate the position of the first AR client in a third coordinate system, and the third coordinate system may be a coordinate system with the camera of the first AR client as a reference. In some implementations, the third coordinate system can be understood as the local coordinate system of the first AR client, and the third positioning information can be understood as the local coordinates of the first AR client.
在一些实施方式中,第一AR客户端可以将第一旋转信息转换为第二旋转信息。第二旋转信息可以用于指示第一AR客户端在第三坐标系中的旋转角度。In some implementations, the first AR client may convert the first rotation information into the second rotation information. The second rotation information may be used to indicate the rotation angle of the first AR client in the third coordinate system.
S1406,第一AR客户端基于第三定位信息加载第一数字资源,呈现第一AR场景。S1406: The first AR client loads the first digital resource based on the third positioning information and presents the first AR scene.
第一AR客户端可以基于第三定位信息加载第一数字资源,从而呈现第一AR场景,使得第一AR客户端的用户可以基于所在的位置,准确地看到周围各个位置的物体等,即看到虚实结合的第一AR场景。The first AR client can load the first digital resource based on the third positioning information, thereby presenting the first AR scene, so that the user of the first AR client can accurately see objects at various locations around him based on the location, that is, see The first AR scene that combines reality and reality.
在一些实施方式中,第一AR客户端可以基于第三定位信息和第二旋转信息,加载第一数字资源,使得第一AR场景中的各物体与第一AR客户端的相机的旋转角度相匹配,提高第一AR场景的真实性和用户体验。In some implementations, the first AR client can load the first digital resource based on the third positioning information and the second rotation information, so that each object in the first AR scene matches the rotation angle of the camera of the first AR client. , improve the authenticity and user experience of the first AR scene.
请参照图15,为本申请实施例所提供的一种呈现VR场景的方法的流程图。需要说明的是,该方法并不以图15以及以下所述的具体顺序为限制,应当理解,在其它实施例中,该方法其中部分步骤的顺序可以根据实际需要相互交换,或者其中的部分步骤也可以省略或删除。该方法包括如下步骤:Please refer to FIG. 15 , which is a flow chart of a method for presenting a VR scene provided by an embodiment of the present application. It should be noted that this method is not limited to the specific sequence shown in Figure 15 and below. It should be understood that in other embodiments, the order of some steps of the method can be exchanged with each other according to actual needs, or some of the steps can be It can also be omitted or deleted. The method includes the following steps:
S1501,第一VR客户端向资源服务器发送预置的第四定位信息。S1501. The first VR client sends preset fourth positioning information to the resource server.
其中,第四定位信息可以用于指示第一VR客户端在第一坐标系中的位置。在一些实施方式中,第四定位信息可以用于指示第一VR客户端所在的AOI区域,比如第一VR客户端所在的国家、省份、城市或行政区等。在一些实施方式中,第一坐标系为GPS坐标系,第四定位信息可以为第二GPS坐标。The fourth positioning information may be used to indicate the position of the first VR client in the first coordinate system. In some implementations, the fourth positioning information may be used to indicate the AOI area where the first VR client is located, such as the country, province, city or administrative region where the first VR client is located. In some implementations, the first coordinate system is a GPS coordinate system, and the fourth positioning information may be a second GPS coordinate.
在一些实施方式中,第一VR客户端还可以向资源服务器发送第二范围信息,第二范围信息可以用于指示第一AR客户端所呈现的游戏场景的范围。 In some implementations, the first VR client may also send second range information to the resource server, and the second range information may be used to indicate the range of the game scene presented by the first AR client.
在一些实施方式中,第四定位信息可以由第一VR客户端接收用户提交而得到。比如,第一VR客户端处于上海市,但仍然可以接收用户提交的归属于北京市的GPS坐标,使得对于资源服务器来说,第一VR客户端为处于北京市的客户端。In some implementations, the fourth positioning information may be obtained by the first VR client receiving a user submission. For example, the first VR client is located in Shanghai, but it can still receive the GPS coordinates submitted by the user and belongs to Beijing, so that for the resource server, the first VR client is the client located in Beijing.
S1502,资源服务器基于第四定位信息向第一VR客户端返回第二数字资源。S1502: The resource server returns the second digital resource to the first VR client based on the fourth positioning information.
资源服务器可以基于第四定位信息,从存储的数字资源集合中获取第二数字资源,并向第一VR客户端返回第二数字资源。其中,第二数字资源可以用于呈现游戏场景中的至少部分场景。The resource server may obtain the second digital resource from the stored digital resource collection based on the fourth positioning information, and return the second digital resource to the first VR client. Wherein, the second digital resource can be used to present at least part of the scene in the game scene.
在一些实施方式中,资源服务器可以基于第四定位信息和第二范围信息,从存储的数字资源集合中获取第二数字资源。In some implementations, the resource server may obtain the second digital resource from the stored digital resource set based on the fourth positioning information and the second range information.
S1503,第一VR客户端将预置的第五定位信息转换为第六定位信息。S1503. The first VR client converts the preset fifth positioning information into sixth positioning information.
第五定位信息可以比第四定位信息更精准地指示第一VR客户端的位置。在一些实施方式中,第五定位信息可以用于指示第一VR客户端在第四定位信息所指示的AOI区域中的具体位置。The fifth positioning information may indicate the position of the first VR client more accurately than the fourth positioning information. In some implementations, the fifth positioning information may be used to indicate the specific location of the first VR client in the AOI area indicated by the fourth positioning information.
其中,第五定位信息可以用于指示第一VR客户端在第二坐标系中的位置。在一些实施方式中,第二坐标系为UTM坐标系,第五定位信息可以为第二UTM坐标。The fifth positioning information may be used to indicate the position of the first VR client in the second coordinate system. In some implementations, the second coordinate system is a UTM coordinate system, and the fifth positioning information may be the second UTM coordinates.
在一些实施方式中,第五定位信息可以存储在第一VR客户端的本地文件中,或者,第五定位信息也可以是从业务服务器获取得到的。在一些实施方式中,第五定位信息所指示的位置,可以理解为第一VR客户端在第一区域中的出生点、重生点或刷新点等。In some implementations, the fifth positioning information may be stored in a local file of the first VR client, or the fifth positioning information may be obtained from the service server. In some implementations, the position indicated by the fifth positioning information can be understood as the birth point, rebirth point, refresh point, etc. of the first VR client in the first area.
第一AR客户端可以将第五定位信息进行坐标系转换,从而得到在第四坐标系中的第六定位信息。其中,第六定位信息可以用于指示第一VR客户端在第四坐标系中的位置,第四坐标系可以为以第一VR客户端为参考的坐标系。在一些实施方式中,第四坐标系可以理解为第一VR客户端的本地坐标系,第六定位信息可以理解为第一AR客户端的本地坐标。The first AR client can perform coordinate system conversion on the fifth positioning information, thereby obtaining the sixth positioning information in the fourth coordinate system. The sixth positioning information may be used to indicate the position of the first VR client in a fourth coordinate system, and the fourth coordinate system may be a coordinate system with the first VR client as a reference. In some implementations, the fourth coordinate system can be understood as the local coordinate system of the first VR client, and the sixth positioning information can be understood as the local coordinates of the first AR client.
S1504,第一VR客户端基于第六定位信息加载第二数字资源,从而呈现第一VR场景。S1504. The first VR client loads the second digital resource based on the sixth positioning information, thereby presenting the first VR scene.
第一VR客户端可以基于第六定位信息加载第二数字资源,从而呈现第一VR场景,使得第一VR客户端的用户可以基于所在的位置,准确地看到周围各个位置的物体等,即看到虚拟的第一VR场景。The first VR client can load the second digital resource based on the sixth positioning information, thereby presenting the first VR scene, so that the user of the first VR client can accurately see objects at various locations around him based on the location, that is, see to the virtual first VR scene.
在一些实施方式中,第一VR客户端可以基于第六定位信息和第三旋转信息,加载第二数字资源,使得第一VR场景中的各物体与第一VR客户端的相机的旋转角度相匹配,提高第一VR场景的真实性和用户体验。其中,第三旋转信息可以由第一VR客户端接收用户提交而得到。In some implementations, the first VR client can load the second digital resource based on the sixth positioning information and the third rotation information, so that each object in the first VR scene matches the rotation angle of the camera of the first VR client. , improve the authenticity and user experience of the first VR scene. The third rotation information can be obtained by the first VR client receiving a user submission.
在上述图15所示的方法中,第一VR客户端可以通过预置的第四定位信息获取第二数字资源,这使得即使在现实中,第一VR客户端未与第一AR客户端处于同一AOI区域,也能够与第一AR客户端进行协同。可以理解的是,在另一些实施方式中,若在现实中,第一VR客户端和第一AR客户端都处于同一AOI区域,那么第一VR客户端也可以按照与第一AR客户端相似的方式,通过采集获取到第四定位信息。In the method shown in Figure 15 above, the first VR client can obtain the second digital resource through the preset fourth positioning information, which makes it possible that even in reality, the first VR client is not in the same position as the first AR client. The same AOI area can also be collaborated with the first AR client. It can be understood that in other implementations, if in reality, the first VR client and the first AR client are in the same AOI area, then the first VR client can also follow a similar process as the first AR client. way, the fourth positioning information is obtained through collection.
结合图14和图15可知,由于第一AR客户端是基于第一定位信息从资源服务器获取第一数字资源,第一VR客户端是基于第四定位信息从资源服务器获取第二数字资源,而第一定位信息和第四定位信息都处于第一坐标系,因此对于资源服务器来说,第一AR客户端和第一VR客户端处于同一坐标系,所以第一AR客户端和第一VR客户端能够呈现同一游戏场景。且由于第一AR客户端基于第三定位信息加载第一数字资源,呈现第一AR场景,第一VR客户端基于第六定位信息加载第二数字资源,从而呈现第一VR场景,而第三定位信息是由对第二坐标系中的第二定位信息转换得到的,第六定位信息是由对第二坐标系中的第五定位信息转换得到的,第二定位信息和第五定位信息属于同一个坐标系,因此对于资源服务器来说,第一AR客户端和第一VR客户端均处于同一个坐标系中,这使得第一AR客户端所呈现的第一AR场景与第一VR客户端所呈现的第一VR场景能够同步,比如,第一AR客户端对应的角色与第一VR客户端对应的角色在游戏场景是相对的,那么当第一AR客户端对应的角色在第一AR场景中看到第一VR客户端对应的角色时,第一VR客户端对应的角色也可以在第一VR场景中看到第一AR客户端对应的角色,即第一AR客户端对应的角色与第一VR客户端对应的角色,可以在该游戏场景中互见互动。Combining Figures 14 and 15, it can be seen that since the first AR client obtains the first digital resource from the resource server based on the first positioning information, the first VR client obtains the second digital resource from the resource server based on the fourth positioning information, and The first positioning information and the fourth positioning information are both in the first coordinate system, so for the resource server, the first AR client and the first VR client are in the same coordinate system, so the first AR client and the first VR client The terminal can present the same game scene. And because the first AR client loads the first digital resource based on the third positioning information to present the first AR scene, the first VR client loads the second digital resource based on the sixth positioning information to present the first VR scene, and the third The positioning information is obtained by converting the second positioning information in the second coordinate system. The sixth positioning information is obtained by converting the fifth positioning information in the second coordinate system. The second positioning information and the fifth positioning information belong to The same coordinate system, so for the resource server, the first AR client and the first VR client are both in the same coordinate system, which makes the first AR scene presented by the first AR client and the first VR client The first VR scene presented by the client can be synchronized. For example, the character corresponding to the first AR client and the character corresponding to the first VR client are relative in the game scene, then when the character corresponding to the first AR client is in the first When you see the character corresponding to the first VR client in the AR scene, the character corresponding to the first VR client can also see the character corresponding to the first AR client in the first VR scene, that is, the character corresponding to the first AR client The characters corresponding to the first VR client can interact with each other in the game scene.
请参照图16,为本申请实施例所提供的一种协同呈现游戏对象的方法的流程图,该方法可以用于实现:当第一AR场景的第一现实位置包括第一对象时,第一VR客户端在第一VR场景的第 一虚拟位置呈现第一对象,该方法还可以用于呈现第一对象的运动事件。其中,以第一AR场景的第一现实位置的第一对象为处于第一现实位置的第一AR客户端的真实用户(即第一对象为现实对象)为例,第一AR客户端对应的定位信息即为第一对象对应的定位信息。需要说明的是,该方法并不以图16以及以下所述的具体顺序为限制,应当理解,在其它实施例中,该方法其中部分步骤的顺序可以根据实际需要相互交换,或者其中的部分步骤也可以省略或删除。该方法包括如下步骤:Please refer to Figure 16, which is a flow chart of a method for collaboratively presenting game objects provided by an embodiment of the present application. This method can be used to implement: when the first real position of the first AR scene includes the first object, the first VR client in the first VR scene A virtual location presents the first object, and the method can also be used to present motion events of the first object. Among them, taking the first object in the first real position of the first AR scene as a real user of the first AR client at the first real position (that is, the first object is a real object) as an example, the corresponding positioning of the first AR client The information is the positioning information corresponding to the first object. It should be noted that this method is not limited to the specific order described in Figure 16 and below. It should be understood that in other embodiments, the order of some steps of the method can be exchanged with each other according to actual needs, or some of the steps can be It can also be omitted or deleted. The method includes the following steps:
S1601,业务服务器向第一VR客户端发送第一对象对应的第二定位信息。S1601. The service server sends the second positioning information corresponding to the first object to the first VR client.
第一对象可以处于第一AR场景中的第一现实位置。业务服务器可以将第一对象在第二坐标系中的第二定位信息发送给第一VR客户端,由于对于业务服务器,第一AR客户端和第一VR客户端同处于第二坐标系,因此使得第一VR客户端可以基于第二定位信息在第一VR场景呈现第一对象。The first object may be at a first realistic location in the first AR scene. The business server can send the second positioning information of the first object in the second coordinate system to the first VR client. Since to the business server, the first AR client and the first VR client are in the second coordinate system, therefore The first VR client can present the first object in the first VR scene based on the second positioning information.
其中,第二定位信息可以与第一对象在第一AR场景中的第一现实位置对应。The second positioning information may correspond to the first real position of the first object in the first AR scene.
在一些实施方式中,业务服务器可以向第一AR客户端发送第一对应的第一旋转信息。In some implementations, the service server may send the first corresponding first rotation information to the first AR client.
在一些实施方式中,业务服务器可以通过广播的方式,向第一VR客户端发送第二定位信息和第一旋转信息中的至少一项。In some implementations, the service server may send at least one of the second positioning information and the first rotation information to the first VR client by broadcasting.
S1602,第一VR客户端将第二定位信息转换为第七定位信息。S1602: The first VR client converts the second positioning information into seventh positioning information.
第一VR客户端可以将第二定位信息进行坐标系转换,从而得到在第四坐标系中的第七定位信息。The first VR client can perform coordinate system conversion on the second positioning information, thereby obtaining seventh positioning information in the fourth coordinate system.
其中,第七定位信息可以指示第一对象在第四坐标系中的位置,第七定位信息也可以理解为第一对象的本地坐标。The seventh positioning information may indicate the position of the first object in the fourth coordinate system, and the seventh positioning information may also be understood as the local coordinates of the first object.
在一些实施方式中,第一VR客户端可以将第一旋转信息转换为第四旋转信息,第四旋转信息可以为第一对象在第四坐标系中的旋转角度,或者,可以理解为第四旋转信息所指示的是第一对象在第一VR客户端本地的旋转角度。In some implementations, the first VR client may convert the first rotation information into fourth rotation information. The fourth rotation information may be the rotation angle of the first object in the fourth coordinate system, or may be understood as the fourth rotation information. The rotation information indicates the local rotation angle of the first object on the first VR client.
S1603,第一VR客户端在第七定位信息所指示的第一VR场景中的第一虚拟位置,呈现第一对象。S1603. The first VR client presents the first object at the first virtual position in the first VR scene indicated by the seventh positioning information.
在一些实施方式中,第一VR客户端可以在第一VR场景中,基于第四旋转信息显示第一对象。In some implementations, the first VR client may display the first object based on the fourth rotation information in the first VR scene.
在一些实施方式中,第一VR客户端可以在第七定位信息所指示的第一VR场景中的第一虚拟位置,基于第四旋转信息显示第一对象。In some implementations, the first VR client may display the first object based on the fourth rotation information at the first virtual position in the first VR scene indicated by the seventh positioning information.
S1604,第一AR客户端控制第一对象在第一AR场景中运动,运动后的第一对象处于第一AR场景中的第三现实位置,第三现实位置对应的定位信息为新的第三定位信息。S1604, the first AR client controls the first object to move in the first AR scene. The moved first object is in the third real position in the first AR scene, and the positioning information corresponding to the third real position is the new third real position. Positioning information.
在一些实施方式中,第一AR客户端的用户可以手持或者佩戴第一AR客户端,当该用户在第一AR场景运动时,也将带动第一AR客户端运动,该用户和第一AR客户端的运动即可以理解为第一对象的运动。当然,在实际应用中,第一对象也可以通过其他方式在第一AR场景中运动。In some embodiments, the user of the first AR client can hold or wear the first AR client. When the user moves in the first AR scene, the first AR client will also be driven to move. The user and the first AR client The movement of the end can be understood as the movement of the first object. Of course, in practical applications, the first object can also move in the first AR scene in other ways.
在一些实施方式中,第一AR客户端控制第一对象在第一AR场景中运动,运动后的第一对象对应的旋转信息为新的第二旋转信息。In some implementations, the first AR client controls the movement of the first object in the first AR scene, and the rotation information corresponding to the moved first object is the new second rotation information.
S1605,第一AR客户端将新的第三定位信息转换为新的第二定位信息。S1605: The first AR client converts the new third positioning information into new second positioning information.
第一AR客户端可以将新的第三定位信息进行坐标系转换,从而得到在第二坐标系中的新的第二定位信息。由于对于业务服务器,第一AR客户端和第一AR客户端同处于第二坐标系,因此使得第一VR客户端可以基于新的第二定位信息在第一VR场景呈现第一对象的运动状况。The first AR client can perform coordinate system conversion on the new third positioning information, thereby obtaining new second positioning information in the second coordinate system. Since the first AR client and the first AR client are both in the second coordinate system for the business server, the first VR client can present the motion status of the first object in the first VR scene based on the new second positioning information. .
在一些实施方式中,第一AR客户端可以将新的第二旋转信息转换为新的第一旋转信息。In some implementations, the first AR client may convert the new second rotation information into new first rotation information.
S1606,第一AR客户端向业务服务器发送新的第二定位信息。S1606: The first AR client sends new second positioning information to the service server.
在一些实施方式中,第一AR客户端可以向业务服务器发送新的第一旋转信息。In some implementations, the first AR client may send new first rotation information to the service server.
S1607,业务服务器向第一VR客户端发送新的第二定位信息。S1607: The service server sends new second positioning information to the first VR client.
在一些实施方式中,业务服务器可以向第一VR客户端发送新的第一旋转信息。In some implementations, the service server may send new first rotation information to the first VR client.
S1608,第一VR客户端将新的第二定位信息转换为新的第七定位信息。S1608: The first VR client converts the new second positioning information into the new seventh positioning information.
在一些实施方式中,第一VR客户端可以将新的第一旋转信息转换为新的第四旋转信息。 In some implementations, the first VR client may convert the new first rotation information into new fourth rotation information.
S1609,第一VR客户端在新的第七定位信息所指示的第一VR场景中的第三虚拟位置,呈现第一对象。S1609: The first VR client presents the first object at the third virtual position in the first VR scene indicated by the new seventh positioning information.
其中,第一VR场景中的第三虚拟位置,可以与第一AR场景中的第三现实位置对应。第三虚拟位置和第三现实位置为游戏场景中的同一位置。The third virtual position in the first VR scene may correspond to the third real position in the first AR scene. The third virtual position and the third real position are the same position in the game scene.
在一些实施方式中,第一VR客户端可以在第一VR场景中,基于新的第四旋转信息呈现第一对象。In some implementations, the first VR client may present the first object in the first VR scene based on the new fourth rotation information.
在一些实施方式中,第一VR客户端可以在新的第七定位信息所指示的在第一VR场景中的第三虚拟位置,基于新的第四旋转信息呈现第一对象。In some implementations, the first VR client may present the first object based on the new fourth rotation information at a third virtual position in the first VR scene indicated by the new seventh positioning information.
在本申请实施例中,当第一对象从第一AR场景中的第一现实位置运动至第一VR场景中第三现实位置时,第一对象也从第一VR场景中的第一虚拟位置运动至第一VR场景中的第三虚拟位置,也即是,第一AR客户端中的第一对象的瞬时位置以及不同时刻的位置变化,对第一VR客户端是可见的。可以理解的是,可以通过多次执行上述S1601-S1609中的至少部分步骤,在同步呈现第一对象的运动状况。In this embodiment of the present application, when the first object moves from the first real position in the first AR scene to the third real position in the first VR scene, the first object also moves from the first virtual position in the first VR scene. Moving to the third virtual position in the first VR scene, that is, the instantaneous position of the first object in the first AR client and the position changes at different times are visible to the first VR client. It can be understood that the motion status of the first object can be presented synchronously by executing at least part of the above-mentioned steps S1601-S1609 multiple times.
请参照图17,为本申请实施例所提供的一种协同呈现游戏对象的方法的流程图,该方法可以用于实现:当第一VR场景的第二虚拟位置包括第二对象时,第一AR客户端在第一AR场景的第二现实位置呈现第二对象,该方法还可以用于呈现第二对象的运动事件。其中,第二对象为第一VR客户端对应的玩家控制角色(即第二对象为虚拟对象),第一VR客户端对应的定位信息即为第二对象对应的定位信息。需要说明的是,该方法并不以图17以及以下所述的具体顺序为限制,应当理解,在其它实施例中,该方法其中部分步骤的顺序可以根据实际需要相互交换,或者其中的部分步骤也可以省略或删除。该方法包括如下步骤:Please refer to Figure 17, which is a flow chart of a method for collaboratively presenting game objects provided by an embodiment of the present application. This method can be used to implement: when the second virtual position of the first VR scene includes a second object, the first The AR client presents the second object in the second real-world position of the first AR scene, and this method can also be used to present motion events of the second object. The second object is the player-controlled character corresponding to the first VR client (that is, the second object is a virtual object), and the positioning information corresponding to the first VR client is the positioning information corresponding to the second object. It should be noted that this method is not limited to the specific order described in Figure 17 and below. It should be understood that in other embodiments, the order of some steps of the method can be exchanged with each other according to actual needs, or some of the steps can be exchanged. It can also be omitted or deleted. The method includes the following steps:
S1701,业务服务器向第一AR客户端发送第二对象对应的第五定位信息。S1701. The service server sends the fifth positioning information corresponding to the second object to the first AR client.
第二对象可以处于第一VR场景中的第二虚拟位置。业务服务器可以将第二对象在第二坐标系中的第五定位信息发送给第一AR客户端,由于对于业务服务器,第一AR客户端和第一VR客户端同处于第二坐标系,因此使得第一AR客户端可以基于第五定位信息在第一AR场景呈现第二对象。The second object may be in a second virtual location in the first VR scene. The business server may send the fifth positioning information of the second object in the second coordinate system to the first AR client. Since to the business server, the first AR client and the first VR client are both in the second coordinate system, therefore The first AR client can present the second object in the first AR scene based on the fifth positioning information.
其中,第五定位信息可以与第二对象在第一VR场景中的第二虚拟位置对应。The fifth positioning information may correspond to the second virtual position of the second object in the first VR scene.
在一些实施方式中,业务服务器可以向第一AR客户端发送第五旋转信息。其中,可以由第一VR客户端对第三旋转信息进行转换得到第五旋转信息,并向业务服务器发送第五旋转信息,第五旋转信息可以指示第二对象在第二坐标系中的旋转角度。In some implementations, the service server may send fifth rotation information to the first AR client. The first VR client may convert the third rotation information to obtain the fifth rotation information, and send the fifth rotation information to the business server. The fifth rotation information may indicate the rotation angle of the second object in the second coordinate system. .
在一些实施方式中,业务服务器可以通过广播的方式,向第一AR客户端发送第五定位信息和第五旋转信息中的至少一个。In some implementations, the service server may send at least one of the fifth positioning information and the fifth rotation information to the first AR client by broadcasting.
S1702,第一AR客户端将第五定位信息转换为第八定位信息。S1702: The first AR client converts the fifth positioning information into the eighth positioning information.
第一AR客户端可以将第五定位信息进行坐标系转换,从而得到在第三坐标系中的第八定位信息。The first AR client may perform coordinate system conversion on the fifth positioning information to obtain the eighth positioning information in the third coordinate system.
其中,第八定位信息可以指示第二对象在第三坐标系中的位置,第八定位信息也可以理解为第二对象的本地坐标。The eighth positioning information may indicate the position of the second object in the third coordinate system, and the eighth positioning information may also be understood as the local coordinates of the second object.
在一些实施方式中,第一AR客户端可以将第五旋转信息转换为第六旋转信息。第六旋转信息可以为第二对象在第三坐标系中的旋转角度,或者,可以理解为第六旋转信息所指示的是第二对象在第一AR客户端本地的旋转角度。In some implementations, the first AR client may convert the fifth rotation information into sixth rotation information. The sixth rotation information may be the rotation angle of the second object in the third coordinate system, or it may be understood that the sixth rotation information indicates the rotation angle of the second object local to the first AR client.
S1703,第一AR客户端在第八定位信息所指示的第一AR场景中的第二现实位置,呈现第二对象。S1703. The first AR client presents the second object at the second real position in the first AR scene indicated by the eighth positioning information.
在一些实施方式中,第一AR客户端可以在第一AR场景中的第二现实位置,基于第六旋转信息呈现第二对象。In some implementations, the first AR client may present the second object based on the sixth rotation information at a second real-world location in the first AR scene.
在一些实施方式中,第一AR客户端可以在第八定位信息所指示的第一AR场景中的第二现实位置,基于第六旋转信息呈现第二对象。In some implementations, the first AR client may present the second object based on the sixth rotation information at a second real-world position in the first AR scene indicated by the eighth positioning information.
S1704,第一VR客户端控制第二对象在第一VR场景中运动,运动后的第二对象处于第一VR场景中的第四虚拟位置,第四虚拟位置对应的定位信息为新的第六定位信息。 S1704, the first VR client controls the second object to move in the first VR scene. The moved second object is in the fourth virtual position in the first VR scene, and the positioning information corresponding to the fourth virtual position is the new sixth Positioning information.
在一些实施方式中,第一VR客户端可以通过键盘、鼠标、手柄、遥控器、麦克风、智能眼镜、头盔和体感设备等外设设备,接收用户提交的运动指令,进而基于该运动指令第二对象在第一VR场景中运动。在一些实施方式中,运动指令的类型可以包括按键操作、触控操作、手势操作、眼动操作和脑电波等类型中的一种或多种。当然,在实际应用中,第一VR客户端也可以通过其他方式来接收用户提交的运动指令,或者可以通过其他方式来控制第二对象在第一VR场景中运动。In some implementations, the first VR client can receive movement instructions submitted by the user through peripheral devices such as keyboard, mouse, handle, remote control, microphone, smart glasses, helmet, and somatosensory equipment, and then based on the movement instructions, the second VR client Objects move in the first VR scene. In some embodiments, the type of movement instruction may include one or more types of key operations, touch operations, gesture operations, eye movement operations, and brain waves. Of course, in actual applications, the first VR client can also receive movement instructions submitted by the user through other methods, or can control the movement of the second object in the first VR scene through other methods.
在一些实施方式中,第一VR客户端控制第二对象在第一VR场景中运动,运动后的第二对象对应的旋转信息为新的第三旋转信息。In some implementations, the first VR client controls the second object to move in the first VR scene, and the rotation information corresponding to the moved second object is the new third rotation information.
S1705,第一VR客户端将新的第六定位信息转换为新的第五定位信息。S1705: The first VR client converts the new sixth positioning information into new fifth positioning information.
第一VR客户端可以将新的第六定位信息进行坐标系转换,从而得到在第二坐标系中的新的第五定位信息。由于对于业务服务器,第一AR客户端和第一AR客户端同处于第二坐标系,因此使得第一AR客户端可以基于新的第五定位信息在第一AR场景呈现第二对象的运动状况。The first VR client can perform coordinate system conversion on the new sixth positioning information, thereby obtaining new fifth positioning information in the second coordinate system. Since the first AR client and the first AR client are both in the second coordinate system for the business server, the first AR client can present the motion status of the second object in the first AR scene based on the new fifth positioning information. .
在一些实施方式中,第一VR客户端可以将新的第三旋转信息转换为新的第五旋转信息。In some implementations, the first VR client may convert the new third rotation information into new fifth rotation information.
S1706,第一VR客户端向业务服务器发送新的第五定位信息。S1706: The first VR client sends new fifth positioning information to the service server.
在一些实施方式中,第一VR客户端还可以向业务服务器发送新的第五旋转信息。In some implementations, the first VR client may also send new fifth rotation information to the service server.
S1707,业务服务器向第一AR客户端发送新的第五定位信息。S1707: The service server sends new fifth positioning information to the first AR client.
在一些实施方式中,业务服务器还可以向第一AR客户端发送新的第五旋转信息。In some implementations, the service server may also send new fifth rotation information to the first AR client.
S1708,第一AR客户端将新的第五定位信息转换为新的第八定位信息。S1708: The first AR client converts the new fifth positioning information into the new eighth positioning information.
在一些实施方式中,第一AR客户端还可以将新的第五旋转信息转换为新的第六旋转信息。In some implementations, the first AR client may also convert the new fifth rotation information into new sixth rotation information.
S1709,第一AR客户端在新的第八定位信息所指示的第一AR场景中的第四现实位置,呈现第二对象。S1709: The first AR client presents the second object at the fourth real position in the first AR scene indicated by the new eighth positioning information.
其中,第一AR场景中的第四现实位置,可以与第一VR场景中的第四虚拟位置对应。The fourth real position in the first AR scene may correspond to the fourth virtual position in the first VR scene.
在一些实施方式中,第一AR客户端可以在第一AR场景中的第四现实位置,基于新的第六旋转信息,呈现第二对象。In some implementations, the first AR client may present the second object at a fourth real-world location in the first AR scene based on the new sixth rotation information.
在一些实施方式中,第一AR客户端可以在新的第八定位信息所指示的第一AR场景中的第四现实位置,基于新的第六旋转信息,呈现第二对象。In some implementations, the first AR client may present the second object based on the new sixth rotation information at a fourth real-world position in the first AR scene indicated by the new eighth positioning information.
在本申请实施例中,当第二对象从第一VR场景中的第二虚拟位置运动至第一VR场景中第四虚拟位置时,第二对象也从第一AR场景中的第二现实位置运动至第一AR场景中的第四现实位置,也即是,第一VR客户端中的第二对象的瞬时位置以及不同时刻的位置变化,对第一AR客户端是可见的。可以理解的是,可以通过多次执行上述S1701-S1709中的至少部分步骤,在同步呈现第二对象的运动状况。In this embodiment of the present application, when the second object moves from the second virtual position in the first VR scene to the fourth virtual position in the first VR scene, the second object also moves from the second real position in the first AR scene. Moving to the fourth real position in the first AR scene, that is, the instantaneous position of the second object in the first VR client and the position changes at different times are visible to the first AR client. It can be understood that the motion status of the second object can be presented synchronously by executing at least part of the above-mentioned steps S1701-S1709 multiple times.
结合上述图16和图17所示方法,第一AR客户端中的第一对象的瞬时位置以及不同时刻的位置变化,对第一VR客户端是可见的,第一VR客户端中的第二对象的瞬时位置以及不同时刻的位置变化,对第一AR客户端是可见的。因此,当第一AR客户端的用户在第一AR场景中看到第一VR客户端对应的虚拟角色时,第一VR客户端的用户也可以在第一VR场景中看到第一AR客户端对应的虚拟角色,即第一AR客户端的真实用户与第一VR客户端的现实用户所控制的虚拟角色,可以在该游戏场景中互见互动。Combining the methods shown in Figure 16 and Figure 17 above, the instantaneous position of the first object in the first AR client and the position changes at different times are visible to the first VR client, and the second object in the first VR client The instantaneous position of the object and its position changes at different moments are visible to the first AR client. Therefore, when the user of the first AR client sees the virtual character corresponding to the first VR client in the first AR scene, the user of the first VR client can also see the virtual character corresponding to the first AR client in the first VR scene. The virtual characters, that is, the virtual characters controlled by the real user of the first AR client and the real user of the first VR client, can interact with each other in the game scene.
例如,图18为第一AR场景,图19为第一VR场景。For example, Figure 18 is the first AR scene, and Figure 19 is the first VR scene.
图18所示的第一AR场景,第一AR场景为第一AR客户端以第一人称的视角呈现的游戏场景。第一AR场景中包括与第一VR客户端对应的第二对象1601,但不包括第一AR客户端对应的第一对象1602。也即是,第一AR客户端的用户能看到共同游玩的其他用户,但无法看到该用户自身。第一AR场景所呈现的第二对象1601为虚拟对象。The first AR scene shown in Figure 18 is a game scene presented by the first AR client from a first-person perspective. The first AR scene includes the second object 1601 corresponding to the first VR client, but does not include the first object 1602 corresponding to the first AR client. That is to say, the user of the first AR client can see other users playing together, but cannot see the user himself. The second object 1601 presented in the first AR scene is a virtual object.
图19所示的第一VR场景,第一VR场景为第一VR客户端以第三人称的视角呈现的游戏场景。第一VR场景中既包括与第一VR客户端对应的第二对象1601,也包括与第一AR客户端对应的第一对象1602,其中,第一对象1602为与第一AR客户端的用户对应的虚拟对象。也即是,第一AR客户端的用户能看到共同游玩的其他用户,也能看到该用户自身。The first VR scene shown in FIG. 19 is a game scene presented by the first VR client from a third-person perspective. The first VR scene includes both the second object 1601 corresponding to the first VR client and the first object 1602 corresponding to the first AR client, wherein the first object 1602 is a virtual object corresponding to the user of the first AR client. That is, the user of the first AR client can see other users playing together, and can also see the user himself.
当第一VR客户端的用户控制第二对象1601在第一VR场景中移动时,第一AR客户端也在 第一AR场景中呈现第二对象1601的移动过程。如图18所示,第二对象1601从第一AR场景中靠近左边的位置,移动至中间位置。When the user of the first VR client controls the second object 1601 to move in the first VR scene, the first AR client also The movement process of the second object 1601 is presented in the first AR scene. As shown in Figure 18, the second object 1601 moves from a position close to the left in the first AR scene to a middle position.
当第一AR客户端的用户(即第一对象1602)在第一AR场景中移动时,第一VR客户端也在第一VR场景中呈现第一对象1602的移动过程。如图19所示,第一对象1602从第一VR场景的靠近右边的位置移动至靠近中间位置。When the user of the first AR client (ie, the first object 1602) moves in the first AR scene, the first VR client also presents the movement process of the first object 1602 in the first VR scene. As shown in Figure 19, the first object 1602 moves from a position near the right to a position near the middle of the first VR scene.
请参照图20,为本申请实施例所提供的一种协同呈现游戏对象的方法的流程图,该方法可以用于实现:当第一AR场景的第一现实位置包括第一对象时,第一VR客户端在第一VR场景的第一虚拟位置呈现第一对象,该方法还可以用于呈现第一对象的运动事件。其中,第一对象为不受第一AR客户端和第一VR客户端控制的对象,比如第一对象可以为非玩家控制角色,或者第一对象可以为第一AR场景中的车辆等不由第一AR客户端控制的物体。需要说明的是,该方法并不以图20以及以下所述的具体顺序为限制,应当理解,在其它实施例中,该方法其中部分步骤的顺序可以根据实际需要相互交换,或者其中的部分步骤也可以省略或删除。该方法包括如下步骤:Please refer to Figure 20, which is a flowchart of a method for collaboratively presenting game objects provided in an embodiment of the present application. The method can be used to implement: when the first real position of the first AR scene includes the first object, the first VR client presents the first object at the first virtual position of the first VR scene. The method can also be used to present motion events of the first object. Among them, the first object is an object that is not controlled by the first AR client and the first VR client. For example, the first object can be a non-player-controlled character, or the first object can be a vehicle in the first AR scene or other objects that are not controlled by the first AR client. It should be noted that the method is not limited to Figure 20 and the specific order described below. It should be understood that in other embodiments, the order of some steps in the method can be interchanged according to actual needs, or some steps can be omitted or deleted. The method includes the following steps:
S2001,第一AR客户端获取第一对象对应的第九定位信息,第九定位信息与第一对象在第一AR场景中的第三现实位置对应。S2001. The first AR client obtains ninth positioning information corresponding to the first object. The ninth positioning information corresponds to the third real position of the first object in the first AR scene.
其中,第九定位信息可以用于指示第一对象在第三坐标系中的位置。The ninth positioning information may be used to indicate the position of the first object in the third coordinate system.
在一些实施方式中,第一AR客户端可以获取第一对象对应的第七旋转信息。其中,第七旋转信息可以用于指示第一对象在第三坐标系中的旋转角度。In some implementations, the first AR client may obtain seventh rotation information corresponding to the first object. The seventh rotation information may be used to indicate the rotation angle of the first object in the third coordinate system.
在一些实施方式中,第一对象为第一AR场景中的现实车辆等不由第一AR客户端控制的物体,那么第一AR客户端获取第一对象对应的第九定位信息的方式,可以与获取第一AR客户端对应的第二定位信息的方式相同或相似,第一AR客户端获取第七旋转信息的方式,可以与获取第一AR客户端对应的第一旋转信息的方式相同或相似。在另一些实施方式中,第一对象为不由第一AR客户端控制的虚拟对象,那么第一对象对应的第九定位信息和第七旋转信息中的至少一个,可以由与第一对象对应的预设运动策略生成的。当然,在实际应用中,第一AR客户端也可以通过其他方式获取第一对象对应的第九定位信息和第七旋转信息中的至少一个。In some implementations, the first object is an object such as a real vehicle in the first AR scene that is not controlled by the first AR client. Then the first AR client obtains the ninth positioning information corresponding to the first object in a manner similar to The method of obtaining the second positioning information corresponding to the first AR client is the same or similar, and the method of obtaining the seventh rotation information by the first AR client can be the same as or similar to the method of obtaining the first rotation information corresponding to the first AR client. . In other embodiments, the first object is a virtual object not controlled by the first AR client, then at least one of the ninth positioning information and the seventh rotation information corresponding to the first object can be determined by the first object. Generated by preset movement strategies. Of course, in practical applications, the first AR client can also obtain at least one of the ninth positioning information and the seventh rotation information corresponding to the first object through other methods.
S2002,第一AR客户端将第九定位信息转换为第十定位信息。S2002: The first AR client converts the ninth positioning information into the tenth positioning information.
其中,第十定位信息可以用于指示第一对象在第二坐标系中的位置。The tenth positioning information may be used to indicate the position of the first object in the second coordinate system.
在一些实施方式中,第一AR客户端可以将第七旋转信息转换为第八旋转信息。其中,第八旋转信息可以用于指示第一对象在第二坐标系中的旋转角度。In some implementations, the first AR client may convert the seventh rotation information into the eighth rotation information. The eighth rotation information may be used to indicate the rotation angle of the first object in the second coordinate system.
S2003,第一AR客户端向业务服务器发送第十定位信息。S2003: The first AR client sends the tenth positioning information to the service server.
在一些实施方式中,第一AR客户端还可以向业务服务器发送第八旋转信息。In some implementations, the first AR client may also send eighth rotation information to the service server.
S2004,业务服务器向第一VR客户端发送第十定位信息。S2004: The service server sends the tenth positioning information to the first VR client.
在一些实施方式中,业务服务器还可以向第一VR客户端发送第八旋转信息。In some implementations, the service server may also send the eighth rotation information to the first VR client.
S2005,第一VR客户端将第十定位信息转换为第十一定位信息。S2005, the first VR client converts the tenth positioning information into the eleventh positioning information.
其中,第十一定位信息可以用于指示第一对象在第四坐标系中的位置。The eleventh positioning information may be used to indicate the position of the first object in the fourth coordinate system.
在一些实施方式中,第一VR客户端可以将第八旋转信息转换为第九旋转信息。其中,第九旋转信息可以用于指示第一对象在第四坐标系中的旋转角度。In some implementations, the first VR client may convert the eighth rotation information into ninth rotation information. The ninth rotation information may be used to indicate the rotation angle of the first object in the fourth coordinate system.
S2006,第一VR客户端在第十一定位信息所指示的第一VR场景中的第一虚拟位置,呈现第一对象。S2006: The first VR client presents the first object at the first virtual position in the first VR scene indicated by the eleventh positioning information.
在一些实施方式中,第一VR客户端可以在第一VR场景中,基于第九旋转信息呈现第一对象。In some implementations, the first VR client may present the first object based on the ninth rotation information in the first VR scene.
在一些实施方式中,第一VR客户端可以在第十一定位信息所指示的第一VR场景中的第一虚拟位置,基于第九旋转信息呈现第一对象。In some implementations, the first VR client may present the first object based on the ninth rotation information at the first virtual position in the first VR scene indicated by the eleventh positioning information.
S2007,第一AR客户端获取第一对象运动后的新的第九定位信息,新的第九定位信息与运动后的第一对象在第一AR场景中的第三现实位置对应。S2007: The first AR client acquires new ninth positioning information after the movement of the first object. The new ninth positioning information corresponds to the third real position of the first object after movement in the first AR scene.
在一些实施方式中,第一AR客户端可以获取运动后第一对象运动对应的新的第七旋转信息。In some implementations, the first AR client may obtain new seventh rotation information corresponding to the movement of the first object after the movement.
在一些实施方式中,第一对象为第一AR场景中的现实车辆等不由第一AR客户端控制的物体,那么第一对象可以自行运动。在另一些实施方式中,第一对象为不由第一AR客户端控制的 虚拟对象,那么第一对象基于与第一对象对应的预设运动策略运动。当然,在实际应用中,第一对象也可以基于其他控制方式来运动,本申请实施例对第一对象运动的原因不进行限定。In some implementations, the first object is an object such as a real vehicle in the first AR scene that is not controlled by the first AR client, and the first object can move on its own. In other implementations, the first object is not controlled by the first AR client virtual object, then the first object moves based on the preset movement strategy corresponding to the first object. Of course, in practical applications, the first object may also move based on other control methods. The embodiments of this application do not limit the reasons for the movement of the first object.
S2008,第一AR客户端将新的第九定位信息转换为新的第十定位信息。S2008, the first AR client converts the new ninth positioning information into the new tenth positioning information.
在一些实施方式中,第一AR客户端可以将新的第七旋转信息转换为新的第八旋转信息。In some implementations, the first AR client may convert the new seventh rotation information into new eighth rotation information.
S2009,第一AR客户端向业务服务器发送新的第十定位信息。S2009: The first AR client sends new tenth positioning information to the service server.
在一些实施方式中,第一AR客户端可以向业务服务器发送新的第八旋转信息。In some implementations, the first AR client may send new eighth rotation information to the service server.
S2010,业务服务器向第一VR客户端发送新的第十定位信息。S2010: The service server sends new tenth positioning information to the first VR client.
在一些实施方式中,业务服务器可以向第一VR客户端发送新的第八旋转信息。In some implementations, the service server may send new eighth rotation information to the first VR client.
S2011,第一VR客户端将新的第十定位信息转换为新的第十一定位信息。S2011, the first VR client converts the new tenth positioning information into the new eleventh positioning information.
在一些实施方式中,第一VR客户端可以将新的第八旋转信息转换为新的第九旋转信息。In some implementations, the first VR client may convert the new eighth rotation information into new ninth rotation information.
S2012,第一VR客户端在新的第十一定位信息所指示的第一VR场景中的第三虚拟位置,呈现第一对象。S2012: The first VR client presents the first object at the third virtual position in the first VR scene indicated by the new eleventh positioning information.
在一些实施方式中,第一VR客户端可以在第一VR场景中,基于新的第九旋转信息呈现第一对象。In some implementations, the first VR client may present the first object based on the new ninth rotation information in the first VR scene.
在一些实施方式中,第一VR客户端可以在新的第十一定位信息所指示的第一VR场景中的第三虚拟位置,基于新的第九旋转信息呈现第一对象。In some implementations, the first VR client may present the first object based on the new ninth rotation information at a third virtual position in the first VR scene indicated by the new eleventh positioning information.
在本申请实施例中,当第一对象从第一AR场景中的第一现实位置运动至第一VR场景中第三现实位置时,第一对象也从第一VR场景中的第一虚拟位置运动至第一VR场景中的第三虚拟位置,也及时,第一AR客户端中的第一对象的瞬时位置以及不同时刻的位置变化,对第一VR客户端是可见的。可以理解的是,可以通过多次执行上述S2001-S2012中的至少部分步骤,在同步呈现第一对象的运动状况。In this embodiment of the present application, when the first object moves from the first real position in the first AR scene to the third real position in the first VR scene, the first object also moves from the first virtual position in the first VR scene. Moving to the third virtual position in the first VR scene, also in time, the instantaneous position of the first object in the first AR client and the position changes at different times are visible to the first VR client. It can be understood that the motion status of the first object can be presented synchronously by executing at least part of the above-mentioned steps S2001-S2012 multiple times.
请参照图21,为本申请实施例所提供的一种协同呈现游戏对象的方法的流程图,该方法可以用于实现:当第一VR场景的第二虚拟位置包括第二对象时,第一AR客户端在第一AR场景的第二现实位置呈现第二对象,该方法还可以用于呈现第二对象的运动事件。其中,第二对象为不受第一VR客户端控制的对象,比如第二对象可以为非玩家控制角色等。需要说明的是,该方法并不以图21以及以下所述的具体顺序为限制,应当理解,在其它实施例中,该方法其中部分步骤的顺序可以根据实际需要相互交换,或者其中的部分步骤也可以省略或删除。该方法包括如下步骤:Please refer to Figure 21, which is a flow chart of a method for collaboratively presenting game objects provided by an embodiment of the present application. This method can be used to implement: when the second virtual position of the first VR scene includes a second object, the first The AR client presents the second object in the second real-world position of the first AR scene, and this method can also be used to present motion events of the second object. Among them, the second object is an object that is not controlled by the first VR client. For example, the second object can be a non-player controlled character, etc. It should be noted that this method is not limited to the specific order described in Figure 21 and below. It should be understood that in other embodiments, the order of some steps of the method can be exchanged with each other according to actual needs, or some of the steps can be It can also be omitted or deleted. The method includes the following steps:
S2101,第一VR客户端获取第二对象对应的第十二定位信息,第十二定位信息与第二对象在第一VR场景中的第二虚拟位置对应。S2101. The first VR client obtains twelfth positioning information corresponding to the second object. The twelfth positioning information corresponds to the second virtual position of the second object in the first VR scene.
其中,第十二定位信息可以用于指示第二对象在第四坐标系中的位置。Wherein, the twelfth positioning information may be used to indicate the position of the second object in the fourth coordinate system.
在一些实施方式中,第一VR客户端可以获取第二对象对应的第十旋转信息。其中,第十旋转信息可以用于指示第二对象在第四坐标系中的旋转角度。In some implementations, the first VR client may obtain the tenth rotation information corresponding to the second object. The tenth rotation information may be used to indicate the rotation angle of the second object in the fourth coordinate system.
在一些实施方式中,第二对象为不由第一VR客户端控制的虚拟对象,那么第二对象对应的第十二定位信息和第十旋转信息中的至少一个,可以由与第二对象对应的预设运动策略生成的。当然,在实际应用中,第一VR客户端也可以通过其他方式获取第二对象对应的第十二定位信息和第十旋转信息中的至少一个。In some embodiments, the second object is a virtual object that is not controlled by the first VR client, then at least one of the twelfth positioning information and the tenth rotation information corresponding to the second object can be generated by a preset motion strategy corresponding to the second object. Of course, in actual applications, the first VR client can also obtain at least one of the twelfth positioning information and the tenth rotation information corresponding to the second object by other means.
S2102,第一VR客户端将第十二定位信息转换为第十三定位信息。S2102: The first VR client converts the twelfth positioning information into the thirteenth positioning information.
其中,第十三定位信息可以用于指示第二对象在第二坐标系中的位置。The thirteenth positioning information may be used to indicate the position of the second object in the second coordinate system.
在一些实施方式中,第一VR客户端可以将第十旋转信息转换为第十一旋转信息,第十一旋转信息可以用于指示第二对象在第二坐标系中的旋转角度。In some implementations, the first VR client may convert the tenth rotation information into eleventh rotation information, and the eleventh rotation information may be used to indicate the rotation angle of the second object in the second coordinate system.
S2103,第一VR客户端向业务服务器发送第二对象对应的第十三定位信息。S2103. The first VR client sends the thirteenth positioning information corresponding to the second object to the service server.
在一些实施方式中,第一VR客户端可以向业务服务器发送第十一旋转信息。In some implementations, the first VR client may send eleventh rotation information to the service server.
S2104,业务服务器向第一AR客户端发送第二对象对应的第十三定位信息。S2104: The service server sends the thirteenth positioning information corresponding to the second object to the first AR client.
在一些实施方式中,业务服务器可以向第一AR客户端发送第十一旋转信息。In some implementations, the service server may send the eleventh rotation information to the first AR client.
S2105,第一AR客户端将第十三定位信息转换为第十四定位信息。S2105: The first AR client converts the thirteenth positioning information into the fourteenth positioning information.
其中,第十四定位信息可以用于指示第二对象在第三坐标系中的位置。The fourteenth positioning information may be used to indicate the position of the second object in the third coordinate system.
在一些实施方式中,第一AR客户端可以将第十一旋转信息转换为第十二旋转信息。其中, 第十二旋转信息可以用于指示第二对象在第三坐标系中的旋转角度。In some implementations, the first AR client may convert the eleventh rotation information into the twelfth rotation information. in, The twelfth rotation information may be used to indicate the rotation angle of the second object in the third coordinate system.
S2106,第一AR客户端在第十四定位信息所指示的第一AR场景中的第二现实位置,呈现第二对象。S2106: The first AR client presents the second object at the second real position in the first AR scene indicated by the fourteenth positioning information.
在一些实施方式中,第一AR客户端可以在第一AR场景中的第二现实位置,基于第十二旋转角度呈现第二对象。In some implementations, the first AR client may present the second object based on the twelfth rotation angle at a second real-world location in the first AR scene.
在一些实施方式中,第一AR客户端可以在第十四定位信息所指示的第一AR场景中的第二现实位置,基于第十二旋转角度呈现第二对象。In some implementations, the first AR client may present the second object based on the twelfth rotation angle at the second real position in the first AR scene indicated by the fourteenth positioning information.
S2107,第一VR客户端获取第二对象运动后的新的第十二定位信息,新的第十二定位信息与运动后的第二对象在第一VR场景中的第四虚拟位置对应。S2107. The first VR client obtains new twelfth positioning information after the movement of the second object. The new twelfth positioning information corresponds to the fourth virtual position of the second object after movement in the first VR scene.
在一些实施方式中,第一VR客户端获取运动后的第二对象对应的新的第十旋转信息。In some implementations, the first VR client obtains new tenth rotation information corresponding to the second object after the movement.
在一些实施方式中,第二对象为不由第一VR客户端控制的虚拟对象,那么第二对象基于与第二对象对应的预设运动策略运动。当然,在实际应用中,第二对象也可以基于其他控制方式来运动,本申请实施例对第二对象运动的原因不进行限定。In some implementations, the second object is a virtual object not controlled by the first VR client, then the second object moves based on the preset movement strategy corresponding to the second object. Of course, in practical applications, the second object may also move based on other control methods. The embodiments of this application do not limit the reasons for the movement of the second object.
S2108,第一VR客户端将新的第十二定位信息转换为新的第十三定位信息。S2108: The first VR client converts the new twelfth positioning information into the new thirteenth positioning information.
在一些实施方式中,第一VR客户端可以将新的第十旋转信息转换为新的第十一旋转信息。In some implementations, the first VR client may convert the new tenth rotation information into new eleventh rotation information.
S2109,第一VR客户端向业务服务器发送新的第十三定位信息。S2109: The first VR client sends new thirteenth positioning information to the service server.
在一些实施方式中,第一VR客户端可以向业务服务器发送新的第十一旋转信息。In some implementations, the first VR client may send new eleventh rotation information to the service server.
S2110,业务服务器向第一AR客户端发送新的第十三定位信息。S2110: The service server sends new thirteenth positioning information to the first AR client.
在一些实施方式中,业务服务器可以向第一AR客户端发送新的第十一旋转信息。In some implementations, the service server may send new eleventh rotation information to the first AR client.
S2111,第一AR客户端将新的第十三定位信息转换为新的第十四定位信息。S2111. The first AR client converts the new thirteenth positioning information into the new fourteenth positioning information.
在一些实施方式中,第一AR客户端还可以将新的第十一旋转信息转换为新的第十二旋转信息。In some implementations, the first AR client may further convert the new eleventh rotation information into new twelfth rotation information.
S2112,第一AR客户端在新的第十四定位信息所指示的第一AR场景中的第四现实位置,呈现第二对象。S2112. The first AR client presents the second object at the fourth real position in the first AR scene indicated by the new fourteenth positioning information.
在一些实施方式中,第一AR客户端可以在第一AR场景中的第四现实位置,基于新的第十二旋转信息,呈现第二对象。In some implementations, the first AR client may present the second object at a fourth real-world location in the first AR scene based on the new twelfth rotation information.
在一些实施方式中,第一AR客户端可以在新的第十四定位信息所指示的第一AR场景中的第四现实位置,基于新的第十二旋转信息呈现第二对象。In some implementations, the first AR client may present the second object based on the new twelfth rotation information at a fourth real-world position in the first AR scene indicated by the new fourteenth positioning information.
在本申请实施例中,当第二对象从第一VR场景中的第二虚拟位置运动至第一VR场景中第四虚拟位置时,第二对象也从第一AR场景中的第二现实位置运动至第一AR场景中的第四现实位置,也及时,第一VR客户端中的第二对象的瞬时位置以及不同时刻的位置变化,对第一AR客户端是可见的。可以理解的是,可以通过多次执行上述S2101-S2112中的至少部分步骤,在同步呈现第二对象的运动状况。In this embodiment of the present application, when the second object moves from the second virtual position in the first VR scene to the fourth virtual position in the first VR scene, the second object also moves from the second real position in the first AR scene. Moving to the fourth real position in the first AR scene, also in time, the instantaneous position of the second object in the first VR client and the position changes at different times are visible to the first AR client. It can be understood that the motion status of the second object can be presented synchronously by performing at least part of the above-mentioned steps S2101-S2112 multiple times.
请参照图22,为本申请实施例所提供的一种协同呈现游戏对象的方法的流程图。可以理解的是,以下以游戏场景中的第一对象为例,对AR客户端和VR客户端协同呈现第一对象以及第一对象的状态更新事件的方式进行说明,当第一协同场景为其他类型的协同场景时,或者对于第一对象之外得到其他对象(包括第二对象),AR客户端和VR客户端可以按照与之相似或相同的方式协同呈现该对象以及该对象的状态更新事件。需要说明的是,该方法并不以图22以及以下所述的具体顺序为限制,应当理解,在其它实施例中,该方法其中部分步骤的顺序可以根据实际需要相互交换,或者其中的部分步骤也可以省略或删除。该方法包括如下步骤:Please refer to FIG. 22 , which is a flow chart of a method for collaboratively presenting game objects provided by an embodiment of the present application. It can be understood that the following takes the first object in the game scene as an example to illustrate the manner in which the AR client and the VR client collaboratively present the first object and the status update event of the first object. When the first collaborative scene is other type of collaborative scene, or for other objects (including the second object) other than the first object, the AR client and the VR client can collaboratively present the object and the state update event of the object in a similar or identical manner. . It should be noted that this method is not limited to the specific order described in Figure 22 and below. It should be understood that in other embodiments, the order of some steps of the method can be exchanged with each other according to actual needs, or some of the steps can be exchanged. It can also be omitted or deleted. The method includes the following steps:
S2201,业务服务器向第一AR客户端和第一VR客户端发送第一对象对应的身份信息。S2201. The business server sends the identity information corresponding to the first object to the first AR client and the first VR client.
在一些实施方式中,业务服务器可以基于第一对象和第一AR客户端在第二坐标系中的位置以及第一AR客户端对应的第一范围信息,判断第一对象是否处于第一范围信息所指示的范围内,如果是向第一AR客户端发送第一对象对应的身份信息。In some embodiments, the business server can determine whether the first object is within the range indicated by the first range information based on the positions of the first object and the first AR client in the second coordinate system and the first range information corresponding to the first AR client, and if so, send the identity information corresponding to the first object to the first AR client.
在一些实施方式中,业务服务器可以基于第一对象和第一VR客户端在第二坐标系中的位置以及第一VR客户端对应的第二范围信息,判断第一对象是否处于第二范围信息所指示的范围内,如果是向第一VR客户端发送第一对象对应的身份信息。 In some implementations, the business server may determine whether the first object is in the second range information based on the positions of the first object and the first VR client in the second coordinate system and the second range information corresponding to the first VR client. Within the indicated range, if the identity information corresponding to the first object is sent to the first VR client.
当然,在实际应用中,业务服务器也可以通过其他方式来确定是否向第一AR客户端和/或第一VR客户端发送第一对象的身份信息。Of course, in practical applications, the service server may also determine whether to send the identity information of the first object to the first AR client and/or the first VR client through other methods.
在一些实施方式中,第一对象的身份信息可以包括第一对象对应的用户ID、名称和角色ID中的至少一个。当然,在实际应用中,第一对象的身份信息也可以包括更多或更少的用于指示第一对象的身份的信息。In some implementations, the identity information of the first object may include at least one of a user ID, a name, and a role ID corresponding to the first object. Of course, in practical applications, the identity information of the first object may also include more or less information indicating the identity of the first object.
S2202,第一AR客户端和第一VR客户端基于第一对象对应的身份信息,获取第一对象对应的资源数据。S2202: The first AR client and the first VR client obtain resource data corresponding to the first object based on the identity information corresponding to the first object.
其中,第一对象对应的资源数据可以用于呈现第一对象。在一些实施方式中,该资源数据可以包括第一对象对应的虚拟形象。在一些实施方式中,该资源数据可以包括第一对象对应的道具、资产和状态等等。当然,在实际应用中,该资源数据也可以包括更多或更少的用于呈现第一对象的数据。The resource data corresponding to the first object can be used to present the first object. In some implementations, the resource data may include an avatar corresponding to the first object. In some implementations, the resource data may include props, assets, status, etc. corresponding to the first object. Of course, in actual applications, the resource data may also include more or less data for presenting the first object.
在一些实施方式中,第一AR客户端和第一VR客户端本地都存储有第一对象对应的至少部分资源数据,那么第一AR客户端可以从第一AR客户端本地获取该至少部分资源数据,第一VR客户端可以从第一VR客户端本地获取该至少部分资源数据。In some implementations, both the first AR client and the first VR client locally store at least part of the resource data corresponding to the first object, then the first AR client can locally obtain at least part of the resource data from the first AR client. Data, the first VR client can obtain at least part of the resource data locally from the first VR client.
在一些实施方式中,资源服务器存储有第一对象对应的至少部分资源数据,那么第一AR客户端和第一VR客户端可以从该资源服务器获取该至少部分资源数据。In some implementations, the resource server stores at least part of the resource data corresponding to the first object, and then the first AR client and the first VR client can obtain the at least part of the resource data from the resource server.
S2203,第一AR客户端和第一VR客户端基于第一对象对应的资源数据,呈现第一对象。S2203. The first AR client and the first VR client present the first object based on the resource data corresponding to the first object.
在一些实施方式中,第一AR客户端和第一VR客户端可以对第一对象对应的虚拟形象进行渲染显示,从而呈现第一对象。在一些实施方式中,第一对象为第一AR场景中的现实对象,第一AR客户端所呈现的就是第一AR客户端通过相机拍摄到的第一对象,或者第一AR客户端的用户可以直接看到第一对象,因此第一AR客户端可以不呈现第一对象对应的虚拟形象。In some implementations, the first AR client and the first VR client may render and display the virtual image corresponding to the first object, thereby presenting the first object. In some implementations, the first object is a real object in the first AR scene, and what is presented by the first AR client is the first object photographed by the first AR client through the camera, or the user of the first AR client can The first object is directly seen, so the first AR client may not present the virtual image corresponding to the first object.
在一些实施方式中,第一AR客户端和第一VR客户端可以显示第一对象对应的身份信息。In some implementations, the first AR client and the first VR client may display identity information corresponding to the first object.
在一些实施方式中,第一AR客户端和第一VR客户端可以显示第一对象的状态,比如生命值、法力值、体力值、耐力值、道具余量等等。In some implementations, the first AR client and the first VR client can display the status of the first object, such as health value, mana value, stamina value, endurance value, remaining items, etc.
S2204,当触发第一对象的状态更新事件时,业务服务器向第一AR客户端和第一VR客户端通知第一对象的状态事件。S2204: When a status update event of the first object is triggered, the service server notifies the first AR client and the first VR client of the status event of the first object.
在一些实施方式中,第一对象的状态更新事件可以为在第一AR场景中触发的事件。比如第一对象为第一AR客户端对应的现实用户,该现实用户在第一AR场景中获取一件道具,即第一对象的道具余量发生变化;或者,第一对象为在第一AR场景中的宝箱,第一AR客户端对应的现实用户打开该宝箱,该宝箱可以从未开启状态变化为开启状态。In some implementations, the status update event of the first object may be an event triggered in the first AR scene. For example, the first object is a real user corresponding to the first AR client. The real user obtains a prop in the first AR scene, that is, the prop balance of the first object changes; or, the first object is a real user in the first AR scene. For the treasure box in the scene, if the real user corresponding to the first AR client opens the treasure box, the treasure box can change from an unopened state to an open state.
在一些实施方式中,第一对象的状态更新事件可以为在第一VR场景中触发的事件。比如,第一对象为宝箱,第一VR客户端对应的玩家控制角色在第一VR场景中打开该宝箱。In some implementations, the status update event of the first object may be an event triggered in the first VR scene. For example, the first object is a treasure box, and the player corresponding to the first VR client controls the character to open the treasure box in the first VR scene.
S2205,第一AR客户端和第一VR客户端更新第一对象的状态,即呈现第一对象的状态更新事件。S2205. The first AR client and the first VR client update the status of the first object, that is, present the status update event of the first object.
第一AR客户端可以对第一AR场景中呈现的第一对象进行更新,第一VR客户端可以对第一VR场景中呈现的第一对象进行更新。The first AR client can update the first object presented in the first AR scene, and the first VR client can update the first object presented in the first VR scene.
在一些实施方式中,第一对象为现实对象,第一AR客户端可以通过相机拍摄到第一对象的状态变化,或者第一AR客户端的用户可以直接看到第一对象的状态变化,因此第一AR客户端可以不对第一对象的状态进行更新。In some implementations, the first object is a real object, and the first AR client can capture the state change of the first object through the camera, or the user of the first AR client can directly see the state change of the first object, so the first AR client can capture the state change of the first object through the camera. An AR client may not update the state of the first object.
在本申请实施例中,第一AR客户端和第一VR客户端可以获取到第一对象对应的身份信息,并基于该身份信息获取第一对象对应的资源数据,第一AR客户端可以在第一AR场景中呈现第一对象,第一VR客户端可以在第一VR场景中呈现第一对象,使得第一AR客户端对应的用户和第一VR客户端的都可以看到第一对象的形象和状态。且当第一对象的状态更新事件触发时,第一AR客户端可以在第一AR场景中更新第一对象的状态,第一VR客户端可以在第一VR场景中更新第一对象的状态,使得第一AR客户端对应的用户和第一VR客户端的都可以看到第一对象的状态变化,提高了协同呈现第一对象的准确性。In the embodiment of the present application, the first AR client and the first VR client can obtain the identity information corresponding to the first object, and obtain the resource data corresponding to the first object based on the identity information. The first AR client can present the first object in the first AR scene, and the first VR client can present the first object in the first VR scene, so that the user corresponding to the first AR client and the first VR client can both see the image and status of the first object. When the status update event of the first object is triggered, the first AR client can update the status of the first object in the first AR scene, and the first VR client can update the status of the first object in the first VR scene, so that the user corresponding to the first AR client and the first VR client can both see the status change of the first object, thereby improving the accuracy of the collaborative presentation of the first object.
可以理解的,业务服务器可以向第一AR客户端和第一VR客户端发送第二对象对应的身份 信息,第一AR客户端和第一VR客户端可以基于第二对象对应的身份信息,获取第二对象对应的资源数据,基于第二对象对应的资源数据,呈现第二对象,使得第一AR客户端对应的用户和第一VR客户端的都可以看到第二对象的形象和状态。当触发第二对象的状态更新事件时,业务服务器可以向第一AR客户端和第一VR客户端通知第二对象的状态事件,第一AR客户端和第一VR客户端可以呈现第二对象的状态更新事件,从而更新第二对象的状态,使得第一AR客户端对应的用户和第一VR客户端的都可以看到第二对象的状态变化,提高了协同呈现第二对象的准确性。It can be understood that the business server can send the identity corresponding to the second object to the first AR client and the first VR client. Information, the first AR client and the first VR client can obtain the resource data corresponding to the second object based on the identity information corresponding to the second object, and present the second object based on the resource data corresponding to the second object, so that the first AR Both the user corresponding to the client and the first VR client can see the image and status of the second object. When the status update event of the second object is triggered, the business server can notify the first AR client and the first VR client of the status event of the second object, and the first AR client and the first VR client can present the second object. status update event, thereby updating the status of the second object, so that both the user corresponding to the first AR client and the first VR client can see the status change of the second object, which improves the accuracy of collaborative presentation of the second object.
例如,第一AR客户端呈现的第一AR场景如图23所示,第一VR客户端呈现第一VR场景如图24所示。其中,第一对象为第一AR客户端的现实用户,且第一对象在第一VR场景中为虚拟形象,第二对象为第一VR客户端对应的玩家控制角色,第二对象在第一AR场景和第一VR场景中均为虚拟形象。For example, the first AR scene presented by the first AR client is as shown in Figure 23, and the first VR scene presented by the first VR client is as shown in Figure 24. Among them, the first object is a real user of the first AR client, and the first object is a virtual image in the first VR scene, the second object is a player-controlled character corresponding to the first VR client, and the second object is in the first AR Both the scene and the first VR scene are virtual images.
由图23可知,第一AR场景中的左上角还包括了第一对象1602对应的头像A 1603和角色名称A 1604等身份信息,还包括了第一对象1602对应的生命值A 1605等状态信息。第一AR场景中的第二对象1601的头顶上方还包括了第二对象1601对应的头像B 1606和角色名称B 1607等身份信息,还包括了第二对象1601对应的生命值B 1608等状态信息。由图24可知,第一VR场景中的左上角还包括了第二对象1601对应的头像B 1606和角色名称B 1607等身份信息,还包括了第二对象1601对应的生命值B 1608等状态信息。第一VR场景中的第一对象1602的头顶上方还包括了第一对象1602对应的头像A 1603和角色名称A 1604等身份信息,还包括了第一对象1602对应的生命值A 1605等状态信息。且第一对象1602对应的生命值A 1605等状态信息在图23所示的第一AR场景和图24所示的第一VR场景中是一致的,第二对象1601对应的生命值B 1608等状态信息在图23所示的第一AR场景和图24所示的第一VR场景中也是一致的。As can be seen from Figure 23, the upper left corner of the first AR scene also includes identity information such as the avatar A 1603 and character name A 1604 corresponding to the first object 1602, and also includes status information such as the health value A 1605 corresponding to the first object 1602. . The top of the head of the second object 1601 in the first AR scene also includes identity information such as the avatar B 1606 and character name B 1607 corresponding to the second object 1601, and also includes status information such as the health value B 1608 corresponding to the second object 1601. . As can be seen from Figure 24, the upper left corner of the first VR scene also includes identity information such as the avatar B 1606 and character name B 1607 corresponding to the second object 1601, and also includes status information such as the health value B 1608 corresponding to the second object 1601. . The top of the head of the first object 1602 in the first VR scene also includes identity information such as the avatar A 1603 and character name A 1604 corresponding to the first object 1602, and also includes status information such as the health value A 1605 corresponding to the first object 1602. . And the health value A 1605 and other status information corresponding to the first object 1602 is consistent in the first AR scene shown in Figure 23 and the first VR scene shown in Figure 24, and the health value B 1608 and so on corresponding to the second object 1601 The status information is also consistent in the first AR scene shown in FIG. 23 and the first VR scene shown in FIG. 24 .
请参照图23左侧的状态更新前的第一AR场景以及图24左侧状态更新前的第一VR场景。其中,第一对象1602对应的生命值A 1605处于残缺状态,第二对象1601对应的生命值B 1608处于饱满状态。Please refer to the first AR scene before status update on the left side of Figure 23 and the first VR scene before status update on the left side of Figure 24 . Among them, the health value A 1605 corresponding to the first object 1602 is in a incomplete state, and the health value B 1608 corresponding to the second object 1601 is in a full state.
请继续参照图23右侧的状态更新后的第一AR场景以及图24右侧状态更新后的第一VR场景。当第一对象1602在第一AR场景使用了生命值回复道具时,第一AR场景和第一VR场景都可以对第一对象1602的生命值A 1605进行更新。在图23右侧的状态更新后的第一AR场景以及图24右侧状态更新后的第一VR场景中,第一对象1602的生命值A 1605均更新至饱满状态。当第二对象1601在第一VR场景中触发了生命值消除道具时,第一AR场景和第一VR场景都可以对第二对象1601的生命值B 1608进行更新,图23右侧的状态更新后的第一AR场景以及图24右侧状态更新后的第一VR场景中,第二对象1601的生命值A 1605均更新为残缺状态。Please continue to refer to the first AR scene after the status update on the right side of Figure 23 and the first VR scene after the status update on the right side of Figure 24 . When the first object 1602 uses the health value restoration item in the first AR scene, both the first AR scene and the first VR scene can update the health value A 1605 of the first object 1602. In the first AR scene after the status update on the right side of Figure 23 and the first VR scene after the status update on the right side of Figure 24, the health value A 1605 of the first object 1602 is both updated to a full state. When the second object 1601 triggers the health value elimination prop in the first VR scene, both the first AR scene and the first VR scene can update the health value B 1608 of the second object 1601. The status update on the right side of Figure 23 In the first AR scene after the state update and the first VR scene after the status update on the right side of Figure 24, the health value A 1605 of the second object 1601 is updated to the incomplete state.
需要说明的是,本申请实施例仅以图23和图24,对第一AR场景和第一VR场景可能呈现第一对象和第二对象的方式、第一对象的状态更新事件以及第二对象的状态更新事件进行说明,而不对第一对象和第二对象的方式、第一对象的状态更新事件以及第二对象的状态更新事件构成限定。It should be noted that this embodiment of the present application only uses FIG. 23 and FIG. 24 to describe the possible ways in which the first object and the second object may be presented in the first AR scene and the first VR scene, the status update event of the first object, and the second object. The state update event of the first object and the second object will be described without limiting the manner of the first object and the second object, the state update event of the first object, and the state update event of the second object.
请参照图25,为本申请实施例所提供的一种对象交互的方法的流程图。可以理解的是,以下以显示游戏场景中的第一对象向第二对象射击为例,对第一对象和第二对象交互方式进行说明。需要说明的是,该方法并不以图25以及以下所述的具体顺序为限制,应当理解,在其它实施例中,该方法其中部分步骤的顺序可以根据实际需要相互交换,或者其中的部分步骤也可以省略或删除。该方法包括如下步骤:Please refer to FIG. 25 , which is a flow chart of an object interaction method provided by an embodiment of the present application. It can be understood that the interaction method between the first object and the second object will be described below by taking the first object shooting at the second object in the game scene as an example. It should be noted that this method is not limited to the specific order described in Figure 25 and below. It should be understood that in other embodiments, the order of some steps of the method can be exchanged with each other according to actual needs, or some of the steps can be It can also be omitted or deleted. The method includes the following steps:
S2501,第一AR客户端响应于第一对象的攻击指令,向业务服务器发送第一对象对应的子弹信息。S2501. The first AR client responds to the attack command of the first object and sends the bullet information corresponding to the first object to the business server.
在一些实施方式中,第一对象为第一AR客户端对应的用户所控制的对象,第一AR客户端可以接收用户提交的攻击指令。In some implementations, the first object is an object controlled by the user corresponding to the first AR client, and the first AR client can receive attack instructions submitted by the user.
在一些实施方式中,第一对象可以不为该用户所控制的对象,第一AR客户端可以从第一对象对应的预设交互策略中获取该攻击指令。当然,在实际应用中,第一AR客户端也可以通过其他方式来获取第一对象的攻击指令。 In some implementations, the first object may not be an object controlled by the user, and the first AR client may obtain the attack instruction from the preset interaction strategy corresponding to the first object. Of course, in actual applications, the first AR client can also obtain the attack instructions of the first object through other methods.
在一些实施方式中,第一对象的攻击指令可以用于指示第一对象对应的子弹信息。In some implementations, the attack instruction of the first object may be used to indicate bullet information corresponding to the first object.
在一些实施方式中,子弹信息可以包括子弹ID、子弹在第二坐标系中的位置以及子弹在第二坐标系中的发射方向。当然,在实际应用中,子弹信息还可以包括更多或更少的信息,比如还可以包括子弹类型、子弹飞行速度、子弹飞行距离等中的一个或多个。In some embodiments, the bullet information may include the bullet ID, the position of the bullet in the second coordinate system, and the firing direction of the bullet in the second coordinate system. Of course, in practical applications, the bullet information may also include more or less information, such as one or more of bullet type, bullet flight speed, bullet flight distance, etc.
S2502,业务服务器向第一VR客户端发送第一对象对应的子弹信息。S2502: The business server sends the bullet information corresponding to the first object to the first VR client.
在一些实施方式中,业务服务器可以通过广播的方式,向第一VR客户端发送第一对象对应的子弹信息。In some implementations, the service server may send the bullet information corresponding to the first object to the first VR client by broadcasting.
在一些实施方式中,业务服务器可以基于第一AR客户端对应的第二定位信息(即第一AR客户端在第二坐标系中的位置)以及第一VR客户端对应的第五定位信息(即第一VR客户端在第二坐标系中的位置),判断第一AR客户端周围是否包括第一VR客户端,如果是则向第一VR客户端发送第一对象对应的子弹信息。In some implementations, the service server may be based on the second positioning information corresponding to the first AR client (ie, the position of the first AR client in the second coordinate system) and the fifth positioning information corresponding to the first VR client (ie, the position of the first AR client in the second coordinate system). That is, the position of the first VR client in the second coordinate system), determine whether the first VR client is included around the first AR client, and if so, send the bullet information corresponding to the first object to the first VR client.
在一些实施方式中,业务服务器可以基于第二坐标系对游戏场景进行网格划分,判断第一VR客户端在第二坐标系中的位置,是否处于以第一AR客户端在第二坐标系中的位置为中心的九宫格网格中,即第一AR客户端周围是否包括第一VR客户端,当第一AR客户端周围包括第一VR客户端,业务服务器可以向第一VR客户端发送第一对象对应的子弹信息。In some implementations, the business server can perform meshing on the game scene based on the second coordinate system, determine the position of the first VR client in the second coordinate system, and determine whether the first AR client is in the second coordinate system. In the nine-square grid with the position as the center, that is, whether the first AR client is surrounded by the first VR client, when the first AR client is surrounded by the first VR client, the business server can send a message to the first VR client. Bullet information corresponding to the first object.
S2503,第一VR客户端基于第一对象对应的子弹信息,在第一VR场景中呈现第一对象射击的子弹。S2503: The first VR client presents the bullet shot by the first object in the first VR scene based on the bullet information corresponding to the first object.
在一些实施方式中,第一VR客户端可以将第一对象对应的子弹信息进行本地化处理,包括将子弹在第二坐标系中的位置转换为子弹在第四坐标系中的位置,将子弹在第二坐标系中的发射方向转换为子弹在第四作坐标系中的发射方向。第一VR客户端基于子弹ID、子弹在第四坐标系中的位置以及子弹在第四作坐标系中的发射方向,在第一VR场景中呈现第一对象射击的子弹。In some embodiments, the first VR client may localize the bullet information corresponding to the first object, including converting the position of the bullet in the second coordinate system into the position of the bullet in the fourth coordinate system, and converting the firing direction of the bullet in the second coordinate system into the firing direction of the bullet in the fourth coordinate system. The first VR client presents the bullet shot by the first object in the first VR scene based on the bullet ID, the position of the bullet in the fourth coordinate system, and the firing direction of the bullet in the fourth coordinate system.
在一些实施方式中,第一VR客户端可以实例化第一对象射击的子弹,并模拟该子弹的弹道。在一些时候方式中,第一VR客户端可以呈现第一对象射击的听觉效果和/或视觉效果。In some implementations, the first VR client can instantiate a bullet fired by the first object and simulate the trajectory of the bullet. In some ways, the first VR client may present the auditory and/or visual effects of the first object's shooting.
在一些实施方式中,第一AR客户端也可以按照与S2503中第一VR客户端相同或相似的方式,基于第一对象对应的子弹信息,在第一AR场景中呈现第一对象射击的子弹。In some implementations, the first AR client can also present the bullet shot by the first object in the first AR scene based on the bullet information corresponding to the first object in the same or similar manner as the first VR client in S2503. .
S2504,当第一对象射击的子弹命中第二对象时,第一VR客户端向业务服务器发送第二对象的身份信息。S2504: When the bullet fired by the first object hits the second object, the first VR client sends the identity information of the second object to the business server.
在一些实施方式中,若第二对象处于第一对象射击的子弹弹道中,则该子弹命中了第二对象。In some embodiments, if the second object is in the trajectory of a bullet fired by the first object, the bullet hits the second object.
在一些实施方式中,第一VR客户端可以通过碰撞检测算法,检测第一对象射击的子弹是否命中第二对象。当然,在实际应用中,第一VR客户端也可以通过其他方式来检测第一对象射击的子弹是否命中第二对象。In some implementations, the first VR client can detect whether the bullet fired by the first object hits the second object through a collision detection algorithm. Of course, in practical applications, the first VR client can also detect whether the bullet fired by the first object hits the second object through other methods.
可以理解的是,在实际应用中,也可以由业务服务器或者第一AR客户端判断第一对象射击的子弹是否命中第二对象。It can be understood that in actual applications, the business server or the first AR client can also determine whether the bullet fired by the first object hits the second object.
S2505,业务服务器基于第一对象的射击结果,获取第二对象对应的生命值变化信息。S2505: The business server obtains the health value change information corresponding to the second object based on the shooting result of the first object.
在一些实施方式中,第二对象对应的生命值变化信息可以包括生命值变化量和/或变化后的生命值。In some implementations, the health value change information corresponding to the second object may include the health value change amount and/or the changed health value.
S2506,业务服务器向第一AR客户端和第一VR客户端发送第二对象对应的生命值变化信息。S2506: The business server sends health value change information corresponding to the second object to the first AR client and the first VR client.
在一些实施方式中,业务服务器可以基于第一AR客户端对应的第二定位信息(即第一AR客户端在第二坐标系中的位置)以及第一VR客户端对应的第五定位信息(即第一VR客户端在第二坐标系中的位置),判断第一VR客户端周围是否包括第一AR客户端,如果是则向第一AR客户端发送第二对象对应的生命值变化信息。In some implementations, the service server may be based on the second positioning information corresponding to the first AR client (ie, the position of the first AR client in the second coordinate system) and the fifth positioning information corresponding to the first VR client (ie, the position of the first AR client in the second coordinate system). That is, the position of the first VR client in the second coordinate system), determine whether the first VR client is surrounded by the first AR client, and if so, send the health value change information corresponding to the second object to the first AR client. .
S2507,第一AR客户端和第一VR客户端基于第二对象对应的生命值变化信息,更新第二对象的生命值。S2507. The first AR client and the first VR client update the health value of the second object based on the health value change information corresponding to the second object.
第一AR客户端可以在第一AR场景中,呈现第二对象对应的生命值减少后的结果,还可以呈现该生命值减少时的听觉效果和/或视觉效果。第一VR客户端可以在一VR场景中,呈现第二呈现对应的生命值减少后的结果,还可以呈现该生命值减少时的听觉效果和/或视觉效果。The first AR client can present the result of the reduction of the health value corresponding to the second object in the first AR scene, and can also present the auditory effect and/or the visual effect when the health value is reduced. The first VR client can present the result of the second presentation corresponding to the reduced health value in a VR scene, and can also present the auditory effect and/or visual effect when the health value is reduced.
可以理解的是,当业务服务器获取到第一对象对应的子弹信息时,可以按照与向第一VR客 户端发送该子弹信息相似或相同的方式,向更多的客户端发送第一对象对应的子弹信息,从而检测第一对象是否命中该更多的客户端中的每一个客户端。当业务服务器获取到第二对象对应的生命值变化信息时,可以按照向第一AR客户端发送该生命值变化信息相似或相同的方式,向更多的客户端发送该生命值变化信息,使得该更多的客户端可以呈现第二对象的生命值变化这一状态变化事件。It can be understood that when the business server obtains the bullet information corresponding to the first object, it can send the information to the first VR client according to the The client sends the bullet information in a similar or identical manner, and sends the bullet information corresponding to the first object to more clients, thereby detecting whether the first object hits each of the more clients. When the business server obtains the life value change information corresponding to the second object, it can send the life value change information to more clients in a similar or identical manner to the first AR client, so that The further client may present a state change event of a health value change of the second object.
在一些实施方式中,第一VR客户端响应于第二对象的攻击指令,可以向业务服务器发送第二对象对应的子弹信息。业务服务器向第一AR客户端发送第二对象对应的子弹信息。第一AR客户端基于第二对象对应的子弹信息,在第一AR场景中呈现第二对象射击的子弹。当第二对象射击的子弹命中第一对象时,第一AR客户端向业务服务器发送第一对象的身份信息。业务服务器基于第二对象的射击结果,获取第一对象对应的生命值变化信息。业务服务器向第一AR客户端和第一VR客户端发送第一对象对应的生命值变化信息。第一AR客户端和第一VR客户端基于第一对象对应的生命值变化信息,更新第一对象的生命值。In some implementations, in response to the second object's attack instruction, the first VR client may send bullet information corresponding to the second object to the business server. The business server sends the bullet information corresponding to the second object to the first AR client. The first AR client presents the bullet fired by the second object in the first AR scene based on the bullet information corresponding to the second object. When the bullet fired by the second object hits the first object, the first AR client sends the identity information of the first object to the business server. The business server obtains the health value change information corresponding to the first object based on the shooting result of the second object. The business server sends health value change information corresponding to the first object to the first AR client and the first VR client. The first AR client and the first VR client update the health value of the first object based on the health value change information corresponding to the first object.
上述实施例中,在第一AR客户端和第一VR客户端接入同一游戏场景的情况下,对当第一AR场景的第一现实位置包括第一对象时,第一VR客户端在第一VR场景的第一虚拟位置呈现第一对象的方式,以及,当第一VR场景的第二虚拟位置包括第二对象时,第一AR客户端在第一AR场景的第二现实位置呈现第二对象的方式进行了说明。但可以理解的是,在实际应用中,接入该游戏场景的客户端还可以包括更多的AR客户端和/或更多的VR客户端。In the above embodiment, when the first AR client and the first VR client access the same game scene, when the first real position of the first AR scene includes the first object, the first VR client A manner in which a first object is presented in a first virtual location of a VR scene, and when a second virtual location of the first VR scene includes a second object, the first AR client presents the first object in a second real location of the first AR scene. Two object methods are described. However, it is understandable that in actual applications, the clients accessing the game scene may also include more AR clients and/or more VR clients.
在上述任一实施例中的另一些实施方式中,第一AR客户端、第一VR客户端和第二AR客户端接入同一游戏场景,当第二AR设备呈现的第二AR场景中的第五现实位置包括第三对象时,第一AR客户端可以在第一AR场景中的第五现实位置呈现第三对象,第一VR客户端可以在第一VR场景中的第五虚拟位置呈现第三对象,其中,第五现实位置和第五虚拟位置为该游戏场景中的同一位置。In other implementations of any of the above embodiments, the first AR client, the first VR client and the second AR client access the same game scene. When the second AR device presents the When the fifth real position includes the third object, the first AR client can present the third object at the fifth real position in the first AR scene, and the first VR client can present it at the fifth virtual position in the first VR scene. The third object, wherein the fifth real position and the fifth virtual position are the same position in the game scene.
在上述任一实施例中的另一些实施方式中,当第二AR场景发生与第三对象对应的第三事件时,第一VR客户端可以在第一VR场景呈现第二事件,第一AR客户端可以在第一AR场景中呈现第三事件。In other implementations of any of the above embodiments, when the third event corresponding to the third object occurs in the second AR scene, the first VR client can present the second event in the first VR scene, and the first AR The client can present a third event in the first AR scene.
在上述任一实施例中的另一些实施方式中,第三事件可以包括第三对象的状态更新事件、第三对象运动事件和第一对象与其他对象的交互事件中的至少一个。In other implementations of any of the above embodiments, the third event may include at least one of a status update event of the third object, a movement event of the third object, and an interaction event between the first object and other objects.
在上述任一实施例中的另一些实施方式中,第一AR客户端、第一VR客户端和第二VR客户端接入同一游戏场景,当第二VR设备呈现的第二VR场景中的第六虚拟位置包括第四对象时,第一AR客户端可以在第一AR场景中的第六现实位置呈现第四对象,第一VR客户端可以在第一VR场景中的第六虚拟位置呈现第四对象,其中,第六现实位置和第六虚拟位置为该游戏场景中的同一位置。In other implementations of any of the above embodiments, the first AR client, the first VR client and the second VR client access the same game scene, and when the second VR scene presented by the second VR device When the sixth virtual position includes the fourth object, the first AR client can present the fourth object at the sixth real position in the first AR scene, and the first VR client can present it at the sixth virtual position in the first VR scene. The fourth object, wherein the sixth real position and the sixth virtual position are the same position in the game scene.
在上述任一实施例中的另一些实施方式中,当第二AR场景发生与第四对象对应的第四事件时,第一VR客户端可以在第一VR场景呈现第四事件,第一AR客户端可以在第一AR场景中呈现第四事件。In other implementations of any of the foregoing embodiments, when a fourth event corresponding to a fourth object occurs in the second AR scene, the first VR client may present the fourth event in the first VR scene, and the first AR client may present the fourth event in the first AR scene.
在上述任一实施例中的另一些实施方式中,第四事件可以包括第四对象的状态更新事件、第四对象运动事件和第四对象与其他对象的交互事件中的至少一个。In other implementations of any of the above embodiments, the fourth event may include at least one of a state update event of a fourth object, a fourth object motion event, and an interaction event between the fourth object and other objects.
也即是,第一AR客户端可以在第一AR场景中呈现来自第一VR场景中的第二对象和第二AR场景中第三对象,第一VR客户端可以在第一VR场景中呈现来自第一AR客户端中的第一对象和来自第二VR场景中的第四对象,从而实现了多类型多客户端协同呈现同一游戏场景,多用户可以通过多个类型不同的客户端在同一游戏场景中互见、互动和游玩。That is, the first AR client can present the second object from the first VR scene and the third object from the second AR scene in the first AR scene, and the first VR client can present it in the first VR scene. The first object from the first AR client and the fourth object from the second VR scene realize multi-type and multi-client collaborative presentation of the same game scene, and multiple users can play the same game scene through multiple different types of clients. Meet, interact and play with each other in the game scene.
例如,在如图26所示的第一AR场景中,还包括来自第二AR场景中的第三对象1603,其中,第三对象1603为第二AR客户端对应的现实用户。在如图27所示的第一VR场景中,还包括来自第二AR场景中的第三对象1603,且第三对象1603为与该现实用户对应的虚拟形象。For example, the first AR scene as shown in Figure 26 also includes a third object 1603 from the second AR scene, where the third object 1603 is a real user corresponding to the second AR client. In the first VR scene as shown in Figure 27, a third object 1603 from the second AR scene is also included, and the third object 1603 is a virtual image corresponding to the real user.
基于同一发明构思,本申请实施例还提供了一种客户端。该客户端包括:存储器和处理器,存储器用于存储计算机程序;处理器用于在调用计算机程序时执行上述方法实施例所述的方法。Based on the same inventive concept, embodiments of the present application also provide a client. The client includes: a memory and a processor. The memory is used to store the computer program; the processor is used to execute the method described in the above method embodiment when the computer program is called.
本实施例提供的客户端可以执行上述方法实施例,其实现原理与技术效果类似,此处不再赘 述。The client provided in this embodiment can execute the above method embodiments, and its implementation principles and technical effects are similar, and will not be repeated here. narrate.
基于同一发明构思,本申请实施例还提供了一种芯片系统。该所述芯片系统包括处理器,所述处理器与存储器耦合,所述处理器执行存储器中存储的计算机程序,以实现上述方法实施例所述的方法。Based on the same inventive concept, an embodiment of the present application further provides a chip system, which includes a processor coupled to a memory, and the processor executes a computer program stored in the memory to implement the method described in the above method embodiment.
其中,该芯片系统可以为单个芯片,或者多个芯片组成的芯片模组。The chip system may be a single chip or a chip module composed of multiple chips.
本申请实施例还提供一种计算机可读存储介质,其上存储有计算机程序,计算机程序被处理器执行时实现上述方法实施例所述的方法。Embodiments of the present application also provide a computer-readable storage medium on which a computer program is stored. When the computer program is executed by a processor, the method described in the above method embodiment is implemented.
本申请实施例还提供一种计算机程序产品,当计算机程序产品在客户端上运行时,使得客户端执行时实现上述方法实施例所述的方法。An embodiment of the present application also provides a computer program product. When the computer program product is run on a client, the client implements the method described in the above method embodiment when executed.
上述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请实现上述实施例方法中的全部或部分流程,可以通过计算机程序来指令相关的硬件来完成,所述的计算机程序可存储于一计算机可读存储介质中,该计算机程序在被处理器执行时,可实现上述各个方法实施例的步骤。其中,所述计算机程序包括计算机程序代码,所述计算机程序代码可以为源代码形式、对象代码形式、可执行文件或某些中间形式等。所述计算机可读存储介质至少可以包括:能够将计算机程序代码携带到拍照装置/客户端的任何实体或装置、记录介质、计算机存储器、只读存储器(read-only memory,ROM)、随机存取存储器(random access memory,RAM)、电载波信号、电信信号以及软件分发介质。例如U盘、移动硬盘、磁碟或者光盘等。If the above-mentioned integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer-readable storage medium. Based on this understanding, this application can implement all or part of the processes in the methods of the above embodiments by instructing relevant hardware through a computer program. The computer program can be stored in a computer-readable storage medium. The computer program When executed by a processor, the steps of each of the above method embodiments may be implemented. Wherein, the computer program includes computer program code, which may be in the form of source code, object code, executable file or some intermediate form. The computer-readable storage medium may at least include: any entity or device capable of carrying computer program code to the camera device/client, recording media, computer memory, read-only memory (ROM), random access memory (random access memory, RAM), electrical carrier signals, telecommunications signals, and software distribution media. For example, U disk, mobile hard disk, magnetic disk or CD, etc.
在上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述或记载的部分,可以参见其它实施例的相关描述。In the above embodiments, each embodiment is described with its own emphasis. For parts that are not detailed or documented in a certain embodiment, please refer to the relevant descriptions of other embodiments.
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。Those of ordinary skill in the art will appreciate that the units and algorithm steps of each example described in conjunction with the embodiments disclosed herein can be implemented with electronic hardware, or a combination of computer software and electronic hardware. Whether these functions are performed in hardware or software depends on the specific application and design constraints of the technical solution. Skilled artisans may implement the described functionality using different methods for each specific application, but such implementations should not be considered beyond the scope of this application.
在本申请所提供的实施例中,应该理解到,所揭露的装置/设备和方法,可以通过其它的方式实现。例如,以上所描述的装置/设备实施例仅仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通讯连接可以是通过一些接口,装置或单元的间接耦合或通讯连接,可以是电性,机械或其它的形式。In the embodiments provided in this application, it should be understood that the disclosed devices/devices and methods can be implemented in other ways. For example, the apparatus/equipment embodiments described above are only illustrative. For example, the division of modules or units is only a logical function division. In actual implementation, there may be other division methods, such as multiple units or units. Components may be combined or may be integrated into another system, or some features may be ignored, or not implemented. On the other hand, the coupling or direct coupling or communication connection between each other shown or discussed may be through some interfaces, indirect coupling or communication connection of devices or units, which may be in electrical, mechanical or other forms.
应当理解,当在本申请说明书和所附权利要求书中使用时,术语“包括”指示所描述特征、整体、步骤、操作、元素和/或组件的存在,但并不排除一个或多个其它特征、整体、步骤、操作、元素、组件和/或其集合的存在或添加。It will be understood that, when used in this specification and the appended claims, the term "comprising" indicates the presence of the described features, integers, steps, operations, elements and/or components but does not exclude one or more other The presence or addition of features, integers, steps, operations, elements, components and/or collections thereof.
还应当理解,在本申请说明书和所附权利要求书中使用的术语“和/或”是指相关联列出的项中的一个或多个的任何组合以及所有可能组合,并且包括这些组合。It will also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
如在本申请说明书和所附权利要求书中所使用的那样,术语“如果”可以依据上下文被解释为“当...时”或“一旦”或“响应于确定”或“响应于检测到”。类似地,短语“如果确定”或“如果检测到[所描述条件或事件]”可以依据上下文被解释为意指“一旦确定”或“响应于确定”或“一旦检测到[所描述条件或事件]”或“响应于检测到[所描述条件或事件]”。As used in this specification and the appended claims, the term "if" may be interpreted as "when" or "once" or "in response to determining" or "in response to detecting" depending on the context. ". Similarly, the phrase "if determined" or "if [the described condition or event] is detected" may be interpreted, depending on the context, to mean "once determined" or "in response to a determination" or "once the [described condition or event] is detected ]" or "in response to detection of [the described condition or event]".
另外,在本申请说明书和所附权利要求书的描述中,术语“第一”、“第二”、“第三”等仅用于区分描述,而不能理解为指示或暗示相对重要性。In addition, in the description of this application and the appended claims, the terms "first", "second", "third", etc. are only used to distinguish the description, and cannot be understood as indicating or implying relative importance.
在本申请说明书中描述的参考“一个实施例”或“一些实施例”等意味着在本申请的一个或多个实施例中包括结合该实施例描述的特定特征、结构或特点。由此,在本说明书中的不同之处出现的语句“在一个实施例中”、“在一些实施例中”、“在其他一些实施例中”、“在另外一些实施例中”等不是必然都参考相同的实施例,而是意味着“一个或多个但不是所有的实施例”,除非是以其他方式另外特别强调。术语“包括”、“包含”、“具有”及它们的变形都意味着“包括但不限于”,除非是以其他方式另外特别强调。 Reference in this specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Therefore, the phrases "in one embodiment", "in some embodiments", "in other embodiments", "in other embodiments", etc. appearing in different places in this specification are not necessarily References are made to the same embodiment, but rather to "one or more but not all embodiments" unless specifically stated otherwise. The terms “including,” “includes,” “having,” and variations thereof all mean “including but not limited to,” unless otherwise specifically emphasized.
最后应说明的是:以上各实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述各实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分或者全部技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的范围。 Finally, it should be noted that the above embodiments are only used to illustrate the technical solution of the present application, but not to limit it; although the present application has been described in detail with reference to the foregoing embodiments, those of ordinary skill in the art should understand that: The technical solutions described in the foregoing embodiments can still be modified, or some or all of the technical features can be equivalently replaced; and these modifications or substitutions do not deviate from the essence of the corresponding technical solutions from the technical solutions of the embodiments of the present application. scope.

Claims (26)

  1. 一种多设备协同方法,其特征在于,所述方法包括:A multi-device collaboration method, characterized in that the method includes:
    第一AR客户端呈现第一AR场景,所述第一AR场景为AR模式的第一协同场景;The first AR client presents a first AR scene, and the first AR scene is a first collaborative scene in AR mode;
    第一VR客户端呈现第一VR场景,所述第一VR场景为VR模式的所述第一协同场景;The first VR client presents a first VR scene, and the first VR scene is the first collaborative scene in VR mode;
    当所述第一AR场景的第一现实位置包括第一对象时,所述第一VR客户端在所述第一VR场景的第一虚拟位置呈现所述第一对象,所述第一现实位置和所述第一虚拟位置为所述第一协同场景中的同一位置;和/或,当所述第一VR场景的第二虚拟位置包括第二对象时,所述第一AR客户端在所述第一AR场景的第二现实位置呈现所述第二对象,所述第二虚拟位置和所述第二现实位置为所述第一协同场景中的同一位置。When the first real position of the first AR scene includes a first object, the first VR client presents the first object at a first virtual position of the first VR scene, the first real position and the first virtual position is the same position in the first collaborative scene; and/or, when the second virtual position of the first VR scene includes a second object, the first AR client is in that The second object is presented in a second real position of the first AR scene, and the second virtual position and the second real position are the same position in the first collaborative scene.
  2. 根据权利要求1所述的方法,其特征在于,所述方法还包括以下任意一项或多项:The method according to claim 1, characterized in that the method further includes any one or more of the following:
    当所述第一AR场景发生与所述第一对象对应的第一事件时,所述第一VR客户端在所述第一VR场景呈现所述第一事件;或,When a first event corresponding to the first object occurs in the first AR scene, the first VR client presents the first event in the first VR scene; or,
    当所述第一VR场景发生与所述第二对象对应的第二事件时,所述第一AR客户端在所述第一AR场景呈现所述第二事件。When a second event corresponding to the second object occurs in the first VR scene, the first AR client presents the second event in the first AR scene.
  3. 根据权利要求2所述的方法,其特征在于,所述第一事件包括以下任意一项或多项:The method of claim 2, wherein the first event includes any one or more of the following:
    所述第一对象的状态更新事件,或,a status update event of the first object, or,
    所述第一对象的运动事件;或,The motion event of the first object; or,
    所述第一对象与其他对象的交互事件。Interaction events between the first object and other objects.
  4. 根据权利要求3所述的方法,其特征在于,所述第一对象的运动事件包括所述第一对象从所述第一AR场景中的所述第一现实位置运动至第三现实位置,所述第一VR客户端在所述第一VR场景呈现所述第一事件,包括:The method according to claim 3, characterized in that the motion event of the first object includes the first object moving from the first real position in the first AR scene to a third real position, and the first VR client presents the first event in the first VR scene, comprising:
    所述第一VR客户端呈现所述第一对象从所述第一VR场景中的所述第一虚拟位置,移动至第三虚拟位置的运动过程,所述第三虚拟位置和所述第三现实位置为所述第一协同场景中的同一位置。The first VR client presents the movement process of the first object moving from the first virtual position in the first VR scene to a third virtual position, the third virtual position and the third virtual position. The real location is the same location in the first collaborative scene.
  5. 根据权利要求2-4任一所述的方法,其特征在于,所述第二事件包括以下任意一项或多项:The method according to any one of claims 2-4, characterized in that the second event includes any one or more of the following:
    所述第二对象的状态更新事件;或,The status update event of the second object; or,
    所述第二对象的运动事件;或,The motion event of the second object; or,
    所述第二对象与其他对象的交互事件。Interaction events between the second object and other objects.
  6. 根据权利要求5所述的方法,其特征在于,所述第二对象的运动事件包括所述第二对象从所述第一VR场景中的所述第二虚拟位置运动至第四虚拟位置,所述第一AR客户端在所述第一AR场景呈现所述第二事件,包括:The method of claim 5, wherein the motion event of the second object includes the second object moving from the second virtual position to a fourth virtual position in the first VR scene, so The first AR client presents the second event in the first AR scene, including:
    所述第一AR客户端呈现所述第二对象从所述第一AR场景中的所述第二现实位置移动至第四现实位置的运动过程,所述第四现实位置和所述第四虚拟位置为所述第一协同场景中的同一位置。The first AR client presents the movement process of the second object moving from the second real position in the first AR scene to a fourth real position, the fourth real position and the fourth virtual The location is the same location in the first collaborative scene.
  7. 根据权利要求1-6任一所述的方法,其特征在于,所述第一对象为现实对象或虚拟结合的对象,所述第一VR客户端在所述第一VR场景的第一虚拟位置呈现所述第一对象,包括:The method according to any one of claims 1 to 6, characterized in that the first object is a real object or a virtual combined object, and the first VR client is at the first virtual position of the first VR scene. Presenting the first object includes:
    所述第一VR客户端在所述第一VR场景的所述第一虚拟位置,呈现所述第一对象对应的虚拟形象。The first VR client presents the virtual image corresponding to the first object in the first virtual position of the first VR scene.
  8. 根据权利要求1-7任一所述的方法,其特征在于,所述第二对象为虚拟对象,所述第一AR客户端在所述第一AR场景的第二现实位置呈现所述第二对象,包括:The method according to any one of claims 1 to 7, characterized in that the second object is a virtual object, and the first AR client presents the second object at a second real position of the first AR scene. Objects, including:
    所述第一AR客户端在所述第一AR场景的所述第二现实位置,呈现所述第二对象或呈现所述第二对象对应的虚实结合的形象。The first AR client presents the second object or a combined virtual and real image corresponding to the second object in the second real position of the first AR scene.
  9. 根据权利要求1-8任一所述的方法,其特征在于,所述第一对象为所述第一AR客户端的用户,和/或,所述第二对象为所述第一VR客户端控制的玩家控制角色。The method according to any one of claims 1 to 8, characterized in that the first object is a user of the first AR client, and/or the second object is a user controlled by the first VR client. The player controls the character.
  10. 根据权利要求1-9任一所述的方法,其特征在于,所述第一AR场景中包括至少部分现实场景,所述第一VR场景中包括对所述至少部分现实场景进行处理得到的虚拟场景。 The method according to any one of claims 1 to 9, characterized in that the first AR scene includes at least part of a real scene, and the first VR scene includes a virtual image obtained by processing the at least part of the real scene. Scenes.
  11. 根据权利要求1-10任一所述的方法,其特征在于,在所述第一AR客户端呈现第一AR场景,所述第一AR场景为AR模式的第一协同场景之前,所述方法还包括:The method according to any one of claims 1 to 10, characterized in that, before the first AR client presents the first AR scene, and the first AR scene is the first collaborative scene of the AR mode, the method Also includes:
    所述第一AR客户端基于所述第一AR客户端在第一坐标系中的真实位置接入所述第一协同场景,所述第一VR客户端基于所述第一VR客户端在所述第一坐标系中的真实位置接入所述第一协同场景,其中,所述第一AR客户端在第一坐标系中的真实位置,与所述第一VR客户端在所述第一坐标系中的真实位置处于同一兴趣面AOI区域;或,The first AR client accesses the first collaborative scene based on the real position of the first AR client in the first coordinate system, and the first VR client accesses the first collaborative scene based on the location of the first VR client. The real position in the first coordinate system is connected to the first collaborative scene, where the real position of the first AR client in the first coordinate system is the same as the real position of the first VR client in the first coordinate system. The real position in the coordinate system is in the same AOI area; or,
    所述第一AR客户端基于所述第一AR客户端在第一坐标系中的真实位置接入所述第一协同场景,所述第一VR客户端基于所述第一VR客户端在所述第一坐标系中的虚拟位置接入所述第一协同场景,其中,所述第一AR客户端在第一坐标系中的真实位置,与所述第一VR客户端在所述第一坐标系中的虚拟位置处于同一AOI区域。The first AR client accesses the first collaborative scene based on the real position of the first AR client in the first coordinate system, and the first VR client accesses the first collaborative scene based on the location of the first VR client. The virtual position in the first coordinate system is connected to the first collaborative scene, where the real position of the first AR client in the first coordinate system is the same as the real position of the first VR client in the first coordinate system. The virtual positions in the coordinate system are in the same AOI area.
  12. 根据权利要求1-11任一所述的方法,其特征在于,所述方法还包括:The method according to any one of claims 1-11, characterized in that the method further includes:
    所述第一AR客户端在所述第一AR场景中呈现第三对象,所述第一VR客户端在所述第一VR场景中呈现所述第三对象对应的虚拟形象,所述第三对象为第二AR客户端所呈现的第二AR场景中包括的对象,所述第三对象为现实对象或虚实结合的对象。The first AR client presents a third object in the first AR scene, and the first VR client presents a virtual image corresponding to the third object in the first VR scene. The third object is an object included in the second AR scene presented by the second AR client, and the third object is a real object or a combination of real and virtual objects.
  13. 根据权利要求1-12任一所述的方法,其特征在于,所述方法还包括:The method according to any one of claims 1-12, characterized in that the method further includes:
    所述第一AR客户端在所述第一AR场景中呈现第四对象,所述第一VR客户端在所述第一VR场景中呈现所述第四对象,所述第四对象为第二VR客户端所呈现的第二VR场景中包括的对象,所述第四对象为虚拟对象。The first AR client presents a fourth object in the first AR scene, the first VR client presents the fourth object in the first VR scene, and the fourth object is a second Objects included in the second VR scene presented by the VR client, and the fourth object is a virtual object.
  14. 根据权利要求1-13任一所述的方法,其特征在于,所述第一AR客户端在所述第一AR场景的第二现实位置呈现所述第二对象,包括:The method according to any one of claims 1 to 13, characterized in that the first AR client presents the second object at a second real position of the first AR scene, comprising:
    所述第一AR客户端获取所述第二对象在第二坐标系中的位置,所述第二坐标系为公共坐标系;The first AR client obtains the position of the second object in a second coordinate system, and the second coordinate system is a public coordinate system;
    所述第一AR客户端将所述第二对象在所述第二坐标系中的位置,转换为所述第二对象在第三坐标系中的所述第二现实位置,所述第三坐标系为与所述第一AR客户端对应的本地坐标系。The first AR client converts the position of the second object in the second coordinate system into the second real position of the second object in a third coordinate system, where the third coordinate system is a local coordinate system corresponding to the first AR client.
  15. 根据权利要求14所述的方法,其特征在于,所述方法还包括:The method of claim 14, further comprising:
    所述第一VR客户端将所述第二对象在第四坐标系中的所述第二虚拟位置,转换为所述第二对象在所述第二坐标系中的位置,所述第四坐标系为与所述第一VR客户端对应的本地坐标系;The first VR client converts the second virtual position of the second object in the fourth coordinate system into the position of the second object in the second coordinate system. The fourth coordinate is the local coordinate system corresponding to the first VR client;
    所述第一VR客户端向业务服务器发送所述第二对象在所述第二坐标系中的位置;The first VR client sends the position of the second object in the second coordinate system to the business server;
    所述第一AR客户端获取所述第二对象在第二坐标系中的位置,包括:The first AR client obtains the position of the second object in the second coordinate system, including:
    所述第一VR客户端从所述业务服务器,获取所述第二对象在所述第二坐标系中的位置。The first VR client obtains the position of the second object in the second coordinate system from the business server.
  16. 根据权利要求1-15任一所述的方法,其特征在于,所述第一VR客户端在所述第一VR场景的第一虚拟位置呈现所述第一对象,包括:The method according to any one of claims 1 to 15, characterized in that the first VR client presents the first object in the first virtual position of the first VR scene, including:
    所述第一VR客户端获取所述第一对象在第二坐标系中的位置,所述第二坐标系为公共坐标系;The first VR client obtains the position of the first object in a second coordinate system, and the second coordinate system is a public coordinate system;
    所述第一VR客户端将所述第一对象在所述第二坐标系中的位置,转换为所述第一对象在第四坐标系中的所述第一虚拟位置,所述第四坐标系为与所述第一VR客户端对应的本地坐标系。The first VR client converts the position of the first object in the second coordinate system into the first virtual position of the first object in the fourth coordinate system. The fourth coordinate is the local coordinate system corresponding to the first VR client.
  17. 根据权利要求16所述的方法,其特征在于,所述方法还包括:The method of claim 16, further comprising:
    所述第一AR客户端将所述第一对象在第三坐标系中的所述第一现实位置,转换为所述第一对象在所述第二坐标系中的位置,所述第三坐标系为与所述第一AR客户端对应的本地坐标系;The first AR client converts the first real position of the first object in the third coordinate system into the position of the first object in the second coordinate system. The third coordinate is the local coordinate system corresponding to the first AR client;
    所述第一AR客户端向业务服务器发送所述第一对象在所述第二坐标系中的位置;The first AR client sends the position of the first object in the second coordinate system to the business server;
    所述第一VR客户端获取所述第一对象在第二坐标系中的位置,包括:The first VR client obtains the position of the first object in the second coordinate system, including:
    所述第一AR客户端从所述业务服务器,获取所述第一对象在所述第二坐标系中的位置。The first AR client obtains the position of the first object in the second coordinate system from the service server.
  18. 一种多设备协同方法,其特征在于,应用于第一AR设备,所述方法包括:A multi-device collaboration method, characterized in that it is applied to a first AR device, and the method includes:
    呈现第一AR场景,所述第一AR场景为AR模式的第一协同场景;Presenting a first AR scene, the first AR scene being a first collaborative scene in AR mode;
    当第一VR场景的第二虚拟位置包括第二对象时,在所述第一AR场景的第二现实位置呈现所述第二对象,所述第二虚拟位置和所述第二现实位置为所述第一协同场景中的同一位置;When the second virtual position of the first VR scene includes a second object, the second object is presented at a second real position of the first AR scene, and the second virtual position and the second real position are the The same position in the first collaborative scene;
    其中,所述第一VR场景为第一VR客户端呈现的场景,所述第一VR场景为VR模式的所述 第一协同场景。Wherein, the first VR scene is a scene presented by a first VR client, and the first VR scene is the VR mode. The first collaborative scene.
  19. 根据权利要求18所述的方法,其特征在于,所述方法还包括:The method of claim 18, further comprising:
    当所述第一VR场景发生与所述第二对象对应的第二事件时,所述第一AR客户端在所述第一AR场景呈现所述第二事件。When a second event corresponding to the second object occurs in the first VR scene, the first AR client presents the second event in the first AR scene.
  20. 根据权利要求19所述的方法,其特征在于,所述第二事件包括以下任意一项或多项:The method of claim 19, wherein the second event includes any one or more of the following:
    所述第二对象的状态更新事件;或,A status update event of the second object; or
    所述第二对象的运动事件;或,The motion event of the second object; or,
    所述第二对象与其他对象的交互事件。Interaction events between the second object and other objects.
  21. 一种多设备协同方法,其特征在于,应用于第一VR设备,所述方法包括:A multi-device collaboration method, characterized in that it is applied to the first VR device, and the method includes:
    呈现第一VR场景,所述第一VR场景为VR模式的第一协同场景;Presenting a first VR scene, the first VR scene being a first collaborative scene in VR mode;
    当第一AR场景的第一现实位置包括第一对象时,在所述第一VR场景的第一虚拟位置呈现所述第一对象,所述第一现实位置和所述第一虚拟位置为所述第一协同场景中的同一位置;When the first real position of the first AR scene includes the first object, the first object is presented at the first virtual position of the first VR scene, and the first real position and the first virtual position are the The same position in the first collaborative scene;
    其中,所述第一AR场景为第一AR客户端呈现的场景,所述第一AR场景为AR模式的所述第一协同场景。Wherein, the first AR scene is a scene presented by a first AR client, and the first AR scene is the first collaborative scene in AR mode.
  22. 根据权利要求21所述的方法,其特征在于,所述方法还包括:The method according to claim 21, characterized in that, the method further includes:
    当所述第一AR场景发生与所述第一对象对应的第一事件时,在所述第一VR场景呈现所述第一事件。When a first event corresponding to the first object occurs in the first AR scene, the first event is presented in the first VR scene.
  23. 根据权利要求22所述的方法,其特征在于,所述第一事件包括以下任意一项或多项:The method of claim 22, wherein the first event includes any one or more of the following:
    所述第一对象的状态更新事件,或,a status update event of the first object, or,
    所述第一对象的运动事件;或,The motion event of the first object; or,
    所述第一对象与其他对象的交互事件。Interaction events between the first object and other objects.
  24. 一种客户端,其特征在于,包括:存储器和处理器,所述存储器用于存储计算机程序;所述处理器用于在调用所述计算机程序时执行如权利要求18-20任一项所述的方法或如权利要求21-23任一项所述的方法。A client, characterized in that it includes: a memory and a processor, the memory is used to store a computer program; the processor is used to execute the method described in any one of claims 18-20 or the method described in any one of claims 21-23 when calling the computer program.
  25. 一种计算机可读存储介质,其上存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现如权利要求18-20任一项所述的方法或如权利要求21-23任一项所述的方法。A computer-readable storage medium with a computer program stored thereon, characterized in that when the computer program is executed by a processor, the method as claimed in any one of claims 18-20 or any of claims 21-23 is implemented. The method described in one item.
  26. 一种计算机程序产品,其特征在于,当所述计算机程序产品在电子设备上运行时,使得所述电子设备执行如权利要求18-20任一项所述的方法或如权利要求21-23任一项所述的方法。 A computer program product, characterized in that, when the computer program product is run on an electronic device, it causes the electronic device to execute the method according to any one of claims 18-20 or the method according to any one of claims 21-23. method described in one item.
PCT/CN2023/106607 2022-09-23 2023-07-10 Multi-device collaboration method, and client WO2024060799A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211168295.4A CN115581913A (en) 2022-09-23 2022-09-23 Multi-device cooperation method and client
CN202211168295.4 2022-09-23

Publications (1)

Publication Number Publication Date
WO2024060799A1 true WO2024060799A1 (en) 2024-03-28

Family

ID=84779033

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/106607 WO2024060799A1 (en) 2022-09-23 2023-07-10 Multi-device collaboration method, and client

Country Status (2)

Country Link
CN (1) CN115581913A (en)
WO (1) WO2024060799A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115581913A (en) * 2022-09-23 2023-01-10 华为技术有限公司 Multi-device cooperation method and client

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019199569A1 (en) * 2018-04-09 2019-10-17 Spatial Inc. Augmented reality computing environments
US20200128106A1 (en) * 2018-05-07 2020-04-23 EolianVR, Inc. Device and content agnostic, interactive, collaborative, synchronized mixed reality system and method
CN111199561A (en) * 2020-01-14 2020-05-26 上海曼恒数字技术股份有限公司 Multi-person cooperative positioning method and system for virtual reality equipment
KR20220096381A (en) * 2020-12-31 2022-07-07 한국전자기술연구원 Method for Creating a Cooperating VR-AR Space with Synchronizing Objects and Interactions
CN115581913A (en) * 2022-09-23 2023-01-10 华为技术有限公司 Multi-device cooperation method and client

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019199569A1 (en) * 2018-04-09 2019-10-17 Spatial Inc. Augmented reality computing environments
US20200128106A1 (en) * 2018-05-07 2020-04-23 EolianVR, Inc. Device and content agnostic, interactive, collaborative, synchronized mixed reality system and method
CN111199561A (en) * 2020-01-14 2020-05-26 上海曼恒数字技术股份有限公司 Multi-person cooperative positioning method and system for virtual reality equipment
KR20220096381A (en) * 2020-12-31 2022-07-07 한국전자기술연구원 Method for Creating a Cooperating VR-AR Space with Synchronizing Objects and Interactions
CN115581913A (en) * 2022-09-23 2023-01-10 华为技术有限公司 Multi-device cooperation method and client

Also Published As

Publication number Publication date
CN115581913A (en) 2023-01-10

Similar Documents

Publication Publication Date Title
US10609334B2 (en) Group video communication method and network device
US11571620B2 (en) Using HMD camera touch button to render images of a user captured during game play
EP3595789B1 (en) Virtual reality system using an actor and director model
US20180225880A1 (en) Method and Apparatus for Providing Hybrid Reality Environment
CN111462307B (en) Virtual image display method, device, equipment and storage medium of virtual object
JP7145976B2 (en) Virtual object information display method and its application program, device, terminal and server
CN111835531B (en) Session processing method, device, computer equipment and storage medium
CN107683449A (en) The personal space content that control is presented via head mounted display
CN110365666A (en) Multiterminal fusion collaboration command system of the military field based on augmented reality
US20230141166A1 (en) Data Sharing Method and Device
WO2024060799A1 (en) Multi-device collaboration method, and client
CN111603771A (en) Animation generation method, device, equipment and medium
WO2022267729A1 (en) Virtual scene-based interaction method and apparatus, device, medium, and program product
US11943282B2 (en) System for providing synchronized sharing of augmented reality content in real time across multiple devices
CN113457173B (en) Remote teaching method, remote teaching device, computer equipment and storage medium
CN113599819A (en) Prompt message display method, device, equipment and storage medium
US20240015260A1 (en) Dynamically switching between rgb and ir capture
CN113599810B (en) Virtual object-based display control method, device, equipment and medium
CN112156463A (en) Role display method, device, equipment and medium
US12001750B2 (en) Location-based shared augmented reality experience system
US20230342100A1 (en) Location-based shared augmented reality experience system
CN107132656A (en) A kind of helmet and its data processing method
US11998833B2 (en) Generating collectible items based on location information
US20230076281A1 (en) Generating collectible items based on location information
US20230177788A1 (en) 3d models for augmented reality (ar)

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23867094

Country of ref document: EP

Kind code of ref document: A1