WO2022193612A1 - Procédé et appareil de traitement de mouvement pour un personnage de jeu, support de stockage et dispositif informatique - Google Patents

Procédé et appareil de traitement de mouvement pour un personnage de jeu, support de stockage et dispositif informatique Download PDF

Info

Publication number
WO2022193612A1
WO2022193612A1 PCT/CN2021/121092 CN2021121092W WO2022193612A1 WO 2022193612 A1 WO2022193612 A1 WO 2022193612A1 CN 2021121092 W CN2021121092 W CN 2021121092W WO 2022193612 A1 WO2022193612 A1 WO 2022193612A1
Authority
WO
WIPO (PCT)
Prior art keywords
game
motion
data
moving image
game character
Prior art date
Application number
PCT/CN2021/121092
Other languages
English (en)
Chinese (zh)
Inventor
曾浩强
林栋国
余婉
陈星雨
Original Assignee
天津亚克互动科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 天津亚克互动科技有限公司 filed Critical 天津亚克互动科技有限公司
Publication of WO2022193612A1 publication Critical patent/WO2022193612A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/403D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures

Definitions

  • the present invention relates to the field of computer technology, and in particular, to a method and device for motion processing of game characters, a storage medium, and a computer device.
  • AR Augmented Reality
  • AR Augmented Reality
  • soldiers can walk around in the real world with a mobile device, and drive the game virtual character to walk a corresponding distance in the corresponding direction in the game world.
  • the existing method will make the virtual scene flat, and there will be no obstacles in the virtual scene, and the character will be prohibited from jumping up and down. Otherwise, the virtual character will be interspersed with the virtual scene, and there will be gangs. In this way, AR gameplay cannot be applied to all game scenarios, and scenes must be specially created for AR gameplay.
  • the present invention provides a motion processing method and device, storage medium, and computer equipment for a game character, which does not require the establishment of a special game scene model for the AR mode, and at the same time, avoids the game character bluntly following the player in the real world.
  • the piercing caused by the moving path of the game not only improves the playability of the game and increases the gameplay, but also ensures the display effect of the game.
  • a motion processing method of a game character comprising: in response to an AR mode interactive operation request in a game client, invoking a game engine in the game client to render a real-time 3D game virtual scene; And according to the real-time real displacement data of the game player in the real world, the game character controlled by the game player is driven to perform non-interspersed movement in the three-dimensional game virtual scene.
  • a motion processing device for a game character comprising: a request response module for invoking a game engine in the game client to render in response to an AR mode interactive operation request in the game client A real-time 3D game virtual scene is obtained; a motion driving module is used to drive the game character controlled by the game player to perform non-interspersed motion in the 3D game virtual scene according to the real-time real displacement data of the game player in the real world.
  • a computer-readable medium on which a computer program is stored, and when the program is executed by a processor, implements the steps of the above-mentioned method for processing the movement of a game character.
  • a computer device comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, the processor implements the movement of the game character when the processor executes the program The steps of the processing method.
  • a computer program product comprising a computer program, when the computer program is executed by a processor, the steps of the above-mentioned method for processing the motion of a game character are implemented.
  • the game character motion processing method and device, storage medium, and computer equipment in response to the AR mode interactive operation request in the game client, through the game
  • the engine renders the 3D game virtual scene in real time.
  • the game client collects the real-time real displacement data of the game player in the real world, and uses the real displacement data and the motion surface data file of the game scene to drive the game character to move freely in the game. Interspersed movement.
  • the embodiment of the present invention does not need to create a special feature for the AR mode, compared to the sports performance effect caused by running the AR mode in the existing scene, or to avoid the scene modeling for the AR mode alone.
  • the game scene model only needs to use the pre-built motion surface data file to analyze whether the game character can follow the player's movement trajectory in the real world in the 3D game virtual scene, the implementation is simple, the threshold is low, so as to render the moving image frame , which avoids the game character bluntly following the player's movement path in the real world in the 3D game virtual scene. While improving the game playability and increasing the gameplay, the game display effect is guaranteed. Through the AR game mode Bring players an extraordinary gaming experience combining virtual and real.
  • FIG. 1 schematically shows a schematic flowchart of a method for processing a motion of a game character according to an embodiment of the present invention
  • FIG. 2 schematically shows a schematic diagram of a dotted ray emission according to an embodiment of the present invention
  • FIG. 3 schematically shows a schematic structural diagram of a motion processing apparatus for a game character according to an embodiment of the present invention
  • Figure 4 schematically shows a block diagram of a computer device for implementing the method according to the invention.
  • Figure 5 schematically shows a block diagram of a computer program product implementing the method according to the invention.
  • a motion processing method for a game character includes:
  • Step 101 in response to the AR mode interactive operation request in the game client, calling the game engine in the game client to render to obtain a real-time 3D game virtual scene;
  • Step 102 according to the real-time real displacement data of the game player in the real world and the motion surface data file corresponding to the 3D game virtual scene, drive the game character controlled by the game player to perform non-interspersed motion in the 3D game virtual scene.
  • the embodiments of the present invention may be applied to a game client, and the game client may include intelligent electronic devices such as smart phones and tablet computers.
  • the game client runs the game, in response to the AR mode interactive operation request for the 3D game virtual scene in the game, the game engine performs real-time rendering on the corresponding 3D game virtual scene.
  • the motion processing method of the game character may include: Step 103, reading a motion surface data file corresponding to the 3D game virtual scene, wherein the motion surface data file is used to indicate The movable position of the three-dimensional game virtual scene.
  • the 3D game virtual scene may be a specific game world in the game, such as a player's exclusive home in the game, and the motion surface data file may be pre-established based on the landform data of the 3D game virtual scene for reaction A set of data for the movable positions of the virtual scene of a 3D game.
  • the moving surface data file may contain data of different meanings.
  • the moving surface data file may include information such as points that can be passed, points that cannot be passed, landform boundaries, and landform heights in a 3D game virtual scene.
  • the motion surface data file can only include the points that can be passed in the virtual scene of the 3D game.
  • the motion feasibility of the game character in the three-dimensional game virtual scene can be judged through the motion surface data file, and in some embodiments, it can be determined whether the game character can move to a certain position.
  • the real displacement data generated when the player holds the game client to move can be used to analyze the target position that the game character in the AR mode should reach if the real movement of the player is simulated, that is, in the AR mode In the mode, the player controls the movement of the game character in the virtual world by holding the game client to move.
  • the parameters used to control the movement of the game character may be collected by the game client at a preset frame rate real displacement data.
  • step 102 may include:
  • Step 102-1 collect the real displacement data corresponding to the game client according to the preset frame rate, and generate the game frame by frame according to the real displacement data and the initial position of the game character in the 3D game virtual scene the target position of the character in the virtual scene of the three-dimensional game;
  • Step 102-2 according to the motion surface data file and the target position, use a game engine to render the motion image frames corresponding to the game character in the 3D game virtual scene frame by frame.
  • the game character is controlled to move in the 3D game virtual scene according to the real displacement data.
  • the moving direction and moving distance move toward the corresponding target position.
  • the initial position is determined in different ways. For example, before the game client responds to the AR mode interactive operation request, the game character is already in the 3D game virtual scene, that is, the AR mode interactive operation request uses In order to instruct to run the AR mode in the current game scene where the game character is located, then the initial position can be the position of the game character when the game client responds to the AR mode interactive operation request, and, for example, before the client responds to the AR mode interactive operation request, the game The character is located in another game scene, that is, the AR mode interaction operation request is used to instruct the game character to switch from the current game scene to the 3D game virtual scene and run the AR mode in the 3D game virtual scene.
  • the initial position at this time can be the game developer.
  • the walking environment in the real world is different from the walking environment in the game world. If the real displacement data of the game client is completely followed Controlling the movement of the game character in the game world may cause the game character to pass through obstacles such as walls and walk out of the map.
  • the real displacement data of the game client and the initial position of the game character in the 3D game virtual scene calculate the target position of the game character corresponding to each real displacement data in the 3D game virtual scene, and then calculate the target position of the game character in the 3D game virtual scene according to the motion surface data file.
  • the movable position of the virtual scene is used to analyze whether the game character can move to the target position, that is, whether the target position belongs to the movable position in the 3D game virtual scene, so as to obtain the moving image frame by image rendering based on the analysis result.
  • you can move to the target position you can render the moving image frame of the game character moving to the target position, and when it is determined that the target position is not a movable position, that is, when the game character should not move to the target position, you can render the game character at the target position.
  • a moving image frame in the form of stop moving, or standing still. in this way.
  • the game character will not appear to pass through obstacles, move out of the map, etc., and improve the authenticity of the game character movement, so that the 3D game in AR mode is virtual.
  • the game characters in the scene can not only follow the player's movement in the real world, but also avoid the game character bluntly following the player's movement path in the real world without the need to re-develop the dedicated map for AR mode. Playability and increase the gameplay, while ensuring the game display effect.
  • the embodiment of the present invention in response to the AR mode interactive operation request, read the motion surface data file corresponding to the 3D game virtual scene indicated by the AR mode interactive operation request, and at the same time, obtain the game client at the preset frame rate Corresponding real displacement data, and based on the real displacement data, determine the target position of the game character in the 3D game virtual scene frame by frame, use the read motion surface data file to analyze whether the target position is movable, and render the corresponding game character based on the analysis result. moving image frame.
  • the embodiment of the present invention does not need to create a special feature for the AR mode, compared to the sports performance effect caused by running the AR mode in the existing scene, or to avoid the scene modeling for the AR mode alone.
  • the game scene model only needs to use the pre-built motion surface data file to analyze whether the game character can follow the player's movement trajectory in the real world in the 3D game virtual scene, so as to render the moving image frame and avoid the game character in the game.
  • the 3D game virtual scene rigidly follows the player's moving path in the real world, which leads to the penetration of the gang, which not only improves the game playability and increases the gameplay, but also ensures the game display effect.
  • the AR game mode brings players a combination of reality and reality. extraordinary gaming experience.
  • step 102-1 may include:
  • Step 102-1.1 Acquire the positioning data collected by the positioning device of the game client according to the preset frame rate, and calculate the real displacement data corresponding to the game client frame by frame;
  • Step 102-1.2 Determine the virtual displacement data of the game character corresponding to the real displacement data according to the preset proportional coefficient, and determine the game character in the three-dimensional position according to the virtual displacement data and the initial position. The target location in the game's virtual scene.
  • a positioning device (such as a GPS module) in the game client can be used to collect positioning data at a preset frame rate, obtain the positioning data of the game client, and calculate the frame by frame based on the collected positioning data.
  • the real displacement data, the positioning data collected by the positioning device can be longitude and latitude data, and the real displacement data can reflect the moving direction and moving distance of the user.
  • the game character can follow the real movement of the player in the real world, and the virtual displacement data corresponding to the real displacement data can be calculated according to the preset scale factor.
  • the preset scale factor is 1, then the virtual displacement data and the real displacement data In the same way, the player moves 1 meter in a certain direction in the real world, and the virtual displacement data is also expressed as moving 1 meter in this direction.
  • the preset scale factor is 10. The player moves 1 meter in the real world, and the game character moves 10 meters in the game. So that the game character can follow the player's movement trajectory in the real world in proportion, bringing the player an immersive and immersive game experience.
  • the method of determining the target position based on the virtual displacement data and the initial position is different from the method of generating the target position according to the real displacement data and the initial position in the above embodiment in that the real displacement data is first transformed into the corresponding virtual displacement data, Then, the calculation of the target position is performed according to the virtual displacement data obtained by the transformation, and the specific calculation method will not be repeated here.
  • the target position can be calculated directly based on the real displacement data and the initial position, that is, step 102-1 can be replaced with: according to the real displacement data and the initial position , and determine the target position of the game character in the three-dimensional game virtual scene.
  • the real displacement data in order to improve the motion performance effect of the game character in the 3D game virtual scene and make the motion of the game character smoother, the real displacement data can be interpolated, and the interpolation result can be used to determine the above-mentioned target position.
  • "collecting the real displacement data corresponding to the game client at a preset frame rate” may be: acquiring first positioning data and second positioning data in the positioning data, wherein the first positioning data The difference between the first sampling frame corresponding to the positioning data and the second sampling frame corresponding to the second positioning data is a preset number; based on the first positioning data and the second positioning data, interpolation is performed to obtain a difference between the preset number and the first sampling frame. matching interpolation position data, and calculate the interpolation displacement data corresponding to the game client according to the interpolation position data, wherein the real displacement data includes the interpolation displacement data.
  • the first positioning data and the second positioning data may be obtained from the positioning data according to a preset number, for example, the preset number is 10, the first positioning data sampled is used as the first positioning data, and the tenth sampling data is used as the first positioning data.
  • the first positioning data and the second positioning data are interpolated to calculate the 8 interpolation data between the first positioning data and the second positioning data, and the first positioning data, 8
  • the 10 pieces of interpolation data and the second positioning data are used as 10 pieces of interpolation displacement data, and when the virtual displacement data and the target position are subsequently calculated, the interpolation displacement data obtained by interpolation is used for calculation.
  • it can be controlled according to the interpolation displacement data, which helps the movement trajectory of the game character to be smoother and improves the performance of the movement.
  • first positioning data and the second positioning data can also be directly collected at fixed time intervals.
  • the preset frame rate is 40 Hz
  • the preset number is 10
  • the fixed time interval is 0.25 seconds
  • the first positioning data is obtained at 0.25 seconds
  • the second positioning data is obtained at 0.25 seconds, so as to reduce the data collection amount of the positioning device.
  • the player may obtain the first positioning data and the second positioning data in different moving stages in the real world, for example, obtain the first positioning data in the player's initial stage Positioning data and second positioning data, to avoid the player's stop-and-go when starting, causing the game character to freeze in the game world.
  • the first positioning data and the second positioning data are obtained during the player's stop phase to prevent the player from suddenly stopping. The step causes the game character to suddenly stop, and the performance is poor.
  • the motion surface data file may include plane coordinates and height data corresponding to movable positions in the 3D game virtual scene determined based on the landform data of the 3D game virtual scene; Step 103, you can The method includes: loading a motion plane corresponding to the motion plane data file in the 3D game virtual scene, wherein the motion plane is hidden when the 3D game virtual scene is displayed.
  • the motion surface data file may be pre-generated according to the landform data of the 3D game virtual scene, and the file may include the plane coordinates and height data of each movable position in the 3D game virtual scene, the plane coordinates and corresponding
  • the height data represents the movable height of a certain movable position in the 3D game virtual scene space.
  • the steps in the 3D game virtual scene are movable positions, and the plane coordinates and height data corresponding to the steps reflect the movement of the game character to this position. The position of the foot on the steps.
  • the motion surface corresponding to the file can be loaded in the game based on the motion surface data file, the motion surface can be hidden in the game, and the player cannot see the motion through the game client display screen
  • the moving surface is only used to analyze whether a certain target position can be moved.
  • the moving surface can also be displayed in the game, so that the player can see the moving surface, so as to help the player pay attention to avoiding the position of the non-moving surface as much as possible. Mobility guarantees smooth, smooth navigation within the game world.
  • step 102-2 may include:
  • Step 102-2-A1 generating a dotted ray based on the target position, and emitting the dotted ray on the moving surface to perform ray dotting, wherein the dotted ray is perpendicular to the plane where the target position is located;
  • Step 102-2-A2 if the dotted ray intersects with the moving surface, render the first moving image frame of the game character moving to the target position, wherein the moving image frame includes the first moving image frame. a moving image frame;
  • Step 102-2-A3 if the dotted ray does not intersect with the motion surface, then according to the preset collision motion rule, the second motion image frame in which the game character moves at the corresponding current position is rendered, wherein, The moving image frame includes the second moving image frame.
  • the dotted ray is generated in the vertical direction according to the target position (the endpoint of the dotted ray can be the highest point of the 3D game virtual scene, and the direction is vertically downward, or the endpoint can be taken at the lowest point, and the direction is vertically upward), and Launch the dotting ray for dotting. If the dotting ray can hit the moving surface, that is, the dotting ray and the moving surface intersect, as shown in Figure 2. Assume that the initial position of the game character is point A, the target position is point B, and the moving surface is S.
  • the dotted ray is emitted upward from point B, and the dotted ray has a meeting point B' with S, indicating that the game character can move to the target position, then render the first moving image of the game character moving to the target position (here, the target position is specifically B'). frame, where the height data corresponding to the target position should be considered when the game character moves to the target position to ensure that the game character can move based on the topography of the scene when moving.
  • the second motion image frame can be rendered according to the preset collision motion rules.
  • the preset collision motion rule may be that in this case, the game character walks on the spot, or stops on the spot, and so on.
  • the dotted ray detection method can ensure that the game character has no interspersed motion effect in the game scene, and the system overhead is small and the efficiency is high.
  • step 102-2-A2 may include:
  • Step 102-2-A2.1 obtaining the height of the intersection point of the dotted ray and the motion surface
  • Step 102-2-A2.2 if the junction height matches the AR mode riding state information of the 3D game virtual scene, render the game character according to the riding state corresponding to the junction height.
  • Step 102-2-A2.3 if the height of the intersection point does not match the riding state information of the AR mode, then according to the preset collision motion rules, render the game character moving at the corresponding current position.
  • a fourth moving image frame wherein the first moving image frame includes the fourth moving image frame.
  • the height information of the intersection generated by the intersection can be used to further analyze whether the target position is a movable position.
  • the third motion image frame in which the game character moves to the target position according to the riding state corresponding to the height of the junction point can be rendered,
  • the virtual scene of a 3D game is a manor, a city, etc. in the game.
  • the game characters walk, ride and ride on the surface, and also support the game characters to travel in the air such as light work and aircraft.
  • the height of the intersection point is the height range that supports the surface travel mode, then the image frames of the game character moving to the target position by walking, riding, and other surface travel methods can be rendered.
  • the height of the intersection point does not match the riding state information in AR mode, it means that although the target position is a movable position in normal mode, but it moves to the target position in AR mode, it should be rendered according to the preset collision motion rules.
  • the fourth moving image frame such as the junction height, is a range of heights that support the air travel mode.
  • the motion surface may also be loaded as a navigation motion surface, and the motion surface may include a 2D navigation mesh motion surface and a 3D voxel motion surface, wherein the 2D navigation mesh motion surface may represent a 3D game
  • the game character can move on the surface by walking, riding, etc.
  • the 3D voxel motion surface can reflect the connection of each movable voxel in the 3D game virtual scene, that is, the connected voxels.
  • the grids can be moved in various ways such as surface travel, air travel, and so on. Based on the 2D navigation mesh motion surface or the 3D voxel motion surface, the pathfinding of the game character in the game world can be realized.
  • step 102-2 may include:
  • Step 102-2-B1 if the target position matches the motion surface of the two-dimensional navigation grid, obtain the preset riding information of the target position corresponding to the motion surface of the two-dimensional navigation grid, according to The preset riding information and the current riding state of the game character determine the target riding state of the game character, and based on the two-dimensional navigation mesh motion surface, render the game character to the target a fifth moving image frame in which the riding state moves toward the target position, wherein the moving image frame includes the fifth moving image frame;
  • Step 102-2-B2 if the target position does not match the motion surface of the two-dimensional navigation grid, render a sixth moving image frame in which the game character moves at the corresponding current position, wherein the motion The image frames include the sixth moving image frame.
  • the target position represented by the in-game plane coordinates
  • the projection of the two-dimensional navigation mesh moving surface on the plane includes the target position, or the dotted ray emitted based on the target position and the two-dimensional navigation mesh
  • the target position matches the 2D navigation grid motion surface.
  • the game character can move to the target position.
  • the target position point is obtained.
  • the preset riding information corresponding to the vertical projection position point on the motion surface of the 2D navigation grid may include walking, riding, boating, etc., to further determine the current riding mode of the game character Whether the riding state belongs to the riding state indicated by the preset riding information, if so, the current riding state of the game character is used as the target riding state, if not, the target riding state of the game character is switched to that indicated by the preset riding information , such as switching from the walking state to the riding state, so as to render the game character to the target position in the target riding state (the target position here refers to the vertical projection position of the target position on the motion surface of the two-dimensional navigation mesh)
  • the fifth moving image frame that moves.
  • step 102-2 may include:
  • Step 102-2-C1 if the target position matches the three-dimensional voxel motion surface, obtain the preset riding information of the target position corresponding to the three-dimensional voxel motion surface, according to the preset riding information
  • the riding information and the current riding state of the game character determine the target riding state of the game character, and determine the search for the game character to move from the current position to the target position based on the three-dimensional voxel motion surface.
  • road information, and according to the pathfinding information a seventh moving image frame in which the game character moves to the target position in the target riding state is rendered, wherein the moving image frame includes the seventh moving image frame ;
  • Step 102-2-C2 if the target position does not match the three-dimensional voxel motion surface, render an eighth moving image frame in which the game character moves at the corresponding current position, wherein the moving image frame The eighth moving image frame is included.
  • the 3D voxel motion surface includes connectable voxels in the 3D game virtual scene.
  • the projection of the 3D voxel motion surface on the plane includes the target position, or the dotted ray emitted based on the target position can hit the
  • the target position matches the three-dimensional voxel motion plane, which means that the game character can move to the target position.
  • the preset riding information corresponding to the vertical projection position point of the target position point on the 3D voxel motion surface is obtained.
  • the preset riding information may include walking, riding and riding.
  • boating and other surface travel methods and can also include air travel methods such as light work, aircraft, etc.
  • determine the target riding state when the game character moves to the target position here
  • the riding state that is the same or similar to the current riding state in the preset riding information is preferentially selected. For example, if the current riding state is walking, the walking state will be prioritized as the target riding state, and the riding state will be considered second. Riding, and finally consider the air travel mode to maintain the continuity of the riding state, avoid unnecessary riding state switching in the game, and improve the user experience.
  • pathfinding information for the movement of the game character from the current position to the target position is generated based on the three-voxel motion surface.
  • Render the game character to the target position in the target riding state (the target position here refers to the vertical projection voxel position of the target position on the three-dimensional voxel motion surface, if the vertical projection voxel position corresponding to the target position includes multiple, One of them may also be selected as the seventh moving image frame in which the target voxel position) moves based on the shortest path first principle.
  • the eighth motion image frame is rendered according to the preset collision motion rule.
  • step 102-2 may further include:
  • Step 102-2-D1 obtaining the viewing angle data obtained by the angular motion detection device of the game client according to the preset frame rate;
  • Step 102-2-D2 according to the moving surface data file and the target position, render the moving image frame of the game character matching the viewing angle data frame by frame, wherein the moving image frame is used to display and The three-dimensional game virtual scene matched with the viewing angle data and the movement action of the game character.
  • the game client can not only acquire real displacement data, but also acquire angular motion data through an angular motion detection device (such as a gyroscope), and process and process it into corresponding viewing angle data, so as to render the data matching the viewing angle data.
  • Motion image frame so that the angular motion of the game client can be reflected through the motion image frame, and the player can change the viewing angle of the game world by turning the mobile phone (when the game client is a mobile phone), so that the motion image frame can show the same
  • the user's real-time perspective matches the environment picture in the 3D game virtual scene, so that the player can better browse the game world and improve the user experience.
  • steps 102-2-D1 and D2 can be combined with steps 102-2-A2.1 to 102-2-A2.3, steps 102-2-B1 to B2, and steps 102-2-C1 to C2 Combined to render a moving image frame that matches the target position (or current position, preset collision motion rules), riding state, and viewing angle data. So that the game character can meet the needs of the player to rotate the viewing angle at will while exercising without wearing help.
  • step 102-2-D1 may include: acquiring the angular motion data collected by the angular motion detection device according to the preset frame rate; The movement category of the mirror is generated, and the angle of view data corresponding to the angular motion data is generated.
  • the perspective data is the first-person perspective of the game character in the three-dimensional game virtual scene, and the perspective data is based on the game character's first-person perspective.
  • the preset initial angle of view is based on the corresponding changes with the angular motion data; when the camera movement category includes third-person camera movement, the angle of view data is a preset corresponding to the position and direction of the game character. The position and the preset angle are viewed from the third-person perspective of the game character, and the direction of the game character changes correspondingly with the angular motion data based on the preset initial direction of the game character.
  • the viewing angle data is processed and processed by the angular motion data collected by the angular motion detection device.
  • the viewing angle data should be calculated based on the camera movement category corresponding to the virtual scene of the 3D game or selected by the user.
  • the camera movement category is first-person mirror movement, which means that the image displayed by the moving image frame is the image corresponding to the first-person perspective of the game character, that is, the game world environment that the game character "sees", and the perspective of the first frame of the moving image frame.
  • the data is the preset initial viewing angle of the game character, and the subsequent viewing angle data is correspondingly changed according to the angular motion data on the basis of the preset initial viewing angle.
  • the running category is the third-person camera movement, which means that the image displayed by the moving image frame is the image corresponding to the third-person perspective "looking at” the game character, that is, showing the game character and the surrounding environment of the game character that the virtual third person "sees".
  • the third-person perspective is based on the position and direction of the game character.
  • the virtual third person is “located” at the preset position and angle of the game character, and the virtual third person “looks” at the position and angle of the game character. The position and angle remain relatively unchanged.
  • the third-person perspective follows the displacement of the game character in the game world and the rotation of the perspective to change the position and angle of the virtual third-person in the game world coordinates, but the virtual third-person remains relative to the coordinate system established with the game character as the origin. Do not move.
  • the direction of the game character is based on the preset initial direction of the game character and correspondingly changes with the angular motion data (the angle motion data indicates how many degrees to rotate, and how many degrees the game character also rotates), and the third-person perspective follows the above rules and changes accordingly. Thereby providing a richer AR mode display effect and providing more gameplay.
  • the motion surface data file may include plane coordinates and height data corresponding to movable positions in the 3D game virtual scene determined based on the landform data of the 3D game virtual scene; the motion surface data file Modifications may be made according to the real-time environment of the three-dimensional game scene.
  • step 102-2 may include:
  • Step 102-2-E1 if the target position matches the plane coordinates of the movable position, generate a ninth moving image frame in which the game character moves to the target position according to the height data corresponding to the target position , wherein the moving image frame includes the ninth moving image frame;
  • Step 102-2-E2 if the target position does not match the plane coordinates of the movable position, then according to the preset collision motion rule, generate a tenth motion in which the game character moves at the corresponding current position An image frame, wherein the moving image frame includes the tenth moving image frame.
  • the game character without interspersed motion directly based on the data in the motion surface data file, and when the target position is a movable position, drive the game character to move to the height data position corresponding to the target position That's it.
  • the plane coordinates and height data of the movable position saved in the motion surface data file can be modified, added, and deleted according to the real-time changes of the game scene.
  • the table placement position becomes In the position that cannot move, the relevant data in the motion surface data file can be modified, so that when the file is used to drive the movement of the game character, even if the environment of the game scene changes, the game character can be guaranteed to perform non-interspersed movement.
  • the real-time modified motion surface data file calculates the motion connectivity of the game character to drive the motion of the character, so that the motion of the game character in the virtual world is more realistic.
  • a gameplay of entering the real world from the 3D game virtual scene can also be provided .
  • it may further include: S1, in response to a request for opening a transmission gate from a three-dimensional game virtual scene to the real world, acquiring a first real-time real image frame corresponding to the real world and converting the first real-time real image The frame is saved as a first texture, and a first real-time virtual image frame corresponding to the virtual scene of the 3D game (the first real-time virtual image frame is a moving image frame) is obtained, wherein the first texture is used to render the transmission A preset portal model corresponding to the gate; S2, according to the first texture, the preset portal model and the first real-time virtual image frame, render a first real-time rendered image frame including the portal.
  • the images collected in the real world are stored in the device memory in the form of textures, so that the game engine can render them to the preset portal model through rendering technology, and the real-world real-time environment is displayed in the gate, and The effect of rendering the image of the real-time environment of the virtual scene of the 3D game outside the portal is rendered.
  • the image displayed in the game has a very holistic effect, and there is no sense of separation.
  • the real-time environment of the real and virtual worlds can be displayed in real time inside and outside the door. Create a more "real" portal effect. Through the portal, players can observe the world environment and location through the portal in real time, which helps to improve the player's experience, enhance the display effect of the game screen, and enhance the game's potential. Playability, through the AR game mode, it brings players an extraordinary game experience combining virtual and reality, and provides technical support for increasing gameplay.
  • the game screen includes a virtual portal or does not include a virtual portal.
  • one is a situation where the virtual portal can be seen from the perspective of the game character in the virtual world, and the other is a situation where the virtual portal cannot be seen from the perspective of the game character in the virtual world.
  • the position and direction of the virtual portal in the virtual scene of the 3D game can be determined by the player's choice. If the position and direction of the virtual portal are determined, the shape and size of the virtual portal displayed in the game will vary with The different positions and directions of the game characters in the three-dimensional game virtual scene change.
  • a motion processing device for a game character includes:
  • a request-response module used for invoking the game engine in the game client to render a real-time 3D game virtual scene in response to an AR mode interactive operation request in the game client;
  • the motion driving module is used for driving the game character controlled by the game player to perform non-interspersed motion in the three-dimensional game virtual scene according to the real-time real displacement data of the game player in the real world.
  • the motion processing apparatus of the game character may further include: a file reading module, configured to read and read the virtual scene of the real-time three-dimensional game by invoking the game engine in the game client to render the virtual scene.
  • a file reading module configured to read and read the virtual scene of the real-time three-dimensional game by invoking the game engine in the game client to render the virtual scene.
  • the motion driving module can be used to: collect the real displacement data corresponding to the game client according to the preset frame rate, and according to the real displacement data and the initial position of the game character in the three-dimensional game virtual scene, step by step. frame generation of the target position of the game character in the 3D game virtual scene; according to the motion surface data file and the target position, the game character is rendered frame by frame through the game engine corresponding to the 3D game virtual scene moving image frame.
  • the motion driving module is further configured to: acquire the positioning data collected by the positioning device of the game client according to the preset frame rate, and calculate the real displacement corresponding to the game client frame by frame data; according to a preset proportional coefficient, determine the virtual displacement data of the game character corresponding to the real displacement data, and determine that the game character is in the 3D game virtual scene according to the virtual displacement data and the initial position target location in .
  • the motion driving module is further configured to: acquire first positioning data and second positioning data in the positioning data, wherein the first sampling frame corresponding to the first positioning data is the same as the The second sampling frames corresponding to the second positioning data differ by a preset amount; based on the first positioning data and the second positioning data, interpolation is performed to obtain the interpolation position data matching the preset number, and according to the interpolation
  • the position data calculates the interpolation displacement data corresponding to the game client, wherein the real displacement data includes the interpolation displacement data.
  • the motion surface data file may include plane coordinates and height data corresponding to movable positions in the 3D game virtual scene determined based on the landform data of the 3D game virtual scene;
  • the file reading module can be configured to: in the 3D game virtual scene, load the motion plane corresponding to the motion plane data file, wherein the motion plane is hidden when the 3D game virtual scene is displayed .
  • the motion driving module may be further configured to: generate a dotted ray based on the target position, and perform ray dotting on the dotted ray emitted by the moving surface, wherein the dotted ray is perpendicular to the target the plane of the location;
  • a second moving image frame in which the game character moves at the corresponding current position is rendered according to the preset collision motion rule, wherein the moving image frame includes all the second moving image frame.
  • the motion driving module may further be used to: obtain the height of the intersection of the dotted ray and the motion surface;
  • the height of the junction matches the riding state information in the AR mode of the 3D game virtual scene, render a third motion in which the game character moves to the target position according to the riding state corresponding to the height of the junction an image frame, wherein the first moving image frame includes the third moving image frame;
  • a fourth moving image frame of the game character moving at the corresponding current position is rendered, and in some implementations manner, the first moving image frame includes the fourth moving image frame.
  • the motion surface may include a two-dimensional navigation mesh motion surface; the motion driving module may also be used to:
  • the target position matches the motion surface of the two-dimensional navigation grid, obtain the preset riding information of the target position corresponding to the motion surface of the two-dimensional navigation grid, according to the preset riding information and the current riding state of the game character, determine the target riding state of the game character, and based on the two-dimensional navigation mesh motion surface, render the game character with the target riding state to the target a position-shifted fifth moving image frame, wherein the moving image frame includes the fifth moving image frame;
  • a sixth moving image frame in which the game character moves at the corresponding current position is rendered according to the preset collision motion rule, wherein the motion The image frames include the sixth moving image frame.
  • the motion surface may include a three-dimensional voxel motion surface; the motion driving module may also be used to:
  • the target position matches the three-dimensional voxel motion surface, obtain the preset riding information of the target position corresponding to the three-dimensional voxel motion surface, according to the preset riding information and the game
  • the current riding state of the character determine the target riding state of the game character, and determine the pathfinding information of the game character moving from the current position to the target position based on the three-dimensional voxel motion surface, according to the road information, and render a seventh moving image frame in which the game character moves to the target position in the target riding state, wherein the moving image frame includes the seventh moving image frame;
  • the eighth motion image frame in which the game character moves at the corresponding current position is rendered according to the preset collision motion rule, wherein the motion image frame The eighth moving image frame is included.
  • the motion surface data file may include plane coordinates and height data corresponding to movable positions in the 3D game virtual scene determined based on the landform data of the 3D game virtual scene; the motion surface data file can be modified according to the real-time environment of the three-dimensional game scene;
  • the motion drive module can also be used for:
  • a ninth moving image frame in which the game character moves to the target position is generated according to the height data corresponding to the target position, wherein the moving image the frame includes the ninth moving image frame;
  • a tenth moving image frame in which the game character moves at the corresponding current position is generated according to the preset collision motion rule, wherein the The moving image frame includes the tenth moving image frame.
  • the motion driving module is further configured to: acquire the viewing angle data obtained by the angular motion detection device of the game client according to the preset frame rate; according to the motion surface data file and the target position, rendering the moving image frames of the game character matching the viewing angle data frame by frame, wherein the moving image frames are used to display the three-dimensional game virtual scene matching the viewing angle data and the movement actions of the game character .
  • the motion driving module may be further configured to: acquire the angular motion data collected by the angular motion detection device according to the preset frame rate; category, and the viewing angle data corresponding to the angular motion data is generated.
  • the perspective data is the first-person perspective of the game character in the three-dimensional game virtual scene, and the perspective data is in the form of The preset initial angle of view of the game character is based on the corresponding change with the angular motion data;
  • the perspective data is a third-person perspective view of the game character looking at the game character at a preset position and a preset angle corresponding to the position and direction of the game character,
  • the direction of the game character changes correspondingly with the angular motion data based on the preset initial direction of the game character.
  • Various component embodiments of the present invention may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof.
  • a microprocessor or a digital signal processor (DSP) may be used in practice to implement some or all of the functions of some or all of the components in the motion processing device of the game character according to the embodiment of the present invention.
  • DSP digital signal processor
  • the present invention can also be implemented as a program/instruction (eg, computer program/instruction and computer program product) for an apparatus or apparatus for performing some or all of the methods described herein.
  • Such programs/instructions implementing the present invention may be stored on a computer readable medium, or may exist in the form of one or more signals, such signals may be downloaded from an Internet website, or provided on a carrier signal, or in any form Available in other formats.
  • FIGS. 1 to 2 Based on the above methods as shown in FIGS. 1 to 2 , correspondingly, according to an embodiment of the present invention, a computer-readable medium is also provided, on which a computer program is stored, and when the computer program is executed by a processor, the above-mentioned FIG. 1 to FIG. 2 show the motion processing method of the game character.
  • Computer-readable media includes both persistent and non-permanent, removable and non-removable media, and storage of information may be implemented by any method or technology.
  • Information may be computer readable instructions, data structures, modules of programs, or other data.
  • Examples of computer storage media include, but are not limited to, phase-change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read only memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), Flash Memory or other memory technology, Compact Disc Read Only Memory (CD-ROM), Digital Versatile Disc (DVD) or other optical storage, Magnetic tape cartridges, disk storage, quantum memory, graphene-based storage media or other magnetic storage devices or any other non-transmission media can be used to store information that can be accessed by computing devices.
  • PRAM phase-change memory
  • SRAM static random access memory
  • DRAM dynamic random access memory
  • RAM random access memory
  • ROM read only memory
  • EEPROM Electrically Erasable Programm
  • the technical solution of the present invention can be embodied in the form of a software product, and the software product can be stored in a non-volatile storage medium (which may be CD-ROM, U disk, mobile hard disk, etc.), including several The instructions are used to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in various implementation scenarios of the present invention.
  • a computer device which may be a personal computer, a server, or a network device, etc.
  • a computer device which can be a personal computer, a server, A network device, etc.
  • the computer device includes a memory and a processor; the memory is used to store the computer program; the processor is used to execute the computer program to implement the above-mentioned method for processing game characters as shown in FIG. 1 to FIG. 2 .
  • FIG. 4 schematically shows a computer device that can implement the method for processing the motion of a game character according to the present invention
  • the computer device includes a processor 410 and a computer-readable medium in the form of a memory 420.
  • the memory 420 is an example of a computer-readable medium having a storage space 430 for storing the computer program 431 .
  • the computer program 431 is executed by the processor 410, each step in the above-described method for processing the motion of a game character can be implemented.
  • the computer device may further include a user interface, a network interface, a camera, a radio frequency (Radio Frequency, RF) circuit, a sensor, an audio circuit, a WI-FI module, and the like.
  • the user interface may include a display screen (Display), an input unit such as a keyboard (Keyboard), etc., and the optional user interface may also include a USB interface, a card reader interface, and the like.
  • Optional network interfaces may include standard wired interfaces, wireless interfaces (such as Bluetooth interfaces, WI-FI interfaces), and the like.
  • a computer device does not constitute a limitation on the computer device, and may include more or less components, or combine some components, or arrange different components.
  • the storage medium may also include an operating system and a network communication module.
  • An operating system is a program that manages and saves the hardware and software resources of computer equipment, supports the operation of information processing programs and other software and/or programs.
  • the network communication module is used to realize the communication between various components inside the storage medium, as well as the communication with other hardware and software in the physical device.
  • Figure 5 schematically shows a block diagram of a computer program product implementing the method according to the invention.
  • the computer program product includes a computer program 510 that, when executed by a processor, such as the processor 410 shown in FIG. step.
  • the present invention can be implemented by means of software plus a necessary general hardware platform, and can also be implemented by hardware in response to the AR mode interactive operation request, reading and The motion surface data file corresponding to the 3D game virtual scene indicated by the AR mode interactive operation request, at the same time, the real displacement data corresponding to the game client is obtained according to the preset frame rate, and based on the real displacement data, the game character is determined frame by frame in 3D
  • the read motion surface data file to analyze whether the target position is movable and render the moving image frame corresponding to the game character based on the analysis result.
  • the embodiment of the present invention does not need to create a special feature for the AR mode, compared to the sports performance effect caused by running the AR mode in the existing scene, or to avoid the scene modeling for the AR mode alone.
  • the game scene model only needs to use the pre-built motion surface data file to analyze whether the game character can follow the player's movement trajectory in the real world in the 3D game virtual scene, so as to render the moving image frame and avoid the game character in the game.
  • the 3D game virtual scene rigidly follows the player's moving path in the real world, which leads to the penetration of the gang, which not only improves the game playability and increases the gameplay, but also ensures the game display effect.
  • the AR game mode brings players a combination of reality and reality. extraordinary gaming experience.
  • the accompanying drawing is only a schematic diagram of a preferred implementation scenario, and the modules or processes in the accompanying drawing are not necessarily necessary to implement the present invention.
  • the modules in the device in the implementation scenario may be distributed in the device in the implementation scenario according to the description of the implementation scenario, or may be located in one or more devices different from the implementation scenario with corresponding changes.
  • the modules of the above implementation scenarios may be combined into one module, or may be further split into multiple sub-modules.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

La présente invention divulgue un procédé et un appareil de traitement de mouvement pour un personnage de jeu, un support de stockage et un dispositif informatique. Le procédé consiste : en réponse à une demande d'opération d'interaction en mode RA dans un client de jeu, à appeler un moteur de jeu à partir du client de jeu pour effectuer un rendu, afin d'obtenir une scène virtuelle de jeu tridimensionnel en temps réel ; et selon des données de vrai déplacement en temps réel d'un joueur de jeu dans le monde réel et un fichier de données de plan de mouvement correspondant à la scène virtuelle de jeu tridimensionnel, à amener un personnage de jeu commandé par le joueur de jeu à effectuer un mouvement de non-interpénétration dans la scène virtuelle de jeu tridimensionnel. La présente invention aide à éviter un problème causé par un personnage de jeu qui suit strictement un trajet de déplacement d'un joueur dans le monde réel, d'où une amélioration de l'effet de présentation d'un jeu.
PCT/CN2021/121092 2021-03-16 2021-09-27 Procédé et appareil de traitement de mouvement pour un personnage de jeu, support de stockage et dispositif informatique WO2022193612A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110282546.0 2021-03-16
CN202110282546.0A CN112862935B (zh) 2021-03-16 2021-03-16 游戏角色的运动处理方法及装置、存储介质、计算机设备

Publications (1)

Publication Number Publication Date
WO2022193612A1 true WO2022193612A1 (fr) 2022-09-22

Family

ID=75994795

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/121092 WO2022193612A1 (fr) 2021-03-16 2021-09-27 Procédé et appareil de traitement de mouvement pour un personnage de jeu, support de stockage et dispositif informatique

Country Status (2)

Country Link
CN (1) CN112862935B (fr)
WO (1) WO2022193612A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116036601A (zh) * 2023-01-28 2023-05-02 腾讯科技(深圳)有限公司 游戏处理方法、装置及计算机设备、存储介质
CN117357894A (zh) * 2023-11-01 2024-01-09 北京畅游天下网络技术集团有限公司 一种三维场景生成方法、装置、设备及介质
CN117695633A (zh) * 2023-12-18 2024-03-15 广西壮族自治区地图院 基于游戏引擎与gis三维引擎的同步匹配方法及系统
CN117695633B (zh) * 2023-12-18 2024-05-28 广西壮族自治区地图院 基于游戏引擎与gis三维引擎的同步匹配方法及系统

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112862935B (zh) * 2021-03-16 2023-03-17 天津亚克互动科技有限公司 游戏角色的运动处理方法及装置、存储介质、计算机设备
CN113821345B (zh) * 2021-09-24 2023-06-30 网易(杭州)网络有限公司 游戏中的移动轨迹渲染方法、装置及电子设备
CN114125552A (zh) * 2021-11-30 2022-03-01 完美世界(北京)软件科技发展有限公司 视频数据的生成方法及装置、存储介质、电子装置
TWI799195B (zh) * 2021-12-10 2023-04-11 宅妝股份有限公司 利用虛擬物件實現第三人稱視角的方法與系統
CN116414223A (zh) * 2021-12-31 2023-07-11 中兴通讯股份有限公司 三维空间中的交互方法、装置、存储介质及电子装置
CN116328309B (zh) * 2023-03-27 2023-10-13 广州美术学院 一种针对视障人群文农旅虚拟场景高阶需求游戏互动方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101780321A (zh) * 2009-12-30 2010-07-21 永春至善体育用品有限公司 一种高临场感的运动健身器材虚拟实景的制作方法以及基于该虚拟实景的互动系统和方法
US10022628B1 (en) * 2015-03-31 2018-07-17 Electronic Arts Inc. System for feature-based motion adaptation
CN109478341A (zh) * 2016-07-13 2019-03-15 株式会社万代南梦宫娱乐 模拟系统、处理方法及信息存储介质
CN110772791A (zh) * 2019-11-05 2020-02-11 网易(杭州)网络有限公司 三维游戏场景的路线生成方法、装置和存储介质
CN112862935A (zh) * 2021-03-16 2021-05-28 天津亚克互动科技有限公司 游戏角色的运动处理方法及装置、存储介质、计算机设备

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104658038B (zh) * 2015-03-12 2019-01-18 南京梦宇三维技术有限公司 基于动作捕捉的三维数字内容智能制作方法及制作系统
CN107441714A (zh) * 2017-06-01 2017-12-08 杨玉苹 一种实现ar第一人称射击游戏的图像处理方法及其装置、射击游戏对战系统及其工作方法
CN112198959A (zh) * 2017-07-28 2021-01-08 深圳市瑞立视多媒体科技有限公司 虚拟现实交互方法、装置及系统
CN108303719A (zh) * 2018-01-30 2018-07-20 上海电力学院 一种判断监控端动态位置是否超出虚拟围栏的方法
CN108427501B (zh) * 2018-03-19 2022-03-22 网易(杭州)网络有限公司 虚拟现实中移动控制方法和装置
EP3644322B1 (fr) * 2018-10-25 2023-12-27 Tata Consultancy Services Limited Méthode et système d'interprétation de l'interaction neurale impliquant une adaptation proprioceptive au cours d'un paradigme de double tâche
CN110280014B (zh) * 2019-05-21 2022-09-13 西交利物浦大学 一种虚拟现实环境下降低眩晕感的方法
CN110665219A (zh) * 2019-10-14 2020-01-10 网易(杭州)网络有限公司 一种虚拟现实游戏的操作控制方法及装置
CN111167120A (zh) * 2019-12-31 2020-05-19 网易(杭州)网络有限公司 游戏中虚拟模型的处理方法和装置
CN111249729B (zh) * 2020-02-18 2023-10-20 网易(杭州)网络有限公司 一种游戏角色的显示方法、装置、电子设备和存储介质
CN111318022B (zh) * 2020-03-19 2023-04-14 网易(杭州)网络有限公司 游戏中的游戏场景生成方法及装置、电子设备、存储介质
CN111744202A (zh) * 2020-06-29 2020-10-09 完美世界(重庆)互动科技有限公司 加载虚拟游戏的方法及装置、存储介质、电子装置
CN112316424B (zh) * 2021-01-06 2021-03-26 腾讯科技(深圳)有限公司 一种游戏数据处理方法、装置及存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101780321A (zh) * 2009-12-30 2010-07-21 永春至善体育用品有限公司 一种高临场感的运动健身器材虚拟实景的制作方法以及基于该虚拟实景的互动系统和方法
US10022628B1 (en) * 2015-03-31 2018-07-17 Electronic Arts Inc. System for feature-based motion adaptation
CN109478341A (zh) * 2016-07-13 2019-03-15 株式会社万代南梦宫娱乐 模拟系统、处理方法及信息存储介质
CN110772791A (zh) * 2019-11-05 2020-02-11 网易(杭州)网络有限公司 三维游戏场景的路线生成方法、装置和存储介质
CN112862935A (zh) * 2021-03-16 2021-05-28 天津亚克互动科技有限公司 游戏角色的运动处理方法及装置、存储介质、计算机设备

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116036601A (zh) * 2023-01-28 2023-05-02 腾讯科技(深圳)有限公司 游戏处理方法、装置及计算机设备、存储介质
CN116036601B (zh) * 2023-01-28 2023-06-09 腾讯科技(深圳)有限公司 游戏处理方法、装置及计算机设备、存储介质
CN117357894A (zh) * 2023-11-01 2024-01-09 北京畅游天下网络技术集团有限公司 一种三维场景生成方法、装置、设备及介质
CN117357894B (zh) * 2023-11-01 2024-03-29 北京畅游天下网络技术集团有限公司 一种三维场景生成方法、装置、设备及介质
CN117695633A (zh) * 2023-12-18 2024-03-15 广西壮族自治区地图院 基于游戏引擎与gis三维引擎的同步匹配方法及系统
CN117695633B (zh) * 2023-12-18 2024-05-28 广西壮族自治区地图院 基于游戏引擎与gis三维引擎的同步匹配方法及系统

Also Published As

Publication number Publication date
CN112862935B (zh) 2023-03-17
CN112862935A (zh) 2021-05-28

Similar Documents

Publication Publication Date Title
WO2022193612A1 (fr) Procédé et appareil de traitement de mouvement pour un personnage de jeu, support de stockage et dispositif informatique
US20210252398A1 (en) Method and system for directing user attention to a location based game play companion application
JP7273068B2 (ja) マルチサーバクラウド仮想現実(vr)ストリーミング
KR101610702B1 (ko) 스프라이트 스트립 렌더러
US11250617B1 (en) Virtual camera controlled by a camera control device
US20170354893A1 (en) Generating challenges using a location based game play companion application
JP7339318B2 (ja) ゲーム内位置ベースのゲームプレイコンパニオンアプリケーション
US9369543B2 (en) Communication between avatars in different games
KR20150108842A (ko) 혼합 현실 필터링
JP7249975B2 (ja) 位置に基づくゲームプレイコンパニオンアプリケーションへユーザの注目を向ける方法及びシステム
CN112933606B (zh) 游戏场景转换方法及装置、存储介质、计算机设备
CN110832442A (zh) 注视点渲染系统中的优化的阴影和自适应网状蒙皮
WO2022000971A1 (fr) Procédé et appareil de mode de commutation de mouvement de caméra, programme informatique et support lisible
US20220395756A1 (en) Building a dynamic social community based on similar interaction regions of game plays of a gaming application
US10293259B2 (en) Control of audio effects using volumetric data
CN112891940B (zh) 图像数据处理方法及装置、存储介质、计算机设备
CN109417651B (zh) 使用基于位置的游戏配套应用程序生成挑战
Chung Metaverse XR Components
WO2023030106A1 (fr) Procédé et appareil d'affichage d'objet, dispositif électronique et support de stockage
CN110314377B (zh) 三维空间中物体移动路径随机生成方法和装置
CN118105689A (zh) 基于虚拟现实的游戏处理方法、装置、电子设备和存储介质
CN116764215A (zh) 虚拟对象的控制方法、装置、设备、存储介质及程序产品
CN115359164A (zh) 在屏幕中心呈现对象的方法、系统、电子设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21931187

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21931187

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 29/02/2024)