WO2022193612A1 - Motion processing method and apparatus for game character, and storage medium and computer device - Google Patents

Motion processing method and apparatus for game character, and storage medium and computer device Download PDF

Info

Publication number
WO2022193612A1
WO2022193612A1 PCT/CN2021/121092 CN2021121092W WO2022193612A1 WO 2022193612 A1 WO2022193612 A1 WO 2022193612A1 CN 2021121092 W CN2021121092 W CN 2021121092W WO 2022193612 A1 WO2022193612 A1 WO 2022193612A1
Authority
WO
WIPO (PCT)
Prior art keywords
game
motion
data
moving image
game character
Prior art date
Application number
PCT/CN2021/121092
Other languages
French (fr)
Chinese (zh)
Inventor
曾浩强
林栋国
余婉
陈星雨
Original Assignee
天津亚克互动科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 天津亚克互动科技有限公司 filed Critical 天津亚克互动科技有限公司
Publication of WO2022193612A1 publication Critical patent/WO2022193612A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/403D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures

Definitions

  • the present invention relates to the field of computer technology, and in particular, to a method and device for motion processing of game characters, a storage medium, and a computer device.
  • AR Augmented Reality
  • AR Augmented Reality
  • soldiers can walk around in the real world with a mobile device, and drive the game virtual character to walk a corresponding distance in the corresponding direction in the game world.
  • the existing method will make the virtual scene flat, and there will be no obstacles in the virtual scene, and the character will be prohibited from jumping up and down. Otherwise, the virtual character will be interspersed with the virtual scene, and there will be gangs. In this way, AR gameplay cannot be applied to all game scenarios, and scenes must be specially created for AR gameplay.
  • the present invention provides a motion processing method and device, storage medium, and computer equipment for a game character, which does not require the establishment of a special game scene model for the AR mode, and at the same time, avoids the game character bluntly following the player in the real world.
  • the piercing caused by the moving path of the game not only improves the playability of the game and increases the gameplay, but also ensures the display effect of the game.
  • a motion processing method of a game character comprising: in response to an AR mode interactive operation request in a game client, invoking a game engine in the game client to render a real-time 3D game virtual scene; And according to the real-time real displacement data of the game player in the real world, the game character controlled by the game player is driven to perform non-interspersed movement in the three-dimensional game virtual scene.
  • a motion processing device for a game character comprising: a request response module for invoking a game engine in the game client to render in response to an AR mode interactive operation request in the game client A real-time 3D game virtual scene is obtained; a motion driving module is used to drive the game character controlled by the game player to perform non-interspersed motion in the 3D game virtual scene according to the real-time real displacement data of the game player in the real world.
  • a computer-readable medium on which a computer program is stored, and when the program is executed by a processor, implements the steps of the above-mentioned method for processing the movement of a game character.
  • a computer device comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, the processor implements the movement of the game character when the processor executes the program The steps of the processing method.
  • a computer program product comprising a computer program, when the computer program is executed by a processor, the steps of the above-mentioned method for processing the motion of a game character are implemented.
  • the game character motion processing method and device, storage medium, and computer equipment in response to the AR mode interactive operation request in the game client, through the game
  • the engine renders the 3D game virtual scene in real time.
  • the game client collects the real-time real displacement data of the game player in the real world, and uses the real displacement data and the motion surface data file of the game scene to drive the game character to move freely in the game. Interspersed movement.
  • the embodiment of the present invention does not need to create a special feature for the AR mode, compared to the sports performance effect caused by running the AR mode in the existing scene, or to avoid the scene modeling for the AR mode alone.
  • the game scene model only needs to use the pre-built motion surface data file to analyze whether the game character can follow the player's movement trajectory in the real world in the 3D game virtual scene, the implementation is simple, the threshold is low, so as to render the moving image frame , which avoids the game character bluntly following the player's movement path in the real world in the 3D game virtual scene. While improving the game playability and increasing the gameplay, the game display effect is guaranteed. Through the AR game mode Bring players an extraordinary gaming experience combining virtual and real.
  • FIG. 1 schematically shows a schematic flowchart of a method for processing a motion of a game character according to an embodiment of the present invention
  • FIG. 2 schematically shows a schematic diagram of a dotted ray emission according to an embodiment of the present invention
  • FIG. 3 schematically shows a schematic structural diagram of a motion processing apparatus for a game character according to an embodiment of the present invention
  • Figure 4 schematically shows a block diagram of a computer device for implementing the method according to the invention.
  • Figure 5 schematically shows a block diagram of a computer program product implementing the method according to the invention.
  • a motion processing method for a game character includes:
  • Step 101 in response to the AR mode interactive operation request in the game client, calling the game engine in the game client to render to obtain a real-time 3D game virtual scene;
  • Step 102 according to the real-time real displacement data of the game player in the real world and the motion surface data file corresponding to the 3D game virtual scene, drive the game character controlled by the game player to perform non-interspersed motion in the 3D game virtual scene.
  • the embodiments of the present invention may be applied to a game client, and the game client may include intelligent electronic devices such as smart phones and tablet computers.
  • the game client runs the game, in response to the AR mode interactive operation request for the 3D game virtual scene in the game, the game engine performs real-time rendering on the corresponding 3D game virtual scene.
  • the motion processing method of the game character may include: Step 103, reading a motion surface data file corresponding to the 3D game virtual scene, wherein the motion surface data file is used to indicate The movable position of the three-dimensional game virtual scene.
  • the 3D game virtual scene may be a specific game world in the game, such as a player's exclusive home in the game, and the motion surface data file may be pre-established based on the landform data of the 3D game virtual scene for reaction A set of data for the movable positions of the virtual scene of a 3D game.
  • the moving surface data file may contain data of different meanings.
  • the moving surface data file may include information such as points that can be passed, points that cannot be passed, landform boundaries, and landform heights in a 3D game virtual scene.
  • the motion surface data file can only include the points that can be passed in the virtual scene of the 3D game.
  • the motion feasibility of the game character in the three-dimensional game virtual scene can be judged through the motion surface data file, and in some embodiments, it can be determined whether the game character can move to a certain position.
  • the real displacement data generated when the player holds the game client to move can be used to analyze the target position that the game character in the AR mode should reach if the real movement of the player is simulated, that is, in the AR mode In the mode, the player controls the movement of the game character in the virtual world by holding the game client to move.
  • the parameters used to control the movement of the game character may be collected by the game client at a preset frame rate real displacement data.
  • step 102 may include:
  • Step 102-1 collect the real displacement data corresponding to the game client according to the preset frame rate, and generate the game frame by frame according to the real displacement data and the initial position of the game character in the 3D game virtual scene the target position of the character in the virtual scene of the three-dimensional game;
  • Step 102-2 according to the motion surface data file and the target position, use a game engine to render the motion image frames corresponding to the game character in the 3D game virtual scene frame by frame.
  • the game character is controlled to move in the 3D game virtual scene according to the real displacement data.
  • the moving direction and moving distance move toward the corresponding target position.
  • the initial position is determined in different ways. For example, before the game client responds to the AR mode interactive operation request, the game character is already in the 3D game virtual scene, that is, the AR mode interactive operation request uses In order to instruct to run the AR mode in the current game scene where the game character is located, then the initial position can be the position of the game character when the game client responds to the AR mode interactive operation request, and, for example, before the client responds to the AR mode interactive operation request, the game The character is located in another game scene, that is, the AR mode interaction operation request is used to instruct the game character to switch from the current game scene to the 3D game virtual scene and run the AR mode in the 3D game virtual scene.
  • the initial position at this time can be the game developer.
  • the walking environment in the real world is different from the walking environment in the game world. If the real displacement data of the game client is completely followed Controlling the movement of the game character in the game world may cause the game character to pass through obstacles such as walls and walk out of the map.
  • the real displacement data of the game client and the initial position of the game character in the 3D game virtual scene calculate the target position of the game character corresponding to each real displacement data in the 3D game virtual scene, and then calculate the target position of the game character in the 3D game virtual scene according to the motion surface data file.
  • the movable position of the virtual scene is used to analyze whether the game character can move to the target position, that is, whether the target position belongs to the movable position in the 3D game virtual scene, so as to obtain the moving image frame by image rendering based on the analysis result.
  • you can move to the target position you can render the moving image frame of the game character moving to the target position, and when it is determined that the target position is not a movable position, that is, when the game character should not move to the target position, you can render the game character at the target position.
  • a moving image frame in the form of stop moving, or standing still. in this way.
  • the game character will not appear to pass through obstacles, move out of the map, etc., and improve the authenticity of the game character movement, so that the 3D game in AR mode is virtual.
  • the game characters in the scene can not only follow the player's movement in the real world, but also avoid the game character bluntly following the player's movement path in the real world without the need to re-develop the dedicated map for AR mode. Playability and increase the gameplay, while ensuring the game display effect.
  • the embodiment of the present invention in response to the AR mode interactive operation request, read the motion surface data file corresponding to the 3D game virtual scene indicated by the AR mode interactive operation request, and at the same time, obtain the game client at the preset frame rate Corresponding real displacement data, and based on the real displacement data, determine the target position of the game character in the 3D game virtual scene frame by frame, use the read motion surface data file to analyze whether the target position is movable, and render the corresponding game character based on the analysis result. moving image frame.
  • the embodiment of the present invention does not need to create a special feature for the AR mode, compared to the sports performance effect caused by running the AR mode in the existing scene, or to avoid the scene modeling for the AR mode alone.
  • the game scene model only needs to use the pre-built motion surface data file to analyze whether the game character can follow the player's movement trajectory in the real world in the 3D game virtual scene, so as to render the moving image frame and avoid the game character in the game.
  • the 3D game virtual scene rigidly follows the player's moving path in the real world, which leads to the penetration of the gang, which not only improves the game playability and increases the gameplay, but also ensures the game display effect.
  • the AR game mode brings players a combination of reality and reality. extraordinary gaming experience.
  • step 102-1 may include:
  • Step 102-1.1 Acquire the positioning data collected by the positioning device of the game client according to the preset frame rate, and calculate the real displacement data corresponding to the game client frame by frame;
  • Step 102-1.2 Determine the virtual displacement data of the game character corresponding to the real displacement data according to the preset proportional coefficient, and determine the game character in the three-dimensional position according to the virtual displacement data and the initial position. The target location in the game's virtual scene.
  • a positioning device (such as a GPS module) in the game client can be used to collect positioning data at a preset frame rate, obtain the positioning data of the game client, and calculate the frame by frame based on the collected positioning data.
  • the real displacement data, the positioning data collected by the positioning device can be longitude and latitude data, and the real displacement data can reflect the moving direction and moving distance of the user.
  • the game character can follow the real movement of the player in the real world, and the virtual displacement data corresponding to the real displacement data can be calculated according to the preset scale factor.
  • the preset scale factor is 1, then the virtual displacement data and the real displacement data In the same way, the player moves 1 meter in a certain direction in the real world, and the virtual displacement data is also expressed as moving 1 meter in this direction.
  • the preset scale factor is 10. The player moves 1 meter in the real world, and the game character moves 10 meters in the game. So that the game character can follow the player's movement trajectory in the real world in proportion, bringing the player an immersive and immersive game experience.
  • the method of determining the target position based on the virtual displacement data and the initial position is different from the method of generating the target position according to the real displacement data and the initial position in the above embodiment in that the real displacement data is first transformed into the corresponding virtual displacement data, Then, the calculation of the target position is performed according to the virtual displacement data obtained by the transformation, and the specific calculation method will not be repeated here.
  • the target position can be calculated directly based on the real displacement data and the initial position, that is, step 102-1 can be replaced with: according to the real displacement data and the initial position , and determine the target position of the game character in the three-dimensional game virtual scene.
  • the real displacement data in order to improve the motion performance effect of the game character in the 3D game virtual scene and make the motion of the game character smoother, the real displacement data can be interpolated, and the interpolation result can be used to determine the above-mentioned target position.
  • "collecting the real displacement data corresponding to the game client at a preset frame rate” may be: acquiring first positioning data and second positioning data in the positioning data, wherein the first positioning data The difference between the first sampling frame corresponding to the positioning data and the second sampling frame corresponding to the second positioning data is a preset number; based on the first positioning data and the second positioning data, interpolation is performed to obtain a difference between the preset number and the first sampling frame. matching interpolation position data, and calculate the interpolation displacement data corresponding to the game client according to the interpolation position data, wherein the real displacement data includes the interpolation displacement data.
  • the first positioning data and the second positioning data may be obtained from the positioning data according to a preset number, for example, the preset number is 10, the first positioning data sampled is used as the first positioning data, and the tenth sampling data is used as the first positioning data.
  • the first positioning data and the second positioning data are interpolated to calculate the 8 interpolation data between the first positioning data and the second positioning data, and the first positioning data, 8
  • the 10 pieces of interpolation data and the second positioning data are used as 10 pieces of interpolation displacement data, and when the virtual displacement data and the target position are subsequently calculated, the interpolation displacement data obtained by interpolation is used for calculation.
  • it can be controlled according to the interpolation displacement data, which helps the movement trajectory of the game character to be smoother and improves the performance of the movement.
  • first positioning data and the second positioning data can also be directly collected at fixed time intervals.
  • the preset frame rate is 40 Hz
  • the preset number is 10
  • the fixed time interval is 0.25 seconds
  • the first positioning data is obtained at 0.25 seconds
  • the second positioning data is obtained at 0.25 seconds, so as to reduce the data collection amount of the positioning device.
  • the player may obtain the first positioning data and the second positioning data in different moving stages in the real world, for example, obtain the first positioning data in the player's initial stage Positioning data and second positioning data, to avoid the player's stop-and-go when starting, causing the game character to freeze in the game world.
  • the first positioning data and the second positioning data are obtained during the player's stop phase to prevent the player from suddenly stopping. The step causes the game character to suddenly stop, and the performance is poor.
  • the motion surface data file may include plane coordinates and height data corresponding to movable positions in the 3D game virtual scene determined based on the landform data of the 3D game virtual scene; Step 103, you can The method includes: loading a motion plane corresponding to the motion plane data file in the 3D game virtual scene, wherein the motion plane is hidden when the 3D game virtual scene is displayed.
  • the motion surface data file may be pre-generated according to the landform data of the 3D game virtual scene, and the file may include the plane coordinates and height data of each movable position in the 3D game virtual scene, the plane coordinates and corresponding
  • the height data represents the movable height of a certain movable position in the 3D game virtual scene space.
  • the steps in the 3D game virtual scene are movable positions, and the plane coordinates and height data corresponding to the steps reflect the movement of the game character to this position. The position of the foot on the steps.
  • the motion surface corresponding to the file can be loaded in the game based on the motion surface data file, the motion surface can be hidden in the game, and the player cannot see the motion through the game client display screen
  • the moving surface is only used to analyze whether a certain target position can be moved.
  • the moving surface can also be displayed in the game, so that the player can see the moving surface, so as to help the player pay attention to avoiding the position of the non-moving surface as much as possible. Mobility guarantees smooth, smooth navigation within the game world.
  • step 102-2 may include:
  • Step 102-2-A1 generating a dotted ray based on the target position, and emitting the dotted ray on the moving surface to perform ray dotting, wherein the dotted ray is perpendicular to the plane where the target position is located;
  • Step 102-2-A2 if the dotted ray intersects with the moving surface, render the first moving image frame of the game character moving to the target position, wherein the moving image frame includes the first moving image frame. a moving image frame;
  • Step 102-2-A3 if the dotted ray does not intersect with the motion surface, then according to the preset collision motion rule, the second motion image frame in which the game character moves at the corresponding current position is rendered, wherein, The moving image frame includes the second moving image frame.
  • the dotted ray is generated in the vertical direction according to the target position (the endpoint of the dotted ray can be the highest point of the 3D game virtual scene, and the direction is vertically downward, or the endpoint can be taken at the lowest point, and the direction is vertically upward), and Launch the dotting ray for dotting. If the dotting ray can hit the moving surface, that is, the dotting ray and the moving surface intersect, as shown in Figure 2. Assume that the initial position of the game character is point A, the target position is point B, and the moving surface is S.
  • the dotted ray is emitted upward from point B, and the dotted ray has a meeting point B' with S, indicating that the game character can move to the target position, then render the first moving image of the game character moving to the target position (here, the target position is specifically B'). frame, where the height data corresponding to the target position should be considered when the game character moves to the target position to ensure that the game character can move based on the topography of the scene when moving.
  • the second motion image frame can be rendered according to the preset collision motion rules.
  • the preset collision motion rule may be that in this case, the game character walks on the spot, or stops on the spot, and so on.
  • the dotted ray detection method can ensure that the game character has no interspersed motion effect in the game scene, and the system overhead is small and the efficiency is high.
  • step 102-2-A2 may include:
  • Step 102-2-A2.1 obtaining the height of the intersection point of the dotted ray and the motion surface
  • Step 102-2-A2.2 if the junction height matches the AR mode riding state information of the 3D game virtual scene, render the game character according to the riding state corresponding to the junction height.
  • Step 102-2-A2.3 if the height of the intersection point does not match the riding state information of the AR mode, then according to the preset collision motion rules, render the game character moving at the corresponding current position.
  • a fourth moving image frame wherein the first moving image frame includes the fourth moving image frame.
  • the height information of the intersection generated by the intersection can be used to further analyze whether the target position is a movable position.
  • the third motion image frame in which the game character moves to the target position according to the riding state corresponding to the height of the junction point can be rendered,
  • the virtual scene of a 3D game is a manor, a city, etc. in the game.
  • the game characters walk, ride and ride on the surface, and also support the game characters to travel in the air such as light work and aircraft.
  • the height of the intersection point is the height range that supports the surface travel mode, then the image frames of the game character moving to the target position by walking, riding, and other surface travel methods can be rendered.
  • the height of the intersection point does not match the riding state information in AR mode, it means that although the target position is a movable position in normal mode, but it moves to the target position in AR mode, it should be rendered according to the preset collision motion rules.
  • the fourth moving image frame such as the junction height, is a range of heights that support the air travel mode.
  • the motion surface may also be loaded as a navigation motion surface, and the motion surface may include a 2D navigation mesh motion surface and a 3D voxel motion surface, wherein the 2D navigation mesh motion surface may represent a 3D game
  • the game character can move on the surface by walking, riding, etc.
  • the 3D voxel motion surface can reflect the connection of each movable voxel in the 3D game virtual scene, that is, the connected voxels.
  • the grids can be moved in various ways such as surface travel, air travel, and so on. Based on the 2D navigation mesh motion surface or the 3D voxel motion surface, the pathfinding of the game character in the game world can be realized.
  • step 102-2 may include:
  • Step 102-2-B1 if the target position matches the motion surface of the two-dimensional navigation grid, obtain the preset riding information of the target position corresponding to the motion surface of the two-dimensional navigation grid, according to The preset riding information and the current riding state of the game character determine the target riding state of the game character, and based on the two-dimensional navigation mesh motion surface, render the game character to the target a fifth moving image frame in which the riding state moves toward the target position, wherein the moving image frame includes the fifth moving image frame;
  • Step 102-2-B2 if the target position does not match the motion surface of the two-dimensional navigation grid, render a sixth moving image frame in which the game character moves at the corresponding current position, wherein the motion The image frames include the sixth moving image frame.
  • the target position represented by the in-game plane coordinates
  • the projection of the two-dimensional navigation mesh moving surface on the plane includes the target position, or the dotted ray emitted based on the target position and the two-dimensional navigation mesh
  • the target position matches the 2D navigation grid motion surface.
  • the game character can move to the target position.
  • the target position point is obtained.
  • the preset riding information corresponding to the vertical projection position point on the motion surface of the 2D navigation grid may include walking, riding, boating, etc., to further determine the current riding mode of the game character Whether the riding state belongs to the riding state indicated by the preset riding information, if so, the current riding state of the game character is used as the target riding state, if not, the target riding state of the game character is switched to that indicated by the preset riding information , such as switching from the walking state to the riding state, so as to render the game character to the target position in the target riding state (the target position here refers to the vertical projection position of the target position on the motion surface of the two-dimensional navigation mesh)
  • the fifth moving image frame that moves.
  • step 102-2 may include:
  • Step 102-2-C1 if the target position matches the three-dimensional voxel motion surface, obtain the preset riding information of the target position corresponding to the three-dimensional voxel motion surface, according to the preset riding information
  • the riding information and the current riding state of the game character determine the target riding state of the game character, and determine the search for the game character to move from the current position to the target position based on the three-dimensional voxel motion surface.
  • road information, and according to the pathfinding information a seventh moving image frame in which the game character moves to the target position in the target riding state is rendered, wherein the moving image frame includes the seventh moving image frame ;
  • Step 102-2-C2 if the target position does not match the three-dimensional voxel motion surface, render an eighth moving image frame in which the game character moves at the corresponding current position, wherein the moving image frame The eighth moving image frame is included.
  • the 3D voxel motion surface includes connectable voxels in the 3D game virtual scene.
  • the projection of the 3D voxel motion surface on the plane includes the target position, or the dotted ray emitted based on the target position can hit the
  • the target position matches the three-dimensional voxel motion plane, which means that the game character can move to the target position.
  • the preset riding information corresponding to the vertical projection position point of the target position point on the 3D voxel motion surface is obtained.
  • the preset riding information may include walking, riding and riding.
  • boating and other surface travel methods and can also include air travel methods such as light work, aircraft, etc.
  • determine the target riding state when the game character moves to the target position here
  • the riding state that is the same or similar to the current riding state in the preset riding information is preferentially selected. For example, if the current riding state is walking, the walking state will be prioritized as the target riding state, and the riding state will be considered second. Riding, and finally consider the air travel mode to maintain the continuity of the riding state, avoid unnecessary riding state switching in the game, and improve the user experience.
  • pathfinding information for the movement of the game character from the current position to the target position is generated based on the three-voxel motion surface.
  • Render the game character to the target position in the target riding state (the target position here refers to the vertical projection voxel position of the target position on the three-dimensional voxel motion surface, if the vertical projection voxel position corresponding to the target position includes multiple, One of them may also be selected as the seventh moving image frame in which the target voxel position) moves based on the shortest path first principle.
  • the eighth motion image frame is rendered according to the preset collision motion rule.
  • step 102-2 may further include:
  • Step 102-2-D1 obtaining the viewing angle data obtained by the angular motion detection device of the game client according to the preset frame rate;
  • Step 102-2-D2 according to the moving surface data file and the target position, render the moving image frame of the game character matching the viewing angle data frame by frame, wherein the moving image frame is used to display and The three-dimensional game virtual scene matched with the viewing angle data and the movement action of the game character.
  • the game client can not only acquire real displacement data, but also acquire angular motion data through an angular motion detection device (such as a gyroscope), and process and process it into corresponding viewing angle data, so as to render the data matching the viewing angle data.
  • Motion image frame so that the angular motion of the game client can be reflected through the motion image frame, and the player can change the viewing angle of the game world by turning the mobile phone (when the game client is a mobile phone), so that the motion image frame can show the same
  • the user's real-time perspective matches the environment picture in the 3D game virtual scene, so that the player can better browse the game world and improve the user experience.
  • steps 102-2-D1 and D2 can be combined with steps 102-2-A2.1 to 102-2-A2.3, steps 102-2-B1 to B2, and steps 102-2-C1 to C2 Combined to render a moving image frame that matches the target position (or current position, preset collision motion rules), riding state, and viewing angle data. So that the game character can meet the needs of the player to rotate the viewing angle at will while exercising without wearing help.
  • step 102-2-D1 may include: acquiring the angular motion data collected by the angular motion detection device according to the preset frame rate; The movement category of the mirror is generated, and the angle of view data corresponding to the angular motion data is generated.
  • the perspective data is the first-person perspective of the game character in the three-dimensional game virtual scene, and the perspective data is based on the game character's first-person perspective.
  • the preset initial angle of view is based on the corresponding changes with the angular motion data; when the camera movement category includes third-person camera movement, the angle of view data is a preset corresponding to the position and direction of the game character. The position and the preset angle are viewed from the third-person perspective of the game character, and the direction of the game character changes correspondingly with the angular motion data based on the preset initial direction of the game character.
  • the viewing angle data is processed and processed by the angular motion data collected by the angular motion detection device.
  • the viewing angle data should be calculated based on the camera movement category corresponding to the virtual scene of the 3D game or selected by the user.
  • the camera movement category is first-person mirror movement, which means that the image displayed by the moving image frame is the image corresponding to the first-person perspective of the game character, that is, the game world environment that the game character "sees", and the perspective of the first frame of the moving image frame.
  • the data is the preset initial viewing angle of the game character, and the subsequent viewing angle data is correspondingly changed according to the angular motion data on the basis of the preset initial viewing angle.
  • the running category is the third-person camera movement, which means that the image displayed by the moving image frame is the image corresponding to the third-person perspective "looking at” the game character, that is, showing the game character and the surrounding environment of the game character that the virtual third person "sees".
  • the third-person perspective is based on the position and direction of the game character.
  • the virtual third person is “located” at the preset position and angle of the game character, and the virtual third person “looks” at the position and angle of the game character. The position and angle remain relatively unchanged.
  • the third-person perspective follows the displacement of the game character in the game world and the rotation of the perspective to change the position and angle of the virtual third-person in the game world coordinates, but the virtual third-person remains relative to the coordinate system established with the game character as the origin. Do not move.
  • the direction of the game character is based on the preset initial direction of the game character and correspondingly changes with the angular motion data (the angle motion data indicates how many degrees to rotate, and how many degrees the game character also rotates), and the third-person perspective follows the above rules and changes accordingly. Thereby providing a richer AR mode display effect and providing more gameplay.
  • the motion surface data file may include plane coordinates and height data corresponding to movable positions in the 3D game virtual scene determined based on the landform data of the 3D game virtual scene; the motion surface data file Modifications may be made according to the real-time environment of the three-dimensional game scene.
  • step 102-2 may include:
  • Step 102-2-E1 if the target position matches the plane coordinates of the movable position, generate a ninth moving image frame in which the game character moves to the target position according to the height data corresponding to the target position , wherein the moving image frame includes the ninth moving image frame;
  • Step 102-2-E2 if the target position does not match the plane coordinates of the movable position, then according to the preset collision motion rule, generate a tenth motion in which the game character moves at the corresponding current position An image frame, wherein the moving image frame includes the tenth moving image frame.
  • the game character without interspersed motion directly based on the data in the motion surface data file, and when the target position is a movable position, drive the game character to move to the height data position corresponding to the target position That's it.
  • the plane coordinates and height data of the movable position saved in the motion surface data file can be modified, added, and deleted according to the real-time changes of the game scene.
  • the table placement position becomes In the position that cannot move, the relevant data in the motion surface data file can be modified, so that when the file is used to drive the movement of the game character, even if the environment of the game scene changes, the game character can be guaranteed to perform non-interspersed movement.
  • the real-time modified motion surface data file calculates the motion connectivity of the game character to drive the motion of the character, so that the motion of the game character in the virtual world is more realistic.
  • a gameplay of entering the real world from the 3D game virtual scene can also be provided .
  • it may further include: S1, in response to a request for opening a transmission gate from a three-dimensional game virtual scene to the real world, acquiring a first real-time real image frame corresponding to the real world and converting the first real-time real image The frame is saved as a first texture, and a first real-time virtual image frame corresponding to the virtual scene of the 3D game (the first real-time virtual image frame is a moving image frame) is obtained, wherein the first texture is used to render the transmission A preset portal model corresponding to the gate; S2, according to the first texture, the preset portal model and the first real-time virtual image frame, render a first real-time rendered image frame including the portal.
  • the images collected in the real world are stored in the device memory in the form of textures, so that the game engine can render them to the preset portal model through rendering technology, and the real-world real-time environment is displayed in the gate, and The effect of rendering the image of the real-time environment of the virtual scene of the 3D game outside the portal is rendered.
  • the image displayed in the game has a very holistic effect, and there is no sense of separation.
  • the real-time environment of the real and virtual worlds can be displayed in real time inside and outside the door. Create a more "real" portal effect. Through the portal, players can observe the world environment and location through the portal in real time, which helps to improve the player's experience, enhance the display effect of the game screen, and enhance the game's potential. Playability, through the AR game mode, it brings players an extraordinary game experience combining virtual and reality, and provides technical support for increasing gameplay.
  • the game screen includes a virtual portal or does not include a virtual portal.
  • one is a situation where the virtual portal can be seen from the perspective of the game character in the virtual world, and the other is a situation where the virtual portal cannot be seen from the perspective of the game character in the virtual world.
  • the position and direction of the virtual portal in the virtual scene of the 3D game can be determined by the player's choice. If the position and direction of the virtual portal are determined, the shape and size of the virtual portal displayed in the game will vary with The different positions and directions of the game characters in the three-dimensional game virtual scene change.
  • a motion processing device for a game character includes:
  • a request-response module used for invoking the game engine in the game client to render a real-time 3D game virtual scene in response to an AR mode interactive operation request in the game client;
  • the motion driving module is used for driving the game character controlled by the game player to perform non-interspersed motion in the three-dimensional game virtual scene according to the real-time real displacement data of the game player in the real world.
  • the motion processing apparatus of the game character may further include: a file reading module, configured to read and read the virtual scene of the real-time three-dimensional game by invoking the game engine in the game client to render the virtual scene.
  • a file reading module configured to read and read the virtual scene of the real-time three-dimensional game by invoking the game engine in the game client to render the virtual scene.
  • the motion driving module can be used to: collect the real displacement data corresponding to the game client according to the preset frame rate, and according to the real displacement data and the initial position of the game character in the three-dimensional game virtual scene, step by step. frame generation of the target position of the game character in the 3D game virtual scene; according to the motion surface data file and the target position, the game character is rendered frame by frame through the game engine corresponding to the 3D game virtual scene moving image frame.
  • the motion driving module is further configured to: acquire the positioning data collected by the positioning device of the game client according to the preset frame rate, and calculate the real displacement corresponding to the game client frame by frame data; according to a preset proportional coefficient, determine the virtual displacement data of the game character corresponding to the real displacement data, and determine that the game character is in the 3D game virtual scene according to the virtual displacement data and the initial position target location in .
  • the motion driving module is further configured to: acquire first positioning data and second positioning data in the positioning data, wherein the first sampling frame corresponding to the first positioning data is the same as the The second sampling frames corresponding to the second positioning data differ by a preset amount; based on the first positioning data and the second positioning data, interpolation is performed to obtain the interpolation position data matching the preset number, and according to the interpolation
  • the position data calculates the interpolation displacement data corresponding to the game client, wherein the real displacement data includes the interpolation displacement data.
  • the motion surface data file may include plane coordinates and height data corresponding to movable positions in the 3D game virtual scene determined based on the landform data of the 3D game virtual scene;
  • the file reading module can be configured to: in the 3D game virtual scene, load the motion plane corresponding to the motion plane data file, wherein the motion plane is hidden when the 3D game virtual scene is displayed .
  • the motion driving module may be further configured to: generate a dotted ray based on the target position, and perform ray dotting on the dotted ray emitted by the moving surface, wherein the dotted ray is perpendicular to the target the plane of the location;
  • a second moving image frame in which the game character moves at the corresponding current position is rendered according to the preset collision motion rule, wherein the moving image frame includes all the second moving image frame.
  • the motion driving module may further be used to: obtain the height of the intersection of the dotted ray and the motion surface;
  • the height of the junction matches the riding state information in the AR mode of the 3D game virtual scene, render a third motion in which the game character moves to the target position according to the riding state corresponding to the height of the junction an image frame, wherein the first moving image frame includes the third moving image frame;
  • a fourth moving image frame of the game character moving at the corresponding current position is rendered, and in some implementations manner, the first moving image frame includes the fourth moving image frame.
  • the motion surface may include a two-dimensional navigation mesh motion surface; the motion driving module may also be used to:
  • the target position matches the motion surface of the two-dimensional navigation grid, obtain the preset riding information of the target position corresponding to the motion surface of the two-dimensional navigation grid, according to the preset riding information and the current riding state of the game character, determine the target riding state of the game character, and based on the two-dimensional navigation mesh motion surface, render the game character with the target riding state to the target a position-shifted fifth moving image frame, wherein the moving image frame includes the fifth moving image frame;
  • a sixth moving image frame in which the game character moves at the corresponding current position is rendered according to the preset collision motion rule, wherein the motion The image frames include the sixth moving image frame.
  • the motion surface may include a three-dimensional voxel motion surface; the motion driving module may also be used to:
  • the target position matches the three-dimensional voxel motion surface, obtain the preset riding information of the target position corresponding to the three-dimensional voxel motion surface, according to the preset riding information and the game
  • the current riding state of the character determine the target riding state of the game character, and determine the pathfinding information of the game character moving from the current position to the target position based on the three-dimensional voxel motion surface, according to the road information, and render a seventh moving image frame in which the game character moves to the target position in the target riding state, wherein the moving image frame includes the seventh moving image frame;
  • the eighth motion image frame in which the game character moves at the corresponding current position is rendered according to the preset collision motion rule, wherein the motion image frame The eighth moving image frame is included.
  • the motion surface data file may include plane coordinates and height data corresponding to movable positions in the 3D game virtual scene determined based on the landform data of the 3D game virtual scene; the motion surface data file can be modified according to the real-time environment of the three-dimensional game scene;
  • the motion drive module can also be used for:
  • a ninth moving image frame in which the game character moves to the target position is generated according to the height data corresponding to the target position, wherein the moving image the frame includes the ninth moving image frame;
  • a tenth moving image frame in which the game character moves at the corresponding current position is generated according to the preset collision motion rule, wherein the The moving image frame includes the tenth moving image frame.
  • the motion driving module is further configured to: acquire the viewing angle data obtained by the angular motion detection device of the game client according to the preset frame rate; according to the motion surface data file and the target position, rendering the moving image frames of the game character matching the viewing angle data frame by frame, wherein the moving image frames are used to display the three-dimensional game virtual scene matching the viewing angle data and the movement actions of the game character .
  • the motion driving module may be further configured to: acquire the angular motion data collected by the angular motion detection device according to the preset frame rate; category, and the viewing angle data corresponding to the angular motion data is generated.
  • the perspective data is the first-person perspective of the game character in the three-dimensional game virtual scene, and the perspective data is in the form of The preset initial angle of view of the game character is based on the corresponding change with the angular motion data;
  • the perspective data is a third-person perspective view of the game character looking at the game character at a preset position and a preset angle corresponding to the position and direction of the game character,
  • the direction of the game character changes correspondingly with the angular motion data based on the preset initial direction of the game character.
  • Various component embodiments of the present invention may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof.
  • a microprocessor or a digital signal processor (DSP) may be used in practice to implement some or all of the functions of some or all of the components in the motion processing device of the game character according to the embodiment of the present invention.
  • DSP digital signal processor
  • the present invention can also be implemented as a program/instruction (eg, computer program/instruction and computer program product) for an apparatus or apparatus for performing some or all of the methods described herein.
  • Such programs/instructions implementing the present invention may be stored on a computer readable medium, or may exist in the form of one or more signals, such signals may be downloaded from an Internet website, or provided on a carrier signal, or in any form Available in other formats.
  • FIGS. 1 to 2 Based on the above methods as shown in FIGS. 1 to 2 , correspondingly, according to an embodiment of the present invention, a computer-readable medium is also provided, on which a computer program is stored, and when the computer program is executed by a processor, the above-mentioned FIG. 1 to FIG. 2 show the motion processing method of the game character.
  • Computer-readable media includes both persistent and non-permanent, removable and non-removable media, and storage of information may be implemented by any method or technology.
  • Information may be computer readable instructions, data structures, modules of programs, or other data.
  • Examples of computer storage media include, but are not limited to, phase-change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read only memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), Flash Memory or other memory technology, Compact Disc Read Only Memory (CD-ROM), Digital Versatile Disc (DVD) or other optical storage, Magnetic tape cartridges, disk storage, quantum memory, graphene-based storage media or other magnetic storage devices or any other non-transmission media can be used to store information that can be accessed by computing devices.
  • PRAM phase-change memory
  • SRAM static random access memory
  • DRAM dynamic random access memory
  • RAM random access memory
  • ROM read only memory
  • EEPROM Electrically Erasable Programm
  • the technical solution of the present invention can be embodied in the form of a software product, and the software product can be stored in a non-volatile storage medium (which may be CD-ROM, U disk, mobile hard disk, etc.), including several The instructions are used to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in various implementation scenarios of the present invention.
  • a computer device which may be a personal computer, a server, or a network device, etc.
  • a computer device which can be a personal computer, a server, A network device, etc.
  • the computer device includes a memory and a processor; the memory is used to store the computer program; the processor is used to execute the computer program to implement the above-mentioned method for processing game characters as shown in FIG. 1 to FIG. 2 .
  • FIG. 4 schematically shows a computer device that can implement the method for processing the motion of a game character according to the present invention
  • the computer device includes a processor 410 and a computer-readable medium in the form of a memory 420.
  • the memory 420 is an example of a computer-readable medium having a storage space 430 for storing the computer program 431 .
  • the computer program 431 is executed by the processor 410, each step in the above-described method for processing the motion of a game character can be implemented.
  • the computer device may further include a user interface, a network interface, a camera, a radio frequency (Radio Frequency, RF) circuit, a sensor, an audio circuit, a WI-FI module, and the like.
  • the user interface may include a display screen (Display), an input unit such as a keyboard (Keyboard), etc., and the optional user interface may also include a USB interface, a card reader interface, and the like.
  • Optional network interfaces may include standard wired interfaces, wireless interfaces (such as Bluetooth interfaces, WI-FI interfaces), and the like.
  • a computer device does not constitute a limitation on the computer device, and may include more or less components, or combine some components, or arrange different components.
  • the storage medium may also include an operating system and a network communication module.
  • An operating system is a program that manages and saves the hardware and software resources of computer equipment, supports the operation of information processing programs and other software and/or programs.
  • the network communication module is used to realize the communication between various components inside the storage medium, as well as the communication with other hardware and software in the physical device.
  • Figure 5 schematically shows a block diagram of a computer program product implementing the method according to the invention.
  • the computer program product includes a computer program 510 that, when executed by a processor, such as the processor 410 shown in FIG. step.
  • the present invention can be implemented by means of software plus a necessary general hardware platform, and can also be implemented by hardware in response to the AR mode interactive operation request, reading and The motion surface data file corresponding to the 3D game virtual scene indicated by the AR mode interactive operation request, at the same time, the real displacement data corresponding to the game client is obtained according to the preset frame rate, and based on the real displacement data, the game character is determined frame by frame in 3D
  • the read motion surface data file to analyze whether the target position is movable and render the moving image frame corresponding to the game character based on the analysis result.
  • the embodiment of the present invention does not need to create a special feature for the AR mode, compared to the sports performance effect caused by running the AR mode in the existing scene, or to avoid the scene modeling for the AR mode alone.
  • the game scene model only needs to use the pre-built motion surface data file to analyze whether the game character can follow the player's movement trajectory in the real world in the 3D game virtual scene, so as to render the moving image frame and avoid the game character in the game.
  • the 3D game virtual scene rigidly follows the player's moving path in the real world, which leads to the penetration of the gang, which not only improves the game playability and increases the gameplay, but also ensures the game display effect.
  • the AR game mode brings players a combination of reality and reality. extraordinary gaming experience.
  • the accompanying drawing is only a schematic diagram of a preferred implementation scenario, and the modules or processes in the accompanying drawing are not necessarily necessary to implement the present invention.
  • the modules in the device in the implementation scenario may be distributed in the device in the implementation scenario according to the description of the implementation scenario, or may be located in one or more devices different from the implementation scenario with corresponding changes.
  • the modules of the above implementation scenarios may be combined into one module, or may be further split into multiple sub-modules.

Abstract

Disclosed in the present invention are a motion processing method and apparatus for a game character, and a storage medium and a computer device. The method comprises: in response to an AR mode interaction operation request in a game client, calling a game engine from the game client to perform rendering, so as to obtain a real-time three-dimensional game virtual scene; and according to real-time true displacement data of a game player in the real world and a motion plane data file corresponding to the three-dimensional game virtual scene, driving a game character controlled by the game player to perform non-interpenetration motion in the three-dimensional game virtual scene. The present invention helps to avoid a goof caused by a game character stiffly following a moving path of a player in the real world, thereby improving the presentation effect of a game.

Description

游戏角色的运动处理方法及装置、存储介质、计算机设备Motion processing method and device for game character, storage medium, and computer equipment
交叉引用cross reference
本申请要求于2021年3月16日提交、申请号为202110282546.0,发明名称为“游戏角色的运动处理方法及装置、存储介质、计算机设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。This application claims the priority of the Chinese patent application filed on March 16, 2021, the application number is 202110282546.0, and the invention title is "Game Character Motion Processing Method and Device, Storage Medium, Computer Equipment", the entire content of which is incorporated by reference in this application.
技术领域technical field
本发明涉及计算机技术领域,尤其是涉及到一种游戏角色的运动处理方法及装置、存储介质、计算机设备。The present invention relates to the field of computer technology, and in particular, to a method and device for motion processing of game characters, a storage medium, and a computer device.
背景技术Background technique
随着计算机技术的快速发展,AR(Argument Reality,增强现实)技术开始相继应用于各行各业中,包括:军事、医疗、影视、游戏等。在游戏领域的AR应用中,玩家可以拿着移动设备在现实世界中走动,驱动游戏虚拟角色在游戏世界中对应方向行走对应距离。With the rapid development of computer technology, AR (Argument Reality, Augmented Reality) technology has begun to be applied in all walks of life, including: military, medical, film and television, games and so on. In AR applications in the game field, players can walk around in the real world with a mobile device, and drive the game virtual character to walk a corresponding distance in the corresponding direction in the game world.
现有方法为了防止虚拟角色和虚拟场景穿插,会把虚拟场景做成平坦的,并且虚拟场景中不会有阻挡物,并且会禁止角色上下跳动。否则虚拟角色会和虚拟场景穿插,会出现穿帮。这样使AR玩法没法应用到所有的游戏场景,必须为AR玩法特殊制作场景。In order to prevent the virtual character from interspersed with the virtual scene, the existing method will make the virtual scene flat, and there will be no obstacles in the virtual scene, and the character will be prohibited from jumping up and down. Otherwise, the virtual character will be interspersed with the virtual scene, and there will be gangs. In this way, AR gameplay cannot be applied to all game scenarios, and scenes must be specially created for AR gameplay.
发明内容SUMMARY OF THE INVENTION
有鉴于此,本发明提供了一种游戏角色的运动处理方法及装置、存储介质、计算机设备,无需为AR模式建立特殊的游戏场景模型,同时,避免了游戏角色生硬的跟随玩家在现实世界中的移动路径而导致的穿帮,在提高游戏可玩性、增加游戏玩法的同时,保证了游戏展示效果。In view of this, the present invention provides a motion processing method and device, storage medium, and computer equipment for a game character, which does not require the establishment of a special game scene model for the AR mode, and at the same time, avoids the game character bluntly following the player in the real world. The piercing caused by the moving path of the game not only improves the playability of the game and increases the gameplay, but also ensures the display effect of the game.
根据本发明的一个方面,提供了一种游戏角色的运动处理方法,包括:响应于游戏客户端内的AR模式交互操作请求,调用所述游戏客户端内游戏引擎渲染得到实时三维游戏虚拟场景;以及根据游戏玩家在真实世界中实时的真实位移数据,驱动所述游戏玩家控制的游戏角色在三维游戏虚拟场景中进行无穿插运动。According to an aspect of the present invention, there is provided a motion processing method of a game character, comprising: in response to an AR mode interactive operation request in a game client, invoking a game engine in the game client to render a real-time 3D game virtual scene; And according to the real-time real displacement data of the game player in the real world, the game character controlled by the game player is driven to perform non-interspersed movement in the three-dimensional game virtual scene.
根据本发明的另一方面,提供了一种游戏角色的运动处理装置,包括:请求响应模块,用于响应于游戏客户端内的AR模式交互操作请求,调用所述游戏客户端内游戏引擎渲染得到实时三维游戏虚拟场景;运动驱动模块,用于根据游戏玩家在真实世界中实时的真实位移数据,驱动所述游戏玩家控制的游戏角色在三维游戏虚拟场景中进行无穿插运动。According to another aspect of the present invention, a motion processing device for a game character is provided, comprising: a request response module for invoking a game engine in the game client to render in response to an AR mode interactive operation request in the game client A real-time 3D game virtual scene is obtained; a motion driving module is used to drive the game character controlled by the game player to perform non-interspersed motion in the 3D game virtual scene according to the real-time real displacement data of the game player in the real world.
依据本发明又一个方面,提供了一种计算机可读介质,其上存储有计算机程序, 所述程序被处理器执行时实现上述游戏角色的运动处理方法的步骤。According to yet another aspect of the present invention, there is provided a computer-readable medium on which a computer program is stored, and when the program is executed by a processor, implements the steps of the above-mentioned method for processing the movement of a game character.
依据本发明再一个方面,提供了一种计算机设备,包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时实现上述游戏角色的运动处理方法的步骤。According to another aspect of the present invention, a computer device is provided, comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, the processor implements the movement of the game character when the processor executes the program The steps of the processing method.
依据本发明再一个方面,提供了一种计算机程序产品,包括计算机程序,所述计算机程序被处理器执行时实现上述游戏角色的运动处理方法的步骤。According to a further aspect of the present invention, a computer program product is provided, comprising a computer program, when the computer program is executed by a processor, the steps of the above-mentioned method for processing the motion of a game character are implemented.
本发明的有益效果为:借由上述技术方案,根据本发明一些实施方式的游戏角色的运动处理方法及装置、存储介质、计算机设备,响应于游戏客户端内的AR模式交互操作请求,通过游戏引擎对三维游戏虚拟场景进行实时渲染,同时,通过游戏客户端采集游戏玩家在真实世界中实时的真实位移数据,利用真实位移数据以及该游戏场景的运动面数据文件驱动游戏角色在游戏中进行无穿插运动。本发明实施例相比于现有技术中利用已有场景运行AR模式会导致的运动表现效果穿帮,或者为避免穿帮单独为AR模式进行场景建模等方式相比,无需为AR模式建立特殊的游戏场景模型,只需要利用预先构建的运动面数据文件来分析游戏角色在三维游戏虚拟场景中是否可以对玩家在现实世界的运动轨迹进行跟随,实现简单、门槛低,从而进行运动图像帧的渲染,避免了游戏角色在三维游戏虚拟场景中生硬的跟随玩家在现实世界中的移动路径而导致的穿帮,在提高游戏可玩性、增加游戏玩法的同时,保证了游戏展示效果,通过AR游戏模式给玩家带来虚实结合的非凡游戏体验。The beneficial effects of the present invention are: by means of the above technical solutions, according to some embodiments of the present invention, the game character motion processing method and device, storage medium, and computer equipment, in response to the AR mode interactive operation request in the game client, through the game The engine renders the 3D game virtual scene in real time. At the same time, the game client collects the real-time real displacement data of the game player in the real world, and uses the real displacement data and the motion surface data file of the game scene to drive the game character to move freely in the game. Interspersed movement. Compared with the prior art, the embodiment of the present invention does not need to create a special feature for the AR mode, compared to the sports performance effect caused by running the AR mode in the existing scene, or to avoid the scene modeling for the AR mode alone. The game scene model only needs to use the pre-built motion surface data file to analyze whether the game character can follow the player's movement trajectory in the real world in the 3D game virtual scene, the implementation is simple, the threshold is low, so as to render the moving image frame , which avoids the game character bluntly following the player's movement path in the real world in the 3D game virtual scene. While improving the game playability and increasing the gameplay, the game display effect is guaranteed. Through the AR game mode Bring players an extraordinary gaming experience combining virtual and real.
上述说明仅是本发明技术方案的概述,为了能够更清楚了解本发明的技术手段,而可依照说明书的内容予以实施,并且为了让本发明的上述和其它目的、特征和优点能够更明显易懂,以下特举本发明的具体实施方式。The above description is only an overview of the technical solutions of the present invention, in order to be able to understand the technical means of the present invention more clearly, it can be implemented according to the content of the description, and in order to make the above and other purposes, features and advantages of the present invention more obvious and easy to understand , the following specific embodiments of the present invention are given.
附图说明Description of drawings
此处所说明的附图用来提供对本发明的进一步理解,构成本发明的一部分,本发明的示意性实施例及其说明用于解释本发明,并不构成对本发明的不当限定。在附图中:The accompanying drawings described herein are used to provide further understanding of the present invention and constitute a part of the present invention. The exemplary embodiments of the present invention and their descriptions are used to explain the present invention and do not constitute an improper limitation of the present invention. In the attached image:
图1示意性地示出了根据本发明实施例的一种游戏角色的运动处理方法的流程示意图;FIG. 1 schematically shows a schematic flowchart of a method for processing a motion of a game character according to an embodiment of the present invention;
图2示意性地示出了根据本发明实施例的一种打点射线发射示意图;FIG. 2 schematically shows a schematic diagram of a dotted ray emission according to an embodiment of the present invention;
图3示意性地示出了根据本发明实施例的一种游戏角色的运动处理装置的结构示意图;FIG. 3 schematically shows a schematic structural diagram of a motion processing apparatus for a game character according to an embodiment of the present invention;
图4示意性地示出了用于实现根据本发明的方法的计算机设备的框图;以及Figure 4 schematically shows a block diagram of a computer device for implementing the method according to the invention; and
图5示意性地示出了实现根据本发明的方法的计算机程序产品的框图。Figure 5 schematically shows a block diagram of a computer program product implementing the method according to the invention.
具体实施方式Detailed ways
下文中将参考附图并结合实施例来详细说明本发明。需要说明的是,在不冲突的情况下,本发明中的实施例及实施例中的特征可以相互组合。Hereinafter, the present invention will be described in detail with reference to the accompanying drawings and in conjunction with embodiments. It should be noted that the embodiments of the present invention and the features of the embodiments may be combined with each other under the condition of no conflict.
根据本发明的实施例,提供了一种游戏角色的运动处理方法,如图1所示,该方 法包括:According to an embodiment of the present invention, a motion processing method for a game character is provided, as shown in Figure 1, the method includes:
步骤101,响应于游戏客户端内的AR模式交互操作请求,调用所述游戏客户端内游戏引擎渲染得到实时三维游戏虚拟场景; Step 101, in response to the AR mode interactive operation request in the game client, calling the game engine in the game client to render to obtain a real-time 3D game virtual scene;
步骤102,根据游戏玩家在真实世界中实时的真实位移数据以及三维游戏虚拟场景对应的运动面数据文件,驱动所述游戏玩家控制的游戏角色在三维游戏虚拟场景中进行无穿插运动。 Step 102 , according to the real-time real displacement data of the game player in the real world and the motion surface data file corresponding to the 3D game virtual scene, drive the game character controlled by the game player to perform non-interspersed motion in the 3D game virtual scene.
本发明实施例可以应用于游戏客户端中,游戏客户端可以包括智能手机、平板电脑等智能电子设备。游戏客户端运行游戏时,响应于游戏内对三维游戏虚拟场景的AR模式交互操作请求,通过游戏引擎对相应的三维游戏虚拟场景进行实时渲染。The embodiments of the present invention may be applied to a game client, and the game client may include intelligent electronic devices such as smart phones and tablet computers. When the game client runs the game, in response to the AR mode interactive operation request for the 3D game virtual scene in the game, the game engine performs real-time rendering on the corresponding 3D game virtual scene.
在一些实施方式中,进行图像渲染之前,游戏角色的运动处理方法可以包括:步骤103,读取与所述三维游戏虚拟场景对应的运动面数据文件,其中,所述运动面数据文件用于指示所述三维游戏虚拟场景的可运动位置。In some embodiments, before performing image rendering, the motion processing method of the game character may include: Step 103, reading a motion surface data file corresponding to the 3D game virtual scene, wherein the motion surface data file is used to indicate The movable position of the three-dimensional game virtual scene.
在一些实施方式中,三维游戏虚拟场景可以是游戏内某个特定的游戏世界,例如游戏内的玩家专属家园,运动面数据文件可以是预先基于该三维游戏虚拟场景的地貌数据建立的用于反应三维游戏虚拟场景的可运动位置的一组数据。在一些实施方式中,运动面数据文件可以包含不同意义的数据,例如,运动面数据文件可以包括三维游戏虚拟场景中可以通过的点、不能通过的点、地貌边界、地貌高度等信息,又例如,运动面数据文件可以只包括三维游戏虚拟场景中可以通过的点。通过运动面数据文件可以对游戏角色在三维游戏虚拟场景中的运动可行性进行判断,在一些实施方式中,可以确定出游戏角色是否可以运动至某个位置。In some embodiments, the 3D game virtual scene may be a specific game world in the game, such as a player's exclusive home in the game, and the motion surface data file may be pre-established based on the landform data of the 3D game virtual scene for reaction A set of data for the movable positions of the virtual scene of a 3D game. In some embodiments, the moving surface data file may contain data of different meanings. For example, the moving surface data file may include information such as points that can be passed, points that cannot be passed, landform boundaries, and landform heights in a 3D game virtual scene. , the motion surface data file can only include the points that can be passed in the virtual scene of the 3D game. The motion feasibility of the game character in the three-dimensional game virtual scene can be judged through the motion surface data file, and in some embodiments, it can be determined whether the game character can move to a certain position.
在本发明一些实施例中,可以结合玩家持有游戏客户端进行运动时产生的真实位移数据,来分析AR模式下的游戏角色如果对玩家真实运动进行模拟应该到达的目标位置,即在该AR模式下,玩家通过持有游戏客户端进行移动来控制游戏角色在虚拟世界中的运动,在一些实施方式中,用于控制游戏角色运动的参数可以是通过游戏客户端按预设帧率采集的真实位移数据。In some embodiments of the present invention, the real displacement data generated when the player holds the game client to move can be used to analyze the target position that the game character in the AR mode should reach if the real movement of the player is simulated, that is, in the AR mode In the mode, the player controls the movement of the game character in the virtual world by holding the game client to move. In some embodiments, the parameters used to control the movement of the game character may be collected by the game client at a preset frame rate real displacement data.
在一些实施方式中,步骤102可以包括:In some embodiments, step 102 may include:
步骤102-1,按预设帧率采集所述游戏客户端对应的真实位移数据,并根据所述真实位移数据以及游戏角色在所述三维游戏虚拟场景中的初始位置,逐帧生成所述游戏角色在所述三维游戏虚拟场景中的目标位置;Step 102-1, collect the real displacement data corresponding to the game client according to the preset frame rate, and generate the game frame by frame according to the real displacement data and the initial position of the game character in the 3D game virtual scene the target position of the character in the virtual scene of the three-dimensional game;
步骤102-2,依据所述运动面数据文件以及所述目标位置,通过游戏引擎逐帧渲染所述游戏角色在所述三维游戏虚拟场景中对应的运动图像帧。Step 102-2, according to the motion surface data file and the target position, use a game engine to render the motion image frames corresponding to the game character in the 3D game virtual scene frame by frame.
在本发明实施例中,按真实位移数据控制游戏角色在三维游戏虚拟场景中运动,以游戏角色在该三维游戏虚拟场景中的初始位置为初始控制点,控制游戏角色按照真实位移数据所指示的移动方向和移动距离向对应的目标位置处运动。在一些实施方式中,在不同的应用场景中,初始位置的确定方式不同,例如在游戏客户端响应AR模式交互操作请 求之前,游戏角色已经处于三维游戏虚拟场景中,即AR模式交互操作请求用于指示在游戏角色所在的当前游戏场景下运行AR模式,那么初始位置可以为游戏客户端对AR模式交互操作请求产生响应时游戏角色所在位置,又例如在客户响应AR模式交互操作请求之前,游戏角色位于其他的游戏场景中,即AR模式交互操作请求用于指示游戏角色从当前游戏场景切换到三维游戏虚拟场景并在三维游戏虚拟场景中运行AR模式,此时的初始位置可以为游戏开发人员预先设定的三维游戏虚拟场景中的传送点位置。In the embodiment of the present invention, the game character is controlled to move in the 3D game virtual scene according to the real displacement data. The moving direction and moving distance move toward the corresponding target position. In some embodiments, in different application scenarios, the initial position is determined in different ways. For example, before the game client responds to the AR mode interactive operation request, the game character is already in the 3D game virtual scene, that is, the AR mode interactive operation request uses In order to instruct to run the AR mode in the current game scene where the game character is located, then the initial position can be the position of the game character when the game client responds to the AR mode interactive operation request, and, for example, before the client responds to the AR mode interactive operation request, the game The character is located in another game scene, that is, the AR mode interaction operation request is used to instruct the game character to switch from the current game scene to the 3D game virtual scene and run the AR mode in the 3D game virtual scene. The initial position at this time can be the game developer. The preset position of the teleportation point in the virtual scene of the 3D game.
在一些实施方式中,在上述对游戏角色的运动控制方式下,由于玩家是在现实世界中移动的,现实世界的行走环境与游戏世界的行走环境是不同的,如果完全按照游戏客户端的真实位移数据对游戏角色在游戏世界中的运动进行控制,可能导致游戏角色穿越墙体等障碍物、以及走出地图范围等情况发生,因此,为了解决这种移动穿帮、玩家游戏体验差的问题,可以先根据游戏客户端的真实位移数据以及游戏角色在三维游戏虚拟场景中的初始位置,计算每一个真实位移数据对应的游戏角色在三维游戏虚拟场景中的目标位置,再根据运动面数据文件所指示的三维游戏虚拟场景的可运动位置来分析游戏角色是否可以运动到目标位置,即目标位置是否属于三维游戏虚拟场景中的可运动位置,从而基于分析结果进行图像渲染得到运动图像帧,例如,在确定游戏角色可以向目标位置移动时,可以渲染游戏角色向目标位置移动的运动图像帧,而在确定目标位置不属于可运动位置时,即游戏角色不应向目标位置移动时,可以渲染游戏角色在目标位置停止移动、或者原地踏步等表现形式的运动图像帧。如此。玩家在游戏客户端显示屏幕上看到的游戏动画中不会出现游戏角色表现出穿越障碍物、移动至地图范围外等效果,提高游戏角色运动的真实性,以使得在AR模式下三维游戏虚拟场景中的游戏角色不仅可以跟随玩家在现实世界的移动,而且在无需重新开发AR模式专用地图的情况下避免了游戏角色生硬的跟随玩家在现实世界中的移动路径而导致的穿帮,在提高游戏可玩性、增加游戏玩法的同时,保证了游戏展示效果。In some embodiments, under the above motion control method for game characters, since the player moves in the real world, the walking environment in the real world is different from the walking environment in the game world. If the real displacement data of the game client is completely followed Controlling the movement of the game character in the game world may cause the game character to pass through obstacles such as walls and walk out of the map. The real displacement data of the game client and the initial position of the game character in the 3D game virtual scene, calculate the target position of the game character corresponding to each real displacement data in the 3D game virtual scene, and then calculate the target position of the game character in the 3D game virtual scene according to the motion surface data file. The movable position of the virtual scene is used to analyze whether the game character can move to the target position, that is, whether the target position belongs to the movable position in the 3D game virtual scene, so as to obtain the moving image frame by image rendering based on the analysis result. When you can move to the target position, you can render the moving image frame of the game character moving to the target position, and when it is determined that the target position is not a movable position, that is, when the game character should not move to the target position, you can render the game character at the target position. A moving image frame in the form of stop moving, or standing still. in this way. In the game animation that the player sees on the game client display screen, the game character will not appear to pass through obstacles, move out of the map, etc., and improve the authenticity of the game character movement, so that the 3D game in AR mode is virtual. The game characters in the scene can not only follow the player's movement in the real world, but also avoid the game character bluntly following the player's movement path in the real world without the need to re-develop the dedicated map for AR mode. Playability and increase the gameplay, while ensuring the game display effect.
通过应用本实施例的技术方案,响应于AR模式交互操作请求,读取与AR模式交互操作请求所指示的三维游戏虚拟场景对应的运动面数据文件,同时,按预设帧率获取游戏客户端对应的真实位移数据,并基于真实位移数据来逐帧的确定游戏角色在三维游戏虚拟场景中的目标位置,利用读取的运动面数据文件分析目标位置是否可运动并基于分析结果渲染游戏角色对应的运动图像帧。本发明实施例相比于现有技术中利用已有场景运行AR模式会导致的运动表现效果穿帮,或者为避免穿帮单独为AR模式进行场景建模等方式相比,无需为AR模式建立特殊的游戏场景模型,只需要利用预先构建的运动面数据文件来分析游戏角色在三维游戏虚拟场景中是否可以对玩家在现实世界的运动轨迹进行跟随,从而进行运动图像帧的渲染,避免了游戏角色在三维游戏虚拟场景中生硬的跟随玩家在现实世界中的移动路径而导致的穿帮,在提高游戏可玩性、增加游戏玩法的同时,保证了游戏展示效果,通过AR游戏模式给玩家带来虚实结合的非凡游戏体验。By applying the technical solution of this embodiment, in response to the AR mode interactive operation request, read the motion surface data file corresponding to the 3D game virtual scene indicated by the AR mode interactive operation request, and at the same time, obtain the game client at the preset frame rate Corresponding real displacement data, and based on the real displacement data, determine the target position of the game character in the 3D game virtual scene frame by frame, use the read motion surface data file to analyze whether the target position is movable, and render the corresponding game character based on the analysis result. moving image frame. Compared with the prior art, the embodiment of the present invention does not need to create a special feature for the AR mode, compared to the sports performance effect caused by running the AR mode in the existing scene, or to avoid the scene modeling for the AR mode alone. The game scene model only needs to use the pre-built motion surface data file to analyze whether the game character can follow the player's movement trajectory in the real world in the 3D game virtual scene, so as to render the moving image frame and avoid the game character in the game. The 3D game virtual scene rigidly follows the player's moving path in the real world, which leads to the penetration of the gang, which not only improves the game playability and increases the gameplay, but also ensures the game display effect. The AR game mode brings players a combination of reality and reality. extraordinary gaming experience.
在一些实施方式中,步骤102-1可以包括:In some embodiments, step 102-1 may include:
步骤102-1.1,按所述预设帧率获取所述游戏客户端的定位装置采集到的定位数据,并逐帧计算所述游戏客户端对应的真实位移数据;Step 102-1.1: Acquire the positioning data collected by the positioning device of the game client according to the preset frame rate, and calculate the real displacement data corresponding to the game client frame by frame;
步骤102-1.2,按照预设比例系数,确定所述真实位移数据对应的所述游戏角色的虚拟位移数据,并依据所述虚拟位移数据以及所述初始位置,确定所述游戏角色在所述三维游戏虚拟场景中的目标位置。Step 102-1.2: Determine the virtual displacement data of the game character corresponding to the real displacement data according to the preset proportional coefficient, and determine the game character in the three-dimensional position according to the virtual displacement data and the initial position. The target location in the game's virtual scene.
在该实施例中,可以利用游戏客户端中的定位装置(例如GPS模块)按预设帧率进行定位数据采集,获取游戏客户端的定位数据,并基于采集的定位数据逐帧的计算游戏客户端的真实位移数据,定位装置所采集的定位数据可以为经纬度数据,真实位移数据能够反应用户的移动方向和移动距离。进而,游戏角色可以跟随玩家在现实世界的真实移动情况进行运动,可以按照预设比例系数,计算真实位移数据对应的虚拟位移数据,例如预设比例系数为1,那么虚拟位移数据与真实位移数据相同,玩家在现实世界向某个方向移动1米,虚拟位移数据也表示为向该方向移动1米,预设比例系数为10,玩家在现实世界移动1米,游戏角色在游戏中移动10米,以使游戏角色能够按比例的跟随玩家在现实世界中的移动轨迹,为玩家带来身临其境般的、沉浸式的游戏体验。基于虚拟位移数据和初始位置确定目标位置的方式,与上文实施例中根据真实位移数据和初始位置生成目标位置的方式的不同之处在于,先将真实位移数据变换为对应的虚拟位移数据,再根据变换得到的虚拟位移数据进行目标位置的计算,具体计算方式在此不再赘述。In this embodiment, a positioning device (such as a GPS module) in the game client can be used to collect positioning data at a preset frame rate, obtain the positioning data of the game client, and calculate the frame by frame based on the collected positioning data. The real displacement data, the positioning data collected by the positioning device can be longitude and latitude data, and the real displacement data can reflect the moving direction and moving distance of the user. Furthermore, the game character can follow the real movement of the player in the real world, and the virtual displacement data corresponding to the real displacement data can be calculated according to the preset scale factor. For example, if the preset scale factor is 1, then the virtual displacement data and the real displacement data In the same way, the player moves 1 meter in a certain direction in the real world, and the virtual displacement data is also expressed as moving 1 meter in this direction. The preset scale factor is 10. The player moves 1 meter in the real world, and the game character moves 10 meters in the game. So that the game character can follow the player's movement trajectory in the real world in proportion, bringing the player an immersive and immersive game experience. The method of determining the target position based on the virtual displacement data and the initial position is different from the method of generating the target position according to the real displacement data and the initial position in the above embodiment in that the real displacement data is first transformed into the corresponding virtual displacement data, Then, the calculation of the target position is performed according to the virtual displacement data obtained by the transformation, and the specific calculation method will not be repeated here.
在一些实施方式中,当预设比例系数为1时,直接基于真实位移数据和初始位置来计算目标位置即可,即步骤102-1可替换为:依据所述真实位移数据以及所述初始位置,确定所述游戏角色在所述三维游戏虚拟场景中的目标位置。In some embodiments, when the preset scale factor is 1, the target position can be calculated directly based on the real displacement data and the initial position, that is, step 102-1 can be replaced with: according to the real displacement data and the initial position , and determine the target position of the game character in the three-dimensional game virtual scene.
在本一些实施例中,为提高游戏角色在三维游戏虚拟场景中的运动表现效果,使游戏角色的运动更加平滑,还可以对真实位移数据进行插值,并利用插值结果来确定上述的目标位置。在一些实施方式中,“按预设帧率采集所述游戏客户端对应的真实位移数据”可以为:获取所述定位数据中的第一定位数据以及第二定位数据,其中,所述第一定位数据对应的第一采样帧与所述第二定位数据对应的第二采样帧相差预设数量;基于所述第一定位数据以及所述第二定位数据做插值,得到与所述预设数量匹配的插值位置数据,并依据所述插值位置数据计算所述游戏客户端对应的插值位移数据,其中,所述真实位移数据包括所述插值位移数据。In some embodiments, in order to improve the motion performance effect of the game character in the 3D game virtual scene and make the motion of the game character smoother, the real displacement data can be interpolated, and the interpolation result can be used to determine the above-mentioned target position. In some embodiments, "collecting the real displacement data corresponding to the game client at a preset frame rate" may be: acquiring first positioning data and second positioning data in the positioning data, wherein the first positioning data The difference between the first sampling frame corresponding to the positioning data and the second sampling frame corresponding to the second positioning data is a preset number; based on the first positioning data and the second positioning data, interpolation is performed to obtain a difference between the preset number and the first sampling frame. matching interpolation position data, and calculate the interpolation displacement data corresponding to the game client according to the interpolation position data, wherein the real displacement data includes the interpolation displacement data.
在一些实施例中,可以按照预设数量从定位数据中获取第一定位数据和第二定位数据,例如预设数量为10,采样的第1个定位数据作为第一定位数据,采样的第10个定位数据作为第二定位数据,然后对第一定位数据和第二定位数据进行插值计算,计算出第一定位数据和第二定位数据之间的8个插值数据,将第一定位数据、8个插值数据、第二定位数据作为10个插值位移数据,并在后续计算虚拟位移数据、目标位置时,利用插值得到的插值位移数据进行计算。以便后续控制游戏角色移动时,可以按插值位移数据进行控制,有助于游戏角色的运动轨迹更加平滑,提升运动表现效果。In some embodiments, the first positioning data and the second positioning data may be obtained from the positioning data according to a preset number, for example, the preset number is 10, the first positioning data sampled is used as the first positioning data, and the tenth sampling data is used as the first positioning data. The first positioning data and the second positioning data are interpolated to calculate the 8 interpolation data between the first positioning data and the second positioning data, and the first positioning data, 8 The 10 pieces of interpolation data and the second positioning data are used as 10 pieces of interpolation displacement data, and when the virtual displacement data and the target position are subsequently calculated, the interpolation displacement data obtained by interpolation is used for calculation. In order to control the movement of the game character later, it can be controlled according to the interpolation displacement data, which helps the movement trajectory of the game character to be smoother and improves the performance of the movement.
需要说明的是,还可以直接以固定时间间隔来采集第一定位数据和第二定位数据,例如预设帧率为40赫兹,预设数量为10,固定时间间隔为0.25秒,那么可以直接0秒时获取第一定位数据,在0.25秒时获取第二定位数据,以减少定位装置的数据采集量。It should be noted that the first positioning data and the second positioning data can also be directly collected at fixed time intervals. For example, the preset frame rate is 40 Hz, the preset number is 10, and the fixed time interval is 0.25 seconds, then you can directly 0 The first positioning data is obtained at 0.25 seconds, and the second positioning data is obtained at 0.25 seconds, so as to reduce the data collection amount of the positioning device.
在一些实施方式中,为保证游戏角色在不同的运动阶段表现效果,可以玩家在现实世界中不同的移动阶段中,获取第一定位数据和第二定位数据,例如在玩家的起步阶段获取第一定位数据、第二定位数据,避免玩家起步时走走停停导致游戏角色在游戏世界中运动卡顿,又例如在玩家的停步阶段获取第一定位数据、第二定位数据,避免玩家突然停步导致游戏角色突然停步,表现效果差。In some embodiments, in order to ensure the performance of the game character in different movement stages, the player may obtain the first positioning data and the second positioning data in different moving stages in the real world, for example, obtain the first positioning data in the player's initial stage Positioning data and second positioning data, to avoid the player's stop-and-go when starting, causing the game character to freeze in the game world. For example, the first positioning data and the second positioning data are obtained during the player's stop phase to prevent the player from suddenly stopping. The step causes the game character to suddenly stop, and the performance is poor.
在本发明一些实施方式中,所述运动面数据文件可包括基于所述三维游戏虚拟场景的地貌数据确定的所述三维游戏虚拟场景中可运动位置对应的平面坐标和高度数据;步骤103,可以包括:在所述三维游戏虚拟场景中,加载与所述运动面数据文件对应的运动面,其中,对所述三维游戏虚拟场景进行展示时隐藏所述运动面。In some embodiments of the present invention, the motion surface data file may include plane coordinates and height data corresponding to movable positions in the 3D game virtual scene determined based on the landform data of the 3D game virtual scene; Step 103, you can The method includes: loading a motion plane corresponding to the motion plane data file in the 3D game virtual scene, wherein the motion plane is hidden when the 3D game virtual scene is displayed.
在一些实施例中,运动面数据文件可以是根据三维游戏虚拟场景的地貌数据预先生成的,文件中可以包含该三维游戏虚拟场景中各可运动位置的平面坐标以及高度数据,平面坐标和对应的高度数据表示在三维游戏虚拟场景空间中,某个可运动位置的可移动高度,例如,在三维游戏虚拟场景中的台阶是可运动位置,台阶对应的平面坐标和高度数据反应游戏角色移动到该台阶时脚踩的位置。响应于AR模式交互操作请求,基于运动面数据文件可以在游戏中加载出与该文件对应的运动面,该运动面在游戏中可以是隐藏的,玩家透过游戏客户端显示屏看不到运动面,该运动面只用于分析某个目标位置是否可以运动,该运动面在游戏中也可以是显示的,使玩家能够看到运动面,以助于玩家注意尽量避免向非运动面的位置移动保证在游戏世界内流畅、顺利的游览。In some embodiments, the motion surface data file may be pre-generated according to the landform data of the 3D game virtual scene, and the file may include the plane coordinates and height data of each movable position in the 3D game virtual scene, the plane coordinates and corresponding The height data represents the movable height of a certain movable position in the 3D game virtual scene space. For example, the steps in the 3D game virtual scene are movable positions, and the plane coordinates and height data corresponding to the steps reflect the movement of the game character to this position. The position of the foot on the steps. In response to the AR mode interactive operation request, the motion surface corresponding to the file can be loaded in the game based on the motion surface data file, the motion surface can be hidden in the game, and the player cannot see the motion through the game client display screen The moving surface is only used to analyze whether a certain target position can be moved. The moving surface can also be displayed in the game, so that the player can see the moving surface, so as to help the player pay attention to avoiding the position of the non-moving surface as much as possible. Mobility guarantees smooth, smooth navigation within the game world.
在一些实施例中,可以通过向运动面进行射线打点的方式来判断是否可以向目标位置运动。在一些实施方式中,步骤102-2可以包括:In some embodiments, whether it is possible to move to the target position can be determined by ray-pointing the moving surface. In some embodiments, step 102-2 may include:
步骤102-2-A1,基于所述目标位置生成打点射线,对所述运动面发射所述打点射线进行射线打点,其中,所述打点射线垂直于所述目标位置所在平面;Step 102-2-A1, generating a dotted ray based on the target position, and emitting the dotted ray on the moving surface to perform ray dotting, wherein the dotted ray is perpendicular to the plane where the target position is located;
步骤102-2-A2,若所述打点射线与所述运动面产生交汇,则渲染所述游戏角色向所述目标位置移动的第一运动图像帧,其中,所述运动图像帧包括所述第一运动图像帧;Step 102-2-A2, if the dotted ray intersects with the moving surface, render the first moving image frame of the game character moving to the target position, wherein the moving image frame includes the first moving image frame. a moving image frame;
步骤102-2-A3,若所述打点射线与所述运动面不产生交汇,则按照预设碰撞运动规则,渲染所述游戏角色在对应的当前位置进行运动的第二运动图像帧,其中,所述运动图像帧包括所述第二运动图像帧。Step 102-2-A3, if the dotted ray does not intersect with the motion surface, then according to the preset collision motion rule, the second motion image frame in which the game character moves at the corresponding current position is rendered, wherein, The moving image frame includes the second moving image frame.
在上述实施例中,根据目标位置向垂直方向生成打点射线(打点射线的端点可以是三维游戏虚拟场景的最高点,方向是垂直向下,也可以在最低点取端点,方向垂直向上),并发射打点射线进行打点,如果打点射线能够打到运动面,即打点射线与运动面产生交汇,如图2所示,假设游戏角色初始位置为A点,目标位置为B点,运动面为S,从B点向上发射打点射线,打点射线与S有交汇点B’,说明游戏角色可以向目标位置移动,那么渲染 游戏角色向目标位置(此处目标位置具体为B’)移动的第一运动图像帧,这里游戏角色向目标位置移动时应考虑目标位置对应的高度数据,保证游戏角色运动时能够基于场景的地貌进行运动。而如果打点射线打不到运动面,即打点射线与运动面不产生交汇,说明游戏角色如果向目标位置移动则会穿帮,此时可以按照预设碰撞运动规则渲染第二运动图像帧。在一些实施方式中,预设碰撞运动规则可以为在该情况下,游戏角色在原地行走、或在原地静止等等。通过打点射线检测方式可保证游戏角色在游戏场景中的无穿插运动效果,且系统开销小、效率高。In the above-mentioned embodiment, the dotted ray is generated in the vertical direction according to the target position (the endpoint of the dotted ray can be the highest point of the 3D game virtual scene, and the direction is vertically downward, or the endpoint can be taken at the lowest point, and the direction is vertically upward), and Launch the dotting ray for dotting. If the dotting ray can hit the moving surface, that is, the dotting ray and the moving surface intersect, as shown in Figure 2. Assume that the initial position of the game character is point A, the target position is point B, and the moving surface is S. The dotted ray is emitted upward from point B, and the dotted ray has a meeting point B' with S, indicating that the game character can move to the target position, then render the first moving image of the game character moving to the target position (here, the target position is specifically B'). frame, where the height data corresponding to the target position should be considered when the game character moves to the target position to ensure that the game character can move based on the topography of the scene when moving. However, if the dotted ray cannot hit the moving surface, that is, the dotted ray does not intersect with the moving surface, it means that the game character will pass through if it moves to the target position. At this time, the second motion image frame can be rendered according to the preset collision motion rules. In some embodiments, the preset collision motion rule may be that in this case, the game character walks on the spot, or stops on the spot, and so on. The dotted ray detection method can ensure that the game character has no interspersed motion effect in the game scene, and the system overhead is small and the efficiency is high.
在不同游戏场景中,普通模式和AR模式下对应的骑乘状态信息可能存在一定区别,因此,在一些实施方式中,步骤102-2-A2可以包括:In different game scenarios, there may be some differences in the riding state information corresponding to the normal mode and the AR mode. Therefore, in some embodiments, step 102-2-A2 may include:
步骤102-2-A2.1,获取所述打点射线与所述运动面的交汇点高度;Step 102-2-A2.1, obtaining the height of the intersection point of the dotted ray and the motion surface;
步骤102-2-A2.2,若所述交汇点高度与所述三维游戏虚拟场景的AR模式骑乘状态信息匹配,则渲染所述游戏角色按照与所述交汇点高度对应的骑乘状态向所述目标位置移动的第三运动图像帧,其中,所述第一运动图像帧包括所述第三运动图像帧;Step 102-2-A2.2, if the junction height matches the AR mode riding state information of the 3D game virtual scene, render the game character according to the riding state corresponding to the junction height. a third moving image frame in which the target position moves, wherein the first moving image frame includes the third moving image frame;
步骤102-2-A2.3,若所述交汇点高度与所述AR模式骑乘状态信息不匹配,则按照所述预设碰撞运动规则,渲染所述游戏角色在对应的当前位置进行运动的第四运动图像帧,其中,所述第一运动图像帧包括所述第四运动图像帧。Step 102-2-A2.3, if the height of the intersection point does not match the riding state information of the AR mode, then according to the preset collision motion rules, render the game character moving at the corresponding current position. A fourth moving image frame, wherein the first moving image frame includes the fourth moving image frame.
在该实施例中,在打点射线与运动面产生交汇的情况下,可以利用交汇产生的交汇点的高度信息来进一步分析目标位置是否为可运动位置,在一些实施方式中,获取交汇点高度后,判断交汇点高度与三维游戏虚拟场景对应的AR模式骑乘状态信息是否匹配,如果匹配,则可以渲染游戏角色按与交汇点高度相应的骑乘状态向目标位置移动的第三运动图像帧,例如三维游戏虚拟场景为游戏中的庄园、城池等,普通模式下支持游戏角色步行、骑乘坐骑等地表行进运动,也支持游戏角色用轻功、飞行器等空中行进方式,AR模式下仅支持步行或坐骑等地表行进方式,交汇点高度为支持地表行进方式的高度范围,那么可以渲染游戏角色以步行、骑乘坐骑等地表行进方式向目标位置移动的图像帧。而如果交汇点高度与AR模式骑乘状态信息不匹配,则说明目标位置虽然在普通模式下为可运动位置,但在AR模式下移动到该目标位置,应按预设碰撞运动规则渲染对应的第四运动图像帧,例如交汇点高度为支持空中行进方式的高度范围。In this embodiment, when the dotted ray intersects with the moving surface, the height information of the intersection generated by the intersection can be used to further analyze whether the target position is a movable position. In some embodiments, after obtaining the height of the intersection , to determine whether the height of the junction point matches the AR mode riding state information corresponding to the 3D game virtual scene, and if so, the third motion image frame in which the game character moves to the target position according to the riding state corresponding to the height of the junction point can be rendered, For example, the virtual scene of a 3D game is a manor, a city, etc. in the game. In the normal mode, the game characters walk, ride and ride on the surface, and also support the game characters to travel in the air such as light work and aircraft. In the AR mode, only walking or For surface travel methods such as mounts, the height of the intersection point is the height range that supports the surface travel mode, then the image frames of the game character moving to the target position by walking, riding, and other surface travel methods can be rendered. However, if the height of the intersection point does not match the riding state information in AR mode, it means that although the target position is a movable position in normal mode, but it moves to the target position in AR mode, it should be rendered according to the preset collision motion rules. The fourth moving image frame, such as the junction height, is a range of heights that support the air travel mode.
根据本发明的一些实施例,还可以将运动面加载为导航运动面,运动面可以包括二维导航网格运动面以及三维体素运动面,其中,二维导航网格运动面可以表示三维游戏虚拟场景中游戏角色可以在其上通过步行、骑乘坐骑等地表行进方式运动,而三维体素运动面可以反应出三维游戏虚拟场景中各可运动体素的连通情况,即能够连通的体素格子之间可以通过地表行进方式、空中行进方式等各种方式移动。基于二维导航网格运动面或三维体素运动面,可以实现游戏角色在游戏世界中的寻路。According to some embodiments of the present invention, the motion surface may also be loaded as a navigation motion surface, and the motion surface may include a 2D navigation mesh motion surface and a 3D voxel motion surface, wherein the 2D navigation mesh motion surface may represent a 3D game In the virtual scene, the game character can move on the surface by walking, riding, etc., and the 3D voxel motion surface can reflect the connection of each movable voxel in the 3D game virtual scene, that is, the connected voxels. The grids can be moved in various ways such as surface travel, air travel, and so on. Based on the 2D navigation mesh motion surface or the 3D voxel motion surface, the pathfinding of the game character in the game world can be realized.
在一些实施方式中,当运动面包括二维导航网格运动面时,步骤102-2可以包括:In some embodiments, when the motion surface includes a two-dimensional navmesh motion surface, step 102-2 may include:
步骤102-2-B1,若所述目标位置与所述二维导航网格运动面匹配,则获取所述二维导 航网格运动面对应的所述目标位置的预设骑乘信息,根据所述预设骑乘信息以及所述游戏角色的当前骑乘状态,确定所述游戏角色的目标骑乘状态,并基于所述二维导航网格运动面,渲染所述游戏角色以所述目标骑乘状态向所述目标位置移动的第五运动图像帧,其中,所述运动图像帧包括所述第五运动图像帧;Step 102-2-B1, if the target position matches the motion surface of the two-dimensional navigation grid, obtain the preset riding information of the target position corresponding to the motion surface of the two-dimensional navigation grid, according to The preset riding information and the current riding state of the game character determine the target riding state of the game character, and based on the two-dimensional navigation mesh motion surface, render the game character to the target a fifth moving image frame in which the riding state moves toward the target position, wherein the moving image frame includes the fifth moving image frame;
步骤102-2-B2,若所述目标位置与所述二维导航网格运动面不匹配,则渲染所述游戏角色在对应的当前位置进行运动的第六运动图像帧,其中,所述运动图像帧包括所述第六运动图像帧。Step 102-2-B2, if the target position does not match the motion surface of the two-dimensional navigation grid, render a sixth moving image frame in which the game character moves at the corresponding current position, wherein the motion The image frames include the sixth moving image frame.
在该实施例中,对于通过游戏内平面坐标来表示的目标位置,当二维导航网格运动面在平面上的投影包括该目标位置时,或基于目标位置发射的打点射线与二维导航网格运动面存在交点时,认为目标位置与二维导航网格运动面匹配,此时说明游戏角色可以向目标位置移动,在确定目标位置与二维导航网格运动面匹配时,获取目标位置点在二维导航网格运动面上的垂直投影位置点对应的预设骑乘信息,例如预设骑乘信息可以包括步行、骑乘坐骑、划船等等地表行进方式,进一步判断游戏角色的当前骑乘状态是否属于预设骑乘信息指示的骑乘状态,若是则将游戏角色的当前骑乘状态作为目标骑乘状态,若否则将游戏角色的目标骑乘状态切换为预设骑乘信息所指示的行进方式,例如从步行状态切换为骑行状态,从而渲染游戏角色以该目标骑乘状态向目标位置(这里的目标位置是指目标位置在二维导航网格运动面上的垂直投影位置)移动的第五运动图像帧。而当二维导航网格运动面在平面上的投影不包括该目标位置时,或基于目标位置发射的打点射线与二维导航网格运动面没有交点时,认为目标位置与二维导航网格运动面不匹配,此时说明游戏角色不可以向目标位置移动,那么按预设碰撞运动规则渲染第六运动图像帧,可以与渲染第二运动图像帧方式相同,在此不再赘述。In this embodiment, for the target position represented by the in-game plane coordinates, when the projection of the two-dimensional navigation mesh moving surface on the plane includes the target position, or the dotted ray emitted based on the target position and the two-dimensional navigation mesh When there is an intersection point on the grid motion surface, it is considered that the target position matches the 2D navigation grid motion surface. At this time, it means that the game character can move to the target position. When it is determined that the target position matches the 2D navigation grid motion surface, the target position point is obtained. The preset riding information corresponding to the vertical projection position point on the motion surface of the 2D navigation grid, for example, the preset riding information may include walking, riding, boating, etc., to further determine the current riding mode of the game character Whether the riding state belongs to the riding state indicated by the preset riding information, if so, the current riding state of the game character is used as the target riding state, if not, the target riding state of the game character is switched to that indicated by the preset riding information , such as switching from the walking state to the riding state, so as to render the game character to the target position in the target riding state (the target position here refers to the vertical projection position of the target position on the motion surface of the two-dimensional navigation mesh) The fifth moving image frame that moves. When the projection of the 2D navigation mesh moving surface on the plane does not include the target position, or when the dotted ray emitted based on the target position has no intersection with the 2D navigation mesh moving surface, it is considered that the target position and the 2D navigation mesh If the motion surfaces do not match, it means that the game character cannot move to the target position, then rendering the sixth moving image frame according to the preset collision motion rules can be done in the same way as rendering the second moving image frame, which will not be repeated here.
在一些实施方式中,当运动面包括三维体素运动面时,步骤102-2可以包括:In some embodiments, when the motion surface includes a three-dimensional voxel motion surface, step 102-2 may include:
步骤102-2-C1,若所述目标位置与所述三维体素运动面匹配,则获取所述三维体素运动面对应的所述目标位置的预设骑乘信息,根据所述预设骑乘信息以及所述游戏角色的当前骑乘状态,确定所述游戏角色的目标骑乘状态,并基于所述三维体素运动面确定所述游戏角色从当前位置向所述目标位置移动的寻路信息,根据所述寻路信息,渲染所述游戏角色以所述目标骑乘状态向所述目标位置移动的第七运动图像帧,其中,所述运动图像帧包括所述第七运动图像帧;Step 102-2-C1, if the target position matches the three-dimensional voxel motion surface, obtain the preset riding information of the target position corresponding to the three-dimensional voxel motion surface, according to the preset riding information The riding information and the current riding state of the game character, determine the target riding state of the game character, and determine the search for the game character to move from the current position to the target position based on the three-dimensional voxel motion surface. road information, and according to the pathfinding information, a seventh moving image frame in which the game character moves to the target position in the target riding state is rendered, wherein the moving image frame includes the seventh moving image frame ;
步骤102-2-C2,若所述目标位置与所述三维体素运动面不匹配,则渲染所述游戏角色在对应的当前位置进行运动的第八运动图像帧,其中,所述运动图像帧包括所述第八运动图像帧。Step 102-2-C2, if the target position does not match the three-dimensional voxel motion surface, render an eighth moving image frame in which the game character moves at the corresponding current position, wherein the moving image frame The eighth moving image frame is included.
在该实施例中,三维体素运动面包含三维游戏虚拟场景内的可连通体素,当三维体素运动面在平面上的投影包括目标位置时,或者基于目标位置发射的打点射线能够打到三维体素运动面上时,认为目标位置与三维体素运动面匹配,此时说明游戏角色可以向目标位置移动。在确定目标位置与三维体素运动面匹配时,获取目标位置点在三维体素运动 面上的垂直投影位置点对应的预设骑乘信息,例如预设骑乘信息可以包括步行、骑乘坐骑、划船等地表行进方式,还可以包括轻功、飞行器等空中行进方式,进一步根据游戏角色的当前骑乘状态结合该预设骑乘信息,确定游戏角色向目标位置移动时的目标骑乘状态,这里在确定目标骑乘状态时优先选择预设骑乘信息中与当前骑乘状态相同或相似的骑乘状态,例如当前骑乘状态为步行,则优先将步行作为目标骑乘状态,其次考虑骑乘坐骑,最后考虑空中行进方式,以保持骑乘状态的连贯性,避免游戏中不必要的骑乘状态切换,提高用户体验。确定目标骑乘状态后,基于三位体素运动面生成游戏角色从当前位置向目标位置移动的寻路信息,在一些实施方式中,可以采用最短路径优先的原则,并按确定的寻路信息渲染游戏角色在目标骑乘状态下向目标位置(此处的目标位置是指目标位置在三维体素运动面上的垂直投影体素位置,如果目标位置对应的垂直投影体素位置包括多个,也可以以最短路径优先的原则选择其中的一个作为目标体素位置)移动的第七运动图像帧。与步骤103-B-2同理,当目标位置与三维体素运动面不匹配时,说明游戏角色不可以向目标位置移动,那么按预设碰撞运动规则渲染第八运动图像帧。In this embodiment, the 3D voxel motion surface includes connectable voxels in the 3D game virtual scene. When the projection of the 3D voxel motion surface on the plane includes the target position, or the dotted ray emitted based on the target position can hit the When it is on the three-dimensional voxel motion plane, it is considered that the target position matches the three-dimensional voxel motion plane, which means that the game character can move to the target position. When it is determined that the target position matches the 3D voxel motion surface, the preset riding information corresponding to the vertical projection position point of the target position point on the 3D voxel motion surface is obtained. For example, the preset riding information may include walking, riding and riding. , boating and other surface travel methods, and can also include air travel methods such as light work, aircraft, etc., and further according to the current riding state of the game character and the preset riding information, determine the target riding state when the game character moves to the target position, here When determining the target riding state, the riding state that is the same or similar to the current riding state in the preset riding information is preferentially selected. For example, if the current riding state is walking, the walking state will be prioritized as the target riding state, and the riding state will be considered second. Riding, and finally consider the air travel mode to maintain the continuity of the riding state, avoid unnecessary riding state switching in the game, and improve the user experience. After the target riding state is determined, pathfinding information for the movement of the game character from the current position to the target position is generated based on the three-voxel motion surface. Render the game character to the target position in the target riding state (the target position here refers to the vertical projection voxel position of the target position on the three-dimensional voxel motion surface, if the vertical projection voxel position corresponding to the target position includes multiple, One of them may also be selected as the seventh moving image frame in which the target voxel position) moves based on the shortest path first principle. Similar to step 103-B-2, when the target position does not match the three-dimensional voxel motion surface, it means that the game character cannot move to the target position, then the eighth motion image frame is rendered according to the preset collision motion rule.
为使玩家获得更具身临其境的体验,在一些实施方式中,步骤102-2还可以包括:In order for the player to obtain a more immersive experience, in some embodiments, step 102-2 may further include:
步骤102-2-D1,按所述预设帧率获取通过所述游戏客户端的角运动检测装置得到的视角数据;Step 102-2-D1, obtaining the viewing angle data obtained by the angular motion detection device of the game client according to the preset frame rate;
步骤102-2-D2,依据所述运动面数据文件以及所述目标位置,逐帧渲染与所述视角数据匹配的所述游戏角色的运动图像帧,其中,所述运动图像帧用于展示与所述视角数据匹配的三维游戏虚拟场景以及所述游戏角色的运动动作。Step 102-2-D2, according to the moving surface data file and the target position, render the moving image frame of the game character matching the viewing angle data frame by frame, wherein the moving image frame is used to display and The three-dimensional game virtual scene matched with the viewing angle data and the movement action of the game character.
在该实施例中,游戏客户端不仅可以获取真实位移数据,还可以通过角运动检测装置(例如陀螺仪)采集角运动数据,并处理加工为对应的视角数据,从而在渲染与视角数据匹配的运动图像帧,从而通过运动图像帧可以反应出游戏客户端角运动情况,玩家可以通过转动手机(游戏客户端为手机时)改变想要观察游戏世界的视角,以使运动图像帧能够展示出与用户的实时视角匹配的三维游戏虚拟场景中的环境画面,使玩家能够更好的游览游戏世界,提高用户体验。In this embodiment, the game client can not only acquire real displacement data, but also acquire angular motion data through an angular motion detection device (such as a gyroscope), and process and process it into corresponding viewing angle data, so as to render the data matching the viewing angle data. Motion image frame, so that the angular motion of the game client can be reflected through the motion image frame, and the player can change the viewing angle of the game world by turning the mobile phone (when the game client is a mobile phone), so that the motion image frame can show the same The user's real-time perspective matches the environment picture in the 3D game virtual scene, so that the player can better browse the game world and improve the user experience.
需要说明的是,上述步骤102-2-D1和D2可与步骤102-2-A2.1~102-2-A2.3、步骤102-2-B1~B2、步骤102-2-C1~C2进行结合,以渲染得到与目标位置(或当前位置、预设碰撞运动规则)、骑乘状态、视角数据均匹配的运动图像帧。以使游戏角色在运动不穿帮的同时,能够满足玩家随意转动视角的需求。It should be noted that the above steps 102-2-D1 and D2 can be combined with steps 102-2-A2.1 to 102-2-A2.3, steps 102-2-B1 to B2, and steps 102-2-C1 to C2 Combined to render a moving image frame that matches the target position (or current position, preset collision motion rules), riding state, and viewing angle data. So that the game character can meet the needs of the player to rotate the viewing angle at will while exercising without wearing help.
在本发明的一些实施方式中,步骤102-2-D1,可以包括:按所述预设帧率,获取通过所述角运动检测装置采集的角运动数据;根据与所述三维游戏虚拟场景对应的运镜类别,生成与所述角运动数据对应的所述视角数据。在一些实施方式中,当所述运镜类别包括第一人称运镜时,所述视角数据为所述游戏角色在所述三维游戏虚拟场景中的第一人称视角,所述视角数据以所述游戏角色的预设初始视角为基础随所述角运动数据产生对应的变化;当所述运镜类别包括第三人称运镜时,所述视角数据为以与所述游戏角色的位置和 方向对应的预设位置和预设角度看向所述游戏角色的第三人称视角,所述游戏角色的方向以所述游戏角色的预设初始方向为基础随所述角运动数据产生对应的变化。In some embodiments of the present invention, step 102-2-D1 may include: acquiring the angular motion data collected by the angular motion detection device according to the preset frame rate; The movement category of the mirror is generated, and the angle of view data corresponding to the angular motion data is generated. In some embodiments, when the camera movement category includes first-person camera movement, the perspective data is the first-person perspective of the game character in the three-dimensional game virtual scene, and the perspective data is based on the game character's first-person perspective. The preset initial angle of view is based on the corresponding changes with the angular motion data; when the camera movement category includes third-person camera movement, the angle of view data is a preset corresponding to the position and direction of the game character. The position and the preset angle are viewed from the third-person perspective of the game character, and the direction of the game character changes correspondingly with the angular motion data based on the preset initial direction of the game character.
在该实施例中,视角数据是通过角运动检测装置采集的角运动数据加工处理而成。在一些实施方式中,视角数据应基于三维游戏虚拟场景对应的或用户选择的运镜类别来计算。运镜类别为第一人称运镜,表示运动图像帧所展示的图像为以游戏角色为第一人称视角对应的图像,也即游戏角色“看到”的游戏世界环境,第一帧运动图像帧的视角数据为游戏角色的预设初始视角,后续的视角数据在预设初始视角的基础上按照角运动数据对应变化。运行类别为第三人称运镜,表示运动图像帧所展示的图像为“看向”游戏角色的第三人称视角对应的图像,即展示虚拟第三人“看到”的游戏角色和游戏角色周边环境,第三人称视角以游戏角色的位置和方向作为基准,虚拟第三人“位于”游戏角色的预设位置和预设角度处,虚拟第三人“看向”游戏角色的位置和角度与游戏角色所在的位置和角度相对保持不变。即第三人称视角跟随游戏角色在游戏世界中的位移和视角转动而改变虚拟第三人在游戏世界坐标下的位置和角度,但虚拟第三人相对于以游戏角色为原点建立的坐标系下保持不动。游戏角色的方向以游戏角色的预设初始方向为基础随着角运动数据产生对应的变化(角运动数据指示转动多少度,游戏角色也转动多少度),第三人称视角遵循上述规律随之变化。从而提供更丰富的AR模式展示效果,提供更多游戏玩法。In this embodiment, the viewing angle data is processed and processed by the angular motion data collected by the angular motion detection device. In some embodiments, the viewing angle data should be calculated based on the camera movement category corresponding to the virtual scene of the 3D game or selected by the user. The camera movement category is first-person mirror movement, which means that the image displayed by the moving image frame is the image corresponding to the first-person perspective of the game character, that is, the game world environment that the game character "sees", and the perspective of the first frame of the moving image frame. The data is the preset initial viewing angle of the game character, and the subsequent viewing angle data is correspondingly changed according to the angular motion data on the basis of the preset initial viewing angle. The running category is the third-person camera movement, which means that the image displayed by the moving image frame is the image corresponding to the third-person perspective "looking at" the game character, that is, showing the game character and the surrounding environment of the game character that the virtual third person "sees". The third-person perspective is based on the position and direction of the game character. The virtual third person is “located” at the preset position and angle of the game character, and the virtual third person “looks” at the position and angle of the game character. The position and angle remain relatively unchanged. That is, the third-person perspective follows the displacement of the game character in the game world and the rotation of the perspective to change the position and angle of the virtual third-person in the game world coordinates, but the virtual third-person remains relative to the coordinate system established with the game character as the origin. Do not move. The direction of the game character is based on the preset initial direction of the game character and correspondingly changes with the angular motion data (the angle motion data indicates how many degrees to rotate, and how many degrees the game character also rotates), and the third-person perspective follows the above rules and changes accordingly. Thereby providing a richer AR mode display effect and providing more gameplay.
在一些实施方式中,所述运动面数据文件可包括基于所述三维游戏虚拟场景的地貌数据确定的所述三维游戏虚拟场景中可运动位置对应的平面坐标和高度数据;所述运动面数据文件可依据所述三维游戏场景的实时环境进行修改。In some embodiments, the motion surface data file may include plane coordinates and height data corresponding to movable positions in the 3D game virtual scene determined based on the landform data of the 3D game virtual scene; the motion surface data file Modifications may be made according to the real-time environment of the three-dimensional game scene.
在一些实施方式中,步骤102-2可以包括:In some embodiments, step 102-2 may include:
步骤102-2-E1,若所述目标位置与所述可运动位置的平面坐标匹配,则依据所述目标位置对应的高度数据生成所述游戏角色向所述目标位置移动的第九运动图像帧,其中,所述运动图像帧包括所述第九运动图像帧;Step 102-2-E1, if the target position matches the plane coordinates of the movable position, generate a ninth moving image frame in which the game character moves to the target position according to the height data corresponding to the target position , wherein the moving image frame includes the ninth moving image frame;
步骤102-2-E2,若所述目标位置与所述可运动位置的平面坐标不匹配,则按照所述预设碰撞运动规则,生成所述游戏角色在对应的当前位置进行运动的第十运动图像帧,其中,所述运动图像帧包括所述第十运动图像帧。Step 102-2-E2, if the target position does not match the plane coordinates of the movable position, then according to the preset collision motion rule, generate a tenth motion in which the game character moves at the corresponding current position An image frame, wherein the moving image frame includes the tenth moving image frame.
在该实施例中,还可以直接基于运动面数据文件中的数据,对游戏角色进行无穿插运动的驱动,当目标位置为可运动位置时,驱动游戏角色向该目标位置对应的高度数据位置运动即可。另外,运动面数据文件中保存的可运动位置的平面坐标和高度数据可以根据游戏场景的实时变化进行修改、增加、删除,例如某个可运动区域中放置了一张桌子,桌子放置位置变为不能运动的位置,则可以对运动面数据文件中的相关数据进行修改,以便利用该文件驱动游戏角色运动时,即使游戏场景的环境发生了变化也可以保证游戏角色能够进行无穿插运动,利用可实时修改的运动面数据文件对游戏角色的运动连通性进行计算从而驱动角色运动,以使游戏角色在虚拟世界中的运动更加真实。In this embodiment, it is also possible to drive the game character without interspersed motion directly based on the data in the motion surface data file, and when the target position is a movable position, drive the game character to move to the height data position corresponding to the target position That's it. In addition, the plane coordinates and height data of the movable position saved in the motion surface data file can be modified, added, and deleted according to the real-time changes of the game scene. For example, if a table is placed in a certain movable area, the table placement position becomes In the position that cannot move, the relevant data in the motion surface data file can be modified, so that when the file is used to drive the movement of the game character, even if the environment of the game scene changes, the game character can be guaranteed to perform non-interspersed movement. The real-time modified motion surface data file calculates the motion connectivity of the game character to drive the motion of the character, so that the motion of the game character in the virtual world is more realistic.
根据本发明的一些实施方式,在AR模式中,除游戏角色在三维游戏虚拟场景中可 以跟随玩家在现实世界中的真实位移而运动外,还可以提供从三维游戏虚拟场景进入现实世界的游戏玩法。在一些实施方式中,还可以包括:S1,响应于从三维游戏虚拟场景向现实世界的传送门开启请求,获取所述现实世界对应的第一实时真实图像帧并将所述第一实时真实图像帧保存为第一贴图,以及获取所述三维游戏虚拟场景对应的第一实时虚拟图像帧(该第一实时虚拟图像帧即运动图像帧),其中,所述第一贴图用于渲染所述传送门对应的预设传送门模型;S2,根据所述第一贴图、所述预设传送门模型以及所述第一实时虚拟图像帧,渲染得到包含所述传送门的第一实时渲染图像帧。According to some embodiments of the present invention, in the AR mode, in addition to the game character moving in the 3D game virtual scene following the real displacement of the player in the real world, a gameplay of entering the real world from the 3D game virtual scene can also be provided . In some embodiments, it may further include: S1, in response to a request for opening a transmission gate from a three-dimensional game virtual scene to the real world, acquiring a first real-time real image frame corresponding to the real world and converting the first real-time real image The frame is saved as a first texture, and a first real-time virtual image frame corresponding to the virtual scene of the 3D game (the first real-time virtual image frame is a moving image frame) is obtained, wherein the first texture is used to render the transmission A preset portal model corresponding to the gate; S2, according to the first texture, the preset portal model and the first real-time virtual image frame, render a first real-time rendered image frame including the portal.
在一些实施方式中,将在现实世界采集的图像以贴图形式存储在设备内存中,以供游戏引擎通过渲染技术将其渲染至预设传送门模型上,在门内展示现实世界实时环境,并渲染出传送门外展示三维游戏虚拟场景实时环境的图像的效果,游戏内展示图像极具整体性效果,没有割裂感,门里、门外能够实时展示出现实、虚拟两个世界的实时环境,营造出更加“真实”的传送门效果,通过传送门玩家能够实时观察到通过传送门所至的世界环境和位置,有助于提高玩家的体验感,提升游戏画面的展现效果,提升游戏的可玩性,通过AR游戏模式给玩家带来虚实结合的非凡游戏体验,为增加游戏玩法提供技术支持。In some implementations, the images collected in the real world are stored in the device memory in the form of textures, so that the game engine can render them to the preset portal model through rendering technology, and the real-world real-time environment is displayed in the gate, and The effect of rendering the image of the real-time environment of the virtual scene of the 3D game outside the portal is rendered. The image displayed in the game has a very holistic effect, and there is no sense of separation. The real-time environment of the real and virtual worlds can be displayed in real time inside and outside the door. Create a more "real" portal effect. Through the portal, players can observe the world environment and location through the portal in real time, which helps to improve the player's experience, enhance the display effect of the game screen, and enhance the game's potential. Playability, through the AR game mode, it brings players an extraordinary game experience combining virtual and reality, and provides technical support for increasing gameplay.
根据本发明的一些实施方式,基于游戏世界中游戏角色与传送门所在位置和所在方向,可展示出游戏画面包含虚拟传送门和不包含虚拟传送门两种情况。在一些实施方式中,一种是以虚拟世界中游戏角色的角度能看到虚拟传送门的情况,另外一种是以虚拟世界中游戏角色的角度不能看到虚拟传送门的情况。在一些实施方式中,虚拟传送门在三维游戏虚拟场景中的位置和方向可由玩家选择确定,在虚拟传送门位置和方向确定的情况下,游戏中展现出的虚拟传送门的形状和大小会随着游戏角色在三维游戏虚拟场景中的不同位置和方向发生改变。According to some embodiments of the present invention, based on the location and direction of the game character and the portal in the game world, it can be displayed that the game screen includes a virtual portal or does not include a virtual portal. In some embodiments, one is a situation where the virtual portal can be seen from the perspective of the game character in the virtual world, and the other is a situation where the virtual portal cannot be seen from the perspective of the game character in the virtual world. In some embodiments, the position and direction of the virtual portal in the virtual scene of the 3D game can be determined by the player's choice. If the position and direction of the virtual portal are determined, the shape and size of the virtual portal displayed in the game will vary with The different positions and directions of the game characters in the three-dimensional game virtual scene change.
作为图1方法的具体实现,如图3所示,根据本发明实施例,提供了一种游戏角色的运动处理装置,该装置包括:As a specific implementation of the method in FIG. 1, as shown in FIG. 3, according to an embodiment of the present invention, a motion processing device for a game character is provided, and the device includes:
请求响应模块,用于响应于游戏客户端内的AR模式交互操作请求,调用所述游戏客户端内游戏引擎渲染得到实时三维游戏虚拟场景;a request-response module, used for invoking the game engine in the game client to render a real-time 3D game virtual scene in response to an AR mode interactive operation request in the game client;
运动驱动模块,用于根据游戏玩家在真实世界中实时的真实位移数据,驱动所述游戏玩家控制的游戏角色在三维游戏虚拟场景中进行无穿插运动。The motion driving module is used for driving the game character controlled by the game player to perform non-interspersed motion in the three-dimensional game virtual scene according to the real-time real displacement data of the game player in the real world.
在一些实施方式中,所述游戏角色的运动处理装置还可包括:文件读取模块,用于所述调用所述游戏客户端内游戏引擎渲染得到实时三维游戏虚拟场景之前,读取与所述三维游戏虚拟场景对应的运动面数据文件,其中,所述运动面数据文件用于指示所述三维游戏虚拟场景的可运动位置;In some embodiments, the motion processing apparatus of the game character may further include: a file reading module, configured to read and read the virtual scene of the real-time three-dimensional game by invoking the game engine in the game client to render the virtual scene. A motion surface data file corresponding to a 3D game virtual scene, wherein the motion surface data file is used to indicate a movable position of the 3D game virtual scene;
所述运动驱动模块,可以用于:按预设帧率采集所述游戏客户端对应的真实位移数据,并根据所述真实位移数据以及游戏角色在所述三维游戏虚拟场景中的初始位置,逐帧生成所述游戏角色在所述三维游戏虚拟场景中的目标位置;依据所述运动面数据文件以及所述目标位置,通过游戏引擎逐帧渲染所述游戏角色在所述三维游戏虚拟场景中对应的 运动图像帧。The motion driving module can be used to: collect the real displacement data corresponding to the game client according to the preset frame rate, and according to the real displacement data and the initial position of the game character in the three-dimensional game virtual scene, step by step. frame generation of the target position of the game character in the 3D game virtual scene; according to the motion surface data file and the target position, the game character is rendered frame by frame through the game engine corresponding to the 3D game virtual scene moving image frame.
在一些实施方式中,所述运动驱动模块,还可用于:按所述预设帧率获取所述游戏客户端的定位装置采集到的定位数据,并逐帧计算所述游戏客户端对应的真实位移数据;按照预设比例系数,确定所述真实位移数据对应的所述游戏角色的虚拟位移数据,并依据所述虚拟位移数据以及所述初始位置,确定所述游戏角色在所述三维游戏虚拟场景中的目标位置。In some embodiments, the motion driving module is further configured to: acquire the positioning data collected by the positioning device of the game client according to the preset frame rate, and calculate the real displacement corresponding to the game client frame by frame data; according to a preset proportional coefficient, determine the virtual displacement data of the game character corresponding to the real displacement data, and determine that the game character is in the 3D game virtual scene according to the virtual displacement data and the initial position target location in .
在一些实施方式中,所述运动驱动模块,还可用于:获取所述定位数据中的第一定位数据以及第二定位数据,其中,所述第一定位数据对应的第一采样帧与所述第二定位数据对应的第二采样帧相差预设数量;基于所述第一定位数据以及所述第二定位数据做插值,得到与所述预设数量匹配的插值位置数据,并依据所述插值位置数据计算所述游戏客户端对应的插值位移数据,其中,所述真实位移数据包括所述插值位移数据。In some embodiments, the motion driving module is further configured to: acquire first positioning data and second positioning data in the positioning data, wherein the first sampling frame corresponding to the first positioning data is the same as the The second sampling frames corresponding to the second positioning data differ by a preset amount; based on the first positioning data and the second positioning data, interpolation is performed to obtain the interpolation position data matching the preset number, and according to the interpolation The position data calculates the interpolation displacement data corresponding to the game client, wherein the real displacement data includes the interpolation displacement data.
在一些实施方式中,所述运动面数据文件可包括基于所述三维游戏虚拟场景的地貌数据确定的所述三维游戏虚拟场景中可运动位置对应的平面坐标和高度数据;In some embodiments, the motion surface data file may include plane coordinates and height data corresponding to movable positions in the 3D game virtual scene determined based on the landform data of the 3D game virtual scene;
所述文件读取模块,可以用于:在所述三维游戏虚拟场景中,加载与所述运动面数据文件对应的运动面,其中,对所述三维游戏虚拟场景进行展示时隐藏所述运动面。The file reading module can be configured to: in the 3D game virtual scene, load the motion plane corresponding to the motion plane data file, wherein the motion plane is hidden when the 3D game virtual scene is displayed .
在一些实施方式中,所述运动驱动模块,还可用于:基于所述目标位置生成打点射线,对所述运动面发射所述打点射线进行射线打点,其中,所述打点射线垂直于所述目标位置所在平面;In some embodiments, the motion driving module may be further configured to: generate a dotted ray based on the target position, and perform ray dotting on the dotted ray emitted by the moving surface, wherein the dotted ray is perpendicular to the target the plane of the location;
若所述打点射线与所述运动面产生交汇,则渲染所述游戏角色向所述目标位置移动的第一运动图像帧,其中,所述运动图像帧包括所述第一运动图像帧;If the dotted ray intersects with the moving surface, rendering a first moving image frame in which the game character moves to the target position, wherein the moving image frame includes the first moving image frame;
若所述打点射线与所述运动面不产生交汇,则按照预设碰撞运动规则,渲染所述游戏角色在对应的当前位置进行运动的第二运动图像帧,其中,所述运动图像帧包括所述第二运动图像帧。If the dotted ray does not intersect with the moving surface, a second moving image frame in which the game character moves at the corresponding current position is rendered according to the preset collision motion rule, wherein the moving image frame includes all the second moving image frame.
在一些实施方式中,所述若所述打点射线与所述运动面产生交汇,则所述运动驱动模块,还可用于:获取所述打点射线与所述运动面的交汇点高度;In some embodiments, if the dotted ray intersects with the motion surface, the motion driving module may further be used to: obtain the height of the intersection of the dotted ray and the motion surface;
若所述交汇点高度与所述三维游戏虚拟场景的AR模式骑乘状态信息匹配,则渲染所述游戏角色按照与所述交汇点高度对应的骑乘状态向所述目标位置移动的第三运动图像帧,其中,所述第一运动图像帧包括所述第三运动图像帧;If the height of the junction matches the riding state information in the AR mode of the 3D game virtual scene, render a third motion in which the game character moves to the target position according to the riding state corresponding to the height of the junction an image frame, wherein the first moving image frame includes the third moving image frame;
若所述交汇点高度与所述AR模式骑乘状态信息不匹配,则按照所述预设碰撞运动规则,渲染所述游戏角色在对应的当前位置进行运动的第四运动图像帧,在一些实施方式中,所述第一运动图像帧包括所述第四运动图像帧。If the height of the intersection point does not match the riding state information in the AR mode, then according to the preset collision motion rule, a fourth moving image frame of the game character moving at the corresponding current position is rendered, and in some implementations manner, the first moving image frame includes the fourth moving image frame.
在一些实施方式中,所述运动面可包括二维导航网格运动面;所述运动驱动模块,还可用于:In some embodiments, the motion surface may include a two-dimensional navigation mesh motion surface; the motion driving module may also be used to:
若所述目标位置与所述二维导航网格运动面匹配,则获取所述二维导航网格运动面对应的所述目标位置的预设骑乘信息,根据所述预设骑乘信息以及所述游戏角色的当前骑乘 状态,确定所述游戏角色的目标骑乘状态,并基于所述二维导航网格运动面,渲染所述游戏角色以所述目标骑乘状态向所述目标位置移动的第五运动图像帧,其中,所述运动图像帧包括所述第五运动图像帧;If the target position matches the motion surface of the two-dimensional navigation grid, obtain the preset riding information of the target position corresponding to the motion surface of the two-dimensional navigation grid, according to the preset riding information and the current riding state of the game character, determine the target riding state of the game character, and based on the two-dimensional navigation mesh motion surface, render the game character with the target riding state to the target a position-shifted fifth moving image frame, wherein the moving image frame includes the fifth moving image frame;
若所述目标位置与所述二维导航网格运动面不匹配,则按照预设碰撞运动规则,渲染所述游戏角色在对应的当前位置进行运动的第六运动图像帧,其中,所述运动图像帧包括所述第六运动图像帧。If the target position does not match the motion surface of the two-dimensional navigation mesh, a sixth moving image frame in which the game character moves at the corresponding current position is rendered according to the preset collision motion rule, wherein the motion The image frames include the sixth moving image frame.
在一些实施方式中,所述运动面可包括三维体素运动面;所述运动驱动模块,还可用于:In some embodiments, the motion surface may include a three-dimensional voxel motion surface; the motion driving module may also be used to:
若所述目标位置与所述三维体素运动面匹配,则获取所述三维体素运动面对应的所述目标位置的预设骑乘信息,根据所述预设骑乘信息以及所述游戏角色的当前骑乘状态,确定所述游戏角色的目标骑乘状态,并基于所述三维体素运动面确定所述游戏角色从当前位置向所述目标位置移动的寻路信息,根据所述寻路信息,渲染所述游戏角色以所述目标骑乘状态向所述目标位置移动的第七运动图像帧,其中,所述运动图像帧包括所述第七运动图像帧;If the target position matches the three-dimensional voxel motion surface, obtain the preset riding information of the target position corresponding to the three-dimensional voxel motion surface, according to the preset riding information and the game The current riding state of the character, determine the target riding state of the game character, and determine the pathfinding information of the game character moving from the current position to the target position based on the three-dimensional voxel motion surface, according to the road information, and render a seventh moving image frame in which the game character moves to the target position in the target riding state, wherein the moving image frame includes the seventh moving image frame;
若所述目标位置与所述三维体素运动面不匹配,则按照预设碰撞运动规则,渲染所述游戏角色在对应的当前位置进行运动的第八运动图像帧,其中,所述运动图像帧包括所述第八运动图像帧。If the target position does not match the three-dimensional voxel motion surface, the eighth motion image frame in which the game character moves at the corresponding current position is rendered according to the preset collision motion rule, wherein the motion image frame The eighth moving image frame is included.
在一些实施方式中,所述运动面数据文件可包括基于所述三维游戏虚拟场景的地貌数据确定的所述三维游戏虚拟场景中可运动位置对应的平面坐标和高度数据;所述运动面数据文件可依据所述三维游戏场景的实时环境进行修改;In some embodiments, the motion surface data file may include plane coordinates and height data corresponding to movable positions in the 3D game virtual scene determined based on the landform data of the 3D game virtual scene; the motion surface data file can be modified according to the real-time environment of the three-dimensional game scene;
所述运动驱动模块,还可用于:The motion drive module can also be used for:
若所述目标位置与所述可运动位置的平面坐标匹配,则依据所述目标位置对应的高度数据生成所述游戏角色向所述目标位置移动的第九运动图像帧,其中,所述运动图像帧包括所述第九运动图像帧;If the target position matches the plane coordinates of the movable position, a ninth moving image frame in which the game character moves to the target position is generated according to the height data corresponding to the target position, wherein the moving image the frame includes the ninth moving image frame;
若所述目标位置与所述可运动位置的平面坐标不匹配,则按照所述预设碰撞运动规则,生成所述游戏角色在对应的当前位置进行运动的第十运动图像帧,其中,所述运动图像帧包括所述第十运动图像帧。If the target position does not match the plane coordinates of the movable position, a tenth moving image frame in which the game character moves at the corresponding current position is generated according to the preset collision motion rule, wherein the The moving image frame includes the tenth moving image frame.
在一些实施方式中,所述运动驱动模块,还可用于:按所述预设帧率获取通过所述游戏客户端的角运动检测装置得到的视角数据;依据所述运动面数据文件以及所述目标位置,逐帧渲染与所述视角数据匹配的所述游戏角色的运动图像帧,其中,所述运动图像帧用于展示与所述视角数据匹配的三维游戏虚拟场景以及所述游戏角色的运动动作。In some implementations, the motion driving module is further configured to: acquire the viewing angle data obtained by the angular motion detection device of the game client according to the preset frame rate; according to the motion surface data file and the target position, rendering the moving image frames of the game character matching the viewing angle data frame by frame, wherein the moving image frames are used to display the three-dimensional game virtual scene matching the viewing angle data and the movement actions of the game character .
在一些实施方式中,所述运动驱动模块,还可用于:按所述预设帧率,获取通过所述角运动检测装置采集的角运动数据;根据与所述三维游戏虚拟场景对应的运镜类别,生成与所述角运动数据对应的所述视角数据。In some implementation manners, the motion driving module may be further configured to: acquire the angular motion data collected by the angular motion detection device according to the preset frame rate; category, and the viewing angle data corresponding to the angular motion data is generated.
在一些实施方式中,在所述运镜类别可包括第一人称运镜的情况下,所述视角数 据为所述游戏角色在所述三维游戏虚拟场景中的第一人称视角,所述视角数据以所述游戏角色的预设初始视角为基础随所述角运动数据产生对应的变化;In some embodiments, in the case where the camera movement category may include first-person camera movement, the perspective data is the first-person perspective of the game character in the three-dimensional game virtual scene, and the perspective data is in the form of The preset initial angle of view of the game character is based on the corresponding change with the angular motion data;
在所述运镜类别包括第三人称运镜的情况下,所述视角数据为以与所述游戏角色的位置和方向对应的预设位置和预设角度看向所述游戏角色的第三人称视角,所述游戏角色的方向以所述游戏角色的预设初始方向为基础随所述角运动数据产生对应的变化。In the case where the camera movement category includes a third-person camera movement, the perspective data is a third-person perspective view of the game character looking at the game character at a preset position and a preset angle corresponding to the position and direction of the game character, The direction of the game character changes correspondingly with the angular motion data based on the preset initial direction of the game character.
需要说明的是,根据本发明实施例的一种游戏角色的运动处理装置所涉及各功能单元的其他相应描述,可以参考图1至图2方法中的对应描述,在此不再赘述。It should be noted that, for other corresponding descriptions of the functional units involved in a motion processing apparatus for a game character according to an embodiment of the present invention, reference may be made to the corresponding descriptions in the methods in FIGS. 1 to 2 , which will not be repeated here.
本发明的各个部件实施例可以以硬件实现,或者以在一个或者多个处理器上运行的软件模块实现,或者以它们的组合实现。本领域的技术人员应当理解,可以在实践中使用微处理器或者数字信号处理器(DSP)来实现根据本发明实施例的游戏角色的运动处理装置中的一些或者全部部件的一些或者全部功能。本发明还可以实现为用于执行这里所描述的方法的一部分或者全部的设备或者装置的程序/指令(例如,计算机程序/指令和计算机程序产品)。这样的实现本发明的程序/指令可以存储在计算机可读介质上,或者可以一个或者多个信号的形式存在,这样的信号可以从因特网网站上下载得到,或者在载体信号上提供,或者以任何其他形式提供。Various component embodiments of the present invention may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art should understand that a microprocessor or a digital signal processor (DSP) may be used in practice to implement some or all of the functions of some or all of the components in the motion processing device of the game character according to the embodiment of the present invention. The present invention can also be implemented as a program/instruction (eg, computer program/instruction and computer program product) for an apparatus or apparatus for performing some or all of the methods described herein. Such programs/instructions implementing the present invention may be stored on a computer readable medium, or may exist in the form of one or more signals, such signals may be downloaded from an Internet website, or provided on a carrier signal, or in any form Available in other formats.
基于上述如图1至图2所示方法,相应的,根据本发明实施例,还提供了一种计算机可读介质,其上存储有计算机程序,该计算机程序被处理器执行时实现上述如图1至图2所示的游戏角色的运动处理方法。Based on the above methods as shown in FIGS. 1 to 2 , correspondingly, according to an embodiment of the present invention, a computer-readable medium is also provided, on which a computer program is stored, and when the computer program is executed by a processor, the above-mentioned FIG. 1 to FIG. 2 show the motion processing method of the game character.
计算机可读介质包括永久性和非永久性、可移动和非可移动媒体可以由任何方法或技术来实现信息存储。信息可以是计算机可读指令、数据结构、程序的模块或其他数据。计算机的存储介质的例子包括,但不限于相变内存(PRAM)、静态随机存取存储器(SRAM)、动态随机存取存储器(DRAM)、其他类型的随机存取存储器(RAM)、只读存储器(ROM)、电可擦除可编程只读存储器(EEPROM)、快闪记忆体或其他内存技术、只读光盘只读存储器(CD-ROM)、数字多功能光盘(DVD)或其他光学存储、磁盒式磁带、磁盘存储、量子存储器、基于石墨烯的存储介质或其他磁性存储设备或任何其他非传输介质,可用于存储可以被计算设备访问的信息。Computer-readable media includes both persistent and non-permanent, removable and non-removable media, and storage of information may be implemented by any method or technology. Information may be computer readable instructions, data structures, modules of programs, or other data. Examples of computer storage media include, but are not limited to, phase-change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read only memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), Flash Memory or other memory technology, Compact Disc Read Only Memory (CD-ROM), Digital Versatile Disc (DVD) or other optical storage, Magnetic tape cartridges, disk storage, quantum memory, graphene-based storage media or other magnetic storage devices or any other non-transmission media can be used to store information that can be accessed by computing devices.
基于这样的理解,本发明的技术方案可以以软件产品的形式体现出来,该软件产品可以存储在一个非易失性存储介质(可以是CD-ROM,U盘,移动硬盘等)中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本发明各个实施场景所述的方法。Based on this understanding, the technical solution of the present invention can be embodied in the form of a software product, and the software product can be stored in a non-volatile storage medium (which may be CD-ROM, U disk, mobile hard disk, etc.), including several The instructions are used to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in various implementation scenarios of the present invention.
基于上述如图1至图2所示的方法,以及图3所示的虚拟装置实施例,为了实现上述目的,根据本发明实施例,还提供了一种计算机设备,可以为个人计算机、服务器、网络设备等,该计算机设备包括存储器和处理器;存储器,用于存储计算机程序;处理器,用于执行计算机程序以实现上述如图1至图2所示的游戏角色的运动处理方法。Based on the above methods shown in FIGS. 1 to 2 and the virtual device embodiment shown in FIG. 3, in order to achieve the above purpose, according to an embodiment of the present invention, a computer device is also provided, which can be a personal computer, a server, A network device, etc., the computer device includes a memory and a processor; the memory is used to store the computer program; the processor is used to execute the computer program to implement the above-mentioned method for processing game characters as shown in FIG. 1 to FIG. 2 .
图4示意性地示出了可以实现根据本发明的游戏角色的运动处理方法的计算机 设备,在一些实施方式中,该计算机设备包括处理器410和以存储器420形式的计算机可读介质。存储器420是计算机可读介质的一个示例,其具有用于存储计算机程序431的存储空间430。当所述计算机程序431由处理器410执行时,可实现上文所描述的游戏角色的运动处理方法中的各个步骤。FIG. 4 schematically shows a computer device that can implement the method for processing the motion of a game character according to the present invention, and in some embodiments, the computer device includes a processor 410 and a computer-readable medium in the form of a memory 420. The memory 420 is an example of a computer-readable medium having a storage space 430 for storing the computer program 431 . When the computer program 431 is executed by the processor 410, each step in the above-described method for processing the motion of a game character can be implemented.
在一些实施方式中,该计算机设备还可以包括用户接口、网络接口、摄像头、射频(Radio Frequency,RF)电路,传感器、音频电路、WI-FI模块等等。用户接口可以包括显示屏(Display)、输入单元比如键盘(Keyboard)等,可选用户接口还可以包括USB接口、读卡器接口等。网络接口可选的可以包括标准的有线接口、无线接口(如蓝牙接口、WI-FI接口)等。In some embodiments, the computer device may further include a user interface, a network interface, a camera, a radio frequency (Radio Frequency, RF) circuit, a sensor, an audio circuit, a WI-FI module, and the like. The user interface may include a display screen (Display), an input unit such as a keyboard (Keyboard), etc., and the optional user interface may also include a USB interface, a card reader interface, and the like. Optional network interfaces may include standard wired interfaces, wireless interfaces (such as Bluetooth interfaces, WI-FI interfaces), and the like.
本领域技术人员可以理解,根据本发明实施例的一种计算机设备结构并不构成对该计算机设备的限定,可以包括更多或更少的部件,或者组合某些部件,或者不同的部件布置。Those skilled in the art can understand that the structure of a computer device according to the embodiment of the present invention does not constitute a limitation on the computer device, and may include more or less components, or combine some components, or arrange different components.
存储介质中还可以包括操作系统、网络通信模块。操作系统是管理和保存计算机设备硬件和软件资源的程序,支持信息处理程序以及其它软件和/或程序的运行。网络通信模块用于实现存储介质内部各组件之间的通信,以及与该实体设备中其它硬件和软件之间通信。The storage medium may also include an operating system and a network communication module. An operating system is a program that manages and saves the hardware and software resources of computer equipment, supports the operation of information processing programs and other software and/or programs. The network communication module is used to realize the communication between various components inside the storage medium, as well as the communication with other hardware and software in the physical device.
图5示意性地示出了实现根据本发明的方法的计算机程序产品的框图。所述计算机程序产品包括计算机程序510,当所述计算机程序510被诸如图4所示的处理器410之类的处理器执行时,可实现上文所描述的游戏角色的运动处理方法中的各个步骤。Figure 5 schematically shows a block diagram of a computer program product implementing the method according to the invention. The computer program product includes a computer program 510 that, when executed by a processor, such as the processor 410 shown in FIG. step.
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到本发明可以借助软件加必要的通用硬件平台的方式来实现,也可以通过硬件实现响应于AR模式交互操作请求,读取与AR模式交互操作请求所指示的三维游戏虚拟场景对应的运动面数据文件,同时,按预设帧率获取游戏客户端对应的真实位移数据,并基于真实位移数据来逐帧的确定游戏角色在三维游戏虚拟场景中的目标位置,利用读取的运动面数据文件分析目标位置是否可运动并基于分析结果渲染游戏角色对应的运动图像帧。本发明实施例相比于现有技术中利用已有场景运行AR模式会导致的运动表现效果穿帮,或者为避免穿帮单独为AR模式进行场景建模等方式相比,无需为AR模式建立特殊的游戏场景模型,只需要利用预先构建的运动面数据文件来分析游戏角色在三维游戏虚拟场景中是否可以对玩家在现实世界的运动轨迹进行跟随,从而进行运动图像帧的渲染,避免了游戏角色在三维游戏虚拟场景中生硬的跟随玩家在现实世界中的移动路径而导致的穿帮,在提高游戏可玩性、增加游戏玩法的同时,保证了游戏展示效果,通过AR游戏模式给玩家带来虚实结合的非凡游戏体验。From the description of the above embodiments, those skilled in the art can clearly understand that the present invention can be implemented by means of software plus a necessary general hardware platform, and can also be implemented by hardware in response to the AR mode interactive operation request, reading and The motion surface data file corresponding to the 3D game virtual scene indicated by the AR mode interactive operation request, at the same time, the real displacement data corresponding to the game client is obtained according to the preset frame rate, and based on the real displacement data, the game character is determined frame by frame in 3D For the target position in the virtual scene of the game, use the read motion surface data file to analyze whether the target position is movable and render the moving image frame corresponding to the game character based on the analysis result. Compared with the prior art, the embodiment of the present invention does not need to create a special feature for the AR mode, compared to the sports performance effect caused by running the AR mode in the existing scene, or to avoid the scene modeling for the AR mode alone. The game scene model only needs to use the pre-built motion surface data file to analyze whether the game character can follow the player's movement trajectory in the real world in the 3D game virtual scene, so as to render the moving image frame and avoid the game character in the game. The 3D game virtual scene rigidly follows the player's moving path in the real world, which leads to the penetration of the gang, which not only improves the game playability and increases the gameplay, but also ensures the game display effect. The AR game mode brings players a combination of reality and reality. extraordinary gaming experience.
本领域技术人员可以理解附图只是一个优选实施场景的示意图,附图中的模块或流程并不一定是实施本发明所必须的。本领域技术人员可以理解实施场景中的装置中的模块可以按照实施场景描述进行分布于实施场景的装置中,也可以进行相应变化位于不同于 本实施场景的一个或多个装置中。上述实施场景的模块可以合并为一个模块,也可以进一步拆分成多个子模块。Those skilled in the art can understand that the accompanying drawing is only a schematic diagram of a preferred implementation scenario, and the modules or processes in the accompanying drawing are not necessarily necessary to implement the present invention. Those skilled in the art can understand that the modules in the device in the implementation scenario may be distributed in the device in the implementation scenario according to the description of the implementation scenario, or may be located in one or more devices different from the implementation scenario with corresponding changes. The modules of the above implementation scenarios may be combined into one module, or may be further split into multiple sub-modules.
上文对本说明书特定实施例进行了描述,其与其它实施例一并涵盖于所附权利要求书的范围内。在一些情况下,在权利要求书中记载的动作或步骤可以按照不同于实施例中的顺序来执行并且仍然可以实现期望的结果。另外,在附图中描绘的过程不一定遵循示出的特定顺序或者连续顺序才能实现期望的结果。在某些实施方式中,多任务处理和并行处理也是可行的或者有利的。Specific embodiments of this specification have been described above, and other embodiments are intended to be included within the scope of the appended claims. In some cases, the actions or steps recited in the claims can be performed in an order different from that in the embodiments and still achieve desirable results. Additionally, the processes depicted in the figures do not necessarily follow the specific order shown, or sequential order, to achieve desirable results. In certain embodiments, multitasking and parallel processing are also possible or advantageous.
还需要说明的是,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、商品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、商品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、商品或者设备中还存在另外的相同要素。It should also be noted that the terms "comprising", "comprising" or any other variation thereof are intended to encompass a non-exclusive inclusion such that a process, method, article or device comprising a series of elements includes not only those elements, but also Other elements not expressly listed, or which are inherent to such a process, method, article of manufacture, or apparatus are also included. Without further limitation, an element qualified by the phrase "comprising a..." does not preclude the presence of additional identical elements in the process, method, article of manufacture, or device that includes the element.
应可理解,以上所述实施例仅为举例说明本发明之目的而并非对本发明进行限制。在不脱离本发明基本精神及特性的前提下,本领域技术人员还可以通过其他方式来实施本发明。本发明的范围当以后附的权利要求为准,凡在本说明书一个或多个实施例的精神和原则之内所做的任何修改、等同替换、改进等,皆应涵盖其中。It should be understood that the above-mentioned embodiments are only for the purpose of illustrating the present invention and not for limiting the present invention. Those skilled in the art can also implement the present invention in other ways without departing from the basic spirit and characteristics of the present invention. The scope of the present invention shall be determined by the appended claims, and any modifications, equivalent substitutions, improvements, etc. made within the spirit and principles of one or more embodiments of this specification shall be covered therein.

Claims (17)

  1. 一种游戏角色的运动处理方法,包括:A motion processing method for game characters, comprising:
    响应于游戏客户端内的AR模式交互操作请求,调用所述游戏客户端内游戏引擎渲染得到实时三维游戏虚拟场景;以及In response to the AR mode interactive operation request in the game client, calling the game engine in the game client to render to obtain a real-time 3D game virtual scene; and
    根据游戏玩家在真实世界中实时的真实位移数据以及三维游戏虚拟场景对应的运动面数据文件,驱动所述游戏玩家控制的游戏角色在三维游戏虚拟场景中进行无穿插运动。According to the real-time real displacement data of the game player in the real world and the motion surface data file corresponding to the 3D game virtual scene, the game character controlled by the game player is driven to perform non-interleaved motion in the 3D game virtual scene.
  2. 根据权利要求1所述的方法,所述调用所述游戏客户端内游戏引擎渲染得到实时三维游戏虚拟场景之前,所述方法还包括:The method according to claim 1, before invoking the game engine in the game client to render to obtain a real-time 3D game virtual scene, the method further comprises:
    读取与所述三维游戏虚拟场景对应的运动面数据文件,所述运动面数据文件用于指示所述三维游戏虚拟场景的可运动位置;reading a motion surface data file corresponding to the 3D game virtual scene, where the motion surface data file is used to indicate a movable position of the 3D game virtual scene;
    所述根据游戏玩家在真实世界中实时的真实位移数据,驱动所述游戏玩家控制的游戏角色在所述三维游戏虚拟场景中进行无穿插运动,包括:According to the real-time real displacement data of the game player in the real world, driving the game character controlled by the game player to perform non-interspersed motion in the three-dimensional game virtual scene, including:
    按预设帧率采集所述游戏客户端对应的真实位移数据,并根据所述真实位移数据以及游戏角色在所述三维游戏虚拟场景中的初始位置,逐帧生成所述游戏角色在所述三维游戏虚拟场景中的目标位置;The real displacement data corresponding to the game client is collected at a preset frame rate, and according to the real displacement data and the initial position of the game character in the 3D game virtual scene, frame-by-frame generation of the game character in the 3D game The target position in the virtual scene of the game;
    依据所述运动面数据文件以及所述目标位置,通过游戏引擎逐帧渲染所述游戏角色在所述三维游戏虚拟场景中对应的运动图像帧。According to the motion surface data file and the target position, the game engine renders the motion image frames corresponding to the game character in the 3D game virtual scene frame by frame.
  3. 根据权利要求2所述的方法,所述按预设帧率采集所述游戏客户端对应的真实位移数据,并根据所述真实位移数据以及游戏角色在所述三维游戏虚拟场景中的初始位置,逐帧确定所述游戏角色在所述三维游戏虚拟场景中的目标位置,包括:The method according to claim 2, wherein the real displacement data corresponding to the game client is collected at a preset frame rate, and according to the real displacement data and the initial position of the game character in the three-dimensional game virtual scene, Determining the target position of the game character in the 3D game virtual scene frame by frame includes:
    按所述预设帧率获取所述游戏客户端的定位装置采集到的定位数据,并逐帧计算所述游戏客户端对应的真实位移数据;Acquire the positioning data collected by the positioning device of the game client according to the preset frame rate, and calculate the real displacement data corresponding to the game client frame by frame;
    按照预设比例系数,确定所述真实位移数据对应的所述游戏角色的虚拟位移数据,并依据所述虚拟位移数据以及所述初始位置,确定所述游戏角色在所述三维游戏虚拟场景中的目标位置。Determine the virtual displacement data of the game character corresponding to the real displacement data according to the preset proportional coefficient, and determine the position of the game character in the 3D game virtual scene according to the virtual displacement data and the initial position. target location.
  4. 根据权利要求3所述的方法,所述按所述预设帧率获取所述游戏客户端的定位装置采集到的定位数据,并逐帧计算所述游戏客户端对应的真实位移数据,包括:The method according to claim 3, wherein the obtaining the positioning data collected by the positioning device of the game client according to the preset frame rate, and calculating the real displacement data corresponding to the game client frame by frame, comprises:
    获取所述定位数据中的第一定位数据以及第二定位数据,所述第一定位数据对应的第一采样帧与所述第二定位数据对应的第二采样帧相差预设数量;acquiring first positioning data and second positioning data in the positioning data, where the first sampling frame corresponding to the first positioning data and the second sampling frame corresponding to the second positioning data differ by a preset amount;
    基于所述第一定位数据以及所述第二定位数据做插值,得到与所述预设数量匹配的插值位置数据,并依据所述插值位置数据计算所述游戏客户端对应的插值位移数据,所述真实位移数据包括所述插值位移数据。Interpolate based on the first positioning data and the second positioning data to obtain the interpolation position data matching the preset number, and calculate the interpolation displacement data corresponding to the game client according to the interpolation position data. The true displacement data includes the interpolated displacement data.
  5. 根据权利要求2所述的方法,所述运动面数据文件包括基于所述三维游戏虚拟场景的地貌数据确定的所述三维游戏虚拟场景中可运动位置对应的平面坐标和高度数据;所述读取与所述三维游戏虚拟场景对应的运动面数据文件,包括:The method according to claim 2, wherein the motion surface data file includes plane coordinates and height data corresponding to movable positions in the three-dimensional game virtual scene determined based on the landform data of the three-dimensional game virtual scene; the reading The motion surface data file corresponding to the virtual scene of the three-dimensional game, including:
    在所述三维游戏虚拟场景中,加载与所述运动面数据文件对应的运动面,对所述三维游戏虚拟场景进行展示时隐藏所述运动面。In the 3D game virtual scene, the motion plane corresponding to the motion plane data file is loaded, and the motion plane is hidden when the 3D game virtual scene is displayed.
  6. 根据权利要求5所述的方法,所述依据所述运动面数据文件以及所述目标位置,通过游戏引擎逐帧渲染所述游戏角色在所述三维游戏虚拟场景中对应的运动图像帧,包括:The method according to claim 5, wherein according to the motion surface data file and the target position, the frame-by-frame rendering of the motion image frame corresponding to the game character in the 3D game virtual scene by a game engine comprises:
    基于所述目标位置生成打点射线,对所述运动面发射所述打点射线进行射线打点,所述打点射线垂直于所述目标位置所在平面;Based on the target position, a dotted ray is generated, and the dotted ray is emitted from the moving surface to perform ray dotting, and the dotted ray is perpendicular to the plane where the target position is located;
    若所述打点射线与所述运动面产生交汇,则渲染所述游戏角色向所述目标位置移动的第一运动图像帧,所述运动图像帧包括所述第一运动图像帧;If the dotted ray intersects with the moving surface, rendering a first moving image frame in which the game character moves to the target position, where the moving image frame includes the first moving image frame;
    若所述打点射线与所述运动面不产生交汇,则按照预设碰撞运动规则,渲染所述游戏角色在对应的当前位置进行运动的第二运动图像帧,所述运动图像帧包括所述第二运动图像帧。If the dotted ray does not intersect with the moving surface, a second moving image frame in which the game character moves at the corresponding current position is rendered according to the preset collision motion rule, and the moving image frame includes the first moving image frame. Two moving image frames.
  7. 根据权利要求6所述的方法,所述若所述打点射线与所述运动面产生交汇,则生成所述游戏角色向所述目标位置移动的第一运动图像帧,包括:The method according to claim 6, wherein if the dotted ray intersects with the motion surface, generating a first motion image frame in which the game character moves to the target position, comprising:
    获取所述打点射线与所述运动面的交汇点高度;Obtain the height of the intersection point of the dotted ray and the motion surface;
    若所述交汇点高度与所述三维游戏虚拟场景的AR模式骑乘状态信息匹配,则渲染所述游戏角色按照与所述交汇点高度对应的骑乘状态向所述目标位置移动的第三运动图像帧,所述第一运动图像帧包括所述第三运动图像帧;If the height of the junction matches the riding state information in the AR mode of the 3D game virtual scene, render a third motion in which the game character moves to the target position according to the riding state corresponding to the height of the junction an image frame, the first moving image frame includes the third moving image frame;
    若所述交汇点高度与所述AR模式骑乘状态信息不匹配,则按照所述预设碰撞运动规则,渲染所述游戏角色在对应的当前位置进行运动的第四运动图像帧,所述第一运动图像帧包括所述第四运动图像帧。If the height of the intersection point does not match the riding state information in the AR mode, the fourth moving image frame in which the game character is moving at the corresponding current position is rendered according to the preset collision motion rule. A moving image frame includes the fourth moving image frame.
  8. 根据权利要求5所述的方法,所述运动面包括二维导航网格运动面;所述依据所述运动面数据文件以及所述目标位置,通过游戏引擎逐帧渲染所述游戏角色在所述三维游戏虚拟场景中对应的运动图像帧,包括:The method according to claim 5, wherein the moving surface comprises a two-dimensional navigation mesh moving surface; the game engine is used to render the game character frame-by-frame according to the moving surface data file and the target position. The corresponding moving image frames in the virtual scene of the 3D game, including:
    若所述目标位置与所述二维导航网格运动面匹配,则获取所述二维导航网格运动面对应的所述目标位置的预设骑乘信息,根据所述预设骑乘信息以及所述游戏角色的当前骑乘状态,确定所述游戏角色的目标骑乘状态,并基于所述二维导航网格运动面,渲染所述游戏角色以所述目标骑乘状态向所述目标位置移动的第五运动图像帧,所述运动图像帧包括所述第五运动图像帧;If the target position matches the motion surface of the two-dimensional navigation grid, obtain the preset riding information of the target position corresponding to the motion surface of the two-dimensional navigation grid, according to the preset riding information and the current riding state of the game character, determine the target riding state of the game character, and based on the two-dimensional navigation mesh motion surface, render the game character with the target riding state to the target a position-shifted fifth moving image frame, the moving image frame including the fifth moving image frame;
    若所述目标位置与所述二维导航网格运动面不匹配,则按照预设碰撞运动规则,渲染所述游戏角色在对应的当前位置进行运动的第六运动图像帧,所述运动图像帧包括所述第六运动图像帧。If the target position does not match the motion surface of the two-dimensional navigation mesh, a sixth moving image frame in which the game character moves at the corresponding current position is rendered according to the preset collision motion rule, and the moving image frame The sixth moving image frame is included.
  9. 根据权利要求5所述的方法,所述运动面包括三维体素运动面;所述依据所述运动面数据文件以及所述目标位置,通过游戏引擎逐帧渲染所述游戏角色在所述三维游戏虚拟场景中对应的运动图像帧,包括:The method according to claim 5, wherein the moving surface comprises a three-dimensional voxel moving surface; the game engine is used to render the game character frame by frame in the three-dimensional game according to the moving surface data file and the target position. The corresponding moving image frames in the virtual scene, including:
    若所述目标位置与所述三维体素运动面匹配,则获取所述三维体素运动面对应的所述目标位置的预设骑乘信息,根据所述预设骑乘信息以及所述游戏角色的当前骑乘状态,确定所述游戏角色的目标骑乘状态,并基于所述三维体素运动面确定所述游戏角色从当前位置向所述目标位置移动的寻路信息,根据所述寻路信息,渲染所述游戏角色以所述目标骑乘状态向所述目标位置移动的第七运动图像帧,所述运动图像帧包括所述第七运动图像帧;If the target position matches the three-dimensional voxel motion surface, obtain the preset riding information of the target position corresponding to the three-dimensional voxel motion surface, according to the preset riding information and the game The current riding state of the character, determine the target riding state of the game character, and determine the pathfinding information of the game character moving from the current position to the target position based on the three-dimensional voxel motion surface, according to the road information, and render a seventh moving image frame in which the game character moves to the target position in the target riding state, and the moving image frame includes the seventh moving image frame;
    若所述目标位置与所述三维体素运动面不匹配,则按照预设碰撞运动规则,渲染所述游戏角色在对应的当前位置进行运动的第八运动图像帧,所述运动图像帧包括所述第八运动图像帧。If the target position does not match the three-dimensional voxel motion surface, render an eighth motion image frame in which the game character moves at the corresponding current position according to the preset collision motion rule, where the motion image frame includes all the eighth moving image frame.
  10. 根据权利要求2所述的方法,所述运动面数据文件包括基于所述三维游戏虚拟场景的地貌数据确定的所述三维游戏虚拟场景中可运动位置对应的平面坐标和高度数据;所述运动面数据文件可依据所述三维游戏场景的实时环境进行修改;The method according to claim 2, wherein the motion surface data file includes plane coordinates and height data corresponding to movable positions in the 3D game virtual scene determined based on the landform data of the 3D game virtual scene; the motion surface The data file can be modified according to the real-time environment of the three-dimensional game scene;
    所述依据所述运动面数据文件以及所述目标位置,通过游戏引擎逐帧渲染所述游戏角色在所述三维游戏虚拟场景中对应的运动图像帧,包括:The moving image frame corresponding to the game character in the 3D game virtual scene is rendered frame by frame through the game engine according to the motion surface data file and the target position, including:
    若所述目标位置与所述可运动位置的平面坐标匹配,则依据所述目标位置对应的高度数据生成所述游戏角色向所述目标位置移动的第九运动图像帧,所述运动图像帧包括所述第九运动图像帧;If the target position matches the plane coordinates of the movable position, generate a ninth moving image frame in which the game character moves to the target position according to the height data corresponding to the target position, and the moving image frame includes the ninth moving image frame;
    若所述目标位置与所述可运动位置的平面坐标不匹配,则按照所述预设碰撞运动规则,生成所述游戏角色在对应的当前位置进行运动的第十运动图像帧,所述运动图像帧包括所述第十运动图像帧。If the target position does not match the plane coordinates of the movable position, generate a tenth moving image frame in which the game character moves at the corresponding current position according to the preset collision motion rule, and the moving image The frames include the tenth moving image frame.
  11. 根据权利要求2所述的方法,所述依据所述运动面数据文件以及所述目标位置,逐帧生成所述游戏角色在所述三维游戏虚拟场景中对应的运动图像帧,包括:The method according to claim 2, wherein generating the moving image frames corresponding to the game character in the 3D game virtual scene frame by frame according to the motion surface data file and the target position, comprising:
    按所述预设帧率获取通过所述游戏客户端的角运动检测装置得到的视角数据;Acquire the viewing angle data obtained by the angular motion detection device of the game client according to the preset frame rate;
    依据所述运动面数据文件以及所述目标位置,逐帧渲染与所述视角数据匹配的所述游戏角色的运动图像帧,所述运动图像帧用于展示与所述视角数据匹配的三维游戏虚拟场景以及所述游戏角色的运动动作。According to the moving surface data file and the target position, the moving image frame of the game character matching the viewing angle data is rendered frame by frame, and the moving image frame is used to display the three-dimensional game virtual image matching the viewing angle data. The scene and the movement of the game character.
  12. 根据权利要求11所述的方法,所述按所述预设帧率获取通过所述游戏客户端的角运动检测装置得到的视角数据,包括:The method according to claim 11, wherein the acquiring the viewing angle data obtained by the angular motion detection device of the game client at the preset frame rate comprises:
    按所述预设帧率,获取通过所述角运动检测装置采集的角运动数据;Acquire angular motion data collected by the angular motion detection device according to the preset frame rate;
    根据与所述三维游戏虚拟场景对应的运镜类别,生成与所述角运动数据对应的所述视角数据。The viewing angle data corresponding to the angular motion data is generated according to the mirror movement category corresponding to the three-dimensional game virtual scene.
  13. 根据权利要求12所述的方法,The method of claim 12,
    在所述运镜类别包括第一人称运镜的情况下,所述视角数据为所述游戏角色在所述三维游戏虚拟场景中的第一人称视角,所述视角数据以所述游戏角色的预设初始视角为基础随所述角运动数据产生对应的变化;In the case where the camera movement category includes first-person camera movement, the perspective data is the first-person perspective of the game character in the three-dimensional game virtual scene, and the perspective data is based on a preset initial value of the game character Based on the viewing angle, corresponding changes are generated with the angular motion data;
    在所述运镜类别包括第三人称运镜的情况下,所述视角数据为以与所述游戏角色的位置和方向对应的预设位置和预设角度看向所述游戏角色的第三人称视角,所述游戏角色的方向以所述游戏角色的预设初始方向为基础随所述角运动数据产生对应的变化。In the case where the camera movement category includes a third-person camera movement, the perspective data is a third-person perspective view of the game character looking at the game character at a preset position and a preset angle corresponding to the position and direction of the game character, The direction of the game character changes correspondingly with the angular motion data based on the preset initial direction of the game character.
  14. 一种游戏角色的运动处理装置,包括:A motion processing device for a game character, comprising:
    请求响应模块,用于响应于游戏客户端内的AR模式交互操作请求,调用所述游戏客户端内游戏引擎渲染得到实时三维游戏虚拟场景;a request-response module, used for invoking the game engine in the game client to render a real-time 3D game virtual scene in response to an AR mode interactive operation request in the game client;
    运动驱动模块,用于根据游戏玩家在真实世界中实时的真实位移数据,驱动所述游戏玩家控制的游戏角色在三维游戏虚拟场景中进行无穿插运动。The motion driving module is used for driving the game character controlled by the game player to perform non-interspersed motion in the three-dimensional game virtual scene according to the real-time real displacement data of the game player in the real world.
  15. 一种计算机可读介质,其上存储有计算机程序,所述计算机程序被处理器执行时实现根据权利要求1至13中任一项所述的方法的步骤。A computer readable medium having stored thereon a computer program which, when executed by a processor, implements the steps of the method according to any one of claims 1 to 13.
  16. 一种计算机设备,包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述计算机程序时实现根据权利要求1至13中任一项所述的方法的步骤。A computer device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the computer program according to any one of claims 1 to 13 when the processor executes the computer program steps of the method.
  17. 一种计算机程序产品,包括计算机程序,所述计算机程序被处理器执行时实现根据权利要求1-13中任一项所述的方法的步骤。A computer program product comprising a computer program which, when executed by a processor, implements the steps of the method according to any of claims 1-13.
PCT/CN2021/121092 2021-03-16 2021-09-27 Motion processing method and apparatus for game character, and storage medium and computer device WO2022193612A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110282546.0 2021-03-16
CN202110282546.0A CN112862935B (en) 2021-03-16 2021-03-16 Game role movement processing method and device, storage medium and computer equipment

Publications (1)

Publication Number Publication Date
WO2022193612A1 true WO2022193612A1 (en) 2022-09-22

Family

ID=75994795

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/121092 WO2022193612A1 (en) 2021-03-16 2021-09-27 Motion processing method and apparatus for game character, and storage medium and computer device

Country Status (2)

Country Link
CN (1) CN112862935B (en)
WO (1) WO2022193612A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116036601A (en) * 2023-01-28 2023-05-02 腾讯科技(深圳)有限公司 Game processing method and device, computer equipment and storage medium
CN117357894A (en) * 2023-11-01 2024-01-09 北京畅游天下网络技术集团有限公司 Three-dimensional scene generation method, device, equipment and medium

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112862935B (en) * 2021-03-16 2023-03-17 天津亚克互动科技有限公司 Game role movement processing method and device, storage medium and computer equipment
CN113821345B (en) * 2021-09-24 2023-06-30 网易(杭州)网络有限公司 Method and device for rendering moving track in game and electronic equipment
CN114125552A (en) * 2021-11-30 2022-03-01 完美世界(北京)软件科技发展有限公司 Video data generation method and device, storage medium and electronic device
TWI799195B (en) * 2021-12-10 2023-04-11 宅妝股份有限公司 Method and system for implementing third-person perspective with a virtual object
CN116414223A (en) * 2021-12-31 2023-07-11 中兴通讯股份有限公司 Interaction method and device in three-dimensional space, storage medium and electronic device
CN116328309B (en) * 2023-03-27 2023-10-13 广州美术学院 High-order demand game interaction method aiming at visual impairment crowd Wen Nong travel virtual scene

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101780321A (en) * 2009-12-30 2010-07-21 永春至善体育用品有限公司 Method for making high-presence virtual reality of exercise fitness equipment, and interactive system and method based on virtual reality
US10022628B1 (en) * 2015-03-31 2018-07-17 Electronic Arts Inc. System for feature-based motion adaptation
CN109478341A (en) * 2016-07-13 2019-03-15 株式会社万代南梦宫娱乐 Simulation system, processing method and information storage medium
CN110772791A (en) * 2019-11-05 2020-02-11 网易(杭州)网络有限公司 Route generation method and device for three-dimensional game scene and storage medium
CN112862935A (en) * 2021-03-16 2021-05-28 天津亚克互动科技有限公司 Game character motion processing method and device, storage medium and computer equipment

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104658038B (en) * 2015-03-12 2019-01-18 南京梦宇三维技术有限公司 3-dimensional digital content intelligence production method and manufacturing system based on motion capture
CN107441714A (en) * 2017-06-01 2017-12-08 杨玉苹 A kind of image processing method and its device, shooting game fighting system and its method of work for realizing AR first person shooting games
CN107479699A (en) * 2017-07-28 2017-12-15 深圳市瑞立视多媒体科技有限公司 Virtual reality exchange method, apparatus and system
CN108303719A (en) * 2018-01-30 2018-07-20 上海电力学院 A method of judging whether monitoring client dynamic position exceeds virtual fence
CN108427501B (en) * 2018-03-19 2022-03-22 网易(杭州)网络有限公司 Method and device for controlling movement in virtual reality
EP3644322B1 (en) * 2018-10-25 2023-12-27 Tata Consultancy Services Limited Method and system for interpreting neural interplay involving proprioceptive adaptation during a dual task paradigm
CN110280014B (en) * 2019-05-21 2022-09-13 西交利物浦大学 Method for reducing dizziness in virtual reality environment
CN110665219A (en) * 2019-10-14 2020-01-10 网易(杭州)网络有限公司 Operation control method and device for virtual reality game
CN111167120A (en) * 2019-12-31 2020-05-19 网易(杭州)网络有限公司 Method and device for processing virtual model in game
CN111249729B (en) * 2020-02-18 2023-10-20 网易(杭州)网络有限公司 Game character display method and device, electronic equipment and storage medium
CN111318022B (en) * 2020-03-19 2023-04-14 网易(杭州)网络有限公司 Game scene generation method and device in game, electronic device and storage medium
CN111744202A (en) * 2020-06-29 2020-10-09 完美世界(重庆)互动科技有限公司 Method and device for loading virtual game, storage medium and electronic device
CN112316424B (en) * 2021-01-06 2021-03-26 腾讯科技(深圳)有限公司 Game data processing method, device and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101780321A (en) * 2009-12-30 2010-07-21 永春至善体育用品有限公司 Method for making high-presence virtual reality of exercise fitness equipment, and interactive system and method based on virtual reality
US10022628B1 (en) * 2015-03-31 2018-07-17 Electronic Arts Inc. System for feature-based motion adaptation
CN109478341A (en) * 2016-07-13 2019-03-15 株式会社万代南梦宫娱乐 Simulation system, processing method and information storage medium
CN110772791A (en) * 2019-11-05 2020-02-11 网易(杭州)网络有限公司 Route generation method and device for three-dimensional game scene and storage medium
CN112862935A (en) * 2021-03-16 2021-05-28 天津亚克互动科技有限公司 Game character motion processing method and device, storage medium and computer equipment

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116036601A (en) * 2023-01-28 2023-05-02 腾讯科技(深圳)有限公司 Game processing method and device, computer equipment and storage medium
CN116036601B (en) * 2023-01-28 2023-06-09 腾讯科技(深圳)有限公司 Game processing method and device, computer equipment and storage medium
CN117357894A (en) * 2023-11-01 2024-01-09 北京畅游天下网络技术集团有限公司 Three-dimensional scene generation method, device, equipment and medium
CN117357894B (en) * 2023-11-01 2024-03-29 北京畅游天下网络技术集团有限公司 Three-dimensional scene generation method, device, equipment and medium

Also Published As

Publication number Publication date
CN112862935A (en) 2021-05-28
CN112862935B (en) 2023-03-17

Similar Documents

Publication Publication Date Title
WO2022193612A1 (en) Motion processing method and apparatus for game character, and storage medium and computer device
US20210252398A1 (en) Method and system for directing user attention to a location based game play companion application
JP7273068B2 (en) Multi-server cloud virtual reality (VR) streaming
KR101610702B1 (en) Sprite strip renderer
US11250617B1 (en) Virtual camera controlled by a camera control device
US9369543B2 (en) Communication between avatars in different games
KR20150108842A (en) Mixed reality filtering
JP7249975B2 (en) Method and system for directing user attention to location-based gameplay companion applications
CN112933606B (en) Game scene conversion method and device, storage medium and computer equipment
CN110832442A (en) Optimized shading and adaptive mesh skin in point-of-gaze rendering systems
JP2023159344A (en) In-game location based game play companion application
WO2022000971A1 (en) Camera movement switching mode method and apparatus, computer program and readable medium
US20220395756A1 (en) Building a dynamic social community based on similar interaction regions of game plays of a gaming application
US10293259B2 (en) Control of audio effects using volumetric data
CN112891940B (en) Image data processing method and device, storage medium and computer equipment
JP6959267B2 (en) Generate challenges using a location-based gameplay companion application
Chung Metaverse XR Components
WO2023030106A1 (en) Object display method and apparatus, electronic device, and storage medium
CN110314377B (en) Method and device for randomly generating object moving path in three-dimensional space
CN116764215A (en) Virtual object control method, device, equipment, storage medium and program product
CN115359164A (en) Method, system, electronic device and storage medium for presenting object in screen center

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21931187

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21931187

Country of ref document: EP

Kind code of ref document: A1