WO2016206491A1 - 目标对象运动轨迹确定方法、装置以及存储介质 - Google Patents

目标对象运动轨迹确定方法、装置以及存储介质 Download PDF

Info

Publication number
WO2016206491A1
WO2016206491A1 PCT/CN2016/081694 CN2016081694W WO2016206491A1 WO 2016206491 A1 WO2016206491 A1 WO 2016206491A1 CN 2016081694 W CN2016081694 W CN 2016081694W WO 2016206491 A1 WO2016206491 A1 WO 2016206491A1
Authority
WO
WIPO (PCT)
Prior art keywords
control point
training
direction vector
vector
determining
Prior art date
Application number
PCT/CN2016/081694
Other languages
English (en)
French (fr)
Inventor
谢思远
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Priority to KR1020177016338A priority Critical patent/KR101975689B1/ko
Priority to JP2017538656A priority patent/JP6735760B2/ja
Publication of WO2016206491A1 publication Critical patent/WO2016206491A1/zh
Priority to US15/624,245 priority patent/US10354393B2/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • A63F13/573Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using trajectories of game objects, e.g. of a golf ball according to the point of impact
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/215Input arrangements for video game devices characterised by their sensors, purposes or types comprising means for detecting acoustic signals, e.g. using a microphone
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/35Details of game servers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/20Linear translation of whole images or parts thereof, e.g. panning
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/822Strategy games; Role-playing games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/64Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car
    • A63F2300/646Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car for calculating the trajectory of an object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising

Definitions

  • the present invention relates to the field of computer technologies, and in particular, to a method, an apparatus, and a storage medium for determining a target object motion trajectory.
  • a role can be selected in the application, and the selected role can be referred to as a target object.
  • the terminal may determine a motion trajectory of the target object on the motion track based on the user's control of the target object, thereby controlling the target object motion through the navigation module of the application.
  • Embodiments of the present invention provide a method, an apparatus, and a storage medium for determining a target object motion trajectory.
  • a method for determining a target object motion trajectory comprising:
  • Determining motion of the target object on the motion trajectory by specifying a spline curve interpolation model based on the first control point, the second control point, the first direction vector, and the second direction vector Track.
  • a target object motion trajectory determining apparatus comprising: one or more processors and a storage medium storing an operation instruction, when the operation instruction in the storage medium is executed, the processing is performed Perform the following steps:
  • Determining motion of the target object on the motion trajectory by specifying a spline curve interpolation model based on the first control point, the second control point, the first direction vector, and the second direction vector Track.
  • a non-transitory computer readable storage medium having stored thereon computer executable instructions for executing the executable instructions in a computer performs the following steps:
  • Determining motion of the target object on the motion trajectory by specifying a spline curve interpolation model based on the first control point, the second control point, the first direction vector, and the second direction vector Track.
  • the first direction vector and the second direction vector are the same, and the multiple variables in the specified spline interpolation model are offset, and the generated
  • the actual motion trajectory of the target object is a straight line, not a curve, so that the actual motion trajectory of the target object is the same as the theoretical motion trajectory, which improves the accuracy of determining the motion trajectory of the target object.
  • FIG. 1 is a flowchart of a method for determining a target object motion trajectory according to an embodiment of the present invention
  • FIG. 2 is a flowchart of another method for determining a target object motion trajectory according to an embodiment of the present invention
  • FIG. 3 is a schematic diagram of a line generated by a CatmullRom spline curve algorithm according to an embodiment of the present invention
  • FIG. 4 is a schematic diagram of a target object motion track interface according to an embodiment of the present invention.
  • FIG. 5 is a schematic diagram of another target object motion trajectory determining interface according to an embodiment of the present invention.
  • FIG. 6 is a schematic diagram of a line generated by a specified spline curve interpolation model according to an embodiment of the present invention.
  • FIG. 7 is a schematic structural diagram of a target object motion trajectory determining apparatus according to an embodiment of the present invention.
  • FIG. 8 is a schematic structural diagram of another target object motion trajectory determining apparatus according to an embodiment of the present invention.
  • FIG. 9 is a schematic structural diagram of still another target object motion trajectory determining apparatus according to an embodiment of the present invention.
  • FIG. 1 is a flowchart of a method for determining a target object motion trajectory according to an embodiment of the present invention.
  • the method can be applied to a terminal, which can be a mobile phone, a tablet computer, a palmtop computer, etc.
  • the method includes steps 101 to 103.
  • step 101 a first control point and a second control point on the motion track are acquired based on a current position of the target object on the motion track, and the first control point and the second control point are adjacent control points.
  • step 102 a first direction vector and a second direction vector are obtained, the first direction vector is a unit direction vector at the first control point, and the second direction vector is a unit direction vector at the second control point.
  • step 103 based on the first control point, the second control point, the first direction vector, and the second direction vector, the motion track of the target object on the motion track is determined by specifying a spline interpolation model.
  • the first direction vector and the second direction vector are the same, and the multiple variables in the specified spline interpolation model are offset, and the generated The actual motion trajectory of the target object is a straight line, not a curve, and thus the target The actual motion trajectory of the object is the same as the theoretical motion trajectory, which improves the accuracy of determining the motion trajectory of the target object.
  • the first control point, the second control point, the first direction vector, and the second direction vector may be included before the motion track of the target object on the motion track is determined by specifying a spline interpolation model. : obtaining a spline curve interpolation model to be trained;
  • the order of the third training control point and the fourth training control point may be sequentially arranged;
  • the specified spline interpolation model is determined based on the second training control point, the third training control point, the training distance, the first training vector, the second training vector, and the training spline interpolation model.
  • determining the training distance, the first training vector, and the second training vector based on the first training control point, the second training control point, the third training control point, and the fourth training control point may include:
  • a unit direction vector connecting the first training control point and the fourth training control point is determined as the second training vector.
  • determining the specified spline interpolation model based on the second training control point, the third training control point, the training distance, the first training vector, the second training vector, and the training spline interpolation model may include:
  • the specified spline interpolation model is determined.
  • determining, according to the first control point, the second control point, the first direction vector, and the second direction vector, determining a motion track of the target object on the motion track by specifying a spline interpolation model may include:
  • P(U) is the motion trajectory of the target object
  • U is the interpolation ratio of the current motion distance between the first control point and the second control point
  • P i-1 is the first control point
  • P i is the second Control point
  • is the control point distance
  • E is the first direction vector
  • F is the second direction vector.
  • the method may further include:
  • the optional embodiments of the present invention may be used in any combination to form an optional embodiment of the present invention.
  • FIG. 2 is a flowchart of a method for determining a target object motion trajectory according to an embodiment of the present invention.
  • the method is applied to a terminal, which may be a mobile phone, a tablet computer, a palmtop computer, etc.
  • the method includes steps 201 to 206.
  • step 201 a training spline interpolation model is acquired.
  • the generated line is a curve instead of a straight line by the CatmullRom spline curve algorithm. For example, as shown in FIG. 3, when the actual motion trajectory of the target object is a straight line from A to B. At the time, through the CatmullRom spline algorithm, the generated lines start from the line A to A', and then the line from A' to B.
  • the training spline interpolation model is obtained.
  • the training spline interpolation model can be:
  • U is the interpolation ratio and is a floating point number of 0-1
  • P(U) is the motion trajectory
  • C 0 , C 1 , C 2 and C 3 are the parameters of the training spline curve interpolation model
  • U specifies P(U) is a point on the motion trajectory. For example, when U is 0, P(0) is the starting point of the motion trajectory. When U is 1, P(1) is the end point of the motion trajectory. .
  • step 202 based on the first training control point, the second training control point, the third training control point, and the fourth training control point, determining a training distance, a first training vector, and a second training vector, the first training control point,
  • the second training control point, the third training control point, and the fourth training control point are control points sequentially arranged on the training track.
  • a specified spline interpolation model can be derived based on the training motion trajectory. Obtaining a first training control point, a second training control point, a third training control point, and a fourth training control point, where the second training control point is a starting point of the training motion track, and the third training control point is an end point of the training motion track, The first training control point is a control point before the second training control point and adjacent to the second training control point, and the fourth training control point is after the third training control point, and the control adjacent to the third training control point point.
  • the first training control point is P i-2
  • the second training control point is P i-1
  • the third training control point is P i
  • the fourth training control point is P i+1
  • the second training control point is The distance between the third training control points is ⁇ , that is, the training distance is ⁇
  • the unit vector of the connection between the first training control point and the third training control point is E, that is, the first training The vector is E
  • the unit vector of the connection between the first training control point and the fourth training control point is F, that is, the second training vector is F.
  • a specified spline interpolation model is determined based on the second training control point, the third training control point, the training distance, the first training vector, the second training vector, and the training spline interpolation model.
  • the interpolation model of the specified spline curve is determined based on each parameter of the training spline interpolation model and the interpolating model of the spline to be trained.
  • P(0) is the starting point of the motion trajectory
  • P(1) is the end point of the motion trajectory. Therefore, for the training trajectory between the second training control point and the third training control point, P(( 0) is the second training control point P i-1 , P(1) is the third training control point P i .
  • the direction at the second training control point is a direction in which the connection between the third training control point and the first training control point is located
  • the direction at the third training control point is a fourth training control point and a second training control The direction in which the connection between the points is located.
  • P(U) is the training motion trajectory
  • U is the interpolation ratio
  • P i-1 is the starting point of the training motion trajectory, that is, the second training control point
  • P i is the end point of the training motion trajectory
  • is the distance between the second training control point and the third training control point, that is, the training distance
  • E is the first training vector
  • F is the second training vector.
  • the specified spline interpolation model can be trained by the above steps 201-203.
  • the above steps 201-203 can be performed not only when determining the target object motion trajectory, but also by using the above steps 201-203 to obtain the specified sample.
  • a curve interpolation model, and storing the specified spline interpolation model, and then determining the target object motion trajectory, the specified spline interpolation model can be directly obtained, and the timing of determining the specified spline interpolation model is not in the embodiment of the present invention. Make specific limits.
  • a spline interpolation model can be specified to generate a spline of the motion track of the target object, and the spline can include multiple control points. Then, the terminal can also determine the motion trajectory of the target object on the motion track by the following steps, as follows.
  • step 204 based on the current position of the target object on the motion track, the first control point and the second control point on the motion track are acquired, and the first control point and the second control point are adjacent control points.
  • the terminal may determine the motion position of the target object on the motion track every specified duration, thereby obtaining a motion track of the target object on the motion track, where the specified duration is set in advance, for example, The duration can be 1 second, so based on the target object per second in the shipment
  • the position of the motion on the moving orbit can determine that the motion of the target object on the moving orbit is relatively smooth.
  • the speed of the target object on the motion track is constant, and therefore, when the target object starts moving from the start point of the motion track, every specified duration
  • the current position of the target object on the motion track can be calculated based on the speed of the target object.
  • the motion track may include a plurality of control points, and in the embodiment of the present invention, for each of the plurality of control points, the control point and the start point of the motion track may be calculated. The distance between the distances is divided by the total length of the motion track to obtain the length ratio of the control points, and the correspondence between the length ratio of each control point and each control point can be stored.
  • the motion track includes 10 control points, the total length of the motion track is 100 meters, and from the first control point to the 10th control point, each control point is at the starting point of the motion track P 1 , 10 meters at P 2, 20 m P 3, 35 m P 4, 45 m P 5, 60 m P 6, 70 m P 7, 80 m P 8, 90 meters P 9, the end point P 10, That is, the distance between each control point and the starting point of the moving track is 0 meters, 10 meters, 20 meters, 35 meters, 45 meters, 60 meters, 70 meters, 80 meters, 90 meters, 100 meters, respectively.
  • the length ratio of each control point is 0%, 10%, 20%, 35%, 45%, 60. %, 70%, 80%, 90%, 100%, and then the length ratio of each control point to each control point is stored in the correspondence between the control point and the length ratio shown in Table 1 below.
  • the acquiring the first control point and the second control point on the motion track may include: obtaining a distance of the target object from the starting point according to the current position of the target object on the motion track, based on the current position of the target object on the motion track, The distance is divided by the total length of the motion track to obtain a length ratio of the current distance of the target object.
  • the length ratio is adjacent to the two length ratios, and further, the control point corresponding to the smaller length ratio of the selected two length ratios can be determined as the first control point, and the larger length ratio of the selected two length ratios The corresponding control point is determined as the second control point.
  • the distance of the target object from the starting point of the moving track is 50 meters
  • the distance 50 meters is divided by the total length of the moving track by 100 meters
  • the length ratio of the current moving distance is 50%.
  • step 205 the first direction vector and the second direction vector are obtained, the first direction vector is a unit direction vector at the first control point, and the second direction vector is a unit direction vector at the second control point.
  • the terminal may acquire the first direction vector and the second direction vector in multiple manners.
  • a derivative operation can be performed on the specified spline interpolation model to obtain a derivative model of the specified spline interpolation model, and the interpolation ratio of the first control point is calculated, and the interpolation ratio of the first control point is substituted into the specified spline interpolation model.
  • the derivative model obtains the first direction vector.
  • the interpolation ratio of the second control point is calculated, and the interpolation ratio of the second control point is substituted into the derivative model of the specified spline interpolation model to obtain the second direction vector.
  • the unit direction vector of the line between the second control point and the third control point can also be based on the coordinates of the second control point and the third Control point, calculate the unit direction vector of the line between the second control point and the third control point, and obtain the first direction vector, the third control point is before the first control point, and is adjacent to the first control point Control point; similarly, based on the coordinates of the fourth control point and the coordinates of the first control point, the unit direction vector of the connection between the fourth control point and the first control point can be calculated to obtain the second direction vector,
  • the four control points are control points after the second control point and adjacent to the second control point. This embodiment of the present invention will not be listed one by one.
  • a method for calculating an interpolation ratio of a first control point and an interpolation ratio of a second control point and an interpolation ratio method for calculating a current distance of the target object between a first control point and a second control point
  • an interpolation ratio method for calculating a current distance of the target object between a first control point and a second control point For details, refer to step 206 for details. The embodiments of the present invention are not described in detail herein.
  • the unit direction vector of the connection between the second control point and the third control point is calculated based on the coordinates of the second control point and the coordinates of the third control point
  • the method for obtaining the first direction vector may include: The coordinates of the control point are subtracted from the coordinates of the third control point, the direction vector between the second control point and the third control point is obtained, and the direction vector between the second control point and the third control point is unitized to obtain the first A direction vector.
  • the direction of calculating the second direction vector is similar, and the embodiment of the present invention will not elaborate on this.
  • step 206 based on the first control point, the second control point, the first direction vector, and the second direction vector, the motion track of the target object on the motion track is determined by specifying a spline interpolation model.
  • the terminal determines, according to the first control point and the second control point, an interpolation ratio between the first control point and the second control point of the target object; determining between the first control point and the second control point a distance obtained by the control point; and based on the first control point, the second control point, the current distance of the target object, the interpolation ratio between the first control point and the second control point, the control point distance, and the first direction vector And the second direction vector, determining the motion trajectory of the current motion of the target object by specifying the spline curve interpolation model as follows;
  • P(U) in the above formula is the motion trajectory of the target object
  • U is the interpolation ratio of the current motion distance of the target object between the first control point and the second control point
  • P i-1 is the target object
  • the starting point of the motion trajectory of the current motion that is, the first control point
  • P i is the end point of the motion trajectory of the current motion of the target object, that is, the second control point
  • is between the first control point and the second control point
  • E is the unit direction vector at the first control point, that is, the first direction vector
  • F is the unit direction vector at the second control point, that is, the second direction vector.
  • the specific operation of determining, by the terminal, the interpolation ratio of the current distance of the target object between the first control point and the second control point based on the first control point and the second control point may include: determining a current distance of the target object The length ratio is subtracted from the length ratio of the first control point to obtain a first ratio, and the length ratio of the second control point is subtracted from the length ratio of the first control point to obtain a second ratio, and the first ratio is divided by the second ratio. Obtaining an interpolation ratio of the current distance of the target object between the first control point and the second control point.
  • control point distance can be calculated based on the coordinates of the first control point and the coordinates of the second control point.
  • specific calculation method reference may be made to the related art, which is not described in detail in the embodiment of the present invention.
  • the length ratio of the current distance of the target object is 50% minus the length ratio of the first control point P 5 by 45%, and the first ratio is 5%, and the length ratio of the second control point P 6 is 60%.
  • the length ratio of the first control point P 5 is 45%, and the second ratio is 15%, and the first ratio 5% is divided by the second ratio 15%, and the current distance of the target object is obtained at the first control point P 5 and the first
  • the interpolation ratio between the two control points P 6 is 33%.
  • the distance between the first control point P 5 and the second control point P 6 is calculated to be 15 meters, that is, the control point distance is 15 meters, and then based on the first control point P 5 (0, 0,45), the second control point P 6 (0,0,60), the current moving distance of the target object is 33% of the interpolation ratio between the first control point and the second control point, and the control point distance is 15,
  • the first direction vector (0, 0, 1) and the second direction vector (0, 0, 1) through the above specified spline interpolation model, determine the position of the current moving distance of the target object is (0, 0, 49.95), further determining the motion trajectory of the target object, that is, determining the position of the current moving distance of the target object from the first control point P 1 and the second control point P 2 in FIG. 4 (0, 0, 49.95)
  • the specified spline interpolation model is a straight line. Therefore, no matter whether U is any value in 0-1, the phenomenon that the motion trajectory of the target object is a curve does not occur, and the accuracy of determining the motion trajectory is improved.
  • the generated line is a line directly from A to B, and for the target moving object, it is also directly moved from A. To B, the fallback phenomenon shown in Figure 3 does not occur.
  • the rate of change of the spline curve generated by the CatmullRom spline curve algorithm is changed, that is, the speed of the target object motion is also changed by the CatmullRom spline curve algorithm, as shown in the figure.
  • the gap between the points from A' to B is changed from small to large, that is, for a constant specified duration, the speed at which the target object moves between A' and B is also slow. Fast changing.
  • the speed of the target object movement needs to be constant, that is, the rate of change of the spline curve must be constant.
  • the target object motion trajectory determining method may be applied to a scenario of a network game that interacts with an application server.
  • the system for determining the motion trajectory may include a terminal and an application server, and the terminal and The application server is connected through a network, and the terminal can determine the motion track of the target object based on the foregoing method, and can synchronize the motion track of the target object to the application server, so that the application server performs the specified operation, for example, when the motion track reaches the motion track.
  • the application server can reward the user corresponding to the terminal.
  • the motion trajectory determining method can also be applied to other scenarios, for example, the scenario in which the terminal uses a network game in a single machine, that is, the system for determining the motion trajectory includes only the terminal.
  • the embodiment of the present invention does not specifically limit the scenario for determining the motion track.
  • the actual motion trajectory of the target object is a straight line
  • the first direction vector and the second direction vector are the same
  • the multiple variables in the specified spline interpolation model are offset
  • the generated target is generated.
  • the actual motion trajectory of the object is a straight line, not a curve, so that the actual motion trajectory of the target object is the same as the theoretical motion trajectory, which improves the accuracy of determining the motion trajectory of the target object.
  • the velocity of the motion trajectory determined by the specified spline interpolation model is constant, and the phenomenon that the velocity of the target object is not constant does not occur.
  • FIG. 7 is a schematic diagram of a target object motion trajectory determining apparatus according to an embodiment of the present invention.
  • the apparatus includes:
  • a first acquiring module 701 configured to acquire, according to a current position of the target object on the motion track, a first control point and a second control point on the motion track, where the first control point and the second control point are adjacent control points;
  • the second obtaining module 702 is configured to obtain a first direction vector, where the first direction vector is a unit direction vector at the first control point, and a second direction vector is a unit direction vector at the second control point;
  • the first determining module 703 is configured to determine a motion track of the target object on the motion track by specifying a spline interpolation model based on the first control point, the second control point, the first direction vector, and the second direction vector.
  • the apparatus may further include:
  • a third obtaining module 704 configured to acquire a training spline interpolation model
  • a second determining module 705, configured to determine a training distance, a first training vector, and a second training vector based on the first training control point, the second training control point, the third training control point, and the fourth training control point, the first training
  • the order of the control point, the second training control point, the third training control point, and the fourth training control point may be sequentially arranged;
  • the third determining module 706 is configured to determine a specified spline interpolation model based on the second training control point, the third training control point, the training distance, the first training vector, the second training vector, and the training spline interpolation model.
  • the second determining module 705 can include:
  • a first determining unit configured to determine a distance between the second training control point and the third training control point Set as training distance
  • a second determining unit configured to determine a unit direction vector of the connection between the first training control point and the third training control point as the first training vector
  • a third determining unit configured to determine a unit direction vector of the connection between the first training control point and the fourth training control point as the second training vector.
  • the third determining module 706 can include:
  • a fourth determining unit configured to determine each parameter of the training spline curve interpolation model based on the second training control point, the third training control point, the training distance, the first training vector, and the second training vector;
  • the fifth determining unit is configured to determine a specified spline interpolation model based on each parameter of the training spline interpolation model and the training spline interpolation model.
  • the first determining module 703 may include:
  • a sixth determining unit configured to determine, according to the first control point and the second control point, an interpolation ratio of a current moving distance of the target object on the motion track between the first control point and the second control point;
  • a seventh determining unit configured to determine a distance between the first control point and the second control point, to obtain a control point distance
  • An eighth determining unit configured to calculate an interpolation ratio, a control point distance, between the first control point and the second control point based on the first control point, the second control point, the current moving distance of the target object on the motion track,
  • the first direction vector and the second direction vector determine the motion trajectory of the target object by specifying a spline interpolation model as follows;
  • P(U) is the motion trajectory of the current motion of the target object
  • U is the interpolation ratio of the current motion distance of the target object on the motion trajectory between the first control point and the second control point
  • P i-1 is The first control point
  • P i is the second control point
  • is the control point distance
  • E is the first direction vector
  • F is the second direction vector.
  • the device may further include:
  • the actual motion track of the target object is a straight line
  • the multiple variables in the specified spline interpolation model are cancelled, and the actual motion trajectory of the generated target object is a straight line, not a curve, thereby causing the actual motion of the target object.
  • the trajectory is the same as the theoretical trajectory, which improves the accuracy of determining the trajectory of the target object.
  • the game motion trajectory determining apparatus provided in the above embodiment is only illustrated by the division of the above functional modules when the motion trajectory is determined. In actual applications, the functions may be assigned to different functional modules according to needs. Upon completion, the internal structure of the device is divided into different functional modules to perform all or part of the functions described above.
  • the game motion trajectory determining apparatus and the game motion trajectory determining method embodiment are provided in the same concept, and the specific implementation process is described in detail in the method embodiment, and details are not described herein again.
  • FIG. 9 is a structural block diagram of a target object motion trajectory determining apparatus according to an embodiment of the present invention.
  • the apparatus may be a terminal, and the terminal 900 may include a communication unit 910 including one or more computer readable storages.
  • WIFI Wireless Fidelity
  • the terminal structure shown in FIG. 9 does not constitute a limitation to the terminal, and may include more or less components than those illustrated, or combine some components, or different component arrangements.
  • the communication unit 910 can be used for transmitting and receiving information and receiving and transmitting signals during a call.
  • the communication unit 910 can be an RF (Radio Frequency) circuit, a router, a modem, or the like.
  • RF circuits as communication units include, but are not limited to, an antenna, at least one amplifier, a tuner, one or more oscillators, a Subscriber Identity Module (SIM) card, a transceiver, a coupler, and a LNA (Low Noise Amplifier, low).
  • SIM Subscriber Identity Module
  • the communication unit 910 can also communicate with the network and other devices through wireless communication.
  • the wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System of Mobile communication), GPRS (General Packet Radio Service), CDMA (Code Division Multiple Access). , code division multiple access), WCDMA (Wideband Code Division Multiple Access, LTE (Long Term Evolution), e-mail, SMS (Short Messaging Service), etc.
  • the memory 920 can be used to store software programs and modules, and the processor 980 executes various functional applications and data processing by running software programs and modules stored in the memory 920.
  • the memory 920 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may be stored according to The data created by the use of the terminal 900 (such as audio data, phone book, etc.) and the like.
  • memory 920 can include high speed random access memory, and can also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, memory 920 may also include a memory controller to provide access to memory 920 by processor 980 and input unit 930.
  • the input unit 930 can be configured to receive input numeric or character information and to generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function controls.
  • the input unit 930 can include a touch sensitive surface 931 as well as other input devices 932.
  • a touch-sensitive surface 931 also referred to as a touch display or trackpad, can collect touch operations on or near the user (eg, the user uses a finger, stylus, etc., any suitable object or accessory on the touch-sensitive surface 931 or The operation near the touch-sensitive surface 931) and drive the corresponding connecting device according to a preset program.
  • the touch sensitive surface 931 can include two portions of a touch detection device and a touch controller.
  • the touch detection device detects the touch orientation of the user, and detects a signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts the touch information into contact coordinates, and sends the touch information.
  • the processor 980 is provided and can receive commands from the processor 980 and execute them.
  • the touch sensitive surface 931 can be implemented in various types such as resistive, capacitive, infrared, and surface acoustic waves.
  • the input unit 930 can also include other input devices 932.
  • other input devices 932 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackballs, mice, joysticks, and the like.
  • Display unit 940 can be used to display information entered by the user or information provided to the user and various graphical user interfaces of terminal 900, which can be constructed from graphics, text, icons, video, and any combination thereof.
  • the display unit 940 can include a display panel 941.
  • an LCD Liquid Crystal Display
  • OLED Organic Light-Emitting
  • the display panel 941 is configured in the form of a Diode, an organic light emitting diode, or the like.
  • touch-sensitive surface 931 can cover the display panel 941, and when the touch-sensitive surface 931 detects a touch operation thereon or nearby, it is transmitted to the processor 980 to determine the type of the touch event, and then the processor 980 according to the touch event The type provides a corresponding visual output on display panel 941.
  • touch-sensitive surface 931 and display panel 941 are implemented as two separate components to implement input and input functions, in some embodiments, touch-sensitive surface 931 can be integrated with display panel 941 for input. And output function.
  • Terminal 900 can also include at least one type of sensor 950, such as a light sensor, motion sensor, and other sensors.
  • the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display panel 941 according to the brightness of the ambient light, and the proximity sensor may turn off the display panel 941 and/or the backlight when the terminal 900 moves to the ear.
  • the gravity acceleration sensor can detect the magnitude of acceleration in all directions (usually three axes). When it is stationary, it can detect the magnitude and direction of gravity.
  • the terminal 900 can also be configured with gyroscopes, barometers, hygrometers, thermometers, infrared sensors and other sensors, here Let me repeat.
  • An audio circuit 960, a speaker 961, and a microphone 962 can provide an audio interface between the user and the terminal 900.
  • the audio circuit 960 can transmit the converted electrical data of the received audio data to the speaker 961, and convert it into a sound signal output by the speaker 961.
  • the microphone 962 converts the collected sound signal into an electrical signal, and the audio circuit 960 After receiving, it is converted to audio data, and then processed by the audio data output processor 980, transmitted to the other terminal, for example, via the communication unit 910, or outputted to the memory 920 for further processing.
  • the audio circuit 960 may also include an earbud jack to provide communication of the peripheral earphones with the terminal 900.
  • the terminal may be configured with a wireless communication unit 970, which may be a WIFI module.
  • the WIFI is a short-range wireless transmission technology, and the terminal 900 can help the user to send and receive emails, browse web pages, and access streaming media through the wireless communication unit 970, which provides wireless broadband Internet access for users.
  • the wireless communication unit 970 is shown in the drawing, it can be understood that it does not belong to the essential configuration of the terminal 900, and may be omitted as needed within the scope of not changing the essence of the invention.
  • the processor 980 is a control center of the terminal 900, and connects the entire mobile phone by using various interfaces and lines.
  • the various components perform overall monitoring of the handset by running or executing software programs and/or modules stored in memory 920, as well as invoking data stored in memory 920, performing various functions and processing data of terminal 900.
  • the processor 980 may include one or more processing cores; preferably, the processor 980 may integrate an application processor and a modem processor, where the application processor mainly processes an operating system, a user interface, an application, and the like.
  • the modem processor primarily handles wireless communications. It will be appreciated that the above described modem processor may also not be integrated into the processor 980.
  • the terminal 900 also includes a power source 990 (such as a battery) that supplies power to the various components.
  • a power source 990 such as a battery
  • the power source can be logically coupled to the processor 980 through a power management system to manage functions such as charging, discharging, and power management through the power management system.
  • Power supply 960 may also include any one or more of a DC or AC power source, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
  • the terminal 900 may further include a camera, a Bluetooth module, and the like, and details are not described herein again.
  • the terminal further includes one or more programs, the one or more programs being stored in the memory and configured to be executed by one or more processors, the one or more programs including
  • the instructions for performing the game motion trajectory determining method provided by the embodiment of the present invention include:
  • the motion track of the target object on the motion track is determined by specifying a spline interpolation model.
  • the method before determining the motion track of the target object on the motion track by specifying the spline interpolation model based on the first control point, the second control point, the first direction vector, and the second direction vector, the method further includes:
  • the order of the third training control point and the fourth training control point may be sequentially arranged;
  • the specified spline interpolation model is determined based on the second training control point, the third training control point, the training distance, the first training vector, the second training vector, and the training spline interpolation model.
  • determining the training distance, the first training vector, and the second training vector based on the first training control point, the second training control point, the third training control point, and the fourth training control point may include:
  • a unit direction vector connecting the first training control point and the fourth training control point is determined as the second training vector.
  • determining the specified spline interpolation model based on the second training control point, the third training control point, the training distance, the first training vector, the second training vector, and the spline curve interpolation model to be trained may include:
  • the specified spline interpolation model is determined.
  • determining, according to the first control point, the second control point, the first direction vector, and the second direction vector, determining a motion track of the target object on the motion track by specifying a spline interpolation model may include:
  • P(U) is the motion trajectory of the target object
  • U is the interpolation ratio of the current motion distance between the first control point and the second control point
  • P i-1 is the first control point
  • P i is the second Control point
  • is the control point distance
  • E is the first direction vector
  • F is the second direction vector.
  • the method may further include:
  • the first direction vector and the second direction vector are the same, and the multiple variables in the specified spline interpolation model are offset, and the generated
  • the actual motion trajectory of the target object is a straight line, not a curve, so that the actual motion trajectory of the target object is the same as the theoretical motion trajectory, which improves the accuracy of determining the motion trajectory of the target object.
  • a person skilled in the art may understand that all or part of the steps of implementing the above embodiments may be completed by hardware, or may be instructed by a program to execute related hardware, and the program may be stored in a computer readable storage medium.
  • the storage medium mentioned may be a read only memory, a magnetic disk or an optical disk or the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Manipulator (AREA)
  • Processing Or Creating Images (AREA)
  • Image Analysis (AREA)

Abstract

一种目标对象运动轨迹确定方法、装置以及存储介质。该方法包括:基于目标对象在运动轨道上的当前位置,获取该运动轨道上的第一控制点和第二控制点,第一控制点和第二控制点为相邻控制点(101);获取第一方向向量和第二方向向量,第一方向向量为第一控制点处的单位方向向量,第二方向向量为第二控制点处的单位方向向量(102);基于第一控制点、第二控制点、第一方向向量和第二方向向量,通过指定样条曲线插值模型,确定该目标对象在该运动轨道上的运动轨迹(103)。通过指定样条曲线插值模型,在实际轨迹为直线时,生成的线条是直线,而并非曲线,提高了确定目标对象运动轨迹的准确率。

Description

目标对象运动轨迹确定方法、装置以及存储介质
本申请要求于2015年6月24日提交中国专利局、申请号为201510354868.6、发明名称为“游戏运动轨迹确定方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本发明涉及计算机技术领域,特别涉及一种目标对象运动轨迹确定方法、装置以及存储介质。
背景技术
随着网络技术的快速发展,出现了越来越多移动游戏的应用,比如各种跑酷类游戏。当用户使用这类应用时,可以在应用中选择一个角色,选择的角色可以称为目标对象。之后,终端可以基于用户对该目标对象的控制,确定该目标对象在运动轨道上的运动轨迹,从而通过该应用的导航模块来控制该目标对象运动。
发明内容
本发明实施例提供了一种目标对象运动轨迹确定方法、装置以及存储介质。
一方面,提供了一种目标对象运动轨迹的确定方法,所述方法包括:
基于目标对象在运动轨道上的当前位置,获取所述运动轨道上的第一控制点和第二控制点,所述第一控制点和所述第二控制点为相邻控制点;
获取第一方向向量和第二方向向量,所述第一方向向量为所述第一控制点处的单位方向向量,所述第二方向向量为所述第二控制点处的单位方向向量;以及
基于所述第一控制点、所述第二控制点、所述第一方向向量和所述第二方向向量,通过指定样条曲线插值模型,确定所述目标对象在所述运动轨道上的运动轨迹。
另一方面,提供了一种目标对象运动轨迹确定装置,所述装置包括:一个或多个处理器和存储有操作指令的存储介质,当运行所述存储介质中的操作指令时,所处处理器执行如下步骤:
基于目标对象在运动轨道上的当前位置,获取所述运动轨道上的第一控制点和第二控制点,所述第一控制点和所述第二控制点为相邻控制点;
获取第一方向向量和第二方向向量,所述第一方向向量为所述第一控制点处的单位方向向量,所述第二方向向量为所述第二控制点处的单位方向向量;以及
基于所述第一控制点、所述第二控制点、所述第一方向向量和所述第二方向向量,通过指定样条曲线插值模型,确定所述目标对象在所述运动轨道上的运动轨迹。
再一方面,提供了一种非瞬时性的计算机可读存储介质,其上存储有计算机可执行指令,当计算机中运行这些可执行指令时,执行如下步骤:
基于目标对象在运动轨道上的当前位置,获取所述运动轨道上的第一控制点和第二控制点,所述第一控制点和所述第二控制点为相邻控制点;
获取第一方向向量和第二方向向量,所述第一方向向量为所述第一控制点处的单位方向向量,所述第二方向向量为所述第二控制点处的单位方向向量;以及
基于所述第一控制点、所述第二控制点、所述第一方向向量和所述第二方向向量,通过指定样条曲线插值模型,确定所述目标对象在所述运动轨道上的运动轨迹。
根据本申请实施例所提供的技术方案,当目标对象的实际运动轨迹为直线时,第一方向向量和第二方向向量相同,指定样条曲线插值模型中的多次变量会抵消,所生成的目标对象的实际运动轨迹是直线,而并非曲线,进而使目标对象的实际运动轨迹与理论运动轨迹相同,提高了确定目标对象运动轨迹的准确率。
附图说明
图1是本发明实施例提供的一种目标对象运动轨迹确定方法流程图;
图2是本发明实施例提供的另一种目标对象运动轨迹确定方法流程图;
图3是本发明实施例提供的一种CatmullRom样条曲线算法生成的线条示意图;
图4是本发明实施例提供的一种目标对象运动轨迹界面示意图;
图5是本发明实施例提供的另一种目标对象运动轨迹确定界面示意图;
图6是本发明实施例提供的一种指定样条曲线插值模型生成的线条示意图;
图7是本发明实施例提供的一种目标对象运动轨迹确定装置结构示意图;
图8是本发明实施例提供的另一种目标对象运动轨迹确定装置结构示意图;
图9是本发明实施例提供的再一种目标对象运动轨迹确定装置结构示意图。
具体实施方式
为使本发明的目的、技术方案和优点更加清楚,下面将结合附图对本发明实施方式作进一步地详细描述。
图1是本发明实施例提供的一种目标对象运动轨迹确定方法流程图。该方法可以应用于终端中,该终端可以为手机、平板电脑、掌上电脑等,参见图1,该方法包括步骤101至步骤103。
在步骤101中,基于目标对象在运动轨道上的当前位置,获取该运动轨道上的第一控制点和第二控制点,第一控制点和第二控制点为相邻控制点。
在步骤102中,获取第一方向向量和第二方向向量,第一方向向量为第一控制点处的单位方向向量,第二方向向量为第二控制点处的单位方向向量。
在步骤103中,基于第一控制点、第二控制点、第一方向向量和第二方向向量,通过指定样条曲线插值模型,确定目标对象在该运动轨道上的运动轨迹。
根据本发明实施例所提供的技术方案,当目标对象的实际运动轨迹为直线时,第一方向向量和第二方向向量相同,指定样条曲线插值模型中的多次变量会抵消,所生成的目标对象的实际运动轨迹是直线,而并非曲线,进而使目标 对象的实际运动轨迹与理论运动轨迹相同,提高了确定目标对象运动轨迹的准确率。
可选地,基于第一控制点、第二控制点、第一方向向量和第二方向向量,通过指定样条曲线插值模型,确定目标对象当前在该运动轨道上的运动轨迹之前,还可以包括:获取待训练样条曲线插值模型;
基于第一训练控制点、第二训练控制点、第三训练控制点和第四训练控制点,确定训练距离、第一训练向量和第二训练向量,第一训练控制点、第二训练控制点、第三训练控制点和第四训练控制点的顺序可以依次排列;
基于第二训练控制点、第三训练控制点、训练距离、第一训练向量、第二训练向量和训练样条曲线插值模型,确定指定样条曲线插值模型。
可选地,基于第一训练控制点、第二训练控制点、第三训练控制点和第四训练控制点,确定训练距离、第一训练向量和第二训练向量,可以包括:
将第二训练控制点与第三训练控制点之间的距离,确定为训练距离;
将第一训练控制点与第三训练控制点之间连线的单位方向向量,确定为第一训练向量;
将第一训练控制点与第四训练控制点之间连线的单位方向向量,确定为第二训练向量。
可选地,基于第二训练控制点、第三训练控制点、训练距离、第一训练向量、第二训练向量和训练样条曲线插值模型,确定指定样条曲线插值模型,可以包括:
基于第二训练控制点、第三训练控制点、训练距离、第一训练向量和第二训练向量,确定训练样条曲线插值模型的各个参数;
基于训练样条曲线插值模型的各个参数和训练样条曲线插值模型,确定指定样条曲线插值模型。
可选地,基于第一控制点、第二控制点、第一方向向量和第二方向向量,通过指定样条曲线插值模型,确定目标对象当前在该运动轨道上的运动轨迹,可以包括:
基于第一控制点和第二控制点,确定该当前运动距离在第一控制点与第二控制点之间的插值比例;
确定第一控制点与第二控制点之间的距离,得到控制点距离;
基于第一控制点、第二控制点、该当前运动距离在第一控制点与第二控制点之间的插值比例、控制点距离、第一方向向量和第二方向向量,通过如下的指定样条曲线插值模型,确定目标对象当前运动的运动轨迹;
P(U)=Pi-1+λEU+(3Pi-3Pi-1-λF-2λF)U2+(-2Pi+2Pi-1+λF+λE)U3
其中,P(U)为目标对象的运动轨迹,U为该当前运动距离在第一控制点与第二控制点之间的插值比例,Pi-1为第一控制点,Pi为第二控制点,λ为控制点距离,E为第一方向向量,F为第二方向向量。
可选地,该方法还可以包括:
当第一方向向量与第二方向向量相等时,该指定样条曲线插值模型为:P(U)=Pi-1+λVU,目标对象在运动轨道上的运动轨迹为直线,其中,V为第一方向向量或者第二方向向量。
上述所有可选技术方案,均可按照任意结合形成本发明的可选实施例,本发明实施例对此不再一一赘述。
图2是本发明实施例提供的一种目标对象运动轨迹确定方法流程图。该方法应用于终端中,该终端可以为手机、平板电脑、掌上电脑等,参见图2,该方法包括步骤201至步骤206。
在步骤201中,获取训练样条曲线插值模型。
当目标对象的实际运动轨迹为直线时,通过CatmullRom样条曲线算法,生成的线条为曲线,而并非直线,比如,如图3所示,当目标对象的实际运动轨迹为从A至B的直线时,通过CatmullRom样条曲线算法,生成的线条先从A至A'的线条,再从A'至B的线条,而对于目标运动对象,需要先从A回退到A',再从A'运动至B,因此,在本发明实施例中,确定目标对象的运动轨迹之前,可以重新训练并生成一个指定样条曲线插值模型,使指定样条曲线插值模型,在目标对象的实际运动轨迹为直线时,生成的线条也为直线。而训练指定样条曲线插值模型之前,获取训练样条曲线插值模型。
比如,训练样条曲线插值模型可以为:
Figure PCTCN2016081694-appb-000001
其中,U为插值比例,且为0-1的浮点数,P(U)为运动轨迹,C0、C1、C2 和C3为训练样条曲线插值模型的各个参数,且当U指定时,P(U)为该运动轨迹上的一个点,比如,当U为0时,P(0)为该运动轨迹的起点,当U为1时,P(1)为该运动轨迹的终点。
在步骤202中,基于第一训练控制点、第二训练控制点、第三训练控制点和第四训练控制点,确定训练距离、第一训练向量和第二训练向量,第一训练控制点、第二训练控制点、第三训练控制点和第四训练控制点为训练轨道上依次排列的控制点。
可以基于训练运动轨迹得到指定样条曲线插值模型。获取第一训练控制点、第二训练控制点、第三训练控制点和第四训练控制点,第二训练控制点为训练运动轨迹的起点,第三训练控制点为该训练运动轨迹的终点,第一训练控制点为第二训练控制点之前,且与第二训练控制点相邻的控制点,第四训练控制点为第三训练控制点之后,且与第三训练控制点相邻的控制点。将第二训练控制点与第三训练控制点之间的距离,确定为该训练距离;将第一训练控制点与第三训练控制点之间连线的单位方向向量,确定为第一训练向量;将第二训练控制点与第四训练控制点之间连线的单位方向向量,确定为第二训练向量。
比如,第一训练控制点为Pi-2,第二训练控制点为Pi-1,第三训练控制点为Pi,第四训练控制点为Pi+1,第二训练控制点与第三训练控制点之间的距离为λ,也即是,该训练距离为λ,第一训练控制点与第三训练控制点之间连线的单位向量为E,也即是,第一训练向量为E,第一训练控制点与第四训练控制点之间连线的单位向量为F,也即是,第二训练向量为F。
在步骤203中,基于第二训练控制点、第三训练控制点、训练距离、第一训练向量、第二训练向量和训练样条曲线插值模型,确定指定样条曲线插值模型。
当确定训练样条曲线插值模型的各个参数后,基于训练样条曲线插值模型的各个参数和待训练样条曲线插值模型,确定指定样条曲线插值模型。
上述步骤201中提到,P(0)为运动轨迹的起点,P(1)为运动轨迹的终点,因此,对于第二训练控制点与第三训练控制点之间的训练运动轨迹,P(0)为第二训练控制点Pi-1,P(1)为第三训练控制点Pi。另外,第二训练控制点处的方向为第三训练控制点与第一训练控制点之间的连线所在的方向,第三训练控制点 处的方向为第四训练控制点与第二训练控制点之间的连线所在的方向。而为了获取第二训练控制点处的方向向量和第三训练控制点处的方向向量,对训练样条曲线插值模型进行导数运算,得到训练样条曲线插值模型的导数模型为P'(U)=C1+2C2U+3C3U2,而第二训练控制点处的方向向量为P'(0),第三训练控制点处的方向向量为P'(1)。因此,可以得到P(0)=C1=Pi-1,P(1)=C0+C1+C2+C3=Pi,P'(0)=C1=λE,P'(1)=C1+2C2+3C3=λF,进而确定训练样条曲线插值模型的各个参数为:C0=Pi-1,C1=λE,C2=3Pi-3Pi-1-λF-2λE,C3=-2Pi+2Pi-1+λF+λE。
之后,可以将待训练样条曲线插值模型的各个参数,代入训练样条曲线插值模型,得到指定样条曲线插值模型为:P(U)=Pi-1+λEU+(3Pi-3Pi-1-λF-2λE)U2+(-2Pi+2Pi-1+λF+λE)U3
其中,其中,P(U)为训练运动轨迹,U为插值比例,Pi-1为训练运动轨迹的起点,也即是,第二训练控制点,Pi为该训练运动轨迹的终点,也即是,第三训练控制点,λ为第二训练控制点与第三训练控制点之间的距离,也即是,训练距离,E为第一训练向量,F为第二训练向量。
其中,通过上述步骤201-203可以训练得到指定样条曲线插值模型,另外,上述步骤201-203不仅可以在确定目标对象运动轨迹时执行,还可以事先通过上述步骤201-203的方法得到指定样条曲线插值模型,并存储该指定样条曲线插值模型,之后,确定目标对象运动轨迹时,可以直接获取该指定样条曲线插值模型,本发明实施例对确定指定样条曲线插值模型的时序不做具体限定。另外,确定指定样条曲线插值模型之后,可以通过指定样条曲线插值模型,生成目标对象所在运动轨道的样条曲线,该样条曲线上可以包括多个控制点。之后,终端还可以通过如下的步骤,确定目标对象在该运动轨道上的运动轨迹,具体如下。
在步骤204中,基于目标对象在运动轨道上的当前位置,获取该运动轨道上的第一控制点和第二控制点,第一控制点和第二控制点为相邻控制点。
在本发明实施例中,终端可以每隔指定时长,确定目标对象在运动轨道上的运动位置,从而得到该目标对象在该运动轨道上的运动轨迹,该指定时长是事先设置的,比如,指定时长可以为1秒,如此,基于该目标对象每秒在该运 动轨道上的运动位置,可以确定目标对象在该运动轨道上的运动轨迹比较光滑。
当该方法应用于网络游戏中时,为了保证网络游戏的显示效果,目标对象在运动轨道上的速度是恒定的,因此,当该目标对象从该运动轨道的起点开始运动时,每隔指定时长,可以基于该目标对象的速度,计算该目标对象在运动轨道上的当前位置。另外,如上述所述,该运动轨道上可以包括多个控制点,而在本发明实施例中,对于该多个控制点中的每个控制点,可以计算该控制点与该运动轨道的起点之间的距离,并将该距离除以该运动轨道的总长度,得到该控制点的长度比例,进而可以存储每个控制点与每个控制点的长度比例之间的对应关系。比如,该运动轨道包括10个控制点,该运动轨道的总长度为100米,且从第1个控制点到第10个控制点,每个控制点在该运动轨道的起点P1、10米处P2、20米处P3、35米处P4、45米处P5、60米处P6、70米处P7、80米处P8、90米处P9、终点P10,也即是,每个控制点与该运动轨道的起点之间的距离分别为0米、10米、20米、35米、45米、60米、70米、80米、90米、100米。将每个控制点与该运动轨道的起点之间的距离分别除以该运动轨道的总长度,得到每个控制点的长度比例为0%、10%、20%、35%、45%、60%、70%、80%、90%、100%,进而将每个控制点与每个控制点的长度比例存储在如下表1所示的控制点与长度比例之间的对应关系中。
表1
控制点 长度比例
P1 0%
P2 10%
P3 20%
P4 35%
P5 45%
P6 60%
P7 70%
P8 80%
P9 90%
P10 100%
基于目标对象在运动轨道上的当前位置,获取该运动轨道上的第一控制点和第二控制点的操作可以包括:根据目标对象在运动轨道上得当前位置获取目标对象距离起点的距离,将该距离除以该运动轨道的总长度,得到目标对象当前距离的长度比例。获取控制点与长度比例之间的对应关系中存储的多个长度比例,将该多个长度比例与目标对象当前距离的长度比例进行比较,从该多个长度比例中,选择与目标对象当前距离的长度比例相邻的两个长度比例,进而可以将选择的两个长度比例中较小的长度比例对应的控制点确定为第一控制点,将选择的两个长度比例中较大的长度比例对应的控制点确定为第二控制点。
基于上述的例子,比如,目标对象在该运动轨道距离起点的距离为50米,将该距离50米除以该运动轨道的总长度100米,得到该当前运动距离的长度比例为50%。获取控制点与长度比例之间的对应关系中存储的长度比例,得到多个长度比例分别为0%、10%、20%、35%、45%、60%、70%、80%、90%、100%,将该多个长度比例与该当前运动距离的长度比例进行比较,从该多个长度比例中,选择与该当前运动距离的长度比例相邻的两个长度比例为45%和60%,进而将选择的两个长度比例中较小的长度比例45%对应的控制点P5确定为第一控制点,将选择的两个长度比例中较大的长度比例60%对应的控制点P6确定为第二控制点,也即是如图4所示,确定该运动轨道上的第5个控制点P5为第一控制点,确定该运动轨道上的第6个控制点P6为第二控制点。
在步骤205中,获取第一方向向量和第二方向向量,第一方向向量为第一控制点处的单位方向向量,第二方向向量为第二控制点处的单位方向向量。
在本发明实施例中,终端可以通过多种方式获取第一方向向量和第二方向向量。比如,可以对指定样条曲线插值模型进行导数运算,得到指定样条曲线插值模型的导数模型,并计算第一控制点的插值比例,将第一控制点的插值比例代入指定样条曲线插值模型的导数模型,得到第一方向向量,同理,计算第二控制点的插值比例,将第二控制点的插值比例代入指定样条曲线插值模型的导数模型,得到第二方向向量。再比如,还可以基于第二控制点的坐标和第三 控制点的坐标,计算第二控制点与第三控制点之间的连线的单位方向向量,得到第一方向向量,第三控制点为第一控制点之前,且与第一控制点相邻的控制点;同理,可以基于第四控制点的坐标和第一控制点的坐标,计算第四控制点与第一控制点之间的连线的单位方向向量,得到第二方向向量,第四控制点为第二控制点之后,且与第二控制点相邻的控制点。本发明实施例对此不再一一列出。
在本发明实施例中,计算第一控制点的插值比例和第二控制点的插值比例的方法,与计算该目标对象的当前距离在第一控制点与第二控制点之间的插值比例方法相同,具体详见步骤206,本发明实施例在此不再进行详细阐述。另外,基于第二控制点的坐标和第三控制点的坐标,计算第二控制点与第三控制点之间的连线的单位方向向量,得到第一方向向量的方法可以包括:将第二控制点的坐标减去第三控制点的坐标,得到第二控制点与第三控制点之间的方向向量,将第二控制点与第三控制点之间的方向向量进行单位化,得到第一方向向量。计算第二方向向量的方向类似,本发明实施例对此同样不再进行详细阐述。
在步骤206中,基于第一控制点、第二控制点、第一方向向量和第二方向向量,通过指定样条曲线插值模型,确定目标对象在该运动轨道上的运动轨迹。
具体地,终端基于第一控制点和第二控制点,确定该目标对象的当前距离在第一控制点与第二控制点之间的插值比例;确定第一控制点与第二控制点之间的距离,得到控制点距离;并基于第一控制点、第二控制点、该目标对象的当前距离在第一控制点与第二控制点之间的插值比例、控制点距离、第一方向向量和第二方向向量,通过如下的指定样条曲线插值模型,确定目标对象当前运动的运动轨迹;
P(U)=Pi-1+λEU+(3Pi-3Pi-1-λF-2λE)U2+(-2Pi+2Pi-1+λF+λE)U3
其中,上式中的P(U)为目标对象的运动轨迹,U为该目标对象的当前运动距离在第一控制点与第二控制点之间的插值比例,Pi-1为该目标对象当前运动的运动轨迹的起点,即,第一控制点,Pi为该目标对象当前运动的运动轨迹的终点,即,第二控制点,λ为第一控制点与第二控制点之间的距离,即,控制点距离,E为第一控制点处的单位方向向量,即,第一方向向量,F为第二控 制点处的单位方向向量,即,第二方向向量。
其中,终端基于第一控制点和第二控制点,确定该目标对象的当前距离在第一控制点与第二控制点之间的插值比例的具体操作可以包括:将该目标对象的当前距离的长度比例减去第一控制点的长度比例,得到第一比例,将第二控制点的长度比例减去第一控制点的长度比例,得到第二比例,将第一比例除以第二比例,得到该目标对象的当前距离在第一控制点与第二控制点之间的插值比例。
另外,可以基于第一控制点的坐标和第二控制点的坐标计算控制点距离,具体的计算方法可以参考相关技术,本发明实施例对此不再进行详细阐述。
比如,将该目标对象的当前距离的长度比例50%减去第一控制点P5的长度比例45%,得到第一比例为5%,将第二控制点P6的长度比例60%减去第一控制点P5的长度比例45%,得到第二比例为15%,将第一比例5%除以第二比例15%,得到该目标对象的当前距离在第一控制点P5与第二控制点P6之间的插值比例为33%。假如,第一控制点P5的坐标为(0,0,45),第二控制点P6的坐标为(0,0,60),第一方向向量和第二方向向量均为(0,0,1),计算第一控制点P5与第二控制点P6之间的距离为15米,也即是,控制点距离为15米,之后,基于第一控制点P5(0,0,45)、第二控制点P6(0,0,60)、该目标对象的当前运动距离在第一控制点与第二控制点之间的插值比例33%、该控制点距离15、第一方向向量(0,0,1)和第二方向向量(0,0,1),通过上述的指定样条曲线插值模型,确定目标对象的当前运动距离所在的位置为(0,0,49.95),进而确定目标对象的运动轨迹,也即是,从图4中的第一控制点P1与第二控制点P2之间确定目标对象的当前运动距离所在的位置(0,0,49.95)为a点,得到图5所示的界面示意图(实际应用中,各个控制点可能不会显示)。
值得注意的是,当目标对象的实际运动轨迹为直线时,也即是,第一方向向量和第二方向向量相等时,将第一方向向量和第二方向向量统一用V表示,且Pi-Pi-1为第一控制点与第二控制点之间的距离,可以用λV表示,因此,上述指定样条曲线插值模型为:
P(U)=Pi-1+λEU+(3Pi-3Pi-1-λF-2λE)U2+(-2Pi+2Pi-1+λF+λE)U3
=Pi-1+λVU+(3λV-λV-2λV)U2+(-2λV+λV+λV)U3
=Pi-1+λVU+(3λV-3λV)U2+(-2λV+2λV)U3
=Pi-1+λVU
此时,该指定样条曲线插值模型为一条直线,因此,不管U为0-1中的任何值,均不会出现目标对象的运动轨迹为曲线的现象,提高了确定运动轨迹的准确率。如图6所示,当实际运动轨迹为从A至B的直线时,通过指定样条曲线插值模型,生成的线条是直接从A至B的线条,而对于目标运动对象,也是直接从A运动至B,不会出现图3所示的回退现象。
另外,当实际运动轨迹为直线时,由于CatmullRom样条曲线算法生成的样条曲线的变化率是变化的,也即是,通过CatmullRom样条曲线算法确定目标对象运动的速度也是变化的,如图3所示,从A'到B之间进行插值的点的间隙是从小到大变化的,也即是,对于恒定的指定时长,目标对象在A'到B之间运动的速度也是从慢到快变化的。而在导航系统中,目标对象运动的速度需要恒定,也即是,样条曲线的变化率必须恒定。而在本发明实施例中,对上述直线情况下的指定样条曲线插值模型P(U)=Pi-1+λVU进行导数运算之后,得到该指定样条曲线插值模型的导数模型为P(U)=λV,为一常数,因此,通过本发明实施例中的指定样条曲线插值模型,生成的样条曲线的变化率恒定,也即是,目标对象运动的速度也是匀速的,不会出现目标对象运动速度不一致的现象。如图6所示,从A到B之间进行插值的点的间隙是相等的,也即是,对于恒定的指定时长,目标对象在A到B之间运动的速度也是恒定的。
需要说明的是,本发明实施例提供的目标对象运动轨迹确定方法可以应用于与应用服务器进行交互的网络游戏的场景中,此时,确定运动轨迹的系统可以包括终端和应用服务器,该终端和应用服务器之间通过网络连接,该终端可以基于上述方法确定目标对象的运动轨迹,并可以将目标对象的运动轨迹同步到应用服务器,使应用服务器执行指定操作,比如,当该运动轨迹达到运动轨道上的指定位置时,应用服务器可以向该终端对应的用户进行奖励等。当然,该运动轨迹确定方法还可以应用于其他的场景,比如,该终端单机使用网络游戏的场景,也即是,确定运动轨迹的系统仅包括该终端。本发明实施例对确定运动轨迹的场景不做具体限定。
根据本实施例所提供的技术方案,当目标对象的实际运动轨迹为直线时,第一方向向量和第二方向向量相同,指定样条曲线插值模型中的多次变量会抵消,所生成的目标对象的实际运动轨迹是直线,而并非曲线,进而使目标对象的实际运动轨迹与理论运动轨迹相同,提高了确定目标对象运动轨迹的准确率。另外,通过指定样条曲线插值模型确定的运动轨迹的速度是恒定的,不会出现目标对象的速度非恒定的现象。
图7是本发明实施例提供的一种目标对象运动轨迹确定装置,参见图7,该装置包括:
第一获取模块701,用于基于目标对象在运动轨道上的当前位置,获取该运动轨道上的第一控制点和第二控制点,第一控制点和第二控制点为相邻控制点;
第二获取模块702,用于获取第一方向向量和第二方向向量,第一方向向量为第一控制点处的单位方向向量,第二方向向量为第二控制点处的单位方向向量;
第一确定模块703,用于基于第一控制点、第二控制点、第一方向向量和第二方向向量,通过指定样条曲线插值模型,确定目标对象在该运动轨道上的运动轨迹。
可选地,参见图8,该装置还可以包括:
第三获取模块704,用于获取训练样条曲线插值模型;
第二确定模块705,用于基于第一训练控制点、第二训练控制点、第三训练控制点和第四训练控制点,确定训练距离、第一训练向量和第二训练向量,第一训练控制点、第二训练控制点、第三训练控制点和第四训练控制点的顺序可以依次排列;
第三确定模块706,用于基于第二训练控制点、第三训练控制点、训练距离、第一训练向量、第二训练向量和训练样条曲线插值模型,确定指定样条曲线插值模型。
可选地,第二确定模块705可以包括:
第一确定单元,用于将第二训练控制点与第三训练控制点之间的距离,确 定为训练距离;
第二确定单元,用于将第一训练控制点与第三训练控制点之间连线的单位方向向量,确定为第一训练向量;
第三确定单元,用于将第一训练控制点与第四训练控制点之间连线的单位方向向量,确定为第二训练向量。
可选地,第三确定模块706可以包括:
第四确定单元,用于基于第二训练控制点、第三训练控制点、训练距离、第一训练向量和第二训练向量,确定训练样条曲线插值模型的各个参数;
第五确定单元,用于基于训练样条曲线插值模型的各个参数和训练样条曲线插值模型,确定指定样条曲线插值模型。
可选地,第一确定模块703可以包括:
第六确定单元,用于基于第一控制点和第二控制点,确定该目标对象在该运动轨道上的当前运动距离在第一控制点与第二控制点之间的插值比例;
第七确定单元,用于确定第一控制点与第二控制点之间的距离,得到控制点距离;
第八确定单元,用于基于第一控制点、第二控制点、该目标对象在该运动轨道上的当前运动距离在第一控制点与第二控制点之间的插值比例、控制点距离、第一方向向量和第二方向向量,通过如下的指定样条曲线插值模型,确定目标对象的运动轨迹;
P(U)=Pi-1+λEU+(3Pi-3Pi-1-λF-2λE)U2+(-2Pi+2Pi-1+λF+λE)U3
其中,P(U)为目标对象当前运动的运动轨迹,U为该目标对象在该运动轨道上的当前运动距离在第一控制点与第二控制点之间的插值比例,Pi-1为第一控制点,Pi为第二控制点,λ为控制点距离,E为第一方向向量,F为第二方向向量。
可选地,该装置还可以包括:
当第一方向向量与第二方向向量相等时,指定样条曲线插值模型为:P(U)=Pi-1+λVU,目标对象在运动轨道上的运动轨迹为直线,其中,V为第一方向向量或者第二方向向量。
根据本发明实施例所提供的技术方案,当目标对象的实际运动轨迹为直线 时,第一方向向量和第二方向向量相同,指定样条曲线插值模型中的多次变量会抵消,所生成的目标对象的实际运动轨迹是直线,而并非曲线,进而使目标对象的实际运动轨迹与理论运动轨迹相同,提高了确定目标对象运动轨迹的准确率。
需要说明的是:上述实施例提供的游戏运动轨迹确定装置在运动轨迹确定时,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。另外,上述实施例提供的游戏运动轨迹确定装置与游戏运动轨迹确定方法实施例属于同一构思,其具体实现过程详见方法实施例,这里不再赘述。
请参考图9,其示出了本发明一个实施例提供的目标对象运动轨迹确定装置的结构方框图,该装置可以为终端,终端900可以包括通信单元910、包括有一个或一个以上计算机可读存储介质的存储器920、输入单元930、显示单元940、传感器950、音频电路960、WIFI(Wireless Fidelity,无线保真)模块970、包括有一个或者一个以上处理核心的处理器980、以及电源990等部件。本领域技术人员可以理解,图9中示出的终端结构并不构成对终端的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。
通信单元910可用于收发信息或通话过程中,信号的接收和发送,该通信单元910可以为RF(Radio Frequency,射频)电路、路由器、调制解调器、等网络通信设备。特别地,当通信单元910为RF电路时,将基站的下行信息接收后,交由一个或者一个以上处理器980处理;另外,将涉及上行的数据发送给基站。通常,作为通信单元的RF电路包括但不限于天线、至少一个放大器、调谐器、一个或多个振荡器、用户身份模块(SIM)卡、收发信机、耦合器、LNA(Low Noise Amplifier,低噪声放大器)、双工器等。此外,通信单元910还可以通过无线通信与网络和其他设备通信。所述无线通信可以使用任一通信标准或协议,包括但不限于GSM(Global System of Mobile communication,全球移动通讯系统)、GPRS(General Packet Radio Service,通用分组无线服务)、CDMA(Code Division Multiple Access,码分多址)、WCDMA(Wideband Code  Division Multiple Access,宽带码分多址)、LTE(Long Term Evolution,长期演进)、电子邮件、SMS(Short Messaging Service,短消息服务)等。存储器920可用于存储软件程序以及模块,处理器980通过运行存储在存储器920的软件程序以及模块,从而执行各种功能应用以及数据处理。存储器920可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等)等;存储数据区可存储根据终端900的使用所创建的数据(比如音频数据、电话本等)等。此外,存储器920可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。相应地,存储器920还可以包括存储器控制器,以提供处理器980和输入单元930对存储器920的访问。
输入单元930可用于接收输入的数字或字符信息,以及产生与用户设置以及功能控制有关的键盘、鼠标、操作杆、光学或者轨迹球信号输入。优选地,输入单元930可包括触敏表面931以及其他输入设备932。触敏表面931,也称为触摸显示屏或者触控板,可收集用户在其上或附近的触摸操作(比如用户使用手指、触笔等任何适合的物体或附件在触敏表面931上或在触敏表面931附近的操作),并根据预先设定的程式驱动相应的连接装置。可选的,触敏表面931可包括触摸检测装置和触摸控制器两个部分。其中,触摸检测装置检测用户的触摸方位,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成触点坐标,再送给处理器980,并能接收处理器980发来的命令并加以执行。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型实现触敏表面931。除了触敏表面931,输入单元930还可以包括其他输入设备932。优选地,其他输入设备932可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆等中的一种或多种。
显示单元940可用于显示由用户输入的信息或提供给用户的信息以及终端900的各种图形用户接口,这些图形用户接口可以由图形、文本、图标、视频和其任意组合来构成。显示单元940可包括显示面板941,可选的,可以采用LCD(Liquid Crystal Display,液晶显示器)、OLED(Organic Light-Emitting  Diode,有机发光二极管)等形式来配置显示面板941。进一步的,触敏表面931可覆盖显示面板941,当触敏表面931检测到在其上或附近的触摸操作后,传送给处理器980以确定触摸事件的类型,随后处理器980根据触摸事件的类型在显示面板941上提供相应的视觉输出。虽然在图9中,触敏表面931与显示面板941是作为两个独立的部件来实现输入和输入功能,但是在某些实施例中,可以将触敏表面931与显示面板941集成而实现输入和输出功能。
终端900还可包括至少一种传感器950,比如光传感器、运动传感器以及其他传感器。光传感器可包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节显示面板941的亮度,接近传感器可在终端900移动到耳边时,关闭显示面板941和/或背光。作为运动传感器的一种,重力加速度传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别手机姿态的应用(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等;至于终端900还可配置的陀螺仪、气压计、湿度计、温度计、红外线传感器等其他传感器,在此不再赘述。
音频电路960、扬声器961,传声器962可提供用户与终端900之间的音频接口。音频电路960可将接收到的音频数据转换后的电信号,传输到扬声器961,由扬声器961转换为声音信号输出;另一方面,传声器962将收集的声音信号转换为电信号,由音频电路960接收后转换为音频数据,再将音频数据输出处理器980处理后,经通信单元910以发送给比如另一终端,或者将音频数据输出至存储器920以便进一步处理。音频电路960还可能包括耳塞插孔,以提供外设耳机与终端900的通信。
为了实现无线通信,该终端上可以配置有无线通信单元970,该无线通信单元970可以为WIFI模块。WIFI属于短距离无线传输技术,终端900通过无线通信单元970可以帮助用户收发电子邮件、浏览网页和访问流式媒体等,它为用户提供了无线的宽带互联网访问。虽然图中示出了无线通信单元970,但是可以理解的是,其并不属于终端900的必须构成,完全可以根据需要在不改变发明的本质的范围内而省略。
处理器980是终端900的控制中心,利用各种接口和线路连接整个手机的 各个部分,通过运行或执行存储在存储器920内的软件程序和/或模块,以及调用存储在存储器920内的数据,执行终端900的各种功能和处理数据,从而对手机进行整体监控。可选的,处理器980可包括一个或多个处理核心;优选的,处理器980可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器980中。
终端900还包括给各个部件供电的电源990(比如电池),优选的,电源可以通过电源管理系统与处理器980逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。电源960还可以包括一个或一个以上的直流或交流电源、再充电系统、电源故障检测电路、电源转换器或者逆变器、电源状态指示器等任意组件。
尽管未示出,终端900还可以包括摄像头、蓝牙模块等,在此不再赘述。
在本实施例中,终端还包括有一个或者一个以上的程序,这一个或者一个以上程序存储于存储器中,且经配置以由一个或者一个以上处理器执行,所述一个或者一个以上程序包含用于进行本发明实施例提供的游戏运动轨迹确定方法的指令,包括:
基于目标对象在运动轨道上的当前位置,获取该运动轨道上的第一控制点和第二控制点,第一控制点和第二控制点为相邻控制点;
获取第一方向向量和第二方向向量,第一方向向量为第一控制点处的单位方向向量,第二方向向量为第二控制点处的单位方向向量;
基于第一控制点、第二控制点、第一方向向量和第二方向向量,通过指定样条曲线插值模型,确定目标对象在该运动轨道上的运动轨迹。
可选地,基于第一控制点、第二控制点、第一方向向量和第二方向向量,通过指定样条曲线插值模型,确定目标对象当前在该运动轨道上的运动轨迹之前,还包括:
获取待训练样条曲线插值模型;
基于第一训练控制点、第二训练控制点、第三训练控制点和第四训练控制点,确定训练距离、第一训练向量和第二训练向量,第一训练控制点、第二训练控制点、第三训练控制点和第四训练控制点的顺序可以依次排列;
基于第二训练控制点、第三训练控制点、训练距离、第一训练向量、第二训练向量和训练样条曲线插值模型,确定指定样条曲线插值模型。
可选地,基于第一训练控制点、第二训练控制点、第三训练控制点和第四训练控制点,确定训练距离、第一训练向量和第二训练向量,可以包括:
将第二训练控制点与第三训练控制点之间的距离,确定为训练距离;
将第一训练控制点与第三训练控制点之间连线的单位方向向量,确定为第一训练向量;
将第一训练控制点与第四训练控制点之间连线的单位方向向量,确定为第二训练向量。
可选地,基于第二训练控制点、第三训练控制点、该训练距离、第一训练向量、第二训练向量和待训练样条曲线插值模型,确定指定样条曲线插值模型,可以包括:
基于第二训练控制点、第三训练控制点、训练距离、第一训练向量和第二训练向量,确定训练样条曲线插值模型的各个参数;
基于训练样条曲线插值模型的各个参数和训练样条曲线插值模型,确定指定样条曲线插值模型。
可选地,基于第一控制点、第二控制点、第一方向向量和第二方向向量,通过指定样条曲线插值模型,确定目标对象当前在该运动轨道上的运动轨迹,可以包括:
基于第一控制点和第二控制点,确定该当前运动距离在第一控制点与第二控制点之间的插值比例;
确定第一控制点与第二控制点之间的距离,得到控制点距离;
基于第一控制点、第二控制点、该当前运动距离在第一控制点与第二控制点之间的插值比例、控制点距离、第一方向向量和第二方向向量,通过如下的指定样条曲线插值模型,确定目标对象当前运动的运动轨迹;
P(U)=Pi-1+λEU+(3Pi-3Pi-1-λF-2λF)U2+(-2Pi+2Pi-1+λF+λE)U3
其中,P(U)为目标对象的运动轨迹,U为该当前运动距离在第一控制点与第二控制点之间的插值比例,Pi-1为第一控制点,Pi为第二控制点,λ为控制点距离,E为第一方向向量,F为第二方向向量。
可选地,该方法还可以包括:
当第一方向向量与第二方向向量相等时,该指定样条曲线插值模型为:P(U)=Pi-1+λVU,目标对象在运动轨道上的运动轨迹为直线。
根据本发明实施例所提供的技术方案,当目标对象的实际运动轨迹为直线时,第一方向向量和第二方向向量相同,指定样条曲线插值模型中的多次变量会抵消,所生成的目标对象的实际运动轨迹是直线,而并非曲线,进而使目标对象的实际运动轨迹与理论运动轨迹相同,提高了确定目标对象运动轨迹的准确率。
本领域普通技术人员可以理解实现上述实施例的全部或部分步骤可以通过硬件来完成,也可以通过程序来指令相关的硬件完成,所述的程序可以存储于一种计算机可读存储介质中,上述提到的存储介质可以是只读存储器,磁盘或光盘等。
以上所述仅为本发明的较佳实施例,并不用以限制本发明,凡在本发明的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本发明的保护范围之内。

Claims (13)

  1. 一种目标对象运动轨迹确定方法,包括:
    基于目标对象在运动轨道上的当前位置,获取所述运动轨道上的第一控制点和第二控制点,所述第一控制点和所述第二控制点为相邻控制点;
    获取第一方向向量和第二方向向量,所述第一方向向量为所述第一控制点处的单位方向向量,所述第二方向向量为所述第二控制点处的单位方向向量;以及
    基于所述第一控制点、所述第二控制点、所述第一方向向量和所述第二方向向量,通过指定样条曲线插值模型,确定所述目标对象在所述运动轨道上的运动轨迹。
  2. 如权利要求1所述的方法,所述基于所述第一控制点、所述第二控制点、所述第一方向向量和所述第二方向向量,通过指定样条曲线插值模型,确定所述目标对象在所述运动轨道上的运动轨迹之前,还包括:
    获取训练样条曲线插值模型;
    基于第一训练控制点、第二训练控制点、第三训练控制点和第四训练控制点,确定训练距离、第一训练向量和第二训练向量,所述第一训练控制点、所述第二训练控制点、所述第三训练控制点和所述第四训练控制点为训练轨道上依次排列的控制点;以及
    基于所述第二训练控制点、所述第三训练控制点、所述训练距离、所述第一训练向量、所述第二训练向量和所述待训练样条曲线插值模型,确定所述指定样条曲线插值模型。
  3. 如权利要求2所述的方法,所述基于第一训练控制点、第二训练控制点、第三训练控制点和第四训练控制点,确定训练距离、第一训练向量和第二训练向量,包括:
    将所述第二训练控制点与所述第三训练控制点之间的距离,确定为所述训练距离;
    将所述第一训练控制点与所述第三训练控制点之间连线的单位方向向量, 确定为所述第一训练向量;以及
    将所述第一训练控制点与所述第四训练控制点之间连线的单位方向向量,确定为所述第二训练向量。
  4. 如权利要求2所述的方法,所述基于所述第二训练控制点、所述第三训练控制点、所述训练距离、所述第一训练向量、所述第二训练向量和所述待训练样条曲线插值模型,确定所述指定样条曲线插值模型,包括:
    基于所述第二训练控制点、所述第三训练控制点、所述训练距离、所述第一训练向量和所述第二训练向量,确定所述待训练样条曲线插值模型的参数;以及
    基于所述待训练样条曲线插值模型的参数和所述训练样条曲线插值模型,确定所述指定样条曲线插值模型。
  5. 如权利要求1-4任一权利要求所述的方法,所述基于所述第一控制点、所述第二控制点、所述第一方向向量和所述第二方向向量,通过指定样条曲线插值模型,确定所述目标对象在所述运动轨道上的运动轨迹,包括:
    基于所述第一控制点和所述第二控制点,确定所述目标物体的当前位置在所述第一控制点与所述第二控制点之间的插值比例;
    确定所述第一控制点与所述第二控制点之间的距离,得到控制点距离;以及
    基于所述第一控制点、所述第二控制点、所述插值比例、所述控制点距离、所述第一方向向量和所述第二方向向量,通过如下的指定样条曲线插值模型,确定所述目标对象当前运动的运动轨迹;
    P(U)=Pi-1+λEU+(3Pi-3Pi-1-λF-2λF)U2+(-2Pi+2Pi-1+λF+λE)U3
    其中,P(U)为所述目标对象的运动轨迹,U为所述插值比例,Pi-1为所述第一控制点,Pi为所述第二控制点,λ为所述控制点距离,E为所述第一方向向量,F为所述第二方向向量。
  6. 如权利要求1或5所述的方法,还包括:
    当所述第一方向向量与所述第二方向向量相等时,所述指定样条曲线插值模型为:P(U)=Pi-1+λVU,所述目标对象在所述运动轨道上的运动轨迹为直线,其中,V为所述第一方向向量或者所述第二方向向量。
  7. 一种目标对象运动轨迹确定装置,包括:一个或多个处理器和存储有操作指令的存储介质,当运行所述存储介质中的操作指令时,所述处理器执行如下步骤:
    基于目标对象在运动轨道上的当前位置,获取所述运动轨道上的第一控制点和第二控制点,所述第一控制点和所述第二控制点为相邻控制点;
    获取第一方向向量和第二方向向量,所述第一方向向量为所述第一控制点处的单位方向向量,所述第二方向向量为所述第二控制点处的单位方向向量;以及
    基于所述第一控制点、所述第二控制点、所述第一方向向量和所述第二方向向量,通过指定样条曲线插值模型,确定所述目标对象在所述运动轨道上的运动轨迹。
  8. 如权利要求7所述的装置,所述处理器还执行:
    获取训练样条曲线差值模型;
    基于第一训练控制点、第二训练控制点、第三训练控制点和第四训练控制点,确定训练距离、第一训练向量和第二训练向量,所述第一训练控制点、所述第二训练控制点、所述第三训练控制点和所述第四训练控制点为训练轨道上依次排列的控制点;以及
    基于所述第二训练控制点、所述第三训练控制点、所述训练距离、所述第一训练向量、所述第二训练向量和所述待训练样条曲线插值模型,确定所述指定样条曲线插值模型。
  9. 如权利要求8所述的装置,当所述处理器执行基于第一训练控制点、第二训练控制点、第三训练控制点和第四训练控制点,确定训练距离、第一训练向量和第二训练向量时,所述处理器执行:
    将所述第二训练控制点与所述第三训练控制点之间的距离,确定为所述训练距离;
    将所述第一训练控制点与所述第三训练控制点之间连线的单位方向向量,确定为所述第一训练向量;以及
    将所述第一训练控制点与所述第四训练控制点之间连线的单位方向向量,确定为所述第二训练向量。
  10. 如权利要求8所述的装置,当所述处理器执行基于所述第二训练控制点、所述第三训练控制点、所述训练距离、所述第一训练向量、所述第二训练向量和所述待训练样条曲线插值模型,确定所述指定样条曲线插值模型时,所述处理器执行:
    基于所述第二训练控制点、所述第三训练控制点、所述训练距离、所述第一训练向量和所述第二训练向量,确定所述待训练样条曲线插值模型的参数;以及
    基于所述待训练样条曲线插值模型的参数和所述训练样条曲线插值模型,确定所述指定样条曲线插值模型。
  11. 如权利要求7-10任一权利要求所述的装置,所述处理器执行基于所述第一控制点、所述第二控制点、所述第一方向向量和所述第二方向向量,通过指定样条曲线插值模型,确定所述目标对象在所述运动轨道上的运动轨迹时,所述处理器执行:
    基于所述第一控制点和所述第二控制点,确定所述目标物体的当前位置在所述第一控制点与所述第二控制点之间的插值比例;
    确定所述第一控制点与所述第二控制点之间的距离,得到控制点距离;以及
    基于所述第一控制点、所述第二控制点、所述插值比例、所述控制点距离、所述第一方向向量和所述第二方向向量,通过如下的指定样条曲线插值模型,确定所述目标对象当前运动的运动轨迹;
    P(U)=Pi-1+λEU+(3Pi-3Pi-1-λF-2λF)U2+(-2Pi+2Pi-1+λF+λE)U3
    其中,P(U)为所述目标对象的运动轨迹,U为所述插值比例,Pi-1为所述第一控制点,Pi为所述第二控制点,λ为所述控制点距离,E为所述第一方向向量,F为所述第二方向向量。
  12. 如权利要求7或11所述的装置,当所述第一方向向量与所述第二方向向量相等时,所述指定样条曲线插值模型为:P(U)=Pi-1+λVU,所述目标对象在所述运动轨道上的运动轨迹为直线,其中,V为所述第一方向向量或者所述第二方向向量。
  13. 一种非瞬时性的计算机可读存储介质,其上存储有计算机可执行指令,当计算机中运行这些可执行指令时,执行如下步骤:
    基于目标对象在运动轨道上的当前位置,获取获取所述运动轨道上的第一控制点和第二控制点,所述第一控制点和所述第二控制点为相邻控制点;
    获取第一方向向量和第二方向向量,所述第一方向向量为所述第一控制点处的单位方向向量,所述第二方向向量为所述第二控制点处的单位方向向量;以及
    基于所述第一控制点、所述第二控制点、所述第一方向向量和所述第二方向向量,通过指定样条曲线插值模型,确定所述目标对象在所述运动轨道上的运动轨迹。
PCT/CN2016/081694 2015-06-24 2016-05-11 目标对象运动轨迹确定方法、装置以及存储介质 WO2016206491A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
KR1020177016338A KR101975689B1 (ko) 2015-06-24 2016-05-11 타겟 대상의 모션 궤적을 결정하는 방법 및 디바이스, 및 저장 매체
JP2017538656A JP6735760B2 (ja) 2015-06-24 2016-05-11 ターゲット対象の動き軌道を決定するための方法およびデバイス、ならびに記憶媒体
US15/624,245 US10354393B2 (en) 2015-06-24 2017-06-15 Method and device for determining motion trajectory of target subject, and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510354868.6A CN105046059B (zh) 2015-06-24 2015-06-24 游戏运动轨迹确定方法及装置
CN201510354868.6 2015-06-24

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/624,245 Continuation US10354393B2 (en) 2015-06-24 2017-06-15 Method and device for determining motion trajectory of target subject, and storage medium

Publications (1)

Publication Number Publication Date
WO2016206491A1 true WO2016206491A1 (zh) 2016-12-29

Family

ID=54452599

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/081694 WO2016206491A1 (zh) 2015-06-24 2016-05-11 目标对象运动轨迹确定方法、装置以及存储介质

Country Status (5)

Country Link
US (1) US10354393B2 (zh)
JP (1) JP6735760B2 (zh)
KR (1) KR101975689B1 (zh)
CN (1) CN105046059B (zh)
WO (1) WO2016206491A1 (zh)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105046059B (zh) * 2015-06-24 2017-09-29 深圳市腾讯计算机系统有限公司 游戏运动轨迹确定方法及装置
JP6429217B2 (ja) * 2015-07-22 2018-11-28 本田技研工業株式会社 経路生成装置、経路生成方法、および経路生成プログラム
CN107019915B (zh) * 2016-02-01 2018-09-07 腾讯科技(深圳)有限公司 一种确定移动轨迹的方法、用户设备及系统
CN106055245A (zh) * 2016-05-20 2016-10-26 小天才科技有限公司 一种计算游戏物体运动轨迹的方法及装置
CN108269297B (zh) * 2017-12-27 2021-06-01 福建省天奕网络科技有限公司 一种在三维场景中编排角色运动轨迹的方法及终端
CN109126120B (zh) * 2018-08-17 2022-06-03 Oppo广东移动通信有限公司 马达控制方法及相关产品
CN111311745B (zh) * 2018-12-11 2023-06-13 网易(杭州)网络有限公司 一种模型的放置方法和装置
SE544061C2 (en) * 2019-07-05 2021-11-30 Climeon Ab Method and controller for dynamically determining a system curve in a heat power system
CN112802067B (zh) * 2021-01-26 2024-01-26 深圳市普汇智联科技有限公司 一种基于图网络的多目标跟踪方法及系统
CN113626118B (zh) * 2021-07-30 2023-07-25 中汽创智科技有限公司 能耗实时显示方法、装置及设备
CN115131471B (zh) * 2022-08-05 2024-08-02 北京字跳网络技术有限公司 基于图像的动画生成方法、装置、设备及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080262721A1 (en) * 2007-04-17 2008-10-23 Hitachi, Ltd. Map generation system and map generation method by using GPS tracks
CN102794767A (zh) * 2012-08-31 2012-11-28 江南大学 视觉引导的机器人关节空间b样条轨迹规划方法
CN103793933A (zh) * 2012-11-02 2014-05-14 同济大学 虚拟人体动画的运动路径生成方法
CN103902086A (zh) * 2012-12-28 2014-07-02 北京汇冠新技术股份有限公司 一种基于曲线拟合的触摸轨迹平滑方法及系统
CN105046059A (zh) * 2015-06-24 2015-11-11 深圳市腾讯计算机系统有限公司 游戏运动轨迹确定方法及装置

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6115051A (en) 1996-08-07 2000-09-05 Adobe Systems Incorporated Arc-length reparameterization
US7805442B1 (en) * 2000-12-05 2010-09-28 Navteq North America, Llc Method and system for representation of geographical features in a computer-based system
US7477988B2 (en) 2006-05-16 2009-01-13 Navteq North America, Llc Dual road geometry representation for position and curvature-heading
JP5446136B2 (ja) * 2008-06-05 2014-03-19 日本電気株式会社 走行支援システムおよび走行路曲線生成方法
CN101822545B (zh) * 2010-05-11 2011-05-25 河南大学 一种数字减影血管造影运动伪影消除方法及其系统
CN102467587B (zh) * 2010-11-01 2014-01-29 财团法人工业技术研究院 水冷机动态特性模型建立方法、水冷机监控方法和装置
CN102201122B (zh) * 2011-05-16 2014-04-23 大连大学 一种运动捕捉的数据降噪方法、系统及运动捕捉系统
US9727987B2 (en) * 2014-05-12 2017-08-08 Adobe Systems Incorporated Blending techniques for curve fitting

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080262721A1 (en) * 2007-04-17 2008-10-23 Hitachi, Ltd. Map generation system and map generation method by using GPS tracks
CN102794767A (zh) * 2012-08-31 2012-11-28 江南大学 视觉引导的机器人关节空间b样条轨迹规划方法
CN103793933A (zh) * 2012-11-02 2014-05-14 同济大学 虚拟人体动画的运动路径生成方法
CN103902086A (zh) * 2012-12-28 2014-07-02 北京汇冠新技术股份有限公司 一种基于曲线拟合的触摸轨迹平滑方法及系统
CN105046059A (zh) * 2015-06-24 2015-11-11 深圳市腾讯计算机系统有限公司 游戏运动轨迹确定方法及装置

Also Published As

Publication number Publication date
KR101975689B1 (ko) 2019-05-07
JP6735760B2 (ja) 2020-08-05
US10354393B2 (en) 2019-07-16
JP2018506118A (ja) 2018-03-01
US20170287142A1 (en) 2017-10-05
CN105046059B (zh) 2017-09-29
KR20170086572A (ko) 2017-07-26
CN105046059A (zh) 2015-11-11

Similar Documents

Publication Publication Date Title
WO2016206491A1 (zh) 目标对象运动轨迹确定方法、装置以及存储介质
JP6467526B2 (ja) 通信メッセージ送信方法及びウェアラブル・デバイス
WO2016169465A1 (zh) 一种显示弹幕信息的方法、装置和系统
WO2016197758A1 (zh) 信息推荐系统、方法及装置
WO2017041664A1 (zh) 一种征信评分确定方法、装置及存储介质
WO2016107501A1 (zh) 智能设备控制方法及装置
WO2017125027A1 (zh) 一种进行信息展示的方法和装置、计算机存储介质
CN104967896A (zh) 一种显示弹幕评论信息的方法和装置
US10845981B2 (en) Operation control method, device and storage medium
CN106487984B (zh) 一种调整音量的方法和装置
WO2014000656A1 (zh) 刷新页面的方法、装置及终端
CN108984066B (zh) 一种应用程序图标显示方法及移动终端
TW201515682A (zh) 一種數據獲取的方法及終端
CN110874128B (zh) 可视化数据处理方法和电子设备
CN105022552A (zh) 一种显示消息列表的方法和装置
US20190034078A1 (en) Method for Sliding Response Acceleration and Related Products
JP2017509051A (ja) ストリーミングメディアデータに関する統計を収集するための方法およびシステム、ならびに関連する装置
WO2017128986A1 (zh) 多媒体菜单项的选择方法、装置及存储介质
WO2015180596A1 (en) Method for superposing location information on collage, terminal and server
CN111651030B (zh) 传感器检测方法、装置、存储介质及移动终端
US20160119695A1 (en) Method, apparatus, and system for sending and playing multimedia information
WO2018219118A1 (zh) 界面显示方法及相关产品
CN107193551B (zh) 一种生成图像帧的方法和装置
CN105159655B (zh) 行为事件的播放方法和装置
WO2013152656A1 (zh) 一种绘制滑动轨迹的方法及移动终端

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16813607

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 20177016338

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2017538656

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC, EPO FORM 1205A DATED 30.05.18

122 Ep: pct application non-entry in european phase

Ref document number: 16813607

Country of ref document: EP

Kind code of ref document: A1