WO2020114154A1 - 动画的控制方法、装置、存储介质和电子装置 - Google Patents

动画的控制方法、装置、存储介质和电子装置 Download PDF

Info

Publication number
WO2020114154A1
WO2020114154A1 PCT/CN2019/114251 CN2019114251W WO2020114154A1 WO 2020114154 A1 WO2020114154 A1 WO 2020114154A1 CN 2019114251 W CN2019114251 W CN 2019114251W WO 2020114154 A1 WO2020114154 A1 WO 2020114154A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
rotation
adjusted
matrix
angle
Prior art date
Application number
PCT/CN2019/114251
Other languages
English (en)
French (fr)
Inventor
李静翔
丁晓骏
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Priority to EP19893340.0A priority Critical patent/EP3832604A4/en
Publication of WO2020114154A1 publication Critical patent/WO2020114154A1/zh
Priority to US17/193,525 priority patent/US11783523B2/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/403D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/35Details of game servers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4007Scaling of whole images or parts thereof, e.g. expanding or contracting based on interpolation, e.g. bilinear interpolation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2213/00Indexing scheme for animation
    • G06T2213/12Rule based animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2016Rotation, translation, scaling

Definitions

  • This application relates to the field of animation, specifically, to the control of animation.
  • DCC Digital Content Creation
  • Most of the animations in the game are fine arts created in Digital Content Creation (DCC) software (such as software Maya, 3dsMax, etc.), and then imported into the game engine to switch between different animations with a state machine.
  • DCC Digital Content Creation
  • the advantage of this production method is that artists can use the production advantages of the software to make realistic animations. For the scene where the eyeball is always facing the camera, it is necessary for the artist to animate in the key orientation, and to perform animation interpolation based on these key orientations when running.
  • IK animation In order to solve the problem of real-time interaction, animation also needs to make timely feedback based on user input. Therefore, inverse dynamics (IK) animation based on child bone nodes driving the movement of parent bone nodes is also widely used in game engines .
  • IK animation is based on the position of the end bones to reverse the movement of other bones. For example, reaching for a teacup, the location of the teacup is the target, and the IK animation solves the animation from the finger to the arm.
  • the advantage of IK animation is that the animation is calculated in real time without the participation of fine arts. For the scene where the eyeball is always facing the camera, as long as the direction of the eyeball is specified, the IK system will automatically calculate the direction of the eyeball, head and neck.
  • the embodiments of the present application provide an animation control method, device, storage medium, and electronic device, so as to at least solve the technical problem of inflexible animation control.
  • an animation control method which is applied to an electronic device, and includes acquiring a first position of a virtual camera in an animation and a second position of an object to be adjusted in an animation; according to the first position, the The two positions and the coordinate information of the target object to which the target to be adjusted belongs determine the rotation angle of the target to be adjusted; the rotation angle is adjusted using the rotation coefficient to obtain the target angle, and the rotation coefficient is used to adjust the ratio of the rotation angle of the target to be adjusted and the associated target,
  • the associated target is a target that has a linkage relationship with the target to be adjusted; the target to be adjusted is controlled to rotate according to the target angle so that the rotated target to be adjusted faces the virtual camera.
  • an animation control device which is applied to an electronic device and includes: an acquiring unit, configured to acquire a first position of a virtual camera in an animation and a second target to be adjusted in an animation Position; determination unit, used to determine the rotation angle of the target to be adjusted based on the first position, the second position and the coordinate information of the target object to which the target to be adjusted; adjustment unit, used to adjust the rotation angle using the rotation coefficient to obtain the target angle , The rotation coefficient is used to adjust the ratio of the rotation angle of the target to be adjusted and the associated target.
  • the associated target is a target that has a linkage relationship with the target to be adjusted; the control unit is used to control the target to be adjusted to rotate according to the target angle, so that the rotated The target to be adjusted faces the virtual camera.
  • a storage medium in which a computer program is stored, wherein the computer program is set to execute the above method when it is run.
  • an electronic device including a memory and a processor, a computer program is stored in the memory, and the processor is configured to execute the foregoing method through the computer program.
  • a computer program product including instructions, which, when run on a computer, causes the computer to execute the above method.
  • the matrix expression of the coordinates of the target to be adjusted is determined by the positions of the target to be adjusted and the virtual camera, the rotation angle of the target to be adjusted is determined according to the coordinates of the target to be adjusted, and the rotation angle can be adjusted according to the input rotation coefficient to obtain The target angle, and then control the target to be adjusted to adjust according to the target angle, so that the rotation of the target to be adjusted is more flexible, the effect exhibited is more natural, and the technical problem of inflexibility when adjusting the rotation angle in the prior art is solved.
  • FIG. 1 is a schematic diagram of an optional hardware environment according to an embodiment of the present application.
  • FIG. 2 is a flowchart of an optional animation control method according to an embodiment of the present application.
  • FIG. 3 is a schematic diagram of an optional vector according to an embodiment of the present application.
  • FIG. 4 is a schematic diagram of an optional rotation effect according to an embodiment of the present application.
  • FIG. 5 is a schematic diagram of an optional application of a rotation angle according to an embodiment of the present application.
  • FIG. 6 is a schematic diagram of an optional animation control device according to an embodiment of the present application.
  • FIG. 7 is a schematic diagram of an optional electronic device according to an embodiment of the present application.
  • a method for controlling animation is provided.
  • the technical solution of the present application is applied to a local application (such as a stand-alone game)
  • the above animation control method can be applied to the hardware environment composed of the terminal 101 shown in FIG. 1,
  • the terminal 101 adjusts the orientation of the target to be adjusted by executing the animation control method of the present application.
  • the above hardware environment may further include a server 102 that provides a business service (such as a game service) for the terminal 101, which is executed by the server 102
  • the animation control method of the present application provides the execution result as a part of a service (such as a game service) to the terminal 101 to adjust the orientation of the target to be adjusted on the terminal.
  • the terminal 101 can be connected to the server 102 through a network.
  • the network includes But not limited to: wide area network, metropolitan area network or local area network, the terminal 101 may be a mobile phone terminal, or a PC terminal, a notebook terminal or a tablet computer terminal.
  • the technical solution of the present application can be applied to social scenes using technologies such as augmented reality AR, virtual reality VR, etc.
  • the social scenes can be provided by the terminal 101 described above, and the server 102 synchronizes the social behaviors in the social scene. Adjust the target object to which the target belongs.
  • the target object may be the virtual character corresponding to the current user in the social scene, the virtual friend of the current user in the social scene, the virtual passerby in the social scene, the virtual pet of the current user in the social scene, etc.
  • the server can perform the technical solution of this application to adjust the orientation of the target to be adjusted on the target object, such as adjusting its eyes to always face the virtual camera, and adjusting the result It is sent to each terminal so that each terminal synchronously renders the eyes of the target object to always be a picture facing the screen (the picture played on the screen is the picture rendered from the perspective of the virtual camera).
  • the technical solution of this application can also be applied to game scenarios, such as multiplayer online tactical competition (Multiplayer Online Battle, MOBA) games, first-person shooter (FPS) games, third-person shooter (Third- person shooter (TPS) games, etc.
  • the game application of the above game can be installed on the above terminal 101 to form a client of the game, the player participates in the game in the game scene through the client, and the game scene includes the target to which the target to be adjusted belongs Object
  • the target object may be a game character controlled by the current user in the game scene, a teammate of the game character, an enemy of the game character, etc.
  • the target to be adjusted may be an organ of the character character or a certain body structure Parts, such as eyes.
  • the player participates in the game in the game scene
  • the game is a stand-alone game or a multi-player local interconnected game
  • one of the terminals can execute the technical solution of this application to adjust the orientation of the target to be adjusted on the target object, and also The execution result is synchronized to other terminals.
  • the server 102 can execute the technical solution of the present application to adjust the orientation of the target to be adjusted on the target object, and synchronize the execution result to the terminal participating in the game.
  • the terminal 101 uses the terminal 101 to perform the animation control method of the embodiment of the present application as an example for description.
  • the terminal 101 executes the animation control method of the embodiment of the present application may also be a client installed on it (such as a game application client) End), for example, embedding the animation control method of the present application into the game engine of the game application as part of the game engine's own logic; or embedding the animation control method of the present application into the game logic of the game application, As part of the game logic.
  • 2 is a flowchart of an optional animation control method according to an embodiment of the present application. As shown in Figure 2, the control method of the animation includes:
  • the terminal acquires the first position of the virtual camera in the animation and the second position of the target to be adjusted in the animation.
  • the target to be adjusted can be a part of the object.
  • the object may be a human character, and the target to be adjusted may be an organ of the human character or a part of a body structure, such as eyes.
  • the animation may be a game animation, and the target to be adjusted may be the eyes of the game character.
  • the first position is the position of the virtual camera in the animation
  • the second position is the position of the target to be adjusted in the animation
  • the game scene and each character in the game scene can be rendered by the game engine, when acquiring the first position and the second position
  • Both the first position and the second position can be represented by three-dimensional coordinates, and the three-dimensional coordinates can be represented by a matrix.
  • the terminal determines the rotation angle of the target to be adjusted according to the first position, the second position, and the coordinate information of the target object to which the target to be adjusted belongs.
  • the vector from the second position to the first position (the vector can be a vector in a three-dimensional coordinate system), as shown in FIG. 3, multiply the vector by the matrix [0 0 1] to obtain the x-axis coordinates of the target to be adjusted; Multiply this vector by the matrix [0 1 0] to get the y-axis coordinates of the target to be adjusted; use this vector by the matrix [1 0 0] to get the z-axis coordinates of the target to be adjusted.
  • the rotation matrix is the matrix M r (i.e., the first matrix), in order to cancel the player character during the game under the control character and the initial state in which no rotational alignment of the X-axis, the player acquires Controlling the rotation matrix M t (ie the second matrix) of the character when the character rotates and the rotation matrix M i (ie the third matrix) of the character in the initial state, the above rotation matrix M r , rotation matrix M t and rotation matrix M i Connect together to get the target matrix M of the target to be adjusted:
  • the target object to which the target to be adjusted belongs is a game character, and the coordinate information of the target object includes a rotation matrix M t and a rotation matrix M i .
  • the rotation matrix M t and the rotation matrix M i can be obtained by a game engine.
  • the above M i , M r can be a matrix of three rows and three columns, and the rotation matrix M t represents the rotation matrix of the game character under the player's control and corresponding to the player's control operation (that is, the state before the control is converted to the state after the control) And the rotation matrix M i of the character in the initial state can be obtained through the application program interface API provided by the game engine:
  • i 11 to i 33 are constants
  • t 11 to t 33 are constants
  • r 11 to r 33 are constants calculated according to the vector from the second position to the first position.
  • R 11 to R 33 can be directly used as M i , The value of each element in M r is calculated.
  • determining the rotation angle of the target to be adjusted according to the first position, the second position, and the coordinate information of the target object to which the target to be adjusted includes: acquiring the target to be adjusted according to the vector from the second position to the first position The first matrix of the target object; obtain the rotation matrix of the target object as the second matrix, where the second matrix is the matrix when the target object is controlled to rotate in a predetermined scene; obtain the rotation matrix of the target object in the initial state as the third Matrix; cascade the first matrix, the second matrix and the third matrix to get the target matrix; convert the target matrix into a rotation angle.
  • the first matrix is a rotation matrix M r
  • the second matrix is a rotation matrix M t
  • the third matrix is a rotation matrix M i .
  • ⁇ x atan2(R 32 , R 33 );
  • ⁇ z atan2(R 21 , R 11 ).
  • the rotation angle may include an angle ⁇ x relative to the X axis, an angle ⁇ y relative to the Y axis, and an angle ⁇ z relative to the Z axis.
  • the terminal adjusts the rotation angle using the rotation coefficient to obtain the target angle.
  • the target to be adjusted After the target to be adjusted rotates according to the target angle, the target to be adjusted will face the virtual camera. Since the target angle represents the angle difference between the angle before adjustment and the angle when facing the virtual camera, the target to be adjusted according to the target angle
  • the adjustment target is toward the virtual camera, and the rotation coefficient is used to adjust the ratio of the rotation angle of the target to be adjusted and the associated target.
  • the associated target is a target that has a linkage relationship with the target to be adjusted. In an optional manner, the rotation coefficient is determined by Animation art preset.
  • rotating a part of the character may cause the linkage of other parts, or rotating a part at the same time will rotate other related parts to make the rotation effect more natural. Therefore, when rotating the target to be adjusted,
  • the rotation angle of other associated targets should also be considered, that is, the ratio of the rotation angle (that is, the aforementioned rotation coefficient) is set for the target to be adjusted and the associated target.
  • the target to be adjusted is the character's eyes
  • the associated targets include the character's head and neck.
  • the rotation ratio of the eyes, head, and neck is 3:2:1.
  • substitute M i The value of each element in M r is calculated as M, and then, when the target angle of the eye is determined to be reduced by 30° from the X axis, the head rotates (that is, the angle from the X axis decreases) 20 °, the neck rotates (that is, the angle with the X axis decreases) by 10°.
  • the rotation coefficient of the target to be rotated may be 3, the ratio between the target and the head is 3:2, and the rotation ratio between the target and the neck is 3:1.
  • the rotation coefficient here can be set according to the different positions of the target to be adjusted in the object, so that the target to be adjusted rotates according to the target angle and faces the virtual camera, and the object's movement is more natural, as shown in Figure 4, the eye rotates At 30°, the head rotates 20° and the neck rotates 10°.
  • the terminal controls the target to be adjusted to rotate according to the target angle, so that the rotated target to be adjusted faces the virtual camera.
  • the matrix expression of the coordinates of the target to be adjusted is determined by the positions of the target to be adjusted and the virtual camera, the rotation angle of the target to be adjusted is determined according to the coordinates of the target to be adjusted, and the rotation angle can be adjusted according to the input rotation coefficient to obtain The target angle, and then control the target to be adjusted to adjust according to the target angle, so that the rotation of the target to be adjusted is more flexible, the effect exhibited is more natural, and the technical problem of inflexibility when adjusting the rotation angle in the prior art is solved.
  • controlling the target to be adjusted to rotate according to the target angle includes: obtaining the current rotation matrix of the target to be adjusted (the current rotation matrix is similar to the rotation matrix M t and the rotation matrix M i , which can be obtained through an application program interface API provided by the game engine ), for the game character, its rotation includes two parts, one is the rotation of the entire game character controlled by the player, etc.
  • the current rotation matrix is used to represent the rotation of the entire character at the current time, and the second is to make
  • the adjustment target always rotates towards the virtual camera, expressed by the target matrix or the target angle converted from the target matrix, the superposition of the two is equivalent to the final rotation; the target angle is applied to the current rotation matrix to obtain the target matrix,
  • the rotation angle represented by the target matrix is the angle of the target to be adjusted toward the virtual camera. At this time, controlling the target to be adjusted to rotate to the target matrix can make the target to be adjusted always face the virtual camera.
  • the current rotation matrix expressed front rotary target spatial position to be adjusted may be represented by M e:
  • e 11 to e 33 are constants.
  • M a [ ⁇ x , ⁇ y , ⁇ z ].
  • the target to be adjusted after the rotation is directed to the virtual camera.
  • the need may refer to the need to point the target to be adjusted to the virtual camera at a certain moment in the game, or it may be kept to be adjusted throughout the game
  • the target faces the virtual camera. If the target to be adjusted does not face the virtual camera in the middle, the adjustment is performed according to the above method of the present application. The adjusted effect is shown in FIG. 4.
  • the animation control method provided by the embodiment of the present application further includes: setting a time required for the target to be adjusted to rotate as the rotation time; linearly differentiating the rotation angle according to the rotation time to obtain the rotation angle after the difference In order to adjust the rotation angle after the difference using the rotation coefficient to obtain the target angle, wherein the rotation angle after the difference is multiplied by the rotation coefficient to obtain the target angle.
  • the embodiment of the present application does not limit the acquisition time of the “rotation time”, and it only needs to be completed before performing the “linear difference of the rotation angle according to the rotation time”.
  • the rotation angle is linearly interpolated according to time, that is, the time required for the rotation is set, the difference After the rotation angle and the rotation coefficient are multiplied, the time required for the rotation can be adjusted. For example, in the case of a rotation angle of 30°, if the rotation time is 1 second, the target to be adjusted is controlled to rotate 30° in 1 second; if the rotation time is 5 seconds, the target to be adjusted is controlled to rotate within 5 seconds 30°.
  • the linear difference does not change the angle of the target to be adjusted, only the time required for the rotation. Setting the rotation time to 5 seconds can show the effect of slowly rotating the target to be adjusted.
  • controlling the target to be adjusted to rotate according to the target angle includes: obtaining a rotation coefficient of at least one associated target as at least one rotation coefficient; applying at least one rotation coefficient to the corresponding associated target in the at least one associated target to obtain at least One rotation angle; controlling at least one associated target and the target to be adjusted to rotate step by step according to the target angle and at least one rotation angle.
  • the rotation angle of the associated target is determined.
  • the target to be adjusted is the character's eyes
  • the associated targets include the character's head and neck
  • the rotation ratio of the eyes, head and neck is 3:2:1
  • the target angle of the eyes is determined to be 30° In the case of, the head rotates 20° and the neck rotates 10°.
  • bone A represents the neck
  • bone B represents the head
  • bone C represents the eye
  • the bone hierarchy is bone A to bone B to bone C.
  • the rotation angle of bone A is calculated first, and then adjusted according to the input of the art Rotation is to adjust the rotation angle of bone A according to the input rotation coefficient, superimposed on the existing rotation, that is, superimposed on the current rotation angle of bone A to complete the rotation of bone A.
  • Calculate the rotation angle of bone B then adjust the rotation according to the input of the art, that is, adjust the rotation angle of bone B according to the input rotation coefficient, superimpose on the existing rotation, that is, superimpose on the current rotation angle of bone B, complete the rotation of bone B .
  • FIG. 5 shows that the rotations of bone A, bone B, and bone C are calculated separately, but in this embodiment, after the rotation angle of bone C is calculated, the rotation angles of bone A and bone B can be directly obtained according to the rotation coefficient, and further Complete the gradual rotation from bone A to bone C.
  • the final rotation result can be seen in Figure 4.
  • the angle between the character's eye (equivalent to bone C) and the X axis is reduced by 30°
  • the head and neck (equivalent to The angle between bone A and bone B) and the X axis is reduced by 20° and 10°, respectively.
  • the effect of the eye-tracking camera can be generated in real time.
  • the animation is smooth and natural, no artificial traces can be seen, and the animation of other parts of the body is not affected. For example, it can blink while looking at you.
  • the technical solution of the present application is equivalent to providing a solution for adjusting a part of a target object (that is, a target to be adjusted) in a virtual scene toward a virtual camera.
  • the virtual scene may be the aforementioned game scene , Social scenes, etc.
  • the method according to the above embodiments can be implemented by means of software plus a necessary general hardware platform, and of course, it can also be implemented by hardware, but in many cases the former is Better implementation.
  • the technical solution of the present application can essentially be embodied in the form of software products, and the computer software products are stored in a storage medium (such as ROM/RAM, magnetic disk,
  • the CD-ROM includes several instructions to enable a terminal device (which may be a mobile phone, computer, server, or network device, etc.) to execute the methods described in the embodiments of the present application.
  • FIG. 6 is a schematic diagram of an optional animation control device according to an embodiment of the application, As shown in Figure 6, the device includes:
  • the obtaining unit 602 is configured to obtain the first position of the virtual camera in the animation and the second position of the target to be adjusted in the animation;
  • the determining unit 604 is configured to determine the rotation angle of the target to be adjusted according to the first position, the second position, and the coordinate information of the target object to which the target to be adjusted belongs;
  • the adjusting unit 606 is used to adjust the rotation angle by using the rotation coefficient to obtain the target angle.
  • the rotation coefficient is used to adjust the ratio of the rotation angle of the target to be adjusted and the related target.
  • the related target is a target that has a linkage relationship with the target to be adjusted.
  • the control unit 608 is configured to control the target to be adjusted to rotate according to the target angle, so that the rotated target to be adjusted faces the virtual camera.
  • the matrix expression of the coordinates of the target to be adjusted is determined by the positions of the target to be adjusted and the virtual camera, the rotation angle of the target to be adjusted is determined according to the coordinates of the target to be adjusted, and the rotation angle can be adjusted according to the input rotation coefficient to obtain The target angle, and then control the target to be adjusted to adjust according to the target angle, so that the rotation of the target to be adjusted is more flexible, the effect exhibited is more natural, and the technical problem of inflexibility when adjusting the rotation angle in the prior art is solved.
  • the determining unit 604 includes:
  • a first acquiring module configured to acquire the first matrix of the target to be adjusted according to the vector from the second position to the first position
  • a second obtaining module configured to obtain a rotation matrix of the target object as a second matrix, wherein the second matrix is a matrix when the target object is controlled to rotate in a predetermined scene;
  • a third obtaining module configured to obtain a rotation matrix of the target object in an initial state to obtain a third matrix
  • a cascading unit configured to cascade the first matrix, the second matrix and the third matrix to obtain a target matrix
  • a conversion unit is used to convert the target matrix into the rotation angle.
  • control unit 608 includes: a fourth acquisition module for acquiring the current rotation matrix of the target to be adjusted; a first application module for applying the target angle to the current rotation matrix to obtain the target matrix, wherein the target matrix The rotation angle indicated is the angle of the target to be adjusted toward the virtual camera; the rotation module is used to control the target to be adjusted to rotate to the target matrix.
  • the device further includes: a setting unit for setting the time required for the target to be adjusted to rotate to obtain the rotation time; a difference unit for linearly differentiating the rotation angle according to the rotation time to obtain the difference Rotation angle, wherein the rotation angle after the difference is multiplied by the rotation coefficient to obtain the target angle; the adjusting unit 606 is specifically configured to adjust the rotation angle after the difference using the rotation coefficient to obtain the target angle.
  • control unit 608 includes: a fifth acquisition module for acquiring a rotation coefficient of at least one associated target to obtain at least one rotation coefficient; and a second application module for applying at least one rotation coefficient to at least one associated target At least one rotation angle is obtained on the corresponding associated target; the control module is used to control the at least one associated target and the target to be adjusted to rotate step by step according to the target angle and the at least one rotation angle.
  • an electronic device for implementing the above animation control method.
  • the electronic device includes, including a memory and a processor, and a computer is stored in the memory A program, the processor is configured to execute the steps in any one of the foregoing method embodiments through a computer program.
  • the electronic device may be a terminal or a server, which is not specifically limited in this embodiment of the present application.
  • FIG. 7 is a schematic diagram of an optional electronic device according to an embodiment of the present application.
  • the electronic device may include: one or more (only one is shown in FIG. 7) processor 701, at least one communication bus 702, user interface 703, at least one transmission device 704, and memory 705.
  • the communication bus 702 is used to implement connection communication between these components.
  • the user interface 703 may include a display 706 and a keyboard 707.
  • the transmission device 704 may optionally include a standard wired interface and a wireless interface.
  • the above-mentioned electronic device may be located in at least one network device among multiple network devices of the computer network.
  • the foregoing processor may be configured to perform the following steps through a computer program:
  • the structure shown in FIG. 7 is only an illustration, and the electronic device may also be a smart phone (such as an Android phone, an iOS phone, etc.), a tablet computer, a palmtop computer, and a mobile Internet device (Mobile Internet devices, MID), PAD and other terminal devices.
  • FIG. 7 does not limit the structure of the above electronic device.
  • the electronic device may further include more or fewer components than those shown in FIG. 7 (such as a network interface, a display device, etc.), or have a configuration different from that shown in FIG. 7.
  • the memory 705 may be used to store software programs and modules. As in the embodiments of the present application, the program instructions/modules corresponding to the device are implemented.
  • the processor 701 executes various programs by running the software programs and modules stored in the memory 705 Functional application and data processing, that is, any embodiment for implementing the above-described animation control method.
  • the memory 705 may include a high-speed random access memory, and may also include a non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory.
  • the memory 705 may further include memories remotely set with respect to the processor 701, and these remote memories may be connected to the terminal through a network. Examples of the above network include but are not limited to the Internet, intranet, local area network, mobile communication network, and combinations thereof.
  • the transmission device 704 described above is used to receive or send data via a network.
  • Specific examples of the aforementioned network may include a wired network and a wireless network.
  • the transmission device 704 includes a network adapter (Network Interface Controller, NIC), which can be connected to other network devices and routers through a network cable to communicate with the Internet or a local area network.
  • the transmission device 704 is a radio frequency (Radio Frequency) module, which is used to communicate with the Internet in a wireless manner.
  • Radio Frequency Radio Frequency
  • the memory 705 is used to store the rotation matrix of the target to be adjusted.
  • the matrix expression of the coordinates of the target to be adjusted is determined by the positions of the target to be adjusted and the virtual camera, the rotation angle of the target to be adjusted is determined according to the coordinates of the target to be adjusted, and the rotation angle can be adjusted according to the rotation coefficient to obtain the target angle Then, the target to be adjusted is controlled to adjust according to the target angle, so that the rotation of the target to be adjusted is more flexible and the effect exhibited is more natural, which solves the technical problem of inflexibility when adjusting the rotation angle in the prior art.
  • An embodiment of the present application further provides a storage medium in which a computer program is stored, wherein the computer program is set to any implementation manner of the control method for executing the above animation at runtime.
  • the above storage medium may be set to store a computer program for performing the following steps:
  • the storage medium is also configured to store a computer program for performing the steps included in the animation control method in the above embodiment, which will not be repeated in this embodiment.
  • the storage medium may include: a flash disk, a read-only memory (Read-Only Memory, ROM), a random access device (Random Access Memory, RAM), a magnetic disk, or an optical disk.
  • an embodiment of the present application also provides a computer program product including instructions, which when run on a computer, causes the computer to execute the method provided by the foregoing embodiment.
  • the integrated unit in the above embodiment is implemented in the form of a software functional unit and sold or used as an independent product, it may be stored in the computer-readable storage medium.
  • the technical solution of the present application essentially or part of the contribution to the existing technology or all or part of the technical solution can be embodied in the form of a software product, the computer software product is stored in a storage medium, Several instructions are included to enable one or more computer devices (which may be personal computers, servers, network devices, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application.
  • the disclosed client may be implemented in other ways.
  • the device embodiments described above are only schematic.
  • the division of the unit is only a logical function division.
  • there may be another division manner for example, multiple units or components may be combined or may Integration into another system, or some features can be ignored, or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, units or modules, and may be in electrical or other forms.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in one place, or may be distributed on multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the above integrated unit may be implemented in the form of hardware or software functional unit.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • Human Computer Interaction (AREA)
  • Architecture (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computing Systems (AREA)
  • Processing Or Creating Images (AREA)

Abstract

一种动画的控制方法、装置、存储介质和电子装置。其中,该方法包括:获取动画中虚拟摄像机的第一位置和动画中待调整目标的第二位置(202);根据第一位置、第二位置和待调整目标所属目标对象的坐标信息确定待调整目标的旋转角度(204);利用旋转系数对旋转角度进行调整,得到目标角度(206),其中,待调整目标按照目标角度旋转后,待调整目标朝向虚拟摄像机,旋转系数用于调整待调整目标与关联目标的旋转角度的比例,关联目标是与待调整目标具有联动关系的目标;控制待调整目标按照目标角度进行旋转(208)。该方法解决了动画的控制方式不灵活的技术问题。

Description

动画的控制方法、装置、存储介质和电子装置
本申请要求于2018年12月05日提交中国专利局、申请号为201811481911.5、申请名称为“动画的控制方法、装置、存储介质和电子装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及动画领域,具体而言,涉及动画的控制。
背景技术
游戏里的动画大部分是美术在数字内容创建(Digital Content Creation,DCC)软件(例如软件Maya、3dsMax等)中制作,然后导入游戏引擎中用状态机在不同的动画切换。这种制作方式的优点是美术人员可以利用软件的制作优势,做出逼真的动画。对于眼球始终朝向摄像机这个场景来说,需要美术在关键的朝向制作动画,运行的时候根据这些关键朝向做动画插值。
为了解决实时交互的问题,动画也要根据用户的输入做出及时的反馈,因此,基于子骨骼节点带动父骨骼节点运动的反向动力学(Inverse Kinematics,IK)动画也广泛应用在游戏引擎里。IK动画是通过末端骨骼的位置来反向推导其他骨骼的运动。比如伸手去拿一个茶杯,茶杯的位置是目标,IK动画解算从手指到手臂的动画。IK动画的优点是实时计算动画,不需要美术的参与。对于眼球始终朝向摄像机这个场景来说,只要指定眼球的朝向,IK系统就会自动算出眼球,头部以及脖子的朝向。
技术内容
本申请实施例提供了一种动画的控制方法、装置、存储介质和电子装置,以至少解决动画的控制方式不灵活的技术问题。
根据本申请实施例的一个方面,提供了一种动画的控制方法,应用于电子设备,包括获取动画中虚拟摄像机的第一位置和动画中待调整目标的第二位置;根据第一位置、第二位置和待调整目标所属目标对象的坐标信息确定待调整目标的旋转角度;利用旋转系数对旋转角度进行调整,得到目标角度,旋转 系数用于调整待调整目标与关联目标的旋转角度的比例,关联目标是与待调整目标具有联动关系的目标;控制待调整目标按照目标角度进行旋转,使得旋转后的待调整目标朝向所述虚拟摄像机。
根据本申请实施例的另一方面,还提供了一种动画的控制装置,应用于电子设备,包括:获取单元,用于获取动画中虚拟摄像机的第一位置和动画中待调整目标的第二位置;确定单元,用于根据第一位置、第二位置和待调整目标所属目标对象的坐标信息确定待调整目标的旋转角度;调整单元,用于利用旋转系数对旋转角度进行调整,得到目标角度,旋转系数用于调整待调整目标与关联目标的旋转角度的比例,关联目标是与待调整目标具有联动关系的目标;控制单元,用于控制待调整目标按照目标角度进行旋转,使得旋转后的待调整目标朝向所述虚拟摄像机。
根据本申请实施例的一方面,还提供了一种存储介质,存储介质中存储有计算机程序,其中,计算机程序被设置为运行时执行上述的方法。
根据本申请实施例的一方面,还提供了一种电子装置,包括存储器和处理器,存储器中存储有计算机程序,处理器被设置为通过计算机程序执行上述的方法。
根据本申请实施例的一方面,还提供了一种包括指令的计算机程序产品,当其在计算机上运行时,使得所述计算机执行上述的方法。
本实施例通过待调整目标和虚拟摄像机的位置确定待调整目标的坐标的矩阵表达,根据待调整目标的坐标之后确定待调整目标的旋转角度,可以根据输入的旋转系数对旋转角度进行调整,得到目标角度,然后控制待调整目标按照目标角度进行调整,从而使得待调整目标的旋转更加灵活,表现出的效果更加自然,解决了现有技术调整旋转角度时不灵活的技术问题。
附图说明
此处所说明的附图用来提供对本申请的进一步理解,构成本申请的一部分,本申请的示意性实施例及其说明用于解释本申请,并不构成对本申请的不当限定。在附图中:
图1是根据本申请实施例的一种可选的硬件环境的示意图;
图2是根据本申请实施例的一种可选的动画的控制方法的流程图;
图3是根据本申请实施例的一种可选的矢量的示意图;
图4是根据本申请实施例的一种可选的旋转效果的示意图;
图5是根据本申请实施例的一种可选的旋转角度应用的示意图;
图6是根据本申请实施例的一种可选的动画的控制装置的示意图;
图7是根据本申请实施例的一种可选的电子装置的示意图。
具体实施方式
为了使本技术领域的人员更好地理解本申请方案,下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分的实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都应当属于本申请保护的范围。
需要说明的是,本申请的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便这里描述的本申请的实施例能够以除了在这里图示或描述的那些以外的顺序实施。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或单元的过程、方法、系统、产品或设备不必限于清楚地列出的那些步骤或单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或单元。
根据本申请实施例的一个方面,提供了一种动画的控制方法。在本实施例中,在将本申请的技术方案应用于本地应用(如单机游戏)的情况下,上述动画的控制方法可以应用于如图1所示的终端101所构成的硬件环境中,由终端101通过执行本申请的动画的控制方法来调整待调整目标的朝向。
可选地,在将本申请的技术方案应用于联网应用(如联网游戏)的情况下,上述硬件环境中还可包括为终端101提供业务服务(如游戏服务)的服务器102,由服务器102执行本申请的动画的控制方法,并将执行结果作为服务(如 游戏服务)的一部分提供给终端101,来调整终端上待调整目标的朝向,终端101可通过网络与服务器102进行连接,上述网络包括但不限于:广域网、城域网或局域网,终端101可以是手机终端,也可以是PC终端、笔记本终端或平板电脑终端。
本申请的技术方案所适用的场景包括但不局限于如下场景:
本申请的技术方案可以应用在采用增强现实AR、虚拟现实VR等技术的社交场景中,该社交场景可以由上述终端101提供,由服务器102来同步社交场景中的社交行为,社交场景中包括待调整目标所属的目标对象,该目标对象可以为当前用户在社交场景中对应的虚拟人物、当前用户在社交场景中的虚拟好友、社交场景中的虚拟路人、当前用户在社交场景中的虚拟宠物等对象;当目标对象在社交场景中运动的过程中,可以通过服务器执行本申请的技术方案来调整目标对象上待调整目标的朝向,如将其眼睛调整为始终朝向虚拟摄像机,并将调整结果下发给各个终端,以使得各个终端同步渲染出目标对象的眼睛始终是朝向屏幕的画面(屏幕播放的画面即以虚拟摄像机的视角渲染出的画面)。
本申请的技术方案还可以应用在游戏场景中,如多人在线战术竞技(Multiplayer Online Battle Arena,MOBA)游戏、第一人称射击类(First-person shooter,FPS)游戏、第三人称射击类(Third-person shooter,TPS)游戏等,可以将上述游戏的游戏应用安装在上述终端101上,以形成游戏的客户端,玩家通过客户端在游戏场景中参与游戏,游戏场景中包括待调整目标所属的目标对象,该目标对象可以为当前用户在游戏场景中操控的游戏角色、该游戏角色的队友、该游戏角色的敌人等对象,而且待调整目标可以是人物角色的某个器官或者身体结构的某个部分,如眼睛等。当玩家在游戏场景中参与游戏的过程中,若该游戏为单机游戏、多人本地互联的游戏,可由其中一个终端执行本申请的技术方案来调整目标对象上待调整目标的朝向,还可将执行结果同步给其它终端,若该游戏为联网游戏,可由服务器102执行本申请的技术方案来调整目标对象上待调整目标的朝向,并将执行结果同步给参与游戏的终端。
下面以本申请实施例的动画的控制方法由终端101来执行为例进行说明, 终端101执行本申请实施例的动画的控制方法也可以是由安装在其上的客户端(如游戏应用的客户端)来执行,例如,将本申请的动画的控制方法嵌入到游戏应用的游戏引擎中,作为游戏引擎自身逻辑的一部分;或者将本申请的动画的控制方法嵌入到游戏应用的游戏逻辑中,作为游戏逻辑的一部分。图2是根据本申请实施例的一种可选的动画的控制方法的流程图。如图2所示,该动画的控制方法包括:
S202,终端获取动画中虚拟摄像机的第一位置和动画中待调整目标的第二位置。
在动画中可以表现多个对象,待调整目标可以是对象的一部分。其中,对象可以是人物角色,待调整目标可以是人物角色的某个器官或者身体结构的某个部分,如眼睛等。该动画可以为游戏动画,待调整目标可以是游戏角色中的眼睛。第一位置是虚拟摄像机在动画中的位置,第二位置是待调整目标在动画中的位置,游戏场景和游戏场景中各个角色可由游戏引擎渲染出,在获取第一位置和第二位置时,可以在游戏引擎中获取其所渲染的待调整目标在动画中的第二位置,并利用游戏引擎的配置文件(配置文件中配置有各个虚拟摄像机的状态、移动逻辑等)获取虚拟摄像机的第一位置。第一位置和第二位置都可以采用三维坐标表示,三维坐标可以利用矩阵表示。
S204,终端根据第一位置、第二位置和待调整目标所属目标对象的坐标信息确定待调整目标的旋转角度。
获取第二位置到第一位置的矢量(该矢量可以为三维坐标系中的矢量),如图3所示,利用该矢量乘以矩阵[0 0 1],得到待调整目标的x轴坐标;利用该矢量乘以矩阵[0 1 0],得到待调整目标的y轴坐标;利用该矢量乘以矩阵[1 0 0],得到待调整目标的z轴坐标。从而得到待调整目标在三维空间中坐标的矩阵表达,该矩阵为旋转矩阵M r(即第一矩阵),为了抵消游戏过程中玩家控制角色的旋转以及初始状态下角色没有对齐X轴,获取玩家控制角色旋转时角色的旋转矩阵M t(即第二矩阵)和初始状态下角色的旋转矩阵M i(即第三矩阵),将以上旋转矩阵M r、旋转矩阵M t和旋转矩阵M i级联起来,得到待调整目标的目标矩阵M:
Figure PCTCN2019114251-appb-000001
其中,待调整目标所属目标对象为游戏角色,目标对象的坐标信息包括旋转矩阵M t和旋转矩阵M i,旋转矩阵M t和旋转矩阵M i可以通过游戏引擎获得。
可选地,上述M i
Figure PCTCN2019114251-appb-000002
M r均可以为三行三列的矩阵,旋转矩阵M t表示游戏角色在玩家的控制下游戏角色自身与玩家的控制操作对应的旋转矩阵(即从控制前的状态转换为控制后的状态之间的转换矩阵)和初始状态下角色的旋转矩阵M i可以通过游戏引擎提供的应用程序接口API获取:
Figure PCTCN2019114251-appb-000003
其中,i 11至i 33为常数;
Figure PCTCN2019114251-appb-000004
其中,t 11至t 33为常数;
Figure PCTCN2019114251-appb-000005
其中,r 11至r 33为常数为按照第二位置到第一位置的矢量计算出来的数值。
Figure PCTCN2019114251-appb-000006
R 11至R 33的数值可以直接利用M i
Figure PCTCN2019114251-appb-000007
M r中各个元素的数值计算出来。
即,根据第一位置、第二位置和待调整目标所属目标对象的坐标信息确定待调整目标的旋转角度包括:根据所述第二位置到所述第一位置的矢量,获取所述待调整目标的第一矩阵;获取目标对象的旋转矩阵,作为第二矩阵,其中,第二矩阵为目标对象在预定场景中被控制旋转时的矩阵;获取目标对象在初始状态下的旋转矩阵,作为第三矩阵;对第一矩阵、第二矩阵和第三矩阵进行级联,得到目标矩阵;将目标矩阵转换成旋转角度。
其中,第一矩阵为旋转矩阵M r,第二矩阵为旋转矩阵M t,第三矩阵为旋转矩阵M i。在得到目标矩阵之后,由于目标矩阵无法直观的表示旋转角度为 多少,为了便于调整待调整目标的旋转角度,可以利用以下公式将目标矩阵转换成旋转角度。
ω x=atan2(R 32,R 33);
Figure PCTCN2019114251-appb-000008
ω z=atan2(R 21,R 11)。
旋转角度可以包括相对于X轴的角度ω x、相对于Y轴的角度ω y以及相对于Z轴的角度ω z
S206,终端利用旋转系数对旋转角度进行调整,得到目标角度。
待调整目标按照目标角度旋转后,待调整目标将朝向虚拟摄像机,由于该目标角度表示的就是调整前的角度与朝向虚拟摄像机时的角度之间的角度差值,所以按照目标角度调整后的待调整目标是朝向虚拟摄像机的,旋转系数用于调整待调整目标与关联目标的旋转角度的比例,关联目标是与待调整目标具有联动关系的目标,一种可选的方式中,旋转系数是由动画美术预先设定的。
对于一个角色来说,旋转角色的某个部分可能会导致其他部分的联动,或者,旋转某个部分时同时旋转其他相关的部分才会使得旋转效果比较自然,因此,在旋转待调整目标时,也要考虑其他关联目标的旋转角度,也就是说,为待调整目标和关联目标设置旋转角度的比例(即前述的旋转系数)。
举例来说,待调整目标为角色的眼睛,关联目标包括角色的头部和脖子,眼睛、头部和脖子的旋转比例为3:2:1,则,代入M i
Figure PCTCN2019114251-appb-000009
M r中各个元素的数值计算出来M,然后,在确定了眼睛的目标角度为与X轴之间的角度减少30°的情况下,头部旋转(即与X轴之间的角度减少)20°,脖子旋转(即与X轴之间的角度减少)10°。在这个例子中,待旋转目标的旋转系数可以为3,其与头部之间的比例为3:2,其与脖子之间的旋转比例为3:1。此处的旋转系数可以根据待调整目标在对象中的不同位置进行设置,使得待调整目标按照目标角度旋转后,朝向虚拟摄像机,且对象的动作表现比较自然,如图4所示,眼睛旋转了30°,头部旋转了20°,脖子旋转了10°。
需要说明的是,相对于Y轴和Z轴的旋转角度与相对于X轴的旋转方式类似。
S208,终端控制待调整目标按照目标角度进行旋转,使得旋转后的待调整目标朝向虚拟摄像机。
本实施例通过待调整目标和虚拟摄像机的位置确定待调整目标的坐标的矩阵表达,根据待调整目标的坐标之后确定待调整目标的旋转角度,可以根据输入的旋转系数对旋转角度进行调整,得到目标角度,然后控制待调整目标按照目标角度进行调整,从而使得待调整目标的旋转更加灵活,表现出的效果更加自然,解决了现有技术调整旋转角度时不灵活的技术问题。
可选地,控制待调整目标按照目标角度进行旋转包括:获取待调整目标的当前旋转矩阵(当前旋转矩阵与旋转矩阵M t、旋转矩阵M i类似,可以通过游戏引擎提供的应用程序接口API获得),对于游戏角色而言,其旋转包括两个部分,其一是受控于玩家等的整个游戏角色的旋转,当前旋转矩阵即用来表示当前时刻整个角色的旋转,其二是为了使得待调整目标始终面向虚拟相机而作的旋转,用目标矩阵或者目标矩阵转换来的目标角度表示,这二者的叠加即相当于最终的旋转;将目标角度应用在当前旋转矩阵上,得到目标矩阵,其中,目标矩阵表示的旋转角度为待调整目标朝向虚拟摄像机的角度。此时,控制所述待调整目标旋转至目标矩阵,能够使得待调整目标始终面向虚拟相机。
作为一个可选的示例,当前旋转矩阵表示旋转前待调整目标的空间位置,可以用M e表示:
Figure PCTCN2019114251-appb-000010
其中,e 11至e 33为常数。
M a=[ω x,ω y,ω z]。
对当前旋转矩阵和表示目标角度的矩阵M a相乘,以便基于M e和M a的乘积矩阵完成待调整目标的旋转。在需要眼睛朝向虚拟摄像机时,完成旋转后的待调整目标朝向虚拟摄像机,此处的需要可以是指游戏中某个时刻需要将待调整目标朝向虚拟摄像机,也可以是整个游戏过程中保持待调整目标朝向虚拟摄像机,如果中间一旦出现待调整目标未朝向虚拟摄像机的情况则按照本申请的上述方法进行调整,调整后的效果如图4所示。
可选地,本申请实施例提供的动画的控制方法还包括:设置待调整目标进 行旋转所需的时间,作为旋转时间;根据旋转时间对旋转角度进行线性差值,得到差值后的旋转角度,以便利用旋转系数对所述差值后的旋转角度进行调整,得到目标角度,其中,差值后的旋转角度与旋转系数相乘得到目标角度。
需要说明的是,本申请实施例不限定“旋转时间”的获取时间,只需在执行“根据旋转时间对旋转角度进行线性差值”之前完成即可。
为了使得旋转的效果自然不生硬,避免待调整目标瞬间旋转目标角度,在输入旋转系数对旋转角度进行调整之前,对旋转角度根据时间做一次线性插值,即,设置旋转所需的时间,差值后的旋转角度和旋转系数相乘可以调整旋转所需的时间。例如,在旋转角度为30°的情况下,如果旋转时间为1秒,则控制待调整目标在1秒钟旋转30°;如果旋转时间为5秒,则控制待调整目标在5秒钟内旋转30°。线性差值不改变待调整目标所要旋转的角度,仅改变旋转所需要的时间。将旋转时间设置为5秒,可以表现出待调整目标慢慢旋转的效果。
可选地,控制待调整目标按照目标角度进行旋转包括:获取至少一个关联目标的旋转系数,作为至少一个旋转系数;将至少一个旋转系数应用在至少一个关联目标中对应的关联目标上,得到至少一个旋转角度;控制至少一个关联目标和待调整目标按照目标角度和至少一个旋转角度逐级进行旋转。
在确定了关联目标的旋转系数之后,就确定了关联目标的旋转角度。举例来说,待调整目标为角色的眼睛,关联目标包括角色的头部和脖子,眼睛、头部和脖子的旋转比例为3:2:1,则,在确定了眼睛的目标角度为30°的情况下,头部旋转20°,脖子旋转10°。确定每个部分的旋转角度之后,按照骨骼的层级逐级进行旋转,也就是按照从脖子到头部再到眼睛的顺序进行旋转。
如图5所示,骨骼A表示脖子,骨骼B表示头部,骨骼C表示眼睛,骨骼的层级顺序为骨骼A到骨骼B到骨骼C,先计算骨骼A的旋转角度,然后根据美术的输入调整旋转,即根据输入旋转系数调整骨骼A的旋转角度,叠加到已有的旋转,即叠加到骨骼A的当前旋转角度上,完成骨骼A的旋转。计算骨骼B的旋转角度,然后根据美术的输入调整旋转,即根据输入旋转系数调整骨骼B的旋转角度,叠加到已有的旋转,即叠加到骨骼B的当前旋转 角度上,完成骨骼B的旋转。计算骨骼C的旋转角度,然后根据美术的输入调整旋转,即根据输入旋转系数调整骨骼C的旋转角度,叠加到已有的旋转,即叠加到骨骼C的当前旋转角度上,完成骨骼C的旋转。
图5虽然示出了分别计算骨骼A、骨骼B和骨骼C的旋转,但是,本实施例可以在计算出骨骼C的旋转角度后,根据旋转系数直接得到骨骼A和骨骼B的旋转角度,进而完成从骨骼A到骨骼C的逐级旋转,最终旋转的结果可以参见图4,图4中人物眼睛(相当于骨骼C)与X轴之间的角度减少30°,头部和脖子(相当于骨骼A和骨骼B)与X轴之间的角度分别减少20°和10°。
本实施例可以实时的产生眼球跟踪摄像机的效果,动画平滑自然,看不出人工痕迹,也不影响身体其它部分的动画,比如朝你看的同时也能眨眼。如前述内容描述,本申请的技术方案相当于提供了一种将虚拟场景中的目标对象的某个部位(即待调整目标)调整为朝向虚拟摄像机的方案,该虚拟场景可以为前述的游戏场景、社交场景等。
需要说明的是,对于前述的各方法实施例,为了简单描述,故将其都表述为一系列的动作组合,但是本领域技术人员应该知悉,本申请并不受所描述的动作顺序的限制,因为依据本申请,某些步骤可以采用其他顺序或者同时进行。其次,本领域技术人员也应该知悉,说明书中所描述的实施例均属于优选实施例,所涉及的动作和模块并不一定是本申请所必须的。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到根据上述实施例的方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端设备(可以是手机,计算机,服务器,或者网络设备等)执行本申请各个实施例所述的方法。
根据本申请实施例的另一个方面,还提供了一种用于实施上述动画的控制方法的动画的控制装置,图6是根据本申请实施例的一种可选的动画的控制装置的示意图,如图6所示,该装置包括:
获取单元602,用于获取动画中虚拟摄像机的第一位置和动画中待调整目标的第二位置;
确定单元604,用于根据第一位置、第二位置和待调整目标所属目标对象的坐标信息确定待调整目标的旋转角度;
调整单元606,用于利用旋转系数对旋转角度进行调整,得到目标角度,旋转系数用于调整待调整目标与关联目标的旋转角度的比例,关联目标是与待调整目标具有联动关系的目标。
控制单元608,用于控制待调整目标按照目标角度进行旋转,使得旋转后的待调整目标朝向所述虚拟摄像机。
本实施例通过待调整目标和虚拟摄像机的位置确定待调整目标的坐标的矩阵表达,根据待调整目标的坐标之后确定待调整目标的旋转角度,可以根据输入的旋转系数对旋转角度进行调整,得到目标角度,然后控制待调整目标按照目标角度进行调整,从而使得待调整目标的旋转更加灵活,表现出的效果更加自然,解决了现有技术调整旋转角度时不灵活的技术问题。
可选地,确定单元604包括:
第一获取模块,用于根据所述第二位置到所述第一位置的矢量,获取所述待调整目标的第一矩阵;
第二获取模块,用于获取所述目标对象的旋转矩阵,作为第二矩阵,其中,所述第二矩阵为所述目标对象在预定场景中被控制旋转时的矩阵;
第三获取模块,用于获取所述目标对象在初始状态下的旋转矩阵,得到第三矩阵;
级联单元,用于对所述第一矩阵、所述第二矩阵和所述第三矩阵进行级联,得到目标矩阵;
转换单元,用于将所述目标矩阵转换成所述旋转角度。
可选地,控制单元608包括:第四获取模块,用于获取待调整目标的当前旋转矩阵;第一应用模块,用于将目标角度应用在当前旋转矩阵上,得到目标矩阵,其中,目标矩阵表示的旋转角度为待调整目标朝向虚拟摄像机的角度;旋转模块,用于控制所述待调整目标旋转至所述目标矩阵。
可选地,装置还包括:设置单元,用于设置待调整目标进行旋转所需的时间,得到旋转时间;差值单元,用于根据旋转时间对旋转角度进行线性差值,得到差值后的旋转角度,其中,差值后的旋转角度与旋转系数相乘得到目标角度;所述调整单元606,具体用于利用旋转系数对所述差值后的旋转角度进行调整,得到目标角度。
可选地,控制单元608包括:第五获取模块,用于获取至少一个关联目标的旋转系数,得到至少一个旋转系数;第二应用模块,用于将至少一个旋转系数应用在至少一个关联目标中对应的关联目标上,得到至少一个旋转角度;控制模块,用于控制至少一个关联目标和待调整目标按照目标角度和至少一个旋转角度逐级进行旋转。
需要说明的是,本申请实施例提供的动画的控制装置的技术详情,请参见上述方法实施例,为了简要起见,在此不再赘述。
根据本申请实施例的又一个方面,还提供了一种用于实施上述动画的控制方法的电子装置,如图7所示,该电子装置包括,包括存储器和处理器,该存储器中存储有计算机程序,该处理器被设置为通过计算机程序执行上述任一项方法实施例中的步骤。
需要说明的是,该电子装置可以是终端,也可以是服务器,本申请实施例对此不做具体限定。
可选地,图7是根据本申请实施例的一种可选的电子装置的示意图。如图7所示,该电子装置可以包括:一个或多个(图7中仅示出一个)处理器701、至少一个通信总线702、用户接口703、至少一个传输装置704和存储器705。其中,通信总线702用于实现这些组件之间的连接通信。其中,用户接口703可以包括显示器706和键盘707。传输装置704可选的可以包括标准的有线接口和无线接口。
可选地,在本实施例中,上述电子装置可以位于计算机网络的多个网络设备中的至少一个网络设备。
可选地,在本实施例中,上述处理器可以被设置为通过计算机程序执行以下步骤:
获取动画中虚拟摄像机的第一位置和动画中待调整目标的第二位置;
根据第一位置、第二位置和待调整目标所属目标对象的坐标信息确定待调整目标的旋转角度;
输入旋转系数对旋转角度进行调整,得到目标角度;
控制待调整目标按照目标角度进行旋转。
可选地,本领域普通技术人员可以理解,图7所示的结构仅为示意,电子装置也可以是智能手机(如Android手机、iOS手机等)、平板电脑、掌上电脑以及移动互联网设备(Mobile Internet Devices,MID)、PAD等终端设备。图7其并不对上述电子装置的结构造成限定。例如,电子装置还可包括比图7中所示更多或者更少的组件(如网络接口、显示装置等),或者具有与图7所示不同的配置。
其中,存储器705可用于存储软件程序以及模块,如本申请实施例中的,其和装置对应的程序指令/模块,处理器701通过运行存储在存储器705内的软件程序以及模块,从而执行各种功能应用以及数据处理,即实现上述的提供的动画的控制方法的任一实施方式。存储器705可包括高速随机存储器,还可以包括非易失性存储器,如一个或者多个磁性存储装置、闪存、或者其他非易失性固态存储器。在一些实例中,存储器705可进一步包括相对于处理器701远程设置的存储器,这些远程存储器可以通过网络连接至终端。上述网络的实例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。
上述的传输装置704用于经由一个网络接收或者发送数据。上述的网络具体实例可包括有线网络及无线网络。在一个实例中,传输装置704包括一个网络适配器(Network Interface Controller,NIC),其可通过网线与其他网络设备与路由器相连从而可与互联网或局域网进行通讯。在一个实例中,传输装置704为射频(Radio Frequency,RF)模块,其用于通过无线方式与互联网进行通讯。
其中,具体地,存储器705用于存储待调整目标的旋转矩阵。
本实施例通过待调整目标和虚拟摄像机的位置确定待调整目标的坐标的矩阵表达,根据待调整目标的坐标之后确定待调整目标的旋转角度,可以根据 旋转系数对旋转角度进行调整,得到目标角度,然后控制待调整目标按照目标角度进行调整,从而使得待调整目标的旋转更加灵活,表现出的效果更加自然,解决了现有技术调整旋转角度时不灵活的技术问题。
本申请的实施例还提供了一种存储介质,该存储介质中存储有计算机程序,其中,该计算机程序被设置为运行时执行上述动画的控制方法的任一实施方式。
可选地,在本实施例中,上述存储介质可以被设置为存储用于执行以下步骤的计算机程序:
获取动画中虚拟摄像机的第一位置和动画中待调整目标的第二位置;
根据第一位置、第二位置和待调整目标所属目标对象的坐标信息确定待调整目标的旋转角度;
输入旋转系数对旋转角度进行调整,得到目标角度;
控制待调整目标按照目标角度进行旋转。
可选地,存储介质还被设置为存储用于执行上述实施例中的动画的控制方法中所包括的步骤的计算机程序,本实施例中对此不再赘述。
可选地,在本实施例中,本领域普通技术人员可以理解上述实施例的各种方法中的全部或部分步骤是可以通过程序来指令终端设备相关的硬件来完成,该程序可以存储于一计算机可读存储介质中,存储介质可以包括:闪存盘、只读存储器(Read-Only Memory,ROM)、随机存取器(Random Access Memory,RAM)、磁盘或光盘等。
可选的,本申请实施例还提供了一种包括指令的计算机程序产品,当其在计算机上运行时,使得计算机执行上述实施例提供的方法。
上述实施例中的集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在上述计算机可读取的存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在存储介质中,包括若干指令用以使得一台或多台计算机设备(可为个人计算机、服务器或者网络设备等)执行本申请各个实施例所述方法的全部或部分步 骤。
在本申请的上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述的部分,可以参见其他实施例的相关描述。
在本申请所提供的几个实施例中,应该理解到,所揭露的客户端,可通过其它的方式实现。其中,以上所描述的装置实施例仅仅是示意性的,例如所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,单元或模块的间接耦合或通信连接,可以是电性或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
以上所述仅是本申请的优选实施方式,应当指出,对于本技术领域的普通技术人员来说,在不脱离本申请原理的前提下,还可以做出若干改进和润饰,这些改进和润饰也应视为本申请的保护范围。

Claims (13)

  1. 一种动画的控制方法,应用于电子设备,包括:
    获取动画中虚拟摄像机的第一位置和所述动画中待调整目标的第二位置;
    根据所述第一位置、所述第二位置和所述待调整目标所属目标对象的坐标信息确定所述待调整目标的旋转角度;
    利用旋转系数对所述旋转角度进行调整,得到目标角度,所述旋转系数用于调整所述待调整目标与关联目标的旋转角度的比例,所述关联目标是与所述待调整目标具有联动关系的目标;
    控制所述待调整目标按照所述目标角度进行旋转,使得旋转后的待调整目标朝向所述虚拟摄像机。
  2. 根据权利要求1所述的方法,所述根据所述第一位置、所述第二位置和所述待调整目标所属目标对象的坐标信息确定所述待调整目标的旋转角度包括:
    根据所述第二位置到所述第一位置的矢量,获取所述待调整目标的第一矩阵;
    获取所述目标对象的旋转矩阵,作为第二矩阵,其中,所述第二矩阵为所述目标对象在预定场景中被控制旋转时的矩阵;
    获取所述目标对象在初始状态下的旋转矩阵,作为第三矩阵;
    对所述第一矩阵、所述第二矩阵和所述第三矩阵进行级联,得到目标矩阵;
    将所述目标矩阵转换成所述旋转角度。
  3. 根据权利要求1所述的方法,所述控制所述待调整目标按照所述目标角度进行旋转包括:
    获取所述待调整目标的当前旋转矩阵;
    将所述目标角度应用在所述当前旋转矩阵上,得到目标矩阵,其中,所述目标矩阵表示的旋转角度为所述待调整目标朝向所述虚拟摄像机的角度;
    控制所述待调整目标旋转至所述目标矩阵。
  4. 根据权利要求1所述的方法,所述方法还包括:
    设置所述待调整目标进行旋转所需的时间,作为旋转时间;
    根据所述旋转时间对所述旋转角度进行线性差值,得到差值后的旋转角 度,其中,所述差值后的旋转角度与所述旋转系数相乘得到所述目标角度;
    所述利用旋转系数对所述旋转角度进行调整,得到目标角度,包括:
    利用旋转系数对所述差值后的旋转角度进行调整,得到目标角度。
  5. 根据权利要求1所述的方法,所述控制所述待调整目标按照所述目标角度进行旋转包括:
    获取至少一个所述关联目标的旋转系数,作为至少一个旋转系数;
    将所述至少一个旋转系数应用在至少一个所述关联目标中对应的关联目标上,得到至少一个旋转角度;
    控制至少一个所述关联目标和所述待调整目标按照所述目标角度和所述至少一个旋转角度逐级进行旋转。
  6. 一种动画的控制装置,包括:
    获取单元,用于获取动画中虚拟摄像机的第一位置和所述动画中待调整目标的第二位置;
    确定单元,用于根据所述第一位置、所述第二位置和所述待调整目标所属目标对象的坐标信息确定所述待调整目标的旋转角度;
    调整单元,用于利用旋转系数对所述旋转角度进行调整,得到目标角度,所述旋转系数用于调整所述待调整目标与关联目标的旋转角度的比例,所述关联目标是与所述待调整目标具有联动关系的目标;
    控制单元,用于控制所述待调整目标按照所述目标角度进行旋转,使得旋转后的待调整目标朝向所述虚拟摄像机。
  7. 根据权利要求6所述的装置,所述确定单元包括:
    第一获取模块,用于根据所述第二位置到所述第一位置的矢量,获取所述待调整目标的第一矩阵;
    第二获取模块,用于获取所述目标对象的旋转矩阵,作为第二矩阵,其中,所述第二矩阵为所述目标对象在预定场景中被控制旋转时的矩阵;
    第三获取模块,用于获取所述目标对象在初始状态下的旋转矩阵,得到第三矩阵;
    级联单元,用于对所述第一矩阵、所述第二矩阵和所述第三矩阵进行级联,得到目标矩阵;
    转换单元,用于将所述目标矩阵转换成所述旋转角度。
  8. 根据权利要求6所述的装置,所述控制单元包括:
    第四获取模块,用于获取所述待调整目标的当前旋转矩阵;
    第一应用模块,用于将所述目标角度应用在所述当前旋转矩阵上,得到目标矩阵,其中,所述目标矩阵表示的旋转角度为所述待调整目标朝向所述虚拟摄像机的角度;
    旋转模块,用于控制所述待调整目标旋转至所述目标矩阵。
  9. 根据权利要求6所述的装置,所述装置还包括:
    设置单元,用于设置所述待调整目标进行旋转所需的时间,作为旋转时间;
    差值单元,用于根据所述旋转时间对所述旋转角度进行线性差值,得到差值后的旋转角度,其中,所述差值后的旋转角度与所述旋转系数相乘得到所述目标角度;
    所述调整单元,具体用于利用旋转系数对所述差值后的旋转角度进行调整,得到目标角度。
  10. 根据权利要求6所述的装置,所述控制单元包括:
    第五获取模块,用于获取至少一个所述关联目标的旋转系数,作为至少一个旋转系数;
    第二应用模块,用于将所述至少一个旋转系数应用在至少一个所述关联目标中对应的关联目标上,得到至少一个旋转角度;
    控制模块,用于控制至少一个所述关联目标和所述待调整目标按照所述目标角度和所述至少一个旋转角度逐级进行旋转。
  11. 一种存储介质,所述存储介质中存储有计算机程序,其中,所述计算机程序被设置为运行时执行所述权利要求1至5任一项中所述的动画的控制方法。
  12. 一种电子装置,包括存储器和处理器,所述存储器中存储有计算机程序,所述处理器被设置为通过所述计算机程序执行所述权利要求1至5任一项中所述的动画的控制方法。
  13. 一种包括指令的计算机程序产品,当其在计算机上运行时,使得所述计算机执行权利要求1至5任意一项所述的动画的控制方法。
PCT/CN2019/114251 2018-12-05 2019-10-30 动画的控制方法、装置、存储介质和电子装置 WO2020114154A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP19893340.0A EP3832604A4 (en) 2018-12-05 2019-10-30 ANIMATION CONTROL PROCESS AND DEVICE, STORAGE MEDIA AND ELECTRONIC DEVICE
US17/193,525 US11783523B2 (en) 2018-12-05 2021-03-05 Animation control method and apparatus, storage medium, and electronic device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811481911.5 2018-12-05
CN201811481911.5A CN110163938B (zh) 2018-12-05 2018-12-05 动画的控制方法、装置、存储介质和电子装置

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/193,525 Continuation US11783523B2 (en) 2018-12-05 2021-03-05 Animation control method and apparatus, storage medium, and electronic device

Publications (1)

Publication Number Publication Date
WO2020114154A1 true WO2020114154A1 (zh) 2020-06-11

Family

ID=67645246

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/114251 WO2020114154A1 (zh) 2018-12-05 2019-10-30 动画的控制方法、装置、存储介质和电子装置

Country Status (4)

Country Link
US (1) US11783523B2 (zh)
EP (1) EP3832604A4 (zh)
CN (1) CN110163938B (zh)
WO (1) WO2020114154A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113198179A (zh) * 2021-05-10 2021-08-03 网易(杭州)网络有限公司 虚拟对象的转向控制方法及装置、存储介质、电子设备

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110163938B (zh) * 2018-12-05 2023-04-25 腾讯科技(深圳)有限公司 动画的控制方法、装置、存储介质和电子装置
CN111632372A (zh) * 2020-06-03 2020-09-08 深圳市瑞立视多媒体科技有限公司 虚拟对象的控制方法、装置、设备及存储介质
CN112156463B (zh) * 2020-10-22 2023-04-07 腾讯科技(深圳)有限公司 角色展示方法、装置、设备及介质
CN113730905A (zh) * 2021-09-03 2021-12-03 北京房江湖科技有限公司 一种在虚拟空间中实现自由游走的方法及其装置

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105488834A (zh) * 2015-12-01 2016-04-13 网易(杭州)网络有限公司 角色面部朝向的调整方法及装置
CN106042005A (zh) * 2016-06-01 2016-10-26 山东科技大学 仿生眼定位追踪系统及其工作方法
CN106981099A (zh) * 2017-03-27 2017-07-25 厦门幻世网络科技有限公司 用于操作三维动画角色的方法和装置
CN108126343A (zh) * 2017-12-20 2018-06-08 网易(杭州)网络有限公司 游戏角色的视线调整方法、装置、处理器和终端
JP2018099198A (ja) * 2016-12-19 2018-06-28 株式会社三共 遊技機
CN110163938A (zh) * 2018-12-05 2019-08-23 腾讯科技(深圳)有限公司 动画的控制方法、装置、存储介质和电子装置

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5300777B2 (ja) * 2010-03-31 2013-09-25 株式会社バンダイナムコゲームス プログラム及び画像生成システム
US9791897B2 (en) * 2012-06-29 2017-10-17 Monkeymedia, Inc. Handheld display device for navigating a virtual environment
CN102982557B (zh) * 2012-11-06 2015-03-25 桂林电子科技大学 基于深度相机的空间手势姿态指令处理方法
KR102084253B1 (ko) * 2013-11-20 2020-03-03 한국전자통신연구원 복원조각과 볼륨형 표면을 이용하는 카메라 트래킹 장치 및 방법
JP6539253B2 (ja) * 2016-12-06 2019-07-03 キヤノン株式会社 情報処理装置、その制御方法、およびプログラム
CN107463256A (zh) 2017-08-01 2017-12-12 网易(杭州)网络有限公司 基于虚拟现实的用户朝向控制方法与装置
CN107517372B (zh) * 2017-08-17 2022-07-26 腾讯科技(深圳)有限公司 一种vr内容拍摄方法、相关设备及系统
EP3675740B1 (en) * 2017-08-28 2022-06-15 Shanghai United Imaging Healthcare Co., Ltd. Systems and methods for determining rotation angles
CN108635857B (zh) * 2018-05-18 2022-04-22 腾讯科技(深圳)有限公司 界面显示方法、装置、电子装置及计算机可读存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105488834A (zh) * 2015-12-01 2016-04-13 网易(杭州)网络有限公司 角色面部朝向的调整方法及装置
CN106042005A (zh) * 2016-06-01 2016-10-26 山东科技大学 仿生眼定位追踪系统及其工作方法
JP2018099198A (ja) * 2016-12-19 2018-06-28 株式会社三共 遊技機
CN106981099A (zh) * 2017-03-27 2017-07-25 厦门幻世网络科技有限公司 用于操作三维动画角色的方法和装置
CN108126343A (zh) * 2017-12-20 2018-06-08 网易(杭州)网络有限公司 游戏角色的视线调整方法、装置、处理器和终端
CN110163938A (zh) * 2018-12-05 2019-08-23 腾讯科技(深圳)有限公司 动画的控制方法、装置、存储介质和电子装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113198179A (zh) * 2021-05-10 2021-08-03 网易(杭州)网络有限公司 虚拟对象的转向控制方法及装置、存储介质、电子设备
CN113198179B (zh) * 2021-05-10 2024-06-04 网易(杭州)网络有限公司 虚拟对象的转向控制方法及装置、存储介质、电子设备

Also Published As

Publication number Publication date
CN110163938B (zh) 2023-04-25
EP3832604A1 (en) 2021-06-09
EP3832604A4 (en) 2021-11-17
US20210192821A1 (en) 2021-06-24
US11783523B2 (en) 2023-10-10
CN110163938A (zh) 2019-08-23

Similar Documents

Publication Publication Date Title
WO2020114154A1 (zh) 动画的控制方法、装置、存储介质和电子装置
WO2022021686A1 (zh) 虚拟对象的控制方法及装置、存储介质、电子装置
EP3760287B1 (en) Method and device for generating video frames
US11238667B2 (en) Modification of animated characters
US11645805B2 (en) Animated faces using texture manipulation
JP2024003191A (ja) ゲームシステム、ゲーム装置及びプログラム
US20230347247A1 (en) Virtual character control method and apparatus, storage medium, and electronic device
CN115526967A (zh) 虚拟模型的动画生成方法、装置、计算机设备及存储介质
US11978152B2 (en) Computer-assisted graphical development tools
US20220118358A1 (en) Computer-readable recording medium, and image generation system
US20230124297A1 (en) Hidden surface removal for layered clothing for an avatar body
Baričević et al. QAVE–a Gamebased Immersive Virtual Reality System
WO2024054580A1 (en) Computer-assisted graphical development tools
Torrao et al. Palco: A multisensor realtime 3D cartoon production system
CN114159776A (zh) 一种vr项目中过场动画播放方法及装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19893340

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019893340

Country of ref document: EP

Effective date: 20210305

NENP Non-entry into the national phase

Ref country code: DE