US11962897B2 - Camera movement control method and apparatus, device, and storage medium - Google Patents

Camera movement control method and apparatus, device, and storage medium Download PDF

Info

Publication number
US11962897B2
US11962897B2 US17/675,293 US202217675293A US11962897B2 US 11962897 B2 US11962897 B2 US 11962897B2 US 202217675293 A US202217675293 A US 202217675293A US 11962897 B2 US11962897 B2 US 11962897B2
Authority
US
United States
Prior art keywords
target
parameter
scaling
speed
time slice
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US17/675,293
Other languages
English (en)
Other versions
US20220174206A1 (en
Inventor
Qi Liu
Guangzhou Zhai
Weihua CUI
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Assigned to TENCENT TECHNOLOGY (SHENZHEN) COMPANY LIMITED reassignment TENCENT TECHNOLOGY (SHENZHEN) COMPANY LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CUI, Weihua, LIU, QI, ZHAI, Guangzhou
Publication of US20220174206A1 publication Critical patent/US20220174206A1/en
Application granted granted Critical
Publication of US11962897B2 publication Critical patent/US11962897B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5258Changing parameters of virtual cameras by dynamically adapting the position of the virtual camera to keep a game object or game character in its viewing frustum, e.g. for tracking a character or a ball
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4781Games
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/35Details of game servers

Definitions

  • Embodiments of this disclosure relate to the field of computer technologies including a camera movement control method and apparatus, a device, and a storage medium.
  • the game client can control a directed camera to move, to focus the directed camera on a wonderful game scene, thereby improving game video watching experience of the user.
  • Embodiments of this disclosure provide a camera operation (e.g., movement and scaling) control method and apparatus, a device, and a storage medium, which are used to improve a camera operation (e.g., movement and scaling) control effect.
  • a camera operation e.g., movement and scaling
  • Some aspects of the disclosure provide a method for camera operation control in an electronic device.
  • the method includes obtaining, based on a plurality of frames associated with a target time slice, a first target parameter to be met by a camera of a virtual camera system in the target time slice. Then, the method includes determining a target operation speed of the camera in the target time slice at least partially based on the first target parameter and a time-speed change magnification curve, and controlling the camera to operate based on the target operation speed.
  • the processing circuitry is configured to obtain, based on a plurality of frames associated with a target time slice, a first target parameter to be met by a camera of a virtual camera system in the target time slice. Further, the processing circuitry is configured to determine a target operation speed of the camera in the target time slice at least partially based on the first target parameter and a time-speed change magnification curve, and control the camera to operate based on the target operation speed.
  • Some aspects of the disclosure provide a non-transitory computer-readable medium storing instructions which when executed by a computer cause the computer to perform the method for camera operation control.
  • a first target parameter that needs to be met by a camera in a target time slice is obtained based on data of a plurality of target frames, so that the reliability of the first target parameter is relatively high, which helps improve the stability of camera operation (e.g., movement and scaling).
  • the camera operation e.g., movement and scaling
  • the camera operation is controlled based on a time-speed change magnification curve, which helps improve the operation (e.g., movement and scaling) continuity of the camera between adjacent time slices, so that the stability of the operation (e.g., movement and scaling) process of the camera is relatively high, and the camera operation (e.g., movement and scaling) control effect is relatively good.
  • FIG. 1 is a schematic diagram of an implementation environment of a camera operation (e.g., movement and scaling) control method according to an embodiment of this disclosure.
  • a camera operation e.g., movement and scaling
  • FIG. 2 is a flowchart of a camera operation (e.g., movement and scaling) control method according to an embodiment of this disclosure.
  • a camera operation e.g., movement and scaling
  • FIG. 3 is a schematic diagram of a operation (e.g., movement and scaling) process of an interaction object according to an embodiment of this disclosure.
  • a operation e.g., movement and scaling
  • FIG. 4 is a schematic diagram of position points indicated by a center position parameter, a sampling position parameter, and an assembly position parameter according to an embodiment of this disclosure.
  • FIG. 5 is a schematic diagram of a distance-scaling change magnification curve according to an embodiment of this disclosure.
  • FIG. 6 is a schematic diagram of a time-speed change magnification curve according to an embodiment of this disclosure.
  • FIG. 7 is a schematic diagram of an angle-steering mixed coefficient curve according to an embodiment of this disclosure.
  • FIG. 8 is a schematic diagram of a time-movement speed curve according to an embodiment of this disclosure.
  • FIG. 9 is a schematic diagram of a camera operation (e.g., movement and scaling) control process according to an embodiment of this disclosure.
  • FIG. 10 is a schematic diagram of a camera operation (e.g., movement and scaling) control apparatus according to an embodiment of this disclosure.
  • FIG. 11 is a schematic diagram of a camera operation (e.g., movement and scaling) control apparatus according to an embodiment of this disclosure.
  • FIG. 12 is a schematic structural diagram of a terminal according to an embodiment of this disclosure.
  • FIG. 13 is a schematic structural diagram of a computer device according to an embodiment of this disclosure.
  • a video game can include a virtual camera system that provides various modes for respective users to select to view the game video.
  • the virtual camera system can include a virtual camera (also referred to as camera) or a set of virtual cameras for providing a view or views of a virtual scene.
  • a camera in the virtual camera system can be controlled to simulate operations of real camera in order to generate a view of the virtual scene from various positions, angles, scales, and the like.
  • a directed camera can automatically move along with game pace, and display wonderful teamfight or some important moments to the user on the premise that the user does not need to perform any operation.
  • the game client can control the directed camera to move, to focus the directed camera on a wonderful game scene, thereby improving game video watching experience of the user.
  • a camera in the virtual camera system can be referred to as a lens in the virtual camera system, and camera control can be referred to as lens control.
  • FIG. 1 is a schematic diagram of an implementation environment of a camera operation (e.g., movement and scaling) control method according to an embodiment of this disclosure.
  • the implementation environment includes a terminal 11 and a server 12 .
  • a game client that can provide a directed mode is installed in the terminal 11 , and under a directed mode, the game client in the terminal 11 can apply the method provided in the embodiments of this disclosure to control a camera to move.
  • the camera in this embodiment of this disclosure is a directed camera under the directed mode, and a scene focused by the directed camera is a scene that the user sees under the directed mode.
  • the server 12 refers to a backend server of the game client installed in the terminal 11 , and the server 12 can provide data support to the game client installed in the terminal 11 .
  • the terminal 11 is a smart device such as a mobile phone, a tablet computer, or a personal computer.
  • the server 12 is a server, or a server cluster including a plurality of servers, or a cloud computing service center.
  • the terminal 11 and the server 12 establish a communication connection through a wired or wireless network.
  • terminal 11 and server 12 are only examples, and other related or potential terminals or servers that are applicable to this disclosure are also to be included in the protection scope of this disclosure, and are included herein by reference.
  • an embodiment of this disclosure provides a camera operation (e.g., movement and scaling) control method by using an example in which the method is applicable to a game client in a terminal.
  • the method provided in this embodiment of this disclosure includes the following steps:
  • step 201 the terminal obtains, based on data of a plurality of target frames corresponding to a target time slice, a first target parameter that needs to be met by a camera in the target time slice.
  • the target time slice refers to a time slice on which a camera operation (e.g., movement and scaling) control operation needs to be performed currently.
  • the terminal divides the game video into a plurality of time slices according to a first reference time interval, and then sequentially performs a camera operation (e.g., movement and scaling) control operation in each time slice according to the method provided in this embodiment of this disclosure.
  • the first reference time interval is set according to experience or freely adjusted according to application scenarios, which is not limited in the embodiments of this disclosure. For example, the first reference time interval is set to 1 second. In this case, a time interval corresponding to the target time slice is also 1 second.
  • the plurality of target frames corresponding to the target time slice include a game video frame corresponding to a start timestamp of the target time slice and a reference number of game video frames located before the game video frame corresponding to the start timestamp.
  • the reference number is set according to experience or freely adjusted according to application scenarios, which is not limited in the embodiments of this disclosure. In some embodiments, the reference number is set to 10. In this case, the terminal can obtain the first target parameter that needs to be met by the camera in the target time slice based on 11 target frames corresponding to the target time slice.
  • the data of each target frame refers to original data in the frame, which includes, but is not limited to, position data of an interaction object in the target frame, timestamp data of the target frame, and the like.
  • the interaction object in the target frame refers to a game character model participating in a game process included in the target frame.
  • the first target parameter refers to a parameter that needs to be met by the camera at an end moment of the target time slice.
  • the first target parameter includes at least one of a first position parameter or a first scaling parameter.
  • the first position parameter is used for indicating a position that needs to be focused by the camera at the end moment of the target time slice
  • the first scaling parameter is used for indicating a scaling value that needs to be reached by the camera at the end moment of the target time slice.
  • the terminal can represent the first position parameter in the form of position coordinates and represent the first scaling parameter in the form of a value.
  • a value between 0 and 1 may be used to represent that a focus range of the camera is zoomed out, 1 may be used to represent that the focus range of the camera is unchanged, and a value greater than 1 may be used to represent that the focus range of the camera is zoomed in.
  • the terminal obtains the first target parameter based on the data of the plurality of target frames, so that harmful impact of abnormal data of one video frame on the first target parameter may be reduced, and the reliability of the obtained first target parameter is relatively high, thereby improving the stability of a subsequent camera operation (e.g., movement and scaling) control process.
  • a subsequent camera operation e.g., movement and scaling
  • a process that the terminal obtains, based on data of a plurality of target frames corresponding to a target time slice, a first target parameter that needs to be met by a camera in the target time slice includes the following steps 2011 to 2013 :
  • step 2011 the terminal performs sampling processing on the data of the target frames corresponding to the target time slice, to obtain a sampling parameter corresponding to each target frame.
  • the sampling parameter corresponding to each target frame refers to a parameter that is determined based on the target frame and needs to be met by the camera in the target time slice.
  • the sampling parameter corresponding to each target frame includes at least one of a sampling position parameter or a sampling scaling parameter corresponding to the target frame.
  • a process that the terminal performs sampling processing on the data of the target frames corresponding to the target time slice, to obtain a sampling parameter corresponding to each target frame includes step A and step B:
  • step A the terminal obtains a center position parameter of interaction objects in each target frame based on the data of the target frames.
  • the center position parameter of the interaction objects in each target frame is used for indicating a geometric center of a position of the interaction objects in each target frame.
  • An implementation process of this step is that: the terminal extracts position data of the interaction objects in each target frame from the data of the target frames. The terminal obtains the center position parameter of the interaction objects in each target frame based on the position data of the interaction objects.
  • a process that the terminal obtains the center position parameter of the interaction objects in each target frame based on the position data of the interaction objects is that: the terminal uses an average value of the position data of the interaction objects as the center position parameter of the interaction objects in each target frame.
  • the terminal can implement the foregoing process based on Formula 1.
  • P i1 Sum( P c )/ N ( p ) (Formula 1)
  • P i1 represents the center position parameter of the interaction objects in each target frame
  • Sum(P c ) represents a sum of the position data of the interaction objects in each target frame
  • N(p) represents a quantity of interaction objects in each target frame.
  • step B the terminal determines the sampling position parameter corresponding to each target frame based on the center position parameter of the interaction objects in each target frame and an assembly position parameter of the interaction objects.
  • the assembly position parameter of the interaction objects refers to a parameter of a final assembly point that needs to be reached by all interaction objects in the game video.
  • the terminal can obtain the assembly position parameter from the backend server of the game client.
  • the center position parameters of the interaction object may be different, but a final assembly position of the interaction object is the same, so that the assembly position parameters of the interaction objects are the same.
  • a movement process of the interaction object is that: the interaction object first moves from a start position point 301 to a middle assembly point 302 , then the interaction object moves from the middle assembly point 302 to a final assembly point 303 by moving from a top lane to a bottom lane.
  • the backend server of the game client can obtain a parameter of the final assembly point by analyzing the entire game video, and feed the parameter of the final assembly point back to the terminal. Therefore, the terminal obtains the assembly position parameter of the interaction object.
  • a process that the terminal determines the sampling position parameter corresponding to each target frame based on the center position parameter of the interaction objects in each target frame and an assembly position parameter of the interaction objects is that: the terminal determines the sampling position parameter corresponding to each target frame based on the center position parameter of the interaction objects in each target frame and a first weight value, and the assembly position parameter of the interaction objects and a second weight value.
  • the terminal respectively obtains a product of the center position parameter of the interaction objects in each target frame and the first weight value and a product of the assembly position parameter of the interaction objects and the second weight value, and uses a sum of the two products as the sampling position parameter corresponding to each target frame.
  • the terminal can implement the foregoing process based on Formula 2.
  • P i P i1 ⁇ d 1 +P i2 ⁇ d 2 (Formula 2)
  • the first weight value and the second weight value are set according to experience or freely adjusted according to application scenarios, which is not limited in the embodiments of this disclosure.
  • the second weight value may be also set to 0.5. This setting manner can enable the user to both view performance of each interaction object at a current position and performance at the final assembly point.
  • the center position parameter, the sampling position parameter, and the assembly position parameter respectively indicate a position point.
  • the position points indicated by the center position parameter, the sampling position parameter, and the assembly position parameter may be as shown in FIG. 4 .
  • a blue party and a red party are two teams rival to each other in the game video, and the terminal can determine a position point 401 indicated by a center position parameter of each interaction object of the red party and each interaction object of the blue party, a position point 403 indicated by the assembly position parameter, and a position point 402 indicated by the sampling position parameter corresponding to the target frame.
  • the terminal determines the sampling position parameter corresponding to each target frame by taking the center position parameter corresponding to each target frame and the assembly position parameter into comprehensive consideration, so that in a camera operation (e.g., movement and scaling) control process, the camera may be caused to meet the first position parameter and the first scaling parameter simultaneously, thereby reducing visual abruptness.
  • a camera operation e.g., movement and scaling
  • a process that the terminal performs sampling processing on the data of the target frames corresponding to the target time slice, to obtain a sampling parameter corresponding to each target frame includes step a to step c:
  • step a the terminal obtains a distance parameter corresponding to each target frame based on the data of the target frames.
  • an implementation process of this step is that: the terminal respectively obtains, based on the data of the target frames, a distance between the position data of each interaction object and the sampling position parameter corresponding to each target frame and a distance between the sampling position parameter corresponding to each target frame and the assembly position parameter.
  • the terminal uses a maximum distance of the foregoing distances as the distance parameter corresponding to each target frame.
  • the terminal can obtain the distance parameter corresponding to each target frame by using Formula 3.
  • L max ⁇
  • L represents the distance parameter corresponding to each target frame
  • P c represents the position data of each interaction object in each target frame
  • P i represents the sampling position parameter corresponding to each target frame
  • P i2 represents the assembly position parameter
  • step b the terminal determines a scaling change magnification corresponding to the distance parameter based on a distance-scaling change magnification curve.
  • the distance parameter corresponding to each game video frame in the entire game video reduces from L max to L min , and the scaling parameter increases from S min to S max .
  • L max refers to a maximum value among the distance parameter corresponding to each game video frame in the entire game video
  • L min refer to a minimum value among the distance parameter corresponding to each game video frame in the entire game video
  • S min refers to a minimum scaling value of the camera in the entire game video and corresponds to L max
  • S max refers to a maximum scaling value of the camera in the entire game video and corresponds to L min .
  • a value of the distance parameter corresponding to each game video frame does not linearly changes between L max and L min , but is relatively concentrated on the L min side. Therefore, changing from S min to S max is to be as follows: a change speed in the former part is low, and a change speed in the latter part is high. Therefore, a slop of the distance-scaling change magnification curve used in this embodiment of this disclosure is an increasing tendency, that is, the slope of the distance-scaling change magnification curve is small in the former part and large in the latter part.
  • This distance-scaling change magnification curve can increase a mapping range and optimize visual feeling of camera operation (e.g., movement and scaling) when the value of L is in a relatively concentrated range.
  • L max , L min , S min , and S max may be determined by the backend server of the game client and fed back to the game client, or may be read by the game client from a configuration file recording L max , L min , S min , and S max .
  • the distance-scaling change magnification curve is as shown by 501 in FIG. 5 .
  • a horizontal coordinate represents a normalized distance
  • a longitudinal coordinate represents a scaling change magnification.
  • a slope of the distance-scaling change magnification curve 501 shown in FIG. 5 is an increasing tendency. As a value of the horizontal coordinate changes from 0 to 1, a value of the longitudinal also changes from 0 to 1.
  • a process that the terminal determines a scaling change magnification corresponding to the distance parameter based on a distance-scaling change magnification curve is that: the terminal performs normalization processing on the distance parameter, to obtain a normalized distance parameter. The terminal determines a scaling change magnification corresponding to the normalized distance parameter based on the distance-scaling change magnification curve.
  • the terminal can obtain the normalized distance parameter by using Formula 4.
  • L ′ ( L max ⁇ L )/( L max ⁇ L min ) (Formula 4)
  • L′ represents the normalized distance parameter
  • L represents the distance parameter before the normalization processing
  • L max represents the maximum value among the distance parameter corresponding to each game video frame in the entire game video
  • L min represents the minimum value among the distance parameter corresponding to each game video frame in the entire game video.
  • step c the terminal determines the sampling scaling parameter corresponding to each target frame based on the scaling change magnification.
  • the terminal can determine the sampling scaling parameter corresponding to each target frame by using Formula 5.
  • S i ( S max ⁇ S min ) ⁇ r+S min (Formula 5)
  • S i represents the sampling scaling parameter corresponding to each target frame
  • r represents the scaling change magnification
  • S min represents the minimum scaling value of the camera in the entire game video
  • S max represents the maximum scaling value of the camera in the entire game video.
  • the terminal may obtain the sampling parameter corresponding to each target frame according to the foregoing step A and step B.
  • the terminal may obtain the sampling parameter corresponding to each target frame according to the foregoing step a to step c.
  • the terminal may obtain the sampling parameter corresponding to each target frame according to the foregoing step A and step B and step a to step c.
  • the terminal performs sampling processing on the data of each target frame respectively based on step 2011 , to obtain the sampling parameter corresponding to each target frame, and then performs step 2012 .
  • step 2012 the terminal sets a weight value for each target frame according to a distance between a timestamp and a start timestamp of the target time slice.
  • a process that the terminal sets a weight value for each target frame according to a distance between a timestamp and a start timestamp of the target time slice is that: the terminal sets weight values from small to large for each target frame according to a from far to near sequence of the distance between the timestamp and the start timestamp of the target time slice, or the terminal sets weight values from large to small for each target frame according to a from near to far sequence of the distance between the timestamp and the start timestamp of the target time slice.
  • distribution of the weight values corresponding to each target frame meet Gaussian distribution.
  • the terminal sets weight values of 1/55, 4/55, 9/55, 16/55, and 25/55 for each target frame respectively according to a from far to near sequence of the distance between the timestamp and the start timestamp of the target time slice.
  • step 2013 the terminal determines the first target parameter that needs to be met by the camera in the target time slice based on the sampling parameter corresponding to each target frame and the weight value corresponding to each target frame.
  • a process that the terminal determines the first target parameter that needs to be met by the camera in the target time slice based on the sampling parameter corresponding to each target frame and the weight value corresponding to each target frame is that: the terminal obtains a product of the sampling parameter corresponding to each target frame and the weight value corresponding to each target frame respectively, and then uses a sum of the products as the first target parameter that needs to be met by the camera in the target time slice.
  • the first target parameter includes the first position parameter.
  • the sampling parameter corresponding to each target frame includes the sampling scaling parameter, the first target parameter includes the first scaling parameter.
  • the sampling parameter corresponding to each target frame includes the sampling position parameter and the sampling scaling parameter, the first target parameter includes the first position parameter and the first scaling parameter.
  • the terminal sets weight values of 1/55, 4/55, 9/55, 16/55, and 25/55 for each target frame respectively according to a from far to near sequence of the distance between the timestamp and the start timestamp of the target time slice.
  • the terminal can obtain the first position parameter in the first target parameter by using Formula 6.
  • P n P 1 ⁇ 1/55+ P 2 ⁇ 4/55+ P 3 ⁇ 9/55+ P 4 ⁇ 16/55+ P 5 ⁇ 25/55 (Formula 6)
  • P n represents the first position parameter in the first target parameter
  • P 1 , P 2 , P 3 , P 4 and P 5 respectively represent sampling position parameters corresponding to 5 target frames arranged in a from far to near sequence of the distance between the timestamp and the start timestamp of the target time slice.
  • the terminal can obtain the first scaling parameter in the first target parameter by using Formula 7.
  • S n S 1 ⁇ 1/55+ S 2 ⁇ 4/55+ S 3 ⁇ 9/55+ S 4 ⁇ 16/55+ S 5 ⁇ 25/55 (Formula 7)
  • S n represents the first scaling parameter in the first target parameter
  • S 1 , S 2 , S 3 , S 4 and S 5 respectively represent sampling scaling parameters corresponding to 5 target frames arranged in a from far to near sequence of the distance between the timestamp and the start timestamp of the target time slice.
  • the terminal may further perform validity verification on the first target parameter based on a parameter change threshold.
  • a process that the terminal performs validity verification on the first target parameter based on a parameter change threshold is: obtaining a change parameter of the camera in the target time slice; and performing validity verification on the first target parameter based on a comparison result of the change parameter and the parameter change threshold.
  • a process that the terminal obtains a change parameter of the camera in the target time slice is that: the terminal obtains the change parameter of the camera in the target time slice based on a second target parameter that has been met by the camera in the target time slice and the first target parameter.
  • the second target parameter that has been met in the target time slice refers to a parameter that is met by the camera at a start moment of the target time slice, and the second target parameter includes at least one of a second position parameter or a second scaling parameter.
  • the change parameter of the camera in the target time slice includes a position change parameter
  • the terminal can represent the position change parameter by using a distance between the first position parameter and the second position parameter.
  • the change parameter of the camera in the target time slice includes a scaling change parameter
  • the terminal can represent the scaling change parameter by using an absolute value of a difference between the first scaling parameter and the second scaling parameter.
  • a process that the terminal performs validity verification on the first target parameter based on a comparison result of the change parameter and the parameter change threshold is that: when the change parameter is lower than the parameter change threshold, the terminal determines that the validity verification on the first target parameter fails; and when the change parameter is not lower than the parameter change threshold, the terminal determines that the validity verification on the first target parameter succeeds.
  • the change parameter of the camera in the target time slice includes a position change parameter and a scaling change parameter.
  • a process that the terminal performs validity verification on the first target parameter based on a comparison result of the change parameter and the parameter change threshold is that: the terminal performs validity verification on the first position parameter based on a comparison result of the position change parameter and a first parameter change threshold; and the terminal performs validity verification on the first scaling parameter based on a comparison result of the scaling change parameter and a second parameter change threshold.
  • the first parameter change threshold and the second parameter change threshold may be the same or may be different, which is not limited in the embodiments of this disclosure.
  • the terminal may represent the position change parameter of the camera in the target time slice by using a Euclidean distance 1 between the two position coordinates, and present the scaling change parameter of the camera in the target time slice by using an absolute value 8 of a difference between the two scaling parameters.
  • the first parameter change threshold is set to 2
  • the second parameter change threshold is set to 3
  • the validity verification on the first position parameter fails, and the validity verification on the first scaling parameter succeeds.
  • the first target parameter on which validity verification succeeds only includes the first scaling parameter.
  • the terminal can significantly reduce a camera jitter caused by tiny translation and tiny scaling by performing validity verification on the first target parameter.
  • a trigger frequency may be set, and the step of obtaining the first target parameter may be triggered to be executed according to the trigger frequency.
  • the trigger frequency may be set according to experience or freely adjusted according to application scenarios, which is not limited in the embodiments of this disclosure.
  • a trigger moment determined according to the trigger frequency may be a moment corresponding to a short time period before each time slice, for example, a moment corresponding to 0.2 seconds before each time slice, to ensure that a camera operation (e.g., movement and scaling) control operation can be performed in time when the target time slice is reached.
  • the terminal determines a target operation (e.g., movement and scaling) speed of the camera in the target time slice based on the first target parameter, an initial operation (e.g., movement and scaling) speed of the camera in the target time slice, a time interval corresponding to the target time slice, and a time-speed change magnification curve.
  • a target operation e.g., movement and scaling
  • an initial operation e.g., movement and scaling
  • the target operation (e.g., movement and scaling) speed includes at least one of a target translation speed or a target scaling speed.
  • the initial operation (e.g., movement and scaling) speed of the camera in the target time slice refers to a operation (e.g., movement and scaling) speed of the camera at a start moment of the target time slice, and the initial operation (e.g., movement and scaling) speed includes at least one of an initial translation speed or an initial scaling speed.
  • the initial operation (e.g., movement and scaling) speed of the camera in the target time slice is also an operation (e.g., movement and scaling) speed of the camera at an end moment of a previous time slice of the target time slice.
  • the initial operation (e.g., movement and scaling) speed of the camera in the target time slice may be obtained.
  • the time interval corresponding to the target time slice refers to a duration from the start timestamp to an end timestamp of the target time slice.
  • the target operation (e.g., movement and scaling) speed of the camera in the target time slice refers to a operation (e.g., movement and scaling) speed of the camera at an end moment of the target time slice. That is, the target operation (e.g., movement and scaling) speed refers to a operation (e.g., movement and scaling) speed when the camera meets the first target parameter.
  • the time-speed change magnification curve is set by a game developer or freely adjusted according to application scenarios, which is not limited in the embodiments of this disclosure.
  • the time-speed change magnification curve is a Bezier curve.
  • the Bezier curve is a smooth curve, and setting the time-speed change curve as a Bezier curve helps improve the smoothness and stability during a camera operation (e.g., movement and scaling) process.
  • the time-speed change magnification curve is as shown by 601 in FIG. 6 .
  • a horizontal coordinate represents a normalized time
  • a longitudinal coordinate represents a speed change magnification.
  • the time-speed change magnification curve 601 shown in FIG. 6 is a Bezier curve. As a value of the horizontal coordinate changes from 0 to 1, a value of the longitudinal also changes from 0 to 1.
  • the terminal determines the target operation (e.g., movement and scaling) speed of the camera in the target time slice based on the first target parameter on which validity verification succeeds, the initial operation (e.g., movement and scaling) speed of the camera in the target time slice, the time interval corresponding to the target time slice, and the time-speed change magnification curve.
  • the target operation e.g., movement and scaling
  • the initial operation e.g., movement and scaling
  • a process that the terminal determines a target operation (e.g., movement and scaling) speed of the camera in the target time slice based on the first target parameter, an initial operation (e.g., movement and scaling) speed of the camera in the target time slice, a time interval corresponding to the target time slice, and a time-speed change magnification curve includes step 2021 to step 2023 :
  • step 2021 the terminal obtains a change parameter of the camera in the target time slice based on a second target parameter that has been met by the camera in the target time slice and the first target parameter.
  • the second target parameter that has been met by the camera in the target time slice refers to a parameter that is met by the camera at a start moment of the target time slice, and the second target parameter includes at least one of a second position parameter or a second scaling parameter.
  • a camera operation e.g., movement and scaling
  • the second target parameter that has been met by the camera in the target time slice may be obtained.
  • the change parameter of the camera in the target time slice when the first target parameter includes the first position parameter, the change parameter of the camera in the target time slice includes a position change parameter, and the position change parameter may be represented by using a distance between the first position parameter and the second position parameter; and when the first target parameter includes the first scaling parameter, the change parameter of the camera in the target time slice includes a scaling change parameter, and the scaling change parameter may be presented by using an absolute value of a difference between the first scaling parameter and the second scaling parameter.
  • the terminal may represent the position change parameter of the camera in the target time slice by using a Euclidean distance 1 between the two position coordinates, and present the scaling change parameter of the camera in the target time slice by using an absolute value 8 of a difference between the two scaling parameters.
  • step 2022 the terminal obtains a point value corresponding to the time-speed change magnification curve.
  • a point value about time of the time-speed change magnification curve is calculated.
  • a time range corresponding to the time-speed change magnification curve is [0, 1].
  • ⁇ V(t) is used to represent the time-speed change magnification curve, and the point value corresponding to the time-speed change magnification curve is obtained according to ⁇ 0 1 ⁇ V(t)dt through calculation.
  • the terminal determines the target operation (e.g., movement and scaling) speed of the camera in the target time slice based on the change parameter, the initial operation (e.g., movement and scaling) speed, the time interval corresponding to the target time slice, and the point value corresponding to the time-speed change magnification curve.
  • the target operation e.g., movement and scaling
  • the initial operation e.g., movement and scaling
  • the terminal can determine the target operation (e.g., movement and scaling) speed of the camera in the target time slice according to Formula 8.
  • ⁇ M [V 1 ⁇ t +( V 2 ⁇ V 1 ) ⁇ t ⁇ 0 1 ⁇ V ( t ) dt] (Formula 8)
  • ⁇ M represents the change parameter, which may be a translation change parameter or may be a scaling change parameter
  • V 1 represents a relatively small operation (e.g., movement and scaling) speed of the initial operation (e.g., movement and scaling) speed and the target operation (e.g., movement and scaling) speed, which may be a relatively small translation speed of the initial translation speed and the target translation speed, or may be a relatively small scaling speed of the initial scaling speed and the target scaling speed
  • V 2 represents a relatively large operation (e.g., movement and scaling) speed of the initial operation (e.g., movement and scaling) speed and the target operation (e.g., movement and scaling) speed, which may be a relatively large translation speed of the initial translation speed and the target translation speed, or may be a relatively large scaling speed of the initial scaling speed and the target scaling speed
  • ⁇ t represents the time interval corresponding to the target time slice
  • ⁇ 0 1 ⁇ V(t)dt represents the point value corresponding to the time-speed change magnification curve.
  • V 1 represents the target operation (e.g., movement and scaling) speed
  • V 2 represents the initial operation (e.g., movement and scaling) speed
  • the product of the initial operation (e.g., movement and scaling) speed and the time interval is less than the change parameter, it indicates that the initial operation (e.g., movement and scaling) speed needs to be increased, that is, the initial operation (e.g., movement and scaling) speed is less than the target operation (e.g., movement and scaling) speed.
  • V 1 represents the initial operation (e.g., movement and scaling) speed
  • V 2 represents the target operation (e.g., movement and scaling) speed.
  • V 2 is calculated according to Formula 8, so that the target operation (e.g., movement and scaling) speed may be obtained.
  • a process that the terminal determines a target operation (e.g., movement and scaling) speed of the camera in the target time slice includes that: the terminal determines, according to Formula 8, the target translation speed of the camera in the target time slice based on the translation change parameter, the initial translation speed, the time interval corresponding to the target time slice, and a point value corresponding to a time-translation speed change magnification curve; and the terminal determines, according to Formula 8, the target scaling speed of the camera in the target time slice based on the scaling change parameter, the initial scaling speed, the time interval corresponding to the target time slice, and a point value corresponding to a time-scaling speed change magnification curve.
  • the time-translation speed change magnification curve and the time-scaling speed change magnification curve may be the same curve or may be two different curves, which is not limited in the embodiments of this disclosure.
  • the terminal can increase impact of a steering mixed coefficient in a process of determining the target operation (e.g., movement and scaling) speed of the camera in the target time slice, to reduce discomfort brought by direction changes (a translation direction and a scaling direction), thereby improving the stability of camera operation (e.g., movement and scaling).
  • This process may include the following step 1 to step 4 :
  • step 1 the terminal obtains a steering angle corresponding to the target time slice based on a first operation (e.g., movement and scaling) direction and a second operation (e.g., movement and scaling) direction.
  • a first operation e.g., movement and scaling
  • a second operation e.g., movement and scaling
  • the first operation (e.g., movement and scaling) direction refers to a operation (e.g., movement and scaling) direction of the camera at the start moment of the target time slice, and the first operation (e.g., movement and scaling) direction includes at least one of a first translation direction or a first scaling direction.
  • the second operation (e.g., movement and scaling) direction refers to a operation (e.g., movement and scaling) direction of the camera at the end moment of the target time slice, and the second operation (e.g., movement and scaling) direction includes at least one of a second translation direction or a second scaling direction.
  • the steering angle corresponding to the target time slice includes at least one of a translation steering angle or a scaling steering angle.
  • a process that the terminal obtains a steering angle corresponding to the target time slice includes: the terminal determines the translation steering angle corresponding to the target time slice based on an angle between the second translation direction and the first translation direction. That is, the terminal uses a degree corresponding to the angle between the second translation direction and the first translation direction as the translation steering angle corresponding to the target time slice.
  • the first translation direction is determined according to a position parameter at a start moment and a position parameter at an end moment of a previous time slice of the target time slice
  • the second translation direction is determined according to the second position parameter in the second target parameter and the first position parameter in the first target parameter.
  • a process that the terminal obtains a steering angle corresponding to the target time slice includes: the terminal determines the scaling steering angle corresponding to the target time slice based on a comparison result of the second scaling direction and the first scaling direction.
  • the first scaling direction is determined according to a scaling parameter at a start moment and a scaling parameter at an end moment of a previous time slice of the target time slice
  • the second scaling direction is determined according to the second scaling parameter in the second target parameter and the first scaling parameter in the first target parameter.
  • a process that the terminal determines the scaling steering angle corresponding to the target time slice based on a comparison result of the second scaling direction and the first scaling direction is that: when the second scaling direction and the first scaling direction are consistent, the terminal uses a first angle as the scaling steering angle corresponding to the target time slice; and when the second scaling direction and the first scaling direction are inconsistent, the terminal uses a second angle as the scaling steering angle.
  • the first angle and the second angle may be set according to experience. In some embodiments, the first angle is set to 0 degree, and the second angle is set to 180 degrees.
  • the scaling direction includes a zooming in direction and a zooming out direction
  • a case that the second scaling direction and the first scaling direction are consistent refers to that both the second scaling direction and the first scaling direction are a zooming in direction, or both the second scaling direction and the first scaling direction are a zooming out direction.
  • step 2 the terminal determines a steering mixed coefficient corresponding to the steering angle.
  • a manner in which the terminal determines the steering mixed coefficient corresponding to the steering angle includes, but is not limited to, the following two types:
  • Manner 1 The terminal determines the steering mixed coefficient corresponding to the steering angle based on an angle-steering mixed coefficient curve.
  • the angle-steering mixed coefficient curve is as shown by 701 in FIG. 7 .
  • a horizontal coordinate represents an angle
  • a longitudinal coordinate represents a steering mixed coefficient.
  • the terminal can determine the steering mixed coefficient corresponding to the steering angle according to the angle-steering mixed coefficient curve 701 . As the angle changes from 0 degree to 180 degrees, the steering mixed coefficient decreases from 1 to 0.
  • Manner 2 The terminal determines the steering mixed coefficient corresponding to the steering angle based on a correspondence between angles and steering mixed coefficients.
  • the correspondence between angles and steering mixed coefficients is set by a game developer, which is not limited in the embodiments of this disclosure.
  • the correspondence between angles and steering mixed coefficients is represented in the form of a table. After the steering angle is determined, the terminal can query the steering mixed coefficient corresponding to the steering angle in a correspondence table of angles and steering mixed coefficients.
  • the terminal needs to respectively determine a translation steering mixed coefficient corresponding to the translation steering angle and a scaling steering mixed coefficient corresponding to the scaling steering angle according to the foregoing manner 1 or manner 2.
  • step 3 the terminal updates the initial operation (e.g., movement and scaling) speed based on the steering mixed coefficient, to obtain an updated initial operation (e.g., movement and scaling) speed.
  • the initial operation e.g., movement and scaling
  • a manner in which the terminal updates the initial operation (e.g., movement and scaling) speed based on the steering mixed coefficient is that: the terminal uses a product of the initial operation (e.g., movement and scaling) speed before update and the steering mixed coefficient as the updated initial operation (e.g., movement and scaling) speed.
  • step 4 the terminal determines the target operation (e.g., movement and scaling) speed of the camera in the target time slice based on the first target parameter, the updated initial operation (e.g., movement and scaling) speed, the time interval corresponding to the target time slice, and the time-speed change magnification curve.
  • the target operation e.g., movement and scaling
  • the updated initial operation e.g., movement and scaling
  • step 4 For an implementation process of step 4 , reference may be made to step 2021 to step 2023 , and details are not described herein again.
  • the terminal can update the target operation (e.g., movement and scaling) speed based on a correction coefficient, to obtain an updated target operation (e.g., movement and scaling) speed.
  • V 2 (n) represents the updated target operation (e.g., movement and scaling) speed
  • V 1 (n) represents the target operation (e.g., movement and scaling) speed before update
  • R represents the correction coefficient.
  • the correction coefficient may be set according to experience or freely adjusted according to application scenarios, which is not limited in the embodiments of this disclosure.
  • the terminal can update the target translation speed based on a first correction coefficient to obtain an updated target translation speed; and the terminal updates the target scaling speed based on a second correction coefficient to obtain an updated target scaling speed.
  • the first correction coefficient and the second correction coefficient may be the same or may be different, which is not limited in the embodiments of this disclosure.
  • step 203 the terminal controls the camera to move based on the time-speed change magnification curve, the initial operation (e.g., movement and scaling) speed, and the target operation (e.g., movement and scaling) speed, to meet the first target parameter.
  • the initial operation e.g., movement and scaling
  • the target operation e.g., movement and scaling
  • the target operation (e.g., movement and scaling) speed in this step refers to the updated target operation (e.g., movement and scaling) speed.
  • the initial operation (e.g., movement and scaling) speed in this step refers to the updated initial operation (e.g., movement and scaling) speed.
  • a process that the terminal controls the camera to move based on the time-speed change magnification curve, the initial operation (e.g., movement and scaling) speed, and the target operation (e.g., movement and scaling) speed includes step 2031 to step 2034 :
  • step 2031 the terminal divides a process of controlling the camera to move into a reference number of subprocesses.
  • the terminal in the time interval corresponding to the target time slice, can divide, according to a second reference time interval, the process of controlling the camera to move into a reference number of consecutive subprocesses, to cause the camera to meet the first target parameter by controlling the camera to move step by step.
  • the reference number may be determined according to the time interval corresponding to the target time slice and the second reference time interval.
  • the terminal uses a ratio of the time interval corresponding to the target time slice to the second reference time interval as the reference number.
  • the second reference time interval is determined according to the time interval corresponding to the target time slice or is set according to experience, which is not limited in the embodiments of this disclosure. In some embodiments, when the time interval corresponding to the target time slice is 1 second, the second reference time interval is set to 0.02 seconds. In this case, the reference number is 50. A time interval corresponding to each subprocess obtained by the terminal according to this division manner is 0.02 seconds.
  • the terminal can further obtain a sub-operation (e.g., movement and scaling) speed corresponding to each subprocess based on step 2032 and step 2033 .
  • a sub-operation e.g., movement and scaling
  • step 2032 the terminal determines a speed change magnification corresponding to any subprocess based on a time parameter corresponding to the any subprocess and the time-speed change magnification curve.
  • the time in the time-speed change magnification curve is normalized, and a time range corresponding to the time-speed change magnification curve is [0, 1].
  • the time parameter corresponding to the any subprocess needs to be obtained first.
  • a process that the terminal obtains the time parameter corresponding to the any subprocess includes the two following steps:
  • step 1 the terminal uses a ratio of a target time interval to the time interval corresponding to the target time slice as a target ratio.
  • step 2 when the initial operation (e.g., movement and scaling) speed is less than the target operation (e.g., movement and scaling) speed, the terminal uses the target ratio as the time parameter corresponding to the any subprocess; and when the initial operation (e.g., movement and scaling) speed is not less than the target operation (e.g., movement and scaling) speed, the terminal uses a difference between 1 and the target ratio as the time parameter corresponding to the any subprocess.
  • the initial operation e.g., movement and scaling
  • the terminal can determine a speed change magnification corresponding to the time parameter based on the time-speed change magnification curve, and uses the speed change magnification as the speed change magnification corresponding to the any subprocess.
  • the speed change magnification includes at least one of a translation speed change magnification or a scaling speed change magnification.
  • the terminal determines a translation speed change magnification corresponding to the time parameter based on a time-translation speed change magnification curve, and uses the translation speed change magnification as the translation speed change magnification corresponding to the any subprocess; and in a process that the terminal obtains a scaling speed change magnification corresponding to the any subprocess, the terminal determines a scaling speed change magnification corresponding to the time parameter based on a time-scaling speed change magnification curve, and uses the scaling speed change magnification as the scaling speed change magnification corresponding to the any subprocess.
  • step 2033 the terminal determines a sub-operation (e.g., movement and scaling) speed corresponding to the any subprocess based on the initial operation (e.g., movement and scaling) speed, the target operation (e.g., movement and scaling) speed, and the speed change magnification corresponding to the any subprocess.
  • a sub-operation e.g., movement and scaling
  • V c represents the sub-operation (e.g., movement and scaling) speed corresponding to the any subprocess
  • V 1 represents a relatively small operation (e.g., movement and scaling) speed of the initial operation (e.g., movement and scaling) speed and the target operation (e.g., movement and scaling) speed
  • V 2 represents a relatively large operation (e.g., movement and scaling) speed of the initial operation (e.g., movement and scaling) speed and the target operation (e.g., movement and scaling) speed
  • T represents the time parameter corresponding to the any subprocess determined in step 2032
  • ⁇ V(T) represents the speed change magnification corresponding to the any subprocess.
  • the sub-operation (e.g., movement and scaling) speed includes at least one of a sub-translation speed or a sub-scaling speed.
  • V c represents the sub-translation speed corresponding to the any subprocess
  • V 1 represents a relatively small translation speed of the initial translation speed and the target translation speed
  • V 2 represents a relatively large translation speed of the initial translation speed and the target translation speed
  • ⁇ V(T) represents the translation speed change magnification corresponding to the any subprocess determined based on the time-translation speed change magnification curve.
  • V c represents the sub-scaling speed corresponding to the any subprocess
  • V 1 represents a relatively small scaling speed of the initial scaling speed and the target scaling speed
  • V 2 represents a relatively large scaling speed of the initial scaling speed and the target scaling speed
  • ⁇ V(T) represents the scaling speed change magnification corresponding to the any subprocess determined based on the time-scaling speed change magnification curve.
  • step 2034 the terminal controls the camera to move according to the sub-operation (e.g., movement and scaling) speed corresponding to the any subprocess in a time interval corresponding to the any subprocess.
  • the sub-operation e.g., movement and scaling
  • the operation (e.g., movement and scaling) speed of the camera keeps the sub-operation (e.g., movement and scaling) speed corresponding to the any subprocess unchanged.
  • the time interval corresponding to the any subprocess is 0.02 seconds
  • the sub-operation (e.g., movement and scaling) speed corresponding to the any subprocess includes a sub-translation speed and a sub-scaling speed, where the sub-translation speed is 1 meter per second, and the sub-scaling speed is 0.5 per second.
  • the camera is controlled to perform scaling at a speed of 0.5 per second while the camera is controlled to perform translation at a speed of 1 meter per second.
  • the terminal may continue to determine a sub-operation (e.g., movement and scaling) speed of a next subprocess, to control the camera to move according to the sub-operation (e.g., movement and scaling) speed corresponding to the next subprocess after controlling the camera to move according to the sub-operation (e.g., movement and scaling) speed corresponding to the any subprocess, and the rest may be deduced by analogy until a sub-operation (e.g., movement and scaling) speed corresponding to the last subprocess in the target time slice is determined.
  • a sub-operation e.g., movement and scaling
  • the terminal then controls the camera to move according to the sub-operation (e.g., movement and scaling) speed corresponding to the last subprocess, to cause the camera to meet the first target parameter, so as to complete the camera operation (e.g., movement and scaling) control process in the target time slice.
  • the terminal then continues to perform a camera operation (e.g., movement and scaling) control operation in a next time slice based on the foregoing step 201 to step 203 .
  • the operation (e.g., movement and scaling) continuity of the camera between adjacent time slices is relatively good, and a transition of the camera between adjacent time slices may be smooth, thereby achieving smooth operation (e.g., movement and scaling) and reducing discomfort of the user.
  • a time-operation (e.g., movement and scaling) speed curve of camera operation (e.g., movement and scaling) among 3 adjacent time slices may be as shown by 801 in FIG. 8 , and the curve 801 in FIG. 8 has good continuity at each junction, which indicates that the smoothness of the camera operation (e.g., movement and scaling) process is relatively good.
  • a process that the terminal controls the camera to move may be as shown by 901 to 904 in FIG. 9 :
  • temporary parameters are prepared. After a first target parameter is obtained, temporary parameters for usage in subsequent processes are prepared, which include, but are not limited to, a trigger frequency, a weight value of each target frame, a parameter change threshold, a time-speed change magnification curve, a distance-scaling change magnification curve, and an angle-steering mixed coefficient curve.
  • a first target parameter is obtained.
  • the terminal obtains a sampling position parameter and a sampling scaling parameter corresponding to each target frame, where the sampling scaling parameter is determined according to the distance-scaling change magnification curve.
  • the terminal determines the first target parameter according to the sampling parameter and the weight value corresponding to each target frame.
  • the terminal performs validity verification on the first target parameter according to the parameter change threshold, to achieve an effect of filtering the first target parameter.
  • the terminal determines a target operation (e.g., movement and scaling) speed.
  • the terminal determines a steering mixed coefficient according to the angle-steering mixed coefficient curve.
  • the terminal updates an initial operation (e.g., movement and scaling) speed according to the steering mixed coefficient.
  • the terminal determines the target operation (e.g., movement and scaling) speed according to the first target parameter, a time interval, the time-speed change magnification curve, and the updated initial operation (e.g., movement and scaling) speed.
  • the terminal controls the camera to move.
  • the terminal determines a sub-operation (e.g., movement and scaling) speed corresponding to each subprocess according to the target operation (e.g., movement and scaling) speed, the updated initial operation (e.g., movement and scaling) speed, and the time-speed change magnification curve.
  • the terminal controls the camera to move according to the sub-operation (e.g., movement and scaling) speed corresponding to each subprocess, to cause the camera to meet the first target parameter.
  • the terminal obtains the first target parameter that needs to be met by the camera in the target time slice based on the data of the plurality of target frames, so that the reliability of the first target parameter is relatively high, which helps improve the stability of camera operation (e.g., movement and scaling).
  • the camera operation e.g., movement and scaling
  • the camera operation is controlled based on a time-speed change magnification curve, which helps improve the operation (e.g., movement and scaling) continuity of the camera between adjacent time slices, so that the stability of the operation (e.g., movement and scaling) process of the camera is relatively high, and the camera operation (e.g., movement and scaling) control effect is relatively good.
  • the process for camera control in FIG. 9 can be implemented in any suitable electronic device, such as a terminal (device), a server (device), and the like.
  • an embodiment of this disclosure provides a camera operation (e.g., movement and scaling) control apparatus that includes an obtaining module 100 , a determining module 1002 , and a control module 1003 .
  • a camera operation e.g., movement and scaling
  • the obtaining module 1001 is configured to obtain, based on data of a plurality of target frames corresponding to a target time slice, a first target parameter that needs to be met by a camera in the target time slice.
  • the first target parameter includes at least one of a first position parameter or a first scaling parameter.
  • the determining module 1002 is configured to determine a target operation (e.g., movement and scaling) speed of the camera in the target time slice based on the first target parameter, an initial operation (e.g., movement and scaling) speed of the camera in the target time slice, a time interval corresponding to the target time slice, and a time-speed change magnification curve.
  • the target operation (e.g., movement and scaling) speed includes at least one of a target translation speed or a target scaling speed.
  • the control module 1003 is configured to control the camera to move based on the time-speed change magnification curve, the initial operation (e.g., movement and scaling) speed, and the target operation (e.g., movement and scaling) speed.
  • the obtaining module 1001 is configured to perform sampling processing on the data of the target frames corresponding to the target time slice, to obtain a sampling parameter corresponding to each target frame; set a weight value for each target frame according to a distance between a timestamp and a start timestamp of the target time slice; and determine the first target parameter that needs to be met by the camera in the target time slice based on the sampling parameter corresponding to each target frame and the weight value corresponding to each target frame.
  • the sampling parameter corresponding to each target frame includes a sampling position parameter
  • the obtaining module 1001 is further configured to obtain a center position parameter of an interaction object in each target frame based on the data of the target frames; and determine the sampling position parameter corresponding to each target frame based on the center position parameter of the interaction object in each target frame and an assembly position parameter of the interaction object.
  • the sampling parameter corresponding to each target frame includes a sampling scaling parameter
  • the obtaining module 1001 is further configured to obtain a distance parameter corresponding to each target frame based on the data of the target frames; determine a scaling change magnification corresponding to the distance parameter based on a distance-scaling change magnification curve; and determine the sampling scaling parameter corresponding to each target frame based on the scaling change magnification.
  • the determining module 1002 is configured to obtain a steering angle corresponding to the target time slice based on a first operation (e.g., movement and scaling) direction and a second operation (e.g., movement and scaling) direction; determine a steering mixed coefficient corresponding to the steering angle; update the initial operation (e.g., movement and scaling) speed based on the steering mixed coefficient, to obtain an updated initial operation (e.g., movement and scaling) speed; and determine the target operation (e.g., movement and scaling) speed of the camera in the target time slice based on the first target parameter, the updated initial operation (e.g., movement and scaling) speed, the time interval corresponding to the target time slice, and the time-speed change magnification curve.
  • a first operation e.g., movement and scaling
  • a second operation e.g., movement and scaling
  • the determining module 1002 is configured to obtain a change parameter of the camera in the target time slice based on a second target parameter that has been met by the camera in the target time slice and the first target parameter, where the second target parameter includes at least one of a second position parameter or a second scaling parameter; obtain a point value corresponding to the time-speed change magnification curve; and determine the target operation (e.g., movement and scaling) speed of the camera in the target time slice based on the change parameter, the initial operation (e.g., movement and scaling) speed, the time interval corresponding to the target time slice, and the point value corresponding to the time-speed change magnification curve.
  • the target operation e.g., movement and scaling
  • control module 1003 is configured to divide a process of controlling the camera to move into a reference number of subprocesses; determine a speed change magnification corresponding to any subprocess based on a time parameter corresponding to the any subprocess and the time-speed change magnification curve; determine a sub-operation (e.g., movement and scaling) speed corresponding to the any subprocess based on the initial operation (e.g., movement and scaling) speed, the target operation (e.g., movement and scaling) speed, and the speed change magnification corresponding to the any subprocess; and control the camera to move according to the sub-operation (e.g., movement and scaling) speed corresponding to the any subprocess in a time interval corresponding to the any subprocess.
  • the initial operation e.g., movement and scaling
  • the target operation e.g., movement and scaling
  • the apparatus further includes a verification module 1004 .
  • the verification module 1004 is configured to perform validity verification on the first target parameter based on a parameter change threshold, and the determining module 1002 is further configured to determine the target operation (e.g., movement and scaling) speed of the camera in the target time slice based on the first target parameter passing the validity verification, the initial operation (e.g., movement and scaling) speed of the camera in the target time slice, the time interval corresponding to the target time slice, and the time-speed change magnification curve.
  • the target operation e.g., movement and scaling
  • the initial operation e.g., movement and scaling
  • the apparatus further includes an update module 1005 .
  • the update module 1005 is configured to update the target operation (e.g., movement and scaling) speed based on a correction coefficient, to obtain an updated target operation (e.g., movement and scaling) speed, and the control module 1003 is further configured to control the camera to move based on the time-speed change magnification curve, the initial operation (e.g., movement and scaling) speed, and the target operation (e.g., movement and scaling) speed, to meet the first target parameter.
  • the time-speed change magnification curve is a Bezier curve.
  • the first target parameter that needs to be met by the camera in the target time slice is obtained based on the data of the plurality of target frames, so that the reliability of the first target parameter is relatively high, which helps improve the stability of camera operation (e.g., movement and scaling).
  • the camera operation e.g., movement and scaling
  • the camera operation is controlled based on a time-speed change magnification curve, which helps improve the operation (e.g., movement and scaling) continuity of the camera between adjacent time slices, so that the stability of the operation (e.g., movement and scaling) process of the camera is relatively high, and the camera operation (e.g., movement and scaling) control effect is relatively good.
  • the division of the foregoing functional modules is merely an example for description.
  • the functions may be assigned to and completed by different functional modules according to the requirements, that is, the internal structure of the device is divided into different functional modules, to implement all or some of the functions described above.
  • the apparatus and method embodiments provided in the foregoing embodiments belong to the same concept. For the specific implementation process, reference may be made to the method embodiments, and details are not described herein again.
  • the embodiments of this disclosure may be implemented in a terminal and/or implemented in a server, and a structure of the terminal is described below.
  • FIG. 12 is a schematic structural diagram of a terminal according to an embodiment of this disclosure, and a game client is installed in the terminal.
  • the terminal may be a smartphone, a tablet computer, a notebook computer, or a desktop computer.
  • the terminal may also be referred to as user equipment, a portable terminal, a laptop terminal, or a desktop terminal, among other names.
  • the terminal includes a processor 1201 and a memory 1202 .
  • the processor 1201 may include one or more processing cores, for example, a 4-core processor or an 8-core processor.
  • the processor 1201 may be implemented by using at least one hardware form of a digital signal processor (DSP), a field-programmable gate array (FPGA), and a programmable logic array (PLA).
  • the processor 1201 may also include a main processor and a coprocessor.
  • the main processor is a processor configured to process data in an awake state, and is also referred to as a central processing unit (CPU).
  • the coprocessor is a low power consumption processor configured to process the data in a standby state.
  • the processor 1201 may be integrated with a graphics processing unit (GPU).
  • the GPU is configured to render and draw content that needs to be displayed on a display.
  • the processor 1201 may further include an artificial intelligence (AI) processor.
  • the AI processor is configured to process computing operations related to machine learning.
  • the memory 1202 may include one or more computer-readable storage media.
  • the computer-readable storage medium may be non-transient.
  • the memory 1202 may further include a high-speed random access memory and a non-volatile memory, for example, one or more disk storage devices or flash storage devices.
  • the non-transient computer-readable storage medium in the memory 1202 is configured to store at least one instruction, and the at least one instruction is configured to be executed by the processor 1201 to perform the following steps:
  • the processor is configured to perform the following steps:
  • the sampling parameter corresponding to each target frame includes a sampling position parameter
  • the processor is configured to perform the following steps:
  • the sampling parameter corresponding to each target frame includes a sampling scaling parameter
  • the processor is configured to perform the following steps:
  • the processor is configured to perform the following steps:
  • the processor is configured to perform the following steps:
  • the processor is configured to perform the following steps:
  • the processor is further configured to perform the following steps:
  • the processor is further configured to perform the following steps:
  • the time-speed change magnification curve is a Bezier curve.
  • the terminal may include a peripheral device interface 1203 and at least one peripheral device.
  • the processor 1201 , the memory 1202 , and the peripheral device interface 1203 may be connected by using a bus or a signal cable.
  • Each peripheral device may be connected to the peripheral device interface 1203 by using a bus, a signal cable, or a circuit board.
  • the peripheral device includes: at least one of a radio frequency (RF) circuit 1204 , a touch display screen 1205 , a camera component 1206 , an audio circuit 1207 , a positioning component 1208 , and a power supply 1209 .
  • RF radio frequency
  • the peripheral device interface 1203 may be configured to connect the at least one peripheral device related to input/output (I/O) to the processor 1201 and the memory 1202 .
  • the processor 1201 , the memory 1202 , and the peripheral device interface 1203 are integrated on a same chip or circuit board.
  • any one or two of the processor 1201 , the memory 1202 , and the peripheral device interface 1203 may be implemented on a single chip or circuit board. This is not limited in this embodiment.
  • the RF circuit 1204 is configured to receive and transmit an RF signal, also referred to as an electromagnetic signal.
  • the RF circuit 1204 communicates with a communication network and other communication devices through the electromagnetic signal.
  • the RF circuit 1204 converts an electrical signal into an electromagnetic signal for transmission, or converts a received electromagnetic signal into an electrical signal.
  • the RF circuit 1204 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chip set, a subscriber identity module card, and the like.
  • the RF circuit 1204 may communicate with another terminal by using at least one wireless communication protocol.
  • the wireless communication protocol includes, but is not limited to, a metropolitan area network, different generations of mobile communication networks (2G, 3G, 4G, and 5G), a wireless local area network, and/or a Wi-Fi network.
  • the RF 1204 may further include a circuit related to NFC, which is not limited in this disclosure.
  • the display screen 1205 is configured to display a user interface (UI).
  • the UI may include a graph, text, an icon, a video, and any combination thereof.
  • the display screen 1205 is further capable of collecting touch signals on or above a surface of the display screen 1205 .
  • the touch signal may be inputted to the processor 1201 as a control signal for processing.
  • the display screen 1205 may be further configured to provide a virtual button and/or a virtual keyboard that are/is also referred to as a soft button and/or a soft keyboard.
  • there may be one display screen 1205 disposed on a front panel of the terminal.
  • the display screen 1205 may be a flexible display screen disposed on a curved surface or a folded surface of the terminal. Even, the display screen 1205 may be further set in a non-rectangular irregular pattern, namely, a special-shaped screen.
  • the display screen 1205 may be prepared by using materials such as a liquid-crystal display (LCD), an organic light-emitting diode (OLED), or the like.
  • the camera component 1206 is configured to collect images or videos.
  • the camera component 1206 includes a front-facing camera and a rear-facing camera.
  • the front-facing camera is disposed on the front panel of the terminal
  • the rear-facing camera is disposed on a back surface of the terminal.
  • there are at least two rear cameras which are respectively any of a main camera, a depth-of-field camera, a wide-angle camera, and a telephoto camera, to achieve background blur through fusion of the main camera and the depth-of-field camera, panoramic photographing and virtual reality (VR) photographing through fusion of the main camera and the wide-angle camera, or other fusion photographing functions.
  • the camera component 1206 may further include a flash.
  • the flash may be a monochrome temperature flash, or may be a double color temperature flash.
  • the double color temperature flash refers to a combination of a warm light flash and a cold light flash, and may be used for light compensation under different color temperatures.
  • the audio circuit 1207 may include a microphone and a speaker.
  • the microphone is configured to acquire sound waves of a user and an environment, and convert the sound waves into an electrical signal to input to the processor 1201 for processing, or input to the RF circuit 1204 for implementing voice communication.
  • the microphone may further be an array microphone or an omni-directional acquisition type microphone.
  • the speaker is configured to convert electric signals from the processor 1201 or the RF circuit 1204 into sound waves.
  • the speaker may be a conventional film speaker, or may be a piezoelectric ceramic speaker.
  • the speaker When the speaker is the piezoelectric ceramic speaker, the speaker not only can convert an electric signal into acoustic waves audible to a human being, but also can convert an electric signal into acoustic waves inaudible to a human being, for ranging and other purposes.
  • the audio circuit 1207 may further include an earphone jack.
  • the positioning component 1208 is configured to position a current geographic location of the terminal, to implement a navigation or a location based service (LBS).
  • the positioning component 1208 may be a positioning component based on the Global Positioning System (GPS) of the United States, the BeiDou system of China, the GLONASS System of Russia, or the GALILEO System of the European Union.
  • GPS Global Positioning System
  • the power supply 1209 is configured to supply power to components in the terminal.
  • the power supply 1209 may be an alternating current, a direct current, a primary battery, or a rechargeable battery.
  • the rechargeable battery may support wired charging or wireless charging.
  • the rechargeable battery may be further configured to support a fast charging technology.
  • the terminal may further include one or more sensors 1210 .
  • the one or more sensors 1210 include, but are not limited to: an acceleration sensor 1211 , a gyroscope sensor 1212 , a pressure sensor 1213 , a fingerprint sensor 1214 , an optical sensor 1215 , and a proximity sensor 1216 .
  • the acceleration sensor 1211 can detect a magnitude of acceleration on three coordinate axes of a coordinate system established based on the terminal.
  • the acceleration sensor 1211 can be configured to detect components of gravity acceleration on three coordinate axes.
  • the processor 1201 may control, according to a gravity acceleration signal acquired by the acceleration sensor 1211 , the touch display screen 1205 to display the UI in a landscape view or a portrait view.
  • the acceleration sensor 1211 may be further configured to acquire motion data of a game or a user.
  • the gyroscope sensor 1212 may detect a body direction and a rotation angle of the terminal, and the gyroscope sensor 1212 may work with the acceleration sensor 1211 to acquire a 3D action performed by the user on the terminal.
  • the processor 1201 may implement the following functions according to data acquired by the gyroscope sensor 1212 : motion sensing (for example, the UI is changed according to a tilt operation of the user), image stabilization during shooting, game control, and inertial navigation.
  • the pressure sensor 1213 may be disposed at a side frame of the terminal and/or a lower layer of the display screen 1205 .
  • a holding signal of the user for the terminal can be detected for the processor 1201 to perform left and right hand recognition or quick operations according to the holding signal acquired by the pressure sensor 1213 .
  • the processor 1201 controls, according to a pressure operation of the user on the touch display screen 1205 , an operable control on the UI.
  • the operable control includes at least one of a button control, a scroll-bar control, an icon control, and a menu control.
  • the fingerprint sensor 1214 is configured to acquire a user's fingerprint, and the processor 1201 identifies a user's identity according to the fingerprint acquired by the fingerprint sensor 1214 , or the fingerprint sensor 1214 identifies a user's identity according to the acquired fingerprint. When identifying that the user's identity is a trusted identity, the processor 1201 authorizes the user to perform related sensitive operations.
  • the sensitive operations include: unlocking a screen, viewing encrypted information, downloading software, paying, changing a setting, and the like.
  • the fingerprint sensor 1214 may be disposed on a front surface, a back surface, or a side surface of the terminal. When a physical button or a vendor logo is disposed on the terminal, the fingerprint sensor 1214 may be integrated with the physical button or the vendor logo.
  • the optical sensor 1215 is configured to acquire ambient light intensity.
  • the processor 1201 may control the display brightness of the touch display screen 1205 according to the ambient light intensity acquired by the optical sensor 1215 . Specifically, when the ambient light intensity is relatively high, the display brightness of the touch display screen 1205 is increased. When the ambient light intensity is relatively low, the display brightness of the touch display screen 1205 is decreased.
  • the processor 1201 may further dynamically adjust a camera parameter of the camera component 1206 according to the ambient light intensity acquired by the optical sensor 1215 .
  • the proximity sensor 1216 is also referred to as a distance sensor and is generally disposed at the front panel of the terminal.
  • the proximity sensor 1216 is configured to acquire a distance between the user and the front face of the terminal.
  • the touch display screen 1201 is controlled by the processor 1205 to switch from a screen-on state to a screen-off state.
  • the touch display screen 1205 is controlled by the processor 1201 to switch from the screen-off state to the screen-on state.
  • FIG. 12 constitutes no limitation on the terminal.
  • the terminal may include more or fewer components than those shown in the drawings, some components may be combined, and a different component deployment may be used.
  • a server is further provided.
  • the server includes a processor 1301 and a memory 1302 , and the memory 1302 stores at least one piece of program code.
  • the at least one piece of program code is loaded and executed by one or more processors 1301 , to implement any camera operation (e.g., movement and scaling) control method described above.
  • a computer-readable storage medium is further provided, storing at least one piece of program code, the at least one piece of program code being loaded and executed by a processor of a computer device to perform the following steps:
  • the processor is configured to perform the following steps:
  • the sampling parameter corresponding to each target frame includes a sampling position parameter
  • the processor is configured to perform the following steps:
  • the sampling parameter corresponding to each target frame includes a sampling scaling parameter
  • the processor is configured to perform the following steps:
  • the processor is configured to perform the following steps:
  • the processor is configured to perform the following steps:
  • the processor is configured to perform the following steps:
  • the processor is further configured to perform the following steps:
  • the processor is further configured to perform the following steps:
  • the time-speed change magnification curve is a Bezier curve.
  • the computer-readable storage medium may be a read-only memory (ROM), a random access memory (random-access memory, RAM), a compact disc read-only memory (CD-ROM), a magnetic tape, a floppy disk, an optical data storage device, and the like.
  • ROM read-only memory
  • RAM random access memory
  • CD-ROM compact disc read-only memory
  • magnetic tape a magnetic tape
  • floppy disk an optical data storage device
  • modules, submodules, and/or units in the present disclosure can be implemented by processing circuitry, software, or a combination thereof, for example.
  • the term module (and other similar terms such as unit, submodule, etc.) in this disclosure may refer to a software module, a hardware module, or a combination thereof.
  • a software module e.g., computer program
  • a hardware module may be implemented using processing circuitry and/or memory.
  • Each module can be implemented using one or more processors (or processors and memory).
  • a processor or processors and memory
  • each module can be part of an overall module that includes the functionalities of the module.
  • “Plurality of” mentioned in this specification means two or more. “And/or” describes an association relationship for describing associated objects and represents that three relationships may exist. For example, A and/or B may represent the following three cases: Only A exists, both A and B exist, and only B exists. The character “/” in this specification generally indicates an “or” relationship between the associated objects.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)
  • Lens Barrels (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)
US17/675,293 2020-01-17 2022-02-18 Camera movement control method and apparatus, device, and storage medium Active 2041-06-29 US11962897B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN202010055703.XA CN111246095B (zh) 2020-01-17 2020-01-17 控制镜头运动的方法、装置、设备及存储介质
CN202010055703.X 2020-01-17
PCT/CN2020/126463 WO2021143296A1 (zh) 2020-01-17 2020-11-04 控制镜头运动的方法、装置、设备及存储介质

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/126463 Continuation WO2021143296A1 (zh) 2020-01-17 2020-11-04 控制镜头运动的方法、装置、设备及存储介质

Publications (2)

Publication Number Publication Date
US20220174206A1 US20220174206A1 (en) 2022-06-02
US11962897B2 true US11962897B2 (en) 2024-04-16

Family

ID=70872728

Family Applications (1)

Application Number Title Priority Date Filing Date
US17/675,293 Active 2041-06-29 US11962897B2 (en) 2020-01-17 2022-02-18 Camera movement control method and apparatus, device, and storage medium

Country Status (5)

Country Link
US (1) US11962897B2 (zh)
EP (1) EP4000700A4 (zh)
JP (1) JP7487293B2 (zh)
CN (1) CN111246095B (zh)
WO (1) WO2021143296A1 (zh)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111246095B (zh) 2020-01-17 2021-04-27 腾讯科技(深圳)有限公司 控制镜头运动的方法、装置、设备及存储介质
CN113962138B (zh) * 2020-07-21 2023-11-03 腾讯科技(深圳)有限公司 移动平台的参数值确定方法、装置、设备及存储介质
CN112399086B (zh) * 2020-12-08 2022-04-29 浙江大华技术股份有限公司 一种运动控制方法、装置、存储介质及电子装置
CN114140329B (zh) * 2021-12-13 2023-03-28 广东欧谱曼迪科技有限公司 一种内窥镜图像缩放方法、系统及执行装置
CN114371806B (zh) * 2022-03-22 2022-08-26 广州三七极创网络科技有限公司 虚拟相机镜头参数处理、更新方法、装置、设备及介质
CN114676389B (zh) * 2022-04-02 2023-06-27 深圳市大族机器人有限公司 电机控制方法、装置、计算机设备和存储介质

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1454663A1 (en) 2003-03-05 2004-09-08 Square Enix Co., Ltd. Virtual camera control method in three-dimensional video game
EP2030661A2 (en) 2007-08-30 2009-03-04 Kabushiki Kaisha Square Enix (also trading as Square Enix Co., Ltd.) Image generating apparatus, method of generating image, program, and recording medium
US20130244782A1 (en) 2011-01-31 2013-09-19 Microsoft Corporation Real-time camera tracking using depth maps
US8576235B1 (en) 2009-07-13 2013-11-05 Disney Enterprises, Inc. Visibility transition planning for dynamic camera control
US20140243082A1 (en) 2012-04-26 2014-08-28 Riot Games, Inc. Systems and methods that enable a spectator's experience for online active games
US20150156422A1 (en) * 2013-11-29 2015-06-04 Avigilon Corporation Camera system control for correcting bore-sight offset
CN104793643A (zh) 2015-04-13 2015-07-22 北京迪生动画科技有限公司 一种三维定格动画拍摄系统及控制方法
WO2015164155A1 (en) 2014-04-24 2015-10-29 Microsoft Technology Licensing, Llc Artist-directed volumetric dynamic virtual cameras
CN105597311A (zh) 2015-12-25 2016-05-25 网易(杭州)网络有限公司 3d游戏中的相机控制方法和装置
CN106502395A (zh) 2016-10-18 2017-03-15 深圳市火花幻境虚拟现实技术有限公司 一种在虚拟现实应用中避免用户眩晕的方法及装置
CN106582012A (zh) 2016-12-07 2017-04-26 腾讯科技(深圳)有限公司 一种vr场景下的攀爬操作处理方法和装置
CN106730852A (zh) 2016-12-20 2017-05-31 网易(杭州)网络有限公司 游戏系统中虚拟镜头的控制方法及控制装置
CN106780874A (zh) 2016-12-05 2017-05-31 苏州维盟韵联网络科技有限公司 一种智能门禁系统
CN107422848A (zh) 2017-06-16 2017-12-01 福建天晴数码有限公司 一种测试虚拟角色加速度值的方法及系统
US10154228B1 (en) * 2015-12-23 2018-12-11 Amazon Technologies, Inc. Smoothing video panning
CN109550246A (zh) 2017-09-25 2019-04-02 腾讯科技(深圳)有限公司 游戏客户端的控制方法、装置、存储介质和电子装置
CN109621415A (zh) 2018-12-26 2019-04-16 网易(杭州)网络有限公司 3d游戏中的显示控制方法及装置、计算机存储介质
CN110170167A (zh) 2019-05-28 2019-08-27 上海米哈游网络科技股份有限公司 一种画面显示方法、装置、设备及介质
CN111246095A (zh) 2020-01-17 2020-06-05 腾讯科技(深圳)有限公司 控制镜头运动的方法、装置、设备及存储介质

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3428562B2 (ja) * 2000-04-25 2003-07-22 株式会社スクウェア オブジェクトの動きを処理する方法および記録媒体、並びに、ゲーム装置
CN100487636C (zh) * 2006-06-09 2009-05-13 中国科学院自动化研究所 基于立体视觉的游戏控制系统及方法
JP5143883B2 (ja) 2010-11-12 2013-02-13 株式会社コナミデジタルエンタテインメント 画像処理装置、画像処理プログラム、及び画像処理方法
CN102750720A (zh) * 2011-04-20 2012-10-24 鸿富锦精密工业(深圳)有限公司 三维效果模拟系统及方法
CN102915553B (zh) * 2012-09-12 2016-02-10 珠海金山网络游戏科技有限公司 一种3d游戏视频拍摄系统及其方法
JP6598522B2 (ja) 2015-06-12 2019-10-30 任天堂株式会社 情報処理装置、情報処理システム、情報処理方法、及び情報処理プログラム
CN106780674B (zh) * 2016-11-28 2020-08-25 网易(杭州)网络有限公司 镜头移动方法和装置
JP2017142783A (ja) 2017-01-04 2017-08-17 株式会社コロプラ 仮想空間における視界領域調整方法、およびプログラム

Patent Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1454663A1 (en) 2003-03-05 2004-09-08 Square Enix Co., Ltd. Virtual camera control method in three-dimensional video game
EP2030661A2 (en) 2007-08-30 2009-03-04 Kabushiki Kaisha Square Enix (also trading as Square Enix Co., Ltd.) Image generating apparatus, method of generating image, program, and recording medium
US8576235B1 (en) 2009-07-13 2013-11-05 Disney Enterprises, Inc. Visibility transition planning for dynamic camera control
US20130244782A1 (en) 2011-01-31 2013-09-19 Microsoft Corporation Real-time camera tracking using depth maps
US20140243082A1 (en) 2012-04-26 2014-08-28 Riot Games, Inc. Systems and methods that enable a spectator's experience for online active games
US20150156422A1 (en) * 2013-11-29 2015-06-04 Avigilon Corporation Camera system control for correcting bore-sight offset
WO2015164155A1 (en) 2014-04-24 2015-10-29 Microsoft Technology Licensing, Llc Artist-directed volumetric dynamic virtual cameras
CN104793643A (zh) 2015-04-13 2015-07-22 北京迪生动画科技有限公司 一种三维定格动画拍摄系统及控制方法
US10154228B1 (en) * 2015-12-23 2018-12-11 Amazon Technologies, Inc. Smoothing video panning
CN105597311A (zh) 2015-12-25 2016-05-25 网易(杭州)网络有限公司 3d游戏中的相机控制方法和装置
CN106502395A (zh) 2016-10-18 2017-03-15 深圳市火花幻境虚拟现实技术有限公司 一种在虚拟现实应用中避免用户眩晕的方法及装置
CN106780874A (zh) 2016-12-05 2017-05-31 苏州维盟韵联网络科技有限公司 一种智能门禁系统
CN106582012A (zh) 2016-12-07 2017-04-26 腾讯科技(深圳)有限公司 一种vr场景下的攀爬操作处理方法和装置
CN106730852A (zh) 2016-12-20 2017-05-31 网易(杭州)网络有限公司 游戏系统中虚拟镜头的控制方法及控制装置
CN107422848A (zh) 2017-06-16 2017-12-01 福建天晴数码有限公司 一种测试虚拟角色加速度值的方法及系统
CN109550246A (zh) 2017-09-25 2019-04-02 腾讯科技(深圳)有限公司 游戏客户端的控制方法、装置、存储介质和电子装置
CN109621415A (zh) 2018-12-26 2019-04-16 网易(杭州)网络有限公司 3d游戏中的显示控制方法及装置、计算机存储介质
CN110170167A (zh) 2019-05-28 2019-08-27 上海米哈游网络科技股份有限公司 一种画面显示方法、装置、设备及介质
CN111246095A (zh) 2020-01-17 2020-06-05 腾讯科技(深圳)有限公司 控制镜头运动的方法、装置、设备及存储介质

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Chinese Office Action dated Nov. 24, 2020 in Chinese Application No. 202010055703. X, 6 pgs.
International Search Report and Written Opinion dated Jan. 25, 2021 in International Application No. PCT/CN2020/126463 with English translation, 11 pgs.
Supplementary European Office Action dated Oct. 14, 2022 in Application No. 20914246.2. (10 pages).

Also Published As

Publication number Publication date
EP4000700A4 (en) 2022-11-16
EP4000700A1 (en) 2022-05-25
WO2021143296A1 (zh) 2021-07-22
JP2022550126A (ja) 2022-11-30
US20220174206A1 (en) 2022-06-02
JP7487293B2 (ja) 2024-05-20
CN111246095A (zh) 2020-06-05
CN111246095B (zh) 2021-04-27

Similar Documents

Publication Publication Date Title
US11962897B2 (en) Camera movement control method and apparatus, device, and storage medium
US11517099B2 (en) Method for processing images, electronic device, and storage medium
US11640235B2 (en) Additional object display method and apparatus, computer device, and storage medium
US11436779B2 (en) Image processing method, electronic device, and storage medium
CN110830811B (zh) 直播互动方法及装置、系统、终端、存储介质
CN109600678B (zh) 信息展示方法、装置及系统、服务器、终端、存储介质
US11790612B2 (en) Information display method and device, terminal, and storage medium
CN110022363B (zh) 虚拟对象的运动状态修正方法、装置、设备及存储介质
WO2021043121A1 (zh) 一种图像换脸的方法、装置、系统、设备和存储介质
CN111028144B (zh) 视频换脸方法及装置、存储介质
WO2021073293A1 (zh) 动画文件的生成方法、装置及存储介质
CN112581358A (zh) 图像处理模型的训练方法、图像处理方法及装置
CN111083513B (zh) 直播画面处理方法、装置、终端及计算机可读存储介质
CN110796083B (zh) 图像显示方法、装置、终端及存储介质
WO2021218926A1 (zh) 图像显示方法、装置和计算机设备
CN112967261B (zh) 图像融合方法、装置、设备及存储介质
WO2021258608A1 (zh) 带宽确定方法、装置、终端及存储介质
CN110708582B (zh) 同步播放的方法、装置、电子设备及介质
WO2019128430A1 (zh) 带宽确定方法、装置、设备及存储介质
CN113065457B (zh) 人脸检测点处理方法、装置、计算机设备及存储介质
US20220405879A1 (en) Method for processing images and electronic device
CN114115660B (zh) 媒体资源处理方法、装置、终端及存储介质
CN110660031B (zh) 图像锐化方法及装置、存储介质
CN110365903B (zh) 基于视频的对象处理方法、装置、设备及可读存储介质
CN113139919A (zh) 特效显示方法、装置、计算机设备及存储介质

Legal Events

Date Code Title Description
AS Assignment

Owner name: TENCENT TECHNOLOGY (SHENZHEN) COMPANY LIMITED, CHINA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LIU, QI;ZHAI, GUANGZHOU;CUI, WEIHUA;REEL/FRAME:059289/0280

Effective date: 20220215

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE