WO2021143296A1 - 控制镜头运动的方法、装置、设备及存储介质 - Google Patents

控制镜头运动的方法、装置、设备及存储介质 Download PDF

Info

Publication number
WO2021143296A1
WO2021143296A1 PCT/CN2020/126463 CN2020126463W WO2021143296A1 WO 2021143296 A1 WO2021143296 A1 WO 2021143296A1 CN 2020126463 W CN2020126463 W CN 2020126463W WO 2021143296 A1 WO2021143296 A1 WO 2021143296A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
parameter
speed
time slice
lens
Prior art date
Application number
PCT/CN2020/126463
Other languages
English (en)
French (fr)
Inventor
刘琦
翟光洲
崔卫华
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Priority to EP20914246.2A priority Critical patent/EP4000700A4/en
Priority to JP2022519447A priority patent/JP7487293B2/ja
Publication of WO2021143296A1 publication Critical patent/WO2021143296A1/zh
Priority to US17/675,293 priority patent/US11962897B2/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5258Changing parameters of virtual cameras by dynamically adapting the position of the virtual camera to keep a game object or game character in its viewing frustum, e.g. for tracking a character or a ball
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4781Games
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/35Details of game servers

Definitions

  • the embodiments of the present application relate to the field of computer technology, and in particular, to a method, device, device, and storage medium for controlling lens movement.
  • the game client can control the movement of the guide lens and focus the guide lens on the exciting game scene, so as to improve the user's game video viewing experience.
  • the embodiments of the present application provide a method, device, device, and storage medium for controlling lens movement, which can be used to improve the effect of controlling lens movement.
  • the technical solution is as follows:
  • an embodiment of the present application provides a method for controlling lens movement, which is applied to a computer device, and the method includes:
  • the first target parameter that the shot needs to meet in the target time slice is acquired, and the first target parameter includes the first position parameter and the first zoom parameter. at least one;
  • the initial movement speed of the lens in the target time slice Based on the first target parameter, the initial movement speed of the lens in the target time slice, the time interval corresponding to the target time slice, and the time-speed change magnification curve, it is determined that the lens is in the target time slice.
  • a target motion speed in the time slice where the target motion speed includes at least one of a target translation speed and a target zoom speed;
  • a device for controlling movement of a lens includes:
  • the acquiring module is used to acquire the first target parameter that the shot needs to meet in the target time slice based on the data of multiple target frames corresponding to the target time slice, and the first target parameter includes a first position parameter and a first position parameter. At least one of a scaling parameter;
  • the determining module is configured to determine the first target parameter, the initial movement speed of the lens in the target time slice, the time interval corresponding to the target time slice, and the time-speed change magnification curve based on the first target parameter, The target motion speed of the lens in the target time slice, where the target motion speed includes at least one of a target translation speed and a target zoom speed;
  • the control module is configured to control the movement of the lens to satisfy the first target parameter based on the time-speed change magnification curve, the initial movement speed and the target movement speed.
  • a computer device in another aspect, includes a processor and a memory, and at least one piece of program code is stored in the memory, and the at least one piece of program code is loaded by the processor and executes the following steps:
  • the first target parameter that the shot needs to meet in the target time slice is acquired, and the first target parameter includes the first position parameter and the first zoom parameter. at least one;
  • the initial movement speed of the lens in the target time slice Based on the first target parameter, the initial movement speed of the lens in the target time slice, the time interval corresponding to the target time slice, and the time-speed change magnification curve, it is determined that the lens is in the target time slice.
  • a target motion speed in the time slice where the target motion speed includes at least one of a target translation speed and a target zoom speed;
  • a computer-readable storage medium is also provided, and at least one piece of program code is stored in the computer-readable storage medium, and the at least one piece of program code is loaded by a processor and executes the following steps:
  • the first target parameter that the shot needs to meet in the target time slice is acquired, and the first target parameter includes the first position parameter and the first zoom parameter. at least one;
  • the initial movement speed of the lens in the target time slice Based on the first target parameter, the initial movement speed of the lens in the target time slice, the time interval corresponding to the target time slice, and the time-speed change magnification curve, it is determined that the lens is in the target time slice.
  • a target motion speed in the time slice where the target motion speed includes at least one of a target translation speed and a target zoom speed;
  • the first target parameter that the lens needs to meet in the target time slice is acquired based on the data of multiple target frames.
  • the reliability of the first target parameter is high, which is beneficial to improving the stability of the lens movement.
  • controlling the movement of the lens based on the time-speed change magnification curve is beneficial to improve the movement continuity of the lens between adjacent time slices, the stability of the lens movement process is higher, and the effect of controlling the lens movement is better.
  • FIG. 1 is a schematic diagram of an implementation environment of a method for controlling lens movement provided by an embodiment of the present application
  • FIG. 2 is a flowchart of a method for controlling lens movement provided by an embodiment of the present application
  • Fig. 3 is a schematic diagram of a movement process of an interactive object provided by an embodiment of the present application.
  • FIG. 4 is a schematic diagram of a location point indicated by a center location parameter, a sampling location parameter, and an aggregate location parameter provided by an embodiment of the present application;
  • FIG. 5 is a schematic diagram of a distance-zoom change magnification curve provided by an embodiment of the present application.
  • Fig. 6 is a schematic diagram of a time-speed change rate curve provided by an embodiment of the present application.
  • Fig. 7 is a schematic diagram of an angle-steering mixing coefficient curve provided by an embodiment of the present application.
  • FIG. 8 is a schematic diagram of a time-motion speed curve provided by an embodiment of the present application.
  • FIG. 9 is a schematic diagram of a process of controlling lens movement provided by an embodiment of the present application.
  • FIG. 10 is a schematic diagram of a device for controlling movement of a lens provided by an embodiment of the present application.
  • FIG. 11 is a schematic diagram of a device for controlling movement of a lens provided by an embodiment of the present application.
  • FIG. 12 is a schematic structural diagram of a terminal provided by an embodiment of the present application.
  • FIG. 13 is a schematic structural diagram of a computer device provided by an embodiment of the present application.
  • the guide lens can automatically move in accordance with the rhythm of the game, and shows the user a wonderful team battle or some important moments without the user having to do any operation.
  • the game client controls the movement of the guide lens to focus the guide lens on the exciting game scene, so as to improve the user's game video viewing experience.
  • FIG. 1 shows a schematic diagram of an implementation environment of the method for controlling lens movement provided by an embodiment of the present application.
  • the implementation environment includes: a terminal 11 and a server 12.
  • the terminal 11 is installed with a game client capable of providing a directing mode.
  • the game client in the terminal 11 can apply the method provided in the embodiment of the present application to control the movement of the lens.
  • the lens in the embodiment of the present application is a guide lens in the guide mode, and the scene focused by the guide lens is the scene seen by the user in the guide mode.
  • the server 12 refers to a background server of the game client installed in the terminal 11, and the server 12 can provide data support for the game client installed in the terminal 11.
  • the terminal 11 is a smart device such as a mobile phone, a tablet computer, and a personal computer.
  • the server 12 is a server, or a server cluster composed of multiple servers, or a cloud computing service center.
  • the terminal 11 and the server 12 establish a communication connection through a wired or wireless network.
  • terminal 11 and server 12 are only examples, and other related or future terminals or servers that may be applicable to this application should also be included in the scope of protection of this application and are cited here. The way is included here.
  • an embodiment of the present application provides a method for controlling lens movement, taking the method applied to a game client in a terminal as an example.
  • the method provided by the embodiment of the present application includes the following steps:
  • step 201 the terminal obtains the first target parameter that the shot needs to satisfy in the target time slice based on the data of the multiple target frames corresponding to the target time slice.
  • the target time slice refers to a time slice that currently needs to perform a lens motion control operation.
  • the terminal divides the game video into multiple time slices according to the first reference time interval, and then, based on the method provided in this embodiment of the present application, executes the control of the camera movement in each time slice in turn Operation.
  • the first reference time interval is set according to experience or freely adjusted according to application scenarios, which is not limited in the embodiment of the present application. Exemplarily, the first reference time interval is set to 1 second, and at this time, the time interval corresponding to the target time slice is also 1 second.
  • the multiple target frames corresponding to the target time slice include a game video frame corresponding to the start timestamp of the target time slice, and a reference number of games before the game video frame corresponding to the start timestamp Video frame.
  • the reference number is set according to experience, or the root application scenario is freely adjusted, which is not limited in the embodiment of the present application. In some embodiments, the reference number is set to 10. In this case, the terminal can obtain the first target parameter that the shot needs to satisfy in the target time slice based on the 11 target frames corresponding to the target time slice.
  • the data of each target frame refers to the original data in the frame, including but not limited to the position data of the interactive object in the target frame, the time stamp data of the target frame, and the like.
  • the interactive object in the target frame refers to the game role model included in the target frame that participates in the game process.
  • the first target parameter refers to a parameter that the shot needs to meet at the end of the target time slice.
  • the first target parameter includes at least one of a first position parameter and a first zoom parameter.
  • the first position parameter is used to indicate the position where the lens needs to be focused at the end of the target time slice
  • the first zoom parameter is used to indicate the zoom value that the lens needs to reach at the end of the target time slice.
  • the terminal can express the first position parameter in the form of position coordinates, and express the first zoom parameter in the form of numerical values.
  • a numerical value between 0 and 1 can be used to indicate that the focus range of the lens is reduced, and 1 is used to indicate that the focus range of the lens remains unchanged.
  • a value greater than 1 indicates that the focus range of the lens is enlarged.
  • the terminal obtains the first target parameter based on the data of multiple target frames, which can reduce the adverse effect of the abnormal data of a certain video frame on the first target parameter, and make the obtained first target parameter more reliable. High, thereby improving the stability of the subsequent control lens movement process.
  • the process of the terminal acquiring the first target parameter that the shot needs to satisfy in the target time slice includes the following steps 2011 to 2013:
  • Step 2011 The terminal performs sampling processing on the data of each target frame corresponding to the target time slice to obtain sampling parameters corresponding to each target frame.
  • the sampling parameter corresponding to each target frame refers to a parameter that needs to be satisfied in the target time slice of the shot determined based on the target frame.
  • the sampling parameter corresponding to each target frame includes at least one of a sampling position parameter and a sampling scaling parameter corresponding to the target frame.
  • the terminal when the sampling parameters corresponding to each target frame include sampling position parameters, the terminal performs sampling processing on the data of each target frame corresponding to the target time slice, and the process of obtaining the sampling parameters corresponding to each target frame includes Step A and Step B:
  • Step A The terminal obtains the center position parameter of the interactive object in each target frame based on the data of each target frame.
  • the parameter of the center position of the interactive object in each target frame is used to indicate the geometric center of the position of each interactive object in each target frame.
  • the realization process of this step is: the terminal extracts the position data of each interactive object in each target frame from the data of each target frame. The terminal obtains the center position parameter of the interactive object in each target frame based on the position data of each interactive object.
  • the terminal obtains the center position parameter of the interactive object in each target frame based on the position data of each interactive object: the terminal uses the average value of the position data of each interactive object as the interactive object in each target frame The center position parameter.
  • the terminal can implement the above process based on Formula 1.
  • P i1 represents the center position parameter of the interactive object in each target frame
  • Sum(P C ) represents the sum of the position data of each interactive object in each target frame
  • N(p) represents the parameter in each target frame The number of interactive objects.
  • Step B The terminal determines the sampling position parameter corresponding to each target frame based on the center position parameter of the interactive object in each target frame and the collective position parameter of the interactive object.
  • the collection position parameter of the interactive objects refers to the parameters of the final collection point that all interactive objects in the game video need to reach. +The terminal can obtain the set location parameter from the backend server of the game client. It should be noted that for different target frames, the center position parameters of the interactive objects may be different, but since the final collection positions of the interactive objects are the same, the collection position parameters of the interactive objects are all the same.
  • the movement process of the interactive object is: the interactive object first moves from the starting position point 301 to the middle assembly point 302, and then the interactive object moves from the upper road to the lower road, from the middle
  • the assembly point 302 moves to the final assembly point 303.
  • the backend server of the game client can obtain the parameters of the final rendezvous by analyzing and processing the entire game video, and feedback the parameters of the final rendezvous to the terminal. In this way, the terminal obtains the collection location parameter of the interactive object.
  • the terminal determines the sampling position parameter corresponding to each target frame based on the center position parameter of the interactive object in each target frame and the collective position parameter of the interactive object as follows: The center position parameter and the first weight value, as well as the collective position parameter and the second weight value of the interactive object, determine the sampling position parameter corresponding to each target frame. In some embodiments, the terminal separately obtains the product of the center position parameter of the interactive object in each target frame and the first weight value, and the product of the collection position parameter of the interactive object and the second weight value, and uses the sum of the two products as Sampling position parameters corresponding to each target frame. The terminal can use formula 2 to implement the above process.
  • P i represents the sampling location parameter corresponding to each target frame
  • P i1 represents the center location parameter of the interactive object in each target frame
  • d 1 represents the first weight value
  • P i2 represents the collection location parameter of the interactive object
  • d 2 represents The second weight value.
  • the first weight value and the second weight value are set according to experience, or freely adjusted according to application scenarios, which is not limited in the embodiment of the present application.
  • the second weight value can also be set to 0.5. This setting method allows the user to not only watch the performance of each interactive object at the current position, but also watch the performance at the final assembly point.
  • the center position parameter, the sampling position parameter, and the collective position parameter each indicate a position point.
  • the position points indicated by the center position parameter, the sampling position parameter, and the collective position parameter may be as shown in FIG. 4.
  • the blue side and the red side are the two opponents in the game video.
  • the terminal can be based on the position point 401 indicated by the center position parameters of each interactive object of the red side and each interactive object of the blue side, and the set position parameter
  • the indicated position point 403 determines the position point 402 indicated by the sampling position parameter corresponding to the target frame.
  • the terminal determines the sampling position parameter corresponding to each target frame by comprehensively considering the center position parameter and the set position parameter corresponding to each target frame, which can make the lens meet the first A position parameter and a first zoom parameter reduce the awkwardness of visual perception.
  • the terminal when the sampling parameters corresponding to each target frame include sampling scaling parameters, the terminal performs sampling processing on the data of each target frame corresponding to the target time slice, and the process of obtaining the sampling parameters corresponding to each target frame includes Step a to step c:
  • Step a The terminal obtains the distance parameter corresponding to each target frame based on the data of each target frame.
  • the implementation process of this step is: the terminal separately obtains the distance between the position data of each interactive object and the sampling position parameter corresponding to each target frame based on the data of each target frame, and the respective target frame The distance between the corresponding sampling location parameter and the collection location parameter.
  • the terminal uses the maximum distance among the foregoing distances as the distance parameter corresponding to each target frame.
  • the terminal can use formula 3 to obtain the distance parameter corresponding to each target frame.
  • L represents the distance parameter corresponding to each target frame
  • P c represents the position data of each interactive object in each target frame
  • P i represents the sampling position parameter corresponding to each target frame
  • P i2 represents the aggregate position parameter
  • Step b The terminal determines the zoom change ratio corresponding to the distance parameter based on the distance-zoom change ratio curve.
  • L max refers to the maximum value of the distance parameters corresponding to each game video frame in the entire game video
  • L min refers to the minimum value of the distance parameters corresponding to each game video frame in the entire game video
  • S min refers to the entire game video
  • the minimum zoom value of the lens corresponds to L max
  • S max refers to the maximum zoom value of the lens in the entire game video, which corresponds to L min .
  • the value of the distance parameter L corresponding to each game video frame does not vary linearly between L max and L min , but is relatively concentrated on the L min side. Therefore, the change from S min to S max should also be slow in the front and fast in the back. Therefore, the slope of the distance-zoom change magnification curve used in the embodiments of the present application has an increasing trend, that is, the slope of the distance-zoom change magnification curve is small before and then large.
  • This kind of distance-zoom change magnification curve can increase the range of the mapping interval within the range where the value of L is relatively concentrated, and optimize the visual experience of lens movement.
  • L max , L min , S min and S max may be determined by the backend server of the game client and fed back to the game client, or the game client may record L max , L min , S min and It can be read from the configuration file of S max.
  • the distance-zoom change magnification curve is shown as 501 in FIG. 5.
  • the abscissa in FIG. 5 represents the normalized distance
  • the ordinate represents the zoom change magnification.
  • the slope of the distance-zoom change rate curve 501 shown in FIG. 5 has an increasing trend. As the value of the abscissa changes from 0 to 1, the value of the ordinate also changes from 0 to 1.
  • the process for the terminal to determine the zoom change magnification corresponding to the distance parameter based on the distance-zoom change magnification curve is: the terminal normalizes the distance parameter to obtain the normalized distance parameter. Based on the distance-zoom change magnification curve, the terminal determines the zoom change magnification corresponding to the normalized distance parameter.
  • the terminal can use Formula 4 to obtain the normalized distance parameter.
  • L′ represents the distance parameter after normalization processing
  • L represents the distance parameter before normalization processing
  • L max represents the maximum value of the distance parameters corresponding to each game video frame in the entire game video
  • L min represents the entire game The minimum value of the distance parameters corresponding to each game video frame in the video.
  • Step c The terminal determines the sampling zoom parameter corresponding to each target frame based on the zoom change magnification.
  • the terminal can use Formula 5 to determine the sampling scaling parameters corresponding to each target frame.
  • S i represents the sampling zoom parameter corresponding to each target frame
  • r represents the zoom change magnification
  • S min represents the minimum zoom of the lens in the entire game video
  • S max represents the maximum zoom of the lens in the entire game video.
  • the terminal can obtain the sampling parameter corresponding to each target frame according to the above steps A and B.
  • the terminal can obtain the sampling parameter corresponding to each target frame according to the above steps a to c.
  • the terminal can obtain the sampling parameters corresponding to each target frame according to the above steps A and B, and steps a to c.
  • the terminal separately performs sampling processing on the data of each target frame based on the above step 2011 to obtain sampling parameters corresponding to each target frame, and then executes step 2012.
  • Step 2012 The terminal sets a weight value for each target frame according to the distance between the timestamp and the start timestamp of the target time slice.
  • the terminal sets the weight value for each target frame according to the distance between the timestamp and the start timestamp of the target time slice: the terminal sets the weight value according to the distance between the timestamp and the start timestamp of the target time slice In order from farthest to nearest, set the weight value from small to large for each target frame, or the terminal sets the weight value for each target frame from near to far in the order of the distance between the timestamp and the start timestamp of the target time slice. Large to small weight value.
  • the distribution of the weight value corresponding to each target frame conforms to the Gaussian distribution.
  • the terminal sets 1/55, 4/ for each target frame in the order of the distance between the time stamp and the starting time stamp of the target time slice. Weight values of 55, 9/55, 16/55, and 25/55.
  • Step 2013 The terminal determines the first target parameter that the shot needs to meet in the target time slice based on the sampling parameter corresponding to each target frame and the weight value corresponding to each target frame.
  • the process for the terminal to determine the first target parameter that the shot needs to meet in the target time slice based on the sampling parameter corresponding to each target frame and the weight value corresponding to each target frame is: the terminal obtains each target frame separately The product of the corresponding sampling parameter and the weight value corresponding to each target frame, and the sum of each product is used as the first target parameter that the shot needs to meet in the target time slice.
  • the first target parameter includes the first position parameter.
  • the sampling parameter corresponding to each target frame includes the sampling scaling parameter
  • the first target parameter includes the first scaling parameter.
  • the sampling parameter corresponding to each target frame includes the sampling position parameter and the sampling scaling parameter
  • the first target parameter includes the first position parameter and the first scaling parameter.
  • the terminal sets 1/55, 4/ for each target frame in the order of the distance between the time stamp and the starting time stamp of the target time slice. Weight values of 55, 9/55, 16/55, and 25/55. Then the terminal can use formula 6 to obtain the first position parameter in the first target parameter.
  • P n represents the first position parameter in the first target parameter
  • P 1 , P 2 , P 3 , P 4 and P 5 respectively represent the distance from the farthest to the starting timestamp of the target time slice according to the distance between the timestamp and the target time slice. Sampling position parameters corresponding to the 5 target frames arranged in the nearest order.
  • the terminal can use Formula 7 to obtain the first scaling parameter in the first target parameter.
  • S n represents the first scaling parameter in the first target parameter
  • S 1 , S 2 , S 3 , S 4 and S 5 respectively represent the distance from the time stamp to the starting time stamp of the target time slice from farthest to Sampling scaling parameters corresponding to the 5 target frames arranged in the nearest order.
  • the terminal may further verify the validity of the first target parameter based on the parameter change threshold.
  • the process for the terminal to verify the validity of the first target parameter based on the parameter change threshold is: acquiring the change parameter of the shot in the target time slice; based on the comparison result of the change parameter and the parameter change threshold, The first target parameter is validated.
  • the process for the terminal to acquire the change parameters of the lens in the target time slice is as follows: the terminal acquires the lens in the target time slice based on the second target parameter and the first target parameter that the lens has met in the target time slice.
  • Variable parameters in time slicing refers to a parameter that the lens meets at the start time of the target time slice, and the second target parameter includes at least one of the second position parameter and the second zoom parameter.
  • the change parameter of the lens in the target time slice includes the position change parameter
  • the terminal can use the distance between the first position parameter and the second position parameter to represent the position change parameter.
  • the change parameter of the lens in the target time slice includes the zoom change parameter
  • the terminal can use the absolute value of the difference between the first zoom parameter and the second zoom parameter to represent the zoom change parameter .
  • the terminal verifies the validity of the first target parameter as follows: when the change parameter is lower than the parameter change threshold, the terminal determines the validity of the first target parameter The verification fails; when the changed parameter is not lower than the parameter change threshold, the terminal determines that the validity verification of the first target parameter is passed.
  • the change parameter of the lens in the target time slice includes the position change parameter and the zoom change parameter.
  • the terminal verifies the validity of the first target parameter as follows: the terminal compares the first target parameter based on the comparison result of the position change parameter and the first parameter change threshold. A position parameter is validated. The terminal verifies the validity of the first zoom parameter based on the comparison result of the zoom change parameter and the second parameter change threshold.
  • the first parameter change threshold and the second parameter change threshold may be the same or different, which is not limited in the embodiment of the present application.
  • the terminal can discard the parameters that fail the validity verification, and use the remaining parameters as the first target parameters that pass the validity verification.
  • the terminal performs step 202 based on the first target parameters that pass the validity verification.
  • the change parameter of the lens in the target time slice includes the position change Parameters and scaling change parameters.
  • the position coordinates of the first position parameter are (1,1)
  • the position coordinates of the second position parameter are (0,1)
  • the first scaling parameter is 10
  • the second scaling parameter is 2.
  • the terminal can use the Euclidean distance 1 between the two position coordinates to represent the position change parameter of the lens in the target time slice, and use the absolute value of the difference between the two zoom parameters to represent the lens in the target time slice. Scaling changes parameters.
  • the validity verification of the first position parameter fails, and the validity verification of the first scaling parameter passes. At this time, only the first scaling parameter is included in the first target parameter passed the validity verification.
  • the terminal can significantly reduce the lens shake phenomenon caused by small translation and small zoom by verifying the validity of the first target parameter.
  • the terminal may set the trigger frequency, and according to the trigger frequency, trigger the execution of the step of acquiring the first target parameter.
  • the trigger frequency can be set according to experience, or can be flexibly adjusted according to application scenarios, which is not limited in the embodiment of the present application.
  • the trigger moment determined according to the trigger frequency can be the moment corresponding to a short period of time before each time slice, for example, the moment corresponding to the first 0.2 seconds of each time slice, so as to ensure that the target time slice is reached in time Perform operations that control the movement of the lens.
  • step 202 the terminal determines the target of the lens in the target time slice based on the first target parameter, the initial movement speed of the lens in the target time slice, the time interval corresponding to the target time slice, and the time-speed change magnification curve. Movement speed.
  • the target movement speed includes at least one of the target translation speed and the target zoom speed.
  • the initial movement speed of the lens in the target time slice refers to the movement speed of the lens at the beginning of the target time slice, and the initial movement speed includes at least one of the initial translation speed and the initial zoom speed.
  • the initial motion speed of the lens in the target time slice is also the motion speed of the lens at the end of the previous time slice of the target time slice.
  • the time interval corresponding to the target time slice refers to the length of time between the start timestamp and the end timestamp of the target time slice.
  • the target motion speed of the lens in the target time slice refers to the motion speed of the lens at the end of the target time slice. In other words, the target motion speed refers to the motion speed of the lens when the first target parameter is satisfied.
  • the time-speed change rate curve is set by the game developer, or flexibly adjusted according to the application scenario, which is not limited in the embodiment of the present application.
  • the time-speed change rate curve is a Bezier curve.
  • the Bezier curve is a smooth curve. Setting the time-speed change curve to the Bezier curve helps to improve the smoothness and stability of the lens movement.
  • the time-speed change magnification curve is shown as 601 in FIG. 6, the abscissa in FIG. 6 represents the time after the normalization process, and the ordinate represents the speed change magnification.
  • the time-speed change rate curve 601 shown in FIG. 6 is a Bezier curve. As the value of the abscissa changes from 0 to 1, the value of the ordinate also changes from 0 to 1.
  • the terminal has verified the validity of the first target parameter.
  • the terminal is based on the first target parameter passed the validity verification, the initial movement speed of the lens in the target time slice, The time interval corresponding to the target time slice and the time-speed change magnification curve determine the target motion speed of the lens in the target time slice.
  • the terminal determines the lens in the target time slice based on the first target parameter, the initial movement speed of the lens in the target time slice, the time interval corresponding to the target time slice, and the time-speed change magnification curve.
  • the target movement speed process includes steps 2021 to 2023:
  • Step 2021 The terminal acquires the change parameters of the shot in the target time slice based on the second target parameter and the first target parameter that the shot has met in the target time slice.
  • the second target parameter that the lens has met in the target time slice refers to the parameter that the lens meets at the beginning of the target time slice, and the second target parameter includes the second position parameter and the second zoom parameter. At least one of them.
  • the change parameter of the lens in the target time slice includes the position change parameter.
  • the position change parameter may be between the first position parameter and the second position parameter.
  • the change parameter of the lens in the target time slice includes the zoom change parameter, and the zoom change parameter can be the difference between the first zoom parameter and the second zoom parameter The absolute value of said.
  • the terminal can use the Euclidean distance 1 between the two position coordinates to represent the position change parameter of the lens in the target time slice, and use the absolute value of the difference between the two zoom parameters to represent the lens in the target time slice. Scaling changes parameters.
  • Step 2022 The terminal obtains the integral value corresponding to the time-speed change rate curve.
  • the time range corresponding to the time-speed change rate curve is [0,1].
  • Use ⁇ V(t) to represent the time-speed change rate curve, and the integral value corresponding to the time-speed change rate curve is based on Calculated.
  • Step 2023 The terminal determines the target motion speed of the lens in the target time slice based on the change parameter, the initial motion speed, the time interval corresponding to the target time slice, and the integral value corresponding to the time-speed change magnification curve.
  • the terminal can determine according to Formula 8: the target movement speed of the lens in the target time slice.
  • ⁇ M represents the change parameter, which can refer to the translation change parameter or the zoom change parameter
  • V 1 represents the smaller of the initial movement speed and the target movement speed, and can refer to the initial translation speed and the target translation speed.
  • the smaller panning speed can also refer to the smaller zooming speed of the initial zooming speed and the target panning zooming speed
  • V 2 represents the larger of the initial motion speed and the target motion speed, which can refer to the initial panning speed and the target
  • the larger translation speed in the translation speed can also refer to the larger zoom speed between the initial zoom speed and the target pan zoom speed
  • ⁇ t represents the time interval corresponding to the target time slice
  • V 1 represents the target movement speed
  • V 2 represents the initial movement speed
  • V 1 represents the initial movement speed
  • V 2 represents the target movement speed
  • the process of the terminal determining the target motion speed of the lens in the target time slice includes: the terminal based on the translation change parameter, the initial translation speed, and the target
  • the time interval corresponding to the time slice and the integral value corresponding to the time-translation speed change magnification curve are determined according to the above formula 8 to determine the target translation speed of the lens in the target time slice.
  • the terminal determines the target zoom speed of the lens in the target time slice based on the zoom change parameter, the initial zoom speed, the time interval corresponding to the target time slice, and the integral value corresponding to the time-zoom speed change magnification curve according to the above formula 8.
  • the time-translation speed change rate curve and the time-zoom speed change rate curve may be the same curve or two different curves, which is not limited in the embodiment of the present application.
  • the terminal can increase the influence of the steering mixing coefficient in the process of determining the target motion speed of the lens in the target time slice, so as to reduce the sense of discomfort caused by changing the direction (translation direction and zoom direction) , Improve the stability of the lens movement.
  • the process can include the following steps 1 to 4:
  • Step 1 The terminal obtains the steering angle corresponding to the target time slice based on the first movement direction and the second movement direction.
  • the first movement direction refers to the movement direction of the lens at the start moment of the target time slice, and the first movement direction includes at least one of the first translation direction and the first zoom direction.
  • the second movement direction refers to the movement direction of the lens at the end moment of the target time slice, and the second movement direction includes at least one of the second translation direction and the second zoom direction.
  • the steering angle corresponding to the target time slice includes at least one of a pan steering angle and a zoom steering angle.
  • the process of the terminal acquiring the steering angle corresponding to the target time slice includes: the terminal is based on the angle between the second pan direction and the first pan direction, Determine the translational steering angle corresponding to the target time slice. That is, the terminal uses the degree corresponding to the included angle between the second translation direction and the first translation direction as the translation steering angle corresponding to the target time slice.
  • the first translation direction is determined according to the position parameter of the start time and the end time of the previous time slice of the target time slice
  • the second translation direction is determined according to the second position in the second target parameter. The first position parameter in the parameter and the first target parameter is determined.
  • the process of the terminal acquiring the steering angle corresponding to the target time slice includes: the terminal based on the comparison result of the second zoom direction and the first zoom direction To determine the zoom steering angle corresponding to the target time slice.
  • the first zoom direction is determined according to the zoom parameter at the start time and the zoom parameter at the end time of the previous time slice of the target time slice
  • the second zoom direction is determined according to the second zoom in the second target parameter. The first scaling parameter among the parameters and the first target parameter is determined.
  • the terminal determines the zoom steering angle corresponding to the target time slice based on the comparison result of the second zoom direction and the first zoom direction: when the second zoom direction is consistent with the first zoom direction, the terminal The first angle is used as the zoom steering angle corresponding to the target time slice; when the second zoom direction is inconsistent with the first zoom direction, the terminal uses the second angle as the zoom steering angle.
  • the first angle and the second angle can be set based on experience. In some embodiments, the first angle is set to 0 degrees, and the second angle is set to 180 degrees. It should be noted that the zooming direction includes zooming in and zooming out.
  • the second zooming direction is consistent with the first zooming direction, which means that the second zooming direction and the first zooming direction are both zooming in, or the second zooming direction and the first zooming direction. Both are reduced.
  • Step 2 The terminal determines the steering mixing coefficient corresponding to the steering angle.
  • the manner in which the terminal determines the steering mixing coefficient corresponding to the steering angle includes but is not limited to the following two:
  • Method 1 The terminal determines the steering mixing coefficient corresponding to the steering angle based on the angle-steering mixing coefficient curve.
  • the angle-steering mixing coefficient curve is shown as 701 in FIG. 7.
  • the abscissa represents the angle
  • the ordinate represents the steering mixing coefficient.
  • the terminal can determine the steering mixing coefficient corresponding to the steering angle according to the angle-steering mixing coefficient curve 701. As the angle changes from 0 degrees to 180 degrees, the steering mixing coefficient drops from 1 to 0.
  • Manner 2 The terminal determines the steering mixing coefficient corresponding to the steering angle based on the corresponding relationship between the angle and the steering mixing coefficient.
  • the corresponding relationship between the angle and the steering mixing coefficient is set by the game developer, which is not limited in the embodiment of the present application. In some embodiments, the corresponding relationship between the angle and the steering mixing coefficient is expressed in the form of a table. After determining the steering angle, the terminal can query the steering mixing coefficient corresponding to the steering angle in the correspondence table of the angle and steering mixing coefficient.
  • the terminal needs to determine the pan steering mixing coefficient corresponding to the pan steering angle and the zoom corresponding to the zoom steering angle according to the above method one or two. Turn to mixing coefficient.
  • Step 3 The terminal updates the initial movement speed based on the steering mixing coefficient to obtain the updated initial movement speed.
  • the method for the terminal to update the initial motion speed based on the steering mixing coefficient is: the terminal uses the product of the initial motion speed before the update and the steering mixing coefficient as the updated initial motion speed.
  • Step 4 The terminal determines the target motion speed of the lens in the target time slice based on the first target parameter, the updated initial motion speed, the time interval corresponding to the target time slice, and the time-speed change magnification curve.
  • step 2021 For the implementation process of this step 4, refer to step 2021 to step 2023, which will not be repeated here.
  • the terminal can update the target motion speed based on the correction coefficient to obtain the updated target motion speed.
  • the terminal can implement the above process based on the following formula 9:
  • V 2 (n) V 1 (n) ⁇ R (Equation 9)
  • V 2 (n) represents the target motion speed after update
  • V 1 (n) represents the target motion speed before update
  • R represents the correction coefficient.
  • the correction coefficient can be set according to experience, or can be flexibly adjusted according to application scenarios, which is not limited in the embodiment of the present application.
  • the terminal can update the target translation speed based on the first correction coefficient to obtain the updated target translation speed.
  • the terminal updates the target zoom speed based on the second correction coefficient, and obtains the updated target zoom speed.
  • the first correction coefficient and the second correction coefficient may be the same or different, which is not limited in the embodiment of the present application.
  • step 203 the terminal controls the movement of the lens based on the time-speed change magnification curve, the initial movement speed, and the target movement speed to satisfy the first target parameter.
  • the target motion speed in this step refers to the updated target motion speed.
  • the initial movement speed in this step refers to the updated initial movement speed.
  • the process of controlling the lens movement of the terminal based on the time-speed change magnification curve, the initial movement speed and the target movement speed includes steps 2031 to 2034:
  • Step 2031 The terminal divides the process of controlling the movement of the lens into a reference number of sub-processes.
  • the terminal within the time interval corresponding to the target time slice, can divide the process of controlling the lens movement into a reference number of consecutive sub-processes according to the second reference time interval, so as to control the lens movement step by step.
  • the lens satisfies the first target parameter.
  • the reference quantity may be determined according to the time interval corresponding to the target time slice and the second reference time interval. In some embodiments, the terminal uses the ratio of the time interval corresponding to the target time slice to the second reference time interval as the reference quantity.
  • the second reference time interval is determined according to the time interval corresponding to the target time slice, or is set according to experience, which is not limited in the embodiment of the present application. In some embodiments, when the time interval corresponding to the target time slice is 1 second, the second reference time interval is set to 0.02 seconds. At this time, the reference number is 50. The time interval corresponding to each sub-process obtained by the terminal according to this division method is 0.02 seconds.
  • the terminal after the terminal divides the lens motion process into a reference number of sub-processes, it can also obtain the sub-motion speed corresponding to each sub-process based on the following steps 2032 and 2033.
  • Step 2032 The terminal determines the speed change magnification corresponding to any sub-process based on the time parameter and the time-speed change magnification curve corresponding to any sub-process.
  • the time in the time-speed change rate curve is the time after the normalization process, and the time range corresponding to the time-speed change rate curve is [0,1]. Before determining the speed change magnification corresponding to any sub-process, it is necessary to obtain the time parameter corresponding to any sub-process.
  • the process for the terminal to obtain the time parameter corresponding to any sub-process includes the following two steps:
  • Step 1 The terminal uses the ratio of the target time interval and the time interval corresponding to the target time slice as the target ratio.
  • the target time interval refers to the sum of the time intervals corresponding to each sub-process before any one of the sub-processes in the target time slice.
  • the time interval corresponding to the target time slice is 1 second
  • the time interval corresponding to each sub-process is 0.02 seconds
  • any sub-process is the sixth sub-process in the target time slice.
  • the target time interval refers to the sum of the time intervals corresponding to the five sub-processes before any one of the sub-processes in the target time slice, and the target time interval is 0.1 second.
  • Step 2 When the initial motion speed is less than the target motion speed, the terminal uses the target ratio as the time parameter corresponding to any sub-process; when the initial motion speed is not less than the target motion speed, the difference between 1 and the target ratio is used as the arbitrary value.
  • the time parameter corresponding to a sub-process is used as the target ratio.
  • the terminal After determining the time parameter corresponding to any sub-process based on the above steps 1 and 2, the terminal can determine the speed change magnification corresponding to the time parameter based on the time-speed change magnification curve, and use the speed change magnification as the any sub-process.
  • the speed change rate corresponding to the process After determining the time parameter corresponding to any sub-process based on the above steps 1 and 2, the terminal can determine the speed change magnification corresponding to the time parameter based on the time-speed change magnification curve, and use the speed change magnification as the any sub-process.
  • the speed change rate corresponding to the process is the speed change magnification corresponding to the process.
  • the speed change magnification includes at least one of the translation speed change magnification and the zoom speed change magnification.
  • the terminal obtains the translation speed change magnification corresponding to any sub-process
  • the terminal determines the translation speed change magnification corresponding to the time parameter based on the time-translation speed change magnification curve, and uses the translation speed change magnification as the any sub-process.
  • the panning speed change magnification corresponding to the process when the terminal obtains the zoom speed change magnification corresponding to any sub-process, the terminal determines the zoom speed change magnification corresponding to the time parameter based on the time-zoom speed change magnification curve, and then zooms The speed change magnification is used as the zoom speed change magnification corresponding to any sub-process.
  • Step 2033 The terminal obtains the sub-motion speed corresponding to any sub-process based on the initial motion speed, the target motion speed, and the speed change magnification corresponding to any sub-process.
  • the terminal can obtain the sub-motion speed corresponding to any sub-process based on formula 10:
  • V C V 1 +(V 2 -V 1 ) ⁇ V(T) (Equation 10)
  • V C represents the sub-motion speed corresponding to any sub-process
  • V 1 represents the smaller of the initial motion speed and the target motion speed
  • V 2 represents the larger of the initial motion speed and the target motion speed
  • T It represents the time parameter corresponding to any sub-process determined in step 2032
  • ⁇ V(T) represents the speed change rate corresponding to any sub-process.
  • the sub-motion speed includes at least one of a sub-translation speed and a sub-zoom speed.
  • V C represents the sub-translation speed corresponding to any sub-process
  • V 1 represents the lower one of the initial translation speed and the target translation speed
  • V 2 represents the initial translation speed ⁇ V(T) is the larger translation speed of the target translation speed
  • ⁇ V(T) represents the change magnification of the translation speed corresponding to any one of the sub-processes determined based on the time-translation speed change curve.
  • V C represents the sub-zoom speed corresponding to any sub-process
  • V 1 represents the smaller zoom speed of the initial zoom speed and the target zoom speed
  • V 2 represents the larger zoom speed of the initial zoom speed and the target zoom speed
  • ⁇ V(T) represents the zoom speed change magnification corresponding to any sub-process determined based on the time-zoom speed change curve.
  • Step 2034 In the time interval corresponding to any sub-process, the terminal controls the lens movement according to the sub-movement speed corresponding to any sub-process.
  • the speed of the lens movement keeps the sub-movement speed corresponding to any sub-process unchanged.
  • the time interval corresponding to any sub-process is 0.02 seconds
  • the sub-motion speed corresponding to any sub-process includes sub-translation speed and sub-zoom speed, where the sub-translation speed is 1 m/s, and the sub-zoom speed It is 0.5/sec. Then, within 0.02 seconds corresponding to any sub-process, while controlling the lens to pan at a speed of 1 m/sec, control the lens to zoom at a speed of 0.5/sec.
  • the terminal can continue to determine the sub-motion speed corresponding to the next sub-process, so as to control the lens movement according to the sub-motion speed corresponding to any sub-process. After that, continue to control the lens movement according to the sub-motion speed corresponding to the next sub-process, and so on, until the sub-motion speed corresponding to the last sub-process in the target time slice is determined, and then according to the sub-motion speed corresponding to the last sub-process.
  • the movement of the lens is controlled so that the lens meets the first target parameter, and the process of controlling the movement of the lens in the target time slice is completed. Then, based on the above steps 201 to 203, the terminal continues to perform the operation of controlling the movement of the lens in the next time slice.
  • the continuity of the movement of the lens between adjacent time slices is good, and the lens can transition smoothly between adjacent time slices to achieve smooth motion. Effect, reduce the user's sense of discomfort.
  • the time-motion speed curve of the lens movement in 3 adjacent time slices can be as shown in 801 in Fig. 8 and the curve in Fig. 8 801 has good continuity at each junction point, indicating that the smoothness of the lens movement process is better.
  • the terminal acquires the sampling position parameter and the sampling scaling parameter corresponding to each target frame, where the sampling scaling parameter is obtained according to the distance-zoom change magnification curve.
  • the terminal determines the first target parameter according to the sampling parameter and the weight value corresponding to each target frame.
  • the terminal verifies the validity of the first target parameter according to the parameter change threshold, so as to achieve the effect of filtering the first target parameter.
  • the terminal determines the target motion speed: the terminal determines the steering mixing coefficient according to the angle-steering mixing coefficient curve. The terminal updates the initial movement speed according to the steering mixing coefficient. The terminal determines the target motion speed according to the first target parameter, the time interval, the time-speed change rate curve, and the updated initial motion speed.
  • the terminal controls the movement of the lens: the terminal determines the sub-movement speed corresponding to each sub-process according to the target movement speed, the updated initial movement speed, and the time-speed change magnification curve. In the time interval corresponding to each sub-process, the terminal controls the movement of the lens according to the sub-movement speed corresponding to each sub-process, so that the lens meets the first target parameter.
  • the terminal obtains the first target parameter that the lens needs to satisfy in the target time slice based on the data of multiple target frames.
  • the reliability of the first target parameter is high, which is beneficial to improving the stability of the lens movement.
  • controlling the movement of the lens based on the time-speed change magnification curve is beneficial to improve the movement continuity of the lens between adjacent time slices, the stability of the lens movement process is higher, and the effect of controlling the lens movement is better.
  • an embodiment of the present application provides a device for controlling movement of a lens, and the device includes:
  • the acquiring module 1001 is configured to acquire a first target parameter that the lens needs to meet in the target time slice based on the data of multiple target frames corresponding to the target time slice, the first target parameter includes a first position parameter and a first zoom parameter At least one of
  • the determining module 1002 is used to determine the target of the lens in the target time slice based on the first target parameter, the initial movement speed of the lens in the target time slice, the corresponding time interval of the target time slice, and the time-speed change magnification curve Movement speed, the target movement speed includes at least one of the target translation speed and the target zoom speed;
  • the control module 1003 is used to control the movement of the lens based on the time-speed change magnification curve, the initial movement speed and the target movement speed.
  • the acquisition module 1001 is configured to sample the data of each target frame corresponding to the target time slice to obtain the sampling parameters corresponding to each target frame; according to the timestamp and the start timestamp of the target time slice Set the weight value for each target frame; determine the first target parameter that the shot needs to meet in the target time slice based on the sampling parameter corresponding to each target frame and the weight value corresponding to each target frame.
  • the sampling parameters corresponding to each target frame include sampling position parameters.
  • the acquisition module 1001 is also used to obtain the center position parameters of the interactive objects in each target frame based on the data of each target frame; The center position parameter of the interactive object and the collective position parameter of the interactive object determine the sampling position parameter corresponding to each target frame.
  • the sampling parameters corresponding to each target frame include sampling scaling parameters
  • the acquisition module 1001 is also used to obtain the distance parameters corresponding to each target frame based on the data of each target frame; and determine the distance-zoom ratio curve based on the data of each target frame.
  • the zoom change magnification corresponding to the distance parameter; based on the zoom change magnification, the sampling zoom parameter corresponding to each target frame is determined.
  • the determining module 1002 is configured to obtain the steering angle corresponding to the target time slice based on the first motion direction and the second motion direction; determine the steering mixing coefficient corresponding to the steering angle; and update the initial motion based on the steering mixing coefficient Speed, get the updated initial motion speed; based on the first target parameter, the updated initial motion speed, the time interval corresponding to the target time slice, and the time-speed change magnification curve, determine the target motion of the lens in the target time slice speed.
  • the determining module 1002 is configured to obtain a change parameter of the lens in the target time slice based on the second target parameter and the first target parameter that the lens has met in the target time slice, and the second target parameter includes At least one of the second position parameter and the second zoom parameter; obtain the integral value corresponding to the time-speed change magnification curve; correspond to the time interval corresponding to the change parameter, the initial motion speed, the target time slice, and the time-speed change magnification curve The integral value of, determines the target movement speed of the lens in the target time slice.
  • control module 1003 is used to divide the process of controlling lens motion into a reference number of sub-processes; based on the time parameter and time-speed change rate curve corresponding to any sub-process, determine the speed corresponding to any sub-process Change magnification; based on the initial motion speed, target motion speed and the speed change magnification corresponding to any sub-process, determine the sub-motion speed corresponding to any sub-process; within the time interval corresponding to any sub-process, according to the corresponding sub-process The sub movement speed controls the movement of the lens.
  • the device further includes:
  • the verification module 1004 is configured to verify the validity of the first target parameter based on the parameter change threshold
  • the determining module 1002 is also used to determine the lens at the target based on the first target parameter passed the validity verification, the initial movement speed of the lens in the target time slice, the time interval corresponding to the target time slice, and the time-speed change magnification curve. The speed of the target movement in the time slice.
  • the device further includes:
  • the update module 1005 is used to update the target motion speed based on the correction coefficient to obtain the updated target motion speed;
  • the control module 1003 is also used to control the movement of the lens to meet the first target parameter based on the time-speed change magnification curve, the initial movement speed and the updated target movement speed.
  • the time-speed change rate curve is a Bezier curve.
  • the first target parameter that the shot needs to meet in the target time slice is acquired based on the data of multiple target frames, and the reliability of the first target parameter is high, which is beneficial to improving the stability of the shot movement.
  • controlling the movement of the lens based on the time-speed change magnification curve is beneficial to improve the movement continuity of the lens between adjacent time slices, the stability of the lens movement process is higher, and the effect of controlling the lens movement is better.
  • the computer equipment provided in the embodiments of the present application may be implemented as a terminal or a server.
  • the structure of the terminal is described below.
  • FIG. 12 is a schematic structural diagram of a terminal provided by an embodiment of the present application, and the terminal is installed with a game client.
  • the terminal can be: a smart phone, a tablet computer, a notebook computer or a desktop computer.
  • the terminal may also be called user equipment, portable terminal, laptop terminal, desktop terminal and other names.
  • the terminal includes a processor 1201 and a memory 1202.
  • the processor 1201 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on.
  • the processor 1201 may adopt at least one hardware form among DSP (Digital Signal Processing), FPGA (Field-Programmable Gate Array), and PLA (Programmable Logic Array, Programmable Logic Array). accomplish.
  • the processor 1201 may also include a main processor and a coprocessor.
  • the main processor is a processor used to process data in the awake state, also called a CPU (Central Processing Unit, central processing unit); the coprocessor is A low-power processor used to process data in the standby state.
  • the processor 1201 may be integrated with a GPU (Graphics Processing Unit, image processor), and the GPU is used for rendering and drawing content that needs to be displayed on the display screen.
  • the processor 1201 may further include an AI (Artificial Intelligence) processor, and the AI processor is used to process computing operations related to machine learning.
  • AI Artificial Intelligence
  • the memory 1202 may include one or more computer-readable storage media, which may be non-transitory.
  • the memory 1202 may also include high-speed random access memory and non-volatile memory, such as one or more magnetic disk storage devices and flash memory storage devices.
  • the non-transitory computer-readable storage medium in the memory 1202 is used to store at least one instruction, and the at least one instruction is used by the processor 1201 to perform the following steps:
  • the first target parameter that the shot needs to meet in the target time slice, where the first target parameter includes at least one of the first position parameter and the first zoom parameter.
  • the initial motion speed of the lens in the target time slice Based on the first target parameter, the initial motion speed of the lens in the target time slice, the time interval corresponding to the target time slice and the time-speed change magnification curve, determine the target motion speed of the lens in the target time slice, and the target motion speed It includes at least one of the target panning speed and the target zooming speed.
  • the lens movement is controlled.
  • the processor is used to perform the following steps:
  • Sampling processing is performed on the data of each target frame corresponding to the target time slice, and the sampling parameter corresponding to each target frame is obtained.
  • the weight value is set for each target frame according to the distance between the timestamp and the start timestamp of the target time slice.
  • the first target parameter that the shot needs to meet in the target time slice is determined.
  • sampling parameters corresponding to each target frame include sampling position parameters
  • the processor is configured to perform the following steps:
  • the center position parameter of the interactive object in each target frame is obtained.
  • the sampling position parameter corresponding to each target frame is determined.
  • sampling parameters corresponding to each target frame include sampling scaling parameters
  • the processor is configured to perform the following steps:
  • the distance parameter corresponding to each target frame is obtained.
  • the zoom change magnification corresponding to the distance parameter is determined.
  • the sampling zoom parameter corresponding to each target frame is determined.
  • the processor is used to perform the following steps:
  • the steering angle corresponding to the target time slice is obtained.
  • the initial movement speed is updated based on the steering mixing coefficient, and the updated initial movement speed is obtained.
  • the target motion speed of the lens in the target time slice is determined.
  • the processor is used to perform the following steps:
  • the change parameter of the lens in the target time slice is acquired.
  • the second target parameter includes at least one of the second position parameter and the second zoom parameter. One.
  • the target motion speed of the lens in the target time slice is determined.
  • the processor is used to perform the following steps:
  • the process of controlling the movement of the lens is divided into a reference number of sub-processes.
  • the speed change rate corresponding to any sub-process is determined.
  • the sub-movement speed corresponding to any sub-process is determined.
  • the lens movement is controlled according to the sub-motion speed corresponding to any sub-process.
  • the processor is further configured to perform the following steps:
  • the validity of the first target parameter is verified.
  • the initial motion speed of the lens in the target time slice including:
  • the processor is further configured to perform the following steps:
  • the target motion speed is updated, and the updated target motion speed is obtained.
  • the lens movement is controlled to meet the first target parameter, including:
  • the lens motion is controlled to meet the first target parameter.
  • the time-speed change rate curve is a Bezier curve.
  • the terminal may optionally further include: a peripheral device interface 1203 and at least one peripheral device.
  • the processor 1201, the memory 1202, and the peripheral device interface 1203 may be connected by a bus or a signal line.
  • Each peripheral device can be connected to the peripheral device interface 1203 through a bus, a signal line, or a circuit board.
  • the peripheral device includes: at least one of a radio frequency circuit 1204, a touch display screen 1205, a camera component 1206, an audio circuit 1207, a positioning component 1208, and a power supply 1209.
  • the peripheral device interface 1203 may be used to connect at least one peripheral device related to I/O (Input/Output) to the processor 1201 and the memory 1202.
  • the processor 1201, the memory 1202, and the peripheral device interface 1203 are integrated on the same chip or circuit board; in some other embodiments, any one of the processor 1201, the memory 1202, and the peripheral device interface 1203 or The two can be implemented on a separate chip or circuit board, which is not limited in this embodiment.
  • the radio frequency circuit 1204 is used for receiving and transmitting RF (Radio Frequency, radio frequency) signals, also called electromagnetic signals.
  • the radio frequency circuit 1204 communicates with a communication network and other communication devices through electromagnetic signals.
  • the radio frequency circuit 1204 converts electrical signals into electromagnetic signals for transmission, or converts received electromagnetic signals into electrical signals.
  • the radio frequency circuit 1204 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a user identity module card, and so on.
  • the radio frequency circuit 1204 can communicate with other terminals through at least one wireless communication protocol.
  • the wireless communication protocol includes, but is not limited to: metropolitan area networks, various generations of mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or WiFi (Wireless Fidelity, wireless fidelity) networks.
  • the radio frequency circuit 1204 may also include a circuit related to NFC (Near Field Communication), which is not limited in this application.
  • the display screen 1205 is used to display a UI (User Interface, user interface).
  • the UI can include graphics, text, icons, videos, and any combination thereof.
  • the display screen 1205 also has the ability to collect touch signals on or above the surface of the display screen 1205.
  • the touch signal may be input to the processor 1201 as a control signal for processing.
  • the display screen 1205 may also be used to provide virtual buttons and/or virtual keyboards, also called soft buttons and/or soft keyboards.
  • the display screen 1205 may be a flexible display screen, which is arranged on a curved surface or a folding surface of the terminal. Furthermore, the display screen 1205 can also be set as a non-rectangular irregular pattern, that is, a special-shaped screen.
  • the display screen 1205 may be made of materials such as LCD (Liquid Crystal Display) and OLED (Organic Light-Emitting Diode).
  • the camera assembly 1206 is used to capture images or videos.
  • the camera assembly 1206 includes a front camera and a rear camera.
  • the front camera is set on the front panel of the terminal, and the rear camera is set on the back of the terminal.
  • the camera assembly 1206 may also include a flash.
  • the flash can be a single-color flash or a dual-color flash. Dual color temperature flash refers to a combination of warm light flash and cold light flash, which can be used for light compensation under different color temperatures.
  • the audio circuit 1207 may include a microphone and a speaker.
  • the microphone is used to collect sound waves of the user and the environment, and convert the sound waves into electrical signals and input them to the processor 1201 for processing, or input to the radio frequency circuit 1204 to implement voice communication.
  • the microphone can also be an array microphone or an omnidirectional collection microphone.
  • the speaker is used to convert the electrical signal from the processor 1201 or the radio frequency circuit 1204 into sound waves.
  • the speaker can be a traditional thin-film speaker or a piezoelectric ceramic speaker.
  • the speaker When the speaker is a piezoelectric ceramic speaker, it can not only convert the electrical signal into human audible sound waves, but also convert the electrical signal into human inaudible sound waves for distance measurement and other purposes.
  • the audio circuit 1207 may also include a headphone jack.
  • the positioning component 1208 is used to locate the current geographic location of the terminal to implement navigation or LBS (Location Based Service, location-based service).
  • the positioning component 1208 may be a positioning component based on the GPS (Global Positioning System, Global Positioning System) of the United States, the Beidou system of China, the Granus system of Russia, or the Galileo system of the European Union.
  • GPS Global Positioning System, Global Positioning System
  • the power supply 1209 is used to supply power to various components in the terminal.
  • the power source 1209 may be alternating current, direct current, disposable batteries, or rechargeable batteries.
  • the rechargeable battery may support wired charging or wireless charging.
  • the rechargeable battery can also be used to support fast charging technology.
  • the terminal further includes one or more sensors 1210.
  • the one or more sensors 1210 include, but are not limited to: an acceleration sensor 1211, a gyroscope sensor 1212, a pressure sensor 1213, a fingerprint sensor 1214, an optical sensor 1215, and a proximity sensor 1216.
  • the acceleration sensor 1211 can detect the magnitude of acceleration on the three coordinate axes of the coordinate system established by the terminal. For example, the acceleration sensor 1211 can be used to detect the components of gravitational acceleration on three coordinate axes.
  • the processor 1201 may control the touch screen 1205 to display the user interface in a horizontal view or a vertical view according to the gravity acceleration signal collected by the acceleration sensor 1211.
  • the acceleration sensor 1211 may also be used for the collection of game or user motion data.
  • the gyroscope sensor 1212 can detect the body direction and rotation angle of the terminal, and the gyroscope sensor 1212 can cooperate with the acceleration sensor 1211 to collect the user's 3D actions on the terminal. Based on the data collected by the gyroscope sensor 1212, the processor 1201 can implement the following functions: motion sensing (such as changing the UI according to the user's tilt operation), image stabilization during shooting, game control, and inertial navigation.
  • the pressure sensor 1213 may be disposed on the side frame of the terminal and/or the lower layer of the touch screen 1205.
  • the processor 1201 performs left and right hand recognition or quick operation according to the holding signal collected by the pressure sensor 1213.
  • the processor 1201 controls the operability controls on the UI interface according to the user's pressure operation on the touch display screen 1205.
  • the operability controls include at least one of button controls, scroll bar controls, icon controls, and menu controls.
  • the fingerprint sensor 1214 is used to collect the user's fingerprint.
  • the processor 1201 identifies the user's identity according to the fingerprint collected by the fingerprint sensor 1214, or the fingerprint sensor 1214 identifies the user's identity according to the collected fingerprint.
  • the processor 1201 authorizes the user to perform related sensitive operations, including unlocking the screen, viewing encrypted information, downloading software, paying, and changing settings.
  • the fingerprint sensor 1214 may be provided on the front, back, or side of the terminal. When a physical button or a manufacturer logo is provided on the terminal, the fingerprint sensor 1214 can be integrated with the physical button or the manufacturer logo.
  • the optical sensor 1215 is used to collect the ambient light intensity.
  • the processor 1201 may control the display brightness of the touch display screen 1205 according to the ambient light intensity collected by the optical sensor 1215. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 1205 is increased; when the ambient light intensity is low, the display brightness of the touch display screen 1205 is decreased.
  • the processor 1201 may also dynamically adjust the shooting parameters of the camera assembly 1206 according to the ambient light intensity collected by the optical sensor 1215.
  • the proximity sensor 1216 also called a distance sensor, is usually set on the front panel of the terminal.
  • the proximity sensor 1216 is used to collect the distance between the user and the front of the terminal.
  • the processor 1201 controls the touch screen 1205 to switch from the on-screen state to the off-screen state; when the proximity sensor 1216 detects When the distance between the user and the front of the terminal gradually increases, the processor 1201 controls the touch display screen 1205 to switch from the on-screen state to the on-screen state.
  • FIG. 12 does not constitute a limitation on the terminal, and may include more or fewer components than shown in the figure, or combine certain components, or adopt different component arrangements.
  • a server is also provided.
  • the server includes a processor 1301 and a memory 1302, and the memory 1302 stores at least one program code.
  • the at least one piece of program code is loaded and executed by one or more processors 1301, so as to implement any of the aforementioned methods for controlling lens movement.
  • a computer-readable storage medium stores at least one piece of program code, and the at least one piece of program code is loaded by a processor of a computer device and executes the following steps:
  • the first target parameter that the shot needs to meet in the target time slice is acquired, where the first target parameter includes at least one of the first position parameter and the first zoom parameter.
  • the initial motion speed of the lens in the target time slice Based on the first target parameter, the initial motion speed of the lens in the target time slice, the time interval corresponding to the target time slice and the time-speed change magnification curve, determine the target motion speed of the lens in the target time slice, and the target motion speed It includes at least one of the target panning speed and the target zooming speed.
  • the lens movement is controlled.
  • the processor is used to perform the following steps:
  • Sampling processing is performed on the data of each target frame corresponding to the target time slice, and the sampling parameter corresponding to each target frame is obtained.
  • the weight value is set for each target frame according to the distance between the timestamp and the start timestamp of the target time slice.
  • the first target parameter that the shot needs to meet in the target time slice is determined.
  • sampling parameters corresponding to each target frame include sampling position parameters
  • the processor is configured to perform the following steps:
  • the center position parameter of the interactive object in each target frame is obtained.
  • the sampling position parameter corresponding to each target frame is determined.
  • sampling parameters corresponding to each target frame include sampling scaling parameters
  • the processor is configured to perform the following steps:
  • the distance parameter corresponding to each target frame is obtained.
  • the zoom change magnification corresponding to the distance parameter is determined.
  • the sampling zoom parameter corresponding to each target frame is determined.
  • the processor is used to perform the following steps:
  • the steering angle corresponding to the target time slice is obtained.
  • the initial movement speed is updated based on the steering mixing coefficient, and the updated initial movement speed is obtained.
  • the target motion speed of the lens in the target time slice is determined.
  • the processor is used to perform the following steps:
  • the change parameter of the lens in the target time slice is acquired.
  • the second target parameter includes at least one of the second position parameter and the second zoom parameter. One.
  • the target motion speed of the lens in the target time slice is determined.
  • the processor is used to perform the following steps:
  • the process of controlling the movement of the lens is divided into a reference number of sub-processes.
  • the speed change rate corresponding to any sub-process is determined.
  • the sub-movement speed corresponding to any sub-process is determined.
  • the lens movement is controlled according to the sub-motion speed corresponding to any sub-process.
  • the processor is further configured to perform the following steps:
  • the validity of the first target parameter is verified.
  • the initial motion speed of the lens in the target time slice including:
  • the processor is further configured to perform the following steps:
  • the target motion speed is updated, and the updated target motion speed is obtained.
  • the lens motion is controlled to meet the first target parameter, including:
  • the lens motion is controlled to meet the first target parameter.
  • the time-speed change rate curve is a Bezier curve.
  • the above-mentioned computer-readable storage medium may be a read-only memory (Read-Only Memory, ROM), a random access memory (Random Access Memory, RAM), and a compact disc (Read-Only Memory, CD).
  • ROM read-only memory
  • RAM random access memory
  • CD compact disc
  • -ROM magnetic tapes
  • floppy disks and optical data storage devices, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)
  • Lens Barrels (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

本申请公开了控制镜头运动的方法、装置、设备及存储介质,属于计算机技术领域。方法包括:基于目标时间分片对应的多个目标帧的数据,获取镜头在目标时间分片中需要满足的第一目标参数;基于第一目标参数、镜头在目标时间分片中的初始运动速度、目标时间分片对应的时间间隔和时间-速度变化倍率曲线,确定镜头在目标时间分片中的目标运动速度;基于时间-速度变化倍率曲线、初始运动速度和目标运动速度,控制镜头运动。上述过程,基于多个目标帧的数据获取的第一目标参数的可靠性较高,有利于提高镜头运动的稳定性;基于时间-速度变化倍率曲线控制镜头运动,有利于提高镜头在相邻时间分片之间的运动连续性。

Description

控制镜头运动的方法、装置、设备及存储介质
本申请要求于2020年1月17日提交的申请号为202010055703.X、发明名称为“控制镜头运动的方法、装置、设备及存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及计算机技术领域,特别涉及一种控制镜头运动的方法、装置、设备及存储介质。
背景技术
随着游戏技术的快速发展,越来越多的游戏客户端为用户提供导播模式。在用户利用导播模式观看游戏视频的过程中,游戏客户端可以控制导播镜头运动,将导播镜头聚焦到精彩的游戏场景,以提高用户的游戏视频观看体验。
发明内容
本申请实施例提供了一种控制镜头运动的方法、装置、设备及存储介质,可用于提高控制镜头运动的效果。所述技术方案如下:
一方面,本申请实施例提供了一种控制镜头运动的方法,应用于计算机设备,所述方法包括:
基于目标时间分片对应的多个目标帧的数据,获取镜头在所述目标时间分片中需要满足的第一目标参数,所述第一目标参数包括第一位置参数和第一缩放参数中的至少一个;
基于所述第一目标参数、所述镜头在所述目标时间分片中的初始运动速度、所述目标时间分片对应的时间间隔和时间-速度变化倍率曲线,确定所述镜头在所述目标时间分片中的目标运动速度,所述目标运动速度包括目标平移速度和目标缩放速度中的至少一个;
基于所述时间-速度变化倍率曲线、所述初始运动速度和所述目标运动速度,控制所述镜头运动。
另一方面,提供了一种控制镜头运动的装置,所述装置包括:
获取模块,用于基于目标时间分片对应的多个目标帧的数据,获取镜头在所述目标时间分片中需要满足的第一目标参数,所述第一目标参数包括第一位置参数和第一缩放参数中的至少一个;
确定模块,用于基于所述第一目标参数、所述镜头在所述目标时间分片中的初始运动速度、所述目标时间分片对应的时间间隔和时间-速度变化倍率曲线,确定所述镜头在所述目标 时间分片中的目标运动速度,所述目标运动速度包括目标平移速度和目标缩放速度中的至少一个;
控制模块,用于基于所述时间-速度变化倍率曲线、所述初始运动速度和所述目标运动速度,控制所述镜头运动,以满足所述第一目标参数。
另一方面,提供了一种计算机设备,所述计算机设备包括处理器和存储器,所述存储器中存储有至少一条程序代码,所述至少一条程序代码由所述处理器加载并执行以下步骤:
基于目标时间分片对应的多个目标帧的数据,获取镜头在所述目标时间分片中需要满足的第一目标参数,所述第一目标参数包括第一位置参数和第一缩放参数中的至少一个;
基于所述第一目标参数、所述镜头在所述目标时间分片中的初始运动速度、所述目标时间分片对应的时间间隔和时间-速度变化倍率曲线,确定所述镜头在所述目标时间分片中的目标运动速度,所述目标运动速度包括目标平移速度和目标缩放速度中的至少一个;
基于所述时间-速度变化倍率曲线、所述初始运动速度和所述目标运动速度,控制所述镜头运动。
另一方面,还提供了一种计算机可读存储介质,所述计算机可读存储介质中存储有至少一条程序代码,所述至少一条程序代码由处理器加载并执行以下步骤:
基于目标时间分片对应的多个目标帧的数据,获取镜头在所述目标时间分片中需要满足的第一目标参数,所述第一目标参数包括第一位置参数和第一缩放参数中的至少一个;
基于所述第一目标参数、所述镜头在所述目标时间分片中的初始运动速度、所述目标时间分片对应的时间间隔和时间-速度变化倍率曲线,确定所述镜头在所述目标时间分片中的目标运动速度,所述目标运动速度包括目标平移速度和目标缩放速度中的至少一个;
基于所述时间-速度变化倍率曲线、所述初始运动速度和所述目标运动速度,控制所述镜头运动。
本申请实施例提供的技术方案至少带来如下有益效果:
基于多个目标帧的数据获取镜头在目标时间分片中需要满足的第一目标参数,第一目标参数的可靠性较高,有利于提高镜头运动的稳定性。此外,基于时间-速度变化倍率曲线控制镜头运动,有利于提高镜头在相邻时间分片之间的运动连续性,镜头运动过程的稳定性较高,控制镜头运动的效果较好。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本申请实施例提供的一种控制镜头运动的方法的实施环境的示意图;
图2是本申请实施例提供的一种控制镜头运动的方法流程图;
图3是本申请实施例提供的一种互动对象的运动过程的示意图;
图4是本申请实施例提供的一种中心位置参数、采样位置参数和集合位置参数指示的位置点的示意图;
图5是本申请实施例提供的一种距离-缩放变化倍率曲线的示意图;
图6是本申请实施例提供的一种时间-速度变化倍率曲线的示意图;
图7是本申请实施例提供的一种角度-转向混合系数曲线的示意图;
图8是本申请实施例提供的一种时间-运动速度曲线的示意图;
图9是本申请实施例提供的一种控制镜头运动的过程的示意图;
图10是本申请实施例提供的一种控制镜头运动的装置示意图;
图11是本申请实施例提供的一种控制镜头运动的装置示意图;
图12是本申请实施例提供的一种终端的结构示意图;
图13是本申请实施例提供的一种计算机设备的结构示意图。
具体实施方式
为使本申请的目的、技术方案和优点更加清楚,下面将结合附图对本申请实施方式作进一步地详细描述。
需要说明的是,本申请的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便这里描述的本申请的实施例能够以除了在这里图示或描述的那些以外的顺序实施。以下示例性实施例中所描述的实施方式并不代表与本申请相一致的所有实施方式。相反,它们仅是与如所附权利要求书中所详述的、本申请的一些方面相一致的装置和方法的例子。
随着游戏技术的快速发展,越来越多的游戏客户端为用户提供导播模式。当用户选择导播模式时,导播镜头能够自动跟随游戏节奏进行移动,在无需用户做任何操作的前提下,为用户展示精彩的团战或者一些重要时刻。换句话说,在用户利用导播模式观看游戏视频的过程中,游戏客户端通过控制导播镜头运动,将导播镜头聚焦到精彩的游戏场景,以提高用户的游戏视频观看体验。
对此,本申请实施例提供了一种控制镜头运动的方法,请参考图1,其示出了本申请实施例提供的控制镜头运动的方法的实施环境的示意图。该实施环境包括:终端11和服务器12。
其中,终端11安装有能够提供导播模式的游戏客户端,在导播模式下,终端11中的游戏客户端能够应用本申请实施例提供的方法控制镜头运动。需要说明的是,本申请实施例中 的镜头为导播模式下的导播镜头,该导播镜头聚焦的场景即为导播模式下用户看到的场景。在一些实施例中,服务器12是指终端11中安装的游戏客户端的后台服务器,服务器12能够为终端11安装的游戏客户端提供数据支持。
在一些实施例中,终端11是诸如手机、平板电脑、个人计算机等智能设备。服务器12是一台服务器,或者是由多台服务器组成的服务器集群,或者是一个云计算服务中心。终端11与服务器12通过有线或无线网络建立通信连接。
本领域技术人员应能理解上述终端11和服务器12仅为举例,其他相关的或今后可能出现的终端或服务器如可适用于本申请,也应包含在本申请保护范围以内,并在此以引用方式包含于此。
基于上述图1所示的实施环境,本申请实施例提供一种控制镜头运动的方法,以该方法应用于终端中的游戏客户端为例。如图2所示,本申请实施例提供的方法包括如下步骤:
在步骤201中,终端基于目标时间分片对应的多个目标帧的数据,获取镜头在目标时间分片中需要满足的第一目标参数。
在一些实施例中,目标时间分片是指当前需要执行控制镜头运动操作的时间分片。在对游戏视频进行导播的过程中,终端按照第一参考时间间隔将游戏视频划分为多个时间分片,然后基于本申请实施例提供的方法,依次在每个时间分片中执行控制镜头运动的操作。在一些实施例中,,第一参考时间间隔根据经验设置,或者根据应用场景自由调整,本申请实施例对此不加以限定。示例性地,第一参考时间间隔被设置为1秒,此时,目标时间分片对应的时间间隔也就为1秒。
在一些实施例中,目标时间分片对应的多个目标帧包括目标时间分片的起始时间戳对应的游戏视频帧,以及位于该起始时间戳对应的游戏视频帧之前的参考数量个游戏视频帧。在一些实施例中,参考数量根据经验设置,或者根应用场景自由调整,本申请实施例对此不加以限定。在一些实施例中,参考数量设置为10,在这种情况下,终端能够基于目标时间分片对应的11个目标帧,获取镜头在目标时间分片中需要满足的第一目标参数。
在一些实施例中,每个目标帧的数据是指该帧中的原始数据,包括但不限于该目标帧中的互动对象的位置数据、该目标帧的时间戳数据等。目标帧中的互动对象是指该目标帧中包括的参与游戏过程的游戏角色模型。
在一些实施例中,第一目标参数是指镜头在目标时间分片的结束时刻需要满足的参数。第一目标参数包括第一位置参数和第一缩放参数中的至少一个。其中,第一位置参数用于指示镜头在目标时间分片的结束时刻需要聚焦的位置,第一缩放参数用于指示镜头在目标时间分片的结束时刻需要达到的缩放值。在一些实施例中,终端能够采用位置坐标的形式表示第一位置参数,采用数值的形式表示第一缩放参数。在一些实施例中,在终端采用数值表示第一缩放参数时,能够采用0~1之间(包括0和1)的数值表示镜头的聚焦范围缩小,用1表示镜头的聚焦范围不变,用大于1的数值表示镜头的聚焦范围放大。
在这种实施方式下,终端基于多个目标帧的数据获取第一目标参数,可以降低某一视频帧的异常数据对第一目标参数产生的不良影响,使得获取的第一目标参数可靠性较高,进而提高后续控制镜头运动过程的稳定性。
在一些实施例中,终端基于目标时间分片对应的多个目标帧的数据,获取镜头在目标时间分片中需要满足的第一目标参数的过程包括以下步骤2011至步骤2013:
步骤2011:终端对目标时间分片对应的各个目标帧的数据进行采样处理,得到各个目标帧对应的采样参数。
在一些实施例中,每个目标帧对应的采样参数是指基于该目标帧确定的镜头在目标时间分片中需要满足的参数。每个目标帧对应的采样参数包括该目标帧对应的采样位置参数和采样缩放参数中的至少一种。
在一些实施例中,在各个目标帧对应的采样参数包括采样位置参数的情况下,终端对目标时间分片对应的各个目标帧的数据进行采样处理,得到各个目标帧对应的采样参数的过程包括步骤A和步骤B:
步骤A:终端基于各个目标帧的数据,获取各个目标帧中的互动对象的中心位置参数。
在一些实施例中,各个目标帧中的互动对象的中心位置参数用于指示各个目标帧中的各个互动对象的位置的几何中心。该步骤的实现过程为:终端在各个目标帧的数据中,提取该各个目标帧中的各个互动对象的位置数据。终端基于各个互动对象的位置数据,获取各个目标帧中的互动对象的中心位置参数。
在一些实施例中,终端基于各个互动对象的位置数据,获取各个目标帧中的互动对象的中心位置参数的过程为:终端将各个互动对象的位置数据的平均值作为各个目标帧中的互动对象的中心位置参数。终端能够基于公式1实现上述过程。
P i1=Sum(P C)/N(p)   (公式1)
其中,P i1表示该各个目标帧中的互动对象的中心位置参数;Sum(P C)表示该各个目标帧中的各个互动对象的位置数据的总和;N(p)表示该各个目标帧中的互动对象的数量。
步骤B:终端基于各个目标帧中的互动对象的中心位置参数和互动对象的集合位置参数,确定各个目标帧对应的采样位置参数。
在一些实施例中,互动对象的集合位置参数是指在游戏视频中所有互动对象均需要到达的最终集合点的参数。+终端能够从游戏客户端的后台服务器获取该集合位置参数。需要说明的是,对于不同的目标帧而言,互动对象的中心位置参数可能不同,但是由于互动对象最终集合的位置是相同的,那么互动对象的集合位置参数均相同。
在游戏视频的战斗场景中,如图3所示,互动对象的运动过程为:互动对象先从起始位置点301运动到中间集合点302,然后互动对象通过从上路朝下路移动,从中间集合点302运动到最终集合点303。游戏客户端的后台服务器能够通过对整个游戏视频的分析处理,得到最终集合点的参数,将最终集合点的参数反馈至终端。由此,终端获取互动对象的集合位置参数。
在一些实施例中,终端基于各个目标帧中的互动对象的中心位置参数和互动对象的集合位置参数,确定各个目标帧对应的采样位置参数的过程为:终端基于各个目标帧中的互动对象的中心位置参数和第一权重值,以及互动对象的集合位置参数和第二权重值,确定各个目标帧对应的采样位置参数。在一些实施例中,终端分别获取各个目标帧中的互动对象的中心位置参数和第一权重值的乘积,以及互动对象的集合位置参数和第二权重值的乘积,将两个乘积的和作为各个目标帧对应的采样位置参数。终端能够采用公式2实现上述过程。
P i=P i1×d 1+P i2×d 2  (公式2)
其中,P i表示各个目标帧对应的采样位置参数;P i1表示各个目标帧中的互动对象的中心位置参数,d 1表示第一权重值;P i2表示互动对象的集合位置参数,d 2表示第二权重值。在一些实施例中,第一权重值和第二权重值根据经验设置,或者根据应用场景自由调整,本申请实施例对此不加以限定。在一些实施例中,当第一权重值设置为0.5时,第二权重值也能够被设置为0.5。此种设置方式可以使得用户既能观看到各个互动对象在当前位置的表现,也能观看到最终集合点上的表现。
在一些实施例中,若终端采用位置坐标来表示各个目标帧对应的中心位置参数、采样位置参数和集合位置参数,则中心位置参数、采样位置参数和集合位置参数各指示一个位置点。中心位置参数、采样位置参数和集合位置参数指示的位置点可以如图4所示。在图4中,蓝方和红方为游戏视频中互为对手的两个战队,终端能够根据红方各个互动对象和蓝方各个互动对象的中心位置参数指示的位置点401,以及集合位置参数指示的位置点403,确定该目标帧对应的采样位置参数指示的位置点402。
在这种实施方式下,终端通过综合考虑每个目标帧对应的中心位置参数和集合位置参数来确定每个目标帧对应的采样位置参数,能够在控制镜头运动的过程中,使镜头同时满足第一位置参数和第一缩放参数,降低视觉感受的突兀感。
在一些实施例中,在各个目标帧对应的采样参数包括采样缩放参数的情况下,终端对目标时间分片对应的各个目标帧的数据进行采样处理,得到各个目标帧对应的采样参数的过程包括步骤a至步骤c:
步骤a:终端基于各个目标帧的数据,获取各个目标帧对应的距离参数。
在一些实施例中,该步骤的实现过程为:终端基于各个目标帧的数据,分别获取每个互动对象的位置数据与该各个目标帧对应的采样位置参数之间的距离,以及该各个目标帧对应的采样位置参数和集合位置参数之间的距离。终端将上述距离中的最大距离作为该各个目标帧对应的距离参数。终端能够采用用公式3获取各个目标帧对应的距离参数。
L=max{|P c-P i|,|P i2-P i|}   (公式3)
其中,L表示各个目标帧对应的距离参数;P c表示该各个目标帧中的各个互动对象的位置数据;P i表示该各个目标帧对应的采样位置参数;P i2表示集合位置参数。
步骤b:终端基于距离-缩放变化倍率曲线,确定与距离参数对应的缩放变化倍率。
在镜头不断运动的过程中,整个游戏视频中各个游戏视频帧对应的距离参数从L max下降 到L min,同时伴随着缩放参数从S min增加到S max。L max是指整个游戏视频中各个游戏视频帧对应的距离参数中的最大值,L min是指整个游戏视频中各个游戏视频帧对应的距离参数中的最小值;S min是指整个游戏视频中镜头的缩放最小值,对应L max;S max是指整个游戏视频中镜头的缩放最大值,对应L min
根据游戏场景本身的特性,在整个游戏视频中,各个游戏视频帧对应的距离参数L的取值不是在L max和L min之间线性变化,而是相对集中在L min侧。所以,从S min到S max的变化,也应该是前面变化慢,后面变化快。因此,本申请实施例中使用的距离-缩放变化倍率曲线的斜率为增大趋势,也就是说,距离-缩放变化倍率曲线的斜率前小后大。此种距离-缩放变化倍率曲线可以在L的取值比较集中的范围内增大映射区间范围,优化镜头运动的视觉感受。需要说明的是,上述L max、L min、S min和S max可以由游戏客户端的后台服务器确定并反馈至游戏客户端,也可以由游戏客户端从记录有L max、L min、S min和S max的配置文件中读取得到。
在一些实施例中,距离-缩放变化倍率曲线如图5中的501所示。图5中的横坐标表示归一化处理后的距离,纵坐标表示缩放变化倍率。图5所示的距离-缩放变化倍率曲线501的斜率为增大趋势。随着横坐标的数值从0变化到1,纵坐标的数值也从0变化到1。
在一些实施例中,终端基于距离-缩放变化倍率曲线,确定与距离参数对应的缩放变化倍率的过程为:终端对距离参数进行归一化处理,得到归一化处理后的距离参数。终端基于距离-缩放变化倍率曲线,确定与归一化处理后的距离参数对应的缩放变化倍率。
在一些实施例中,终端能够采用公式4得到归一化处理后的距离参数。
L′=(L max-L)/(L max-L min)   (公式4)
其中,L′表示归一化处理后的距离参数;L表示归一化处理前的距离参数,L max表示整个游戏视频中各个游戏视频帧对应的距离参数中的最大值;L min表示整个游戏视频中各个游戏视频帧对应的距离参数中的最小值。
步骤c:终端基于缩放变化倍率,确定各个目标帧对应的采样缩放参数。
在一些实施例中,终端能够采用公式5确定各个目标帧对应的采样缩放参数。
S i=(S max-S min)×r+S min   (公式5)
其中,S i表示各个目标帧对应的采样缩放参数;r表示缩放变化倍率;S min表示整个游戏视频中镜头的缩放最小值;S max表示整个游戏视频中镜头的缩放最大值。
总之,对于各个目标帧对应的采样参数仅包括采样位置参数的情况,终端根据上述步骤A和步骤B即可获取各个目标帧对应的采样参数。对于各个目标帧对应的采样参数仅包括采样缩放参数的情况,终端根据上述步骤a至步骤c即可获取各个目标帧对应的采样参数。对于各个目标帧对应的采样参数包括采样位置参数和采样缩放参数的情况,终端根据上述步骤A和步骤B,以及步骤a至步骤c能够获取各个目标帧对应的采样参数。
在一些实施例中,终端基于上述步骤2011,分别对每个目标帧的数据进行采样处理,得到各个目标帧对应的采样参数,然后执行步骤2012。
步骤2012:终端按照时间戳与目标时间分片的起始时间戳的距离,为各个目标帧设置权 重值。
在一些实施例中,终端按照时间戳与目标时间分片的起始时间戳的距离,为各个目标帧设置权重值的过程为:终端按照时间戳与目标时间分片的起始时间戳的距离从远到近的顺序,为各个目标帧设置从小到大的权重值,或者,终端按照时间戳与目标时间分片的起始时间戳的距离从近到远的顺序,为各个目标帧设置从大到小的权重值。在一些实施例中,各个目标帧对应的权重值的分布符合高斯分布。
在一些实施例中,若目标帧的数量为5,那么终端按照时间戳与目标时间分片的起始时间戳的距离从远到近的顺序,为各个目标帧分别设置1/55、4/55、9/55、16/55和25/55的权重值。
步骤2013:终端基于各个目标帧对应的采样参数和各个目标帧对应的权重值,确定镜头在目标时间分片中需要满足的第一目标参数。
在一些实施例中,终端基于各个目标帧对应的采样参数和各个目标帧对应的权重值,确定镜头在目标时间分片中需要满足的第一目标参数的过程为:终端分别获取每个目标帧对应的采样参数和每个目标帧对应的权重值的乘积,将各个乘积的和作为镜头在目标时间分片中需要满足的第一目标参数。
需要说明的是,对于各个目标帧对应的采样参数包括采样位置参数的情况,第一目标参数包括第一位置参数。对于各个目标帧对应的采样参数包括采样缩放参数的情况,第一目标参数包括第一缩放参数。对于各个目标帧对应的采样参数包括采样位置参数和采样缩放参数的情况,第一目标参数包括第一位置参数和第一缩放参数。
在一些实施例中,若目标帧的数量为5,那么终端按照时间戳与目标时间分片的起始时间戳的距离从远到近的顺序,为各个目标帧分别设置1/55、4/55、9/55、16/55和25/55的权重值。则终端能够采用公式6获取第一目标参数中的第一位置参数。
P n=P 1×1/55+P 2×4/55+P 3×9/55+P 4×16/55+P 5×25/55   (公式6)
其中,P n表示第一目标参数中的第一位置参数;P 1、P 2、P 3、P 4和P 5分别表示按照时间戳与目标时间分片的起始时间戳的距离从远到近的顺序排列的5个目标帧对应的采样位置参数。
在一些实施例中,终端能够采用公式7获取第一目标参数中的第一缩放参数。
S n=S 1×1/55+S 2×4/55+S 3×9/55+S 4×16/55+S 5×25/55  (公式7)
其中,S n表示第一目标参数中的第一缩放参数;S 1、S 2、S 3、S 4和S 5分别表示按照时间戳与目标时间分片的起始时间戳的距离从远到近的顺序排列的5个目标帧对应的采样缩放参数。
在一些实施例中,终端在获取镜头在目标时间分片中需要满足的第一目标参数之后,还可以基于参数变化阈值,对第一目标参数进行有效性验证。
在一些实施例中,终端基于参数变化阈值,对第一目标参数进行有效性验证的过程为:获取镜头在目标时间分片中的变化参数;基于变化参数和参数变化阈值的比对结果,对第一目标参数进行有效性验证。
在一些实施例中,终端获取镜头在目标时间分片中的变化参数的过程为:终端基于镜头在目标时间分片中已满足的第二目标参数和第一目标参数,获取镜头在所述目标时间分片中的变化参数。其中,在目标时间分片中已满足的第二目标参数是指镜头在目标时间分片的起始时刻满足的参数,第二目标参数包括第二位置参数和第二缩放参数中的至少一个。
当第一目标参数包括第一位置参数时,镜头在目标时间分片中的变化参数包括位置变化参数,终端能够采用第一位置参数与第二位置参数之间的距离长度表示该位置变化参数。当第一目标参数包括第一缩放参数时,镜头在目标时间分片中的变化参数包括缩放变化参数,终端能够采用第一缩放参数与第二缩放参数的差值的绝对值表示该缩放变化参数。
在一些实施例中,终端基于变化参数和参数变化阈值的比对结果,对第一目标参数进行有效性验证的过程为:当变化参数低于参数变化阈值时,终端确定第一目标参数的有效性验证不通过;当变化参数不低于参数变化阈值时,终端确定第一目标参数的有效性验证通过。
在一些实施例中,对于第一目标参数包括第一位置参数和第一缩放参数的情况,镜头在目标时间分片中的变化参数包括位置变化参数和缩放变化参数。在此种情况下,终端基于变化参数和参数变化阈值的比对结果,对第一目标参数进行有效性验证的过程为:终端基于位置变化参数和第一参数变化阈值的比对结果,对第一位置参数进行有效性验证。终端基于缩放变化参数和第二参数变化阈值的比对结果,对第一缩放参数进行有效性验证。其中,第一参数变化阈值和第二参数变化阈值可以相同,也可以不同,本申请实施例对此不加以限定。经过上述过程后,终端能够将有效性验证不通过的参数丢弃,将剩余的参数作为有效性验证通过的第一目标参数,终端基于有效性验证通过的第一目标参数执行步骤202。
在一些实施例中,若第一目标参数包括第一位置参数和第一缩放参数,第二目标参数包括第二位置参数和第二缩放参数,镜头在目标时间分片中的变化参数包括位置变化参数和缩放变化参数。假设第一位置参数的位置坐标为(1,1),第二位置参数的位置坐标为(0,1),第一缩放参数为10,第二缩放参数为2。则终端能够采用两个位置坐标之间的欧式距离1表示镜头在目标时间分片中的位置变化参数,采用两个缩放参数之间的差值的绝对值8表示镜头在目标时间分片中的缩放变化参数。假设第一参数变化阈值设置为2,第二参数变化阈值设置为3,则第一位置参数的有效性验证不通过,第一缩放参数的有效性验证通过。此时,有效性验证通过的第一目标参数中仅包括第一缩放参数。
在这种实施方式下,终端通过对第一目标参数进行有效性验证,能够明显减少因为微小平移和微小缩放带来的镜头抖动现象。
需要说明的是,终端在获取镜头在目标时间分片中需要满足的第一目标参数之前,可以设置触发频率,根据触发频率,触发执行获取第一目标参数的步骤。触发频率可以根据经验设置,也可以根据应用场景灵活调整,本申请实施例对此不加以限定。根据触发频率确定的触发时刻可以为每个时间分片的前一小段时间对应的时刻,例如,每个时间分片的前0.2秒对应的时刻,从而可以保证在到达目标时间分片时,及时执行控制镜头运动的操作。
在步骤202中,终端基于第一目标参数、镜头在目标时间分片中的初始运动速度、目标 时间分片对应的时间间隔和时间-速度变化倍率曲线,确定镜头在目标时间分片中的目标运动速度。
其中,目标运动速度包括目标平移速度和目标缩放速度中的至少一个。
在一些实施例中,镜头在目标时间分片中的初始运动速度是指镜头在目标时间分片的起始时刻的运动速度,初始运动速度包括初始平移速度和初始缩放速度中的至少一个。镜头在目标时间分片中的初始运动速度也是镜头在目标时间分片的前一个时间分片的结束时刻的运动速度。终端在目标时间分片的前一个时间分片中的控制镜头运动的过程结束时,能够得到镜头在目标时间分片中的初始运动速度。
在一些实施例中,目标时间分片对应的时间间隔是指目标时间分片从起始时间戳到结束时间戳之间的时长。镜头在目标时间分片中的目标运动速度是指镜头在目标时间分片的结束时刻的运动速度。也就是说,目标运动速度是指镜头在满足第一目标参数时的运动速度。
在一些实施例中,时间-速度变化倍率曲线由游戏的开发人员设置,或者根据应用场景灵活调整,本申请实施例对此不加以限定。在一些实施例中,时间-速度变化倍率曲线为贝塞尔曲线。贝塞尔曲线为平滑曲线,将时间-速度变化曲线设置为贝塞尔曲线,有利于提高镜头运动过程中的平滑性和稳定性。
在一些实施例中,时间-速度变化倍率曲线如图6中的601所示,图6中的横坐标表示归一化处理后的时间,纵坐标表示速度变化倍率。图6所示的时间-速度变化倍率曲线601为一条贝塞尔曲线。随着横坐标的数值从0变化到1,纵坐标的数值也从0变化到1。
在一些实施例中,终端对于对第一目标参数进行了有效性验证的情况,在本步骤中,终端基于有效性验证通过的第一目标参数、镜头在目标时间分片中的初始运动速度、目标时间分片对应的时间间隔和时间-速度变化倍率曲线,确定镜头在目标时间分片中的目标运动速度。
在一些实施例中,终端基于第一目标参数、镜头在目标时间分片中的初始运动速度、目标时间分片对应的时间间隔和时间-速度变化倍率曲线,确定镜头在目标时间分片中的目标运动速度的过程包括步骤2021至步骤2023:
步骤2021:终端基于镜头在目标时间分片中已满足的第二目标参数和第一目标参数,获取镜头在目标时间分片中的变化参数。
在一些实施例中,镜头在目标时间分片中已满足的第二目标参数是指镜头在目标时间分片的起始时刻满足的参数,第二目标参数包括第二位置参数和第二缩放参数中的至少一个。在目标时间分片的前一个时间分片中的控制镜头运动的过程结束时,可以得到镜头在目标时间分片中已满足的第二目标参数。
在一些实施例中,当第一目标参数包括第一位置参数时,镜头在目标时间分片中的变化参数包括位置变化参数,该位置变化参数可以用第一位置参数与第二位置参数之间的距离长度表示;当第一目标参数包括第一缩放参数时,镜头在目标时间分片中的变化参数包括缩放变化参数,该缩放变化参数可以用第一缩放参数与第二缩放参数的差值的绝对值表示。
在一些实施例中,假设第一位置参数的位置坐标为(1,1),第二位置参数的位置坐标为 (0,1),第一缩放参数为10,第二缩放参数为2。则终端能够采用两个位置坐标之间的欧式距离1表示镜头在目标时间分片中的位置变化参数,采用两个缩放参数之间的差值的绝对值8表示镜头在目标时间分片中的缩放变化参数。
步骤2022:终端获取时间-速度变化倍率曲线对应的积分值。
计算时间-速度变化倍率曲线关于时间的积分值。在一些实施例中,如图6所示,对于时间-速度变化倍率曲线中的时间为归一化处理后的情况,时间-速度变化倍率曲线对应的时间范围为[0,1]。用ΔV(t)表示时间-速度变化倍率曲线,则时间-速度变化倍率曲线对应的积分值根据
Figure PCTCN2020126463-appb-000001
计算得到。
步骤2023:终端基于变化参数、初始运动速度、目标时间分片对应的时间间隔和时间-速度变化倍率曲线对应的积分值,确定镜头在目标时间分片中的目标运动速度。
在一些实施例中,终端能够根据公式8确定:镜头在目标时间分片中的目标运动速度。
Figure PCTCN2020126463-appb-000002
其中,ΔM表示变化参数,可以是指平移变化参数,也可以是指缩放变化参数;V 1表示初始运动速度和目标运动速度中较小的运动速度,可以是指初始平移速度和目标平移速度中较小的平移速度,也可以是指初始缩放速度和目标平缩放速度中较小的缩放速度;V 2表示初始运动速度和目标运动速度中较大的运动速度,可以是指初始平移速度和目标平移速度中较大的平移速度,也可以是指初始缩放速度和目标平缩放速度中较大的缩放速度;Δt表示目标时间分片对应时间间隔;
Figure PCTCN2020126463-appb-000003
表示时间-速度变化倍率曲线对应的积分值。需要说明的是,
在一些实施例中,当初始运动速度与时间间隔的乘积不小于变化参数时,说明初始运动速度需要降速或保持不变,也就是说,初始运动速度不小于目标运动速度。此时,V 1表示目标运动速度,V 2表示初始运动速度。根据上述公式8计算出V 1,即可得到目标运动速度。
在一些实施例中,当初始运动速度与时间间隔的乘积小于变化参数时,说明初始运动速度需要升速,也就是说,初始运动速度小于目标运动速度。此时,V 1表示初始运动速度,V 2表示目标运动速度。根据上述公式8计算出V 2,即可得到目标运动速度。
在一些实施例中,在目标运动速度包括目标平移速度和目标缩放速度的情况下,终端确定镜头在目标时间分片中的目标运动速度的过程包括:终端基于平移变化参数、初始平移速度、目标时间分片对应的时间间隔和时间-平移速度变化倍率曲线对应的积分值,按照上述公式8,确定镜头在目标时间分片中的目标平移速度。终端基于缩放变化参数、初始缩放速度、目标时间分片对应的时间间隔和时间-缩放速度变化倍率曲线对应的积分值,按照上述公式8,确定镜头在目标时间分片中的目标缩放速度。需要说明的是,时间-平移速度变化倍率曲线和时间-缩放速度变化倍率曲线可以为同一条曲线,也可以为两条不同的曲线,本申请实施例对此不加以限定。
在一些实施例中,终端能够在确定镜头在目标时间分片中的目标运动速度的过程中,增加转向混合系数的影响,以降低因为改变方向(平移方向和缩放方向)带来的不适应感,提高镜头运动的稳定性。该过程可以包括以下步骤1至步骤4:
步骤1:终端基于第一运动方向和第二运动方向,获取目标时间分片对应的转向角度。
在一些实施例中,第一运动方向是指镜头在目标时间分片的起始时刻的运动方向,第一运动方向包括第一平移方向和第一缩放方向中的至少一个。第二运动方向是指镜头在目标时间分片的结束时刻的运动方向,第二运动方向包括第二平移方向和第二缩放方向中的至少一个。目标时间分片对应的转向角度包括平移转向角度和缩放转向角度中的至少一个。
在一些实施例中,当目标时间分片对应的转向角度包括平移转向角度时,终端获取目标时间分片对应的转向角度的过程包括:终端基于第二平移方向和第一平移方向的夹角,确定目标时间分片对应的平移转向角度。也就是说,终端将第二平移方向和第一平移方向的夹角对应的度数作为目标时间分片对应的平移转向角度。在一些实施例中,第一平移方向根据目标时间分片的前一个时间分片的起始时刻的位置参数和结束时刻的位置参数确定,第二平移方向根据第二目标参数中的第二位置参数和第一目标参数中的第一位置参数确定。
在一些实施例中,当目标时间分片对应的转向角度包括缩放转向角度时,终端获取目标时间分片对应的转向角度的过程包括:终端基于第二缩放方向和第一缩放方向的比对结果,确定目标时间分片对应的缩放转向角度。在一些实施例中,第一缩放方向根据目标时间分片的前一个时间分片的起始时刻的缩放参数和结束时刻的缩放参数确定,第二缩放方向根据第二目标参数中的第二缩放参数和第一目标参数中的第一缩放参数确定。
在一些实施例中,终端基于第二缩放方向和第一缩放方向的比对结果,确定目标时间分片对应的缩放转向角度的过程为:当第二缩放方向和第一缩放方向一致时,终端将第一角度作为目标时间分片对应的缩放转向角度;当第二缩放方向和第一缩放方向不一致时,终端将第二角度作为缩放转向角度。第一角度和第二角度可以根据经验设置,在一些实施例中,第一角度被设置为0度,第二角度被设置为180度。需要说明的是,缩放方向包括放大和缩小两种,第二缩放方向与第一缩放方向一致是指第二缩放方向和第一缩放方向均为放大,或者,第二缩放方向和第一缩放方向均为缩小。
步骤2:终端确定与转向角度对应的转向混合系数。
在一些实施例中,终端确定与转向角度对应的转向混合系数的方式包括但不限于以下两种:
方式一:终端基于角度-转向混合系数曲线,确定与转向角度对应的转向混合系数。
在一些实施例中,角度-转向混合系数曲线如图7中的701所示。在图7中,横坐标表示角度,纵坐标表示转向混合系数。在确定转向角度后,终端能够根据该角度-转向混合系数曲线701确定与转向角度对应的转向混合系数。随着角度从0度变化到180度,转向混合系数从1下降到0。
方式二:终端基于角度和转向混合系数的对应关系,确定与转向角度对应的转向混合系数。
在一些实施例中,角度和转向混合系数的对应关系由游戏的开发人员设置,本申请实施例对此不加以限定。在一些实施例中,角度和转向混合系数的对应关系用表格的形式表示。在确定转向角度后,终端能够在角度和转向混合系数的对应关系表中查询与转向角度对应的 转向混合系数。
需要说明的是,对于转向角度包括平移转向角度和缩放转向角度的情况,终端需要根据上述方式一或者方式二,分别确定与平移转向角度对应的平移转向混合系数,以及与缩放转向角度对应的缩放转向混合系数。
步骤3:终端基于转向混合系数更新初始运动速度,得到更新后的初始运动速度。
在一些实施例中,终端基于转向混合系数更新初始运动速度的方式为:终端将更新前的初始运动速度与转向混合系数的乘积作为更新后的初始运动速度。
步骤4:终端基于第一目标参数、更新后的初始运动速度、目标时间分片对应的时间间隔和时间-速度变化倍率曲线,确定镜头在目标时间分片中的目标运动速度。
该步骤4的实现过程参见步骤2021至步骤2023,此处不再赘述。
在一些实施例中,终端在得到目标运动速度后,能够基于矫正系数更新目标运动速度,得到更新后的目标运动速度。终端能够基于下述公式9实现上述过程:
V 2(n)=V 1(n)×R   (公式9)
其中,V 2(n)表示更新后的目标运动速度,V 1(n)表示更新前的目标运动速度,R表示矫正系数。矫正系数可以根据经验设置,也可以根据应用场景灵活调整,本申请实施例对此不加以限定。示例性地,将矫正系数设置在[0.9,1]的范围内,例如R=0.95。基于此范围内的矫正系数更新目标运动参数,一方面可以较少积分计算带来的偏差,另一方面还可以使得镜头倾向于延后满足第一目标参数。相比于提前满足第一目标参数,延后满足第一目标参数可以提升视觉效果,减少停顿感。
在一些实施例中,对于目标运动速度包括目标平移速度和目标缩放速度的情况,终端能够基于第一矫正系数更新目标平移速度,得到更新后的目标平移速度。终端基于第二矫正系数更新目标缩放速度,得到更新后的目标缩放速度。其中,第一矫正系数与第二矫正系数可以相同,也可以不同,本申请实施例对此不加以限定。
在步骤203中,终端基于时间-速度变化倍率曲线、初始运动速度和目标运动速度,控制镜头运动,以满足第一目标参数。
需要说明的是,在基于矫正系数对目标运动速度进行更新的情况下,此步骤中的目标运动速度是指更新后的目标运动速度。在基于转向混合系数对初始运动速度进行更新的情况下,此步骤中的初始运动速度是指更新后的初始运动速度。
在一些实施例中,终端基于时间-速度变化倍率曲线、初始运动速度和目标运动速度,控制镜头运动的过程包括步骤2031至步骤2034:
步骤2031:终端将控制镜头运动的过程划分为参考数量个子过程。
在一些实施例中,在目标时间分片对应的时间间隔内,终端能够根据第二参考时间间隔将,控制镜头运动的过程划分为参考数量个连续的子过程,从而通过分步控制镜头运动使镜头满足第一目标参数。参考数量可以根据目标时间分片对应的时间间隔和第二参考时间间隔确定。在一些实施例中,终端将目标时间分片对应的时间间隔和第二参考时间间隔的比值作 为参考数量。
在一些实施例中第二参考时间间隔根据目标时间分片对应的时间间隔确定,或者根据经验设置,本申请实施例对此不加以限定。在一些实施例中,当目标时间分片对应的时间间隔为1秒时,第二参考时间间隔被设置为0.02秒。此时,参考数量为50。终端根据此种划分方式得到的每个子过程对应的时间间隔均为0.02秒。
在一些实施例中,终端在将镜头运动的过程划分为参考数量个子过程后,还能够基于下述步骤2032和步骤2033获取每个子过程对应的子运动速度。
步骤2032:终端基于任一子过程对应的时间参数和时间-速度变化倍率曲线,确定任一子过程对应的速度变化倍率。
在一些实施例中,时间-速度变化倍率曲线中的时间为归一化处理后的时间,时间-速度变化倍率曲线对应的时间范围为[0,1]。在确定任一子过程对应的速度变化倍率之前,需要先获取任一子过程对应的时间参数。在一些实施例中,终端获取任一子过程对应的时间参数的过程包括以下两个步骤:
步骤1:终端将目标时间间隔和目标时间分片对应的时间间隔的比值作为目标比值。
其中,目标时间间隔是指目标时间分片中位于该任一子过程之前的各个子过程对应的时间间隔的总和。示例性地,假设目标时间分片对应的时间间隔为1秒,每个子过程对应的时间间隔为0.02秒,该任一子过程为目标时间分片中的第6个子过程。则目标时间间隔是指目标时间分片中位于该任一子过程之前的5个子过程对应的时间间隔的总和,目标时间间隔为0.1秒。在此种情况下,目标比值为0.1/1=0.1。
步骤2:当初始运动速度小于目标运动速度时,终端将目标比值作为该任一子过程对应的时间参数;当初始运动速度不小于目标运动速度时,将1和目标比值的差值作为该任一子过程对应的时间参数。
在基于上述步骤1和步骤2确定任一子过程对应的时间参数后,终端能够基于时间-速度变化倍率曲线,确定与该时间参数对应的速度变化倍率,将该速度变化倍率作为该任一子过程对应的速度变化倍率。
需要说明的是,速度变化倍率包括平移速度变化倍率和缩放速度变化倍率中的至少一个。在终端获取任一子过程对应的平移速度变化倍率的过程中,终端基于时间-平移速度变化倍率曲线,确定与该时间参数对应的平移速度变化倍率,将该平移速度变化倍率作为该任一子过程对应的平移速度变化倍率;在终端获取任一子过程对应的缩放速度变化倍率的过程中,终端基于时间-缩放速度变化倍率曲线,确定与该时间参数对应的缩放速度变化倍率,将该缩放速度变化倍率作为该任一子过程对应的缩放速度变化倍率。
步骤2033:终端基于初始运动速度、目标运动速度和任一子过程对应的速度变化倍率,获取任一子过程对应的子运动速度。
在一些实施例中,终端能够基于公式10获取任一子过程对应的子运动速度:
V C=V 1+(V 2-V 1)×ΔV(T)   (公式10)
其中,V C表示任一子过程对应的子运动速度,V 1表示初始运动速度和目标运动速度中较小的运动速度;V 2表示初始运动速度和目标运动速度中较大的运动速度;T表示步骤2032中确定的该任一子过程对应的时间参数;ΔV(T)表示任一子过程对应的速度变化倍率。
在一些实施例中,子运动速度包括子平移速度和子缩放速度中的至少一个。终端在基于上述公式10计算子平移速度的过程中,V C表示任一子过程对应的子平移速度,V 1表示初始平移速度和目标平移速度中较小的平移速度;V 2表示初始平移速度和目标平移速度中较大的平移速度;ΔV(T)表示基于时间-平移速度变化曲线确定的该任一子过程对应的平移速度变化倍率。
在一些实施例中,终端在基于上述公式10计算子缩放速度的过程中,V C表示任一子过程对应的子缩放速度,V 1表示初始缩放速度和目标缩放速度中较小的缩放速度;V 2表示初始缩放速度和目标缩放速度中较大的缩放速度;ΔV(T)表示基于时间-缩放速度变化曲线确定的该任一子过程对应的缩放速度变化倍率。
步骤2034:终端在任一子过程对应的时间间隔内,按照任一子过程对应的子运动速度控制镜头运动。
需要说明的是,在任一子过程对应的时间间隔内,镜头运动的速度保持该任一子过程对应的子运动速度不变。在一些实施例中,任一子过程对应的时间间隔为0.02秒,该任一子过程对应的子运动速度包括子平移速度和子缩放速度,其中,子平移速度为1米/秒,子缩放速度为0.5/秒。则在该任一子过程对应的0.02秒内,在控制镜头按照1米/秒的速度进行平移的同时,控制镜头按照0.5/秒的速度进行缩放。
在一些实施例中,终端在确定任一子过程对应的子运动速度之后,即可继续确定下一个子过程对应的子运动速度,以在按照该任一子过程对应的子运动速度控制镜头运动之后,继续按照下一个子过程对应的子运动速度控制镜头运动,以此类推,直至确定目标时间分片中的最后一个子过程对应的子运动速度,然后按照最后一个子过程对应的子运动速度控制镜头运动,使得镜头满足第一目标参数,完成目标时间分片中的控制镜头运动的过程。然后终端基于上述步骤201至步骤203,继续执行控制镜头在下一个时间分片中运动的操作。
在基于本申请实施例提供的方法控制镜头运动的整个过程中,镜头在相邻时间分片之间的运动连续性较好,镜头在相邻时间分片之间能够平缓过渡,达到平滑运动的效果,减少用户的不适应感。示例性地,假设每个时间分片的时间间隔为1秒,在3个相邻的时间分片中镜头运动的时间-运动速度曲线可以如图8中的801所示,图8中的曲线801在各个交接点处具有良好的连续性,说明镜头运动过程的平滑性较好。
综上所述,终端控制镜头运动的过程可以如图9中的901-904所示:
901、准备临时参数:在获取第一目标参数之前,准备供后续过程使用的临时参数,包括但不限于触发频率、各个目标帧的权重值、参数变化阈值、时间-速度变化倍率曲线、距离-缩放变化倍率曲线和角度-转向混合系数曲线等。
902、获取第一目标参数:终端获取各个目标帧对应的采样位置参数和采样缩放参数,其 中,采样缩放参数根据距离-缩放变化倍率曲线得到。终端根据各个目标帧对应的采样参数和权重值,确定第一目标参数。终端根据参数变化阈值对第一目标参数进行有效性验证,以达到对第一目标参数进行过滤的效果。
903、终端确定目标运动速度:终端根据角度-转向混合系数曲线,确定转向混合系数。终端根据转向混合系数更新初始运动速度。终端根据第一目标参数、时间间隔、时间-速度变化倍率曲线和更新后的初始运动速度,确定目标运动速度。
904、终端控制镜头运动:终端根据目标运动速度、更新后的初始运动速度和时间-速度变化倍率曲线,确定各个子过程对应的子运动速度。在各个子过程对应的时间间隔中,终端按照各个子过程对应的子运动速度控制镜头运动,使镜头满足第一目标参数。
在本申请实施例中,终端基于多个目标帧的数据获取镜头在目标时间分片中需要满足的第一目标参数,第一目标参数的可靠性较高,有利于提高镜头运动的稳定性。此外,基于时间-速度变化倍率曲线控制镜头运动,有利于提高镜头在相邻时间分片之间的运动连续性,镜头运动过程的稳定性较高,控制镜头运动的效果较好。
参见图10,本申请实施例提供了一种控制镜头运动的装置,该装置包括:
获取模块1001,用于基于目标时间分片对应的多个目标帧的数据,获取镜头在目标时间分片中需要满足的第一目标参数,第一目标参数包括第一位置参数和第一缩放参数中的至少一个;
确定模块1002,用于基于第一目标参数、镜头在目标时间分片中的初始运动速度、目标时间分片的对应时间间隔和时间-速度变化倍率曲线,确定镜头在目标时间分片中的目标运动速度,目标运动速度包括目标平移速度和目标缩放速度中的至少一个;
控制模块1003,用于基于时间-速度变化倍率曲线、初始运动速度和目标运动速度,控制镜头运动。
在一些实施例中,获取模块1001,用于对目标时间分片对应的各个目标帧的数据进行采样处理,得到各个目标帧对应的采样参数;按照时间戳与目标时间分片的起始时间戳的距离,为各个目标帧设置权重值;基于各个目标帧对应的采样参数和各个目标帧对应的权重值,确定镜头在目标时间分片中需要满足的第一目标参数。
在一些实施例中,各个目标帧对应的采样参数包括采样位置参数,获取模块1001,还用于基于各个目标帧的数据,获取各个目标帧中的互动对象的中心位置参数;基于各个目标帧中的互动对象的中心位置参数和互动对象的集合位置参数,确定各个目标帧对应的采样位置参数。
在一些实施例中,各个目标帧对应的采样参数包括采样缩放参数,获取模块1001,还用于基于各个目标帧的数据,获取各个目标帧对应的距离参数;基于距离-缩放变化倍率曲线,确定与距离参数对应的缩放变化倍率;基于缩放变化倍率,确定各个目标帧对应的采样缩放参数。
在一些实施例中,确定模块1002,用于基于第一运动方向和第二运动方向,获取目标时间分片对应的转向角度;确定与转向角度对应的转向混合系数;基于转向混合系数更新初始运动速度,得到更新后的初始运动速度;基于第一目标参数、更新后的初始运动速度、目标时间分片对应的时间间隔和时间-速度变化倍率曲线,确定镜头在目标时间分片中的目标运动速度。
在一些实施例中,确定模块1002,用于基于镜头在目标时间分片中已满足的第二目标参数和第一目标参数,获取镜头在目标时间分片中的变化参数,第二目标参数包括第二位置参数和第二缩放参数中的至少一个;获取时间-速度变化倍率曲线对应的积分值;基于变化参数、初始运动速度、目标时间分片对应的时间间隔和时间-速度变化倍率曲线对应的积分值,确定镜头在目标时间分片中的目标运动速度。
在一些实施例中,控制模块1003,用于将控制镜头运动的过程划分为参考数量个子过程;基于任一子过程对应的时间参数和时间-速度变化倍率曲线,确定任一子过程对应的速度变化倍率;基于初始运动速度、目标运动速度和任一子过程对应的速度变化倍率,确定任一子过程对应的子运动速度;在任一子过程对应的时间间隔内,按照任一子过程对应的子运动速度控制镜头运动。
在一些实施例中,参见图11,该装置还包括:
验证模块1004,用于基于参数变化阈值,对第一目标参数进行有效性验证;
确定模块1002,还用于基于有效性验证通过的第一目标参数、镜头在目标时间分片中的初始运动速度、目标时间分片对应的时间间隔和时间-速度变化倍率曲线,确定镜头在目标时间分片中的目标运动速度。
在一些实施例中,参见图11,该装置还包括:
更新模块1005,用于基于矫正系数,更新目标运动速度,得到更新后的目标运动速度;
控制模块1003,还用于基于时间-速度变化倍率曲线、初始运动速度和更新后的目标运动速度,控制镜头运动,以满足第一目标参数。
在一些实施例中,时间-速度变化倍率曲线为贝塞尔曲线。
在本申请实施例中,基于多个目标帧的数据获取镜头在目标时间分片中需要满足的第一目标参数,第一目标参数的可靠性较高,有利于提高镜头运动的稳定性。此外,基于时间-速度变化倍率曲线控制镜头运动,有利于提高镜头在相邻时间分片之间的运动连续性,镜头运动过程的稳定性较高,控制镜头运动的效果较好。
需要说明的是,上述实施例提供的装置在实现其功能时,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将设备的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。另外,上述实施例提供的装置与方法实施例属于同一构思,其具体实现过程详见方法实施例,这里不再赘述。
本申请实施例提供的计算机设备既可以实现为终端,也可以实现为服务器,下面对终端的结构进行说明。
图12是本申请实施例提供的一种终端的结构示意图,该终端安装有游戏客户端。该终端可以是:智能手机、平板电脑、笔记本电脑或台式电脑。终端还可能被称为用户设备、便携式终端、膝上型终端、台式终端等其他名称。
通常,终端包括有:处理器1201和存储器1202。
处理器1201可以包括一个或多个处理核心,比如4核心处理器、8核心处理器等。处理器1201可以采用DSP(Digital Signal Processing,数字信号处理)、FPGA(Field-Programmable Gate Array,现场可编程门阵列)、PLA(Programmable Logic Array,可编程逻辑阵列)中的至少一种硬件形式来实现。处理器1201也可以包括主处理器和协处理器,主处理器是用于对在唤醒状态下的数据进行处理的处理器,也称CPU(Central Processing Unit,中央处理器);协处理器是用于对在待机状态下的数据进行处理的低功耗处理器。在一些实施例中,处理器1201可以集成有GPU(Graphics Processing Unit,图像处理器),GPU用于负责显示屏所需要显示的内容的渲染和绘制。一些实施例中,处理器1201还可以包括AI(Artificial Intelligence,人工智能)处理器,该AI处理器用于处理有关机器学习的计算操作。
存储器1202可以包括一个或多个计算机可读存储介质,该计算机可读存储介质可以是非暂态的。存储器1202还可包括高速随机存取存储器,以及非易失性存储器,比如一个或多个磁盘存储设备、闪存存储设备。在一些实施例中,存储器1202中的非暂态的计算机可读存储介质用于存储至少一个指令,该至少一个指令用于被处理器1201以执行下述步骤:
基于目标时间分片对应的多个目标帧的数据,获取镜头在目标时间分片中需要满足的第一目标参数,第一目标参数包括第一位置参数和第一缩放参数中的至少一个。
基于第一目标参数、镜头在目标时间分片中的初始运动速度、目标时间分片对应的时间间隔和时间-速度变化倍率曲线,确定镜头在目标时间分片中的目标运动速度,目标运动速度包括目标平移速度和目标缩放速度中的至少一个。
基于时间-速度变化倍率曲线、初始运动速度和目标运动速度,控制镜头运动。
在一些实施例中,处理器用于执行以下步骤:
对目标时间分片对应的各个目标帧的数据进行采样处理,得到各个目标帧对应的采样参数。
按照时间戳与目标时间分片的起始时间戳的距离,为各个目标帧设置权重值。
基于各个目标帧对应的采样参数和各个目标帧对应的权重值,确定镜头在目标时间分片中需要满足的第一目标参数。
在一些实施例中,各个目标帧对应的采样参数包括采样位置参数,处理器用于执行以下步骤:
基于各个目标帧的数据,获取各个目标帧中的互动对象的中心位置参数。
基于各个目标帧中的互动对象的中心位置参数和互动对象的集合位置参数,确定各个目 标帧对应的采样位置参数。
在一些实施例中,各个目标帧对应的采样参数包括采样缩放参数,处理器用于执行以下步骤:
基于各个目标帧的数据,获取各个目标帧对应的距离参数。
基于距离-缩放变化倍率曲线,确定与距离参数对应的缩放变化倍率。
基于缩放变化倍率,确定各个目标帧对应的采样缩放参数。
在一些实施例中,处理器用于执行以下步骤:
基于第一运动方向和第二运动方向,获取目标时间分片对应的转向角度。
确定与转向角度对应的转向混合系数。
基于转向混合系数更新初始运动速度,得到更新后的初始运动速度。
基于第一目标参数、更新后的初始运动速度、目标时间分片对应的时间间隔和时间-速度变化倍率曲线,确定镜头在目标时间分片中的目标运动速度。
在一些实施例中,处理器用于执行以下步骤:
基于镜头在目标时间分片中已满足的第二目标参数和第一目标参数,获取镜头在目标时间分片中的变化参数,第二目标参数包括第二位置参数和第二缩放参数中的至少一个。
获取时间-速度变化倍率曲线对应的积分值。
基于变化参数、初始运动速度、目标时间分片对应的时间间隔和时间-速度变化倍率曲线对应的积分值,确定镜头在目标时间分片中的目标运动速度。
在一些实施例中,处理器用于执行以下步骤:
将控制镜头运动的过程划分为参考数量个子过程。
基于任一子过程对应的时间参数和时间-速度变化倍率曲线,确定任一子过程对应的速度变化倍率。
基于初始运动速度、目标运动速度和任一子过程对应的速度变化倍率,确定任一子过程对应的子运动速度。
在任一子过程对应的时间间隔内,按照任一子过程对应的子运动速度控制镜头运动。
在一些实施例中,处理器还用于执行以下步骤:
基于参数变化阈值,对第一目标参数进行有效性验证。
基于第一目标参数、镜头在目标时间分片中的初始运动速度、目标时间分片对应的时间间隔和时间-速度变化倍率曲线,确定镜头在目标时间分片中的目标运动速度,包括:
基于有效性验证通过的第一目标参数、镜头在目标时间分片中的初始运动速度、目标时间分片对应的时间间隔和时间-速度变化倍率曲线,确定镜头在目标时间分片中的目标运动速度。
在一些实施例中,处理器还用于执行以下步骤:
基于矫正系数,更新目标运动速度,得到更新后的目标运动速度。
基于时间-速度变化倍率曲线、初始运动速度和目标运动速度,控制镜头运动,以满足第 一目标参数,包括:
基于时间-速度变化倍率曲线、初始运动速度和更新后的目标运动速度,控制镜头运动,以满足第一目标参数。
在一些实施例中,时间-速度变化倍率曲线为贝塞尔曲线。
在一些实施例中,终端还可选包括有:外围设备接口1203和至少一个外围设备。处理器1201、存储器1202和外围设备接口1203之间可以通过总线或信号线相连。各个外围设备可以通过总线、信号线或电路板与外围设备接口1203相连。具体地,外围设备包括:射频电路1204、触摸显示屏1205、摄像头组件1206、音频电路1207、定位组件1208和电源1209中的至少一种。
外围设备接口1203可被用于将I/O(Input/Output,输入/输出)相关的至少一个外围设备连接到处理器1201和存储器1202。在一些实施例中,处理器1201、存储器1202和外围设备接口1203被集成在同一芯片或电路板上;在一些其他实施例中,处理器1201、存储器1202和外围设备接口1203中的任意一个或两个可以在单独的芯片或电路板上实现,本实施例对此不加以限定。
射频电路1204用于接收和发射RF(Radio Frequency,射频)信号,也称电磁信号。射频电路1204通过电磁信号与通信网络以及其他通信设备进行通信。射频电路1204将电信号转换为电磁信号进行发送,或者,将接收到的电磁信号转换为电信号。可选地,射频电路1204包括:天线系统、RF收发器、一个或多个放大器、调谐器、振荡器、数字信号处理器、编解码芯片组、用户身份模块卡等等。射频电路1204可以通过至少一种无线通信协议来与其它终端进行通信。该无线通信协议包括但不限于:城域网、各代移动通信网络(2G、3G、4G及5G)、无线局域网和/或WiFi(Wireless Fidelity,无线保真)网络。在一些实施例中,射频电路1204还可以包括NFC(Near Field Communication,近距离无线通信)有关的电路,本申请对此不加以限定。
显示屏1205用于显示UI(User Interface,用户界面)。该UI可以包括图形、文本、图标、视频及其它们的任意组合。当显示屏1205是触摸显示屏时,显示屏1205还具有采集在显示屏1205的表面或表面上方的触摸信号的能力。该触摸信号可以作为控制信号输入至处理器1201进行处理。此时,显示屏1205还可以用于提供虚拟按钮和/或虚拟键盘,也称软按钮和/或软键盘。在一些实施例中,显示屏1205可以为一个,设置在终端的前面板;在另一些实施例中,显示屏1205可以为至少两个,分别设置在终端的不同表面或呈折叠设计;在再一些实施例中,显示屏1205可以是柔性显示屏,设置在终端的弯曲表面上或折叠面上。甚至,显示屏1205还可以设置成非矩形的不规则图形,也即异形屏。显示屏1205可以采用LCD(Liquid Crystal Display,液晶显示屏)、OLED(Organic Light-Emitting Diode,有机发光二极管)等材质制备。
摄像头组件1206用于采集图像或视频。可选地,摄像头组件1206包括前置摄像头和后置摄像头。通常,前置摄像头设置在终端的前面板,后置摄像头设置在终端的背面。在一些 实施例中,后置摄像头为至少两个,分别为主摄像头、景深摄像头、广角摄像头、长焦摄像头中的任意一种,以实现主摄像头和景深摄像头融合实现背景虚化功能、主摄像头和广角摄像头融合实现全景拍摄以及VR(Virtual Reality,虚拟现实)拍摄功能或者其它融合拍摄功能。在一些实施例中,摄像头组件1206还可以包括闪光灯。闪光灯可以是单色温闪光灯,也可以是双色温闪光灯。双色温闪光灯是指暖光闪光灯和冷光闪光灯的组合,可以用于不同色温下的光线补偿。
音频电路1207可以包括麦克风和扬声器。麦克风用于采集用户及环境的声波,并将声波转换为电信号输入至处理器1201进行处理,或者输入至射频电路1204以实现语音通信。出于立体声采集或降噪的目的,麦克风可以为多个,分别设置在终端的不同部位。麦克风还可以是阵列麦克风或全向采集型麦克风。扬声器则用于将来自处理器1201或射频电路1204的电信号转换为声波。扬声器可以是传统的薄膜扬声器,也可以是压电陶瓷扬声器。当扬声器是压电陶瓷扬声器时,不仅可以将电信号转换为人类可听见的声波,也可以将电信号转换为人类听不见的声波以进行测距等用途。在一些实施例中,音频电路1207还可以包括耳机插孔。
定位组件1208用于定位终端的当前地理位置,以实现导航或LBS(Location Based Service,基于位置的服务)。定位组件1208可以是基于美国的GPS(Global Positioning System,全球定位系统)、中国的北斗系统、俄罗斯的格雷纳斯系统或欧盟的伽利略系统的定位组件。
电源1209用于为终端中的各个组件进行供电。电源1209可以是交流电、直流电、一次性电池或可充电电池。当电源1209包括可充电电池时,该可充电电池可以支持有线充电或无线充电。该可充电电池还可以用于支持快充技术。
在一些实施例中,终端还包括有一个或多个传感器1210。该一个或多个传感器1210包括但不限于:加速度传感器1211、陀螺仪传感器1212、压力传感器1213、指纹传感器1214、光学传感器1215以及接近传感器1216。
加速度传感器1211可以检测以终端建立的坐标系的三个坐标轴上的加速度大小。比如,加速度传感器1211可以用于检测重力加速度在三个坐标轴上的分量。处理器1201可以根据加速度传感器1211采集的重力加速度信号,控制触摸显示屏1205以横向视图或纵向视图进行用户界面的显示。加速度传感器1211还可以用于游戏或者用户的运动数据的采集。
陀螺仪传感器1212可以检测终端的机体方向及转动角度,陀螺仪传感器1212可以与加速度传感器1211协同采集用户对终端的3D动作。处理器1201根据陀螺仪传感器1212采集的数据,可以实现如下功能:动作感应(比如根据用户的倾斜操作来改变UI)、拍摄时的图像稳定、游戏控制以及惯性导航。
压力传感器1213可以设置在终端的侧边框和/或触摸显示屏1205的下层。当压力传感器1213设置在终端的侧边框时,可以检测用户对终端的握持信号,由处理器1201根据压力传感器1213采集的握持信号进行左右手识别或快捷操作。当压力传感器1213设置在触摸显示屏1205的下层时,由处理器1201根据用户对触摸显示屏1205的压力操作,实现对UI界面上的可操作性控件进行控制。可操作性控件包括按钮控件、滚动条控件、图标控件、菜单控 件中的至少一种。
指纹传感器1214用于采集用户的指纹,由处理器1201根据指纹传感器1214采集到的指纹识别用户的身份,或者,由指纹传感器1214根据采集到的指纹识别用户的身份。在识别出用户的身份为可信身份时,由处理器1201授权该用户执行相关的敏感操作,该敏感操作包括解锁屏幕、查看加密信息、下载软件、支付及更改设置等。指纹传感器1214可以被设置在终端的正面、背面或侧面。当终端上设置有物理按键或厂商Logo时,指纹传感器1214可以与物理按键或厂商Logo集成在一起。
光学传感器1215用于采集环境光强度。在一个实施例中,处理器1201可以根据光学传感器1215采集的环境光强度,控制触摸显示屏1205的显示亮度。具体地,当环境光强度较高时,调高触摸显示屏1205的显示亮度;当环境光强度较低时,调低触摸显示屏1205的显示亮度。在另一个实施例中,处理器1201还可以根据光学传感器1215采集的环境光强度,动态调整摄像头组件1206的拍摄参数。
接近传感器1216,也称距离传感器,通常设置在终端的前面板。接近传感器1216用于采集用户与终端的正面之间的距离。在一个实施例中,当接近传感器1216检测到用户与终端的正面之间的距离逐渐变小时,由处理器1201控制触摸显示屏1205从亮屏状态切换为息屏状态;当接近传感器1216检测到用户与终端的正面之间的距离逐渐变大时,由处理器1201控制触摸显示屏1205从息屏状态切换为亮屏状态。
本领域技术人员可以理解,图12中示出的结构并不构成对终端的限定,可以包括比图示更多或更少的组件,或者组合某些组件,或者采用不同的组件布置。
在示例性实施例中,还提供了一种服务器,参见图13,该服务器包括处理器1301和存储器1302,该存储器1302中存储有至少一条程序代码。该至少一条程序代码由一个或者一个以上处理器1301加载并执行,以实现上述任一种控制镜头运动的方法。
在示例性实施例中,还提供了一种计算机可读存储介质,该计算机可读存储介质中存储有至少一条程序代码,该至少一条程序代码由计算机设备的处理器加载并执行以下步骤:
基于目标时间分片对应的多个目标帧的数据,获取镜头在目标时间分片中需要满足的第一目标参数,第一目标参数包括第一位置参数和第一缩放参数中的至少一个。
基于第一目标参数、镜头在目标时间分片中的初始运动速度、目标时间分片对应的时间间隔和时间-速度变化倍率曲线,确定镜头在目标时间分片中的目标运动速度,目标运动速度包括目标平移速度和目标缩放速度中的至少一个。
基于时间-速度变化倍率曲线、初始运动速度和目标运动速度,控制镜头运动。
在一些实施例中,处理器用于执行以下步骤:
对目标时间分片对应的各个目标帧的数据进行采样处理,得到各个目标帧对应的采样参数。
按照时间戳与目标时间分片的起始时间戳的距离,为各个目标帧设置权重值。
基于各个目标帧对应的采样参数和各个目标帧对应的权重值,确定镜头在目标时间分片中需要满足的第一目标参数。
在一些实施例中,各个目标帧对应的采样参数包括采样位置参数,处理器用于执行以下步骤:
基于各个目标帧的数据,获取各个目标帧中的互动对象的中心位置参数。
基于各个目标帧中的互动对象的中心位置参数和互动对象的集合位置参数,确定各个目标帧对应的采样位置参数。
在一些实施例中,各个目标帧对应的采样参数包括采样缩放参数,处理器用于执行以下步骤:
基于各个目标帧的数据,获取各个目标帧对应的距离参数。
基于距离-缩放变化倍率曲线,确定与距离参数对应的缩放变化倍率。
基于缩放变化倍率,确定各个目标帧对应的采样缩放参数。
在一些实施例中,处理器用于执行以下步骤:
基于第一运动方向和第二运动方向,获取目标时间分片对应的转向角度。
确定与转向角度对应的转向混合系数。
基于转向混合系数更新初始运动速度,得到更新后的初始运动速度。
基于第一目标参数、更新后的初始运动速度、目标时间分片对应的时间间隔和时间-速度变化倍率曲线,确定镜头在目标时间分片中的目标运动速度。
在一些实施例中,处理器用于执行以下步骤:
基于镜头在目标时间分片中已满足的第二目标参数和第一目标参数,获取镜头在目标时间分片中的变化参数,第二目标参数包括第二位置参数和第二缩放参数中的至少一个。
获取时间-速度变化倍率曲线对应的积分值。
基于变化参数、初始运动速度、目标时间分片对应的时间间隔和时间-速度变化倍率曲线对应的积分值,确定镜头在目标时间分片中的目标运动速度。
在一些实施例中,处理器用于执行以下步骤:
将控制镜头运动的过程划分为参考数量个子过程。
基于任一子过程对应的时间参数和时间-速度变化倍率曲线,确定任一子过程对应的速度变化倍率。
基于初始运动速度、目标运动速度和任一子过程对应的速度变化倍率,确定任一子过程对应的子运动速度。
在任一子过程对应的时间间隔内,按照任一子过程对应的子运动速度控制镜头运动。
在一些实施例中,处理器还用于执行以下步骤:
基于参数变化阈值,对第一目标参数进行有效性验证。
基于第一目标参数、镜头在目标时间分片中的初始运动速度、目标时间分片对应的时间 间隔和时间-速度变化倍率曲线,确定镜头在目标时间分片中的目标运动速度,包括:
基于有效性验证通过的第一目标参数、镜头在目标时间分片中的初始运动速度、目标时间分片对应的时间间隔和时间-速度变化倍率曲线,确定镜头在目标时间分片中的目标运动速度。
在一些实施例中,处理器还用于执行以下步骤:
基于矫正系数,更新目标运动速度,得到更新后的目标运动速度。
基于时间-速度变化倍率曲线、初始运动速度和目标运动速度,控制镜头运动,以满足第一目标参数,包括:
基于时间-速度变化倍率曲线、初始运动速度和更新后的目标运动速度,控制镜头运动,以满足第一目标参数。
在一些实施例中,时间-速度变化倍率曲线为贝塞尔曲线。
在一些实施例中,上述计算机可读存储介质可以是只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、只读光盘(Compact Disc Read-Only Memory,CD-ROM)、磁带、软盘和光数据存储设备等。
应当理解的是,在本文中提及的“多个”是指两个或两个以上。“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。字符“/”一般表示前后关联对象是一种“或”的关系。
以上所述仅为本申请的示例性实施例,并不用以限制本申请,凡在本申请的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本申请的保护范围之内。

Claims (15)

  1. 一种控制镜头运动的方法,应用于计算机设备,所述方法包括:
    基于目标时间分片对应的多个目标帧的数据,获取镜头在所述目标时间分片中需要满足的第一目标参数,所述第一目标参数包括第一位置参数和第一缩放参数中的至少一个;
    基于所述第一目标参数、所述镜头在所述目标时间分片中的初始运动速度、所述目标时间分片对应的时间间隔和时间-速度变化倍率曲线,确定所述镜头在所述目标时间分片中的目标运动速度,所述目标运动速度包括目标平移速度和目标缩放速度中的至少一个;
    基于所述时间-速度变化倍率曲线、所述初始运动速度和所述目标运动速度,控制所述镜头运动。
  2. 根据权利要求1所述的方法,其中,所述基于目标时间分片对应的多个目标帧的数据,获取镜头在所述目标时间分片中需要满足的第一目标参数,包括:
    对所述目标时间分片对应的各个目标帧的数据进行采样处理,得到所述各个目标帧对应的采样参数;
    按照时间戳与所述目标时间分片的起始时间戳的距离,为所述各个目标帧设置权重值;
    基于所述各个目标帧对应的采样参数和所述各个目标帧对应的权重值,确定所述镜头在所述目标时间分片中需要满足的第一目标参数。
  3. 根据权利要求2所述的方法,其中,所述各个目标帧对应的采样参数包括采样位置参数,所述对所述目标时间分片对应的各个目标帧的数据进行采样处理,得到所述各个目标帧对应的采样参数,包括:
    基于所述各个目标帧的数据,获取所述各个目标帧中的互动对象的中心位置参数;
    基于所述各个目标帧中的互动对象的中心位置参数和所述互动对象的集合位置参数,确定所述各个目标帧对应的采样位置参数。
  4. 根据权利要求2所述的方法,其中,所述各个目标帧对应的采样参数包括采样缩放参数,所述对所述目标时间分片对应的各个目标帧的数据进行采样处理,得到所述各个目标帧对应的采样参数,包括:
    基于所述各个目标帧的数据,获取所述各个目标帧对应的距离参数;
    基于距离-缩放变化倍率曲线,确定与所述距离参数对应的缩放变化倍率;
    基于所述缩放变化倍率,确定所述各个目标帧对应的采样缩放参数。
  5. 根据权利要求1所述的方法,其中,所述基于所述第一目标参数、所述镜头在所述目标时间分片中的初始运动速度、所述目标时间分片对应的时间间隔和时间-速度变化倍率曲线, 确定所述镜头在所述目标时间分片中的目标运动速度,包括:
    基于第一运动方向和第二运动方向,获取所述目标时间分片对应的转向角度;
    确定与所述转向角度对应的转向混合系数;
    基于所述转向混合系数更新所述初始运动速度,得到更新后的初始运动速度;
    基于所述第一目标参数、所述更新后的初始运动速度、所述目标时间分片对应的时间间隔和所述时间-速度变化倍率曲线,确定所述镜头在所述目标时间分片中的目标运动速度。
  6. 根据权利要求1所述的方法,其中,所述基于所述第一目标参数、所述镜头在所述目标时间分片中的初始运动速度、所述目标时间分片对应的时间间隔和时间-速度变化倍率曲线,确定所述镜头在所述目标时间分片中的目标运动速度,包括:
    基于所述镜头在所述目标时间分片中已满足的第二目标参数和所述第一目标参数,获取所述镜头在所述目标时间分片中的变化参数,所述第二目标参数包括第二位置参数和第二缩放参数中的至少一个;
    获取所述时间-速度变化倍率曲线对应的积分值;
    基于所述变化参数、所述初始运动速度、所述目标时间分片对应的时间间隔和所述时间-速度变化倍率曲线对应的积分值,确定所述镜头在所述目标时间分片中的目标运动速度。
  7. 根据权利要求1所述的方法,其中,所述基于所述时间-速度变化倍率曲线、所述初始运动速度和所述目标运动速度,控制所述镜头运动,以满足所述第一目标参数,包括:
    将控制所述镜头运动的过程划分为参考数量个子过程;
    基于任一子过程对应的时间参数和所述时间-速度变化倍率曲线,确定所述任一子过程对应的速度变化倍率;
    基于所述初始运动速度、所述目标运动速度和所述任一子过程对应的速度变化倍率,确定所述任一子过程对应的子运动速度;
    在所述任一子过程对应的时间间隔内,按照所述任一子过程对应的子运动速度控制所述镜头运动。
  8. 根据权利要求1所述的方法,其中,所述获取镜头在所述目标时间分片中需要满足的第一目标参数之后,所述方法还包括:
    基于参数变化阈值,对所述第一目标参数进行有效性验证;
    所述基于所述第一目标参数、所述镜头在所述目标时间分片中的初始运动速度、所述目标时间分片对应的时间间隔和时间-速度变化倍率曲线,确定所述镜头在所述目标时间分片中的目标运动速度,包括:
    基于有效性验证通过的第一目标参数、所述镜头在所述目标时间分片中的初始运动速度、所述目标时间分片对应的时间间隔和时间-速度变化倍率曲线,确定所述镜头在所述目标时间 分片中的目标运动速度。
  9. 根据权利要求1所述的方法,其中,所述确定所述镜头在所述目标时间分片中的目标运动速度之后,所述方法还包括:
    基于矫正系数,更新所述目标运动速度,得到更新后的目标运动速度;
    所述基于所述时间-速度变化倍率曲线、所述初始运动速度和所述目标运动速度,控制所述镜头运动,以满足所述第一目标参数,包括:
    基于所述时间-速度变化倍率曲线、所述初始运动速度和所述更新后的目标运动速度,控制所述镜头运动,以满足所述第一目标参数。
  10. 根据权利要求1-9任一所述的方法,其中,所述时间-速度变化倍率曲线为贝塞尔曲线。
  11. 一种控制镜头运动的装置,所述装置包括:
    获取模块,用于基于目标时间分片对应的多个目标帧的数据,获取镜头在所述目标时间分片中需要满足的第一目标参数,所述第一目标参数包括第一位置参数和第一缩放参数中的至少一个;
    确定模块,用于基于所述第一目标参数、所述镜头在所述目标时间分片中的初始运动速度、所述目标时间分片对应的时间间隔和时间-速度变化倍率曲线,确定所述镜头在所述目标时间分片中的目标运动速度,所述目标运动速度包括目标平移速度和目标缩放速度中的至少一个;
    控制模块,用于基于所述时间-速度变化倍率曲线、所述初始运动速度和所述目标运动速度,控制所述镜头运动。
  12. 根据权利要求11所述的装置,其中,所述获取模块,用于对所述目标时间分片对应的各个目标帧的数据进行采样处理,得到所述各个目标帧对应的采样参数;按照时间戳与所述目标时间分片的起始时间戳的距离,为所述各个目标帧设置权重值;基于所述各个目标帧对应的采样参数和所述各个目标帧对应的权重值,确定所述镜头在所述目标时间分片中需要满足的第一目标参数。
  13. 根据权利要求12所述的装置,其中,各个目标帧对应的采样参数包括采样位置参数,所述获取模块,还用于基于各个目标帧的数据,获取所述各个目标帧中的互动对象的中心位置参数;基于所述各个目标帧中的互动对象的中心位置参数和所述互动对象的集合位置参数,确定所述各个目标帧对应的采样位置参数。
  14. 一种计算机设备,所述计算机设备包括处理器和存储器,所述存储器中存储有至少一条程序代码,所述至少一条程序代码由所述处理器加载并执行,以实现如权利要求1至10任一所述的控制镜头运动的方法。
  15. 一种计算机可读存储介质,所述计算机可读存储介质中存储有至少一条程序代码,所述至少一条程序代码由处理器加载并执行,以实现如权利要求1至10任一所述的控制镜头运动的方法。
PCT/CN2020/126463 2020-01-17 2020-11-04 控制镜头运动的方法、装置、设备及存储介质 WO2021143296A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP20914246.2A EP4000700A4 (en) 2020-01-17 2020-11-04 CAMERA RECORDING MOTION CONTROL METHOD, DEVICE, DEVICE AND STORAGE MEDIA
JP2022519447A JP7487293B2 (ja) 2020-01-17 2020-11-04 仮想カメラの動きを制御する方法及び装置並びコンピュータ装置及びプログラム
US17/675,293 US11962897B2 (en) 2020-01-17 2022-02-18 Camera movement control method and apparatus, device, and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010055703.XA CN111246095B (zh) 2020-01-17 2020-01-17 控制镜头运动的方法、装置、设备及存储介质
CN202010055703.X 2020-01-17

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/675,293 Continuation US11962897B2 (en) 2020-01-17 2022-02-18 Camera movement control method and apparatus, device, and storage medium

Publications (1)

Publication Number Publication Date
WO2021143296A1 true WO2021143296A1 (zh) 2021-07-22

Family

ID=70872728

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/126463 WO2021143296A1 (zh) 2020-01-17 2020-11-04 控制镜头运动的方法、装置、设备及存储介质

Country Status (5)

Country Link
US (1) US11962897B2 (zh)
EP (1) EP4000700A4 (zh)
JP (1) JP7487293B2 (zh)
CN (1) CN111246095B (zh)
WO (1) WO2021143296A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114371806A (zh) * 2022-03-22 2022-04-19 广州三七极创网络科技有限公司 虚拟相机镜头参数处理、更新方法、装置、设备及介质

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111246095B (zh) * 2020-01-17 2021-04-27 腾讯科技(深圳)有限公司 控制镜头运动的方法、装置、设备及存储介质
CN113962138B (zh) * 2020-07-21 2023-11-03 腾讯科技(深圳)有限公司 移动平台的参数值确定方法、装置、设备及存储介质
CN112399086B (zh) * 2020-12-08 2022-04-29 浙江大华技术股份有限公司 一种运动控制方法、装置、存储介质及电子装置
CN114140329B (zh) * 2021-12-13 2023-03-28 广东欧谱曼迪科技有限公司 一种内窥镜图像缩放方法、系统及执行装置
CN114676389B (zh) * 2022-04-02 2023-06-27 深圳市大族机器人有限公司 电机控制方法、装置、计算机设备和存储介质

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140243082A1 (en) * 2012-04-26 2014-08-28 Riot Games, Inc. Systems and methods that enable a spectator's experience for online active games
CN104793643A (zh) * 2015-04-13 2015-07-22 北京迪生动画科技有限公司 一种三维定格动画拍摄系统及控制方法
CN106582012A (zh) * 2016-12-07 2017-04-26 腾讯科技(深圳)有限公司 一种vr场景下的攀爬操作处理方法和装置
CN106730852A (zh) * 2016-12-20 2017-05-31 网易(杭州)网络有限公司 游戏系统中虚拟镜头的控制方法及控制装置
CN106780674A (zh) * 2016-11-28 2017-05-31 网易(杭州)网络有限公司 镜头移动方法和装置
CN109550246A (zh) * 2017-09-25 2019-04-02 腾讯科技(深圳)有限公司 游戏客户端的控制方法、装置、存储介质和电子装置
CN110170167A (zh) * 2019-05-28 2019-08-27 上海米哈游网络科技股份有限公司 一种画面显示方法、装置、设备及介质
CN111246095A (zh) * 2020-01-17 2020-06-05 腾讯科技(深圳)有限公司 控制镜头运动的方法、装置、设备及存储介质

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3428562B2 (ja) * 2000-04-25 2003-07-22 株式会社スクウェア オブジェクトの動きを処理する方法および記録媒体、並びに、ゲーム装置
JP3696216B2 (ja) * 2003-03-05 2005-09-14 株式会社スクウェア・エニックス 3次元ビデオゲーム装置、3次元ビデオゲームにおける仮想カメラの制御方法、並びにプログラム及び記録媒体
CN100487636C (zh) * 2006-06-09 2009-05-13 中国科学院自动化研究所 基于立体视觉的游戏控制系统及方法
JP4489800B2 (ja) * 2007-08-30 2010-06-23 株式会社スクウェア・エニックス 画像生成装置及び方法、並びにプログラム及び記録媒体
US8576235B1 (en) * 2009-07-13 2013-11-05 Disney Enterprises, Inc. Visibility transition planning for dynamic camera control
JP5143883B2 (ja) * 2010-11-12 2013-02-13 株式会社コナミデジタルエンタテインメント 画像処理装置、画像処理プログラム、及び画像処理方法
US8401242B2 (en) * 2011-01-31 2013-03-19 Microsoft Corporation Real-time camera tracking using depth maps
CN102750720A (zh) * 2011-04-20 2012-10-24 鸿富锦精密工业(深圳)有限公司 三维效果模拟系统及方法
CN102915553B (zh) * 2012-09-12 2016-02-10 珠海金山网络游戏科技有限公司 一种3d游戏视频拍摄系统及其方法
AU2013270593B2 (en) * 2013-11-29 2019-01-24 Motorola Solutions, Inc. Camera system control for correcting bore-sight offset
US9616339B2 (en) * 2014-04-24 2017-04-11 Microsoft Technology Licensing, Llc Artist-directed volumetric dynamic virtual cameras
JP6598522B2 (ja) * 2015-06-12 2019-10-30 任天堂株式会社 情報処理装置、情報処理システム、情報処理方法、及び情報処理プログラム
US10154228B1 (en) * 2015-12-23 2018-12-11 Amazon Technologies, Inc. Smoothing video panning
CN105597311B (zh) * 2015-12-25 2019-07-12 网易(杭州)网络有限公司 3d游戏中的相机控制方法和装置
CN106502395A (zh) * 2016-10-18 2017-03-15 深圳市火花幻境虚拟现实技术有限公司 一种在虚拟现实应用中避免用户眩晕的方法及装置
CN106780874B (zh) 2016-12-05 2019-01-18 上海晨晓电子科技有限公司 一种智能门禁系统
JP2017142783A (ja) * 2017-01-04 2017-08-17 株式会社コロプラ 仮想空間における視界領域調整方法、およびプログラム
CN107422848B (zh) * 2017-06-16 2020-04-21 福建天晴数码有限公司 一种测试虚拟角色加速度值的方法及系统
CN109621415A (zh) * 2018-12-26 2019-04-16 网易(杭州)网络有限公司 3d游戏中的显示控制方法及装置、计算机存储介质

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140243082A1 (en) * 2012-04-26 2014-08-28 Riot Games, Inc. Systems and methods that enable a spectator's experience for online active games
CN104793643A (zh) * 2015-04-13 2015-07-22 北京迪生动画科技有限公司 一种三维定格动画拍摄系统及控制方法
CN106780674A (zh) * 2016-11-28 2017-05-31 网易(杭州)网络有限公司 镜头移动方法和装置
CN106582012A (zh) * 2016-12-07 2017-04-26 腾讯科技(深圳)有限公司 一种vr场景下的攀爬操作处理方法和装置
CN106730852A (zh) * 2016-12-20 2017-05-31 网易(杭州)网络有限公司 游戏系统中虚拟镜头的控制方法及控制装置
CN109550246A (zh) * 2017-09-25 2019-04-02 腾讯科技(深圳)有限公司 游戏客户端的控制方法、装置、存储介质和电子装置
CN110170167A (zh) * 2019-05-28 2019-08-27 上海米哈游网络科技股份有限公司 一种画面显示方法、装置、设备及介质
CN111246095A (zh) * 2020-01-17 2020-06-05 腾讯科技(深圳)有限公司 控制镜头运动的方法、装置、设备及存储介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4000700A4

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114371806A (zh) * 2022-03-22 2022-04-19 广州三七极创网络科技有限公司 虚拟相机镜头参数处理、更新方法、装置、设备及介质

Also Published As

Publication number Publication date
EP4000700A1 (en) 2022-05-25
US20220174206A1 (en) 2022-06-02
EP4000700A4 (en) 2022-11-16
JP7487293B2 (ja) 2024-05-20
CN111246095A (zh) 2020-06-05
US11962897B2 (en) 2024-04-16
JP2022550126A (ja) 2022-11-30
CN111246095B (zh) 2021-04-27

Similar Documents

Publication Publication Date Title
WO2021143296A1 (zh) 控制镜头运动的方法、装置、设备及存储介质
US11288807B2 (en) Method, electronic device and storage medium for segmenting image
WO2021008456A1 (zh) 图像处理方法、装置、电子设备及存储介质
US11517099B2 (en) Method for processing images, electronic device, and storage medium
WO2020221012A1 (zh) 图像特征点的运动信息确定方法、任务执行方法和设备
CN107982918B (zh) 游戏对局结果的展示方法、装置及终端
CN112533017B (zh) 直播方法、装置、终端及存储介质
WO2020249025A1 (zh) 身份信息的确定方法、装置及存储介质
WO2022134632A1 (zh) 作品处理方法及装置
WO2021043121A1 (zh) 一种图像换脸的方法、装置、系统、设备和存储介质
CN112738607B (zh) 播放方法、装置、设备及存储介质
CN110139143B (zh) 虚拟物品显示方法、装置、计算机设备以及存储介质
WO2020108041A1 (zh) 耳部关键点检测方法、装置及存储介质
CN111385525B (zh) 视频监控方法、装置、终端及系统
CN111818358A (zh) 音频文件的播放方法、装置、终端及存储介质
CN110796083A (zh) 图像显示方法、装置、终端及存储介质
CN112269559A (zh) 音量调整方法、装置、电子设备及存储介质
CN110152309B (zh) 语音通信方法、装置、电子设备及存储介质
CN112616082A (zh) 视频预览方法、装置、终端及存储介质
CN112257594A (zh) 多媒体数据的显示方法、装置、计算机设备及存储介质
CN110134902B (zh) 资料信息生成方法、装置及存储介质
WO2021218926A1 (zh) 图像显示方法、装置和计算机设备
WO2021258608A1 (zh) 带宽确定方法、装置、终端及存储介质
CN110660031B (zh) 图像锐化方法及装置、存储介质
CN110942426A (zh) 图像处理的方法、装置、计算机设备和存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20914246

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020914246

Country of ref document: EP

Effective date: 20220215

ENP Entry into the national phase

Ref document number: 2022519447

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE