CN116758190A - Control method, system, equipment and medium for virtual camera - Google Patents

Control method, system, equipment and medium for virtual camera Download PDF

Info

Publication number
CN116758190A
CN116758190A CN202310806478.2A CN202310806478A CN116758190A CN 116758190 A CN116758190 A CN 116758190A CN 202310806478 A CN202310806478 A CN 202310806478A CN 116758190 A CN116758190 A CN 116758190A
Authority
CN
China
Prior art keywords
virtual camera
target
determining
information
target displacement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310806478.2A
Other languages
Chinese (zh)
Inventor
蔡京珂
高巧展
张琳晗
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing IQIYI Science and Technology Co Ltd
Original Assignee
Beijing IQIYI Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing IQIYI Science and Technology Co Ltd filed Critical Beijing IQIYI Science and Technology Co Ltd
Priority to CN202310806478.2A priority Critical patent/CN116758190A/en
Publication of CN116758190A publication Critical patent/CN116758190A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computing Systems (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the application provides a control method, a control system, control equipment and a control medium of a virtual camera, which aim to simplify the control operation of the virtual camera and improve the manufacturing efficiency of animation. The method comprises the following steps: acquiring first motion information of a rocker control and second motion information of a gyroscope; determining target displacement information of a virtual camera according to the first motion information, and determining target rotation information of the virtual camera according to the second motion information; and controlling the virtual camera to move in a virtual scene according to the target displacement information and the target rotation information.

Description

Control method, system, equipment and medium for virtual camera
Technical Field
The present application relates to the field of virtual movie and television manufacturing technologies, and in particular, to a method, a system, a device, and a medium for controlling a virtual camera.
Background
During the animation process, the motion of the virtual camera needs to be controlled to record the changes of the pictures in the virtual scene. After the operation parameters of the virtual camera are set by using the key mouse, the virtual camera is controlled to continuously move in the virtual scene based on the operation parameters set by the key mouse, so that continuous animation is obtained.
The later animation production mode based on the mouse dotting control virtual camera motion does not accord with the operation habit of a camera operator and has complex operation, so that the production efficiency of the animation is reduced.
Disclosure of Invention
In view of the foregoing, embodiments of the present application provide a method, system, apparatus, and medium for controlling a virtual camera, so as to overcome or at least partially solve the foregoing problems.
In a first aspect of an embodiment of the present application, there is provided a method for controlling a virtual camera, including:
acquiring first motion information of a rocker control and second motion information of a gyroscope;
determining target displacement information of a virtual camera according to the first motion information, and determining target rotation information of the virtual camera according to the second motion information;
and controlling the virtual camera to move in a virtual scene according to the target displacement information and the target rotation information.
In a second aspect of the embodiment of the present application, there is provided a control system for a virtual camera, the system including a mobile device and a terminal device, the mobile device being communicatively connected to the terminal device, the mobile device including a gyroscope and a screen, the screen being configured to display pictures captured by a rocker control and the virtual camera, wherein:
the mobile equipment is used for acquiring first motion information of the rocker control and second motion information of the gyroscope and sending the first motion information and the second motion information to the terminal equipment;
the terminal equipment is used for determining target displacement information of the virtual camera according to the first motion information, determining target rotation information of the virtual camera according to the second motion information, controlling the virtual camera to move in a virtual scene according to the target displacement information and the target rotation information, and sending pictures shot by the virtual camera in the moving process to the mobile equipment;
the mobile device is also used for displaying pictures shot by the virtual camera in the motion process.
In a third aspect of the embodiment of the present application, an electronic device is provided, which includes a memory, a processor, and a computer program stored on the memory, where the processor executes the computer program to implement the method for controlling a virtual camera according to the first aspect disclosed in the embodiment of the present application.
In a fourth aspect of embodiments of the present application, there is provided a computer-readable storage medium having stored thereon a computer program/instruction which, when executed by a processor, implements a method of controlling a virtual camera according to the first aspect of embodiments of the present application.
The embodiment of the application has the following advantages:
in this embodiment, the motion change of the virtual camera in the virtual scene is jointly controlled by the mutual coupling of the rocker control and the gyroscope, so that the user can simulate the motions such as the position movement and the lens rotation of the camera in the real environment to perform real-time image shooting by operating the rocker control and the gyroscope, thereby converting the post-stage mode of dotting by the keyboard and mouse into the pre-stage mode of jointly controlling the motion mirror of the virtual camera in real time by the rocker control and the gyroscope, thereby being more in line with the operation habit of a photographer and having simpler operation, and greatly improving the manufacturing efficiency of the animation.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments of the present application will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of steps of a method for controlling a virtual camera according to an embodiment of the present application;
FIG. 2 is a flow chart of the steps of a method for controlling the displacement motion of a virtual camera according to an embodiment of the present application;
FIG. 3 is a schematic illustration of a rocker control in an embodiment of the present application;
FIG. 4 is a schematic diagram of a frame shot by a virtual camera before movement according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a frame shot by a virtual camera after a position is moved in an embodiment of the present application;
FIG. 6 is a flow chart of the steps of a method for controlling rotational motion of a virtual camera in accordance with an embodiment of the present application;
FIG. 7 is a schematic diagram of a frame shot by a virtual camera after rotation according to an embodiment of the present application;
FIG. 8 is a schematic diagram of a picture taken by a virtual camera after performing a multi-dimensional motion according to an embodiment of the present application;
FIG. 9 is a schematic diagram of a preview animation process according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of a control system of a virtual camera according to an embodiment of the present application;
fig. 11 is a schematic diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order that the above-recited objects, features and advantages of the present application will become more readily apparent, a more particular description of the application will be rendered by reference to the appended drawings and appended detailed description.
In the production process of an animation, particularly a pre-cast animation, it is necessary to control the motion (such as rotation and displacement) of a virtual camera to record the change of a picture in a virtual scene. In the related software, the motion of the virtual camera is usually that the key mouse is used to set the operation parameters of the virtual camera at different moments, and then interpolation is used to deduct the operation parameters into continuous animation. The animation production flow based on the mouse dotting control virtual camera motion is not only biased to the later animation production flow, but also inconvenient to operate, low in production efficiency, not in line with the operation habit of a photographer, and not easy for video main creators to directly participate in a previewing link.
Based on the analysis, aiming at the problems of complex operation, non-compliance with the operation habit of a photographer and the like of a virtual camera control mode based on key mouse dotting, the embodiment of the application provides a scheme for controlling the movement of a camera by combining a rocker and a somatosensory, wherein the movement of the virtual camera is split into a plurality of dimensions, and the movement change of the virtual camera is controlled in a multidimensional manner by mutually coupling a rocker control and a gyroscope, so that the animation production process can be changed from a later stage to a earlier stage, the real shooting condition can be simulated, the method is more beneficial to the hands of a movie and television main creator, the production efficiency of the animation is greatly improved, and the cost in the movie and television production process is reduced.
Referring to fig. 1, a step flow chart of a control method of a virtual camera according to an embodiment of the present application is shown, and as shown in fig. 1, the control method of the virtual camera includes the following steps:
step S11: and acquiring first motion information of the rocker control and second motion information of the gyroscope.
In a specific implementation, the first motion information and the second motion information are collected through a mobile device (such as a mobile phone or a tablet) which comprises a gyroscope and a screen, wherein the screen is used for displaying a rocker control. The gyroscope and the screen can be arranged in the same physical device or in different physical devices.
When a user drags an intermediate component (i.e., a rocker body) in a rocker control displayed in a screen, a sensor (e.g., a touch sensor or a visual sensor, etc.) associated with the screen will acquire drag operation related information such as a distance (i.e., a drag distance) that the user drags the intermediate component, a direction (i.e., a drag direction) that the user drags the intermediate component, or a speed (i.e., a drag speed) that the user drags the intermediate component, to generate first motion information.
When a user changes the gesture of the equipment where the gyroscope is located (such as turning over the equipment), the gyroscope acquires information such as the rotation angle of the equipment to generate second motion information, so that the purpose that the user controls the virtual camera by using the motion sense (namely, the user rotates the gyroscope by changing the gesture of the user) is achieved through the second motion information.
Step S12: and determining target displacement information of the virtual camera according to the first motion information, and determining target rotation information of the virtual camera according to the second motion information.
In a specific implementation, the mobile device may determine, according to the first motion information, target displacement information for controlling the position movement of the virtual camera, and determine, according to the second motion information, target rotation information for controlling the rotation of the virtual camera, or the mobile device may send the first motion information and the second motion information to the terminal device, where the terminal device determines the target displacement information and the target rotation information, so as to reduce the calculation amount of the mobile device, where the terminal device is a device that establishes a virtual scene and a virtual camera for a desktop computer, a notebook computer, and the like.
Step S13: and controlling the virtual camera to move in a virtual scene according to the target displacement information and the target rotation information.
In the implementation, after receiving or generating the target displacement information and the target rotation information, the terminal device can control the virtual camera to correspondingly perform displacement motion and rotation motion in the virtual scene according to the target displacement information and the target rotation information, and record a picture shot by the virtual camera in the motion process so as to generate the preview animation later. The terminal device can also send the picture shot by the virtual camera to the mobile device for display, so that a user can adjust the shooting position and the shooting angle of the virtual camera in real time according to the picture shot by the virtual camera.
It can be understood that by splitting the motion of the camera in the real scene into two dimensions of rotational motion and displacement motion, the rocker control controls the virtual camera to simulate the displacement motion change of the camera in the real scene, and the gyroscope controls the virtual camera to simulate the rotational motion change of the camera in the real scene, so that the user can simulate the lens-transporting effect of the camera in the real scene by simply operating the rocker control and the gyroscope through the mutual coupling of the rocker control and the gyroscope, thereby providing a technical realization foundation for the user to directly participate in the preview link (i.e. the user can control the motion of the virtual camera and shoot in real time). The terminal equipment records pictures shot by the virtual camera under the control of a user in real time to generate the animation, so that the animation production flow is changed from the later stage to the earlier stage, the operation of a film and television creator is easier, the operation threshold is greatly reduced, and the production efficiency of the pre-playing animation is improved.
By adopting the technical scheme of the embodiment of the application, the motion change of the virtual camera in the virtual scene is controlled jointly through the mutual coupling of the rocker control and the gyroscope, so that a user can simulate the motions such as the position movement, the lens rotation and the like of the camera in the real environment to shoot a real-time picture by operating the rocker control and the gyroscope, thereby converting the later mode of dotting by a keymouse into the earlier mode of jointly controlling the mirror of the virtual camera in real time by the rocker control and the gyroscope, thereby being more in line with the operation habit of a photographer, having simpler operation and greatly improving the manufacturing efficiency of the animation.
Referring to fig. 2, a flowchart illustrating steps of a displacement motion control method of a virtual camera according to an embodiment of the present application is shown, and as shown in fig. 2, the method includes the steps of:
step S21: and acquiring first motion information of the rocker control.
The first motion information comprises a dragging direction and a dragging distance of the rocker control.
In specific implementation, the mobile device may use the original position of the middle component of the rocker control as the origin of the position to establish a coordinate system, so as to determine the dragging direction and distance of the rocker control according to the position of the middle component in the coordinate system after being dragged.
Step S22: and determining the target displacement speed of the virtual camera according to the dragging distance.
In the implementation, the dragging distance and the displacement speed of the virtual camera can be set to have positive influence, for example, when the middle component is not dragged (i.e. is in the original position), the displacement speed is 0, the farther the middle component is dragged relative to the original position, the greater the position speed required to be reached by the virtual camera, and different displacement speeds can be flexibly set for different dragging distances according to the operation habit of the user. For example, the user may set the displacement speed of the virtual camera to be maximum speed when dragging the intermediate component to three units of distance from the position origin, set the displacement speed of the virtual camera to be minimum speed (e.g., speed of 0) when dragging the intermediate component to the farthest distance from the position origin, and so on.
In an alternative embodiment, the target displacement speed of the virtual camera may be determined according to the first mapping relation and the dragging distance.
The first mapping relation is used for representing the mapping relation between the dragging distance and the displacement speed.
In the specific implementation, the linear mapping relation or the nonlinear mapping relation between the dragging distance and the displacement speed can be set according to the linear change relation or a specific function and the like. When the target displacement speed of the virtual camera is determined, the displacement speed of the virtual camera can be directly determined according to the distance between the dragged intermediate assembly and the position origin, and the displacement speeds corresponding to the virtual camera on different axes can be determined according to the distance between the dragged intermediate assembly and the position origin on different axes of the coordinate system.
Alternatively, the first mapping relationship may be determined according to an EXP curve. The EXP curve is generally used to describe a nonlinear mapping relationship between the current physical output quantity of the rocker and the current logical output quantity of the rocker, and in a specific implementation, the current physical output quantity of the rocker and the current logical output quantity of the rocker may be replaced by the dragging distance and the displacement speed, so that the first mapping relationship is determined based on the EXP curve.
Taking the rocker control shown in fig. 3 as an example, the middle component of the rocker control can be freely dragged in a circular area with radius of 1 in a two-dimensional coordinate system, and the middle component is virtual at the momentThe target displacement speed of the pseudo-camera comprises displacement speeds corresponding to two coordinate axes respectively, such as a Y-axis in a vertical direction corresponds to the displacement speed of the virtual camera in a first direction in the virtual scene, and an X-axis in a horizontal direction corresponds to the displacement speed of the virtual camera in a second direction in the virtual scene. When the intermediate component is dragged to the position point A, the distance between the intermediate component and the position origin along the directions of two coordinate axes isThe target displacement speed of the virtual camera in both the first and second directions may then be the highest speed +.>When the intermediate component is dragged to the position point B, the distance from the position origin point of the intermediate component in the Y-axis direction is 1 and the distance from the position origin point in the X-axis direction is 0, and at this time, the target displacement speed of the virtual camera in the first direction is the highest speed and the target displacement speed in the second direction is 0.
Step S23: and determining the target displacement direction of the virtual camera according to the dragging direction.
In specific implementation, a mapping relationship between the coordinate system in the rocker control and the coordinate system in the virtual scene can be established, so that the direction in which the intermediate component is dragged in the rocker control is mapped to the target displacement direction of the virtual camera in the virtual scene, and if the intermediate component is set to be dragged forwards along the Y-axis direction, the virtual camera correspondingly moves forwards along the Y-axis direction in the virtual scene.
As a possible implementation manner, the rocker control at least comprises a rocker control for controlling the virtual camera to perform position movement in the first axial direction, and the target displacement direction of the virtual camera for performing position movement in the first axial direction is determined according to the dragging direction.
In particular implementations, movement of the virtual camera in the virtual scene may be split into movement in different axial directions, and movement in one or more axial directions (i.e., a first axial direction) of the virtual camera may be controlled by a rocker control, with movement in the remaining axial directions being controlled or maintained by other devices.
As shown in fig. 3, the rocker control is configured to simultaneously control the virtual camera to perform a position movement in the direction of the X-axis and the Y-axis. When the middle component is dragged to the position point B, the target displacement direction of the virtual camera is upward along the Y-axis direction, namely the target displacement direction is the advancing direction; when the intermediate member is dragged to the position point a, the target displacement direction of the virtual camera is forward in the Y-axis direction and leftward in the X-axis direction, that is, the target displacement direction is a combination of forward and leftward movement (left front).
Optionally, the rocker control may include at least one of: a first rocker control for controlling the virtual camera to move in the Z-axis direction; and the second rocker control is used for controlling the virtual camera to move in the direction of the X axis and/or the Y axis.
It will be appreciated that the control of the second rocker control to move the position of the virtual camera in the directions of the X axis and the Y axis actually means that: and controlling the virtual camera to freely move in the plane where the X axis and the Y axis are located.
Step S24: and determining the target displacement information according to the target displacement speed and the target displacement direction.
In the implementation, the target displacement speed and the target displacement direction may be directly used as the target displacement information, or the speed variation corresponding to each moment during the period of converting the current displacement speed of the virtual camera from the current displacement speed in the current displacement direction to the target displacement speed in the target displacement direction may be determined first, then the target displacement direction and the target speed required to be achieved by the virtual camera at each moment are determined according to the speed variation corresponding to each moment, and the target displacement direction and the target speed required to be achieved at each moment are determined as the target displacement information.
Specifically, according to smooth information (which may be used to describe a mapping relationship between a variable amount such as a speed and a rotation angle and a time length, for example, a speed variable amount at a first moment needs to be 3 and a speed variable amount at a second moment needs to be 1 during a speed change period) preset by a user and the target displacement speed, a displacement speed required to be achieved by the virtual camera at each moment before the target displacement speed is achieved is determined, and the displacement speed obtained at each moment is added to the target displacement information, so that an acceleration or deceleration effect of the camera in a real scene is simulated according to the displacement speed required to be achieved at each moment, thereby meeting film and television manufacturing requirements.
For example, the displacement speed of the virtual camera at the first moment is 0, the target displacement speed is 3, through preset smooth information, the displacement speed required to be achieved by the virtual camera at the second moment is 1, the displacement speed required to be achieved at the third moment is 3, the displacement speeds required to be achieved at the second moment and the third moment are added to the target displacement information, so that the displacement speed of the virtual camera does not directly reach the target displacement speed at the second moment, or the relevant information of the displacement speed change process can be combined with the target displacement direction, if the rising speed of the virtual camera at the first moment along the Z axis is 6, the user hopes that the falling speed of the virtual camera along the Z axis is 3, according to the smooth information, the rising speed of the virtual camera can be determined to be gradually reduced to 0 according to the change rate set by the smooth information, then the 0 gradually increases the change rate set by the falling speed to the relevant information of the displacement speed change process of 3, the relevant information of the displacement speed change process can be added to the target displacement speed change process, so that the virtual camera can realize real motion scene simulation of the virtual camera can be realized according to the target displacement terminal equipment.
Step S25: and controlling the virtual camera to perform displacement movement in the virtual scene according to the target displacement information.
The mobile device sends the target displacement information or the related information for determining the target displacement information to the terminal device, so that the terminal device knows the target displacement information and correspondingly adjusts displacement related parameters (such as a displacement direction, a displacement speed, a position and the like) of the virtual camera, and at the moment, the mobile device can control the virtual camera through the terminal device, so that the virtual camera can move along with the operation of a user on the rocker control.
Fig. 4 shows a picture taken by the virtual camera before moving (such as position movement or rotation movement), at this time, the user may drag the middle component in the rocker control downwards, and the terminal device correspondingly controls the virtual camera to move downwards along the Z axis, and obtains a picture taken by the virtual camera after moving in position, as shown in fig. 5.
As can be seen from fig. 4 and 5, the rocker control in the embodiment of the present application is mainly used for controlling the position movement of the virtual camera, so that the virtual camera can flexibly approach or depart from the target, but the rotation angle of the lens of the virtual camera is not changed.
In order to implement the change of the rotation angle of the lens of the virtual camera, referring to fig. 6, a flowchart of the steps of a method for controlling the rotation motion of the virtual camera according to an embodiment of the present application is shown, and as shown in fig. 6, the method includes the following steps:
step S31: second motion information of the gyroscope is acquired.
Wherein the second motion information includes a rotation parameter of the gyroscope in a second axis.
In a specific implementation, the mobile device may provide gyroscopes with different axial directions for a user to select, where the gyroscopes with different axial directions may be different physical gyroscopes, or may be different virtual gyroscopes that are integrated in the same physical gyroscope. When a user changes the gesture of the device where the gyroscope is located, the gyroscopes in different axial directions will collect rotation parameters such as the rotation angle, the rotation angular velocity and the like of the device (i.e. the gyroscope) in the corresponding axial direction (i.e. the second axial direction).
As a possible implementation, the gyroscopes of different axes described above may be used to control the rotational movement of the virtual camera in different axes. Specifically, the gyroscope in the mobile device may include at least one of: a first gyroscope for controlling the virtual camera to rotate in the direction of a pitch (Tilt) axis, i.e. the first gyroscope can be used for controlling the virtual camera to rotate around a horizontal axis, so as to control the lens of the virtual camera to Tilt upwards or downwards, so as to shoot targets with different heights; the second gyroscope is used for controlling the virtual camera to rotate in the direction of a heading (Pan) axis, namely the second gyroscope can be used for controlling the virtual camera to rotate around an axis vertical to the ground, so that the lens of the virtual camera is controlled to rotate in the horizontal direction, and virtual scene pictures in different directions can be shot; a third gyroscope for controlling the rotation of the virtual camera in the direction of the Roll (Roll) axis, i.e. the third gyroscope may be used to control the rotation of the virtual camera about an axis perpendicular to the line of sight, and thus the rotation of the lens of the virtual camera on its own axis (similar to the rolling of an aircraft wing) in order to take pictures with a rolling effect.
It is understood that the first gyroscope, the second gyroscope and the third gyroscope may be different physical gyroscopes, or may be different virtual gyroscopes integrally provided in the same physical gyroscope.
Step S32: and determining target rotation parameters of the virtual camera in a third axial direction according to the second mapping relation and the rotation parameters of the gyroscope in the second axial direction.
The second mapping relationship is used for representing a mapping relationship between a rotation parameter in the second axial direction and a rotation parameter in the third axial direction (such as the target rotation parameter).
In an implementation, the mobile device may determine the target rotation parameter of the virtual camera in the third axis according to the second mapping relationship and the rotation parameter of the gyroscope in the second axis. If the mobile device can directly determine the rotational angular velocity of the gyroscope in the pitch axis direction (i.e. the second axis) as the rotational angular velocity of the virtual camera in the pitch axis direction (i.e. the third axis) and send it to the terminal device. The terminal device may also determine a rotation parameter of the virtual camera in a third axis according to the rotation parameter of the gyroscope in the second axis sent by the mobile device.
As a possible implementation manner, the mapping relationship between the second axis corresponding to the gyroscope and the third axis corresponding to the virtual camera can be set through the second mapping relationship, for example, a mapping relationship between the pitch axis direction corresponding to the gyroscope and the heading axis direction corresponding to the virtual camera can be set, and at this time, the user can operate the gyroscope to rotate in the pitch axis direction so as to control the virtual camera to rotate in the heading axis direction; the mapping proportion of the rotation parameter can also be set through the second mapping relation, for example, in 1:1, the user rotates the gyroscope 5 degrees, then the virtual camera will correspondingly rotate 5 degrees, while at 1:2, the user rotates the gyroscope by 5 degrees, then the virtual camera will correspondingly rotate by 10 degrees. Step S33: and determining the target rotation information according to the target rotation parameters of the virtual camera in the third axial direction.
The terminal device or the mobile device may determine a rotation angle (i.e. target rotation information) that the virtual camera needs to reach in the third axis according to rotation parameters such as a rotation angular velocity or a rotation velocity of the virtual camera in the third axis, where the rotation parameters in the third axis may include rotation parameters in one or more axial directions, i.e. a user may control the camera to perform a rotation motion in one or more axial directions through the gyroscope.
Step S34: and controlling the virtual camera to perform rotary motion in a virtual scene according to the target rotary information.
In the implementation, the target rotation parameter (such as a rotation angle, a rotation angular velocity, or a time length of a required rotation) may be directly used as the target rotation information, and the corresponding rotation parameter setting may be performed on the virtual camera.
As a possible implementation manner, taking the target rotation parameter as the target rotation angle as an example, the target rotation angle can be directly taken as target rotation information, and the virtual camera correspondingly instantaneously converts from the current rotation angle in the third axial direction to the target rotation angle; the method may further include determining, according to smooth information set by a user for rotation, an angle change amount corresponding to each moment during rotation of the virtual camera in the third axis from the current rotation angle to the target rotation angle, determining, according to the angle change amount corresponding to each moment, a target rotation angle required to be achieved by the virtual camera at each moment, determining, according to the target rotation angle required to be achieved at each moment, as the target rotation information, and then gradually rotating, according to the target rotation angle required to be achieved at each moment, from the current rotation angle to a final target rotation angle desired by the user, so as to simulate a rotation motion process of the real camera.
After the terminal device obtains the target rotation information, parameters such as a rotation angle of the virtual camera can be adjusted, so that the virtual camera can rotate along with the operation of a user on the gyroscope. If the user can rotate the mobile device along the positive direction of the Tilt axis, the terminal device correspondingly controls the virtual camera to Tilt upwards, and acquires the picture shot by the virtual camera after rotation, as shown in fig. 7.
As can be seen from fig. 4 and 7, the gyroscope in the embodiment of the present application is mainly used for controlling the rotation of the virtual camera, so that the virtual camera can shoot the target from different lens angles, but the position of the virtual camera is not changed.
As shown in fig. 8, a user drags the middle component in the rocker control downwards and rotates the mobile device along the positive direction of the Tilt axis at the same time, that is, through the mutual coupling of the rocker control and the gyroscope, the virtual camera is controlled to move downwards along the Z axis and to Tilt upwards at the same time, so that a picture shot by the virtual camera after multi-dimensional movement is obtained, and as can be seen from fig. 4 and 8, the control scheme of the virtual camera provided by the embodiment of the application can achieve a better mirror-carrying effect.
It will be understood that, as shown in fig. 9, after the terminal device establishes a virtual scene and a virtual camera through three-dimensional software, the mobile device may connect the virtual camera in a corresponding local area network through an IP address and a port number, after recording starts, the mobile device may execute the steps S11 to S13 and related specific implementation steps, control the displacement motion and the rotation motion of the virtual camera through a rocker control and a gyroscope, and acquire a picture shot by the virtual camera returned by the terminal device in real time, and the terminal device may also record the displacement and the direction data change of the virtual camera in real time, derive the change data of the virtual camera at the time of recording end, and generate a continuous preview animation.
Based on the above embodiment, the control scheme of the virtual camera provided by the embodiment of the application optimizes the production flow of the preview animation, so that the preview animation is changed from the later stage to the earlier stage, the operation of film and television creators is easier, the operation threshold is greatly reduced, the development of the preview industry is facilitated, and the camera motion is simulated by combining a rocker and a somatosensory combination mode, so that the preview animation is simpler, the production efficiency of the preview animation is improved, and the additional cost expense caused by the traditional flow is reduced.
It should be noted that, for simplicity of description, the method embodiments are shown as a series of acts, but it should be understood by those skilled in the art that the embodiments are not limited by the order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred embodiments, and that the acts are not necessarily required by the embodiments of the application.
Fig. 10 is a schematic structural diagram of a control system of a virtual camera according to an embodiment of the present application, where the system includes a mobile device and a terminal device, the mobile device is communicatively connected to the terminal device, and the mobile device includes a gyroscope and a screen, and the screen is used to display pictures shot by a rocker control and the virtual camera, where:
the mobile equipment is used for acquiring first motion information of the rocker control and second motion information of the gyroscope and sending the first motion information and the second motion information to the terminal equipment;
the terminal equipment is used for determining target displacement information of the virtual camera according to the first motion information, determining target rotation information of the virtual camera according to the second motion information, controlling the virtual camera to move in a virtual scene according to the target displacement information and the target rotation information, and sending pictures shot by the virtual camera in the moving process to the mobile equipment;
the mobile device is also used for displaying pictures shot by the virtual camera in the motion process.
By adopting the technical scheme of the embodiment of the application, the motion change of the virtual camera in the virtual scene is controlled jointly through the mutual coupling of the rocker control and the gyroscope, so that a user can simulate the motions such as the position movement, the lens rotation and the like of the camera in the real environment to shoot a real-time picture by operating the rocker control and the gyroscope, thereby converting the later mode of dotting by a keymouse into the earlier mode of jointly controlling the mirror of the virtual camera in real time by the rocker control and the gyroscope, thereby being more in line with the operation habit of a photographer, having simpler operation and greatly improving the manufacturing efficiency of the animation.
It should be noted that, the system embodiment is similar to the method embodiment, so the description is simpler, and the relevant places refer to the method embodiment.
The embodiment of the application also provides an electronic device, and referring to fig. 11, fig. 11 is a schematic diagram of the electronic device according to the embodiment of the application. As shown in fig. 11, the electronic device 100 includes: the memory 110 and the processor 120 are connected through a bus communication, and a computer program is stored in the memory 110 and can run on the processor 120, so that the steps in the control method of the virtual camera disclosed by the embodiment of the application are realized.
The embodiment of the application also provides a computer readable storage medium, on which a computer program/instruction is stored, which when executed by a processor, implements the control method of the virtual camera as disclosed in the embodiment of the application.
In this specification, each embodiment is described in a progressive manner, and each embodiment is mainly described by differences from other embodiments, and identical and similar parts between the embodiments are all enough to be referred to each other.
It will be apparent to those skilled in the art that embodiments of the present application may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the application may take the form of a computer program product on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus, electronic devices, and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal device to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal device, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiment and all such alterations and modifications as fall within the scope of the embodiments of the application.
Finally, it is further noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or terminal device comprising the element.
The above detailed description of the control method, system, device and medium of the virtual camera provided by the present application applies specific examples to illustrate the principles and embodiments of the present application, and the above description of the examples is only used to help understand the method and core idea of the present application; meanwhile, as those skilled in the art will have variations in the specific embodiments and application scope in accordance with the ideas of the present application, the present description should not be construed as limiting the present application in view of the above.

Claims (10)

1. A method of controlling a virtual camera, the method comprising:
acquiring first motion information of a rocker control and second motion information of a gyroscope;
determining target displacement information of a virtual camera according to the first motion information, and determining target rotation information of the virtual camera according to the second motion information;
and controlling the virtual camera to move in a virtual scene according to the target displacement information and the target rotation information.
2. The method of claim 1, wherein the first motion information includes a drag direction and a drag distance in which a rocker body in the rocker control is dragged; the determining target displacement information of the virtual camera according to the first motion information comprises the following steps:
determining the target displacement speed of the virtual camera according to the dragging distance;
determining a target displacement direction of the virtual camera according to the dragging direction;
and determining the target displacement information according to the target displacement speed and the target displacement direction.
3. The method of claim 2, wherein determining the target displacement speed of the virtual camera based on the drag distance comprises:
and determining the target displacement speed of the virtual camera according to a first mapping relation and the dragging distance, wherein the first mapping relation is used for representing the mapping relation between the dragging distance and the displacement speed.
4. The method of claim 2, wherein the rocker control comprises at least a rocker control for controlling a position movement of the virtual camera in a first axial direction; the determining the target displacement direction of the virtual camera according to the dragging direction comprises the following steps:
and determining the target displacement direction of the position movement of the virtual camera in the first axial direction according to the dragging direction.
5. The method of claim 2, wherein said determining said target displacement information based on said target displacement velocity and said target displacement direction comprises:
determining a speed variation corresponding to each moment during the period that the virtual camera is converted from the current displacement speed in the current displacement direction to the target displacement speed in the target displacement direction;
determining a target displacement direction and a target speed which are required to be achieved by the virtual camera at each moment according to the speed variation corresponding to each moment;
and determining the target displacement direction and the target speed which are required to be achieved at each moment as the target displacement information.
6. The method of claim 1, wherein the second motion information includes a rotation parameter of the gyroscope in a second axis; the determining target rotation information of the virtual camera according to the second motion information includes:
determining a target rotation parameter of the virtual camera in a third axial direction according to a second mapping relation and a rotation parameter of the gyroscope in the second axial direction, wherein the second mapping relation is used for representing the mapping relation between the rotation parameter in the second axial direction and the rotation parameter in the third axial direction;
and determining the target rotation information according to the target rotation parameters of the virtual camera in the third axial direction.
7. The method of claim 6, wherein the target rotation parameter comprises a target rotation angle; the determining the target rotation information according to the target rotation parameters of the virtual camera in the third axial direction comprises the following steps:
determining an angle change amount corresponding to each moment during the period that the virtual camera rotates from the current rotation angle to the target rotation angle in the third axial direction;
determining a target rotation angle which is required to be reached by the virtual camera at each moment according to the angle change quantity corresponding to each moment;
and determining the target rotation angle required to be reached at each moment as the target rotation information.
8. A control system of a virtual camera, the system comprising a mobile device and a terminal device, the mobile device being in communication connection with the terminal device, the mobile device comprising a gyroscope and a screen for displaying pictures taken by a rocker control and the virtual camera, wherein:
the mobile equipment is used for acquiring first motion information of the rocker control and second motion information of the gyroscope and sending the first motion information and the second motion information to the terminal equipment;
the terminal equipment is used for determining target displacement information of the virtual camera according to the first motion information, determining target rotation information of the virtual camera according to the second motion information, controlling the virtual camera to move in a virtual scene according to the target displacement information and the target rotation information, and sending pictures shot by the virtual camera in the moving process to the mobile equipment;
the mobile device is also used for displaying pictures shot by the virtual camera in the motion process.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory, characterized in that the processor executes the computer program to implement the method of controlling a virtual camera according to any one of claims 1 to 7.
10. A computer-readable storage medium, on which a computer program/instruction is stored, characterized in that the computer program/instruction, when executed by a processor, implements the control method of a virtual camera as claimed in any one of claims 1 to 7.
CN202310806478.2A 2023-06-30 2023-06-30 Control method, system, equipment and medium for virtual camera Pending CN116758190A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310806478.2A CN116758190A (en) 2023-06-30 2023-06-30 Control method, system, equipment and medium for virtual camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310806478.2A CN116758190A (en) 2023-06-30 2023-06-30 Control method, system, equipment and medium for virtual camera

Publications (1)

Publication Number Publication Date
CN116758190A true CN116758190A (en) 2023-09-15

Family

ID=87958872

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310806478.2A Pending CN116758190A (en) 2023-06-30 2023-06-30 Control method, system, equipment and medium for virtual camera

Country Status (1)

Country Link
CN (1) CN116758190A (en)

Similar Documents

Publication Publication Date Title
US12079942B2 (en) Augmented and virtual reality
JP7498209B2 (en) Information processing device, information processing method, and computer program
US10165179B2 (en) Method, system, and computer program product for gamifying the process of obtaining panoramic images
US9723203B1 (en) Method, system, and computer program product for providing a target user interface for capturing panoramic images
US20160133230A1 (en) Real-time shared augmented reality experience
JP6615988B2 (en) Navigation through multidimensional image space
JP2018036955A (en) Image processor, image processing method, and program
US9025007B1 (en) Configuring stereo cameras
CN105872353A (en) System and method for implementing playback of panoramic video on mobile device
JP2015018296A (en) Display controller, program, and recording medium
CN113426117B (en) Shooting parameter acquisition method and device for virtual camera, electronic equipment and storage medium
CN103458184A (en) Method for carrying out long-range control on pan-tilt with cell phone
CN108377361B (en) Display control method and device for monitoring video
CN117170504B (en) Method, system and storage medium for viewing with person in virtual character interaction scene
CN116017163A (en) A control method, system, electronic device and storage medium of a virtual camera
WO2020000402A1 (en) Method and apparatus for manipulating movable platform, and movable platform
CN110548289B (en) Method and device for displaying three-dimensional control
CN116758190A (en) Control method, system, equipment and medium for virtual camera
CN110545438B (en) Panoramic play interaction device and panoramic play interaction method
CN111265866B (en) Control method and device of virtual camera, electronic equipment and storage medium
CN107230243A (en) A kind of consistent speed change interpolating method of space-time based on 2 D animation
CN110688012A (en) A method and device for realizing interaction with smart terminals and VR devices
JP5878511B2 (en) Method of using 3D geometric data for representation and control of virtual reality image in 3D space
US11816785B2 (en) Image processing device and image processing method
CN116828308A (en) Method and system for obtaining previewing data, readable storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination