CN117078805B - Method and device for generating visual animation - Google Patents

Method and device for generating visual animation Download PDF

Info

Publication number
CN117078805B
CN117078805B CN202311351348.0A CN202311351348A CN117078805B CN 117078805 B CN117078805 B CN 117078805B CN 202311351348 A CN202311351348 A CN 202311351348A CN 117078805 B CN117078805 B CN 117078805B
Authority
CN
China
Prior art keywords
view
camera
time point
view direction
parameters
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311351348.0A
Other languages
Chinese (zh)
Other versions
CN117078805A (en
Inventor
李蓓蓓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing CHL Robotics Co ltd
Original Assignee
Beijing CHL Robotics Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing CHL Robotics Co ltd filed Critical Beijing CHL Robotics Co ltd
Priority to CN202311351348.0A priority Critical patent/CN117078805B/en
Priority to CN202311578771.4A priority patent/CN117409117A/en
Publication of CN117078805A publication Critical patent/CN117078805A/en
Application granted granted Critical
Publication of CN117078805B publication Critical patent/CN117078805B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Abstract

The present disclosure provides a method and an apparatus for generating a view animation, where the method includes: acquiring a view direction switching time point selected by a user in a simulation animation, wherein the simulation animation is an existing animation of a simulation production line in production line simulation, and the view direction switching time point is a time point for switching view directions; based on the view direction switching time point, reading parameters of the camera in the current scene after the user adjusts the camera to obtain view direction information, wherein the view direction information comprises view direction camera parameters; adding the view direction into the simulation animation according to the view direction information; and updating and adding the simulation animation after the view by frames, and setting the parameters of the view camera to the camera in the current scene to obtain the view animation. In the method, the user can automatically read out the parameters of the camera in the current scene only by selecting the time point for switching the view direction and adjusting the camera, the operation process is simple, the user does not need to manually set the parameters of the camera, and the problems of complex adding process and troublesome operation of the view direction animation in the related technology are solved.

Description

Method and device for generating visual animation
Technical Field
The disclosure relates to the technical field of visual animation, in particular to a method and a device for generating visual animation.
Background
In production line simulation software, in order to observe the motion conditions of various mechanisms at different stations in the simulation process, or observe the processing process of a product from different angles at the same station, it is generally necessary to add a visual animation in the current scene. The adding process of the visual-direction animation in the existing production line simulation software is complex, each visual-direction needs to be manually set with camera parameters by a user, the operation is complex when the visual-direction animation is added, and the user interaction is not friendly.
Aiming at the problems of complicated adding process and troublesome operation of the visual animation in the related technology, no effective technical solution is proposed at present.
Disclosure of Invention
The main purpose of the present disclosure is to provide a method and an apparatus for generating a view animation, so as to solve the problems of tedious adding process and troublesome operation of the view animation in the related art.
To achieve the above object, a first aspect of the present disclosure provides a method for generating a view animation, including:
acquiring a view direction switching time point selected by a user in a simulation animation, wherein the simulation animation is an existing animation of a simulation production line in production line simulation, and the view direction switching time point is a time point for switching view directions;
reading parameters of a camera in a current scene after a user adjusts the camera based on the view direction switching time point to obtain view direction information, wherein the view direction information comprises view direction camera parameters;
adding the view direction into the simulation animation according to the view direction information; and
and updating and adding the simulation animation after the view by frames, and setting the parameters of the view camera to the camera in the current scene to obtain the view animation.
Optionally, the view direction information further includes a view direction switching time point, a switching animation and a view direction stay time point, wherein the view direction stay time point is a time point when the view direction stays;
based on the view direction switching time point, reading parameters of the camera in the current scene after the user adjusts the camera to obtain view direction information, including:
determining a view stay time point according to the simulation animation time, the view switching time point and the preset view switching time between views;
after the user adjusts the position or the visual angle of the camera, the parameters of the camera in the current scene are read to obtain the parameters of the visual direction camera.
Further, determining the view stay time point according to the simulation animation time, the view switching time point and the preset inter-view switching time, including:
judging whether the current view direction is the first view direction or not;
if the first view direction is the first view direction, the view direction stay time point of the first view direction is determined according to the following formula
If not the first view, the view-direction stay time point of the last view-direction is updated according to the following formula
Wherein,Tin order to simulate the duration of an animation,for inter-view switching duration, < >>A view switch time point for the current view, < +.>The point in time of the switch of the view direction is the last view direction.
Optionally, adding the view direction to the simulated animation according to the view direction information includes:
adding all the view information into a view information list of the simulation animation according to the sequence of the view switching time points in each view information;
when adding the view information of the current view, adding the view information of the current view into a view information list according to a time sequence, so that the added view information list is arranged according to the time sequence.
Optionally, updating and adding a simulation animation after the view from frame to frame, setting a view camera parameter to a camera in a current scene to obtain the view animation, including:
updating the first in the simulation processWhen frame animation, determine +.>Time point corresponding to frame animation->
Searching and time point from view direction information listAdjacent view-direction switching time points;
determining a time pointAnd>obtaining a first judgment result according to the size relation between adjacent view-direction switching time points, and determining view-direction camera parameters according to the first judgment result, wherein the view-direction camera parameters comprise rotation information, position information, view cone height and focal length of a view-direction middle camera;
and synchronously setting the view direction camera parameters to the cameras in the current scene according to the types of the cameras in the current scene.
Further, with the point in timeAdjacent view-direction switching time points include +.>The point of time of switching of the individual viewing directions +.>And->The point of time of switching of the individual viewing directions +.>Wherein->
Determining a time pointAnd>the magnitude relation between adjacent view-direction switching time points obtains a first judgment result, and determines a view-direction camera parameter according to the first judgment result, including:
if it isReading->Visual direction information of individual visual directions +.>Obtain->A view direction camera parameter for each view direction;
if it isReading->Visual direction information of individual visual directions +.>Obtain->A view direction camera parameter for each view direction;
if it isFrom->Visual direction information of individual visual directions +.>Read view residence time point +.>Judging time point->And->The view-direction dwell time point of the individual view directions +.>And obtaining a second judgment result according to the magnitude relation between the two, and determining the parameters of the vision-oriented camera according to the second judgment result.
Further, the time point is judgedAnd->The view-direction dwell time point of the individual view directions +.>The magnitude relation between the two is used for obtaining a second judging result, and determining the parameters of the vision-oriented camera according to the second judging result, comprising the following steps:
if it isReading->Visual direction information of individual visual directions +.>Obtain->A view direction camera parameter for each view direction;
if it isReading->Visual direction information of individual visual directions +.>Obtain->A view direction camera parameter for each view direction;
if it isAccording to->View-direction camera parameters of the individual view-directions and +.>Interpolation of the view camera parameters of the individual views is carried out to obtain the time point +.>Parameters of the time of day camera.
Further, the parameters of the camera include rotation information, position information, view cone height, and focal length of the camera;
according to the firstView-direction camera parameters of the individual view-directions and +.>Interpolation of the view camera parameters of the individual views is carried out to obtain the time point +.>Parameters of the time of day camera, including:
determining interpolation parameters according to the following formula
For the firstRotation information and +.>Performing spherical linear interpolation on rotation information of cameras in each view direction to obtain time point +.>Rotation information of the time camera->
According to the firstIndividual view direction and->Rotation information, position information and focal length of the cameras in the respective directions, respectively determining +.>Focus of camera in individual view directions +.>And->Focus of camera in individual view directions +.>And according to the following formula for +.>Focus of camera in individual view directions +.>And->Focus of camera in individual view directions +.>Performing linear interpolation to obtain a time pointMoment camera focus +.>
According to the time pointRotation information of the time camera->Focus->And->Focal length of camera in individual view direction +.>Determining the time point +.>Position information of the time camera->
Determining the time point according to the following formulaCone height of time camera>
Determining the time point according to the following formulaFocal length +.>
Wherein,is->Cone height of camera in individual view direction, < +.>Is->Cone height of camera in individual view direction, < +.>Is->Focal length of camera in the individual view direction, +.>Is->Focal length of the camera in each view direction.
Optionally, the types of cameras in the current scene include orthogonal cameras and perspective cameras;
setting view camera parameters synchronously to cameras in a current scene according to the types of the cameras in the current scene, including:
synchronously setting the rotation information and the position information in the view camera parameters to a camera in the current scene;
if the type of the camera in the current scene is a quadrature camera, setting the view cone height in the view camera parameters to the camera in the current scene synchronously;
if the type of camera in the current scene is a perspective camera, the focal length in the view-to-camera parameters is synchronously set to the camera in the current scene.
A second aspect of the present disclosure provides a generating apparatus of a view animation, including:
the system comprises an acquisition unit, a display unit and a display unit, wherein the acquisition unit is used for acquiring a view direction switching time point selected by a user in a simulation animation, wherein the simulation animation is an existing animation of a simulation production line in production line simulation, and the view direction switching time point is a time point for switching view directions;
the reading unit is used for reading parameters of the camera in the current scene after the user adjusts the camera based on the view direction switching time point to obtain view direction information, wherein the view direction information comprises view direction camera parameters;
the adding unit is used for adding the view direction into the simulation animation according to the view direction information; and
and the setting unit is used for updating the simulation animation added with the view from frame to frame, and setting the parameters of the view camera to the camera in the current scene to obtain the view animation.
A third aspect of the present disclosure provides a computer-readable storage medium storing computer instructions for causing a computer to execute the method of generating a view animation provided in any one of the first aspects.
A fourth aspect of the present disclosure provides an electronic device, comprising: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores a computer program executable by the at least one processor, the computer program being executable by the at least one processor to cause the at least one processor to perform the method of generating a visual animation provided in any one of the first aspects.
In the method for generating the visual-direction animation, provided by the embodiment of the disclosure, the user can automatically read out the parameters of the camera in the current scene only by selecting the time point for switching the visual-direction and adjusting the camera, the operation process is simple, the user does not need to manually set the parameters of the camera, and the problems of complex adding process and troublesome operation of the visual-direction animation in the related art are solved.
Drawings
In order to more clearly illustrate the embodiments of the present disclosure or the prior art, the drawings that are required in the detailed description or the prior art will be briefly described, it will be apparent that the drawings in the following description are only some embodiments of the present disclosure and that other drawings may be obtained from these drawings without inventive effort to those of ordinary skill in the art.
Fig. 1 is a flowchart of a method for generating a view animation according to an embodiment of the present disclosure;
FIG. 2 is a block diagram of a generating device for a visual animation provided by an embodiment of the present disclosure;
fig. 3 is a block diagram of an electronic device provided in an embodiment of the present disclosure.
Detailed Description
In order that those skilled in the art will better understand the present disclosure, a technical solution in the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present disclosure, not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art without inventive effort, based on the embodiments in this disclosure, shall fall within the scope of the present disclosure.
It should be noted that the terms "first," "second," and the like in the description and claims of the present disclosure and in the foregoing figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate in order to describe the embodiments of the disclosure herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It should be noted that, without conflict, the embodiments of the present disclosure and features of the embodiments may be combined with each other. The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
In production line simulation software, in order to observe the motion conditions of various mechanisms at different stations in the simulation process, or observe the processing process of a product from different angles at the same station, it is generally necessary to add a visual animation in the current scene. The adding process of the visual-direction animation in the existing production line simulation software is complex, each visual-direction needs to be set by a user, the operation is troublesome when the visual-direction animation is added, and the user interaction is not friendly; in addition, the simulation animation and the view direction switching cannot be operated in parallel, and the dynamic viewing simulation process cannot be realized.
In order to solve the above-mentioned problems, an embodiment of the present disclosure provides a method for generating a view animation, as shown in fig. 1, including the following steps S101 to S104:
step S101: acquiring a view direction switching time point selected by a user in a simulation animation, wherein the simulation animation is an existing animation of a simulation production line in production line simulation, and the view direction switching time point is a time point for switching view directions; in the prior art, when a workstation simulates or a production line simulates, the working state of each workshop, assembly line, station or machine can be simulated to obtain a simulation animation; based on the existing simulation animation, a user can find a time point required to switch the visual direction by dragging the simulation progress bar, and adds the visual direction animation.
Step S102: reading parameters of a camera in a current scene after a user adjusts the camera based on the view direction switching time point to obtain view direction information, wherein the view direction information comprises view direction camera parameters;
after the user adjusts the camera at the view-direction switching time point, the parameters of the camera in the current scene can be automatically read out to serve as the view-direction camera parameters.
In an optional embodiment of the disclosure, the view direction information further includes a view direction switching time point, a switching animation and a view direction stay time point, where the view direction stay time point is a time point when the view direction stays; the camera can be provided with a non-switching animation, when the animation is switched, the animation is gradually switched between the directions, the time length of the gradual switching between the directions is preset, and when the animation is not switched, the camera is directly switched between the directions and is not gradually switched.
Wherein, step S102 includes:
determining a view stay time point according to the simulation animation time, the view switching time point and the preset view switching time between views; the simulation animation duration is the total duration of the simulation animation, the inter-view switching duration is the duration of two adjacent view gradual change switching animations, for example, the duration between the current view switching time point and the last view stay time point can be preset to be 5s, the view stay time point can be automatically determined, and the inter-view switching duration or the view stay time point can be changed by a user;
after the user adjusts the position or the visual angle of the camera, the parameters of the camera in the current scene are read to obtain the parameters of the visual direction camera. The user can adjust the position of the camera by moving the position of the mouse, and can also adjust the visual angle of the camera by rotating the mouse or clicking left and right keys of the mouse, and after the position and/or visual angle of the camera are adjusted at the time point of the visual-direction switching, the user can automatically read the parameters of the camera in the current scene to obtain the parameters of the visual-direction camera.
In a preferred embodiment of the present disclosure, determining a view stay time point according to a simulated animation time period, a view switching time point, and a preset inter-view switching time period includes:
judging whether the current view direction is the first view direction or not;
if the first view direction is the first view direction, the view direction stay time point of the first view direction is determined according to the following formula
If not the first view, the view-direction stay time point of the last view-direction is updated according to the following formula
Wherein,Tin order to simulate the duration of an animation,for inter-view switching duration, < >>A view switch time point for the current view, < +.>The point in time of the switch of the view direction is the last view direction. In chronological order, the previous view is the previous view to the current view.
Step S103: adding the view direction into the simulation animation according to the view direction information; and adding the view direction into the simulation animation through the view direction information so as to obtain the view direction animation later.
In an alternative embodiment of the present disclosure, step S103 includes:
adding all the view information into a view information list of the simulation animation according to the sequence of the view switching time points in each view information; by passing throughList of view informationRecording the view direction information of each view direction so as to facilitate adding each view direction into the simulation animation;
when adding the view information of the current view, adding the view information of the current view into a view information list according to a time sequence, so that the added view information list is arranged according to the time sequence. Viewing direction information for current viewing directionIncluding rotation information of the camera in the current view direction +.>Position information->Cone height->And focal length->Viewing direction information of the current viewing direction +.>Insert in time order into the view information list +.>In, guarantee the list of view information +.>Arranged in time order.
Step S104: and updating and adding the simulation animation after the view by frames, and setting the parameters of the view camera to the camera in the current scene to obtain the view animation.
In the simulation process, when the simulation animation with the added view is updated frame by frame, the view camera parameters of each view are set to the cameras in the current scene, no new camera is required to be additionally added to specially generate the view animation, and the view camera parameters are directly set to the cameras in the current scene.
In an alternative embodiment of the present disclosure, after adding all view information to the view information list in time sequence, step S104 includes:
updating the first in the simulation processWhen frame animation, determine +.>Time point corresponding to frame animation->
Searching and time point from view direction information listAdjacent view-direction switching time points; for a given point in time->From the list of view information->Find with time point->The time points of view switching adjacent to each other;
determining a time pointAnd>obtaining a first judgment result according to the size relation between adjacent view-direction switching time points, and determining view-direction camera parameters according to the first judgment result, wherein the view-direction camera parameters comprise rotation information, position information, view cone height and focal length of a view-direction middle camera; time Point->And>the size relationship between adjacent view-direction switching time points, i.e. time point +.>And the time-sequence position relation of the adjacent view-direction switching time point in the simulation duration.
In a preferred embodiment of the present disclosure, the time pointAdjacent view-direction switching time points include +.>The point of time of switching of the individual viewing directions +.>And->The point of time of switching of the individual viewing directions +.>Wherein->;/>Is->Visual direction information of individual visual directions +.>In the recorded view switch time point, +.>Is->Visual direction information of individual visual directionsA view-direction switching time point recorded in (a); when the time point is->At the first viewing direction's viewing direction switching time point +.>Before, the->Is->There is no->Time point +.>After the point in time of the switching of the last viewing direction, there is no +.>
Wherein, judge the time pointAnd>the magnitude relation between adjacent view-direction switching time points obtains a first judgment result, and determines a view-direction camera parameter according to the first judgment result, including:
if it isReading->Visual direction information of individual visual directions +.>Obtain->A view direction camera parameter for each view direction;
if it isReading->Visual direction information of individual visual directions +.>Obtain->A view direction camera parameter for each view direction;
if it isFrom->Visual direction information of individual visual directions +.>Read view residence time point +.>Judging time point->And->The view-direction dwell time point of the individual view directions +.>And obtaining a second judgment result according to the magnitude relation between the two, and determining the parameters of the vision-oriented camera according to the second judgment result.
In a preferred embodiment of the present disclosureIn the embodiment, the time point is determinedAnd->Point of view dwell time for each viewThe magnitude relation between the two is used for obtaining a second judging result, and determining the parameters of the vision-oriented camera according to the second judging result, comprising the following steps:
if it isReading->Visual direction information of individual visual directions +.>Obtain->A view direction camera parameter for each view direction;
if it isReading->Visual direction information of individual visual directions +.>Obtain->A view direction camera parameter for each view direction;
if it isAccording to->Personal visionDirection of view camera parameters and +.>Interpolation of the view camera parameters of the individual views is carried out to obtain the time point +.>Parameters of the time of day camera.
In a preferred embodiment of the present disclosure, the parameters of the camera include rotation information, position information, view cone height, and focal length of the camera;
wherein, according to the firstView-direction camera parameters of the individual view-directions and +.>Interpolation of the view camera parameters of the individual views is carried out to obtain the time point +.>Parameters of the time of day camera, including:
determining interpolation parameters according to the following formula
For the firstRotation information and +.>Performing spherical linear interpolation on rotation information of cameras in each view direction to obtain time point +.>Rotation information of the time camera->The method comprises the steps of carrying out a first treatment on the surface of the The spherical linear interpolation is the spherical linear interpolation of quaternion;
according to the firstIndividual view direction and->Rotation information, position information and focal length of the cameras in the respective directions, respectively determining +.>Focus of camera in individual view directions +.>And->Focus of camera in individual view directions +.>And according to the following formula for +.>Focus of camera in individual view directions +.>And->Focus of camera in individual view directions +.>Performing linear interpolation to obtain a time pointMoment camera focus +.>
According to the time pointRotation information of the time camera->Focus->And->Focal length of camera in individual view direction +.>Determining the time point +.>Position information of the time camera->
Determining the time point according to the following formulaCone height of time camera>
Determining the time point according to the following formulaFocal length +.>
Wherein,is->Cone height of camera in individual view direction, < +.>Is->Cone height of camera in individual view direction, < +.>Is->Focal length of camera in the individual view direction, +.>Is->Focal length of the camera in each view direction.
In combination with the above, ifOr->From->Visual direction information of individual visual directions +.>Read in (1)A view direction camera parameter for each view direction; if->Or->From->Visual direction information of individual visual directions +.>Read in->A view direction camera parameter for each view direction; if->According to->View-direction camera parameters of the individual view-directions and +.>Interpolation of the view camera parameters of the individual views is carried out to obtain the time point +.>Parameters of the time of day camera.
And synchronously setting the view direction camera parameters to the cameras in the current scene according to the types of the cameras in the current scene. Based on the type of the camera in the current scene, the parameters of the camera in the current scene are synchronously set or synchronously modified according to the parameters of the camera in the view direction recorded in the view direction, the simulation animation and the view direction animation are operated in parallel, and the view direction animation can be played while the simulation animation is played, so that the dynamic view simulation process is realized.
In an alternative embodiment of the present disclosure, the types of cameras in the current scene include orthogonal cameras and perspective cameras; in order to facilitate reading and setting of camera parameters, the view-direction camera parameters of each view direction obtained in the step S102 are read under the same type of camera, and the camera in the current scene read when the view-direction camera parameters are obtained in the step S102 and the camera in the current scene to which the view-direction camera parameters are set in the step S104 belong to the same type, are both orthogonal cameras or are both perspective cameras;
if the camera in the current scene during reading and the camera in the current scene during setting are not of the same type, the viewing direction camera parameters need to be converted, so that the converted viewing direction camera parameters can reach the position and the viewing angle after the user adjusts the camera after being set in the other type of camera.
According to the type of the camera in the current scene, synchronously setting the parameters of the view camera to the camera in the current scene, including:
synchronously setting the rotation information and the position information in the view camera parameters to a camera in the current scene; if it isOr->Will be->Rotation information in view camera parameters of the individual view +.>And position information->Synchronously setting the camera in the current scene; if->Or->Will be->Rotation information in view camera parameters of the individual view +.>And position information->Synchronously setting the camera in the current scene; if->According to->View-direction camera parameters of the individual view-directions and +.>Interpolation of the view camera parameters of the individual views is carried out to obtain the time point +.>The rotation information of the camera after the parameters of the moment camera are +.>And position information->Synchronously setting the camera in the current scene;
if the type of the camera in the current scene is a quadrature camera, setting the view cone height in the view camera parameters to the camera in the current scene synchronously;
if the type of camera in the current scene is a perspective camera, the focal length in the view-to-camera parameters is synchronously set to the camera in the current scene.
By synchronously setting the view camera parameters to the cameras in the current scene, the cameras in the current scene can be synchronously set or synchronously modified according to the recorded view camera parameters during simulation to obtain view animation, and the existing simulation animation and the added view animation are operated in parallel, and the simulation animation is played while the view animation is played, so that the dynamic viewing simulation process is realized.
From the above description, it can be seen that the present disclosure achieves the following technical effects:
the method and the device for controlling the camera in the current scene operate the camera in the current scene without adding a new camera, directly record parameters of the current camera, and are convenient to operate;
in the method, the user can automatically read out the parameters of the camera in the current scene only by selecting the time point for switching the view direction and adjusting the camera, the operation process is simple, the user does not need to manually set the parameters of the camera, and the problems of complex adding process and troublesome operation of the view direction animation in the related technology are solved;
according to the method and the device, the parameters of the camera in the current scene are synchronously set or synchronously modified according to the parameters of the video camera recorded in the video, the simulation animation and the video animation are operated in parallel, the video animation can be played while the simulation animation is played, and the dynamic viewing simulation process is realized.
It should be noted that the steps illustrated in the flowcharts of the figures may be performed in a computer system such as a set of computer executable instructions, and that although a logical order is illustrated in the flowcharts, in some cases the steps illustrated or described may be performed in an order other than that illustrated herein.
The embodiment of the present disclosure further provides a device for generating a visual animation for implementing the method for generating a visual animation, as shown in fig. 2, where the device 20 for generating a visual animation includes:
an obtaining unit 21, configured to obtain a view direction switching time point selected by a user in a simulation animation, where the simulation animation is an existing animation of a simulated production line in a production line simulation, and the view direction switching time point is a time point for switching view directions;
a reading unit 22, configured to read parameters of the camera in the current scene after the user adjusts the camera based on the view direction switching time point, to obtain view direction information, where the view direction information includes view direction camera parameters;
an adding unit 23 for adding the view direction to the simulation animation according to the view direction information; and
and the setting unit 24 is used for updating the simulation animation added with the view from frame to frame, and setting the parameters of the view camera to the cameras in the current scene to obtain the view animation.
The specific manner in which the units of the above embodiments of the apparatus perform their operations has been described in detail in relation to the embodiments of the method and is not described in detail here.
The disclosed embodiments also provide an electronic device, as shown in fig. 3, which includes one or more processors 31 and a memory 32, and in fig. 3, one processor 31 is taken as an example.
The controller may further include: an input device 33 and an output device 34.
The processor 31, the memory 32, the input device 33 and the output device 34 may be connected by a bus or otherwise, in fig. 3 by way of example.
The processor 31 may be a central processing unit (Central Processing Unit, abbreviated as CPU), the processor 31 may also be other general purpose processors, digital signal processors (DigitalSignal Processor, abbreviated as DSP), application specific integrated circuits (Application Specific Integrated Circuit, abbreviated as ASIC), field programmable gate arrays (Field-Programmable Gate Array, abbreviated as FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or a combination of the foregoing types of chips, and the general purpose processor may be a microprocessor or any conventional processor.
The memory 32 serves as a non-transitory computer readable storage medium that may be used to store non-transitory software programs, non-transitory computer-executable programs, and modules, such as program instructions/modules corresponding to the control methods in embodiments of the present disclosure. The processor 31 executes various functional applications of the server and data processing, that is, implements the method of generating a view-oriented animation of the above-described method embodiment, by running non-transitory software programs, instructions, and modules stored in the memory 32.
The memory 32 may include a storage program area that may store an operating system, at least one application program required for functions, and a storage data area; the storage data area may store data created according to the use of a processing device operated by the server, or the like. In addition, the memory 32 may include high-speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, memory 32 may optionally include memory located remotely from processor 31, which may be connected to a network connection device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input means 33 may receive input numeric or character information and generate key signal inputs related to user settings and function control of the processing means of the server. The output device 34 may include a display device such as a display screen.
One or more modules are stored in the memory 32 that, when executed by the one or more processors 31, perform the method shown in fig. 1.
It will be appreciated by those skilled in the art that implementing all or part of the above-described embodiment method may be implemented by a computer program for instructing relevant hardware, and the program may be stored in a computer readable storage medium, and the program may include the embodiment of the above-described motor control method when executed. The storage medium may be a magnetic Disk, an optical disc, a Read-only memory (ROM), a random access memory (Random Access Memory, RAM), a Flash Memory (FM), a Hard Disk (HDD), or a Solid-state Drive (SSD); the storage medium may also comprise a combination of memories of the kind described above.
Although embodiments of the present disclosure have been described with reference to the accompanying drawings, various modifications and variations may be made by those skilled in the art without departing from the spirit and scope of the disclosure, and such modifications and variations fall within the scope as defined by the appended claims.

Claims (5)

1. A method for generating a visual animation, comprising:
acquiring a view direction switching time point selected by a user in a simulation animation, wherein the simulation animation is an existing animation of a simulation production line in production line simulation, and the view direction switching time point is a time point for switching view directions;
reading parameters of a camera in the current scene after the user adjusts the camera based on the view direction switching time point to obtain view direction information, wherein the view direction information comprises view direction camera parameters;
adding the view direction into the simulation animation according to the view direction information; and
updating and adding the simulation animation after the view direction frame by frame, and setting the view direction camera parameters to cameras in the current scene to obtain the view direction animation;
wherein the adding the view direction to the simulation animation according to the view direction information comprises the following steps:
adding all the view information into a view information list of the simulation animation according to the sequence of the view switching time points in each view information;
when adding the view direction information of the current view direction, adding the view direction information of the current view direction into the view direction information list according to a time sequence, so that the added view direction information list is arranged according to the time sequence;
the step of updating frame by frame to add the simulation animation after the view direction, setting the view direction camera parameter to a camera in the current scene to obtain the view direction animation, comprising the following steps:
updating the first in the simulation processWhen frame animation, determine +.>Time point corresponding to frame animation->
Searching and time point from view direction information listAdjacent view direction switchingA time point;
determining a time pointAnd>obtaining a first judgment result according to the size relation between adjacent view-direction switching time points, and determining view-direction camera parameters according to the first judgment result, wherein the view-direction camera parameters comprise rotation information, position information, view cone height and focal length of a view-direction middle camera;
synchronously setting the parameters of the view camera to the cameras in the current scene according to the types of the cameras in the current scene;
wherein the and time pointAdjacent view-direction switching time points include +.>The point of time of switching of the individual viewing directions +.>And (d)The point of time of switching of the individual viewing directions +.>Wherein->
The judging time pointAnd>the size relation between adjacent view-direction switching time points is used for obtaining a first judgment result, and determining the view-direction camera parameters according to the first judgment result, and the method comprises the following steps:
if it isReading->Visual direction information of individual visual directions +.>Obtain->A view direction camera parameter for each view direction;
if it isReading->Visual direction information of individual visual directions +.>Obtain->A view direction camera parameter for each view direction;
if it isFrom->Visual direction information of individual visual directions +.>Read view residence time point +.>Judging time point->And->The view-direction dwell time point of the individual view directions +.>The magnitude relation between the two is used for obtaining a second judging result, and the parameters of the vision-oriented camera are determined according to the second judging result;
wherein the judging time pointAnd->The view-direction dwell time point of the individual view directions +.>The magnitude relation between the two is used for obtaining a second judging result, and determining the parameters of the vision-oriented camera according to the second judging result, comprising the following steps:
if it isReading->Visual direction information of individual visual directions +.>Obtain->A view direction camera parameter for each view direction;
if it isReading->Visual direction information of individual visual directions +.>Obtain->A view direction camera parameter for each view direction;
if it isAccording to->View-direction camera parameters of the individual view-directions and +.>Interpolation of the view camera parameters of the individual views is carried out to obtain the time point +.>Parameters of the time camera;
the parameters of the camera comprise rotation information, position information, view cone height and focal length of the camera;
said according to the firstView-direction camera parameters of the individual view-directions and +.>Interpolation of the view camera parameters of the individual views is carried out to obtain the time point +.>Parameters of the time of day camera, including:
determining interpolation parameters according to the following formula
For->Rotation information and +.>Performing spherical linear interpolation on rotation information of cameras in each view direction to obtain time point +.>Rotation information of the time camera->
According to the firstIndividual view direction and->Rotation information, position information and focal length of the cameras in the respective directions, respectively determining +.>Focus of camera in individual view directions +.>And->Focus of camera in individual view directions +.>And according to the following formula for +.>Focus of camera in individual view directions +.>And->Focus of camera in individual view directions +.>Performing linear interpolation to obtain time point +.>Moment camera focus +.>
According to the time point->Rotation information of the time camera->Focus->And->Focal length of camera in individual view direction +.>Determining the time point +.>Position information of the time camera->
Determining the time point according to the following formulaCone height of time camera>
The time point is determined according to the following formula>Focal length +.>
Wherein (1)>Is->Cone height of camera in individual view direction, < +.>Is the firstCone height of camera in individual view direction, < +.>Is->Focal length of camera in the individual view direction, +.>Is->Focal length of the camera in each view direction.
2. The method of claim 1, wherein the step of determining the position of the substrate comprises,
the view direction information further comprises a view direction switching time point, a switching animation and a view direction stay time point, wherein the view direction stay time point is a time point when the view direction stays;
based on the view direction switching time point, reading parameters of a camera in a current scene after a user adjusts the camera to obtain view direction information, including:
determining the view stay time point according to the simulation animation time, the view switching time point and the preset inter-view switching time;
and after the user adjusts the position or the visual angle of the camera, reading the parameters of the camera in the current scene to obtain the parameters of the visual direction camera.
3. The method of claim 2, wherein the determining the view dwell time point according to the simulated animation time period, the view switch time point, and a preset inter-view switch time period comprises:
judging whether the current view direction is the first view direction or not;
if the first view direction is the first view direction, the view direction stay time point of the first view direction is determined according to the following formula
If not the first view, the view-direction stay time point of the last view-direction is updated according to the following formula
Wherein,Tfor simulating animation duration->For the inter-view switching duration,a view switch time point for the current view, < +.>The point in time of the switch of the view direction is the last view direction.
4. The method of claim 1, wherein the step of determining the position of the substrate comprises,
the types of cameras in the current scene comprise orthogonal cameras and perspective cameras;
the step of synchronously setting the view direction camera parameters to the cameras in the current scene according to the types of the cameras in the current scene comprises the following steps:
synchronously setting the rotation information and the position information in the view camera parameters to a camera in the current scene;
if the type of the camera in the current scene is a quadrature camera, setting the view cone height synchronization in the view camera parameters to the camera in the current scene;
and if the type of the camera in the current scene is a perspective camera, synchronously setting the focal length in the view direction camera parameters to the camera in the current scene.
5. A viewing animation generation apparatus, comprising:
the system comprises an acquisition unit, a display unit and a display unit, wherein the acquisition unit is used for acquiring a view direction switching time point selected by a user in a simulation animation, wherein the simulation animation is an existing animation of a simulation production line in production line simulation, and the view direction switching time point is a time point for switching view directions;
the reading unit is used for reading parameters of the camera in the current scene after the user adjusts the camera based on the view direction switching time point to obtain view direction information, wherein the view direction information comprises view direction camera parameters;
the adding unit is used for adding the view direction into the simulation animation according to the view direction information; and
the setting unit is used for updating and adding the simulation animation after the view direction frame by frame, and setting the view direction camera parameters to the cameras in the current scene to obtain the view direction animation;
wherein the adding the view direction to the simulation animation according to the view direction information comprises the following steps:
adding all the view information into a view information list of the simulation animation according to the sequence of the view switching time points in each view information;
when adding the view direction information of the current view direction, adding the view direction information of the current view direction into the view direction information list according to a time sequence, so that the added view direction information list is arranged according to the time sequence;
the step of updating frame by frame to add the simulation animation after the view direction, setting the view direction camera parameter to a camera in the current scene to obtain the view direction animation, comprising the following steps:
updating the first in the simulation processWhen frame animation, determine +.>Time point corresponding to frame animation->
Searching and time point from view direction information listAdjacent view-direction switching time points;
determining a time pointAnd>obtaining a first judgment result according to the size relation between adjacent view-direction switching time points, and determining view-direction camera parameters according to the first judgment result, wherein the view-direction camera parameters comprise rotation information, position information, view cone height and focal length of a view-direction middle camera;
synchronously setting the parameters of the view camera to the cameras in the current scene according to the types of the cameras in the current scene;
wherein the and time pointAdjacent view-direction switching time points include +.>The point of time of switching of the individual viewing directions +.>And (d)The point of time of switching of the individual viewing directions +.>Wherein->
The judging time pointAnd>the size relation between adjacent view-direction switching time points is used for obtaining a first judgment result, and determining the view-direction camera parameters according to the first judgment result, and the method comprises the following steps:
if it isReading->Visual direction information of individual visual directions +.>Obtain->A view direction camera parameter for each view direction;
if it isReading->Visual direction information of individual visual directions +.>Obtain->A view direction camera parameter for each view direction;
if it isFrom->Visual direction information of individual visual directionsRest->Read view residence time point +.>Judging time point->And->The view-direction dwell time point of the individual view directions +.>The magnitude relation between the two is used for obtaining a second judging result, and the parameters of the vision-oriented camera are determined according to the second judging result;
wherein the judging time pointAnd->The view-direction dwell time point of the individual view directions +.>The magnitude relation between the two is used for obtaining a second judging result, and determining the parameters of the vision-oriented camera according to the second judging result, comprising the following steps:
if it isReading->Visual direction information of individual visual directions +.>Obtain->A view direction camera parameter for each view direction;
if it isReading->Visual direction information of individual visual directions +.>Obtain->A view direction camera parameter for each view direction;
if it isAccording to->View-direction camera parameters of the individual view-directions and +.>Interpolation of the view camera parameters of the individual views is carried out to obtain the time point +.>Parameters of the time camera;
the parameters of the camera comprise rotation information, position information, view cone height and focal length of the camera;
said according to the firstView-direction camera parameters of the individual view-directions and +.>Interpolation is carried out on the vision-direction camera parameters of the respective vision directions, and the time is obtainedThe middle point->Parameters of the time of day camera, including:
determining interpolation parameters according to the following formula
For->Rotation information and +.>Performing spherical linear interpolation on rotation information of cameras in each view direction to obtain time point +.>Rotation information of the time camera->
According to the firstIndividual view direction and->Rotation information, position information and focal length of the cameras in the respective directions, respectively determining +.>Focus of camera in individual view directions +.>And->Focus of camera in individual view directions +.>And according to the following formula for +.>Focus of camera in individual view directions +.>And->Focus of camera in individual view directions +.>Performing linear interpolation to obtain time point +.>Moment camera focus +.>
According to the time point->Rotation information of the time camera->Focus->And->Focal length of camera in individual view direction +.>Determining the time point +.>Position information of the time camera->
Determining the view cone height of the camera at a point in time according to the following formula
The time point is determined according to the following formula>Focal length +.>
Wherein (1)>Is->Cone height of camera in individual view direction, < +.>Is the firstCone height of camera in individual view direction, < +.>Is->Focal length of camera in the individual view direction, +.>Is->Focal length of the camera in each view direction.
CN202311351348.0A 2023-10-18 2023-10-18 Method and device for generating visual animation Active CN117078805B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202311351348.0A CN117078805B (en) 2023-10-18 2023-10-18 Method and device for generating visual animation
CN202311578771.4A CN117409117A (en) 2023-10-18 2023-10-18 Method and device for generating visual animation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311351348.0A CN117078805B (en) 2023-10-18 2023-10-18 Method and device for generating visual animation

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202311578771.4A Division CN117409117A (en) 2023-10-18 2023-10-18 Method and device for generating visual animation

Publications (2)

Publication Number Publication Date
CN117078805A CN117078805A (en) 2023-11-17
CN117078805B true CN117078805B (en) 2023-12-15

Family

ID=88715784

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202311351348.0A Active CN117078805B (en) 2023-10-18 2023-10-18 Method and device for generating visual animation
CN202311578771.4A Pending CN117409117A (en) 2023-10-18 2023-10-18 Method and device for generating visual animation

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202311578771.4A Pending CN117409117A (en) 2023-10-18 2023-10-18 Method and device for generating visual animation

Country Status (1)

Country Link
CN (2) CN117078805B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6064393A (en) * 1995-08-04 2000-05-16 Microsoft Corporation Method for measuring the fidelity of warped image layer approximations in a real-time graphics rendering pipeline
JP2013134596A (en) * 2011-12-26 2013-07-08 Brother Ind Ltd Information processing device, frame image changeover method, and program
CN103314570A (en) * 2010-11-12 2013-09-18 三星电子株式会社 Method and apparatus for video stabilization by compensating for view direction of camera
US8674998B1 (en) * 2008-08-29 2014-03-18 Lucasfilm Entertainment Company Ltd. Snapshot keyframing
CN105142017A (en) * 2015-08-12 2015-12-09 北京金山安全软件有限公司 Picture switching method and picture switching device during picture video playing
CN111298433A (en) * 2020-02-10 2020-06-19 腾讯科技(深圳)有限公司 Animation video processing method and device, electronic equipment and storage medium
CN112740710A (en) * 2018-07-27 2021-04-30 瑞典爱立信有限公司 System and method for inserting advertising content in 360 degree immersive video
CN112789854A (en) * 2018-10-01 2021-05-11 瑞典爱立信有限公司 System and method for providing quality control in 360 ° immersive video during pauses

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10089759B2 (en) * 2016-02-01 2018-10-02 Adobe Systems Incorporated Homography-assisted perspective drawing

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6064393A (en) * 1995-08-04 2000-05-16 Microsoft Corporation Method for measuring the fidelity of warped image layer approximations in a real-time graphics rendering pipeline
US8674998B1 (en) * 2008-08-29 2014-03-18 Lucasfilm Entertainment Company Ltd. Snapshot keyframing
CN103314570A (en) * 2010-11-12 2013-09-18 三星电子株式会社 Method and apparatus for video stabilization by compensating for view direction of camera
JP2013134596A (en) * 2011-12-26 2013-07-08 Brother Ind Ltd Information processing device, frame image changeover method, and program
CN105142017A (en) * 2015-08-12 2015-12-09 北京金山安全软件有限公司 Picture switching method and picture switching device during picture video playing
CN112740710A (en) * 2018-07-27 2021-04-30 瑞典爱立信有限公司 System and method for inserting advertising content in 360 degree immersive video
CN112789854A (en) * 2018-10-01 2021-05-11 瑞典爱立信有限公司 System and method for providing quality control in 360 ° immersive video during pauses
CN111298433A (en) * 2020-02-10 2020-06-19 腾讯科技(深圳)有限公司 Animation video processing method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN117409117A (en) 2024-01-16
CN117078805A (en) 2023-11-17

Similar Documents

Publication Publication Date Title
KR20220130197A (en) Filming method, apparatus, electronic equipment and storage medium
CN110213490B (en) Image anti-shake method and device, electronic equipment and storage medium
WO2020107297A1 (en) Video clipping control method, terminal device, system
CN104581380A (en) Information processing method and mobile terminal
CN106060406A (en) Photographing method and mobile terminal
WO2017014800A1 (en) Video editing on mobile platform
CN112822541B (en) Video generation method and device, electronic equipment and computer readable medium
CN106658139B (en) Focus control method and device
CN102800045A (en) Image processing method and device
WO2022100162A1 (en) Method and apparatus for producing dynamic shots in short video
CN103929606A (en) Image presenting control method and image presenting control device
CN106484407A (en) Control display packing and device
CN109151257A (en) A kind of method and video camera of image procossing
CN104333699A (en) Synthetic method and device of user-defined photographing area
CN105678695A (en) Picture processing method and device
CN110149476A (en) A kind of time-lapse photography method, apparatus, system and terminal device
CN113426112B (en) Game picture display method and device, storage medium and electronic equipment
CN117078805B (en) Method and device for generating visual animation
CN106126057B (en) Screen capture method and device and terminal equipment
CN105204720A (en) Method and device for displaying background image on mobile terminal
CN106338807A (en) Automatic focusing method and device and terminal
CN104754202A (en) Image collecting method and electronic device
CN111757177B (en) Video clipping method and device
US20140016914A1 (en) Editing apparatus, editing method, program and storage medium
CN103826061A (en) Information processing method and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant