CN112827172B - Shooting method, shooting device, electronic equipment and storage medium - Google Patents
Shooting method, shooting device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN112827172B CN112827172B CN202011626634.XA CN202011626634A CN112827172B CN 112827172 B CN112827172 B CN 112827172B CN 202011626634 A CN202011626634 A CN 202011626634A CN 112827172 B CN112827172 B CN 112827172B
- Authority
- CN
- China
- Prior art keywords
- shooting
- animation data
- historical
- target
- current
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
- A63F13/525—Changing parameters of virtual cameras
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/70—Game security or game management aspects
- A63F13/79—Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/85—Providing additional services to players
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Signal Processing (AREA)
- Business, Economics & Management (AREA)
- Computer Security & Cryptography (AREA)
- General Business, Economics & Management (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the invention discloses a shooting method, a shooting device, electronic equipment and a storage medium, wherein the shooting method comprises the following steps: acquiring current animation data corresponding to a current user at a current time; determining a target shooting parameter corresponding to the current animation data based on the historical shooting animation data and the historical shooting parameter corresponding to the historical shooting animation data; at least one target object in the current animation data is photographed based on the target photographing parameters. According to the technical scheme, the target shooting parameters for shooting the current animation data are determined according to the historical shooting parameters of the historical shooting animation data, and the target image comprising the target object is shot based on the target shooting parameters, so that the high-efficiency shooting of the current animation data can be realized, the pictures possibly interesting to the user can be automatically shot, the user highlight moment can be recorded in time, the personalized requirements of the user can be met, and the technical effect of user experience is improved.
Description
Technical Field
The embodiment of the invention relates to the technical field of game development, in particular to a shooting method, a shooting device, electronic equipment and a storage medium.
Background
In order to record the animation such as the story line in the network game, the player interaction, etc., the function of video recording or photo shooting of the animation is set in the existing game. When a game player wants to record a certain animation or a certain frame of pictures, a shooting button can be triggered to achieve the aim of recording.
Currently, in the game process, if a game player needs to shoot a game picture, the shooting of the game picture is often realized through manual operation of the player. In such a manual screen capturing manner, various key screens cannot be obtained in time due to untimely screen capturing operation or forgetting screen capturing of the Player, such as an interactive screen with a Non-Player Character (NPC), a fight screen with a bos, a special effect screen for releasing skills when fighting, or the like, and it is difficult for the Player to obtain a screen that is immediately lost in the game. Meanwhile, the manual screenshot may also cause inaccurate screenshot time and miss a proper game picture due to network delay or machine jamming of player equipment.
Disclosure of Invention
The embodiment of the invention provides a shooting method, a shooting device, electronic equipment and a storage medium, so as to realize automatic shooting of current animation data.
In a first aspect, an embodiment of the present invention provides a photographing method, including:
acquiring current animation data corresponding to a current user at a current time;
determining a target shooting parameter corresponding to the current animation data based on historical shooting animation data and historical shooting parameters corresponding to the historical shooting animation data;
and shooting at least one target object in the current animation data based on the target shooting parameters.
In a second aspect, an embodiment of the present invention further provides a photographing apparatus, including:
the animation data acquisition module is used for acquiring current animation data corresponding to a current user at the current time;
a shooting parameter determining module, configured to determine a target shooting parameter corresponding to the current animation data based on historical shooting animation data and historical shooting parameters corresponding to the historical shooting animation data;
and the shooting module is used for shooting at least one target object in the current animation data based on the target shooting parameters.
In a third aspect, an embodiment of the present invention further provides an electronic device, including:
one or more processors;
Storage means for storing one or more programs,
when the one or more programs are executed by the one or more processors, the one or more processors implement the photographing method according to any one of the embodiments of the present invention.
In a fourth aspect, an embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, which when executed by a processor, implements a photographing method according to any of the embodiments of the present invention.
According to the technical scheme, the historical shooting animation data and the historical shooting parameters corresponding to the historical shooting animation data are determined, so that the target shooting parameters for shooting the current animation data at the current moment can be determined, further, the target image comprising the target object is shot based on the target shooting parameters, the technical problem that the user experience is poor due to the fact that corresponding pictures cannot be shot automatically in the prior art is solved, the technical effect of shooting the corresponding pictures automatically is achieved, the shooting parameters corresponding to the corresponding animation data can be called to shoot the corresponding animation data when the corresponding animation data are shot, the matching degree of the shot image and a user is improved, the shooting effect and the shooting flexibility are improved, and the technical effect of user experience is greatly improved.
Drawings
In order to more clearly illustrate the technical solution of the exemplary embodiments of the present invention, a brief description is given below of the drawings required for describing the embodiments. It is obvious that the drawings presented are only drawings of some of the embodiments of the invention to be described, and not all the drawings, and that other drawings can be made according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of a photographing method according to an embodiment of the invention; the method comprises the steps of carrying out a first treatment on the surface of the
Fig. 2 is a flowchart of a photographing method according to a second embodiment of the present invention;
fig. 3 is a flowchart of a photographing method according to a second embodiment of the present invention; the method comprises the steps of carrying out a first treatment on the surface of the
Fig. 4 is a schematic structural diagram of a photographing device according to a third embodiment of the present invention;
fig. 5 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present invention.
Detailed Description
The invention is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting thereof. It should be further noted that, for convenience of description, only some, but not all of the structures related to the present invention are shown in the drawings.
It should be further noted that, for convenience of description, only some, but not all of the matters related to the present invention are shown in the accompanying drawings. Before discussing exemplary embodiments in more detail, it should be mentioned that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart depicts operations (or steps) as a sequential process, many of the operations can be performed in parallel, concurrently, or at the same time. Furthermore, the order of the operations may be rearranged. The process may be terminated when its operations are completed, but may have additional steps not included in the figures. The processes may correspond to methods, functions, procedures, subroutines, and the like.
Before the embodiment of the invention is introduced, an application scene is introduced. The shooting method of the embodiment of the invention can be suitable for scenes for automatically shooting the animation data in the multimedia resources, and is particularly suitable for automatically shooting the animation data in the game animation. For easy understanding, in the embodiment of the present invention, the shooting method is described taking an application scene as a game scene as an example. Animation data in a game animation may include environments, characters, monsters, and the like.
Example 1
Fig. 1 is a flowchart of a photographing method according to an embodiment of the present invention, where the method may be performed by a photographing device, and the device may be configured in a terminal or a server, and the terminal and the server may perform the photographing method according to the embodiment of the present invention independently or cooperatively.
As shown in fig. 1, the photographing method in this embodiment may specifically include:
s110, acquiring current animation data corresponding to a current user at a current time.
It should be noted that, animation is often composed of many frames of images. Usually, the animation has themes and is provided with a story line. As the storyline progresses, the content of each frame of image also changes. During the course of the animation, the user may continuously see new animation data. And the playing progress of the same animation corresponding to different users can be different. Therefore, in the embodiment of the invention, the current animation data corresponding to the current user at the current time can be acquired.
The current animation data may be data corresponding to an animation frame that is currently played. Taking a game animation as an example, the current animation data may be data of a scene picture including one or more objects such as a person, a monster, weather, a tree, or a building. In animation, these objects generally need to be implemented through an information group corresponding to the objects. For example, the information group corresponding to each object in the current animation data may be rendered based on a preset viewing angle of the game player to obtain the current animation frame. That is, the current animation data is determined based on the information group of each object included in the current animation picture.
Specifically, each video frame in the game process can be acquired, the target video frame corresponding to the terminal device at the current moment can be determined, scene picture data such as characters, monster, weather and the like contained in the target video frame can be determined, and the determined data can be used as current animation data.
S120, determining target shooting parameters corresponding to the current animation data based on the historical shooting animation data and the historical shooting parameters corresponding to the historical shooting animation data.
The historical shooting animation data can be understood as animation data shot before the current moment or when other terminals or the current terminal trigger shooting. That is, the history shooting animation data may be understood as animation data that has been shot, and may or may not include current animation data. For example, the historical shooting animation data comprises shooting data of a preset animation event by a historical user, and the current animation data may be video frame data of the preset animation event played by the current user; the current moving picture data may be video frame data other than the history shooting moving picture data. The content included in the history animation data may be the same as or different from the content included in the current animation data explained above, and the difference is that different users have different favorites on the image, and accordingly, there is a certain difference in specific content corresponding to the history shooting animation data. The above-mentioned history user may be understood as a user who has animation history shooting data, and may or may not include the current user. The number of history users may be one or a plurality. The historical shooting parameters can be understood as camera shooting parameters, camera attribute parameters and/or scene parameters corresponding to the time of shooting the historical animation data. The camera shooting parameters are parameters corresponding to shooting the animation data in the history shooting, and can include, but are not limited to, at least one of the following data: shooting angles of historical shooting moving picture data, shooting positions of cameras during shooting, wide-angle parameters of the cameras during shooting and the like; the camera attribute parameters are parameters used in the shooting process, that is, specific parameters corresponding to shooting, for example, a shutter, an aperture, a field angle, an exposure amount, whether a flash is turned on, and may be shooting light and/or shooting angle corresponding to the current user. The scene parameters may be light source parameters under the current scene, such as the number of light sources, the intensity of the light sources, etc. The target shooting parameters can be understood as camera shooting parameters corresponding to shooting current animation data, camera attribute parameters and scene parameters corresponding to the current animation data. The target shooting parameters are shooting parameters corresponding to the current animation data, and the target shooting parameters are determined from the historical shooting parameters corresponding to the historical shooting animation data.
It should be noted that, for each piece of animation data, the shooting parameter may be one or more of a specific angle at which a certain picture is shot, a specific position at which a camera is placed, and a shooting wide angle.
In this embodiment, the determining, according to the historical shooting animation data, the target shooting parameter corresponding to the current animation data may be: and determining a plurality of historical animation data with higher association degree with the current animation data from the historical shooting animation data, and calling the historical shooting parameters associated with the historical animation data, wherein the historical shooting parameters obtained at the moment are all used as target shooting parameters of the current animation data so as to shoot a target image comprising a target shooting object based on the determined shooting parameters.
For example, a correspondence between the historical photographing animation data and the historical photographing parameters may be established, after the current animation data is acquired, a similarity between the current animation data and each of the historical photographing animation data may be determined, at least one of the historical photographing animation data associated with the current animation data may be determined according to the similarity, and the historical photographing parameters corresponding to the at least one of the historical photographing animation data may be retrieved according to the correspondence and used as the target photographing parameters.
S130, shooting at least one target object in the current animation data based on the target shooting parameters.
Before shooting at least one target object, shooting of at least one target object in the current animation data can be triggered based on the animation history shooting data of the current user, including: determining whether the current animation data is target shooting animation data or not based on animation history shooting data of a current user; if yes, shooting of at least one target object in the current animation data is triggered based on the animation history shooting data of the current user.
For example, whether the current animation data is target shooting animation data may be determined based on a picture type of history shooting animation data among animation history shooting data of the current user.
Specifically, determining whether the current animation data is target shooting animation data based on a picture type of history shooting animation data among animation history shooting data of the current user may include: determining a historical shooting picture type of historical shooting animation data in the animation historical shooting data of the current user; determining a current picture type of the current animation data; and if the type of the picture shot based on the history is the same as the type of the current picture, determining the current animation data as target shooting animation data.
The scene type may be determined based on the scene type of the current animation data, and may be, for example, a combat scene, a reload scene, or an upgrade scene. The picture type may also be determined based on the picture style, and may be, for example, a fierce style, a beautiful style, or a refreshing style, etc. The picture type may also be determined based on the picture color, e.g., the picture type may be determined based on the kind of picture color and a preset color kind threshold, e.g., the picture type may be classified as color gorgeous or conciseness based on the picture color; for example, the determination of the key of the color of the picture may be based on the predetermined proportion of the color in the picture, and the picture type may be determined. It should be noted that, the determination modes of the frame types may be various, and specific division may be set according to actual requirements, which is herein specifically defined.
Alternatively, this operation may also be implemented by means of artificial intelligence. For example, a preset machine learning model may be trained based on the animation history shooting data of the current user to obtain an animation shooting prediction model, and shooting of at least one target object in the current animation data may be triggered based on a distal result of the animation shooting prediction model on the current animation data.
Optionally, triggering shooting of at least one target object in the current animation data based on the animation history shooting data of the current user includes: determining whether the current animation data is target shooting animation data or not based on historical shooting data which corresponds to at least one historical user and corresponds to the current animation data in the animation historical shooting data of at least one historical user; if yes, shooting of at least one target object in the current animation data is triggered based on the animation history shooting data of the current user.
It will be appreciated that historical shot data corresponding to at least one historical user and to the current animation data, for example, historical animation data when the historical user is at the same game level as the current user, tends to be more similar to the current animation data. In game animation, animation data includes game characters, game scenes, and the like. A game scene will typically include fixed objects that will not change due to changes in the game player, and we can treat this data as static data. Namely, the inherent object data in the animation data. For example, inanimate objects such as buildings, plants, small objects, etc. in a game scene, or atmosphere data such as time, weather, wind conditions, tide, etc. in an environmental atmosphere, etc. may be used.
Optionally, historical static data in historical shooting animation data based on animation historical shooting data of at least one historical user and current static data of the current animation data corresponding to a current user are respectively acquired, and the historical shooting data corresponding to the at least one historical user and the current animation data are determined based on the similarity of the current static data and the historical static data. Specifically, if the similarity between the current static data and the historical static data is greater than a preset static similarity threshold, the historical shooting animation data is determined to be the historical shooting data corresponding to at least one historical user and the current animation data.
Of course, the history shooting data corresponding to the at least one history user and to the current animation data may be determined based on the history shooting animation data of the at least one history user, the screen type of the current animation data of the current user, and the like. The specific implementation manner may refer to the explanation of the picture type in the embodiment of the present invention, and will not be described herein.
As previously described, the current animation data may include one or more objects such as a person, monster, weather, tree, or building. The target object may be understood as a key object to be photographed among the respective objects included in the current animation data. The number of target objects may be one, two or more. Target objects include, but are not limited to, player character play, game monsters, game NPCs, and scene buildings. It should be noted that player-operated characters include, but are not limited to, characters and animals; scene buildings include, but are not limited to, natural scenes such as mountains, sky, grass, etc., and real buildings such as churches, arenas, etc.
In one embodiment, the target object may be determined based on a picture type of the current animation data. The association relationship of the picture type and the corresponding target object may be stored in advance. After the picture type of the current animation data is identified, a target object corresponding to the current animation data may be determined based on a pre-stored association relationship. For example, when the current animation data is a battle scene, the corresponding at least two target objects to be photographed include at least one player character and a game monster battle with each character, or include each player character battle with each other. That is, at least two target objects corresponding to the current animation data are determined by identifying the type of the current animation data of the target player.
In another embodiment, at least one of the two target objects includes a target player character. The remaining target objects may be interactive objects of the current animation data in which the target player manipulates the character. Specifically, the target player manipulation character in the current animation data can be detected in real time, so that when the target player manipulation character generates interactive behaviors, such as combat, each mutual object of the target player manipulation character is determined to be the other target objects to be shot.
In another embodiment, the target object may also be determined based on attribute information of each object. Wherein the attribute information of each object includes, but is not limited to, object types such as player character, monster, NPC, building, etc. Specifically, when it is detected that the attribute information of each object in the current picture data contains a preset object type, selecting an object which accords with the preset object type from the objects as a target object. The preset attribute information may be player character + monster or player character + building, etc.
Specifically, after the target object is determined, the target object may be photographed based on the target photographing parameter determined in S120, so as to obtain the target image.
If the determined target shooting parameters include multiple groups, at least one target shooting object may be shot based on each group of shooting parameters, where the number of target images obtained by shooting is the same as the number of the determined target shooting parameter groups.
Optionally, after triggering shooting of at least one target object in the current animation data, the method further comprises: and shooting at least one target object in the current animation data. The shooting of the at least one target object in the current animation data may take a picture of the at least one target object in the current animation data, or record the at least one target object in the current animation data.
It will be appreciated that the target object is the main shooting object and is not a limitation of shooting content. When at least one target object in the current animation data is shot, the shot data can comprise objects except the target object.
According to the technical scheme, the historical shooting animation data and the historical shooting parameters corresponding to the historical shooting animation data are determined, so that the target shooting parameters for shooting the current animation data at the current moment can be determined, further, the target image comprising the target object is shot based on the target shooting parameters, the technical problem that the user experience is poor due to the fact that corresponding pictures cannot be shot automatically in the prior art is solved, the technical effect of shooting the corresponding pictures automatically is achieved, the shooting parameters corresponding to the corresponding animation data can be called to shoot the corresponding animation data when the corresponding animation data are shot, the matching degree of the shot image and a user is improved, the shooting effect and the shooting flexibility are improved, and the technical effect of user experience is greatly improved.
Example two
Fig. 2 is a flowchart of a photographing method according to a second embodiment of the present invention, where the method further refines the photographing method based on the above-mentioned alternative technical solutions, and optionally, determining, based on the historical photographing animation data and the historical photographing parameters corresponding to the historical photographing animation data, the target photographing parameters corresponding to the current animation data includes: training an original machine learning model based on historical shooting animation data and historical shooting parameters corresponding to the historical shooting animation data to obtain a shooting parameter prediction model; and determining target shooting parameters corresponding to the current animation data based on the shooting parameter prediction model. Wherein, the explanation of the same or corresponding terms as the above embodiments is not repeated herein.
As shown in fig. 2, the photographing method in this embodiment may specifically include:
s210, acquiring current animation data corresponding to a current user at a current time.
S220, training an original machine learning model based on the historical shooting animation data and the historical shooting parameters corresponding to the historical shooting animation data to obtain a shooting parameter prediction model.
The shooting parameter prediction model is obtained by training an original machine learning model after sample data are obtained. The shooting parameter prediction model is regarded as a shooting parameter model corresponding to the prediction of shooting of a corresponding picture. The original machine learning model may be a deep learning model or a reinforcement learning model. In order to improve the accuracy of the shooting parameter prediction model, training sample data can be acquired as much as possible. The training sample data includes history shooting animation data and history shooting parameters corresponding to the history shooting animation data. The historical shooting animation data can be used as input of the original machine learning model, and the corresponding historical shooting parameters can be used as output of the original machine learning model.
Training to obtain the shooting prediction model may be: for each training sample data, inputting a historical shooting animation in the current training sample data into an original machine learning model, wherein the original machine learning model can output a corresponding actual output result; and correcting the loss function in the model according to the actual output result and the historical shooting parameters in the current training sample data. The loss function convergence can be used as a training target, and the original machine learning model can be trained to obtain a shooting parameter prediction model.
It should be noted that, the model parameters in the original machine learning model may be set as default parameters, and the specific contents included in the shooting parameters may be specifically expressed in the first parameter embodiment, which is not explained in detail herein.
S230, determining target shooting parameters corresponding to the current animation data based on the shooting parameter prediction model.
In this embodiment, if training is completed with the shooting parameter prediction model, the current animation data may be input into the shooting parameter prediction model, and the shooting parameter prediction model may output shooting parameters matched with the current animation data, where the obtained shooting parameters may be used as target shooting parameters.
S240, shooting at least one target object in the current animation data based on the target shooting parameters.
According to the technical scheme, the original machine learning model is trained through the historical shooting animation data and shooting parameters corresponding to the historical shooting animation data to obtain the shooting parameter prediction model, so that the current animation data of the current user is predicted based on the shooting parameter prediction model, the shooting parameters of the current animation data can be rapidly and effectively determined, the model can be continuously optimized along with the increase of the historical shooting animation data, the shooting parameters of the current animation data can be accurately and effectively determined, and meanwhile, the technical effect of user experience is improved.
Example III
Fig. 3 is a flowchart of a photographing method according to a third embodiment of the present invention, where the method further refines the flowchart based on the above-mentioned alternative technical solutions, and optionally, determining, based on the historical photographing animation data and the historical photographing parameters corresponding to the historical photographing animation data, the target photographing parameters corresponding to the current animation data includes: determining at least one group of target historical shooting animation data corresponding to the current animation data in the historical shooting animation data; the current shooting parameters are determined based on the historical shooting parameters corresponding to the at least one set of target historical shooting animation data. Wherein, the explanation of the same or corresponding terms as the above embodiments is not repeated herein.
S310, acquiring current animation data corresponding to a current user at a current time.
S320, determining at least one group of target historical shooting animation data corresponding to the current animation data in the historical shooting animation data.
Wherein the history shooting animation data may be a frame image including a plurality of elements. The current animation data may be a current frame image corresponding to the current picture. An image similarity algorithm may be employed to determine a similarity between the current frame image and each of the historical frame images, and at least one set of historical frame images associated with the current frame image may be determined from the historical frame images based on the similarity. That is, the target historical shooting animation data is shooting data which is matched with the current animation data is screened from the historical shooting animation data.
In the present embodiment, the determination target history animation data may also be: after the history shooting animation data is acquired, it may be classified and stored separately according to the picture type of the history shooting animation data. After determining the target picture type of the current animation data, a plurality of pieces of history shooting animation data matching the current animation data may be determined from a database corresponding to the target picture type as target history shooting animation data. Alternatively, the target history shooting animation data may be determined in combination with the similarity.
Based on the above technical solution, in order to quickly and accurately determine a set of target historical shooting animation data matched with the current animation data, at least one or more of the following manners may be adopted for determination.
In one embodiment, at least one set of target historical shot animation data corresponding to current animation data in the historical shot animation data is determined based on scene information of the historical shot animation data and the current animation data.
The historical shooting animation data may be animation data shot at a certain checkpoint, and scene identifiers corresponding to different checkpoints are different, so that the scene identifiers can be used as scene information. Different users may prefer different video frames corresponding to the same level, and may include a plurality of historical shooting animation data corresponding to the same level, i.e., the historical shooting animation data associated with the same scene identifier may include a plurality of historical shooting animation data. The same checkpoint may also include multiple scenes, for example, three scenes included in checkpoint 1 may be a target object performing tasks in a snowfield, a rainfield, and a dark scene, where the scene identifiers may be 1-1, 1-2, and 1-3. The determination of the scene information may also be determined according to the subject included in the frame, for example, when the subjects included in a certain scene are identical, the scene identification is identical. For example, if the scene information includes trees, mountains, and streams, only scenes of all frames (history shooting moving picture data) including the above elements may be marked as one scene identifier.
It should be noted that, any one of the above methods may be adopted to determine the scene identifier corresponding to each animation data, and the principle of mainly determining the scene identifier is unified.
Specifically, when storing each of the history shooting moving picture data, the scene identification of each of the history shooting moving picture data may be determined according to the content included in the history shooting moving picture data, and the corresponding history shooting moving picture data may be marked. Or determining a scene identifier according to the corresponding level of each historical shooting animation data; or determining scene identification according to different sub-scenes of each historical shooting animation data in the same checkpoint. After determining the scene identifier, the target scene identifier corresponding to the current animation data can be determined according to the same principle. The historical scene identification matched with the target scene identification can be determined, the corresponding historical shooting animation data is called according to the historical scene identification, and the historical shooting animation data called out at the moment is the target historical shooting animation data.
Alternatively, if there is history shooting animation data that is identical to scene information of the current animation data in the history shooting animation data, a proportion of the history shooting animation data that is identical to the scene information of the current animation data in the total history shooting animation data is determined, and the history shooting animation data corresponding to the current animation data is determined based on the proportion.
In another embodiment, the determining at least one set of target historical shooting animation data corresponding to the current animation data in the historical shooting animation data includes: acquiring historical user operation information corresponding to the at least one group of target historical shooting animation data; and determining at least one group of target historical shooting animation data corresponding to the current animation data in the historical shooting animation data based on the historical user operation information.
Wherein, the history user operation information corresponding to the history shooting animation data can be understood as the operation information of the history user on the history shooting animation data. Wherein the historical user operation information may include one, two or more items. Alternatively, the historical user operation information may include, but is not limited to, at least one of the following: clipping, beautifying, deleting, saving, sharing, and/or manually retaking, etc. In the embodiment of the invention, the historical preference information of the historical user on the historical shooting animation data can be determined through the historical user operation information of the historical user. The advantage of determining the operation information thereof is that the history shooting animation data associated with the same scene information may include a plurality of pieces, and which of the history shooting animation data is animation data to which the user is interested may be determined based on the operation information. Illustratively, when the history user shares and saves the history shooting animation data, the history shooting preference degree for the history shooting animation data is illustrated to be higher than that when the history user saves only the history shooting animation data; the historical shooting preference degree of the historical shooting animation data when the historical user saves the historical shooting animation data is higher than that of the historical shooting animation data when the historical user beautifies the historical shooting animation data; the historical shooting preference degree of the historical shooting animation data when the historical user beautifies the historical shooting animation data is higher than that when the historical user deletes the historical shooting animation data and manually re-shoots the historical shooting preference degree of the historical shooting animation data; the historical photographing preference degree for the historical photographing animation data when the historical user deletes the historical photographing animation data and manually re-photographs is higher than the historical photographing preference degree for the historical photographing animation data when the historical user deletes only the historical photographing animation data. After determining the historical preference degree of each historical shooting animation data, a preset number of target historical shooting animation data can be determined according to the historical shooting preference degree, and corresponding historical shooting parameters can be called.
That is, after determining the degree of the history user's extreme shooting preference for each of the history shooting animation data according to the above manner, the degree of the history shooting preference may be bound with the corresponding history shooting animation data. When determining the target historical shooting animation data corresponding to the current animation data, the target historical shooting animation data which are consistent in recognition of the scene Jing Biao can be screened out according to the target scene identification of the current animation data, and the historical shooting animation data with the historical shooting preference higher than the preset preference threshold value can be selected as the target historical shooting animation data according to the historical shooting preference of each piece of the historical shooting animation data.
It can be understood that, firstly, according to the historical operation information of the user, the historical shooting preference degree corresponding to each historical shooting animation data is determined, and when the target historical shooting animation data is determined, the method can be as follows: firstly screening out part of to-be-processed historical shooting animation data by combining with scene identification, and then screening out target historical shooting animation data according to the historical shooting preference of the to-be-processed historical shooting animation data.
The above preference level and specific user operation information on which setting is based are merely exemplary descriptions of the manner in which the historical photographing preference level is determined, and are not limited thereto.
For example, the history shooting preference degree for the history shooting moving picture data may also be determined by the number of shares of the history shooting moving picture data or the like.
S330, determining target shooting parameters based on the historical shooting parameters corresponding to the at least one set of target historical shooting animation data.
In the present embodiment, the target shooting parameters are determined based on the history shooting parameters corresponding to at least one set of target history shooting animation data, and at least two embodiments may be adopted.
In the first embodiment, when the determined historical photographing animation data includes a plurality of sets, each of the plurality of sets of historical photographing parameters may be used as a target photographing parameter, and a plurality of target images including the target object may be photographed based on the photographing parameters.
That is, the history shooting parameters corresponding to the at least one set of target history shooting animation data are respectively taken as target shooting parameters.
The second embodiment may be: the determining the current shooting parameters based on the historical shooting parameters corresponding to the at least one set of target historical shooting animation data comprises: if the historical photographing parameters corresponding to the target historical photographing animation data are two or more groups, the target photographing parameters are determined based on photographing time information corresponding to the historical photographing parameters.
Specifically, the determined target historical animation data may include a plurality of groups, and the target shooting parameters may be determined according to shooting moments of each historical shooting animation data. For example, the historical photographing parameter closest to the current moment may be used as the target photographing parameter, because the appreciation level and the appreciation capability of the user are continuously improved, the historical photographing parameter farther from the current moment may not match the current appreciation level, the matching degree between the historical photographing parameter at the angle from the current moment and the user is higher, and the historical photographing parameter closer to the current moment is used as the target photographing parameter.
S340, shooting at least one target object in the current animation data based on the target shooting parameters.
According to the technical scheme, the historical shooting animation data and the historical shooting parameters corresponding to the historical shooting animation data are determined, so that the target shooting parameters for shooting the current animation data at the current moment can be determined, further, the target image comprising the target object is shot based on the target shooting parameters, the technical problem that the user experience is poor due to the fact that corresponding pictures cannot be shot automatically in the prior art is solved, the technical effect of shooting the corresponding pictures automatically is achieved, the shooting parameters corresponding to the corresponding animation data can be called to shoot the corresponding animation data when the corresponding animation data are shot, the matching degree of the shot image and a user is improved, the shooting effect and the shooting flexibility are improved, and the technical effect of user experience is greatly improved.
Example IV
Fig. 4 is a schematic structural diagram of a photographing device according to a fourth embodiment of the present invention, which can be used to execute the photographing method according to any of the embodiments of the present invention, and the device can be implemented by software and/or hardware.
The photographing device of the embodiment of the invention can comprise: an animation data acquisition module 410, a photographing parameter determination module 420, and a photographing module 430.
The animation data obtaining module 410 is configured to obtain current animation data corresponding to a current user at a current time; a shooting parameter determining module 420, configured to determine a target shooting parameter corresponding to the current animation data based on historical shooting animation data and historical shooting parameters corresponding to the historical shooting animation data; and a shooting module 430, configured to shoot at least one target object in the current animation data based on the target shooting parameter. According to the technical scheme, the shooting of the current animation data is triggered based on the animation history shooting data of the current user, personalized analysis is conducted on the current user, the shooting of the current animation data can be automatically triggered, the high-efficiency shooting of the current animation data can be conducted, the automatic shooting of the pictures possibly interested by the user is conducted, the user highlight moment can be recorded timely, personalized requirements of the user are met, and the user experience is improved.
On the basis of the above technical solutions, the shooting parameter determining module includes:
the shooting parameter prediction model determining unit is used for training an original machine learning model based on historical shooting animation data and historical shooting parameters corresponding to the historical shooting animation data to obtain a shooting parameter prediction model; and the target shooting parameter determining unit is used for determining target shooting parameters corresponding to the current animation data based on the shooting parameter prediction model.
On the basis of the above technical solution, the shooting parameter determining module further includes:
a shooting animation data determining unit, configured to determine at least one group of target historical shooting animation data corresponding to the current animation data in the historical shooting animation data; and a target shooting parameter determining unit configured to determine target shooting parameters based on the history shooting parameters corresponding to the at least one set of target history shooting animation data.
On the basis of the above technical solutions, the shooting animation data determining unit is further configured to: and determining at least one group of target historical shooting animation data corresponding to the current animation data in the historical shooting animation data based on the scene information of the historical shooting animation data and the current animation data.
On the basis of the above technical solutions, the shooting animation data determining unit is further configured to: acquiring historical user operation information corresponding to the at least one group of target historical shooting animation data; and determining at least one group of target historical shooting animation data corresponding to the current animation data in the historical shooting animation data based on the historical user operation information.
On the basis of the above technical solutions, the shooting parameter determining unit is further configured to: if the historical photographing parameters corresponding to the target historical photographing animation data are two or more groups, the target photographing parameters are determined based on photographing time information corresponding to the historical photographing parameters.
On the basis of the above technical solutions, the shooting parameter determining unit is further configured to: and respectively determining the historical shooting parameters corresponding to the at least one group of target historical shooting animation data as target shooting parameters.
The shooting device provided by the embodiment of the invention can execute the shooting method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method.
It should be noted that, each unit and module included in the photographing apparatus are only divided according to the functional logic, but not limited to the above division, so long as the corresponding functions can be implemented; in addition, the specific names of the functional units are also only for distinguishing from each other, and are not used to limit the protection scope of the embodiments of the present invention.
Example five
Fig. 5 is a schematic structural diagram of an electronic device according to a fifth embodiment of the present invention. Fig. 5 illustrates a block diagram of an exemplary electronic device 12 suitable for use in implementing embodiments of the present invention. The electronic device 12 shown in fig. 5 is merely an example and should not be construed as limiting the functionality and scope of use of embodiments of the present invention.
As shown in fig. 5, the electronic device 12 is in the form of a general purpose computing device. Components of the electronic device 12 may include, but are not limited to: one or more processors or processing units 16, a system memory 28, a bus 18 that connects the various system components, including the system memory 28 and the processing units 16.
The system memory 28 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM) 30 and/or cache memory 32. The electronic device 12 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 34 may be used to read from or write to non-removable, nonvolatile magnetic media (not shown in FIG. 5, commonly referred to as a "hard disk drive"). Although not shown in fig. 5, a magnetic disk drive for reading from and writing to a removable non-volatile magnetic disk (e.g., a "floppy disk"), and an optical disk drive for reading from or writing to a removable non-volatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In such cases, each drive may be coupled to bus 18 through one or more data medium interfaces. The system memory 28 may include at least one program product having a set (e.g., at least one) of program modules configured to carry out the functions of the embodiments of the invention.
A program/utility 40 having a set (at least one) of program modules 42 may be stored in, for example, system memory 28, such program modules 42 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment. Program modules 42 generally perform the functions and/or methods of the embodiments described herein.
The electronic device 12 may also communicate with one or more external devices 14 (e.g., keyboard, pointing device, display 24, etc.), one or more devices that enable a user to interact with the electronic device 12, and/or any devices (e.g., network card, modem, etc.) that enable the electronic device 12 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 22. Also, the electronic device 12 may communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN) and/or a public network, such as the Internet, through a network adapter 20. As shown, the network adapter 20 communicates with other modules of the electronic device 12 over the bus 18. It should be appreciated that although not shown, other hardware and/or software modules may be used in connection with electronic device 12, including, but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
The processing unit 16 executes various functional applications and data processing by running a program stored in the system memory 28, for example, implements a photographing method provided by the present embodiment.
Example six
A sixth embodiment of the present invention also provides a storage medium containing computer-executable instructions, which when executed by a computer processor, are for performing a photographing method, the method comprising:
acquiring current animation data corresponding to a current user at a current time;
determining a target shooting parameter corresponding to the current animation data based on historical shooting animation data and historical shooting parameters corresponding to the historical shooting animation data;
and shooting at least one target object in the current animation data based on the target shooting parameters.
The computer storage media of embodiments of the invention may take the form of any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for embodiments of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
Note that the above is only a preferred embodiment of the present invention and the technical principle applied. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, while the invention has been described in connection with the above embodiments, the invention is not limited to the embodiments, but may be embodied in many other equivalent forms without departing from the spirit or scope of the invention, which is set forth in the following claims.
Claims (8)
1. A photographing method, comprising:
acquiring current animation data corresponding to a current user at a current time;
determining a target shooting parameter corresponding to the current animation data based on historical shooting animation data and historical shooting parameters corresponding to the historical shooting animation data;
the determining, based on the historical shooting animation data and the historical shooting parameters corresponding to the historical shooting animation data, the target shooting parameters corresponding to the current animation data includes:
determining at least one group of target historical shooting animation data corresponding to the current animation data in the historical shooting animation data;
Determining target shooting parameters based on historical shooting parameters corresponding to the at least one set of target historical shooting animation data;
the determining the target shooting parameters based on the historical shooting parameters corresponding to the at least one set of target historical shooting animation data includes:
if the historical shooting parameters corresponding to the target historical shooting animation data are two groups or more, determining the target shooting parameters based on shooting time information corresponding to the historical shooting parameters;
and shooting at least one target object in the current animation data based on the target shooting parameters.
2. The method of claim 1, wherein the determining the target shooting parameter corresponding to the current animation data based on the historical shooting animation data and the historical shooting parameter corresponding to the historical shooting animation data comprises:
training an original machine learning model based on historical shooting animation data and historical shooting parameters corresponding to the historical shooting animation data to obtain a shooting parameter prediction model;
and determining target shooting parameters corresponding to the current animation data based on the shooting parameter prediction model.
3. The method of claim 1, wherein determining at least one set of target historical shot animation data in the historical shot animation data corresponding to the current animation data comprises:
and determining at least one group of target historical shooting animation data corresponding to the current animation data in the historical shooting animation data based on the scene information of the historical shooting animation data and the current animation data.
4. The method of claim 1, wherein determining at least one set of target historical shot animation data in the historical shot animation data corresponding to the current animation data comprises:
acquiring historical user operation information corresponding to the at least one group of target historical shooting animation data;
and determining at least one group of target historical shooting animation data corresponding to the current animation data in the historical shooting animation data based on the historical user operation information.
5. The method of claim 1, wherein the determining the target capture parameters based on the historical capture parameters corresponding to the at least one set of target historical capture animation data comprises:
and respectively determining the historical shooting parameters corresponding to the at least one group of target historical shooting animation data as target shooting parameters.
6. A photographing apparatus, comprising:
the animation data acquisition module is used for acquiring current animation data corresponding to a current user at the current time;
a shooting parameter determining module, configured to determine a target shooting parameter corresponding to the current animation data based on historical shooting animation data and historical shooting parameters corresponding to the historical shooting animation data;
the shooting parameter determining module further comprises:
a shooting animation data determining unit, configured to determine at least one group of target historical shooting animation data corresponding to the current animation data in the historical shooting animation data;
a target shooting parameter determination unit configured to determine target shooting parameters based on historical shooting parameters corresponding to the at least one set of target historical shooting animation data;
the shooting parameter determining unit is further configured to: if the historical shooting parameters corresponding to the target historical shooting animation data are two groups or more, determining the target shooting parameters based on shooting time information corresponding to the historical shooting parameters;
and the shooting module is used for shooting at least one target object in the current animation data based on the target shooting parameters.
7. An electronic device, the electronic device comprising:
one or more processors;
storage means for storing one or more programs,
when the one or more programs are executed by the one or more processors, the one or more processors are caused to implement the shooting method of any of claims 1-5.
8. A computer-readable storage medium, on which a computer program is stored, characterized in that the program, when executed by a processor, implements the shooting method as claimed in any one of claims 1 to 5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011626634.XA CN112827172B (en) | 2020-12-31 | 2020-12-31 | Shooting method, shooting device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011626634.XA CN112827172B (en) | 2020-12-31 | 2020-12-31 | Shooting method, shooting device, electronic equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112827172A CN112827172A (en) | 2021-05-25 |
CN112827172B true CN112827172B (en) | 2023-05-16 |
Family
ID=75924524
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011626634.XA Active CN112827172B (en) | 2020-12-31 | 2020-12-31 | Shooting method, shooting device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112827172B (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113694518B (en) * | 2021-08-27 | 2023-10-24 | 上海米哈游璃月科技有限公司 | Freezing effect processing method and device, storage medium and electronic equipment |
CN113694522B (en) * | 2021-08-27 | 2023-10-24 | 上海米哈游璃月科技有限公司 | Method and device for processing crushing effect, storage medium and electronic equipment |
CN113992845B (en) * | 2021-10-18 | 2023-11-10 | 咪咕视讯科技有限公司 | Image shooting control method and device and computing equipment |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107454331A (en) * | 2017-08-28 | 2017-12-08 | 维沃移动通信有限公司 | The switching method and mobile terminal of a kind of screening-mode |
CN108229369A (en) * | 2017-12-28 | 2018-06-29 | 广东欧珀移动通信有限公司 | Image capturing method, device, storage medium and electronic equipment |
CN109718537A (en) * | 2018-12-29 | 2019-05-07 | 努比亚技术有限公司 | Game video method for recording, mobile terminal and computer readable storage medium |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018086262A1 (en) * | 2016-11-08 | 2018-05-17 | 华为技术有限公司 | Method for acquiring photographing reference data, mobile terminal and server |
CN107820020A (en) * | 2017-12-06 | 2018-03-20 | 广东欧珀移动通信有限公司 | Method of adjustment, device, storage medium and the mobile terminal of acquisition parameters |
CN108600669A (en) * | 2018-03-30 | 2018-09-28 | 努比亚技术有限公司 | Game video method for recording, mobile terminal and computer readable storage medium |
CN109951664B (en) * | 2019-03-31 | 2023-05-02 | 联想(北京)有限公司 | Recording method and device |
-
2020
- 2020-12-31 CN CN202011626634.XA patent/CN112827172B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107454331A (en) * | 2017-08-28 | 2017-12-08 | 维沃移动通信有限公司 | The switching method and mobile terminal of a kind of screening-mode |
CN108229369A (en) * | 2017-12-28 | 2018-06-29 | 广东欧珀移动通信有限公司 | Image capturing method, device, storage medium and electronic equipment |
CN109718537A (en) * | 2018-12-29 | 2019-05-07 | 努比亚技术有限公司 | Game video method for recording, mobile terminal and computer readable storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN112827172A (en) | 2021-05-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112827172B (en) | Shooting method, shooting device, electronic equipment and storage medium | |
CN111045777B (en) | Rendering method and device, storage medium and electronic equipment | |
CN112866562B (en) | Picture processing method and device, electronic equipment and storage medium | |
CN112843735B (en) | Game picture shooting method, device, equipment and storage medium | |
CN112422844A (en) | Method, device and equipment for adding special effect in video and readable storage medium | |
CN108421240A (en) | Court barrage system based on AR | |
CN112423143A (en) | Live broadcast message interaction method and device and storage medium | |
CN112843693B (en) | Method and device for shooting image, electronic equipment and storage medium | |
CN112843733A (en) | Method and device for shooting image, electronic equipment and storage medium | |
CN112822555A (en) | Shooting method, shooting device, electronic equipment and storage medium | |
CN112791401B (en) | Shooting method, shooting device, electronic equipment and storage medium | |
CN112843695B (en) | Method and device for shooting image, electronic equipment and storage medium | |
CN116966557A (en) | Game video stream sharing method and device, storage medium and electronic equipment | |
CN112843691B (en) | Method and device for shooting image, electronic equipment and storage medium | |
CN112860360B (en) | Picture shooting method and device, storage medium and electronic equipment | |
CN112774199B (en) | Target scene picture restoration method and device, electronic equipment and storage medium | |
CN112843739B (en) | Shooting method, shooting device, electronic equipment and storage medium | |
CN112843694B (en) | Picture shooting method and device, storage medium and electronic equipment | |
CN112843696A (en) | Shooting method, shooting device, electronic equipment and storage medium | |
CN112791402A (en) | Shooting method, shooting device, electronic equipment and storage medium | |
CN114125552A (en) | Video data generation method and device, storage medium and electronic device | |
CN112861612A (en) | Method and device for shooting image, electronic equipment and storage medium | |
CN112843678B (en) | Method and device for shooting image, electronic equipment and storage medium | |
CN112860372B (en) | Method and device for shooting image, electronic equipment and storage medium | |
CN112843736A (en) | Method and device for shooting image, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |