CN112791402A - Shooting method, shooting device, electronic equipment and storage medium - Google Patents

Shooting method, shooting device, electronic equipment and storage medium Download PDF

Info

Publication number
CN112791402A
CN112791402A CN202011626966.8A CN202011626966A CN112791402A CN 112791402 A CN112791402 A CN 112791402A CN 202011626966 A CN202011626966 A CN 202011626966A CN 112791402 A CN112791402 A CN 112791402A
Authority
CN
China
Prior art keywords
shooting
current
historical
data
animation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011626966.8A
Other languages
Chinese (zh)
Inventor
胡婷婷
赵男
包炎
刘超
施一东
李鑫培
师锐
董一夫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Mihoyo Tianming Technology Co Ltd
Original Assignee
Shanghai Mihoyo Tianming Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Mihoyo Tianming Technology Co Ltd filed Critical Shanghai Mihoyo Tianming Technology Co Ltd
Priority to CN202011626966.8A priority Critical patent/CN112791402A/en
Publication of CN112791402A publication Critical patent/CN112791402A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/79Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories

Abstract

The embodiment of the invention discloses a shooting method, a shooting device, electronic equipment and a storage medium, wherein the method comprises the following steps: acquiring current animation data corresponding to a current user at a current moment; and triggering the shooting of at least one target object in the current animation data based on animation historical shooting data of at least one historical user. According to the technical scheme of the embodiment of the invention, the current animation data is shot by triggering based on the animation historical shooting data of the historical user, namely whether the current animation data needs to be shot or not is determined by the animation historical shooting data, the shooting of the current animation data can be automatically triggered, the high-efficiency shooting of the game picture can be realized, the picture which is possibly interested by the current user can be automatically shot, the highlight moment of the current user can be timely recorded, the personalized requirement of the user is met, and the technical effect of the experience of the user is improved.

Description

Shooting method, shooting device, electronic equipment and storage medium
Technical Field
The embodiment of the invention relates to the technical field of game development, in particular to a shooting method, a shooting device, electronic equipment and a storage medium.
Background
In order to record animation such as storyline, player interaction and the like in a network game, the conventional game is provided with a function of recording animation or taking pictures. When a game player wants to record a certain animation or a certain frame of picture, the shooting button can be triggered to achieve the purpose of recording.
At present, in the game process, if a game player needs to shoot a game picture, the game player often intercepts the game picture through manual operation of the player. In such a manual screenshot manner, since the screenshot operation of the Player is not in time or the screenshot is forgotten, various key frames cannot be obtained in time, for example, an interactive frame with a Non-Player Character (NPC), a battle frame with BOSS, or a special effect frame that releases skills during battle, it is difficult for the Player to obtain a frame that is fleeting during the game. Meanwhile, the manual screenshot may also cause inaccurate screenshot opportunity and miss a proper game picture due to network delay of player equipment or machine stutter.
Disclosure of Invention
The embodiment of the invention provides a shooting method, a shooting device, electronic equipment and a storage medium, and aims to realize automatic shooting of current animation data.
In a first aspect, an embodiment of the present invention provides a shooting method, where the method includes:
acquiring current animation data corresponding to a current user at a current moment;
and triggering the shooting of at least one target object in the current animation data based on animation historical shooting data of at least one historical user.
In a second aspect, an embodiment of the present invention further provides a shooting apparatus, where the shooting apparatus includes:
the animation acquisition module is used for acquiring current animation data corresponding to a current user at the current moment;
and the shooting triggering module is used for triggering the shooting of at least one target object in the current animation data based on the animation historical shooting data of at least one historical user.
In a third aspect, an embodiment of the present invention further provides an electronic device, where the electronic device includes:
one or more processors;
a storage device for storing one or more programs,
when the one or more programs are executed by the one or more processors, the one or more processors implement the photographing method according to any of the embodiments of the present invention.
In a fourth aspect, the embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the shooting method according to any one of the embodiments of the present invention.
According to the technical scheme of the embodiment of the invention, the current animation data is shot by triggering based on the animation historical shooting data of the historical user, namely whether the current animation data needs to be shot or not is determined by the animation historical shooting data, the shooting of the current animation data can be automatically triggered, the high-efficiency shooting of the game picture can be realized, the picture which is possibly interested by the current user can be automatically shot, the highlight moment of the current user can be timely recorded, the personalized requirement of the user is met, and the technical effect of the experience of the user is improved.
Drawings
In order to more clearly illustrate the technical solutions of the exemplary embodiments of the present invention, a brief description is given below of the drawings used in describing the embodiments. It should be clear that the described figures are only views of some of the embodiments of the invention to be described, not all, and that for a person skilled in the art, other figures can be derived from these figures without inventive effort.
Fig. 1 is a schematic flowchart of a shooting method according to an embodiment of the present invention; (ii) a
Fig. 2 is a schematic flow chart of a shooting method according to a second embodiment of the present invention; (ii) a
Fig. 3 is a schematic flowchart of a shooting method according to a third embodiment of the present invention;
fig. 4 is a schematic flowchart of a shooting method according to a fourth embodiment of the present invention;
fig. 5 is a schematic structural diagram of a shooting device according to a fifth embodiment of the present invention;
fig. 6 is a schematic structural diagram of an electronic device according to a sixth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
It should be further noted that, for the convenience of description, only some but not all of the relevant aspects of the present invention are shown in the drawings. Before discussing exemplary embodiments in more detail, it should be noted that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart may describe the operations (or steps) as a sequential process, many of the operations can be performed in parallel, concurrently or simultaneously. In addition, the order of the operations may be re-arranged. The process may be terminated when its operations are completed, but may have additional steps not included in the figure. The processes may correspond to methods, functions, procedures, subroutines, and the like.
Before the embodiments of the present invention are described, an application scenario is described. The shooting method provided by the embodiment of the invention can be suitable for scenes for automatically shooting the animation data in the multimedia resources, and is particularly suitable for automatically shooting the animation data in the game animation. For convenience of understanding, in the embodiment of the present invention, an application scene is taken as an example of a game scene, and a shooting method is described. Animation data in a game animation may include environments, characters, monsters, and the like.
Example one
Fig. 1 is a flowchart of a shooting method according to an embodiment of the present invention, where the present embodiment is applicable to a case where current animation data is automatically shot based on animation historical shooting data, and the method may be executed by a shooting device, where the shooting device may be configured in a terminal or a server, and the terminal and the server independently execute or cooperate to execute the shooting method according to the embodiment of the present invention.
As shown in fig. 1, the shooting method in this embodiment may specifically include:
and S110, acquiring current animation data corresponding to the current user at the current moment.
Note that the animation is often composed of many frame images. Animations are usually themed and are provided with a storyline. As the story line progresses, the content of each frame of image also changes. During the playing of the animation, the user may continuously see new animation data. And the playing progress of the same animation corresponding to different users may also be different. Therefore, in the embodiment of the present invention, the current animation data corresponding to the current user at the current time may be obtained.
The current animation data may be data corresponding to an animation picture currently played. Taking a game animation as an example, the current animation data may be data of a scene picture including one or more objects of a character, a monster, weather, a tree, or a building. In animation, these objects generally need to be realized through the corresponding information groups of the objects. For example, the information group corresponding to each object in the current animation data may be rendered based on a preset game player viewing angle to obtain a current animation picture. That is, the current animation data is determined based on the information groups of the respective objects included in the current animation picture.
And S120, triggering the shooting of at least one target object in the current animation data based on the animation historical shooting data of at least one historical user.
Specifically, it may be determined whether to photograph the current animation data based on animation history photographing data of the historical user, and if so, the photographing of at least one target object in the current animation data may be triggered. In other words, animation history photographing data of the history user may be analyzed to determine whether current animation data is to be photographed.
Wherein, the animation history shooting data can include but is not limited to at least one of the following data: history shooting animation data, history shooting parameters, user operation information corresponding to the history shooting animation data, and the like. The user operation information may be understood as operation behavior information of a user acting on the history shooting animation data. Illustratively, the user operation information may specifically include, but is not limited to, at least one of the following operations: and deleting, saving, sharing and manually re-shooting the historical shooting animation data.
Note that the historical captured animation data may be understood as animation data that has been captured once, and may or may not include current animation data. For example, the historical shooting animation data includes shooting data of a historical user on a preset animation event, and the current picture data may be a picture of the current user playing the preset animation event; the current picture data may be other picture data than the history shooting animation data.
For example, the shooting of the at least one target object in the current animation data is triggered based on animation historical shooting data of at least one historical user, and specifically, the shooting of the at least one target object in the current animation data may be triggered based on historical shooting data corresponding to the current animation data in the animation historical shooting data of the at least one historical user.
The historical user may be understood as a user with animation historical shooting data, and may or may not include the current user. The number of the historical users can be one or more.
Alternatively, when the shooting of at least one target object in the current animation data is triggered based on the historical shooting data corresponding to the current animation data in the animation historical shooting data of two or more historical users, the historical shooting data corresponding to the current animation data in the animation historical shooting data of each historical user can be respectively obtained, and then the historical shooting data corresponding to the historical users are summed or weighted and summed to determine whether the shooting of at least one target object in the current animation data is triggered. For example, triggering the photographing of the at least one target object in the current animation data based on the historical photographing data corresponding to the current animation data in the animation historical photographing data of the at least one historical user may further include: and counting historical shooting data corresponding to the current animation data in the animation historical shooting data of at least one historical user based on preset trigger parameters, and determining whether to trigger the shooting of at least one target object in the current animation data according to a counting result.
The historical shooting data comprises historical shooting animation data and user operation information corresponding to the historical shooting animation data; the preset trigger parameters may include user operation information of the history photographed animation data corresponding to the current animation data and the number of operations corresponding to each item of user operation information.
For example, user operation information of historical captured animation data corresponding to current animation data in animation historical captured data of at least one historical user and the operation times corresponding to each item of user operation information may be counted, and then whether capturing of at least one target object in the current animation data is triggered or not may be determined according to a result of the counting. For example, when the operation times corresponding to at least one item of user operation information is greater than a preset time threshold, the shooting of at least one target object in the current animation data may be triggered. It can be understood that the preset time thresholds corresponding to different user operation information may be the same or different, and the specific numerical value may be set according to actual requirements, which is not specifically limited herein.
As previously described, the current animation data may include one or more objects such as characters, monsters, weather, trees, or buildings. The target object may be understood as a key object to be photographed among the respective objects contained in the current animation data. The number of target objects may be one, two, or more than two. Target objects include, but are not limited to, player-manipulated characters, game monsters, game NPCs, scene buildings. It is noted that player-manipulated characters include, but are not limited to, characters and animals; scene buildings include, but are not limited to, natural scenes such as mountains, sky, grass, etc., and real buildings such as churches, arenas, etc.
In one embodiment, at least two target objects to be photographed may be determined based on a picture type of the current animation data. The association relationship between the picture type and the corresponding target object to be photographed may be stored in advance. After the picture type of the current animation data is identified, the target object to be shot corresponding to the current animation data can be determined based on the pre-stored association relation. For example, when the current animation data is a battle scene, the corresponding at least two target objects to be photographed include at least one player control character and a game monster battle with each control character, or include player control characters respectively battle with each other. Namely, at least two target objects to be photographed corresponding to the current animation data are determined by recognizing the type of the current animation data of the target player.
In another embodiment, at least the target player manipulation character is included in the at least two target objects to be photographed. The remaining target objects to be photographed may be interactive objects of the target player manipulating the character in the current animation data. Specifically, the target player control character in the current animation data can be detected in real time, so that when the target player control character generates an interactive behavior, such as battle, each mutual object of the target player control character is determined as the rest target objects to be photographed.
In another embodiment, at least two target objects to be photographed may be further determined based on the attribute information of each object. The attribute information of each object includes, but is not limited to, object types such as player characters, monsters, NPCs, buildings, and the like. Specifically, when it is detected that the attribute information of each object in the current picture data includes a preset object type, an object conforming to the preset object type is selected from the objects to serve as at least two target objects to be photographed. The preset attribute information may be player character + monster or player character + building, etc.
Optionally, after triggering the shooting of at least one target object in the current animation data, the method further includes: and shooting at least one target object in the current animation data. The shooting of the at least one target object in the current animation data may be performed on the at least one target object in the current animation data, or the at least one target object in the current animation data is recorded.
It is to be understood that the target object is a main subject, and is not a limitation on the contents of photographing. When at least one target object in the current animation data is shot, objects except the target object can be included in the shot data.
According to the technical scheme of the embodiment of the invention, the current animation data is shot by triggering based on the animation historical shooting data of the historical user, namely whether the current animation data needs to be shot or not is determined by the animation historical shooting data, the shooting of the current animation data can be automatically triggered, the high-efficiency shooting of the game picture can be realized, the picture which is possibly interested by the current user can be automatically shot, the highlight moment of the current user can be timely recorded, the personalized requirement of the user is met, and the technical effect of the experience of the user is improved.
Optionally, after the triggering of shooting of at least one target object in the current animation data, the method further includes: determining current shooting parameters based on historical shooting parameters in user historical shooting data of the current user; and shooting at least one target object in the current animation data based on the current shooting parameters.
For example, the current shooting parameter is determined based on the historical shooting parameter in the user historical shooting data of the current user, and the historical shooting parameter used when shooting is carried out the latest time from the current time in the animation historical shooting data of the current user can be obtained as the current shooting parameter.
Optionally, the current shooting parameters are determined based on the historical shooting parameters in the user historical shooting data of the current user, and the historical shooting parameters with the highest use frequency in the historical shooting parameters in the animation historical shooting data of the current user may be acquired as the current shooting parameters.
Optionally, determining the current shooting parameters based on the historical shooting parameters in the animation historical shooting data includes: determining at least one group of target historical shooting animation data corresponding to the current animation data in the animation historical shooting data; determining current photographing parameters based on the history photographing parameters corresponding to the at least one set of target history photographing animation data.
For example, at least one set of target history photographed animation data corresponding to the current animation data among animation history photographed data may be determined according to a picture type of the history photographed animation data and a picture type of the current animation data. Specifically, the history captured animation data that coincides with the picture type of the current animation data among the history captured animation data may be taken as the target history captured animation data.
For example, at least one set of target historical captured animation data corresponding to the current animation data in the animation historical captured data may be determined according to the similarity between the historical captured animation data and the current animation data in the animation historical captured data. Specifically, if the degree of similarity of the history captured animation data to the current animation data exceeds a preset similarity threshold, the history captured animation data is determined as the target history captured animation data.
Optionally, determining the current photographing parameters based on the historical photographing parameters corresponding to the at least one set of target historical photographing animation data includes: respectively acquiring historical shooting parameters corresponding to each group of target historical shooting animation data, and respectively making each group of historical shooting parameters as a group of current shooting parameters; or determining target historical shooting parameters from historical shooting parameters corresponding to the at least one group of target historical shooting animation data, and taking the target historical shooting parameters as current shooting parameters.
For example, determining the target history photographing parameter from the history photographing parameters corresponding to the at least one set of target history photographing animation data may include: determining the picture similarity of the target historical shooting animation data and the current animation data, and determining reference animation shooting data from the at least one group of target historical shooting animation data based on the picture similarity; and taking the historical shooting parameters corresponding to the reference animation shooting data as target historical shooting parameters. For example, the target history captured animation data having the highest degree of screen similarity with the current animation data may be used as the target history captured animation data as the reference animation captured data.
For example, the determining of the picture similarity of the target history captured animation data and the current animation data may specifically be calculating the picture similarity of the target history captured animation data and the current animation data based on a preset similarity parameter. Wherein, the preset similarity parameter may include, but is not limited to, at least one of the following parameters: picture color, number of picture objects and/or picture type, etc.
The number of the preset similarity parameters may be one, two or more. When the picture similarity of the target historical shooting animation data and the current animation data is calculated based on two or more preset similarity parameters, the picture similarity of the target historical shooting animation data and the current animation data can be calculated through each similarity parameter, and then the picture similarity of the target historical shooting animation data and the current animation data is obtained through a summing or weighted summing mode.
Optionally, determining the current shooting parameters based on the historical shooting parameters in the animation historical shooting data includes: determining at least one group of target historical shooting animation data corresponding to the current animation data in the animation historical shooting data; determining current photographing parameters based on the history photographing parameters corresponding to the at least one set of target history photographing animation data.
Optionally, determining the current shooting parameters based on the historical shooting parameters in the animation historical shooting data includes: determining at least one group of target historical shooting animation data corresponding to the current animation data in the animation historical shooting data; determining current photographing parameters based on the history photographing parameters corresponding to the at least one set of target history photographing animation data.
The advantage that sets up like this can shoot based on current user's individual shooting custom, matches user's individualized demand better for the data of shooing more laminate with current user's historical shooting data, in order to promote user experience.
Optionally, after the triggering of shooting of at least one target object in the current animation data, the method further includes: determining current shooting parameters based on historical shooting parameters corresponding to the current animation data in animation historical shooting data of at least one historical user; and shooting at least one target object in the current animation data based on the current shooting parameters.
Similarly, determining the current photographing parameters based on the historical photographing parameters corresponding to the current animation data in the animation historical photographing data of the at least one historical user may include: determining at least one group of target historical shooting animation data corresponding to the current animation data in the animation historical shooting data of the at least one historical user; determining current photographing parameters based on the history photographing parameters corresponding to the at least one set of target history photographing animation data.
It can be understood that the specific implementation manner of determining at least one set of target historical captured animation data corresponding to the current animation data in the animation historical captured data of the at least one historical user may adopt an implementation manner of "determining current capturing parameters based on historical capturing parameters in the user historical captured data of the current user", and details are not repeated herein.
The historical shooting parameters can be understood as corresponding camera shooting parameters, camera attribute parameters and/or scene parameters when shooting the historical animation data. Illustratively, the camera shooting parameters are parameters corresponding to shooting history shooting animation data, and may include, but are not limited to, at least one of the following data: the shooting angle at the time of shooting the history shot moving image data, the shooting position of the camera at the time of shooting, the wide angle parameter of the camera at the time of shooting, and the like. Specifically, the camera attribute parameters are parameters used in the shooting process, that is, specific parameters corresponding to shooting, for example, a shutter, an aperture, an angle of view, an exposure amount, whether a flash is on, and may also be shooting light and/or a shooting angle corresponding to the current user. The scene parameter may be a light source parameter in the current scene, e.g. number of light sources, intensity of light sources, etc. The current shooting parameters can be understood as camera shooting parameters corresponding to shooting of the current animation data, camera attribute parameters and scene parameters corresponding to the current animation data.
The advantage that sets up like this lies in, can confirm current shooting parameter through the historical shooting data of historical user to current animation data, and this kind of confirm current shooting parameter from the big data can effectively select the animation picture of waiting to shoot, can provide more shooting choices for the user when shooing moreover, confirms current shooting parameter from the dimension of time and space to guarantee to shoot the effect, improved the flexibility of shooing, greatly promote user experience.
Example two
Fig. 2 is a schematic flow chart of a shooting method according to a second embodiment of the present invention, where the present embodiment is further refined on the basis of the foregoing optional technical solutions, and optionally, triggering shooting of at least one target object in current animation data based on animation historical shooting data of at least one historical user includes: determining whether the current animation data is target shooting animation data or not based on historical shooting data corresponding to the current animation data in animation historical shooting data of at least one historical user; and if so, triggering the shooting of at least one target object in the current animation data. The same or corresponding terms as those in the above embodiments are not explained in detail herein.
As shown in fig. 2, the shooting method in this embodiment may specifically include:
s210, current animation data corresponding to the current user at the current moment are obtained.
S220, determining whether the current animation data is target shooting animation data or not based on historical shooting data corresponding to the current animation data in animation historical shooting data of at least one historical user.
The historical shooting data corresponding to at least one historical user and corresponding to the current animation data in the animation historical shooting data of the historical user can be understood as the historical shooting data of the historical animation data corresponding to the current animation data and the animation historical shooting data of the historical user. In other words, the history shooting data generated when the same animation data is played by the history user in his game animation. Because the game players are different and the game roles controlled by the game players are different, the real-time animation data in playing are different aiming at the same preset animation image frame.
It is understood that the historical photographing data corresponding to at least one historical user and corresponding to the current animation data, for example, the historical animation data when the historical user and the current user are in the same game stage, tends to have a higher similarity to the current animation data. In game animation, animation data includes a game character, a game scene, and the like. The game scene usually includes some fixed objects which are not changed by the change of the game player, and the data can be used as static data. I.e., the inherent object data in the animation data. For example, inanimate objects such as buildings, plants, and small objects in the game scene, or atmosphere data such as time, weather, wind conditions, and tide in the environment atmosphere may be used.
Optionally, historical static data in historical shooting animation data in animation historical shooting data based on at least one historical user and current static data of the current animation data corresponding to the current user are respectively obtained, and historical shooting data corresponding to the at least one historical user and the current animation data is determined based on the similarity between the current static data and the historical static data. Specifically, if the similarity between the current static data and the historical static data is greater than a preset static similarity threshold, the historical captured animation data may be determined as the historical captured data corresponding to the current animation data and corresponding to at least one historical user.
Of course, the history shooting data corresponding to the at least one history user and corresponding to the current animation data may be determined based on the history shooting animation data in the animation history shooting data of the at least one history user, the picture type of the current animation data of the current user, and the like. For a specific implementation manner, reference may be made to the explanation of the picture type in the embodiment of the present invention, which is not described herein again.
And S230, if so, triggering the shooting of at least one target object in the current animation data.
According to the technical scheme, whether the current animation data of the current user is shot or not is determined through the historical shooting data corresponding to the current animation data in the historical shooting data based on the historical user, the animation data to be shot can be determined by means of the shooting data of other users, automatic shooting of the animation data can be effectively achieved by means of the big data of the historical user, the current user can shoot the current animation data in time even if the current animation data is played for the first time, the defects that manual operation is complicated and wonderful pictures are missed are avoided, and user experience is further improved.
EXAMPLE III
Fig. 3 is a schematic flow chart of a shooting method according to a third embodiment of the present invention, where this embodiment further refines the optional technical solutions, and optionally determines whether the current animation data is target shooting animation data based on historical shooting data corresponding to the current animation data in animation historical shooting data of at least one historical user, where the determining includes: determining the current shooting preference of the current animation data based on historical shooting data corresponding to the current animation data in animation historical shooting data of at least one historical user; and if the current shooting preference degree reaches a preset preference degree threshold value, determining the current animation data as target shooting animation data.
As shown in fig. 3, the shooting method in this embodiment may specifically include:
s310, current animation data corresponding to the current user at the current time are obtained.
S320, determining the current shooting preference of the current animation data based on the historical shooting data corresponding to the current animation data in the animation historical shooting data of at least one historical user.
Optionally, historical shooting animation data corresponding to the current animation data in animation historical shooting data of at least one historical user is determined, and the current shooting preference of the current user on the current animation data is determined based on the historical shooting animation data. For example, scene information of historical photographing animation data and current animation data may be respectively determined, and a current photographing preference of the current user for the current animation data may be determined based on the scene information.
Optionally, determining the current shooting preference of the current user for the current animation data based on the scene information includes: determining whether historical shooting animation data consistent with scene information of current animation data exists in the historical shooting animation data, and determining the current shooting preference of a current user on the current animation data based on the determination result. For example, if the current shooting preference degree exists, the current shooting preference degree is determined to be 1; and if not, determining that the current shooting preference degree is 0.
Optionally, the determining of the current photographing preference of the current user for the current animation data based on the determination result includes: if the historical shooting animation data consistent with the scene information of the current animation data exists in the historical shooting animation data, determining the proportion of the historical shooting animation data consistent with the scene information of the current animation data in the total historical shooting animation data, and determining the current shooting preference of the current animation data based on the proportion. For example, the higher the occupied ratio, the higher the current shooting preference, and conversely, the higher the occupied ratio, the lower the current shooting preference.
Alternatively, if there is history photographed animation data that coincides with the scene information of the current animation data among the history photographed animation data, a history photographing time of the history photographed animation data that coincides with the scene information of the current animation data is determined, and a current photographing preference of the current animation data is determined based on the history photographing time. For example, the current shooting preference may be higher as the history shooting time is closer to the current time, whereas the current shooting preference may be lower as the history shooting time is farther from the current time.
In addition, the current shooting preference of the current user for the current animation data can be determined based on the historical shooting animation data, the picture type of the current animation data, the story line and/or the attribute information of the target object in the picture, and the like. Optionally, determining historical user operation information corresponding to the current animation data in animation historical shooting data of at least one historical user; and determining the historical shooting preference of the current animation data based on the historical user operation information.
The historical user operation information corresponding to the historical captured animation data may be understood as operation information of the historical user on the historical captured animation data. The historical user operation information may include one item, two items or more items. Specifically, the historical user operation information may include, but is not limited to, at least one of the following operations: clipping, beautifying, deleting, saving, sharing, and/or manually re-shooting, etc. In the embodiment of the invention, the historical preference information of the historical user on the historical shooting animation data can be determined through the historical user operation information of the historical user.
Illustratively, the history shooting preference for the history shooting animation data when the history user shares and saves the history shooting animation data is higher than the history shooting preference for the history shooting animation data when the history user saves only the history shooting animation data; when the historical shooting animation data is stored by the historical user, the historical shooting preference degree of the historical shooting animation data is higher than that of the historical shooting animation data when the historical user beautifies the historical shooting animation data; when the historical shooting animation data is beautified by the historical user, the historical shooting preference of the historical shooting animation data is higher than that of the historical shooting animation data when the historical user deletes the historical shooting animation data and manually shoots again; the history shooting preference for the history shooting animation data when the history user deletes the history shooting animation data and manually retakes it is higher than the history shooting preference for the history shooting animation data when the history user deletes only the history shooting animation data.
It should be noted that the above level of the preference and the specific user operation information depended on in the setting are only an exemplary illustration of the determination method of the history shooting preference, and are not limited.
For example, the history shooting preference for the history shooting animation data may also be determined by the number of times of sharing the history shooting animation data, or the like.
S330, if the current shooting preference degree reaches a preset preference degree threshold value, determining the current animation data as target shooting animation data.
It is understood that the preset preference threshold may be set according to actual requirements, and specific values thereof are not specifically limited herein.
And S340, triggering shooting of at least one target object in the current animation data.
According to the technical scheme, the current shooting preference degree of the current user for the current animation data is determined by analyzing the historical shooting data corresponding to the current animation data in the animation historical shooting data of the historical user, namely, the degree of the current user wanting to shoot the current animation data is obtained, if the current shooting preference degree reaches a preset preference degree threshold value, the current user wanting to shoot the current animation data can be determined, shooting of at least one target object in the current animation data is triggered at the moment, the preference of the user can be analyzed from the animation historical shooting data of the current user, a personalized shooting triggering mechanism for the current user is generated, and user experience is further improved.
Example four
Fig. 4 is a schematic flow chart of a shooting method according to a fourth embodiment of the present invention, where this embodiment further refines on the basis of the foregoing optional technical solutions, and optionally, the determining a current shooting preference degree of the current animation data based on historical shooting data corresponding to at least one historical user and corresponding to the current animation data includes: training an original machine learning model based on historical shooting data corresponding to the current animation data in animation historical shooting data of at least one historical user to obtain a preference degree evaluation model, wherein the animation historical shooting data comprises historical shooting animation data and historical shooting preference degrees corresponding to the historical shooting animation data; and determining the current shooting preference of the current user to the current animation data based on the trained preference evaluation model.
As shown in fig. 4, the shooting method in this embodiment may specifically include:
and S410, acquiring current animation data corresponding to the current user at the current moment.
And S420, training the original machine learning model based on historical shooting data corresponding to the current animation data in the animation historical shooting data of at least one historical user to obtain a preference evaluation model.
The animation historical shooting data comprises historical shooting animation data and historical shooting preference degrees corresponding to the historical shooting animation data.
Specifically, historical shooting animation data of a historical user and a historical shooting preference corresponding to the historical shooting animation data may be input into a pre-established original machine learning model, and a predicted shooting preference corresponding to the historical shooting animation data may be output; and adjusting an original machine learning model based on the predicted shooting preference and the historical shooting preference corresponding to the historical shooting animation data to obtain a preference evaluation model.
Specifically, the original machine learning model is adjusted based on the predicted shooting preference and the historical shooting preference corresponding to the historical shooting animation data, and a loss function may be constructed based on the predicted shooting preference and the historical shooting preference corresponding to the historical shooting animation data, and model parameters of original machine learning are adjusted to converge the loss function. When the loss function converges, the trained original machine learning model is used as a preference evaluation model. The model parameter may be a weight of each feature data of the history shooting animation data.
It can be understood that the original machine learning model needs to be iteratively trained through historical shooting animation data of a plurality of historical users, so as to obtain a preference evaluation model. Optionally, after the training is completed, the trained preference degree evaluation model may be further verified based on verification sample data in the historical shooting animation data, and it is determined whether the preference degree evaluation model needs to continue training based on a verification result.
Illustratively, the raw machine learning model may comprise a neural network model. Further, the neural network model may include a deep learning model.
And S430, determining the current shooting preference of the current user to the current animation data based on the trained preference evaluation model.
Specifically, the current animation data is input to the preference evaluation model after training is completed, and the current shooting preference of the current animation data is obtained.
S440, if the current shooting preference degree reaches a preset preference degree threshold value, determining the current animation data as target shooting animation data.
S450, triggering shooting of at least one target object in the current animation data.
In the technical scheme of the embodiment, the original machine learning model is trained through the historical shooting data corresponding to the current animation data in the animation historical shooting data of the historical user to obtain a preference degree evaluation model, and then the preference evaluation model evaluates the current shooting preference of the current user to the current animation data, through the automatic decision of the machine learning model, the shooting preference degree of the animation historical shooting data of the current user can be fully analyzed, and then assess the present shooting preference degree of present animation data fast effectively, along with the increase of animation historical shooting data, the model can constantly automatic optimization moreover, for setting for fixed statistical mode, the preference degree of present user can be assessed in a more nimble dynamic manner to the mode of machine learning, has promoted data processing's efficiency, has further promoted user experience simultaneously.
EXAMPLE five
Fig. 5 is a schematic structural diagram of a shooting apparatus according to a fifth embodiment of the present invention, which can be used to execute the shooting method according to any embodiment of the present invention, and the apparatus can be implemented by software and/or hardware. The photographing apparatus of an embodiment of the present invention may include: an animation acquisition module 510 and a photographing trigger module 520.
The animation obtaining module 510 is configured to obtain current animation data corresponding to a current user at a current time; and a shooting triggering module 520, configured to trigger shooting of at least one target object in the current animation data based on animation historical shooting data of at least one historical user.
According to the technical scheme of the embodiment of the invention, the current animation data is shot by triggering based on the animation historical shooting data of the historical user, namely whether the current animation data needs to be shot or not is determined by the animation historical shooting data, the shooting of the current animation data can be automatically triggered, the high-efficiency shooting of the game picture can be realized, the picture which is possibly interested by the current user can be automatically shot, the highlight moment of the current user can be timely recorded, the personalized requirement of the user is met, and the technical effect of the experience of the user is improved.
On the basis of the technical solutions of the embodiments of the present invention, the shooting trigger module 520 includes:
a shot animation determining unit for determining whether the current animation data is target shot animation data based on historical shot data corresponding to the current animation data in animation historical shot data of at least one historical user;
and the shooting triggering unit is used for triggering the shooting of at least one target object in the current animation data if the current animation data is positive.
On the basis of each technical solution of the embodiment of the present invention, the shooting animation determining unit may include:
the preference degree determining subunit is used for determining the current shooting preference degree of the current animation data based on historical shooting data corresponding to the current animation data in animation historical shooting data of at least one historical user;
and the shooting animation determining subunit is used for determining the current animation data as the target shooting animation data if the current shooting preference reaches a preset preference threshold.
On the basis of each technical scheme of the embodiment of the invention, the preference degree determining subunit can be used for:
training an original machine learning model based on historical shooting data corresponding to the current animation data in animation historical shooting data of at least one historical user to obtain a preference degree evaluation model, wherein the animation historical shooting data comprises historical shooting animation data and historical shooting preference degrees corresponding to the historical shooting animation data;
and determining the current shooting preference of the current user to the current animation data based on the trained preference evaluation model.
On the basis of the technical solutions of the embodiments of the present invention, the photographing apparatus may further include: an operation information determination module and a historical preference determination module.
The operation information determining module is used for determining historical user operation information which corresponds to at least one historical user in animation historical shooting data of at least one historical user and corresponds to the current animation data; and the historical preference determining module is used for determining the historical shooting preference of the current animation data based on the historical user operation information.
On the basis of the technical solutions of the embodiments of the present invention, the photographing apparatus may further include: the device comprises a first shooting parameter determining module and a first shooting module.
The first shooting parameter determining module is used for determining current shooting parameters based on historical shooting parameters in user historical shooting data of the current user after the trigger is used for shooting at least one target object in the current animation data; and the first shooting module is used for shooting at least one target object in the current animation data based on the current shooting parameters.
On the basis of the technical solutions of the embodiments of the present invention, the photographing apparatus may further include: the second shooting parameter determining module and the second shooting module.
After the trigger is used for shooting at least one target object in the current animation data, the second shooting parameter determining module is used for determining current shooting parameters based on historical shooting parameters corresponding to the current animation data in animation historical shooting data of at least one historical user; and the second shooting module is used for shooting at least one target object in the current animation data based on the current shooting parameters.
The shooting device provided by the embodiment of the invention can execute the shooting method provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method.
It should be noted that, the units and modules included in the above-mentioned shooting device are merely divided according to functional logic, but are not limited to the above-mentioned division as long as the corresponding functions can be realized; in addition, specific names of the functional units are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the embodiment of the invention.
EXAMPLE six
Fig. 6 is a schematic structural diagram of an electronic device according to a sixth embodiment of the present invention. FIG. 6 illustrates a block diagram of an exemplary electronic device 12 suitable for use in implementing embodiments of the present invention. The electronic device 12 shown in fig. 6 is only an example and should not bring any limitation to the function and the scope of use of the embodiment of the present invention.
As shown in FIG. 6, electronic device 12 is embodied in the form of a general purpose computing device. The components of electronic device 12 may include, but are not limited to: one or more processors or processing units 16, a system memory 28, and a bus 18 that couples various system components including the system memory 28 and the processing unit 16.
Bus 18 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, micro-channel architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Electronic device 12 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by electronic device 12 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 28 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM)30 and/or cache memory 32. The electronic device 12 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 34 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 6, and commonly referred to as a "hard drive"). Although not shown in FIG. 6, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to bus 18 by one or more data media interfaces. System memory 28 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
A program/utility 40 having a set (at least one) of program modules 42 may be stored, for example, in system memory 28, such program modules 42 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination thereof may comprise an implementation of a network environment. Program modules 42 generally carry out the functions and/or methodologies of the described embodiments of the invention.
Electronic device 12 may also communicate with one or more external devices 14 (e.g., keyboard, pointing device, display 24, etc.), with one or more devices that enable a user to interact with electronic device 12, and/or with any devices (e.g., network card, modem, etc.) that enable electronic device 12 to communicate with one or more other computing devices. Such communication may be through an input/output (I/O) interface 22. Also, the electronic device 12 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet) via the network adapter 20. As shown, the network adapter 20 communicates with other modules of the electronic device 12 via the bus 18. It should be understood that although not shown in the figures, other hardware and/or software modules may be used in conjunction with electronic device 12, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
The processing unit 16 executes various functional applications and data processing by executing programs stored in the system memory 28, for example, to implement a photographing method provided by the present embodiment.
EXAMPLE seven
An embodiment of the present invention further provides a storage medium containing computer-executable instructions, which when executed by a computer processor, perform a photographing method, including:
acquiring current animation data corresponding to a current user at a current moment;
and triggering the shooting of at least one target object in the current animation data based on animation historical shooting data of at least one historical user.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for embodiments of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (10)

1. A photographing method, characterized by comprising:
acquiring current animation data corresponding to a current user at a current moment;
and triggering the shooting of at least one target object in the current animation data based on animation historical shooting data of at least one historical user.
2. The method of claim 1, wherein triggering the capturing of the at least one target object in the current animation data based on animation history capturing data of at least one historical user comprises:
determining whether the current animation data is target shooting animation data or not based on historical shooting data corresponding to the current animation data in animation historical shooting data of at least one historical user;
and if so, triggering the shooting of at least one target object in the current animation data.
3. The method of claim 2, wherein the determining whether the current animation data is target animation data based on historical captured data corresponding to the current animation data in animation historical captured data of at least one historical user comprises:
determining the current shooting preference of the current animation data based on historical shooting data corresponding to the current animation data in animation historical shooting data of at least one historical user;
and if the current shooting preference degree reaches a preset preference degree threshold value, determining the current animation data as target shooting animation data.
4. The method of claim 3, wherein determining the current capture preference for the current animation data based on historical capture data corresponding to at least one historical user and corresponding to the current animation data comprises:
training an original machine learning model based on historical shooting data corresponding to the current animation data in animation historical shooting data of at least one historical user to obtain a preference degree evaluation model, wherein the animation historical shooting data comprises historical shooting animation data and historical shooting preference degrees corresponding to the historical shooting animation data;
and determining the current shooting preference of the current user to the current animation data based on the trained preference evaluation model.
5. The method of claim 3, further comprising:
determining historical user operation information corresponding to the current animation data in animation historical shooting data of at least one historical user;
and determining the historical shooting preference of the current animation data based on the historical user operation information.
6. The method of claim 1, further comprising, after the triggering the capturing of the at least one target object in the current animation data:
determining current shooting parameters based on historical shooting parameters in user historical shooting data of the current user;
and shooting at least one target object in the current animation data based on the current shooting parameters.
7. The method of claim 1, further comprising, after the triggering the capturing of the at least one target object in the current animation data:
determining current shooting parameters based on historical shooting parameters corresponding to the current animation data in animation historical shooting data of at least one historical user;
and shooting at least one target object in the current animation data based on the current shooting parameters.
8. A camera, comprising:
the animation acquisition module is used for acquiring current animation data corresponding to a current user at the current moment;
and the shooting triggering module is used for triggering the shooting of at least one target object in the current animation data based on the animation historical shooting data of at least one historical user.
9. An electronic device, characterized in that the electronic device comprises:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the photographing method of any of claims 1-7.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the photographing method according to any one of claims 1-7.
CN202011626966.8A 2020-12-31 2020-12-31 Shooting method, shooting device, electronic equipment and storage medium Pending CN112791402A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011626966.8A CN112791402A (en) 2020-12-31 2020-12-31 Shooting method, shooting device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011626966.8A CN112791402A (en) 2020-12-31 2020-12-31 Shooting method, shooting device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN112791402A true CN112791402A (en) 2021-05-14

Family

ID=75807836

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011626966.8A Pending CN112791402A (en) 2020-12-31 2020-12-31 Shooting method, shooting device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112791402A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070283265A1 (en) * 2006-05-16 2007-12-06 Portano Michael D Interactive gaming system with animated, real-time characters
US20080227516A1 (en) * 2007-03-15 2008-09-18 Boris Itskov Poker video game terminal
CN105260116A (en) * 2015-09-23 2016-01-20 网易(杭州)网络有限公司 Method for capturing image in game in real time
CN106843897A (en) * 2017-02-09 2017-06-13 腾讯科技(深圳)有限公司 A kind of method and apparatus for intercepting game picture
CN109976634A (en) * 2019-03-18 2019-07-05 北京智明星通科技股份有限公司 A kind of game APP screenshot method and equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070283265A1 (en) * 2006-05-16 2007-12-06 Portano Michael D Interactive gaming system with animated, real-time characters
US20080227516A1 (en) * 2007-03-15 2008-09-18 Boris Itskov Poker video game terminal
CN105260116A (en) * 2015-09-23 2016-01-20 网易(杭州)网络有限公司 Method for capturing image in game in real time
CN106843897A (en) * 2017-02-09 2017-06-13 腾讯科技(深圳)有限公司 A kind of method and apparatus for intercepting game picture
CN109976634A (en) * 2019-03-18 2019-07-05 北京智明星通科技股份有限公司 A kind of game APP screenshot method and equipment

Similar Documents

Publication Publication Date Title
CN112827172B (en) Shooting method, shooting device, electronic equipment and storage medium
CN106375674A (en) Method and apparatus for finding and using video portions that are relevant to adjacent still images
CN112843735B (en) Game picture shooting method, device, equipment and storage medium
CN112423143A (en) Live broadcast message interaction method and device and storage medium
CN108421240A (en) Court barrage system based on AR
CN109939439B (en) Virtual character blocking detection method, model training method, device and equipment
CN112843693B (en) Method and device for shooting image, electronic equipment and storage medium
CN112822555A (en) Shooting method, shooting device, electronic equipment and storage medium
CN112791402A (en) Shooting method, shooting device, electronic equipment and storage medium
CN112791401B (en) Shooting method, shooting device, electronic equipment and storage medium
CN112866562B (en) Picture processing method and device, electronic equipment and storage medium
CN116966557A (en) Game video stream sharing method and device, storage medium and electronic equipment
CN112860360B (en) Picture shooting method and device, storage medium and electronic equipment
CN112843695B (en) Method and device for shooting image, electronic equipment and storage medium
CN112843691B (en) Method and device for shooting image, electronic equipment and storage medium
CN112839171B (en) Picture shooting method and device, storage medium and electronic equipment
CN112843733A (en) Method and device for shooting image, electronic equipment and storage medium
CN114125552A (en) Video data generation method and device, storage medium and electronic device
CN113934766A (en) Go fixed-type playing method and device, electronic equipment and storage medium
CN112843739A (en) Shooting method, shooting device, electronic equipment and storage medium
CN113497946A (en) Video processing method and device, electronic equipment and storage medium
CN112843696A (en) Shooting method, shooting device, electronic equipment and storage medium
CN112843732A (en) Method and device for shooting image, electronic equipment and storage medium
CN112843678B (en) Method and device for shooting image, electronic equipment and storage medium
CN112860372B (en) Method and device for shooting image, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination