CN112839170A - Shooting method, shooting device, electronic equipment and storage medium - Google Patents

Shooting method, shooting device, electronic equipment and storage medium Download PDF

Info

Publication number
CN112839170A
CN112839170A CN202011625298.7A CN202011625298A CN112839170A CN 112839170 A CN112839170 A CN 112839170A CN 202011625298 A CN202011625298 A CN 202011625298A CN 112839170 A CN112839170 A CN 112839170A
Authority
CN
China
Prior art keywords
target
orientation
target objects
target object
shooting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011625298.7A
Other languages
Chinese (zh)
Other versions
CN112839170B (en
Inventor
胡婷婷
赵男
包炎
刘超
施一东
李鑫培
师锐
董一夫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Mihoyo Tianming Technology Co Ltd
Original Assignee
Shanghai Mihoyo Tianming Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Mihoyo Tianming Technology Co Ltd filed Critical Shanghai Mihoyo Tianming Technology Co Ltd
Priority to CN202011625298.7A priority Critical patent/CN112839170B/en
Publication of CN112839170A publication Critical patent/CN112839170A/en
Application granted granted Critical
Publication of CN112839170B publication Critical patent/CN112839170B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/85Providing additional services to players
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention discloses a shooting method, a shooting device, electronic equipment and a storage medium. The method comprises the following steps: determining at least two target objects to be shot so as to obtain the objects to be shot in the game picture; the orientation information of the target parts of at least two target objects is determined, so that the detection of the orientation information of the target objects is realized; and at least two target objects are shot according to the orientation information trigger, so that the aim of triggering the shooting of the game picture based on the orientation information of each target object is realized, the high-efficiency shooting of the game picture is realized, the accurate shooting of the key game picture is realized, the timely recording of highlight moments of users is realized, and the experience of the users is improved.

Description

Shooting method, shooting device, electronic equipment and storage medium
Technical Field
The embodiment of the invention relates to the technical field of computers, in particular to a shooting method, a shooting device, electronic equipment and a storage medium.
Background
At present, the game screenshot is usually realized by manual operation of a player. In such a manual screenshot manner, since the screenshot operation of the Player is not in time or the screenshot is forgotten, various key frames cannot be obtained in time, for example, an interactive frame with a Non-Player Character (NPC), a battle frame with BOSS, or a special effect frame that releases skills during battle, it is difficult for the Player to obtain a frame that is fleeting during the game. Meanwhile, the manual screenshot may also cause inaccurate screenshot opportunity and miss a proper game picture due to network delay of player equipment or machine stutter.
Disclosure of Invention
The embodiment of the invention provides a shooting method, a shooting device, electronic equipment and a storage medium, which are used for triggering the shooting of a game picture based on the orientation information of each target object, so that the high-efficiency shooting of the game picture is realized, the accurate shooting of a key game picture is realized, and the experience of a user is further improved.
In a first aspect, an embodiment of the present invention provides a shooting method, where the method includes:
determining at least two target objects to be photographed;
determining orientation information of target sites of at least two of the target objects;
and triggering to shoot the at least two target objects according to the orientation information.
In a second aspect, an embodiment of the present invention further provides a shooting apparatus, where the shooting apparatus includes:
the object determining module is used for determining at least two target objects to be shot;
an angle determination module for determining orientation information of target parts of at least two target objects;
and the shooting module is used for triggering the shooting of the at least two target objects according to the orientation information.
In a third aspect, an embodiment of the present invention further provides an electronic device, where the electronic device includes:
one or more processors;
a storage device for storing one or more programs,
when the one or more programs are executed by the one or more processors, the one or more processors implement the photographing method provided by any embodiment of the present invention.
In a fourth aspect, the embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the shooting method provided in any embodiment of the present invention.
The embodiment of the invention has the following advantages or beneficial effects:
determining at least two target objects to be shot so as to obtain the objects to be shot in the game picture; the orientation information of the target parts of at least two target objects is determined, so that the detection of the orientation information of the target objects is realized; and at least two target objects are shot according to the orientation information trigger, so that the aim of triggering the shooting of the game picture based on the orientation information of each target object is realized, the high-efficiency shooting of the game picture is realized, the accurate shooting of the key game picture is realized, the timely recording of highlight moments of users is realized, and the experience of the users is improved.
Drawings
In order to more clearly illustrate the technical solutions of the exemplary embodiments of the present invention, a brief description is given below of the drawings used in describing the embodiments. It should be clear that the described figures are only views of some of the embodiments of the invention to be described, not all, and that for a person skilled in the art, other figures can be derived from these figures without inventive effort.
Fig. 1 is a schematic flowchart of a shooting method according to an embodiment of the present invention;
FIG. 2 is a schematic view of an orientation angle provided in accordance with an embodiment of the present invention;
fig. 3 is a schematic flowchart of a shooting method according to a second embodiment of the present invention;
FIG. 4 is a schematic view of an orientation mark line provided in the second embodiment of the present invention;
fig. 5 is a schematic diagram illustrating a distance between the orientation mark lines of the second target object and the first target object according to the second embodiment of the present invention;
FIG. 6 is a schematic view of parallel orientation mark lines provided in a second embodiment of the present invention;
fig. 7 is a schematic flowchart of a shooting method according to a third embodiment of the present invention;
fig. 8 is a schematic flowchart of a shooting method according to a fourth embodiment of the present invention;
fig. 9 is a schematic structural diagram of a shooting device according to a fifth embodiment of the present invention;
fig. 10 is a schematic structural diagram of an electronic device according to a sixth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Example one
Fig. 1 is a flowchart of a shooting method according to an embodiment of the present invention, which is applicable to shooting two or more target objects in a multimedia resource, and is particularly applicable to shooting a game screen in real time, for example, shooting of each object is triggered according to orientation information of each object in the game screen. For convenience of understanding, the shooting method provided by the embodiment of the invention is explained by taking a game scene as an example.
Referring to fig. 1, the shooting method provided in this embodiment specifically includes the following steps:
and S110, determining at least two target objects to be shot.
The target objects may be various objects included in the current game scene. The target object to be photographed refers to a key object to be photographed among various objects included in the current game scene, including but not limited to a player manipulation character, a game monster, a game NPC, a scene building, a war goods, and a skill release special effect. It is noted that player-manipulated characters include, but are not limited to, characters and animals; scene buildings include, but are not limited to, natural scenes such as mountains, sky, grass, etc., and real buildings such as churches, arenas, etc.
Specifically, the current game scene may be understood as a game screen displayed on the client of the target player. For example, the current game scene may include types of interaction scenes between multiple players, battle scenes of players, conversation scenes of players with NPCs, scenes of players enjoying scenes, and/or scenes of players picking up war supplies. The fighting scene of the player can be a scene in which at least one player control character fights with a game monster or a scene in which a plurality of player control characters fight with each other.
It should be noted that, in the current game scene, other target objects may be included in addition to the target object to be photographed, and therefore, at least two target objects to be photographed need to be determined from the current game scene, so as to photograph the target object to be photographed.
In one embodiment, the at least two target objects to be photographed may be determined based on a scene type of a current game scene. The association relationship between each scene type and the corresponding target object to be photographed is stored in advance. After the scene type of the current game scene is identified, the target object to be shot corresponding to the current game scene can be determined based on the pre-stored association relationship. Illustratively, when the current game scene is a battle scene type of the player, the corresponding at least two target objects to be photographed include at least one player control character and game monsters battle with the respective control characters, or include player control characters respectively battle with each other. When the current game scene is the interactive scene type of a plurality of players, the corresponding at least two target objects to be shot comprise at least two interactive player control characters. When the current game scene is a scene type that a player picks up a war article, the corresponding at least two target objects to be shot comprise a player control character and the war article. In other words, at least two target objects to be photographed corresponding to the current game scene can be determined by detecting the type of the current game scene of the target player.
In another embodiment, at least the target player manipulation character is included in the at least two target objects to be photographed. The remaining target objects to be photographed may be interactive objects of the target player manipulating the character in the current game scene. The interactive object can be other player control characters, game NPCs, game monsters, scene buildings, war goods or skill release special effects and the like. Specifically, the target player control character in the current game scene can be detected in real time, so that when the target player control character generates an interaction behavior, each mutual object of the target player control character is determined as the rest target objects to be shot.
In another embodiment, at least two target objects to be photographed may be determined based on the attribute information of the target objects.
The attribute information of the target object includes, but is not limited to, a player character, a monster, an NPC, a building, a specific item, and a skill. Specifically, when it is detected that the attribute information of each target object includes preset attribute information, a target object meeting the preset attribute information is selected from the attribute information of each target object to serve as at least two target objects to be photographed. The preset attribute information may be player character + monster, player character + NPC, or player character + building, etc.
In these alternative embodiments, by determining at least two target objects to be photographed based on the attribute information of the target objects, it is achieved that the at least two target objects to be photographed are determined according to the attributes, thereby achieving accurate determination of the target objects to be photographed.
And S120, determining the orientation information of the target parts of at least two target objects.
The target region in the present embodiment may be a set region of the target object, and typically may be a hand, a face, a foot, a back, or the like of the target object. Optionally, the target portion of the target object is determined based on the feature points of the target object. Specifically, the target portion may be a portion where a feature point of the target object is located, and the feature point may be a root node, a center of gravity, a center of mass, a point that is closest or farthest to another target object, or the like. Specifically, in this embodiment, the target portion of the target object may be a portion where a root node, a center of gravity, or a center of mass is preset, or may also be a portion where a detected point that is closest to or farthest from another target object is located.
For example, the target portion may also be determined based on a scene type of the current game scene to which the target object to be photographed belongs. For example, if the scene type of the current game scene is a fighting scene of the player, considering that most of the fighting between the player control character and the game monster is triggered face to face, the target parts corresponding to the fighting scene type may all be faces. The target portion of each target object to be photographed may be the same or different. For example, the target part of the target object a may be a face, and the target part of the target object B may be a foot.
Considering that the current game frame of the battle scene type may also be a frame of skill released by the player on the game monster, the target part corresponding to the battle scene type may also include: the player controls the part of the character releasing the skill, such as the hand, the mouth, etc., and the face of the game monster, that is, the orientation angle of the player control character target object is the hand orientation information, and the orientation information of the game monster target object is the face orientation information.
In addition, the current game frame of the battle scene type may also be a cooperative battle frame of a plurality of player control characters, for example, the plurality of player control characters fight against a game monster back to back, and at this time, the target portion corresponding to the battle scene type may include: the orientation information of the player-operated character-back and the game monster-face, that is, the orientation information of each player-operated character target object is back orientation information, and the orientation information of the game monster target object is face orientation information.
In another embodiment, the target portion may be determined based on a point where each target object is closest to other target objects, for example, if the player operates the character to run to a building, the target portion of the player operating the character target object is a foot. For example, when the player operates the character with a weapon directed to an opponent or a monster, the target portion of the player operated character target object may be a portion where the player controls the weapon, such as a hand or a foot.
Wherein the orientation information is used to characterize in which direction the target site is oriented. Alternatively, the orientation information of different target sites may be the same or different.
Specifically, determining the orientation information of the target sites of the at least two target objects may include at least one of the following ways:
determining orientation information of the target part based on a preset mark surface of the target part;
determining orientation information of the target site based on an extending direction of the target site;
orientation information of the target portion is determined based on the motion information of the target portion.
Specifically, the information for determining the orientation of the target portion based on the preset mark surface of the target portion may be that a perpendicular line perpendicular to the preset mark surface is drawn from a preset feature point of the target portion to the preset mark surface, and the information for determining the orientation of the target portion is determined based on the feature area of the target portion and the perpendicular line. It is to be understood that the orientation information of the face may be simply referred to as orientation information. It should be noted that the mark surface of the target portion may be preset according to actual requirements, and different target portions may select different mark surfaces, where the method for selecting the mark surface is limited.
Taking the target part as the head or the face as an example, the preset feature point may be the center of the eyebrow, or the center of the line connecting the two nearest canthi, or the center of the face region. The characteristic region of the face may be a region where five sense organs are located. The preset mark surface of the face may be a plane corresponding to the feature region.
Alternatively, determining the orientation information of the target site based on the extending direction of the target site may be determining the orientation information of the target site based on the extending direction of the target tip of the target site. Wherein the target end may be a separate end that is not connected to other parts, such as a fingertip, a toe tip, or a crown of the head, etc. The orientation information of the target portion may be a pointed direction of a fingertip, a toe tip, or a top of the head.
Alternatively, the orientation information of the target portion may be determined based on the motion information of the target portion, and the traveling direction of the target object may be determined based on the motion information of the target portion, and the traveling direction may be determined as the orientation information of the target portion.
Specifically, the corresponding orientation angle may be determined based on the orientation information of each target object. The orientation angle may be based on an angle between the orientation information of the target portion of the target object and the reference direction. The reference direction may be a positive direction of the horizontal reference direction or the vertical reference direction, or a negative direction of the horizontal reference direction or the vertical reference direction. It will be appreciated that the horizontal reference direction or the vertical reference direction and the positive and negative directions thereof may be determined based on a pre-constructed coordinate system. Illustratively, as shown in fig. 2, orientation information is presented, in which the number of target objects to be photographed is two, the reference direction is a positive direction of the horizontal reference direction, and the target portion is a hand of the target object.
And S130, shooting at least two target objects according to the orientation information.
Specifically, the orientation angle difference between the target objects may be determined based on the orientation angles of the target objects, and the shooting of at least two target objects may be triggered based on the orientation angle difference.
For example, the orientation angle difference may be compared with a preset angle difference threshold or a preset angle range, and if the orientation angle difference does not exceed the preset angle difference threshold or is within the preset angle range, the at least two target objects are triggered to be photographed. Wherein the orientation angle difference is an absolute value of a difference between the orientation angles of the respective target objects. The preset angle difference threshold and the preset angle range may be set according to actual requirements, for example, the preset angle difference threshold may be 0 °, 30 °, 45 °, 90 °, 180 °, or the like, the preset angle range may be 0 ° to 30 °, 75 ° to 105 °, or the like, and specific values of the preset angle difference threshold and specific values of the endpoints of the preset angle range are not limited in the present application.
In one embodiment, the preset angle difference threshold may be determined based on attribute information of each target object and/or scene information where each target object is located. For example, if the attribute information of each target object includes a player character and an NPC, since the interaction between the player character and the NPC is generally a face-to-face interaction, the preset angle difference threshold may be set to be smaller; if the attribute information of each target object includes a player character and a monster, since the battle between the player character and the monster may be a skill released by the player character in a head-down manner, and the monster looks for the player on its side, the difference in the orientation angle between the player character and the monster is large, and thus the preset angle difference threshold may be set to be large.
In this embodiment, if the orientation angle difference between the target objects does not exceed the preset angle difference threshold or is within the preset angle range, only shooting two target objects that do not exceed the orientation angle difference threshold or are within the preset angle range may be triggered, and shooting all the target objects may also be triggered.
In one embodiment, a preset reference level may be set in advance for each target object except for the reference object, and based on the preset reference level of each target object, when the difference in orientation angle between two target objects with the highest preset reference level does not exceed a preset angle difference threshold or is within a preset angle range, shooting of two target objects or all target objects with the highest preset reference level may be triggered. The preset reference level may be preset according to the importance of the target object, which is not limited in this application.
According to the technical scheme of the embodiment, at least two target objects to be shot are determined, so that the objects to be shot in the game picture are obtained; the orientation information of the target parts of at least two target objects is determined, so that the detection of the orientation information of the target objects is realized; and at least two target objects are shot according to the orientation information trigger, so that the aim of triggering the shooting of the game picture based on the orientation information of each target object is realized, the high-efficiency shooting of the game picture is realized, the accurate shooting of the key game picture is realized, the timely recording of highlight moments of users is realized, and the experience of the users is improved.
Example two
Fig. 3 is a schematic flow chart of a shooting method according to a second embodiment of the present invention, where this embodiment optionally triggers shooting of at least two target objects according to orientation information on the basis of the foregoing embodiments, where the shooting method includes: constructing orientation marking lines respectively based on the orientation information of each target object; at least two target objects are shot based on the orientation sign line trigger. Wherein explanations of the same or corresponding terms as those of the above embodiments are omitted.
Referring to fig. 3, the photographing method provided by the present embodiment includes the steps of:
and S310, determining at least two target objects to be shot.
S320, determining the orientation information of the target parts of at least two target objects.
And S330, constructing orientation marking lines respectively based on the orientation information of each target object.
The orientation marker line may be understood as a ray constructed based on the orientation information to indicate the orientation of the target object. The direction marker line may be a ray extending in a direction approaching the remaining target object or extending in a pointing direction or a traveling direction of the target object with a feature point of the target portion of the target object as an end point. As mentioned above, the feature points of the target portion may be determined according to actual requirements, and are not specifically limited herein. Illustratively, as shown in fig. 4, an orientation marking line constructed based on the orientation information of the target object is shown.
And S340, shooting at least two target objects based on the direction marking line trigger.
Optionally, the at least two target objects include at least a first target object and a second target object. Wherein the first target object and the second target object may be determined based on a preset reference level of the target object. Specifically, a target object whose preset reference level is higher than the set level threshold may be determined as a first target object, and a target object other than the first target object may be determined as a second target object. It should be noted that the number of the first target objects may be one or more, and correspondingly, the number of the second target objects may also be one or more.
The present embodiment may also determine the first target object and the second target object based on the scene type of the current game scene. For example, if the scene type of the current game scene is a battle scene of a player, the first target object may be a player-controlled character, and the second target object may be a target object other than the player-controlled character, such as a game monster.
Optionally, triggering to shoot at least two target objects based on the orientation marking line of the first target object and the second target object includes: and if the distance between the second target object and the first target object facing the marking line is smaller than a preset distance threshold, triggering to shoot at least two target objects. The distance between the second target object and the orientation mark line of the first target object may be the shortest straight-line distance between the second target object and the orientation mark line of the first target object, for example, as shown in fig. 5, a perpendicular line may be drawn from any point of the second target object to the orientation mark line of the first target object, and the perpendicular line segment d may represent the distance between the relative position of the second target object and the orientation mark line of the first target object.
It should be noted that when at least one first target object and at least one second target object satisfy the trigger condition, only the first target object and the second target object that satisfy the trigger condition may be photographed, or the first target object and the second target object and other target objects except the first target object and the second target object may be photographed.
In another embodiment, when the marker line of the target object passes through other target objects, the passed target object and the corresponding target object facing the marker line may be triggered to be photographed, or all the target objects may be triggered to be photographed. For example, as shown in fig. 4, the orientation marking line of the target object a passes through the target object C, and at this time, the shooting of the target object a and the target object C may be triggered, or the shooting of the target object a, the target object B, and the target object C may be triggered. The image capturing of the target object may be triggered when the direction marker line of the target object passes through an arbitrary portion of another target object, or may be triggered when the direction marker line of the target object passes through a target portion of another target object.
Optionally, shooting at least two target objects based on the orientation marker line trigger includes: and triggering to shoot at least two target objects based on the relative positions of the orientation marking line of the first target object and the orientation marking line of the second target object. In this embodiment, when the relative positions of the orientation marking line of the first target object and the orientation marking line of the second target object are parallel, intersected or overlapped, the shooting of at least two target objects can be triggered.
Optionally, based on the relative position of the orientation marking line of the first target object and the orientation marking line of the second target object, triggering to shoot at least two target objects, including at least one of: if the orientation marking line of the first target object is parallel to the orientation marking line of the second target object, triggering to shoot at least two target objects; if the orientation marking line of the first target object is intersected with the orientation marking line of the second target object, triggering to shoot at least two target objects; and if the orientation marking line of the first target object and the orientation marking line of the second target object are in the same straight line, triggering to shoot at least two target objects.
Wherein, the direction marking line of the first target object and the direction marking line of the second target object are parallel to each other, and the direction marking line comprises: the direction of the first target object towards the mark line is the same as or opposite to the direction of the second target object towards the mark line. For example, as shown in fig. 6, the first target object G and the second target object E are oriented in the same direction as the sign line, and the first target object G and the second target object E may be, for example, a player E and a player G when a plurality of player control characters interact with each other; the first target object G and the second target object F, which may be, for example, player characters and game monsters, face opposite sign lines.
In this embodiment, when the direction marker line of the first target object and the direction marker line of the second target object have an intersection, the direction marker line of the first target object and the direction marker line of the second target object intersect each other. For example, as the orientation sign line of the target object a and the orientation sign line of the target object B intersect in fig. 4, the first target object and the second target object may be, for example, respective player characters when a plurality of player characters cooperatively fight.
Specifically, when the direction marker lines of the target objects intersect, the target object corresponding to the intersected direction marker line may be triggered to be photographed, or all the target objects may be triggered to be photographed. For example, as shown in fig. 4, the orientation marker lines of the target object a and the target object B intersect, and at this time, photographing of the target object a and the target object B or photographing of the target object a, the target object B, and the target object C other than the target object a and the target object B may be triggered.
Wherein, the marking line of the orientation of the first target object and the marking line of the orientation of the second target object are in the same straight line, and the marking line comprises: the orientation marking line of the first target object is completely overlapped or partially overlapped with the orientation marking line of the second target object; or the orientation marking line of the first target object is partially overlapped with the extension line of the orientation marking line of the second target object; or the orientation mark line of the second target object is partially overlapped with the extension line of the orientation mark line of the first target object.
In this embodiment, when the orientation marking line of the first target object and the orientation marking line of the second target object are parallel to each other, or the orientation marking line of the first target object and the orientation marking line of the second target object are intersected, or the orientation marking line of the first target object and the orientation marking line of the second target object are in the same straight line, at least two target objects are triggered to be shot, so that the shooting of the current game picture is accurately triggered, the accurate shooting of the key game picture is realized, and the experience of a user is improved.
In the optional embodiments, the shooting of at least two target objects is triggered based on the relative position of the orientation marking line of the first target object and the second target object and/or based on the relative position of the orientation marking line of the first target object and the orientation marking line of the second target object, so that the shooting of the current game picture is accurately triggered, the accurate shooting of the key game picture is realized, and the experience of a user is improved.
According to the technical scheme of the embodiment, the orientation marking line is constructed based on the orientation angle of each target object; at least two target objects are shot based on the orientation sign line trigger, so that the current game picture can be shot accurately through accurate trigger, the key game picture can be shot accurately, and the experience of a user is improved.
EXAMPLE III
Fig. 7 is a schematic flow chart of a shooting method according to a third embodiment of the present invention, where this embodiment optionally triggers shooting of at least two target objects according to orientation information on the basis of the foregoing embodiments, where the shooting method includes: determining at least one reference object of the at least two target objects; the photographing of the at least two target objects is triggered based on the orientation information of the at least one reference object. Wherein explanations of the same or corresponding terms as those of the above embodiments are omitted.
Referring to fig. 7, the photographing method provided by the present embodiment includes the steps of:
and S710, determining at least two target objects to be shot.
S720, determining the orientation information of the target parts of at least two target objects.
S730, determining at least one reference object in the at least two target objects.
Here, the reference object may be understood as a main photographing object of at least two target objects to be photographed. The reference object is determined based on a preset selection rule. The preset selection rule comprises various types of game scenes and reference objects corresponding to the game scenes. For example, the preset selection rule may include: the reference object corresponding to the battle scene type is a target player control role; and the reference object corresponding to the special effect release scene type is a skill release special effect. It should be noted that the number of the reference objects may be one or more, which is not limited in the present application, and one reference object is taken as an example in the present embodiment.
For example, at least one reference object of the at least two target objects may be determined based on the attribute information of the target object. For example, a dynamic target object may be used as a reference object. Of course, a reference object may be set in advance, for example, a player-controlled character may be used as the reference object, a peculiar monster in a certain game scene may be used as the reference object, and the like.
And S740, triggering shooting of at least two target objects based on the orientation information of at least one reference object.
Similarly, an orientation marker line may be constructed based on orientation information of at least one reference object, and photographing of the reference object and the target object may be triggered based on the orientation marker line. For example, when the orientation marking line of the reference object intersects with the position of the target object, or the distance between the position of the target object and the orientation marking line of the reference object is smaller than a preset distance threshold, the reference object and the target object are triggered to be shot.
In one embodiment, the target objects with the highest reference level may be selected from the target objects based on a preset reference level of the target objects other than the reference object. And triggering to shoot the reference object and the target object with the highest reference level only when the orientation marking line of the reference object is intersected with the position of the target object with the highest reference level, or the distance is less than a preset distance threshold. The relative position of the reference object constructed towards the marker line and other target objects is not taken into account.
In the technical scheme of the embodiment, at least one reference object in at least two target objects is determined; the shooting of the at least two target objects is triggered based on the orientation information of the at least one reference object, and the shooting triggering based on the main shooting object is realized, so that the highlight time or the special picture of a user can be recorded in time, and the experience of the user is improved.
Example four
Fig. 8 is a schematic flow chart of a shooting method according to a fourth embodiment of the present invention, where on the basis of the foregoing embodiments, optionally, the present embodiment further includes: calculating an object distance between at least two target objects; correspondingly, the shooting of at least two target objects is triggered according to the orientation information, and the shooting comprises the following steps: and triggering to shoot at least two target objects according to the orientation information and the object distance.
Wherein explanations of the same or corresponding terms as those of the above embodiments are omitted.
Referring to fig. 8, the photographing method provided by the present embodiment includes the steps of:
and S810, determining at least two target objects to be shot.
And S820, determining the orientation information of the target parts of at least two target objects.
S830, calculating the object distance between at least two target objects.
The object distance is understood to be the euclidean distance between at least two target objects, i.e. the linear distance between the target objects. Specifically, the object distance between the at least two target objects may be calculated based on the position information of the target feature points of the at least two target objects. The target feature point may include, but is not limited to, a root node, a center of gravity, a center of mass of the target object, a point closest or farthest to another target object, and the like.
The position information refers to the position information of the target object in the current game scene, and can be understood as the coordinates of the target object in the current game scene. Specifically, the coordinates of the target object in the current game scene can be obtained by abstracting the whole target object into a coordinate point; or abstracting the set part of the target object into a coordinate point, thereby obtaining the coordinates of the target object in the current game scene.
And S840, triggering to shoot at least two target objects according to the orientation information and the object distance.
In one embodiment, when the object distance is within a preset distance range and the orientation angle difference between the target objects does not exceed a preset angle difference threshold or is within a preset angle range, the at least two target objects may be triggered to be photographed. The shooting of at least two target objects can also be triggered when the object distance is within a preset distance range and the orientation angle difference between the target objects does not exceed a preset angle difference threshold or is within a preset angle range.
Illustratively, when the object distance is within a preset distance range, shooting of at least two target objects is triggered based on the orientation marking of the target objects. For example, the method in the second embodiment and the method in the third embodiment may be referred to for triggering to shoot at least two target objects based on the orientation marker line of the target objects, and details are not repeated here.
The size of the preset distance range in this embodiment may be preset according to the attribute information of at least two target objects, so that at least two different target objects have corresponding preset distance ranges. For example, if the attribute information of each target object is a player character, or a player character and an NPC, or a player character and a specific item, or a player character and a skill, the preset distance range may be set to be smaller; if the attributes of each target object are a player character and a monster, or a player character and a building, the preset distance range may be set to be larger. In other words, the corresponding preset distance range is determined according to the attribute information of the at least two target objects, whether the calculated distance of the at least one object is within the preset distance range is judged, and if yes, the system is triggered to perform automatic shooting operation on the at least two target objects.
It should be noted that the execution sequence of S820 and S830 in this embodiment is not sequential, or S820 and S830 may be executed simultaneously.
According to the technical scheme of the embodiment, the object distance of the target object is calculated, and at least two target objects are triggered to be shot according to the orientation information and the object distance, so that the game picture is triggered to be shot based on the distance and the orientation information, and the key game picture is shot accurately and in real time.
EXAMPLE five
Fig. 9 is a schematic structural diagram of a shooting device according to a fifth embodiment of the present invention, which is applicable to shooting two or more target objects in a multimedia resource, and is particularly applicable to shooting a game screen in real time, for example, shooting of each object is triggered according to an orientation angle of each object in the game screen, and the device specifically includes: an object determination module 910, an orientation determination module 920, and a photographing module 930.
An object determination module 910, configured to determine at least two target objects to be photographed;
an orientation determining module 920, configured to determine orientation information of target portions of at least two target objects;
and a shooting module 930 for triggering shooting of at least two target objects according to the orientation information.
In the embodiment, at least two target objects to be shot are determined through the object determining module, so that the objects to be shot in the game picture are obtained; determining orientation information of target parts of at least two target objects through an orientation determining module, so as to realize detection of the orientation information of the target objects; the shooting module is used for triggering the shooting of at least two target objects according to the orientation information, so that the shooting of the game picture is triggered based on the orientation information of each target object, the high-efficiency shooting of the game picture is realized, the accurate shooting of the key game picture is realized, the timely recording of the highlight moment of a user is realized, and the experience of the user is improved.
Optionally, the shooting module 930 includes a sign line constructing unit and a triggering unit; the sign line constructing unit is used for constructing a sign line in the direction based on the direction information of each target object; the trigger unit is used for shooting at least two target objects based on the direction marking line trigger.
Optionally, the at least two target objects at least include a first target object and a second target object, and correspondingly, the trigger unit includes a first trigger subunit and a second trigger subunit; the first trigger subunit is used for triggering to shoot at least two target objects based on the orientation marking line of the first target object and the second target object; the second trigger subunit is used for triggering to shoot at least two target objects based on the relative positions of the orientation marking line of the first target object and the orientation marking line of the second target object.
Optionally, the second trigger subunit includes at least one subunit of:
the parallel triggering subunit is used for triggering to shoot at least two target objects when the orientation marking line of the first target object is parallel to the orientation marking line of the second target object;
the intersection triggering subunit is used for triggering to shoot at least two target objects when the orientation marking line of the first target object intersects with the orientation marking line of the second target object;
and the straight line triggering subunit is used for triggering the shooting of at least two target objects when the orientation marking line of the first target object and the orientation marking line of the second target object are in the same straight line.
Optionally, the photographing module 930 further includes a reference object photographing unit for determining at least one reference object of the at least two target objects; the photographing of the at least two target objects is triggered based on the orientation information of the at least one reference object.
Optionally, the photographing apparatus further includes a distance calculating module for calculating a target distance between at least two target objects, and correspondingly, the photographing module 930 further includes an orientation distance photographing unit for triggering photographing of at least two target objects according to the orientation information and the target distance.
Optionally, the angle determination module includes at least one unit of:
a face orientation determination unit for determining orientation information of faces of at least two of the target objects;
a hand orientation determination unit for determining orientation information of hands of at least two of the target objects;
a foot orientation determining unit for determining orientation information of the feet of at least two target objects;
a back orientation determining unit for determining orientation information of the backs of at least two of the target objects.
The shooting device provided by the embodiment of the invention can execute the shooting method provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method.
It should be noted that, the units and modules included in the system are merely divided according to functional logic, but are not limited to the above division as long as the corresponding functions can be realized; in addition, specific names of the functional units are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the embodiment of the invention.
EXAMPLE six
Fig. 10 is a schematic structural diagram of an electronic device according to a sixth embodiment of the present invention. FIG. 10 illustrates a block diagram of an exemplary electronic device 12 suitable for use in implementing embodiments of the present invention. The electronic device 12 shown in fig. 10 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiment of the present invention. The device 12 is typically an electronic device that assumes the function of triggering a shot.
As shown in FIG. 10, electronic device 12 is embodied in the form of a general purpose computing device. The components of electronic device 12 may include, but are not limited to: one or more processors or processing units 16, a memory 28, and a bus 18 that couples the various components (including the memory 28 and the processing unit 16).
Bus 18 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, an Industry Standard Architecture (ISA) bus, a Micro Channel Architecture (MCA) bus, an enhanced ISA bus, a Video Electronics Standards Association (VESA) local bus, and a Peripheral Component Interconnect (PCI) bus.
Electronic device 12 typically includes a variety of computer-readable media. Such media may be any available media that is accessible by electronic device 12 and includes both volatile and nonvolatile media, removable and non-removable media.
Memory 28 may include computer device readable media in the form of volatile Memory, such as Random Access Memory (RAM) 30 and/or cache Memory 32. The electronic device 12 may further include other removable/non-removable, volatile/nonvolatile computer storage media. By way of example only, the storage device 34 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 10, and commonly referred to as a "hard drive"). Although not shown in FIG. 10, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a Compact disk-Read Only Memory (CD-ROM), a Digital Video disk (DVD-ROM), or other optical media) may be provided. In these cases, each drive may be connected to bus 18 by one or more data media interfaces. Memory 28 may include at least one program product 40, with program product 40 having a set of program modules 42 configured to carry out the functions of embodiments of the invention. Program product 40 may be stored, for example, in memory 28, and such program modules 42 include, but are not limited to, one or more application programs, other program modules, and program data, each of which examples or some combination may comprise an implementation of a network environment. Program modules 42 generally carry out the functions and/or methodologies of the described embodiments of the invention.
Electronic device 12 may also communicate with one or more external devices 14 (e.g., keyboard, mouse, camera, etc., and display), one or more devices that enable a user to interact with electronic device 12, and/or any devices (e.g., network card, modem, etc.) that enable electronic device 12 to communicate with one or more other computing devices. Such communication may be through an input/output (I/O) interface 22. Also, the electronic device 12 may communicate with one or more networks (e.g., a Local Area Network (LAN), Wide Area Network (WAN), and/or a public Network such as the internet) via the Network adapter 20. As shown, the network adapter 20 communicates with other modules of the electronic device 12 via the bus 18. It should be understood that although not shown in the figures, other hardware and/or software modules may be used in conjunction with electronic device 12, including but not limited to: microcode, device drivers, Redundant processing units, external disk drive Arrays, disk array (RAID) devices, tape drives, and data backup storage devices, to name a few.
The processor 16 executes various functional applications and data processing by executing programs stored in the memory 28, for example, implementing the photographing method provided by the above-described embodiment of the present invention, including:
determining at least two target objects to be photographed;
determining orientation information of target parts of at least two target objects;
and triggering to shoot at least two target objects according to the orientation information.
Of course, those skilled in the art can understand that the processor can also implement the technical solution of the shooting method provided by any embodiment of the present invention.
EXAMPLE seven
An embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps of the photographing method provided in any embodiment of the present invention, and the method includes:
determining at least two target objects to be photographed;
determining orientation information of target parts of at least two target objects;
and triggering to shoot at least two target objects according to the orientation information.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for embodiments of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (10)

1. A photographing method, characterized by comprising:
determining at least two target objects to be photographed;
determining orientation information of target sites of at least two of the target objects;
and triggering to shoot the at least two target objects according to the orientation information.
2. The method of claim 1, wherein the triggering of the photographing of the at least two target objects according to the orientation information comprises:
constructing orientation marking lines respectively based on the orientation information of each target object;
and triggering to shoot the at least two target objects based on the orientation mark line.
3. The method of claim 2, wherein the at least two target objects comprise at least a first target object and a second target object;
the shooting the at least two target objects based on the orientation sign line trigger comprises:
triggering the shooting of the at least two target objects based on the orientation marking line of the first target object and the second target object; alternatively, the first and second electrodes may be,
and triggering to shoot the at least two target objects based on the relative positions of the orientation marking line of the first target object and the orientation marking line of the second target object.
4. The method of claim 3, wherein the triggering of the capturing of the at least two target objects based on the relative positions of the first target object's orientation marker line and the second target object's orientation marker line comprises at least one of:
if the orientation marking line of the first target object is parallel to the orientation marking line of the second target object, triggering to shoot the at least two target objects;
if the orientation marking line of the first target object is intersected with the orientation marking line of the second target object, triggering to shoot the at least two target objects;
and if the orientation marking line of the first target object and the orientation marking line of the second target object are in the same straight line, triggering to shoot the at least two target objects.
5. The method of claim 1, wherein the triggering of the photographing of the at least two target objects according to the orientation information comprises:
determining at least one reference object of the at least two target objects;
triggering the photographing of the at least two target objects based on the orientation information of at least one of the reference objects.
6. The method of claim 1, further comprising:
calculating an object distance between at least two of the target objects;
correspondingly, the triggering of shooting the at least two target objects according to the orientation information includes:
and triggering to shoot the at least two target objects according to the orientation information and the object distance.
7. The method of claim 1, wherein determining orientation information of target sites of at least two of the target objects comprises at least one of:
determining orientation information of faces of at least two of the target objects;
determining orientation information of hands of at least two of the target objects;
determining orientation information of feet of at least two of the target objects;
orientation information of the backs of at least two of the target objects is determined.
8. A camera, comprising:
the object determining module is used for determining at least two target objects to be shot;
an angle determination module for determining orientation information of target parts of at least two target objects;
and the shooting module is used for triggering the shooting of the at least two target objects according to the orientation information.
9. An electronic device, characterized in that the electronic device comprises:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the photographing method of any of claims 1-7.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the photographing method according to any one of claims 1-7.
CN202011625298.7A 2020-12-31 2020-12-31 Shooting method, shooting device, electronic equipment and storage medium Active CN112839170B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011625298.7A CN112839170B (en) 2020-12-31 2020-12-31 Shooting method, shooting device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011625298.7A CN112839170B (en) 2020-12-31 2020-12-31 Shooting method, shooting device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112839170A true CN112839170A (en) 2021-05-25
CN112839170B CN112839170B (en) 2022-07-05

Family

ID=75924386

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011625298.7A Active CN112839170B (en) 2020-12-31 2020-12-31 Shooting method, shooting device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112839170B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113318446A (en) * 2021-06-30 2021-08-31 北京字跳网络技术有限公司 Interaction method, interaction device, electronic equipment and computer-readable storage medium

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110001800A1 (en) * 2009-07-03 2011-01-06 Sony Corporation Image capturing apparatus, image processing method and program
US20130007787A1 (en) * 2011-06-29 2013-01-03 Avaya Inc. System and method for processing media highlights
CN103051830A (en) * 2012-12-31 2013-04-17 北京中科大洋科技发展股份有限公司 System and method for multi-angle real-time rebroadcasting of shot targets
CN104135612A (en) * 2014-07-11 2014-11-05 深圳市中兴移动通信有限公司 A shooting method and a shooting device with an adjustable location of a shot object
WO2017020423A1 (en) * 2015-07-31 2017-02-09 宇龙计算机通信科技(深圳)有限公司 Intelligent camera method and intelligent terminal
CN106971181A (en) * 2017-05-27 2017-07-21 上海天马微电子有限公司 A kind of display panel and display device
CN107297074A (en) * 2017-06-30 2017-10-27 努比亚技术有限公司 Game video method for recording, terminal and storage medium
CN107360375A (en) * 2017-08-29 2017-11-17 维沃移动通信有限公司 A kind of image pickup method and mobile terminal
CN107707817A (en) * 2017-09-27 2018-02-16 维沃移动通信有限公司 A kind of video capture method and mobile terminal
CN109344715A (en) * 2018-08-31 2019-02-15 北京达佳互联信息技术有限公司 Intelligent composition control method, device, electronic equipment and storage medium
CN109451234A (en) * 2018-10-23 2019-03-08 长沙创恒机械设备有限公司 Optimize method, equipment and the storage medium of camera function
CN109718537A (en) * 2018-12-29 2019-05-07 努比亚技术有限公司 Game video method for recording, mobile terminal and computer readable storage medium
CN110024369A (en) * 2016-11-30 2019-07-16 华为技术有限公司 A kind of photographic method, device and terminal device
CN110072055A (en) * 2019-05-07 2019-07-30 中国联合网络通信集团有限公司 Video creating method and system based on artificial intelligence
CN110177206A (en) * 2019-05-27 2019-08-27 努比亚技术有限公司 Image pickup method, mobile terminal and computer readable storage medium
CN110339566A (en) * 2019-05-29 2019-10-18 努比亚技术有限公司 A kind of game Wonderful time recognition methods, terminal and computer readable storage medium

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110001800A1 (en) * 2009-07-03 2011-01-06 Sony Corporation Image capturing apparatus, image processing method and program
US20130007787A1 (en) * 2011-06-29 2013-01-03 Avaya Inc. System and method for processing media highlights
CN103051830A (en) * 2012-12-31 2013-04-17 北京中科大洋科技发展股份有限公司 System and method for multi-angle real-time rebroadcasting of shot targets
CN104135612A (en) * 2014-07-11 2014-11-05 深圳市中兴移动通信有限公司 A shooting method and a shooting device with an adjustable location of a shot object
WO2017020423A1 (en) * 2015-07-31 2017-02-09 宇龙计算机通信科技(深圳)有限公司 Intelligent camera method and intelligent terminal
CN110024369A (en) * 2016-11-30 2019-07-16 华为技术有限公司 A kind of photographic method, device and terminal device
CN106971181A (en) * 2017-05-27 2017-07-21 上海天马微电子有限公司 A kind of display panel and display device
CN107297074A (en) * 2017-06-30 2017-10-27 努比亚技术有限公司 Game video method for recording, terminal and storage medium
CN107360375A (en) * 2017-08-29 2017-11-17 维沃移动通信有限公司 A kind of image pickup method and mobile terminal
CN107707817A (en) * 2017-09-27 2018-02-16 维沃移动通信有限公司 A kind of video capture method and mobile terminal
CN109344715A (en) * 2018-08-31 2019-02-15 北京达佳互联信息技术有限公司 Intelligent composition control method, device, electronic equipment and storage medium
CN109451234A (en) * 2018-10-23 2019-03-08 长沙创恒机械设备有限公司 Optimize method, equipment and the storage medium of camera function
CN109718537A (en) * 2018-12-29 2019-05-07 努比亚技术有限公司 Game video method for recording, mobile terminal and computer readable storage medium
CN110072055A (en) * 2019-05-07 2019-07-30 中国联合网络通信集团有限公司 Video creating method and system based on artificial intelligence
CN110177206A (en) * 2019-05-27 2019-08-27 努比亚技术有限公司 Image pickup method, mobile terminal and computer readable storage medium
CN110339566A (en) * 2019-05-29 2019-10-18 努比亚技术有限公司 A kind of game Wonderful time recognition methods, terminal and computer readable storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113318446A (en) * 2021-06-30 2021-08-31 北京字跳网络技术有限公司 Interaction method, interaction device, electronic equipment and computer-readable storage medium
CN113318446B (en) * 2021-06-30 2023-11-21 北京字跳网络技术有限公司 Interaction method, interaction device, electronic equipment and computer readable storage medium

Also Published As

Publication number Publication date
CN112839170B (en) 2022-07-05

Similar Documents

Publication Publication Date Title
US20210252398A1 (en) Method and system for directing user attention to a location based game play companion application
KR101748593B1 (en) Capturing views and movements of actors performing within generated scenes
US11351468B2 (en) Generating challenges using a location based game play companion application
US8957858B2 (en) Multi-platform motion-based computer interactions
JP2022527662A (en) Virtual object control methods, devices, equipment and computer programs
US10166477B2 (en) Image processing device, image processing method, and image processing program
CN111045777B (en) Rendering method and device, storage medium and electronic equipment
EP2969078B1 (en) User-generated recordings of skeletal animations
US20230051703A1 (en) Gesture-Based Skill Search
CN112839170B (en) Shooting method, shooting device, electronic equipment and storage medium
WO2017218306A1 (en) Method and system for directing user attention to a location based game play companion application
CN109939439B (en) Virtual character blocking detection method, model training method, device and equipment
CN112843693B (en) Method and device for shooting image, electronic equipment and storage medium
CN112843739B (en) Shooting method, shooting device, electronic equipment and storage medium
CN112843689A (en) Shooting method, shooting device, electronic equipment and storage medium
CN112839171B (en) Picture shooting method and device, storage medium and electronic equipment
CN112791418B (en) Determination method and device of shooting object, electronic equipment and storage medium
CN112860360B (en) Picture shooting method and device, storage medium and electronic equipment
CN112791401A (en) Shooting method, shooting device, electronic equipment and storage medium
CN112843687B (en) Shooting method, shooting device, electronic equipment and storage medium
CN112843713B (en) Method, device, equipment and medium for determining center point of visual field
CN112843732A (en) Method and device for shooting image, electronic equipment and storage medium
CN112807698B (en) Shooting position determining method and device, electronic equipment and storage medium
CN112843715B (en) Shooting visual angle determining method, device, equipment and storage medium
CN112843686A (en) Shooting position determining method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant