CN108924438B - Shooting control method and related product - Google Patents

Shooting control method and related product Download PDF

Info

Publication number
CN108924438B
CN108924438B CN201810666390.4A CN201810666390A CN108924438B CN 108924438 B CN108924438 B CN 108924438B CN 201810666390 A CN201810666390 A CN 201810666390A CN 108924438 B CN108924438 B CN 108924438B
Authority
CN
China
Prior art keywords
limb
target
determining
ultrasonic
special effect
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810666390.4A
Other languages
Chinese (zh)
Other versions
CN108924438A (en
Inventor
米岚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201810666390.4A priority Critical patent/CN108924438B/en
Publication of CN108924438A publication Critical patent/CN108924438A/en
Application granted granted Critical
Publication of CN108924438B publication Critical patent/CN108924438B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2621Cameras specially adapted for the electronic generation of special effects during image pickup, e.g. digital cameras, camcorders, video cameras having integrated special effects capability
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Abstract

The embodiment of the application discloses a shooting control method and a related product, which are applied to electronic equipment comprising an ultrasonic sensor.

Description

Shooting control method and related product
Technical Field
The application relates to the technical field of shooting, in particular to a shooting control method and a related product.
Background
With the widespread use of electronic devices (such as mobile phones, tablet computers, etc.), the electronic devices have more and more applications and more powerful functions, and the electronic devices are developed towards diversification and personalization, and become indispensable electronic products in the life of users.
Nowadays, more and more users shoot videos or images through electronic equipment and also carry out live video broadcasting, generally speaking, the users can carry out post-processing on the shot videos or images after shooting is completed, for example, special effects are added, however, in a live broadcasting scene, generally, audiences can watch videos in real time, and it is difficult to add special effects to live videos, and therefore, the problem of how to add special effects to the videos or images being shot needs to be solved urgently.
Disclosure of Invention
The embodiment of the application provides a shooting control method and a related product, which can add special effects to a picture being shot according to limb actions of a user.
In a first aspect, an embodiment of the present application provides a shooting control method, which is applied to an electronic device, where the electronic device includes an ultrasonic sensor, and the method includes:
when the electronic equipment is in a shooting mode, acquiring target limb actions of a user through the ultrasonic sensor;
determining a target special effect according to the target limb action;
and adding the target special effect in the current picture of the display screen of the electronic equipment.
In a second aspect, an embodiment of the present application provides a shooting control apparatus, which is applied to an electronic device including an ultrasonic sensor, and includes:
the acquisition unit is used for acquiring the target limb action of the user through the ultrasonic sensor when the electronic equipment is in a shooting mode;
the determining unit is used for determining a target special effect according to the target limb action;
and the control unit is used for adding the target special effect in the current picture of the display screen of the electronic equipment.
In a third aspect, an embodiment of the present application provides an electronic device, including: an ultrasonic sensor, a processor and a memory; and one or more programs stored in the memory and configured to be executed by the processor, the programs including instructions for some or all of the steps as described in the first aspect.
In a fourth aspect, the present application provides a computer-readable storage medium, where the computer-readable storage medium is used to store a computer program, where the computer program is used to make a computer execute some or all of the steps described in the first aspect of the present application.
In a fifth aspect, embodiments of the present application provide a computer program product, where the computer program product comprises a non-transitory computer-readable storage medium storing a computer program, the computer program being operable to cause a computer to perform some or all of the steps as described in the first aspect of embodiments of the present application. The computer program product may be a software installation package.
The embodiment of the application has the following beneficial effects:
it can be seen that the shooting control method and the related product described in the embodiments of the present application are applied to an electronic device including an ultrasonic sensor, when the electronic device is in a shooting mode, the ultrasonic sensor is used to acquire a target body motion of a user, determine a target special effect according to the target body motion, and add the target special effect to a current picture of a display screen of the electronic device, so that the special effect can be added to the picture being shot according to the body motion of the user.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1A is a schematic structural diagram of an example electronic device provided in an embodiment of the present application;
fig. 1B is a schematic flowchart of a shooting control method provided in an embodiment of the present application;
FIG. 1C is a schematic diagram illustrating an embodiment of the present application providing an example of determining a body contour of a user via a plurality of reflection points;
FIG. 1D is a schematic diagram illustrating a demonstration of adding special effects through a target limb movement of a user according to an embodiment of the present disclosure;
fig. 2 is a schematic flowchart of another shooting control method provided in an embodiment of the present application;
fig. 3 is a schematic flowchart of another shooting control method provided in an embodiment of the present application;
fig. 4 is another schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of a shooting control apparatus according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of another electronic device provided in the embodiment of the present application.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The electronic devices involved in the embodiments of the present application may include various handheld devices, vehicle-mounted devices, wearable devices, computing devices or other processing devices connected to a wireless modem with wireless communication functions, as well as various forms of User Equipment (UE), Mobile Stations (MS), terminal equipment (terminal device), and so on. For convenience of description, the above-mentioned devices are collectively referred to as electronic devices.
The following describes embodiments of the present application in detail.
Referring to fig. 1A, fig. 1A is a schematic structural diagram of an electronic device 100 according to an embodiment of the present disclosure, where the electronic device 100 includes: casing 110, set up in circuit board 120, ultrasonic sensor 130 in casing 110 and set up in display screen 140 on casing 110, be provided with processor 121 on the circuit board 120, ultrasonic sensor 130 with processor 121 is connected, processor 121 connects display screen 140, wherein, ultrasonic sensor 130 can include ultrasonic transmitter and ultrasonic receiver, and ultrasonic transmitter can include one or more speaker, earphone or other electroacoustic conversion unit, and ultrasonic receiver can include one or more microphone or other acoustoelectric conversion unit, and above-mentioned ultrasonic transmitter and ultrasonic receiver can be located positions such as the front of electronic equipment, the back, the frame, and the embodiment of this application does not do the restriction.
The following describes embodiments of the present application in detail.
Referring to fig. 1B, fig. 1B is a schematic flowchart of a shooting control method according to an embodiment of the present disclosure, where the shooting control method described in this embodiment is applied to the electronic device shown in fig. 1A, the electronic device includes an ultrasonic sensor, and the shooting control method includes:
101. when the electronic equipment is in a shooting mode, the target limb movement is obtained through the ultrasonic sensor.
In the embodiment of the application, the electronic device can preset a special effect shooting mode and a traditional shooting mode, wherein the special effect is the content with a specific visual effect added in a shooting picture, the traditional shooting mode is the shooting mode without special effect adding operation, the special effect shooting mode is the shooting mode with the special effect added in the shooting process, and a user can set the shooting mode to shoot in the special effect shooting mode, so that the target limb movement of the user can be automatically obtained through the ultrasonic sensor in the process of shooting images or videos by the electronic device.
The target limb action includes an action formed by the movement of each limb part of the body of the user, and the target limb action may include at least one of the following actions: the user may be a robot.
Optionally, in step 101, acquiring the target limb movement through the ultrasonic sensor may include the following steps:
11. transmitting an ultrasonic signal by an ultrasonic transmitter of the ultrasonic sensor, and receiving an echo signal of the ultrasonic signal by an ultrasonic receiver of the ultrasonic sensor;
12. identifying the limb partial motion of each limb part in a plurality of limb parts according to the ultrasonic signals and the echo signals to obtain a plurality of limb partial motions;
13. composing the target limb movement according to at least one limb sub-movement of the plurality of limb sub-movements.
In the embodiment of the application, an ultrasonic signal can be transmitted by an ultrasonic transmitter of an ultrasonic sensor according to a preset frequency, when the transmitted ultrasonic signal meets an obstacle, the transmitted ultrasonic signal can be reflected by the obstacle, and a reflected echo signal can be received by an ultrasonic receiver of the ultrasonic sensor, wherein the obstacle can be a person, an animal, an object and the like in a range of which the distance between the obstacle and an electronic device is smaller than a preset distance threshold value, in the embodiment of the application, a scene with a user as a shot object is analyzed by taking the user as the obstacle, the ultrasonic signal is reflected by each position of the body of the user to generate the echo signal, and the distance between each reflection point of a plurality of reflection points of a plurality of positions of the body of the user for reflecting the ultrasonic wave and the electronic device can be determined according to the time length t between the transmission of the ultrasonic signal and the reception of the echo signal, obtaining a plurality of distances, wherein each reflection point corresponds to one distance, and the calculation formula of the distance d is as follows: d ═ t × C/2, a three-dimensional coordinate system in a three-dimensional space can be established in advance, coordinates of each of a plurality of reflection points of the body of the user in the three-dimensional coordinate system can be determined according to a plurality of distances and the propagation direction of the ultrasonic wave, then the plurality of reflection points are connected to obtain the body contour of the user, as shown in fig. 1C, fig. 1C is a schematic representation of determining the body contour of the user through the plurality of reflection points, according to the body contour, the limb component actions of each limb part of the body of the user can be determined, and one limb part corresponds to one limb component action, wherein the limb part can include at least one of the following: the user's head, face, neck, left hand, right hand, belly, hips, left foot, right foot, left leg, right leg, and the like.
In step 13, the target limb movement may be composed of sub-movements of multiple limbs, for example, the target limb movement is a movement in which two legs are separated and two hands are extended to both sides, the target limb movement is composed of limb sub-movements of limb parts such as a left hand, a right hand, a left leg, and a right leg, and the target limb movement may also be composed of a sub-movement of one limb, for example, a movement in which a right hand (or a left hand) swings up and down, so that the target limb movement may be obtained by combining at least one of the sub-movements of multiple limbs.
Optionally, in step 101, acquiring the target limb movement through the ultrasonic sensor may include the following steps:
14. acquiring sound field change information of an ultrasonic sound field generated by the ultrasonic sensor in a preset time period;
15. determining limb movement tracks according to the sound field change information, and generating track images;
16. matching the track image with a plurality of image templates in a preset image template library to obtain a plurality of matching values, wherein each image template in the plurality of image templates corresponds to one limb action;
17. and determining the target limb action corresponding to the image template with the maximum matching value in the plurality of image templates.
In this embodiment of the application, sound field change information of an ultrasonic sound field in a preset time period may be obtained, first, multiple sound wave distribution information at multiple time points in the preset time period may be obtained, then, multiple difference sound wave distribution information may be determined according to the preset sound wave distribution information and the multiple sound wave distribution information, where the preset sound wave distribution information is sound wave distribution information when there is no obstacle in the sound field, and finally, sound field change information may be determined according to the multiple difference sound wave distribution information, specifically, multiple change sub-information between two difference sound wave distribution information corresponding to two adjacent time points may be sequentially determined, change sub-information between any two adjacent first times and second times may reflect a change in a state of a user's limb from the first time to the second time, and the change in the state of the limb may include at least one of: the method comprises the steps of determining the change of the limb state of a user at a plurality of time points according to a plurality of change sub-information, further determining the limb motion track of the user according to the change sub-information, generating a track image, presetting a plurality of image templates corresponding to a plurality of different gestures, wherein each image template corresponds to one gesture, establishing an image template library, further respectively matching the generated track image with a plurality of image templates in the preset image template library to obtain the matching value of the track image and each image template, and further taking the gesture corresponding to the image template with the maximum matching value in the image templates as the target limb action.
For example, the user may draw a heart-shaped trajectory in the three-dimensional space by the left hand and the right hand, and may determine, through the sound field change information, the positions of the two hands of the user at each time period in the multiple time points, and determine, in chronological order, the limb movement trajectories of the left hand and the right hand of the user, and then determine the target limb movements of the left hand and the right hand according to the limb movement trajectories.
102. And determining a target special effect according to the target limb action.
In this embodiment of the application, after the target limb movement is obtained, a mapping relationship corresponding to the target limb movement may be determined according to a preset mapping relationship between the limb movement and a special effect, where when the electronic device captures an image, the target special effect may include at least one of the following: for example, when the target limb moves as a left hand or right hand finger of a user drags love, the corresponding target special effect can be set as a special effect of flickering love, and when the electronic device shoots a video, the target special effect can include at least one of the following: adding animation, still images, music, etc., e.g., a user's hand is waving in the air, the target effect may be set to add fluorescence following the hand.
Optionally, in the step 102, determining the target special effect according to the target limb motion may include the following steps:
21. determining composition information of at least one limb part corresponding to the target limb action;
22. determining a target mapping relation corresponding to the composition information in a preset mapping relation set, wherein each group of mapping relations in the mapping relation set is a mapping relation between a limb action and a special effect;
23. and determining a target special effect corresponding to the target limb movement according to the target mapping relation.
In the embodiment of the present application, the composition information refers to composition information of at least one limb portion constituting a target limb movement, and the target limb movement may be composed of limb portion movements of one or more limb portions of limb portions such as a hand, a head, a belly, a hip, a leg, and a foot, so that a mapping relationship between the limb movement and a special effect may be preset under a plurality of different composition information, and thus after the composition information of the target limb movement is determined, a corresponding target mapping relationship may be determined according to the composition information, for example, when the composition information of the target limb movement is a hand and a foot, a target mapping relationship between the limb movement and a special effect corresponding to the composition information of the hand and the foot may be preset, the target mapping relationship is a mapping relationship between a plurality of limb movements and a plurality of special effects, and a plurality of limb movements in the target mapping relationship are limb movements composed by transferring movements of limbs of the hand and the foot, and finally, determining a target special effect corresponding to the target limb action according to the target mapping relation.
Optionally, in the step 21, the target limb motion corresponds to a target limb portion, and the determining the composition information of at least one limb portion corresponding to the target limb motion may include the following steps:
a1, determining the target limb type of the target limb part corresponding to the target limb action;
a2, judging whether the target limb type is a designated type, wherein a symmetrical limb part symmetrical to the target limb part exists in the designated type;
a3, if yes, acquiring a first position of the target limb part in the shooting picture and a second position of the symmetrical limb part in the shooting picture;
a4, determining the relative position relation between the target limb part and the symmetrical limb part according to the first position and the second position, and determining the composition information of the target limb part according to the relative position relation.
In the embodiment of the present application, the specified type refers to a type of a symmetric limb portion, and different effects corresponding to the same motion can be set for different limb portions for symmetric limb portions such as a left hand, a right hand, a left leg, a right leg, a left foot, and a right foot, for example, a first effect corresponding to a movement of playing love for the left hand and a second effect corresponding to a movement of playing love for the right hand can be set, and the first effect is different from the second effect, so that the types of the effects corresponding to the limited limb portions can be richer.
In view of the above situation, in the process of determining the composition information of at least one limb part corresponding to the target limb movement, if it is determined that the target limb movement corresponds to one target limb part and the target limb part belongs to a specified type of limb part, it is necessary to distinguish the limb part from a symmetric limb part thereof, for example, to distinguish which limb part of the left hand and the right hand has made the target limb movement, since there is a certain relative positional relationship between the limb parts of the user, the target limb part and the symmetric limb part can be distinguished according to the relative positional relationship between the target limb part and the symmetric limb part, where the relative positional relationship may include any one of: for example, when the user faces the electronic device and the user crosses, the left hand in the shooting picture can be determined as the user's left hand and the right hand in the shooting picture can be determined as the user's left hand.
103. And adding the target special effect in the current picture of the display screen of the electronic equipment.
In the embodiment of the application, the target special effect can be added in the current picture of the display screen, so that the shot image or video is more interesting, specifically, the special effect can be added in the preset area in the shot picture, and the preset area can be set by a system or a user.
Alternatively, when a plurality of users are included in the shot picture, the body motions of the plurality of users can be identified, and then the target special effect is determined according to the body motions of the plurality of users.
For example, in a scene where a user performs live video, the electronic device may be controlled by limb actions to add a special effect in a live video, as shown in fig. 1D, fig. 1D is a schematic diagram illustrating that a special effect is added by a target limb action of the user according to the present application, and it can be seen that when the target limb action is an action in which both legs are separated and both hands are extended to both sides, a special effect of fluorescence and flame may be added.
Optionally, at least one picture area of at least one limb part corresponding to the target limb action in the display screen can be determined, and then the content of the target special effect is scaled according to the proportion of the total area of the at least one picture area to the total area of the display screen, so that the target special effect is presented along with the limb action.
Optionally, in this embodiment of the application, after the current picture with the target special effect added is photographed to obtain an image or a video, the target image may be stored in the photographed image or video, so that the user may view the target special effect simultaneously while viewing the image or the video.
Optionally, since the added target special effect is an additional effect in the captured image or video, the electronic device may receive a preset operation control instruction of the user, and then cancel the target special effect according to the preset operation control instruction, so as to restore the original display effect of the image or video, and enable the user to flexibly edit the captured image or video.
Optionally, more than two corresponding special effects may be set for the target limb movement, and any one of the more than two special effects is preset as a default special effect, in the step 102, the default special effect may be determined as the target special effect according to the target limb movement, and after the shooting is completed, a setting instruction of a user may be received, and the added target special effect is replaced, so that diversity of the special effects is realized, and personalized requirements of the user are met.
It can be seen that the shooting control method described in the embodiment of the present application is applied to an electronic device including an ultrasonic sensor, and when the electronic device is in a shooting mode, the ultrasonic sensor is used to acquire a target body motion of a user, determine a target special effect according to the target body motion, and add the target special effect to a current picture of a display screen of the electronic device, so that the special effect can be added to the picture being shot according to the body motion of the user.
Referring to fig. 2, fig. 2 is a schematic flowchart of another photographing control method according to an embodiment of the present disclosure, where the photographing control method described in the embodiment is applied to the electronic device shown in fig. 1A, where the electronic device includes an ultrasonic sensor, and the method includes the following steps:
201. when the electronic equipment is in a shooting mode, the target limb movement is obtained through the ultrasonic sensor.
202. And determining a target special effect according to the target limb action.
The specific implementation process of the steps 201-202 can refer to the corresponding description in the method shown in fig. 1B, and will not be described herein again.
203. And acquiring the movement speed of the user.
In the embodiment of the application, in the process of shooting the video, the motion speed of the user may be relatively high, for example, in the process of dancing, the movement speed of the limb part is faster, the jumping of the shot video picture is also faster, in this case, if a special effect is added, the duration of the special effect in the corresponding picture is shorter, and a better visual effect is difficult to present, therefore, the movement speed of the user can be obtained, wherein, the movement speed of the user refers to the movement speed of the limb part of the user, the movement speed of the user is obtained, by acquiring at least one movement distance or movement amplitude from a first time to a second time in at least one limb part corresponding to the target limb action of the user, and determining the movement speed of the user according to the maximum movement distance or the maximum movement amplitude in the at least one movement distance or movement amplitude.
204. And if the movement speed is less than a preset threshold value, adding the target special effect in a current picture of a display screen of the electronic equipment.
The preset threshold value can be set by a user, or the system is default, if the motion speed is greater than or equal to the preset threshold value, the process is terminated, and if the motion speed is less than the preset threshold value, the special effect is added in the current picture of the display screen.
It can be seen that the shooting control method described in the embodiment of the present application is applied to an electronic device including an ultrasonic sensor, when the electronic device is in a shooting mode, the ultrasonic sensor is used to acquire a target body motion of a user, determine a target special effect according to the target body motion, acquire a motion speed of the user, and add the target special effect to a current picture of a display screen of the electronic device if the motion speed is less than a preset threshold, so that the special effect can be flexibly added to the picture being shot according to the body motion of the user.
In accordance with the above, please refer to fig. 3, which is a flowchart illustrating an embodiment of another photographing control method provided in an embodiment of the present application, where the photographing control method described in this embodiment is applied to the electronic device shown in fig. 1A, the electronic device includes an ultrasonic sensor, and the ultrasonic sensor is used to generate a sound field, and the method may include the following steps:
301. when the electronic device is in a shooting mode, transmitting an ultrasonic signal by an ultrasonic transmitter of the ultrasonic sensor, and receiving an echo signal of the ultrasonic signal by an ultrasonic receiver of the ultrasonic sensor.
302. And identifying the limb partial motion of each limb part in the multiple limb parts according to the ultrasonic signals and the echo signals to obtain multiple limb partial motions.
303. Composing the target limb movement according to at least one limb sub-movement of the plurality of limb sub-movements.
304. And determining a target special effect according to the target limb action.
305. And acquiring the movement speed of the user.
306. And if the movement speed is less than a preset threshold value, adding the target special effect in a current picture of a display screen of the electronic equipment.
The specific implementation process of steps 301-304 can refer to fig. 1B, and the specific implementation process of steps 305-306 can refer to the corresponding description in the method shown in fig. 2, and will not be described herein again.
The photographing control method described in the embodiments of the present application is applied to an electronic apparatus including an ultrasonic sensor, when the electronic equipment is in a shooting mode, an ultrasonic wave signal is transmitted by an ultrasonic wave transmitter of the ultrasonic wave sensor, and receiving an echo signal of the ultrasonic signal by an ultrasonic receiver of the ultrasonic sensor, identifying the limb partial motion of each limb part in a plurality of limb parts of the user according to the ultrasonic wave signals and the echo signals to obtain a plurality of limb partial motions, forming a target limb action according to at least one limb action in the plurality of limb actions, determining a target special effect according to the target limb action, acquiring the movement speed of the user, if the movement speed is less than a preset threshold value, the target special effect is added in the current picture of the display screen of the electronic equipment, so that the special effect can be flexibly added to the picture which is shot according to the limb movement of the user.
The following is a device for implementing the above-described shooting control method, specifically as follows:
in accordance with the above, please refer to fig. 4, in which fig. 4 is an electronic device according to an embodiment of the present application, including: an ultrasonic sensor, a processor and a memory; and one or more programs stored in the memory and configured to be executed by the processor, the programs including instructions for performing the steps of:
when the electronic equipment is in a shooting mode, acquiring target limb actions through the ultrasonic sensor;
determining a target special effect according to the target limb action;
and adding the target special effect in the current picture of the display screen of the electronic equipment.
In one possible example, in the acquiring of the target limb motion of the user by the ultrasound sensor, the program comprises instructions for performing the steps of:
transmitting an ultrasonic signal by an ultrasonic transmitter of the ultrasonic sensor, and receiving an echo signal of the ultrasonic signal by an ultrasonic receiver of the ultrasonic sensor;
identifying the limb partial motion of each limb part in the multiple limb parts of the user according to the ultrasonic signal and the echo signal to obtain multiple limb partial motions;
composing the target limb movement according to at least one limb sub-movement of the plurality of limb sub-movements.
In one possible example, in the acquiring of the target limb motion of the user by the ultrasound sensor, the program comprises instructions for performing the steps of:
acquiring sound field change information of an ultrasonic sound field generated by the ultrasonic sensor in a preset time period;
determining limb movement tracks according to the sound field change information, and generating track images;
matching the track image with a plurality of image templates in a preset image template library to obtain a plurality of matching values, wherein each image template in the plurality of image templates corresponds to one limb action;
and determining the target limb action corresponding to the image template with the maximum matching value in the plurality of image templates.
In one possible example, in the determining a target special effect from the target limb action, the program includes instructions for performing the steps of:
determining composition information of at least one limb part corresponding to the target limb action;
determining a target mapping relation corresponding to the composition information in a preset mapping relation set, wherein each group of mapping relations in the mapping relation set is a mapping relation between a limb action and a special effect;
and determining a target special effect corresponding to the target limb movement according to the target mapping relation.
In one possible example, the target limb action corresponds to a target limb part, and in the aspect of determining composition information of at least one limb part corresponding to the target limb action, the program includes instructions for performing the following steps:
determining a target limb type of a target limb part corresponding to the target limb action;
judging whether the target limb type is a designated type, wherein a symmetrical limb part symmetrical to the target limb part exists in the designated type;
if so, acquiring a first position of the target limb part in the shooting picture and a second position of the symmetrical limb part in the shooting picture;
and determining the relative position relationship between the target limb part and the symmetrical limb part according to the first position and the second position, and determining the composition information of the target limb part according to the relative position relationship.
In one possible example, the program further comprises instructions for performing the steps of:
acquiring the movement speed of the user;
and if the movement speed is smaller than a preset threshold value, executing the step of adding the target special effect in the current picture of the display screen of the electronic equipment.
Referring to fig. 5, fig. 5 is a schematic structural diagram of a photographing control device according to the present embodiment. The photographing control apparatus is applied to an electronic device including an ultrasonic sensor, and includes an acquisition unit 501, a determination unit 502, and a control unit 503, wherein,
the acquiring unit 501 is configured to acquire a target limb action through the ultrasonic sensor when the electronic device is in a shooting mode;
the determining unit 502 is configured to determine a target special effect according to the target limb movement;
the control unit 503 is configured to add the target special effect to a current screen of a display screen of the electronic device.
Optionally, in the aspect of acquiring the target limb movement of the user through the ultrasonic sensor, the acquiring unit 501 is specifically configured to:
transmitting an ultrasonic signal by an ultrasonic transmitter of the ultrasonic sensor, and receiving an echo signal of the ultrasonic signal by an ultrasonic receiver of the ultrasonic sensor;
identifying the limb partial motion of each limb part in the multiple limb parts of the user according to the ultrasonic signal and the echo signal to obtain multiple limb partial motions;
composing the target limb movement according to at least one limb sub-movement of the plurality of limb sub-movements.
Optionally, the determining unit 502 is specifically configured to:
transmitting an ultrasonic signal by an ultrasonic transmitter of the ultrasonic sensor, and receiving an echo signal of the ultrasonic signal by an ultrasonic receiver of the ultrasonic sensor;
identifying the limb partial motion of each limb part in the multiple limb parts of the user according to the ultrasonic signal and the echo signal to obtain multiple limb partial motions;
composing the target limb movement according to at least one limb sub-movement of the plurality of limb sub-movements.
Optionally, the determining unit 502 is specifically configured to:
acquiring sound field change information of an ultrasonic sound field generated by the ultrasonic sensor in a preset time period;
determining limb movement tracks according to the sound field change information, and generating track images;
matching the track image with a plurality of image templates in a preset image template library to obtain a plurality of matching values, wherein each image template in the plurality of image templates corresponds to one limb action;
and determining the target limb action corresponding to the image template with the maximum matching value in the plurality of image templates.
Optionally, the target limb motion corresponds to a target limb part, and in terms of determining composition information of at least one limb part corresponding to the target limb motion, the determining unit 502 is specifically configured to:
determining a target limb type of a target limb part corresponding to the target limb action;
judging whether the target limb type is a designated type, wherein a symmetrical limb part symmetrical to the target limb part exists in the designated type;
if so, acquiring a first position of the target limb part in the shooting picture and a second position of the symmetrical limb part in the shooting picture;
and determining the relative position relationship between the target limb part and the symmetrical limb part according to the first position and the second position, and determining the composition information of the target limb part according to the relative position relationship.
Optionally, the obtaining unit 501 is further configured to obtain a movement speed of the user;
if the motion speed is smaller than the preset threshold, the step of adding the target special effect to the current frame of the display screen of the electronic device is executed by the control unit 503.
It can be seen that the photographing control apparatus described in the embodiments of the present application, applied to an electronic device including an ultrasonic sensor, when the electronic equipment is in a shooting mode, an ultrasonic wave signal is transmitted by an ultrasonic wave transmitter of the ultrasonic wave sensor, and receiving an echo signal of the ultrasonic signal by an ultrasonic receiver of the ultrasonic sensor, identifying the limb partial motion of each limb part in a plurality of limb parts of the user according to the ultrasonic wave signals and the echo signals to obtain a plurality of limb partial motions, forming a target limb action according to at least one limb action in the plurality of limb actions, determining a target special effect according to the target limb action, acquiring the movement speed of the user, if the movement speed is less than a preset threshold value, the target special effect is added in the current picture of the display screen of the electronic equipment, so that the special effect can be flexibly added to the picture which is shot according to the limb movement of the user.
It can be understood that the functions of each program module of the shooting control apparatus in this embodiment can be specifically implemented according to the method in the foregoing method embodiment, and the specific implementation process thereof can refer to the related description of the foregoing method embodiment, which is not described herein again.
As shown in fig. 6, for convenience of description, only the portions related to the embodiments of the present application are shown, and details of the specific technology are not disclosed, please refer to the method portion of the embodiments of the present application. The electronic device may be any terminal device including a mobile phone, a tablet computer, a PDA (personal digital assistant), a POS (point of sales), a vehicle-mounted computer, etc., taking the electronic device as the mobile phone as an example:
the electronic device 6000 as shown in fig. 6 includes: at least one processor 6011, a memory 6012, communication interfaces (including SIM interface 6014, audio input interface 6015, serial interface 6016, and other communication interfaces 6017), a signal processing module 6013 (including receiver 6018, transmitter 6019, LOs6020, and signal processor 6021), and input and output modules (including a display screen 6022, speakers 6023, microphone 6024, sensors 6025, etc.). Those skilled in the art will appreciate that the electronic device configuration shown in fig. 6 does not constitute a limitation of the electronic device and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The respective constituent components of the electronic apparatus are specifically described below with reference to fig. 6:
the processor 6011 is a control center of the mobile phone, connects various parts of the whole mobile phone by using various interfaces and lines, and performs various functions of the electronic device and processes data by operating or executing software programs and/or modules stored in the memory 6012 and calling data stored in the memory, thereby integrally monitoring the electronic device. Alternatively, the processor may integrate an application processor (e.g., CPU, or GPU) that primarily handles operating systems, user interfaces, application programs, and the like, and a modem processor that primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor.
The processor 6011 is configured to perform the following steps:
when the electronic equipment is in a shooting mode, acquiring target limb actions through the ultrasonic sensor;
determining a target special effect according to the target limb action;
and adding the target special effect in the current picture of the display screen of the electronic equipment.
The memory 6012 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function, and the like; the storage data area may store data created according to use of the electronic device, and the like. In addition, the memory may include a high-speed random access memory, and may further include a non-volatile memory, such as at least one disk storage device, a flash memory device, or other volatile solid-state storage device.
The communication interface is used for performing communication connection with an external device, and includes a SIM interface 6014, an audio input interface 6015, a serial interface 6016, and another communication interface 6017.
The input and output module 6010 may include a display screen 6022, a speaker 6023, a microphone 6024, a sensor 6025, and the like, wherein the display screen 6022 is configured to detect a movement parameter of a user's finger moving a preset distance on the display screen and acquire a pressing parameter of the user's finger pressing the display screen, and the sensor 6025 may include a light sensor, a motion sensor, an ultrasonic sensor, a brain wave sensor, a camera, and other sensors. Specifically, the light sensor may include an environment sensor and a proximity sensor, wherein the environment sensor may adjust brightness of the touch display screen according to brightness of ambient light, and the proximity sensor may turn off the touch display screen and/or the backlight when the mobile phone moves to the ear. The motion sensor may be, for example, an accelerometer sensor, which can detect the magnitude of acceleration in various directions (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of the electronic device (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer, tapping), and the like.
The signal processing module 6013 is configured to process a signal received by the electronic device from an external device and send the signal to the external device, where the external device may be a base station, for example, the receiver 6018 is configured to receive the signal sent by the external device and transmit the signal to the signal processor 6021, and the transmitter 6019 is configured to transmit the signal output by the signal processor 6021.
In the foregoing embodiments shown in fig. 1B, fig. 2, or fig. 3, the method flows of the steps may be implemented based on the structure of the electronic device.
In the embodiments shown in fig. 4 and 5, the functions of the units may be implemented based on the structure of the mobile phone.
Embodiments of the present application also provide a computer storage medium, wherein the computer storage medium stores a computer program for electronic data exchange, the computer program causing a computer to execute a part or all of the steps of any one of the photographing control methods as set forth in the above method embodiments.
Embodiments of the present application also provide a computer program product including a non-transitory computer-readable storage medium storing a computer program operable to cause a computer to execute some or all of the steps of any one of the shooting control methods as set forth in the above method embodiments.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one type of division of logical functions, and there may be other divisions when actually implementing, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some interfaces, devices or units, and may be an electric or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or may be implemented in the form of a software program module.
The integrated units, if implemented in the form of software program modules and sold or used as stand-alone products, may be stored in a computer readable memory. Based on such understanding, the technical solution of the present application may be substantially implemented or a part of or all or part of the technical solution contributing to the prior art may be embodied in the form of a software product stored in a memory, and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method described in the embodiments of the present application. And the aforementioned memory comprises: various media capable of storing program codes, such as a usb disk, a read-only memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and the like.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable memory, which may include: flash disk, ROM, RAM, magnetic or optical disk, and the like.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. A shooting control method applied to an electronic apparatus including an ultrasonic sensor, the method comprising:
when the electronic equipment is in a shooting mode, acquiring target limb actions of a user through the ultrasonic sensor;
determining a target special effect according to the target limb action;
adding the target special effect in a current picture of a display screen of the electronic equipment, specifically: and determining at least one picture area of at least one limb part corresponding to the target limb action in the display screen, and scaling the content of the target special effect according to the proportion of the total area of the at least one picture area to the total area of the display screen.
2. The method of claim 1, wherein the acquiring, by the ultrasound sensor, the target limb movement of the user comprises:
transmitting an ultrasonic signal by an ultrasonic transmitter of the ultrasonic sensor, and receiving an echo signal of the ultrasonic signal by an ultrasonic receiver of the ultrasonic sensor;
identifying the limb partial motion of each limb part in the multiple limb parts of the user according to the ultrasonic signal and the echo signal to obtain multiple limb partial motions;
composing the target limb movement according to at least one limb sub-movement of the plurality of limb sub-movements.
3. The method of claim 1, wherein the acquiring, by the ultrasound sensor, the target limb movement of the user comprises:
acquiring sound field change information of an ultrasonic sound field generated by the ultrasonic sensor in a preset time period;
determining limb movement tracks according to the sound field change information, and generating track images;
matching the track image with a plurality of image templates in a preset image template library to obtain a plurality of matching values, wherein each image template in the plurality of image templates corresponds to one limb action;
and determining the target limb action corresponding to the image template with the maximum matching value in the plurality of image templates.
4. The method of any one of claims 1 to 3, wherein determining a target special effect from the target limb movement comprises:
determining composition information of at least one limb part corresponding to the target limb action;
determining a target mapping relation corresponding to the composition information in a preset mapping relation set, wherein each group of mapping relations in the mapping relation set is a mapping relation between a limb action and a special effect;
and determining a target special effect corresponding to the target limb movement according to the target mapping relation.
5. The method of claim 4, wherein the target limb action corresponds to a target limb portion, and the determining the composition information of at least one limb portion corresponding to the target limb action comprises:
determining a target limb type of a target limb part corresponding to the target limb action;
judging whether the target limb type is a designated type, wherein a symmetrical limb part symmetrical to the target limb part exists in the designated type;
if so, acquiring a first position of the target limb part in the shooting picture and a second position of the symmetrical limb part in the shooting picture;
and determining the relative position relationship between the target limb part and the symmetrical limb part according to the first position and the second position, and determining the composition information of the target limb part according to the relative position relationship.
6. The method according to any one of claims 1-3, further comprising:
acquiring the movement speed of the user;
and if the movement speed is smaller than a preset threshold value, executing the step of adding the target special effect in the current picture of the display screen of the electronic equipment.
7. A shooting control apparatus applied to an electronic device including an ultrasonic sensor, the shooting control apparatus comprising:
the acquisition unit is used for acquiring the target limb action of the user through the ultrasonic sensor when the electronic equipment is in a shooting mode;
the determining unit is used for determining a target special effect according to the target limb action;
the control unit is used for adding the target special effect in a current picture of a display screen of the electronic equipment, and specifically comprises the following steps: and determining at least one picture area of at least one limb part corresponding to the target limb action in the display screen, and scaling the content of the target special effect according to the proportion of the total area of the at least one picture area to the total area of the display screen.
8. The imaging control apparatus according to claim 7, wherein, in the acquiring of the target limb movement of the user by the ultrasonic sensor, the acquiring unit is specifically configured to:
transmitting an ultrasonic signal by an ultrasonic transmitter of the ultrasonic sensor, and receiving an echo signal of the ultrasonic signal by an ultrasonic receiver of the ultrasonic sensor;
identifying the limb partial motion of each limb part in the multiple limb parts of the user according to the ultrasonic signal and the echo signal to obtain multiple limb partial motions;
composing the target limb movement according to at least one limb sub-movement of the plurality of limb sub-movements.
9. An electronic device, comprising: an ultrasonic sensor, a processor and a memory; and one or more programs stored in the memory and configured to be executed by the processor, the programs comprising instructions for the method of any of claims 1-6.
10. A computer-readable storage medium for storing a computer program, wherein the computer program causes a computer to perform the method according to any one of claims 1-6.
CN201810666390.4A 2018-06-26 2018-06-26 Shooting control method and related product Active CN108924438B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810666390.4A CN108924438B (en) 2018-06-26 2018-06-26 Shooting control method and related product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810666390.4A CN108924438B (en) 2018-06-26 2018-06-26 Shooting control method and related product

Publications (2)

Publication Number Publication Date
CN108924438A CN108924438A (en) 2018-11-30
CN108924438B true CN108924438B (en) 2021-03-02

Family

ID=64421788

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810666390.4A Active CN108924438B (en) 2018-06-26 2018-06-26 Shooting control method and related product

Country Status (1)

Country Link
CN (1) CN108924438B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109547696B (en) * 2018-12-12 2021-07-30 维沃移动通信(杭州)有限公司 Shooting method and terminal equipment
CN110175594B (en) * 2019-05-31 2021-07-30 Oppo广东移动通信有限公司 Vein identification method and related product
CN110336940A (en) * 2019-06-21 2019-10-15 深圳市茄子咔咔娱乐影像科技有限公司 A kind of method and system shooting synthesis special efficacy based on dual camera
CN111282261B (en) * 2020-01-22 2023-08-08 京东方科技集团股份有限公司 Man-machine interaction method and device and somatosensory game equipment
CN111885411A (en) * 2020-08-03 2020-11-03 网易(杭州)网络有限公司 Display control method and device in network live broadcast, electronic equipment and storage medium
CN112637490A (en) * 2020-12-18 2021-04-09 咪咕文化科技有限公司 Video production method and device, electronic equipment and storage medium
CN113923391B (en) * 2021-09-08 2022-10-14 荣耀终端有限公司 Method, apparatus and storage medium for video processing
CN114885164B (en) * 2022-07-12 2022-09-30 深圳比特微电子科技有限公司 Method and device for determining intra-frame prediction mode, electronic equipment and storage medium
CN115278082A (en) * 2022-07-29 2022-11-01 维沃移动通信有限公司 Video shooting method, video shooting device and electronic equipment
CN116503289B (en) * 2023-06-20 2024-01-09 北京天工异彩影视科技有限公司 Visual special effect application processing method and system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101197945A (en) * 2007-12-26 2008-06-11 北京中星微电子有限公司 Method and device for generating special video effect
JP4935647B2 (en) * 2007-11-29 2012-05-23 カシオ計算機株式会社 Composite image output apparatus and composite image output processing program
CN104902212A (en) * 2015-04-30 2015-09-09 努比亚技术有限公司 Video communication method and apparatus
CN105824406A (en) * 2015-11-30 2016-08-03 维沃移动通信有限公司 Photographing method and terminal
CN106657814A (en) * 2017-01-17 2017-05-10 维沃移动通信有限公司 Video recording method and mobile terminal
CN107728782A (en) * 2017-09-21 2018-02-23 广州数娱信息科技有限公司 Exchange method and interactive system, server

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4935647B2 (en) * 2007-11-29 2012-05-23 カシオ計算機株式会社 Composite image output apparatus and composite image output processing program
CN101197945A (en) * 2007-12-26 2008-06-11 北京中星微电子有限公司 Method and device for generating special video effect
CN104902212A (en) * 2015-04-30 2015-09-09 努比亚技术有限公司 Video communication method and apparatus
CN105824406A (en) * 2015-11-30 2016-08-03 维沃移动通信有限公司 Photographing method and terminal
CN106657814A (en) * 2017-01-17 2017-05-10 维沃移动通信有限公司 Video recording method and mobile terminal
CN107728782A (en) * 2017-09-21 2018-02-23 广州数娱信息科技有限公司 Exchange method and interactive system, server

Also Published As

Publication number Publication date
CN108924438A (en) 2018-11-30

Similar Documents

Publication Publication Date Title
CN108924438B (en) Shooting control method and related product
CN109413563B (en) Video sound effect processing method and related product
CN110022363B (en) Method, device and equipment for correcting motion state of virtual object and storage medium
WO2020098462A1 (en) Ar virtual character drawing method and apparatus, mobile terminal and storage medium
JP7121805B2 (en) Virtual item adjustment method and its device, terminal and computer program
CN105450736B (en) Method and device for connecting with virtual reality
CN107707817B (en) video shooting method and mobile terminal
US20140129937A1 (en) Methods, apparatuses and computer program products for manipulating characteristics of audio objects by using directional gestures
CN111445583B (en) Augmented reality processing method and device, storage medium and electronic equipment
CN108989672B (en) Shooting method and mobile terminal
CN111045511B (en) Gesture-based control method and terminal equipment
CN104813642A (en) Methods, apparatuses and computer readable medium for triggering a gesture recognition mode and device pairing and sharing via non-touch gestures
CN108681402A (en) Identify exchange method, device, storage medium and terminal device
CN104777991A (en) Remote interactive projection system based on mobile phone
CN107730460B (en) Image processing method and mobile terminal
CN108495045A (en) Image capturing method, device, electronic device and storage medium
CN108833779B (en) Shooting control method and related product
CN105653029A (en) Method and system for obtaining immersion feel in virtual reality system as well as intelligent glove
CN111182211A (en) Shooting method, image processing method and electronic equipment
CN109194810B (en) Display control method and related product
US20210217218A1 (en) Systems configured to control digital characters utilizing real-time facial and/or body motion capture and methods of use thereof
JP2022545933A (en) Target User Locking Methods and Electronic Devices
CN109547696B (en) Shooting method and terminal equipment
CN110300275B (en) Video recording and playing method, device, terminal and storage medium
CN111913560A (en) Virtual content display method, device, system, terminal equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant