CN108833779B - Shooting control method and related product - Google Patents

Shooting control method and related product Download PDF

Info

Publication number
CN108833779B
CN108833779B CN201810621300.XA CN201810621300A CN108833779B CN 108833779 B CN108833779 B CN 108833779B CN 201810621300 A CN201810621300 A CN 201810621300A CN 108833779 B CN108833779 B CN 108833779B
Authority
CN
China
Prior art keywords
target
face
faces
windows
paster
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810621300.XA
Other languages
Chinese (zh)
Other versions
CN108833779A (en
Inventor
颜伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201810621300.XA priority Critical patent/CN108833779B/en
Publication of CN108833779A publication Critical patent/CN108833779A/en
Application granted granted Critical
Publication of CN108833779B publication Critical patent/CN108833779B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a shooting control method and a related product, wherein when an electronic device is in a shooting mode, a window opening instruction is received, N windows are created on a display screen, N is a positive integer, a camera of the electronic device is used for carrying out face detection on a shot picture to obtain M personal face information of M faces, each face corresponds to one piece of face information, M is a positive integer less than or equal to N, M target windows in the N windows are selected, M faces are respectively displayed in the M target windows, each face corresponds to one target window, a target paster corresponding to each face in the M faces is determined according to the M personal face information to obtain M target pasters, each face corresponds to one target paster, the M target pasters are respectively added to the target windows corresponding to the faces in the M target windows, so that the expression of the faces can be determined according to the expression of the faces, and the paster is set for each face respectively, so that the individual requirements of the user on the paster are met.

Description

Shooting control method and related product
Technical Field
The application relates to the technical field of shooting, in particular to a shooting control method and a related product.
Background
With the widespread use of electronic devices (such as mobile phones, tablet computers, etc.), the electronic devices have more and more applications and more powerful functions, and the electronic devices are developed towards diversification and personalization, and become indispensable electronic products in the life of users.
Nowadays, users increasingly shoot images through electronic equipment, and functions of applications of shooting images are increasingly diversified, for example, special effects, stickers and the like are added in the shot images, but the existing sticker technology still needs to be improved in terms of personalization and intellectualization, and therefore, the application provides a shooting control method for improving the image shooting effect.
Disclosure of Invention
The embodiment of the application provides a shooting control method and a related product, and different stickers can be added to a plurality of users in a shooting picture when electronic equipment shoots.
In a first aspect, an embodiment of the present application provides a shooting control method, which is applied to an electronic device, where the electronic device includes a display screen, and the method includes:
receiving a window opening instruction when the electronic equipment is in a shooting mode, and creating N windows on the display screen, wherein N is a positive integer;
carrying out face detection on a shot picture through a camera of the electronic equipment to obtain M pieces of personal face information of M faces, wherein each face corresponds to one piece of face information, and M is a positive integer less than or equal to N;
selecting M target windows in the N windows, and respectively displaying the M faces in the M target windows, wherein each face corresponds to one target window;
determining a target paster corresponding to each face in the M faces according to the M pieces of personal face information to obtain M pieces of target pasters, wherein each face corresponds to one target paster;
and respectively adding the M target stickers to target windows corresponding to the human faces in the M target windows.
In a second aspect, an embodiment of the present application provides a shooting control apparatus, which is applied to an electronic device, where the electronic device includes a display screen, and the shooting control apparatus includes:
the electronic equipment comprises a creating unit, a display unit and a display unit, wherein the creating unit is used for receiving a window opening instruction when the electronic equipment is in a shooting mode, and creating N windows on the display screen, wherein N is a positive integer;
the detection unit is used for carrying out face detection on a shot picture through a camera of the electronic equipment to obtain M pieces of personal face information of M faces, wherein each face corresponds to one piece of face information, and M is a positive integer less than or equal to N;
the display unit is used for selecting M target windows in the N windows and respectively displaying the M faces in the M target windows, wherein each face corresponds to one target window;
the determining unit is used for determining a target paster corresponding to each face in the M faces according to the M pieces of personal face information to obtain M pieces of target pasters, and each face corresponds to one target paster;
and the execution unit is used for respectively adding the M target stickers to the target windows corresponding to the human faces in the M target windows.
In a third aspect, an embodiment of the present application provides an electronic device, including: a processor and a memory; and one or more programs stored in the memory and configured to be executed by the processor, the programs including instructions for some or all of the steps as described in the first aspect.
In a fourth aspect, the present application provides a computer-readable storage medium, where the computer-readable storage medium is used to store a computer program, where the computer program is used to make a computer execute some or all of the steps described in the first aspect of the present application.
In a fifth aspect, embodiments of the present application provide a computer program product, where the computer program product comprises a non-transitory computer-readable storage medium storing a computer program, the computer program being operable to cause a computer to perform some or all of the steps as described in the first aspect of embodiments of the present application. The computer program product may be a software installation package.
The embodiment of the application has the following beneficial effects:
it can be seen that, in the shooting control method and the related product described in the embodiments of the present application, when the electronic device is in the shooting mode, a window opening instruction is received, N windows are created on the display screen, where N is a positive integer, a camera of the electronic device performs face detection on a shot picture to obtain M pieces of personal face information of M faces, each face corresponds to one piece of face information, where M is a positive integer less than or equal to N, M target windows of the N windows are selected, M faces are respectively displayed in the M target windows, each face corresponds to one target window, a target sticker corresponding to each face of the M faces is determined according to the M pieces of personal face information to obtain M pieces of target stickers, each face corresponds to one target sticker, the M pieces of target stickers are respectively added to target windows corresponding to the faces of the M target windows, therefore, the paster can be set for each face according to the expressions of the faces, and the personalized requirements of the user on the paster are met.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1A is a schematic structural diagram of an example electronic device provided in an embodiment of the present application;
fig. 1B is a schematic flowchart of a shooting control method provided in an embodiment of the present application;
fig. 2 is a schematic flowchart of another shooting control method provided in an embodiment of the present application;
fig. 3 is a schematic flowchart of another shooting control method provided in an embodiment of the present application;
fig. 4 is another schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 5A is a schematic structural diagram of a shooting control apparatus according to an embodiment of the present application;
fig. 5B is a modified structure of the photographing control apparatus described in fig. 5A provided in an embodiment of the present application;
fig. 6 is a schematic structural diagram of another electronic device provided in the embodiment of the present application.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The electronic devices involved in the embodiments of the present application may include various handheld devices, vehicle-mounted devices, wearable devices, computing devices or other processing devices connected to a wireless modem with wireless communication functions, as well as various forms of User Equipment (UE), Mobile Stations (MS), terminal equipment (terminal device), and so on. For convenience of description, the above-mentioned devices are collectively referred to as electronic devices.
The following describes embodiments of the present application in detail.
Referring to fig. 1A, fig. 1A is a schematic structural diagram of an electronic device 100 according to an embodiment of the present disclosure, where the electronic device 100 includes: the casing 110, set up in circuit board 120 in the casing 110, camera 130 and set up in display screen 140 on the casing 110, be provided with treater 121 on the circuit board 120, camera 130 with treater 121 is connected, treater 121 is connected display screen 140.
The following describes embodiments of the present application in detail.
Referring to fig. 1B, fig. 1B is a schematic flowchart of a shooting control method according to an embodiment of the present disclosure, where the shooting control method described in this embodiment is applied to an electronic device shown in fig. 1A, the electronic device includes a display screen, and the shooting control method includes:
101. and when the electronic equipment is in a shooting mode, receiving a window opening instruction, and creating N windows on the display screen, wherein N is a positive integer.
In the embodiment of the application, when the electronic device is in the shooting mode, the electronic device can receive a window opening instruction sent by a user, the user can set the number N of windows through the electronic device, generally, the user can set the number of windows according to the number of faces to be shot, for example, when the number of faces to be shot is 3, 3 windows can be set and created through the electronic device, and after the electronic device receives the window opening instruction, 3 windows are created on a display screen.
Alternatively, the electronic device may set a virtual button in the display screen, through which the user may send a window opening command, and the shape of the N windows may be set by default by the system or by the user.
102. And carrying out face detection on the shot picture through a camera of the electronic equipment to obtain M individual face information of M faces, wherein each face corresponds to one face information, and M is a positive integer less than or equal to N.
In an embodiment of the present application, the face information may include at least one of the following: the position information is the position of each face in the M faces in the shot picture, and the shot picture can be cut according to the position information of each face in the M faces to obtain the face image of each face in the M faces.
Optionally, in step 102, performing face detection on the shot picture through a camera of the electronic device to obtain M pieces of face information of M faces, which may include the following steps:
21. carrying out face detection on the shot picture through a camera of the electronic equipment to obtain M face outlines of M faces;
22. cutting the shot picture according to the M face outlines to obtain M face images;
23. and acquiring a target expression corresponding to each face image in the M personal face images to obtain M target expressions, wherein each face image corresponds to one target expression.
The face detection is carried out on the shot picture, and the face outline is obtained in the following mode: the method comprises the steps of carrying out human face feature point extraction on a shot picture to obtain a feature point set corresponding to each human face in M human faces, wherein each feature point set comprises a plurality of feature points of each part of one human face, and connecting the feature point sets to obtain a peripheral outline aiming at the feature point set corresponding to any human face, wherein the peripheral outline is the human face outline of the human face.
The method includes the steps of obtaining a plurality of image templates in advance, enabling each image template to correspond to one expression, then enabling the image templates to correspond to the expressions, obtaining a target expression corresponding to each face image in M face images, enabling the M face images to be matched with the image templates respectively to determine the target expression corresponding to each face image, and specifically enabling the face images to be sequentially matched with the image templates aiming at any face image to obtain the target expression corresponding to the image template which is successfully matched.
Optionally, if M is smaller than N, sending a face detection prompting message, where the face detection prompting message is used to prompt a user to send a face detection instruction to the electronic device;
and receiving the face detection instruction, and repeatedly executing the operation of carrying out face detection on the shot picture by the camera of the electronic equipment according to the face detection instruction until M is equal to N.
In the embodiment of the application, if the number M of the faces detected by the camera is less than the number N of the windows, it is indicated that the camera needs to detect the faces of a plurality of users in the shot picture again, and a face detection prompt message can be sent out by the camera, wherein the face detection prompt message can comprise a text prompt message or an icon prompt message displayed on a display screen, and can also comprise a voice prompt message, so that the camera can prompt a plurality of users to correct the shooting posture, or adjust the shooting angle and the like, and the camera can accurately acquire the face information of each face in the shot picture.
103. And selecting M target windows from the N windows, and respectively displaying the M faces in the M target windows, wherein each face corresponds to one target window.
In the embodiment of the application, after obtaining M face information of M faces in a captured picture, M target windows in the M windows may be selected, optionally, a window corresponding to each face in the M faces may be determined according to position information of the M faces, and a Texture (Texture) may be generated according to a picture of each face, where the Texture is a container storing image data of each face in the captured picture, specifically, an electronic device may calculate a color of each pixel in the image data of the container through a shader (shader) to generate a corresponding Texture for each face in the M faces, and then, the shader is used to display the M textures corresponding to the M faces into the M windows, where the shader is an editable program used to implement image rendering.
104. And determining a target paster corresponding to each face in the M faces according to the M personal face information to obtain M target pasters, wherein each face corresponds to one target paster.
In the embodiment of the application, a target sticker corresponding to face information of each face in M faces can be determined, when the face information includes expression information, a corresponding relationship between the expression and the sticker can be preset, then the target sticker corresponding to the expression information of each face can be determined according to the corresponding relationship, so that the corresponding sticker is matched according to the expressions of different faces, when the face information includes a face contour, a corresponding relationship between a contour shape and the sticker can be preset, then the target sticker corresponding to each face contour can be determined according to the corresponding relationship, so that the corresponding sticker is matched according to the contour shape of different faces, when the face information includes face position information, different target stickers are set for a plurality of faces with different position information, so that the visual effect corresponding to each face in M faces is different, the interest of the sticker effect is enhanced, for example, the sticker corresponding to the face above can be set as the "glasses" sticker, the sticker corresponding to the face in the middle is the "ear" sticker, and the sticker corresponding to the face below is the "cartoon beard" sticker.
105. And respectively adding the M target stickers to target windows corresponding to the human faces in the M target windows.
In the embodiment of the application, after the target sticker corresponding to each face in the M faces is determined, the M target stickers may be respectively added to the windows corresponding to the faces, specifically, for different target stickers, the display position corresponding to the target sticker may be determined, for example, when the target sticker is a "headwear" sticker, the "headwear" sticker may be displayed in the display area above the head of the face picture.
Optionally, in the step 105, adding the M target stickers to the target windows corresponding to the human face in the M target windows respectively may specifically include the following steps:
51. obtaining the regional expression parameters of each face in the M faces to obtain M regional expression parameters;
52. determining display parameters of each target paster in the M target pasters according to the M regional expression parameters to obtain M display parameters;
53. and adding each target paster in the M target pasters to a target window corresponding to the human face in the M target windows according to the M display parameters.
In this embodiment, the regional expression parameters refer to expression parameters of a specific region of a face, when a user takes a photo, the user may make various expressions, when the user makes an exaggerated expression, eyebrows may shake up and down, a mouth may be enlarged, or eyes may be opened, when the user smiles, a dimple may be generated on a cheek, the user may also blink, and the like, each expression in the various expressions corresponds to one regional expression parameter, so that, by acquiring the regional expression parameters of the face of the user, then determining display parameters of the target sticker according to the regional expression parameters, for example, when the user blinks, the regional expression parameters of eyes when the user blinks may be acquired, and when the user smiles, the regional expression parameters of the dimple region of the user may be acquired, where the display parameters may include at least one of: the target expression is displayed in a display screen, and the target expression is displayed in a display screen.
Optionally, in the step 52, determining a display parameter of each of the M target stickers according to the M regional expression parameters includes:
determining the scaling of each target sticker in the M target stickers according to each regional expression parameter of the M regional expression parameters;
in the above step 53, adding the M target stickers to the target windows corresponding to the human face in the M target windows respectively may specifically include the following steps:
zooming each target paster in the M target pasters according to the zooming proportion of each target paster in the M target pasters to obtain M zoomed target pasters;
and adding each target paster in the M zoomed target pasters to a window corresponding to the human face.
For each face in the M faces, after the target sticker is determined according to the expression information of the face, the regional expression parameter of the face region corresponding to the target sticker may be determined, and then the scaling of the target sticker corresponding to the face may be determined according to the regional expression parameter of the face.
In the embodiment of the application, after adding M target stickers to M faces, the electronic device can receive a shooting control instruction sent by a user, and image shooting or video shooting is performed according to the shooting control instruction, so that different target stickers are matched for multiple users, and the target stickers corresponding to each face in the shot images or videos all meet the personalized requirements of the users.
It can be seen that the shooting control method described in the embodiments of the present application is applied to an electronic device including a display screen, and when the electronic device is in a shooting mode, receives a window opening instruction, creates N windows on the display screen, where N is a positive integer, performs face detection on a shot picture through a camera of the electronic device to obtain M pieces of personal face information of M faces, each face corresponding to one piece of face information, where M is a positive integer less than or equal to N, selects M target windows of the N windows, respectively displays the M faces in the M target windows, each face corresponding to one target window, determines a target sticker corresponding to each face of the M faces according to the M pieces of face information to obtain M pieces of target stickers, each face corresponding to one target sticker, respectively adds the M pieces of target stickers to target windows corresponding to the faces of the M target windows, therefore, the paster can be set for each face according to the expressions of the faces, and the personalized requirements of the user on the paster are met.
Referring to fig. 2, fig. 2 is a schematic flowchart of another shooting control method according to an embodiment of the present disclosure, where the shooting control method described in this embodiment is applied to the electronic device shown in fig. 1A, where the electronic device includes a display screen, and the method includes the following steps:
201. and when the electronic equipment is in a shooting mode, receiving a window opening instruction, and creating N windows on the display screen, wherein N is a positive integer.
202. And carrying out face detection on the shot picture through a camera of the electronic equipment to obtain M individual face information of M faces, wherein each face corresponds to one face information, and M is a positive integer less than or equal to N.
203. And selecting M target windows from the N windows, and respectively displaying the M faces in the M target windows, wherein each face corresponds to one target window.
204. And determining a target paster corresponding to each face in the M faces according to the M personal face information to obtain M target pasters, wherein each face corresponds to one target paster.
205. And respectively adding the M target stickers to target windows corresponding to the human faces in the M target windows.
The specific implementation process of the step 201-205 can refer to the corresponding description in the method shown in fig. 1B, and is not described herein again.
206. And if M is less than N, sending a face detection prompt message, wherein the face detection prompt message is used for prompting a user to send a face detection instruction to the electronic equipment.
In the embodiment of the application, if the number M of the faces detected by the camera is smaller than the number N of the windows, the fact that the face of a plurality of users in a shot picture needs to be detected again by the camera is indicated, the electronic equipment can send out face detection prompt messages to prompt the user to align the camera to more faces, and therefore the number M of the faces detected by the camera again is consistent with the number N of the windows.
207. Receiving the face detection instruction, and repeatedly executing 202 and 205.
The electronic equipment detects the number of human faces and human face information in the shot picture again according to the human face detection instruction, and then repeatedly displays the detected human faces to the window and matches the stickers.
It can be seen that the shooting control method described in the embodiments of the present application is applied to an electronic device including a display screen, and when the electronic device is in a shooting mode, receives a window opening instruction, creates N windows on the display screen, where N is a positive integer, performs face detection on a shot picture through a camera of the electronic device to obtain M pieces of personal face information of M faces, each face corresponding to one piece of face information, where M is a positive integer less than or equal to N, selects M target windows of the N windows, respectively displays the M faces in the M target windows, each face corresponding to one target window, determines a target sticker corresponding to each face of the M faces according to the M pieces of face information to obtain M pieces of target stickers, each face corresponding to one target sticker, respectively adds the M pieces of target stickers to target windows corresponding to the faces of the M target windows, therefore, the paster can be set for each face according to the expressions of the faces, and the personalized requirements of the user on the paster are met.
In accordance with the above, please refer to fig. 3, which is a schematic flow chart of another embodiment of a shooting control method according to an embodiment of the present application, where the shooting control method described in this embodiment is applied to the electronic device shown in fig. 1A, where the electronic device includes a display screen, and the method includes the following steps:
301. and when the electronic equipment is in a shooting mode, receiving a window opening instruction, and creating N windows on the display screen, wherein N is a positive integer.
302. And carrying out face detection on the shot picture through a camera of the electronic equipment to obtain M face outlines of the M faces.
303. And cutting the shot picture according to the M face outlines to obtain M face images.
304. And acquiring a target expression corresponding to each face image in the M personal face images to obtain M target expressions, wherein each face image corresponds to one target expression.
305. And selecting M target windows from the N windows, and respectively displaying the M faces in the M target windows, wherein each face corresponds to one target window.
306. And determining a target paster corresponding to each face in the M faces according to the M target expressions to obtain M target pasters, wherein each face corresponds to one target paster.
307. And respectively adding the M target stickers to target windows corresponding to the human faces in the M target windows.
308. And if M is less than N, sending a face detection prompt message, wherein the face detection prompt message is used for prompting a user to send a face detection instruction to the electronic equipment.
309. And receiving the face detection instruction, and repeatedly executing the steps 302 and 307.
The specific implementation process of steps 301 and 309 can be described with reference to the corresponding description in the method shown in fig. 1B, and will not be described herein again.
The shooting control method described in the embodiment of the application is applied to an electronic device comprising a display screen, receives a window opening instruction when the electronic device is in a shooting mode, creates N windows on the display screen, wherein N is a positive integer, performs face detection on a shot picture through a camera of the electronic device to obtain M personal face information of M faces, each face corresponds to one piece of face information, wherein M is a positive integer less than or equal to N, selects M target windows in the N windows, respectively displays the M faces in the M target windows, each face corresponds to one target window, determines a target sticker corresponding to each face in the M faces according to the M face information to obtain M target stickers, each face corresponds to one target sticker, respectively adds the M target stickers to the target windows corresponding to the faces in the M target windows, therefore, the paster can be set for each face according to the expressions of a plurality of faces.
The following is a device for implementing the above-described shooting control method, specifically as follows:
in accordance with the above, please refer to fig. 4, in which fig. 4 is an electronic device according to an embodiment of the present application, including: a processor and a memory; and one or more programs stored in the memory and configured to be executed by the processor, the programs including instructions for performing the steps of:
receiving a window opening instruction when the electronic equipment is in a shooting mode, and creating N windows on the display screen, wherein N is a positive integer;
carrying out face detection on a shot picture through a camera of the electronic equipment to obtain M pieces of personal face information of M faces, wherein each face corresponds to one piece of face information, and M is a positive integer less than or equal to N;
selecting M target windows in the N windows, and respectively displaying the M faces in the M target windows, wherein each face corresponds to one target window;
determining a target paster corresponding to each face in the M faces according to the M pieces of personal face information to obtain M pieces of target pasters, wherein each face corresponds to one target paster;
and respectively adding the M target stickers to target windows corresponding to the human faces in the M target windows.
In one possible example, in the aspect of performing face detection on a shot picture by a camera of the electronic device to obtain M pieces of face information of M faces, the program includes instructions for performing the following steps:
carrying out face detection on the shot picture through a camera of the electronic equipment to obtain M face outlines of M faces;
cutting the shot picture according to the M face outlines to obtain M face images;
and acquiring a target expression corresponding to each face image in the M personal face images to obtain M target expressions, wherein each face image corresponds to one target expression.
In one possible example, in the aspect of adding the M target stickers to the target windows corresponding to the human faces, respectively, the program includes instructions for:
obtaining the regional expression parameters of each face in the M faces to obtain M regional expression parameters;
determining display parameters of each target paster in the M target pasters according to the M regional expression parameters to obtain M display parameters;
and adding each target paster in the M target pasters to a target window corresponding to the human face in the M target windows according to the M display parameters.
In one possible example, in said determining display parameters for each of said M target stickers from said M regional expression parameters, said program includes instructions for:
determining the scaling of each target sticker in the M target stickers according to each regional expression parameter of the M regional expression parameters;
in the aspect of adding the M target stickers to the target windows corresponding to the human face among the M target windows, respectively, the program includes instructions for:
zooming each target paster in the M target pasters according to the zooming proportion of each target paster in the M target pasters to obtain M zoomed target pasters;
and adding each target paster in the M zoomed target pasters to a window corresponding to the human face.
In one possible example, the program further comprises instructions for performing the steps of:
if M is less than N, sending a face detection prompt message, wherein the face detection prompt message is used for prompting a user to send a face detection instruction to the electronic equipment;
and receiving the face detection instruction, and repeatedly executing the operation of carrying out face detection on the shot picture by the camera of the electronic equipment according to the face detection instruction until M is equal to N.
Referring to fig. 5A, fig. 5A is a schematic structural diagram of a photographing control device according to the present embodiment. The photographing control apparatus is applied to an electronic device including a display screen, and includes a creating unit 501, a detecting unit 502, a display unit 503, a determining unit 504, and an executing unit 505, wherein,
the creating unit 501 is configured to receive a window opening instruction when the electronic device is in a shooting mode, and create N windows on the display screen, where N is a positive integer;
the detection unit 502 is configured to perform face detection on a shot picture through a camera of the electronic device to obtain M pieces of personal face information of M faces, where each face corresponds to one piece of face information, and M is a positive integer less than or equal to N;
the display unit 503 is configured to select M target windows from the N windows, and display the M faces in the M target windows respectively, where each face corresponds to one target window;
the determining unit 504 is configured to determine, according to the M pieces of personal face information, a target sticker corresponding to each of the M faces, to obtain M pieces of target stickers, where each face corresponds to one target sticker;
the execution unit 505 is configured to add the M target stickers to target windows corresponding to the human face in the M target windows, respectively.
Optionally, the detecting unit 502 is specifically configured to:
carrying out face detection on the shot picture through a camera of the electronic equipment to obtain M face outlines of M faces;
cutting the shot picture according to the M face outlines to obtain M face images;
and acquiring a target expression corresponding to each face image in the M personal face images to obtain M target expressions, wherein each face image corresponds to one target expression.
Optionally, the execution unit 505 is specifically configured to:
obtaining the regional expression parameters of each face in the M faces to obtain M regional expression parameters;
determining display parameters of each target paster in the M target pasters according to the M regional expression parameters to obtain M display parameters;
and adding each target paster in the M target pasters to a target window corresponding to the human face in the M target windows according to the M display parameters.
Optionally, in the aspect of determining the display parameter of each target sticker of the M target stickers according to the M regional expression parameters, the execution unit 505 is specifically configured to:
determining the scaling of each target sticker in the M target stickers according to each regional expression parameter of the M regional expression parameters;
in the aspect that the M target stickers are respectively added to the target windows corresponding to the human face in the M target windows, the execution unit 505 is specifically configured to:
zooming each target paster in the M target pasters according to the zooming proportion of each target paster in the M target pasters to obtain M zoomed target pasters;
and adding each target paster in the M zoomed target pasters to a window corresponding to the human face.
Alternatively, as shown in fig. 5B, fig. 5B is a modified structure of the photographing control apparatus described in fig. 5A, which may further include a transmitting unit 506 compared to fig. 5A, wherein,
the sending unit 506 is configured to send a face detection prompting message when M is smaller than N, where the face detection prompting message is used to prompt a user to send a face detection instruction to the electronic device;
then, the detection unit 502 receives the face detection instruction, and repeatedly executes the operation of performing face detection on the shot picture by the camera of the electronic device until M is equal to N.
It can be seen that, the shooting control apparatus described in the embodiment of the present application is applied to an electronic device including a display screen, and when the electronic device is in a shooting mode, receives a window opening instruction, creates N windows on the display screen, where N is a positive integer, performs face detection on a shot picture through a camera of the electronic device to obtain M pieces of personal face information of M faces, each face corresponding to one piece of face information, where M is a positive integer less than or equal to N, selects M target windows from the N windows, displays the M faces in the M target windows respectively, each face corresponding to one target window, determines a target sticker corresponding to each face in the M faces according to the M pieces of face information to obtain M pieces of target stickers, each face corresponding to one target sticker, and adds the M pieces of target stickers to target windows corresponding to the faces in the M target windows respectively, therefore, the paster can be set for each face according to the expressions of the faces, and the personalized requirements of the user on the paster are met.
It can be understood that the functions of each program module of the shooting control apparatus in this embodiment can be specifically implemented according to the method in the foregoing method embodiment, and the specific implementation process thereof can refer to the related description of the foregoing method embodiment, which is not described herein again.
As shown in fig. 6, for convenience of description, only the portions related to the embodiments of the present application are shown, and details of the specific technology are not disclosed, please refer to the method portion of the embodiments of the present application. The electronic device may be any terminal device including a mobile phone, a tablet computer, a Personal Digital Assistant (PDA), a point of sale (POS), a vehicle-mounted computer, and the like.
Fig. 6 is a block diagram illustrating a partial structure of an electronic device provided in an embodiment of the present invention. As shown in fig. 6, the electronic device 610 may include control circuitry, which may include storage and processing circuitry 620. The storage and processing circuit 620 may be a memory, such as a hard disk drive memory, a non-volatile memory (e.g., a flash memory or other electronically programmable read only memory used to form a solid state drive, etc.), a volatile memory (e.g., a static or dynamic random access memory, etc.), etc., and the embodiments of the present application are not limited thereto. The processing circuitry in storage and processing circuitry 620 may be used to control the operation of electronic device 610. The processing circuitry may be implemented based on one or more microprocessors, microcontrollers, digital signal processors, baseband processors, power management units, audio codec chips, application specific integrated circuits, display driver integrated circuits, and the like.
The storage and processing circuit 620 may be used to run software in the electronic device 610, such as an internet browsing application, a Voice Over Internet Protocol (VOIP) phone call application, an email application, a media playing application, operating system functions, and so forth. Such software may be used to perform control operations such as, for example, camera-based image capture, ambient light measurement based on an ambient light sensor, proximity sensor measurement based on a proximity sensor, information display functionality based on status indicators such as status indicator lights of light emitting diodes, touch event detection based on a touch sensor, functionality associated with displaying information on multiple (e.g., layered) displays, operations associated with performing wireless communication functions, operations associated with collecting and generating audio signals, control operations associated with collecting and processing button press event data, and other functions in the electronic device 610, and the like, without limitation of embodiments of the present application.
The electronic device 610 may also include input-output circuitry 630. The input-output circuitry 630 may be used to enable the electronic device 610 to enable input and output of data, i.e., to allow the electronic device 610 to receive data from external devices and also to allow the electronic device 610 to output data from the electronic device 610 to external devices. The input-output circuit 630 may further include a sensor 631. The sensors 631 may include ambient light sensors, light and capacitance based proximity sensors, touch sensors (e.g., light and/or capacitive based touch sensors, ultrasonic sensors, wherein the touch sensors may be part of a touch display screen or may be used independently as a touch sensor structure), acceleration sensors, and other sensors, among others.
Input-output circuitry 630 may also include one or more displays, such as display 632. Display 632 may include one or a combination of liquid crystal displays, organic light emitting diode displays, electronic ink displays, plasma displays, displays using other display technologies. Display 632 may include an array of touch sensors (i.e., display 632 may be a touch display screen). The touch sensor may be a capacitive touch sensor formed by a transparent touch sensor electrode (e.g., an Indium Tin Oxide (ITO) electrode) array, or may be a touch sensor formed using other touch technologies, such as acoustic wave touch, pressure sensitive touch, resistive touch, optical touch, and the like, and the embodiments of the present application are not limited thereto.
The electronic device 610 may also include an audio component 633. The audio component 633 can be used to provide audio input and output functionality for the electronic device 610. Audio components 36 in electronic device 10 may include speakers, microphones, buzzers, tone generators, and other components for generating and detecting sound.
The communication circuit 634 may be used to provide the electronic device 610 with the ability to communicate with external devices. The communication circuit 634 may include analog and digital input-output interface circuits, and wireless communication circuits based on radio frequency signals and/or optical signals. The wireless communication circuitry in communication circuitry 634 may include radio-frequency transceiver circuitry, power amplifier circuitry, low noise amplifiers, switches, filters, and antennas. For example, the wireless communication circuitry in communications circuitry 634 may include circuitry to support Near Field Communication (NFC) by transmitting and receiving near field coupled electromagnetic signals. For example, the communication circuit 634 may include a near field communication antenna and a near field communication transceiver. The communications circuitry 634 may also include a cellular telephone transceiver and antenna, a wireless local area network transceiver circuitry and antenna, and so forth.
The electronic device 610 may further include a battery, power management circuitry, and other input-output units 635. The input-output unit 635 may include buttons, joysticks, click wheels, scroll wheels, touch pads, keypads, keyboards, cameras, light emitting diodes and other status indicators, and the like.
A user may input commands through the input-output circuitry 630 to control the operation of the electronic device 610, and may use output data of the input-output circuitry 630 to enable receipt of status information and other outputs from the electronic device 610.
In the foregoing embodiments shown in fig. 1B, fig. 2, or fig. 3, the method flows of the steps may be implemented based on the structure of the electronic device.
In the embodiments shown in fig. 4, fig. 5A or fig. 5B, the functions of the units may be implemented based on the structure of the electronic device.
Embodiments of the present application also provide a computer storage medium, wherein the computer storage medium stores a computer program for electronic data exchange, the computer program causing a computer to execute a part or all of the steps of any one of the photographing control methods as set forth in the above method embodiments.
Embodiments of the present application also provide a computer program product including a non-transitory computer-readable storage medium storing a computer program operable to cause a computer to execute some or all of the steps of any one of the shooting control methods as set forth in the above method embodiments.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one type of division of logical functions, and there may be other divisions when actually implementing, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some interfaces, devices or units, and may be an electric or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or may be implemented in the form of a software program module.
The integrated units, if implemented in the form of software program modules and sold or used as stand-alone products, may be stored in a computer readable memory. Based on such understanding, the technical solution of the present application may be substantially implemented or a part of or all or part of the technical solution contributing to the prior art may be embodied in the form of a software product stored in a memory, and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method described in the embodiments of the present application. And the aforementioned memory comprises: various media capable of storing program codes, such as a usb disk, a read-only memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and the like.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable memory, which may include: flash disk, ROM, RAM, magnetic or optical disk, and the like.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (9)

1. A shooting control method is applied to an electronic device, the electronic device comprises a display screen, and the method comprises the following steps:
receiving a window opening instruction when the electronic equipment is in a shooting mode, and creating N windows on the display screen, wherein N is a positive integer; the N is a value set by a user according to the number of the faces to be shot;
carrying out face detection on a shot picture through a camera of the electronic equipment to obtain M pieces of personal face information of M faces, wherein each face corresponds to one piece of face information, and M is a positive integer less than or equal to N;
if M is less than N, sending a face detection prompt message, wherein the face detection prompt message is used for prompting a user to send a face detection instruction to the electronic equipment; the face detection prompt information is also used for prompting a user to correct the shooting posture or adjust the shooting angle; receiving the face detection instruction, and repeatedly executing the operation of face detection on a shot picture by a camera of the electronic equipment according to the face detection instruction until M is equal to N;
selecting M target windows in the N windows, and respectively displaying the M faces in the M target windows, wherein each face corresponds to one target window;
determining a target paster corresponding to each face in the M faces according to the M pieces of personal face information to obtain M pieces of target pasters, wherein each face corresponds to one target paster;
and respectively adding the M target stickers to target windows corresponding to the human faces in the M target windows.
2. The method according to claim 1, wherein the performing face detection on the shot picture by a camera of the electronic device to obtain M pieces of face information of M faces comprises:
carrying out face detection on the shot picture through a camera of the electronic equipment to obtain M face outlines of M faces;
cutting the shot picture according to the M face outlines to obtain M face images;
and acquiring a target expression corresponding to each face image in the M personal face images to obtain M target expressions, wherein each face image corresponds to one target expression.
3. The method according to claim 1 or 2, wherein the adding the M target stickers to the target windows corresponding to the human face, respectively, comprises:
obtaining the regional expression parameters of each face in the M faces to obtain M regional expression parameters;
determining display parameters of each target paster in the M target pasters according to the M regional expression parameters to obtain M display parameters;
and adding each target paster in the M target pasters to a target window corresponding to the human face in the M target windows according to the M display parameters.
4. The method of claim 3, wherein the determining the display parameters of each of the M target stickers according to the M regional expression parameters comprises:
determining the scaling of each target sticker in the M target stickers according to each regional expression parameter in the M regional expression parameters;
add M target sticker respectively to in M target window with the target window that the people face corresponds, include:
zooming each target paster in the M target pasters according to the zooming proportion of each target paster in the M target pasters to obtain M zoomed target pasters;
and adding each target paster in the M zoomed target pasters to a window corresponding to the human face.
5. A shooting control apparatus applied to an electronic device including a display screen, the shooting control apparatus comprising:
the electronic equipment comprises a creating unit, a display unit and a display unit, wherein the creating unit is used for receiving a window opening instruction when the electronic equipment is in a shooting mode, and creating N windows on the display screen, wherein N is a positive integer; the N is a value set by a user according to the number of the faces to be shot;
the detection unit is used for carrying out face detection on a shot picture through a camera of the electronic equipment to obtain M pieces of personal face information of M faces, wherein each face corresponds to one piece of face information, and M is a positive integer less than or equal to N;
the sending unit is used for sending a face detection prompting message when M is smaller than N, wherein the face detection prompting message is used for prompting a user to send a face detection instruction to the electronic equipment; the face detection prompt information is also used for prompting a user to correct the shooting posture or adjust the shooting angle;
the detection unit is further configured to receive the face detection instruction, and repeatedly execute the operation of performing face detection on a shot picture by using the camera of the electronic device until M is equal to N;
the display unit is used for selecting M target windows in the N windows and respectively displaying the M faces in the M target windows, wherein each face corresponds to one target window;
the determining unit is used for determining a target paster corresponding to each face in the M faces according to the M pieces of personal face information to obtain M pieces of target pasters, and each face corresponds to one target paster;
and the execution unit is used for respectively adding the M target stickers to the target windows corresponding to the human faces in the M target windows.
6. The shooting control apparatus according to claim 5, wherein the detection unit is specifically configured to:
carrying out face detection on the shot picture through a camera of the electronic equipment to obtain M face outlines of M faces;
cutting the shot picture according to the M face outlines to obtain M face images;
and acquiring a target expression corresponding to each face image in the M personal face images to obtain M target expressions, wherein each face image corresponds to one target expression.
7. The shooting control apparatus according to claim 5 or 6, wherein the execution unit is specifically configured to:
obtaining the regional expression parameters of each face in the M faces to obtain M regional expression parameters;
determining display parameters of each target paster in the M target pasters according to the M regional expression parameters to obtain M display parameters;
and adding each target paster in the M target pasters to a target window corresponding to the human face in the M target windows according to the M display parameters.
8. An electronic device, comprising: a processor and a memory; and one or more programs stored in the memory and configured to be executed by the processor, the programs comprising instructions for the method of any of claims 1-4.
9. A computer-readable storage medium for storing a computer program, wherein the computer program causes a computer to perform the method according to any one of claims 1-4.
CN201810621300.XA 2018-06-15 2018-06-15 Shooting control method and related product Active CN108833779B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810621300.XA CN108833779B (en) 2018-06-15 2018-06-15 Shooting control method and related product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810621300.XA CN108833779B (en) 2018-06-15 2018-06-15 Shooting control method and related product

Publications (2)

Publication Number Publication Date
CN108833779A CN108833779A (en) 2018-11-16
CN108833779B true CN108833779B (en) 2021-05-04

Family

ID=64142269

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810621300.XA Active CN108833779B (en) 2018-06-15 2018-06-15 Shooting control method and related product

Country Status (1)

Country Link
CN (1) CN108833779B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109584177A (en) * 2018-11-26 2019-04-05 北京旷视科技有限公司 Face method of modifying, device, electronic equipment and computer readable storage medium
CN111488759A (en) * 2019-01-25 2020-08-04 北京字节跳动网络技术有限公司 Image processing method and device for animal face
CN112084750B (en) * 2019-06-14 2023-05-23 腾讯数码(天津)有限公司 Label paper processing method and device, electronic equipment and storage medium
CN111260600B (en) * 2020-01-21 2023-08-22 维沃移动通信有限公司 Image processing method, electronic equipment and medium
CN113923355A (en) * 2021-09-30 2022-01-11 上海商汤临港智能科技有限公司 Vehicle, image shooting method, device, equipment and storage medium
CN117079324B (en) * 2023-08-17 2024-03-12 厚德明心(北京)科技有限公司 Face emotion recognition method and device, electronic equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104954678A (en) * 2015-06-15 2015-09-30 联想(北京)有限公司 Image processing method, image processing device and electronic equipment
CN105516588A (en) * 2015-12-07 2016-04-20 小米科技有限责任公司 Photographic processing method and device
JP2017017409A (en) * 2015-06-29 2017-01-19 フリュー株式会社 Imaging apparatus and imaging method
CN106803069A (en) * 2016-12-29 2017-06-06 南京邮电大学 Crowd's level of happiness recognition methods based on deep learning

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101247482B (en) * 2007-05-16 2010-06-02 北京思比科微电子技术有限公司 Method and device for implementing dynamic image processing
US8831379B2 (en) * 2008-04-04 2014-09-09 Microsoft Corporation Cartoon personalization
CN101308570A (en) * 2008-07-11 2008-11-19 北京中星微电子有限公司 Regional effect steering method and apparatus
CN102289339B (en) * 2010-06-21 2013-10-30 腾讯科技(深圳)有限公司 Method and device for displaying expression information
JP5879536B2 (en) * 2012-01-18 2016-03-08 パナソニックIpマネジメント株式会社 Display device and display method
WO2013126860A1 (en) * 2012-02-24 2013-08-29 Redigi, Inc. A method to give visual representation of a music file or other digital media object chernoff faces
CN103778360A (en) * 2012-10-26 2014-05-07 华为技术有限公司 Face unlocking method and device based on motion analysis
CN104244101A (en) * 2013-06-21 2014-12-24 三星电子(中国)研发中心 Method and device for commenting multimedia content
CN103413270A (en) * 2013-08-15 2013-11-27 北京小米科技有限责任公司 Method and device for image processing and terminal device
WO2015127394A1 (en) * 2014-02-23 2015-08-27 Northeastern University System for beauty, cosmetic, and fashion analysis
CN106412458A (en) * 2015-07-31 2017-02-15 中兴通讯股份有限公司 Image processing method and apparatus
CN105554429A (en) * 2015-11-19 2016-05-04 掌赢信息科技(上海)有限公司 Video conversation display method and video conversation equipment
CN106210545A (en) * 2016-08-22 2016-12-07 北京金山安全软件有限公司 Video shooting method and device and electronic equipment
CN106777329B (en) * 2017-01-11 2019-03-05 维沃移动通信有限公司 A kind of processing method and mobile terminal of image information
CN108022206A (en) * 2017-11-30 2018-05-11 广东欧珀移动通信有限公司 Image processing method, device, electronic equipment and computer-readable recording medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104954678A (en) * 2015-06-15 2015-09-30 联想(北京)有限公司 Image processing method, image processing device and electronic equipment
JP2017017409A (en) * 2015-06-29 2017-01-19 フリュー株式会社 Imaging apparatus and imaging method
CN105516588A (en) * 2015-12-07 2016-04-20 小米科技有限责任公司 Photographic processing method and device
CN106803069A (en) * 2016-12-29 2017-06-06 南京邮电大学 Crowd's level of happiness recognition methods based on deep learning

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
大型集体照的拍摄技巧及后期制作;罗飞;《重庆第二师范学院学报》;20140525;全文 *

Also Published As

Publication number Publication date
CN108833779A (en) 2018-11-16

Similar Documents

Publication Publication Date Title
CN108833779B (en) Shooting control method and related product
CN106558025B (en) Picture processing method and device
CN108495045B (en) Image capturing method, image capturing apparatus, electronic apparatus, and storage medium
CN107967129B (en) Display control method and related product
CN109283996B (en) Display control method and related product
CN109240577B (en) Screen capturing method and terminal
CN110099219B (en) Panoramic shooting method and related product
CN108307106B (en) Image processing method and device and mobile terminal
CN111399658B (en) Calibration method and device for eyeball fixation point, electronic equipment and storage medium
CN108495049A (en) Filming control method and Related product
CN109407948B (en) Interface display method and mobile terminal
CN113360005B (en) Color cast adjusting method and related product
CN108958587B (en) Split screen processing method and device, storage medium and electronic equipment
CN112703534A (en) Image processing method and related product
CN110198421B (en) Video processing method and related product
CN110221696B (en) Eyeball tracking method and related product
CN109194810B (en) Display control method and related product
CN108920052B (en) Page display control method and related product
CN109257489B (en) Display method and mobile terminal
CN108259756B (en) Image shooting method and mobile terminal
CN108121583B (en) Screen capturing method and related product
CN111556248B (en) Shooting method, shooting device, storage medium and mobile terminal
CN108737657B (en) Antenna control method and related product
CN114077465A (en) UI (user interface) rendering method and device, electronic equipment and storage medium
CN109451336B (en) Video playing method and related product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant