CN114125285A - Shooting method and device - Google Patents

Shooting method and device Download PDF

Info

Publication number
CN114125285A
CN114125285A CN202111369179.4A CN202111369179A CN114125285A CN 114125285 A CN114125285 A CN 114125285A CN 202111369179 A CN202111369179 A CN 202111369179A CN 114125285 A CN114125285 A CN 114125285A
Authority
CN
China
Prior art keywords
style
user
shooting
information
characteristic information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111369179.4A
Other languages
Chinese (zh)
Inventor
任苑
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202111369179.4A priority Critical patent/CN114125285A/en
Publication of CN114125285A publication Critical patent/CN114125285A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a shooting method and a shooting device, wherein the method comprises the following steps: under the condition that a shooting preview interface is displayed, acquiring characteristic information of an object displayed in the shooting preview interface, wherein the characteristic information is used for reflecting the wearing style of a user; the characteristic information comprises at least one item of user attribute characteristic information and user wearing attribute characteristic information; the user wearing attribute feature information comprises: at least one item of attribute feature information of upper-body fit of the user, attribute feature information of lower-body fit of the user and feature information of type of fit of the user; determining target overlapping style information of the object according to the characteristic information and a preset overlapping style recognition model; and displaying the shooting style filter matched with the target putting style information.

Description

Shooting method and device
Technical Field
The present application belongs to the field of image technologies, and in particular, to a shooting method and device.
Background
Among present electronic equipment, no matter leading camera or rear camera, the portrait effect is all comparatively single, can't embody one person unique style and quality of the face, promptly, can't realize the portrait demand of thousand faces of people, however, no matter male sex, women all hope to take the portrait photo that possesses oneself unique style and quality of the face.
Disclosure of Invention
The embodiment of the application aims to provide a shooting method and a shooting device, and the problem that in the prior art, a portrait photo with a unique style and a unique quality of the portrait photo cannot be shot for each user is solved.
In a first aspect, an embodiment of the present application provides a shooting method, where the method includes:
under the condition that a shooting preview interface is displayed, acquiring characteristic information of an object displayed in the shooting preview interface, wherein the characteristic information is used for reflecting the wearing style of a user; the characteristic information comprises at least one item of user attribute characteristic information and user wearing attribute characteristic information; the user wearing attribute feature information comprises: at least one item of attribute feature information of upper-body fit of the user, attribute feature information of lower-body fit of the user and feature information of type of fit of the user;
determining target overlapping style information of the object according to the characteristic information and a preset overlapping style recognition model;
and displaying the shooting style filter matched with the target putting style information.
In a second aspect, an embodiment of the present application provides a shooting device, including:
the acquisition module is used for acquiring the characteristic information of an object displayed in the shooting preview interface, which is used for reflecting the wearing style of a user, under the condition of displaying the shooting preview interface; the characteristic information comprises at least one item of user attribute characteristic information and user wearing attribute characteristic information; the user wearing attribute feature information comprises: at least one item of attribute feature information of upper-body fit of the user, attribute feature information of lower-body fit of the user and feature information of type of fit of the user;
the determining module is used for determining target putting style information of the object according to the characteristic information and a preset putting style identification model;
and the display module is used for displaying the shooting style filter matched with the target wearing style information.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, and when executed by the processor, the program or instructions implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In the embodiment of the application, on one hand, target lap style information of an object is obtained according to a preset lap style recognition model and characteristic information of the object used for reflecting a lap style of a user and displayed in a shooting preview interface, and then a shooting style filter matched with the target lap style information is recommended, so that accuracy of shooting style effect recommendation is improved. On the other hand, the method starts from the characteristic information of the user wearing attribute, so that the photographing style more suitable for the personal image of the user can be obtained, and the requirement of photographing thousands of people can be met.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description, serve to explain the principles of the invention.
Fig. 1 is a flowchart of a shooting method provided in an embodiment of the present application;
fig. 2 is a schematic structural diagram of a shooting device according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of an electronic device according to another embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments that can be derived by one of ordinary skill in the art from the embodiments given herein are intended to be within the scope of the present disclosure.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The shooting method provided by the embodiment of the present application is described in detail below with reference to the accompanying drawings through specific embodiments and application scenarios thereof.
Please refer to fig. 1, which is a flowchart illustrating a photographing method according to an embodiment of the present application. The method can be applied to electronic equipment, and the electronic equipment can be a mobile phone, a tablet computer, a notebook computer and the like. In fig. 1, the photographing method may include the following steps 1100 to 1300, which are described in detail below.
Step 1100, under the condition that a shooting preview interface is displayed, acquiring characteristic information of an object displayed in the shooting preview interface, wherein the characteristic information is used for reflecting a user's style of putting on and putting down.
The shooting preview interface is an interface displayed after the shooting application program is entered. Wherein, the shooting object is displayed in the shooting preview interface.
The characteristic information used for reflecting the user putting style comprises at least one item of user attribute characteristic information and user putting style attribute characteristic information.
The user attribute feature information may be user gender information, such as: the user gender is female, or the user personality is male.
The user fit attribute feature information comprises at least one item of user upper fit attribute feature information, user lower fit attribute feature information and user fit type feature information.
The attribute feature information of the upper body wearing of the user may include a style of the upper body sleeves, a style of the upper body clothes, a material of the upper body clothes, and a type of the upper body clothes. The style of the upper body sleeve includes, but is not limited to: long, medium, seven-part and no sleeves. The style of the upper body garment includes but is not limited to: long and short. The materials of the upper body garment include but are not limited to: pure cotton, real silk, jean, knitting, paillette and leather. Types of such upper body garments include, but are not limited to: business, leisure, sports, and dress.
The attribute feature information of the lower part of the user can comprise the style of skirt, the style of trousers, the material of the lower part of the clothes and the type of the lower part of the clothes. The skirt style includes but is not limited to: short skirt, middle skirt and longuette. The pant style includes, but is not limited to: shorts, trousers, and seventies. The materials of the lower body garment include but are not limited to: pure cotton, real silk, jean, knitting, paillette and leather. Types of such lower body garments include, but are not limited to: jeans, sport pants, casual pants, and western style pants.
The user wearing type feature information includes but is not limited to: business, leisure, sports, and dress.
Example 1, the feature information for reflecting the wearing style of the user has 6 feature information, and the 6 feature information may be a user gender, a style of upper body sleeves, a style of upper body clothes, a material of upper body clothes, a type of upper body clothes, and a wearing style of the user, where the user gender, the style of upper body sleeves, the style of upper body clothes, the material of upper body clothes, the type of upper body clothes, and the value of the wearing style of the user may be a woman, a long sleeve, a long style, pure cotton, a shirt, and a business.
Example 2, the feature information for reflecting the wearing style of the user has 9 feature information, and the 9 feature information may be a user's gender, a style of upper body sleeves, a style of upper body clothes, a material of upper body clothes, a type of upper body clothes, a wearing style of the user, trousers, a material of lower body clothes, and a type of lower body clothes, respectively, where values of the user's gender, a style of upper body sleeves, a style of upper body clothes, a material of upper body clothes, a type of upper body clothes, a wearing style of the user, trousers, a material of lower body clothes, and a type of lower body clothes may be women, long sleeves, long lines, pure cotton, shirts, businesses, long trousers, pure cotton, and western style trousers, respectively.
In this embodiment, in the step 1100, when the shooting preview interface is displayed, the obtaining of the feature information of the object displayed in the shooting preview interface, which is used for reflecting the style of putting on and putting on the user, may further include the following steps 1110 to 1120:
step 1110, acquiring a first image of the object acquired by the target camera.
The target camera may be a front-facing camera or a rear-facing camera in the electronic device.
It can be understood that when the user self-timer, the front camera of the electronic device is usually turned on, and conversely, the rear camera of the electronic device is turned on.
In one example, the acquiring the first image of the object acquired by the target camera in step 1110 may further include: the method includes acquiring a first image of an object acquired by a front camera under a condition that the front camera is in an open state.
In this example, when the front camera is in the on state, since only the upper body of the user is usually displayed in the shooting preview interface, only the upper body image of the object displayed in the shooting preview interface may be captured by the front camera.
In one example, the acquiring the first image of the object acquired by the target camera in step 1110 may further include: the method includes acquiring a first image of an object acquired by a rear camera under a condition that the rear camera is in an open state.
For example, if the upper body of the user is displayed in the shooting preview interface in a state where the rear camera is turned on, the upper body image of the object displayed in the shooting preview interface may be captured only by the rear camera.
For another example, if the lower body of the user is displayed in the shooting preview interface when the rear camera is in the on state, the rear camera can be used for collecting the lower body image of the object displayed in the shooting preview interface.
In an example, the acquiring the first image of the object acquired by the target camera in step 1110 may further include: acquiring a first image of the object acquired by the target camera under different zoom magnifications.
In this example, when the target camera is a rear camera, the user can recognize the upper body fit and the lower body fit, and a more accurate fit style is recommended to the user. Here, when the rear camera recognizes the person wearing the suit, the user may not see the whole person wearing the suit due to the distance. Here, the user's whole body wearing can be recognized by changing the zoom magnification of the rear camera. For example, a first image of an object displayed in the shooting preview interface at different zoom magnifications, such as zoom magnification 1X and zoom magnification 3X, may be captured to assist in recognizing the whole body wearing of the user.
Step 1120, identifying the first image, and obtaining the characteristic information of the object used for reflecting the wearing style of the user.
In this embodiment, after a first image of an object displayed on the shooting preview interface is captured by the target camera, the first image may be recognized, so as to obtain feature information of the object for reflecting a user's style of putting on and putting down.
In this embodiment, before the step 1100 is executed to acquire the feature information of the object displayed in the shooting preview interface and used for reflecting the style of putting on and putting on the user, the shooting method of the present disclosure may further include steps 2100 to 2300:
at step 2100, a first input to the capture preview interface is received.
The first input may be a touch input for the capture preview interface, which may be a click input, which may be, for example, a single click input.
Illustratively, in the case of displaying a photographing preview interface in which a photographing function is displayed, the photographing function includes, for example and without limitation: video function, shooting function, slow motion function and portrait function. Here, the first input may be an input of the user clicking a "portrait function".
Step 2200, in response to the first input, controlling the electronic device to enter a portrait shooting mode.
Continuing with the above example, after the user clicks the "portrait function", the electronic device may be controlled to enter the portrait shooting mode, and the step of obtaining the feature information, which is used for reflecting the user wearing style, of the object displayed in the shooting preview interface is executed only when the electronic device enters the portrait shooting mode.
In the case of displaying a shooting preview interface, after acquiring feature information of an object displayed in the shooting preview interface and used for reflecting a user's style of putting on and taking off, entering:
step 1200, determining target putting style information of the object according to the characteristic information and a preset putting style identification model.
The preset putting-on style recognition model is used for recognizing the putting-on style information of the object. And inputting the preset putting-on style identification model, namely the characteristic information of the object used for reflecting the putting-on style of the user, and outputting the characteristic information, namely the putting-on style information of the object. User engagement style information includes, but is not limited to: simple pure color, elegant perception, sweet and cool, dark, little fresh and youth campus.
In this embodiment, after obtaining the feature information of the object displayed in the shooting preview interface, which is used for reflecting the user's putting style, according to the step 1100, the feature information may be substituted into the preset putting style recognition model, so as to obtain the target putting style information of the object displayed in the shooting preview interface.
After determining the target putting-on style information of the object according to the characteristic information and a preset putting-on style recognition model, entering:
and 1300, displaying the shooting style filter matched with the target putting style information.
In this embodiment, after the target wearing style information of the object displayed in the shooting preview interface is recognized, the shooting style filter matched with the target wearing style information can be recommended and displayed. It is understood that there may be one or more matched shooting style filters, and this embodiment is not limited herein.
According to the embodiment of the application, on one hand, the target wearing style information of the object is obtained according to the preset wearing style recognition model and the characteristic information of the object used for reflecting the wearing style of the user and displayed in the shooting preview interface, and then the shooting style filter matched with the target wearing style information is recommended, so that the accuracy of the shooting style effect recommendation is improved. On the other hand, the method starts from the characteristic information of the user wearing attribute, so that the photographing style more suitable for the personal image of the user can be obtained, and the requirement of photographing thousands of people can be met.
In one embodiment, after the above step 1300 is executed to display the shooting style filter matched with the target putting style information, the shooting method of the embodiment of the present disclosure may further include the following steps 3100 to 3200:
step 3100, receiving a second input to the capture preview interface.
The second input may be a touch input for the capture preview interface, which may be a click input, which may be, for example, a single click input.
Illustratively, in the case of displaying a shooting preview interface, a shooting button is displayed in the shooting preview interface. Here, the second input may be an input that the user clicks a "photograph button".
Step 3200, in response to the second input, photographing the object based on the photographing style filter.
In one example, where there is one matching shooting style filter, the subject may be shot directly based on the shooting style filter in response to the second input.
Continuing with the above example, after clicking the capture button, the subject may be captured based on the one capture style filter that was matched.
In one example, in the case that there are a plurality of matched shooting style filters, the user may first determine a target shooting style filter from the plurality of shooting style filters, then receive a second input to the shooting preview interface, and in response to the second input, shoot the object based on the target shooting style filter.
According to the embodiment, the object displayed in the shooting preview interface can be shot based on the matched shooting style filter, so that the shooting style more suitable for the personal image of the user is obtained, and the requirement of people shooting thousands of people is met.
In an embodiment, before the above step 1100 is executed to acquire the feature information of the object displayed in the shooting preview interface for reflecting the style of putting on a user, the shooting method of the embodiment of the present disclosure may further include the following steps 4100 to 4400:
step 4100, acquiring a user image with accurate style of putting on the user as a training sample.
According to this step 4100, the preset cross-matching style recognition model can be obtained by training the mapping function with the training samples.
Each training sample comprises characteristic information of a corresponding object for reflecting the tabbing style of the user and tabbing style information of the object.
The greater the number of training samples, the more accurate the training results are generally, but after a certain number of training samples are reached, the more slowly the accuracy of the training results increases until the orientation stabilizes. Here, the number of training samples required for the determination of the accuracy of the training results and the data processing cost can be considered.
Step 4200, obtaining the preset cross-over style recognition model according to the training sample.
According to the embodiment, the preset matching style recognition model can be obtained by training the matching style recognition model through the training sample.
In this embodiment, the preset putting-on style recognition model may be obtained through various fitting means based on the feature information of the object in the training sample for reflecting the putting-on style of the user and the putting-on style information of the object, for example, the preset putting-on style recognition model may be obtained by using an arbitrary multiple linear regression model, which is not limited herein.
The multiple linear regression model can be a simple polynomial function reflecting the preset cross-over style recognition model, wherein coefficients of various orders of the polynomial function are unknown, and the coefficients of various orders of the polynomial function can be determined by substituting the vector value of the feature vector of the training sample and the corresponding user cross-over style into the polynomial function, so that the preset cross-over style recognition model is obtained.
According to the embodiment, the preset lap-type recognition model can be obtained through training according to the characteristic information of the object in each training sample, which is used for reflecting the lap-type of the user, and the lap-type information of the object.
Next, a photographing method of an example is shown, in which the photographing method includes the steps of:
in step 5100, a shooting preview interface is displayed.
Step 5200, receiving a first input to the capture preview interface.
Step 5200, in response to the first input, controlling the electronic device to enter a portrait shooting mode, and turning on a front camera of the electronic device.
Step 5300, obtain a first image of the object acquired by the front-facing camera.
Step 5400, identify the first image, and obtain the feature information of the object used for reflecting the style of putting on and putting off by the user.
Example 1, the feature information for reflecting the wearing style of the user has 6 feature information, and the 6 feature information may be a user gender, a style of upper body sleeves, a style of upper body clothes, a material of upper body clothes, a type of upper body clothes, and a wearing style of the user, where the user gender, the style of upper body sleeves, the style of upper body clothes, the material of upper body clothes, the type of upper body clothes, and the value of the wearing style of the user may be a woman, a long sleeve, a long style, pure cotton, a shirt, and a business.
Step 5500, target putting-on style information of the object is obtained according to the characteristic information and a preset putting-on style recognition model.
And step 5600, displaying a shooting style filter matched with the target putting-on style information.
At step 5700, a second input is received for the capture preview interface.
Step 5800, in response to a second input, photographs the subject based on the photographing style filter.
According to the example, the photographing style more suitable for the personal image of the user can be obtained from the characteristic of the putting-on attribute of the user, and the requirement of photographing for thousands of people can be met.
In the shooting method provided by the embodiment of the present application, the execution subject may be a shooting device, or a control module in the shooting device for executing the shooting method. The embodiment of the present application describes a shooting device provided in the embodiment of the present application by taking a method for executing shooting control by a shooting device as an example.
Corresponding to the above embodiment, referring to fig. 2, an embodiment of the present application further provides a shooting apparatus 200, where the apparatus 200 includes an obtaining module 201, a determining module 202, and a displaying module 203.
The acquiring module 201 is configured to acquire, when a shooting preview interface is displayed, feature information that is used for reflecting a user wearing style of an object displayed in the shooting preview interface; the characteristic information comprises at least one item of user attribute characteristic information and user wearing attribute characteristic information; the user wearing attribute feature information comprises: at least one item of attribute feature information of upper-body fit of the user, attribute feature information of lower-body fit of the user and feature information of type of fit of the user.
And the determining module 202 is configured to determine target lap style information of the object according to the feature information and a preset lap style recognition model.
And the display module 203 is used for displaying the shooting style filter matched with the target wearing style information.
In one embodiment, the apparatus 200 further comprises a control module and a first receiving module (neither shown).
The first receiving module is used for receiving a first input of the shooting preview interface;
and the control module is used for responding to the first input and controlling the electronic equipment to enter a portrait shooting mode.
In an embodiment, the obtaining module 201 is specifically configured to: acquiring a first image of the object acquired by a target camera; and identifying the first image, and obtaining the characteristic information of the object used for reflecting the wearing style of the user.
In an embodiment, the obtaining module 201 is specifically configured to: acquiring a first image of the object acquired by the target camera under different zoom magnifications.
In one embodiment, the apparatus 200 further comprises a second receiving module and a photographing module (not shown in the figure).
The second receiving module is used for receiving a second input to the shooting preview interface.
The shooting module is used for responding to the second input and shooting the object based on the shooting style filter.
In the embodiment of the application, on one hand, the target wearing style information of the object is obtained according to the preset wearing style recognition model and the characteristic information of the object used for reflecting the wearing style of the user and displayed in the shooting preview interface, and then the shooting style filter matched with the target wearing style information is recommended, so that the accuracy of the shooting style effect recommendation is improved. On the other hand, the method starts from the characteristic information of the user wearing attribute, so that the photographing style more suitable for the personal image of the user can be obtained, and the requirement of photographing thousands of people can be met.
The shooting device in the embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The photographing apparatus in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The shooting device provided by the embodiment of the application can realize each process realized by the method embodiment, and is not repeated here for avoiding repetition.
Corresponding to the foregoing embodiments, optionally, as shown in fig. 3, an electronic device 300 is further provided in this embodiment of the present application, and includes a processor 301, a memory 302, and a program or an instruction stored in the memory 302 and capable of running on the processor 301, where the program or the instruction is executed by the processor 301 to implement each process of the foregoing shooting method embodiment, and can achieve the same technical effect, and no further description is provided here to avoid repetition.
It should be noted that the electronic device in the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 4 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 400 includes, but is not limited to: radio unit 401, network module 402, audio output unit 403, input unit 404, sensor 405, display unit 406, user input unit 407, interface unit 408, memory 409, and processor 410.
Those skilled in the art will appreciate that the electronic device 400 may further include a power source (e.g., a battery) for supplying power to various components, and the power source may be logically connected to the processor 410 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system. The electronic device structure shown in fig. 4 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is omitted here.
The processor 410 is configured to, in a case that a shooting preview interface is displayed, acquire feature information that an object displayed in the shooting preview interface is used for reflecting a user's wearing style; the characteristic information comprises at least one item of user attribute characteristic information and user wearing attribute characteristic information; the user wearing attribute feature information comprises: at least one item of attribute feature information of upper-body fit of the user, attribute feature information of lower-body fit of the user and feature information of type of fit of the user.
In one embodiment, the processor 410 is further configured to receive a first input to the capture preview interface; and responding to the first input, and controlling the electronic equipment to enter a portrait shooting mode.
In one embodiment, the processor 410 is further configured to acquire a first image of the object acquired by the target camera; and identifying the first image, and obtaining the characteristic information of the object used for reflecting the wearing style of the user.
In one embodiment, the processor 410 is further configured to acquire the first image of the object acquired by the target camera at different zoom magnifications.
In one embodiment, the processor 410 is further configured to receive a second input to the capture preview interface; in response to the second input, the subject is photographed based on the photographing style filter.
In the embodiment of the application, on one hand, the target wearing style information of the object is obtained according to the preset wearing style recognition model and the characteristic information of the object used for reflecting the wearing style of the user and displayed in the shooting preview interface, and then the shooting style filter matched with the target wearing style information is recommended, so that the accuracy of the shooting style effect recommendation is improved. On the other hand, the method starts from the characteristic information of the user wearing attribute, so that the photographing style more suitable for the personal image of the user can be obtained, and the requirement of photographing thousands of people can be met.
It should be understood that in the embodiment of the present application, the input Unit 404 may include a Graphics Processing Unit (GPU) 4041 and a microphone 4042, and the Graphics processor 4041 processes image data of a still picture or a video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 406 may include a display panel 4061, and the display panel 4061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 407 includes a touch panel 4071 and other input devices 4072. A touch panel 4071, also referred to as a touch screen. The touch panel 4071 may include two parts, a touch detection device and a touch controller. Other input devices 4072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. The memory 409 may be used to store software programs as well as various data including, but not limited to, application programs and an operating system. The processor 410 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 410.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the above-mentioned information search method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the above information search method embodiment, and can achieve the same technical effect, and the details are not repeated here to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a computer software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. A photographing method, characterized in that the method comprises:
under the condition that a shooting preview interface is displayed, acquiring characteristic information of an object displayed in the shooting preview interface, wherein the characteristic information is used for reflecting the wearing style of a user; the characteristic information comprises at least one item of user attribute characteristic information and user wearing attribute characteristic information; the user wearing attribute feature information comprises: at least one item of attribute feature information of upper-body fit of the user, attribute feature information of lower-body fit of the user and feature information of type of fit of the user;
determining target overlapping style information of the object according to the characteristic information and a preset overlapping style recognition model;
and displaying the shooting style filter matched with the target putting style information.
2. The method of claim 1, before obtaining feature information of an object displayed in the shooting preview interface for reflecting a user's style of putting on a hat, further comprising:
receiving a first input to the shooting preview interface;
and responding to the first input, and controlling the electronic equipment to enter a portrait shooting mode.
3. The method of claim 1, wherein the obtaining of the feature information of the object displayed in the shooting preview interface for reflecting the user's style of putting on a hat comprises:
acquiring a first image of the object acquired by a target camera;
and identifying the first image, and obtaining the characteristic information of the object used for reflecting the wearing style of the user.
4. The method of claim 3, wherein said acquiring the image of the object acquired by the target camera comprises:
acquiring the first image of the object acquired by the target camera under different zoom magnifications.
5. The method of claim 1, wherein after displaying the capture style filter that matches the target punch-through style information, further comprising:
receiving a second input to the shooting preview interface;
in response to the second input, the subject is photographed based on the photographing style filter.
6. A camera, characterized in that the camera comprises:
the acquisition module is used for acquiring the characteristic information of an object displayed in the shooting preview interface, which is used for reflecting the wearing style of a user, under the condition of displaying the shooting preview interface; the characteristic information comprises at least one item of user attribute characteristic information and user wearing attribute characteristic information; the user wearing attribute feature information comprises: at least one item of attribute feature information of upper-body fit of the user, attribute feature information of lower-body fit of the user and feature information of type of fit of the user;
the determining module is used for determining target putting style information of the object according to the characteristic information and a preset putting style identification model;
and the display module is used for displaying the shooting style filter matched with the target wearing style information.
7. The apparatus of claim 6, further comprising a control module and a first receiving module:
the first receiving module is used for receiving a first input of the shooting preview interface;
and the control module is used for responding to the first input and controlling the electronic equipment to enter a portrait shooting mode.
8. The apparatus of claim 6, wherein the obtaining module is specifically configured to:
acquiring a first image of the object acquired by a target camera;
and identifying the first image, and obtaining the characteristic information of the object used for reflecting the wearing style of the user.
9. The apparatus of claim 8, wherein the obtaining module is specifically configured to:
acquiring a first image of the object acquired by the target camera under different zoom magnifications.
10. The apparatus of claim 6, further comprising a second receiving module and a photographing module,
the second receiving module is used for receiving a second input to the shooting preview interface;
the shooting module is used for responding to the second input and shooting the object based on the shooting style filter.
CN202111369179.4A 2021-11-18 2021-11-18 Shooting method and device Pending CN114125285A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111369179.4A CN114125285A (en) 2021-11-18 2021-11-18 Shooting method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111369179.4A CN114125285A (en) 2021-11-18 2021-11-18 Shooting method and device

Publications (1)

Publication Number Publication Date
CN114125285A true CN114125285A (en) 2022-03-01

Family

ID=80397596

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111369179.4A Pending CN114125285A (en) 2021-11-18 2021-11-18 Shooting method and device

Country Status (1)

Country Link
CN (1) CN114125285A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109117779A (en) * 2018-08-06 2019-01-01 百度在线网络技术(北京)有限公司 One kind, which is worn, takes recommended method, device and electronic equipment
CN109462727A (en) * 2018-11-23 2019-03-12 维沃移动通信有限公司 A kind of filter method of adjustment and mobile terminal
CN110139021A (en) * 2018-02-09 2019-08-16 北京三星通信技术研究有限公司 Auxiliary shooting method and terminal device
CN112163930A (en) * 2020-09-27 2021-01-01 深圳莱尔托特科技有限公司 Matching recommendation method and matching recommendation system
CN112714257A (en) * 2020-12-30 2021-04-27 维沃移动通信(杭州)有限公司 Display control method, display control device, electronic device, and medium
CN112714251A (en) * 2020-12-24 2021-04-27 联想(北京)有限公司 Shooting method and shooting terminal

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110139021A (en) * 2018-02-09 2019-08-16 北京三星通信技术研究有限公司 Auxiliary shooting method and terminal device
CN109117779A (en) * 2018-08-06 2019-01-01 百度在线网络技术(北京)有限公司 One kind, which is worn, takes recommended method, device and electronic equipment
CN109462727A (en) * 2018-11-23 2019-03-12 维沃移动通信有限公司 A kind of filter method of adjustment and mobile terminal
CN112163930A (en) * 2020-09-27 2021-01-01 深圳莱尔托特科技有限公司 Matching recommendation method and matching recommendation system
CN112714251A (en) * 2020-12-24 2021-04-27 联想(北京)有限公司 Shooting method and shooting terminal
CN112714257A (en) * 2020-12-30 2021-04-27 维沃移动通信(杭州)有限公司 Display control method, display control device, electronic device, and medium

Similar Documents

Publication Publication Date Title
CN104615769B (en) Picture classification method and device
CN107665238B (en) Picture processing method and device for picture processing
CN112135046B (en) Video shooting method, video shooting device and electronic equipment
CN103197825B (en) Image processing apparatus and display control method
CN106156297A (en) Method and device recommended by dress ornament
WO2021208633A1 (en) Method and device for determining item name, computer apparatus, and storage medium
WO2019120031A1 (en) Method, device, storage medium, and mobile terminal for making recommendation about clothing matching
JP2017076315A (en) Image processing device, image processing method, and program
CN112714257B (en) Display control method, display control device, electronic device, and medium
KR20150007403A (en) Apparatus and method for operating information searching data of persons and person recognizes method using the same
CN113794834B (en) Image processing method and device and electronic equipment
CN111953902A (en) Image processing method and device
CN113487373A (en) Fitting mirror, terminal, clothing recommendation method and storage medium
CN111800574B (en) Imaging method and device and electronic equipment
CN112399078B (en) Shooting method and device and electronic equipment
CN113747076A (en) Shooting method and device and electronic equipment
CN112734661A (en) Image processing method and device
JP2015179431A (en) Retrieval device, retrieval method, and program
CN114125285A (en) Shooting method and device
CN111429210A (en) Method, device and equipment for recommending clothes
CN111429207A (en) Method, device and equipment for recommending clothes
CN112148912A (en) Method, device and equipment for recommending clothes
CN111524160A (en) Track information acquisition method and device, electronic equipment and storage medium
CN111429206A (en) Method, device and equipment for recommending clothes
CN113347355A (en) Image processing method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination