CN111161133A - Picture processing method and electronic equipment - Google Patents

Picture processing method and electronic equipment Download PDF

Info

Publication number
CN111161133A
CN111161133A CN201911367052.1A CN201911367052A CN111161133A CN 111161133 A CN111161133 A CN 111161133A CN 201911367052 A CN201911367052 A CN 201911367052A CN 111161133 A CN111161133 A CN 111161133A
Authority
CN
China
Prior art keywords
picture
target
filter mode
filter
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911367052.1A
Other languages
Chinese (zh)
Other versions
CN111161133B (en
Inventor
穆锦云
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201911367052.1A priority Critical patent/CN111161133B/en
Publication of CN111161133A publication Critical patent/CN111161133A/en
Application granted granted Critical
Publication of CN111161133B publication Critical patent/CN111161133B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/04Context-preserving transformations, e.g. by using an importance map

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a picture processing method and electronic equipment, wherein the method comprises the following steps: under the condition that a first picture is displayed on a target interface, identifying clothing of a target object in the first picture; determining a target style type corresponding to the clothing; and determining a first filter mode corresponding to the first picture according to the target style type. The invention facilitates the determination of the filter mode and can improve the determination efficiency of the filter mode.

Description

Picture processing method and electronic equipment
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to a picture processing method and electronic equipment.
Background
When the photo culture is increasingly popular, more and more people take photos for commemoration in activities such as daily life, work meetings, travel and travel. The electronic device may be provided with a number of filter modes for the user to select an appropriate filter mode for the photograph to enhance the display of the photograph.
Currently, the way the user selects the filter mode is tried one by one. This approach takes longer with more filter modes. Therefore, the existing electronic equipment has the problem that the user is inconvenient to select the filter mode.
Disclosure of Invention
The embodiment of the invention provides a picture processing method and electronic equipment, and aims to solve the problem that the electronic equipment is inconvenient for a user to select a filter mode.
In order to solve the problems, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides an image processing method, which is applied to an electronic device, and the method includes:
under the condition that a first picture is displayed on a target interface, identifying clothing of a target object in the first picture;
determining a target style type corresponding to the clothing;
and determining a first filter mode corresponding to the first picture according to the target style type.
In a second aspect, an embodiment of the present invention further provides an electronic device, where the electronic device includes:
the first identification module is used for identifying the clothing of a target object in a first picture under the condition that the first picture is displayed on a target interface;
the first determining module is used for determining the target style type corresponding to the clothing;
and the second determining module is used for determining a first filter mode corresponding to the first picture according to the target style type.
In a third aspect, an embodiment of the present invention further provides an electronic device, which includes a processor, a memory, and a computer program stored on the memory and executable on the processor, where the computer program, when executed by the processor, implements the steps of the picture processing method described above.
In a fourth aspect, the embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when being executed by a processor, the computer program implements the steps of the picture processing method as described above.
In the embodiment of the invention, under the condition that a first picture is displayed on a target interface, clothes of a target object in the first picture are identified; determining a target style type corresponding to the clothing; and determining a first filter mode corresponding to the first picture according to the target style type. Therefore, the embodiment of the invention can determine the filter mode corresponding to the picture according to the clothes of the target object in the picture, thereby facilitating the determination of the filter mode and improving the determination efficiency of the filter mode.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive exercise.
Fig. 1 is a flowchart of a picture processing method according to an embodiment of the present invention;
FIG. 2 is a second flowchart of a picture processing method according to an embodiment of the present invention;
FIG. 3 is a block diagram of an electronic device according to an embodiment of the present invention;
fig. 4 is a second structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "first," "second," and the like in this application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus. Further, as used herein, "and/or" means at least one of the connected objects, e.g., a and/or B and/or C, means 7 cases including a alone, B alone, C alone, and both a and B present, B and C present, both a and C present, and A, B and C present.
Referring to fig. 1, fig. 1 is a structural diagram of a network system to which an embodiment of the present invention is applicable, and as shown in fig. 1, the network system includes a terminal 11 and a network-side device 12, where the terminal 11 and the network-side device 12 can communicate with each other.
The image processing method of the embodiment of the invention can be applied to electronic equipment. The electronic Device may be a Mobile phone, a Tablet Personal Computer (Tablet Personal Computer), a Laptop Computer (Laptop Computer), a Personal Digital Assistant (PDA), a Mobile Internet Device (MID), a Wearable Device (Wearable Device), or the like.
The following describes a picture processing method according to an embodiment of the present invention.
Referring to fig. 1, fig. 1 is a flowchart of a picture processing method according to an embodiment of the present invention. As shown in fig. 1, the picture processing method of the present embodiment may include the following steps:
step 101, identifying the clothing of a target object in a first picture under the condition that the first picture is displayed on a target interface.
In a specific implementation, the target interface may be, but is not limited to, a photo preview interface, or a photo editing interface.
The object may be, but is not limited to, an image of the user in a picture. In the case where two or more objects are included in the first picture, the target object may be any one of: any object in the first picture; the object with the largest proportion in the first picture; the object selected by the user, but is not limited thereto.
The garment may include a garment, pants, skirt, etc. Furthermore, the shoe cover can also comprise accessories such as shoes, hats and the like.
And 102, determining the target style type corresponding to the clothing.
In this embodiment, the electronic device may store a database in which the clothing pictures of a plurality of style types are collected.
For example, the clothing pictures in the database may include the following categories: japanese style, Korean style, European style, Chinese style, hip-hop style, sports style, etc. Furthermore, the Japanese style can comprise a quadratic element, a forest system, a Chinese jungle, an occupation and an original host system; korean style wind may include fresh, cool, vintage, joint; the European and American style can include occupation, fashion, sexy and vintage; the Chinese wind may include Han clothing and national wind.
After the electronic equipment identifies the clothing, pixel information of the clothing in the first picture can be extracted; and generating a target clothing picture according to the pixel information.
And then, the electronic equipment can compare the target clothing picture with the clothing pictures in the database to obtain the matching degree of the target clothing picture and each clothing picture. And determining the style type corresponding to the clothing picture with the highest matching degree with the target clothing picture in the database as the target style type corresponding to the clothing.
And 103, determining a first filter mode corresponding to the first picture according to the target style type.
In this embodiment, the electronic device may pre-store the corresponding relationship between the style and the filter mode. In the specific implementation, in the corresponding relationship, the style type and the filter mode may be a one-to-one corresponding relationship, or the style type and the filter mode may also be a one-to-many corresponding relationship, which may be determined specifically according to actual requirements, and this is not limited in the embodiment of the present invention. For example, in the correspondence, the style type 1 may correspond to only 1 filter mode; genre type 2 may correspond to 3 filter patterns.
In practical application, the filter mode specifically corresponding to the style type can be preset. Exemplarily, for the korean street style picture, it is suitable for the filter effect with high contrast and high saturation, and therefore, the korean street style can be set to correspond to the filter mode with the filter hue with high contrast and high saturation; for the solar forest system style, the filter effect with low contrast and low saturation is suitable, so the filter mode with the solar forest system style corresponding to the filter effect with low contrast and low saturation can be set.
It should be noted that, in this embodiment, when a certain style type corresponds to two or more filter modes, a filter library may be provided according to the style type, where the filter library includes the corresponding filter mode.
After the electronic device determines the target style type, the filter mode corresponding to the target style type can be determined by searching the corresponding relationship.
And under the condition that the number of the filter modes corresponding to the target style type is 1, directly determining the filter mode corresponding to the target style type as the first filter mode.
When the number of filter patterns corresponding to the target style type is greater than 1, one filter pattern may be selected from the filter patterns corresponding to the target style type as the first filter pattern.
In this embodiment, after the electronic device determines the first filter mode corresponding to the first picture, in an implementation manner, the electronic device may directly perform filter processing on the first picture by using the first filter mode to obtain an effect image of the first picture; in another embodiment, a control corresponding to the first filter mode may be displayed for a user to select. The present invention can be determined according to practical requirements, and is not limited in the embodiments of the present invention.
In the picture processing method of the embodiment, under the condition that a first picture is displayed on a target interface, clothes of a target object in the first picture are identified; determining a target style type corresponding to the clothing; and determining a first filter mode corresponding to the first picture according to the target style type. Therefore, the embodiment of the invention can determine the filter mode corresponding to the picture according to the clothes of the target object in the picture, thereby facilitating the determination of the filter mode and improving the determination efficiency of the filter mode.
In this embodiment, optionally, before determining the first filter mode corresponding to the first picture according to the target style type, the method further includes:
acquiring first parameter information of the garment, wherein the first parameter information comprises at least one of hue information, brightness information and saturation information;
determining a first filter mode corresponding to the first picture according to the target style type specifically includes:
determining a target filter library corresponding to the target style type;
and selecting a first filter mode corresponding to the first picture from the target filter library according to the first parameter information.
In this embodiment, the color tone of the garment may be optionally divided into a cool color tone and a warm color tone; the lightness of a garment can be divided into low lightness and high lightness; the saturation of the garment can be classified into low saturation and high saturation.
Accordingly, the filter bank may include at least one of the following filter patterns:
the method comprises the steps that a filter mode suitable for a first target picture is the filter mode corresponding to the first target picture, wherein the hue of the first target picture is low hue, the lightness is low lightness, and the saturation is low saturation;
the filter mode is applicable to a second target picture, namely the filter mode corresponding to the second target picture, wherein the hue of the second target picture is low hue, the lightness is low lightness, and the saturation is high saturation;
a filter mode applicable to a third target picture, that is, a filter mode corresponding to the third target picture, where the hue of the third target picture is low, the lightness is high, and the saturation is low;
a filter mode applicable to a fourth target picture, that is, a filter mode corresponding to the fourth target picture, where the hue of the fourth target picture is low, the lightness is high, and the saturation is high;
a filter mode applicable to a fifth target picture, namely the filter mode corresponding to the fifth target picture, wherein the hue of the fifth target picture is warm hue, the lightness is low lightness, and the saturation is low saturation;
the filter mode is applicable to a sixth target picture, namely the filter mode corresponding to the sixth target picture, wherein the hue of the sixth target picture is warm hue, the lightness is low lightness, and the saturation is high saturation;
a filter mode applicable to a seventh target picture, namely the filter mode corresponding to the seventh target picture, wherein the hue of the seventh target picture is warm, the lightness is high, and the saturation is low;
a filter mode applicable to an eighth target picture, that is, a filter mode corresponding to the eighth target picture, where a hue of the eighth target picture is a warm hue, a lightness is a high lightness, and a saturation is a high saturation;
the first target picture, the second target picture, the third target picture, the fourth target picture, the fifth target picture, the sixth target picture, the seventh target picture and the eighth target picture all satisfy the following conditions: the style type corresponding to the filter library is the style type corresponding to the filter library.
In specific implementation, the electronic device may determine, according to the first parameter information, a target picture corresponding to the first picture; and determining one filter mode in the filter modes corresponding to the target picture in the target filter library as the first filter mode.
The acquired first parameter information may be used to determine a specific expression form of at least one of hue, brightness information, and saturation information of the clothing of the target object, and further determine a relationship between the first picture and the target picture. For example, if the clothing of the target object has a cool tone, low lightness and low saturation, it may be determined that the first picture corresponds to the first target picture, and then one of the filter modes corresponding to the first target picture may be determined as the first filter mode.
Therefore, through the manner, under the condition that the number of the filter modes corresponding to the style types is larger than 1, the electronic equipment can further combine the first parameter information of the garment to screen the first filter mode corresponding to the first picture from the filter modes corresponding to the style types, so that the use reliability of the determined filter modes can be improved.
In this embodiment, optionally, after determining the first filter mode corresponding to the first picture according to the target style type, the method further includes:
displaying a second picture on the target interface, wherein the second picture is an effect picture obtained by filtering the first picture in a target filter mode;
wherein the target filter mode is: the first filter mode; or, a second filter mode selected by the user.
In a specific implementation, optionally, after determining the first filter mode, the electronic device may include the following embodiments:
in the first embodiment, after determining the first filter mode, the electronic device directly performs filter processing on the first picture by using the first filter mode.
In the first embodiment, after the electronic device performs filter processing on the first picture in the first filter mode, the electronic device may further determine that a touch operation of a user on a second filter mode other than the first filter mode is detected.
And if the touch operation of the user on the control corresponding to the second filter mode is detected, the electronic equipment performs filter processing on the first picture by adopting the second filter mode, and in this case, the second picture is an effect picture obtained by performing filter processing on the first picture by adopting the second filter mode.
If the touch operation of the user on the second filter mode is not detected, in this case, the second picture is an effect picture obtained by filtering the first picture by using the first filter mode.
In a second embodiment, after determining the first filter mode, the electronic device may display a control corresponding to the first filter mode, and temporarily does not adopt the first filter mode to perform filter processing on the first picture. In the second embodiment, the first filter mode may be regarded as a filter mode recommended by the electronic device.
If the touch operation of the user on the control corresponding to the first filter mode is detected, the first filter mode can be adopted to perform filter processing on the first picture, and in this case, the second picture is an effect picture obtained by performing filter processing on the first picture by adopting the first filter mode.
If the touch operation of the user on the control corresponding to the second filter mode is detected, the second filter mode can be adopted to perform filter processing on the first picture, and in this case, the second picture is an effect picture obtained by performing filter processing on the first picture by adopting the second filter mode.
It should be understood that the user can switch the filter mode acting on the first picture according to his own needs.
Therefore, by the mode, the electronic equipment can adopt the target filter mode to carry out filter processing on the first picture to obtain the effect picture, and therefore the display effect of the picture can be improved.
In this embodiment, optionally, after displaying the second picture, the method further includes:
identifying color information of the garment;
and adjusting second parameter information of the second picture when the occupation ratio of the target color is larger than a first value in the color information, wherein the second parameter information comprises at least one of brightness information and saturation information.
In a specific implementation, the color information may include the number of colors included in the garment, and the ratio of each color. In this way, the electronic device may determine whether there is a target color with a ratio greater than a first value according to the color information, and the first value may be determined according to actual needs.
For example, assume that the garment includes two colors: red and yellow, wherein the proportion of red is 80 percent, and the proportion of yellow is 20 percent; the first value is 30%, which indicates that the ratio of the target color of the garment is greater than the first value, and the electronic device can adjust the second parameter information of the second picture.
In practical applications, the specific expression form of the second parameter information may be set according to actual requirements, and the adjustment direction of the second parameter information may also be set according to actual requirements, which is not limited in the embodiment of the present invention.
For example, in a case where the percentage of the target color in the color information is greater than the first value, the electronic device may increase the saturation of the second picture by 5% and increase the brightness by 5%, thereby enhancing the expressiveness of the target color.
Optionally, the adjustment direction of the second parameter information may be related to the specific expression form of the target color.
Specifically, in the case where the target color belongs to a warm tone, the adjustment direction of the second parameter information may be a positive direction, that is, at least one of the lightness and the saturation of the second picture is increased; in case the target color belongs to a cool hue, the adjustment direction of the second parameter information may be a negative direction, i.e. decreasing at least one of the lightness and the saturation of the second picture.
Therefore, by the mode, the electronic equipment can further adjust the parameter information of the second picture according to the color information of the garment, so that the display effect of the second picture can be further improved.
In this embodiment, optionally, when the target filter mode is the second filter mode, after the displaying the second picture, the method further includes:
under the condition that a third picture is displayed on the target interface, determining whether the style type corresponding to the clothing of the third picture is the target style type;
and displaying a fourth picture on the target interface under the condition that the style type corresponding to the clothing of the third picture is the target style type, wherein the fourth picture is an effect picture obtained by filtering the third picture in the second filter mode.
In this embodiment, after determining the first filter mode, the electronic device still selects the second filter mode to perform filter processing on the first picture, which indicates that the user prefers to perform filter processing in the second filter mode on pictures including clothes of the target style type. Therefore, under the condition that the clothing type corresponding to the clothing of the third picture is the target style type, the second filter mode can be adopted to carry out filter processing on the third picture, and an effect picture is obtained.
Further, the electronic device may count the number of times of performing filter processing on the image including the clothing of the target style type using the second filter mode, and may perform filter processing on the third image using the second filter mode to obtain the effect image only when the number of times of processing reaches a preset value and the clothing type corresponding to the clothing of the third image is the target style type.
Thus, the fourth picture displayed in the above manner can better meet the user's desire.
It should be noted that, various optional implementations described in the embodiments of the present invention may be implemented in combination with each other or separately without conflict between the implementations, and the embodiments of the present invention are not limited in this respect.
For ease of understanding, reference is made to fig. 2 for illustration as follows:
as shown in fig. 2, the image processing method of the present embodiment may include the steps of:
step 201, preparing a database according to the style classification.
1) Firstly, a large amount of clothes information on a network is collected according to a clothing style classification method and is input into a database, so that a large amount of data of clothes pictures are ensured under each style classification.
2) Then preparing a large amount of filter resources for corresponding classification according to a clothing style classification method and a color tone classification method, and ensuring that all color tone branches of each style classification have a corresponding amount of filters for selection.
In specific implementation, a designer can design a large number of filter tones according to the clothing in combination with the tone style, and input a Look-Up Table (Look-Up-Table, LUT) for filter display into a database.
Step 202, identifying the clothes style classification of the user.
The method comprises the steps of firstly identifying clothes of a user by using an image intelligent identification technology, extracting information of the clothes, comparing a picture of the clothes with a large amount of collected and classified picture data in a database, finally obtaining a picture in the database with the highest matching degree, analyzing which type the style classification of the clothes belongs to according to the matched picture, and correspondingly matching the style classification of the clothes to a classified filter library. For example, if the user wears the sports suit, the sports style classification is matched, and the corresponding filter banks are all filters suitable for shooting the sports style.
And step 203, combining the integral tone to screen out a filter.
The result of the determination in step 202 is analyzed for a second time, and information such as color, hue, brightness, saturation, and the like of the clothes is identified by analyzing the identified clothes picture. And screening a filter with the highest matching degree from the filter library according to a color tone classification method. For example, the color tone of the clothes is a cold color system, and the lightness and the saturation are lower, and finally the clothes are matched with a filter with low lightness and low saturation in the cold color system.
And step 204, representing colors according to the colors of the clothes.
And analyzing whether the picture has more than 30% of colors according to the clothes information picture identified in the second step, and if the picture has more than 30% of certain colors, improving the saturation by 5% and the brightness by 5% on the basis of the filter. This can enhance the expressiveness of the color.
And step 205, continuously recording the filter information selected by the user.
After the user matches the filter, all the behaviors of actively selecting the filter to take a picture are recorded, the use habits of the user are recorded, and then the use habits of the user are also used as a reference item when the filter is matched. This can continuously improve the accuracy of matching the filter to the user.
On one hand, the image processing method provided by the embodiment of the invention can intelligently recommend the filter according to the clothing style, help the user to efficiently finish the shooting appeal, and improve the satisfaction degree of the user on the use and shooting of the filter; on the other hand, the expansibility of the filter can be enriched, the filter is not limited and not entangled to be displayed for the user when the filter module is designed, the number of the designed filters is not limited, most of the filters can not be displayed for the user, and are recommended for the user only when being suitable, so that the cognitive cost of the user is saved.
Referring to fig. 3, fig. 3 is a block diagram of an electronic device according to an embodiment of the present invention. As shown in fig. 3, the electronic device 300 includes:
the first identification module 301 is configured to identify clothing of a target object in a first picture under the condition that the first picture is displayed on a target interface;
a first determining module 302, configured to determine a target style type corresponding to the clothing;
a second determining module 303, configured to determine, according to the target style type, a first filter mode corresponding to the first picture.
Optionally, the electronic device 300 further includes:
an obtaining module, configured to obtain first parameter information of the garment before determining, according to the target style type, a first filter mode corresponding to the first picture, where the first parameter information includes at least one of hue information, brightness information, and saturation information;
the second determining module 303 includes:
the determining unit is used for determining a target filter library corresponding to the target style type;
and the selection unit is used for selecting a first filter mode corresponding to the first picture from the target filter library according to the first parameter information.
Optionally, the electronic device 300 further includes:
the first display module is used for displaying a second picture after determining a first filter mode corresponding to the first picture according to the target style type, wherein the second picture is an effect picture obtained by filtering the first picture by adopting the target filter mode;
wherein the target filter mode is: the first filter mode; or, a second filter mode selected by the user.
Optionally, the electronic device 300 further includes:
the second identification module is used for identifying the color information of the garment after the second picture is displayed;
an adjusting module, configured to adjust second parameter information of the second picture when the occupancy of the target color is greater than a first value in the color information, where the second parameter information includes at least one of brightness information and saturation information.
Optionally, in a case that the target filter mode is the second filter mode, the electronic device 300 further includes:
a third determining module, configured to determine, after the second picture is displayed, whether a style type corresponding to clothing of a third picture is the target style type when the third picture is displayed on the target interface;
and the second display module is used for displaying a fourth picture under the condition that the style type corresponding to the clothing of the third picture is the target style type, wherein the fourth picture is an effect picture obtained by filtering the third picture by adopting the second filter mode.
The electronic device 300 can implement the processes in the method embodiment of the present invention and achieve the same beneficial effects, and is not described herein again to avoid repetition.
Referring to fig. 4, fig. 4 is a second structural diagram of an electronic device according to a second embodiment of the present invention, where the electronic device may be a hardware structural diagram of an electronic device for implementing various embodiments of the present invention. As shown in fig. 4, electronic device 400 includes, but is not limited to: radio frequency unit 401, network module 402, audio output unit 403, input unit 404, sensor 405, display unit 406, user input unit 407, interface unit 408, memory 409, processor 410, and power supply 411. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 4 does not constitute a limitation of the electronic device, and that the electronic device may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the electronic device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
Wherein, the processor 410 is configured to:
under the condition that a first picture is displayed on a target interface, identifying clothing of a target object in the first picture;
determining a target style type corresponding to the clothing;
and determining a first filter mode corresponding to the first picture according to the target style type.
Optionally, the processor 410 is further configured to:
acquiring first parameter information of the garment, wherein the first parameter information comprises at least one of hue information, brightness information and saturation information;
determining a target filter library corresponding to the target style type;
and selecting a first filter mode corresponding to the first picture from the target filter library according to the first parameter information.
Optionally, the processor 410 is further configured to:
displaying a second picture on the target interface through a display unit 406, wherein the second picture is an effect picture obtained by filtering the first picture in a target filter mode;
wherein the target filter mode is: the first filter mode; or, a second filter mode selected by the user.
Optionally, the processor 410 is further configured to:
identifying color information of the garment;
and adjusting second parameter information of the second picture when the occupation ratio of the target color is larger than a first value in the color information, wherein the second parameter information comprises at least one of brightness information and saturation information.
Optionally, the processor 410 is further configured to:
under the condition that a third picture is displayed on the target interface, determining whether the style type corresponding to the clothing of the third picture is the target style type;
and under the condition that the style type corresponding to the clothing of the third picture is the target style type, displaying a fourth picture on the target interface through a display unit 406, wherein the fourth picture is an effect picture obtained by filtering the third picture in the second filter mode.
It should be noted that, in this embodiment, the electronic device 400 may implement each process in the method embodiment of the present invention and achieve the same beneficial effects, and for avoiding repetition, details are not described here.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 401 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 410; in addition, the uplink data is transmitted to the base station. Typically, radio unit 401 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. Further, the radio unit 401 can also communicate with a network and other devices through a wireless communication system.
The electronic device provides wireless broadband internet access to the user via the network module 402, such as assisting the user in sending and receiving e-mails, browsing web pages, and accessing streaming media.
The audio output unit 403 may convert audio data received by the radio frequency unit 401 or the network module 402 or stored in the memory 409 into an audio signal and output as sound. Also, the audio output unit 403 may also provide audio output related to a specific function performed by the electronic apparatus 400 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 403 includes a speaker, a buzzer, a receiver, and the like.
The input unit 404 is used to receive audio or video signals. The input Unit 404 may include a Graphics Processing Unit (GPU) 4041 and a microphone 4042, and the Graphics processor 4041 processes picture data of a still picture or video obtained by a picture capturing device (e.g., a camera) in a video capturing mode or a picture capturing mode. The processed picture frame may be displayed on the display unit 406. The picture frame processed by the graphic processor 4041 may be stored in the memory 409 (or other storage medium) or transmitted via the radio frequency unit 401 or the network module 402. The microphone 4042 may receive sound, and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 401 in case of the phone call mode.
The electronic device 400 also includes at least one sensor 405, such as light sensors, motion sensors, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 4061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 4061 and/or the backlight when the electronic apparatus 400 is moved to the ear. As one type of motion sensor, an accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of an electronic device (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 405 may also include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., which will not be described in detail herein.
The display unit 406 is used to display information input by the user or information provided to the user. The Display unit 406 may include a Display panel 4061, and the Display panel 4061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 407 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic device. Specifically, the user input unit 407 includes a touch panel 4071 and other input devices 4072. Touch panel 4071, also referred to as a touch screen, may collect touch operations by a user on or near it (e.g., operations by a user on or near touch panel 4071 using a finger, a stylus, or any suitable object or attachment). The touch panel 4071 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 410, receives a command from the processor 410, and executes the command. In addition, the touch panel 4071 can be implemented by using various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 4071, the user input unit 407 may include other input devices 4072. Specifically, the other input devices 4072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a track ball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 4071 can be overlaid on the display panel 4061, and when the touch panel 4071 detects a touch operation thereon or nearby, the touch operation is transmitted to the processor 410 to determine the type of the touch event, and then the processor 410 provides a corresponding visual output on the display panel 4061 according to the type of the touch event. Although in fig. 4, the touch panel 4071 and the display panel 4061 are two independent components to implement the input and output functions of the electronic device, in some embodiments, the touch panel 4071 and the display panel 4061 may be integrated to implement the input and output functions of the electronic device, and the implementation is not limited herein.
The interface unit 408 is an interface for connecting an external device to the electronic apparatus 400. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 408 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the electronic apparatus 400 or may be used to transmit data between the electronic apparatus 400 and an external device.
The memory 409 may be used to store software programs as well as various data. The memory 409 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, a picture playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 409 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 410 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, performs various functions of the electronic device and processes data by operating or executing software programs and/or modules stored in the memory 409 and calling data stored in the memory 409, thereby performing overall monitoring of the electronic device. Processor 410 may include one or more processing units; preferably, the processor 410 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 410.
The electronic device 400 may further include a power supply 411 (e.g., a battery) for supplying power to various components, and preferably, the power supply 411 may be logically connected to the processor 410 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the electronic device 400 includes some functional modules that are not shown, and are not described in detail herein.
Preferably, an embodiment of the present invention further provides an electronic device, which includes a processor 410, a memory 409, and a computer program that is stored in the memory 409 and can be run on the processor 410, and when being executed by the processor 410, the computer program implements each process of the above-mentioned image processing method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the above-mentioned image processing method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (10)

1. A picture processing method is applied to electronic equipment, and is characterized by comprising the following steps:
under the condition that a first picture is displayed on a target interface, identifying clothing of a target object in the first picture;
determining a target style type corresponding to the clothing;
and determining a first filter mode corresponding to the first picture according to the target style type.
2. The method of claim 1, wherein prior to determining the first filter pattern corresponding to the first picture according to the target style type, the method further comprises:
acquiring first parameter information of the garment, wherein the first parameter information comprises at least one of hue information, brightness information and saturation information;
determining a first filter mode corresponding to the first picture according to the target style type specifically includes:
determining a target filter library corresponding to the target style type;
and selecting a first filter mode corresponding to the first picture from the target filter library according to the first parameter information.
3. The method of claim 1, wherein after determining the first filter pattern corresponding to the first picture according to the target style type, the method further comprises:
displaying a second picture on the target interface, wherein the second picture is an effect picture obtained by filtering the first picture in a target filter mode;
wherein the target filter mode is: the first filter mode; or, a second filter mode selected by the user.
4. The method of claim 3, wherein after the displaying the second picture, the method further comprises:
identifying color information of the garment;
and adjusting second parameter information of the second picture when the occupation ratio of the target color is larger than a first value in the color information, wherein the second parameter information comprises at least one of brightness information and saturation information.
5. The method of claim 3, wherein after the displaying the second picture if the target filter mode is the second filter mode, the method further comprises:
under the condition that a third picture is displayed on the target interface, determining whether the style type corresponding to the clothing of the third picture is the target style type;
and displaying a fourth picture on the target interface under the condition that the style type corresponding to the clothing of the third picture is the target style type, wherein the fourth picture is an effect picture obtained by filtering the third picture in the second filter mode.
6. An electronic device, characterized in that the electronic device comprises:
the first identification module is used for identifying the clothing of a target object in a first picture under the condition that the first picture is displayed on a target interface;
the first determining module is used for determining the target style type corresponding to the clothing;
and the second determining module is used for determining a first filter mode corresponding to the first picture according to the target style type.
7. The electronic device of claim 6, further comprising:
an obtaining module, configured to obtain first parameter information of the garment before determining, according to the target style type, a first filter mode corresponding to the first picture, where the first parameter information includes at least one of hue information, brightness information, and saturation information;
the second determining module includes:
the determining unit is used for determining a target filter library corresponding to the target style type;
and the selection unit is used for selecting a first filter mode corresponding to the first picture from the target filter library according to the first parameter information.
8. The electronic device of claim 6, further comprising:
the first display module is used for displaying a second picture after determining a first filter mode corresponding to the first picture according to the target style type, wherein the second picture is an effect picture obtained by filtering the first picture by adopting the target filter mode;
wherein the target filter mode is: the first filter mode; or, a second filter mode selected by the user.
9. The electronic device of claim 8, further comprising:
the second identification module is used for identifying the color information of the garment after the second picture is displayed;
an adjusting module, configured to adjust second parameter information of the second picture when the occupancy of the target color is greater than a first value in the color information, where the second parameter information includes at least one of brightness information and saturation information.
10. The electronic device according to claim 8, wherein in a case where the target filter mode is the second filter mode, the electronic device further comprises:
a third determining module, configured to determine, after the second picture is displayed, whether a style type corresponding to clothing of a third picture is the target style type when the third picture is displayed on the target interface;
and the second display module is used for displaying a fourth picture under the condition that the style type corresponding to the clothing of the third picture is the target style type, wherein the fourth picture is an effect picture obtained by filtering the third picture by adopting the second filter mode.
CN201911367052.1A 2019-12-26 2019-12-26 Picture processing method and electronic equipment Active CN111161133B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911367052.1A CN111161133B (en) 2019-12-26 2019-12-26 Picture processing method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911367052.1A CN111161133B (en) 2019-12-26 2019-12-26 Picture processing method and electronic equipment

Publications (2)

Publication Number Publication Date
CN111161133A true CN111161133A (en) 2020-05-15
CN111161133B CN111161133B (en) 2023-07-04

Family

ID=70558469

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911367052.1A Active CN111161133B (en) 2019-12-26 2019-12-26 Picture processing method and electronic equipment

Country Status (1)

Country Link
CN (1) CN111161133B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115249221A (en) * 2022-09-23 2022-10-28 阿里巴巴(中国)有限公司 Image processing method and device and cloud equipment

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130343615A1 (en) * 2012-06-20 2013-12-26 Tong Zhang Identifying a style of clothing based on an ascertained feature
US20160203586A1 (en) * 2015-01-09 2016-07-14 Snapchat, Inc. Object recognition based photo filters
CN106657810A (en) * 2016-09-26 2017-05-10 维沃移动通信有限公司 Filter processing method and device for video image
CN107071299A (en) * 2017-04-28 2017-08-18 珠海市魅族科技有限公司 Information processing method and device, computer installation and storage medium
WO2018127091A1 (en) * 2017-01-09 2018-07-12 腾讯科技(深圳)有限公司 Image processing method and apparatus, relevant device and server
CN109068056A (en) * 2018-08-17 2018-12-21 Oppo广东移动通信有限公司 A kind of electronic equipment and its filter processing method of shooting image, storage medium
CN110084154A (en) * 2019-04-12 2019-08-02 北京字节跳动网络技术有限公司 Render method, apparatus, electronic equipment and the computer readable storage medium of image
CN110298283A (en) * 2019-06-21 2019-10-01 北京百度网讯科技有限公司 Matching process, device, equipment and the storage medium of picture material

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130343615A1 (en) * 2012-06-20 2013-12-26 Tong Zhang Identifying a style of clothing based on an ascertained feature
US20160203586A1 (en) * 2015-01-09 2016-07-14 Snapchat, Inc. Object recognition based photo filters
CN106657810A (en) * 2016-09-26 2017-05-10 维沃移动通信有限公司 Filter processing method and device for video image
WO2018127091A1 (en) * 2017-01-09 2018-07-12 腾讯科技(深圳)有限公司 Image processing method and apparatus, relevant device and server
CN107071299A (en) * 2017-04-28 2017-08-18 珠海市魅族科技有限公司 Information processing method and device, computer installation and storage medium
CN109068056A (en) * 2018-08-17 2018-12-21 Oppo广东移动通信有限公司 A kind of electronic equipment and its filter processing method of shooting image, storage medium
CN110084154A (en) * 2019-04-12 2019-08-02 北京字节跳动网络技术有限公司 Render method, apparatus, electronic equipment and the computer readable storage medium of image
CN110298283A (en) * 2019-06-21 2019-10-01 北京百度网讯科技有限公司 Matching process, device, equipment and the storage medium of picture material

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115249221A (en) * 2022-09-23 2022-10-28 阿里巴巴(中国)有限公司 Image processing method and device and cloud equipment

Also Published As

Publication number Publication date
CN111161133B (en) 2023-07-04

Similar Documents

Publication Publication Date Title
CN107995429B (en) Shooting method and mobile terminal
CN110969981B (en) Screen display parameter adjusting method and electronic equipment
CN107817939B (en) Image processing method and mobile terminal
CN107644396B (en) Lip color adjusting method and device
CN108427873B (en) Biological feature identification method and mobile terminal
CN108182271B (en) Photographing method, terminal and computer readable storage medium
CN109788204A (en) Shoot processing method and terminal device
CN110866038A (en) Information recommendation method and terminal equipment
CN107831989A (en) A kind of Application Parameters method of adjustment and mobile terminal
CN108259746B (en) Image color detection method and mobile terminal
CN111125800B (en) Icon display method and electronic equipment
CN110825897A (en) Image screening method and device and mobile terminal
CN109656636B (en) Application starting method and device
CN108984143B (en) Display control method and terminal equipment
CN108718389B (en) Shooting mode selection method and mobile terminal
CN107728877B (en) Application recommendation method and mobile terminal
CN109246474A (en) A kind of video file edit methods and mobile terminal
CN109462727B (en) Filter adjusting method and mobile terminal
CN108600544A (en) A kind of Single-hand control method and terminal
CN109639981B (en) Image shooting method and mobile terminal
CN107563353B (en) Image processing method and device and mobile terminal
CN111556358B (en) Display method and device and electronic equipment
CN108366194B (en) Photographing method and mobile terminal
CN111161133A (en) Picture processing method and electronic equipment
CN110825288A (en) Image screening processing method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant