CN108615013B - Image processing method and device - Google Patents
Image processing method and device Download PDFInfo
- Publication number
- CN108615013B CN108615013B CN201810393773.9A CN201810393773A CN108615013B CN 108615013 B CN108615013 B CN 108615013B CN 201810393773 A CN201810393773 A CN 201810393773A CN 108615013 B CN108615013 B CN 108615013B
- Authority
- CN
- China
- Prior art keywords
- target
- target person
- hair style
- characteristic parameters
- proportion
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
- G06V40/165—Detection; Localisation; Normalisation using facial parts and geometric relationships
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Geometry (AREA)
- Evolutionary Biology (AREA)
- General Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Processing Or Creating Images (AREA)
- Image Processing (AREA)
Abstract
The present disclosure relates to an image processing method and apparatus. The method comprises the following steps: acquiring head characteristic parameters of the target image, wherein the head characteristic parameters comprise at least one of facial feature parameters and hair style characteristic parameters, the facial feature parameters are used for indicating the facial features of the target person in the target image, and the hair style characteristic parameters are used for indicating the hair style characteristics of the target person; and determining the target clothes matched with the target person according to the head characteristic parameters. According to the technical scheme, the clothes matched with the five sense organs or the hairstyle of the target person can be determined to be suitable for the clothes of the target person on the premise that the target person does not need to wear the clothes, and therefore the user experience is improved.
Description
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an image processing method and apparatus.
Background
Along with the development of science and technology, the dress that people wore in daily life all is more with quantity, and under the general condition, when the user need know whether with self matching, need look over the image of this dress of oneself wearing to confirm whether this dress is fit for this user.
Disclosure of Invention
To overcome the problems in the related art, embodiments of the present disclosure provide an image processing method and apparatus. The technical scheme is as follows:
according to a first aspect of embodiments of the present disclosure, there is provided an image processing method including:
acquiring head characteristic parameters of the target image, wherein the head characteristic parameters comprise at least one of facial feature parameters and hair style characteristic parameters, the facial feature parameters are used for indicating the facial features of the target person in the target image, and the hair style characteristic parameters are used for indicating the hair style characteristics of the target person;
and determining the target clothes matched with the target person according to the head characteristic parameters.
According to the technical scheme, the head characteristic parameters of the target image are obtained, the target clothes matched with the target person are determined according to the head characteristic parameters, the target clothes can be matched with the five sense organs or the hair style of the target person, and therefore on the premise that the clothes do not need to be worn by the target person, the clothes matched with the five sense organs or the hair style of the target person are determined to be the clothes suitable for the target person, and user experience is improved.
In one embodiment, the method further comprises:
and displaying the target clothes according to the position of the target person.
In one embodiment, determining target apparel matching the target person based on the head characteristic parameters comprises:
acquiring a clothing color system parameter according to the head characteristic parameter, wherein the clothing color system parameter is used for indicating a color system to which the color of the target clothing belongs;
and determining the target clothes according to the clothes color system parameters.
In one embodiment, determining target apparel to match a target person based on the head feature parameters includes:
when the head characteristic parameters comprise hair style characteristic parameters, acquiring a hair style face proportion of the target person according to the hair style characteristic parameters, wherein the hair style face proportion is used for indicating the proportion of the face of the target person which is shielded by the hair of the target person;
and determining the target clothes according to the hair style and face proportion of the target person.
In one embodiment, determining target apparel to match a target person based on the head feature parameters includes:
when the head characteristic parameters comprise the facial characteristic parameters of the five sense organs, acquiring the facial proportion of the five sense organs of the target person according to the facial characteristic parameters of the five sense organs, wherein the facial proportion of the five sense organs is used for indicating the proportion of the five sense organs of the target person to the face of the target person;
and determining the target clothes according to the facial proportion of the five sense organs of the target person.
According to a second aspect of an embodiment of the present disclosure, there is provided an image processing apparatus including:
the head characteristic acquisition module is used for acquiring head characteristic parameters of the target image, the head characteristic parameters comprise at least one of facial feature characteristic parameters and hair style characteristic parameters, the facial feature characteristic parameters are used for indicating the facial features of the target person in the target image, and the hair style characteristic parameters are used for indicating the hair style characteristics of the target person;
and the clothing matching module is used for determining target clothing matched with the target person according to the head characteristic parameters.
In one embodiment, the apparatus further comprises:
and the clothing image acquisition module is used for acquiring the image of the target clothing.
And the clothing display module is used for displaying the image of the target clothing according to the position of the target person.
In one embodiment, a apparel matching module, comprises:
the clothing color system obtaining submodule is used for obtaining clothing color system parameters according to the head characteristic parameters, and the clothing color system parameters are used for indicating a color system to which the color of the target clothing belongs;
and the first clothing matching sub-module is used for determining the target clothing according to the clothing color system parameters.
In one embodiment, a apparel matching module, comprises:
the hair style face proportion obtaining submodule is used for obtaining the hair style face proportion of the target person according to the hair style feature parameters when the head feature parameters comprise the hair style feature parameters, and the hair style face proportion is used for indicating the proportion of the face of the target person which is shielded by the hair of the target person;
and the second clothes matching sub-module is used for determining the target clothes according to the hair style face proportion of the target person.
In one embodiment, a apparel matching module, comprises:
the facial proportion of the five sense organs obtains the submodule, is used for obtaining the facial proportion of the five sense organs of the target figure according to the characteristic parameter of the five sense organs when the characteristic parameter of the head includes the characteristic parameter of the five sense organs, the facial proportion of the five sense organs of the target figure is used for indicating the facial proportion of the five sense organs of the target figure to the face of the target figure;
and the third clothes matching sub-module is used for determining the target clothes according to the facial proportion of the five sense organs of the target person.
According to a third aspect of an embodiment of the present disclosure, there is provided an image processing apparatus including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
acquiring head characteristic parameters of the target image, wherein the head characteristic parameters comprise at least one of facial feature parameters and hair style characteristic parameters, the facial feature parameters are used for indicating the facial features of the target person in the target image, and the hair style characteristic parameters are used for indicating the hair style characteristics of the target person;
and determining the target clothes matched with the target person according to the head characteristic parameters.
According to a fourth aspect of embodiments of the present disclosure, there is provided a computer-readable storage medium having stored thereon computer instructions, characterized in that the instructions, when executed by a processor, implement the steps of the method of any one of the first aspect of the embodiments of the present disclosure.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1a is a schematic flow diagram 1 illustrating an image processing method according to an exemplary embodiment;
FIG. 1b is a schematic flow diagram 2 illustrating an image processing method according to an exemplary embodiment;
FIG. 1c is a schematic flow diagram 3 illustrating an image processing method according to an exemplary embodiment;
FIG. 1d is a flowchart illustration 4 of an image processing method according to an exemplary embodiment;
FIG. 1e is a schematic flow diagram 5 illustrating an image processing method according to an exemplary embodiment;
FIG. 2 is an interaction flow diagram illustrating an image processing method according to an exemplary embodiment;
FIG. 3a is a schematic diagram 1 illustrating the structure of an image processing apparatus according to an exemplary embodiment;
FIG. 3b is a schematic diagram of the structure of an image processing apparatus shown in FIG. 2 according to an exemplary embodiment;
FIG. 3c is a schematic diagram 3 illustrating the structure of an image processing apparatus according to an exemplary embodiment;
FIG. 3d is a schematic diagram 4 illustrating the structure of an image processing apparatus according to an exemplary embodiment;
FIG. 3e is a block diagram 5 illustrating an image processing apparatus according to an exemplary embodiment;
FIG. 4 is a block diagram illustrating an apparatus in accordance with an exemplary embodiment;
FIG. 5 is a block diagram illustrating an apparatus in accordance with an example embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
With the rapid development of science and technology and the continuous improvement of living standard of people, in recent years, the types and the number of the clothes worn by people in daily life are gradually increased, and generally, when a user needs to know whether the clothes are matched with the user, the user needs to check the image of the clothes worn by the user to determine whether the clothes are suitable for the user. For example, when a user needs to know whether a coat is matched with the user, the user can wear the coat and then check the image of wearing the coat in the mirror, can shoot the image of wearing the coat and check the shot image through the terminal, so that the user can know whether the garment is suitable for the user. Although the above scheme may determine the clothes suitable for the user, the above scheme consumes more time of the user due to the cumbersome process of wearing the clothes, thereby impairing the user experience.
An embodiment of the present disclosure provides an image processing method, as shown in fig. 1a, including the following steps 101 to 102:
in step 101, head feature parameters of a target image are acquired.
The head characteristic parameters comprise at least one of facial feature parameters and hair style characteristic parameters, the facial feature parameters are used for indicating the facial features of the target person in the target image, and the hair style characteristic parameters are used for indicating the hair style characteristics of the target person.
For example, the target image may be an image including a head of the target person, the target image may be acquired in advance, or the target image may be acquired by capturing an image of the target person by an image capturing device such as a camera, a smart phone including a camera, a tablet computer including a camera, or the like.
The head feature parameter of the target image may be an image feature of a human face indicating the target person, and the head feature parameter may include at least one of a five sense organs feature parameter and a hair style feature parameter. For example, the feature parameters of five sense organs can be used to indicate at least one of the contour shape of the eyes, nose or mouth of the target person, the distance between the eyes and the mouth, the distance between the eyes and the nose, and the contour shape of the face, wherein the contour shape of the face can include: standard face, long face, round face, square face, inverted triangle face. The hair style characteristic parameter may be used to indicate at least one of a length, a color value and a shape of the hair of the target person, where the color value of the hair may be an RGB color channel value or a YUV color coding value, and taking the RGB color channel value as an example, the target face characteristic parameter may be: RGB (252, 224, 203), where RGB color channel values, i.e., RGB (red, green, blue), identify color values of three channels of red, green, and blue, different colors can be obtained by adjusting the color values of the three color channels of red, green, and blue, and a specific RGB color channel value represents a specific color; the shape of the hair may include straight hair, wavy hair, curly hair, wool curly hair, small spiral hair, and the like.
In step 102, a target apparel matching the target person is determined according to the head feature parameters.
Illustratively, the target apparel may include at least one of apparel, shoes, hats, socks, gloves, scarves, ties, bags, parasols, jewelry. Determining a target apparel matching the target person according to the head characteristic parameters may be obtaining an apparel parameter database indicating a correspondence of at least one apparel identifier and at least one head characteristic parameter. Target clothing identification corresponding to the head characteristic parameters can be obtained according to the clothing parameter database, and the target clothing indicated by the target clothing identification can be determined according to the target clothing identification. It should be noted that the head feature parameter may correspond to one or more clothing identifiers, and one clothing identifier may also indicate one or more kinds of clothing.
According to the technical scheme provided by the embodiment of the disclosure, the head characteristic parameters of the target image are obtained, and the target clothes matched with the target person are determined according to the head characteristic parameters, so that the target clothes can be ensured to be matched with the five sense organs or the hair style of the target person, and the clothes matched with the five sense organs or the hair style of the target person are determined to be suitable for the clothes of the target person on the premise of not needing the clothes worn by the target person, so that the user experience is improved.
In one embodiment, as shown in fig. 1b, the image processing method provided by the embodiment of the present disclosure further includes the following steps 103 to 104:
in step 103, an image of the target apparel is acquired.
In step 104, an image of the target apparel is presented according to the position of the target person.
Illustratively, the image of the target garment may be obtained by retrieving the image from a garment image database according to the identifier of the target garment to obtain a garment image corresponding to the identifier of the target garment, where the garment image database may be stored in the electronic device in advance, or may be obtained from a server or other devices or systems for the electronic device. The image of the target apparel may be displayed according to the position of the target person through a display screen or a virtual reality head-mounted display device, or the like, wherein when the image of the target apparel is displayed according to the position of the target person, a user may be enabled to observe an effect that the target apparel is worn on the target person. For example, when the target apparel is a hat, the image of the hat can be displayed on the top of the target person through the virtual reality head-mounted display device, so that the user can intuitively feel the effect that the target person wears the hat.
By acquiring the image of the target clothing and displaying the image of the target clothing according to the position of the target person, the user can intuitively feel the effect that the target person wears the target clothing, and the user experience is improved.
In one embodiment, as shown in fig. 1c, in step 102, determining the target apparel matching the target person according to the head feature parameters may be implemented through steps 1021 to 1022:
in step 1021, the clothing color system parameters are obtained according to the head feature parameters.
The clothing color system parameter is used for indicating a color system to which the color of the target clothing belongs.
In step 1022, a target apparel is determined based on the apparel color system parameters.
For example, the clothing color system parameters are obtained according to the head characteristic parameters, and the clothing color system parameters corresponding to the head characteristic parameters may be obtained by searching in a color system parameter database according to the head characteristic parameters, where the color system parameter database may be stored in the electronic device in advance, or may be obtained from a server or other devices or systems for the electronic device. It should be noted that, the dress color system parameter may be used to refer to one or more color systems. The target apparel is determined according to the apparel color system parameters, and the apparel may be determined to be the target apparel when the color system of the apparel belongs to the color system indicated by the apparel color system parameters.
By acquiring the clothing color system parameters according to the head characteristic parameters and determining the target clothing according to the clothing color system parameters, the clothing with the color system matched with the five sense organs or the hairstyle of the target person, namely the clothing with the color system suitable for the target person can be determined on the premise of not wearing the clothing by the target person, so that the user experience is improved.
In one embodiment, as shown in fig. 1d, in step 102, determining the target apparel to match with the target person according to the head feature parameters can be implemented through steps 1023 to 1024:
in step 1023, when the head feature parameter includes a hair style feature parameter, a hair style face proportion of the target person is obtained according to the hair style feature parameter.
Wherein the hair style face accounts for a proportion indicating that the face of the target person is blocked by the hair of the target person.
In step 1024, the target apparel is determined based on the hair-style face ratio of the target person.
For example, when it is determined that the hair style face proportion of the target person is smaller than the predetermined hair style proportion threshold, it may be considered that the target person has a small proportion of hair covering the whole face thereof, and a light-color clothing match may be recommended for the target person, so that the target clothing is determined to be light-color clothing; in contrast, when it is determined that the hair style-face ratio of the target person is greater than or equal to the predetermined hair style ratio threshold, it may be considered that the target person has a high ratio of hair covering the entire face thereof, and a deep color clothing match may be recommended for the target person, so that the target clothing is determined to be a deep color clothing.
In one embodiment, as shown in fig. 1e, in step 102, determining the target apparel matching the target person according to the head feature parameters may be implemented by steps 1025 to 1026:
in step 1025, when the head feature parameters include the facial feature parameters of the five sense organs, the facial proportion of the five sense organs of the target person is obtained according to the facial feature parameters of the five sense organs.
Wherein the facial proportion of the five sense organs is indicative of a proportion of the five sense organs of the target person to the face of the target person.
In step 1026, the target apparel is determined based on the facial aspect ratio of the five sense organs of the target person.
For example, when it is determined that the facial proportion of the five sense organs of the target person is smaller than the predetermined facial proportion threshold value, the proportion of the facial proportion of the five sense organs of the target person is considered to be small, and a clothing match of a cloudy color system can be recommended for the target person, so that the target clothing is determined to be cloudy clothing; in contrast, when it is determined that the facial proportion of the five sense organs of the target person is greater than or equal to the predetermined facial proportion threshold, it may be considered that the facial proportion of the five sense organs of the target person is greater, and a bright-color-series clothing match may be recommended for the target person, so that the target clothing is determined to be bright-color-series clothing
The implementation is described in detail by examples below.
FIG. 2 is a schematic flow chart diagram illustrating an image processing method according to one exemplary embodiment. As shown in fig. 2, the method comprises the following steps:
in step 201, head feature parameters of a target image are acquired.
The head characteristic parameters comprise at least one of facial feature parameters and hair style characteristic parameters, the facial feature parameters are used for indicating the facial features of the target person in the target image, and the hair style characteristic parameters are used for indicating the hair style characteristics of the target person.
In step 202, when the head characteristic parameter includes a hair style characteristic parameter, a hair style face proportion of the target person is obtained according to the hair style characteristic parameter.
Wherein the hair style face accounts for a proportion indicating that the face of the target person is blocked by the hair of the target person.
In step 203, the dress color system parameters are obtained according to the hair style face proportion.
The dress color system parameter is used for indicating a color system to which the color of the target dress belongs.
In step 204, when the head feature parameters include facial feature parameters, facial proportion of the five sense organs of the target person is obtained according to the facial feature parameters.
Wherein the facial proportion of the five sense organs is indicative of a proportion of the five sense organs of the target person to the face of the target person.
In step 205, the color system parameters of the apparel are obtained according to the facial proportion of the five sense organs of the target person
In step 206, a target apparel is determined based on the apparel color system parameters.
In step 207, the target apparel is presented according to the location of the target person.
According to the technical scheme provided by the embodiment of the disclosure, the head characteristic parameters of the target image are obtained, and the target clothes matched with the target person are determined according to the head characteristic parameters, so that the target clothes can be ensured to be matched with the five sense organs or the hair style of the target person, and the clothes matched with the five sense organs or the hair style of the target person are determined to be suitable for the clothes of the target person on the premise of not needing the clothes worn by the target person, so that the user experience is improved.
The following are embodiments of the disclosed apparatus that may be used to perform embodiments of the disclosed methods.
Fig. 3a is a block diagram of an image processing apparatus 30 according to an exemplary embodiment, where the image processing apparatus 30 may be a server or a part of a server, or may also be a terminal or a part of a terminal, and the image processing apparatus 30 may be implemented as a part or all of an electronic device by software, hardware, or a combination of the two. As shown in fig. 3a, the image processing apparatus 30 includes:
the head feature acquiring module 301 is configured to acquire head feature parameters of the target image, where the head feature parameters include at least one of facial feature parameters and hair style feature parameters, the facial feature parameters are used to indicate facial features of the target person in the target image, and the hair style feature parameters are used to indicate hair style features of the target person;
and the clothing matching module 302 is used for determining target clothing matched with the target person according to the head characteristic parameters.
In one embodiment, as shown in fig. 3b, the image processing apparatus 30 further includes:
a clothing image obtaining module 303, configured to obtain an image of the target clothing.
And the clothing display module 304 is used for displaying the image of the target clothing according to the position of the target person.
In one embodiment, as shown in FIG. 3c, apparel matching module 302 includes:
a clothing color system obtaining submodule 3021 configured to obtain a clothing color system parameter according to the head characteristic parameter, where the clothing color system parameter is used to indicate a color system to which a color of the target clothing belongs;
the first apparel matching sub-module 3022 is configured to determine a target apparel according to the apparel color system parameter.
In one embodiment, as shown in FIG. 3d, apparel matching module 302 includes:
a hair style face ratio obtaining submodule 3023 configured to, when the head feature parameter includes the hair style feature parameter, obtain a hair style face ratio of the target person according to the hair style feature parameter, where the hair style face ratio is used to indicate a ratio at which the face of the target person is blocked by hair of the target person;
and the second clothes matching sub-module 3024 is configured to determine a target clothes according to the hair style face ratio of the target person.
In one embodiment, as shown in FIG. 3e, apparel matching module 302 includes:
a facial proportion of five sense organs obtaining sub-module 3025 configured to obtain a facial proportion of five sense organs of the target person according to the facial proportion of five sense organs when the head feature parameters include facial feature parameters of five sense organs, the facial proportion of five sense organs being used to indicate a proportion of five sense organs of the target person to the face of the target person;
a third apparel matching sub-module 3026 for determining a target apparel according to the facial proportion of the five sense organs of the target person.
Embodiments of the present disclosure provide an image processing apparatus, which may determine a target garment matched with a target person according to a head characteristic parameter by obtaining the head characteristic parameter of the target image, and may ensure that the target garment is matched with five sense organs or a hair style of the target person, thereby determining that the garment matched with the five sense organs or the hair style of the target person is a garment suitable for the target person on the premise that the target person does not need to wear the garment, thereby improving user experience.
Fig. 4 is a block diagram illustrating an image processing apparatus 40 according to an exemplary embodiment, where the image processing apparatus 40 may be a server or a part of a server, or may be a terminal or a part of a terminal, and the image processing apparatus 40 includes:
a processor 401;
a memory 402 for storing instructions executable by the processor 401;
wherein the processor 401 is configured to:
acquiring head characteristic parameters of the target image, wherein the head characteristic parameters comprise at least one of facial feature parameters and hair style characteristic parameters, the facial feature parameters are used for indicating the facial features of the target person in the target image, and the hair style characteristic parameters are used for indicating the hair style characteristics of the target person;
and determining the target clothes matched with the target person according to the head characteristic parameters.
In one embodiment, the processor 401 may be further configured to:
and displaying the target clothes according to the position of the target person.
In one embodiment, the processor 401 may be further configured to:
determining target clothes matched with the target person according to the head characteristic parameters, wherein the target clothes comprise:
acquiring a clothing color system parameter according to the head characteristic parameter, wherein the clothing color system parameter is used for indicating a color system to which the color of the target clothing belongs;
and determining the target clothes according to the clothes color system parameters.
In one embodiment, the processor 401 may be further configured to:
determining target clothes matched with the target person according to the head characteristic parameters, wherein the target clothes comprise:
when the head characteristic parameters comprise hair style characteristic parameters, acquiring a hair style face proportion of the target person according to the hair style characteristic parameters, wherein the hair style face proportion is used for indicating the proportion of the face of the target person which is shielded by the hair of the target person;
and determining the target clothes according to the hair style and face ratio of the target person.
In one embodiment, the processor 401 may be further configured to:
determining target clothes matched with the target person according to the head characteristic parameters, wherein the target clothes comprise:
when the head characteristic parameters comprise the facial characteristic parameters of the five sense organs, acquiring the facial proportion of the five sense organs of the target person according to the facial characteristic parameters of the five sense organs, wherein the facial proportion of the five sense organs is used for indicating the proportion of the five sense organs of the target person to the face of the target person;
and determining the target clothes according to the facial ratio of the five sense organs of the target person.
Embodiments of the present disclosure provide an image processing apparatus, which may determine a target garment matched with a target person according to a head characteristic parameter by obtaining the head characteristic parameter of the target image, and may ensure that the target garment is matched with five sense organs or a hair style of the target person, thereby determining that the garment matched with the five sense organs or the hair style of the target person is a garment suitable for the target person on the premise that the target person does not need to wear the garment, thereby improving user experience.
Fig. 5 is a block diagram illustrating an apparatus 500 for processing an image according to an exemplary embodiment, where the apparatus 500 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, or the like.
The apparatus 500 may include one or more of the following components: processing component 502, memory 504, power component 506, multimedia component 508, audio component 510, input/output (I/O) interface 512, sensor component 514, and communication component 516.
The processing component 502 generally controls overall operation of the device 500, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing elements 502 may include one or more processors 520 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 502 can include one or more modules that facilitate interaction between the processing component 502 and other components. For example, the processing component 502 can include a multimedia module to facilitate interaction between the multimedia component 508 and the processing component 502.
The memory 504 is configured to store no various types of data to support operations at the apparatus 500. Examples of such data include instructions for any application or method operating on device 500, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 504 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
A power supply component 506 provides power to the various components of the device 500. The power components 506 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the apparatus 500.
The multimedia component 508 includes a screen that provides an output interface between the device 500 and the user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 508 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 500 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 510 is configured to output and/or input audio signals. For example, audio component 510 includes a Microphone (MIC) configured to receive external audio signals when apparatus 500 is in an operating mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 504 or transmitted via the communication component 516. In some embodiments, audio component 510 further includes a speaker for outputting audio signals.
The I/O interface 512 provides an interface between the processing component 502 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 514 includes one or more sensors for providing various aspects of status assessment for the device 500. For example, the sensor assembly 514 may detect an open/closed state of the apparatus 500, the relative positioning of the components, such as a display and keypad of the apparatus 500, the sensor assembly 514 may also detect a change in the position of the apparatus 500 or a component of the apparatus 500, the presence or absence of user contact with the apparatus 500, orientation or acceleration/deceleration of the apparatus 500, and a change in the temperature of the apparatus 500. The sensor assembly 514 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 514 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 514 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 516 is configured to facilitate communication between the apparatus 500 and other devices in a wired or wireless manner. The device 500 may access a wireless network based on a communication standard, such as walkie-talkie private network, WiFi, 2G, 3G, 4G, or 5G, or a combination thereof. In an exemplary embodiment, the communication component 516 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 516 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 500 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors, or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer-readable storage medium comprising instructions, such as the memory 504 comprising instructions, executable by the processor 520 of the apparatus 500 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
A non-transitory computer readable storage medium, in which instructions, when executed by a processor of an apparatus 500, enable the apparatus 500 to perform the above-described image processing method, the method comprising:
acquiring head characteristic parameters of the target image, wherein the head characteristic parameters comprise at least one of facial feature parameters and hair style characteristic parameters, the facial feature parameters are used for indicating the facial features of the target person in the target image, and the hair style characteristic parameters are used for indicating the hair style characteristics of the target person;
and determining the target clothes matched with the target person according to the head characteristic parameters.
In one embodiment, the method further comprises:
and displaying the target clothes according to the position of the target person.
In one embodiment, determining target apparel matching the target person based on the head characteristic parameters comprises:
acquiring a clothing color system parameter according to the head characteristic parameter, wherein the clothing color system parameter is used for indicating a color system to which the color of the target clothing belongs;
and determining the target clothes according to the clothes color system parameters.
In one embodiment, determining target apparel to match a target person based on the head feature parameters includes:
when the head characteristic parameters comprise hair style characteristic parameters, acquiring the hair style face proportion of the target person according to the hair style characteristic parameters, wherein the hair style face proportion is used for indicating the proportion of the face of the target person which is shielded by the hair of the target person;
and determining the target clothes according to the hair style and face ratio of the target person.
In one embodiment, determining target apparel to match a target person based on the head feature parameters includes:
when the head characteristic parameters comprise the facial characteristic parameters of the five sense organs, acquiring the facial proportion of the five sense organs of the target person according to the facial characteristic parameters of the five sense organs, wherein the facial proportion of the five sense organs is used for indicating the proportion of the five sense organs of the target person to the face of the target person;
and determining the target clothes according to the facial ratio of the five sense organs of the target person.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.
Claims (10)
1. An image processing method, comprising:
acquiring head characteristic parameters of a target image, wherein the head characteristic parameters comprise at least one of facial features and hair style characteristic parameters, the facial features characteristic parameters are used for indicating the facial features of a target person in the target image, and the hair style characteristic parameters are used for indicating the hair style characteristics of the target person;
determining the target clothes matched with the target person according to the head characteristic parameters, wherein the method comprises the following steps: when the head characteristic parameters comprise the hair style characteristic parameters, acquiring a hair style face proportion of the target person according to the hair style characteristic parameters, wherein the hair style face proportion is used for indicating the proportion of the face of the target person which is shielded by the hair of the target person; and determining the target clothes according to the hair style face proportion of the target person.
2. The method of image processing according to claim 1, further comprising:
acquiring an image of the target clothes;
and displaying the image of the target clothes according to the position of the target person.
3. The image processing method according to claim 1, wherein the determining a target apparel matching the target person according to the head feature parameters comprises:
acquiring clothing color system parameters according to the head characteristic parameters, wherein the clothing color system parameters are used for indicating a color system to which the color of the target clothing belongs;
and determining the target clothes according to the clothes color system parameters.
4. The method of claim 1, wherein the determining a target apparel matching the target person according to the head feature parameters comprises:
when the head characteristic parameters comprise the facial feature parameters of the five sense organs, acquiring the facial proportion of the five sense organs of the target person according to the facial feature parameters of the five sense organs, wherein the facial proportion of the five sense organs is used for indicating the proportion of the five sense organs of the target person to the face of the target person;
and determining the target clothes according to the facial proportion of the five sense organs of the target person.
5. An image processing apparatus characterized by comprising:
the head characteristic acquisition module is used for acquiring head characteristic parameters of a target image, wherein the head characteristic parameters comprise at least one of facial feature parameters and hair style characteristic parameters, the facial feature parameters are used for indicating the facial features of a target person in the target image, and the hair style characteristic parameters are used for indicating the hair style characteristics of the target person;
the clothing matching module is used for determining target clothing matched with the target character according to the head characteristic parameters;
the dress matches the module, includes:
a hair style face proportion obtaining sub-module, configured to obtain a hair style face proportion of the target person according to the hair style feature parameter when the head feature parameter includes the hair style feature parameter, where the hair style face proportion is used to indicate a proportion of a face of the target person that is blocked by hair of the target person;
and the second clothes matching sub-module is used for determining the target clothes according to the hair style face proportion of the target person.
6. The image processing apparatus according to claim 5, characterized in that the apparatus further comprises:
the clothing image acquisition module is used for acquiring an image of the target clothing;
and the clothing display module is used for displaying the image of the target clothing according to the position of the target person.
7. The image processing apparatus of claim 5, wherein the apparel matching module comprises:
the clothing color system obtaining submodule is used for obtaining clothing color system parameters according to the head characteristic parameters, and the clothing color system parameters are used for indicating a color system to which the color of the target clothing belongs;
and the first clothing matching sub-module is used for determining the target clothing according to the clothing color system parameters.
8. The image processing apparatus of claim 5, wherein the apparel matching module comprises:
a facial proportion of five sense organs obtaining sub-module, configured to obtain a facial proportion of five sense organs of the target person according to the facial proportion of five sense organs when the head feature parameters include the facial feature parameters, where the facial proportion of five sense organs is used to indicate a proportion of five sense organs of the target person to the face of the target person;
and the third clothing matching sub-module is used for determining the target clothing according to the facial proportion of the five sense organs of the target character.
9. An image processing apparatus characterized by comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
acquiring head characteristic parameters of a target image, wherein the head characteristic parameters comprise at least one of facial features and hair style characteristic parameters, the facial features characteristic parameters are used for indicating the facial features of a target person in the target image, and the hair style characteristic parameters are used for indicating the hair style characteristics of the target person;
determining the target clothes matched with the target person according to the head characteristic parameters, wherein the method comprises the following steps: when the head characteristic parameters comprise the hair style characteristic parameters, acquiring a hair style face proportion of the target person according to the hair style characteristic parameters, wherein the hair style face proportion is used for indicating the proportion of the face of the target person which is shielded by the hair of the target person; and determining the target clothes according to the hair style face proportion of the target person.
10. A computer-readable storage medium having stored thereon computer instructions, which when executed by a processor, perform the steps of the method of any one of claims 1 to 4.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810393773.9A CN108615013B (en) | 2018-04-27 | 2018-04-27 | Image processing method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810393773.9A CN108615013B (en) | 2018-04-27 | 2018-04-27 | Image processing method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108615013A CN108615013A (en) | 2018-10-02 |
CN108615013B true CN108615013B (en) | 2022-08-26 |
Family
ID=63661377
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810393773.9A Active CN108615013B (en) | 2018-04-27 | 2018-04-27 | Image processing method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108615013B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110648382B (en) * | 2019-09-30 | 2023-02-24 | 北京百度网讯科技有限公司 | Image generation method and device |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102426650A (en) * | 2011-09-30 | 2012-04-25 | 宇龙计算机通信科技(深圳)有限公司 | Method and device of character image analysis |
CN103310234A (en) * | 2013-07-03 | 2013-09-18 | 深圳时尚空间网络有限公司 | Matching hairstyle, costume and/or accessory obtaining method based on feature analysis of five sense organs |
CN106600702A (en) * | 2016-11-23 | 2017-04-26 | 中南大学 | Image processing device based on virtual reality |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120158515A1 (en) * | 2010-12-21 | 2012-06-21 | Yahoo! Inc. | Dynamic advertisement serving based on an avatar |
CN106779977B (en) * | 2017-01-16 | 2020-11-27 | 深圳市娜尔思时装有限公司 | Clothing matching method and system based on intelligent mobile terminal |
-
2018
- 2018-04-27 CN CN201810393773.9A patent/CN108615013B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102426650A (en) * | 2011-09-30 | 2012-04-25 | 宇龙计算机通信科技(深圳)有限公司 | Method and device of character image analysis |
CN103310234A (en) * | 2013-07-03 | 2013-09-18 | 深圳时尚空间网络有限公司 | Matching hairstyle, costume and/or accessory obtaining method based on feature analysis of five sense organs |
CN106600702A (en) * | 2016-11-23 | 2017-04-26 | 中南大学 | Image processing device based on virtual reality |
Also Published As
Publication number | Publication date |
---|---|
CN108615013A (en) | 2018-10-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109672830B (en) | Image processing method, image processing device, electronic equipment and storage medium | |
CN105357425B (en) | Image capturing method and device | |
JP2016531362A (en) | Skin color adjustment method, skin color adjustment device, program, and recording medium | |
CN109857311A (en) | Generate method, apparatus, terminal and the storage medium of human face three-dimensional model | |
CN107977885B (en) | Virtual fitting method and device | |
US20190095746A1 (en) | Method, device and non-transitory storage medium for processing clothes information | |
CN113194254A (en) | Image shooting method and device, electronic equipment and storage medium | |
CN107705245A (en) | Image processing method and device | |
CN112188091B (en) | Face information identification method and device, electronic equipment and storage medium | |
CN105678266A (en) | Method and device for combining photo albums of human faces | |
CN110580688A (en) | Image processing method and device, electronic equipment and storage medium | |
CN111526287A (en) | Image shooting method, image shooting device, electronic equipment, server, image shooting system and storage medium | |
CN112330570A (en) | Image processing method, image processing device, electronic equipment and storage medium | |
CN107369142A (en) | Image processing method and device | |
CN112434338A (en) | Picture sharing method and device, electronic equipment and storage medium | |
CN107563395B (en) | Method and device for dressing management through intelligent mirror | |
CN112184540A (en) | Image processing method, image processing device, electronic equipment and storage medium | |
CN104902318B (en) | Control method for playing back and terminal device | |
CN112004020B (en) | Image processing method, image processing device, electronic equipment and storage medium | |
CN108615013B (en) | Image processing method and device | |
CN111373409B (en) | Method and terminal for obtaining color value change | |
CN111988522B (en) | Shooting control method and device, electronic equipment and storage medium | |
CN111340690B (en) | Image processing method, device, electronic equipment and storage medium | |
CN111355879B (en) | Image acquisition method and device containing special effect pattern and electronic equipment | |
CN111524160A (en) | Track information acquisition method and device, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |