CN108765352B - Image processing method and electronic device - Google Patents

Image processing method and electronic device Download PDF

Info

Publication number
CN108765352B
CN108765352B CN201810562973.2A CN201810562973A CN108765352B CN 108765352 B CN108765352 B CN 108765352B CN 201810562973 A CN201810562973 A CN 201810562973A CN 108765352 B CN108765352 B CN 108765352B
Authority
CN
China
Prior art keywords
image
adjustment
parameter
user
different
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810562973.2A
Other languages
Chinese (zh)
Other versions
CN108765352A (en
Inventor
杨双新
郭轶尊
刘龙飞
曹越
张洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201810562973.2A priority Critical patent/CN108765352B/en
Publication of CN108765352A publication Critical patent/CN108765352A/en
Application granted granted Critical
Publication of CN108765352B publication Critical patent/CN108765352B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/94Dynamic range modification of images or parts thereof based on local image properties, e.g. for local contrast enhancement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/171Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/178Human faces, e.g. facial parts, sketches or expressions estimating age from face image; using age information for improving recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Image Processing (AREA)

Abstract

The present disclosure provides an image processing method, including: obtaining an image; analyzing the image to obtain characteristics of the user in the image; determining an image adjustment parameter based on the characteristics of the user; obtaining a correction parameter; changing the image adjustment parameter based on the correction parameter; and performing image processing on the area corresponding to the user in the image based on the changed image adjustment parameter. The present disclosure also provides an electronic device.

Description

Image processing method and electronic device
Technical Field
The present disclosure relates to an image processing method and an electronic device.
Background
With the rapid development of electronic technology, electronic devices having various functions are increasingly applied to many scenes such as life and work. For example, more and more electronic apparatuses are provided with an image processing function (e.g., a beauty function, etc.). However, as the user demands the image processing function more and more, the conventional image adjusting function cannot meet the user's demand in many scenes. Therefore, it is desirable to provide an optimized image adjustment function to improve the user experience.
Disclosure of Invention
One aspect of the present disclosure provides an image processing method, including: the method comprises the steps of obtaining an image, analyzing the image to obtain the characteristics of a user in the image, determining image adjustment parameters based on the characteristics of the user, obtaining correction parameters, changing the image adjustment parameters based on the correction parameters, and carrying out image processing on an area corresponding to the user in the image based on the changed image adjustment parameters.
Optionally, the obtaining the correction parameter includes: and obtaining a correction parameter based on the environmental information when the image is obtained.
Optionally, in the method, the correction parameters corresponding to different pieces of environment information are different.
Optionally, the method further includes: and obtaining environmental information through a sensing element of the electronic equipment, wherein the environmental information is used for representing the environment when the electronic equipment obtains the image.
Optionally, the environment information includes: time information, the changing the image adjustment parameter based on the modification parameter includes: if the time information meets a first range, determining a first adjustment item in the image adjustment parameters and an adjustment variation amount for the first adjustment item, where the first adjustment item is one or more, and if the time information meets a second range, determining a second adjustment item in the image adjustment parameters and an adjustment variation amount for the second adjustment item, where the second adjustment item is one or more, where the first adjustment item is the same as the second adjustment item, and correction parameters corresponding to different time information are different, or the first adjustment item is different from the second adjustment item, and different time information respectively corresponds to a corresponding adjustment variation amount determined for different adjustment items.
Another aspect of the present disclosure provides an electronic device including: the system comprises an image acquisition device, an intelligent analysis engine, a processor and a display screen. The image acquisition device is used for acquiring an image, the intelligent analysis engine is used for analyzing the image to acquire the characteristics of a user in the image, determining an image adjustment parameter based on the characteristics of the user to acquire a correction parameter, and changing the image adjustment parameter based on the correction parameter, the processor is used for processing the image of the area corresponding to the user in the image based on the changed image adjustment parameter, and the display screen is used for displaying the processed image.
Optionally, the obtaining the correction parameter includes: and obtaining a correction parameter based on the environmental information when the image is obtained.
Optionally, in the electronic device, the correction parameters corresponding to different pieces of environment information are different.
Optionally, the electronic device further includes: and the sensing element is used for obtaining environment information, and the environment information is used for representing the environment when the electronic equipment obtains the image.
Optionally, the environment information includes: time information, the changing the image adjustment parameter based on the modification parameter includes: if the time information meets a first range, determining a first adjustment item in the image adjustment parameters and an adjustment variation amount for the first adjustment item, where the first adjustment item is one or more, and if the time information meets a second range, determining a second adjustment item in the image adjustment parameters and an adjustment variation amount for the second adjustment item, where the second adjustment item is one or more, where the first adjustment item is the same as the second adjustment item, and correction parameters corresponding to different time information are different, or the first adjustment item is different from the second adjustment item, and different time information respectively corresponds to a corresponding adjustment variation amount determined for different adjustment items.
Another aspect of the present disclosure provides an image processing system including: the device comprises a first obtaining module, an analyzing module, a determining module, a second obtaining module, a changing module and a processing module. The image processing device comprises a first obtaining module, an analyzing module, a determining module, a second obtaining module and a processing module, wherein the first obtaining module is used for obtaining an image, the analyzing module is used for analyzing the image to obtain the characteristics of a user in the image, the determining module is used for determining an image adjusting parameter based on the characteristics of the user, the second obtaining module is used for obtaining a correction parameter, the changing module is used for changing the image adjusting parameter based on the correction parameter, and the processing module is used for processing the image of a region corresponding to the user in the image based on the changed image.
Optionally, the obtaining the correction parameter includes: and obtaining a correction parameter based on the environmental information when the image is obtained.
Optionally, in the electronic device, the correction parameters corresponding to different pieces of environment information are different.
Optionally, the electronic device further includes: and the sensing element is used for obtaining environment information, and the environment information is used for representing the environment when the electronic equipment obtains the image.
Optionally, the environment information includes: time information, the changing the image adjustment parameter based on the modification parameter includes: if the time information meets a first range, determining a first adjustment item in the image adjustment parameters and an adjustment variation amount for the first adjustment item, where the first adjustment item is one or more, and if the time information meets a second range, determining a second adjustment item in the image adjustment parameters and an adjustment variation amount for the second adjustment item, where the second adjustment item is one or more, where the first adjustment item is the same as the second adjustment item, and correction parameters corresponding to different time information are different, or the first adjustment item is different from the second adjustment item, and different time information respectively corresponds to a corresponding adjustment variation amount determined for different adjustment items.
Another aspect of the disclosure provides a non-volatile storage medium storing computer-executable instructions for implementing the method as described above when executed.
Another aspect of the disclosure provides a computer program comprising computer executable instructions for implementing the method as described above when executed.
Drawings
For a more complete understanding of the present disclosure and the advantages thereof, reference is now made to the following descriptions taken in conjunction with the accompanying drawings, in which:
fig. 1A to 1B schematically show application scenarios of an image processing method and an electronic device according to an embodiment of the present disclosure;
2A-2B schematically illustrate a flow chart of an image processing method according to an embodiment of the present disclosure;
FIG. 2C schematically shows a flow chart of an image processing method according to another embodiment of the present disclosure;
3A-3B schematically illustrate changing the same image adjustment parameter based on temporal information, according to an embodiment of the disclosure;
4A-4B schematically illustrate changing different image adjustment parameters based on temporal information, according to an embodiment of the disclosure;
FIG. 5 schematically shows a block diagram of an electronic device according to an embodiment of the disclosure;
FIG. 6 schematically shows a schematic view of an electronic device according to an embodiment of the disclosure;
FIG. 7 schematically shows a block diagram of an image processing system according to an embodiment of the present disclosure; and
FIG. 8 schematically shows a block diagram of a computer system for image processing according to an embodiment of the present disclosure.
Detailed Description
Hereinafter, embodiments of the present disclosure will be described with reference to the accompanying drawings. It should be understood that the description is illustrative only and is not intended to limit the scope of the present disclosure. In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the disclosure. It may be evident, however, that one or more embodiments may be practiced without these specific details. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present disclosure.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. The terms "comprises," "comprising," and the like, as used herein, specify the presence of stated features, steps, operations, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, or components.
All terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art unless otherwise defined. It is noted that the terms used herein should be interpreted as having a meaning that is consistent with the context of this specification and should not be interpreted in an idealized or overly formal sense.
Where a convention analogous to "at least one of A, B and C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B and C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.). Where a convention analogous to "A, B or at least one of C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B or C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.). It will be further understood by those within the art that virtually any disjunctive word and/or phrase presenting two or more alternative terms, whether in the description, claims, or drawings, should be understood to contemplate the possibilities of including one of the terms, either of the terms, or both terms. For example, the phrase "a or B" should be understood to include the possibility of "a" or "B", or "a and B".
Some block diagrams and/or flow diagrams are shown in the figures. It will be understood that some blocks of the block diagrams and/or flowchart illustrations, or combinations thereof, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the instructions, which execute via the processor, create means for implementing the functions/acts specified in the block diagrams and/or flowchart block or blocks.
Accordingly, the techniques of this disclosure may be implemented in hardware and/or software (including firmware, microcode, etc.). In addition, the techniques of this disclosure may take the form of a computer program product on a computer-readable medium having instructions stored thereon for use by or in connection with an instruction execution system. In the context of this disclosure, a computer-readable medium may be any medium that can contain, store, communicate, propagate, or transport the instructions. For example, the computer readable medium can include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. Specific examples of the computer readable medium include: magnetic storage devices, such as magnetic tape or Hard Disk Drives (HDDs); optical storage devices, such as compact disks (CD-ROMs); a memory, such as a Random Access Memory (RAM) or a flash memory; and/or wired/wireless communication links.
The embodiment of the disclosure provides an image processing method, which includes obtaining an image, analyzing the image to obtain characteristics of a user in the image, determining an image adjustment parameter based on the characteristics of the user, obtaining a correction parameter, changing the image adjustment parameter based on the correction parameter, and performing image processing on an area corresponding to the user in the image based on the changed image adjustment parameter.
Therefore, in the technical scheme of the embodiment of the disclosure, the acquired image is analyzed to obtain the characteristics of the user in the image, the image adjustment parameters can be determined according to the characteristics of the user, the image adjustment parameters are corrected according to the correction parameters, the user area in the image is processed based on the corrected image adjustment parameters, the intellectualization of image processing is realized, and the requirements of the user on image processing are met.
Fig. 1A to 1B schematically show application scenarios of an image processing method and an electronic device according to an embodiment of the present disclosure. It should be noted that fig. 1A to 1B are only examples of scenarios in which the embodiments of the present disclosure may be applied to help those skilled in the art understand the technical content of the present disclosure, but do not mean that the embodiments of the present disclosure may not be applied to other devices, systems, environments or scenarios.
As shown in fig. 1A to 1B, the application scene 100 may include, for example, an electronic device 110 and an image 120.
In the disclosed embodiment, the electronic device 110 may include, for example, a camera device that may be used to acquire, for example, the image 120. For example, the camera device includes a front camera or a rear camera of the electronic device 110, or another external camera device capable of performing data interaction with the electronic device, and so on. The electronic device 110 may further comprise a display unit, which may be for example used to display the image 120, which may be a display screen of the electronic device 110, for example. The electronic device 110 is not particularly limited, and the electronic device 110 may be a mobile phone, a computer, a tablet, or the like.
In the embodiment of the present disclosure, the image 120 may be, for example, an image acquired by an image capturing device of the electronic device 110, or may also be an image acquired by another external device and stored in the electronic device 110, and the image 120 may be displayed on a display unit of the electronic device 110, for example. The image 120 includes an image of the user, such as an image of the face of the user.
According to the embodiment of the present disclosure, after the image 120 is acquired, the characteristics of the user, which may include, for example, the age, sex, race, and the like of the user, are acquired by analyzing the image 120. Analysis of the image by an artificial intelligence engine analysis determines the age, gender, race, etc. of the user in the image. Certainly, the information such as the human face, the five sense organs, the skin color and the like in the image can also be determined through the artificial intelligence engine.
In the disclosed embodiments, image adjustment parameters may be determined based on the acquired characteristics of the user. Wherein the image adjustment parameters are at least for adjustment to a user portion in the image. The image adjustment parameter may be one adjustment parameter or a plurality of adjustment parameters adjusted for the user part. The image adjustment parameters include, for example, a face size adjustment parameter for adjusting the size of the face of the user in the image, an eye size adjustment parameter for adjusting the size of the eyes of the user in the image, a display effect exhibited by a skin color area of the user in the image, and the like. The characteristics of the users are different, and the parameter types and parameter values of the corresponding image adjustment parameters may be different.
More specifically, for example, when the age of the user is older, the eye size adjustment parameter corresponding to the user may be larger, and when the age of the user is younger, the eye size adjustment parameter corresponding to the user may be smaller. The image adjustment parameter is one term (i.e., one category), specifically, the eye size. It can be understood that, an older user usually has no eye spirit and thus the user's eyes in the image are smaller, and at this time, the corresponding eye size adjustment parameter of the user is larger, and the eye size adjustment parameter can be used to enlarge the user's eyes in the image 120 (i.e., adjust the parameter value of the term of the eye size), so that the user's eyes in the image are more eye spirit. Similarly, when the user is older, the user's eyes are usually more prominent, and the corresponding eye-size adjustment parameter for the user may be slightly smaller (i.e., the eyes of the older user may increase to a greater extent than the eyes of the younger user).
In the embodiment of the present disclosure, obtaining the correction parameter is further included. The correction parameter is used to change the image adjustment parameter. The image adjustment parameters in this embodiment are based on the image adjustment parameters determined by the artificial intelligence engine to be performed on the image 120. The image adjustment parameters derived by the artificial intelligence engine are adjustment parameters determined based on the image content of the image 120 itself. The artificial intelligence engine may be a beauty intelligence engine. The correction parameter obtained in the present embodiment is determined not based on the image content of the image 120 itself, but based on the environmental information when the electronic device obtains the image 120. For example, when an image is obtained by an image capture device (e.g., a camera) of an electronic device and displayed on a display screen of the electronic device (i.e., a preview mode of a photographing application), a sensor of the electronic device obtains environment information of an environment in which the electronic device is located. Or, when an image is obtained by an image acquisition device (e.g., a camera) of the electronic device and saved (i.e., a picture is saved by triggering a photograph), a sensor of the electronic device obtains an environment where environment information of the environment where the electronic device is located. Of course, in another embodiment of the present application, the artificial intelligence engine further integrates an environment analysis engine, not only the beauty engine described above, i.e., the environment analysis engine can determine the environment information when generating the image 120 through its own image content analysis. So that the correction parameters are determined based on the environmental information obtained by the environmental analysis engine at the time of generation of the image 120.
For convenience of understanding, the environmental information is taken as time information, and the time information includes, for example, morning or evening, and the like.
The beauty engine in this embodiment determines that the image adjustment parameter is adjusted for the eye size term, the adjustment parameter value being increased by 50. As shown in fig. 1A, when the image adjustment parameter is an eye size adjustment parameter (for enlarging the eyes of the user) in the morning, the correction parameter may be to appropriately decrease the eye size adjustment parameter, for example, by 20 on a 50 basis, so that the adjustment parameter value for the eye size adjustment is determined to be an increase of 30, and there is no need to adjust the eyes as large because the user is usually refreshing and the eyes are large and prominent in the morning. The correction parameters according to the present embodiment are adjusted for the image adjustment parameters, so that the beauty of the image 120 is more natural, and the most natural state in the scene of the obtained image is more satisfied. That is, according to the present embodiment, the user color value can be improved (the most natural beauty is shown in the image) even if the beauty of the image 120 is more appropriate for the natural state of the user in the scene at the time of photographing, and at this time, the degree of enlargement of the eyes of the user based on the changed eye size adjustment parameter is small. Of course, the image adjustment parameters may also include adjustments for the size of the face and adjustment parameter values. When the image adjustment parameter is a face size adjustment parameter (for face thinning), the face of the user who has not been processed in the morning is tight so that the face is small, and at this time, the correction parameter may be an appropriate face size reduction parameter.
As shown in fig. 1B, for example, the time is night, when the image adjustment parameter is the eye size adjustment parameter (for enlarging the eyes of the user), for example, 10 is added on the basis of 50, so that the adjustment parameter value for eye size adjustment is determined to be increased by 60, because the user is usually tired and has no mind at night, and at this time, the correction parameter may be the eye size adjustment parameter that is appropriately increased, and at this time, the degree of increase of the eyes of the user based on the changed eye size adjustment parameter is large. When the image adjustment parameter is a face size adjustment parameter (for face thinning), the face of the user who is not processed at night is more edematous to cause a large face size, and the correction parameter may be an appropriate increase of the face size adjustment parameter.
It is to be understood that the specific type of the image adjustment parameter is not limited by the embodiments of the present disclosure, as long as the face of the user in the image can be adjusted. The embodiment of the present disclosure does not limit the specific type of the environment information, as long as the environment information can represent the environment when the image 120 is obtained, and the environment information may be time information, light information, weather information, and the like.
Fig. 2A to 2B schematically show a flowchart of an image processing method according to an embodiment of the present disclosure.
As shown in fig. 2A, the method includes operations S210 to S260.
In operation S210, an image is obtained.
In the embodiment of the present disclosure, for example, the image may be acquired by an image capturing device of the electronic device, and the electronic device may also acquire the image from other external devices. When the electronic device acquires an image from other external devices, the image may include, for example, environmental information of the other external devices when acquiring the image. The image may include, for example, an image corresponding to the user, such as an image of the user's face.
In operation S220, the features of the user in the image are analyzed.
According to embodiments of the present disclosure, the characteristics of the user may include, for example, the age, gender, race, and the like of the user. For example, the image may be analyzed by the artificial intelligence engine to determine information such as the age, sex, and race of the user, and certainly, the artificial intelligence engine may also determine information such as a face, five sense organs, and skin color in the image. More specifically, the face image of the user can be intelligently identified, and the features of the user can be identified by acquiring the feature points of the face image of the user.
In operation S230, an image adjustment parameter is determined based on the characteristics of the user.
In the embodiment of the present disclosure, the image adjustment parameter can adjust, for example, an area in which the user is located in the image, and more specifically, the image adjustment parameter can be used to adjust, for example, a face image of the user. The image adjustment parameter may be one adjustment parameter or multiple adjustment parameters adjusted according to the area where the user is located.
According to the embodiment of the present disclosure, the image adjustment parameters include, for example, a face size adjustment parameter, a face luminance adjustment parameter, a face chromaticity adjustment parameter, a face blur degree adjustment parameter, an eye size adjustment parameter, an eye luminance adjustment parameter, and the like.
Specifically, for example, the face thinning processing may be performed on the user in the image through a face size adjustment parameter, the face brightness of the user in the image may be adjusted through a face brightness adjustment parameter, the user's face in the image may be whitened through a face chroma adjustment parameter, the user's face in the image may be buffed through a face blurriness adjustment parameter, the user's eyes in the image may be magnified through an eye size adjustment parameter, the user's eyes in the image may be brightened through an eye brightness adjustment parameter, and the like, that is, the image adjustment parameter may adjust the display effect exhibited by the skin color area of the user in the image.
In the embodiment of the present disclosure, the features of the user and the image adjustment parameters may be in a mapping relationship, and the features of different users may correspond to the parameter types or parameter values of different image adjustment parameters.
In operation S240, a correction parameter is obtained.
According to an embodiment of the present disclosure, the modification parameter may be used to change an image adjustment parameter, for example, and the modification parameter may correspond to an image adjustment parameter, which can represent a variation amount of the corresponding image adjustment parameter.
According to the embodiment of the disclosure, obtaining the correction parameter includes: correction parameters are obtained based on environmental information at the time of obtaining the image. Wherein, the correction parameters corresponding to different environment information are different.
In the embodiment of the present disclosure, the environment information may be, for example, environment information when the image is acquired, and the environment information may include, for example, time information, light information, weather information, location information of multiple users in the image, and the like.
The correction parameters corresponding to different environmental information are different, and taking time information as an example, the physiological states of the user are different in different time periods, so that the corresponding correction parameters can be different in different time periods, and the image adjustment parameters are dynamically changed by using different correction parameters.
More specifically, the correction parameters corresponding to different environmental information are different, and may include, but not limited to, the following:
first, the time information is different, and the correction parameter for one of the image adjustment parameters is different. For example, taking the eye size adjustment parameter as an example of the image adjustment parameters, the values of the correction parameters corresponding to the eye size adjustment parameter are different in different time periods (for example, in the morning, afternoon, evening, etc.).
Second, the time information is different, and the kind of parameters to be changed in the image adjustment parameters is different. For example, during the morning time period, the types of parameters that need to be changed may include eye size adjustment parameters, during the afternoon time period, the types of parameters that need to be changed may include eye size adjustment parameters, face size adjustment parameters, and the like, and during the evening time period, the types of parameters that need to be changed may include face size adjustment parameters, face chromaticity adjustment parameters, and the like.
Third, the time information and the light information are different, the correction parameter for one of the image adjustment parameters is different, or the kind of the parameter to be changed in the image adjustment parameter is different. For example, the numerical values of the correction parameters corresponding to the eye size adjustment parameters are different in the case where the light is dark in the morning time period or in the case where the light is bright in the afternoon time period. In the case of dark light in the morning, the types of parameters to be changed may include an eye size adjustment parameter, a face size adjustment parameter, and the like. Or in the case of bright light in the afternoon time period, the types of parameters that need to be changed include a face brightness adjustment parameter, a face size adjustment parameter, and the like.
According to the embodiment of the present disclosure, the environmental information may be obtained, for example, through a sensing element of the electronic device, and the environmental information is used to characterize an environment when the electronic device obtains an image.
For example, when the electronic device acquires an image, light information of the current environment, such as intensity information of light, may be acquired through a sensing unit of the electronic device.
The time information can be obtained by obtaining the system time of the electronic device, and the weather information can be obtained by obtaining information in the weather application of the electronic device.
The position information of the multiple users in the image can be obtained by processing and identifying the acquired image through a processor of the electronic equipment.
In operation S250, an image adjustment parameter is changed based on the correction parameter.
According to the embodiment of the present disclosure, after the correction parameter is obtained based on the environmental information, the image adjustment parameter may be changed by the correction parameter.
For example, when the image adjustment parameter is an eye size adjustment parameter, the correction parameter corresponding to the eye size adjustment parameter is different in different time periods such as the morning, afternoon, or evening. During the morning hours, the user in the image is usually refreshing and the eyes are large and prominent, and the eye size adjustment parameter can be appropriately reduced by correcting the parameter. In the afternoon time period, the user in the image is usually busy and is bloated, and the eye size adjustment parameter can be increased appropriately by correcting the parameter. In the evening time period, the user in the image is usually tired and the eyes are not very distracted, and at the moment, the eye size adjustment parameter can be properly increased by correcting the parameter.
Taking the light information as an example, for example, when taking a backlight shot, the eyes of the user in the image are not interfered by light, the eye size adjustment parameter is reduced through the correction parameter corresponding to the eye size adjustment parameter, that is, the amplification degree of the eyes is reduced, the overall harmony between the user and the environment in the image is ensured, and meanwhile, under the backlight condition, the face chromaticity adjustment parameter (whitening parameter) can be reduced through the correction parameter, so that the overall harmony of the image under the backlight environment is ensured. Under the condition of dim light, the color noise of the user in the image is high, and at the moment, the face chroma adjustment parameter, the face blurriness adjustment parameter and the like can be changed through the correction parameters, for example, the parameters of whitening, peeling, spot removal and the like of the user in the image are increased through the correction parameters.
Taking weather information as an example, for example, when the temperature is low (usually, cloudy day), and the brightness of the image is low, the face chroma adjustment parameter, such as the whitening parameter, can be reduced by the correction parameter. For example, when the temperature is high (usually, on a sunny day), the brightness of the image is high, and the face chromaticity adjustment parameter, for example, the whitening parameter, may be increased by the correction parameter. For example, in the case of haze weather, the face chromaticity adjustment parameter, the face blur degree adjustment parameter, and the like may be appropriately increased by the correction parameter, and the eye size adjustment parameter may also be appropriately decreased by the correction parameter. And changing image adjustment parameters according to correction parameters corresponding to different weather information so as to ensure the overall harmony of the user and the environment in the image.
Taking the position information of multiple users in the image as an example, for example, when the image includes multiple users, by identifying the position relationship of the users in the image, the correction parameters corresponding to the face size adjustment parameters of different users are obtained according to the distances between the faces of the multiple users, and the face size adjustment parameters of different users are changed based on the correction parameters. For example, the user at the front or both sides of the image may generally have a larger face, and the face size adjustment parameter may be increased by the correction parameter, such as increasing the face thinning parameter, and the user at the rear of the image may generally have a smaller face, and the face size adjustment parameter may be decreased by the correction parameter, such as decreasing the face thinning parameter.
In addition, by identifying a first user among a plurality of users in the image, the first user may be, for example, an owner of the electronic device that acquires the image, acquire the correction parameter of the owner of the electronic device according to a master priority principle, and preferentially change the image adjustment parameter of the user based on the correction parameter.
In operation S260, image processing is performed on an area corresponding to the user in the image based on the changed image adjustment parameter.
In the embodiment of the present disclosure, the image adjustment parameter is changed by the modification parameter to obtain a changed image adjustment parameter, and the image may be processed by the changed image adjustment parameter, for example, a partial region in the image may be processed, where the partial region may be, for example, a region corresponding to a user in the image, and more specifically, the partial region may be a face image of the user in the image.
As shown in fig. 2B, operation S250 includes operations S251 to S252.
For ease of understanding, the following detailed description will be made by taking time information as an example.
In operation S251, if the time information satisfies the first range, a first adjustment item in the image adjustment parameters and an adjustment variation amount for the first adjustment item are determined, the first adjustment item being one or more.
According to the embodiment of the present disclosure, the first range may include a specific time period, in which the image adjustment parameter to be changed is, for example, a first adjustment item, and the correction parameter corresponding to the first adjustment item is, for example, an adjustment variation amount for the first adjustment item.
More specifically, the first range may be, for example, a period of time from four points in the morning to twelve points in the noon, and the first adjustment items in the first range include, for example, eye size adjustment parameters, face size adjustment parameters, and the like.
In operation S252, if the time information satisfies the second range, a second adjustment item in the image adjustment parameters and an adjustment variation amount for the second adjustment item are determined, the second adjustment item being one or more.
According to the embodiment of the present disclosure, the second range may include a specific time period in which the image adjustment parameter that needs to be changed is, for example, a second adjustment item, and the correction parameter corresponding to the second adjustment item is, for example, an adjustment variation amount for the second adjustment item.
More specifically, the second range may be, for example, a time period from twelve am to twenty pm, and the second adjustment items in the second range include, for example, an eye size adjustment parameter, a face chromaticity adjustment parameter, and the like.
The first adjustment item is the same as the second adjustment item, and the correction parameters corresponding to different time information are different.
Fig. 3A to 3B schematically show diagrams of changing the same image adjustment parameter based on time information according to an embodiment of the present disclosure.
As shown in fig. 3A to 3B, taking the first adjustment term and the second adjustment term as the eye size adjustment parameters as an example, the description will be made on the difference of the correction parameters corresponding to the eye size adjustment parameters with different time information as an example.
As shown in fig. 3A, the left image is the acquired image, the middle image is the image processed by the unchanged image adjustment parameter, and the right image is the image processed by the changed image adjustment parameter after the image adjustment parameter is changed by the correction parameter.
In the first range of the time information, the eye size adjustment parameter may be reduced by a correction parameter corresponding to the eye size adjustment parameter, the eye size of the user in the image may be adjusted by the reduced eye size adjustment parameter, for example, the eye size adjustment parameter is increased by 50, the eye size adjustment parameter may be reduced by the correction parameter, for example, the eye size adjustment parameter is reduced by 20 on the basis of 50, the eye size adjustment parameter is reduced by 30, and the used eye is increased by the eye size adjustment parameter 30. Since in the first range (morning) the user is usually refreshing, the eyes are large and there is no need to adjust the eyes so large, and therefore the eye size adjustment parameter is reduced by the correction parameter, making the image processing result more natural.
As shown in fig. 3B, in the second range of the time information, the eye size adjustment parameter may be increased by the correction parameter corresponding to the eye size adjustment parameter, and the eye size of the user in the image is adjusted by the increased eye size adjustment parameter, for example, the eye size adjustment parameter is increased by 50, the eye size adjustment parameter may be increased by 20 on the basis of 50, the eye size adjustment parameter is decreased by 70, and the used eye is increased by the eye size adjustment parameter 70. Since the user is usually tired and eyes are not in mind in the second range (afternoon), the correction parameter may be to increase the eye size adjustment parameter appropriately at this time, so that the image processing result is more natural.
Or the first adjustment item is different from the second adjustment item, and the different time information respectively corresponds to the adjustment variable quantity determined corresponding to the different adjustment items.
Fig. 4A to 4B schematically show diagrams of changing different image adjustment parameters based on time information according to an embodiment of the present disclosure.
As shown in fig. 4A to 4B, taking the first adjustment item as the eye size adjustment parameter, the second adjustment item as the eye size adjustment parameter and the face size adjustment parameter as an example, the description is made on an example in which the corresponding adjustment variation is determined for different adjustment items by different time information.
As shown in fig. 4A, the left image is the acquired image, the middle image is the image processed by the unchanged image adjustment parameter, and the right image is the image processed by the changed image adjustment parameter after the image adjustment parameter is changed by the correction parameter.
In the first range of the time information, the eye size adjustment parameter may be reduced by a correction parameter corresponding to the eye size adjustment parameter, the eye size of the user in the image is adjusted by the reduced eye size adjustment parameter, for example, the eye size adjustment parameter is increased by 50, the eye size adjustment parameter may be reduced by the correction parameter, for example, 20 is reduced on the basis of 50, the eye size adjustment parameter is reduced by 30, and the used eye is increased by the eye size adjustment parameter 30.
As shown in fig. 4B, in the second range of the time information, the eye size adjustment parameter may be increased by the correction parameter corresponding to the eye size adjustment parameter, and the eye size of the user in the image is adjusted by the increased eye size adjustment parameter, for example, the eye size adjustment parameter is increased by 50, the eye size adjustment parameter may be increased by the correction parameter, for example, increased by 20 on the basis of 50, the eye size adjustment parameter is decreased by 70, and the used eye is increased by the eye size adjustment parameter 70.
The face resizing parameter may be increased by a correction parameter corresponding to the face resizing parameter, the face size of the user in the image may be adjusted by the increased face resizing parameter, e.g. the face resizing parameter is decreased by 30, the face resizing parameter may be increased by, e.g. 20 on the basis of 30, the increased face resizing parameter is 50, and the size of the face used is decreased by the face resizing parameter 50. Since the user is usually tired in mind and body and has edema of the face in the second range (afternoon), the correction parameter may be to increase the face size adjustment parameter appropriately at this time, so that the image processing result is more natural.
It is to be understood that the above specific values are examples for facilitating understanding of the embodiments of the present disclosure, and the embodiments of the present disclosure do not limit the specific values of the image adjustment parameters and the correction parameters, and can be specifically set by those skilled in the art according to the practical application.
In the embodiment of the present disclosure, if the time information satisfies the third range, a third adjustment item in the image adjustment parameters and an adjustment change amount for the third adjustment item are determined, the third adjustment item being one or more.
According to the embodiment of the present disclosure, the third range may include a specific time period in which the image adjustment parameter that needs to be changed is, for example, a third adjustment item, and the correction parameter corresponding to the third adjustment item is, for example, an adjustment variation amount for the third adjustment item.
More specifically, the third range is, for example, a period of time from twenty pm to four am, and the third adjustment items in the third range include, for example, a face size adjustment parameter, a face chromaticity adjustment parameter, and the like.
The third adjustment item is the same as the first adjustment item and the second adjustment item, and the correction parameters corresponding to different time information are different.
Or the third adjustment item is different from the first adjustment item and the second adjustment item, and the correction parameters corresponding to different time information are different.
According to the image processing method and device, the characteristics of the user in the image are analyzed, the image adjusting parameters are determined according to the characteristics of the user, the correction parameters are obtained through the environment information obtained when the image is obtained, the image adjusting parameters are changed through the correction parameters, and the area corresponding to the user in the image is processed based on the changed image adjusting parameters.
Fig. 2C schematically shows a flow chart of an image processing method according to another embodiment of the present disclosure.
As shown in fig. 2C, the method includes operations S210, and S270 to S290.
In operation S210, an image is obtained. This operation is the same as or similar to operation S210 described above with reference to fig. 2A, and is not described again here.
In operation S270, environment information for characterizing an environment when the electronic device acquires an image is obtained.
In the embodiment of the present disclosure, the environment information may include, for example, time information, light information, weather information, and location information of a plurality of users in an image, and the like. For example, when the electronic device acquires an image, light information of the current environment, such as intensity information of light, may be acquired through a sensing unit of the electronic device. For example, the time information may be obtained by obtaining a system time of the electronic device, and the weather information may be obtained by obtaining information in a weather application of the electronic device. For example, the position information of multiple users in the image can be obtained by processing and identifying the acquired image by a processor of the electronic device.
In operation S280, image adjustment parameters are determined based on the image and the environmental information by the smart analysis engine.
In the embodiment of the present disclosure, the intelligent analysis engine may include, for example, a beauty intelligent engine, and the intelligent analysis engine inputs the acquired image and the environment information into the intelligent analysis engine, and analyzes and processes the image and the environment information to obtain an image adjustment parameter corresponding to the image.
In operation S290, image processing is performed on an area corresponding to a user in an image based on the image adjustment parameter.
According to the embodiment of the disclosure, processing the image based on the image adjustment parameter may be, for example, performing a beauty treatment on the face of the user in the image, so that the beauty for the image is more natural and more conforms to the most natural state in the scene where the image is obtained. That is, the present embodiment makes the beauty of the image more consistent with the natural state of the user in the scene at the time point of photographing, and also can improve the user color value (the most natural beauty is presented in the image).
Fig. 5 schematically shows a block diagram of an electronic device according to an embodiment of the disclosure.
As shown in fig. 5, electronic device 500 includes an image capture device 510, a smart analysis engine 520, a processor 530, a display screen 540, and a sensing element 550.
The image capturing device 510 may be used to obtain an image, among other things. According to the embodiment of the present disclosure, the image capturing device 510 may perform, for example, operation S210 described above with reference to fig. 2A, which is not described herein again.
The intelligent analysis engine 520 may be used to analyze the image to obtain the characteristics of the user in the image; the method includes determining an image adjustment parameter based on a characteristic of a user, obtaining a modification parameter, and changing the image adjustment parameter based on the modification parameter. According to the embodiment of the present disclosure, the smart analysis engine 520 may perform, for example, the operations S220 to S250 described above with reference to fig. 2A, which are not described herein again.
Wherein obtaining the correction parameter comprises: correction parameters are obtained based on environmental information at the time of obtaining the image.
Wherein, the correction parameters corresponding to different environment information are different.
Wherein the environment information includes: the time information, changing the image adjustment parameter based on the correction parameter includes:
and if the time information meets the first range, determining a first adjustment item in the image adjustment parameters and an adjustment variation amount aiming at the first adjustment item, wherein the first adjustment item is one or more.
And if the time information meets the second range, determining a second adjustment item in the image adjustment parameters and an adjustment variation amount aiming at the second adjustment item, wherein the second adjustment item is one or more.
Wherein the first adjustment item is the same as the second adjustment item, and the correction parameters corresponding to different time information are different, or
Or the first adjustment item is different from the second adjustment item, and the different time information respectively corresponds to the adjustment variable quantity determined corresponding to the different adjustment items.
The processor 530 may be configured to perform image processing on an area corresponding to a user in an image based on the changed image adjustment parameter. According to the embodiment of the present disclosure, the processor 530 may perform, for example, operation S260 described above with reference to fig. 2A, which is not described herein again.
The display screen 540 may be used to display the processed image.
The sensing element 550 can be used to obtain environmental information that characterizes the environment in which the electronic device obtains the image.
Fig. 6 schematically shows a schematic view of an electronic device according to an embodiment of the disclosure.
As shown in fig. 6, the electronic device 600 is a smart phone, for example, and the electronic device 600 includes a camera 610 and a display 620.
The camera 610 may be used to obtain an image, for example, the camera may include a front camera and a rear camera for capturing an image.
The display screen 620 may be used, for example, to display processed images, or may also be used to display unprocessed images, etc.
Fig. 7 schematically shows a block diagram of an image processing system according to an embodiment of the present disclosure.
As shown in fig. 7, the image processing system 700 includes a first obtaining module 710, an analyzing module 720, a determining module 730, a second obtaining module 740, a changing module 750, and a processing module 760.
The first obtaining module 710 may be configured to obtain an image. According to the embodiment of the present disclosure, the first obtaining module 710 may, for example, perform operation S210 described above with reference to fig. 2A, which is not described herein again.
The analysis module 720 may be used to analyze the characteristics of the user in the image-captured image. According to the embodiment of the present disclosure, the analysis module 720 may, for example, perform operation S220 described above with reference to fig. 2A, which is not described herein again.
The determination module 730 may be used to determine image adjustment parameters based on the characteristics of the user. According to the embodiment of the present disclosure, the determining module 730 may, for example, perform the operation S230 described above with reference to fig. 2A, which is not described herein again.
The second obtaining module 740 may be used to obtain the correction parameters. According to the embodiment of the present disclosure, the second obtaining module 740 may perform, for example, the operation S240 described above with reference to fig. 2A, which is not described herein again.
Wherein obtaining the correction parameter comprises: correction parameters are obtained based on environmental information at the time of obtaining the image.
Wherein, the correction parameters corresponding to different environment information are different.
Wherein the environment information includes: the time information, changing the image adjustment parameter based on the correction parameter includes:
and if the time information meets the first range, determining a first adjustment item in the image adjustment parameters and an adjustment variation amount aiming at the first adjustment item, wherein the first adjustment item is one or more.
And if the time information meets the second range, determining a second adjustment item in the image adjustment parameters and an adjustment variation amount aiming at the second adjustment item, wherein the second adjustment item is one or more.
Wherein the first adjustment item is the same as the second adjustment item, and the correction parameters corresponding to different time information are different, or
The first adjustment item is different from the second adjustment item, and different time information respectively corresponds to the adjustment variable quantity determined corresponding to the different adjustment items.
The environment information can be obtained through the sensing element of the electronic device, and the environment information is used for representing the environment when the electronic device obtains the image.
The changing module 750 may be used to change the image adjustment parameters based on the modification parameters. According to the embodiment of the present disclosure, the changing module 750 may, for example, perform the operation S250 described above with reference to fig. 2A, which is not described herein again.
The processing module 760 may be configured to perform image processing on an area of the image corresponding to the user based on the changed image adjustment parameter. According to the embodiment of the present disclosure, the processing module 760 may perform, for example, operation S260 described above with reference to fig. 2A, which is not described herein again.
Any number of modules, sub-modules, units, sub-units, or at least part of the functionality of any number thereof according to embodiments of the present disclosure may be implemented in one module. Any one or more of the modules, sub-modules, units, and sub-units according to the embodiments of the present disclosure may be implemented by being split into a plurality of modules. Any one or more of the modules, sub-modules, units, sub-units according to embodiments of the present disclosure may be implemented at least in part as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented in any other reasonable manner of hardware or firmware by integrating or packaging a circuit, or in any one of or a suitable combination of software, hardware, and firmware implementations. Alternatively, one or more of the modules, sub-modules, units, sub-units according to embodiments of the disclosure may be at least partially implemented as a computer program module, which when executed may perform the corresponding functions.
For example, any number of the first obtaining module 710, the analyzing module 720, the determining module 730, the second obtaining module 740, the changing module 750, and the processing module 760 may be combined in one module to be implemented, or any one of them may be split into multiple modules. Alternatively, at least part of the functionality of one or more of these modules may be combined with at least part of the functionality of the other modules and implemented in one module. According to an embodiment of the disclosure, at least one of the first obtaining module 710, the analyzing module 720, the determining module 730, the second obtaining module 740, the changing module 750, and the processing module 760 may be implemented at least in part as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented in hardware or firmware in any other reasonable manner of integrating or packaging a circuit, or in any one of three implementations of software, hardware, and firmware, or in any suitable combination of any of them. Alternatively, at least one of the first obtaining module 710, the analyzing module 720, the determining module 730, the second obtaining module 740, the changing module 750, and the processing module 760 may be at least partially implemented as a computer program module that, when executed, may perform a corresponding function.
FIG. 8 schematically shows a block diagram of a computer system for image processing according to an embodiment of the present disclosure. The computer system illustrated in FIG. 8 is only one example and should not impose any limitations on the scope of use or functionality of embodiments of the disclosure.
As shown in fig. 8, a computer system 800 implemented for image processing includes a processor 801, a computer-readable storage medium 802. The system 800 may perform a method according to an embodiment of the present disclosure.
In particular, the processor 801 may include, for example, a general purpose microprocessor, an instruction set processor and/or related chip set and/or a special purpose microprocessor (e.g., an Application Specific Integrated Circuit (ASIC)), and/or the like. The processor 801 may also include onboard memory for caching purposes. The processor 801 may be a single processing unit or a plurality of processing units for performing the different actions of the method flows according to embodiments of the present disclosure.
Computer-readable storage medium 802 may be, for example, any medium that can contain, store, communicate, propagate, or transport the instructions. For example, a readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. Specific examples of the readable storage medium include: magnetic storage devices, such as magnetic tape or Hard Disk Drives (HDDs); optical storage devices, such as compact disks (CD-ROMs); a memory, such as a Random Access Memory (RAM) or a flash memory; and/or wired/wireless communication links.
The computer-readable storage medium 802 may include a computer program 803, which computer program 803 may include code/computer-executable instructions that, when executed by the processor 801, cause the processor 801 to perform a method according to an embodiment of the present disclosure, or any variant thereof.
The computer program 803 may be configured with, for example, computer program code comprising computer program modules. For example, in an example embodiment, code in computer program 803 may include one or more program modules, including for example 803A, module 803B, … …. It should be noted that the division and number of the modules are not fixed, and those skilled in the art may use suitable program modules or program module combinations according to actual situations, so that the processor 801 may execute the method according to the embodiment of the present disclosure or any variation thereof when the program modules are executed by the processor 801.
According to an embodiment of the present invention, at least one of the first obtaining module 710, the analyzing module 720, the determining module 730, the second obtaining module 740, the changing module 750, and the processing module 760 may be implemented as a computer program module as described with reference to fig. 8, which, when executed by the processor 801, may implement the respective operations described above.
The present disclosure also provides a computer-readable medium, which may be embodied in the apparatus/device/system described in the above embodiments; or may exist separately and not be assembled into the device/apparatus/system. The computer readable medium carries one or more programs which, when executed, implement:
an image processing method comprising: the method comprises the steps of obtaining an image, analyzing the image to obtain the characteristics of a user in the image, determining image adjustment parameters based on the characteristics of the user, obtaining correction parameters, changing the image adjustment parameters based on the correction parameters, and carrying out image processing on an area corresponding to the user in the image based on the changed image adjustment parameters.
Optionally, the obtaining the correction parameter includes: correction parameters are obtained based on environmental information at the time of obtaining the image.
Optionally, in the method, the correction parameters corresponding to different pieces of environment information are different.
Optionally, the method further includes: the environmental information is obtained through a sensing element of the electronic device and is used for representing the environment when the electronic device obtains the image.
Optionally, the environment information includes: the time information, changing the image adjustment parameter based on the correction parameter includes: if the time information meets a first range, determining a first adjustment item in the image adjustment parameters and an adjustment variable quantity aiming at the first adjustment item, wherein the first adjustment item is one or more, if the time information meets a second range, determining a second adjustment item in the image adjustment parameters and an adjustment variable quantity aiming at the second adjustment item, and the second adjustment item is one or more, wherein the first adjustment item is the same as the second adjustment item, and the correction parameters corresponding to different time information are different, or the first adjustment item is different from the second adjustment item, and different time information respectively corresponds to different adjustment items to determine the corresponding adjustment variable quantity.
According to embodiments of the present disclosure, a computer readable medium may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer-readable signal medium may include a propagated data signal with computer-readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, optical fiber cable, radio frequency signals, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Those skilled in the art will appreciate that various combinations and/or combinations of features recited in the various embodiments and/or claims of the present disclosure can be made, even if such combinations or combinations are not expressly recited in the present disclosure. In particular, various combinations and/or combinations of the features recited in the various embodiments and/or claims of the present disclosure may be made without departing from the spirit or teaching of the present disclosure. All such combinations and/or associations are within the scope of the present disclosure.
While the disclosure has been shown and described with reference to certain exemplary embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the disclosure as defined by the appended claims and their equivalents. Accordingly, the scope of the present disclosure should not be limited to the above-described embodiments, but should be defined not only by the appended claims, but also by equivalents thereof.

Claims (10)

1. An image processing method comprising:
obtaining an image;
analyzing the image to obtain characteristics of the user in the image;
determining an image adjustment parameter based on the characteristics of the user;
obtaining a correction parameter for changing the image adjustment parameter based on time information when the image is obtained, wherein the correction parameter corresponding to the time information is related to the physiological state of the user;
changing the image adjustment parameter based on the correction parameter;
and performing image processing on the area corresponding to the user in the image based on the changed image adjustment parameter.
2. The method of claim 1, wherein the obtaining a correction parameter comprises:
and obtaining a correction parameter based on the environmental information when the image is obtained.
3. The method of claim 2, wherein the correction parameters are different for different environmental information.
4. The method of claim 1, further comprising:
and obtaining environmental information through a sensing element of the electronic equipment, wherein the environmental information is used for representing the environment when the electronic equipment obtains the image.
5. The method of claim 4, wherein:
the environment information includes: time information;
the changing the image adjustment parameter based on the modification parameter includes:
if the time information meets a first range, determining a first adjustment item in the image adjustment parameters and an adjustment variation amount aiming at the first adjustment item, wherein the first adjustment item is one or more;
if the time information meets a second range, determining a second adjustment item in the image adjustment parameters and an adjustment variation amount aiming at the second adjustment item, wherein the second adjustment item is one or more;
the first adjustment item and the second adjustment item are the same, and correction parameters corresponding to different time information are different; or
The first adjustment item is different from the second adjustment item, and different time information respectively corresponds to the adjustment variable quantity determined corresponding to the different adjustment items.
6. An electronic device, comprising:
the image acquisition device is used for acquiring an image;
the intelligent analysis engine is used for analyzing the image to obtain the characteristics of the user in the image; determining an image adjustment parameter based on the characteristics of the user; obtaining a correction parameter for changing the image adjustment parameter based on time information when the image is obtained, wherein the correction parameter corresponding to the time information is related to the physiological state of the user; and changing the image adjustment parameter based on the correction parameter;
a processor for performing image processing on an area corresponding to the user in the image based on the changed image adjustment parameter;
and the display screen is used for displaying the processed image.
7. The electronic device of claim 6, wherein the obtaining correction parameters comprises:
and obtaining a correction parameter based on the environmental information when the image is obtained.
8. The electronic device of claim 7, wherein the correction parameters are different for different environmental information.
9. The electronic device of claim 6, further comprising:
and the sensing element is used for obtaining environment information, and the environment information is used for representing the environment when the electronic equipment obtains the image.
10. The electronic device of claim 9, wherein:
the environment information includes: time information;
the changing the image adjustment parameter based on the modification parameter includes:
if the time information meets a first range, determining a first adjustment item in the image adjustment parameters and an adjustment variation amount aiming at the first adjustment item, wherein the first adjustment item is one or more;
if the time information meets a second range, determining a second adjustment item in the image adjustment parameters and an adjustment variation amount aiming at the second adjustment item, wherein the second adjustment item is one or more;
the first adjustment item and the second adjustment item are the same, and correction parameters corresponding to different time information are different; or
The first adjustment item is different from the second adjustment item, and different time information respectively corresponds to the adjustment variable quantity determined corresponding to the different adjustment items.
CN201810562973.2A 2018-06-01 2018-06-01 Image processing method and electronic device Active CN108765352B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810562973.2A CN108765352B (en) 2018-06-01 2018-06-01 Image processing method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810562973.2A CN108765352B (en) 2018-06-01 2018-06-01 Image processing method and electronic device

Publications (2)

Publication Number Publication Date
CN108765352A CN108765352A (en) 2018-11-06
CN108765352B true CN108765352B (en) 2021-07-16

Family

ID=64002540

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810562973.2A Active CN108765352B (en) 2018-06-01 2018-06-01 Image processing method and electronic device

Country Status (1)

Country Link
CN (1) CN108765352B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110097622B (en) * 2019-04-23 2022-02-25 北京字节跳动网络技术有限公司 Method and device for rendering image, electronic equipment and computer readable storage medium
CN112446832A (en) * 2019-08-31 2021-03-05 华为技术有限公司 Image processing method and electronic equipment
CN110992500B (en) * 2019-10-12 2023-04-25 平安科技(深圳)有限公司 Attendance checking method and device, storage medium and server
CN112070707B (en) * 2020-11-12 2021-02-23 国科天成科技股份有限公司 True color image intensifier based on micro-lens array

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104270571A (en) * 2014-10-20 2015-01-07 联想(北京)有限公司 Image processing method and electronic equipment
CN107330904A (en) * 2017-06-30 2017-11-07 北京金山安全软件有限公司 Image processing method, image processing device, electronic equipment and storage medium
CN107346544A (en) * 2017-06-30 2017-11-14 联想(北京)有限公司 A kind of image processing method and electronic equipment
CN107798652A (en) * 2017-10-31 2018-03-13 广东欧珀移动通信有限公司 Image processing method, device, readable storage medium storing program for executing and electronic equipment
CN107862653A (en) * 2017-11-30 2018-03-30 广东欧珀移动通信有限公司 Method for displaying image, device, storage medium and electronic equipment

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140198177A1 (en) * 2013-01-15 2014-07-17 International Business Machines Corporation Realtime photo retouching of live video
CN107845088B (en) * 2017-10-25 2020-02-07 苏州比格威医疗科技有限公司 Method for acquiring physiological parameters in retina OCT image based on dynamic constraint graph search

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104270571A (en) * 2014-10-20 2015-01-07 联想(北京)有限公司 Image processing method and electronic equipment
CN107330904A (en) * 2017-06-30 2017-11-07 北京金山安全软件有限公司 Image processing method, image processing device, electronic equipment and storage medium
CN107346544A (en) * 2017-06-30 2017-11-14 联想(北京)有限公司 A kind of image processing method and electronic equipment
CN107798652A (en) * 2017-10-31 2018-03-13 广东欧珀移动通信有限公司 Image processing method, device, readable storage medium storing program for executing and electronic equipment
CN107862653A (en) * 2017-11-30 2018-03-30 广东欧珀移动通信有限公司 Method for displaying image, device, storage medium and electronic equipment

Also Published As

Publication number Publication date
CN108765352A (en) 2018-11-06

Similar Documents

Publication Publication Date Title
CN108765352B (en) Image processing method and electronic device
JP7266672B2 (en) Image processing method, image processing apparatus, and device
CN113129312B (en) Image processing method, device and equipment
US10074165B2 (en) Image composition device, image composition method, and recording medium
KR102149187B1 (en) Electronic device and control method of the same
WO2018019206A1 (en) Systems and methods for changing operation modes of the optical filter of an imaging device
KR20170125604A (en) Electronic apparatus and controlling method thereof
US20170154437A1 (en) Image processing apparatus for performing smoothing on human face area
JP2020537441A (en) Photography method and electronic equipment
JP2022512125A (en) Methods and Electronic Devices for Taking Long Exposure Images
CN112822413B (en) Shooting preview method, shooting preview device, terminal and computer readable storage medium
CN114340102A (en) Lamp strip control method and device, display equipment and system and storage medium
US20190205689A1 (en) Method and device for processing image, electronic device and medium
CN113891008B (en) Exposure intensity adjusting method and related equipment
WO2022011621A1 (en) Face illumination image generation apparatus and method
CN114119413A (en) Image processing method and device, readable medium and mobile terminal
CN112950641A (en) Image processing method and device, computer readable storage medium and electronic device
CN117750190B (en) Image processing method and electronic equipment
RU2791810C2 (en) Method, equipment and device for image processing
CN111131716B (en) Image processing method and electronic device
US11190705B2 (en) Intelligent array of lights for illumination
CN112449103B (en) Image processing method and related equipment
RU2794062C2 (en) Image processing device and method and equipment
CN112073617B (en) Light supplement method, light supplement device, computer readable storage medium and electronic equipment
CN116664630B (en) Image processing method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant