CN109379534B - Method, device, terminal and storage medium for processing image - Google Patents

Method, device, terminal and storage medium for processing image Download PDF

Info

Publication number
CN109379534B
CN109379534B CN201811347813.2A CN201811347813A CN109379534B CN 109379534 B CN109379534 B CN 109379534B CN 201811347813 A CN201811347813 A CN 201811347813A CN 109379534 B CN109379534 B CN 109379534B
Authority
CN
China
Prior art keywords
image
mobile terminal
light intensity
exposure compensation
compensation value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811347813.2A
Other languages
Chinese (zh)
Other versions
CN109379534A (en
Inventor
范润启
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Oppo Chongqing Intelligent Technology Co Ltd
Original Assignee
Oppo Chongqing Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo Chongqing Intelligent Technology Co Ltd filed Critical Oppo Chongqing Intelligent Technology Co Ltd
Priority to CN201811347813.2A priority Critical patent/CN109379534B/en
Publication of CN109379534A publication Critical patent/CN109379534A/en
Application granted granted Critical
Publication of CN109379534B publication Critical patent/CN109379534B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time

Abstract

The embodiment of the application discloses a method, a device, a terminal and a storage medium for processing images, which belong to the technical field of computers, wherein the method comprises the following steps: the method for processing the image, provided by this embodiment, can enable the mobile terminal to obtain the first light intensity through the first photosensitive component and the target camera arranged in the same target panel when the target camera collects the image, and when the first light intensity is greater than the light intensity threshold, determine that the target camera is in a backlight state, then perform image recognition on the collected image, obtain the first image recognition result, obtain the exposure compensation value corresponding to the first image recognition result, and re-render the image according to the exposure compensation value. Therefore, the backlight scene can be automatically confirmed, the image is subjected to adaptive exposure compensation through the software processing flow, and the effect of automatically optimizing the backlight image by the terminal is improved.

Description

Method, device, terminal and storage medium for processing image
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to a method, a device, a terminal and a storage medium for processing images.
Background
With the development of the micro camera technology, the physical form of the camera can be embedded in the mobile terminal, and a camera shooting function is provided for the mobile terminal.
In some applications, the mobile terminal takes images or records through a camera. When the user confirms that the camera is in a backlight state, the user can manually turn on a flash lamp in the mobile terminal to supplement light for a shot object.
Disclosure of Invention
The embodiment of the application provides a method, a device, a terminal and a storage medium for processing images. The technical scheme is as follows:
according to an aspect of the present application, there is provided a method for processing an image, the method being applied in a mobile terminal, a target panel of the mobile terminal being provided with a target camera and a first photosensitive component, the target panel being a front panel or a rear panel, the method comprising:
when the mobile terminal acquires an image through the target camera, the mobile terminal acquires first light intensity through the first photosensitive assembly, and the first light intensity is used for indicating the intensity of light irradiating the target camera;
when the first light intensity is larger than a light intensity threshold value, determining that the target camera is in a backlight state;
the mobile terminal carries out image recognition on the image to obtain a first image recognition result;
and the mobile terminal processes the image according to the exposure compensation value corresponding to the first image identification result.
According to another aspect of the present application, there is provided an apparatus for processing an image, the method is applied in a mobile terminal, a target panel of the mobile terminal is provided with a target camera and a first photosensitive component, the target panel is a front panel or a rear panel, the apparatus includes:
the light intensity acquisition module is used for acquiring first light intensity through the first photosensitive assembly when the mobile terminal acquires an image through the target camera, and the first light intensity is used for indicating the intensity of light irradiating the target camera;
the backlight determination module is used for determining that the target camera is in a backlight state when the first light intensity is greater than a light intensity threshold value;
the image recognition module is used for carrying out image recognition on the image to obtain a first image recognition result;
and the image processing module is used for processing the image according to the exposure compensation value corresponding to the first image identification result.
According to another aspect of the present application, there is provided a terminal comprising a processor and a memory, the memory having stored therein at least one instruction, the instruction being loaded and executed by the processor to implement the method of processing an image as provided in the implementations of the present application.
According to another aspect of the present application, there is provided a computer readable storage medium having stored therein at least one instruction, which is loaded and executed by a processor to implement a method of processing an image as provided in the implementations of the present application.
The beneficial effects brought by the technical scheme provided by the embodiment of the application can include:
the method for processing the image, provided by this embodiment, can enable the mobile terminal to obtain the first light intensity through the first photosensitive component and the target camera arranged in the same target panel when the target camera collects the image, and when the first light intensity is greater than the light intensity threshold, determine that the target camera is in a backlight state, then perform image recognition on the collected image, obtain the first image recognition result, obtain the exposure compensation value corresponding to the first image recognition result, and re-render the image according to the exposure compensation value. Therefore, the backlight scene can be automatically confirmed, the image is subjected to adaptive exposure compensation through the software processing flow, and the effect of automatically optimizing the backlight image by the terminal is improved.
Drawings
In order to more clearly describe the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments of the present application will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a schematic structural diagram of a mobile terminal according to an exemplary embodiment of the present application;
FIG. 2 is a flow chart of a method of processing an image provided by an exemplary embodiment of the present application;
FIG. 3 is a flow chart of a method of processing an image provided by another exemplary embodiment of the present application;
FIG. 4 is a flow chart of a method for processing images provided based on the embodiments of FIGS. 2 and 3;
FIG. 5 is a flow chart of a method for processing an image provided based on the embodiment shown in FIG. 4;
FIG. 6 is a flow chart of another method of processing images provided herein;
fig. 7 is a block diagram of an apparatus for processing an image according to an exemplary embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
In the description of the present application, it is to be understood that the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. In the description of the present application, it is to be noted that, unless otherwise explicitly specified or limited, the terms "connected" and "connected" are to be interpreted broadly, e.g., as being fixed or detachable or integrally connected; can be mechanically or electrically connected; may be directly connected or indirectly connected through an intermediate. The specific meaning of the above terms in the present application can be understood in a specific case by those of ordinary skill in the art. Further, in the description of the present application, "a plurality" means two or more unless otherwise specified. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
In the description of the present application, it is to be understood that the terms "set", "set" and "setting" do not refer to actions performed by a user, operator or robot. For the purposes of this description, the term "disposed" indicates a mounted or oriented relationship. For example, A is placed in B, meaning that A is located in B. The term "arrangement" indicates the manner of installation or the positional relationship. For example, the present embodiment explains the positional relationship between the respective components in the mobile terminal using the above description. The setting mode of a means the installation mode of a in a mobile terminal or a certain component, and does not refer to the action that a is set in the mobile terminal or a certain component.
In order to make the solution shown in the embodiments of the present application easy to understand, several terms appearing in the embodiments of the present application will be described below.
A target panel: refers to a front panel or a rear panel in a mobile terminal. In one possible implementation, the front panel includes a screen and the back panel is the side of the front panel opposite the side.
Optionally, when the target panel is a front panel, the target camera is a front camera, the first photosensitive assembly is a front photosensitive assembly, and a panel opposite to the target panel is a rear panel. In this scenario, one way of arranging the components is that the front camera and the front photosensitive component can be respectively arranged in two through holes of the front panel. The other component is arranged in a way that the front camera and the front photosensitive component are integrated in one through hole of the front panel. When the front camera and the front photosensitive assembly are arranged in one through hole of the front panel, the mobile terminal can be only provided with one through hole in the front panel, and the through hole can be further provided with at least one of a fingerprint sensor, a thermal sensor and a pressure sensor.
Optionally, when the target panel is a rear panel, the target camera is a rear camera, the first photosensitive assembly is a rear photosensitive assembly, and the opposite side panel of the target panel is a front panel. In this scenario, one way of setting the components is that the rear camera and the rear photosensitive component can be set in two through holes of the front panel, respectively. Optionally, when the rear camera includes at least two cameras, the target camera is one of the rear cameras.
The first light intensity: for indicating the intensity of the light illuminating the target camera. The first light intensity may be in candela (english: candela, abbreviated: cd). Alternatively, the unit of the first light intensity may be expressed by candlelight or bronchus, and the like, which is not limited in this embodiment.
The second light intensity: indicating the intensity of light striking the opposite side panel of the target panel. The unit of the second light intensity may be candela (english: candela, abbreviated: cd). Alternatively, the unit of the second light intensity may be represented by candlelight or bronchus, and the present embodiment is not limited thereto.
Light intensity threshold: and the light intensity threshold is prestored in the mobile terminal or the server. In one possible approach, the light intensity threshold is a fixed constant value. When the mobile terminal needs to use the light intensity threshold, the mobile terminal reads the light intensity threshold from a local configuration file or a server in the cloud.
In another possible approach, the light intensity threshold is a value that varies with environmental information. In one possible approach, the environmental information is at least one of time information, geographical location information, and weather information. The mobile terminal or the server acquires the current environment information and determines the corresponding light intensity threshold value according to the current environment information. In a determination method of the light intensity threshold, the mobile terminal or the server can determine the light intensity threshold through a light intensity threshold determination model, wherein the light intensity threshold determination model is a machine learning model and is trained in advance through environmental information marked with the light intensity threshold. After the mobile terminal or the server inputs the environment information into the light intensity threshold value determination model, the light intensity threshold value determination model outputs a light intensity threshold value corresponding to the environment information. The mobile terminal will acquire the light intensity threshold.
A mobile terminal: the mobile terminal mentioned in the present application refers to an electronic terminal with an image capturing function that can be carried by a user. As a possible implementation manner, the mobile terminal may be a device including a mobile phone, a tablet computer, smart glasses, a digital camera, an MP4 player terminal, an MP5 player terminal, and the like. It should be noted that the listing of the mobile terminal herein is an illustrative description, and does not limit the specific implementation functions of the mobile terminal covered by the present application.
In a configuration mode of a mobile terminal provided by the application, the mobile terminal comprises a shell, a mainboard and a photosensitive assembly. In the constitution mode, the mobile terminal collects the light intensity of the mobile terminal through the photosensitive assembly. In an alternative implementation, the photosensitive assembly includes a first photosensitive assembly and a second photosensitive assembly. For a structural introduction of a mobile terminal, please refer to fig. 1.
Please refer to fig. 1, which is a schematic structural diagram of a mobile terminal according to an exemplary embodiment of the present application. In fig. 1, the mobile terminal 100 includes a housing, a main board, a photosensitive member, and a front panel.
The mobile terminal 100 is enclosed by a housing and a front panel to form a cavity for housing electronic components inside the mobile terminal. In one implementation, the motherboard is located in a cavity defined by the housing and the front panel, and the electronic component is welded, adhered or inserted on the motherboard. Among the various electronic components, including the processor 121 and the memory 122, the memory 122 stores at least one instruction, which is loaded and executed by the processor 121 to implement the method for processing images as provided in the various embodiments of the method for processing images of the present application.
Processor 121 may include one or more processing cores. The processor 121 connects various parts within the entire main board 100 using various interfaces and lines, and performs various functions of the main board 100 and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 122 and calling data stored in the memory 122. Optionally, the processor 121 may be implemented in at least one hardware form of Digital Signal Processing (DSP), Field-Programmable Gate Array (FPGA), and Programmable Logic Array (PLA). The processor 121 may integrate one or more of a Central Processing Unit (CPU), an image processor (GPU), a modem, and the like. Wherein, the CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing the content required to be displayed by the display screen; the modem is used to handle wireless communications. It is understood that the modem may not be integrated into the processor 121, but may be implemented by a single chip.
The Memory 122 may include a Random Access Memory (RAM) or a Read-Only Memory (Read-Only Memory). Optionally, the memory 122 includes a non-transitory computer-readable medium. The memory 122 may be used to store instructions, programs, code sets, or instruction sets. The memory 122 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing various method embodiments described below, and the like; the storage data area may store data and the like referred to in the following respective method embodiments.
Optionally, in a possible implementation manner, a battery is disposed in a cavity enclosed by the housing and the front panel. The battery is electrically connected with the mainboard and provides electric energy for the work of the mainboard.
Optionally, in another possible implementation manner, when the mobile terminal has a communication function, the side frame of the housing 110 includes a plug-in card slot. The card slot is used for accommodating a SIM (Subscriber identity module, Chinese) and/or a memory card. The side frame of the shell can be at least one of a top side frame, a bottom side frame, a left side frame or a right side frame.
Optionally, in a possible implementation, in the cavity enclosed by the housing and the front panel, a sub-board is further included in addition to the main board and the battery. As a possible arrangement of electronic components, the electronic components are arranged in a cavity defined by the housing and the front panel. The battery is disposed in a space between the main board and the sub-board. In some mobile terminals and technical documents, the motherboard may also be referred to as a main board or a mother board, and the auxiliary board may also be referred to as an auxiliary board. The arrangement of the main board, the battery and the sub board in this implementation may be referred to as a three-stage type.
In the mobile terminal with the three-section structure, one circuit board can be selected from the mainboard and the auxiliary mainboard to integrate the camera module, and the two circuit boards can be integrated with the camera module. Optionally, the camera module is connected with the main board and/or the auxiliary board through a flat cable. The camera module comprises at least one camera. The camera module comprises a camera which is a rear camera and is embedded in the rear panel.
In a possible implementation manner, the camera module located in the main board comprises a front camera and a rear camera, and the camera module located in the auxiliary board is used for collecting a fingerprint image pressed on a front panel of the mobile terminal by a user and executing operations related to light sensing fingerprint identification.
Optionally, in another possible implementation, the housing and the front panel enclose a cavity, and the cavity includes two components, namely a main board and a battery. In this implementation, the electronic components and modules of the mobile terminal are integrated on the motherboard.
It should be noted that the front panel is a front panel of the mobile terminal, and the front panel includes a screen assembly therein. Optionally, the front panel may further comprise a fingerprint acquisition module.
In the mobile terminal shown in the present application, the rear panel of the housing includes a through hole therein. The through hole is used for transmitting light in the environment to the photosensitive assembly. Because the rear panel is also provided with a rear camera module. Consequently, gather the photosensitive assembly of the light in the environment through the through-hole in this application, can comparatively accurately confirm the light intensity of the light in the environment around the rear camera. It should be noted that the through hole is a space concept, and the boundary of the through hole on the back panel is a closed figure. The through hole means a cylindrical space surrounded by the through hole in the rear panel.
In this embodiment, in a possible implementation manner, the photosensitive component is located in the through hole. In another possible implementation manner, the photosensitive assembly is located in a cavity defined by the housing and the motherboard. In another possible implementation manner, part of the structure of the photosensitive assembly is located in the through hole, and the other part of the structure is located in a cavity defined by the shell and the motherboard.
To photosensitive assembly's mode of setting, when photosensitive assembly is located the cavity that casing and mainboard enclose, this application embodiment provides photosensitive assembly's mode of setting as follows.
In one arrangement mode of the photosensitive assembly, the photosensitive assembly is located in a facing area, and the facing area is an area where the orthographic projection of the through hole in the cavity is located. The facing area may be an area on the main board or an area on the sub board.
In another setting mode of the photosensitive assembly, the mobile terminal comprises a light pipe, and the light pipe is a crystal capable of transmitting light. The light pipe is arranged in a through hole in the rear panel of the shell, one end of the light pipe is connected with the photosensitive assembly, and the photosensitive assembly is located in a cavity defined by the shell and the mainboard. The mobile terminal conducts light rays in the environment to the light sensing assembly through the light guide pipe. It should be noted that the photosensitive assembly includes at least a Light Sensor (Light-Sensor). In one possible implementation, the light sensor is also referred to as a brightness sensor.
As an implementation manner, the light sensor includes a light projector, a light receiver, and a receiving sensor, the light projector first receives light transmitted by the light pipe, focuses the light through a lens in the light projector, and transmits the light to the lens of the light receiver, and the light receiver transmits the light passing through the lens of the light receiver to the receiving sensor, so that the light is converted into corresponding electrical signals on the receiving sensor. It should be noted that the light beams with different light intensities correspond to the electrical signals with different signal intensities. Therefore, the light sensor can detect the light intensity of the ambient light.
In one possible implementation, a flash is included in the rear panel of the housing. The flash lamp is used for supplementing light when the rear camera module collects images, or providing illumination for the mobile terminal.
Please refer to fig. 2, which is a flowchart illustrating a method for processing an image according to an exemplary embodiment of the present application. The method of processing an image may be applied to the mobile terminal shown in fig. 1. In fig. 2, a method of processing an image includes:
step 210, when the mobile terminal collects an image through the target camera, the mobile terminal obtains a first light intensity through the first photosensitive assembly, and the first light intensity is used for indicating the intensity of light irradiating the target camera.
In this application embodiment, when the mobile terminal collects an image through the target camera, the operation of obtaining the first light intensity through the first photosensitive component provided by this embodiment is executed.
In an image acquisition scene, when a mobile terminal receives an instruction for starting a target camera, the mobile terminal acquires an image through the target camera and displays the acquired image in a user interface of a photographing application in real time. The image displayed in the user interface of the photographing application in real time can be a preview image, so that a user can conveniently determine the time for acquiring the image according to the preview image. In the scene, the mobile terminal confirms that the mobile terminal is in the scene of acquiring the image through the target camera.
In another image acquisition scene, after a photographing application or other applications for calling the target camera to start are started, when the target camera is in a state of continuously acquiring and storing images, namely when the target camera is in a camera shooting state, the mobile terminal confirms that the mobile terminal is in a scene of acquiring images through the target camera.
The mobile terminal obtains first light intensity through the first photosensitive assembly. Because first sensitization subassembly and target camera set up in same target panel, consequently, first light intensity can reflect the intensity of the light that shines target camera. In a possible arrangement mode of the first photosensitive assembly and the target camera in the target panel, the first photosensitive assembly is arranged in a range with the target camera as a circle center and a preset distance as a radius, so that the intensity of light irradiating the target camera can be reflected by first light intensity detected by the first photosensitive assembly.
And step 220, when the first light intensity is greater than the light intensity threshold value, determining that the target camera is in a backlight state.
In this embodiment, when the first light intensity is greater than the light intensity threshold, the mobile terminal determines that the target camera is in a backlight state.
In a possible mode, the mobile terminal determines that the target camera is in a backlight state when the first light intensity is greater than the light intensity threshold value through the processor.
In a possible mode, the mobile terminal determines that the target camera is in a backlight state through the special chip when the first light intensity is greater than the light intensity threshold. The dedicated chip may be a chip integrated with the first photosensitive component.
In step 230, the mobile terminal performs image recognition on the image to obtain a first image recognition result.
In the embodiment of the application, the mobile terminal identifies the acquired image. In one possible scenario, the mobile terminal can perform image recognition on the acquired image, and determine whether a preset type of scene exists in the image. The preset type of scenes may be a specified number of types. The configuration file can be pre-stored in the configuration file of the terminal, and can also be provided by a server at the cloud end.
The first image recognition result acquired by the mobile terminal can be used for indicating whether a preset type of scene exists in the image.
And step 240, the mobile terminal processes the image according to the exposure compensation value corresponding to the first image recognition result.
In the embodiment of the application, the mobile terminal may store an exposure compensation value corresponding to the first image recognition result. When the mobile terminal determines the first image recognition result, the corresponding exposure compensation value can be found according to the mapping relation, and the image collected by the mobile terminal is processed. In one possible implementation, the first image recognition result and the corresponding exposure compensation value may be referred to as values provided in table one.
Watch 1
First image recognition result Human face Cat (cat) Flower and plant
Exposure compensation value 1.5EV 0.5EV 2EV
In table one, the mobile terminal can obtain a corresponding exposure compensation value according to the first image recognition result, and process an image acquired by the mobile terminal. It should be noted that the mobile terminal will render the image according to the exposure compensation value to improve the image quality of the image.
In one possible approach, the image being processed is a frame preview image.
In another possible approach, the image being processed is a captured image.
In another possible approach, the processed image is a frame image in a recorded video.
In summary, the method for processing an image according to this embodiment can enable the mobile terminal to obtain the first light intensity through the first photosensitive component and the target camera arranged on the same target panel when the target camera collects an image, determine that the target camera is in a backlight state when the first light intensity is greater than the light intensity threshold, perform image recognition on the collected image to obtain the first image recognition result, obtain the exposure compensation value corresponding to the first image recognition result, and re-render the image according to the exposure compensation value. Therefore, the backlight scene can be automatically confirmed, the image is subjected to adaptive exposure compensation through the software processing flow, and the effect of automatically optimizing the backlight image by the terminal is improved.
Based on the solution disclosed in the previous embodiment, the terminal can also determine whether the mobile terminal is currently in a backlight state in another way, please refer to the following embodiments for details.
Please refer to fig. 3, which is a flowchart illustrating a method for processing an image according to another exemplary embodiment of the present application. The method for processing the image can be applied to the terminal shown in fig. 1. In fig. 3, the method of processing an image includes:
step 310, when the mobile terminal collects an image through the target camera, the mobile terminal obtains a first light intensity through the first photosensitive assembly.
The execution procedure of step 310 is the same as that of step 210, and is not described herein again.
And 320, the mobile terminal acquires a second light intensity through the second photosensitive assembly.
In a mobile terminal that performs this operation, the mobile terminal is required to provide a second photosensitive member in the panel opposite to the target panel. The second photosensitive assembly can also collect light intensity, and the light intensity is second light intensity and is used for indicating the intensity of light rays irradiating the opposite side panel of the target panel.
And 330, when the difference value of the first light intensity and the second light intensity is larger than the target threshold value, determining that the target camera is in a backlight state.
In the embodiment of the application, when the first light intensity is greater than the second light intensity by a target threshold value, the mobile terminal determines that a target camera acquired by the mobile terminal is in a backlight state. That is, the light intensity of the light irradiating the target camera is larger than the light intensity of the light irradiating the opposite side of the target panel, and when the difference value is larger than the target threshold value, the target camera of the scene is determined to be in the backlight state.
And 340, the mobile terminal performs image recognition on the image to obtain a first image recognition result.
The execution procedure of step 340 is the same as that of step 230, and is not described herein again.
And 350, processing the image by the mobile terminal according to the exposure compensation value corresponding to the first image identification result.
The execution procedure of step 350 is the same as that of step 240, and is not described herein again.
In summary, in this embodiment, the target camera and the first photosensitive component that are disposed in the same target panel can be used, when the target camera collects an image, the first photosensitive component obtains the first light intensity, the second photosensitive component that is disposed on the opposite side of the target panel obtains the second light intensity, when the difference between the first light intensity and the second light intensity is greater than the target threshold, it is determined that the target camera is in a backlight state, and then image recognition is performed on the collected image, so as to obtain a first image recognition result, obtain an exposure compensation value corresponding to the first image recognition result, and re-render the image according to the exposure compensation value. Therefore, the embodiment provided by the application can confirm that the target camera is in the backlight state when the first light intensity obtained by the first photosensitive assembly is stronger than the light intensity of the natural environment represented by the second light intensity, so that the accuracy of confirming that the target camera is in the backlight state is improved, and the effect of optimizing images of the mobile terminal is improved.
Based on the method shown in the foregoing embodiment, an embodiment of the present application further provides a method for processing an image, which can select a corresponding method for processing an image according to whether a specific first image recognition result includes a human face. Refer specifically to the following examples.
Please refer to fig. 4, which is a flowchart illustrating a method for processing an image according to the embodiment of fig. 2 and 3. The method for processing the image can be applied to the terminal shown in fig. 1. In fig. 4, the method of processing an image includes:
step 410, when the mobile terminal collects an image through the target camera, the mobile terminal obtains a first light intensity through the first photosensitive assembly.
The execution process of step 410 is the same as that of step 210, and is not described herein again.
When the mobile terminal performs the completion step 410, the mobile terminal may perform the step 421a, and may also perform the step 422a and the step 422 b. After the step 421a or the step 422b is completed, the terminal performs the step 430.
In step 421a, when the first light intensity is greater than the light intensity threshold, the mobile terminal determines that the target camera is in a backlight state.
The execution procedure of step 421a is the same as that of step 220, and is not described herein again.
In step 422a, the mobile terminal obtains a second light intensity through the second photosensitive component.
The execution procedure of step 422a is the same as that of step 320, and is not described herein again.
And step 422b, when the difference value of the first light intensity and the second light intensity is larger than the target threshold value, determining that the target camera is in a backlight state.
The execution procedure of step 422b is the same as that of step 330, and is not described herein again.
And 430, the mobile terminal performs image recognition on the image to obtain a first image recognition result.
The execution process of step 430 is the same as the execution process of step 230, and is not described herein again.
In step 441a, when the first image recognition result is that the image includes a human face, the mobile terminal obtains a human exposure compensation value, where the human exposure compensation value is a preset value in the mobile terminal.
In the application, the mobile terminal can execute corresponding operation according to whether the image comprises the face or not. In one implementation, when the first image recognition result indicates that the image includes a human face, the mobile terminal determines a human image area according to the human face.
It should be noted that, in a possible manner, the human image area refers to a minimum effective area containing a human face, and the size of the minimum effective area is autonomously determined by an image recognition process of the mobile terminal.
In another possible mode, the human image area refers to a minimum effective area of a human body corresponding to the human face, and the size of the minimum effective area is autonomously determined by an image recognition process of the mobile terminal.
The mobile terminal will obtain the portrait exposure compensation value. In one possible approach, the portrait exposure compensation value is a constant with a fixed value that can be stored either locally or in the server in the cloud.
In another possible approach, the portrait exposure compensation value will be adjusted with the value of the aperture size and/or the value of the focal length of the target camera. The mobile terminal is in the local area, or the server is instructed to obtain a portrait exposure compensation value corresponding to the numerical value of the aperture size and/or the numerical value of the focal length of the target camera.
And step 441b, the mobile terminal performs exposure compensation on the image according to a preset portrait exposure compensation value.
In the application, the mobile terminal carries out exposure compensation on the image according to a preset portrait exposure compensation value.
In one possible scenario, the mobile terminal may perform exposure compensation on the image during the framing of the target camera.
In another possible long scene, the mobile terminal may perform exposure compensation on each frame of image in the recorded video during the process of recording the video by the target camera.
In yet another possible scenario, the mobile terminal may perform exposure compensation on an image captured by the target camera.
Alternatively, the portrait exposure compensation value may be a positive number.
In step 442a, when the first image recognition result is that the image does not include a human face, the mobile terminal will obtain a preset full-image exposure compensation value.
In the scheme provided by the application, the mobile terminal acquires a preset full-image exposure compensation value when the first image recognition result is that the image does not include a human face. Optionally, the full-image exposure compensation value and the portrait exposure compensation value are not equal.
In one possible scenario, the terminal may include overexposure compensation values and underexposure compensation values in accordance with the global map exposure compensation values. Wherein the overexposure compensation value is a positive number and the underexposure compensation value is a negative number. The mobile terminal can determine the full-image exposure compensation value as one of an overexposure compensation value and an underexposure compensation value according to a scheme preset by a user.
It should be noted that the overexposure compensation value is used to fill light into the foreground object in the image, and the underexposure compensation value is used to make the image features near the light source clearer.
Alternatively, the mobile terminal may automatically select one of the overexposure compensation value and the underexposure compensation value as the global exposure compensation value according to the brightness of the focus position. And if the brightness of the focus position of the image is the highest value of the image brightness, determining the underexposure compensation value as a full-image exposure compensation value. And if the brightness of the focal position of the image is lower than the average value of the image brightness, determining the overexposure compensation value as a full-image exposure compensation value.
In a possible acquisition mode of the full-image exposure compensation value, the mobile terminal can complete the steps (1), (2) and (3).
And (1) when the first image recognition result is that the image does not include the human face, the mobile terminal acquires a focus area, wherein the focus area is a preset area with the image focus as a reference point.
In the application, when the first image recognition result is that the image does not include the human face, the mobile terminal acquires a focus area, wherein the focus area is a preset area with the image focus as a reference point. For example, a center area having the image focus as the center and the specified length as the radius. Alternatively, a square area having a side length and centered on the image focus is specified.
And (2) carrying out image recognition on the focus area to obtain a second image recognition result.
In the application, the mobile terminal focus area carries out image recognition, and a second image recognition result is obtained.
And (3) when the second image recognition result indicates that the focus area comprises a preset image object, acquiring a preset full-image exposure compensation value, wherein the preset image object is an object of a specified type stored in the mobile terminal in advance.
It should be noted that the preset image object includes: cat, dog, flower, leaves, food, etc.
In step 442b, the mobile terminal performs exposure compensation on the image according to the preset full-image exposure compensation value.
And 451, acquiring corresponding image optimization parameters according to the first light intensity.
In an embodiment of the application, the image optimization parameter comprises at least one of contrast, saturation and sharpness. And the mobile terminal acquires the image optimization parameters corresponding to the first light intensity according to the corresponding relation between the first light intensity and the image optimization parameters.
And step 452, optimizing the image according to the image optimization parameters.
In an embodiment of the application, the mobile terminal is capable of adjusting at least one of contrast, saturation and sharpness of the image according to the image optimization parameter. If the mobile terminal obtains the two parameters of the contrast and the saturation, the mobile terminal adjusts the contrast and the saturation of the image.
In summary, the method for processing an image according to this embodiment can execute different light supplement processing procedures on the image according to whether the image includes a face, so that the face portion in the image including the face can obtain suitable light supplement, and different light supplement processing is determined according to whether the image not including the face highlights details around the light source or other foreground scenes, thereby improving the processing effect of the mobile terminal on the image in various scenes.
Based on the embodiment shown in fig. 4, the present application further provides a method for processing an image, which is capable of processing the image in regions according to the brightness in each region of the image, and specifically, refer to the following embodiments.
Please refer to fig. 5, which is a flowchart illustrating a method for processing an image according to the embodiment shown in fig. 4. The method for processing the image can be applied to the terminal shown in fig. 1. In fig. 5, the steps provided in this embodiment may be steps 461, 462 and 463 after the execution of step 452 is completed. The steps 461, 462 and 463 are executed as follows:
step 461, dividing the image into sub-brightness regions of at least two levels according to the brightness of the image, wherein the at least two levels are preset in the mobile terminal.
In the embodiment of the application, the mobile terminal can divide the image into the sub-brightness regions of at least two levels according to the brightness of the image, wherein the at least two levels are preset in the mobile terminal. For example, the mobile terminal may determine at least two levels as high brightness and low brightness. The high-luminance sub-luminance region is a region having an average luminance higher than a predetermined value, and the low-luminance sub-luminance region is a region having an average luminance lower than a predetermined value. The mobile terminal can divide the image into a plurality of areas according to a preset algorithm and then classify the sub-brightness areas.
Step 462, obtaining the hue and/or saturation corresponding to the sub-brightness area according to the level of the sub-brightness area.
In the embodiment of the application, the mobile terminal acquires the corresponding hue and/or saturation according to the grade of the sub-brightness area. In this embodiment, the mobile terminal may store the hue and/or saturation corresponding to the level of the sub-brightness area in advance.
In step 463, the sub-luminance regions are adjusted according to the hues and/or saturations corresponding to the sub-luminance regions.
In the embodiment of the application, the mobile terminal can adjust the corresponding sub-brightness area according to the hue and/or saturation corresponding to the sub-brightness area. For example, the mobile terminal obtains a first hue and/or a first saturation of the high-brightness sub-brightness area, and the terminal adjusts the high-brightness sub-brightness area according to the first hue and/or the first saturation. And the terminal acquires the second hue and/or the second saturation of the low-brightness sub-brightness area, and then the terminal adjusts the high-brightness sub-brightness area according to the second hue and/or the second saturation.
In summary, the method for processing an image according to the embodiment can enable the mobile terminal to adjust different hues and/or saturations according to the difference of the brightness of the image, so as to improve the overall display effect of the image. By combining with other method embodiments provided by the embodiment, the display effect of each region with different brightness can be improved on the premise of improving the backlight display effect.
Based on the method for processing the image provided by any one of the embodiments of fig. 2 to 5, the application can also display the processed image and the unprocessed image at the same time after the image is processed. See the examples below for details.
Please refer to fig. 6, which is a flowchart illustrating another image processing method provided in the present application. In the method for processing an image, the mobile terminal can perform steps 610 and 620 after processing an image captured by the target camera according to the exposure compensation value corresponding to the first image recognition result. The execution process of step 610 and step 620 is as follows:
and step 610, displaying the processed image in a floating window in a screen of the mobile terminal, wherein the floating window is floated on a user interface of the photographing application.
In the embodiment of the application, if the mobile terminal starts the photographing application, the processed image is displayed in the floating window floating on the user interface of the photographing application. It should be noted that the size of the floating window can be freely set.
Step 620, displaying the image which is not processed by the exposure compensation value corresponding to the first image recognition result in the user interface of the photographing application.
In the embodiment of the application, the mobile terminal can also display the images which are not processed by the exposure compensation value corresponding to the first image recognition result at the same time.
In one possible implementation, if the mobile terminal is a folding screen terminal including a first screen and a second screen, the floating window may occupy the entire display area of the first screen, and the unprocessed image may be displayed in the second screen.
In summary, the method for processing the image provided by the application can display the backlight image after the mobile terminal processes the backlight image and the unprocessed backlight image at the same time, so that the efficiency of obtaining information from the image by a user is improved.
The following are embodiments of the apparatus of the present application that may be used to perform embodiments of the method of the present application. For details which are not disclosed in the embodiments of the apparatus of the present application, reference is made to the embodiments of the method of the present application.
Referring to fig. 7, a block diagram of an apparatus for processing an image according to an exemplary embodiment of the present application is shown. The apparatus for processing an image may be implemented as all or a part of a mobile terminal by software, hardware, or a combination of both. The device includes:
a light intensity obtaining module 710, configured to obtain a first light intensity through the first photosensitive component when the mobile terminal collects an image through the target camera, where the first light intensity is used to indicate intensity of light irradiating the target camera;
a backlight determining module 720, configured to determine that the target camera is in a backlight state when the first light intensity is greater than a light intensity threshold;
the image recognition module 730 is configured to perform image recognition on the image to obtain a first image recognition result;
the image processing module 740 is configured to process the image according to the exposure compensation value corresponding to the first image recognition result.
In an optional embodiment, the mobile terminal is provided with a second photosensitive component in a panel opposite to the target panel, and the apparatus preferably includes an executing module, where the executing module is configured to enable the mobile terminal to obtain a second light intensity through the second photosensitive component, and the second light intensity is used to indicate intensity of light irradiating the panel opposite to the target panel; and when the difference value of the first light intensity and the second light intensity is larger than a target threshold value, determining that the target camera is in a backlight state.
In an optional embodiment, the image processing module 740 is configured to, when the first image recognition result is that the image includes a human face, obtain a human image exposure compensation value by the mobile terminal, where the human image exposure compensation value is a preset value in the mobile terminal; and the mobile terminal carries out exposure compensation on the image according to the portrait exposure compensation value.
In an optional embodiment, the image processing module 740 is configured to, when the first image recognition result is that the image does not include a human face, obtain a preset full-image exposure compensation value by the mobile terminal; and the mobile terminal carries out exposure compensation on the image according to the preset full-image exposure compensation value.
In an optional embodiment, the image processing module 740 is configured to, when the first image recognition result is that the image does not include a human face, obtain a focus area by the mobile terminal, where the focus area is a preset area with a focus of the image as a reference point; carrying out image recognition on the focus area to obtain a second image recognition result; and when the second image recognition result indicates that the focus area comprises a preset image object, acquiring a preset full-image exposure compensation value, wherein the preset image object is an object of a specified type stored in the mobile terminal in advance.
In an alternative embodiment, the executing module is configured to obtain a corresponding image optimization parameter according to the first light intensity, where the image optimization parameter includes at least one of contrast, saturation, and sharpness; and optimizing the image according to the image optimization parameters.
In an optional embodiment, the execution module is configured to divide the image into sub-luminance regions of at least two levels according to the luminance of the image, where the at least two levels are preset in the mobile terminal; obtaining the hue and/or saturation corresponding to the sub-brightness area according to the grade of the sub-brightness area; and adjusting the sub-brightness area according to the hue and/or saturation corresponding to the sub-brightness area.
In an optional embodiment, the execution module is configured to display the processed image in a floating window in a screen of the mobile terminal, where the floating window is floating on a user interface of a photographing application; and displaying the image which is not processed by the exposure compensation value corresponding to the first image recognition result in the user interface of the photographing application.
The present embodiments also provide a computer-readable medium, which stores at least one instruction, where the at least one instruction is loaded and executed by the processor to implement the method for processing an image according to the above embodiments.
It should be noted that: in the method for processing an image according to the above embodiment, the above-mentioned division of each functional module is merely used as an example to illustrate when the apparatus for processing an image is used to execute the method for processing an image, and in practical applications, the above-mentioned function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules to complete all or part of the above-mentioned functions. In addition, the apparatus for processing an image and the method for processing an image provided by the above embodiments belong to the same concept, and specific implementation processes thereof are detailed in the method embodiments and are not described herein again.
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only exemplary of the implementation of the present application and is not intended to limit the present application, and any modifications, equivalents, improvements, etc. made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (10)

1. A method for processing an image is applied to a mobile terminal, a target panel of the mobile terminal is provided with a target camera and a first photosensitive assembly, the target panel is a front panel or a rear panel, a second photosensitive assembly is arranged in a panel opposite to the target panel, the first photosensitive assembly and/or the second photosensitive assembly are/is located in a cavity defined by a shell and a mainboard, and the first photosensitive assembly and/or the second photosensitive assembly acquire light in the environment through a light guide pipe, and the method comprises the following steps:
when the mobile terminal acquires an image through the target camera, the mobile terminal acquires first light intensity through the first photosensitive assembly, and the first light intensity is used for indicating the intensity of light irradiating the target camera;
the mobile terminal obtains second light intensity through the second photosensitive assembly, and the second light intensity is used for indicating the intensity of light irradiating the opposite side panel of the target panel;
when the difference value of the first light intensity and the second light intensity is larger than a target threshold value, determining that the target camera is in a backlight state;
the mobile terminal carries out image recognition on the image to obtain a first image recognition result;
and the mobile terminal processes the image according to the exposure compensation value corresponding to the first image identification result.
2. The method according to claim 1, wherein the processing of the image by the mobile terminal according to the exposure compensation value corresponding to the first image recognition result comprises:
when the first image recognition result is that the image comprises a face, the mobile terminal acquires a portrait exposure compensation value, wherein the portrait exposure compensation value is a value preset in the mobile terminal;
and the mobile terminal carries out exposure compensation on the image according to the portrait exposure compensation value.
3. The method according to claim 1, wherein the processing of the image by the mobile terminal according to the exposure compensation value corresponding to the first image recognition result comprises:
when the first image recognition result is that the image does not include the face, the mobile terminal acquires a preset full-image exposure compensation value;
and the mobile terminal carries out exposure compensation on the image according to the preset full-image exposure compensation value.
4. The method according to claim 3, wherein when the first image recognition result is that the image does not include a human face, the mobile terminal obtains a preset full-image exposure compensation value, and the method comprises:
when the first image recognition result is that the image does not include the face, the mobile terminal acquires a focus area, wherein the focus area is a preset area with the image focus as a reference point;
carrying out image recognition on the focus area to obtain a second image recognition result;
and when the second image recognition result indicates that the focus area comprises a preset image object, acquiring a preset full-image exposure compensation value, wherein the preset image object is an object of a specified type stored in the mobile terminal in advance.
5. The method of any of claims 1 to 4, further comprising:
acquiring corresponding image optimization parameters according to the first light intensity, wherein the image optimization parameters comprise at least one of contrast, saturation and sharpness;
and optimizing the image according to the image optimization parameters.
6. The method of claim 5, further comprising:
dividing the image into sub-brightness areas of at least two levels according to the brightness of the image, wherein the at least two levels are preset in the mobile terminal;
obtaining the hue and/or saturation corresponding to the sub-brightness area according to the grade of the sub-brightness area;
and adjusting the sub-brightness area according to the hue and/or saturation corresponding to the sub-brightness area.
7. The method according to claim 1, wherein the image is an image pre-captured by the mobile terminal through a rear camera, and after the mobile terminal processes the image according to the exposure compensation value corresponding to the first image recognition result, the method further comprises:
displaying the processed image in a floating window in a screen of the mobile terminal, wherein the floating window is suspended on a user interface of a photographing application;
and displaying the image which is not processed by the exposure compensation value corresponding to the first image recognition result in the user interface of the photographing application.
8. An apparatus for processing an image, the apparatus being applied in a mobile terminal, a target panel of the mobile terminal being provided with a target camera and a first photosensitive component, the target panel being a front panel or a rear panel, and a second photosensitive component being provided in a panel opposite to the target panel, the first photosensitive component and/or the second photosensitive component being located in a cavity surrounded by a housing and a motherboard, the first photosensitive component and/or the second photosensitive component acquiring light in an environment through a light pipe, the apparatus comprising:
the light intensity acquisition module is used for acquiring first light intensity through the first photosensitive assembly when the mobile terminal acquires an image through the target camera, and the first light intensity is used for indicating the intensity of light irradiating the target camera;
the backlight determination module is used for determining that the target camera is in a backlight state when the first light intensity is greater than a light intensity threshold value;
the image recognition module is used for carrying out image recognition on the image to obtain a first image recognition result;
the image processing module is used for processing the image according to the exposure compensation value corresponding to the first image identification result;
the execution module is used for enabling the mobile terminal to obtain second light intensity through the second photosensitive assembly, and the second light intensity is used for indicating the intensity of light irradiating the panel opposite to the target panel; and when the difference value of the first light intensity and the second light intensity is larger than a target threshold value, determining that the target camera is in a backlight state.
9. A terminal, characterized in that it comprises a processor and a memory, in which at least one instruction is stored, which is loaded and executed by the processor to implement a method of processing an image according to any one of claims 1 to 7.
10. A computer-readable storage medium having stored thereon at least one instruction which is loaded and executed by a processor to implement a method of processing an image as claimed in any one of claims 1 to 7.
CN201811347813.2A 2018-11-13 2018-11-13 Method, device, terminal and storage medium for processing image Active CN109379534B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811347813.2A CN109379534B (en) 2018-11-13 2018-11-13 Method, device, terminal and storage medium for processing image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811347813.2A CN109379534B (en) 2018-11-13 2018-11-13 Method, device, terminal and storage medium for processing image

Publications (2)

Publication Number Publication Date
CN109379534A CN109379534A (en) 2019-02-22
CN109379534B true CN109379534B (en) 2020-09-22

Family

ID=65384774

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811347813.2A Active CN109379534B (en) 2018-11-13 2018-11-13 Method, device, terminal and storage medium for processing image

Country Status (1)

Country Link
CN (1) CN109379534B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111756987B (en) * 2019-03-28 2023-09-26 上海擎感智能科技有限公司 Control method and device of vehicle-mounted camera and vehicle-mounted image capturing system
CN111340016A (en) * 2020-02-25 2020-06-26 浙江大华技术股份有限公司 Image exposure method and apparatus, storage medium, and electronic apparatus
CN115550556B (en) * 2021-06-25 2023-10-24 荣耀终端有限公司 Exposure intensity adjusting method and related device
CN113392792A (en) * 2021-06-27 2021-09-14 赣州德业电子科技有限公司 Face AI image recognition system for tower crane operation
CN113422908B (en) * 2021-07-01 2023-05-23 联想(北京)有限公司 Data processing method and device
CN113364994B (en) * 2021-08-11 2021-11-12 浙江芯昇电子技术有限公司 Backlight compensation method and backlight compensation circuit

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101448085A (en) * 2008-12-26 2009-06-03 北京中星微电子有限公司 Videography processing method and system supporting face detection
CN102238339A (en) * 2011-06-21 2011-11-09 深圳市先河系统技术有限公司 Method for compensating backlight
CN103561210A (en) * 2013-10-24 2014-02-05 惠州Tcl移动通信有限公司 Method and mobile terminal for photographing according to background light

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4711987B2 (en) * 2007-03-14 2011-06-29 株式会社リコー Imaging apparatus and automatic exposure control method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101448085A (en) * 2008-12-26 2009-06-03 北京中星微电子有限公司 Videography processing method and system supporting face detection
CN102238339A (en) * 2011-06-21 2011-11-09 深圳市先河系统技术有限公司 Method for compensating backlight
CN103561210A (en) * 2013-10-24 2014-02-05 惠州Tcl移动通信有限公司 Method and mobile terminal for photographing according to background light

Also Published As

Publication number Publication date
CN109379534A (en) 2019-02-22

Similar Documents

Publication Publication Date Title
CN109379534B (en) Method, device, terminal and storage medium for processing image
US10924677B2 (en) Electronic device and method for providing notification related to image displayed through display and image stored in memory based on image analysis
CN109348089B (en) Night scene image processing method and device, electronic equipment and storage medium
EP3579544A1 (en) Electronic device for providing quality-customized image and method of controlling the same
EP3611917A1 (en) Imaging control method and apparatus, electronic device, and computer readable storage medium
US11076087B2 (en) Method for processing image based on scene recognition of image and electronic device therefor
CN108288044B (en) Electronic device, face recognition method and related product
CN109218628A (en) Image processing method, device, electronic equipment and storage medium
CN104349033A (en) Self-timer light supplement method, self-timer light supplement device and electronic equipment
CN108965692B (en) Sticker setting method and device
US20210209400A1 (en) Method for providing text translation managing data related to application, and electronic device thereof
CN110830730B (en) Apparatus and method for generating moving image data in electronic device
CN110519485A (en) Image processing method, device, storage medium and electronic equipment
CN113542580B (en) Method and device for removing light spots of glasses and electronic equipment
CN110266954A (en) Image processing method, device, storage medium and electronic equipment
CN110581956A (en) Image processing method, image processing device, storage medium and electronic equipment
CN108810277A (en) It takes pictures method for previewing and device
CN105791661A (en) Shooting system and method
CN107682611B (en) Focusing method and device, computer readable storage medium and electronic equipment
CN117768789A (en) Shooting method and electronic equipment
CN113674303A (en) Image processing method, image processing device, electronic equipment and storage medium
CN113298735A (en) Image processing method, image processing device, electronic equipment and storage medium
CN108898650B (en) Human-shaped material creating method and related device
CN105791689A (en) Image processing method and image processing device
CN110278386A (en) Image processing method, device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant