CN108600631B - Image processing method, image processing device, computer-readable storage medium and electronic equipment - Google Patents

Image processing method, image processing device, computer-readable storage medium and electronic equipment Download PDF

Info

Publication number
CN108600631B
CN108600631B CN201810466604.3A CN201810466604A CN108600631B CN 108600631 B CN108600631 B CN 108600631B CN 201810466604 A CN201810466604 A CN 201810466604A CN 108600631 B CN108600631 B CN 108600631B
Authority
CN
China
Prior art keywords
image
light effect
processed
effect enhancement
acquiring
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810466604.3A
Other languages
Chinese (zh)
Other versions
CN108600631A (en
Inventor
袁全
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201810466604.3A priority Critical patent/CN108600631B/en
Publication of CN108600631A publication Critical patent/CN108600631A/en
Application granted granted Critical
Publication of CN108600631B publication Critical patent/CN108600631B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)

Abstract

The application relates to an image processing method, an image processing device, a computer readable storage medium and an electronic device. The method comprises the following steps: acquiring the geographical position of the electronic equipment when acquiring the image to be processed and the corresponding acquisition time; acquiring a light effect enhancement model according to the geographic position and the acquisition time, wherein the light effect enhancement model is a model for simulating light change; and carrying out light effect enhancement processing on the image to be processed according to the light effect enhancement model. The image processing method, the image processing device, the computer readable storage medium and the electronic equipment can improve the accuracy of image processing.

Description

Image processing method, image processing device, computer-readable storage medium and electronic equipment
Technical Field
The present application relates to the field of computer technologies, and in particular, to an image processing method and apparatus, a computer-readable storage medium, and an electronic device.
Background
The intelligent terminal can acquire the image and process the image, so that the image is more suitable for the requirements and the aesthetic feelings of users. For example, the intelligent terminal can acquire pictures from a network, and can also acquire the pictures directly through a camera. After the intelligent terminal acquires the image, the user can also perform different processing according to the requirements of the user. For example, the picture is subjected to processing such as beauty, white balance, and brightness adjustment.
Disclosure of Invention
The embodiment of the application provides an image processing method and device, a computer readable storage medium and an electronic device, which can improve the accuracy of image processing.
An image processing method comprising:
acquiring the geographical position of the electronic equipment when acquiring the image to be processed and the corresponding acquisition time;
acquiring a light effect enhancement model according to the geographic position and the acquisition time, wherein the light effect enhancement model is a model for simulating light change;
and carrying out light effect enhancement processing on the image to be processed according to the light effect enhancement model.
An image processing apparatus comprising:
the image acquisition module is used for acquiring the geographical position of the electronic equipment when acquiring the image to be processed and the corresponding acquisition time;
the model acquisition module is used for acquiring a light effect enhancement model according to the geographic position and the acquisition time, wherein the light effect enhancement model is a model for simulating light change;
and the light effect enhancement module is used for carrying out light effect enhancement processing on the image to be processed according to the light effect enhancement model.
A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
acquiring the geographical position of the electronic equipment when acquiring the image to be processed and the corresponding acquisition time;
acquiring a light effect enhancement model according to the geographic position and the acquisition time, wherein the light effect enhancement model is a model for simulating light change;
and carrying out light effect enhancement processing on the image to be processed according to the light effect enhancement model.
An electronic device comprising a memory and a processor, the memory having stored therein computer-readable instructions that, when executed by the processor, cause the processor to perform the steps of:
acquiring the geographical position of the electronic equipment when acquiring the image to be processed and the corresponding acquisition time;
acquiring a light effect enhancement model according to the geographic position and the acquisition time, wherein the light effect enhancement model is a model for simulating light change;
and carrying out light effect enhancement processing on the image to be processed according to the light effect enhancement model.
According to the image processing method, the image processing device, the computer-readable storage medium and the electronic equipment, the geographical position and the acquisition time of the electronic equipment can be acquired when the electronic equipment acquires the image to be processed, the light effect enhancement model is acquired according to the geographical position and the acquisition time, and the light effect enhancement processing is performed on the image to be processed according to the light effect enhancement model. Therefore, when the light effect enhancement processing is carried out on the electronic equipment, different processing can be carried out according to different positions and moments, and the accuracy of image processing is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a diagram of an exemplary embodiment of an image processing method;
FIG. 2 is a flow diagram of a method of image processing in one embodiment;
FIG. 3 is a flow chart of an image processing method in another embodiment;
FIG. 4 is a schematic diagram of a light effect enhancement model in one embodiment;
FIG. 5 is a flowchart of an image processing method in yet another embodiment;
FIG. 6 is a flowchart of an image processing method in yet another embodiment;
FIG. 7 is a diagram showing a configuration of an image processing apparatus according to an embodiment;
FIG. 8 is a schematic diagram of an image processing circuit in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It will be understood that, as used herein, the terms "first," "second," and the like may be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another. For example, a first client may be referred to as a second client, and similarly, a second client may be referred to as a first client, without departing from the scope of the present application. Both the first client and the second client are clients, but they are not the same client.
FIG. 1 is a diagram of an embodiment of an application environment of an image processing method. As shown in fig. 1, the application environment diagram includes an electronic device 104. A camera may be mounted on the electronic device 104, and then the to-be-processed image 102 is captured by the camera. When the electronic device 104 acquires the to-be-processed image 102, the geographic location where the electronic device 104 acquired the to-be-processed image 102 and the corresponding acquisition time may be obtained. And then acquiring a light effect enhancement model according to the geographical position and the acquisition time, wherein the light effect enhancement model is a model for simulating light change. So that the electronic device 104 can perform the light effect enhancement processing on the image to be processed 102 according to the light effect enhancement model. The electronic device 102 may be a device for inputting user information and outputting a processing result, and may be a personal computer, a mobile terminal, a personal digital assistant, a wearable electronic device, or the like.
FIG. 2 is a flow diagram of a method of image processing in one embodiment. As shown in fig. 2, the image processing method includes steps 202 to 206. Wherein:
step 202, acquiring the geographic position of the electronic device when the to-be-processed image is acquired, and acquiring the corresponding acquisition time.
In one embodiment, a camera may be mounted on the electronic device, such that images are captured by the mounted camera. The number of cameras mounted on the electronic apparatus, and the mounting positions of the cameras are not limited herein. For example, one camera may be installed on a front panel of the electronic device, two cameras may be installed on a back panel of the electronic device, and the cameras may be installed in an embedded manner inside the electronic device and then opened by rotating or sliding. Specifically, a front camera and a rear camera can be mounted on the electronic device, the front camera and the rear camera can acquire images from different viewing angles, the front camera can acquire images from a front viewing angle of the electronic device, and the rear camera can acquire images from a back viewing angle of the electronic device.
The upper application of the electronic equipment can start an image acquisition instruction to the camera, and when the electronic equipment detects the image acquisition instruction, the camera can be controlled to acquire the image to be processed. After the electronic equipment acquires the image to be processed, the current geographic position and the acquisition time of the image to be processed are immediately acquired. Specifically, the geographic location of the electronic device may be, but is not limited to, located by a GPS (Global Positioning System), a network address, or the like. When the electronic equipment collects the image to be processed, the current moment can be collected as the collection moment through the clock of the electronic equipment. For example, in the Android system, the system time can be obtained by the system currenttimeMillis () function.
After the electronic device acquires the image to be processed, the geographic position and the acquisition time when the image to be processed is acquired can be stored together with the image to be processed. The collected image to be processed can be transmitted among different electronic devices, so that the geographic position and the collection time when the image to be processed is collected can be obtained as long as the image to be processed is received, and the image to be processed is processed according to the obtained geographic position and the collection time.
And 204, acquiring a light effect enhancement model according to the geographic position and the acquisition time, wherein the light effect enhancement model is a model for simulating light change.
The scene of the user is often very complicated in the shooting process, especially the light of the shooting scene is complicated and changeable, and the shooting scene cannot be changed by the user in the shooting process, so that the effect desired by the user can be achieved only through post-processing. The image to be processed is an image needing light effect enhancement processing, and the light effect is image enhancement processing simulating a light source effect. After the light source emits light, the light is diffused around the light source, and the intensity of the light is weakened along with the increase of the distance from the light source. For example, the light source effects may be of different types, including natural light, stage light, studio light, film light, contour light, and the like.
When a picture is taken, the position and the shooting time are different, and the brightness and the direction of natural light in the environment are different. For example, the light during the middle day of shooting may be strong, and the light during the evening of shooting may be dark. When shooting at different longitudes and latitudes at the same time, the directions of the light rays are also different. Therefore, when the shot image is processed, the acquired image can be processed differently according to the geographical position and the acquisition time when the image is acquired. For example, the direction and intensity of sunlight can be calculated according to the longitude and latitude and the time of collection when the image is collected, so that the light effect with different directions and intensities can be added to the image according to the direction and intensity of the sunlight.
The light effect enhancement model is a model simulating light change, and light effect enhancement processing can be carried out on the image to be processed through the light effect enhancement model. Specifically, the light effect enhancement model can simulate the change curves of the direction, the intensity and the like of light, and light rays with different directions and intensities are added to the image through the light effect enhancement model. Parameters such as the direction and the intensity of natural light when the image is shot can be calculated according to the geographic position and the collection time, then a corresponding light effect enhancement model is constructed according to the parameters such as the direction and the intensity of the natural light, and the image is processed according to the obtained light effect enhancement model. The image to be processed presents different processing effects according to different acquired positions and time.
And step 206, carrying out light effect enhancement processing on the image to be processed according to the light effect enhancement model.
The light effect enhancement processing refers to processing of adding a light effect to an image. Specifically, the image to be processed is a two-dimensional pixel matrix formed by a plurality of pixel points, and each pixel point has a corresponding pixel value. Therefore, a light effect enhancement model can be obtained, and light effect enhancement coefficients of pixel points in the image are calculated according to the light effect enhancement model. After the light effect enhancement coefficient is calculated, light effect enhancement processing can be carried out on each pixel point in the image to be processed according to the light effect enhancement coefficient. The light effect enhancement processing can be specifically carried out in a mode of overlapping or multiplying the light effect enhancement coefficients with the image to be processed. It is understood that the value range of the pixel value in the image is generally [0,255], so the pixel value of the image to be processed after the light effect enhancement processing cannot be larger than 255.
For example, assume that the image to be processed is H0(x, y) and the light effect enhancement model is P (x, y), then the to-be-processed image H (x, y) after the light effect enhancement processing is performed by the superposition method can be represented as H (x, y) ═ 1+ P (x, y)) H0(x, y), and the to-be-processed image after the light effect enhancement processing is performed by means of multiplication can be represented as H (x, y) ═ P (x, y) H0(x, y). It is to be understood that the light effect enhancement process may also be implemented in other ways, which are not limited herein.
Specifically, when the light effect enhancement processing is performed on the image to be processed, different processing can be performed on each color channel in the image to be processed. The light effect enhancement model can be used for carrying out different processing on each color channel in the image to be processed, specifically, the light effect enhancement coefficients of RGB three channels corresponding to each pixel point can be calculated according to the obtained light effect enhancement model, and then the light effect enhancement processing is carried out on the RGB three channels of each pixel point respectively according to the light effect enhancement coefficients. After the light effect enhancement processing with different intensities is carried out on each channel, the light effect enhancement effect of the obtained image is the same. For example, in the obtained light efficiency enhancement coefficients corresponding to the RGB three channels, the light efficiency enhancement coefficient corresponding to the R channel is greater than the light efficiency enhancement coefficients of the G channel and the B channel, and then after the light efficiency enhancement processing is performed on the image to be processed according to the obtained light efficiency enhancement coefficients, the obtained light efficiency enhancement image is the effect of being reddish relative to the image to be processed.
It can be understood that when the electronic device collects an image through the camera, the electronic device can be understood to locate the current position and acquire the collection time. The collected to-be-processed image does not need to be immediately subjected to light effect enhancement processing, and the electronic equipment can correspondingly store the collected geographic position and the collected time and the to-be-processed image. When light effect enhancement processing is needed, light effect enhancement processing is conducted on the image to be processed according to the geographic position and the acquisition time. For example, the device 1 may acquire an image to be processed and simultaneously acquire the geographic position and the acquisition time. The device 1 can simultaneously send the collected image to be processed, the geographic position and the collection time to the device 2, and the device 2 performs light effect enhancement processing on the image to be processed according to the received geographic position and the collection time.
The image processing method provided by the embodiment can acquire the geographic position and the acquisition time of the electronic equipment when the electronic equipment acquires the image to be processed, acquire the light effect enhancement model according to the geographic position and the acquisition time, and perform light effect enhancement processing on the image to be processed according to the light effect enhancement model. Therefore, when the light effect enhancement processing is carried out on the electronic equipment, different processing can be carried out according to different positions and moments, and the accuracy of image processing is improved.
Fig. 3 is a flowchart of an image processing method in another embodiment. As shown in fig. 3, the image processing method includes steps 302 to 308. Wherein:
step 302, obtaining an environment brightness value when the electronic device collects the image to be processed.
The environment brightness value is used for representing the brightness of the environment where the electronic device is located. The greater the ambient brightness value, the brighter the environment in which the electronic device is located. The ambient brightness value may be calculated from the acquired image to be processed. For example, the gray value of each pixel point in the image to be processed may be obtained, the average value of the gray values of all the pixel points is counted, and the environment brightness value is calculated according to the counted average value of the gray values. The ambient brightness value may also be obtained by an ambient light sensor, which is not limited herein. For example, when the electronic device detects a shooting instruction for acquiring an image to be processed, the ambient light sensor is started to operate, and an ambient brightness value of the current environment is acquired through the ambient light sensor.
Taking the Android system as an example, in the Android system, a sensor manager service SensorManager may be first obtained, and then the service of the LIGHT sensor is called through the sensor management service, that is, the sensor manager. The light sensor can then be registered and monitored in the system to obtain the ambient brightness value returned by the light sensor.
Step 304, if the environmental brightness value is smaller than the first brightness threshold, acquiring the geographic position where the electronic device is located when acquiring the image to be processed, and acquiring the corresponding acquisition time.
In one embodiment, the brightness of the environment when the to-be-processed image is acquired is judged according to the brightness value of the environment, so as to judge whether a light effect needs to be added to the to-be-processed image. When the brightness value of the environment is smaller than the first brightness threshold, the environment where the electronic device is located is considered to be dark, and the light effect of the image to be processed collected in the current environment needs to be enhanced. For example, when taking a picture outdoors, if the environment of the electronic device is cloudy, the light effect enhancement processing may be performed according to the geographical location and the collection time of the electronic device.
And step 306, determining a light effect enhancing direction according to the geographical position and the collecting time, and determining a light effect intensity model according to the light effect enhancing direction.
The geographical location of the electronic device may be represented by latitude and longitude. The electronic device can calculate the direction of the ambient natural light according to the acquired geographic position and the acquisition time, so that the direction of the added light effect is determined according to the direction of the natural light. For example, when a photo is taken at noon, sunlight is irradiated from the right above, and then light effect can be added to the image from the right above; when a picture is taken in the afternoon, sunlight is irradiated from the obliquely upper side, and then light effect can be added to the image from the obliquely upper side. The electronic device may pre-store a correspondence between the geographic position, the collection time, and the light effect enhancing direction, and then determine the corresponding light effect enhancing direction according to the acquired geographic position and the collection time.
Specifically, after the electronic device calculates the direction of the ambient light according to the geographic position and the collection time, the electronic device needs to acquire the orientation of the electronic device, and determine the light effect enhancement direction according to the direction of the ambient light and the orientation of the electronic device. For example, if the direction of the electronic device is the same and the orientation of the electronic device is different, the influence of light received by the subject in the environment is different. Whether the electronic device is in a front-light or back-light environment can be determined according to the direction of the ambient light and the orientation of the electronic device, so that different light effects are added.
It is understood that the factors affecting the light include the light source, the propagation direction, the intensity and color of the light, and the like. The light effect enhancement model can simulate the change curve of light rays, namely light rays with different light source centers, propagation directions, intensities and colors. After the light effect enhancement direction is determined, the light effect enhancement model can be determined according to the light effect enhancement direction, so that light effects in different directions are added according to the light effect enhancement model. The parameters such as light intensity and light color can be determined, and the light effect enhancement model corresponding to the image can be obtained according to the parameters such as light intensity and light color, so that the light effects with different intensities and colors can be added to the image.
The light is spread by taking the light source as a center and diffusing and spreading all around. Thus, the direction of the light can be changed by moving the position of the light source. The electronic device may pre-define a light effect enhancement reference model, which may be a model with any point in space as a light source. After the light effect enhancement direction is determined, the light direction can be changed by moving the light source center of the predefined light effect enhancement reference model. Specifically, a light source center point may be determined according to the light efficiency enhancement direction, and a light efficiency enhancement model may be constructed according to the light source center point.
In one embodiment, the light effect enhancement model may be constructed according to a two-dimensional gaussian distribution function. Firstly, a two-dimensional Gaussian distribution function is obtained, and then a light effect enhancement model is constructed by taking the central point of the light source as the maximum value point of the two-dimensional Gaussian distribution function. It can be understood that the light source center point may be a pixel point in the image to be processed, or may be any point in a space other than the image to be processed. The two-dimensional gaussian distribution function is obtained as follows:
Figure BDA0001662225090000101
wherein, (x, y) represents the two-dimensional coordinates of any pixel point in the image to be processed, and d is a constant. The function is a two-dimensional Gaussian distribution function with (0,0) as a maximum value point, the light effect enhancement model obtained according to the central pixel point is namely the two-dimensional Gaussian distribution function is displaced, and the maximum value point of the two-dimensional Gaussian distribution function is moved to the position of the light source central point to obtain the light effect enhancement model. Assume that the center pixel is (x)o,yo) Then, the resulting light effect enhancement model can be expressed as:
Figure BDA0001662225090000102
in the obtained light effect enhancement model, a central pixel point (x)o,yo) It is the maximum point, i.e. at the central pixel point (x)o,yo) The obtained light efficiency enhancement coefficient Po(x, y) max. The intensity of the light effect enhancement coefficient can be adjusted according to the constant d.
Fig. 4 is a schematic diagram of a light effect enhancement model in an embodiment. As shown in fig. 4, the resolution of the to-be-processed image in the light effect enhancement model is 50 × 50, and the coordinate value of the central pixel 402 is (25, 25). It can be seen that the light effect enhancement coefficient corresponding to the central pixel 402 is the largest, the light effect enhancement coefficients corresponding to other pixels in the image to be processed decrease with the increase of the distance from the central pixel 402, and the light effect enhancement coefficients corresponding to pixels farther away from the central pixel 402 are smaller.
And 308, acquiring a light effect enhancement coefficient corresponding to each pixel point in the image to be processed according to the light effect enhancement model, and performing light effect enhancement processing on each pixel point in the image to be processed according to the light effect enhancement coefficient.
In the embodiment provided by the application, the light effect enhancement model can simulate the change of light, and the light effect enhancement coefficient of each pixel point in the image to be processed can be calculated according to the light effect enhancement model. The light effect enhancement coefficient is a parameter for performing light effect enhancement processing on each pixel point. Generally, the farther from the light source location, the more the light is attenuated. Correspondingly, the farther the pixel point is away from the central pixel point in the image to be processed, the smaller the light effect enhancement coefficient is, and the light effect enhancement processing can be performed on each pixel point according to the obtained light effect enhancement coefficient.
The image to be processed is a two-dimensional pixel matrix, a coordinate system can be established by taking the leftmost lower pixel point of the image to be processed as an original point, and the pixel points in the image to be processed can be represented by one two-dimensional coordinate. And obtaining the light effect enhancement coefficient of each pixel point in the image to be processed according to the light effect enhancement model, and directly bringing the coordinate corresponding to each pixel point into the light effect enhancement model to obtain the light effect enhancement coefficient of the pixel point. And carrying out light effect enhancement processing on each pixel point in the image to be processed through the obtained light effect enhancement coefficient.
In an embodiment, the intensity of the light effect may also be adjusted according to the light effect enhancement model. The step of obtaining a light effect enhancement model may specifically comprise:
and 502, determining the light effect enhancement direction according to the geographical position and the acquisition time.
And step 504, obtaining a light effect intensity factor, wherein the light effect intensity factor is a parameter influencing the light effect enhancement processing intensity.
In the embodiment that this application provided, the light efficiency of light efficiency reinforcing model is handled intensity and can be adjusted, can adjust the light efficiency intensity coefficient of obtaining according to light efficiency reinforcing model according to light efficiency intensity factor. Namely, the values of the luminous efficiency intensity factors are different, and the luminous efficiency intensity coefficients of the correspondingly obtained pixel points are different. Specifically, a light effect intensity factor, which is a parameter that affects the light effect enhancement processing intensity, may be obtained, and the light effect enhancement model may be obtained according to the light effect intensity factor and the light effect enhancement direction. The brightness of the image to be processed is improved by performing the light effect enhancement processing on the image to be processed. If the image to be processed is relatively bright, then the light effect enhancement processing is performed on the image to be processed, which causes serious distortion of the image to be processed, so that the intensity of the light effect enhancement processing can be adjusted according to the brightness of the image to be processed.
It is understood that the time when the image is taken is different, and the intensity of the sunlight is also different. For example, the morning sunlight is softer and the noon sunlight is more intense. Therefore, the light effect intensity factors can be acquired at different acquisition moments, and the light effect intensity factors corresponding to different acquisition moments are different, so that the light effect enhancement processing of different degrees is carried out on the shot images. Specifically, the electronic device can preset a corresponding relation between the light effect intensity factor and the collection time, and can acquire the corresponding light effect intensity factor according to the collection time. The light effect intensity factor can be calculated according to the brightness of the shot image to be processed, so that the light effect processing intensity can be adjusted according to the shot image, and the image is prevented from being distorted due to overhigh brightness.
And step 506, acquiring a light effect enhancement model according to the light effect enhancement direction and the light effect intensity factor.
The light source center point can be determined according to the light effect enhancing direction, and then the intensity of the light effect processing can be adjusted according to the light effect intensity factor. Specifically, according to the light source center point and the light effect intensity factor, a light effect enhancement model simulating light rays in different directions and different intensities can be constructed, and light effect enhancement processing is performed on the image to be processed according to the constructed light effect enhancement model.
In the embodiments provided by the present application, a light effect color factor can also be obtained, which can affect the color of the added light. Specifically, when adding light effects of different colors, the light effect enhancement coefficients of the respective color channels in the image to be processed are different. And acquiring a light effect enhancement model according to the light effect color factor, acquiring the light effect enhancement coefficient of each color channel in the image to be processed through the light effect enhancement model, and performing light effect enhancement processing on each color channel in the image to be processed according to the light effect enhancement coefficient. For example, the time for shooting the images is different, the color of the ambient light is different, the solar light at noon is white, and the solar light at evening is red, so that the lighting effect color factor can be obtained according to the collection time.
The user can also adjust the color, direction and intensity of the light effect according to different requirements, which is not limited herein. For example, a user may adjust the color of the light effect through the electronic device, and may also adjust the intensity of the light effect, thereby adding light effects of different colors and intensities. Specifically, a user can input a trigger instruction through the electronic device, the electronic device obtains one or more of a light effect enhancement direction, a light effect intensity factor and a light effect color factor according to the trigger instruction, and constructs a light effect enhancement model according to the one or more of the light effect enhancement direction, the light effect intensity factor and the light effect color factor.
In an embodiment, when obtaining the light effect intensity factor according to the image, the step of obtaining the light effect intensity factor may specifically include:
step 602, obtaining the brightness value of each pixel point in the image to be processed, and obtaining a reference pixel point from the image to be processed according to the brightness value.
The brightness value can represent the brightness of a pixel point in the image to be processed, and if the brightness value of the pixel point is too large, the pixel point is over bright. The higher the brightness value of the pixel point is, the greater the influence of the light effect enhancement processing is. Therefore, when the brightness value of the pixel point is relatively high, the intensity of the light effect enhancement processing needs to be reduced accordingly. Specifically, a reference pixel point can be selected according to the brightness value of the pixel point, and the light effect intensity factor is obtained according to the brightness value of the reference pixel point. The reference pixel points are generally pixel points with larger brightness values, and the pixel points with the largest corresponding brightness values in the image to be processed can be specifically used as the reference pixel points; or taking the pixel points of which the corresponding brightness values are greater than the second brightness threshold value in the image to be processed as reference pixel points.
And step 604, acquiring a light effect intensity factor according to the brightness value corresponding to the reference pixel point.
The reference pixel points are pixel points with higher brightness in the image to be processed, the higher the brightness value of the reference pixel points is, and the lower the intensity of the light effect enhancement processing is to avoid excessive distortion of the image. Specifically, the higher the brightness value of the reference pixel point is, the smaller the light effect intensity factor is. The electronic equipment can preset the corresponding relation between the brightness value of the reference pixel point and the luminous efficiency intensity factor, and the corresponding luminous efficiency intensity factor can be obtained according to the brightness value of the reference pixel point. For example, when a pixel point with the maximum brightness value in the image to be processed is taken as a reference pixel point, the corresponding light effect intensity factor can be obtained according to the maximum brightness value. When the pixel points of which the corresponding brightness values are larger than the second brightness threshold value in the image to be processed are taken as reference pixel points, the brightness average value of the reference pixel points can be calculated, and then the corresponding light effect intensity factors are obtained according to the brightness average value.
The image processing method provided by the embodiment can acquire the geographic position and the acquisition time of the electronic equipment when the electronic equipment acquires the image to be processed, acquire the light effect enhancement model according to the geographic position and the acquisition time, and perform light effect enhancement processing on the image to be processed according to the light effect enhancement model. Therefore, when the light effect enhancement processing is carried out on the electronic equipment, different processing can be carried out according to different positions and moments, and the accuracy of image processing is improved.
It should be understood that although the steps in the flowcharts of fig. 2, 3, 5, and 6 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 2, 3, 5, and 6 may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or stages is not necessarily sequential, but may be performed alternately or alternatingly with other steps or at least some of the sub-steps or stages of other steps.
Fig. 7 is a schematic structural diagram of an image processing apparatus according to an embodiment. As shown in fig. 7, the image processing apparatus 700 includes an image acquisition module 702, a model acquisition module 704, and a light effect enhancement module 706. Wherein:
the image acquisition module 702 is configured to acquire a geographic location where the electronic device acquires the image to be processed, and a corresponding acquisition time.
A model obtaining module 704, configured to obtain a light effect enhancement model according to the geographic location and the collection time, where the light effect enhancement model is a model that simulates light changes.
And a light effect enhancement module 706, configured to perform light effect enhancement processing on the image to be processed according to the light effect enhancement model.
The image processing device provided by the embodiment can acquire the geographic position and the acquisition time of the electronic equipment when the electronic equipment acquires the image to be processed, acquire the light effect enhancement model according to the geographic position and the acquisition time, and perform light effect enhancement processing on the image to be processed according to the light effect enhancement model. Therefore, when the light effect enhancement processing is carried out on the electronic equipment, different processing can be carried out according to different positions and moments, and the accuracy of image processing is improved.
In one embodiment, the image capturing module 702 is further configured to obtain an ambient brightness value when the electronic device captures the image to be processed; and if the environment brightness value is smaller than the first brightness threshold value, acquiring the geographic position of the electronic equipment when the electronic equipment acquires the image to be processed and the corresponding acquisition time.
In an embodiment, the model obtaining module 704 determines a light effect enhancing direction according to the geographical position and the collecting time, and determines a light effect intensity model according to the light effect enhancing direction.
In one embodiment, the model obtaining module 704 determines the light effect enhancing direction according to the geographic location and the collecting time; acquiring a light effect intensity factor, wherein the light effect intensity factor is a parameter influencing the light effect enhancement processing intensity; and acquiring a light effect enhancement model according to the light effect enhancement direction and the light effect intensity factor.
In one embodiment, the model obtaining module 704 obtains a brightness value of each pixel point in the image to be processed, and obtains a reference pixel point from the image to be processed according to the brightness value; and acquiring a light effect intensity factor according to the brightness value corresponding to the reference pixel point.
In an embodiment, the model obtaining module 704 takes a pixel point with the largest corresponding brightness value in the image to be processed as a reference pixel point; or taking the pixel points of which the corresponding brightness values are greater than the second brightness threshold value in the image to be processed as reference pixel points.
In an embodiment, the light effect enhancing module 706 obtains the light effect enhancing coefficient corresponding to each pixel point in the image to be processed according to the light effect enhancing model, and performs the light effect enhancing processing on each pixel point in the image to be processed according to the light effect enhancing coefficient.
The division of the modules in the image processing apparatus is only for illustration, and in other embodiments, the image processing apparatus may be divided into different modules as needed to complete all or part of the functions of the image processing apparatus.
The embodiment of the application also provides a computer readable storage medium. One or more non-transitory computer-readable storage media containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform the image processing methods provided by the above-described embodiments.
A computer program product comprising instructions which, when run on a computer, cause the computer to perform the image processing method provided by the above embodiments.
The embodiment of the application also provides the electronic equipment. The electronic device includes therein an Image Processing circuit, which may be implemented using hardware and/or software components, and may include various Processing units defining an ISP (Image Signal Processing) pipeline. FIG. 8 is a schematic diagram of an image processing circuit in one embodiment. As shown in fig. 8, for convenience of explanation, only aspects of the image processing technology related to the embodiments of the present application are shown.
As shown in fig. 8, the image processing circuit includes an ISP processor 840 and control logic 850. Image data captured by imaging device 810 is first processed by ISP processor 840, and ISP processor 840 analyzes the image data to capture image statistics that may be used to determine and/or control one or more parameters of imaging device 810. Imaging device 810 may include a camera having one or more lenses 812 and an image sensor 814. Image sensor 814 may include an array of color filters (e.g., Bayer filters), and image sensor 814 may acquire light intensity and wavelength information captured with each imaging pixel of image sensor 814 and provide a set of raw image data that may be processed by ISP processor 840. The sensor 820 (e.g., a gyroscope) may provide parameters of the acquired image processing (e.g., anti-shake parameters) to the ISP processor 840 based on the type of sensor 820 interface. The sensor 820 interface may utilize an SMIA (Standard Mobile Imaging Architecture) interface, other serial or parallel camera interfaces, or a combination of the above.
In addition, the image sensor 814 may also send raw image data to the sensor 820, the sensor 820 may provide raw image data to the ISP processor 840 based on the sensor 820 interface type, or the sensor 820 may store raw image data in the image memory 830.
The ISP processor 840 processes the raw image data pixel by pixel in a variety of formats. For example, each image pixel may have a bit depth of 8, 10, 12, or 14 bits, and ISP processor 840 may perform one or more image processing operations on the raw image data, collecting statistical information about the image data. Wherein the image processing operations may be performed with the same or different bit depth precision.
ISP processor 840 may also receive image data from image memory 830. For example, the sensor 820 interface sends raw image data to the image memory 830, and the raw image data in the image memory 830 is then provided to the ISP processor 840 for processing. The image Memory 830 may be a portion of a Memory device, a storage device, or a separate dedicated Memory within an electronic device, and may include a DMA (Direct Memory Access) feature.
Upon receiving raw image data from image sensor 814 interface or from sensor 820 interface or from image memory 830, ISP processor 840 may perform one or more image processing operations, such as temporal filtering. The processed image data may be sent to image memory 830 for additional processing before being displayed. ISP processor 840 may also receive processed data from image memory 830, which is subjected to image data processing in the raw domain and in the RGB and YCbCr color spaces. The processed image data may be output to a display 880 for viewing by a user and/or further Processing by a Graphics Processing Unit (GPU). Further, the output of ISP processor 840 may also be sent to image memory 830 and display 880 may read image data from image memory 830. In one embodiment, image memory 830 may be configured to implement one or more frame buffers. Further, the output of the ISP processor 840 may be transmitted to an encoder/decoder 870 for encoding/decoding the image data. The encoded image data may be saved and decompressed before being displayed on the display 880 device.
The step of the ISP processor 840 processing the image data includes: the image data is subjected to VFE (Video Front End) Processing and CPP (Camera Post Processing). The VFE processing of the image data may include modifying the contrast or brightness of the image data, modifying digitally recorded lighting status data, performing compensation processing (e.g., white balance, automatic gain control, gamma correction, etc.) on the image data, performing filter processing on the image data, etc. CPP processing of image data may include scaling an image, providing a preview frame and a record frame to each path. Among other things, the CPP may use different codecs to process the preview and record frames. The image data processed by the ISP processor 840 may be sent to the light effect processing module 860 for light effect enhancement processing of the image before being displayed. The light effect Processing module 860 may be a Central Processing Unit (CPU), a GPU, a coprocessor, or the like. The data processed by the light effect processing module 860 may be transmitted to the encoder/decoder 870 to encode/decode image data. The encoded image data may be saved and decompressed before being displayed on the display 880 device. Wherein, the light effect processing module 860 may also be located between the encoder/decoder 870 and the display 880, i.e. the light effect enhancing module 860 performs the light effect enhancing processing on the imaged image. The encoder/decoder 870 may be a CPU, GPU, coprocessor, or the like in the mobile terminal.
The statistics determined by ISP processor 840 may be sent to control logic 850 unit. For example, the statistical data may include image sensor 814 statistical information such as auto-exposure, auto-white balance, auto-focus, flicker detection, black level compensation, lens 812 shading correction, and the like. Control logic 850 may include a processor and/or microcontroller that executes one or more routines (e.g., firmware) that may determine control parameters of imaging device 810 and ISP processor 840 based on the received statistical data. For example, the control parameters of imaging device 810 may include sensor 820 control parameters (e.g., gain, integration time for exposure control), camera flash control parameters, lens 812 control parameters (e.g., focal length for focusing or zooming), or a combination of these parameters. The ISP control parameters may include gain levels and color correction matrices for automatic white balance and color adjustment (e.g., during RGB processing), as well as lens 812 shading correction parameters.
The image processing method described above can be implemented using the image processing technique of fig. 8.
Any reference to memory, storage, database, or other medium used herein may include non-volatile and/or volatile memory. Suitable non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM), which acts as external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms, such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), Enhanced SDRAM (ESDRAM), synchronous Link (Synchlink) DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and bus dynamic RAM (RDRAM).
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (12)

1. An image processing method, comprising:
acquiring the geographical position of the electronic equipment when acquiring the image to be processed and the corresponding acquisition time;
determining a light effect enhancement direction according to the geographic position and the acquisition time;
acquiring a light effect intensity factor, wherein the light effect intensity factor is a parameter influencing the light effect enhancement processing intensity;
acquiring a light effect enhancement model according to the light effect enhancement direction and the light effect intensity factor, wherein the light effect enhancement model is a model for simulating light ray change;
and carrying out light effect enhancement processing on the image to be processed according to the light effect enhancement model, wherein the light effect enhancement processing refers to the processing of adding light effect to the image.
2. The method according to claim 1, wherein the obtaining of the geographical location of the electronic device when acquiring the image to be processed and the corresponding acquisition time comprises:
acquiring an environment brightness value when the electronic equipment acquires an image to be processed;
and if the environment brightness value is smaller than the first brightness threshold value, acquiring the geographic position of the electronic equipment when the electronic equipment acquires the image to be processed and the corresponding acquisition time.
3. The method of claim 1, wherein the obtaining a light effect intensity factor comprises:
acquiring the brightness value of each pixel point in an image to be processed, and acquiring a reference pixel point from the image to be processed according to the brightness value;
and acquiring a light effect intensity factor according to the brightness value corresponding to the reference pixel point.
4. The method according to claim 3, wherein the obtaining a reference pixel point from the image to be processed according to the brightness value comprises:
taking the pixel point with the maximum corresponding brightness value in the image to be processed as a reference pixel point; or
And taking the pixel points of which the corresponding brightness values are greater than a second brightness threshold value in the image to be processed as reference pixel points.
5. The method according to any one of claims 1 to 4, wherein the performing light effect enhancement processing on the image to be processed according to the light effect enhancement model comprises:
and acquiring a light effect enhancement coefficient corresponding to each pixel point in the image to be processed according to the light effect enhancement model, and performing light effect enhancement processing on each pixel point in the image to be processed according to the light effect enhancement coefficient.
6. An image processing apparatus characterized by comprising:
the image acquisition module is used for acquiring the geographical position of the electronic equipment when acquiring the image to be processed and the corresponding acquisition time;
the model acquisition module is used for determining the light effect enhancement direction according to the geographic position and the acquisition time; acquiring a light effect intensity factor, wherein the light effect intensity factor is a parameter influencing the light effect enhancement processing intensity; acquiring a light effect enhancement model according to the light effect enhancement direction and the light effect intensity factor, wherein the light effect enhancement model is a model for simulating light ray change;
and the lighting effect enhancement module is used for carrying out lighting effect enhancement processing on the image to be processed according to the lighting effect enhancement model, wherein the lighting effect enhancement processing refers to the processing of adding light effect to the image.
7. The apparatus of claim 6,
the image acquisition module is used for acquiring an environment brightness value when the electronic equipment acquires an image to be processed; and if the environment brightness value is smaller than the first brightness threshold value, acquiring the geographic position of the electronic equipment when the electronic equipment acquires the image to be processed and the corresponding acquisition time.
8. The apparatus of claim 6,
the model acquisition module is used for acquiring the brightness value of each pixel point in the image to be processed and acquiring a reference pixel point from the image to be processed according to the brightness value; and acquiring a light effect intensity factor according to the brightness value corresponding to the reference pixel point.
9. The apparatus of claim 8,
the model obtaining module is further configured to use a pixel point with the largest corresponding brightness value in the image to be processed as a reference pixel point; or taking the pixel points of which the corresponding brightness values are greater than the second brightness threshold value in the image to be processed as reference pixel points.
10. The apparatus according to any one of claims 6 to 9,
the light effect enhancement module is further used for obtaining light effect enhancement coefficients corresponding to all the pixel points in the image to be processed according to the light effect enhancement model, and carrying out light effect enhancement processing on all the pixel points in the image to be processed according to the light effect enhancement coefficients.
11. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method according to any one of claims 1 to 5.
12. An electronic device comprising a memory and a processor, the memory having stored therein computer-readable instructions that, when executed by the processor, cause the processor to perform the method of any of claims 1-5.
CN201810466604.3A 2018-05-16 2018-05-16 Image processing method, image processing device, computer-readable storage medium and electronic equipment Active CN108600631B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810466604.3A CN108600631B (en) 2018-05-16 2018-05-16 Image processing method, image processing device, computer-readable storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810466604.3A CN108600631B (en) 2018-05-16 2018-05-16 Image processing method, image processing device, computer-readable storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN108600631A CN108600631A (en) 2018-09-28
CN108600631B true CN108600631B (en) 2021-03-12

Family

ID=63631332

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810466604.3A Active CN108600631B (en) 2018-05-16 2018-05-16 Image processing method, image processing device, computer-readable storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN108600631B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111178118B (en) * 2018-11-13 2023-07-21 浙江宇视科技有限公司 Image acquisition processing method, device and computer readable storage medium
CN109712177B (en) * 2018-12-25 2021-07-09 Oppo广东移动通信有限公司 Image processing method, image processing device, electronic equipment and computer readable storage medium
CN112511737A (en) * 2020-10-29 2021-03-16 维沃移动通信有限公司 Image processing method and device, electronic equipment and readable storage medium

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101532881B (en) * 2009-04-03 2010-07-07 合肥工业大学 Single factor atmospheric polarization modeling method based on Rayleigh scattering
KR101066734B1 (en) * 2010-06-21 2011-09-21 중앙대학교 산학협력단 Method and apparatus for texture segmentation based on multi-scale entropy profile
CN102999890B (en) * 2011-09-09 2015-09-30 苏州普达新信息技术有限公司 Based on the image light dynamic changes of strength bearing calibration of environmental factor
CN103632351B (en) * 2013-12-16 2017-01-11 武汉大学 All-weather traffic image enhancement method based on brightness datum drift
CN107689031B (en) * 2016-08-03 2021-05-28 天津慧医谷科技有限公司 Color restoration method based on illumination compensation in tongue picture analysis
CN106557617A (en) * 2016-10-27 2017-04-05 北京航空航天大学 A kind of clear sky fixed-wing solar energy unmanned plane energy production power estimation method
CN107767348B (en) * 2017-09-27 2021-06-08 重庆大学 Single tunnel image rapid enhancement method based on imaging model constraint
CN107682685B (en) * 2017-10-30 2019-03-08 Oppo广东移动通信有限公司 White balancing treatment method and device, electronic device and computer readable storage medium

Also Published As

Publication number Publication date
CN108600631A (en) 2018-09-28

Similar Documents

Publication Publication Date Title
CN107948519B (en) Image processing method, device and equipment
CN108419028B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN108012080B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN110445988B (en) Image processing method, image processing device, storage medium and electronic equipment
CN108989700B (en) Imaging control method, imaging control device, electronic device, and computer-readable storage medium
CN108322669B (en) Image acquisition method and apparatus, imaging apparatus, and readable storage medium
US11431915B2 (en) Image acquisition method, electronic device, and non-transitory computer readable storage medium
CN108055452B (en) Image processing method, device and equipment
CN110225248B (en) Image acquisition method and device, electronic equipment and computer readable storage medium
CN109246354B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN108717530B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN108734676B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN110213494B (en) Photographing method and device, electronic equipment and computer readable storage medium
CN107704798B (en) Image blurring method and device, computer readable storage medium and computer device
CN107911682B (en) Image white balance processing method, device, storage medium and electronic equipment
CN110349163B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN109242794B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN108616700B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN108322651B (en) Photographing method and device, electronic equipment and computer readable storage medium
WO2020029679A1 (en) Control method and apparatus, imaging device, electronic device and readable storage medium
CN107194901B (en) Image processing method, image processing device, computer equipment and computer readable storage medium
CN108600631B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN112004029B (en) Exposure processing method, exposure processing device, electronic apparatus, and computer-readable storage medium
CN110956679B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN108848306B (en) Image processing method and device, electronic equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant