CN108419028B - Image processing method, image processing device, computer-readable storage medium and electronic equipment - Google Patents

Image processing method, image processing device, computer-readable storage medium and electronic equipment Download PDF

Info

Publication number
CN108419028B
CN108419028B CN201810231620.4A CN201810231620A CN108419028B CN 108419028 B CN108419028 B CN 108419028B CN 201810231620 A CN201810231620 A CN 201810231620A CN 108419028 B CN108419028 B CN 108419028B
Authority
CN
China
Prior art keywords
light effect
pixel point
image
effect enhancement
processed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201810231620.4A
Other languages
Chinese (zh)
Other versions
CN108419028A (en
Inventor
袁全
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201810231620.4A priority Critical patent/CN108419028B/en
Publication of CN108419028A publication Critical patent/CN108419028A/en
Application granted granted Critical
Publication of CN108419028B publication Critical patent/CN108419028B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • G06T5/77
    • G06T5/90

Abstract

The application relates to an image processing method, an image processing device, a computer readable storage medium and an electronic device. The method comprises the following steps: acquiring a central pixel point in the image to be processed selected according to the trigger instruction; obtaining a light effect enhancement model according to the central pixel point, wherein the light effect enhancement model is a model which simulates light intensity change by taking the central pixel point as a light source; calculating a light effect enhancement coefficient of each pixel point in the image to be processed according to the light effect enhancement model; and carrying out light effect enhancement processing on each pixel point in the image to be processed according to the light effect enhancement coefficient. According to the image processing method and device, the computer-readable storage medium and the electronic equipment, the light effect enhancement processing can be performed on the image to be processed according to the central pixel point, the image processing can be performed according to the requirements of the user, and the user viscosity of the electronic equipment is improved.

Description

Image processing method, image processing device, computer-readable storage medium and electronic equipment
Technical Field
The present application relates to the field of computer technologies, and in particular, to an image processing method and apparatus, a computer-readable storage medium, and an electronic device.
Background
The intelligent terminal can generate images in real time through the camera, can download the images through a network, or can import the images through external equipment. If the acquired image can not meet the personalized requirements, the user can also perform post-processing on the acquired image according to the preference of the user. For example, the overall brightness of the image may be adjusted, noise in the image may be weakened, or a portrait in the image may be beautified.
Disclosure of Invention
The embodiment of the application provides an image processing method and device, a computer readable storage medium and an electronic device, which can improve user viscosity.
A method of image processing, the method comprising:
acquiring a central pixel point in the image to be processed selected according to the trigger instruction;
obtaining a light effect enhancement model according to the central pixel point, wherein the light effect enhancement model is a model which simulates light intensity change by taking the central pixel point as a light source;
calculating a light effect enhancement coefficient of each pixel point in the image to be processed according to the light effect enhancement model;
and carrying out light effect enhancement processing on each pixel point in the image to be processed according to the light effect enhancement coefficient.
An image processing apparatus, the apparatus comprising:
the center acquisition module is used for acquiring a center pixel point in the image to be processed selected according to the trigger instruction;
the model obtaining module is used for obtaining a light effect enhancement model according to the central pixel point, and the light effect enhancement model is a model which takes the central pixel point as a light source to simulate the change of light intensity;
the coefficient acquisition module is used for calculating the light effect enhancement coefficient of each pixel point in the image to be processed according to the light effect enhancement model;
and the enhancement processing module is used for carrying out light effect enhancement processing on each pixel point in the image to be processed according to the light effect enhancement coefficient.
A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
acquiring a central pixel point in the image to be processed selected according to the trigger instruction;
obtaining a light effect enhancement model according to the central pixel point, wherein the light effect enhancement model is a model which simulates light intensity change by taking the central pixel point as a light source;
calculating a light effect enhancement coefficient of each pixel point in the image to be processed according to the light effect enhancement model;
and carrying out light effect enhancement processing on each pixel point in the image to be processed according to the light effect enhancement coefficient.
An electronic device comprising a memory and a processor, the memory having stored therein computer-readable instructions that, when executed by the processor, cause the processor to perform the steps of:
acquiring a central pixel point in the image to be processed selected according to the trigger instruction;
obtaining a light effect enhancement model according to the central pixel point, wherein the light effect enhancement model is a model which simulates light intensity change by taking the central pixel point as a light source;
calculating a light effect enhancement coefficient of each pixel point in the image to be processed according to the light effect enhancement model;
and carrying out light effect enhancement processing on each pixel point in the image to be processed according to the light effect enhancement coefficient.
According to the image processing method, the image processing device, the computer readable storage medium and the electronic equipment, the central pixel point in the image to be processed can be obtained according to the trigger instruction, and then the light effect enhancement model is determined according to the central pixel point. And calculating the light effect enhancement coefficient of each pixel point in the image to be processed according to the light effect enhancement model, and performing light effect enhancement processing on each pixel point in the image to be processed according to the calculated light ray enhancement coefficient. Therefore, any one pixel point in the image to be processed can be selected as a central pixel point according to the received trigger instruction, the light effect enhancement processing is carried out on the image to be processed according to the central pixel point, the image processing can be carried out according to the requirements of users, and the user viscosity of the electronic equipment is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a diagram of an exemplary embodiment of an image processing method;
FIG. 2 is a flow diagram of a method of image processing in one embodiment;
FIG. 3 is a flow chart of an image processing method in another embodiment;
FIG. 4 is a schematic diagram of a light effect enhancement model in one embodiment;
FIG. 5 is a flowchart of an image processing method in yet another embodiment;
FIG. 6 is a flowchart of an image processing method in yet another embodiment;
FIG. 7 is a diagram showing a configuration of an image processing apparatus according to an embodiment;
FIG. 8 is a schematic diagram of an image processing circuit in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It will be understood that, as used herein, the terms "first," "second," and the like may be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another. For example, a first client may be referred to as a second client, and similarly, a second client may be referred to as a first client, without departing from the scope of the present application. Both the first client and the second client are clients, but they are not the same client.
FIG. 1 is a diagram of an embodiment of an application environment of an image processing method. As shown in fig. 1, the application environment diagram includes a user 102 and a terminal 104. The image to be processed may be displayed on the terminal 104, and the user 102 may select any area in the image to be processed displayed on the terminal 104 by a trigger instruction. The trigger instruction may be initiated according to a touch operation, a physical key operation, a voice control operation, a shaking operation, or the like. After detecting the trigger instruction, the terminal 104 acquires a light effect central area of the to-be-processed image selected according to the trigger instruction, and determines a central pixel point according to the light effect central area; acquiring a light effect enhancement model according to the central pixel point; calculating a light effect enhancement coefficient of each pixel point in the image to be processed according to the light effect enhancement model; and carrying out light effect enhancement treatment on each pixel point in the image to be processed according to the light effect enhancement coefficient. The terminal 102 is an electronic device located at the outermost periphery of the computer network and mainly used for inputting user information and outputting a processing result, and may be, for example, a personal computer, a mobile terminal, a personal digital assistant, a wearable electronic device, and the like. It is understood that in other embodiments provided by the present application, the application environment of the image processing method may only include the terminal 102.
FIG. 2 is a flow diagram of a method of image processing in one embodiment. As shown in fig. 2, the image processing method includes steps 202 to 208. Wherein:
step 202, obtaining a central pixel point in the image to be processed selected according to the trigger instruction.
In one embodiment, the scene is often very complicated during shooting, especially the light of the shooting scene is complicated and changeable, and the shooting scene cannot be changed by the user during shooting, so that the effect desired by the user can be achieved only by post-processing. The image to be processed is an image needing light effect enhancement processing, and the light effect is image enhancement processing simulating a light source effect. Specifically, the light source effect may be an effect of natural light, stage light, studio light, film light, contour light, or the like. After the light source emits light, the light is diffused around the light source, and the intensity of the light is weakened along with the increase of the distance from the light source. The central pixel point is the pixel point corresponding to the center of the light source.
The image to be processed is composed of a plurality of pixel points which are arranged into a two-dimensional pixel point matrix according to a certain rule. The resolution of the image to be processed can be represented by the number of pixel points of the image to be processed in each transverse direction or longitudinal direction, and the positions of the pixel points in the image to be processed can also be represented by one two-dimensional coordinate. For example, the resolution of the image may be 320 × 640, which means that the image includes 320 pixels in each lateral direction and 640 pixels in each longitudinal direction. If a coordinate system is established by taking the pixel point at the leftmost lower corner of the image as an origin, the position of any pixel point in the image can be represented by a two-dimensional coordinate.
The electronic device can acquire the image to be processed and display the image to be processed. The user can select the central pixel point according to the image to be processed displayed by the electronic equipment. Specifically, the image to be processed may be a preview image in a preview process of an image captured by the electronic device, may also be an image captured by the electronic device, or an image pre-stored in the electronic device, which is not limited herein. The trigger instruction can be touch operation, pressing operation of a physical key, voice control operation or shaking operation of the mobile terminal and other trigger operations. The touch control operation includes a touch click operation, a touch long-time press operation, a touch slide operation, a multi-point touch control operation and the like, wherein the touch long-time press operation is a touch press operation exceeding a preset time length.
And 204, acquiring a light effect enhancement model according to the central pixel point, wherein the light effect enhancement model is a model which takes the central pixel point as a light source to simulate the change of light intensity.
The light effect enhancement model is a model for performing light effect enhancement processing on an image to be processed, and can simulate a curve of light intensity change emitted by a light source. And acquiring a light effect enhancement model according to the central pixel point, namely using the central pixel point as a light source to simulate the model of light intensity change of the position of each pixel point. The light effect enhancement reference model can be stored in the electronic equipment in advance, and can be a model taking any reference pixel point in an image as a light source. After the central pixel point is obtained, the displacement of the central pixel point relative to the reference pixel point can be obtained, and the light effect enhancement model corresponding to the central pixel point is obtained after the light effect enhancement reference model is displaced.
For example, a light effect enhancement reference model P (x, y) with a reference pixel point with coordinates (0,0) as a light source may be stored in the electronic device in advance. Assume that the selected center pixel point is (x)0,y0) If so, then the displacement of the center pixel relative to the reference pixel is (-x)0,-y0) Then the light effect enhancement model corresponding to the central pixel point obtained according to the displacement is P (x-x)0,y-y0). The obtained light effect enhancement model P (x-x)0,y-y0) In, with the central pixel point (x)0,y0) Being light sourcesAnd a light effect enhancement model.
The image to be processed may be an RGB image composed of three RGB channels, or may be a monochrome image composed of one channel. If the image to be processed is an RGB image, each pixel point in the image to be processed has three corresponding RGB channel values. If the simulated light source effects are different, and the colors generated by the light may be different, the respective enhancement coefficients for the three channels RGB will be different. For example, the color of sunlight may be yellowish, and the color of stage light may be variegated. Specifically, light effect enhancement models corresponding to the RGB three channels can be respectively obtained according to the central pixel point, and light effect enhancement coefficients corresponding to the RGB three channels are respectively calculated according to the light effect enhancement models.
And step 206, calculating the light effect enhancement coefficient of each pixel point in the image to be processed according to the light effect enhancement model.
In the embodiment provided by the application, the light effect enhancement model can simulate the change of the light intensity by taking the central pixel point as the light source, and then the light effect enhancement coefficient of each pixel point in the image to be processed can be calculated according to the light effect enhancement model. The light effect enhancement coefficient is a parameter for performing light effect enhancement processing on each pixel point. Generally, the farther from the light source location, the more the light is attenuated. Correspondingly, the farther the pixel point is away from the central pixel point in the image to be processed, the smaller the light effect enhancement coefficient is, and the light effect enhancement processing can be performed on each pixel point according to the obtained light effect enhancement coefficient.
And 208, carrying out light effect enhancement processing on each pixel point in the image to be processed according to the light effect enhancement coefficient.
The light effect enhancement processing is processing for enhancing the brightness of an image. After the light effect enhancement coefficient is calculated, light effect enhancement processing can be carried out on each pixel point in the image to be processed according to the light effect enhancement coefficient. The light effect enhancement processing can be specifically carried out in a mode of overlapping or multiplying the light effect enhancement coefficients with the image to be processed. It is understood that the value range of the pixel value in the image is generally [0,255], so the pixel value of the image to be processed after the light effect enhancement processing cannot be larger than 255.
For example, assume that the image to be processed is H0(x, y) and the light effect enhancement model is P (x, y), then the to-be-processed image H (x, y) after the light effect enhancement processing is performed by the superposition method can be represented as H (x, y) ═ 1+ P (x, y)) H0(x, y), and the to-be-processed image after the light effect enhancement processing is performed by means of multiplication can be represented as H (x, y) ═ P (x, y) H0(x, y). It is to be understood that the light effect enhancement process may also be implemented in other ways, which are not limited herein.
Specifically, the light efficiency enhancement coefficients of the RGB three channels corresponding to each pixel point can be calculated according to the light efficiency enhancement model, and then the light efficiency enhancement processing can be performed on the RGB three channels of each pixel point respectively according to the light efficiency enhancement coefficients. After the light effect enhancement processing with different intensities is carried out on each channel, the light effect enhancement effect of the obtained image is the same. For example, in the obtained light efficiency enhancement coefficients corresponding to the RGB three channels, the light efficiency enhancement coefficient corresponding to the R channel is greater than the light efficiency enhancement coefficients of the G channel and the B channel, and then after the light efficiency enhancement processing is performed on the image to be processed according to the obtained light efficiency enhancement coefficients, the obtained light efficiency enhancement image is the effect of being reddish relative to the image to be processed.
The image processing method provided by the embodiment can acquire the light effect central area of the image to be processed according to the trigger instruction, determine the central pixel point according to the light effect central area, and determine the light effect enhancement model according to the central pixel point. And calculating the light effect enhancement coefficient of each pixel point in the image to be processed according to the light effect enhancement model, and performing light effect enhancement processing on each pixel point in the image to be processed according to the calculated light ray enhancement coefficient. Therefore, any one pixel point in the image to be processed can be selected as a central pixel point according to the received trigger instruction, the light effect enhancement processing is carried out on the image to be processed according to the central pixel point, the image processing can be carried out according to the requirements of users, and the user viscosity of the electronic equipment is improved.
Fig. 3 is a flowchart of an image processing method in another embodiment. As shown in fig. 3, the image processing method includes steps 302 to 308. Wherein:
step 302, obtaining a central pixel point in the image to be processed selected according to the trigger instruction.
In one embodiment, the image to be processed is a two-dimensional pixel matrix formed by a plurality of pixels, and then the user can select any one pixel in the image to be processed as a central pixel. Specifically, the obtaining of the center pixel specifically may include: and acquiring a light effect central area of the image to be processed selected according to the trigger instruction, and determining a central pixel point according to the light effect central area. The light effect central area is the area where the light source is located when the light effect enhancement processing is carried out on the image to be processed. It can be understood that, when the user selects the center pixel, the user may be displayed by the size of the display screen of the electronic device, and the center pixel cannot be accurately selected. Then, the light effect central area where the central pixel point is located can be selected first, and then the central pixel point is determined according to the light effect central area. The user can send a trigger instruction according to the image to be processed displayed by the electronic equipment, and the trigger instruction is used for selecting any area in the image to be processed. When the electronic device detects a trigger instruction, the light effect center region in the to-be-processed image selected according to the trigger instruction can be acquired.
For example, a user can touch a position in the image to be processed on a screen of the electronic device by a finger during a shooting preview process, and the electronic device can use the region touched by the finger of the user as a light effect center region according to the region touched by the finger of the user. After the light effect central area is determined, the central pixel point can be determined according to the light effect central area. Specifically, any one pixel point in the light effect central area can be obtained as a central pixel point, and a pixel point in the central position of the light effect central area can also be obtained as a central pixel point, which is not limited in this embodiment. After the light effect central area of the image to be processed selected according to the trigger instruction is obtained, the light effect central area can be amplified and displayed, and then the central pixel point is determined according to a central selection instruction input by a user.
And 304, acquiring a two-dimensional Gaussian distribution function, and constructing a light effect enhancement model by taking the central pixel point as a maximum value point of the two-dimensional Gaussian distribution function.
Specifically, the light effect enhancement model may be constructed according to a two-dimensional gaussian distribution function. First, a two-dimensional gaussian distribution function is obtained as follows:
Figure BDA0001602740450000071
wherein, (x, y) represents the two-dimensional coordinates of any pixel point in the image to be processed, and d is a constant. The function is a two-dimensional Gaussian distribution function with (0,0) as a maximum value point, the light effect enhancement model obtained according to the central pixel point is namely the two-dimensional Gaussian distribution function is displaced, and the maximum value point of the two-dimensional Gaussian distribution function is moved to the position of the central pixel point to obtain the light effect enhancement model. Assume that the center pixel is (x)o,yo) Then, the resulting light effect enhancement model can be expressed as:
Figure BDA0001602740450000081
in the obtained light effect enhancement model, a central pixel point (x)o,yo) It is the maximum point, i.e. at the central pixel point (x)o,yo) The obtained light efficiency enhancement coefficient Po(x, y) max. The intensity of the light effect enhancement coefficient can be adjusted according to the constant d.
Fig. 4 is a schematic diagram of a light effect enhancement model in an embodiment. As shown in fig. 4, the resolution of the to-be-processed image in the light effect enhancement model is 50 × 50, and the coordinate value of the central pixel 402 is (25, 25). It can be seen that the light effect enhancement coefficient corresponding to the central pixel 402 is the largest, the light effect enhancement coefficients corresponding to other pixels in the image to be processed decrease with the increase of the distance from the central pixel 402, and the light effect enhancement coefficients corresponding to pixels farther away from the central pixel 402 are smaller.
And step 306, calculating the light effect enhancement coefficient of each pixel point in the image to be processed according to the light effect enhancement model.
It can be understood that the image to be processed is a two-dimensional pixel matrix, a coordinate system can be established by using the leftmost lower pixel point of the image to be processed as the origin, and the pixel points in the image to be processed can be represented by one two-dimensional coordinate. The light effect enhancement coefficient of each pixel point in the image to be processed can be obtained according to the light effect enhancement model, and the coordinates corresponding to each pixel point can be directly brought into the light effect enhancement model to obtain the light effect enhancement coefficient of the pixel point.
And 308, detecting a portrait area in the image to be processed, and performing light effect enhancement processing on each pixel point in the portrait area according to the light effect enhancement coefficient.
Specifically, the user generally pays attention to the region where the portrait is located, so when the light effect processing is performed on the image to be processed, the light effect processing may be performed only on the portrait region, and the region outside the portrait region may not be processed, or may be subjected to the weakening processing. The portrait area is an area where a portrait in an image to be processed is located, and the portrait area can be detected from a face area in the image to be processed and extracted according to the detected face area.
Specifically, the face region of the image to be processed may be obtained through a face detection algorithm, and the face detection algorithm may include a detection method based on geometric features, a feature face detection method, a linear discriminant analysis method, a detection method based on a hidden markov model, and the like, which is not limited herein. Generally, when an image is acquired by an image acquisition device, a depth map corresponding to the image can be acquired at the same time, and a pixel point in the depth map corresponds to a pixel point in the image. And the pixel points in the depth map represent depth information of corresponding pixels in the image, and the depth information is depth information from an object corresponding to the pixel points to the image acquisition device. For example, the depth information may be obtained by two cameras, and the obtained depth information corresponding to the pixel points may be 1 meter, 2 meters, or 3 meters. The acquiring the portrait area may specifically include: acquiring an image to be processed and corresponding depth information; and detecting a face region in the image to be processed, and acquiring the face region in the image to be processed according to the face region and the depth information. Generally, the portrait and the face are on the same vertical plane, and the value of the depth information from the portrait to the image acquisition device and the value of the depth information from the face to the image acquisition device are in the same range. Therefore, after the face region is obtained, the depth information corresponding to the face region can be obtained from the depth map, then the depth information corresponding to the portrait region can be obtained according to the depth information corresponding to the face region, and then the portrait region in the image to be processed can be obtained according to the depth information corresponding to the portrait region. It is understood that the portrait area may also be obtained by other methods, which are not limited in this embodiment. For example, the portrait area may be obtained by artificial intelligence, a region growing method, or the like.
After the portrait area in the image to be processed is detected, the light effect enhancement processing can be carried out on the portrait area according to the light effect enhancement coefficient. The other regions except the portrait region may be not processed or may be weakened. For example, the pixel values of the pixels in other regions may all be set to 0, or the brightness of the pixels in other regions may be reduced, or the pixels in other regions may be subjected to blurring processing, etc., which is not limited in this embodiment.
In one embodiment, a road portrait may be captured when the to-be-processed image is acquired, and the road portrait and the main portrait may be determined according to the area of the portrait. The portrait with a large area is considered as a main portrait, the portrait with a small area is considered as a road portrait, and when the portrait is subjected to light effect enhancement processing, the portrait can be processed only aiming at the main portrait. The step of processing the portrait area may further comprise:
step 502, obtaining the area of the portrait area, and taking the portrait area with the area larger than the area threshold as the target portrait area.
The human image area is composed of a plurality of pixel points in the image to be processed, and the larger the number of the pixel points contained in the human image area is, the larger the area of the human image area is represented. The area of the portrait area can be represented by the number of pixels contained therein, or can be identified by the ratio of the number of pixels contained in the portrait area to the total number of pixels of the image to be processed, which is not limited in this embodiment.
And step 504, carrying out light effect enhancement processing on each pixel point in the target portrait area according to the light effect enhancement coefficient.
And taking the portrait area with the area larger than the area threshold value as a target portrait area, and carrying out light effect enhancement processing on the target portrait area, so that the portrait area with the smaller area is weakened, and the target portrait area is more prominent.
In the embodiment that this application provided, the light efficiency processing intensity of light efficiency intensity model can be adjusted, can adjust the light efficiency intensity coefficient of obtaining according to light efficiency intensity model according to light efficiency intensity factor. Namely, the values of the luminous efficiency intensity factors are different, and the luminous efficiency intensity coefficients of the correspondingly obtained pixel points are different. Specifically, a light effect intensity factor, which is a parameter that affects the light effect enhancement processing intensity, may be obtained, and a light effect enhancement model may be obtained according to the light effect intensity factor and the center pixel point.
The brightness of the image to be processed is improved by performing the light effect enhancement processing on the image to be processed. If the image to be processed is relatively bright, then the light effect enhancement processing is performed on the image to be processed, which causes a serious distortion of the image to be processed. Specifically, the intensity of the light effect enhancement processing may be adjusted according to the brightness of the image to be processed. And acquiring brightness data of the image to be processed, and acquiring a light effect intensity factor according to the brightness data. The brightness data refers to data representing the brightness of an image to be processed, and a light effect intensity factor is obtained according to the brightness data. The light effect intensity factor may be obtained according to an average brightness value of the image to be processed, or may be obtained according to a maximum brightness value, or may be obtained according to an average brightness value of the portrait area, which is not limited in this embodiment.
In one embodiment, the specific step of obtaining the light effect intensity factor according to the brightness data comprises:
step 602, obtaining a reference pixel point according to the luminance data, and obtaining a distance between the reference pixel point and the center pixel point.
And step 604, acquiring a light effect intensity factor according to the interval distance.
Specifically, a pixel point with the maximum brightness value in the image to be processed can be obtained as a reference pixel point, and the spacing distance between the reference pixel point and the center pixel point is obtained; and acquiring a light effect intensity factor according to the spacing distance. It can be understood that the reference pixel point is the brightest point in the image to be processed, and the central pixel point is the pixel point with the largest corresponding light effect intensity coefficient. The farther the distance from the central pixel point is, the smaller the corresponding luminous efficiency intensity coefficient is. The farther the reference pixel is from the center pixel, the smaller the corresponding light effect intensity coefficient is, and the smaller the influence of the light effect enhancement processing on the reference pixel is. Therefore, the light effect intensity factor can be obtained according to the spacing distance between the reference pixel point and the central pixel point, and the larger the spacing distance is, the larger the light effect intensity factor is. Specifically, the brightness value corresponding to the reference pixel point can be obtained, and the light effect intensity factor is obtained according to the brightness value and the interval distance of the reference pixel point. The lower the brightness value of the reference pixel point is, the larger the spacing distance is, and the larger the corresponding acquired light effect intensity factor is.
In other embodiments provided by the application, pixel points with brightness values larger than a brightness threshold value in the image to be processed can be obtained as reference pixel points, and the spacing distance between each reference pixel point and the central pixel point is obtained; and acquiring a light effect intensity factor according to the acquired minimum value in the interval distances. When the brightness value is greater than the brightness threshold, it can be considered that the pixel point is too bright and is greatly influenced by the light effect enhancement processing. And then, the pixel points with the brightness values larger than the brightness threshold value can be obtained as reference pixel points, and the light effect intensity factor is obtained according to the reference pixel point closest to the central pixel point.
The image processing method provided by the embodiment can acquire the light effect central area of the image to be processed according to the trigger instruction, determine the central pixel point according to the light effect central area, and determine the light effect enhancement model according to the central pixel point. And calculating the light effect enhancement coefficient of each pixel point in the image to be processed according to the light effect enhancement model, and performing light effect enhancement processing on each pixel point in the image to be processed according to the calculated light ray enhancement coefficient. Therefore, any one pixel point in the image to be processed can be selected as a central pixel point according to the received trigger instruction, the light effect enhancement processing is carried out on the image to be processed according to the central pixel point, the image processing can be carried out according to the requirements of users, and the user viscosity of the electronic equipment is improved.
It should be understood that although the steps in the flowcharts of fig. 2, 3, 5, and 6 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 2, 3, 5, and 6 may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or stages is not necessarily sequential, but may be performed alternately or alternatingly with other steps or at least some of the sub-steps or stages of other steps.
Fig. 7 is a schematic structural diagram of an image processing apparatus according to an embodiment. As shown in fig. 7, the image processing apparatus 700 includes a center acquisition module 702, a model acquisition module 704, a coefficient acquisition module 706, and an enhancement processing module 708. Wherein:
and a center obtaining module 702, configured to obtain a center pixel point in the to-be-processed image selected according to the trigger instruction.
The model obtaining module 704 is configured to obtain a light effect enhancement model according to the central pixel point, where the light effect enhancement model is a model that simulates light intensity changes with the central pixel point as a light source.
A coefficient obtaining module 706, configured to calculate, according to the light effect enhancement model, a light effect enhancement coefficient of each pixel point in the image to be processed.
And the enhancement processing module 708 is configured to perform light effect enhancement processing on each pixel point in the image to be processed according to the light effect enhancement coefficient.
The image processing device provided by the embodiment can acquire the light effect central area of the image to be processed according to the trigger instruction, determine the central pixel point according to the light effect central area, and determine the light effect enhancement model according to the central pixel point. And calculating the light effect enhancement coefficient of each pixel point in the image to be processed according to the light effect enhancement model, and performing light effect enhancement processing on each pixel point in the image to be processed according to the calculated light ray enhancement coefficient. Therefore, any one pixel point in the image to be processed can be selected as a central pixel point according to the received trigger instruction, the light effect enhancement processing is carried out on the image to be processed according to the central pixel point, the image processing can be carried out according to the requirements of users, and the user viscosity of the electronic equipment is improved.
In an embodiment, the model obtaining module 704 is further configured to obtain a light effect intensity factor, and obtain a light effect enhancement model according to the light effect intensity factor and the center pixel point, where the light effect intensity factor is a parameter that affects the light effect enhancement processing intensity.
In one embodiment, the model obtaining module 704 is further configured to obtain brightness data of the image to be processed, and obtain the light effect intensity factor according to the brightness data.
In an embodiment, the model obtaining module 704 is further configured to obtain a reference pixel point according to the luminance data, and obtain a distance between the reference pixel point and the center pixel point; and acquiring a light effect intensity factor according to the interval distance.
In an embodiment, the model obtaining module 704 is further configured to obtain a two-dimensional gaussian distribution function, and construct the light effect enhancement model by using the central pixel point as a maximum point of the two-dimensional gaussian distribution function.
In an embodiment, the enhancement processing module 708 is further configured to detect a portrait area in the image to be processed, and perform light effect enhancement processing on each pixel point in the portrait area according to the light effect enhancement coefficient.
In one embodiment, the enhancement processing module 708 is further configured to obtain a region area of the portrait region, and use the portrait region with the region area greater than an area threshold as the target portrait region; and carrying out light effect enhancement processing on each pixel point in the target portrait area according to the light effect enhancement coefficient.
The division of the modules in the image processing apparatus is only for illustration, and in other embodiments, the image processing apparatus may be divided into different modules as needed to complete all or part of the functions of the image processing apparatus.
The embodiment of the application also provides a computer readable storage medium. One or more non-transitory computer-readable storage media containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform the image processing methods provided by the above-described embodiments.
A computer program product comprising instructions which, when run on a computer, cause the computer to perform the image processing method provided by the above embodiments.
The embodiment of the application also provides the electronic equipment. The electronic device includes therein an Image Processing circuit, which may be implemented using hardware and/or software components, and may include various Processing units defining an ISP (Image Signal Processing) pipeline. FIG. 8 is a schematic diagram of an image processing circuit in one embodiment. As shown in fig. 8, for convenience of explanation, only aspects of the image processing technology related to the embodiments of the present application are shown.
As shown in fig. 8, the image processing circuit includes an ISP processor 840 and control logic 850. Image data captured by imaging device 810 is first processed by ISP processor 840, and ISP processor 840 analyzes the image data to capture image statistics that may be used to determine and/or control one or more parameters of imaging device 810. Imaging device 810 may include a camera having one or more lenses 812 and an image sensor 814. Image sensor 814 may include an array of color filters (e.g., Bayer filters), and image sensor 814 may acquire light intensity and wavelength information captured with each imaging pixel of image sensor 814 and provide a set of raw image data that may be processed by ISP processor 840. The sensor 820 (e.g., a gyroscope) may provide parameters of the acquired image processing (e.g., anti-shake parameters) to the ISP processor 840 based on the type of sensor 820 interface. The sensor 820 interface may utilize an SMIA (Standard Mobile Imaging Architecture) interface, other serial or parallel camera interfaces, or a combination of the above.
In addition, the image sensor 814 may also send raw image data to the sensor 820, the sensor 820 may provide raw image data to the ISP processor 840 based on the sensor 820 interface type, or the sensor 820 may store raw image data in the image memory 830.
The ISP processor 840 processes the raw image data pixel by pixel in a variety of formats. For example, each image pixel may have a bit depth of 8, 10, 12, or 14 bits, and ISP processor 840 may perform one or more image processing operations on the raw image data, collecting statistical information about the image data. Wherein the image processing operations may be performed with the same or different bit depth precision.
ISP processor 840 may also receive image data from image memory 830. For example, the sensor 820 interface sends raw image data to the image memory 830, and the raw image data in the image memory 830 is then provided to the ISP processor 840 for processing. The image Memory 830 may be a portion of a Memory device, a storage device, or a separate dedicated Memory within an electronic device, and may include a DMA (Direct Memory Access) feature.
Upon receiving raw image data from image sensor 814 interface or from sensor 820 interface or from image memory 830, ISP processor 840 may perform one or more image processing operations, such as temporal filtering. The processed image data may be sent to image memory 830 for additional processing before being displayed. ISP processor 840 may also receive processed data from image memory 830, which is subjected to image data processing in the raw domain and in the RGB and YCbCr color spaces. The processed image data may be output to a display 880 for viewing by a user and/or further Processing by a Graphics Processing Unit (GPU). Further, the output of ISP processor 840 may also be sent to image memory 830 and display 880 may read image data from image memory 830. In one embodiment, image memory 830 may be configured to implement one or more frame buffers. Further, the output of the ISP processor 840 may be transmitted to an encoder/decoder 870 for encoding/decoding the image data. The encoded image data may be saved and decompressed before being displayed on the display 880 device.
The step of the ISP processor 840 processing the image data includes: the image data is subjected to VFE (Video Front End) Processing and CPP (Camera Post Processing). The VFE processing of the image data may include modifying the contrast or brightness of the image data, modifying digitally recorded lighting status data, performing compensation processing (e.g., white balance, automatic gain control, gamma correction, etc.) on the image data, performing filter processing on the image data, etc. CPP processing of image data may include scaling an image, providing a preview frame and a record frame to each path. Among other things, the CPP may use different codecs to process the preview and record frames. The image data processed by the ISP processor 840 may be sent to the light effect processing module 860 for light effect enhancement processing of the image before being displayed. The light effect Processing module 860 may be a Central Processing Unit (CPU), a GPU, a coprocessor, or the like. The data processed by the light effect processing module 860 may be transmitted to the encoder/decoder 870 to encode/decode image data. The encoded image data may be saved and decompressed before being displayed on the display 880 device. Wherein, the light effect processing module 860 may also be located between the encoder/decoder 870 and the display 880, i.e. the light effect enhancing module 860 performs the light effect enhancing processing on the imaged image. The encoder/decoder 870 may be a CPU, GPU, coprocessor, or the like in the mobile terminal.
The statistics determined by ISP processor 840 may be sent to control logic 850 unit. For example, the statistical data may include image sensor 814 statistical information such as auto-exposure, auto-white balance, auto-focus, flicker detection, black level compensation, lens 812 shading correction, and the like. Control logic 850 may include a processor and/or microcontroller that executes one or more routines (e.g., firmware) that may determine control parameters of imaging device 810 and ISP processor 840 based on the received statistical data. For example, the control parameters of imaging device 810 may include sensor 820 control parameters (e.g., gain, integration time for exposure control), camera flash control parameters, lens 812 control parameters (e.g., focal length for focusing or zooming), or a combination of these parameters. The ISP control parameters may include gain levels and color correction matrices for automatic white balance and color adjustment (e.g., during RGB processing), as well as lens 812 shading correction parameters.
The image processing method described above can be implemented using the image processing technique of fig. 8.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a non-volatile computer-readable storage medium, and can include the processes of the embodiments of the methods described above when the program is executed. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), or the like.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. An image processing method, characterized in that the method comprises:
acquiring a central pixel point in the image to be processed selected according to the trigger instruction;
acquiring brightness data of an image to be processed, acquiring a reference pixel point according to the brightness data, and acquiring a spacing distance between the reference pixel point and the central pixel point;
obtaining a light effect intensity factor according to the interval distance; the brightness value of the reference pixel point is greater than a brightness threshold value, and the light effect intensity factor is positively correlated with the spacing distance;
acquiring a light effect enhancement model according to the light effect intensity factor and the central pixel point, wherein the light effect intensity factor is a parameter influencing the light effect enhancement processing intensity, and the light effect enhancement model is a model simulating the light intensity change by taking the central pixel point as a light source;
calculating a light effect enhancement coefficient of each pixel point in the image to be processed according to the light effect enhancement model;
and carrying out light effect enhancement processing on each pixel point in the image to be processed according to the light effect enhancement coefficient.
2. The method of claim 1, wherein the obtaining a light effect enhancement model further comprises:
and acquiring a two-dimensional Gaussian distribution function, and constructing a light effect enhancement model by taking the central pixel point as a maximum value point of the two-dimensional Gaussian distribution function.
3. The method according to claim 1 or 2, wherein performing light effect enhancement processing on each pixel point in the image to be processed according to the light effect enhancement coefficient comprises:
and detecting a portrait area in the image to be processed, and performing light effect enhancement processing on each pixel point in the portrait area according to the light effect enhancement coefficient.
4. The method according to claim 3, wherein the performing light effect enhancement processing on each pixel point in the portrait area according to the light effect enhancement coefficient comprises:
acquiring the area of the portrait area, and taking the portrait area with the area larger than an area threshold value as a target portrait area;
and carrying out light effect enhancement processing on each pixel point in the target portrait area according to the light effect enhancement coefficient.
5. An image processing apparatus, characterized in that the apparatus comprises:
the center acquisition module is used for acquiring a center pixel point in the image to be processed selected according to the trigger instruction;
the model acquisition module is used for acquiring brightness data of an image to be processed, acquiring a reference pixel point according to the brightness data and acquiring the spacing distance between the reference pixel point and the central pixel point; obtaining a light effect intensity factor according to the interval distance; acquiring a light effect enhancement model according to the light effect intensity factor and the central pixel point; the light effect intensity factor is a parameter influencing light effect enhancement processing intensity, the light effect intensity factor is positively correlated with the interval distance, the brightness value of the reference pixel point is greater than a brightness threshold value, and the light effect enhancement model is a model simulating light intensity change by taking the central pixel point as a light source;
the coefficient acquisition module is used for calculating the light effect enhancement coefficient of each pixel point in the image to be processed according to the light effect enhancement model;
and the enhancement processing module is used for carrying out light effect enhancement processing on each pixel point in the image to be processed according to the light effect enhancement coefficient.
6. The apparatus of claim 5,
the model obtaining module is further configured to obtain a two-dimensional gaussian distribution function, and construct a light effect enhancement model by using the center pixel point as a maximum value point of the two-dimensional gaussian distribution function.
7. The apparatus of claim 5 or 6,
the enhancement processing module is further used for detecting a portrait area in the image to be processed, and carrying out light effect enhancement processing on each pixel point in the portrait area according to the light effect enhancement coefficient.
8. The apparatus of claim 7,
the enhancement processing module is further used for acquiring the area of the portrait area and taking the portrait area with the area larger than the area threshold value as a target portrait area; and carrying out light effect enhancement processing on each pixel point in the target portrait area according to the light effect enhancement coefficient.
9. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 4.
10. An electronic device comprising a memory and a processor, the memory having stored therein computer-readable instructions that, when executed by the processor, cause the processor to perform the image processing method of any of claims 1 to 4.
CN201810231620.4A 2018-03-20 2018-03-20 Image processing method, image processing device, computer-readable storage medium and electronic equipment Expired - Fee Related CN108419028B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810231620.4A CN108419028B (en) 2018-03-20 2018-03-20 Image processing method, image processing device, computer-readable storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810231620.4A CN108419028B (en) 2018-03-20 2018-03-20 Image processing method, image processing device, computer-readable storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN108419028A CN108419028A (en) 2018-08-17
CN108419028B true CN108419028B (en) 2020-07-17

Family

ID=63132834

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810231620.4A Expired - Fee Related CN108419028B (en) 2018-03-20 2018-03-20 Image processing method, image processing device, computer-readable storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN108419028B (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109040598B (en) * 2018-08-29 2020-08-14 Oppo广东移动通信有限公司 Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN109325905B (en) * 2018-08-29 2023-10-13 Oppo广东移动通信有限公司 Image processing method, image processing device, computer readable storage medium and electronic apparatus
CN109242794B (en) * 2018-08-29 2021-05-11 Oppo广东移动通信有限公司 Image processing method, image processing device, electronic equipment and computer readable storage medium
CN109242793A (en) * 2018-08-29 2019-01-18 Oppo广东移动通信有限公司 Image processing method, device, computer readable storage medium and electronic equipment
CN109191398B (en) * 2018-08-29 2021-08-03 Oppo广东移动通信有限公司 Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN109246354B (en) * 2018-09-07 2020-04-24 Oppo广东移动通信有限公司 Image processing method and device, electronic equipment and computer readable storage medium
CN109447926A (en) * 2018-09-29 2019-03-08 Oppo广东移动通信有限公司 Image processing method and device, electronic equipment, computer readable storage medium
CN109345603B (en) * 2018-09-29 2021-08-31 Oppo广东移动通信有限公司 Image processing method and device, electronic equipment and computer readable storage medium
CN109461186A (en) * 2018-10-15 2019-03-12 Oppo广东移动通信有限公司 Image processing method, device, computer readable storage medium and electronic equipment
CN109447927B (en) * 2018-10-15 2021-01-22 Oppo广东移动通信有限公司 Image processing method and device, electronic equipment and computer readable storage medium
CN111178118B (en) * 2018-11-13 2023-07-21 浙江宇视科技有限公司 Image acquisition processing method, device and computer readable storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101308572A (en) * 2008-06-24 2008-11-19 北京中星微电子有限公司 Luminous effect processing method and apparatus
JP2014206566A (en) * 2013-04-10 2014-10-30 株式会社ハートス Image processing light-emission control system, light-emitting display image processing program, and image light-emitting display method
CN104794699A (en) * 2015-05-08 2015-07-22 四川天上友嘉网络科技有限公司 Image rendering method applied to games
CN106709981A (en) * 2016-12-29 2017-05-24 天津瀚海星云数字科技有限公司 Local illumination method and local illumination system used for UI Sprite image in U3D
CN107742274A (en) * 2017-10-31 2018-02-27 广东欧珀移动通信有限公司 Image processing method, device, computer-readable recording medium and electronic equipment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015018219A (en) * 2013-06-14 2015-01-29 キヤノン株式会社 Image display device and method for controlling the same

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101308572A (en) * 2008-06-24 2008-11-19 北京中星微电子有限公司 Luminous effect processing method and apparatus
JP2014206566A (en) * 2013-04-10 2014-10-30 株式会社ハートス Image processing light-emission control system, light-emitting display image processing program, and image light-emitting display method
CN104794699A (en) * 2015-05-08 2015-07-22 四川天上友嘉网络科技有限公司 Image rendering method applied to games
CN106709981A (en) * 2016-12-29 2017-05-24 天津瀚海星云数字科技有限公司 Local illumination method and local illumination system used for UI Sprite image in U3D
CN107742274A (en) * 2017-10-31 2018-02-27 广东欧珀移动通信有限公司 Image processing method, device, computer-readable recording medium and electronic equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
多光源图像细化和细节增强的协同图像处理算法研究;张德发等;《重庆邮电大学学报( 自然科学版)》;20140430;全文 *

Also Published As

Publication number Publication date
CN108419028A (en) 2018-08-17

Similar Documents

Publication Publication Date Title
CN108419028B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
JP7003238B2 (en) Image processing methods, devices, and devices
EP3480783B1 (en) Image-processing method, apparatus and device
CN108537155B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN108734676B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN107451969B (en) Image processing method, image processing device, mobile terminal and computer readable storage medium
CN108111749B (en) Image processing method and device
CN107509031B (en) Image processing method, image processing device, mobile terminal and computer readable storage medium
CN108154514B (en) Image processing method, device and equipment
JP6903816B2 (en) Image processing method and equipment
CN108846807B (en) Light effect processing method and device, terminal and computer-readable storage medium
CN108717530B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN107395991B (en) Image synthesis method, image synthesis device, computer-readable storage medium and computer equipment
CN109242794B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN108616700B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN108012078A (en) Brightness of image processing method, device, storage medium and electronic equipment
CN107704798B (en) Image blurring method and device, computer readable storage medium and computer device
CN109191398B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN107194901B (en) Image processing method, image processing device, computer equipment and computer readable storage medium
CN109325905B (en) Image processing method, image processing device, computer readable storage medium and electronic apparatus
CN107454317B (en) Image processing method, image processing device, computer-readable storage medium and computer equipment
CN108600631B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN107295261B (en) Image defogging method and device, storage medium and mobile terminal
CN109040598B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
CN108629329B (en) Image processing method and device, electronic equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant after: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd.

Address before: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant before: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS Corp.,Ltd.

GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20200717