CN110619614B - Image processing method, device, computer equipment and storage medium - Google Patents

Image processing method, device, computer equipment and storage medium Download PDF

Info

Publication number
CN110619614B
CN110619614B CN201911019819.1A CN201911019819A CN110619614B CN 110619614 B CN110619614 B CN 110619614B CN 201911019819 A CN201911019819 A CN 201911019819A CN 110619614 B CN110619614 B CN 110619614B
Authority
CN
China
Prior art keywords
image
pixel
value
processing
pixel point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911019819.1A
Other languages
Chinese (zh)
Other versions
CN110619614A (en
Inventor
吴文艺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Kugou Computer Technology Co Ltd
Original Assignee
Guangzhou Kugou Computer Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Kugou Computer Technology Co Ltd filed Critical Guangzhou Kugou Computer Technology Co Ltd
Priority to CN201911019819.1A priority Critical patent/CN110619614B/en
Publication of CN110619614A publication Critical patent/CN110619614A/en
Application granted granted Critical
Publication of CN110619614B publication Critical patent/CN110619614B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • G06T5/70
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20192Edge enhancement; Edge preservation

Abstract

The application discloses an image processing method, an image processing device, computer equipment and a storage medium, and belongs to the technical field of computers. The method comprises the following steps: performing image smoothing processing on the first image to obtain a second image; determining a third image based on the difference of pixel values of corresponding pixel points in the first image and the second image; amplifying pixel values larger than a preset threshold value in the pixel values of all pixel points in the third image, and reducing the pixel values smaller than the preset threshold value to obtain a fourth image; and performing image synthesis based on the fourth image and the first image to obtain a fifth image. The method can reduce the influence on the image edge definition during the peeling treatment.

Description

Image processing method, device, computer equipment and storage medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a method and apparatus for image processing, a computer device, and a storage medium.
Background
In daily life, people apply more and more images, particularly people apply more and more personal images, and with the increase of applications, the quality requirements of people on the personal images are higher and higher.
In order to meet the requirements of users on the image quality of the characters, the technical staff develop the technology of grinding skin, the function of grinding skin is to calculate the final pixel value of each pixel point by weighted average of the pixel values of each pixel point and the pixel points in the neighborhood of each pixel point in the input image, and the pixel values of each adjacent pixel point are more similar after the operation, so that the purpose of beautifying the skin of the characters is achieved.
In carrying out the present application, the inventors have found that the prior art has at least the following problems:
because the prior art performs weighted average calculation on the pixel values of each pixel point in the figure image and the pixel points in the adjacent region, the pixel points of the edge in the figure image are similar in value, so that the outline of the figure image is unclear.
Disclosure of Invention
The embodiment of the application provides an image processing method, an image processing device, computer equipment and a storage medium, which can solve the problem that the outline of a person image is not clear after skin grinding. The technical scheme is as follows:
in one aspect, there is provided a method of image processing, the method comprising:
performing image smoothing processing on the first image to obtain a second image;
determining a third image based on a difference in pixel values of corresponding pixels in the first image and the second image;
amplifying pixel values larger than a preset threshold value in the pixel values of all pixel points in the third image, and reducing the pixel values smaller than the preset threshold value to obtain a fourth image;
and performing image synthesis based on the fourth image and the first image to obtain a fifth image.
Optionally, before the image smoothing processing is performed on the first image to obtain the second image, the method further includes:
carrying out normalization processing on pixel values of all pixel points in an original image to obtain the first image;
the method further includes, after performing image synthesis based on the fourth image and the first image to obtain a fifth image:
and carrying out inverse processing of the normalization processing on the pixel values of all the pixel points in the fifth image to obtain a sixth image.
Optionally, in the pixel values of each pixel point in the third image, performing amplification processing on the pixel value greater than the preset threshold, and performing reduction processing on the pixel value less than the preset threshold, to obtain a fourth image, where the method includes:
in the pixel value of each pixel point in the third image, the formula a' =1- (1-a) is based on 2 Amplifying pixel values greater than a preset threshold and based on formula B' =b 2 And (a) carrying out reduction processing on the pixel value smaller than a preset threshold value to obtain a fourth image, wherein A is the pixel value before the pixel point in the third image is enlarged, A 'is the pixel value after the pixel point in the third image is enlarged, A is the pixel value before the pixel point in the third image is enlarged, B' is the pixel value after the pixel point in the third image is reduced, B is the pixel value before the pixel point in the third image is reduced, and a is the preset threshold value.
Optionally, the determining the third image based on the difference value of the pixel values of the corresponding pixels in the first image and the second image includes:
and taking the sum of the difference value of the pixel values of the corresponding pixel points in the first image and the second image and the preset value as the pixel value to obtain a third image.
Optionally, the image synthesizing based on the fourth image and the first image to obtain a fifth image includes:
based on the preset number of clusters, carrying out clustering processing on the pixel values of all pixel points of the fourth image to obtain the number of classes and the value of a cluster center corresponding to each class, and adjusting the pixel value in each class to be the value of the cluster center to obtain a seventh image;
and performing image synthesis on the seventh image and the first image to obtain a fifth image.
Optionally, the image synthesizing the seventh image and the first image to obtain a fifth image includes:
determining an adjusting factor corresponding to each pixel point based on the pixel value of each pixel point in the second image;
and based on the adjusting factors corresponding to the pixel points, performing image synthesis on the seventh image and the first image to obtain a fifth image.
Optionally, the determining, based on the pixel values of the pixels in the second image, the adjustment factor corresponding to the pixels includes:
based on the formula p=nx m Determining an adjustment factor corresponding to a pixel point in the second image, wherein X is a pixel value of any pixel point in the second image, P is the adjustment factor corresponding to any pixel point, n and m are preset values, and 0<n<1、0<m<1。
Optionally, the image synthesizing the seventh image and the first image based on the adjustment factors corresponding to the pixel points to obtain a fifth image includes:
and determining pixel values of all pixel points in a fifth image based on a formula Q+P Q-P L to obtain the fifth image, wherein Q, L is the pixel value of the corresponding pixel point in the first image and the seventh image respectively, and P is an adjusting factor of the corresponding pixel point.
In another aspect, there is provided an apparatus for image processing, the apparatus comprising:
the smoothing module is used for carrying out image smoothing processing on the first image to obtain a second image;
a difference module, configured to determine a third image based on a difference between pixel values of corresponding pixel points in the first image and the second image;
the scaling module is used for amplifying the pixel values larger than a preset threshold value in the pixel values of all the pixel points in the third image, and reducing the pixel values smaller than the preset threshold value to obtain a fourth image;
and the synthesis module is used for carrying out image synthesis based on the fourth image and the first image to obtain a fifth image.
Optionally, the apparatus further includes:
the normalization module is used for carrying out normalization processing on pixel values of all pixel points in the original image to obtain the first image;
the apparatus further comprises:
and the inverse normalization module is used for performing inverse processing of the normalization processing on the pixel values of all the pixel points in the fifth image to obtain a sixth image.
Optionally, the scaling module is configured to:
in the pixel value of each pixel point in the third image, the formula a' =1- (1-a) is based on 2 Amplifying pixel values greater than a preset threshold and based on formula B' =b 2 A, performing reduction processing on the pixel value smaller than the preset threshold value to obtain a fourth image, wherein A is the pixel value before the pixel point in the third image is amplified, A 'is the pixel value after the pixel point in the third image is amplified, A is the pixel value before the pixel point in the third image is amplified, and B' is the pixel value before the pixel point in the third image is amplifiedAnd B is the pixel value before the pixel point in the third image is reduced, and a is the preset threshold value.
Optionally, the difference module is configured to:
and taking the sum of the difference value of the pixel values of the corresponding pixel points in the first image and the second image and the preset value as the pixel value to obtain a third image.
Optionally, the synthesis module is configured to:
based on the preset number of clusters, carrying out clustering processing on the pixel values of all pixel points of the fourth image to obtain the number of classes and the value of a cluster center corresponding to each class, and adjusting the pixel value in each class to be the value of the cluster center to obtain a seventh image;
and performing image synthesis on the seventh image and the first image to obtain a fifth image.
Optionally, the synthesis module is configured to:
determining an adjusting factor corresponding to each pixel point based on the pixel value of each pixel point in the second image;
and based on the adjusting factors corresponding to the pixel points, performing image synthesis on the seventh image and the first image to obtain a fifth image.
Optionally, the synthesis module is configured to:
based on the formula p=nx m Determining an adjustment factor corresponding to a pixel point in the second image, wherein X is a pixel value of any pixel point in the second image, P is the adjustment factor corresponding to any pixel point, n and m are preset values, and 0<n<1、0<m<1。
Optionally, the synthesis module is configured to:
and determining pixel values of all pixel points in a fifth image based on a formula Q+P Q-P L to obtain the fifth image, wherein Q, L is the pixel value of the corresponding pixel point in the first image and the seventh image respectively, and P is an adjusting factor of the corresponding pixel point.
In yet another aspect, a computer device is provided that includes a processor and a memory having at least one instruction stored therein that is loaded and executed by the processor to perform the operations performed by the method of image processing as described above.
In yet another aspect, a computer-readable storage medium having stored therein at least one instruction loaded and executed by a processor to perform the operations performed by the method of image processing as described above is provided.
The beneficial effects that technical scheme that this application embodiment provided brought are:
the first image is subjected to smoothing processing to obtain a second image, then the difference value between the first image and the second image is calculated to obtain a third image, then the pixel value of a pixel point in the third image is subjected to amplification or reduction processing to obtain a fourth image, and then a fifth image is obtained based on the fourth image and the first image. In the processing from the third image to the fourth image, the edges are subjected to sharpening processing, so that the influence on the edge sharpness of the images can be reduced in the peeling processing.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic illustration of an implementation environment provided by an embodiment of the present application;
FIG. 2 is a flow chart of a method of image processing provided by an embodiment of the present application;
fig. 3 is a schematic view of an apparatus structure for image processing according to an embodiment of the present application;
fig. 4 is a schematic diagram of a terminal structure of image processing according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
The embodiment of the application provides an image processing method which can be realized by a terminal. The terminal may have a display, and at the same time, the terminal may be a mobile phone, a desktop computer, a tablet computer, a notebook computer, etc. The terminal may have an application program, such as an image processing application program, installed therein.
As shown in fig. 1, the terminal is provided with an application program for image processing (hereinafter referred to as an application program), and a user can import a picture to be processed into the application program, can adjust the color tone of an image in the application program, can blur the background, can adjust the brightness of the image, and the like, and can provide functions such as face thinning, eye enlargement, skin polishing, and the like in the case of a person image. In this embodiment, the terminal uses the image processing application program to perform the skin peeling processing on the character image, and other cases are similar to the processing application program and will not be described again.
When the application program is used, a user copies a character image to be processed into the application program, the user clicks a control for performing skin polishing on the character image according to the needs of the user, when the application program receives an instruction for performing skin polishing, the application program obtains pixel values of all pixel points in the character image, then the pixel values of all pixel points are operated based on a processing function set in the application program to obtain new pixel values, the pixel values are output, and the application program displays the image according to the pixel values of the pixel points. The image processing method provided by the embodiment of the application can optimize the image, so that the obtained image is smooth and has clear edges.
Fig. 2 is a flowchart of an image processing method according to an embodiment of the present application. Referring to fig. 1, the process includes:
in step 201, an image smoothing process is performed on the first image to obtain a second image.
In an implementation, a user inputs a character image to be processed into an application program, the application program analyzes the image to obtain a pixel value of each pixel point in the image, the pixel value may be composed of a plurality of channels, in this embodiment, three channels are illustrated by taking three channels as an example, the three channels are R, G, B respectively, representing three colors of red, green and blue, each channel corresponds to a pixel value, the range of the pixel value is 0 to 255,0 represents the lowest brightness, 255 represents the highest brightness, for example, the pixel value of a certain pixel point is (0, 0), and the pixel value of R, G, B of the pixel point is 0, representing that the pixel point is the highest brightness, that is, white.
After obtaining the pixel values of each pixel point in the image, dividing the pixel values into three groups according to R, G, B, and then carrying out normalization processing on the pixel values. I.e. all pixel values from 0 to 255 divided by 255, result in pixel values between 0 and 1.
After the above operation, a first image is obtained, that is, an image after normalization processing is obtained, and then the first image is subjected to smoothing processing, where the smoothing processing may be performing gaussian filtering calculation on pixel values of each pixel point in the first image, and the processing may be as follows:
and respectively calculating the three groups of pixel values of R, G, B, wherein the calculation process can be to scan the pixel value of each pixel point in the image through a calculation model, the calculation model can be a convolution model, the convolution model carries out weighted average calculation on the pixel value of the pixel point in the neighborhood of any pixel point with a preset size based on any pixel point, the obtained weighted average value is given to the pixel point, namely the pixel value of the pixel point is changed into the weighted average value obtained through the calculation, the calculation is repeated until all three groups of pixel values are replaced with the weighted average value of the neighborhood corresponding to the pixel point, and finally the second image is obtained through the calculation.
All the following steps are R, G, B, and the same processing is performed on three groups of pixel values, which are not described in detail herein.
Step 202, determining a third image based on a difference value of pixel values of corresponding pixel points in the first image and the second image.
In the implementation, first, the pixel value of each pixel in the first image is made worse than the pixel value of the pixel in the second image obtained by the above smoothing process. Next, in order to prevent the negative number from occurring during the subtraction operation, a preset value is added to the difference value after the difference value is obtained, the preset value may be an intermediate value of 0.5 in the interval from 0 to 1, the pixel values obtained after the addition of 0.5 are all pixel values in the interval from 0 to 1, and then the pixel values are used as the pixel values of each pixel point in the third image.
If there is a special case, i.e. adding 0.5, which is still a negative number, the pixel value is set to 0.
If a situation of greater than 1 occurs after 0.5 is added, the pixel value is set to 1.
In step 203, among the pixel values of each pixel point in the third image, the pixel value greater than the preset threshold is amplified, and the pixel value less than the preset threshold is reduced, so as to obtain a fourth image.
In implementation, after obtaining the pixel value of each pixel point in the third image, determining whether the pixel value of each pixel point is greater than a preset threshold, where the preset threshold may be set by:
firstly, after the processing in step 102, the pixel value of the smoothed pixel is the above intermediate value, that is, 0.5, and the non-smoothed pixel, that is, the edge pixel, performs two cases:
in the first case, the pixel value of the second image is greater than the pixel value of the first image, that is, since the pixel point is an edge pixel point, the pixel value in the neighborhood of the pixel point is higher than the pixel point of the pixel point, and after the smoothing processing, the finally obtained pixel value is higher than the original pixel value of the pixel point. The final value in this case is less than 0.5.
In the second case, the pixel value of the second image is smaller than the pixel value of the first image, that is, because the pixel point is an edge pixel point, the pixel value in the neighborhood of the pixel point is lower than the pixel point of the pixel point, and after the smoothing processing is performed, the finally obtained pixel value is lower than the original pixel value of the pixel point. The final value in this case is greater than 0.5.
Next, a preset threshold is determined, from the above, whether 0.5 is a median value of all pixel values or a critical value of the edge pixel point is determined, so that the following determination can be made by setting 0.5 as the preset threshold:
when it is larger than 0.5, the pixel value is brought into formula 1- (1-A) 2 0.5, the formula is a quadratic function, and according to the image of the function, the result obtained by calculation is larger than the previous pixel value, namely, the brightness of a part of edge points is lightened.
When not more than 0.5, the pixel value is brought into formula A 2 And/0.5, wherein the formula is a quadratic function, and according to the image of the function, the result obtained by calculation is smaller than the previous pixel value, namely the brightness of the smooth point is unchanged, and the brightness of the other part of edge points is dimmed.
After the operation, the obtained pixel value is the pixel value of the fourth image.
And 204, performing image synthesis based on the fourth image and the first image to obtain a fifth image.
In implementation, based on the preset number of clusters, the pixel values of each pixel point in the fourth image are clustered, and the clustering may be fuzzy C-means clustering, and the processing procedure may be as follows:
and inputting the pixel values of all pixel points in the fourth image into a fuzzy C-means clustering processing model, setting iteration times for obtaining an expected result, outputting a central value when the iteration times are up to the iteration times, and also setting a preset threshold value, and outputting the central value when the difference value of the central values calculated in the last two times is smaller than the preset threshold value. After the setting is completed, the number of the central values are input to the fuzzy C-means clustering processing model, the fuzzy C-means clustering processing model generates the number of the central values after processing, the central values are obtained and stored, then the central values are used as input to repeat the step of generating the central values until the preset iteration times are reached, or the difference value between the output central values and the input central values is smaller than a preset threshold value, and the finally obtained central values are output.
And setting the pixel values of the pixel points in each cluster to be the central value output by the steps according to the corresponding relation, and obtaining a seventh image.
After obtaining the seventh image, performing image synthesis based on the seventh image and the first image to obtain a fifth image, wherein the processing procedure may be as follows:
firstly, calculating an adjustment factor based on a second image, performing power function operation on pixel values of all pixel points in the second image, wherein the number of times of the power function is a preset decimal between 0 and 1, and meanwhile, the pixel value of the input pixel point is also a decimal between 0 and 1, so that a result is larger than the input pixel value after power function operation, and after the result of the power function is obtained, the result is multiplied by a decimal set by a technician, so that the adjustment factor of each pixel point can be obtained.
And then, calculating a fifth image based on the adjustment factors, multiplying the pixel values of all pixel points in the second image by the adjustment factors of the points respectively to obtain modified pixel values of the second image on the first image, multiplying the pixel values of all pixel points in the seventh image by the adjustment factors of the points respectively to obtain modified pixel values of the seventh image on the first image, adding the pixel values of all pixel points in the first image to the modified pixel values of the corresponding second image on the first image, and subtracting the modified pixel values of the seventh image on the first image to obtain the fifth image.
After obtaining the fifth image, the application program needs to perform calculation of the sixth image, multiplies the pixel value of each pixel point in the obtained fifth image by 0.3, adds the corresponding value of each pixel point in the first image by 0.7, and performs inverse normalization processing, namely, the pixel values in the interval 0 to 1 obtained through the steps are corresponding to the interval 0 to 255, so as to obtain the sixth image, and the application program displays the sixth image.
The first image is subjected to smoothing processing to obtain a second image, then the difference value between the first image and the second image is calculated to obtain a third image, then the pixel value of a pixel point in the third image is subjected to amplification or reduction processing to obtain a fourth image, and then a fifth image is obtained based on the fourth image and the first image. In the processing from the third image to the fourth image, the edges are subjected to sharpening processing, so that the influence on the edge sharpness of the images can be reduced in the peeling processing.
Any combination of the above-mentioned optional solutions may be adopted to form an optional embodiment of the present disclosure, which is not described herein in detail.
Fig. 3 is a schematic diagram of an image processing apparatus according to an embodiment of the present application. Referring to fig. 2, the process includes:
a smoothing module 310, configured to perform image smoothing processing on the first image to obtain a second image;
a difference module 320, configured to determine a third image based on a difference between pixel values of corresponding pixel points in the first image and the second image;
the scaling module 330 is configured to amplify a pixel value greater than a preset threshold value among the pixel values of each pixel point in the third image, and reduce a pixel value less than the preset threshold value to obtain a fourth image;
and a synthesizing module 340, configured to synthesize an image based on the fourth image and the first image, so as to obtain a fifth image.
Optionally, the apparatus further includes:
the normalization module is used for carrying out normalization processing on pixel values of all pixel points in the original image to obtain the first image;
the apparatus further comprises:
and the inverse normalization module is used for performing inverse processing of the normalization processing on the pixel values of all the pixel points in the fifth image to obtain a sixth image.
Optionally, the scaling module 330 is configured to:
in the pixel value of each pixel point in the third image, the formula a' =1- (1-a) is based on 2 Amplifying pixel values greater than a preset threshold and based on formula B' =b 2 A, for images smaller than a preset threshold valueAnd carrying out reduction processing on the pixel values to obtain a fourth image, wherein A is the pixel value before the pixel point in the third image is enlarged, A 'is the pixel value after the pixel point in the third image is enlarged, A is the pixel value before the pixel point in the third image is enlarged, B' is the pixel value after the pixel point in the third image is reduced, B is the pixel value before the pixel point in the third image is reduced, and a is the preset threshold value.
Optionally, the difference module 320 is configured to:
and taking the sum of the difference value of the pixel values of the corresponding pixel points in the first image and the second image and the preset value as the pixel value to obtain a third image.
Optionally, the synthesizing module 340 is configured to:
based on the preset number of clusters, carrying out clustering processing on the pixel values of all pixel points of the fourth image to obtain the number of classes and the value of a cluster center corresponding to each class, and adjusting the pixel value in each class to be the value of the cluster center to obtain a seventh image;
and performing image synthesis on the seventh image and the first image to obtain a fifth image.
Optionally, the synthesizing module 340 is configured to:
determining an adjusting factor corresponding to each pixel point based on the pixel value of each pixel point in the second image;
and based on the adjusting factors corresponding to the pixel points, performing image synthesis on the seventh image and the first image to obtain a fifth image.
Optionally, the synthesizing module 340 is configured to:
based on the formula p=nx m Determining an adjustment factor corresponding to a pixel point in the second image, wherein X is a pixel value of any pixel point in the second image, P is the adjustment factor corresponding to any pixel point, n and m are preset values, and 0<n<1、0<m<1。
Optionally, the synthesizing module 340 is configured to:
and determining pixel values of all pixel points in a fifth image based on a formula Q+P Q-P L to obtain the fifth image, wherein Q, L is the pixel value of the corresponding pixel point in the first image and the seventh image respectively, and P is an adjusting factor of the corresponding pixel point.
The method comprises the steps of carrying out smoothing treatment on a first image to obtain a second image, calculating the difference value between the first image and the second image to obtain a third image, carrying out method or reduction treatment on the pixel value of a pixel point in the third image to obtain a fourth image, and then obtaining a fifth image based on the fourth image and the first image, so that the pixel value of a part of pixel points of the edge in the image is reduced, and the pixel value of a part of pixel points is increased, thereby enabling the edge contour of the image to be clear.
The first image is subjected to smoothing processing to obtain a second image, then the difference value between the first image and the second image is calculated to obtain a third image, then the pixel value of a pixel point in the third image is subjected to amplification or reduction processing to obtain a fourth image, and then a fifth image is obtained based on the fourth image and the first image. In the processing from the third image to the fourth image, the edges are subjected to sharpening processing, so that the influence on the edge sharpness of the images can be reduced in the peeling processing.
Fig. 4 shows a block diagram of a terminal 400 according to an exemplary embodiment of the present application. The terminal 400 may be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion picture expert compression standard audio plane 3), an MP4 (Moving Picture Experts Group Audio Layer IV, motion picture expert compression standard audio plane 4) player, a notebook computer, or a desktop computer. The terminal 400 may also be referred to by other names as user equipment, portable terminal, laptop terminal, desktop terminal, etc.
In general, the terminal 400 includes: a processor 401 and a memory 402.
Processor 401 may include one or more processing cores such as a 4-core processor, an 8-core processor, etc. The processor 401 may be implemented in at least one hardware form of DSP (Digital Signal Processing ), FPGA (Field-Programmable Gate Array, field programmable gate array), PLA (Programmable Logic Array ). The processor 401 may also include a main processor, which is a processor for processing data in an awake state, also called a CPU (Central Processing Unit ), and a coprocessor; a coprocessor is a low-power processor for processing data in a standby state. In some embodiments, the processor 401 may integrate a GPU (Graphics Processing Unit, image processor) for rendering and drawing of content required to be displayed by the display screen. In some embodiments, the processor 401 may also include an AI (Artificial Intelligence ) processor for processing computing operations related to machine learning.
Memory 402 may include one or more computer-readable storage media, which may be non-transitory. Memory 402 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 402 is used to store at least one instruction for execution by processor 401 to implement the image processing methods provided by the method embodiments herein.
In some embodiments, the terminal 400 may further optionally include: a peripheral interface 403 and at least one peripheral. The processor 401, memory 402, and peripheral interface 403 may be connected by a bus or signal line. The individual peripheral devices may be connected to the peripheral device interface 403 via buses, signal lines or a circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 404, a touch display 405, a camera 406, audio circuitry 407, a positioning component 408, and a power supply 409.
Peripheral interface 403 may be used to connect at least one Input/Output (I/O) related peripheral to processor 401 and memory 402. In some embodiments, processor 401, memory 402, and peripheral interface 403 are integrated on the same chip or circuit board; in some other embodiments, either or both of the processor 401, memory 402, and peripheral interface 403 may be implemented on separate chips or circuit boards, which is not limited in this embodiment.
The Radio Frequency circuit 404 is configured to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuitry 404 communicates with a communication network and other communication devices via electromagnetic signals. The radio frequency circuit 404 converts an electrical signal into an electromagnetic signal for transmission, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 404 includes: antenna systems, RF transceivers, one or more amplifiers, tuners, oscillators, digital signal processors, codec chipsets, subscriber identity module cards, and so forth. The radio frequency circuitry 404 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocol includes, but is not limited to: metropolitan area networks, various generations of mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or WiFi (Wireless Fidelity ) networks. In some embodiments, the radio frequency circuitry 404 may also include NFC (Near Field Communication ) related circuitry, which is not limited in this application.
The display screen 405 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 405 is a touch display screen, the display screen 405 also has the ability to collect touch signals at or above the surface of the display screen 405. The touch signal may be input as a control signal to the processor 401 for processing. At this time, the display screen 405 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 405 may be one, providing a front panel of the terminal 400; in other embodiments, the display 405 may be at least two, and disposed on different surfaces of the terminal 400 or in a folded design; in still other embodiments, the display 405 may be a flexible display disposed on a curved surface or a folded surface of the terminal 400. Even more, the display screen 405 may be arranged in an irregular pattern that is not rectangular, i.e. a shaped screen. The display 405 may be made of LCD (Liquid Crystal Display ), OLED (Organic Light-Emitting Diode) or other materials.
The camera assembly 406 is used to capture images or video. Optionally, camera assembly 406 includes a front camera and a rear camera. Typically, the front camera is disposed on the front panel of the terminal and the rear camera is disposed on the rear surface of the terminal. In some embodiments, the at least two rear cameras are any one of a main camera, a depth camera, a wide-angle camera and a tele camera, so as to realize that the main camera and the depth camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize a panoramic shooting and Virtual Reality (VR) shooting function or other fusion shooting functions. In some embodiments, camera assembly 406 may also include a flash. The flash lamp can be a single-color temperature flash lamp or a double-color temperature flash lamp. The dual-color temperature flash lamp refers to a combination of a warm light flash lamp and a cold light flash lamp, and can be used for light compensation under different color temperatures.
The audio circuit 407 may include a microphone and a speaker. The microphone is used for collecting sound waves of users and environments, converting the sound waves into electric signals, and inputting the electric signals to the processor 401 for processing, or inputting the electric signals to the radio frequency circuit 404 for realizing voice communication. For the purpose of stereo acquisition or noise reduction, a plurality of microphones may be respectively disposed at different portions of the terminal 400. The microphone may also be an array microphone or an omni-directional pickup microphone. The speaker is used to convert electrical signals from the processor 401 or the radio frequency circuit 404 into sound waves. The speaker may be a conventional thin film speaker or a piezoelectric ceramic speaker. When the speaker is a piezoelectric ceramic speaker, not only the electric signal can be converted into a sound wave audible to humans, but also the electric signal can be converted into a sound wave inaudible to humans for ranging and other purposes. In some embodiments, audio circuit 407 may also include a headphone jack.
The location component 408 is used to locate the current geographic location of the terminal 400 to enable navigation or LBS (Location Based Service, location-based services). The positioning component 408 may be a positioning component based on the united states GPS (Global Positioning System ), the beidou system of china, the grainer system of russia, or the galileo system of the european union.
The power supply 409 is used to power the various components in the terminal 400. The power supply 409 may be an alternating current, a direct current, a disposable battery, or a rechargeable battery. When power supply 409 comprises a rechargeable battery, the rechargeable battery may support wired or wireless charging. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the terminal 400 further includes one or more sensors 410. The one or more sensors 410 include, but are not limited to: acceleration sensor 411, gyroscope sensor 412, pressure sensor 413, fingerprint sensor 414, optical sensor 415, and proximity sensor 416.
The acceleration sensor 411 may detect the magnitudes of accelerations on three coordinate axes of the coordinate system established with the terminal 400. For example, the acceleration sensor 411 may be used to detect components of gravitational acceleration on three coordinate axes. The processor 401 may control the touch display screen 405 to display a user interface in a lateral view or a longitudinal view according to the gravitational acceleration signal acquired by the acceleration sensor 411. The acceleration sensor 411 may also be used for the acquisition of motion data of a game or a user.
The gyro sensor 412 may detect a body direction and a rotation angle of the terminal 400, and the gyro sensor 412 may collect a 3D motion of the user to the terminal 400 in cooperation with the acceleration sensor 411. The processor 401 may implement the following functions according to the data collected by the gyro sensor 412: motion sensing (e.g., changing UI according to a tilting operation by a user), image stabilization at shooting, game control, and inertial navigation.
The pressure sensor 413 may be disposed at a side frame of the terminal 400 and/or at a lower layer of the touch display 405. When the pressure sensor 413 is disposed at a side frame of the terminal 400, a grip signal of the terminal 400 by a user may be detected, and the processor 401 performs a left-right hand recognition or a shortcut operation according to the grip signal collected by the pressure sensor 413. When the pressure sensor 413 is disposed at the lower layer of the touch display screen 405, the processor 401 controls the operability control on the UI interface according to the pressure operation of the user on the touch display screen 405. The operability controls include at least one of a button control, a scroll bar control, an icon control, and a menu control.
The fingerprint sensor 414 is used to collect a fingerprint of the user, and the processor 401 identifies the identity of the user based on the fingerprint collected by the fingerprint sensor 414, or the fingerprint sensor 414 identifies the identity of the user based on the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, the user is authorized by the processor 401 to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying for and changing settings, etc. The fingerprint sensor 414 may be provided on the front, back or side of the terminal 400. When a physical key or vendor Logo is provided on the terminal 400, the fingerprint sensor 414 may be integrated with the physical key or vendor Logo.
The optical sensor 415 is used to collect the ambient light intensity. In one embodiment, the processor 401 may control the display brightness of the touch display screen 405 according to the ambient light intensity collected by the optical sensor 415. Specifically, when the intensity of the ambient light is high, the display brightness of the touch display screen 405 is turned up; when the ambient light intensity is low, the display brightness of the touch display screen 405 is turned down. In another embodiment, the processor 401 may also dynamically adjust the shooting parameters of the camera assembly 406 according to the ambient light intensity collected by the optical sensor 415.
A proximity sensor 416, also referred to as a distance sensor, is typically provided on the front panel of the terminal 400. The proximity sensor 416 is used to collect the distance between the user and the front of the terminal 400. In one embodiment, when the proximity sensor 416 detects a gradual decrease in the distance between the user and the front face of the terminal 400, the processor 401 controls the touch display 405 to switch from the bright screen state to the off screen state; when the proximity sensor 416 detects that the distance between the user and the front surface of the terminal 400 gradually increases, the processor 401 controls the touch display screen 405 to switch from the off-screen state to the on-screen state.
Those skilled in the art will appreciate that the structure shown in fig. 4 is not limiting of the terminal 400 and may include more or fewer components than shown, or may combine certain components, or may employ a different arrangement of components.
In an exemplary embodiment, a computer readable storage medium, such as a memory comprising instructions executable by a processor in a terminal to perform the image processing method of the above embodiment, is also provided. For example, the computer readable storage medium may be Read-only Memory (ROM), random-access Memory (Random Access Memory, RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program for instructing relevant hardware, where the program may be stored in a computer readable storage medium, and the storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The foregoing description of the preferred embodiments of the present application is not intended to limit the invention to the particular embodiments of the present application, but to limit the scope of the invention to the particular embodiments of the present application.

Claims (12)

1. A method of image processing, the method comprising:
carrying out normalization processing on pixel values of all pixel points in an original image to obtain a first image;
performing image smoothing processing on the first image to obtain a second image;
taking the sum of the difference value of the pixel values of the corresponding pixel points in the first image and the second image and a preset value as a pixel value to obtain a third image;
in the pixel value of each pixel point in the third image, the formula a' =1- (1-a) is based on 2 Amplifying pixel values greater than a preset threshold and based on formula B' =b 2 A, performing pixel value less than preset thresholdPerforming line reduction processing to obtain a fourth image, wherein A 'is a pixel value of the third image after the pixel point is enlarged, A is a pixel value of the third image before the pixel point is enlarged, B' is a pixel value of the third image after the pixel point is reduced, B is a pixel value of the third image before the pixel point is reduced, and a is the preset threshold value;
based on the preset number of clusters, carrying out clustering processing on the pixel values of all the pixel points of the fourth image to obtain the number of classes and the numerical value of the cluster center corresponding to each class, and adjusting the pixel value in each class to the numerical value of the cluster center to obtain a seventh image;
based on the formula p=nx m Determining an adjustment factor corresponding to a pixel point in the second image, wherein X is a pixel value of any pixel point in the second image, P is the adjustment factor corresponding to any pixel point, n and m are preset values, and 0<n<1、0<m<1;
And determining pixel values of all pixel points in a fifth image based on a formula Q+P Q-P L to obtain the fifth image, wherein Q, L is the pixel value of the corresponding pixel point in the first image and the seventh image respectively, and P is an adjusting factor of the corresponding pixel point.
2. The method of claim 1, wherein after the obtaining the fifth image, the method further comprises:
and carrying out inverse processing of the normalization processing on the pixel values of all the pixel points in the fifth image to obtain a sixth image.
3. An apparatus for image processing, wherein the apparatus is configured to implement the method for image processing of claim 1, the apparatus comprising:
the smoothing module is used for carrying out image smoothing processing on the first image to obtain a second image;
a difference module, configured to determine a third image based on a difference between pixel values of corresponding pixel points in the first image and the second image;
the scaling module is used for amplifying the pixel values larger than a preset threshold value in the pixel values of all the pixel points in the third image, and reducing the pixel values smaller than the preset threshold value to obtain a fourth image;
and the synthesis module is used for carrying out image synthesis based on the fourth image and the first image to obtain a fifth image.
4. A device according to claim 3, characterized in that the device further comprises:
and the inverse normalization module is used for performing inverse processing of the normalization processing on the pixel values of all the pixel points in the fifth image to obtain a sixth image.
5. The apparatus of claim 4, wherein the scaling module is configured to:
in the pixel value of each pixel point in the third image, the formula a' =1- (1-a) is based on 2 Amplifying pixel values greater than a preset threshold and based on formula B' =b 2 And (a) carrying out reduction processing on the pixel value smaller than a preset threshold value to obtain a fourth image, wherein A 'is the pixel value of the third image after the pixel point is amplified, A is the pixel value of the third image before the pixel point is amplified, B' is the pixel value of the third image after the pixel point is reduced, B is the pixel value of the third image before the pixel point is reduced, and a is the preset threshold value.
6. The apparatus of claim 3, wherein the difference module is configured to:
and taking the sum of the difference value of the pixel values of the corresponding pixel points in the first image and the second image and the preset value as the pixel value to obtain a third image.
7. A device according to claim 3, wherein the synthesis module is configured to:
based on the preset number of clusters, carrying out clustering processing on the pixel values of all pixel points of the fourth image to obtain the number of classes and the value of a cluster center corresponding to each class, and adjusting the pixel value in each class to be the value of the cluster center to obtain a seventh image;
and performing image synthesis on the seventh image and the first image to obtain a fifth image.
8. The apparatus of claim 7, wherein the synthesis module is configured to:
determining an adjusting factor corresponding to each pixel point based on the pixel value of each pixel point in the second image;
and based on the adjusting factors corresponding to the pixel points, performing image synthesis on the seventh image and the first image to obtain a fifth image.
9. The apparatus of claim 8, wherein the synthesis module is configured to:
based on the formula p=nx m Determining an adjustment factor corresponding to a pixel point in the second image, wherein X is a pixel value of any pixel point in the second image, P is the adjustment factor corresponding to any pixel point, n and m are preset values, and 0<n<1、0<m<1。
10. The apparatus of claim 8, wherein the synthesis module is configured to:
and determining pixel values of all pixel points in a fifth image based on a formula Q+P Q-P L to obtain the fifth image, wherein Q, L is the pixel value of the corresponding pixel point in the first image and the seventh image respectively, and P is an adjusting factor of the corresponding pixel point.
11. A computer device comprising a processor and a memory having stored therein at least one instruction that is loaded and executed by the processor to implement the operations performed by the image processing method of claim 1 or claim 2.
12. A computer readable storage medium having stored therein at least one instruction that is loaded and executed by a processor to implement the operations performed by the image processing method of claim 1 or claim 2.
CN201911019819.1A 2019-10-24 2019-10-24 Image processing method, device, computer equipment and storage medium Active CN110619614B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911019819.1A CN110619614B (en) 2019-10-24 2019-10-24 Image processing method, device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911019819.1A CN110619614B (en) 2019-10-24 2019-10-24 Image processing method, device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN110619614A CN110619614A (en) 2019-12-27
CN110619614B true CN110619614B (en) 2023-05-16

Family

ID=68926392

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911019819.1A Active CN110619614B (en) 2019-10-24 2019-10-24 Image processing method, device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110619614B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112184581B (en) * 2020-09-27 2023-09-05 腾讯科技(深圳)有限公司 Image processing method, device, computer equipment and medium
CN113012185B (en) * 2021-03-26 2023-08-29 影石创新科技股份有限公司 Image processing method, device, computer equipment and storage medium
CN115022526B (en) * 2021-09-29 2023-05-30 荣耀终端有限公司 Full depth image generation method and device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010074416A (en) * 2008-09-17 2010-04-02 Nagasaki Univ Image data processing method, image data processing apparatus, and program
CN102790891A (en) * 2011-05-16 2012-11-21 乐金显示有限公司 Image processing method and stereoscopic image display device using the same

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030039402A1 (en) * 2001-08-24 2003-02-27 Robins David R. Method and apparatus for detection and removal of scanned image scratches and dust
JP4589887B2 (en) * 2006-03-20 2010-12-01 株式会社リコー Image processing device
JP5315158B2 (en) * 2008-09-12 2013-10-16 キヤノン株式会社 Image processing apparatus and image processing method
GB2568039B (en) * 2017-10-30 2020-10-28 Imagination Tech Ltd Systems and methods for processing a stream of data values
CN108133488A (en) * 2017-12-29 2018-06-08 安徽慧视金瞳科技有限公司 A kind of infrared image foreground detection method and equipment
CN108171667A (en) * 2017-12-29 2018-06-15 努比亚技术有限公司 A kind of image processing method, terminal and computer readable storage medium
JP2019180749A (en) * 2018-04-09 2019-10-24 富士通株式会社 Image processing program, image processing apparatus, and image processing method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010074416A (en) * 2008-09-17 2010-04-02 Nagasaki Univ Image data processing method, image data processing apparatus, and program
CN102790891A (en) * 2011-05-16 2012-11-21 乐金显示有限公司 Image processing method and stereoscopic image display device using the same

Also Published As

Publication number Publication date
CN110619614A (en) 2019-12-27

Similar Documents

Publication Publication Date Title
CN109829864B (en) Image processing method, device, equipment and storage medium
CN111028144B (en) Video face changing method and device and storage medium
CN110619614B (en) Image processing method, device, computer equipment and storage medium
CN111723803B (en) Image processing method, device, equipment and storage medium
CN112581358B (en) Training method of image processing model, image processing method and device
CN109102811B (en) Audio fingerprint generation method and device and storage medium
WO2023142915A1 (en) Image processing method, apparatus and device, and storage medium
CN111105474B (en) Font drawing method, font drawing device, computer device and computer readable storage medium
CN114140342A (en) Image processing method, image processing device, electronic equipment and storage medium
CN110992268B (en) Background setting method, device, terminal and storage medium
CN112135191A (en) Video editing method, device, terminal and storage medium
CN108196813B (en) Method and device for adding sound effect
CN115798417A (en) Backlight brightness determination method, device, equipment and computer readable storage medium
CN110910309B (en) Image processing method, image processing apparatus, electronic device, storage medium, and program product
CN108881739B (en) Image generation method, device, terminal and storage medium
CN112399080A (en) Video processing method, device, terminal and computer readable storage medium
CN107992230B (en) Image processing method, device and storage medium
CN113763486B (en) Dominant hue extraction method, device, electronic equipment and storage medium
CN112133267B (en) Audio effect processing method, device and storage medium
CN110929675B (en) Image processing method, device, computer equipment and computer readable storage medium
CN111091512B (en) Image processing method and device and computer readable storage medium
CN113129221B (en) Image processing method, device, equipment and storage medium
CN111064994B (en) Video image processing method and device and storage medium
CN110660031B (en) Image sharpening method and device and storage medium
CN111340894B (en) Image processing method, device and computer equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant