CN107633490B - Image processing method, device and storage medium - Google Patents
Image processing method, device and storage medium Download PDFInfo
- Publication number
- CN107633490B CN107633490B CN201710852921.4A CN201710852921A CN107633490B CN 107633490 B CN107633490 B CN 107633490B CN 201710852921 A CN201710852921 A CN 201710852921A CN 107633490 B CN107633490 B CN 107633490B
- Authority
- CN
- China
- Prior art keywords
- image
- processed
- filtering
- processing
- termination
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Landscapes
- Image Processing (AREA)
Abstract
The disclosure relates to an image processing method, device and storage medium, comprising: acquiring an image to be processed; blurring the image to be processed to obtain a blurred image; and taking the blurred image as a guide image, carrying out preset joint filtering on the image to be processed, and determining a target image based on the result of the preset joint filtering.
Description
Technical Field
The present disclosure relates to image processing technology, and in particular, to an image processing method, apparatus, and storage medium.
Background
In the related art, after an image is acquired by a beautifying camera, spots and flaws of facial skin are removed by adopting a skin-grinding method, the problem of uneven facial skin color and the like is solved, however, the image acquired by the method causes loss of part of detail information of the image, reduces the quality of the image, and in addition, if the skin-grinding force is too large, the detail information of the image is seriously lost, the acquired image is fuzzy, so that the image processing effect is not ideal.
Disclosure of Invention
To overcome the problems in the related art, the present disclosure provides an image processing method, apparatus, and storage medium.
According to a first aspect of an embodiment of the present disclosure, there is provided an image processing method including: acquiring an image to be processed; blurring the image to be processed to obtain a blurred image; and taking the blurred image as a guide image, carrying out preset joint filtering on the image to be processed, and determining a target image based on the result of the preset joint filtering.
Optionally, the determining the target image based on the result of the preset joint filtering includes: circularly executing filtering processing until a termination condition is met, and determining an image output after termination as the target image; the filtering process includes: and determining the output image of the last preset joint filtering as an updated guide image, and carrying out the preset joint filtering on the image to be processed according to the updated guide image.
Optionally, the termination condition includes: the number of times of the loop filtering process reaches the termination number.
Optionally, the method further includes obtaining a processing strength of performing preset joint filtering on the image to be processed, and determining the termination times corresponding to the processing strength.
According to a second aspect of embodiments of the present disclosure, there is provided an image processing apparatus including an image acquisition module configured to acquire an image to be processed; the first image processing module is configured to blur the image to be processed to obtain a blurred image; the second image processing module is configured to perform preset joint filtering on the image to be processed by taking the blurred image as a guide image, and determine a target image based on the result of the preset joint filtering.
Optionally, the second image processing module is further configured to: circularly executing filtering processing until a termination condition is met, and determining an image output after termination as the target image; the filtering process includes: and determining the output image of the last preset joint filtering as an updated guide image, and carrying out the preset joint filtering on the image to be processed according to the updated guide image.
Optionally, the termination condition includes: the number of times of the loop filtering process reaches the termination number.
Optionally, the device further includes a determining module configured to obtain a processing strength of performing preset joint filtering on the image to be processed, and determine the termination times corresponding to the processing strength.
According to a third aspect of the embodiments of the present disclosure, there is provided an image processing apparatus including: a processor; a memory for storing processor-executable instructions; wherein the processor is configured to: acquiring an image to be processed; blurring the image to be processed to obtain a blurred image; and taking the blurred image as a guide image, carrying out preset joint filtering on the image to be processed, and determining a target image based on the result of the preset joint filtering.
According to a fourth aspect of embodiments of the present disclosure, there is provided a computer readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the steps of the image processing method provided by the first aspect of the present disclosure.
The technical scheme provided by the embodiment of the disclosure can comprise the following beneficial effects: acquiring an image to be processed; blurring the image to be processed to obtain a blurred image; and taking the blurred image as a guide image, carrying out preset joint filtering on the image to be processed, and determining a target image based on the result of the preset joint filtering. In this way, after blurring processing is carried out on the image to be processed, noise and detail level of the image to be processed are reduced, and detail information of the blurred image is recovered through preset combined filtering, so that an image with higher quality is obtained, and the processing effect of the image is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a flowchart illustrating a method of image processing according to an exemplary embodiment;
FIG. 2 is a flowchart illustrating another image processing method according to an exemplary embodiment;
FIG. 3 is a block diagram of an image processing apparatus according to an exemplary embodiment;
FIG. 4 is a block diagram of another image processing apparatus shown according to an exemplary embodiment;
fig. 5 is a schematic diagram showing a hardware configuration of an image processing apparatus according to an exemplary embodiment.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
The method reduces noise and detail level of an image to be processed after blurring the image to be processed, and restores detail information of the blurred image through preset combined filtering, so that an image with higher quality is obtained, and the image processing effect is improved.
Fig. 1 is a flowchart illustrating an image processing method for use in a terminal, which may include a mobile phone, a tablet computer, a notebook, etc., as shown in fig. 1, according to an exemplary embodiment, the image processing method including the following steps.
In step S101, an image to be processed is acquired.
In this step, the image to be processed may be obtained by photographing, downloading, receiving the transmission of others, and the like, and the image to be processed may be a color image.
In step S102, the image to be processed is blurred to obtain a blurred image.
In the step, more noise of the acquired image to be processed is considered, so that the acquired image to be processed can be subjected to blurring processing to obtain the blurred image, the noise of the image to be processed is reduced, and the level of detail is reduced, wherein the image to be processed can be subjected to blurring processing through a Gaussian blurring algorithm.
In step S103, the blurred image is used as a guiding image, the image to be processed is subjected to preset joint filtering, and a target image is determined based on the result of the preset joint filtering.
In one possible implementation manner, in order to perform better detail recovery on the image, multiple cyclic filtering processes may be performed, and in addition, in this embodiment, the number of times of the cyclic filtering process may be controlled by a termination condition may be set, so as to control the strength of the image processing, for example, the filtering process is performed in a cyclic manner until the termination condition is met, and it is determined that the image output after termination is the target image; the filtering process includes: and determining the output image of the last preset joint filtering as an updated guide image, and carrying out the preset joint filtering on the image to be processed according to the updated guide image.
The present disclosure does not limit this, and may also preset the joint filtering by the preset guided filtering model, where the termination condition may include: the number of times of the loop filtering process reaches the termination number.
By way of example, the fuzzy image is taken as a guide image, and the preset joint filtering is carried out on the image to be processed to obtain a first image, so that the first detail information recovery is completed; after the first image is obtained, the first image is used as a guide image, and preset combined filtering is carried out on the image to be processed to obtain a second image, so that second detail information recovery is completed; after the second image is obtained, the second image is used as a guide image, the preset combined filtering is carried out on the image to be processed, and a third image is obtained, so that third detail information recovery is completed, and the like until the times of the cyclic filtering processing reach the termination times.
In this embodiment, the above-mentioned termination times may be set by two implementation manners, one implementation manner is to preset one termination time, and then, when the loop calculation step is executed, the loop calculation is performed according to the termination time.
In this way, the number of times of termination may be preset according to experience, for example, when the number of times of termination is set to 2 or 3 times, the image with relatively clear image noise can be obtained by performing preset joint filtering on the image to be processed according to the number of times of termination, and the number of times of termination may be set to 2 or 3 times.
The other implementation mode is to acquire the processing intensity of the preset combined filtering on the image to be processed and determine the termination times corresponding to the processing intensity, so that the subsequent cyclic calculation is performed according to the termination times corresponding to the processing intensity selected by the user when the cyclic calculation step is executed.
In the method, different processing forces can be provided for users to select, the different processing forces correspond to different termination times, and after the processing force selected by the users is obtained, the termination times corresponding to the processing force are determined.
By way of example, the processing intensity may include a high process, a medium process, and a light process, and if the user wants to obtain a higher image processing effect (e.g., a peeling effect, etc.), the high process may be selected, if the user wants to obtain a lower image processing effect, the low process may be selected, and if the user wants to obtain an image processing effect between the higher and lower image processing effects, the medium process may be selected.
Wherein, the higher the setting of the termination number is, the more the restoration of noise is, the lower the setting of the termination number is, and the less the restoration of image details is, and therefore, the higher or lower the setting of the termination number may make the image processing effect worse, based on the above consideration, in order to obtain the higher image processing effect, the corresponding termination number of the high processing may be set to 2 or 3 times, the corresponding termination number of the medium processing may be set to 4 or 5 times, and the corresponding termination number of the light processing may be set to 6 or 7 times.
According to the method, after blurring processing is carried out on the image to be processed, noise and detail level of the image to be processed are reduced, and detail information of the blurred image is recovered through preset combined filtering, so that an image with higher quality is obtained, and the image processing effect is improved.
Fig. 2 is a flowchart of an image processing method according to an exemplary embodiment, and as shown in fig. 2, the image processing method is used in a terminal, where the terminal may include a mobile phone, a tablet computer, a notebook, and the like, and in this embodiment, the image processing method is described by taking preset joint filtering through a preset joint bilateral filtering model as an example, and the image processing method includes:
in step S201, an image to be processed is acquired.
In this step, the image to be processed may be obtained by photographing, downloading, receiving the transmission of others, and the like, and the image to be processed may be a color image.
In step S202, a gaussian blur process is performed on the image to be processed to obtain a blurred image.
In this step, the gaussian blur processing can be performed on the image to be processed by the following formula:
wherein J (p) represents the pixel value of a pixel point p of the blurred image, p (x, y) and q (x, y) respectively represent the position coordinates corresponding to the pixel points p and q, I (q) represents the pixel value of a pixel point q of the image to be processed, wherein p and q are the pixel points covered by a filter window omega, p represents a central pixel point, q represents a neighborhood pixel point of the central pixel point p, and the functional relation is thatAs a Gaussian kernel function, sigma s Representing the standard deviation of the gaussian function, K representing the normalized parameters of the image.
The normalization parameter K may be an average value of all pixel points in the filtering window. When the value of the standard deviation is increased, the filter radius of the filter window is correspondingly increased, and the values of the standard deviation and the filter radius can refer to the related technology, so that the disclosure does not limit the values. In this way, the whole image to be processed is subjected to Gaussian blur processing by moving the filter window to obtain the blurred image.
In step S203, a processing strength of performing a preset joint filtering on the image to be processed is obtained, and a termination number corresponding to the processing strength is determined.
In this step, different processing forces may be provided for the user to select, where the different processing forces correspond to different termination times, and after the processing force selected by the user is obtained, the termination times corresponding to the processing force are determined.
By way of example, the processing intensity may include a high process, a medium process, and a light process, and if the user wants to obtain a higher image processing effect (e.g., a peeling effect, etc.), the high process may be selected, if the user wants to obtain a lower image processing effect, the low process may be selected, and if the user wants to obtain an image processing effect between the higher and lower image processing effects, the medium process may be selected.
Wherein, the higher the setting of the termination number is, the more the restoration of noise is, the lower the setting of the termination number is, and the less the restoration of image details is, and therefore, the higher or lower the setting of the termination number may make the image processing effect worse, based on the above consideration, in order to obtain the higher image processing effect, the corresponding termination number of the high processing may be set to 2 or 3 times, the corresponding termination number of the medium processing may be set to 4 or 5 times, and the corresponding termination number of the light processing may be set to 6 or 7 times.
In step S204, the filtering process is cyclically performed until the number of times of termination is satisfied, and the image output after termination is determined as the target image.
In this step, the filtering process includes: and determining the output image of the last preset joint filtering as an updated guide image, and carrying out the preset joint filtering on the image to be processed according to the updated guide image.
The preset joint filtering can be performed through a preset joint bilateral filtering model, and a calculation formula of the preset joint bilateral filtering model can be expressed as follows:
wherein J is t+1 (p) represents performing a cyclometer according to the predetermined joint bilateral filtering modelThe pixel value of the pixel point p of the output image after calculation, p (x, y) and q (x, y) respectively represent the position coordinates corresponding to the pixel points p and q in the guiding image, J t (p) and J t (q) respectively representing the pixel values of the pixel points p and q in the guide image, t representing the times of cyclic filtering processing, t taking a positive integer, I (q) representing the pixel value of the pixel point q of the image to be processed, wherein p and q are the pixel points covered by a filtering window omega, p represents the central pixel point, q represents the neighborhood pixel point of the central pixel point p, and the functional relationAnd +.>As a Gaussian kernel function, sigma s Sum sigma r And respectively representing standard deviations corresponding to the Gaussian kernel functions, wherein K represents normalization parameters of the image.
Wherein, at t=1, the guide image is a blurred image.
In this step, when the guiding image and the image to be processed are substituted into the preset combined bilateral filtering model to obtain a first image, the number of times of the cyclic filtering process is determined to be one, and then the first image and the image to be processed are substituted into the preset combined bilateral filtering model to obtain a second image, the number of times of the cyclic filtering process is determined to be 2, and so on, so as to determine the number of times of the cyclic filtering process.
In step S205, when the number of times of the loop filter processing reaches the termination number, the obtained image is determined as the target image.
It should be noted that, for the sake of simplicity of description, the above method embodiments are all expressed as a series of combinations of actions, but it should be understood by those skilled in the art that the present disclosure is not limited by the described order of actions, because some steps may be performed in other order or simultaneously according to the present disclosure, for example, the above step S203 may be performed before the step S202 or may be performed before the step S201, the present disclosure is not limited by the specific order of execution, and those skilled in the art should also understand that the embodiments described in the present disclosure belong to the preferred embodiments, and the related actions and modules are not necessarily required in the present disclosure.
According to the method, after blurring processing is carried out on the image to be processed, noise and detail level of the image to be processed are reduced, and detail information of the blurred image is recovered through the preset combined bilateral filtering model, so that an image with higher quality is obtained, and the processing effect of the image is improved.
Fig. 3 is a block diagram of an image processing apparatus according to an exemplary embodiment. Referring to fig. 3, the apparatus includes an image acquisition module 301, a first image processing module 302, and a second image processing module 303.
An image acquisition module 301 configured to acquire an image to be processed;
a first image processing module 302 configured to blur the image to be processed to obtain a blurred image;
the second image processing module 303 is configured to perform preset joint filtering on the image to be processed by using the blurred image as a guide image, and determine a target image based on a result of the preset joint filtering.
Optionally, the second image processing module 303 is further configured to: circularly executing filtering processing until a termination condition is met, and determining an image output after termination as the target image; the filtering process includes: and determining the output image of the last preset joint filtering as an updated guide image, and carrying out the preset joint filtering on the image to be processed according to the updated guide image.
Optionally, the termination condition includes: the number of times of the loop filtering process reaches the termination number.
Optionally, as shown in fig. 4, the apparatus further includes a determining module 304 configured to obtain a processing strength of performing a preset joint filtering on the image to be processed, and determine the termination number corresponding to the processing strength.
The specific manner in which the various modules perform the operations in the apparatus of the above embodiments have been described in detail in connection with the embodiments of the method, and will not be described in detail herein.
By means of the device, blurring processing is carried out on the image to be processed, noise and detail level of the image to be processed are reduced, detail information of the blurred image is recovered through preset combined filtering, and therefore an image with higher quality is obtained, and processing effect of the image is improved.
The present disclosure also provides a computer readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the steps of the image processing method provided by the present disclosure.
Fig. 5 is a block diagram illustrating an apparatus 500 for image processing according to an exemplary embodiment. For example, the apparatus 500 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, or the like.
Referring to fig. 5, an apparatus 500 may include one or more of the following components: a processing component 501, a memory 502, a power component 503, a multimedia component 504, an audio component 505, an input/output (I/O) interface 506, a sensor component 507, and a communication component 508.
The processing component 502 generally controls overall operation of the apparatus 500, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 501 may include one or more processors 509 to execute instructions to perform all or part of the steps of the image processing methods described above. Further, the processing component 501 can include one or more modules that facilitate interactions between the processing component 501 and other components. For example, the processing component 501 may include a multimedia module to facilitate interaction between the multimedia component 504 and the processing component 501.
The memory 502 is configured to store various types of data to support operations at the apparatus 500. Examples of such data include instructions for any application or method operating on the apparatus 500, contact data, phonebook data, messages, pictures, videos, and the like. The memory 502 may be implemented by any type of volatile or non-volatile memory device or combination thereof, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The power component 503 provides power to the various components of the device 500. The power components 503 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the device 500.
The multimedia component 504 includes a screen between the device 500 and the user that provides an output interface. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or sliding action, but also the duration and pressure associated with the touch or sliding operation. In some embodiments, the multimedia component 504 includes a front-facing camera and/or a rear-facing camera. The front-facing camera and/or the rear-facing camera may receive external multimedia data when the apparatus 500 is in an operational mode, such as a photographing mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have focal length and optical zoom capabilities.
The audio component 505 is configured to output and/or input audio signals. For example, the audio component 505 includes a Microphone (MIC) configured to receive external audio signals when the apparatus 500 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may be further stored in the memory 502 or transmitted via the communication component 508. In some embodiments, the audio component 505 further comprises a speaker for outputting audio signals.
The I/O interface 506 provides an interface between the processing component 501 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: homepage button, volume button, start button, and lock button.
The sensor assembly 507 includes one or more sensors for providing status assessment of various aspects of the apparatus 500. For example, the sensor assembly 507 may detect an on/off state of the device 500, a relative positioning of the components, such as a display and keypad of the device 500, the sensor assembly 507 may also detect a change in position of the device 500 or a component of the device 500, the presence or absence of user contact with the device 500, an orientation or acceleration/deceleration of the device 500, and a change in temperature of the device 500. The sensor assembly 507 may include a proximity sensor configured to detect the presence of nearby objects in the absence of any physical contact. The sensor assembly 507 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 507 may further include an acceleration sensor, a gyroscopic sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 508 is configured to facilitate communication between the apparatus 500 and other devices in a wired or wireless manner. The apparatus 500 may access a wireless network based on a communication standard, such as WiFi,2G or 3G, or a combination thereof. In one exemplary embodiment, the communication component 508 receives broadcast signals or broadcast-related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communication component 508 further includes a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 500 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic elements for performing the above-described image processing methods.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided, such as a memory 502, comprising instructions executable by the processor 509 of the device 500 to perform the image processing method described above. For example, the non-transitory computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.
Claims (4)
1. An image processing method, comprising:
acquiring an image to be processed;
blurring the image to be processed to obtain a blurred image;
taking the blurred image as a guide image, carrying out preset joint filtering on the image to be processed, and determining a target image based on the result of the preset joint filtering;
the blurring the image to be processed to obtain a blurred image comprises the following steps: carrying out fuzzy processing on an image to be processed through a Gaussian fuzzy algorithm to obtain a fuzzy image;
the determining the target image based on the result of the preset joint filtering includes:
circularly executing filtering processing until a termination condition is met, and determining an image output after termination as the target image;
the filtering process includes: determining the output image of the last preset joint filtering as an updated guide image, and carrying out the preset joint filtering on the image to be processed according to the updated guide image;
the termination condition includes: the times of the cyclic filtering treatment reach the termination times;
the number of termination times is determined by: acquiring processing intensity of preset joint filtering on the image to be processed, and determining the termination times corresponding to the processing intensity; wherein the treatment intensity comprises high treatment, medium treatment and light treatment.
2. An image processing apparatus, comprising:
an image acquisition module configured to acquire an image to be processed;
the first image processing module is configured to blur the image to be processed to obtain a blurred image;
the second image processing module is configured to perform preset joint filtering on the image to be processed by taking the blurred image as a guide image, and determine a target image based on the result of the preset joint filtering;
the first image processing module is configured to perform blurring processing on an image to be processed through a Gaussian blurring algorithm to obtain a blurred image;
the second image processing module is further configured to:
circularly executing filtering processing until a termination condition is met, and determining an image output after termination as the target image;
the filtering process includes: determining the output image of the last preset joint filtering as an updated guide image, and carrying out the preset joint filtering on the image to be processed according to the updated guide image;
the termination condition includes: the times of the cyclic filtering treatment reach the termination times;
the number of termination times is determined by: acquiring processing intensity of preset joint filtering on the image to be processed, and determining the termination times corresponding to the processing intensity; wherein the treatment intensity comprises high treatment, medium treatment and light treatment.
3. An image processing apparatus, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to: acquiring an image to be processed; blurring the image to be processed to obtain a blurred image; taking the blurred image as a guide image, carrying out preset joint filtering on the image to be processed, and determining a target image based on the result of the preset joint filtering;
the blurring the image to be processed to obtain a blurred image comprises the following steps: carrying out fuzzy processing on an image to be processed through a Gaussian fuzzy algorithm to obtain a fuzzy image;
the determining the target image based on the result of the preset joint filtering includes:
circularly executing filtering processing until a termination condition is met, and determining an image output after termination as the target image;
the filtering process includes: determining the output image of the last preset joint filtering as an updated guide image, and carrying out the preset joint filtering on the image to be processed according to the updated guide image;
the termination condition includes: the times of the cyclic filtering treatment reach the termination times;
the number of termination times is determined by: acquiring processing intensity of preset joint filtering on the image to be processed, and determining the termination times corresponding to the processing intensity; wherein the treatment intensity comprises high treatment, medium treatment and light treatment.
4. A computer readable storage medium having stored thereon computer program instructions, characterized in that,
which program instructions, when executed by a processor, carry out the steps of the method according to claim 1.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710852921.4A CN107633490B (en) | 2017-09-19 | 2017-09-19 | Image processing method, device and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710852921.4A CN107633490B (en) | 2017-09-19 | 2017-09-19 | Image processing method, device and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107633490A CN107633490A (en) | 2018-01-26 |
CN107633490B true CN107633490B (en) | 2023-10-03 |
Family
ID=61103161
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710852921.4A Active CN107633490B (en) | 2017-09-19 | 2017-09-19 | Image processing method, device and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107633490B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112669232B (en) * | 2020-12-24 | 2024-08-09 | 浙江大华技术股份有限公司 | Depth image enhancement processing method and device |
CN112862715B (en) * | 2021-02-08 | 2023-06-30 | 天津大学 | Real-time and controllable scale space filtering method |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103310430A (en) * | 2012-03-13 | 2013-09-18 | 三星电子株式会社 | Method and apparatus for deblurring non-uniform motion blur |
CN103927717A (en) * | 2014-03-28 | 2014-07-16 | 上海交通大学 | Depth image recovery method based on improved bilateral filters |
CN104537618A (en) * | 2014-12-24 | 2015-04-22 | 浙江宇视科技有限公司 | Image processing method and device |
CN104766307A (en) * | 2015-03-13 | 2015-07-08 | 青岛海信电器股份有限公司 | Picture processing method and device |
CN105427262A (en) * | 2015-12-15 | 2016-03-23 | 南京信息工程大学 | Image de-noising method based on bidirectional enhanced diffusion filtering |
US9305338B1 (en) * | 2013-12-13 | 2016-04-05 | Pixelworks, Inc. | Image detail enhancement and edge sharpening without overshooting |
CN105512605A (en) * | 2015-11-23 | 2016-04-20 | 小米科技有限责任公司 | Face image processing method and device |
CN106097267A (en) * | 2016-06-08 | 2016-11-09 | 浙江传媒学院 | A kind of image deblurring method based on Fourier transformation |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140301651A1 (en) * | 2013-04-07 | 2014-10-09 | Xiaomi Inc. | Method and device for performing spatial filtering process on image |
-
2017
- 2017-09-19 CN CN201710852921.4A patent/CN107633490B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103310430A (en) * | 2012-03-13 | 2013-09-18 | 三星电子株式会社 | Method and apparatus for deblurring non-uniform motion blur |
US9305338B1 (en) * | 2013-12-13 | 2016-04-05 | Pixelworks, Inc. | Image detail enhancement and edge sharpening without overshooting |
CN103927717A (en) * | 2014-03-28 | 2014-07-16 | 上海交通大学 | Depth image recovery method based on improved bilateral filters |
CN104537618A (en) * | 2014-12-24 | 2015-04-22 | 浙江宇视科技有限公司 | Image processing method and device |
CN104766307A (en) * | 2015-03-13 | 2015-07-08 | 青岛海信电器股份有限公司 | Picture processing method and device |
CN105512605A (en) * | 2015-11-23 | 2016-04-20 | 小米科技有限责任公司 | Face image processing method and device |
CN105427262A (en) * | 2015-12-15 | 2016-03-23 | 南京信息工程大学 | Image de-noising method based on bidirectional enhanced diffusion filtering |
CN106097267A (en) * | 2016-06-08 | 2016-11-09 | 浙江传媒学院 | A kind of image deblurring method based on Fourier transformation |
Non-Patent Citations (3)
Title |
---|
"利用联合双边滤波或引导滤波进行升采样(Upsampling)技术提高一些耗时算法的速度";laviewpbt;《http://www.cnblogs.com/imageshop/p/3677313.html》;20140420;第1-2页 * |
SAR图像各向异性扩散滤波算法;李倩等;国外电子测量技术(02);全文 * |
合成孔径雷达目标图像复原方法仿真研究;代文征等;计算机仿真(09);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN107633490A (en) | 2018-01-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107692997B (en) | Heart rate detection method and device | |
CN111340733B (en) | Image processing method and device, electronic equipment and storage medium | |
CN109819229B (en) | Image processing method and device, electronic equipment and storage medium | |
CN110569777B (en) | Image processing method and device, electronic device and storage medium | |
CN110580688B (en) | Image processing method and device, electronic equipment and storage medium | |
CN105631803B (en) | The method and apparatus of filter processing | |
CN107967459B (en) | Convolution processing method, convolution processing device and storage medium | |
CN107341509B (en) | Convolutional neural network training method and device and readable storage medium | |
CN107480785B (en) | Convolutional neural network training method and device | |
CN111523346B (en) | Image recognition method and device, electronic equipment and storage medium | |
CN109615593A (en) | Image processing method and device, electronic equipment and storage medium | |
CN113139947B (en) | Image processing method and device, electronic equipment and storage medium | |
CN111583142A (en) | Image noise reduction method and device, electronic equipment and storage medium | |
CN110675355A (en) | Image reconstruction method and device, electronic equipment and storage medium | |
CN110415258B (en) | Image processing method and device, electronic equipment and storage medium | |
CN111861942A (en) | Noise reduction method and device, electronic equipment and storage medium | |
CN110633715B (en) | Image processing method, network training method and device and electronic equipment | |
CN107633490B (en) | Image processing method, device and storage medium | |
CN109447258B (en) | Neural network model optimization method and device, electronic device and storage medium | |
CN109376674B (en) | Face detection method, device and storage medium | |
CN113689361A (en) | Image processing method and device, electronic equipment and storage medium | |
CN112102300B (en) | Counting method and device, electronic equipment and storage medium | |
CN113506229A (en) | Neural network training and image generation method and device | |
CN111260581B (en) | Image processing method, device and storage medium | |
CN111507131B (en) | Living body detection method and device, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |