CN113538479B - Image edge processing method, device, equipment and storage medium - Google Patents

Image edge processing method, device, equipment and storage medium Download PDF

Info

Publication number
CN113538479B
CN113538479B CN202010310001.1A CN202010310001A CN113538479B CN 113538479 B CN113538479 B CN 113538479B CN 202010310001 A CN202010310001 A CN 202010310001A CN 113538479 B CN113538479 B CN 113538479B
Authority
CN
China
Prior art keywords
pixel
original image
edge area
gray value
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010310001.1A
Other languages
Chinese (zh)
Other versions
CN113538479A (en
Inventor
黄中琨
赖健豪
左国云
陈艳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Hansen Software Co.,Ltd.
Original Assignee
Shenzhen Hosonsoft Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Hosonsoft Co Ltd filed Critical Shenzhen Hosonsoft Co Ltd
Priority to CN202010310001.1A priority Critical patent/CN113538479B/en
Publication of CN113538479A publication Critical patent/CN113538479A/en
Application granted granted Critical
Publication of CN113538479B publication Critical patent/CN113538479B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • G06T5/90
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20192Edge enhancement; Edge preservation

Abstract

The present invention relates to the field of image processing technologies, and in particular, to an image edge processing method, device, apparatus, and storage medium. The method comprises the following steps of S1: acquiring an original image; step S2: acquiring pixels belonging to an edge region of the original image; step S3: and adjusting the color shade degree of the acquired pixels, and after each pixel belonging to the edge area is adjusted, realizing the processing of the edge area. By adjusting the degree of color shading of each pixel in the edge area of the original image, the display effect of the print-generated image can be changed.

Description

Image edge processing method, device, equipment and storage medium
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to an image edge processing method, device, apparatus, and storage medium.
Background
The image edge area is the transition area of the image and the background. The image edge area directly affects the display effect of the image. By processing the image edge area, the display effect of the image can be changed. When the color of the image is very close to the background, the edge of the image is unclear, and the edge of the image is processed, so that the edge of the image is obviously different from the background, the image can be clearly displayed, and the display effect of the image is changed. When the color of the image is greatly different from the background, the edge of the image is processed to enable the edge to be closer to the background, so that the image display effect is more real, and the image display effect can be changed.
However, in the prior art, a technical scheme for processing an image edge area is lacking, so that the display effect of an image cannot be changed by processing the image edge.
Disclosure of Invention
The embodiment of the invention provides an image edge processing method, an image edge processing device, image edge processing equipment and a storage medium. The image edge processing method, the device, the equipment and the storage medium can change the display effect of the image to a certain extent.
In a first aspect, an embodiment of the present invention provides an image edge processing method, where the method includes:
step S1: acquiring an original image;
step S2: acquiring pixels belonging to an edge region of the original image;
step S3: and adjusting the color shade degree of the acquired pixels, and after each pixel belonging to the edge area is adjusted, realizing the processing of the edge area.
In a second aspect, an embodiment of the present invention further provides an image edge processing apparatus, including:
the first acquisition module is used for acquiring an original image;
a second acquisition module for acquiring pixels belonging to an edge region of the original image;
and the adjusting module is used for adjusting the color shade degree of the acquired pixels and realizing the processing of the edge area after adjusting each pixel belonging to the edge area.
In a third aspect, an embodiment of the present invention provides an image edge processing apparatus, including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein, the liquid crystal display device comprises a liquid crystal display device,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the image edge processing method described above.
In a fourth aspect, an embodiment of the present invention provides a computer storage medium having stored thereon computer program instructions, wherein the above-described image edge processing method is implemented when the computer program instructions are executed by a processor.
In summary, the image edge processing method, device, equipment and storage medium provided by the embodiments of the present invention can change the display effect of an image by adjusting the color shade degree of each pixel belonging to the image edge region.
Drawings
FIGS. 1-2 are various state diagrams of images involved in an image processing process by using the image edge processing method, device, equipment and storage medium provided by the invention;
FIG. 3 is a flow chart of an image edge processing method according to an embodiment of the invention;
FIG. 4 is a flow chart of an image edge processing method according to an embodiment of the invention;
FIG. 5 is a flow chart of an image edge processing method according to an embodiment of the invention;
FIG. 6 is a flow chart of an image edge processing method according to an embodiment of the invention;
FIG. 7 is a schematic diagram illustrating connection of an image edge processing apparatus according to an embodiment of the present invention;
FIG. 8 is a schematic diagram illustrating connection of an image edge processing apparatus according to an embodiment of the present invention;
FIG. 9 is a schematic diagram illustrating connection of an image edge processing apparatus according to an embodiment of the present invention;
FIG. 10 is a schematic diagram illustrating connection of an image edge processing apparatus according to an embodiment of the present invention;
fig. 11 is a schematic diagram showing connection of components of an image edge processing apparatus in an embodiment of the present invention.
Detailed Description
Features and exemplary embodiments of various aspects of the present invention will be described in detail below, and in order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in further detail below with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are merely configured to illustrate the invention and are not configured to limit the invention. It will be apparent to one skilled in the art that the present invention may be practiced without some of these specific details. The following description of the embodiments is merely intended to provide a better understanding of the invention by showing examples of the invention.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
Fig. 1 shows an original image. The figure includes an original image pixel 05 and a background pixel 06 of the original image.
After the original image pixel 05 and the background pixel 06 are acquired, the original image pixel 05 and the background pixel 06 are processed, and each pixel belonging to the image edge region in the original image pixel 05 is acquired.
Processing the original image pixel 05 and the background pixel 06 to obtain each pixel belonging to the image edge area in the original image pixel 05, including: for any background pixel 06, acquiring each pixel with the city distance smaller than a first set value from the background pixel 06; for each acquired pixel, judging whether the gray value of the pixel is equal to the gray value of the background pixel 06; if the gray value of the pixel is not equal to the gray value of the background pixel 06, the pixel belongs to the edge area of the original image.
After each pixel belonging to the edge area of the original image is acquired, each pixel is processed, so that the display effect of the image is changed.
Processing each pixel, including: for any pixel belonging to the edge region of the original image, the gradation value of the pixel is added to the data Δh1 corresponding to the pixel.
Data Δh1= (255-h 1) ×f1 corresponding to the pixel, where h1 is a gray value of the pixel, and f1 is equal to or greater than-1 and equal to or less than 1.
When the background is white, f1 is larger than 0 and smaller than 1, the gray value of each pixel in the edge area of the original image is increased, and the color of each pixel in the edge area of the original image is faded. Fig. 2 is an image obtained after the edge area of the original image is subjected to the thinning process. In fig. 2, the color of the edge area pixel 052 is thinned, and the color of the center area pixel 051 and the background pixel 06 are not changed.
An embodiment of the present invention provides an image edge processing method, as shown in fig. 3, which includes the following steps S1 to S3.
Step S1: acquiring an original image; step S2: acquiring pixels belonging to an edge region of the original image; step S3: and adjusting the color shade degree of the acquired pixels, and after each pixel belonging to the edge area is adjusted, realizing the processing of the edge area.
A pixel is a basic unit that constitutes an image. The pixels comprise small areas of a single color in the image. The degree of color shading of the image edge region can be adjusted by adjusting the degree of color shading of each pixel belonging to the image edge.
The original image includes objects, text, color blocks, graphics, etc.
The edge region of the original image is the region of the original image adjacent to the background. By adjusting the color shading degree of each pixel in the edge region of the original image, the color shading degree of the edge region of the original image can be changed, thereby improving the display effect of the original image.
In one embodiment, the edge region includes a first edge region adjacent to a background of the original image.
As shown in fig. 4, the step S2 includes a step S21: for any pixel in the original image background, acquiring each pixel with the city distance between the periphery of the pixel and the pixel smaller than a first set value; step S22: judging whether the pixel is the pixel in the background or not according to any acquired pixel; if the pixel is not the pixel in the background, the pixel belongs to the first edge area, and the pixel is acquired.
The first edge region is a region of the original image adjacent to the background. The background of the original image is typically solid. The pixels in the original image background are typically identical. Each pixel of the first edge region is different from a pixel in the original image background. Because each pixel of the first edge area is different from the pixels in the background of the original image and the first edge area is adjacent to the background of the original image, each pixel of the first edge area can be acquired by using the pixels in the background of the original image, so that the color shade degree of each acquired pixel of the first edge area can be adjusted, and the display effect of the image is changed.
On a medium including an original image and an original image background, if the central position coordinate of one pixel is (x 1, y 1) and the central position coordinate of the other pixel is (x 2, y 2), the city distance d= |x1-x2|+|y1-y2| between the two pixels. The medium herein includes paper, plastic sheet, cloth, etc. The first set value is larger than or equal to the city distance between two adjacent pixels in the original image. By adjusting the first set value, the number of pixels in which the city distance between the periphery of the acquired pixel and the pixel is smaller than the first set value can be adjusted, so that the width of the acquired first edge region is adjusted.
The pixels in the original image background include pixels adjacent to the first edge region and pixels farther from the first edge region. If a pixel in the background of the original image is adjacent to the original image, each pixel, including one or more pixels of the original image, of which the acquired city distance between the surrounding of the pixel and the pixel is smaller than a first set value belongs to the first edge region.
If a pixel in the background of the original image is far from the first edge area, each pixel with the acquired city distance between the surrounding of the pixel and the pixel smaller than the first set value may not include the pixel of the original image, and the pixel belonging to the first edge area may not be acquired by using the pixel.
In one embodiment, in step S22, for any pixel that has been acquired, it is determined whether the pixel is a pixel in the background, and if the pixel is not a pixel in the background, the pixel belongs to the first edge area, and acquiring the pixel includes: for any pixel which is obtained, judging whether the gray value of the pixel is equal to the gray value of the pixel in the original image background, and if the gray value of the pixel is not equal to the gray value of the pixel in the original image background, the pixel belongs to the first edge area.
The gray value of a pixel is used to characterize the color shade level of the pixel. When the original image is an 8-bit image, the gray value of each pixel of the original image is between 0 and 255. In the original image, for a pixel, the larger the gray value of the pixel, the lighter the color of the pixel. The smaller the gray value of a pixel, the more intense the color of that pixel. When the original image is a black and white original image, if the gray value of a pixel is 0, the pixel is black. If the gray value of a pixel is 255, the pixel is white. When the gray value of a pixel is between 0 and 255, the color of the pixel is gray between black and white.
In a color original image, each pixel of the image is composed of R, G, B three primary colors, wherein R is red, G is green and B is blue. The gray value of any pixel in the original image includes: an R gray value forming the pixel, a G gray value forming the pixel, and a B gray value forming the pixel. If the R gray scale value of a pixel is 0, the G gray scale value is 0, and the B gray scale value is also 0, the pixel is black; the R gray scale value of a pixel is 255, the G gray scale value is 255, and the B gray scale value is 255, so that the pixel is white; the R gray scale value of a pixel is more than 0 and less than 255, the G gray scale value is more than 0 and less than 255, and the B gray scale value is more than 0 and less than 255, so that the pixel is colored.
When the original image is colored, the background of the original image is generally black or white. When the background of the original image is black or white, and the original image is color, judging whether the gray value of the pixel is equal to the gray value of the pixel in the background of the original image according to any acquired pixel, if the gray value of the pixel is not equal to the gray value of the pixel in the background of the original image, the pixel belongs to the first edge area, and the method comprises the following steps: for any pixel in an original image, calculating and obtaining the total gray value of the pixel by using the R gray value, the G gray value and the B gray value of the pixel; judging whether the total gray value of the pixel is equal to the gray value of the pixel in the original image background, and if the total gray value of the pixel is not equal to the gray value of the pixel in the original image background, the pixel belongs to the first edge area.
Calculating and acquiring the total gray level value of the pixel by using the R gray level value, the G gray level value and the B gray level value of the pixel, wherein the method comprises the following steps: calculating and obtaining the total Gray value of the pixel by using a formula gray= (R+G+B)/3; in the formula, R is the R Gray scale value of the pixel, G is the G Gray scale value of the pixel, B is the B Gray scale value of the pixel, and Gray is the total Gray scale value of the pixel obtained through calculation.
When the original image is colored, the background of the original image is generally solid color other than black and white, and for any pixel that has been obtained, it is determined whether the gray value of the pixel is equal to the gray value of the pixel in the background of the original image, and if the gray value of the pixel is not equal to the gray value of the pixel in the background of the original image, the pixel belongs to the first edge region, including: judging whether the R gray value of each pixel is equal to the R gray value of the pixel in the original image background, judging whether the G gray value of each pixel is equal to the G gray value of the pixel in the original image background, judging whether the B gray value of each pixel is equal to the B gray value of the pixel in the original image background, and if any one of the R gray value, the G gray value and the B gray value of the pixel is different from the background pixel, the pixel belongs to the first edge area.
In one embodiment, the edge region includes a second edge region.
As shown in fig. 5, the step S2 further includes a step S23: for any pixel belonging to the first edge area, acquiring each pixel with the city distance between the periphery of the pixel and the pixel smaller than a second set value; step S24: and judging whether the pixel belongs to the first edge area according to any acquired pixel, and if the pixel does not belong to the first edge area and the pixel is not the pixel in the background, acquiring the pixel if the pixel belongs to the second edge area.
The second edge area is an area adjacent to the first edge area and not adjacent to the background in the original image. The second edge area is processed, so that the second edge area is different from the first edge area, the edge area of the original image is enabled to show a gradual change effect, and the overall display effect of the image is improved.
The second set value is larger than or equal to the city distance between two adjacent pixels in the original image. By adjusting the second set value, the number of pixels in which the city distance between the periphery of a pixel in the acquired first edge region and the pixel is smaller than the second set value can be adjusted, so that the width of the acquired first edge region is adjusted.
If one pixel in the first edge area is adjacent to the second edge area, each pixel with the city distance smaller than the second set value between the periphery of the pixel and the pixel can be acquired, so that the pixel belonging to the second edge area can be acquired, and further the pixel belonging to the second edge area can be processed.
According to any pixel in the first edge region, each pixel with the city distance between the periphery of the acquired pixel and the pixel being smaller than the second set value may be a pixel in the background, may be a pixel in the first edge region, may be a pixel in the second edge region, and if the acquired pixel is neither a pixel in the first edge region nor a pixel in the background, the pixel belongs to the second edge region.
In one embodiment, in step S24, for any pixel that has been acquired, it is determined whether the pixel belongs to the first edge area or is a pixel in the background; if the pixel does not belong to the first edge region and the pixel is not a pixel in the background, the pixel belongs to the second edge region, and the acquiring the pixel includes: for any pixel which is obtained, judging whether the gray value of the pixel is equal to the gray value of the pixel in the background of the original image or not, and whether the gray value of the pixel is equal to the gray value of each pixel regulated in the first edge area or not, if the gray value of the pixel is not equal to the gray value of the pixel in the background of the original image or the gray value regulated in the first edge area, the pixel belongs to the second edge area.
In one embodiment, in step S3, adjusting the color shade level of the acquired pixel includes: for any one of the obtained pixels, the gray value of the pixel is added with a numerical value delta h1 corresponding to the gray value of the pixel, so that the color shade degree of the pixel is changed.
The gray value of a pixel can represent the degree of color shading of the pixel. The degree of color shading of a pixel can be changed by changing the gray value of the pixel. In an 8bit original image, the gray value of each pixel is between 0 and 255. The larger the gray value of a pixel, the closer the pixel is to white. The smaller the gray value of a pixel, the closer the pixel is to a solid color.
The value Deltah 1 is greater than-255 and less than 255. When the value Δh1 added to the gray value of a pixel is greater than-255 and less than 0, the color of the pixel is emphasized. When the value Δh1 added by the gray value of a pixel is greater than 0 and less than 255, the color of the pixel is faded.
In one embodiment, in step S3, adjusting the color shade level of the acquired pixel includes: before adding a corresponding value Δh1 to the gray value of any pixel belonging to the image edge region, calculating and obtaining a value Δh1 corresponding to the gray value of the pixel by using a formula Δh1= (256-h 1) ×f1, where h1 is the gray value of the pixel, f1 is greater than or equal to-1 and less than or equal to 1, and Δh1 is a value corresponding to the gray value of the pixel.
When the color of the pixel is faded, f1 is greater than 0 and less than 1. When the color of the pixel is emphasized, f1 is greater than-1 and less than 0. The gray value h1 of the pixel is 0 or more and 255 or less. For any pixel, the pixel has an adjusted gray value h2=h1+ (255-h 1) f1. By setting f1 to-1 or less and 1 or less, the gradation value h2 obtained by calculation can be set to 0 or more and 255 or less.
F1= -0.5 when the pixel is thickened by 50% for any pixel; f1=0.5 when the pixel is desalted by 50%.
For example, the gray value of a pixel in the original image is 128. The pixel is enhanced by 50%, and after the enhancement, the gray value h2=h1+ [ delta ] h1=h1+ (255-h 1) ×f1=128+ (255-128) × (-0.5) =64 of the pixel. When the pixel is desalted by 50%, the gray value h2=h1+ [ delta ] h1=h1+ (255-h 1) ×f1=128+ (255-128) × (0.5) =192 of the pixel after the desalting.
In one embodiment, in step S3, adjusting the color shade degree of the acquired pixels, and after adjusting each pixel belonging to the edge area, implementing the processing of the edge area, including: and adjusting the gray value of each pixel in the edge area to be a third set value, wherein the third set value is more than or equal to 0 and less than or equal to 255.
By adjusting the gray values of the edge areas to the third set value, the edge areas of the original image can be made to be solid, so that the edge contours of the original image can be highlighted, and the display effect of the image can be changed.
In one embodiment, in step S3, adjusting the color shade degree of the acquired pixels, and after adjusting each pixel belonging to the edge area, implementing the processing of the edge area, including: the gray value of each pixel belonging to the edge region is increased.
In one embodiment, if the gray value of the original image is less than the gray value of the background of the original image, the background of the original image is more white than the original image. And when the difference between the gray value of the background of the original image and the gray value of the edge area of the original image is larger than a fourth set value, increasing the gray value of each pixel of the edge area of the original image. By increasing the gray value of each pixel in the edge area of the original image, the color of the edge area of the original image can be thinned, so that the original image and the background are naturally transited, and the ink consumption in the printing process can be reduced. The fourth set value is greater than 0.
In one embodiment, if the gray value of the background of the original image is less than the gray value of the original image, the background of the original image is more black than the original image. When the difference between the gray value of the edge area of the original image and the gray value of the background image is smaller than the fifth set value, the gray value of each pixel of the edge area of the original image is increased. By increasing the gray value of each pixel of the edge area of the original image, the contrast ratio of the edge area of the original image and the background can be increased, so that the display effect of the original image is improved. The fifth set value is greater than 0.
In one embodiment, in step S3, adjusting the color shade degree of the acquired pixels, and after adjusting each pixel belonging to the edge area, implementing the processing of the edge area, including: the gray value of each pixel belonging to the edge region is reduced.
In one embodiment, if the gray value of the original image is less than the gray value of the background of the original image, the background of the original image is more white than the original image. When the difference between the gray value of the background of the original image and the gray value of the edge area of the original image is smaller than the sixth set value, the gray value of each pixel of the edge area of the original image is reduced, so that the color of the edge area of the original image is thickened, the contrast ratio of the original image and the background is increased, and the display effect of the original image is improved. The sixth set point is greater than 0.
In one embodiment, if the gray value of the background of the original image is less than the gray value of the original image, the background of the original image is more black than the original image. And when the difference between the gray value of each pixel of the edge area of the original image and the gray value of the background image is larger than a seventh set value, reducing the gray value of each pixel of the edge area of the original image. By reducing the gray value of each pixel of the edge area of the original image, the transition between the edge area of the original image and the background of the original image can be natural, so that the display effect of the original image is improved. The seventh set point is greater than 0.
In one embodiment, in step S3, adjusting the color shade degree of the acquired pixels, and after adjusting each pixel belonging to the edge area, implementing the processing of the edge area, including: the gray value of each pixel belonging to the edge region is changed to 0.
By changing the edge of the original image into 0, the image printed by the dot matrix data of the second image can be enabled to have black edge areas, so that the edge of the image can be highlighted, and the image is clear in outline.
In one embodiment, in step S3, adjusting the color shade degree of the acquired pixels, and after adjusting each pixel belonging to the edge area, implementing the processing of the edge area, including: changing the gray value of each pixel belonging to the edge area into the gray value of the original image background.
The gray value of the edge area of the original image is changed into the gray value of the background of the original image, so that the original image can be reduced, and the use requirement of the original image is met.
In one embodiment, as shown in fig. 6, in step S3, adjusting the color shading degree of the acquired pixels, and after adjusting each pixel belonging to the edge area, implementing the processing of the edge area includes: step S31: adjusting the color shade degree of pixels belonging to a first edge region, and realizing the processing of the first edge region after the adjustment of each pixel belonging to the first edge region is completed; step S32: and adjusting the color shade degree of the pixels belonging to the second edge region, and realizing the processing of the second edge region after the adjustment of each pixel belonging to the second edge region is completed.
In step S31, adjusting the color shading degree of the pixel belonging to the first edge region, and after completing the adjustment of each pixel belonging to the first edge region, implementing the processing of the first edge region, including: for any pixel belonging to the first edge region, the gray value of the pixel is added with a numerical value delta h2 corresponding to the gray value of the pixel, so that the color shade degree of the pixel is changed.
In step S31, the adjustment of the color shading degree of the pixel belonging to the first edge region is performed, and after the adjustment of each pixel belonging to the first edge region is completed, the processing of the first edge region is performed, and the method further includes: before adding a corresponding value Δh2 to the gray value of the pixel belonging to any one of the first edge regions, calculating and obtaining a value Δh2 corresponding to the gray value of the pixel by using a formula Δh2= (255-h 2) ×f2, where h2 is the gray value of the pixel, f2 is equal to or greater than-1 and equal to or less than 1, and Δh2 is the value corresponding to the gray value of the pixel.
In step S32, adjusting the color shade of the pixel belonging to the second edge region, and after completing the adjustment of each pixel belonging to the second edge region, implementing the processing of the second edge region, including: for any pixel belonging to the second edge region, the gray value of the pixel is added with a numerical value delta h3 corresponding to the gray value of the pixel, so that the color shade degree of the pixel is changed.
In step S32, the color shade degree of the pixel belonging to the second edge region is adjusted, and after the adjustment of each pixel belonging to the second edge region is completed, the second edge region is processed, and the method further includes: before adding a corresponding value Δh3 to the gray value of the pixel belonging to any pixel of the second edge region, calculating and obtaining a value Δh3 corresponding to the gray value of the pixel by using a formula Δh3= (255-h 3) ×f3, where h3 is the gray value of the pixel, f3 is greater than or equal to-1 and less than or equal to 1, and Δh3 is the value corresponding to the gray value of the pixel.
By making f2 and f3 different, the color gradation change degree of each pixel in the first edge area is different from the color gradation change degree of each pixel in the second edge area, so that the edge of the original image can be made to present a gradual change effect, and the display effect of the original image is improved.
In one embodiment, after step S3, further comprising: after finishing the adjustment of each pixel belonging to the edge area, obtaining a second image; and performing ink-jet printing by using the second image.
Using the second image, performing inkjet printing, comprising: acquiring image dot matrix data containing dot data for characterizing the ink output of each nozzle by using the second image; inkjet printing is performed using the image dot matrix data.
An embodiment of the present invention provides an image edge processing apparatus, as shown in fig. 7, which includes a first acquisition module 1, a second acquisition module 2, and an adjustment module 3.
A first acquisition module 1 for acquiring an original image; a second acquisition module 2 for acquiring one or more pixels belonging to an edge area of the original image; and the adjusting module 3 is used for adjusting the color shade degree of the acquired pixels so as to realize the processing of the edge area after each pixel belonging to the edge area is adjusted.
A pixel is a basic unit that constitutes an image. The pixels comprise small areas of a single color in the image. The degree of color shading of the image edge region can be adjusted by adjusting the degree of color shading of each pixel belonging to the image edge.
The original image includes objects, text, color blocks, graphics, etc.
The edge region of the original image is the region of the original image adjacent to the background. By adjusting the color shading degree of each pixel of the edge area of the original image by the adjustment module 3, the color shading degree of the edge area of the original image can be changed, thereby improving the display effect of the original image.
In one embodiment, the edge region includes a first edge region. As shown in fig. 8, the second acquisition module 2 includes: a first acquisition sub-module 21 and a first determination sub-module 22.
A first obtaining sub-module 21, configured to obtain, for any pixel in the original image background, each pixel with a city distance between the surrounding of the pixel and the pixel being smaller than a first set value; a first judging sub-module 22, configured to judge, for any pixel that has been acquired, whether the pixel is a pixel in the background; if the pixel is not the pixel in the background, the pixel belongs to the first edge area, and the pixel is acquired.
The first edge region is a region of the original image adjacent to the background. The background of the original image is typically solid. The pixels in the original image background are typically identical. Each pixel of the first edge region is different from a pixel in the original image background. Because each pixel of the first edge area is different from the pixels in the background of the original image and the first edge area is adjacent to the background of the original image, each pixel of the first edge area can be acquired by using the pixels in the background of the original image, so that the color shade degree of each acquired pixel of the first edge area can be adjusted, and the display effect of the image is changed.
On a medium including an original image and an original image background, if the central position coordinate of one pixel is (x 1, y 1) and the central position coordinate of the other pixel is (x 2, y 2), the city distance d= |x1-x2|+|y1-y2| between the two pixels. The medium herein includes paper, plastic sheet, cloth, etc. The first set value is larger than or equal to the city distance between two adjacent pixels in the original image. The first obtaining sub-module 21 can adjust the number of pixels with a city distance between the periphery of the obtained pixel and the pixel smaller than the first set value by adjusting the size of the first set value, thereby adjusting the width of the obtained first edge region.
The pixels in the original image background include pixels adjacent to the first edge region and pixels farther from the first edge region. If a pixel in the background of the original image is adjacent to the original image, each pixel, including one or more pixels of the original image, of which the city distance between the surrounding of the pixel and the pixel acquired by the first acquisition sub-module 1 is smaller than the first set value. One or more pixels of the acquired original image belong to the first edge region.
If a pixel in the background of the original image is far from the first edge area, each pixel, which is acquired by the first acquiring sub-module 1 and has a city distance between the surrounding of the pixel and the pixel smaller than the first set value, may not include the pixel of the original image, and may not acquire the pixel belonging to the first edge area by using the pixel.
In one embodiment, the first determining sub-module 22 determines, for any pixel that has been acquired, whether the pixel is a pixel in the background, and if the pixel is not a pixel in the background, the pixel belongs to the first edge area, and acquires the pixel, including: the first judging submodule 22 judges, for any one of the acquired pixels, whether the gray value of the pixel is equal to the gray value of the pixel in the original image background, and if the gray value of the pixel is not equal to the gray value of the pixel in the original image background, the pixel belongs to the first edge region.
The gray value of a pixel is used to characterize the color shade level of the pixel. When the original image is an 8-bit image, the gray value of each pixel of the original image is between 0 and 255. In the original image, for a pixel, the larger the gray value of the pixel, the lighter the color of the pixel. The smaller the gray value of a pixel, the more intense the color of that pixel. When the original image is a black and white original image, if the gray value of a pixel is 0, the pixel is black. If the gray value of a pixel is 255, the pixel is white. When the gray value of a pixel is between 0 and 255, the color of the pixel is gray between black and white.
In a color original image, each pixel of the image is composed of R, G, B three primary colors, wherein R is red, G is green and B is blue. The gray value of any pixel in the original image includes: an R gray value forming the pixel, a G gray value forming the pixel, and a B gray value forming the pixel. If the R gray scale value of a pixel is 0, the G gray scale value is 0, and the B gray scale value is also 0, the pixel is black; the R gray scale value of a pixel is 255, the G gray scale value is 255, and the B gray scale value is 255, so that the pixel is white; the R gray scale value of a pixel is more than 0 and less than 255, the G gray scale value is more than 0 and less than 255, and the B gray scale value is more than 0 and less than 255, so that the pixel is colored.
When the original image is colored, the background of the original image is generally black or white. When the background of the original image is black or white and the original image is color, the first judging submodule 22 judges, for any pixel that has been acquired, whether the gray value of the pixel is equal to the gray value of the pixel in the background of the original image, and if the gray value of the pixel is not equal to the gray value of the pixel in the background of the original image, the pixel belongs to the first edge region, including: the first judging submodule 22 calculates and obtains the total gray scale value of any pixel in the original image by utilizing the R gray scale value, the G gray scale value and the B gray scale value of the pixel; judging whether the total gray value of the pixel is equal to the gray value of the pixel in the original image background, and if the total gray value of the pixel is not equal to the gray value of the pixel in the original image background, the pixel belongs to the first edge area.
The first judging submodule 22 calculates and obtains a total gray scale value of the pixel by using the R gray scale value, the G gray scale value and the B gray scale value of the pixel, and the method includes: calculating and obtaining the total Gray value of the pixel by using a formula gray= (R+G+B)/3; in the formula, R is the R Gray scale value of the pixel, G is the G Gray scale value of the pixel, B is the B Gray scale value of the pixel, and Gray is the total Gray scale value of the pixel obtained through calculation.
When the original image is color, the background of the original image is generally solid color other than black and white, the first determining sub-module 22 determines, for any pixel that has been acquired, whether the gray value of the pixel is equal to the gray value of the pixel in the background of the original image, and if the gray value of the pixel is not equal to the gray value of the pixel in the background of the original image, the pixel belongs to the first edge region, including: the first judging submodule 22 judges whether the R gray value of each pixel is equal to the R gray value of the pixel in the original image background, judges whether the G gray value of each pixel is equal to the G gray value of the pixel in the original image background, judges whether the B gray value of each pixel is equal to the B gray value of the pixel in the original image background, and if any one of the R gray value, the G gray value and the B gray value of the pixel is different from the background pixel, the pixel belongs to the first edge region.
In one embodiment, the edge region includes a second edge region.
As shown in fig. 9, the second acquisition module 2 further includes: a second acquisition sub-module 23 and a second determination sub-module 24.
A second obtaining sub-module 23, configured to obtain, for any pixel belonging to the first edge area, each pixel having a city distance between the surrounding of the pixel and the pixel that is smaller than a second set value; a second determining sub-module 24, configured to determine, for any pixel that has been acquired, whether the pixel belongs to the first edge region, and if the pixel does not belong to the first edge region and the pixel is not a pixel in the background, then the pixel belongs to the second edge region, and acquire the pixel.
The second edge area is an area adjacent to the first edge area and not adjacent to the background in the original image. The second edge area is processed, so that the second edge area is different from the first edge area, the edge area of the original image is enabled to show a gradual change effect, and the overall display effect of the image is improved.
The second set value is larger than or equal to the city distance between two adjacent pixels in the original image. The second obtaining submodule 23 can adjust the number of pixels in the obtained first edge region, in which the city distance between the periphery of a pixel and the pixel is smaller than the second set value, by adjusting the size of the second set value, thereby adjusting the width of the obtained first edge region.
If a pixel in the first edge region is adjacent to the second edge region, the second obtaining sub-module 23 can obtain pixels belonging to the second edge region by obtaining pixels around the pixel and having a city distance between the pixels smaller than the second set value. And further pixels belonging to the second edge region can be processed.
The second obtaining sub-module 23 obtains, according to any pixel in the first edge area, each pixel with a city distance between the surrounding of the pixel and the pixel smaller than the second set value, which may be a pixel in the background, which may be a pixel in the first edge area, or may be a pixel in the second edge area. If the second determination sub-module 24 determines that the acquired pixel is neither a pixel of the first edge region nor a pixel in the background, the pixel belongs to the second edge region.
In one embodiment, the second determining sub-module 24 determines, for any pixel that has been acquired, whether the pixel belongs to the first edge region and is a pixel in the background; if the pixel does not belong to the first edge region and the pixel is not a pixel in the background, the pixel belongs to the second edge region, and the acquiring the pixel includes: the second judging submodule 24 judges, for any one of the acquired pixels, whether the gray value of the pixel is equal to the gray value of the pixel in the background of the original image or equal to the gray value of each pixel in the first edge region, and if the gray value of the pixel is not equal to the gray value of the pixel in the background of the original image or not equal to the gray value of the pixel in the background of the first edge region, the pixel belongs to the second edge region.
In one embodiment, the adjusting module 3 is further configured to add, for any one of the acquired pixels, a value Δh1 corresponding to the gray value of the pixel, so as to change the color shade degree of the pixel.
The gray value of a pixel can represent the degree of color shading of the pixel. The adjustment module 3 can change the color shade degree of the pixel by changing the gray value of the pixel. In an 8bit original image, the gray value of each pixel is between 0 and 255. The larger the gray value of a pixel, the closer the pixel is to white. The smaller the gray value of a pixel, the closer the pixel is to a solid color.
The value Deltah 1 is greater than-255 and less than 255. When the adjustment module 3 adds a gray value Δh1 of a pixel to a value greater than-255 and less than 0, the color of the pixel will be emphasized. When the adjustment module 3 adds a value Δh1 to the gray value of a pixel greater than 0 and less than 255, the color of the pixel will be faded.
In one embodiment, the adjusting module 3 is further configured to calculate, before adding, to any pixel belonging to the image edge area, a corresponding value Δh1 to a gray value of the pixel, using a formula Δh1= (256-h 1) ×f1, to obtain a value Δh1 corresponding to the gray value of the pixel, where h1 is the gray value of the pixel, and f1 is greater than or equal to-1 and less than or equal to 1, and Δh1 is a value corresponding to the gray value of the pixel.
When the adjustment module 3 fades the color of the pixel, f1 is greater than 0 and less than 1. When the adjustment module 3 enriches the color of the pixel, f1 is greater than-1 and less than 0. The gray value h1 of the pixel is 0 or more and 255 or less. For any pixel, the gray value h2=h1+ (255-h 1) f1 of the pixel after adjustment, which is acquired by the adjustment module 3. By setting f1 to-1 or less and 1 or less, the gradation value h2 obtained by calculation can be set to 0 or more and 255 or less.
For any pixel, f1= -0.5 when the adjustment module 3 enriches the pixel by 50%; f1=0.5 when the pixel is desalted by 50%.
For example, the gray value of a pixel in the original image is 128. The adjustment module 3 increases the pixel by 50%, and the gray value h2=h1+ +Δh1=h1+ (256-h1) f1=128+ (256-128) (-0.5) =64. When the adjustment module 3 fades the pixel by 50%, the gray value h2=h1+ [ Δh1=h1+ (256-h 1) ×f1=128+ (256-128) ×0.5) =192 of the pixel after the fading.
In one embodiment, the adjusting module 3 is further configured to adjust the gray value of each pixel in the edge area to a third set value, where the third set value is greater than or equal to 0 and less than or equal to 255.
The adjustment module 3 can make the edge area of the original image be solid color by adjusting the gray values of the edge area to the third set value, so that the edge contour of the original image can be highlighted, and the display effect of the image can be changed.
In an embodiment the adjustment module 3 is further arranged for increasing the gray value of each pixel belonging to said edge region.
In one embodiment, if the gray value of the original image is less than the gray value of the background of the original image, the background of the original image is more white than the original image. When the difference between the gray value of the background of the original image and the gray value of the edge area of the original image is larger than the fourth set value, the gray value of each pixel of the edge area of the original image is increased by the adjusting module 3. By increasing the gray value of each pixel in the edge area of the original image by using the adjusting module 3, the color of the edge area of the original image can be thinned, so that the original image and the background can be naturally transited, and the ink consumption in the printing process can be reduced. The fourth set value is greater than 0.
In one embodiment, if the gray value of the background of the original image is less than the gray value of the original image, the background of the original image is more black than the original image. When the difference between the gray value of the edge area of the original image and the gray value of the background image is smaller than the fifth set value, the gray value of each pixel of the edge area of the original image is increased by the adjusting module 3. By increasing the gray value of each pixel of the edge area of the original image by using the adjusting module 3, the contrast ratio of the edge area of the original image and the background can be increased, so that the display effect of the original image is improved. The fifth set value is greater than 0.
In an embodiment the adjustment module 3 is further arranged for reducing the gray value of each pixel belonging to said edge region.
In one embodiment, if the gray value of the original image is less than the gray value of the background of the original image, the background of the original image is more white than the original image. When the difference between the gray value of the background of the original image and the gray value of the edge area of the original image is smaller than the sixth set value, the gray value of each pixel of the edge area of the original image is reduced by using the adjusting module 3, so that the color of the edge area of the original image is thickened, the contrast ratio of the original image and the background is increased, and the display effect of the original image is improved. The sixth set point is greater than 0.
In one embodiment, if the gray value of the background of the original image is less than the gray value of the original image, the background of the original image is more black than the original image. When the difference between the gray value of each pixel in the edge area of the original image and the gray value of the background image is larger than the seventh set value, the gray value of each pixel in the edge area of the original image is reduced by the adjusting module 3. By reducing the gray value of each pixel of the edge area of the original image by using the adjusting module 3, the transition between the edge area of the original image and the background of the original image can be natural, so that the display effect of the original image is improved. The seventh set point is greater than 0.
In an embodiment the adjustment module 3 is further adapted to change the gray value of each pixel belonging to said edge region to 0.
By changing the edge of the original image to 0 by the adjusting module 3, the image printed by the dot matrix data of the second image can be enabled to have black edge areas, so that the edge of the image can be highlighted, and the image is clear in outline.
In an embodiment, the adjusting module 3 is further configured to change the gray value of each pixel belonging to the edge area to the gray value of the original image background.
The gray level of the edge area of the original image is changed into the gray level of the background of the original image by using the adjusting module 3, so that the original image can be reduced, and the use requirement of the original image is met.
In one embodiment, as shown in fig. 10, the adjustment module 3 includes a first adjustment sub-module 31 and a second adjustment sub-module 32. A first adjustment submodule 31, configured to adjust a color shading degree of pixels belonging to a first edge area, and implement processing of the first edge area after adjustment of each pixel belonging to the first edge area is completed; the second adjusting sub-module 32 is configured to adjust the color shade degree of the pixels belonging to the second edge area, and implement the processing of the second edge area after completing the adjustment of the pixels belonging to the second edge area.
The first adjustment submodule 31 adjusts the color shading degree of the pixels belonging to the first edge region, and after completing the adjustment of the pixels belonging to the first edge region, implements the processing of the first edge region, including: the first adjustment sub-module 31 adds, for any pixel belonging to the first edge region, a value Δh2 corresponding to the gray value of the pixel, thereby changing the color shading degree of the pixel.
The first adjustment submodule 31 adjusts the color shading degree of the pixels belonging to the first edge region, and after completing the adjustment of the pixels belonging to the first edge region, realizes the processing of the first edge region, and further includes: the first adjustment sub-module 31 calculates and obtains a value Δh2 corresponding to the gray value of the pixel by using a formula Δh2= (255-h 2) ×f2, where h2 is the gray value of the pixel, f2 is equal to or greater than-1 and equal to or less than 1, and Δh2 is a value corresponding to the gray value of the pixel, before adding the corresponding value Δh2 to the gray value of the pixel for any one of the pixels belonging to the first edge region.
The second adjustment sub-module 32 adjusts the color shade of the pixels belonging to the second edge area, and after completing the adjustment of the pixels belonging to the second edge area, implements the processing of the second edge area, including: the second adjustment sub-module 32 adds, for any pixel belonging to the second edge region, the gray value of the pixel to the value Δh3 corresponding to the gray value of the pixel, thereby changing the color shade degree of the pixel.
The second adjustment sub-module 32 adjusts the color shade of the pixels belonging to the second edge area, and after completing the adjustment of the pixels belonging to the second edge area, implements the processing of the second edge area, and further includes: the second adjustment sub-module 32 calculates and obtains a value Δh3 corresponding to the gray value of the pixel by using a formula Δh3= (255-h 3) ×f3 before adding the corresponding value Δh3 to the gray value of the pixel for any pixel belonging to the second edge region, where h3 is the gray value of the pixel, f3 is equal to or greater than-1 and equal to or less than 1, and Δh3 is the value corresponding to the gray value of the pixel.
The adjusting module 3 can make the color gradation degree of each pixel of the first edge area different from the color gradation degree of each pixel of the second edge area by making f2 and f3 different, so that the edge of the original image can be made to present a gradual change effect, and the display effect of the original image is improved.
In one embodiment, the device further comprises a printing module 4, the printing module 4 being electrically connected to the adjustment module 3. A printing module 4, configured to obtain a second image after completing adjustment of each pixel belonging to the edge area; and performing ink-jet printing by using the second image.
The printing module 4 performs inkjet printing using the second image, including: the printing module 4 acquires image dot matrix data containing dot data for representing the ink output of each nozzle by using the second image; inkjet printing is performed using the image dot matrix data.
Referring to fig. 11, the printing method according to the above embodiment of the present invention further provides an image edge processing apparatus, which mainly includes:
at least one processor 401; the method comprises the steps of,
a memory 402 communicatively coupled to the at least one processor; wherein, the liquid crystal display device comprises a liquid crystal display device,
the memory 402 stores instructions executable by the at least one processor 401 to enable the at least one processor 401 to perform the methods described in the above embodiments of the present invention. For a detailed description of the apparatus, please refer to the above embodiment, and the detailed description is omitted herein.
In particular, the processor 401 described above may include a Central Processing Unit (CPU), or an application specific integrated circuit (Application Specific Integrated Circuit, ASIC), or may be configured as one or more integrated circuits implementing embodiments of the present invention.
Memory 402 may include mass storage for data or instructions. By way of example, and not limitation, memory 402 may comprise a Hard Disk Drive (HDD), floppy Disk Drive, flash memory, optical Disk, magneto-optical Disk, magnetic tape, or universal serial bus (Universal Serial Bus, USB) Drive, or a combination of two or more of the foregoing. Memory 402 may include removable or non-removable (or fixed) media, where appropriate. Memory 402 may be internal or external to the data processing apparatus, where appropriate. In a particular embodiment, the memory 402 is a non-volatile solid state memory. In a particular embodiment, the memory 402 includes Read Only Memory (ROM). The ROM may be mask programmed ROM, programmable ROM (PROM), erasable PROM (EPROM), electrically Erasable PROM (EEPROM), electrically rewritable ROM (EAROM), or flash memory, or a combination of two or more of these, where appropriate.
The processor 401 implements any of the image edge processing methods of the above embodiments by reading and executing computer program instructions stored in the memory 402.
In one example, the image edge processing device may also include a communication interface 403 and a bus 410. As shown in fig. 11, the processor 401, the memory 402, and the communication interface 403 are connected to each other by a bus 410 and perform communication with each other.
The communication interface 403 is mainly used to implement communication between each module, device, unit and/or apparatus in the embodiment of the present invention.
Bus 410 includes hardware, software, or both, coupling components comprising the image edge processing device to each other. By way of example, and not limitation, the buses may include an Accelerated Graphics Port (AGP) or other graphics bus, an Enhanced Industry Standard Architecture (EISA) bus, a Front Side Bus (FSB), a HyperTransport (HT) interconnect, an Industry Standard Architecture (ISA) bus, an infiniband interconnect, a Low Pin Count (LPC) bus, a memory bus, a micro channel architecture (MCa) bus, a Peripheral Component Interconnect (PCI) bus, a PCI-Express (PCI-X) bus, a Serial Advanced Technology Attachment (SATA) bus, a video electronics standards association local (VLB) bus, or other suitable bus, or a combination of two or more of the above. Bus 410 may include one or more buses, where appropriate. Although embodiments of the invention have been described and illustrated with respect to a particular bus, the invention contemplates any suitable bus or interconnect.
In addition, in combination with the image edge processing method in the above embodiment, the embodiment of the present invention may be implemented by providing a computer readable storage medium. The computer readable storage medium has stored thereon computer program instructions; the computer program instructions, when executed by a processor, implement any of the image edge processing methods of the above embodiments.
In summary, the image edge processing method, device, equipment and storage medium provided by the embodiments of the present invention can solve the technical problem that the edge of the image cannot be processed in the prior art by means of a pure computer algorithm in a mathematical modeling manner after each pixel of the original image background is acquired.
It should be understood that the invention is not limited to the particular arrangements and instrumentality described above and shown in the drawings. For the sake of brevity, a detailed description of known methods is omitted here. In the above embodiments, several specific steps are described and shown as examples. However, the method processes of the present invention are not limited to the specific steps described and shown, and those skilled in the art can make various changes, modifications and additions, or change the order between steps, after appreciating the spirit of the present invention. These are intended to be within the scope of the present invention.

Claims (6)

1. A method of image edge processing, the method comprising:
step S1: acquiring an original image;
step S2: acquiring pixels belonging to an edge area of the original image, wherein the edge area comprises a first edge area, and the first edge area is adjacent to a background of the original image; the step S2 includes:
step S21: for any pixel in the original image background, acquiring each pixel with the city distance between the periphery of the pixel and the pixel smaller than a first set value; if the center position coordinate of one pixel is (x 1, y 1) and the center position coordinate of the other pixel is (x 2, y 2), the city distance d= |x1-x2|+|y1-y2| between the two pixels;
step S22: judging whether the pixel is the pixel in the background or not according to any acquired pixel; if the pixel is not the pixel in the background, the pixel belongs to the first edge area, and the pixel is acquired;
step S3: adjusting the color shade degree of the acquired pixels, and after adjusting each pixel belonging to the edge area, implementing the processing of the edge area, including: for any one of the acquired pixels, adding a value delta h1 corresponding to the gray value of the pixel, so as to change the color shade degree of the pixel; before adding a corresponding value Δh1 to the gray value of any pixel belonging to the image edge region, calculating and obtaining a value Δh1 corresponding to the gray value of the pixel by using a formula Δh1= (255-h 1) ×f1, where h1 is the gray value of the pixel, f1 is greater than or equal to-1 and less than or equal to 1, and Δh1 is the value corresponding to the gray value of the pixel.
2. The method of claim 1, wherein the edge region comprises: a second edge region adjacent to the first edge region and not adjacent to the background of the original image;
the step S2 further includes:
step S23: for any pixel belonging to the first edge area, acquiring each pixel with the city distance between the periphery of the pixel and the pixel smaller than a second set value;
step S24: judging whether the pixel belongs to the first edge area and is a pixel in the background or not according to any acquired pixel; if the pixel does not belong to the first edge area and the pixel is not a pixel in the background, the pixel belongs to the second edge area, and the pixel is acquired.
3. The method according to any one of claims 1-2, further comprising, after step S3: after finishing the adjustment of each pixel belonging to the edge area, obtaining a second image; and performing ink-jet printing by using the second image.
4. An image edge processing apparatus, the apparatus comprising:
the first acquisition module is used for acquiring an original image;
A second acquisition module, configured to acquire pixels belonging to an edge area of the original image, including: for any pixel in the original image background, acquiring each pixel with the city distance between the periphery of the pixel and the pixel smaller than a first set value; if the center position coordinate of one pixel is (x 1, y 1) and the center position coordinate of the other pixel is (x 2, y 2), the city distance d= |x1-x2|+|y1-y2| between the two pixels; judging whether the pixel is the pixel in the background or not according to any acquired pixel; if the pixel is not the pixel in the background, the pixel belongs to a first edge area, and the pixel is acquired, wherein the edge area comprises the first edge area, and the first edge area is adjacent to the background of the original image;
the adjusting module is configured to adjust the color shade degree of the acquired pixels, and after adjusting each pixel belonging to the edge area, implement processing on the edge area, where the adjusting module includes: for any one of the acquired pixels, adding a value delta h1 corresponding to the gray value of the pixel, so as to change the color shade degree of the pixel; before adding a corresponding value Δh1 to the gray value of any pixel belonging to the image edge region, calculating and obtaining a value Δh1 corresponding to the gray value of the pixel by using a formula Δh1= (255-h 1) ×f1, where h1 is the gray value of the pixel, f1 is greater than or equal to-1 and less than or equal to 1, and Δh1 is the value corresponding to the gray value of the pixel.
5. An image edge processing apparatus, the apparatus comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein, the liquid crystal display device comprises a liquid crystal display device,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-3.
6. A computer storage medium having stored thereon computer program instructions, characterized in that,
a method as claimed in any one of claims 1 to 3 when the computer program instructions are executed by a processor.
CN202010310001.1A 2020-04-20 2020-04-20 Image edge processing method, device, equipment and storage medium Active CN113538479B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010310001.1A CN113538479B (en) 2020-04-20 2020-04-20 Image edge processing method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010310001.1A CN113538479B (en) 2020-04-20 2020-04-20 Image edge processing method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113538479A CN113538479A (en) 2021-10-22
CN113538479B true CN113538479B (en) 2023-07-14

Family

ID=78123537

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010310001.1A Active CN113538479B (en) 2020-04-20 2020-04-20 Image edge processing method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113538479B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1834582A (en) * 2005-03-15 2006-09-20 欧姆龙株式会社 Image processing method, three-dimensional position measuring method and image processing apparatus
CN103164702A (en) * 2011-12-13 2013-06-19 李卫伟 Extracting method and device of marker central point and image processing system
CN103514595A (en) * 2012-06-28 2014-01-15 中国科学院计算技术研究所 Image salient region detecting method
CN104574312A (en) * 2015-01-06 2015-04-29 深圳市元征软件开发有限公司 Method and device of calculating center of circle for target image
CN106067933A (en) * 2015-04-21 2016-11-02 柯尼卡美能达株式会社 Image processing apparatus and image processing method
CN108022233A (en) * 2016-10-28 2018-05-11 沈阳高精数控智能技术股份有限公司 A kind of edge of work extracting method based on modified Canny operators
CN110163119A (en) * 2019-04-30 2019-08-23 中国地质大学(武汉) A kind of finger vein identification method and system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6381183B2 (en) * 2013-07-09 2018-08-29 キヤノン株式会社 Apparatus, method, and program for extending object included in image data

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1834582A (en) * 2005-03-15 2006-09-20 欧姆龙株式会社 Image processing method, three-dimensional position measuring method and image processing apparatus
CN103164702A (en) * 2011-12-13 2013-06-19 李卫伟 Extracting method and device of marker central point and image processing system
CN103514595A (en) * 2012-06-28 2014-01-15 中国科学院计算技术研究所 Image salient region detecting method
CN104574312A (en) * 2015-01-06 2015-04-29 深圳市元征软件开发有限公司 Method and device of calculating center of circle for target image
CN106067933A (en) * 2015-04-21 2016-11-02 柯尼卡美能达株式会社 Image processing apparatus and image processing method
CN108022233A (en) * 2016-10-28 2018-05-11 沈阳高精数控智能技术股份有限公司 A kind of edge of work extracting method based on modified Canny operators
CN110163119A (en) * 2019-04-30 2019-08-23 中国地质大学(武汉) A kind of finger vein identification method and system

Also Published As

Publication number Publication date
CN113538479A (en) 2021-10-22

Similar Documents

Publication Publication Date Title
CN107767354B (en) Image defogging algorithm based on dark channel prior
CN110148095B (en) Underwater image enhancement method and enhancement device
US8355574B2 (en) Determination of main object on image and improvement of image quality according to main object
CN103065334B (en) A kind of color cast detection based on hsv color space, bearing calibration and device
US20100208946A1 (en) Specifying flesh area on image
JP2001014456A (en) Image processing device and program recording medium
JP2004181965A (en) Whole range mapping of color using cost function
CN108960351B (en) Image printing method, device, equipment and storage medium
EP3407589B1 (en) Image processing apparatus, image processing method, and storage medium
CN113538479B (en) Image edge processing method, device, equipment and storage medium
CN109949379B (en) Color-tracing method, device, equipment and storage medium for ink-jet printer
CN103379346A (en) Chrominance information processing method, device and system of images in YUV format
CN111861893A (en) Method, system, equipment and computer medium for eliminating false color edge of image
CN110888612B (en) Evaluation method, device, equipment and medium for printing target color gamut
CN109716748A (en) Evaluation system and evaluation method
CN107609453A (en) A kind of license plate image correction, registration number character dividing method and equipment
CN113538201B (en) Ceramic watermark model training method and device based on bottom changing mechanism and embedding method
CN109102473A (en) A method of improving color digital image quality
CN112200755B (en) Image defogging method
CN105599296B (en) 2.5D image printing method
JP2008118183A (en) Image processing program and image processing apparatus
US11539861B1 (en) Color plane misregistration determinations
JP6880847B2 (en) Image processing equipment, image processing methods and programs
CN112083894A (en) Method, device, equipment and medium for automatically setting white ink amount of pixel by pixel of image
JP2005260404A (en) Image processor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address

Address after: 518000 a201-a301, building a, Sino German European Industrial Demonstration Park, Hangcheng Avenue, guxing community, Xixiang street, Bao'an District, Shenzhen, Guangdong

Patentee after: Shenzhen Hansen Software Co.,Ltd.

Address before: 1701, 1703, building C6, Hengfeng Industrial City, 739 Zhoushi Road, Hezhou community, Hangcheng street, Bao'an District, Shenzhen, Guangdong 518000

Patentee before: SHENZHEN HOSONSOFT Co.,Ltd.

CP03 Change of name, title or address