CN113409222A - Image processing method, printing-related apparatus, and readable storage medium - Google Patents

Image processing method, printing-related apparatus, and readable storage medium Download PDF

Info

Publication number
CN113409222A
CN113409222A CN202110734089.4A CN202110734089A CN113409222A CN 113409222 A CN113409222 A CN 113409222A CN 202110734089 A CN202110734089 A CN 202110734089A CN 113409222 A CN113409222 A CN 113409222A
Authority
CN
China
Prior art keywords
images
image
slice
pixel point
edge pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110734089.4A
Other languages
Chinese (zh)
Inventor
刘鹏
欧阳欣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Anycubic Technology Co Ltd
Original Assignee
Shenzhen Anycubic Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Anycubic Technology Co Ltd filed Critical Shenzhen Anycubic Technology Co Ltd
Priority to CN202110734089.4A priority Critical patent/CN113409222A/en
Publication of CN113409222A publication Critical patent/CN113409222A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • G06T5/94
    • G06T5/73
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images

Abstract

The invention provides an image processing method, a printing related device and a readable storage medium, and relates to the technical field of printing. The image processing method comprises the following steps: acquiring N slice images of a model to be printed, wherein N is a positive integer; blurring edge pixel points in the N slice images to obtain N first intermediate images; and performing contrast stretching processing on the N first intermediate images to obtain N second intermediate images, wherein the model to be printed is generated based on the N second intermediate images. Therefore, the edge pixel points in the N layer slice images of the model to be printed can be subjected to fuzzy processing and contrast stretching processing, so that the gray values of the edge pixel points in the N layer slice images can be adjusted, the curing degree of resin at the corresponding positions of the edge pixel points can be effectively controlled, the pixel striations of the model to be printed can be effectively reduced, and the quality of the model to be printed is improved.

Description

Image processing method, printing-related apparatus, and readable storage medium
Technical Field
The present invention relates to the field of printing technologies, and in particular, to an image processing method, a printing related apparatus, and a readable storage medium.
Background
With the development of three-dimensional (3D) printing technology, 3D printing devices are widely used, especially photo-curing printing devices. In the process of 3D printing by using a photo-curing printing apparatus, it is necessary to expose the slice-by-slice image with an ultraviolet light source, and cure the resin layer by layer according to the exposure condition to form a printing model. Because the existing printing mode is to directly cure the resin at the corresponding position of the edge pixel point of each layer slice image based on the layer slice image, the pixel striations of the obtained printing model are obvious.
Disclosure of Invention
The embodiment of the invention provides an image processing method, a printing related device and a readable storage medium, which aim to solve the problem that pixel striations of a printing model obtained by the existing printing mode are obvious.
In a first aspect, an embodiment of the present invention provides an image processing method, where the method includes:
acquiring N slice images of a model to be printed, wherein N is a positive integer;
blurring edge pixel points in the N slice images to obtain N first intermediate images;
and performing contrast stretching processing on the N first intermediate images to obtain N second intermediate images, wherein the model to be printed is generated based on the N second intermediate images. Optionally, the blurring processing is performed on edge pixel points in the N slice images to obtain N first intermediate images, including:
carrying out binarization processing on the N slice images to obtain N binarized images;
acquiring edge pixel points corresponding to each binary image in the N binary images;
and carrying out fuzzy processing on the edge pixel points in the N layer slice images to obtain N first intermediate images.
Optionally, the obtaining of the edge pixel point corresponding to each binarized image in the N binarized images includes:
acquiring a neighborhood pixel value of a first pixel point in a binary image to be judged, wherein the binary image to be judged is any one of the N binary images, the first pixel point is any one of the pixel points in the binary image to be judged, and the neighborhood pixel value is the pixel value of each pixel point adjacent to the first pixel point;
and under the condition that the neighborhood pixel values comprise at least one first pixel value, determining the first pixel point as an edge pixel point corresponding to the binary image to be judged, wherein the first pixel value is used for representing a pixel value corresponding to black. Optionally, the blurring processing the edge pixel points in the N slice images to obtain N first intermediate images includes:
acquiring a fuzzy processing parameter, wherein the fuzzy processing parameter is determined according to a first input, and the first input is an input operation executed by a user aiming at the fuzzy processing parameter;
determining a sliding window participating in the blurring processing based on the blurring processing parameter, wherein the sliding window is used for determining a pixel point set participating in the blurring processing in a first layer slice image every time, and the first layer slice image is any one layer slice image in the N layer slice images;
sequentially acquiring target gray values corresponding to all edge pixel points in the first layer of slice images based on the sliding window;
and determining a first intermediate image corresponding to the first layer of slice image according to the target gray value.
Optionally, the sequentially obtaining target gray values corresponding to edge pixel points in the first layer slice image based on the sliding window includes:
sequentially sliding the sliding window in the first layer slice image according to a preset step length, wherein the preset step length is used for representing the distance of each movement of the sliding window;
when the pixel point corresponding to the target position of the sliding window is a first edge pixel point, performing convolution calculation on a pixel point set corresponding to the current sliding window and a preset convolution kernel to obtain a convolution result, wherein the preset convolution kernel is a matrix of M x M, the value of M is determined according to the fuzzy processing parameter, M is an integer greater than 1, and the first edge pixel point is any edge pixel point in the first layer of slice image;
and determining the target gray value of the first edge pixel point according to the convolution result.
Optionally, before the blurring processing is performed on the edge pixel points in the N slice images to obtain N first intermediate images, the method further includes:
performing anti-aliasing processing on the N slice images;
the blurring processing is performed on the edge pixel points in the N slice images to obtain N first intermediate images, including:
and carrying out fuzzy processing on edge pixel points in the N layers of sliced images after anti-aliasing processing to obtain N first intermediate images.
Optionally, the performing contrast stretching processing on the N first intermediate images to obtain N second intermediate images includes:
acquiring a target gray scale level, wherein the target gray scale level is determined according to a second input, and the second input is an input operation executed by a user aiming at the gray scale level;
determining a gray value corresponding to the target gray scale level based on a preset mapping relation, wherein the preset mapping relation is used for indicating the mapping relation between the gray scale level and the gray value;
and adjusting the gray value of each pixel point in the N first intermediate images according to the gray value corresponding to the target gray level to obtain N second intermediate images.
In a second aspect, an embodiment of the present invention further provides a printing method, where the printing method includes:
acquiring an image to be printed, and printing a model according to the image to be printed; the image to be printed is N second intermediate images generated by the image processing method according to the first aspect.
In a third aspect, an embodiment of the present invention further provides a printing system, including: an image processing apparatus and a printing device;
the image processing apparatus is configured to execute the image processing method according to the first aspect;
and the printing equipment uses the N second intermediate images output by the image processing device and obtains a model to be printed according to the N second intermediate images.
In a fourth aspect, embodiments of the present invention further provide a printing apparatus, including a processor, a memory, and a program or instructions stored on the memory and executable on the processor, where the program or instructions, when executed by the processor, implement the steps of the image processing method according to the first aspect.
In a fifth aspect, the embodiment of the present invention further provides a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the image processing method according to the first aspect.
In the embodiment of the invention, N layer slice images of a model to be printed are obtained, wherein N is a positive integer; blurring edge pixel points in the N slice images to obtain N first intermediate images; and performing contrast stretching processing on the N first intermediate images to obtain N second intermediate images, wherein the model to be printed is generated based on the N second intermediate images. By the method, the edge pixel points in the N layer slice images of the model to be printed can be subjected to fuzzy processing and contrast stretching processing, so that the gray values of the edge pixel points in the N layer slice images can be adjusted, the curing degree of resin at the corresponding positions of the edge pixel points can be effectively controlled, the pixel striations of the model to be printed can be effectively reduced, and the quality of the model to be printed is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
FIG. 1 is a flowchart of an image processing method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a 4-neighborhood provided by an embodiment of the present invention;
FIG. 3 is a flow chart of a printing method provided by an embodiment of the invention;
FIG. 4 is a schematic structural diagram of a printing system provided in an embodiment of the present application;
fig. 5 is a schematic structural diagram of a printing apparatus according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Unless defined otherwise, technical or scientific terms used herein shall have the ordinary meaning as understood by one of ordinary skill in the art to which this invention belongs. The use of "first," "second," and similar terms in the present application do not denote any order, quantity, or importance, but rather the terms are used to distinguish one element from another. Also, the use of the terms "a" or "an" and the like do not denote a limitation of quantity, but rather denote the presence of at least one. The terms "connected" or "coupled" and the like are not restricted to physical or mechanical connections, but may include electrical connections, whether direct or indirect. "upper", "lower", "left", "right", and the like are used merely to indicate relative positional relationships, and when the absolute position of the object being described is changed, the relative positional relationships are changed accordingly.
An embodiment of the present invention provides an image processing method, and referring to fig. 1, fig. 1 is a flowchart of the image processing method provided in the embodiment of the present invention. As shown in fig. 1, the method may specifically include the following steps:
step 101, obtaining N slice images of a model to be printed, wherein N is a positive integer.
Specifically, the model to be printed may be any model that needs to be printed in 3D, such as a living product, a building model, an appliance, and the like. The layer slice image is an image obtained by processing a graphic file containing a model to be printed by a preset graphic processing tool. In the process of processing by a preset image processing tool, the three-dimensional stereo model needs to be cut into slice sheets with preset thickness, and each slice sheet corresponds to one image, namely a slice image.
It should be noted that, since the thickness of each slice is very thin, the slice image may be regarded as a two-dimensional image. Each model to be printed may include N slice images, where N may be any positive integer, such as 1, 2, 3, 4, …, etc. For example, assuming that the model to be printed is a cube with N pixel points on the side length, the model is sliced in the height direction according to the thickness of one pixel point, and finally N slice images can be obtained, so that the N slice images are superposed together to form the cube with N pixel points on the side length.
Step 102, performing fuzzy processing on edge pixel points in the N slice images to obtain N first intermediate images.
In this step, edge pixel points in each slice image may be obtained first, and then the edge pixel points in each slice image are subjected to blurring processing to obtain N first intermediate images. Wherein the number of the first intermediate images is the same as the number of the slice images. The blurring process may adopt a linear filtering process, such as a mean filtering process, a median filtering process, a gaussian filtering process, or a nonlinear filtering process, such as a bilateral filtering process, and the embodiment is not particularly limited.
And 103, performing contrast stretching treatment on the N first intermediate images to obtain N second intermediate images, wherein the model to be printed is generated based on the N second intermediate images.
The contrast stretching processing refers to adjusting the gray value of each pixel point in the N first intermediate images. Specifically, the same gray value may be promoted for each pixel point in the first intermediate image, or different gray values may be promoted for each pixel point in the first intermediate image, which is not specifically limited in this embodiment. Wherein the number of the second intermediate images is the same as the number of the first intermediate images.
It should be noted that the image processing method may be executed by a printing device, or may be executed by an electronic device independent from the printing device, such as a computer, a notebook, a tablet computer, a mobile terminal, and the like, and the embodiment is not particularly limited.
After the N second intermediate images are acquired, a model to be printed may be generated based on the N second intermediate images.
Specifically, if the image processing method is executed by the printing device, the printing device may directly perform exposure, curing and molding on the N second intermediate images to obtain a model to be printed; if the image processing method is executed by the electronic device, the electronic device may compress the N second intermediate images to generate a slice file of the model to be printed. After the printing device obtains the slice file, the slice file can be decompressed, and exposure, solidification and molding are performed on the basis of the N second intermediate images in the slice file, so that the model to be printed is finally obtained.
In this embodiment, the edge pixel points in the N layer slice images of the model to be printed can be blurred and contrast-stretched, so that the gray values of the edge pixel points in the N layer slice images can be adjusted, and the degree of curing of the resin at the corresponding positions of the edge pixel points can be effectively controlled, thereby effectively reducing the pixel striations of the model to be printed, and improving the quality of the model to be printed.
Further, the step 102 of performing a blurring process on edge pixel points in the N slice images to obtain N first intermediate images may specifically include the following steps:
carrying out binarization processing on the N slice images to obtain N binarized images;
acquiring edge pixel points corresponding to each binary image in the N binary images;
and carrying out fuzzy processing on edge pixel points in the N layer slice images to obtain N first intermediate images.
Specifically, the binarization processing is a process of setting the gray value of a pixel point on an image to be 0 or 255, that is, the whole image exhibits an obvious black-and-white effect. For example, when the gray value of a certain pixel point in the image is less than or equal to a preset threshold, the gray value is represented by 0; when the gray value of a certain pixel point in the image is greater than the preset threshold, the gray value is 255. It should be noted that the preset threshold may be set according to actual needs, and this embodiment is not particularly limited.
The binary image is a black-and-white image, and the outline of the slice image can be highlighted. In the binary image, the white area represents an image area needing to be printed on the slice image, the black area represents a blank area needing not to be printed on the slice image, and the pixel points located at the adjacent positions of the white area and the black area are edge pixel points.
After the edge pixel points of the N binary images are determined, the edge pixel points in each layer of sliced image can be subjected to fuzzy processing, so that N first intermediate images are obtained. The blurring process may adopt a linear filtering process, such as a mean filtering process, a median filtering process, a gaussian filtering process, or a nonlinear filtering process, such as a bilateral filtering process, and the embodiment is not particularly limited.
In this embodiment, binarization processing is performed on the N layer sliced images, and the edge pixel points corresponding to each binarized image are determined, so that blurring processing of the edge pixel points in the N layer sliced images can be realized, and compared with blurring processing of all pixel points in the whole layer sliced image, blurring processing efficiency can be greatly improved.
Further, the step of obtaining the edge pixel point corresponding to each binarized image in the N binarized images may specifically include the following steps:
acquiring a neighborhood pixel value of a first pixel point in a binary image to be judged, wherein the binary image to be judged is any one of N binary images, the first pixel point is any one of the pixel points in the binary image to be judged, and the neighborhood pixel value is the pixel value of each pixel point adjacent to the first pixel point;
and under the condition that the neighborhood pixel value comprises at least one first pixel value, determining the first pixel point as an edge pixel point corresponding to the binary image to be judged, wherein the first pixel value is used for representing the pixel value corresponding to black.
In an embodiment, after obtaining N binarized images, neighborhood pixel values of a first pixel point in the binarized image to be determined may be obtained, and when the neighborhood pixel values of the first pixel point include at least one first pixel value, the first pixel point is an edge pixel point; and when the neighborhood pixel value of the first pixel does not comprise the first pixel value, the first pixel point is a non-edge pixel point. The binary image to be judged is any one of the N binary images, and the first pixel point is any one of the pixel points in the binary image to be judged. The neighborhood pixel value refers to a pixel value of a pixel point adjacent to a first pixel point, and the first pixel value refers to a pixel value with a gray value of 0, that is, the first pixel value is a pixel value for displaying black. As shown in fig. 2, pixel values of pixels in 4 neighborhoods of the first pixel point in a, b, c, and d4 directions may be obtained as neighborhood pixel values of the first pixel point, respectively. When the gray value of one pixel point is 0 in the pixel values of the pixel points in the 4 neighborhoods, it indicates that the first pixel point has a blank area in at least one direction, and thus the first pixel point can be regarded as an edge pixel point.
Of course, as another embodiment, the pixel value of the pixel point in the 8 neighborhoods of the first pixel point may also be obtained to serve as the neighborhood pixel value of the first pixel point, so as to determine whether the first pixel point is an edge pixel point, which is not specifically limited in this application.
In this embodiment, the layer slice image may be binarized to obtain neighborhood pixels of each first pixel point in the layer slice image, and thus, whether the first pixel point is an edge pixel point is determined according to pixel values of the neighborhood pixels, so that detection of the edge pixel point in the layer slice image is more accurate.
Further, the step of performing fuzzy processing on edge pixel points in the N slice images to obtain N first intermediate images includes:
acquiring fuzzy processing parameters, wherein the fuzzy processing parameters are determined according to first input, and the first input is input operation executed by a user aiming at the fuzzy processing parameters;
determining a sliding window participating in the fuzzy processing based on the fuzzy processing parameters, wherein the sliding window is used for determining a pixel point set participating in the fuzzy processing in the first layer slice image every time, and the first layer slice image is any one layer slice image in the N layer slice images;
sequentially acquiring target gray values corresponding to all edge pixel points in the first layer of slice images based on the sliding window;
and determining a first intermediate image corresponding to the first layer of slice image according to the target gray value. Specifically, the blurring parameter may be a number of layers for blurring edge pixel points in each layer slice image, and the blurring parameter may be any value such as 2, 3, 4, or 5. It should be noted that the blurring processing parameter may be preset, or may be determined based on a first input of a user, and the present application is not limited specifically. When the fuzzy processing parameter is determined to be obtained based on the first input of the user, the user can select the fuzzy processing parameter based on the user interface or the physical button and input the first input according to the selection result, thereby realizing flexible setting of the fuzzy processing parameter. The first input here includes, but is not limited to, sliding, clicking, etc. the user interface, or pressing, etc. the physical key.
After determining the blur processing parameters, a sliding window participating in the blur processing may be determined based on the blur processing parameters. For example, when the blurring parameter is 2, the sliding window is a window of 2 pixels by 2 pixels; when the fuzzy processing parameter is 3, the sliding window is a window with 3 pixels and 3 pixels; and when the fuzzy processing parameter is 4, the sliding window is a window of 4 pixels by 4 pixels, and the like. Therefore, when the sliding window slides in the first layer slice image, the pixel point set participating in the blurring processing after each sliding of the sliding window can be determined. Therefore, based on the sliding window, fuzzy processing can be sequentially carried out on each edge pixel point in the first layer of slice image, a target gray value corresponding to each edge pixel point is obtained, and a first intermediate image corresponding to the first layer of slice image is obtained.
In this embodiment, the blurring processing parameters can be flexibly set, and a user can select appropriate blurring processing parameters according to actual conditions of the slice image and the printing device to perform blurring processing on the edge pixel points, so as to increase the flexibility of the blurring processing.
Further, the step of sequentially obtaining the target gray-scale values corresponding to the edge pixel points in the first layer slice image based on the sliding window may include the following steps:
sequentially sliding the sliding window in the first layer of slice image according to a preset step length, wherein the preset step length is used for representing the moving distance of the sliding window each time;
when the pixel point corresponding to the target position of the sliding window is the first edge pixel point, performing convolution calculation on a pixel point set corresponding to the current sliding window and a preset convolution kernel to obtain a convolution result, wherein the preset convolution kernel is a matrix of M x M, the value of M is determined according to the fuzzy processing parameter, M is an integer larger than 1, and the first edge pixel point is any edge pixel point in the first layer of slice image;
and determining the target gray value of the first edge pixel point according to the convolution result. Specifically, the size of the above-described convolution kernel is proportional to the size of the blur processing parameter. For example, when the blurring processing parameter is 2, the size of the convolution kernel is 2 × 2; when the blurring processing parameter is 3, the size of the convolution kernel is 3 × 3; when the blurring processing parameter is 4, the size of the convolution kernel is 4 × 4, and the like.
When the blurring processing is performed, the sliding window needs to be scanned line by line for each pixel point in the first layer slice image according to a preset step length. When the first edge pixel point is located at the target position of the sliding window, the convolution calculation can be performed on each pixel point in the sliding window and the convolution kernel to obtain the target gray value of the first edge pixel point. It should be noted that the preset step length here may be any length such as 1 pixel length, 2 pixel lengths, 3 pixel lengths, and the like, and in order to implement traversal of all edge pixels, 1 pixel length may be selected as the preset step length. The target position here refers to an arbitrary fixed position in the sliding window. For example, if the sliding window is a window with 2 pixels × 2 pixels, a position corresponding to any one of [0,0], [0,1], [1,0] and [1,1] in the sliding window may be selected as the target position. In order to achieve a better blurring processing effect, the middle position of the sliding window may be preferably used as the target position, for example, when the sliding window is a window with 3 pixels × 3 pixels, the position corresponding to the [1,1] pixel point may be selected as the target position of the sliding window. Meanwhile, in order to traverse each pixel point of the whole slice image, pixel points with pixel values of 0 can be filled in the peripheral outer edge of the slice image.
After the sliding window slides each time, whether pixel points corresponding to the target position are edge pixel points or not needs to be judged, and if the pixel points corresponding to the target position are the edge pixel points, convolution calculation is carried out; and if the pixel point corresponding to the target position is a non-edge pixel point, performing convolution calculation, and directly sliding to the next pixel point. And repeating the execution until all edge pixel points on a certain layer of slice image are traversed, and finishing sliding the sliding window. And obtaining a first intermediate image according to the convolution result of each convolution calculation.
In this embodiment, each pixel point in the sliding window, that is, the first edge pixel point and other pixel points around the first edge pixel point, may participate in the convolution calculation, and the obtained convolution result includes the gray value characteristics of the first edge pixel point and its neighboring pixel points, so that the blurring processing effect of the first edge pixel point is better.
Further, before the step 102 of performing the blurring processing on the edge pixel points in the N slice images to obtain N first intermediate images, the method further includes:
performing anti-aliasing treatment on the N layer slice images;
the step 102 of performing a blurring process on edge pixel points in the N slice images to obtain N first intermediate images includes:
and carrying out fuzzy processing on edge pixel points in the N layers of sliced images after anti-aliasing processing to obtain N first intermediate images.
Specifically, the anti-aliasing treatment method includes but is not limited to: Super-Sampling Anti-Aliasing (SSAA for short), multi-Sampling Anti-Aliasing (MSAA for short), overlay Sampling Anti-Aliasing (CSAA for short), programmable Filter Anti-Aliasing (CFAA for short), Fast approximation Anti-Aliasing (FXAA for short) and the like.
When the anti-aliasing process is performed on the N slice images, the image anti-aliasing process may be performed according to a preset anti-aliasing level parameter, so that the edges of the slice images are relatively smooth. The anti-aliasing level here may be preset or determined based on the input operation of the user, and the present application is not particularly limited. After the anti-aliasing processing is performed on the N layer slice images, the blurring processing may be performed based on edge pixel points in the N layer slice images after the anti-aliasing processing, and a specific blurring processing process is described in detail in the above embodiments and is not described herein again.
In this embodiment, the anti-aliasing processing may be performed on the layer slice image to make the edge of the layer slice image tend to be smooth, and then the blurring processing may be performed on the basis, so as to achieve a better processing effect.
Further, the step 103 of performing contrast stretching processing on the N first intermediate images to obtain N second intermediate images includes:
acquiring a target gray scale level, wherein the target gray scale level is determined according to a second input, and the second input is an input operation executed by a user aiming at the gray scale level;
determining a gray value corresponding to the target gray scale level based on a preset mapping relation, wherein the preset mapping relation is used for indicating the mapping relation between the gray scale level and the gray value;
and adjusting the gray value of each pixel point in the N first intermediate images according to the gray value corresponding to the target gray level to obtain N second intermediate images.
Specifically, the target gray scale level may be understood as a scale parameter for adjusting the gray scale value of the first intermediate image, and the target gray scale level may be any value such as 2, 3, 4, or 5. It should be noted that the target gray scale level may be preset, or may be determined based on a second input of the user, and the application is not limited specifically. When the target gray scale level is determined based on the second input of the user, the user may select the target gray scale level based on the user interface or the physical button and input the second input operation according to the selection result, thereby implementing flexible setting of the target gray scale level. The second input includes, but is not limited to, sliding, clicking, etc. the user interface, or pressing a physical key, etc.
The preset mapping relationship is used for indicating the mapping relationship between different gray scale levels and different gray scale values. For example, assuming that 16 gray levels are required to represent gray levels in the range of 0 to 255, the preset mapping relationship can be as shown in the following table:
gray scale 0 1 2 3 4 5 6 7
Grey scale value 15 31 47 63 79 95 111 127
Gray scale 8 9 10 11 12 13 14 15
Grey scale value 143 159 175 191 207 223 239 255
Watch 1
In this way, the contrast stretching processing can be performed on the N first intermediate images based on the target gray scale level and the preset mapping relation, so as to obtain N second intermediate images. For example, assuming that the target gray scale level selected by the user is 2, 47 may be added to the gray scale value of each pixel point on each first intermediate image to obtain N second intermediate images. In the process, the gray value of the pixel point is actually adjusted, so that the curing degree of the corresponding pixel point can be changed according to the adjustment result of the gray value during printing.
It should be noted that due to the difference between different printing apparatuses, there is often individual difference in the gray-scale distribution energy curves of different printing apparatuses, that is, the range of the gray-scale value for resin curing of each printing apparatus is different. Therefore, a user needs to be able to set different target gray scale levels based on different gray scale distribution energy curves of each printing device, so that each printing device can achieve a better curing effect, and the occurrence of pixel striations is reduced.
In this embodiment, the target gray scale level can be flexibly set by the user, and the user can adjust the target gray scale level according to the current curing effect of the printing device, so as to achieve the optimal printing state of the printing device.
Referring to fig. 3, fig. 3 is a flowchart of a printing method according to an embodiment of the present invention. As shown in fig. 3, the printing method includes:
301, acquiring an image to be printed, and printing a model according to the image to be printed; the image to be printed is the N second intermediate images generated by the image processing method.
In this embodiment, the printing method is performed by a printing apparatus, and the image processing method is performed by an electronic apparatus independent of the printing apparatus. The electronic device may perform the above steps 101 to 103 to generate N second intermediate images. For a specific implementation process, reference may be made to the embodiment shown in fig. 1, which is not described herein again.
After the electronic device generates the N second intermediate images, the user can manually copy the file containing the N second intermediate images onto the printing device, and also can transmit the file containing the N second intermediate images from the electronic device to the printing device in a wired manner or a wireless manner, so that the printing device can acquire the image to be printed and print the model according to the image to be printed.
In this embodiment, the image processing process and the model printing process are separately realized by different devices, so that the processing pressure of the printing device can be reduced, and meanwhile, the problem that the printing device cannot print or has low printing efficiency in the image processing process can be solved.
In addition, the application also provides a printing system. Referring to fig. 4, fig. 4 is a schematic structural diagram of a printing system according to an embodiment of the present application. As shown in fig. 4, the printing system 400 includes an image processing apparatus 401 and a printing device 402;
an image processing apparatus 401 for executing the above-described image processing method;
the printing device 402 uses the N second intermediate images output by the image processing apparatus 401, and obtains the model to be printed according to the N second intermediate images.
In this printing system, executing the image processing method described above is performed by the image processing apparatus 401, and the image processing apparatus 401 may be any electronic device independent of the printing device 402. The user may manually copy the file containing the N second intermediate images to the printing device 402, or may transfer the file containing the N second intermediate images from the electronic device to the printing device 402 in a wired or wireless manner. In this way, the printing apparatus 402 can acquire the N second intermediate images; and sequentially carrying out exposure curing molding on the N second intermediate images to obtain the model to be printed.
Specifically, the image processing apparatus 401 includes:
the acquisition module is used for acquiring N slice images of the model to be printed, wherein N is a positive integer;
the first processing module is used for carrying out fuzzy processing on edge pixel points in the N layer slice images to obtain N first intermediate images;
and the second processing module is used for carrying out contrast stretching processing on the N first intermediate images to obtain N second intermediate images, wherein the model to be printed is generated based on the N second intermediate images.
Optionally, the first processing module comprises:
the first sub-processing module is used for carrying out binarization processing on the N slice images to obtain N binarization images;
the first obtaining submodule is used for obtaining edge pixel points corresponding to each binary image in the N binary images;
and the second sub-processing module is used for carrying out fuzzy processing on edge pixel points in the N layer slice images to obtain N first intermediate images.
Optionally, the first obtaining sub-module includes:
the first obtaining unit is used for obtaining a neighborhood pixel value of a first pixel point in the binary image to be judged, the binary image to be judged is any one of the N binary images, the first pixel point is any one pixel point in the binary image to be judged, and the neighborhood pixel value is the pixel value of each pixel point adjacent to the first pixel point;
the first determining unit is used for determining a first pixel point as an edge pixel point corresponding to the binary image to be judged under the condition that the neighborhood pixel value comprises at least one first pixel value, and the first pixel value is used for representing a pixel value corresponding to black.
Optionally, the second sub-processing module includes:
the second acquisition unit is used for acquiring fuzzy processing parameters, wherein the fuzzy processing parameters are determined according to first input, and the first input is input operation executed by a user aiming at the fuzzy processing parameters;
the second determining unit is used for determining a sliding window participating in the fuzzy processing based on the fuzzy processing parameter, the sliding window is used for determining a pixel point set participating in the fuzzy processing in the first layer slice image every time, and the first layer slice image is any one layer slice image in the N layer slice images;
the third acquisition unit is used for sequentially acquiring target gray values corresponding to all edge pixel points in the first layer of slice images based on the sliding window;
and the third determining unit is used for determining a first intermediate image corresponding to the first layer slice image according to the target gray value.
Optionally, the third obtaining unit is specifically configured to:
sequentially sliding the sliding window in the first layer of slice image according to a preset step length, wherein the preset step length is used for representing the moving distance of the sliding window each time;
when the pixel point corresponding to the target position of the sliding window is the first edge pixel point, performing convolution calculation on a pixel point set corresponding to the current sliding window and a preset convolution kernel to obtain a convolution result, wherein the preset convolution kernel is a matrix of M x M, the value of M is determined according to the fuzzy processing parameter, M is an integer larger than 1, and the first edge pixel point is any edge pixel point in the first layer of slice image;
and determining the target gray value of the first edge pixel point according to the convolution result.
Optionally, the image processing apparatus 401 further includes:
the third processing module is used for carrying out anti-aliasing processing on the N layer slice images;
the first processing module is further configured to perform fuzzy processing on edge pixel points in the N layer sliced images after the antialiasing processing to obtain N first intermediate images.
Optionally, the second processing module comprises:
the second acquisition submodule is used for acquiring a target gray scale level, the target gray scale level is determined according to second input, and the second input is input operation executed by a user aiming at the gray scale level;
the determining submodule is used for determining a gray value corresponding to the target gray scale level based on a preset mapping relation, and the preset mapping relation is used for indicating the mapping relation between the gray scale level and the gray value;
and the adjusting submodule adjusts the gray value of each pixel point in the N first intermediate images according to the gray value corresponding to the target gray level to obtain N second intermediate images.
The image processing apparatus 401 can implement each process of the method embodiment in fig. 1 in the embodiment of the present invention, and achieve the same beneficial effects, and for avoiding repetition, details are not described here again.
Specifically, the printing apparatus 402 includes:
the printing module is used for acquiring an image to be printed and printing the model according to the image to be printed; the image to be printed is N second intermediate images generated by the image processing method.
Specifically, the printing device 402 can implement the processes of the embodiment of the method in fig. 3 in the embodiment of the present invention, and achieve the same beneficial effects, and for avoiding repetition, the details are not described here again.
Referring to fig. 5, fig. 5 is a schematic structural diagram of a printing apparatus according to an embodiment of the present application. As shown in fig. 5, the printing apparatus 500 includes a processor 501, a memory 502, and a program or instructions stored on the memory 502 and executable on the processor 501, which when executed by the processor 501 implement the steps of the image processing method described above. The printing apparatus 500 can implement the processes of the embodiment of the method in fig. 1 in the embodiment of the present application, and achieve the same beneficial effects, and for avoiding repetition, the details are not described here again.
The embodiment of the present invention further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the embodiment of the image processing method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (11)

1. An image processing method, characterized in that the method comprises:
acquiring N slice images of a model to be printed, wherein N is a positive integer;
blurring edge pixel points in the N slice images to obtain N first intermediate images;
and performing contrast stretching processing on the N first intermediate images to obtain N second intermediate images, wherein the model to be printed is generated based on the N second intermediate images.
2. The image processing method according to claim 1, wherein the blurring processing of the edge pixel points in the N slice images to obtain N first intermediate images comprises:
carrying out binarization processing on the N slice images to obtain N binarized images;
acquiring edge pixel points corresponding to each binary image in the N binary images;
and carrying out fuzzy processing on the edge pixel points in the N layer slice images to obtain N first intermediate images.
3. The image processing method according to claim 2, wherein said obtaining edge pixel points corresponding to each binarized image in the N binarized images comprises:
acquiring a neighborhood pixel value of a first pixel point in a binary image to be judged, wherein the binary image to be judged is any one of the N binary images, the first pixel point is any one of the pixel points in the binary image to be judged, and the neighborhood pixel value is the pixel value of each pixel point adjacent to the first pixel point;
and under the condition that the neighborhood pixel values comprise at least one first pixel value, determining the first pixel point as an edge pixel point corresponding to the binary image to be judged, wherein the first pixel value is used for representing a pixel value corresponding to black.
4. The image processing method according to claim 2, wherein the blurring the edge pixel points in the N slice images to obtain N first intermediate images includes:
acquiring a fuzzy processing parameter, wherein the fuzzy processing parameter is determined according to a first input, and the first input is an input operation executed by a user aiming at the fuzzy processing parameter;
determining a sliding window participating in the blurring processing based on the blurring processing parameter, wherein the sliding window is used for determining a pixel point set participating in the blurring processing in a first layer slice image every time, and the first layer slice image is any one layer slice image in the N layer slice images;
sequentially acquiring target gray values corresponding to all edge pixel points in the first layer of slice images based on the sliding window;
and determining a first intermediate image corresponding to the first layer of slice image according to the target gray value.
5. The image processing method according to claim 4, wherein the sequentially obtaining target gray-scale values corresponding to edge pixel points in the first layer slice image based on the sliding window comprises:
sequentially sliding the sliding window in the first layer slice image according to a preset step length, wherein the preset step length is used for representing the distance of each movement of the sliding window;
when the pixel point corresponding to the target position of the sliding window is a first edge pixel point, performing convolution calculation on a pixel point set corresponding to the current sliding window and a preset convolution kernel to obtain a convolution result, wherein the preset convolution kernel is a matrix of M x M, the value of M is determined according to the fuzzy processing parameter, M is an integer greater than 1, and the first edge pixel point is any edge pixel point in the first layer of slice image;
and determining the target gray value of the first edge pixel point according to the convolution result.
6. The image processing method according to claim 1, wherein before the blurring processing is performed on the edge pixel points in the N slice images to obtain N first intermediate images, the method further comprises:
performing anti-aliasing processing on the N slice images;
the blurring processing is performed on the edge pixel points in the N slice images to obtain N first intermediate images, including:
and carrying out fuzzy processing on edge pixel points in the N layers of sliced images after anti-aliasing processing to obtain N first intermediate images.
7. The image processing method according to claim 1, wherein said performing contrast stretching processing on the N first intermediate images to obtain N second intermediate images comprises:
acquiring a target gray scale level, wherein the target gray scale level is determined according to a second input, and the second input is an input operation executed by a user aiming at the gray scale level;
determining a gray value corresponding to the target gray scale level based on a preset mapping relation, wherein the preset mapping relation is used for indicating the mapping relation between the gray scale level and the gray value;
and adjusting the gray value of each pixel point in the N first intermediate images according to the gray value corresponding to the target gray level to obtain N second intermediate images.
8. A method of printing, the method comprising:
acquiring an image to be printed, and printing a model according to the image to be printed; wherein the image to be printed is N second intermediate images generated by the image processing method according to any one of claims 1 to 7.
9. A printing system, comprising: an image processing apparatus and a printing device;
the image processing apparatus for performing the image processing method according to any one of claims 1 to 7;
and the printing equipment uses the N second intermediate images output by the image processing device and obtains a model to be printed according to the N second intermediate images.
10. A printing apparatus comprising a processor, a memory and a program or instructions stored on the memory and executable on the processor, the program or instructions when executed by the processor implementing the steps of the image processing method as claimed in any one of claims 1 to 7.
11. A readable storage medium, characterized in that it stores thereon a program or instructions which, when executed by a processor, implement the steps of the image processing method according to any one of claims 1 to 7.
CN202110734089.4A 2021-06-30 2021-06-30 Image processing method, printing-related apparatus, and readable storage medium Pending CN113409222A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110734089.4A CN113409222A (en) 2021-06-30 2021-06-30 Image processing method, printing-related apparatus, and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110734089.4A CN113409222A (en) 2021-06-30 2021-06-30 Image processing method, printing-related apparatus, and readable storage medium

Publications (1)

Publication Number Publication Date
CN113409222A true CN113409222A (en) 2021-09-17

Family

ID=77680426

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110734089.4A Pending CN113409222A (en) 2021-06-30 2021-06-30 Image processing method, printing-related apparatus, and readable storage medium

Country Status (1)

Country Link
CN (1) CN113409222A (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101244629A (en) * 2006-08-29 2008-08-20 3D系统公司 Method and device for solid objective formation layer by layer
GB201208991D0 (en) * 2012-05-22 2012-07-04 Mcor Technologies Ltd Colour 3-Dimensional printing
WO2016183985A1 (en) * 2015-05-15 2016-11-24 京东方科技集团股份有限公司 3d printing device and imaging system thereof
CN106903877A (en) * 2017-02-18 2017-06-30 无锡金谷三维科技有限公司 A kind of photocuring 3D printing antialiasing optimization method of the LCD based on RGB arrangements
CN107175329A (en) * 2017-04-14 2017-09-19 华南理工大学 A kind of 3D printing successively detects reverse part model and positioning defect apparatus and method
CN107972266A (en) * 2017-12-15 2018-05-01 博纳云智(天津)科技有限公司 A kind of high accuracy smooth Method of printing of DLP photocurings 3D printer
CN109242949A (en) * 2017-07-11 2019-01-18 周武增 A kind of intelligence 3D printing system and method
CN111583157A (en) * 2020-05-13 2020-08-25 杭州睿琪软件有限公司 Image processing method, system and computer readable storage medium
CN111941846A (en) * 2020-08-06 2020-11-17 深圳市纵维立方科技有限公司 Light equalizing method and device for LCD photocuring 3D printer
CN111993666A (en) * 2020-08-14 2020-11-27 广州谦辉信息科技有限公司 Photocuring 3D printing control system with high cost performance
CN112650457A (en) * 2020-12-29 2021-04-13 深圳市创想三维科技有限公司 File processing method, device and equipment applied to 3D printing and storage medium
CN112743851A (en) * 2020-12-28 2021-05-04 深圳市创想三维科技有限公司 Photocuring 3D printing method, 3D printer, computer device and medium

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101244629A (en) * 2006-08-29 2008-08-20 3D系统公司 Method and device for solid objective formation layer by layer
GB201208991D0 (en) * 2012-05-22 2012-07-04 Mcor Technologies Ltd Colour 3-Dimensional printing
WO2016183985A1 (en) * 2015-05-15 2016-11-24 京东方科技集团股份有限公司 3d printing device and imaging system thereof
CN106903877A (en) * 2017-02-18 2017-06-30 无锡金谷三维科技有限公司 A kind of photocuring 3D printing antialiasing optimization method of the LCD based on RGB arrangements
CN107175329A (en) * 2017-04-14 2017-09-19 华南理工大学 A kind of 3D printing successively detects reverse part model and positioning defect apparatus and method
CN109242949A (en) * 2017-07-11 2019-01-18 周武增 A kind of intelligence 3D printing system and method
CN107972266A (en) * 2017-12-15 2018-05-01 博纳云智(天津)科技有限公司 A kind of high accuracy smooth Method of printing of DLP photocurings 3D printer
CN111583157A (en) * 2020-05-13 2020-08-25 杭州睿琪软件有限公司 Image processing method, system and computer readable storage medium
CN111941846A (en) * 2020-08-06 2020-11-17 深圳市纵维立方科技有限公司 Light equalizing method and device for LCD photocuring 3D printer
CN111993666A (en) * 2020-08-14 2020-11-27 广州谦辉信息科技有限公司 Photocuring 3D printing control system with high cost performance
CN112743851A (en) * 2020-12-28 2021-05-04 深圳市创想三维科技有限公司 Photocuring 3D printing method, 3D printer, computer device and medium
CN112650457A (en) * 2020-12-29 2021-04-13 深圳市创想三维科技有限公司 File processing method, device and equipment applied to 3D printing and storage medium

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
张永弟;岳彦芳;杨光;申振华;: "提高CT图像手骨模型重建精度的方法", 计算机辅助设计与图形学学报, no. 10, 15 October 2017 (2017-10-15) *
徐亮;杨洋洋;郝鹏;吴晓南;陈东华;王鹏举;: "基于CT图像三维重建全骨盆模型与3D打印", 精密制造与自动化, no. 04, 25 November 2016 (2016-11-25) *
李长春;杨云;王崴;刘晓卫;: "基于反求工程的RP创意浮雕制作工艺研究", 塑料工业, no. 12, 20 December 2015 (2015-12-20) *
王蕾;张毅;严红燕;王学建;堵俊;: "基于三维打印技术制作钛合金修复体修补兔颅骨缺损的研究", 安徽医药, no. 11, 19 October 2018 (2018-10-19) *
魏天骄;廖国婷;王柳力;郭欣;冯剑桥;: "3D打印美学模板冠边缘适合性的初步研究", 口腔医学, no. 03, 20 March 2018 (2018-03-20) *

Similar Documents

Publication Publication Date Title
Cao et al. Contrast enhancement of brightness-distorted images by improved adaptive gamma correction
CN111583157B (en) Image processing method, system and computer readable storage medium
CN107403421B (en) Image defogging method, storage medium and terminal equipment
CN110766639B (en) Image enhancement method and device, mobile equipment and computer readable storage medium
US11188777B2 (en) Image processing method, image processing apparatus, learnt model manufacturing method, and image processing system
CN102473293B (en) Image processing apparatus and image processing method
CN104517110A (en) Binarization method and system of two-dimensional code image
CN111489322B (en) Method and device for adding sky filter to static picture
CN109214996B (en) Image processing method and device
CN110335221B (en) Multi-exposure image fusion method based on unsupervised learning
CN111383181B (en) Image enhancement method and device, storage medium and terminal
CN108074220A (en) A kind of processing method of image, device and television set
US7826678B2 (en) Adaptive image sharpening method
CN110363837B (en) Method and device for processing texture image in game, electronic equipment and storage medium
CN114549383A (en) Image enhancement method, device, equipment and medium based on deep learning
Zheng et al. Windowing decomposition convolutional neural network for image enhancement
CN113409222A (en) Image processing method, printing-related apparatus, and readable storage medium
CN113989127A (en) Image contrast adjusting method, system, equipment and computer storage medium
CN103559690A (en) Method for achieving smoothness of image edges
CN109509237B (en) Filter processing method and device and electronic equipment
CN113421203A (en) Image processing method, printing-related apparatus, and readable storage medium
CN113822809B (en) Dim light enhancement method and system thereof
CN113822937B (en) Image correction method, device, equipment and storage medium
CN113591832A (en) Training method of image processing model, document image processing method and device
CN110009082B (en) Three-dimensional code optimization method, medium, computer device and apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination