CN113172986B - Network layout generation method, production system, image processing method and device - Google Patents

Network layout generation method, production system, image processing method and device Download PDF

Info

Publication number
CN113172986B
CN113172986B CN202110327794.2A CN202110327794A CN113172986B CN 113172986 B CN113172986 B CN 113172986B CN 202110327794 A CN202110327794 A CN 202110327794A CN 113172986 B CN113172986 B CN 113172986B
Authority
CN
China
Prior art keywords
image
color
layer
trapping
screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110327794.2A
Other languages
Chinese (zh)
Other versions
CN113172986A (en
Inventor
任文婷
孙凯
杨晓刚
龚文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba China Co Ltd
Original Assignee
Alibaba China Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba China Co Ltd filed Critical Alibaba China Co Ltd
Priority to CN202110327794.2A priority Critical patent/CN113172986B/en
Publication of CN113172986A publication Critical patent/CN113172986A/en
Application granted granted Critical
Publication of CN113172986B publication Critical patent/CN113172986B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B41PRINTING; LINING MACHINES; TYPEWRITERS; STAMPS
    • B41CPROCESSES FOR THE MANUFACTURE OR REPRODUCTION OF PRINTING SURFACES
    • B41C1/00Forme preparation
    • B41C1/14Forme preparation for stencil-printing or silk-screen printing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/40Filling a planar surface by adding surface attributes, e.g. colour or texture

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Manufacture Or Reproduction Of Printing Formes (AREA)
  • Facsimile Image Signal Circuits (AREA)
  • Color Image Communication Systems (AREA)

Abstract

The application discloses a screen image generation method, a production system, an image processing method and equipment. The screen image generation method comprises the following steps: acquiring an image and production parameters; carrying out color separation processing on the image to obtain at least two color layers; one image layer contains part of pixel points in the image; performing trapping processing on part of the at least two color layers to add trapping on the processed color layer; and generating a screen map corresponding to each color layer according to the production parameters, the part of the at least two color layers subjected to the trapping treatment and the other part of the at least two color layers not subjected to the trapping treatment. The technical scheme provided by the embodiment of the application solves the problem that image color layering can be realized only by manually tracing and matting by using design software in the related technology, simplifies the generation process of the screen image and further contributes to improving the generation efficiency of the screen plate.

Description

Network layout generation method, production system, image processing method and device
Technical Field
The present application relates to the field of computer technologies, and in particular, to a halftone screen image generation method, a production system, an image processing method, and an image processing apparatus.
Background
In the clothes printing production link, layered printing treatment needs to be carried out on colors in patterns when screen printing is used for printing, and a printing designer needs to spend more time to complete screen printing plate drawing manufacturing, including bottom layer edge expanding or edge shrinking, color layering and trap printing of color overlapping positions.
For most garment factories, the print order comes from the print image provided by the merchant, and the color layering can be realized only by manually drawing and matting the print order by using design software by a print designer. Meanwhile, when in actual printing, the trapping may need to be modified again, and the whole process is manual and consumes a long time.
Disclosure of Invention
In view of the above, the present application is directed to a screen map generating method, a printing system, and an electronic apparatus that solve the above problems or at least partially solve the above problems.
Thus, in one embodiment of the present application, a method of halftone map generation is provided. The method comprises the following steps:
acquiring an image and production parameters;
carrying out color separation processing on the image to obtain at least two color layers; one image layer contains part of pixel points in the image;
performing trapping processing on part of the at least two color layers to add trapping on the processed color layer;
and generating a network layout corresponding to each color layer according to the production parameters, the part of the at least two color layers which is subjected to the trapping treatment and the other part of the at least two color layers which is not subjected to the trapping treatment.
In another embodiment of the present application, another production system is provided. The system comprises:
the client equipment is used for acquiring images and production parameters; carrying out color separation processing on the image to obtain at least two color layers, wherein one layer contains partial pixel points in the image; performing trapping processing on part of the at least two color layers to add trapping on the processed color layer; generating a network layout corresponding to each color layer according to the production parameters, the part of the at least two color layers which is subjected to the trapping treatment and the other part of the at least two color layers which is not subjected to the trapping treatment;
and the plate making equipment is in communication connection with the client equipment and is used for receiving the screen printing plate corresponding to each color layer sent by the client equipment and respectively processing different silk screens according to the received screen printing plate corresponding to each color layer to obtain a plurality of screen printing plates for printing.
In yet another embodiment of the present application, a method of generating a halftone map is provided. The method comprises the following steps:
acquiring an image;
dividing the image layer to obtain a plurality of image layers;
performing trapping processing on part of the layers in the plurality of layers so as to add trapping on the processed layer;
and generating a network layout corresponding to each layer according to the part of the layers subjected to the trapping processing in the plurality of layers and the other part of the layers which are not subjected to the trapping processing in the plurality of layers.
In yet another embodiment of the present application, a production system is provided. The production system includes:
a client device that acquires an image; dividing the image layer to obtain a plurality of image layers; performing trapping processing on part of the layers to add trapping on the processed layer; generating a network layout corresponding to each layer according to the part of the layers subjected to the trapping processing in the plurality of layers and the other part of the layers not subjected to the trapping processing in the plurality of layers;
and the plate making equipment is in communication connection with the client equipment and is used for receiving the screen printing plate corresponding to each image sent by the client equipment and respectively processing different silk screens according to the received screen printing plate corresponding to each color layer to obtain a plurality of screen printing plates for printing.
In yet another embodiment of the present application, an image processing method is provided. The image processing method includes:
responding to an input operation triggered by a user through an interactive interface, and acquiring an image input by the user;
dividing the image layer to obtain a plurality of image layers;
performing trapping processing on part of the layers in the plurality of layers so as to add trapping on the processed layer;
generating a network layout corresponding to each layer according to the part of the layers subjected to the trapping processing in the plurality of layers and the other part of the layers not subjected to the trapping processing in the plurality of layers;
and displaying the network layout corresponding to each layer.
In yet another embodiment of the present application, an electronic device is provided. The electronic device includes: a memory and a processor; wherein, the first and the second end of the pipe are connected with each other,
a memory for storing a program;
a processor, coupled to the memory, for executing the program stored in the memory to:
acquiring an image and production parameters;
carrying out color separation processing on the image to obtain at least two color layers; one image layer contains part of pixel points in the image;
performing trapping processing on part of the at least two color layers to add trapping on the processed color layer;
and generating a network layout corresponding to each color layer according to the production parameters, the part of the at least two color layers subjected to the trapping processing and the other part of the at least two color layers not subjected to the trapping processing.
In yet another embodiment of the present application, an electronic device is provided. The electronic device includes: a memory and a processor; wherein, the first and the second end of the pipe are connected with each other,
a memory for storing a program;
a processor, coupled to the memory, for executing the program stored in the memory to:
acquiring an image;
dividing the image layer to obtain a plurality of image layers;
performing trapping processing on part of the layers to add trapping on the processed layer;
and generating a network layout corresponding to each layer according to the part of the layers subjected to the trapping processing in the plurality of layers and the other part of the layers which are not subjected to the trapping processing in the plurality of layers.
In yet another embodiment of the present application, an electronic device is provided. The electronic device includes: a memory, a processor and a display; wherein, the first and the second end of the pipe are connected with each other,
a memory for storing a program;
a processor, coupled to the memory, for executing programs stored in the memory for:
responding to an input operation triggered by a user through an interactive interface, and acquiring an image input by the user;
dividing the image layer to obtain a plurality of image layers;
performing trapping processing on part of the layers to add trapping on the processed layer;
generating a network layout corresponding to each layer according to the part of the layers subjected to the trapping processing in the plurality of layers and the other part of the layers not subjected to the trapping processing in the plurality of layers;
and controlling the display to display the network map corresponding to each layer.
In yet another embodiment of the present application, a plate making apparatus is provided. The plate making apparatus includes: a memory, a processor, and an execution component; wherein the content of the first and second substances,
the memory is used for storing programs;
the processor, coupled with the memory, to execute the program stored in the memory to:
acquiring an image;
dividing the image layer to obtain a plurality of image layers;
performing trapping processing on part of the layers in the plurality of layers so as to add trapping on the processed layer;
generating a network layout corresponding to each layer according to the part of the layers subjected to the trapping processing in the plurality of layers and the other part of the layers not subjected to the trapping processing in the plurality of layers;
and controlling the action of the execution assembly based on the screen image corresponding to each image layer so as to respectively process different screen meshes to obtain a plurality of screen plates for printing.
According to the technical scheme provided by the embodiment of the application, the image is subjected to color separation processing to obtain the image layers corresponding to different colors, the problem that image color layering can be realized only by manually tracing and matting by using design software in the related technology is solved, and the generation efficiency of the image layers is improved; in addition, the embodiment of the application realizes that the trapping processing is automatically carried out on part of the color layers in at least two color layers, and the trapping is directly superposed in the processed color layers; in addition, the screen printing plate generation method and device can automatically generate the screen printing plate corresponding to each color layer meeting the production requirements according to the production parameters, manual adjustment or setting is not needed, operation is simpler, the generation process of the screen printing plate is simplified, and the generation efficiency of the screen printing plate is improved.
According to the other technical scheme provided by the embodiment of the application, the image layers are divided to obtain a plurality of image layers, and manual division of the image layers is not needed; then, performing trapping processing on part of the layers in the plurality of layers so as to add trapping on the processed layers; then generating a network layout corresponding to each layer; the whole process is less in manual participation, and the generation process of the screen image is simplified.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a schematic flow chart of a halftone map generation method according to an embodiment of the present application;
fig. 2 is a schematic diagram of two color layers obtained by color separation of an image according to an embodiment of the present application, where one of the color layers is subjected to a trapping process;
FIG. 3 is a bottom layer of an image according to an embodiment of the present application;
fig. 4 is a schematic flow chart of a screen map generating method according to another embodiment of the present application;
FIG. 5a is a schematic view of an image provided in an embodiment of the present application;
FIG. 5b is a schematic diagram of a plurality of color layers obtained by color separation of the image shown in FIG. 5a and a priming layer obtained based on the outline of the image;
fig. 5c is a schematic diagram of a mesh map corresponding to each layer generated based on each layer shown in fig. 5 b;
fig. 6 is a block diagram of a production system according to an embodiment of the present application;
fig. 7 is a schematic flowchart of a screen map generating method according to another embodiment of the present application;
fig. 8a is a schematic flowchart of a halftone map generating method according to another embodiment of the present application;
FIG. 8b is a diagram illustrating a user interface display after the method of FIG. 8a is employed;
fig. 9 is a block diagram of a halftone map generating apparatus according to an embodiment of the present application;
fig. 10 is a block diagram of a halftone image generation apparatus according to another embodiment of the present application;
fig. 11 is a block diagram illustrating a screen image generating apparatus according to another embodiment of the present application;
fig. 12 is a block diagram of an electronic device according to an embodiment of the present application.
Detailed Description
At present, the screen layout manufacturing in the printing industry mainly relates to the garment printing industry and the product outer package printing industry.
In the prior art, software used for manufacturing screen printing plate patterns in the garment printing industry mainly has the following problems: one type of software, such as color separation software, needs to extract each color separately and then stratify, and has high requirement on the resolution of pixels; and the trapping on the layer needs manual drawing. Another kind of software, such as CorelDRaw, needs to create a new layer after manual cutout, and the software has an automatic color separation function; but again, the trapping on the layer requires manual drawing.
The screen printing chart making software commonly used in the product outer packaging printing industry can directly carry out color separation on vector diagrams, but can only split colors into C, M, Y and K four primary colors for bitmaps; the background image layer with any scaling can not be automatically generated, if the generation needs to be carried out, the edge of the background image is acquired according to the pattern, and then the edge is drawn (edge expansion) or erased (edge contraction) according to the actual scale. In addition, the software can directly generate all traps for the whole image by setting trap widths at different color overlapping positions, but all the generated traps are stored in a single image layer, and a final screen image can be obtained only by subsequent continuous color separation and screen drawing. The printed network layout of the external package is essentially different from the printed network layout of the garment, so that the network layout required by the garment production cannot be obtained by one key when the software is used for manufacturing the network layout.
The present application provides the following embodiments to solve or partially solve the problems of the above-described aspects. In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application.
In some of the flows described in the specification, claims, and above-described figures of the present application, a number of operations are included that occur in a particular order, which operations may be performed out of order or in parallel as they occur herein. The sequence numbers of the operations, e.g., 101, 102, etc., are merely used to distinguish between the various operations, and the sequence numbers themselves do not represent any order of execution. Additionally, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel. It should be noted that, the descriptions of "first", "second", etc. in this document are used for distinguishing different messages, devices, modules, etc., and do not represent a sequential order, nor do they limit the types of "first" and "second". In addition, the embodiments described below are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments obtained by a person skilled in the art based on the embodiments in the present application without making any creative effort belong to the protection scope of the present application.
Fig. 1 illustrates a flow chart of a screen map generation method according to an embodiment of the present application. The method provided by the embodiment of the application is suitable for a client side or plate making equipment. The client may be hardware integrated on the terminal device and having an embedded program, may also be application software installed in the terminal, and may also be tool software embedded in an operating system of the terminal device, which is not limited in this embodiment of the present application. The terminal equipment can comprise any terminal equipment such as a mobile phone, a tablet personal computer, wearable equipment and AR equipment. The plate-making equipment has the function of automatically generating a screen layout and can produce a corresponding screen plate according to a self-generated screen layout. As shown in fig. 1, the method includes:
101. acquiring an image and production parameters;
102. carrying out color separation processing on an image to obtain at least two color layers, wherein one layer contains partial pixel points in the image;
103. performing trapping processing on part of the at least two color layers to add trapping on the processed color layer;
104. and generating a network layout corresponding to each color layer according to the production parameters, the part of the at least two color layers subjected to the trapping processing and the other part of the at least two color layers not subjected to the trapping processing.
The technical scheme provided by the embodiment can be used for various printing industries, such as the garment printing industry. In the application of the method 101 in the garment printing industry, the production parameters may include, but are not limited to, at least one of a reduction or enlargement size, screen naming information, and the like. In actual printing process, for example the printing form is garment materials, after using the corresponding pattern of screen printing version based on half tone map production to print on garment materials, need carry out high temperature heating drying to this surface fabric, nevertheless because of the problem of surface fabric self material, the surface fabric can lead to the deformation inconsistent because of the shrinkage difference after the heating, and the screen layout just needs to adjust according to actual conditions this moment. Therefore, when the embodiment is implemented, the production parameters may be set in advance, the production parameters may be set after the staff knows the attributes of the printing medium in advance, the execution main body of the embodiment may be automatically generated based on the category to which the printing medium belongs, and the like, and the method is not limited in detail here.
In an implementation solution, the step 101 "acquiring an image and a production parameter" may include:
1011. displaying an interactive interface;
1012. and responding to an input operation triggered by a user through the interactive interface, and acquiring the image and the production parameters input by the user.
In the step 102, the pixels belonging to the same color classification in the image may be clustered based on a clustering algorithm to obtain a color layer corresponding to the color classification. That is, the step 102 "color-separating the image to obtain at least two color layers" may include the following steps:
1021. acquiring pixel information of pixel points in the image;
1022. and clustering the pixels belonging to the same color classification according to the pixel information of the pixels in the image to obtain the at least two color layers.
For example, in specific implementation, the user may input color number information (e.g., lab color number) corresponding to all colors contained in the image through the above-mentioned interactive interface. That is, the step 1012 may specifically be: and responding to an input operation triggered by a user through the interactive interface, and acquiring the image input by the user, color number information corresponding to at least two colors contained in the image and the production parameters.
Wherein, the Lab color number is the Lab value corresponding to a certain color. The Lab color model is based on the perception of color by the human eye, and the Lab value describes all colors that a person with normal vision can see. In the Lab color model, the Lab color model is composed of three elements, i.e., luminance (L) and a and b, related to color. L represents lightness, a represents a red-green color difference, and b represents a blue-yellow color difference. In addition, there is a unique corresponding relationship between the Lab color number and the PANTONE color number used for printing, wherein PANTONE color card (PANTONE) is an international universal standard color card and is commonly called PANTONE in Chinese.
Therefore, in a specific implementation, the step 1022 may be implemented by the following steps:
acquiring color number information corresponding to at least two colors contained in the image;
and based on the color number information corresponding to the at least two colors and the pixel information of the pixel points in the image, performing clustering color separation on the pixel points in the image by using a Lab color space similarity algorithm to obtain at least two color layers.
When the method is specifically implemented, a formula corresponding to the Lab color space similarity algorithm is as follows:
Figure BDA0002995280180000091
where Δ E represents the color difference, and Δ L, Δ a, and Δ B represent the difference between the two colors in different components, respectively. The color attributes of the pixels can be divided by calculating Δ E values of color number information of each pixel corresponding to each color input by the user. Before the specific implementation, the pixel information of each pixel of the image in this embodiment includes the Lab corresponding to the pixel 1 11 The value is obtained. Suppose that the image includes 3 colors, and the color number information corresponding to the three colors is: lab 01 、Lab 02 And Lab 03 . Lab capable of calculating pixel point 1 11 The values are respectively compared with Lab 01 、Lab 02 And Lab 03 Corresponding Δ E 1 、ΔE 2 、ΔE 3 Taking Δ E 1 、ΔE 2 、ΔE 3 The minimum corresponding color number information in the color classification is used as the color classification to which the pixel point 1 belongs.
The premise for the implementation of steps 1021 and 1022 is that the image is a digital image, and the pixel information of each pixel point can be extracted from the image. For the paper design image, a user adopts a picture shot by a camera; or a picture downloaded from a network side, etc., where the picture cannot directly extract the pixel information of each pixel point from the picture. For this reason, the method provided by the embodiment may include:
identifying the image to identify color block areas contained in the image and color number information corresponding to the color block areas;
and generating a color layer corresponding to each color block area based on the identified color block area and the color number information corresponding to each color block area.
In specific implementation, the image may be identified by using an image identification technology, such as a machine learning algorithm (e.g., a neural network model), and not only the color number information corresponding to the colors included in the image but also the outlines of the regions (i.e., color block regions) having the same color are identified.
In the above 103, trapping is also called trapping, which is also called shrinking, and is mainly used to compensate for the trapping between two adjacent different colors caused by inaccurate printing and misregistration. Therefore, it is necessary to add corresponding trapping on some layers to avoid white leakage during printing. Wherein the width of the trapping can be determined based on production parameters, such as the size of the enlarge. In this embodiment, one of the two color layers having a common boundary may be subjected to a trapping process, for example, a color layer having a light color is subjected to a trapping process. And trapping is added in the layer after the trapping treatment. Referring to the example shown in fig. 2, the image 4 is disassembled into two color layers, namely, a color layer 1 and a color layer 2. As can be seen from fig. 2, the color layer 2 is lighter in color than the color layer 1. An outward extension is added to each color layer 2 where there is a common border with the color layer 1 to superimpose the traps 3 on the pattern in the color layer 1 (the trap colors are substantially the same as the color layer 2, but for ease of viewing, the traps are darkened to be highlighted in fig. 2). The shape of the trapping 3 is adapted to the shape corresponding to the boundary region where the trapping is located, and the trapping amount (such as the outward expansion width) is related to production parameters, material characteristics of a printing body, overprinting precision of a printing system and the like.
In the step 104, after obtaining the part of the color layer subjected to the trapping processing in the at least two color layers and the other part of the color layer not subjected to the trapping processing in the at least two color layers, the mesh pattern corresponding to each color may be generated based on the production parameters.
The production parameters in the process of generating the screen map serve to adjust the ratio of each layer, for example, to the size of the layer meeting the actual production requirement. In addition. With the process of generating the corresponding grid map by each color layer, the embodiment is not particularly limited, and the specific implementation process is the same as that in the prior art, which can refer to the related content in the prior art.
According to the technical scheme provided by the embodiment, the image is subjected to color separation processing to obtain the image layers corresponding to different colors, the problem that image color layering can be realized only by manually tracing and matting by using design software in the related technology is solved, and the generation efficiency of the image layers is improved; in addition, the embodiment of the application realizes that the trapping processing is automatically carried out on part of the color layers in at least two color layers, and the trapping is directly superposed in the processed color layers; in addition, the screen layout corresponding to each color layer meeting the production requirements can be automatically generated according to the production parameters, manual adjustment or setting is not needed, the operation is simpler, the generation process of the screen layout is simplified, and the generation efficiency of the screen printing plate is improved.
The specific implementation of step 103 in this embodiment is further described below. That is, in an implementation solution, the step 103 "performing a trapping process on a part of the at least two color layers" in the above step may include:
1031. determining the layer relation of at least two color layers;
1032. according to the layer relation, a first type layer needing to be subjected to trapping and a second type layer needing not to be subjected to trapping are determined from the at least two color layers;
1033. determining a boundary area of the first type of layer needing to be subjected to trapping printing;
1034. and performing trapping processing on the boundary region in the first type of layer according to the trapping size corresponding to the trapping region.
The layer relationship of the at least two color layers may be a relationship in which a common boundary exists and a relationship in which a common boundary does not exist. In specific implementation, the layer relationship between any two color layers can be determined based on a two-dimensional graphics algorithm, and when a common boundary exists between the two color layers, the common boundary area is determined, and the common boundary area is used as a boundary area of one color layer to be subjected to trapping.
The trapping principle is that the foreground color and the background color are mutually overlapped by an internal and external expanding method, so that the phenomenon of white edges between two different colors can be avoided. Generally, when at least two color blocks with larger color difference are connected, trapping is needed. A trap may also be understood as an extended color band. And (3) overlapping and printing the adjacent color layers by adopting a trapping technology of external expansion and internal contraction, namely the size of the color block in one color layer is unchanged, and the size of the color block in the other color layer is changed. Specifically, the size of the outer expansion is enlarged corresponding to the size of the foreground color object, and the size of the background color block is kept unchanged, so that the outer edge of the foreground color block is overprinted on the background color. The inner shrinkage is kept unchanged corresponding to the size of the foreground color block, the hollow part of the background color block is shrunk, and the foreground color block is overprinted on the edge of the shrunk background color block hollow part. Generally, in the trapping process, whether the inner contraction or the outer expansion is adopted depends on the contrast of the foreground color and the background color, and generally, the following principle is followed in the outer expansion: the background color is not expanded, the light color is expanded, the dark color is not expanded, and the flat net is expanded without being expanded. The reason is that the background color, light color and plain net have less visual impact on people than the foreground color, dark color. Otherwise, one can easily perceive the change in the shape of the object.
In an implementation solution, the trap between two layers of different colors can be fixed in one of the color layers, for example, the trap between two layers of different colors can be fixed in a light color layer. That is, in step 1032, "determining a first type of layer that needs to be subjected to trapping and a second type of layer that does not need to be subjected to trapping from the at least two color layers according to the layer relationship," may specifically be:
10321. according to the layer relation, two target layers with a common boundary are found out from the at least two color layers;
10322. and determining a target image layer with light color as a first type of image layer needing to be subjected to trapping compensation, and determining a target image layer with dark color as a second type of image layer needing not to be subjected to trapping compensation.
More specifically, step 10322 may include the steps of:
obtaining the brightness values corresponding to the two target layers respectively;
determining a target layer with a large brightness value as a first layer to be subjected to trapping compensation;
and determining the target layer with small brightness value as a second type layer without the need of the supplemental trapping.
In practical application, after the two color layers are overprinted by trapping, sometimes an additional line that does not exist in the pattern is generated, and for this reason, the following steps are added in this embodiment:
105. obtaining the first type of layer after the trapping processing;
106. and smoothing the supplementary trapping in the first type of layer.
In this embodiment, the step of smoothing the trapping is to generate a gradient effect of the color of the trapping so as to weaken lines caused by the overprinting. In specific implementation, the color of the trapping can be smoothed by using a corresponding image processing method, wherein the smoothing process is not specifically limited in this embodiment, and may be implemented by referring to related contents in the prior art.
Since the color layer is split in the embodiment by means of pixel color clustering, the obtained color layer may have an edge mosaic as shown in fig. 2. In most cases, the designer designed the design has smooth edges, except for some designs that are specifically designed for a mosaic effect. Therefore, the present embodiment further adds the following steps:
107. performing edge mosaic processing on the at least two color layers to obtain the at least two color layers with smooth edges, and executing the trapping processing step and the screen printing plate image generating step based on the at least two color layers with smooth edges.
This step 107 may precede steps 103 and 104 so that the color layers participating in steps 103 and 104 are both color layers having smooth edges after edge mosaic processing. In specific implementation, the edge mosaic problem existing after bitmap color separation can be improved based on two-dimensional graphics and image processing. Similarly, the specific implementation of the edge mosaic processing in this embodiment is not limited, and all schemes that can perform the edge mosaic processing in the prior art can be applied to this embodiment.
Further, the technical solution provided by the embodiment of the present application further includes:
108. generating a bottoming layer according to the image;
109. and determining a screen printing plate corresponding to the priming image layer based on the priming image layer and the production parameters.
In the above 108, "generating a priming image layer according to an image" may specifically include the following steps:
1081. acquiring outline information of a pattern in an image;
1082. adjusting the outline information according to the production parameters to obtain a bottoming layer
In 1081, the outline information of the pattern in the image can be obtained by using an image recognition technology, wherein the outline information at least includes the size and shape of the image.
In 1082, the height and width of the profile information is adjusted based on the production parameters entered by the user. In particular, the profile information is adjusted according to the size of the reduction or enlargement in the production parameters.
In 109, after the underlying layer is obtained, the mesh pattern corresponding to the underlying layer is generated. In one case, the underlying layer is directly used as the mesh map.
In the actual production process, information, such as header information, footer information, screen positioning symbols, and the like, needs to be noted on each screen icon. The marking information needs to be generated on the screen printing plate corresponding to each screen printing plate; the screen plate is used in the printing process. The worker can determine the number of the screen printing plates, the image names and the like corresponding to the printing images at this time through the marking information on the screen printing plates; the printing equipment can also carry out positioning operation and the like through screen positioning symbols on the screen printing plate. Namely, the present embodiment further includes the steps of:
110. determining the margin of the net layout corresponding to the bottoming layer;
111. acquiring screen plate naming information;
112. and determining a header for the screen image corresponding to the priming image layer based on the screen naming information.
In the above step 110, the margin of the mesh map corresponding to the underlying layer may be determined based on two-dimensional graphics and image processing in the prior art. The margins may include margins in four directions, that is, the distance between the upper boundary and the lower boundary, the distance between the left boundary and the right boundary of the pattern in the halftone screen image.
In the above 111, the screen naming information may be included in the production parameters input by the user. The user can input by himself through the interactive interface. Screen naming information may include, but is not limited to: the type of the clothes, the serial number of the clothes, the printing position, the color of the layer corresponding to the screen image and the like.
Besides, headers are noted for the screen icons corresponding to the underlying image layers, headers are also noted for the screen icons corresponding to the at least two color image layers. Meanwhile, the page feet can be also filled in the screen image corresponding to the bottoming image layer and the screen image icons corresponding to the at least two color image layers. That is, the technical solution provided in the embodiment of the present application further includes:
113. determining page headers for the screen images corresponding to the at least two color image layers respectively according to the screen naming information;
114. and marking page pins for the screen pattern corresponding to the at least two color layers and the screen icon corresponding to the priming layer.
Wherein, the page headers of the screen printing plate corresponding to each color layer can be the same as the page headers of the screen printing plate corresponding to the bottoming layer. For the footer, the process sequence of the screen printing plate corresponding to each screen image in the printing process can be determined, for example, when printing, the bottom color of the pattern needs to be printed by using the screen printing plate of the screen image corresponding to the priming image layer, and then, the printing is performed on the bottom color by using the screen printing plate of the screen image corresponding to each color image layer. Therefore, the footer of the mesh map corresponding to the bottoming layer can be marked as 1, and the footers of the mesh map corresponding to the rest color layers are arranged from 2 to the back according to the printing process sequence. Specifically, the footers of the halftone graph can adopt a similar form, such as 2/10: the total number of the images corresponding to the mesh pattern is 10, and the current mesh pattern is the second one.
After the headers and the footers are arranged, the screen printing plate corresponding to the screen printing plate can be positioned. However, the headers and the footers are characters, and deviation is easy to occur in the actual positioning process, so that screen positioning symbols can be marked on the screen image. That is, the technical solution provided in the embodiment of the present application further includes:
115. and according to the header and the footer of the screen layout, respectively marking screen positioning symbols for the screen layout corresponding to the bottoming layer and the screen icons corresponding to the at least two color layers.
The screen positioning symbol can be used for mounting a screen plate corresponding to the screen pattern. Referring to FIG. 3, a screen illustration 5 of a primed image layer having a header 6, footer 7, and screen alignment symbology
Figure BDA0002995280180000141
Figure BDA0002995280180000141
8。
The following describes the flow of the method provided in the embodiments of the present application with reference to specific examples.
Referring to the flowchart shown in fig. 4, the screen map generating method includes:
s1, obtaining an image and production parameters.
And S2, generating a bottoming layer according to the contour information of the image.
S3, carrying out color separation processing on the image to obtain at least two color layers;
s4, performing edge mosaic processing on the at least two color layers to smooth edges;
and S5, performing trapping processing on part of the at least two color layers to add trapping in the corresponding color layers.
And S6, smoothing the trapping added in the color layer to prevent the appearance of unwanted lines in the pattern.
And S7, generating a network layout corresponding to each layer based on the production parameters.
Wherein, each map layer includes: a color layer added with trap printing, a color layer without trap printing and a priming layer.
For example, the user enters the image shown in FIG. 5a, as well as production parameters. Wherein, the production parameters include but are not limited to: the size of the reduction printing or the enlargement printing (such as 0.1 mm), the name information of the screen printing plate, the color number information of all colors contained in the image, and the like. Then, based on the outline information of the image, a priming layer 11 as shown in fig. 5b is obtained; the color separation processing in step S3 above results in a plurality of color layers 12, 13, 14, 15, 16, 17, and 18. After the above processing of steps S4, S5 and S6, a plurality of screen layouts 21, 22, 23, 24, 25, 26, 27 and 28 as shown in fig. 5c are generated by step S7. The screen layouts are added with page headers, page footers and screen positioning symbols.
Further, the method provided by this embodiment may further include the following steps:
displaying the net layout corresponding to the bottoming layer and the net layout corresponding to each color layer;
and responding to a virtual trial printing instruction triggered by a user, and starting a virtual trial printing module so that the virtual trial printing module generates a virtual trial printing effect image according to the net layout corresponding to the bottoming layer and the net layout corresponding to each color layer.
After the virtual test printing module is started, the virtual test printing module can generate corresponding printing animation and display the corresponding printing animation on a user interface, so that the virtual test printing module is convenient for a user to watch, and is visual and convenient. In addition to displaying the animated rendering and the virtual impression block, the virtual impression block may also provide error detection to detect errors in the respective reticle fields. After the virtual trial printing effect graph is generated, the user can trigger the error detection function to obtain the wrong network layout graph and the reasons of the errors (such as the outline error of the underlying layer, the error of the production parameters and the like).
The user can look over the printing effect based on the virtual trial printing effect picture to can revise the half tone picture that each picture layer corresponds or revise production parameter etc. when the effect is unsatisfactory. That is, the method provided by this embodiment may further include the following steps:
and responding to modification operation triggered by a user aiming at the screen image of one image layer, modifying the screen image and displaying the modified screen image.
Alternatively, there are trial printing apparatuses, such as plate making apparatuses and printing apparatuses, on site. The user can send the screen printing plate corresponding to the bottoming layer and the screen printing plate corresponding to each color layer to the plate making equipment, and the plate making equipment respectively processes different silk screens to obtain a plurality of screen printing plates for printing. Then, a plurality of screen printing plates are placed on corresponding placement positions of printing equipment so as to print the patterns of corresponding image layers on the printing stock in sequence, and finally, the patterns on the printing stock are dried through processes such as drying. The user can adjust the screen printing and/or production parameters of each layer in time based on the effect of the trial printing. After the trial printing effect is satisfied, the screen printing graph corresponding to the bottom printing layer and the screen printing graph corresponding to each color graph layer can be sent to equipment in an intelligent processing factory so as to carry out large-scale production.
In addition, production parameters (also referred to as process parameters) of the different material fabrics, such as shrinkage of the fabrics (which is related to the size of the reduction or enlargement set by the user), etc., can be pre-configured locally as user-selectable options. For example, the user can automatically generate corresponding generation parameters by selecting the material of the printed material, and the user does not need to input the corresponding generation parameters by himself.
Fig. 6 shows a schematic structural diagram of a production system provided in an embodiment of the present application. As shown in fig. 6, the production system includes:
a client device 201 for acquiring images and production parameters; carrying out color separation processing on the image to obtain at least two color layers, wherein one layer contains partial pixel points in the image; performing trapping processing on part of the at least two color layers to add trapping on the processed color layer; generating a screen map corresponding to each color layer according to the production parameters, the part of the at least two color layers which is subjected to the trapping processing and the other part of the at least two color layers which is not subjected to the trapping processing;
and the plate making device 202 is in communication connection with the client device 201, and is configured to receive the halftone maps corresponding to the color layers sent by the client device 201, and process different silk screens respectively according to the received halftone maps corresponding to the color layers to obtain a plurality of screen plates for printing.
Further, the system provided by the embodiment may further include a printing device. The printing equipment is provided with a first placing position for placing the at least two screen printing plates and a second placing position for placing a printing stock, and the first placing position and the second placing position are used for performing printing operation by using the screen printing plates in corresponding sequence one by one according to the printing sequence of the at least two screen printing plates so as to print the patterns of the corresponding image layers on the printing stock.
The client device 201 may be any type of device with computing capability, such as a desktop computer, a laptop computer, a smart phone, a smart wearable device, a tablet computer, and so on.
Here, it should be noted that: the client device in the production system provided in the above embodiments may implement the technical solutions described in the above method embodiments, and the specific implementation principle of each module or unit may refer to the corresponding content in the above method embodiments, and is not described herein again.
Fig. 7 is a flowchart illustrating a screen map generating method according to another embodiment of the present application. As shown in fig. 7, the method includes:
301. acquiring an image;
302. dividing the image layer to obtain a plurality of image layers;
303. performing trapping processing on part of the layers to add trapping on the processed layer;
304. and generating a network layout corresponding to each layer according to the part of the layers subjected to the trapping processing in the plurality of layers and the other part of the layers which are not subjected to the trapping processing in the plurality of layers.
The image layer division performed on the image in 302 above may be implemented by the following steps:
3021. determining a bottoming layer based on the contour information of the pattern in the image;
3022. carrying out color separation processing on the image to obtain a plurality of color layers; and one image layer contains part of pixel points in the image.
For the contents of the above steps 3021 and 3022, reference may be made to the corresponding contents in the above, and details are not described here.
The step 303 "generating a screen printing plate image corresponding to each image layer according to the part of the image layers after the trapping processing in the plurality of image layers and another part of the image layers not subjected to the trapping processing in the plurality of image layers" may specifically include:
3031. obtaining production parameters;
3032. generating a net layout corresponding to the backing layer according to the production parameters and the backing layer;
3033. performing trapping processing on part of the color layers in the plurality of color layers so as to add trapping on the processed color layers;
3034. and generating a network layout corresponding to each color layer according to the production parameters, the part of the color layers subjected to the trapping processing in the plurality of color layers and the other part of the color layers which are not subjected to the trapping processing in the plurality of color layers.
The production parameters described above may include, but are not limited to: size of reduction or enlargement, screen name information, color number information of all colors contained in the image, and the like.
Based on the above method embodiments, the present embodiment provides another production system. The structure of the production system is the same as that of fig. 6 described above, except for the function of the client device. In particular, the method comprises the following steps of,
the client device acquires an image; dividing the image layer to obtain a plurality of image layers; performing trapping processing on part of the layers to add trapping on the processed layer; generating a network layout corresponding to each layer according to the part of the layers subjected to the trapping processing in the plurality of layers and the other part of the layers not subjected to the trapping processing in the plurality of layers;
and the plate making equipment is in communication connection with the client equipment and is used for receiving the screen printing plate corresponding to each image sent by the client equipment and respectively processing different silk screens according to the received screen printing plate corresponding to each color layer to obtain a plurality of screen printing plates for printing.
Similarly, the present embodiment may also include a printing device. The printing equipment is provided with a first placing position for placing the at least two screen printing plates and a second placing position for placing a printing stock, and the first placing position and the second placing position are used for executing printing operation by using the screen printing plates in corresponding sequence one by one according to the printing sequence of the at least two screen printing plates so as to print the patterns of the corresponding layers on the printing stock.
The technical scheme provided by each embodiment of the application aims to: the screen printing layout generation method has the advantages that the working personnel can directly input images and related parameters, the screen printing layout of each picture layer meeting the actual production requirement can be automatically generated by the computing equipment, the generation efficiency of the screen printing layout is improved, the workload is simplified, and the production efficiency of the whole printing process is improved. That is, the present application also provides an embodiment, such as a flowchart of the image processing method shown in fig. 8a and 8 b. As shown in fig. 8a and 8b, the method comprises:
401. responding to an input operation triggered by a user through an interactive interface, and acquiring an image input by the user;
402. dividing the image layer to obtain a plurality of image layers;
403. performing trapping processing on part of the layers to add trapping on the processed layer;
404. generating a network layout corresponding to each layer according to the part of the layers subjected to the trapping processing in the plurality of layers and the other part of the layers not subjected to the trapping processing in the plurality of layers;
405. and displaying the network map corresponding to each layer.
For the above steps 402-404, reference is made to the corresponding contents above, which are not described herein again.
Furthermore, the user can also modify the generated network layout and the like. Namely, the method provided by the embodiment may further include:
406. and responding to a modification operation triggered by a user aiming at a screen map corresponding to one layer, and modifying the screen map corresponding to the layer by the user.
What needs to be added here is: the technical scheme provided by the above embodiments of the present application is applicable to the pattern printing scenes of various types of materials. Such as a garment printing scenario, a product overwrap printing scenario, and so forth. The printing screen type that the technical scheme provided by each embodiment of the application can cover can include: offset printing, watermarking, foam printing, screen printing, digital printing, and the like.
After the application program implemented by the technical solution provided by the embodiment of the present application is installed, a user can input an image and related production parameters (certainly not limited to those parameters shown in fig. 8 b) through an interactive interface after opening the application program, and then click the "generation" control, so that the user can see the generated mesh layout corresponding to each layer. For a user, after the user inputs the image and the production parameters, the network layout corresponding to each layer required by production can be generated by one key. Under the condition that the production parameters are correct, the user only needs to return the corresponding page to modify the corresponding parameters and then generates the parameters by one key, and the modification of the layout can be quickly completed.
In fact, the technical solution provided by this embodiment can be used as an optimization module of an existing computer design tool (such as Photoshop, etc.), for example, a module capable of implementing the method function provided by this embodiment is loaded in the existing computer design tool as an additional module. When the user needs to use the method, a module for realizing the functions of the method provided by the embodiment is loaded in the calculation design tool, so that a designer does not need to develop a new application program, and the design difficulty is greatly simplified.
Fig. 9 is a schematic structural diagram illustrating a halftone map generating apparatus according to an embodiment of the present application. The screen image generating device includes: an acquisition module 31, a color separation module 32, a trapping module 33, and a generation module 34. The obtaining module 31 is used for obtaining images and production parameters. The color separation module 32 is configured to perform color separation processing on the image to obtain at least two color layers; and one image layer contains part of pixel points in the image. The trapping module 33 is configured to perform a trapping process on a part of the at least two color layers, so as to add a trap on the processed color layer. The generating module 34 is configured to generate a mesh map corresponding to each color layer according to the production parameter, the part of the at least two color layers that is subjected to the trapping processing, and another part of the at least two color layers that is not subjected to the trapping processing.
Further, when the color separation module 32 performs color separation processing on the image to obtain at least two color layers, the color separation module is specifically configured to:
acquiring pixel information of pixel points in the image; and clustering the pixels belonging to the same color classification according to the pixel information of the pixels in the image to obtain the at least two color layers.
Further, the color separation module 32 is configured to cluster the pixels belonging to the same color classification according to the pixel information of the pixels in the image, and when obtaining the at least two color layers, specifically:
acquiring color number information corresponding to at least two colors contained in the image; and based on the color number information corresponding to the at least two colors and the pixel information of the pixel points in the image, performing clustering color separation on the pixel points in the image by using a Lab color space similarity algorithm to obtain at least two color layers.
Further, the color separation module 32 is specifically configured to display an interactive interface when color number information corresponding to at least two colors included in the image is acquired; and responding to an input operation triggered by a user through the interactive interface, and acquiring the image input by the user, color number information corresponding to at least two colors contained in the image and the production parameters.
Further, when the trapping module 33 performs trapping processing on a part of the at least two color layers, the trapping module is specifically configured to:
determining the layer relation of the at least two color layers; determining a first type of layer needing to be subjected to trapping printing and a second type of layer needing not to be subjected to trapping printing from the at least two color layers according to the layer relation; determining a boundary area of the first type of layer to be subjected to trapping; and performing trapping compensation treatment on the boundary area in the first type of layer according to the trapping size corresponding to the trapping area.
Further, when the trapping module 33 determines a first type of layer that needs to be subjected to trapping and a second type of layer that does not need to be subjected to trapping from the at least two color layers according to the layer relationship, the trapping module is specifically configured to:
according to the layer relation, two target layers with a common boundary are found out from the at least two color layers; and determining a target image layer with light color as a first type of image layer needing to be subjected to trapping compensation, and determining a target image layer with dark color as a second type of image layer needing not to be subjected to trapping compensation.
Further, the trapping module 33 is configured to, when determining, of the two target image layers, a light-colored target image layer as a first type of image layer to be subjected to trapping compensation and a dark-colored target image layer as a second type of image layer not to be subjected to trapping compensation, specifically:
obtaining the brightness values corresponding to the two target layers respectively; determining a target layer with a large brightness value as a first layer to be subjected to trapping compensation; and determining the target layer with small brightness value as a second type layer without the need of the supplemental trapping.
Further, the obtaining module 31 is further configured to obtain the first type of layer after the trapping processing. The trapping module 33 is further configured to perform a smoothing process on the supplementary trapping in the first type layer.
Further, the apparatus provided in this embodiment may further include an edge processing module. The edge processing module is used for performing edge mosaic processing on the at least two color layers to obtain the at least two color layers with smooth edges, and executing the trapping processing step and the screen printing plate image generating step based on the at least two color layers with smooth edges.
Further, in the apparatus provided in this embodiment, the generating module 34 is further configured to: generating a bottoming layer according to the image; and determining a screen printing plate corresponding to the priming image layer based on the priming image layer and the production parameters.
Still further, when the generating module 34 generates a bottom layer according to the image, it is specifically configured to: acquiring outline information of a pattern in the image; and adjusting the profile information according to the production parameters to obtain the bottoming layer.
Further, in the apparatus provided in this embodiment, the generating module 34 is further configured to:
determining the margin of the net layout corresponding to the bottoming layer; acquiring screen plate naming information; and determining a header for the screen image corresponding to the priming image layer based on the screen naming information.
Further, the generating module 34 is further configured to:
determining page headers for the screen images corresponding to the at least two color image layers respectively according to the screen naming information; and marking pins for the screen pattern corresponding to the at least two color layers and the screen icon corresponding to the priming layer.
Further, the generating module 34 is further configured to: and according to the header and the footer of the screen layout, respectively marking screen positioning symbols for the screen layout corresponding to the bottoming layer and the screen icons corresponding to the at least two color layers.
Further, the production parameters include at least one of: the size of the reduction or enlargement print, the name information of the screen printing plate and the color number information of all colors in the image.
Here, it should be noted that: the screen image generating device provided in the above embodiment may implement the technical solutions described in the above method embodiments, and the specific implementation principle of each module or unit may refer to the corresponding content in the above method embodiments, and will not be described herein again.
Fig. 10 is a schematic structural diagram illustrating a screen map generating apparatus according to an embodiment of the present application. As shown in fig. 10, the screen map generating apparatus includes: an acquisition module 41, a division module 42, a trapping module 43, and a generation module 44. The acquiring module 41 is used for acquiring an image. The dividing module 42 is configured to perform layer division on the image to obtain multiple layers. The trapping module 43 is configured to perform a trapping process on a part of the layers, so as to add a trap on the processed layer. The generating module 44 is configured to generate a mesh map corresponding to each layer according to a part of the layers subjected to the trapping processing in the plurality of layers and another part of the layers not subjected to the trapping processing in the plurality of layers.
Further, when the dividing module 42 performs layer division on the image to obtain a plurality of layers, the dividing module is specifically configured to:
determining a bottoming layer based on the contour information of the pattern in the image; carrying out color separation processing on the image to obtain a plurality of color layers; and one image layer contains part of pixel points in the image.
Further, when the generating module generates a mesh map corresponding to each layer according to a part of the layers subjected to the trapping process in the plurality of layers and another part of the layers not subjected to the trapping process in the plurality of layers, the generating module is specifically configured to:
obtaining production parameters; generating a net layout corresponding to the backing layer according to the production parameters and the backing layer; performing trapping processing on part of the color layers in the plurality of color layers so as to add trapping on the processed color layers; and generating a network layout corresponding to each color layer according to the production parameters, the part of the color layers subjected to the trapping processing in the plurality of color layers and the other part of the color layers which are not subjected to the trapping processing in the plurality of color layers.
Further, the production parameters include the size of the reduction or enlargement, the name information of the screen, and the color number information of all colors contained in the image.
Here, it should be noted that: the screen image generating device provided in the above embodiment may implement the technical solutions described in the above method embodiments, and the specific implementation principle of each module or unit may refer to the corresponding content in the above method embodiments, and will not be described herein again.
Fig. 11 shows a schematic structural diagram of an image processing apparatus according to an embodiment of the present application. As shown in fig. 11, the image processing apparatus includes: an acquisition module 51, a dividing module 52, a trapping module 53, a generation module 54 and a display module 55. The obtaining module 51 is configured to obtain an image input by a user in response to an input operation triggered by the user through an interactive interface. The dividing module 52 is configured to perform layer division on the image to obtain multiple layers. The trapping module 53 is configured to perform a trapping process on a part of the layers in the plurality of layers, so as to add a trap on the processed layer. The generating module 54 is configured to generate a mesh map corresponding to each layer according to the part of the plurality of layers that is subjected to the trapping process and another part of the plurality of layers that is not subjected to the trapping process. The display module 55 is configured to display the mesh map corresponding to each layer.
Further, the apparatus provided in this embodiment may further include a modification module. The modification module is used for responding to modification operation triggered by a user aiming at a screen image corresponding to one image layer, and modifying the screen image corresponding to the image layer by the user.
Here, it should be noted that: the image processing apparatus provided in the foregoing embodiment may implement the technical solutions described in the foregoing corresponding method embodiments, and the specific implementation principle of each module or unit may refer to the corresponding content in the foregoing method embodiments, and is not described herein again.
Fig. 12 shows a schematic structural diagram of an electronic device provided in an embodiment of the present application. Wherein the electronic device comprises: a memory 61 and a processor 62; wherein the content of the first and second substances,
a memory 61 for storing a program;
a processor 62, coupled to the memory 61, for executing programs stored in the memory 61 for:
acquiring an image and production parameters;
carrying out color separation processing on the image to obtain at least two color layers; one layer contains part of pixel points in the image;
performing trapping processing on part of the at least two color layers to add trapping on the processed color layer;
and generating a network layout corresponding to each color layer according to the production parameters, the part of the at least two color layers which is subjected to the trapping treatment and the other part of the at least two color layers which is not subjected to the trapping treatment.
The memory 61 described above may be configured to store other various data to support operations on the electronic device. Examples of such data include instructions for any application or method operating on the electronic device. The memory 61 may be implemented by any type or combination of volatile or non-volatile storage devices, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The processor 62 may also implement other functions besides the above functions when executing the program in the memory 61, and specifically refer to the description of the foregoing embodiments.
Further, as shown in fig. 12, the electronic apparatus further includes: communication components 63, display 64, power components 65, audio components 66, and the like. Only some of the components are schematically shown in fig. 12, and the electronic device is not meant to include only the components shown in fig. 12.
Another embodiment of the present application provides an electronic device, which has a structure similar to that of fig. 12. Specifically, the electronic device includes: a memory and a processor; wherein the content of the first and second substances,
a memory for storing a program;
a processor, coupled to the memory, for executing the program stored in the memory to:
acquiring an image;
dividing the image layer to obtain a plurality of image layers;
performing trapping processing on part of the layers to add trapping on the processed layer;
and generating a network layout corresponding to each layer according to the part of the layers subjected to the trapping processing in the plurality of layers and the other part of the layers which are not subjected to the trapping processing in the plurality of layers.
The memory may be configured to store other various data to support operations on the electronic device. Examples of such data include instructions for any application or method operating on the electronic device. The memory may be implemented by any type or combination of volatile and non-volatile memory devices, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
When the processor executes the program in the memory, the processor may implement other functions in addition to the above functions, which may be specifically referred to the description of the foregoing embodiments.
The present application also provides an electronic device having a structure similar to that of fig. 12 above. The electronic device comprises a memory, a processor and a display; wherein the content of the first and second substances,
the memory is used for storing programs;
the processor, coupled to the memory, to execute the program stored in the memory to:
responding to an input operation triggered by a user through an interactive interface, and acquiring an image input by the user;
dividing the image layer to obtain a plurality of image layers;
performing trapping processing on part of the layers in the plurality of layers so as to add trapping on the processed layer;
generating a net layout corresponding to each layer according to the part of the layers subjected to the trapping processing in the plurality of layers and the other part of the layers not subjected to the trapping processing in the plurality of layers;
and controlling the display to display the network map corresponding to each layer.
When the processor executes the program in the memory, the processor may implement other functions in addition to the above functions, which may be specifically referred to the description of the foregoing embodiments.
Accordingly, embodiments of the present application further provide a computer-readable storage medium storing a computer program, where the computer program can implement the steps or functions of the screen image generation method and the image processing method provided in the foregoing embodiments when the computer program is executed by a computer.
Further, this embodiment still provides a platemaking equipment. The plate making apparatus includes: memory, processor, and execution components. Wherein the memory is used for storing programs;
the processor, coupled with the memory, to execute the program stored in the memory to:
acquiring an image;
dividing the image layer to obtain a plurality of image layers;
performing trapping processing on part of the layers to add trapping on the processed layer;
generating a net layout corresponding to each layer according to the part of the layers subjected to the trapping processing in the plurality of layers and the other part of the layers not subjected to the trapping processing in the plurality of layers;
and controlling the action of the execution assembly based on the screen layout corresponding to each layer so as to respectively process different screens to obtain a plurality of screen printing plates for printing.
When the processor executes the program in the memory, the processor may implement other functions in addition to the above functions, which may be specifically referred to the description of the foregoing embodiments.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by adding a necessary general hardware platform, and of course, can also be implemented by a combination of hardware and software. With this understanding in mind, the above-described technical solutions and some contributions to the art may be embodied in the form of a computer program product, which may be embodied on one or more computer-usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable coordinate determination device to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable coordinate determination device, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable coordinate determination apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable coordinate determination device to cause a series of operational steps to be performed on the computer or other programmable device to produce a computer implemented process such that the instructions which execute on the computer or other programmable device provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both permanent and non-permanent, removable and non-removable media, may implement the information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Disks (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present application.

Claims (13)

1. A screen image generation method is characterized in that the method is suitable for a client or plate making equipment, the client is software installed or embedded on terminal equipment, the plate making equipment has a function of automatically generating a screen image by adopting the following method, and the method comprises the following steps:
acquiring an image and production parameters;
carrying out color separation processing on the image to obtain at least two color layers; one layer contains part of pixel points in the image;
performing trapping processing on part of the at least two color layers to add trapping on the processed color layer;
generating a network layout corresponding to each color layer according to the production parameters, the part of the at least two color layers which is subjected to the trapping treatment and the other part of the at least two color layers which is not subjected to the trapping treatment;
generating a bottoming layer according to the outline information of the pattern in the image;
determining a screen printing plate corresponding to the priming image layer based on the priming image layer and the production parameters;
generating a virtual pilot printing effect graph according to the screen printing graph corresponding to the priming image layer and the screen printing graph corresponding to each color image layer, and performing error check on each screen printing graph;
and displaying the virtual trial printing effect picture and the error check result corresponding to each screen printing picture.
2. The method of claim 1, wherein the color separating the image to obtain at least two color layers comprises:
acquiring pixel information of pixel points in the image;
and clustering the pixels belonging to the same color classification according to the pixel information of the pixels in the image to obtain the at least two color layers.
3. The method according to claim 2, wherein clustering pixels belonging to a color classification according to pixel information of pixels in the image to obtain the at least two color layers comprises:
acquiring color number information corresponding to at least two colors contained in the image;
and based on the color number information corresponding to the at least two colors and the pixel information of the pixel points in the image, performing clustering color separation on the pixel points in the image by using a Lab color space similarity algorithm to obtain at least two color layers.
4. The method according to any of claims 1 to 3, wherein the step of performing a trapping process on a part of the at least two color layers comprises:
determining the layer relation of the at least two color layers;
according to the layer relation, a first type layer needing to be subjected to trapping and a second type layer needing not to be subjected to trapping are determined from the at least two color layers;
determining a boundary area of the first type of layer needing to be subjected to trapping printing;
and performing trapping processing on the boundary region in the first type of layer according to the trapping size corresponding to the trapping region.
5. The method according to claim 4, wherein determining a first type of layer to be subjected to padding and a second type of layer not to be subjected to padding from the at least two color layers according to the layer relationship comprises:
according to the layer relation, two target layers with a common boundary are found out from the at least two color layers;
and determining a target image layer with light color as a first type of image layer needing to be subjected to trapping compensation, and determining a target image layer with dark color as a second type of image layer needing not to be subjected to trapping compensation.
6. The method of claim 4, further comprising:
obtaining the first type of layer after the trapping processing; smoothing the supplementary trapping in the first type of layer;
performing edge mosaic processing on the at least two color layers to obtain the at least two color layers with smooth edges, and executing the trapping processing step and the screen printing plate image generating step based on the at least two color layers with smooth edges.
7. The method according to any one of claims 1 to 3, wherein generating a priming layer according to the outline information of the pattern in the image comprises:
and adjusting the profile information according to the production parameters to obtain a bottoming layer.
8. The method of claim 7, further comprising:
determining the margin of the net layout corresponding to the bottoming layer;
acquiring screen plate naming information;
determining a header for the screen image corresponding to the priming image layer based on the screen naming information;
determining page headers for the screen images corresponding to the at least two color image layers respectively according to the screen naming information;
marking page pins for the screen pattern corresponding to the at least two color layers and the screen icon corresponding to the priming layer;
and according to the header and the footer of the screen layout, respectively marking screen positioning symbols for the screen layout corresponding to the bottoming layer and the screen icons corresponding to the at least two color layers.
9. A production system, comprising:
the system comprises a client device, a server and a server, wherein the client device is provided with or embedded with software, and executes the software to acquire images and production parameters; carrying out color separation processing on the image to obtain at least two color layers, wherein one layer contains partial pixel points in the image; performing trapping processing on part of the at least two color layers to add trapping on the processed color layer; generating a screen map corresponding to each color layer according to the production parameters, the part of the at least two color layers which is subjected to the trapping processing and the other part of the at least two color layers which is not subjected to the trapping processing; generating a bottoming layer based on the outline information of the pattern in the image; determining a screen printing plate corresponding to the priming image layer based on the priming image layer and the production parameters; generating a virtual trial printing effect graph according to the screen printing graph corresponding to the bottoming layer and the screen printing graph corresponding to each color layer, and carrying out error check on each screen printing graph; displaying the virtual trial printing effect graph and the error check result corresponding to each screen printing graph;
and the plate making equipment is in communication connection with the client equipment and is used for receiving the screen printing plate corresponding to each color layer sent by the client equipment and respectively processing different silk screens according to the received screen printing plate corresponding to each color layer to obtain a plurality of screen printing plates for printing.
10. The system of claim 9, comprising:
and the printing equipment is provided with a first placing position for placing the at least two screen printing plates and a second placing position for placing a printing stock, and is used for executing printing operation by using the screen printing plates in the corresponding sequence one by one according to the printing sequence of the at least two screen printing plates so as to print the pattern of the corresponding image layer on the printing stock.
11. An image processing method is applicable to a client or a platemaking device, wherein the client is software installed or embedded on a terminal device, and the platemaking device has a function of automatically generating a layout by adopting the following method, and the method comprises the following steps:
responding to an input operation triggered by a user through an interactive interface, and acquiring an image input by the user;
carrying out color separation processing on the image to obtain a plurality of color layers; one image layer contains part of pixel points in the image;
performing trapping processing on part of the color layers in the plurality of color layers so as to add trapping on the processed color layers;
obtaining production parameters;
generating a network layout corresponding to each color layer according to the production parameters, the part of the color layers subjected to the trapping processing in the plurality of color layers and the other part of the color layers not subjected to the trapping processing in the plurality of color layers;
determining a bottoming layer based on the outline information of the pattern in the image;
generating a screen printing plate image corresponding to the priming image layer according to the production parameters and the priming image layer;
displaying the screen printing plate corresponding to the priming image layer and the screen printing plate corresponding to each color image layer;
responding to the operation triggered by a user, generating a virtual pilot printing effect graph according to the screen printing graph corresponding to the priming layer and the screen printing graph corresponding to each color layer, and carrying out error check on each screen printing graph;
and displaying the virtual trial printing effect picture and the error check result corresponding to each screen printing picture.
12. An electronic device, characterized in that the electronic device is a terminal device or a plate making device, the electronic device comprising: a memory and a processor; wherein the content of the first and second substances,
the memory is used for storing a software program;
the processor, coupled to the memory, for executing the software program stored in the memory to implement the steps of the method of any one of the preceding claims 1 to 8; or to carry out the steps of the method as claimed in claim 11.
13. A plate making apparatus, comprising: a memory, a processor, and an execution component; wherein the content of the first and second substances,
the memory is used for storing a software program;
the processor, coupled with the memory, to execute the software program stored in the memory to:
acquiring an image and production parameters;
carrying out color separation processing on the image to obtain a plurality of color layers; one layer contains part of pixel points in the image;
performing trapping processing on part of the color layers in the plurality of color layers so as to add trapping on the processed color layers;
generating a network layout corresponding to each color layer according to the production parameters, the part of the plurality of color layers subjected to the trapping treatment and the other part of the plurality of color layers not subjected to the trapping treatment;
determining a bottoming layer based on the contour information of the pattern in the image;
generating a screen printing plate corresponding to the priming image layer according to the production parameters and the priming image layer;
generating a virtual pilot printing effect graph according to the screen printing graph corresponding to the priming image layer and the screen printing graph corresponding to each color image layer, and performing error check on each screen printing graph;
displaying the virtual trial printing effect graph and the error check result corresponding to each screen printing graph;
and controlling the action of the execution assembly based on each screen image to process different screen images respectively to obtain a plurality of screen plates for printing.
CN202110327794.2A 2021-03-26 2021-03-26 Network layout generation method, production system, image processing method and device Active CN113172986B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110327794.2A CN113172986B (en) 2021-03-26 2021-03-26 Network layout generation method, production system, image processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110327794.2A CN113172986B (en) 2021-03-26 2021-03-26 Network layout generation method, production system, image processing method and device

Publications (2)

Publication Number Publication Date
CN113172986A CN113172986A (en) 2021-07-27
CN113172986B true CN113172986B (en) 2022-12-06

Family

ID=76922429

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110327794.2A Active CN113172986B (en) 2021-03-26 2021-03-26 Network layout generation method, production system, image processing method and device

Country Status (1)

Country Link
CN (1) CN113172986B (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5625766A (en) * 1995-05-11 1997-04-29 Creo Products Inc. Software based proofing method for double sided printing
CN1857923A (en) * 2006-06-06 2006-11-08 赛勒斯·沙纳德 Four color dyeing and printing process
CN101969518B (en) * 2009-07-28 2013-03-06 方正国际软件(北京)有限公司 Method and system for previewing trapping region
CN102568020B (en) * 2012-01-11 2014-02-19 广东壮丽彩印股份有限公司 Overprint plate making method
DE102015220470A1 (en) * 2014-11-20 2016-05-25 Heidelberger Druckmaschinen Aktiengesellschaft Method for creating PDF trapping objects without knowing the contour description

Also Published As

Publication number Publication date
CN113172986A (en) 2021-07-27

Similar Documents

Publication Publication Date Title
KR102215766B1 (en) Method and apparatus for generating synthetic picture
TWI225624B (en) Method and apparatus for digital image segmentation using an iterative method
US8174539B1 (en) Imprint for visualization and manufacturing
EP2840551B1 (en) Methods and systems for automated selection of regions of an image for secondary finishing and generation of mask image of same
KR20200014842A (en) Image illumination methods, devices, electronic devices and storage media
JP4142614B2 (en) Trapping method, trapping program, trapping apparatus, and printing system
US10013784B2 (en) Generating an assembled group image from subject images
US20140169697A1 (en) Editor for assembled group images
TWI766303B (en) Measurement system, method for generating learning model used in image measurement of semiconductor including specific structure, and storage useful for computer to generate learning used in image measurement of semiconductor including specific structure The recording medium of the model processing program
CN111611416B (en) Picture retrieval method and device, electronic equipment and storage medium
CN108347548A (en) Image processing apparatus and its control method
US7023448B1 (en) Selecting rendering intent
CN113172986B (en) Network layout generation method, production system, image processing method and device
US20040169664A1 (en) Method and apparatus for applying alterations selected from a set of alterations to a background scene
JP6662592B2 (en) Image processing system and center server for image processing system
JP2019193040A (en) Image processing apparatus, image processing method, and program
JP2011061860A (en) Image-data processor, medium with image-data set recorded, medium with image-data processing program recorded and method for processing image data
US10573045B2 (en) Generating an assembled group image from subject images
CN117351032B (en) Seal removing method and system
JP2005142791A (en) Method, device, and program for trapping, and print system
CN112215920A (en) Personalized card display and manufacturing method and equipment
KR102157005B1 (en) Method of improving precision of deep learning resultant image by using image filtering technique
CN117671058A (en) Image generation method, device, nonvolatile storage medium and computer equipment
JP2009016940A (en) Uniformity correction processing apparatus and uniformity correction processing method of frame image
CN107193816A (en) A kind of image search method, virtual portrait image acquiring method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: Room 554, 5 / F, building 3, 969 Wenyi West Road, Wuchang Street, Yuhang District, Hangzhou City, Zhejiang Province

Applicant after: Alibaba (China) Co.,Ltd.

Address before: 310052 room 508, 5th floor, building 4, No. 699 Wangshang Road, Changhe street, Binjiang District, Hangzhou City, Zhejiang Province

Applicant before: Alibaba (China) Co.,Ltd.

GR01 Patent grant
GR01 Patent grant