CN113752695B - Image data generating apparatus - Google Patents

Image data generating apparatus Download PDF

Info

Publication number
CN113752695B
CN113752695B CN202110602879.7A CN202110602879A CN113752695B CN 113752695 B CN113752695 B CN 113752695B CN 202110602879 A CN202110602879 A CN 202110602879A CN 113752695 B CN113752695 B CN 113752695B
Authority
CN
China
Prior art keywords
image data
layer
information
area
drawing area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110602879.7A
Other languages
Chinese (zh)
Other versions
CN113752695A (en
Inventor
谷内华菜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sumitomo Heavy Industries Ltd
Original Assignee
Sumitomo Heavy Industries Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sumitomo Heavy Industries Ltd filed Critical Sumitomo Heavy Industries Ltd
Publication of CN113752695A publication Critical patent/CN113752695A/en
Application granted granted Critical
Publication of CN113752695B publication Critical patent/CN113752695B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/12Digital output to print unit, e.g. line printer, chain printer
    • G06F3/1201Dedicated interfaces to print systems
    • G06F3/1223Dedicated interfaces to print systems specifically adapted to use a particular technique
    • G06F3/1237Print job management
    • G06F3/1242Image or content composition onto a page
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B41PRINTING; LINING MACHINES; TYPEWRITERS; STAMPS
    • B41JTYPEWRITERS; SELECTIVE PRINTING MECHANISMS, i.e. MECHANISMS PRINTING OTHERWISE THAN FROM A FORME; CORRECTION OF TYPOGRAPHICAL ERRORS
    • B41J2/00Typewriters or selective printing mechanisms characterised by the printing or marking process for which they are designed
    • B41J2/005Typewriters or selective printing mechanisms characterised by the printing or marking process for which they are designed characterised by bringing liquid or particles selectively into contact with a printing material
    • B41J2/01Ink jet
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B41PRINTING; LINING MACHINES; TYPEWRITERS; STAMPS
    • B41JTYPEWRITERS; SELECTIVE PRINTING MECHANISMS, i.e. MECHANISMS PRINTING OTHERWISE THAN FROM A FORME; CORRECTION OF TYPOGRAPHICAL ERRORS
    • B41J29/00Details of, or accessories for, typewriters or selective printing mechanisms not otherwise provided for
    • B41J29/38Drives, motors, controls or automatic cut-off devices for the entire printing mechanism
    • B41J29/393Devices for controlling or analysing the entire machine ; Controlling or analysing mechanical parameters involving printing of test patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/12Digital output to print unit, e.g. line printer, chain printer
    • G06F3/1201Dedicated interfaces to print systems
    • G06F3/1223Dedicated interfaces to print systems specifically adapted to use a particular technique
    • G06F3/1237Print job management
    • G06F3/1241Dividing a job according to job requirements, e.g. black/white and colour pages, covers and body of books, tabs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/12Digital output to print unit, e.g. line printer, chain printer
    • G06F3/1201Dedicated interfaces to print systems
    • G06F3/1223Dedicated interfaces to print systems specifically adapted to use a particular technique
    • G06F3/1237Print job management
    • G06F3/125Page layout or assigning input pages onto output media, e.g. imposition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/12Digital output to print unit, e.g. line printer, chain printer
    • G06F3/1201Dedicated interfaces to print systems
    • G06F3/1223Dedicated interfaces to print systems specifically adapted to use a particular technique
    • G06F3/1237Print job management
    • G06F3/1253Configuration of print job parameters, e.g. using UI at the client

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Record Information Processing For Printing (AREA)
  • Ink Jet (AREA)
  • Processing Or Creating Images (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

The present invention provides an image data generating device capable of generating image data defining a dot pattern in which a plurality of dots are distributed without converting from image data in a vector format. Data is input to the input device. The processing means allows the user to input distribution rule information designating a rule for distributing a plurality of points on the surface to be drawn from the input means. Further, the processing means generates image data specifying positions of respective points distributed on the surface to be drawn according to the rule specified by the distribution rule information.

Description

Image data generating apparatus
The present application claims priority based on japanese patent application No. 2020-095683 filed on 1-month 6 of 2020. The entire contents of this japanese application are incorporated by reference into this specification.
Technical Field
The present invention relates to an image data generating apparatus.
Background
An insulating dot (dot spacer) for preventing a short circuit between two conductive films is used in the resistive film type touch panel. The insulating dots are composed of a plurality of dots regularly distributed in the plane, and the plurality of dots are composed of an insulating material. Patent document 1 discloses a technique of forming insulating dots by spraying ink on an insulating dot formation surface using an inkjet printer. When a desired pattern is drawn by using an inkjet printer, raster-format image data is used to control discharge of ink from an inkjet head. Raster format image data is typically generated by converting data in vector format using CAD fabrication.
Patent document 1: japanese patent application laid-open No. 2004-139162
When converting image data in a vector format into image data in a raster format, there is a possibility that one point of the image data in the vector format is allocated to two pixels of the image data in the raster format. Further, a situation may occur in which a dot is allocated to a pixel that is offset from the pixel to which the dot should be allocated by one pixel amount.
Disclosure of Invention
The present invention provides an image data generating device capable of generating image data defining a dot pattern in which a plurality of dots are distributed without converting from image data in a vector format.
According to an aspect of the present invention, there is provided an image data generating apparatus including:
an input device for inputting data; a kind of electronic device with high-pressure air-conditioning system
The processing device comprises a processing device and a processing device,
the processing device has the following functions:
let the user input distribution rule information designating a rule to distribute a plurality of points on the depicted surface from the input means,
image data specifying positions of respective points distributed on the depicted surface according to a rule specified by the distribution rule information is generated.
The image data defining the dot pattern in which the plurality of dots are distributed can be generated by inputting the range information and the distribution rule information of the drawing area without creating the image data in the vector format.
Drawings
Fig. 1 (a) is a schematic front view of an image data generating device and an ink applying device for drawing using image data generated by the image data generating device according to an embodiment, and fig. 1 (B) is a diagram showing a positional relationship of a movable table, an ink discharge unit, and an image pickup device in a plan view.
Fig. 2 is a plan view showing a pattern formed on the surface of a substrate by ink.
Fig. 3 is a block diagram of an image data generating apparatus according to the present embodiment.
Fig. 4 is a diagram showing a window for inputting printing conditions.
Fig. 5 is a diagram showing a window in which the information file reading tag is activated.
Fig. 6 is a diagram showing a window when the drawing information setting tab is activated.
Fig. 7 is a diagram showing a window when the drawing information editing tag is activated.
Fig. 8 is a diagram showing a window depicting when the information deletion tab is activated.
Fig. 9 (a) is a diagram showing image data of the 1 st layer generated by performing the 1 st step on a two-dimensional plane of a pixel coordinate system, fig. 9 (B) is a diagram showing image data of the 2 nd layer generated by performing the 2 nd step on a two-dimensional plane of a pixel coordinate system, fig. 9 (C) is a diagram showing image data of the 1 st layer generated by performing the 3 rd step on a two-dimensional plane of a pixel coordinate system, and fig. 9 (D) is a diagram showing composite image data generated by performing the 4 th step on a two-dimensional plane of a pixel coordinate system.
Fig. 10 (a) is a diagram showing image data of the 2 nd layer generated by performing the 2 nd step according to another embodiment on a two-dimensional plane of a pixel coordinate system, fig. 10 (B) is a diagram showing image data of the 2 nd layer after the 2 nd drawing region is expanded on a two-dimensional plane of a pixel coordinate system, fig. 10 (C) is a diagram showing image data of the 1 st layer generated by performing the 3 rd step on a two-dimensional plane of a pixel coordinate system, and fig. 10 (D) is a diagram showing synthetic image data generated by performing the 4 th step on a two-dimensional plane of a pixel coordinate system.
Fig. 11 (a) shows image data of the 2 nd layer generated by performing the 2 nd step according to still another embodiment on a two-dimensional plane of a pixel coordinate system, fig. 11 (B) shows image data of the 2 nd layer after the 2 nd drawing region is expanded on a two-dimensional plane of a pixel coordinate system, fig. 11 (C) shows image data of the 1 st layer generated by performing the 3 rd step on a two-dimensional plane of a pixel coordinate system, and fig. 11 (D) shows composite image data generated by performing the 4 th step on a two-dimensional plane of a pixel coordinate system.
Fig. 12 (a) shows image data of the 1 st layer generated by performing the 1 st step according to still another embodiment on a two-dimensional plane of a pixel coordinate system, fig. 12 (B) shows image data of the 2 nd layer generated by performing the 2 nd step and a blank area to be secured on a two-dimensional plane of a pixel coordinate system, fig. 12 (C) shows image data of the 1 st layer generated by performing the 3 rd step on a two-dimensional plane of a pixel coordinate system, and fig. 12 (D) shows composite image data generated by performing the 4 th step on a two-dimensional plane of a pixel coordinate system.
Fig. 13 (a) shows image data of the 2 nd layer generated by performing the 2 nd step according to still another embodiment on a two-dimensional plane of a pixel coordinate system, fig. 13 (B) shows image data of the 2 nd layer after shifting a plurality of points in a translational manner on a two-dimensional plane of a pixel coordinate system, fig. 13 (C) shows image data of the 1 st layer generated by performing the 3 rd step on a two-dimensional plane of a pixel coordinate system, and fig. 13 (D) shows composite image data generated by performing the 4 th step on a two-dimensional plane of a pixel coordinate system.
Fig. 14 (a) is a diagram showing the arrangement of the drawing region of the 1 st layer designated by the image data of the 1 st layer according to still another embodiment, fig. 14 (B) is a diagram showing the arrangement of the drawing region of the 2 nd layer designated by the image data of the 2 nd layer, and fig. 14 (C) is a diagram showing the arrangement of the drawing region of the 1 st layer and the drawing region of the 2 nd layer designated by the composite image data.
Fig. 15 (a) and (B) are diagrams showing windows for inputting drawing information according to still another embodiment.
Fig. 16 (a) is a diagram showing a positional relationship between a drawing surface and two drawing areas 1 arranged on layer 1, fig. 16 (B) is a diagram showing a drawing area 2 on layer 2, and fig. 16 (C) is a schematic diagram for explaining a process before the data generating unit (fig. 3) executes step 3.
Fig. 17 is a diagram showing a window for inputting drawing information according to still another embodiment.
Fig. 18 is a diagram showing the synthesized image data on a two-dimensional plane of a pixel coordinate system.
Fig. 19 (a) is a diagram showing the arrangement of the 1 st drawing region arranged on the 1 st layer, fig. 19 (B) is a diagram showing the arrangement of the 2 nd drawing region arranged on the 2 nd layer, fig. 19 (C) is a diagram showing the heights of the drawing regions and dots specified by the synthesized image data, and fig. 19 (D) to (F) are diagrams schematically showing the printing data of the 1 st to 3 rd times on the two-dimensional plane of the pixel coordinate system, respectively.
Fig. 20 (a) is a diagram showing composite image data according to still another embodiment on a two-dimensional plane of a pixel coordinate system, and fig. 20 (B) is a diagram showing composite image data according to a modification of the embodiment of fig. 20 (a) on a two-dimensional plane of a pixel coordinate system.
Fig. 21 is a diagram showing the arrangement of a 1 st drawing region of a 1 st layer and a 2 nd drawing region of a 2 nd layer specified by composite image data generated by an image data generating apparatus according to still another embodiment.
Fig. 22 is a diagram showing the arrangement of drawing areas designated by synthesized image data generated by the image data generating apparatus according to still another embodiment.
In the figure: 10-a base, 11-a moving mechanism, 11-X-direction moving mechanism, 11Y-direction moving mechanism, 12-a movable table, 13-a supporting member, 20-a substrate, 21-a pass (pass) region, 22-a dot, 23-a drawn surface, 24-a reference point of a drawing region, 24A-a reference point of a drawing region 1, 24B-a reference point of a drawing region 2, 25-a drawing region, 26A, 26B, 26C, 26D-a drawing region 1 of a layer 1, 27A, 27B, 27C, 27D, 27E, 27F-a drawing region 2 of a layer 2 (inner deep), 28-a drawing region 3 of a layer 3, 29A, 29B-a peripheral edge portion, 30-an ink discharge unit, 31-an inkjet head, 32-nozzle, 33-curing light source, 40-image pickup device, 50-control device, 51-storage unit, 52-control unit, 60-image data generation device, 61-processing device, 62-input control unit, 63-data generation unit, 64-output control unit, 65-drawing condition storage unit, 66-image data storage unit, 67-input device, 68-output device, 70-input window, 72-input area, 73-printing condition display area, 74-drawing information list display area, 75-save button, 76-image data production start button, 81-input frame, 82-printing condition application button, 83-file read-in button, 84-input frame, 85-drawing information application button, 86-delete button, 87-input box for inputting overlapping drawing area ID of lower layer, 88-input box for inputting dot height, 90-pixel, 91-1 st layer image data, 92-2 nd layer image data, 93-composite image data, 95-blank area, 96x, 96 y-area where dot is not arranged, 97-blank area, 101-1 st time data for printing, 102-2 nd time data for printing, 103-3 rd time data for printing.
Detailed Description
An image data generating apparatus according to an embodiment will be described with reference to fig. 1 to 9.
Fig. 1 (a) is a schematic front view of an image data generating device and an ink applying device for drawing using image data generated by the image data generating device according to the present embodiment. A movable table 12 is supported on the base 10 via a moving mechanism 11. An xyz orthogonal coordinate system is defined in which the x-axis and the y-axis are oriented in the horizontal direction and the z-axis is oriented downward in the vertical direction. The control device 50 controls the movement mechanism 11 to move the movable table 12 in both the x-direction and the y-direction. As the moving mechanism 11, for example, an XY stage including an X-direction moving mechanism 11X and a Y-direction moving mechanism 11Y can be used. The X-direction moving mechanism 11X moves the Y-direction moving mechanism 11Y in the X-direction relative to the base 10, and the Y-direction moving mechanism 11Y moves the movable table 12 in the Y-direction relative to the base 10.
A substrate 20 to be ink-coated is held on the upper surface (holding surface) of the movable table 12. The substrate 20 is fixed to the movable table 12 by, for example, a vacuum chuck. The moving mechanism 11 moves the substrate 20 held on the movable stage 12 in a direction parallel to the xy-plane. Above the movable table 12, for example, the ink discharge unit 30 and the imaging device 40 are supported by a door-shaped support member 13. The ink discharge unit 30 and the imaging device 40 are supported to be movable up and down with respect to the base 10. The ink discharge unit 30 has a plurality of nozzles facing the substrate 20. Each nozzle discharges a droplet of photo-curable (e.g., uv-curable) ink toward the surface of the substrate 20 after forming the droplet. The discharge of the ink is controlled by the control device 50.
The imaging device 40 images the upper surface (ink-coated surface) of the substrate 20. The image pickup device 40 picks up an image of a region of the upper surface of the substrate 20 that is located within the angle of view of the image pickup device 40.
The control device 50 includes a storage unit 51 and a control unit 52. The storage unit 51 stores information (hereinafter referred to as image data) specifying a position at which ink is to be applied. The image data is composed of a plurality of pixels respectively associated with a plurality of positions on the surface of the substrate 20. Pixels to which ink should be dropped are specified by associating information specifying that there is no ink application with each pixel. In this specification, positions on the substrate corresponding to a plurality of pixels of image data are sometimes simply referred to as "pixels".
The control unit 52 controls the moving mechanism 11 and the ink discharge unit 30 based on the image data so that the ink drops at a predetermined position on the surface of the substrate 20. Thereby, a dot pattern or a film pattern made of ink is formed on the surface of the substrate 20. In this specification, a dot in which ink dropped on one pixel among dots is not continuous and isolated from ink dropped on other pixels is referred to as an "isolated dot".
The image data generating device 60 generates image data based on various drawing conditions input by a user. The generated image data is input to the control device 50. The image data is input from the image data generating device 60 to the control device 50 using a removable medium, a communication network such as LAN, short-range wireless communication such as bluetooth (registered trademark), or the like.
Fig. 1 (B) is a diagram showing a positional relationship among the movable table 12, the ink discharge unit 30, and the imaging device 40 in a plan view. The substrate 20 is held on the holding surface of the movable table 12. Above the substrate 20, an ink discharge unit 30 and an imaging device 40 are supported. The ink discharge unit 30 includes an inkjet head 31 and a curing light source 33. A plurality of nozzles 32 are provided on a surface of the inkjet head 31 facing the substrate 20. The plurality of nozzles 32 are arranged at equal intervals in the x direction. The interval is, for example, a size corresponding to a resolution of 600 dpi. The plurality of nozzles 32 need not necessarily be arranged on a straight line parallel to the x-direction, and may be arranged at positions regularly offset from a reference line parallel to the x-direction toward the y-direction. For example, it may be arranged in a staggered configuration.
The curing light sources 33 are disposed on both sides of the inkjet head 31 in the y direction, respectively, and irradiate light for curing the ink applied on the substrate 20 toward the substrate 20. For example, when the ink is an ultraviolet curable ink, the curing light source 33 irradiates ultraviolet rays toward the substrate 20. The curing light source 33 functions as a curing device that cures the ink applied to the substrate 20.
The control device 50 controls the moving mechanism 11 to move the movable table 12 in the x-direction and the y-direction. The control device 50 controls the discharge of ink from the nozzles 32 of the inkjet head 31.
The ink can be applied to the substrate 20 by ejecting the ink from the inkjet head 31 while moving the substrate 20 in the y direction (in other words, while relatively moving the inkjet head 31 with respect to the substrate 20 in the y direction), and the resolution in the x direction can be set to 600dpi, for example. The ink dropped on the substrate 20 is cured by light emitted from the curing light source 33 located on the downstream side in the moving direction of the substrate 20. The operation of dropping ink from the inkjet head 31 to the substrate 20 while moving the substrate 20 in the y direction is referred to as a "scanning operation". The y-direction is referred to as the scan direction.
The inkjet head 31 may be reciprocated at least once in the y direction in one scanning action. At this time, by shifting the inkjet head 31 by 1/2 of the pitch of the nozzles 32 with respect to the substrate 20 in the x-direction during the forward and backward travel, the ink can be dropped onto the substrate 20 at a resolution of 1200 dpi. By setting the offset of the inkjet heads 31 to 1/4 of the pitch of the nozzles 32 and reciprocating the inkjet heads 31 twice, ink can be dropped onto the substrate 20 at 2400 dpi. In this way, when the inkjet head 31 is moved a plurality of times in the negative and positive directions of the y-axis in order to improve the resolution, the plurality of movements are also referred to as a single scanning operation.
When the one-time scanning operation is completed, the control device 50 moves the movable table 12 in the x-direction. In other words, the inkjet head 31 is relatively moved in the x-direction with respect to the substrate 20. This action is referred to as "displacement action". The x-direction is referred to as the displacement direction. By repeating the scanning operation and the displacement operation, the ink can be applied to the entire region of the substrate 20. The relative movement amount of the inkjet head 31 with respect to the substrate 20 in the x direction may be set to be substantially equal to the distance between the two nozzles 32 located at both ends in the x direction. When a part of the nozzles 32 near both ends is not used for discharging ink, the relative movement amount may be substantially equal to the distance between the nozzles 32 located at both ends among the nozzles 32 actually used.
By performing the scanning operation a plurality of times without performing the displacement operation, the ink can be dropped a plurality of times onto one pixel. That is, the ink can be repeatedly applied. By repeating the application of the ink, isolated dots can be further increased, and the film can be further thickened.
When the scanning operation is performed in a state where light is irradiated from the curing light source 33 to the substrate 20, the ink discharged from the inkjet head 31 is immediately dropped on the substrate 20 and then is temporarily cured. Specifically, after the ink drops, the ink is temporarily cured in a short time until the drop point of the ink moves into the path of the light irradiated from the curing light source 33. Before the ink is temporarily cured, the ink dropped on the substrate 20 spreads in the in-plane direction of the substrate 20. The extent of this diffusion depends on the extent of lyophilic properties of the surface of the substrate 20. After the ink is temporarily cured, the ink does not spread in the in-plane direction of the substrate 20.
Fig. 2 is a plan view showing an example of a pattern formed on the surface of the substrate 20 by ink. In the present embodiment, an insulating dot (dot spacer) for a resistive film type touch panel is formed. Four touch panels are bonded to one substrate 20. The surface (upper surface) of the substrate 20 to which the ink is applied is referred to as a drawn surface 23. Four drawing areas 25 having a square or rectangular shape are defined on the drawing surface 23 corresponding to the four touch panels, respectively. The four sides of the drawn surface 23 and the four sides of the respective drawing areas 25 are parallel to the x-direction (displacement direction) or the y-direction (scanning direction).
As an example, one vertex of the drawn surface 23 is defined as the origin O of the xy coordinate system. In fig. 2, the upper left vertex is defined as the origin O, the right direction is defined as the positive direction of the x-axis, and the lower direction is defined as the positive direction of the y-axis. In each drawing region 25, a reference point 24 whose relative position to the drawing region 25 is fixed is defined. As the reference point 24, for example, a vertex closest to the origin O among four vertices of each drawing area is used. In fig. 2, the top left vertex of each drawing area 25 becomes the reference point 24. By specifying the position of the reference point 24 in the xy coordinate system, the position of the drawing region 25 on the drawn surface 23 can be specified.
The drop position of the ink discharged from the inkjet head 31 (fig. 1 (B)) is specified in pixel coordinates defined on the drawn surface 23. The pixel having the origin of the xy coordinate system as one vertex corresponds to the origin (0, 0) of the pixel coordinate system.
In the drawing region 25, a plurality of individual dots 22 are regularly distributed in the x-direction and the y-direction. The distribution rules of the plurality of dots 22 are different from each other in the inner deep portion and the peripheral portion of the drawing area 25. The image data generating device 60 (fig. 1 (a)) generates image data specifying positions of the plurality of points 22 arranged in the drawing area 25.
Fig. 2 shows an example in which the dot density in the inner deep portion of the drawing area 25 is lower than that in the peripheral portion, but the distribution of the plurality of dots 22 is not limited to the example shown in fig. 2. There may be a case where the dot density in the inner deep portion of the drawing area 25 is higher than the dot density in the peripheral portion, or a case where the plurality of dots 22 are uniformly distributed over the entire area of the drawing area 25.
The area where ink can be applied by one scanning operation is referred to as a pass (pass) area 21. The plurality of sweep areas 21 are arranged in the x-direction. In fig. 2, the area of one substrate 20 to which ink should be applied is covered by four sweep areas 21. In addition, the number of sweep areas 21 required depends on the size of the substrate 20 in the x-direction and the size of the inkjet head 31.
Fig. 3 is a block diagram of an image data generating apparatus 60 according to the present embodiment.
The image data generating device 60 includes a processing device 61, an input device 67, and an output device 68. The processing device 61 includes an input control unit 62, a data generation unit 63, an output control unit 64, a drawing condition storage unit 65, and an image data storage unit 66.
Various printing conditions and drawing information required for forming the dots 22 on the substrate 20 (fig. 2) are input from the input device 67. As the input means 67, for example, a keyboard, a display means and a pointing device, a communication means, a removable medium reading means, and the like can be used.
The input control unit 62 controls the input device 67 to allow the user to input printing conditions, drawing information, and the like from the input device 67. The input control unit 62 receives the printing conditions and drawing information input from the input device 67, and stores the printing conditions and drawing information in the drawing condition storage unit 65. The printing conditions and drawing information stored in the drawing condition storage unit 65 are supplied to the data generation unit 63 as necessary. The printing conditions include the size, drawing resolution, and conversion coefficient of the drawn surface 23 (fig. 2) of the substrate 20 (fig. 2). The drawing information includes range information specifying the range of each drawing region 25 (fig. 2), distribution rule information specifying the distribution rule of a plurality of points, and the like. The range information includes, for example, information specifying the position, shape, and size of the drawing area 25. The distribution rule information includes, for example, information specifying the pitch of the points 22 in the x-direction and the y-direction.
The data generation unit 63 generates raster-format image data from the range information, the distribution rule information, and the like stored in the drawing condition storage unit 65, and stores the generated image data in the image data storage unit 66. The output control section 64 outputs the image data stored in the image data storage section 66 to the output device 68. As the output device 68, for example, a removable medium writing device, a communication device, or the like can be used. The image data output from the output device 68 is read by the control device 50 (fig. 1 (a)) of the ink application device. The output device 68 includes a display screen, and displays image data on a two-dimensional plane of a pixel coordinate system. The output device 68 and the input device 67 may share a part of the components.
Next, a method of inputting printing conditions, drawing information, and the like from the input device 67 (fig. 3) will be described with reference to fig. 4 to 8. A window for inputting various information is displayed on the display screen of the input device 67.
Fig. 4 is a diagram showing a window 70 for inputting printing conditions. Various steps described below with respect to the window 70 are performed by the input control section 62 controlling the input device 67.
An input area 72 for performing "print condition setting", "drawing information file reading", "drawing information setting", "drawing information editing" and "drawing information deletion" is displayed in a tab form in the window 70. Fig. 4 shows a state in which the printing condition setting label is activated. In addition to the input area 72, a print condition display area 73, a drawing information list display area 74, a save button 75, and an image data creation start button 76 are displayed in the window 70.
The printing conditions registered as the basic information for generating the image data are displayed in the printing condition display area 73. A list of drawing information registered as basic information for generating image data is displayed in the drawing information list display area 74. When the user presses the save button 75, the currently registered printing conditions and drawing information are stored in the drawing condition storage unit 65 (fig. 3) with the file name attached. The file name can be freely set by the user. When the user presses the image data creation start button 76, the input control unit 62 (fig. 3) issues an instruction to create image data to the data creation unit 63 (fig. 3). Upon receiving the instruction, the data generation unit 63 generates image data based on the printing conditions and the drawing information stored in the drawing condition storage unit 65. The user can press the save button 75 and the image data creation start button 76 by a touch (tap) operation, a click operation of a mouse, or the like.
When the print condition setting tab is activated, the input control section 62 displays six input boxes 81 for inputting the dimensions of the surface 23 to be drawn (fig. 2) in the x-direction and the y-direction, the drawing resolution in the x-direction and the y-direction, and the conversion coefficients in the x-direction and the y-direction in the input area 72. Also, a print condition applicable button 82 is displayed.
The dimensions of the depicted surface 23 are specified in terms of length in the x-direction and y-direction. The unit of length in the x-direction and the y-direction is "mm". The drawing resolutions in the x-direction and the y-direction are input by a pull-down menu method. The unit of the drawing resolution is "dpi". A pixel coordinate system is defined on the depicted surface 23 (fig. 2) according to the specified resolution.
Next, the conversion coefficient will be described. Since 1 inch=25.4 mm, a standard conversion coefficient C that converts an inch into millimeters 0 25.4. If the position in the xy coordinate system is specified in millimeter units, the xy coordinate and the conversion coefficient C of the position can be calculated 0 And the specified resolution to find the pixel coordinates in the pixel coordinate system.
For example, the nozzle pitch of an inkjet head having a resolution of 600dpi is 25.4/600[ mm ]. However, according to the inkjet head 31, there is a case where the design value of the pitch of the nozzles 32 is slightly deviated from the pitch equivalent to the resolution. For example, even if the resolution of the inkjet head is 600dpi in the specification, there are cases where the design value of the actual nozzle pitch is 0.04225 (=25.35/600) mm.
If the design value of the nozzle pitch is denoted as Pn, the length Lx [ mm ] in the x direction allocated to the surface to be drawn is measured when the ink jet head having a resolution of 600dpi is used ]The number of pixels of (1) is (Lx/C) 0 ) X 600[]. A length Lxc [ mm ] corresponding to the number of pixels]Is (Lx/C) 0 )×600×Pn[mm]。
When an inkjet head having a resolution of 600dpi and a design value Pn of a nozzle pitch of 0.04225mm is used, a standard conversion coefficient C is applied 0 However, if the x-coordinate expressed in millimeter units is converted into the pixel coordinate, the correct pixel coordinate may not be obtained. In the present embodiment, the input control section 62 does not input the standard conversion coefficient C 0 And an accurate conversion coefficient based on a design value of a nozzle pitch of the inkjet head actually used is input as the conversion coefficient. For example, when an inkjet head having a resolution of 600dpi and a nozzle pitch of 0.04225mm is used, 25.35 may be given as the conversion coefficient. When the x-coordinate expressed in millimeter units is converted into pixel coordinates using the conversion coefficient, accurate pixel coordinates can be obtained. The conversion coefficient is input in a pull-down menu manner, for example. The menu list of the pull-down menu preferably includes a conversion coefficient corresponding to a design value of the nozzle pitch of the inkjet head to be actually used and a standard conversion coefficient C 0
When the user presses the print condition application button 82, the input control unit 62 (fig. 3) registers the information of the current time input to the input frame 81 as the print condition for generating the image data, and displays the registered print condition in the print condition display area 73.
Fig. 5 is a diagram showing a window 70 when the information file reading tag is activated. The input area 72 displays a list of the file names and update dates and times of the drawing information files stored in the drawing condition storage unit 65 (fig. 3) and the folder names of the drawing information files stored therein. A file read button 83 is also displayed in the input area 72.
When the user selects one file from the list and presses the file read button 83, the input control unit 62 (fig. 3) reads the designated file from the drawing condition storage unit 65 (fig. 3) and displays the content in the printing condition display area 73 and the drawing information list display area 74.
Fig. 6 is a diagram showing a window 70 when the drawing information setting tab is activated. The input area 72 has seven input boxes 84 in total for inputting layer identification information of a specified layer, positions (reference point positions) of the reference points 24 (fig. 2) in the x-direction and the y-direction, sizes (drawing area sizes) of the drawing area 25 (fig. 2) in the x-direction and the y-direction, and dot pitches in the x-direction and the y-direction. A drawing information application button 85 is also displayed.
The layer identification information is an integer of 1 or more and is input in a pull-down menu manner. The function of the information of the designated layer will be described later with reference to fig. 9 (a) to (D). The position of the reference point 24 (fig. 2) is specified by the x-coordinate and the y-coordinate of the xy-coordinate system having one vertex of the drawn surface 23 (fig. 2) as the origin O. The x-coordinate and the y-coordinate are specified in millimeter units. The size of the drawing area 25 is specified by the length of the side of the drawing area 25 (fig. 2) extending in the x-direction and the length of the side extending in the y-direction. The length of these sides is specified in millimeters. Information including position information of the reference point 24 and length information of sides indicating the size of the drawing area 25 may be referred to as "range information" designating the range of the drawing area 25.
The dot pitch in the x-direction and the dot pitch in the y-direction are defined as the distance between the centers of two adjacent dots in the x-direction and the distance between the centers of two adjacent dots in the y-direction, respectively. Dot pitch is specified in millimeter units. The pitch information specifying the pitch of the dots is sometimes referred to as "distribution rule information" specifying the distribution rule of the plurality of dots.
When the user presses the drawing information application button 85, one drawing area 25 (fig. 2) is newly registered as basic information for generating image data based on the information input to the input box 84 at the current time. When one drawing area 25 is newly registered, the input control unit 62 gives drawing area identification information (drawing area ID) to the newly registered drawing area 25. The drawing area ID is information for distinguishing a plurality of drawing areas 25 in one layer, and is composed of, for example, integers sequentially allocated from 1. The input control unit 62 additionally displays the drawing area ID of the newly registered drawing area 25 in the drawing information list display area 74.
Fig. 7 is a diagram showing a window 70 when the drawing information editing tag is activated. The content displayed in the input area 72 is the same as the content when the drawing information setting tab is activated (fig. 6). When the user selects one drawing area from the drawing information list display area 74, the input control unit 62 displays the layer information, the range information, and the distribution rule information on the selected drawing area 25 in the input area 72, and the user modifies the information.
The user can modify the layer information, the range information, and the distribution rule information displayed in the input area 72 by operating the input device 67 (fig. 3). After modifying the information, when the user presses the drawing information application button 85, the input control unit 62 rewrites the drawing information of the selected drawing area 25 to the modified drawing information. When the layer information is modified, the input control unit 62 newly assigns a drawing area ID to all drawing areas 25 currently registered in the book.
Fig. 8 is a diagram showing a window 70 when the information deletion tab is activated. When the user selects one drawing area 25 from the drawing information list display area 74, the input control unit 62 displays drawing information of the selected drawing area 25 in gray in the input area 72. The user cannot modify the information. A delete button 86 is also displayed in the input area 72.
If the user presses the delete button 86, the registration of the currently selected drawing area 25 is erased. Then, the drawing area ID is newly assigned to the drawing area 25 currently registered in the book except for the drawing area 25 which has been erased. The input control unit 62 deletes the drawing area 25 registered and erased from the drawing information list display area 74.
Next, the procedure of generating image data by the data generating unit 63 (fig. 3) will be described with reference to (a) to (D) in fig. 9. One drawing area is set on each of the two layers. The two layers are referred to as layer 1 and layer 2, respectively, and a 1 st drawing area 26 is set in layer 1 and a 2 nd drawing area 27 is set in layer 2. Layer 1 is the lower layer of layer 2.
The data generating unit 63 executes step 1 of generating the image data 91 of layer 1 based on the drawing information of the layer 1 drawing region 26 of layer 1. The image data 91 is raster-format image data composed of a plurality of pixels 90.
Fig. 9 (a) is a diagram showing pixels 90 arranged inside the 1 st drawing region 26 among a plurality of pixels 90 constituting the 1 st layer image data 91 generated by executing the 1 st step. The image data 91 of layer 1 designates pixels corresponding to the plurality of points 22 distributed in the drawing area 26 of layer 1. In fig. 9 (a), the pixel 90 of the arrangement point 22 is blackened. In the following drawings, the pixels 90 of the arrangement points 22 are also darkened.
In the drawing area 1, a plurality of pixels 90 are arranged in a matrix in the x-direction and the y-direction in the drawing area 26. The pitches (hereinafter, referred to as pixel pitches) of the plurality of pixels 90 in the x-direction and the y-direction are obtained from the drawing resolution and the conversion coefficient of the printing conditions shown in fig. 4. When the drawing resolution in the x direction is 2400dpi and the conversion coefficient is 25.35, the pixel pitch in the x direction is 25.35/2400= 0.0105625mm. When the drawing resolution in the y direction is 2400dpi and the conversion coefficient is 25.4, the pixel pitch in the y direction is 25.4/2400= 0.01058333mm.
The position of the reference point 24A of the 1 st drawing area 26 on the drawn surface 23 (fig. 2) is determined by converting the xy coordinates of the reference point position input in the window 70 shown in fig. 6 into pixel coordinates. The size of the 1 st drawing area 26 in the x-direction and the y-direction can be obtained from the drawing area size inputted into the window 70 shown in fig. 6. The range of the 1 st drawing area 26 can be defined by, for example, the position of the reference point 24A and the position of the pixel diagonally opposite to the reference point 24A. The pixel coordinates of the pixels located diagonally with respect to the reference point 24A can be calculated from the xy coordinates of the reference point position input in the window 70 shown in fig. 6 and the drawing region sizes in the x direction and the y direction expressed in millimeter units.
In step 1, the data generating unit 63 arranges the plurality of points 22 at a predetermined point pitch in the x-direction and the y-direction with the reference point 24A of the 1 st drawing area 26 as a starting point. Hereinafter, the pixel 90 at which the dot 22 is arranged may be simply referred to as "dot 22". The dot pitch is equal to the distance between the centers of the pixels 90 corresponding to the adjacent two dots 22.
Fig. 9 (a) shows an example in which two pixels 90 are arranged between two adjacent points 22 in the x-direction and the y-direction, but when insulating points of a resistive film type touch panel are formed, the number of pixels between the two adjacent points 22 is usually more than two. In addition, since the pixel pitch in the x direction and the pixel pitch in the y direction are not the same in a strict sense, even if the dot pitch in the x direction and the dot pitch in the y direction expressed in millimeter units are the same, the number of pixels between two adjacent dots 22 in the x direction and the number of pixels between two adjacent dots 22 in the y direction are not necessarily the same.
After executing step 1, the data generating section 63 executes step 2 of generating the image data 92 of layer 2 from the drawing information of the drawing area 27 of layer 2.
Fig. 9 (B) is a diagram showing pixels 90 disposed inside the 1 st drawing region 26 among a plurality of pixels 90 constituting the 2 nd layer image data 92 generated by executing the 2 nd step. The 2 nd drawing region 27 is included in the 1 st drawing region 26, and is located at the inner deep portion of the 1 st drawing region 26.
Similarly to the case of the 1 st drawing area 26, the position, x-direction and y-direction dimensions, and dot pitch of the reference point 24B of the 2 nd drawing area 27 are also set by user input in a state where the window 70 shown in fig. 6 is displayed. The dot pitch of the drawing area 2 27 is wider than the dot pitch of the drawing area 1 26. As in the case of the 1 st drawing area 26, the plurality of dots 22 are arranged at a specified dot pitch in the x-direction and the y-direction with the reference point 24B of the 2 nd drawing area 27 as a starting point. The image data 92 of the 2 nd layer designates positions of the plurality of points 22 distributed in the 2 nd drawing area 27.
After the 1 st step (fig. 9 (a)) and the 2 nd step (fig. 9 (B)) are executed, the data generating unit 63 executes the 3 rd step described below.
Fig. 9 (C) is a diagram showing pixels 90 arranged inside the 1 st drawing region 26 among a plurality of pixels 90 constituting the 1 st layer image data 91 generated by executing the 3 rd step. In step 3, the data generating unit 63 (fig. 3) performs a process of deleting the point 22 defined inside the 2 nd drawing region 27 on the layer located relatively above, with respect to the image data 91 of the 1 st layer (i.e., the image data on the layer located relatively below). Thus, all pixels 90 located inside the drawing area 2 27 among the plurality of pixels 90 of the image data 91 of the 1 st layer become pixels where the dots 22 are not arranged.
After executing the 3 rd step (fig. 9 (C)), the data generating section 63 executes the 4 th step described below.
Fig. 9 (D) is a diagram showing pixels 90 arranged inside the 1 st drawing region 26 among a plurality of pixels 90 constituting the synthesized image data 93 generated by executing the 4 th step. In step 4, the data generating section 63 generates the composite image data 93 specifying the position of the dot 22 specified by any one of the image data 91 of the 1 st layer (fig. 9 (C)) and the image data 92 of the 2 nd layer (fig. 9 (B)) after step 3. For example, the synthesized image data 93 is generated by making the pixel 90 of the arrangement point 22 correspond to "1", making the pixel 90 of the non-arrangement point 22 correspond to "0", and taking a logical sum for each pixel 90 for the image data 91 of the 1 st layer and the image data 92 of the 2 nd layer.
The 3 rd step and the 4 th step are equivalent to the processing of overlaying the pixels 90 of the 2 nd drawing area 27 of the image data 92 of the 2 nd layer on the pixels 90 of the 2 nd drawing area 27 of the image data 91 of the 1 st layer.
As a result, as shown in fig. 9 (D), a plurality of dots 22 are arranged inside the 2 nd drawing area 27 at the dot pitch specified by the drawing information of the 2 nd drawing area 27. In the region (hereinafter, referred to as the peripheral edge portion 29) outside the 2 nd drawing region 27 inside the 1 st drawing region 26, a plurality of dots 22 are arranged at a dot pitch specified by drawing information of the 1 st drawing region 26. In fig. 9 (D), the peripheral edge 29 is hatched. The 2 nd drawing region 27 is sometimes referred to as an inner deep portion with respect to the peripheral edge 29.
Next, the excellent effects of the above-described embodiments will be described.
In the above embodiment, the user can generate the composite image data 93 specifying the positions of the plurality of points 22 by specifying the range of the drawing area and the dot pitch without using CAD or the like to create image data in a vector format (fig. 9 (D)). The data generation unit 63 stores the composite image data 93 in the image data storage unit 66.
The output control section 64 outputs the synthesized image data 93 stored in the image data storage section 66 to the output device 68. The composite image data is provided to the control device 50 of the ink application device.
When the user designates the distribution rule of the plurality of points 22 arranged on the peripheral edge 29 (fig. 9 (D)), the user does not need to designate the position or shape of the inner peripheral edge of the peripheral edge 29, but only needs to designate the range of the drawing area 26 of the 1 st (i.e., the outer peripheral edge of the peripheral edge 29), as shown in fig. 9 (a).
In a certain drawing area, if the dot pitches in the x-direction and the y-direction are designated to be the same as the pixel pitches in the x-direction and the y-direction, respectively, ink is applied to all pixels in the drawing area. Thus, a film of ink can be formed. Such a film can be used, for example, as a protective film (overcoat) covering wiring of a resistive film type touch panel. Therefore, using the image data generating device based on the above-described embodiment can generate image data for forming both the insulating dot and the overcoat layer of the resistive film type touch panel.
The user can input the position or size of the drawing area and the dot pitch in units of millimeters without considering the pixel pitch. This can suppress the occurrence of user input errors, compared with a method in which the user designates pixels 90 in which a plurality of dots 22 are arranged one by one in consideration of the pixel pitch.
Further, the registered drawing information may be modified for each drawing area 25 as shown in fig. 7, or may be deleted for each drawing area 25 as shown in fig. 8. Thus, convenience for the user is improved.
Next, a modification of the above embodiment will be described. In the above-described embodiment, the example was given in which the insulating dots of the resistive film type touch panel were formed using the image data generating device 60 (fig. 1 (a)), but the image data generating device 60 according to the above-described embodiment may also generate image data for forming other dot patterns in which a plurality of dots are regularly arranged. For example, image data for forming pixels of each luminescent color of an organic EL (OLED) panel or a quantum dot display panel may be generated.
In the above embodiment, the positions of the 1 st drawing area 26 and the 2 nd drawing area 27 are specified in the drawing target surface 23, and the plurality of points 22 are arranged in the 1 st drawing area 26 and the 2 nd drawing area 27, but the positions of the 1 st drawing area 26 and the 2 nd drawing area 27 are not necessarily specified. For example, the distribution rule may be specified only for the plurality of points 22 arranged in the drawing surface 23 without specifying the position of the 1 st drawing area 26 or the 2 nd drawing area 27 in the drawing surface 23. For example, when the plurality of points 22 are arranged in almost the entire area within the drawn surface 23, only the distribution rule may be specified.
Next, an image data generating apparatus according to another embodiment will be described with reference to (a) to (D) in fig. 10. Hereinafter, the same configuration as in the embodiment shown in fig. 1 to 9 will be omitted.
The data generating unit 63 (fig. 3) generates the image data 91 of the 1 st layer (fig. 9 (a)) and the image data 92 of the 2 nd layer (fig. 9 (B)) in the same steps as the 1 st step (fig. 9 (a)) and the 2 nd step (fig. 9 (B)) of the embodiment shown in fig. 1 to 9.
Fig. 10 (a) is a diagram showing pixels 90 disposed inside the 1 st drawing region 26 among a plurality of pixels 90 constituting the 2 nd layer image data 92 generated by executing the 2 nd step, and is the same as the diagram shown in fig. 9 (B). Since the plurality of points 22 are arranged at predetermined point pitches in the x-direction and the y-direction with the reference point 24B of the drawing area 27 of the 2 nd as a starting point, a part of the points 22 are arranged at positions in contact with or on two sides (peripheral lines) extending from the reference point 24B. In the present embodiment, the 2 nd drawing region 27 is expanded before the 3 rd step (fig. 9 (C)) of the embodiment shown in fig. 1 to 9 is performed. No dot is arranged in the region enlarged by expanding the drawing region 27 of fig. 2.
Fig. 10 (B) is a diagram showing the pixels 90 arranged inside the 1 st drawing region 26 among the plurality of pixels 90 constituting the image data 92 of the 2 nd layer after the 2 nd drawing region 27 is expanded. The data generating unit 63 moves the outer peripheral line of the 2 nd drawing area 27 to the outside to expand the 2 nd drawing area 27. For example, in order to expand the 2 nd drawing region 27, the upper, lower, left, and right outer peripheral lines of the 2 nd drawing region 27 may be moved in the up, down, left, and right directions. The movement amount of the outer peripheral line is determined by the density of the point 22 (fig. 9 (a)) disposed in the 1 st drawing area 26 or the density of the point 22 (fig. 10 (a)) disposed in the 2 nd drawing area 27.
For example, the outer peripheral line is moved in the x-direction by the number of pixels between two points 22 adjacent in the x-direction of the 2 nd drawing region 27. Likewise, the peripheral line is shifted in the y-direction by the number of pixels between two points 22 adjacent in the y-direction. In the example shown in fig. 10 (B), since three pixels are arranged between two points 22 adjacent to each other in the x-direction, the outer peripheral line of the drawing area 27 of fig. 2 is shifted by three pixels in the x-direction. The same applies to the movement in the y-direction.
Fig. 10 (C) is a diagram showing pixels 90 arranged inside the 1 st drawing region 26 among a plurality of pixels 90 constituting the 1 st layer image data 91 generated by executing the 3 rd step. In step 3, the data generating unit 63 performs a process of deleting the point 22 located inside the expanded 2 nd drawing region 27 with respect to the 1 st layer image data 91. Thus, all pixels 90 located inside the expanded 2 nd drawing region 27 among the plurality of pixels 90 of the image data 91 of the 1 st layer become pixels where the dots 22 are not arranged.
After the 3 rd step (fig. 10 (C)) is performed, the 4 th step is performed in the same manner as in the embodiment shown in fig. 1 to 9.
Fig. 10 (D) is a diagram showing pixels 90 arranged inside the 1 st drawing region 26 among a plurality of pixels 90 constituting the synthesized image data 93 generated by executing the 4 th step. In the expanded drawing area 2, a plurality of points 22 are arranged in the same manner as the image data 92 of the layer 2 shown in fig. 10 (B). In the peripheral edge 29, a plurality of dots 22 are arranged in the same manner as the image data 91 of the 1 st layer shown in fig. 10 (C). In fig. 10 (D), the peripheral edge 29 is hatched. No dot is arranged in a region corresponding to the difference between the 2 nd drawing region 27 after expansion and the 2 nd drawing region 27 before expansion.
Next, the excellent effects of the embodiments shown in (a) to (D) of fig. 10 will be described.
In the embodiment shown in fig. 1 to 9, as shown in fig. 9 (D), the distance between the dot 22 disposed on the peripheral edge 29 and the center of the dot 22 disposed on the 2 nd drawing area 27 may be shorter than the dot pitch of the 1 st drawing area 26 and the dot pitch of the 2 nd drawing area 27. For example, in the example shown in fig. 9 (D), the points 22 are adjacent to each other across the outer peripheral lines on the left and upper sides of the drawing area 27 of fig. 2.
In contrast, in the embodiments shown in (a) to (D) of fig. 10, since the 2 nd drawing region 27 is expanded and no dot is arranged in the region expanded before the expansion, the dot 22 arranged at the peripheral edge portion 29 can be prevented from being too close to the dot 22 arranged at the 2 nd drawing region 27 (inner deep portion). Therefore, a non-disposed region where no dots are disposed can be ensured at the boundary between the drawing region 27 (inner deep portion) of the 2 nd and the peripheral edge portion 29.
For example, if the amount of movement of the outer peripheral line of the 2 nd drawing region 27 in the x direction is set to the number of pixels between two points 22 adjacent in the x direction of the 2 nd drawing region 27, the shortest distance between the point 22 in the 2 nd drawing region 27 and the point 22 arranged in the x direction of the peripheral edge 29 can be set to be equal to or greater than the point pitch in the x direction of the 2 nd drawing region 27. The same applies in the y direction.
Next, a further embodiment will be described with reference to (a) to (D) in fig. 11. Hereinafter, the description of the same configuration as in the embodiments shown in fig. 10 (a) to (D) will be omitted.
Fig. 11 (a) is a diagram showing pixels 90 disposed inside the 1 st drawing region 26 among a plurality of pixels 90 constituting the 2 nd layer image data 92 generated by executing the 2 nd step, and is the same as the diagram shown in fig. 10 (a). The dot pitch in the x-direction and the y-direction of the drawing area 27 of fig. 2 is four times the pixel pitch. That is, three pixels 90 are arranged between two adjacent dots 22. As shown in fig. 9 (a), the dot pitch in the x-direction and the y-direction of the 1 st drawing region 26 is three times the pixel pitch. That is, two pixels 90 are arranged between two adjacent dots 22.
Fig. 11 (B) is a diagram showing the pixels 90 arranged inside the 1 st drawing region 26 among the plurality of pixels 90 constituting the image data 92 of the 2 nd layer after the 2 nd drawing region 27 is expanded. In the present embodiment, the movement amount of the outer peripheral line of the 2 nd drawing region 27 (the width of the 2 nd drawing region 27 is expanded) is calculated from the shorter one of the dot pitch of the 1 st drawing region 26 and the dot pitch of the 2 nd drawing region 27. For example, the peripheral line is shifted by the number of pixels 90 between two dots 22 adjacent to each other in the region where the dot pitch is shorter.
In the present embodiment, the dot pitch of the 1 st drawing area 26 (fig. 9 (a)) is shorter than the dot pitch of the 2 nd drawing area 27 (fig. 11 (a)). Therefore, the peripheral line is shifted by the number (i.e., two) of pixels 90 between two points 22 adjacent to each other in the 1 st drawing area 26.
Fig. 11 (C) is a diagram showing pixels 90 arranged inside the 1 st drawing region 26 among a plurality of pixels 90 constituting the 1 st layer image data 91 generated by executing the 3 rd step. In step 3, the data generating unit 63 (fig. 3) performs a process of deleting the point 22 located inside the 2 nd drawing region 27 after expansion on the 1 st layer image data 91, as in step 3 shown in fig. 10C.
Fig. 11 (D) is a diagram showing the pixels 90 arranged inside the 1 st drawing region 26 among the plurality of pixels 90 constituting the synthesized image data 93 generated by executing the 4 th step. In the expanded drawing area 2, a plurality of points 22 are arranged in the same manner as the image data 92 of the layer 2 shown in fig. 11 (B). In fig. 11 (D), the peripheral edge 29 is hatched.
Next, the excellent effects of the embodiments shown in (a) to (D) of fig. 11 will be described.
In the present embodiment, compared with the embodiments shown in fig. 10 (a) to (D), the width of the non-arrangement region that ensures the point at the boundary between the peripheral edge portion 29 and the 2 nd drawing region 27 (inner deep portion) can be narrowed. When this embodiment is applied to an application in which the width of the non-placement area of the undesired spot is too wide, a greater effect can be obtained.
Next, a further embodiment will be described with reference to (a) to (D) in fig. 12. Hereinafter, the description of the same configuration as in the embodiments shown in fig. 11 (a) to (D) will be omitted.
Fig. 12 (a) is a diagram showing pixels 90 arranged inside the 1 st drawing region 26 among a plurality of pixels 90 constituting the 1 st layer image data 91 generated by executing the 1 st step. As in the embodiment shown in fig. 9 a, the data generating unit 63 (fig. 3) arranges the plurality of dots 22 in the 1 st drawing area 26 at the specified dot pitch.
Fig. 12 (B) is a diagram showing pixels 90 disposed inside the 1 st drawing region 26 among a plurality of pixels 90 constituting the 2 nd layer image data 92 generated by executing the 2 nd step.
The data generating unit 63 (fig. 3) arranges the plurality of dots 22 in the drawing area 2 at the specified dot pitch 27. If the plurality of points 22 are arranged at a predetermined point pitch in the x direction (right direction) with the upper left reference point 24B as the starting point, there is a possibility that a region 96x where the points 22 are not arranged may occur between the point 22 at the right end and the outer peripheral line on the right side of the drawing region 27 of the 2 nd. As shown in fig. 12 (B), when the number of pixels 90 between two adjacent dots 22 in the x-direction is five, the size of the region 96x in the x-direction where the dot 22 is not arranged is five pixels or less. In the y direction, a region 96y where the dot 22 is not disposed may appear between the dot 22 at the lower end and the outer peripheral line of the lower side of the drawing region 27 of fig. 2.
In the present embodiment, after the plurality of dots 22 are arranged in the 2 nd drawing area 27, an annular blank area 95 surrounding all the dots 22 arranged in the 2 nd drawing area 27 is defined at a position outside the outermost dot 22. In fig. 12 (B), the blank region 95 is hatched. The outermost dot 22 of the drawing area 2 27 is connected to the pixel 90 located on the inner peripheral side of the blank area 95. The width of the blank region 95 is equal to the length of the number of pixels between the points in the drawing region where the point pitch is narrower in the 1 st drawing region 26 and the 2 nd drawing region 27.
In the present embodiment, the dot pitch of the 1 st drawing area 26 (fig. 12 (a)) is narrower than the dot pitch of the 2 nd drawing area 27, and the number of pixels between the dots in the 1 st drawing area 26 is two, so the width of the blank area 95 corresponds to the amount of two pixels. The data generating unit 63 moves the outer peripheral line of the 2 nd drawing area 27 toward a position equal to the outer peripheral line of the blank area 95.
Fig. 12 (C) is a diagram showing pixels 90 arranged inside the 1 st drawing region 26 among a plurality of pixels 90 constituting the 1 st layer image data 91 generated by executing the 3 rd step. The data generating unit 63 (fig. 3) deletes the point 22 inside the 2 nd drawing area 27 after the peripheral line has been moved from the plurality of points 22 arranged in the 1 st drawing area 26.
Fig. 12 (D) is a diagram showing pixels 90 arranged inside the 1 st drawing region 26 among the plurality of pixels 90 constituting the synthesized image data 93 generated by executing the 4 th step. As in the embodiment shown in fig. 9 (D), the data generating unit 63 generates composite image data 93 specifying the position of the dot 22 specified by any one of the image data 91 of the 1 st layer (fig. 12 (C)) and the image data 92 of the 2 nd layer (fig. 12 (B)) after the 3 rd step.
Next, the excellent effects of the embodiments shown in (a) to (D) of fig. 12 will be described.
In this embodiment, as shown in fig. 12 (B), the outer peripheral line of the drawing area 2 27 is moved toward a position equal to the outer peripheral line of the blank area 95. Therefore, a sufficiently wide non-disposed region where no spot is disposed can be ensured at the boundary portion between the peripheral edge portion 29 and the 2 nd drawing region 27 (inner deep portion). In fig. 12 (D), the peripheral edge 29 is hatched.
Since the width of the blank region 95 is set according to the dot pitch of the drawing region having a narrower dot pitch in the 1 st drawing region 26 and the 2 nd drawing region 27, and the blank region 95 is defined so that the outermost dot 22 of the 2 nd drawing region 27 is in contact with the pixel 90 located on the inner peripheral side of the blank region 95, as shown in fig. 12 (B), even if the region 96x or the region 96y where the dot 22 is not arranged is present, a state where the width of the non-arranged region is not necessarily increased more is not easily generated.
Next, a further embodiment will be described with reference to (a) to (D) in fig. 13. Hereinafter, the same configuration as in the embodiment shown in fig. 1 to 9 will be omitted. The image data 91 of the 1 st layer generated by executing the 1 st step is the same as the image data 91 of the 1 st layer shown in (a) of fig. 12.
Fig. 13 (a) is a diagram showing pixels 90 disposed inside the 1 st drawing region 26 among a plurality of pixels 90 constituting the 2 nd layer image data 92 generated by executing the 2 nd step. The data generating unit 63 (fig. 3) arranges the plurality of points 22 in the drawing area 2 according to the specified distribution rule.
When the dots 22 are arranged at a predetermined dot pitch in the x direction (right direction) with the reference point 24B on the left of the 2 nd drawing region 27 as the starting point, a region 96x where the dots 22 are not arranged appears between the dot 22 on the right and the outer peripheral line on the right of the 2 nd drawing region 27. In the y direction, a region 96y in which the dot 22 is not disposed also appears between the dot 22 at the lower end and the outer peripheral line on the lower side of the drawing region 27 of fig. 2.
In the present embodiment, the data generating unit 63 moves the plurality of points 22 arranged in the 2 nd drawing area 27 horizontally in the 2 nd drawing area 27 in a state in which the positional relationship between the plurality of points 22 is fixed before the 3 rd step is executed. More specifically, the distance between the geometric center of the drawing area 2 and the center of gravity of the plurality of points 22 disposed in the drawing area 2 27 is shortened as compared with before the translational movement. More preferably, the center of gravity of the plurality of points 22 disposed in the 2 nd drawing region 27 is aligned with the geometric center of the 2 nd drawing region 27.
Fig. 13 (B) is a diagram showing pixels 90 disposed inside the 1 st drawing region 26 among a plurality of pixels 90 constituting the image data 92 of the 2 nd layer after the plurality of points 22 are moved in a translational manner. By moving the plurality of points 22 in a translational manner, a blank area 97 in which no point is arranged is ensured between the outermost point 22 among the plurality of points 22 arranged in the 2 nd drawing area 27 and the outer peripheral line of the 2 nd drawing area 27. In fig. 13 (B), the blank region 97 is hatched.
Fig. 13 (C) is a diagram showing pixels 90 arranged inside the 1 st drawing region 26 among a plurality of pixels 90 constituting the 1 st layer image data 91 generated by executing the 3 rd step. The data generating unit 63 performs processing for removing the plurality of points 22 disposed inside the drawing area 27 of fig. 2 on the image data 91 (fig. 12 (a)) of the 1 st layer.
Fig. 13 (D) is a diagram showing pixels 90 arranged inside the 1 st drawing region 26 among a plurality of pixels 90 constituting the synthesized image data 93 generated by executing the 4 th step. As in the embodiment shown in fig. 9 (D), the data generating unit 63 generates composite image data 93 specifying the position of the dot 22 specified by any one of the image data 91 of the 1 st layer (fig. 13 (C)) and the image data 92 of the 2 nd layer (fig. 13 (B)) after the 3 rd step.
Next, the excellent effects of the embodiments shown in (a) to (D) of fig. 13 will be described.
In the present embodiment, the plurality of points 22 arranged in the drawing area 2 are moved in a translational manner, and the synthesized image data 93 is generated based on the arrangement of the points 22 after the translational movement (fig. 13D). Therefore, a somewhat non-disposed region can be secured at the boundary between the peripheral edge portion 29 and the 2 nd drawing region 27 (inner deep portion). In fig. 13 (D), the peripheral edge 29 is hatched. The non-arrangement region includes a blank region 97 generated by the translational movement.
Next, a modification of the embodiment shown in (a) to (D) of fig. 13 will be described.
In the example shown in fig. 13 (B), the blank area 97 is generated by moving the plurality of points 22 in translation. However, for example, if two points 22 located at both ends in the x-direction are in contact with the left and right outer peripheral lines of the 2 nd drawing region 27 (i.e., the center of gravity of the points 22 is the same as the x-coordinate of the geometric center of the 2 nd drawing region 27), the distance of the translational movement in the x-direction becomes zero. In this case, the blank area 97 along the left and right outer peripheral lines of the drawing area 2 27 is not generated (fig. 13B). The same situation may occur in the y-direction as well.
In the present modification, after the plurality of points 22 arranged in the 2 nd drawing area 27 are moved in a translational manner, an annular blank area 95 surrounding all the points 22 arranged in the 2 nd drawing area 27 is defined at a position outside the outermost point 22 as in the embodiment shown in fig. 12 (B). Next, the outer peripheral line of the drawing area 2 27 is moved toward a position equal to the outer peripheral line of the blank area 95. In step 3, the data generating unit 63 (fig. 3) deletes the point 22 inside the 2 nd drawing area 27 after the peripheral line has been moved from the plurality of points 22 arranged in the 1 st drawing area 26.
Next, the excellent effects of this modification will be described. In the present modification, when the width of the blank region 97 (fig. 13 (B)) generated by the translational movement of the plurality of points 22 is insufficient, the blank region 95 is ensured by the same step as that shown in fig. 12 (B). Therefore, a sufficiently wide non-disposed region can be ensured at the boundary between the peripheral edge portion 29 and the 2 nd drawing region 27 (inner deep portion).
Next, a further embodiment will be described with reference to (a) to (C) in fig. 14. Hereinafter, the same configuration as in the embodiment shown in fig. 1 to 9 will be omitted.
Fig. 14 (a) is a diagram showing the arrangement of the drawing region of layer 1 specified by the image data 91 of layer 1. In the present embodiment, two 1 st drawing areas 26A, 26B are arranged on the drawing target surface 23. The data generating unit 63 (fig. 3) performs the same step as the 1 st step shown in fig. 9 (a) to arrange a plurality of points in the 1 st drawing areas 26A and 26B. Specifically, the data generating unit 63 executes step 1 to generate the image data 91 of layer 1, which designates positions of a plurality of points arranged in the two 1 st drawing regions 26A and 26B. In fig. 14 (a), only the outer peripheral lines of the 1 st drawing areas 26A and 26B are shown, and illustration of a plurality of points is omitted.
Fig. 14 (B) is a diagram showing the arrangement of the drawing region of the layer 2 specified by the image data 92 of the layer 2. A drawing area 27 of the 2 nd is arranged on the surface 23 to be drawn. The data generating section 63 (fig. 3) disposes a plurality of points in the 2 nd drawing area 27 by executing the same step as the 2 nd step shown in fig. 9 (B). Specifically, the data generating unit 63 generates the image data 92 of the 2 nd layer, which designates the positions of the plurality of points arranged in the 2 nd drawing area 27, by executing the 2 nd step. In fig. 14 (B), only the outer peripheral line of the drawing area 27 of fig. 2 is shown, and illustration of a plurality of points is omitted.
Fig. 14 (C) is a diagram showing the arrangement of the 1 st drawing regions 26A, 26B and the 2 nd drawing region 27 designated by the composite image data 93. The data generating unit 63 generates the composite image data 93 by executing the 3 rd step shown in fig. 9 (C) and the 4 th step shown in fig. 9 (D). The data generating unit 63 extracts a drawing region overlapping with the 2 nd drawing region 27 arranged on the 2 nd layer from the plurality of 1 st drawing regions 26A and 26B arranged on the 1 st layer before executing the 3 rd step. Whether or not the drawing region of layer 1 and the drawing region of layer 2 overlap can be determined from the positions (xy coordinates) of the vertices of the drawing regions.
For example, the data generating unit 63 calculates xy coordinates of four vertices of the 1 st drawing regions 26A and 26B arranged on the 1 st layer. The positions of the four vertices can be obtained from the reference point positions and the drawing area dimensions shown in fig. 6. Then, the xy coordinates of the four vertices disposed in the 2 nd drawing region 27 of the 2 nd layer are also calculated.
When at least one of the four vertices of the 2 nd drawing region 27 is located within a range surrounded by the four vertices of either one of the 1 st drawing regions 26A, 26B of the 1 st layer, it is determined that the drawing region of the 1 st layer and the 2 nd drawing region 27 of the 2 nd layer overlap at least partially. For example, as shown in fig. 14 (C), when the four vertices of the 2 nd drawing area 27 are located within the 1 st drawing area 26A of the 1 st layer, the 2 nd drawing area 27 is included in the 1 st drawing area 26A. When one or two vertexes of the drawing region 2 b 27 are located within the range of the drawing region 1 b 26A but the other vertexes are located outside the range of the drawing region 1 b 26A, a part of the drawing region 1 b 26A overlaps a part of the drawing region 2 b 27.
There is no vertex of the 2 nd drawing area 27 of the 2 nd layer within the range of the other 1 st drawing area 26B. At this time, the 1 st drawing area 26B does not overlap with any drawing area of the 2 nd layer.
Next, the excellent effects of the embodiments shown in (a) to (C) of fig. 14 will be described.
In this embodiment, when the user inputs drawing information of the drawing area of layer 2, there is no need to input information for specifying the drawing area of the lower layer (layer 1) that overlaps with the drawing area in which the drawing information is currently being input. This reduces the burden on the user in the operation of inputting the drawing information.
Next, a further embodiment will be described with reference to fig. 15 to 16. Hereinafter, description of the same configuration as in the embodiment shown in fig. 1 to 9 will be omitted.
Fig. 15 (a) and (B) are diagrams showing a window 70 for inputting drawing information according to the present embodiment. In the present embodiment, the input control section 62 displays an input box 87 for inputting the overlapping drawing area ID of the lower layer in addition to the input box 84 displayed on the window 70 (fig. 6) for inputting drawing information according to the embodiment shown in fig. 1 to 9.
As shown in fig. 15 (a), when drawing information of the drawing area of layer 1 is input, an input box 87 for inputting drawing information identification information is set as a blank. At this time, the input control section 62 (fig. 3) stores the drawing information in the drawing condition storage section 65 by the same procedure as in the embodiment shown in fig. 1 to 9. As shown in fig. 15 (B), when drawing information of the drawing area of the 2 nd layer is input, the user sets the input frame of the reference point position as a blank space and inputs the overlapping drawing area ID of the lower layer. This information can be input by selecting one drawing area ID from the list displayed in the drawing information list display area 74. The input control unit 62 stores the inputted drawing information in the drawing condition storage unit 65.
Fig. 16 (a) is a diagram showing the positional relationship between the drawing surface 23 and the two 1 st drawing regions 26A and 26B of the 1 st layer. The drawing area IDs of the 1 st drawing areas 26A and 26B are "1" and "2", respectively. For example, the 1 st drawing region 26A has a dimension of 70mm in the x direction and 80mm in the y direction.
Fig. 16 (B) is a diagram showing the 2 nd drawing region 27 of the 2 nd layer. As shown in fig. 15 (B), the reference point position of the 2 nd drawing region 27 is not input, and therefore the position of the 2 nd drawing region 27 on the drawn surface 23 cannot be determined only by the drawing information of the 2 nd drawing region 27 that is input. However, the size of the drawing area 27 of the 2 nd can be determined. For example, the dimension in the x direction of the drawing area 27 of fig. 2 is 50mm, and the dimension in the y direction is 60mm.
Fig. 16 (C) is a schematic diagram for explaining the processing before the data generating section 63 (fig. 3) executes the 3 rd step. As shown in fig. 15B, the 2 nd drawing region 27 overlaps with the drawing region of which the drawing region ID of the 1 st layer is "1" (i.e., the 1 st drawing region 26A). The data generating unit 63 arranges the 2 nd drawing region 27 in the center of the 1 st drawing region 26A of the lower layer overlapping with the 2 nd drawing region 27. Specifically, the 2 nd drawing region 27 is arranged such that the geometric center of the 2 nd drawing region 27 coincides with the geometric center of the 1 st drawing region 26A. Thus, the width of the upper, lower, left and right portions of the peripheral edge 29 was 10mm. The overlapping drawing area ID of the lower layer shown in (B) of fig. 15 can be said to be information indirectly specifying the position of the 2 nd drawing area 27 on the drawn surface 23.
The data generating section performs the 3 rd and 4 th steps shown in (C) and (D) of fig. 9 after determining the position of the 2 nd drawing area 27 on the drawn surface 23, thereby generating the composite image data 93.
Next, the excellent effects of the embodiments shown in fig. 15 to 16 will be described.
Depending on the application of the plurality of dots 22 formed on the substrate 20 (fig. 2), it may be desirable to dispose the 2 nd drawing region 27 (fig. 9 (D)) at the center of the 1 st drawing region 26 (fig. 9 (D)). In the present embodiment, the 2 nd drawing region 27 is arranged in the center of the 1 st drawing region 26A without specifying the position of the 2 nd drawing region 27 on the drawing surface 23 and only by specifying the overlapping drawing region of the lower layer. The user does not need to calculate the position of the 2 nd drawing region 27 on the drawn surface 23, and therefore an excellent effect of improving operability can be obtained.
Next, a further embodiment will be described with reference to fig. 17 to 19. Hereinafter, the same configuration as in the embodiment shown in fig. 1 to 9 will be omitted.
Fig. 17 is a diagram showing a window 70 for inputting drawing information according to the present embodiment. In the present embodiment, the input control section 62 displays an input box 88 for inputting the height of a point in addition to the input box 84 displayed in the window 70 (fig. 6) for inputting drawing information according to the embodiment shown in fig. 1 to 9. The dot height is specified in terms of the number of times the ink is caused to drop onto the same pixel.
Fig. 18 is a diagram showing pixels 90 in regions including the 1 st drawing regions 26A, 26B, 26C, and 26D among the plurality of pixels 90 constituting the composite image data 93. Four 1 st drawing areas 26A, 26B, 26C, 26D are arranged on the drawing surface 23. A numerical value of the height of the specified point is assigned to each pixel 90. No dots are arranged on the allocated pixels 90 with a value of zero. In the example shown in fig. 18, the height of the point disposed in the 1 st drawing area 26A, 26B is "1", the height of the point disposed in the 1 st drawing area 26C is "2", and the height of the point disposed in the 1 st drawing area 26D is "3".
Fig. 19 (a) is a diagram showing the arrangement of the 1 st drawing regions 26A and 26B of the 1 st layer. Two 1 st drawing areas 26A and 26B are arranged on the drawing surface 23. The positions of the plurality of points disposed in the two 1 st drawing regions 26A and 26B are specified by the image data 91 of the 1 st layer. The height of the point disposed in one of the 1 st drawing regions 26A is "2", and the height of the point disposed in the other 1 st drawing region 26B is "3".
Fig. 19 (B) is a diagram showing the arrangement of the 2 nd drawing regions 27A to 27F of the 2 nd layer. Six 2 nd drawing areas 27A to 27F are arranged on the drawing surface 23. Positions of a plurality of points disposed in the six 2 nd drawing regions 27A to 27F are specified by the image data 92 of the 2 nd layer. The height of the points disposed in the 2 nd drawing regions 27A, 27B, 27D, 27E, 27F is "1", and the height of the points disposed in the remaining 2 nd drawing region 27C is "3".
Fig. 19 (C) is a diagram showing the height of the drawing region and the point specified by the composite image data 93. Three 2 nd drawing regions 27A, 27B, 27C of the 2 nd layer are arranged in the 1 st drawing region 26A of the 1 st layer, and three 2 nd drawing regions 27D, 27E, 27F of the 2 nd layer are arranged in the 1 st drawing region 26B of the 1 st layer. The height of the point disposed inside the 1 st drawing area 26A and outside the 2 nd drawing areas 27A, 27B, 27C (i.e., the peripheral edge portion 29A) is equal to the height "2" of the point disposed in the 1 st drawing area 26A specified by the image data 91 of the 1 st layer. Similarly, the height of the point disposed inside the 1 st drawing region 26B and outside the 2 nd drawing regions 27D, 27E, 27F (i.e., the peripheral edge portion 29B) is equal to the height "3" of the point disposed in the 1 st drawing region 26B specified by the 1 st layer image data 91. The height of the points disposed in the 2 nd drawing regions 27A to 27F is equal to the height specified by the image data 92 (fig. 19 (B)) of the 2 nd layer.
The data generating unit 63 (fig. 3) generates print data from the composite image data 93. In this embodiment, the maximum value of the height of the dot is "3", and thus three prints are performed. The data generating unit 63 generates the printing data of the 1 st, 2 nd, and 3 rd times.
Fig. 19 (D) is a diagram schematically showing the 1 st-time printing data 101 on a two-dimensional plane of a pixel coordinate system. The 1 st printing data 101 specifies the positions of dots to which ink should be applied by the 1 st printing. In fig. 19 (D), the areas where the dots to be coated with ink are arranged are hatched. In the 1 st printing, ink was applied to the positions of dots having the heights "1", "2", and "3". Therefore, the positions of all the points disposed in the 1 st drawing areas 26A and 26B are ink application targets.
Fig. 19 (E) is a diagram schematically showing the printing data 102 of the 2 nd time on a two-dimensional plane of the pixel coordinate system. The data 102 for printing of the 2 nd time specifies the position of the dot to which the ink should be applied by the 2 nd printing. In fig. 19 (E), the area where the dot to which the ink is applied by the 2 nd printing is arranged is hatched. In the 2 nd printing, ink was applied to the positions of dots having the heights "2" and "3". Therefore, the positions of the dots disposed in the peripheral portions 29A, 29B and the 2 nd drawing area 27C are the ink application target except for the positions of the dots in the 2 nd drawing areas 27A, 27B, 27D, 27E and 27F (fig. 19 (C)).
Fig. 19 (F) is a diagram schematically showing the printing data 103 of the 3 rd time on a two-dimensional plane of the pixel coordinate system. The 3 rd printing data 103 specifies the positions of dots to which ink should be applied by the 3 rd printing. In fig. 19 (F), the areas where the dots to be coated with ink are arranged are hatched. In the 3 rd printing, ink was applied to the positions of dots having a height of "3". Therefore, the positions of the dots disposed in the peripheral edge portion 29B and the 2 nd drawing area 27C are the ink application target except for the positions of the dots in the peripheral edge portion 29A (fig. 19 (C)).
Next, the excellent effects of the embodiments shown in fig. 17 to 19 will be described.
In the present embodiment, a plurality of points having different heights from each other can be formed on one drawn surface 23. The user does not need to create the 1 st, 2 nd and 3 rd order printing data, and only needs to input the height of the input point in the drawing information input window 70 shown in fig. 17. When the user inputs the height of the dot, the data generating unit 63 generates a plurality of pieces of printing data. Therefore, the time for producing image data composed of the printing data of the 1 st, 2 nd and 3 rd times can be shortened.
Next, a modification of the above embodiment will be described.
In the above embodiment, the maximum value of the height of the dot is set to "3", but the height of the dot may be set to "2", or may be set to "4" or more. For example, when the dot height is determined by an integer of 1 to N, the data generating unit 63 generates a plurality of print data from the 1 st to the nth times. The position of the dot having a height of N or more is preferably specified by the nth printing data, with the parameter N being an integer of 1 or more and N or less.
Next, a further embodiment will be described with reference to fig. 20 (a) and (B). Hereinafter, the same configuration as in the embodiment shown in fig. 1 to 9 will be omitted.
Fig. 20 (a) is a diagram showing a pixel 90 disposed inside the 1 st drawing region 26 among a plurality of pixels 90 constituting the composite image data 93 according to the present embodiment. In the embodiment shown in fig. 1 to 9, a peripheral edge portion 29 (fig. 9 (D)) surrounding the 2 nd drawing area 27 (fig. 9 (D)) of the 2 nd layer is provided. In contrast, in the present embodiment, a part of the outer peripheral line of the 2 nd drawing region 27 of the 2 nd layer coincides with a part of the outer peripheral line of the 1 st drawing region 26 of the 1 st layer. For example, the reference point 24B of the 2 nd drawing region 27 is disposed on the outer peripheral line of the upper side of the 1 st drawing region 26. In this way, the region inside the 1 st drawing region 26 and outside the 2 nd drawing region 27 does not necessarily need to surround the 2 nd drawing region 27.
In the present embodiment, the data generating section 63 (fig. 3) also generates the synthetic image data 93 by executing steps 1 (fig. 9 (a)) to 4 (fig. 9 (D)). In addition, the 2 nd drawing region 27 may be expanded before the 3 rd step is performed as in the embodiment shown in fig. 10 (B), 11 (B), or 12 (B). The plurality of points 22 disposed in the drawing area 27 of fig. 2 may be moved in translation as in the embodiment shown in fig. 13 (B).
The various steps of the embodiments shown in fig. 14 (a) to (C) and the embodiments shown in fig. 17 to 19 may be applied to the present embodiment.
Fig. 20 (B) is a diagram showing pixels 90 disposed inside and around the 1 st drawing region 26 among a plurality of pixels 90 constituting the composite image data 93 according to the modification of the present embodiment. In the present modification, a part of the 2 nd drawing region 27 overlaps the 1 st drawing region 26, and the other part of the 2 nd drawing region 27 is located outside the 1 st drawing region 26. In this modification, the various steps of the above embodiment can also be applied. As in the present modification, the 2 nd drawing region 27 does not necessarily need to be included in the 1 st drawing region 26.
Next, a further embodiment will be described with reference to fig. 21. Hereinafter, the same configuration as in the embodiment shown in fig. 1 to 9 will be omitted.
Fig. 21 is a diagram showing the arrangement of the 1 st drawing region 26 of the 1 st layer and the 2 nd drawing region 27 of the 2 nd layer designated by the synthesized image data 93 generated by the image data generating apparatus 60 (fig. 3) according to the present embodiment. In the embodiment shown in fig. 1 to 9, the 1 st drawing area 26 and the 2 nd drawing area 27 are square or rectangular. In contrast, in the present embodiment, the 1 st drawing area 26 and the 2 nd drawing area 27 having shapes other than square or rectangle are arranged on the drawing surface 23. The 1 st drawing area 26 and the 2 nd drawing area 27 are circular, elliptical, or regular hexagonal in shape.
When the 1 st drawing area 26 and the 2 nd drawing area 27 are circular, the center point may be used as a reference point, and the length of the radius may be used as information of a predetermined size. When the 1 st drawing area 26 and the 2 nd drawing area 27 are elliptical, the center point is used as a reference point, the lengths of the major axis and the minor axis are used as information of a predetermined size, and the angle formed by the x direction and the major axis is used as information of a predetermined posture on the surface 23 to be drawn. When the 1 st drawing area 26 and the 2 nd drawing area 27 are regular hexagons, the center point is used as a reference point, the diagonal length (diameter of the circumscribed circle) is used as information of a predetermined size, and the angle formed by the x direction and one side is used as information of a predetermined posture on the drawing surface 23.
As in the present embodiment, the shapes of the 1 st drawing area 26 and the 2 nd drawing area 27 may be two-dimensional figures which can be defined as points serving as references of positions and which can be defined in terms of dimensions and postures on the drawing surface 23.
Next, a further embodiment will be described with reference to fig. 22. Hereinafter, the same configuration as in the embodiment shown in fig. 1 to 9 will be omitted.
Fig. 22 is a diagram showing the arrangement of drawing areas designated by the synthesized image data 93 generated by the image data generating apparatus 60 (fig. 3) according to the present embodiment. In the embodiment shown in fig. 1 to 9, the drawing region is arranged in two layers, i.e., layer 1 and layer 2. In contrast, in the present embodiment, the drawing regions are arranged in three layers.
A 1 st drawing region 26 is arranged on the 1 st layer of the lowermost layer, a 2 nd drawing region 27 is arranged on the 2 nd layer of the intermediate layer, and a 3 rd drawing region 28 is arranged on the 3 rd layer of the uppermost layer. The 2 nd drawing area 27 is included in the 1 st drawing area 26, and the 3 rd drawing area 28 is included in the 2 nd drawing area 27. In generating the composite image data 93, it is only necessary to delete points located inside the drawing region of the previous layer from among a plurality of points defined by the image data of one layer as in step 3 shown in fig. 9 (C) and the like. In this way, the total number of layers is not limited to two, but may be three or more.
The embodiments described above are merely examples, and it goes without saying that the structures shown in the different embodiments may be partially replaced or combined. The same operational effects based on the same structure in the plurality of embodiments are not mentioned one by one in each embodiment. Moreover, the present invention is not limited to the above-described embodiments. For example, various alterations, modifications, combinations, etc. may be made as will be apparent to those skilled in the art.

Claims (10)

1. An image data generation device is characterized by comprising:
an input device for inputting data; a kind of electronic device with high-pressure air-conditioning system
The processing device comprises a processing device and a processing device,
the processing device has the following functions:
allowing a user to input distribution rule information designating a rule to distribute a plurality of points on a surface to be drawn, the surface to be drawn having at least one drawing area thereon,
generating image data specifying positions of respective points distributed on the depicted surface in accordance with a rule specified by the distribution rule information;
for each drawing area, allowing a user to input layer identification information designating one layer from among a plurality of layers from the input device,
overlapping a plurality of rendering regions of the plurality of layers to generate the image data,
In the case where the drawing region of one layer overlaps with the drawing region of the other layer, points of the drawing regions of the layers located in the lower layer of the two layers are deleted from the overlapping regions of the two layers, and points of the drawing regions of the layers located in the upper layer remain.
2. The image data generating apparatus according to claim 1, wherein,
the processing device also has the following functions:
the user inputs range information designating a position and a size of the at least one drawing area arranged on the surface to be drawn from the input device, and arranges the plurality of points designated by the image data inside the drawing area designated by the range information.
3. The image data generating apparatus according to claim 2, wherein,
the shape of the drawing area specified by the range information is square or rectangular, the range information includes position information specifying a position of a reference point fixed with respect to a relative position of the drawing area on the surface to be drawn, and length information specifying a length of a side of the drawing area, and the distribution rule information includes pitch information specifying a pitch of the points.
4. The image data generating apparatus according to claim 2, wherein,
when a plurality of drawing areas are set in one layer, the processing device gives area identification information for identifying the drawing areas to the plurality of drawing areas in the same layer.
5. The image data generating apparatus according to claim 4, wherein,
the processing device has the following functions:
allowing a user to input the layer identification information and the area identification information from the input device,
the range information and the distribution rule information of the drawing area specified from the inputted layer identification information and the area identification information are displayed on the input device,
allowing a user to modify at least a portion of the range information and the distribution rule information.
6. The image data generating apparatus according to claim 4 or 5, wherein,
the processing device has the following functions:
registering the plurality of drawing areas to which the range information and the distribution rule information are input,
the image data is generated from the registered plurality of drawing areas.
7. The image data generating apparatus according to claim 6, wherein,
The processing device has the following functions:
allowing a user to input the layer identification information and the area identification information from the input device,
and erasing registration of the drawing area specified from the inputted layer identification information and the area identification information.
8. The image data generating apparatus according to claim 7, wherein,
when a part of the drawing area is erased, the processing device newly gives the area identification information to the drawing area which is not erased.
9. The image data generating apparatus according to claim 2, wherein,
the range information of at least one drawing area of the plurality of drawing areas that is the 1 st drawing area includes information for determining the drawing area of the other layer overlapped therewith as the information of the specified position,
the processing device matches the center of the 1 st drawing area with the center of the drawing area of the other layer overlapping with the 1 st drawing area.
10. The image data generating apparatus according to claim 1, wherein,
the processing device also has the following functions:
for each drawing area, inputting height information specifying a height of a point determined by an integer of 1 or more and N or less from the input device,
The image data includes a plurality of printing data of 1 st to nth times,
the n-th printing data specifies the positions of points having a height of n or more.
CN202110602879.7A 2020-06-01 2021-05-31 Image data generating apparatus Active CN113752695B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-095683 2020-06-01
JP2020095683A JP7469146B2 (en) 2020-06-01 2020-06-01 Image data generating device

Publications (2)

Publication Number Publication Date
CN113752695A CN113752695A (en) 2021-12-07
CN113752695B true CN113752695B (en) 2023-04-21

Family

ID=78787327

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110602879.7A Active CN113752695B (en) 2020-06-01 2021-05-31 Image data generating apparatus

Country Status (4)

Country Link
JP (1) JP7469146B2 (en)
KR (1) KR20210148922A (en)
CN (1) CN113752695B (en)
TW (1) TWI784547B (en)

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3966034B2 (en) 2002-03-14 2007-08-29 セイコーエプソン株式会社 Discharge pattern data generation method and discharge pattern data generation apparatus
JP2004139162A (en) 2002-10-15 2004-05-13 Fujitsu Component Ltd Manufacturing method of touch panel and touch panel manufactured by the method
JP2004337720A (en) 2003-05-14 2004-12-02 Seiko Epson Corp Liquid drop discharge control method, liquid drop discharge apparatus, liquid drop discharge system and liquid drop discharge control program
JP2008089868A (en) * 2006-09-29 2008-04-17 Fujifilm Corp Method and device for acquiring drawing point data and method and device for drawing
JP4721118B2 (en) * 2006-09-29 2011-07-13 富士フイルム株式会社 Image processing apparatus and method, and image forming apparatus and method
JP2008092191A (en) * 2006-09-29 2008-04-17 Fujifilm Corp Image processing method and device, and image formation method and device
JP2008173804A (en) * 2007-01-17 2008-07-31 Brother Ind Ltd Printer, communication system and printing method
JP4486995B2 (en) * 2007-08-01 2010-06-23 シャープ株式会社 Image processing system
JP5084686B2 (en) * 2008-09-30 2012-11-28 キヤノン株式会社 Image forming apparatus, image forming method, program, and storage medium
JP5268875B2 (en) * 2009-06-23 2013-08-21 キヤノン株式会社 Image forming apparatus and image forming method
JP5585505B2 (en) * 2011-03-17 2014-09-10 セイコーエプソン株式会社 Image supply apparatus, image display system, image supply apparatus control method, image display apparatus, and program
JP5377685B2 (en) * 2012-01-31 2013-12-25 京セラドキュメントソリューションズ株式会社 Image forming apparatus and image forming program
JP6120824B2 (en) * 2014-12-11 2017-04-26 キヤノン株式会社 Image processing apparatus, image processing method, and program
EP3271186A4 (en) 2015-07-09 2018-12-19 Hewlett-Packard Development Company, L.P. Printer configuration
JP6679174B2 (en) * 2016-08-22 2020-04-15 住友重機械工業株式会社 Inkjet device
JP6769365B2 (en) * 2017-03-21 2020-10-14 カシオ計算機株式会社 Drawing device and drawing method
JP7005293B2 (en) * 2017-11-08 2022-01-21 シャープ株式会社 Image processing equipment
JP7206587B2 (en) * 2017-12-08 2023-01-18 カシオ計算機株式会社 Rendering device and rendering method

Also Published As

Publication number Publication date
KR20210148922A (en) 2021-12-08
JP7469146B2 (en) 2024-04-16
JP2021187088A (en) 2021-12-13
TW202147076A (en) 2021-12-16
CN113752695A (en) 2021-12-07
TWI784547B (en) 2022-11-21

Similar Documents

Publication Publication Date Title
CN108351510B (en) Seamless line direct imaging for high resolution electronic patterning
CN113752695B (en) Image data generating apparatus
CN110095935B (en) Film forming method, film forming apparatus, and composite substrate having film formed thereon
CN113752696B (en) Image data generating device and image data generating method
KR101274534B1 (en) Drawing data acquiring method and device, and drawing method and apparatus
CN109927413B (en) Film forming apparatus and film forming method
CN113752694B (en) Printing data generating device and ink applying device control device
CN110737179B (en) Drawing device and drawing method
CN113682052B (en) Ink applying device, control device thereof and ink applying method
KR101111083B1 (en) Multi-layer printing method
TWI763327B (en) Ink coating device, control device for ink coating device, and ink coating method
TWI786859B (en) Distortion correction processing device, drawing method and program
KR101509046B1 (en) Printing method of built-in cad printing electronic system
JP2013163294A (en) Method of generating print data and printing apparatus
JP5799584B2 (en) DATA GENERATION DEVICE, DATA GENERATION METHOD, AND PROGRAM
JP2021079359A (en) Ink coating controller and ink coating method
JP2014100636A (en) Method for manufacturing substrate, and device for manufacturing substrate

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant