WO2018020533A1 - Image processing device, replication system, and replication method - Google Patents

Image processing device, replication system, and replication method Download PDF

Info

Publication number
WO2018020533A1
WO2018020533A1 PCT/JP2016/004920 JP2016004920W WO2018020533A1 WO 2018020533 A1 WO2018020533 A1 WO 2018020533A1 JP 2016004920 W JP2016004920 W JP 2016004920W WO 2018020533 A1 WO2018020533 A1 WO 2018020533A1
Authority
WO
WIPO (PCT)
Prior art keywords
height
information
convex
layer
height distribution
Prior art date
Application number
PCT/JP2016/004920
Other languages
French (fr)
Japanese (ja)
Inventor
島崎 浩昭
田中 義人
美馬 邦啓
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to JP2016-149934 priority Critical
Priority to JP2016149934 priority
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Publication of WO2018020533A1 publication Critical patent/WO2018020533A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B41PRINTING; LINING MACHINES; TYPEWRITERS; STAMPS
    • B41JTYPEWRITERS; SELECTIVE PRINTING MECHANISMS, e.g. INK-JET PRINTERS, THERMAL PRINTERS, i.e. MECHANISMS PRINTING OTHERWISE THAN FROM A FORME; CORRECTION OF TYPOGRAPHICAL ERRORS
    • B41J5/00Devices or arrangements for controlling character selection
    • B41J5/30Character or syllable selection controlled by recorded information
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B41PRINTING; LINING MACHINES; TYPEWRITERS; STAMPS
    • B41JTYPEWRITERS; SELECTIVE PRINTING MECHANISMS, e.g. INK-JET PRINTERS, THERMAL PRINTERS, i.e. MECHANISMS PRINTING OTHERWISE THAN FROM A FORME; CORRECTION OF TYPOGRAPHICAL ERRORS
    • B41J21/00Column, tabular, or like printing arrangements; Means for centralising short lines
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical means
    • G01B11/24Measuring arrangements characterised by the use of optical means for measuring contours or curvatures

Abstract

This image processing device (20) is provided with a control unit (22) and an output unit (21b). The control unit (22) receives height information obtained by measuring the shape of a protrusion formed on the surface of an object, and differentiates the protrusion into a plurality of layers including at least a first layer and second layer. The control unit (22) generates first height distribution information indicating the surface height distribution of the object on the basis of first height information, from among the height information, that indicates the height of the first layer. The control unit (22) generates second height distribution information indicating the surface height distribution of the object on the basis of second height information, from among the height information, that indicates the height of the second layer. The output unit (21b) outputs the first height distribution information and the second height distribution information.

Description

Image processing apparatus, duplication system, and duplication method

The present disclosure relates to an image processing device that generates data for duplicating an object having a convex portion, a duplication system that duplicates an object having a convex portion, and a duplication method.

Patent Document 1 discloses an image processing apparatus that generates stereoscopic image data by adding height direction information to a planar original image. This image processing apparatus makes it possible to realistically express shadows and textures by adding height information to each region separated based on focus information of original image data.

JP 2016-63522 A

The present disclosure provides an image processing apparatus, a duplication system, and a duplication method effective for duplicating an object having a convex portion.

The image processing apparatus according to the present disclosure includes a control unit and an output unit. The control unit inputs height information obtained by measuring the shape of the convex portion formed on the surface of the object, and identifies the convex portion as a plurality of layers including at least the first layer and the second layer. To do. A control part produces | generates the 1st height distribution information which shows the height distribution of the surface of an object based on the 1st height information which shows the height of a 1st layer among height information. A control part produces | generates the 2nd height distribution information which shows the height distribution of the surface of an object based on the 2nd height information which shows the height of a 2nd layer among height information. The output unit outputs the first height distribution information and the second height distribution information.

The replication system according to the present disclosure includes an image processing device and a printing device. The image processing apparatus includes a control unit and an output unit. The control unit inputs height information obtained by measuring the shape of the convex portion formed on the surface of the object, and identifies the convex portion as a plurality of layers including at least the first layer and the second layer. To do. A control part produces | generates the 1st height distribution information which shows the height distribution of the surface of an object based on the 1st height information which shows the height of a 1st layer among height information. A control part produces | generates the 2nd height distribution information which shows the height distribution of the surface of an object based on the 2nd height information which shows the height of a 2nd layer among height information. The output unit outputs the first height distribution information and the second height distribution information. The printing apparatus performs printing based on the first height distribution information and the second height distribution information, and generates a duplicate of the object.

In the replication method according to the present disclosure, height information obtained by measuring the shape of the convex portion formed on the surface of the object is input, and the convex portion includes at least two of the first layer and the second layer. Identify the layers. In this duplication method, the first height distribution information indicating the height distribution of the surface of the object is generated based on the first height information indicating the height of the first layer among the height information. In this duplication method, second height distribution information indicating the height distribution of the surface of the object is generated based on the second height information indicating the height of the second layer among the height information. In this duplication method, the first height distribution information and the second height distribution information are output. In this duplication method, printing is performed based on the first height distribution information and the second height distribution information, and a duplicate of the object is generated.

The image processing apparatus, the duplication system, and the duplication method according to the present disclosure are effective for duplicating an object having a convex portion.

FIG. 1 is a block diagram illustrating a configuration of a replication system according to the first embodiment. FIG. 2 is a diagram for explaining imaging of a picture by the imaging apparatus according to the first embodiment. FIG. 3 is a side view of the imaging apparatus according to the first embodiment. FIG. 4A is a diagram for explaining a relationship between an illumination angle and a shadow at the time of imaging in the first embodiment. FIG. 4B is a diagram for explaining a relationship between an illumination angle and a shadow at the time of imaging in the first embodiment. FIG. 5 is a flowchart for explaining the image processing operation according to the first embodiment. FIG. 6 is a diagram illustrating an example of a cross section of a duplicate image formed by printing by the printing apparatus according to the first embodiment. FIG. 7 is a diagram for explaining image duplication in the second embodiment. FIG. 8A is a diagram for explaining generation of height distribution information and printing processing according to the second embodiment. FIG. 8B is a diagram for explaining generation of height distribution information and printing processing according to the second embodiment. FIG. 8C is a diagram for describing generation and printing processing of height distribution information according to the second embodiment. FIG. 8D is a diagram for describing generation and printing processing of height distribution information according to the second embodiment. FIG. 8E is a diagram for describing generation and printing processing of height distribution information according to the second embodiment. FIG. 9 is a diagram for explaining image duplication in the third embodiment. FIG. 10A is a diagram for describing generation and printing processing of height distribution information according to the third embodiment. FIG. 10B is a diagram for describing generation of height distribution information and printing processing according to the third embodiment. FIG. 10C is a diagram for describing generation and printing processing of height distribution information according to the third embodiment. FIG. 10D is a diagram for explaining generation of height distribution information and printing processing according to the third embodiment. FIG. 10E is a diagram for describing generation and printing processing of height distribution information according to the third embodiment. FIG. 10F is a diagram for describing generation of height distribution information and printing processing in a comparative example. FIG. 11A is a diagram for describing generation of height distribution information and printing processing according to another embodiment of the third embodiment. FIG. 11B is a diagram for explaining generation of height distribution information and printing processing in another embodiment of the third embodiment. FIG. 11C is a diagram for describing generation of height distribution information and printing processing in another embodiment of the third embodiment. FIG. 11D is a diagram for explaining generation of height distribution information and printing processing according to another embodiment of the third embodiment. FIG. 11E is a diagram for describing generation of height distribution information and printing processing in another embodiment of the third embodiment. FIG. 11F is a diagram for describing generation of height distribution information and printing processing according to another embodiment of the third embodiment. FIG. 11G is a diagram illustrating generation and printing processing of height distribution information according to another embodiment of the third embodiment. FIG. 12A is a diagram schematically illustrating color image data according to the first embodiment. FIG. 12B is a diagram illustrating the height distribution information according to the first embodiment as an image.

Hereinafter, embodiments will be described in detail with reference to the drawings as appropriate. However, more detailed description than necessary may be omitted. For example, detailed descriptions of already well-known matters and repeated descriptions for substantially the same configuration may be omitted. This is to avoid the following description from becoming unnecessarily redundant and to facilitate understanding by those skilled in the art. In addition, the inventors provide the accompanying drawings and the following description in order for those skilled in the art to fully understand the present disclosure, and are not intended to limit the subject matter described in the claims. Absent.

(Embodiment 1)
Embodiment 1 will be described with reference to the drawings. For example, a painting such as an oil painting may include a convex portion (thickness portion of the paint) formed by overpainting the paint. In the first embodiment, the convex portion of the object is reproduced together with the color of the object (such as a painting). That is, the replication system of Embodiment 1 can generate a replica (replica) that reproduces the unevenness and color of an object.

1. Configuration FIG. 1 shows a configuration of a replication system according to the first embodiment. The replication system 100 according to the first embodiment includes an imaging device 10, an image processing device 20, and a printing device 30. The imaging device 10 captures an object (in the first embodiment, a painting) and generates image data. The image processing device 20 processes the image data generated by the imaging device 10 and outputs image information (height distribution information, color image data, etc.) necessary for the reproduction of the painting. The printing apparatus 30 performs printing based on the image information output from the image processing apparatus 20, and duplicates the painting. Hereinafter, configurations of the imaging device 10, the image processing device 20, and the printing device 30 will be described.

1-1. Configuration of Imaging Device The imaging device 10 of Embodiment 1 is a scanner using a line scan camera. The imaging device 10 includes an input / output unit 11, a control unit 12, a camera 13, a first illumination unit 14, a second illumination unit 15, and a moving device 16.

The input / output unit 11 includes an input unit 11a and a communication unit 11b. The input unit 11a is a keyboard, a mouse, a touch panel, or the like. The communication unit 11b includes an interface circuit for performing communication with an external device in conformity with a predetermined communication standard (for example, Local Area Network: LAN, WiFi). For example, the imaging device 10 inputs an instruction to start imaging via the input unit 11a or the communication unit 11b. And the imaging device 10 outputs the image data produced | generated by imaging a picture from the communication part 11b.

The control unit 12 controls the entire imaging apparatus 10. For example, the control unit 12 controls the moving device 16 based on the imaging start instruction input by the input unit 11a, and moves the camera 13, the first illumination unit 14, and the second illumination unit 15 simultaneously. In addition, the control unit 12 illuminates at least one of the first illumination unit 14 and the second illumination unit 15 with the camera 13, the first illumination unit 14, and the second illumination unit 15 moving simultaneously. Let And the control part 12 controls the camera 13 in the state in which the painting is illuminated, and makes the camera 13 image a painting. The control unit 12 can be realized by a semiconductor element or the like. The function of the control unit 12 may be configured only by hardware, or may be realized by combining hardware and software. The control unit 12 includes, for example, a microcomputer, a central processing unit: CPU, a micro-processing unit: MPU, a digital signal processor: DSP, a field-programmable gate array: FPGA, and an application spec.

The camera 13 includes an imaging unit 13a and a memory 13b. The imaging unit 13a includes, for example, a Charge Coupled Device (CCD) line sensor or a Complementary Metal Oxide Semiconductor (CMOS) line sensor. The camera 13 scans and images a picture line by line. The imaging unit 13a generates and captures image data of a picture for each line. The image data includes color information (RGB or CMYK) for each pixel. The image data captured by the imaging unit 13a is stored in the memory 13b. When the image data for each line stored in the memory 13b is combined, two-dimensional image data is generated. The memory 13b can be realized by, for example, a Random Access Memory: RAM, a Dynamic Random Access Memory: DRAM, a ferroelectric memory, a flash memory, a magnetic disk, or a combination thereof.

The first illumination unit 14 and the second illumination unit 15 are scanning illumination light sources. Specifically, the first illuminating unit 14 and the second illuminating unit 15 are a high color rendering straight tube type fluorescent lamp, a line LED illumination in which high color rendering white light emitting diodes (LEDs) are linearly arranged, or the like. is there. In the first embodiment, the first illumination unit 14 and the second illumination unit 15 are installed on both sides of the camera 13. Furthermore, the 1st illumination part 14 and the 2nd illumination part 15 are arrange | positioned in the symmetrical position with respect to the surface perpendicular | vertical to the main surface of an object (painting). If a picture is captured by the camera 13 in a state where the picture is illuminated by one of the first illumination part 14 and the second illumination part 15, image data including the shadow of the convex part of the picture can be generated. On the other hand, if a picture is taken with the camera 13 in a state where the picture is illuminated by both the first illumination part 14 and the second illumination part 15, image data that does not include the shadow of the convex part can be generated.

The moving device 16 is connected to the camera 13, the first illumination unit 14, and the second illumination unit 15. The moving device 16 moves the camera 13, the first illumination unit 14, and the second illumination unit 15 in the scan direction. As a result, the camera 13 can capture the picture line by line while moving.

1-2. Configuration of Image Processing Device The image processing device 20 includes an input / output unit 21, a control unit 22, and a memory 23.

The input / output unit 21 includes an input unit 21a and a communication unit 21b. The communication unit 21b functions as an input unit and an output unit. The input unit 21a is a keyboard, a mouse, a touch panel, or the like. The communication unit 21b includes an interface circuit for performing communication with an external device in compliance with a predetermined communication standard (for example, LAN, WiFi). For example, when the user inputs an instruction for capturing image data via the input unit 21 a, the communication unit 21 b outputs a request for capturing image data to the imaging device 10. When image data is transmitted from the imaging device 10, the communication unit 21b receives the image data. Further, the communication unit 21 b outputs the height distribution information generated by the control unit 22 to the printing apparatus 30. Further, the communication unit 21b outputs the image data transmitted from the imaging device 10 to the printing device 30 as color image data (color information).

The control unit 22 controls the entire image processing apparatus 20. The control unit 22 calculates the height of the convex portion formed on the surface of the painting from the length of the shadow included in the image of the image data received by the communication unit 21b. Information indicating the calculated height is defined as height information. Further, height information of all the convex portions existing in the entire desired region of the surface of the object is generated. Thereby, the control part 22 produces | generates the height distribution information which shows the height distribution of the whole desired area | region among the surfaces of an object. Specifically, height data that represents the height of the surface of the painting as a numerical value for each pixel is generated as the height distribution information. This data is, for example, data in which the numerical value increases as the height of the convex portion increases. The control unit 22 stores the generated height distribution information in the memory 23. Further, the control unit 22 outputs the generated height distribution information to the printing apparatus 30 via the communication unit 21b. The control unit 22 can be realized by a semiconductor element or the like. The function of the control unit 22 may be configured only by hardware, or may be realized by combining hardware and software. The control unit 22 can be configured by, for example, a microcomputer, CPU, MPU, DSP, FPGA, and ASIC. The memory 23 can be realized by, for example, a RAM, a DRAM, a ROM, a ferroelectric memory, a flash memory, a magnetic disk, or a combination thereof.

1-3. Configuration of Printing Device The printing device 30 generates an image that reproduces the height of the surface of a painting, that is, an image including a convex portion, based on the height distribution information received from the image processing device 20 and the color image data. . The printing apparatus 30 is, for example, a UV inkjet printer that uses Ultra Violet (UV) ink. The UV ink is cured by applying ultraviolet rays. The printing apparatus 30 can perform multilayer printing. The printing apparatus 30 generates an image including a convex portion by increasing the thickness of the ink as the numerical value of the height indicated in the height distribution information is larger.

1-4. FIG. 2 shows a state in which a picture 200 is being picked up by the image pickup apparatus 10. FIG. 3 shows a side view of the imaging apparatus 10. The side view of FIG. 3 is a schematic view. Here, let the main surface of the painting 200 be a surface when the painting 200 is visually recognized from the front. That is, the main surface of the painting 200 refers to a plane in which unevenness is ignored. The main surface of the painting 200 will be described as a surface defined by the X axis and the Y axis, and the height of the convex portion will be described as the height in the Z axis direction. The X axis, the Y axis, and the Z axis are in a relationship orthogonal to each other. In the first embodiment, for convenience of explanation, the positive direction in the X-axis direction is the right direction, and the negative direction in the X-axis direction is the left direction. The positive direction in the Y-axis direction is the downward direction, and the negative direction in the Y-axis direction is the upward direction.

As shown in FIGS. 2 and 3, the moving device 16 of the imaging device 10 includes one first movable body 16a, four second movable bodies 16d, two first guide rails 16b, and two A second guide rail 16c and a frame 16e are provided.

The two first guide rails 16b are arranged in parallel to the Y-axis direction. Each first guide rail 16 b is disposed at a position facing the painting 200.

The two second guide rails 16c are arranged in parallel to the X-axis direction. The two second guide rails 16 c are arranged one by one at both ends of the painting 200 in the Y-axis direction.

The first movable body 16a moves back and forth in the Y-axis direction along the two first guide rails 16b.

The four second movable bodies 16d are arranged one by one at both ends of each first guide rail 16b. Each second movable body 16d moves forward and backward in the X-axis direction along each second guide rail 16c. The first movable body 16a and the second movable body 16d move forward and backward by driving a motor or the like.

The frame 16e is connected to the first movable body 16a. The camera 13, the first illumination unit 14, and the second illumination unit 15 are fixed to the frame 16e. With this configuration, the camera 13, the first illumination unit 14, and the second illumination unit 15 are movable in the XY directions. Note that the moving device 16 can also include a third movable body that allows the first illumination unit 14 and the second illumination unit 15 to move up and down in the Z-axis direction. When the camera 13 scans and captures the painting 200, the control unit 12 drives and controls the moving device 16 so that the camera 13, the first illumination unit 14, and the second illumination unit 15 are integrated at a constant speed in the scan direction. Translate. The scanning direction is not limited to the vertical direction of the painting 200 and may be any direction. For example, the scanning direction may be a vertical direction, a horizontal direction, or an oblique direction depending on the arrangement or orientation of the painting 200.

In the first embodiment, the illumination light of the first illumination unit 14 and the second illumination unit 15 illuminates the imaged part 150 (for one line) of the painting 200 directly under the camera 13. The angle (illumination angle) θ between the illumination directions 14a and 15a of the illumination light of the first illumination unit 14 and the second illumination unit 15 to the image capturing unit 150 and the main surface of the painting 200 is the same constant angle. Is set to be The illumination angle θ is, for example, 30 °. The 1st illumination part 14 and the 2nd illumination part 15 illuminate the to-be-imaged part 150 from the upper side direction and lower side direction in the Y-axis direction of the to-be-imaged part 150, respectively. As described above, it is possible to generate shaded image data by illuminating the imaging target 150 from diagonally upward or diagonally downward. Note that the illumination angle θ may be an angle at which a shadow appears due to illumination, and 20 ° to 45 ° is particularly suitable.

FIG. 4A and FIG. 4B show a state when such an imaging apparatus 10 illuminates a painting 200 having a convex portion 201 on the surface. The convex portion 201 is a thickness portion of the paint. FIG. 4A shows a shadow S1 that is generated when the first lighting unit 14 illuminates the painting 200 from an oblique upper side. FIG. 4B shows a shadow S <b> 2 that occurs when the second illumination unit 15 illuminates the painting 200 from obliquely below.

As shown in FIG. 4A, when the painting 200 is illuminated obliquely from the upper side by the first illumination unit 14, a shadow S1 of the convex portion 201 appears on the lower side of the convex portion 201 (the positive direction of the Y axis). The control unit 22 of the image processing apparatus 20 can calculate the length of the shadow S1 from the number of pixels. Further, the control unit 22 can calculate the height H1 on the lower side of the convex portion 201 based on the length of the shadow S1 and the illumination angle θ of the first illumination unit 14 (H1 = length of the shadow S1 × tan θ). Similarly, when the painting 200 is illuminated obliquely from the lower side by the second illumination unit 15, a shadow S2 of the projection 201 appears above the projection 201 (in the negative direction of the Y axis). The control unit 22 of the image processing device 20 can calculate the height H2 above the convex portion 201 based on the length (number of pixels) of the shadow S2 and the illumination angle θ of the second illumination unit 15 (H2 = Length of shadow S2 × tan θ).

2. Operation FIG. 5 shows a process of generating height distribution information by the control unit 22 of the image processing apparatus 20.

First, as shown in FIG. 3, both the first illumination unit 14 and the second illumination unit 15 of the imaging apparatus 10 are illuminated at an illumination angle θ from the upper and lower sides (Y axis positive / negative direction) of the imaging target unit 150 of the painting 200. Illuminate at the same time. In this state, the camera 13 of the imaging apparatus 10 scans and captures the painting 200 and generates first image data as color image data (color information). The first image data includes color information (RGB or CMYK) of each pixel where the painting 200 is captured. The first image data is image data of a two-dimensional image that does not include the shadow of the convex portion 201. Thus, when both the 1st illumination part 14 and the 2nd illumination part 15 illuminate the painting 200 simultaneously, the image data which does not contain the shadow of the convex part 201 is obtained. Here, FIG. 12A is a diagram schematically showing color image data (first image data). That is, FIG. 12A is shooting data obtained by shooting a picture. The painting is an oil painting, and the one that exists in the foreground in the actual landscape is described with a higher level of paint. For example, the rising height of the paint is adjusted so that the one in the foreground is raised in the order of trees, clouds in front of the mountains, mountains, clouds in the back of the mountains, and sky. On the other hand, paintings are colored as the painter intended. Therefore, the brightness of each drawn part is not related to the unevenness of the actual paint. For example, the cloud color is white and the sky color is blue. The color of the leaves of the tree is green with different shades depending on the type of tree. The image processing device 20 inputs the first image data (color image data) generated by the imaging device 10 as described above (S501).

Next, as shown in FIG. 4A, only the first illumination unit 14 illuminates the imaged unit 150 of the painting 200. In this state, the camera 13 scans and captures the painting 200 to generate second image data as shadow image data (shadow information). This second image data is image data of a two-dimensional image including a shadow S1 on the lower side of the convex portion 201. The image processing apparatus 20 inputs the second image data (shadow image data) (S502). The control unit 22 of the image processing apparatus 20 calculates the length (for example, the number of pixels) of the shadow S1 on the lower side of the convex portion 201 based on, for example, the luminance value or color of the pixels (S503). Based on the length of the shadow S1 and the illumination angle θ of the first illumination unit 14, the control unit 22 calculates the height H1 of the lower side of the convex portion 201 in the Y-axis direction (S504).

Next, as shown in FIG. 4B, only the second illumination unit 15 illuminates the imaged unit 150 of the painting 200. In this state, the camera 13 scans and captures the painting 200 to generate third image data as shadow image data (other shadow information). The third image data is image data of a two-dimensional image including the shadow S2 above the convex portion 201. The control unit 22 inputs the third image data (shadow image data) (S505). The control unit 22 calculates the length (for example, the number of pixels) of the shadow S2 on the upper side of the convex portion 201 included in the third image based on, for example, the luminance value or the color (S506). The control unit 22 calculates the height H2 of the upper side of the convex portion 201 in the Y-axis direction based on the length of the shadow S2 and the illumination angle θ of the second illumination unit 15 (S507).

Based on the height H1 of the convex portion 201 calculated based on the second image data and the height 201 of the convex portion 201 calculated based on the third image data, the control unit 22 The total height H3 is calculated. The overall height of the convex portion 201 can be calculated, for example, by interpolating the height H3 between the lower height H1 and the upper height H2 of the convex portion 201. In the first embodiment, the height H3 is set as height information. In the first embodiment, the height of the convex portion 201 is the height H3 calculated based on the height H1 and the height H2, but the height of the convex portion 201 is the height H1 or the height H2. It is good also as one of these.

In addition to the first illumination unit 14 and the second illumination unit 15 that are arranged in the vertical direction (Y-axis direction) with respect to the camera 13, the imaging device 10 is laterally (X-axis direction) with respect to the camera 13. You may further provide the 3rd illumination part and the 4th illumination part which are arrange | positioned. In this case, the third illumination unit and the fourth illumination unit illuminate the imaged unit 150 at the same illumination angle θ from the left-right direction of the imaged unit 150. As a result, image data including shadows formed in the left-right direction of the convex portion 201 is obtained. In this case, the overall height of the convex portion 201 may be calculated based on the height calculated from the vertical and horizontal shadows of the convex portion 201.

In this way, the control unit 22 calculates the heights of all the convex portions 201 included in the painting 200. Then, the control unit 22 calculates the height of the entire image of the painting 200 (all pixels constituting the image), and generates height distribution information indicating the height distribution of the entire image (S508). The height distribution information is, for example, data representing the height of each pixel in the image as a numerical value. When the height distribution information is represented by an image image based on the numerical data, an image as shown in FIG. 12B is obtained. In the image of FIG. 12B, unlike the image of FIG. 12A, the portion where the swell of the paint is low is black, and the portion where the swell of the paint is high is expressed in white. When the painting includes fine convex portions due to the traces of the brush, the shadows due to such fine convex portions are also reflected in the height distribution information of FIG. 12B.

Numeral data of such height distribution information is output from the image processing apparatus 20 to the printing apparatus 30. Similarly, the color image data is also output from the image processing apparatus 20 to the printing apparatus 30 (S509).

FIG. 6 shows an example of a cross section of an image duplicated by printing by the printing apparatus 30. The printing device 30 prints the transparent ink 72 a plurality of times on the base material 71 (paper, cloth, plastic, etc.) based on the height distribution information output from the image processing device 20. For example, the larger the numerical value of the height distribution information, the larger the discharge amount of the transparent ink 72 is required. In pixels where the amount of the transparent ink 72 ejected is large, a plurality of layers are formed by printing the transparent ink 72 a plurality of times. Since the transparent ink 72 is cured immediately by being exposed to ultraviolet rays, the upper layer can be printed immediately after the lower layers of a plurality of layers are printed. That is, in a pixel having a large number of layers and a large discharge amount of the transparent ink 72, the surface rises higher, so that the convex portion 201 can be represented. The printing apparatus 30 prints the color ink 73 on the upper surface of the transparent ink 72 based on the color image data output from the image processing apparatus 20. Thereby, the painting 200 which reproduced the convex part 201 can be duplicated.

3. Effects etc. Conventional replicas (replicas) of paintings are flat because they were produced by color printing with a camera or scanner, etc., and the convex portions included in the paintings were not reproduced. For this reason, conventional reproductions of paintings do not reflect the changes in the viewpoint of the viewer or the changes in the lighting applied to them, resulting in a lack of realism. Conventionally, it has been possible to express a feeling of unevenness using a resin or the like, but it has been expensive because a convex portion has to be generated manually.

On the other hand, according to the replication system 100 of the present disclosure, a shadow of a convex portion of a painting is photographed by imaging the painting in a state where the painting is illuminated at a predetermined angle, and the height of the convex portion is calculated from the length of the shadow. Is calculated. Therefore, when replicating a painting, the convex portion (the height of the painting surface) of the painting can be reproduced. This makes it possible to generate a reproduction of a painting that is closer to the real thing.

According to the replication system 100 of the present disclosure, it is not necessary to use a special unevenness measuring device to measure the height of the surface of the painting 200. Therefore, it is possible to produce a replica with a sense of unevenness at a low cost.

Also, according to the replication system 100 of the present disclosure, it is not necessary to irradiate the painting with a laser in order to obtain height information. Therefore, it is possible to create a duplicate with a sense of unevenness without imposing a load on the painting.

Also, in the replication system 100 of the present disclosure, the painting 200 is illuminated from each of the upward direction and the downward direction, and the height H1 and the height H2 are calculated from the shadow S1 and the shadow S2 of the convex portion 201. And the height H3 of the convex part 201 is produced | generated from the height H1 and the height H2. As described above, by generating the height information based on the respective pieces of shadow information when illuminated from a plurality of directions, the convex portion 201 can be reproduced more faithfully. Furthermore, the painting 200 may be illuminated from each of the left direction, the right direction, and the diagonal direction, and the height information may be generated based on the shadow of each convex portion 201. Thereby, the convex part 201 can be reproduced more faithfully.

Further, in the replication system 100 of the present disclosure, even when the user does not grasp the number, position, and shape of the convex portions, the object including the convex portions can be duplicated based on the image data captured by the imaging device 10.

(Embodiment 2)
As described above, the first embodiment has been described as an example of the technique disclosed in the present application. However, the technology in the present disclosure is not limited to this, and can also be applied to embodiments in which changes, substitutions, additions, omissions, and the like are appropriately performed. Moreover, it is also possible to combine each component demonstrated in the said Embodiment 1, and it can also be set as a new embodiment. Thus, Embodiment 2 will be exemplified below.

When forming the convex portion by printing with the printing apparatus 30, the image processing apparatus 20 may instruct the printing apparatus 30 in a plurality of times. That is, the image processing apparatus 20 may divide the height distribution information into a plurality of height distribution information, and give an instruction to the printing apparatus 30 based on each height distribution information. For example, when the width of the convex portion is narrow, it is difficult to form the convex portion by raising the ink high by one printing. However, it is possible to increase the ink level by printing in multiple times and stacking the ink layers.

For example, as shown in FIG. 7, when the convex portion 201 is photographed and the convex portion 201 is reproduced with an ink layer, the width of the convex portion 201 and the surface direction of the main surface of the object (X direction in FIG. 7) Or height information is divided and generated by a width W in the Y direction). In FIG. 7, the height information is divided by the width W of the convex portion 201 in the Y direction.

Specifically, when printing the convex part 201 and the convex part 202 having different widths as shown in FIG. 8A, the image processing apparatus 20 identifies the wide convex part 202 as the first layer and narrows the width. The convex portion 201 is identified as the second layer. And based on the 1st height information showing the height of the wide convex part 202, the 1st height distribution information like FIG. 8B is produced | generated. Further, based on the second height information indicating the height of the narrow convex portion 201, second height distribution information as shown in FIG. 8C is generated. The first height distribution information and the second height distribution information may be generated by being divided from one height distribution information, or may be generated separately in advance.

The image processing device 20 first instructs the printing device 30 to perform printing based on the first height distribution information shown in FIG. 8B. Further, the image processing apparatus 20 instructs the number of times of printing based on the first height distribution information to be one time. Therefore, first, as shown in FIG. 8D, the ink layer 302 corresponding to the wide convex portion 202 is printed. The ink layer 302 is formed by one printing.

Thereafter, the image processing apparatus 20 instructs the printing apparatus 30 to perform printing based on the second height distribution information shown in FIG. 8C. Further, the image processing apparatus 20 instructs the number of times of printing based on the second height distribution information to be twice. Then, as shown in FIG. 8E, the ink layer 301 corresponding to the narrow convex portion 201 is formed by two printings. In this way, a plurality of ink layers (ink layer 301a and ink layer 301b) are formed. Note that the number of times of printing may be two or more.

Here, it is difficult to form the narrow ink layer 301 by one printing. The reason is that the ink drips before curing. In contrast, the image processing apparatus 20 according to the second embodiment instructs the number of times of printing to be a plurality of times (twice in FIG. 8E). Accordingly, the ink can be raised to a higher level, and the same height as the original convex portion 201 can be reproduced. On the other hand, it is relatively easy to form the wide ink layer 302 by raising the ink level by one printing. Since the ink layer 302 can be formed by one printing, the total number of printing times and printing time can be suppressed.

As described above, when a plurality of convex portions (for example, the convex portion 201 and the convex portion 202) are formed on the surface of the object, each convex portion includes the first layer and the second layer depending on the width of each convex portion. Identify at least one of the multiple layers. And by instructing the number of times of printing for each layer, printing can be performed efficiently and accurately.

Note that, as described above, it is difficult to reproduce the narrow convex portion 201 by raising the ink high by one printing. However, increasing the number of printings makes it possible to increase the ink level. Therefore, when the width of the convex portion 201 is less than a predetermined value, the number of times of printing may be increased as compared with the case where the width of the convex portion 201 is equal to or larger than the predetermined value. The predetermined value in the second embodiment is in the range of 0.2 mm to 0.5 mm. The predetermined value varies depending on the printing characteristics of the printing apparatus 30 and the ink curing characteristics.

In the second embodiment, when the width of the convex portion is the predetermined value “greater than”, the convex portion is identified as the first layer, and when the width of the convex portion is the predetermined value “less than”, the convex portion is the second value. Although identified as a layer, if the width of the convex portion is “larger” than the predetermined value, the convex portion is identified as the first layer, and if the width of the convex portion is the predetermined value “below”, the convex portion is identified as the second layer. May be identified as In the second embodiment, the first layer and the second layer are identified by the width of the convex portion. However, the second layer is identified by any one of three or more layers including the third layer and the fourth layer. May be.

In the description of the second embodiment, it is described that the height information is divided by the width W in the surface direction (X direction or Y direction in FIG. 7) of the main surface of the object. The height information may be divided according to (spatial frequency). That is, when forming the convex part 201 and the convex part 202 having different areas (spatial frequencies) as shown in FIG. 8A, the height information (the first information indicating the height of the convex part 202 having a large area (low spatial frequency). (1 height information) and height information (second height information) indicating the height of the convex portion 201 having a small area (high spatial frequency) may be used.

Further, the first height information and the second height information may be calculated from each shadow information when the object is illuminated from two or more directions. Thereby, the convex part of an object can be duplicated more faithfully.

In the second embodiment, the first height information and the second height information are generated based on the shadow information output from the imaging device 10 and the illumination angle, as in the first embodiment. The method for generating the first height information and the second height information is not limited to this method. The first height information and the second height information may be obtained and input to the image processing device 20 using an existing special unevenness measuring device, a laser, or the like. Regardless of the method used to obtain the height information, as shown in the second embodiment, the convex portion is identified as at least one of a plurality of layers including the first layer and the second layer under a predetermined condition, and printing is performed for each layer. By doing this, replicas can be replicated efficiently and accurately.

(Embodiment 3)
In the third embodiment, similarly to the second embodiment, when the convex portion is reproduced by printing, the printing is performed in a plurality of times. That is, the image processing apparatus 20 divides the height distribution information into a plurality of height distribution information. In particular, in the third embodiment, the height distribution information is divided according to the shape of the convex portion of the painting 200. FIG. 9 is a diagram for explaining image duplication in the third embodiment.

As shown in FIG. 9, the shape of the convex portion 203 of the painting 200 is a multistage shape that is identified by being divided into multiple stages (two stages in FIG. 9) in the height direction. The convex portion 203 is identified by being divided into a lower portion 203a (first step) and an upper portion 203b (second step) by a multi-stage boundary line.

That is, the control unit 22 of the image processing apparatus 20 identifies the lower part 203a as the first layer and the upper part 203b as the second layer. Further, the control unit 22 generates height information (first height information) of the lower portion 203a and height information (second height information) of the upper portion 203b. The height information of the lower part 203a is calculated from the shadow S1a and the shadow S2a. The height information of the upper part 203b is calculated from the shadow S1b and the shadow S2b. And the control part 22 produces | generates the height information (1st height distribution information) of the whole surface of an object (for example, painting 200) based on the 1st height information showing the height of the lower part 203a. Further, the control unit 22 generates height distribution information (second height distribution information) of the entire surface of the object based on the second height information indicating the height of the upper portion 203b. Then, the control unit 22 instructs the printing apparatus 30 to perform printing based on the first height distribution information and printing based on the second height distribution information separately via the communication unit 21b. I do. Thereby, the convex part 203 with the clear boundary of the upper part 203b and the lower part 203a can be formed.

FIG. 10A shows the height distribution information of the convex part 201 and the convex part 203 generated by the replication system of the third embodiment.

In the third embodiment, the height distribution information is divided into first height distribution information and second height distribution information. That is, in the third embodiment, the control unit 22 of the image processing apparatus 20 sequentially searches the height information of the convex portions 201 and 203 by the widths of the convex portions 201 and 203 as in the second embodiment.

Since the width of the lower portion of the convex portion 203 (lower portion 203a shown in FIG. 10B) is equal to or greater than a predetermined value, the control unit 22 identifies the lower portion 203a as the first layer. And the control part 22 produces | generates the 1st height distribution information shown to FIG. 10B based on the 1st height information which shows the height of a 1st layer.

Further, since the width of the convex portion 201 and the width of the upper portion of the convex portion 203 (upper portion 203b shown in FIG. 10C) are less than a predetermined value, the control unit 22 identifies the convex portion 201 and the upper portion 203b as the second layer. . And the control part 22 produces | generates the 2nd height distribution information shown to FIG. 10C based on the 2nd height information which shows the height of a 2nd layer.

Then, the control unit 22 designates the divided first height distribution information and second height distribution information via the communication unit 21b, and instructs the printing apparatus 30 to print twice.

The printing apparatus 30 first performs printing based on the first height distribution information. As a result, as shown in FIG. 10D, the ink layer 303a corresponding to the lower portion 203a of the convex portion 203 is printed. Then, the ink layer 303a is cured by irradiating UV.

Next, as shown in FIG. 10E, printing is performed based on the second height distribution information. As a result, as shown in FIG. 10E, the ink layer 301 corresponding to the convex portion 201 and the ink layer 303b corresponding to the upper portion 203b of the convex portion 203 are printed. At this time, since the ink layer 303a is already cured, it is possible to prevent the upper ink layer 303b from flowing into the lower ink layer 303a and the periphery of the ink layer 303b from becoming gentle. That is, the edge of the upper ink layer 303b can be clearly formed.

On the other hand, FIG. 10F shows an ink layer 401 and an ink layer 403 that have been duplicated by another duplication system for comparison with the third embodiment. The ink layer 401 is a copy of the projection 201 and the ink layer 403 is a copy of the projection 203. In this comparative example, as in the third embodiment, UV ink that is cured by being irradiated with ultraviolet rays is used as ink used in the printing apparatus. In the comparative example, unlike the third embodiment, the height distribution information is not divided and generated. Therefore, in the comparative example, the printing device prints the two-tiered ink layer 403 at a time. In such a comparative example, the upper part (the part corresponding to the upper part 203b) of the ink layer 403 is raised in a state where the lower part (the part corresponding to the lower part 203a) of the ink layer 403 is not cured. Therefore, as shown in FIG. 10F, the upper ink flows downward, and the upper outer periphery forms a gentle (no edge) ink layer 403. As a result, there arises a problem that it is difficult to express a multistage shape such as the convex portion 203. In particular, if the width of the upper portion 203b is less than a predetermined value, ink may sag from the center of the upper portion 203b, and the height of the upper portion 203b may be lower than a desired height. The predetermined value is in the range of 0.5 mm to 0.8 mm. This value varies depending on the printing characteristics of the printing apparatus and the ink curing characteristics.

In the third embodiment, the height information is divided by the width W in the surface direction (X direction or Y direction in FIG. 7) of the main surface of the object, but the height information is divided by the area (spatial frequency) of the convex portion. May be. Further, the height information may be divided by searching the convex portion in the height direction and detecting a multi-level step (boundary between the lower portion 203a and the upper portion 203b shown in FIG. 10A). Hereinafter, further specific examples will be described.

FIG. 11A shows height distribution information generated by the replication system according to another embodiment of the third embodiment. 11B and 11C show an example in which a convex portion is searched in the height direction and the height distribution information is divided.

First, the control unit 22 of the image processing apparatus 20 searches the height distribution information shown in FIG. 11A from the lower one by the search process. In the search process, a multi-level step is detected. Then, the control unit 22 identifies the convex portion 201, the lower portion 203a of the convex portion 203, and the lower portion 204a of the convex portion 204 shown in FIG. 11B as lower layer portions (referred to as a first layer). The control unit 22 generates first height distribution information as shown in FIG. 11B based on the height information (first height information) of each lower layer. Further, the control unit 22 identifies the upper portion 203b of the convex portion 203 and the upper portion 204b of the convex portion 204 shown in FIG. 11C as upper layer portions (referred to as a second layer). And the control part 22 produces | generates 2nd height distribution information as shown to FIG. 11C based on the height information (2nd height information) of each upper layer part. Thereby, the height distribution information can be divided into the first height distribution information and the second height distribution information.

Furthermore, as a method of dividing the height distribution information in FIG. 11A, as in FIGS. 10B and 10C, the width W of the main surface of the object (the X direction or the Y direction in FIG. 7), or the convex portion There is also a method of dividing by area (spatial frequency).

That is, the control unit 22 identifies the first layer as the width W in the surface direction of the main surface of the object of the lower part 203a of the convex part 203 and the lower part 204a of the convex part 204 is equal to or greater than a predetermined value (W1). To do. And the control part 22 produces | generates the 1st height distribution information shown to FIG. 11D based on the 1st height information which shows the height of each 1st layer.

Furthermore, since the width W of the upper part 204b of the convex part 204 is not less than the predetermined value W2 and less than the predetermined value W1, the control unit 22 identifies it as the second layer. And the control part 22 produces | generates the 2nd height distribution information shown to FIG. 11E based on the 2nd height information which shows the height of a 2nd layer.

Moreover, since the width W of the convex part 201 and the upper part 203b of the convex part 203 is less than a predetermined value (it is set to W2), the control part 22 identifies as a 3rd layer. And the control part 22 produces | generates the 3rd height distribution information shown to FIG. 11F based on the 3rd height information which shows the height of each 3rd layer.

In addition, when the width W of the convex portion is larger than the predetermined value W1, the convex portion may be identified as the first layer. The convex portion may be identified as the second layer when the width W of the convex portion is larger than the predetermined value W2 and equal to or smaller than the predetermined value W1. Furthermore, the convex portion may be identified as the third layer when the width W of the convex portion is equal to or smaller than the predetermined value W2.

The printing apparatus 30 performs printing based on the first height distribution information shown in FIG. 11D, the second height distribution information shown in FIG. 11E, and the third height distribution information shown in FIG. 11F. Here, when it is desired to print with ink higher than the height that can be printed by the printing apparatus 30 at once, or when it is desired to emphasize the details, it is based on one height distribution information as shown in FIG. Printing can be performed in multiple steps. That is, the printing apparatus 30 performs printing in the order of the first height distribution information in FIG. 11D and the second height distribution information in FIG. 11E based on the height distribution information. Thereafter, when printing based on the third height distribution information shown in FIG. 11F, first, as shown in FIG. 11G, the ink layer 301c for the lower layer of the convex portion 201 and the ink layer 303c for the lower layer of the upper portion 203b are printed. Then, the ink layer 301c and the ink layer 303c are cured, and the ink layer 301d for the upper layer of the convex portion 201 and the ink layer 303d for the upper layer of the upper portion 203b are printed again based on the same third height distribution information. Thereby, the ink can be easily raised high.

In the third embodiment, the height information of the convex portion 201, the convex portion 203, and the convex portion 204 is generated based on the shadow information output from the imaging device 10 and the illumination angle, as in the first embodiment. However, the method of generating the first height information and the second height information is not limited to this method. The first height information and the second height information may be obtained using an existing special unevenness measuring device, a laser, or the like. Regardless of which method is used to obtain the height information, as shown in the third embodiment, the convex portion 201, the convex portion 203, and the convex portion 204 are formed of a plurality of layers including a first layer and a second layer under predetermined conditions. By replicating at least one of these and performing printing for each layer, a replica can be efficiently and accurately replicated.

In the above embodiment, the imaging device 10 is a scanner using a line scan camera, but the imaging device is not limited to a scanner. Since it is only necessary to obtain shaded image data in a form in which the height of the convex portion can be calculated, for example, a normal camera that can obtain a two-dimensional image may be used.

In the above-described embodiment, a painting is described as an example of a replication target of the replication system 100 of the present disclosure, but the replication target is not limited to a painting. For example, in addition to painting, sculpture may be used. In this case, the area between the recesses is recognized as a projection. The idea of the replication system 100 of the present disclosure can be applied when a planar object having convex portions is replicated including the height information of the object surface.

In the above embodiment, the image processing device 20 is independent of each of the imaging device 10 and the printing device 30, but may be integrated with the imaging device 10 or the printing device 30. When the image processing device 20 and the imaging device 10 are integrated, the communication unit 11b and the communication unit 21b may not be provided. Moreover, each control part 12 and control part 22 may be united.

Further, the duplication system 100 of the embodiment may be configured to duplicate a desired region of an object. For example, the user may be able to freely set the area of the image data output by the imaging device 10. Or the area | region of the height distribution information which the image processing apparatus 20 outputs may be able to set freely. Alternatively, the user may be able to freely set the area where the printing apparatus 30 prints.

The replication system 100 of the present disclosure can be realized by cooperating with hardware resources such as a processor, a memory, and a program.

The components described in the attached drawings and detailed description include not only components essential for solving the problem but also components not essential for solving the problem in order to exemplify the above technique. Can be. Therefore, it should not be immediately recognized that these non-essential components are essential as those non-essential components are described in the accompanying drawings and detailed description.

For example, in the above-described embodiment, the moving device 16 is configured to move the camera 13, the first illumination unit 14, and the second illumination unit 15 in the scan direction. For example, the camera 13, the first illumination unit 14, and the first illumination unit 15 2 It is good also as a structure which fixes the illumination part 15 and moves the painting 200. FIG. In solving the problem of the present disclosure, it is only necessary that the relative positional relationship between the camera 13, the first illumination unit 14, and the second illumination unit 15 is clear, and the scanning method is not essential for solving the problem.

In addition, since the above-described embodiment is for exemplifying the technique in the present disclosure, various modifications, replacements, additions, omissions, and the like can be performed within the scope of the claims or an equivalent scope thereof.

The present disclosure can be applied to an image processing apparatus that generates data for reproducing a planar object (for example, a painting) having a convex portion, and a reproduction system that duplicates a painting.

DESCRIPTION OF SYMBOLS 10 Imaging device 11 Input / output part 11a Input part 11b Communication part 12 Control part 13 Camera 13a Imaging part 13b Memory 14 1st illumination part 15 2nd illumination part 16 Mobile device 20 Image processing apparatus 21 Input / output part 21a Input part 21b Communication part 22 Control unit 23 Memory 30 Printing device 71 Base material 72 Transparent ink 73 Color ink 100 Replication system 150 Imaged unit 200 Painting 201, 202, 203, 204 Protruding part 203a, 204a Lower part 203b, 204b Upper part 301, 301a, 301b, 301c , 301d Ink layer 302 Ink layer 303a, 303b, 303c, 303d Ink layer 401, 403 Ink layer

Claims (16)

  1. Enter the height information obtained by measuring the shape of the protrusions formed on the surface of the object,
    The convex portion is identified as a plurality of layers including at least two of the first layer and the second layer,
    Based on the first height information indicating the height of the first layer among the height information, generating first height distribution information indicating the height distribution of the surface of the object,
    A control unit that generates second height distribution information indicating a height distribution of the surface of the object based on second height information indicating the height of the second layer of the height information;
    An output unit for outputting the first height distribution information and the second height distribution information;
    An image processing apparatus comprising:
  2. When the width of the convex portion in a predetermined direction parallel to the main surface of the object is a multi-stage shape having first and second steps different in the height direction,
    The control unit inputs information on the width of the convex portion together, identifies the first stage as the first layer, and identifies the second stage as the second layer.
    The image processing apparatus according to claim 1.
  3. The controller is
    Input the width information in a predetermined direction parallel to the main surface of the object of the convex part,
    When the width of the convex portion is a predetermined value or more, the convex portion is identified as the first layer,
    When the width of the convex portion is smaller than the predetermined value, the convex portion is identified as the second layer.
    The image processing apparatus according to claim 1.
  4. An input unit for inputting shadow information indicating the shadow of the convex part, obtained by illuminating the object having the convex part from a predetermined direction;
    The controller is
    The first height information and the second height based on the shading information obtained by illuminating the object from the predetermined direction and the angle formed by the predetermined direction and the main surface of the object. The image processing apparatus according to claim 2, wherein the image processing apparatus generates information.
  5. The input unit further inputs other shadow information indicating the shadow of the convex portion obtained by illuminating the object from another direction different from the predetermined direction,
    The control unit generates the first height information and the second height information by using the angle formed by the other direction and the main surface of the object and the other shadow information together. The image processing apparatus according to claim 4.
  6. The first height distribution information and the second height distribution information are expressed in an image data format in which pixels of the surface of the object are arranged at a predetermined interval, and the height of the surface is expressed numerically for each pixel. Data,
    The image processing apparatus according to any one of claims 1 to 5.
  7. The output unit further outputs color information including color information of the object;
    The image processing apparatus according to claim 1.
  8. The color information is photographing data obtained by photographing the object illuminated simultaneously from a plurality of directions.
    The image processing apparatus according to claim 7.
  9. Enter the height information obtained by measuring the shape of the protrusions formed on the surface of the object,
    The convex portion is identified as a plurality of layers including at least two of the first layer and the second layer,
    Based on the first height information indicating the height of the first layer among the height information, generating first height distribution information indicating the height distribution of the surface of the object,
    A control unit that generates second height distribution information indicating a height distribution of the surface of the object based on second height information indicating the height of the second layer of the height information;
    An image processing apparatus comprising: an output unit that outputs the first height distribution information and the second height distribution information;
    Printing based on the first height distribution information and the second height distribution information to generate a copy of the object;
    A replication system comprising:
  10. When the width of the convex portion in a predetermined direction parallel to the main surface of the object is a multi-stage shape having first and second steps different in the height direction,
    The control unit identifies the first stage as the first layer, and identifies the second stage as the second layer.
    The replication system according to claim 9.
  11. The control unit identifies the convex portion as the first layer when the width in the predetermined direction parallel to the main surface of the object is equal to or greater than a predetermined value. When the is smaller than the predetermined value, the convex portion is identified as the second layer.
    The replication system according to claim 9.
  12. The printing apparatus performs printing based on the second height distribution information after printing based on the first height distribution information.
    The replication system according to any one of claims 9 to 11.
  13. Enter the height information obtained by measuring the shape of the protrusions formed on the surface of the object,
    The convex portion is identified as a plurality of layers including at least two of the first layer and the second layer,
    Based on the first height information indicating the height of the first layer among the height information, generating first height distribution information indicating the height distribution of the surface of the object,
    Based on the second height information indicating the height of the second layer among the height information, generating second height distribution information indicating the height distribution of the surface of the object,
    Outputting the first height distribution information and the second height distribution information;
    Printing based on the first height distribution information and the second height distribution information to generate a duplicate of the object;
    Duplication method.
  14. When the width of the convex portion in a predetermined direction parallel to the main surface of the object is a multi-stage shape having first and second steps different in the height direction,
    In the step of identifying the protrusions as the plurality of layers, the first stage is identified as the first layer, and the second stage is identified as the second layer.
    The duplication method according to claim 13.
  15. In the step of identifying the convex portions as the plurality of layers, when the width in the direction parallel to the main surface of the object is greater than or equal to a predetermined value, the convex portions are Identifying the layer as a second layer when the width is smaller than the predetermined value,
    The duplication method according to claim 13.
  16. In the printing step, after printing based on the first height distribution information, printing is performed based on the second height distribution information.
    The duplication method according to any one of claims 13 to 15.
PCT/JP2016/004920 2016-07-29 2016-11-18 Image processing device, replication system, and replication method WO2018020533A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2016-149934 2016-07-29
JP2016149934 2016-07-29

Publications (1)

Publication Number Publication Date
WO2018020533A1 true WO2018020533A1 (en) 2018-02-01

Family

ID=61015745

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/004920 WO2018020533A1 (en) 2016-07-29 2016-11-18 Image processing device, replication system, and replication method

Country Status (1)

Country Link
WO (1) WO2018020533A1 (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004015297A (en) * 2002-06-05 2004-01-15 Keio Gijuku Stereoscopic observation apparatus and method for generating stereoscopic image reproducing color of object surface
JP2004340832A (en) * 2003-05-16 2004-12-02 Matsushita Electric Ind Co Ltd Method and system for visual inspection of circuit board
WO2013080439A1 (en) * 2011-11-28 2013-06-06 パナソニック株式会社 Stereoscopic image processing apparatus and stereoscopic image processing method
JP2013205202A (en) * 2012-03-28 2013-10-07 Azbil Corp Visual inspection apparatus for solder spike
JP2015049806A (en) * 2013-09-03 2015-03-16 株式会社アイジェット Three-dimensional data creation method, three-dimensional shaped article employing the same, and manufacturing method therefor
WO2015050033A1 (en) * 2013-10-04 2015-04-09 株式会社ミマキエンジニアリング Three-dimensional shaping device and method for forming three-dimensional shaping device
JP2015076023A (en) * 2013-10-11 2015-04-20 カシオ計算機株式会社 Image processor, stereoscopic data generation method, and program
JP2015217682A (en) * 2014-05-14 2015-12-07 ソク−ムン,キム 3d printing device and method, and construction method of reinforced concrete structure utilizing the device
JP2016063522A (en) * 2014-09-22 2016-04-25 カシオ計算機株式会社 Image processing system, method, and program

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004015297A (en) * 2002-06-05 2004-01-15 Keio Gijuku Stereoscopic observation apparatus and method for generating stereoscopic image reproducing color of object surface
JP2004340832A (en) * 2003-05-16 2004-12-02 Matsushita Electric Ind Co Ltd Method and system for visual inspection of circuit board
WO2013080439A1 (en) * 2011-11-28 2013-06-06 パナソニック株式会社 Stereoscopic image processing apparatus and stereoscopic image processing method
JP2013205202A (en) * 2012-03-28 2013-10-07 Azbil Corp Visual inspection apparatus for solder spike
JP2015049806A (en) * 2013-09-03 2015-03-16 株式会社アイジェット Three-dimensional data creation method, three-dimensional shaped article employing the same, and manufacturing method therefor
WO2015050033A1 (en) * 2013-10-04 2015-04-09 株式会社ミマキエンジニアリング Three-dimensional shaping device and method for forming three-dimensional shaping device
JP2015076023A (en) * 2013-10-11 2015-04-20 カシオ計算機株式会社 Image processor, stereoscopic data generation method, and program
JP2015217682A (en) * 2014-05-14 2015-12-07 ソク−ムン,キム 3d printing device and method, and construction method of reinforced concrete structure utilizing the device
JP2016063522A (en) * 2014-09-22 2016-04-25 カシオ計算機株式会社 Image processing system, method, and program

Similar Documents

Publication Publication Date Title
Steger et al. Machine vision algorithms and applications
Rerabek et al. New light field image dataset
KR101265358B1 (en) Method of controlling an action, such as a sharpness modification, using a colour digital image
US7627196B2 (en) Image processing device and image capturing device
US8072654B2 (en) Three-dimensional calibration using orientation and position sensitive calibration pattern
CA2472524C (en) Registration of separations
US20180376116A1 (en) Method and system for projector calibration
JP4421843B2 (en) System and method for camera image stitching
US8944610B2 (en) Image projection apparatus, control method, control program, and storage medium
US8724893B2 (en) Method and system for color look up table generation
JP3082289B2 (en) Image processing device
CN102089697B (en) Image-capturing apparatus
EP2869266B1 (en) Method and apparatus for generating depth map of a scene
KR930007296A (en) 3D stereoscopic information acquisition device
Moreno et al. Simple, accurate, and robust projector-camera calibration
JP2008054289A (en) Image processing method
US9088729B2 (en) Imaging apparatus and method of controlling same
JP6506507B2 (en) Measurement apparatus and control method thereof
US20110025830A1 (en) Methods, systems, and computer-readable storage media for generating stereoscopic content via depth map creation
US20120176478A1 (en) Forming range maps using periodic illumination patterns
JP2006507087A (en) Acquisition of 3D images by active stereo technology using local unique patterns
TWI526773B (en) Three dimensional scanning-printing device
JP2006010791A (en) Automatic focusing for projector
KR100543801B1 (en) Pseudo three dimensional image generating apparatus
JP4709571B2 (en) Visual information processing system and visual information processing method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16910438

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase in:

Ref country code: DE

NENP Non-entry into the national phase in:

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 16910438

Country of ref document: EP

Kind code of ref document: A1