WO2018042727A1 - Image pickup device, image processing system, image pickup method, and image processing method - Google Patents

Image pickup device, image processing system, image pickup method, and image processing method Download PDF

Info

Publication number
WO2018042727A1
WO2018042727A1 PCT/JP2017/010125 JP2017010125W WO2018042727A1 WO 2018042727 A1 WO2018042727 A1 WO 2018042727A1 JP 2017010125 W JP2017010125 W JP 2017010125W WO 2018042727 A1 WO2018042727 A1 WO 2018042727A1
Authority
WO
WIPO (PCT)
Prior art keywords
illumination
imaging
unit
shadow
image generation
Prior art date
Application number
PCT/JP2017/010125
Other languages
French (fr)
Japanese (ja)
Inventor
美馬 邦啓
島崎 浩昭
田中 義人
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Publication of WO2018042727A1 publication Critical patent/WO2018042727A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/04Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa
    • H04N1/10Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa using flat picture-bearing surfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/04Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa
    • H04N1/10Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa using flat picture-bearing surfaces
    • H04N1/107Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa using flat picture-bearing surfaces with manual scanning
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment

Definitions

  • the present disclosure relates to an imaging device that generates image data of an object, an image processing system that uses the imaging device, an imaging method that generates image data of an object, and an image processing method that generates and processes image data of the object.
  • Patent Document 1 discloses an image processing apparatus that generates stereoscopic image data by adding height direction information to a planar original image. This image processing apparatus adds shadow information and texture to each region separated based on focus information of original image data by adding height information based on contrast height (brightness or darkness) or based on object contour information. Can be expressed realistically.
  • An imaging apparatus includes a first imaging unit capable of imaging a first imaging area extending in a first direction, and a first illuminating the first imaging area from a first illumination direction.
  • a first image generation unit including an illumination unit; a second imaging unit capable of imaging a second imaging region extending in a second direction different from the first direction; and the second imaging.
  • a second illumination unit that illuminates the second imaging region from a second illumination direction different from the first illumination direction when the region is viewed from above, and
  • a drive unit that integrally moves the first image generation unit and the second image generation unit in a predetermined movement direction.
  • the image processing system includes an imaging device and an image processing device.
  • the imaging apparatus includes a first imaging unit capable of imaging a first imaging area extending in a first direction, and a first illumination unit that illuminates the first imaging area from a first illumination direction.
  • a second imaging unit capable of imaging a second imaging region extending in a second direction different from the first direction, and an upper surface of the second imaging region.
  • a second illumination unit that illuminates the second imaging region from a second illumination direction different from the first illumination direction when viewed, and the first image
  • a drive unit that integrally moves the generation unit and the second image generation unit in the movement direction.
  • the first image generation unit shoots an object having a convex portion in the first imaging region by illuminating the first illumination unit with the first illumination unit, and shows first shadow information indicating a shadow by the convex portion.
  • the second image generation unit illuminates the object in the second imaging area with the second illumination unit and shoots the second shadow indicating the shadow by the convex part. Generate information.
  • the imaging method illuminates an object in the first imaging area from the first illumination direction while moving in a predetermined movement direction in the first imaging area extending in the first direction.
  • the first image is scanned and the second image area extending in the second direction different from the first direction is moved in the moving direction integrally with the first image area.
  • the object in the second imaging region is photographed by illuminating from a second illumination direction different from the first illumination direction when the object is viewed from above, and a second image is scanned and imaged. To do.
  • a first imaging region extending in a first direction an object having a convex portion on a surface is photographed by illuminating from a first illumination direction, and the convex portion is used.
  • First shadow information indicating a shadow is generated, and the first object is viewed when the object is viewed from above in a second imaging region extending in a second direction different from the first direction.
  • the imaging apparatus, image processing system, imaging method, and image processing method according to the present disclosure can generate object image data with high accuracy.
  • FIG. 1 is a block diagram illustrating a configuration of an image processing system and a printing apparatus connected to the image processing system according to the first embodiment.
  • FIG. 2 is a diagram for explaining imaging of a picture by the imaging apparatus according to the first embodiment.
  • FIG. 3 is a diagram for explaining a drive unit of the imaging apparatus according to the first embodiment.
  • FIG. 4 is a diagram for explaining the positional relationship between the first image generation unit and the second image generation unit of the imaging apparatus according to the first embodiment.
  • FIG. 5 is a diagram for explaining the illumination angle of the imaging apparatus according to the first embodiment.
  • FIG. 6 is a diagram for explaining a moving direction during imaging of the imaging apparatus according to the first embodiment.
  • FIG. 7 is a flowchart for explaining the operation of the imaging apparatus according to the first embodiment.
  • FIG. 1 is a block diagram illustrating a configuration of an image processing system and a printing apparatus connected to the image processing system according to the first embodiment.
  • FIG. 2 is a diagram for explaining imaging of a picture by the
  • FIG. 8A is a diagram for explaining a relationship between an illumination angle and a shadow at the time of imaging in the first embodiment.
  • FIG. 8B is a diagram for explaining a relationship between an illumination angle and a shadow at the time of imaging in the first embodiment.
  • FIG. 9 is a flowchart for explaining the operation of the image processing apparatus according to the first embodiment.
  • FIG. 10 is a diagram for explaining an allowable angle between imaging regions.
  • FIG. 11 is a diagram for describing the positional relationship between the first image generation unit and the second image generation unit of an imaging apparatus according to another embodiment.
  • FIG. 12 is a diagram for explaining a positional relationship among a first image generation unit, a second image generation unit, and a third image generation unit of an imaging apparatus according to another embodiment.
  • FIG. 13 is a diagram for explaining the problem.
  • a painting such as an oil painting is an object having a convex portion such as the thickness of a paint or the trace of a brush.
  • image data including the shadow of the object can be generated.
  • the object 410 (painting) includes a convex portion 412 having a low height in the vicinity of the convex portion 411 having a high height, as shown in FIG. There is a case where it is hidden by the shadow of the part 411. In such a case, the image data does not indicate the shadow information of the convex portion 412.
  • the imaging device of the present disclosure can capture an object and generate highly accurate image data.
  • the object is a painting such as an oil painting
  • the heights of these convex portions are different from each other, and these convex portions are adjacent to each other or in the vicinity thereof.
  • the imaging device generates image data including the shadow of the convex portion with high accuracy.
  • an object having a convex portion (for example, a painting such as an oil painting) is captured while simultaneously moving a plurality of image generation units in a state where the positional relationship is fixed. To do.
  • each of the plurality of image generation units illuminates an object from each of a plurality of directions and images it. That is, the direction in which one image generation unit illuminates the object and the direction in which the other image generation unit illuminates the object are different when the object is viewed from above. Accordingly, in the imaging device of the present disclosure, the shadows of the convex portions that overlap when captured by one image generation unit do not overlap when captured by another image generation unit. That is, the imaging apparatus acquires a plurality of pieces of image data including different shade information. Since these image data are generated in a state where the positional relationship among the plurality of image generation units is fixed, positional deviation is unlikely to occur and the image data is synthesized with high accuracy.
  • the image processing system generates highly accurate height data based on the plurality of image data.
  • the imaging apparatus of the present disclosure can also generate image data of an object without a convex portion with high accuracy.
  • a picture having a convex portion is described as an example of an object to be image processed.
  • FIG. 1 shows a configuration of an image processing system 1 according to the first embodiment.
  • the image processing system 1 according to the first embodiment includes an imaging device 100 that captures the surface of a painting (an example of an object) such as an oil painting and generates image data, and the generated image data to process information on the painting (high Image processing apparatus 200 for outputting color data and color image data).
  • the image processing apparatus 200 is connected to a printing apparatus 300 that duplicates a picture by printing based on output information of the image processing apparatus 200, for example.
  • the imaging apparatus 100 is a scanner using a line scan camera that captures an image of a linear region.
  • the imaging apparatus 100 receives an instruction to start imaging, outputs an input / output unit 11 that outputs image data of a captured picture, a control unit 12 that controls the entire imaging apparatus 100, and a first that illuminates and images a painting.
  • the image generation unit 10A and the second image generation unit 10B, and the drive unit 16 that moves the first image generation unit 10A and the second image generation unit 10B integrally.
  • the input / output unit 11 includes an input unit 11a and a communication unit 11b.
  • the input unit 11a is a keyboard, a mouse, a touch panel, or the like.
  • the communication unit 11b includes an interface circuit for performing communication with an external device in conformity with a predetermined communication standard (for example, Local Area Network: LAN, WiFi).
  • a predetermined communication standard for example, Local Area Network: LAN, WiFi.
  • the imaging apparatus 100 inputs an instruction to start imaging via the input unit 11a or the communication unit 11b, and outputs image data generated by imaging a picture from the communication unit 11b.
  • the control unit 12 Based on the input imaging start instruction, the control unit 12 causes the drive unit 16 to move the first image generation unit 10A and the second image generation unit 10B together while moving the first image generation unit. 10A and the 2nd image generation part 10B are made to image a picture in the state which illuminated the picture.
  • the control unit 12 can be realized by a semiconductor element or the like.
  • the function of the control unit 12 may be configured only by hardware, or may be realized by combining hardware and software.
  • the control unit 12 is, for example, a microcomputer, a CPU (Central Processing Unit), an MPU (Micro Processing Unit), a DSP (Digital Signal Processor), an FPGA (Field-Programmable Gate Array), and an ASIC (AppliedCritical Indication).
  • the first image generation unit 10A includes a camera 13a that captures a picture and generates image data, and a first illumination unit 14a and a third illumination unit 15a that illuminate the painting.
  • the camera 13a includes an imaging unit 131a and a memory 132a.
  • the imaging unit 131a includes, for example, a CCD (Charge Coupled Device) line sensor or a CMOS (Complementary Metal Oxide Semiconductor) line sensor.
  • the imaging unit 131a scans and captures a picture line by line, and captures image data of the picture.
  • the image data captured by the imaging unit 131a is stored in the memory 132a.
  • the camera 13a captures a picture, acquires color information (RGB or CMYK) for each pixel, and stores it in the memory 132a.
  • the memory 132a is realized by, for example, a RAM (Random Access Memory), a DRAM (Dynamic Random Access Memory), a ferroelectric memory, a flash memory, a magnetic disk, or a combination thereof.
  • the first illumination unit 14a and the third illumination unit 15a are scanning illumination light sources.
  • the first illuminating unit 14a and the third illuminating unit 15a are, for example, a high color rendering straight tube fluorescent lamp or a line LED illumination in which high color rendering white light emitting diodes (LEDs) are linearly arranged. is there.
  • the camera 13a may generate image data of an image including a shadow of a convex portion of the painting. it can.
  • the second image generation unit 10B includes a camera 13b that captures a picture and generates image data, and a second illumination unit 14b and a fourth illumination unit 15b that illuminate the painting.
  • the internal configuration of the second image generation unit 10B (that is, the camera 13b, the second illumination unit 14b, and the fourth illumination unit 15b) is the internal configuration of the first image generation unit 10A (except for the arrangement positions thereof) ( That is, it is the same as the camera 13a, the 1st illumination part 14a, and the 3rd illumination part 15a).
  • the drive unit 16 includes the camera 13a, the first illumination unit 14a, and the third illumination unit 15a of the first image generation unit 10A, and the camera 13b, the second illumination unit 14b, and the fourth of the second image generation unit 10B. It is connected to the illumination unit 15b.
  • the drive unit 16 is controlled by the control unit 12 to move the first image generation unit 10A and the second image generation unit 10B integrally. Thereby, it becomes possible for the camera 13a and the camera 13b to capture the picture one line at a time while moving.
  • the imaging apparatus 100 generates two-dimensional image data from the image data scanned by the camera 13a for each line and captured in the memory 132a, and also obtained from the image data scanned by the camera 13b for each line and captured in the memory 132b. Two-dimensional image data is generated, and the two image data are output from the communication unit 11b.
  • the image processing apparatus 200 inputs image data and outputs height information indicating the height of the surface of the object, and controls the entire image processing apparatus 200 and processes two input image data. Then, a control unit 22 that generates height information indicating the height of the surface of the object, and a memory 23 are included.
  • the input / output unit 21 includes an input unit 21a and a communication unit 21b.
  • the input unit 21a is a keyboard, a mouse, a touch panel, or the like.
  • the communication unit 21b includes an interface circuit for performing communication with an external device in compliance with a predetermined communication standard (for example, LAN, WiFi).
  • the image processing apparatus 200 when the user inputs an instruction for capturing image data via the input unit 21a, the image processing apparatus 200 outputs an image data capture request to the imaging device 100 via the communication unit 21b.
  • the image processing apparatus 200 receives the image data transmitted from the imaging apparatus 100 via the communication unit 21b.
  • the control unit 22 calculates the height of the surface of the painting (the height of the convex portion) from the length of the shadow included in the image of the received image data, and generates height information indicating the calculated height. Specifically, height data that represents the height of the surface of the painting as a numerical value for each pixel is generated as the height information. For example, height data is generated such that the higher the height of the convex portion, the larger the numerical value. The generated information is stored in the memory 23. Further, the control unit 22 outputs the generated height information to the printing apparatus 300 via the communication unit 21b.
  • the control unit 22 can be realized by a semiconductor element or the like. The function of the control unit 22 may be configured only by hardware, or may be realized by combining hardware and software.
  • the control unit 22 includes, for example, a microcomputer, CPU, MPU, DSP, FPGA, and ASIC.
  • the memory 23 is realized by, for example, a RAM, a DRAM, a ferroelectric memory, a flash memory, a magnetic disk, or a combination thereof.
  • the printing apparatus 300 prints an image that reproduces the height of the surface of the painting (including the convex portion) based on the height information (height data) received from the image processing apparatus 200.
  • the printing apparatus 300 is, for example, a UV inkjet printer that uses UV ink that is cured by being irradiated with ultraviolet rays.
  • the printing apparatus 300 prints an image including a convex portion by increasing the thickness of the ink as the height value indicated by the height data is larger by multilayer printing.
  • FIG. 2 shows a state where the imaging apparatus 100 is capturing a picture 400.
  • FIG. 3 shows a schematic side view of the imaging apparatus 100. 2 and 3, the right direction of the painting 400 is defined as a positive X direction, and the left direction of the painting 400 is defined as a negative X direction.
  • the downward direction of the painting 400 is a positive Y direction, and the upward direction of the painting 400 is a negative Y direction.
  • the Y direction is a main scanning direction (so-called scanning direction) for each line of the imaging apparatus 100, and the X direction is a direction (sub scanning direction) orthogonal to the Y direction.
  • the main scanning direction and the sub-scanning direction are moving directions of the imaging apparatus 100.
  • the drive unit 16 of the imaging device 100 includes a first guide rail 16b extending in the Y direction, a first movable body 16a moving forward and backward along the first guide rail 16b, and X
  • the second guide rail 16c extends in the direction
  • the second movable body 16d moves forward and backward along the second guide rail 16c
  • the frame 16e connected to the first movable body 16a.
  • the first movable body 16a and the second movable body 16d move forward and backward by driving a motor or the like.
  • Both the first image generation unit 10A and the second image generation unit 10B are fixed to the frame 16e. Thereby, the drive unit 16 moves the first image generation unit 10A and the second image generation unit 10B integrally.
  • the 10 A of 1st image generation parts image the painting 400 directly under the camera 13a. That is, the first image generation unit 10A images the painting 400 in the imaging region L1 (first imaging region) for one line immediately below the camera 13a.
  • the second image generation unit 10B captures the painting 400 directly below the camera 13b. That is, the second image generation unit 10B images the painting 400 in the imaging area L2 (second imaging area) for one line immediately below the camera 13b.
  • FIG. 4 shows a schematic top view of the imaging apparatus 100. That is, FIG. 4 shows the upper surface of the imaging apparatus 100 viewed from a direction (Z direction) perpendicular to the main surface of the painting 400.
  • the main surface of the painting 400 is the XY plane of the painting 400 and is a plane that ignores the convex portion.
  • the camera 13a of the first image generation unit 10A can capture an imaging region L1 that extends in a first direction that intersects the main scanning direction at an angle of 45 degrees.
  • the first illumination unit 14a and the third illumination unit 15a are arranged on both sides of the camera 13a so that the longitudinal directions of the first illumination unit 14a and the third illumination unit 15a are parallel to the extending direction of the imaging region L1. Is done.
  • the first illumination unit 14a and the third illumination unit 15a illuminate the imaging region L1 from both sides of the camera 13a.
  • the direction in which the first illumination unit 14a illuminates the imaging region L1 is defined as an illumination direction D1 (first illumination direction).
  • the direction in which the third illumination unit 15a illuminates the imaging region L1 is defined as an illumination direction D3 (third illumination direction).
  • the illumination direction D1 and the illumination direction D3 differ by 180 degrees when the painting 400 is viewed from above.
  • the camera 13b of the second image generation unit 10B can image the imaging region L2 extending in the second direction.
  • the second direction is a different direction on the same plane as the first direction (on the XY plane).
  • the second direction intersects the first direction at a predetermined angle (90 degrees in the first embodiment).
  • the second illumination unit 14b and the fourth illumination unit 15b are arranged on both sides of the camera 13b so that the longitudinal directions of the second illumination unit 14b and the fourth illumination unit 15b are parallel to the extending direction of the imaging region L2. Is done.
  • the second illumination unit 14b and the fourth illumination unit 15b illuminate the imaging region L2 from both sides of the camera 13b.
  • the direction in which the second illumination unit 14b illuminates the imaging region L2 is defined as an illumination direction D2 (second illumination direction).
  • a direction in which the fourth illumination unit 15b illuminates the imaging region L2 is defined as an illumination direction D4.
  • the illumination direction D2 and the illumination direction D4 differ by 180 degrees when the painting 400 is viewed from above.
  • the illumination direction D1 is a direction different from the illumination direction D2 and the illumination direction D4 when the object 400 is viewed from above.
  • the illumination direction D1 is a direction that is not parallel to the illumination direction D2 and the illumination direction D4 when the object 400 is viewed from above.
  • the illumination direction D1 is a direction that intersects the illumination direction D2 and the illumination direction D4 at an angle of 90 degrees when the object 400 is viewed from above.
  • the illumination direction D3 is a direction different from the illumination direction D2 and the illumination direction D4 when the object 400 is viewed from above.
  • the illumination direction D3 is a direction that is not parallel to the illumination direction D2 and the illumination direction D4 when the object 400 is viewed from above.
  • the illumination direction D3 is a direction that intersects the illumination direction D2 and the illumination direction D4 at an angle of 90 degrees when the object 400 is viewed from above.
  • the relationship in which a certain direction intersects with another direction includes a relationship in which an arrow indicating a certain direction and an arrow indicating the other direction intersect on the extension lines in the respective positive and negative directions.
  • the imaging area L1 and the imaging area L2 have the same size.
  • the first image generation unit 10A and the second image generation unit 10B are arranged side by side in the main scanning direction (Y direction) during imaging.
  • the imaging apparatus 100 can generate two pieces of image data whose shadow directions differ by 90 degrees in the top view of the painting 400 depending on the illumination direction D1 and the illumination direction D2 with respect to the object. Similarly, the imaging apparatus 100 can generate two pieces of image data that are 90 degrees different in shadow direction in the top view of the painting 400 depending on the illumination direction D3 and the illumination direction D4 with respect to the object.
  • the imaging device 100 is configured so that the illumination light of the first illumination unit 14a and the third illumination unit 15a of the first image generation unit 10A does not leak into the imaging region L2 of the second image generation unit 10B and the second image
  • the first image generation unit 10A and the second image generation unit 10B A light shielding member 17 that shields illumination light is further provided between the image generation units 10B.
  • FIG. 5 schematically shows the first image generation unit 10A and the second image generation unit 10B of the imaging apparatus 100 viewed from the direction of the arrow V1 in FIG.
  • the light shielding member 17 is omitted.
  • the lighting direction D1 first lighting direction
  • the imaging region L1 for one line
  • the entire surface of the painting 400 placed horizontally.
  • the imaging region L1 is illuminated such that the smaller angle (illumination angle) ⁇ among the angles is a constant angle.
  • This constant angle ⁇ is, for example, 30 °.
  • the first illumination unit 14a and the third illumination unit 15a respectively illuminate the imaging region L1 directly below the camera 13a from the diagonally upper right side direction and diagonally lower left side direction of the imaging region L1 (ie, painting).
  • the first image generation unit 10A can generate shaded image data by illuminating the imaging region L1 with respect to the painting 400 from diagonally upward or diagonally downward.
  • the second image generation unit 10B has a smaller angle (illumination) among the angles between the illumination direction D2 (second illumination direction) to the imaging region L2 by the second illumination unit 14b and the entire surface of the painting 400.
  • Angle) ⁇ and the smaller angle (illumination angle) ⁇ among the angles between the illumination direction D4 (fourth illumination direction) to the imaging region L2 by the fourth illumination unit 15b and the entire surface of the painting 400 are constant angles.
  • This constant angle ⁇ is, for example, 30 °.
  • the illumination angle ⁇ of the first illuminating unit 14a to the fourth illuminating unit 15b may be an angle at which a shadow appears due to illumination, and is not limited to 30 °, but 20 ° to 45 ° is particularly suitable.
  • the camera 13a, the first illumination unit 14a, and the third illumination unit 15a are fixed to the frame 16e.
  • the drive part 16 can also have the 3rd movable body 16f which makes the 1st illumination part 14a and the 3rd illumination part 15a raise / lower.
  • the camera 13b, the second illumination unit 14b, and the fourth illumination unit 15b of the second image generation unit 10B are fixed to the frame 16e, similarly to the first image generation unit 10A.
  • the control unit 12 controls the drive unit 16, and the camera 13a and the camera 13b, the first illumination unit 14a, the second illumination unit 14b, and the third illumination unit 15a.
  • the fourth illuminating unit 15b are integrally translated in the main scanning direction at a constant speed.
  • FIG. 6 shows a moving direction (main scanning direction) of the imaging device 100 during imaging.
  • the imaging device 100 captures an image of the painting 400 while moving in the vertical direction (Y direction) of the painting 400 while the painting 400 is fixed.
  • the imaging apparatus 100 captures an image in the positive Y direction from the upper end to the lower end of the painting 400
  • the imaging apparatus 100 moves slightly in the positive X direction and captures an image in the negative Y direction from the lower end to the upper end of the painting 400. This is repeated from the start position to the end position, and the entire painting 400 is imaged.
  • the main scanning direction is not limited to the vertical direction of the painting 400, and may be any direction.
  • the main scanning direction may be a vertical direction, a horizontal direction, or an oblique direction depending on the arrangement or orientation of the painting 400.
  • FIG. 7 shows a picture imaging process performed by the imaging apparatus 100.
  • the painting 400 imaged by the imaging device 100 is fixed directly below the imaging device 100 as shown in FIG. 2, for example.
  • the imaging device 100 illuminates the imaging region L1 and the imaging region L2 from one direction in each of the first image generation unit 10A and the second image generation unit 10B, and includes image data (shadow image) including the shadow of the convex portion. Data) is generated (S701). Specifically, the imaging apparatus 100 captures the painting 400 by moving the first image generation unit 10A and the second image generation unit 10B from the start position to the end position in FIG. . At this time, in the first image generation unit 10A, the imaging device 100 scans and captures the painting 400 with the camera 13a in a state where the imaging region L1 is illuminated only by the first illumination unit 14a, and the first image data ( Shadow image data) is generated.
  • the imaging device 100 when the imaging device 100 generates the first image data, at the same time, in the second image generation unit 10B, the imaging region L2 is illuminated only by the second illumination unit 14b, and the painting 400 is created by the camera 13b. Scan imaging is performed to generate second image data (shadow image data).
  • the first image data and the second image data are two-dimensional image data including shadows of the convex portions of the painting 400.
  • the imaging device 100 illuminates the imaging region L1 and the imaging region L2 from other directions in each of the first image generation unit 10A and the second image generation unit 10B, and includes an image including a shadow of the convex portion.
  • Data is generated (S702).
  • the imaging apparatus 100 causes the drive unit 16 to again perform the first image generation unit 10A and the first image generation unit 10A and the first image generation unit 10A in the reverse direction from the start position to the end position in FIG.
  • the painting 400 is imaged while moving the second image generation unit 10B.
  • the imaging apparatus 100 scans and captures the painting 400 with the camera 13a in a state where the imaging region L1 is illuminated only by the third illumination unit 15a, and obtains third image data ( Shadow image data) and the second image generator 10B scans and images the painting 400 with the camera 13b in a state where the imaging region L2 is illuminated only by the fourth illumination unit 15b. Shadow image data) is generated.
  • the third image data and the fourth image data are two-dimensional image data including the shadow of the convex portion of the painting 400.
  • the imaging apparatus 100 illuminates the imaging region L1 or the imaging region L2 from both directions in at least one of the first image generation unit 10A and the second image generation unit 10B, and includes the shadow of the convex portion.
  • No image data is generated (S703).
  • the imaging device 100 uses the drive unit 16 to perform the first image generation unit 10A and the second image generation unit 10B from the start position to the end position in FIG. 6 or from the end position to the start position in FIG.
  • the image 400 is picked up again while moving.
  • the imaging apparatus 100 scans and images the painting 400 in one or both of the first image generation unit 10A and the second image generation unit 10B, and generates fifth image data (color image data). .
  • the camera 13a is illuminated by the first illumination unit 14a and the third illumination unit 15a simultaneously from the upper and lower sides of the imaging region L1 to the imaging region L1 at the illumination angle ⁇ .
  • the painting 400 is scanned and imaged to generate fifth image data (color image data).
  • the camera 13b is illuminated by the second illumination unit 14b and the fourth illumination unit 15b simultaneously from the upper and lower sides of the imaging region L2 to the imaging region L2 at the illumination angle ⁇ .
  • the painting 400 may be scanned and imaged to generate fifth image data.
  • both the first image generation unit 10A and the second image generation unit 10B may generate the fifth image data, respectively.
  • the fifth image data is two-dimensional image data (color image data) that includes the color information (RGB or CMYK) of each pixel of the painting 400 and does not include the shadow of the convex portion.
  • the imaging apparatus 100 outputs the generated first image data to fifth image data from the communication unit 11b (S704).
  • FIG. 8A shows a shadow generated when the first illumination unit 14a of the first image generation unit 10A illuminates the painting 400 from the diagonal upper right side.
  • FIG. 8B shows a shadow generated when the third illumination unit 15a of the first image generation unit 10A illuminates the painting 400 from the diagonally lower left side.
  • the painting 400 may include, for example, a convex portion (thickness portion of the paint) 401 formed by repeatedly painting colors such as an oil painting.
  • the second illumination unit 14b and the fourth illumination unit 15b of the second image generation unit 10B are 90 degrees with the shadow S1 and the shadow S2 of the first image generation unit 10A on the main surface (XY plane) of the painting 400. Shadows on two directions of different convex portions 401 (for example, upper left and lower right) are obtained. That is, according to the first embodiment, shadows in four directions are obtained with respect to the convex portion 401.
  • the image processing device 200 calculates the height of the convex portion 401 of the painting as follows.
  • FIG. 9 shows height data generation processing by the control unit 22 of the image processing apparatus 200.
  • the image processing apparatus 200 receives the first to fourth image data (shadow image data) and the fifth image data (color image data) output from the imaging apparatus 100 (S901).
  • the image processing apparatus 200 calculates the shade length of the convex portion 401 from the first image data and the third image data generated by the first image generation unit 10A (S902). As illustrated in FIG. 8A, when the painting 400 is illuminated obliquely from the upper right side by the first lighting unit 14a, a shadow S1 of the convex portion 401 appears in one direction of the convex portion 401 (for example, the lower left side of the convex portion 401). For example, the image processing apparatus 200 calculates the length (for example, the number of pixels) of the shade S1 of the convex portion 401 based on the difference in luminance value or the color difference between the pixels of the first image data and the fifth image data. . Further, as shown in FIG.
  • the image processing apparatus 200 calculates the length (for example, the number of pixels) of the shade S2 of the convex portion 401 based on the difference in luminance value or the color difference between the pixels of the third image data and the fifth image data. .
  • the image processing apparatus 200 calculates the height H3 of the convex portion 401 based on the calculated lengths of the shadow S1 and the shadow S2 and the illumination angle ⁇ of the first illumination unit 14a and the third illumination unit 15a (S903). . Specifically, the image processing apparatus 200 determines one direction (for example, the lower left side) of the convex portion 401 based on the calculated length (number of pixels) of the shadow S1 and the illumination angle ⁇ of the first illumination unit 14a.
  • the image processing apparatus 200 determines the height in the other direction (for example, the upper right side) of the convex portion 401 based on the calculated length (number of pixels) of the shadow S2 and the illumination angle ⁇ of the third illumination unit 15a.
  • the image processing apparatus 200 interpolates the height H3 between the height H1 and the height H2 of the convex portion 401 in the two directions (upper right side and lower left side).
  • the image processing apparatus 200 has a direction that is 90 degrees different from the first image generation unit 10A from the second image data, the fourth image data, and the fifth image data generated by the second image generation unit 10B ( For example, the length of the shadow of the convex portion 401 on the upper left side and the lower right side of the convex portion 401 is calculated (S904). Then, the image processing apparatus 200 calculates the height H3 of the convex portion 401 based on the calculated shadow length and the illumination angle ⁇ of the second illumination portion 14b and the fourth illumination portion 15b (S905).
  • the first image generation unit 10A and the first image generation unit 10A 90 degrees from the length of the shadow of the projection 401 in a direction 90 degrees different from the first image generation unit 10A (for example, the upper left side and the lower right side of the projection 401).
  • the heights of the convex portions 401 in different directions are calculated, and the height H3 between them is interpolated.
  • the image processing apparatus 200 combines the height information of the first image generation unit 10A and the second image generation unit 10B, calculates the height of the entire convex portion 401, and generates height data (S906). ). That is, the height H3 of the convex portion 401 calculated based on the image data of the first image generation unit 10A (result of S903) and the convex portion 401 calculated based on the image data of the second image generation unit 10B. From the height H3 (result of S905), the height of the entire convex portion 401 is calculated.
  • the height H3 (result of S903) obtained by interpolation based on the height of the upper right side and the lower left side of the convex portion 401, the upper left side and the lower right side of the convex portion 401
  • the total height of the convex 401 is calculated by comparing the height H3 (result of S905) obtained by interpolation based on the height of H, and selecting the higher value.
  • the shadow image data (first image data, third image data) generated by the first image generation unit 10A is a high convex portion 411 as shown in FIG.
  • the shadow image data (second image data and fourth image data) generated by the second image generation unit 10B is generated even in a state where the shadow of the low convex portion 412 overlaps the shadow of As shown in FIG. 13B, the shadow of the high convex portion 411 and the shadow of the low convex portion 412 are generated so as not to overlap. Therefore, by using both the shadow image data generated by the first image generation unit 10A and the shadow image data generated by the second image generation unit 10B, the overall height of the convex portion 401 is accurately calculated. be able to.
  • the image processing apparatus 200 calculates the height of the entire image (all pixels constituting the image) of the painting 400 by calculating the height of all the convex portions 401 included in the painting 400, and the height of the entire image. Generate height data as information. For example, height data representing the height of each pixel in the image as a numerical value is generated.
  • the image processing apparatus 200 outputs the color image data and the height data to the printing apparatus 300 (S907).
  • the printing device 300 Based on the height data output from the image processing device 200, the printing device 300 increases the discharge amount of the transparent ink, for example, for the pixels with the larger numerical value of the height data. Is printed on a substrate (paper, cloth, plastic, etc.). As a result, pixels with a large amount of transparent ink discharged rise higher, and thus the convex portion 401 can be represented.
  • the printing apparatus 300 prints an image using color ink on the upper surface of the transparent ink based on the color image data output from the image processing apparatus 200. Thereby, the painting 400 which reproduced the convex part 401 can be duplicated.
  • the imaging apparatus 100 includes the first imaging unit (imaging unit 131a) capable of imaging the first imaging region L1 extending in the first direction, and the first imaging.
  • a first image generation unit 10A including a first illumination unit 14a that illuminates the region L1 from the first illumination direction D1, and a second imaging region L2 that extends in a second direction different from the first direction. And a second illumination direction D2 different from the first illumination direction D1 when the second imaging region L2 is viewed from above the second imaging region L2.
  • the second image generation unit 10B including the second illumination unit 14b that illuminates the first image generation unit 10B, and the first image generation unit 10A and the second image generation unit 10B in predetermined moving directions (main scanning direction and sub-operation direction).
  • the imaging device 100 can perform imaging in a state where illumination is performed from a plurality of different directions by the first illumination unit 14a and the second illumination unit 14b. That is, the imaging apparatus 100 can capture images in different illumination directions D1 and D2 as shown in (a) above and (b) below in FIG. 13 without rotating an object (for example, a picture). become. Thereby, image data can be synthesized with high accuracy based on a plurality of image data. In particular, even when the object has a plurality of convex portions and there are convex portions having different heights in the vicinity, image data including a shadow can be generated with high accuracy based on the plurality of image data.
  • the imaging apparatus 100 can generate image data including a shadow with higher accuracy. If the illumination direction D1 and the illumination direction D2 are different directions when the object is viewed from above, it is possible to avoid the shadows of the convex portions from overlapping. In particular, if the illumination direction D1 and the illumination direction D2 are non-parallel directions when the object is viewed from the top, it is possible to further avoid the shadows of the convex portions from overlapping.
  • the first image generation unit 10A shoots an object having a convex portion in the first imaging region L1 by illuminating with the first illumination unit 14a, and generates first shadow information indicating a shadow by the convex portion. To do.
  • the second image generation unit 10B shoots an object in the second imaging region L2 by illuminating with the second illumination unit 14b, and generates second shadow information indicating a shadow by the convex portion.
  • the first image generation unit 10A is arranged on the opposite side of the first illumination unit 14a via the imaging unit 131a, and illuminates the first imaging region L1 from the third illumination direction D3. 15a is further included.
  • the first image generation unit 10A shoots an object in the first imaging region L1 by illuminating with the third illumination unit 15a, and generates third shadow information indicating a shadow by the convex portion.
  • the second image generation unit 10B is disposed on the opposite side of the second illumination unit 14b via the imaging unit 131b, and includes a fourth illumination unit 15b that illuminates the second imaging region L2 from the fourth illumination direction D4. In addition.
  • the second image generation unit 10B shoots an object in the second imaging region L2 by illuminating with the fourth illumination unit 15b, and generates fourth shadow information indicating a shadow by the convex portion. Thereby, it can image by illuminating from two different directions (illumination direction D1 and illumination direction D3) with respect to the first imaging region L1. Moreover, it can image by illuminating from two different directions (illumination direction D2, illumination direction D4) with respect to the second imaging region L2. That is, more shadow information (first shadow information to fourth shadow information) can be generated. Therefore, image data can be generated with higher accuracy.
  • the imaging apparatus 100 simultaneously performs imaging of the first imaging area L1 by the first image generation unit 10A and imaging of the second imaging area L2 by the second image generation unit 10B. Thereby, a plurality of image data can be generated in a short time.
  • the first direction (the direction in which the imaging region L1 extends) and the longitudinal directions of the first illumination unit 14a and the third illumination unit 15a are parallel to each other.
  • the second direction (the direction in which the imaging region L2 extends) and the longitudinal directions of the second illumination unit 14b and the fourth illumination unit 15b are parallel to each other. That is, in the entire imaging region L1, the distance from the first illumination unit 14a to each pixel in the imaging region L1 is the same. Similarly, in the entire imaging region L1, the distance from the third illumination unit 15a to each pixel in the imaging region L1 is the same. Furthermore, in the entire imaging region L2, the distance from the second illumination unit 14b to each pixel in the imaging region L2 is the same.
  • the distance from the fourth illumination unit 15b to each pixel in the imaging region L2 is the same. Therefore, all the pixels in the imaging region L1 and the imaging region L2 can be illuminated with a uniform angle and light amount. As a result, accurate shadow image data can be obtained.
  • the angle at which the first direction (the direction in which the imaging region L1 extends) and the second direction (the direction in which the imaging region L2 extends) intersects is 90 degrees. Therefore, according to the imaging apparatus 100 of Embodiment 1, it is possible to acquire image data in a state where the painting is illuminated from a direction different by 90 degrees without rotating the painting.
  • the angle formed by the moving direction (main scanning direction) is 45 degrees.
  • the direction in which the imaging region L1 and the imaging region L2 extend intersects with the moving direction (main scanning direction) at the time of imaging at 45 degrees.
  • the size of the imaging region L2 in the X direction is the same.
  • the sizes of the imaging area L1 and the imaging area L2 in the Y direction are also the same. Therefore, the image processing apparatus 200 can easily combine the height information of the image data obtained by imaging in each of the imaging region L1 and the imaging region L2.
  • the first image generation unit 10A and the second image generation unit 10B of the first embodiment are arranged side by side in the main scanning direction. Thereby, height information can be easily synthesized.
  • the imaging apparatus 100 includes illuminations of a first illumination unit 14a, a second illumination unit 14b, a third illumination unit 15a, and a fourth illumination unit 15b) between the first image generation unit 10A and the second image generation unit 10B.
  • a light shielding member 17 for shielding light is provided.
  • the shielding member 17 can prevent the illumination light for illuminating the other imaging area from leaking into one imaging area, thereby reducing the shadow or changing the direction of the shadow. Therefore, it is possible to generate shadow image data with higher accuracy.
  • the image processing system 1 includes an imaging device 100, an angle formed by the illumination direction D1 of the first illumination unit 14a and the main surface of the object, first shadow information (first image data), and second illumination.
  • the height information (height data) indicating the height of the surface of the object is generated based on the angle formed by the illumination direction D2 of the part 14b and the main surface of the object and the second shadow information (second image data).
  • an image processing apparatus 200 Furthermore, the image processing apparatus 200 in the image processing system 1 according to the first embodiment includes an angle formed by the illumination direction D3 of the third illumination unit 15a and the main surface of the object, third shadow information (third image data), and fourth.
  • Height information (height data) is generated using both the angle formed by the illumination direction D4 of the illumination unit 15b and the main surface of the object and the fourth shadow information (fourth image data). According to the imaging apparatus 100 of the first embodiment, it is possible to accurately generate image data including a shadow. Therefore, according to the image processing system 1 of the first embodiment using the imaging apparatus 100, the height of the entire convex portion can be accurately calculated from the length of the shadow.
  • the image processing method illuminates and captures an object having a convex portion on the surface from the first illumination direction D1 in the first imaging region L1 extending in the first direction. 1st shadow information which shows the shadow by is produced
  • the second imaging region L2 extending in a second direction different from the first direction
  • height information indicating the height of the surface of the object is generated.
  • the angle and first shadow information formed by the first illumination direction D1 and the main surface of the object, and the angle and second shadow information formed by the second illumination direction D2 and the main surface of the object are generated.
  • the image processing method according to the first embodiment shoots an object by illuminating an object from the third illumination direction D3 in the first imaging region L1, and generates third shadow information indicating a shadow due to the convex portion.
  • this image processing method in the second imaging region L2, when an object is viewed from the top, the object is illuminated and photographed from a fourth illumination direction D4 that is different from the third illumination direction D3. 4th shadow information which shows a shadow is produced
  • image data including a shadow can be generated with high accuracy, the height of the entire convex portion can be accurately calculated from the length of the shadow.
  • the first embodiment provides a duplication system including an imaging device 100, an image processing device 200, and a printing device 300.
  • this duplication system when duplicating a picture, the convex part (height of the picture surface) of the picture can be accurately reproduced. This makes it possible to generate a reproduction of a painting that is closer to the real thing.
  • illumination is performed from a plurality of different illumination directions to capture a plurality of shadow images including shadows of the convex portions of the painting.
  • illumination from a plurality of different illumination directions is performed to shade the convex portions of the painting.
  • a plurality of color images not including the image may be captured.
  • a plurality of color images can be captured at once by using the imaging device 100 according to the first embodiment.
  • the first embodiment has been described as an example of the technique disclosed in the present application.
  • the technology in the present disclosure is not limited to this, and can also be applied to embodiments in which changes, substitutions, additions, omissions, and the like are appropriately performed.
  • the movement from the start position of the scan to the end position is performed three times, thereby illuminating from four directions.
  • Shadow image data (first image to fourth image data) and color image data (fifth image data) were obtained.
  • the number of movements need not be three. Shadow image data (first to fourth image data) and color image data (first image data) from four directions (for example, upper right side and lower left side, upper left side and lower right side) that differ by 90 degrees with respect to the painting 400 (5 image data) can be acquired, the number of movements may be set arbitrarily.
  • the imaging region L1 and the imaging region L2 are illuminated with three types of illumination states (a state in which only the first illumination unit 14a and the second illumination unit 14b are illuminated, the third illumination unit 15a and the fourth illumination unit).
  • the shadow image data first to fourth image data
  • color image data fifth image data
  • the intersection angle in the extending direction of the imaging region L1 and the imaging region L2 is 90 degrees.
  • the intersection angle may not be just 90 degrees and may be substantially 90 degrees.
  • FIG. 10 shows an allowable range of the crossing angle according to the number of pixels of the imaging unit 131a and the imaging unit 131b. This allowable range indicates an angle range that can be said to be substantially 90 degrees.
  • FIG. 10 shows an imaging when the error of the position of the convex portion on the XY plane can be tolerated to 1 mm when the image data captured by the imaging unit 131a and the imaging unit 131b is captured at a resolution of 1000 dpi.
  • region L1 and the imaging area L2 is shown.
  • the error in the position of the convex portion can be calculated by dividing the number of pixels of the imaging unit by the value of resolution of image data 1000 dpi ⁇ number of pixels of the imaging unit ⁇ tan (intersection angle error), for example, the imaging unit 131a. If the number of pixels of the imaging unit 131b is 7000, the allowable range of the crossing angle is 89.5 ° to 90.5 °. The intersection angle between the imaging regions L1 and L2 may be determined within a range of allowable angles according to the number of pixels of the imaging unit 131a and the imaging unit 131b as illustrated in FIG.
  • the first image generation unit 10A and the second image generation unit 10B are arranged side by side with respect to the main scanning direction (along the Y direction) as shown in FIG. These may be arranged in other positional relationships.
  • FIG. 11 shows different positional relationships between the first image generation unit 10A and the second image generation unit 10B.
  • the first image generation unit 10A and the second image generation unit 10B may be arranged side by side (along the X direction) with respect to the main scanning direction.
  • the size in the X direction and the size in the Y direction of the imaging region L1 and the imaging region L2 are the same. become. Therefore, it becomes possible to easily combine the height information of the image data obtained by imaging in each of the imaging area L1 and the imaging area L2.
  • the arrangement in which the first image generation unit 10A and the second image generation unit 10B are arranged on the left and right in the main scanning direction is the first image generation unit 10A and the second image generation unit 10A in the arrangement shown in FIG.
  • the first image generation unit 10A and the second image generation unit 10B are arranged in the front and rear direction with respect to the main scanning direction as shown in FIG. 4, or are arranged in the right and left direction with respect to the main scanning direction as shown in FIG. This may be determined according to the aspect ratio of the picture to be captured. For example, when capturing a vertically long picture, it may be arranged in the front-rear direction with respect to the main scanning direction.
  • the imaging apparatus 100 includes the two image generation units of the first image generation unit 10A and the second image generation unit 10B has been described.
  • the number of image generation units is not limited to two.
  • the imaging apparatus 100 may include three or more image generation units.
  • FIG. 12 illustrates a positional relationship when the imaging apparatus 100 includes three image generation units (a first image generation unit 10A, a second image generation unit 10B, and a third image generation unit 10C).
  • the third image generation unit 10C includes an imaging unit (third imaging unit), an illumination unit 14c (fifth illumination unit), and an illumination unit 15c (sixth illumination unit).
  • the imaging unit (third imaging unit) images the third imaging region L3 extending in the third direction.
  • the illumination unit 14c and the illumination unit 15c illuminate the third imaging region L3.
  • the directions in which the imaging regions L1 to L3 extend are different.
  • the imaging apparatus 100 causes three image generation units (first image generation unit 10A to third image generation unit 10C) to extend in the respective imaging regions L1 to L3. May be arranged so as to intersect at 60 degrees. Thereby, it becomes possible to calculate the height of the convex portion with higher accuracy.
  • a painting is described as an example of an imaging target of the imaging device 100 of the present disclosure, but the imaging target is not limited to a painting.
  • the idea of the imaging apparatus 100 of the present disclosure can be applied particularly when imaging a planar object having a convex portion.
  • examples of the object include art objects such as sculpture, wallpaper, floor and ceiling crosses, and the like.
  • a picture is described as an example of an image processing target of the image processing system 1 of the present disclosure, but the image processing target is not limited to a picture.
  • the idea of the image processing system 1 of the present disclosure can be applied when generating height information of the surface of a planar object having a convex portion.
  • the replication system of the present disclosure can be realized by cooperation with hardware resources such as a processor, a memory, and a program.
  • the present disclosure can be applied to an imaging device that generates image data including a shadow of an object having a convex portion (for example, a painting) and an image processing system that generates height data of an object having a convex portion from the image data including the shadow. It is.

Abstract

Disclosed is an image pickup device (100) that is provided with: a first image generation unit (10A), which includes a first image pickup unit (131a) capable of picking up an image of a first image pickup region extending in the first direction, and a first illuminating unit (14a) that illuminates the first image pickup region from the first illuminating direction; a second image generation unit (10B), which includes a second image pickup unit (131b) capable of picking up an image of a second image pickup region extending in the second direction that is different from the first direction, and a second illuminating unit (14b) that illuminates the second image pickup region from the second illuminating direction in plan view of the second image pickup region, said second illuminating direction being different from the first illuminating direction; and a drive unit (16) that integrally moves, in the predetermined moving direction, the first image generating unit (10A) and the second image generating unit (10B).

Description

撮像装置、画像処理システム、撮像方法、及び画像処理方法Imaging apparatus, image processing system, imaging method, and image processing method
 本開示は、物体の画像データを生成する撮像装置、撮像装置を用いた画像処理システム、物体の画像データを生成する撮像方法、及び物体の画像データを生成して処理する画像処理方法に関する。 The present disclosure relates to an imaging device that generates image data of an object, an image processing system that uses the imaging device, an imaging method that generates image data of an object, and an image processing method that generates and processes image data of the object.
 特許文献1は、平面の原画像に高さ方向の情報を付加して立体画像データを生成する画像処理装置を開示する。この画像処理装置は、原画像データの焦点情報に基づいて分離した領域毎に、コントラストの高低(明暗)に基づいて又はオブジェクトの輪郭情報に基づいて高さ情報を付加することによって、陰影や質感をリアルに表現することを可能にしている。 Patent Document 1 discloses an image processing apparatus that generates stereoscopic image data by adding height direction information to a planar original image. This image processing apparatus adds shadow information and texture to each region separated based on focus information of original image data by adding height information based on contrast height (brightness or darkness) or based on object contour information. Can be expressed realistically.
特開2016-63522号公報JP 2016-63522 A
 本開示にかかる撮像装置は、第1の方向に延在する第1の撮像領域を撮像可能な第1の撮像部と、前記第1の撮像領域を第1の照明方向から照明する第1の照明部と、を含む第1の画像生成部と、前記第1の方向と異なる第2の方向に延在する第2の撮像領域を撮像可能な第2の撮像部と、前記第2の撮像領域を上面視した場合に、前記第1の照明方向と異なる第2の照明方向から前記第2の撮像領域を照明する第2の照明部と、を含む第2の画像生成部と、前記第1の画像生成部及び前記第2の画像生成部を所定の移動方向へ一体的に移動させる駆動部と、を備える。 An imaging apparatus according to an embodiment of the present disclosure includes a first imaging unit capable of imaging a first imaging area extending in a first direction, and a first illuminating the first imaging area from a first illumination direction. A first image generation unit including an illumination unit; a second imaging unit capable of imaging a second imaging region extending in a second direction different from the first direction; and the second imaging. A second illumination unit that illuminates the second imaging region from a second illumination direction different from the first illumination direction when the region is viewed from above, and A drive unit that integrally moves the first image generation unit and the second image generation unit in a predetermined movement direction.
 本開示にかかる画像処理システムは、撮像装置と画像処理装置とを備える。前記撮像装置は、第1の方向に延在する第1の撮像領域を撮像可能な第1の撮像部と、前記第1の撮像領域を第1の照明方向から照明する第1の照明部と、を含む第1の画像生成部と、前記第1の方向と異なる第2の方向に延在する第2の撮像領域を撮像可能な第2の撮像部と、前記第2の撮像領域を上面視した場合に、前記第1の照明方向と異なる第2の照明方向から前記第2の撮像領域を照明する第2の照明部と、を含む第2の画像生成部と、前記第1の画像生成部及び第2の画像生成部を移動方向へ一体的に移動させる駆動部と、を備える。前記第1の画像生成部は、前記第1の撮像領域内にある凸部を有する物体を、前記第1の照明部によって照明して撮影し、前記凸部による陰影を示す第1の陰影情報を生成し、前記第2の画像生成部は、前記第2の撮像領域内にある前記物体を、前記第2の照明部によって照明して撮影し、前記凸部による陰影を示す第2の陰影情報を生成する。 The image processing system according to the present disclosure includes an imaging device and an image processing device. The imaging apparatus includes a first imaging unit capable of imaging a first imaging area extending in a first direction, and a first illumination unit that illuminates the first imaging area from a first illumination direction. , A second imaging unit capable of imaging a second imaging region extending in a second direction different from the first direction, and an upper surface of the second imaging region. A second illumination unit that illuminates the second imaging region from a second illumination direction different from the first illumination direction when viewed, and the first image And a drive unit that integrally moves the generation unit and the second image generation unit in the movement direction. The first image generation unit shoots an object having a convex portion in the first imaging region by illuminating the first illumination unit with the first illumination unit, and shows first shadow information indicating a shadow by the convex portion. The second image generation unit illuminates the object in the second imaging area with the second illumination unit and shoots the second shadow indicating the shadow by the convex part. Generate information.
 本開示にかかる撮像方法は、第1の方向に延在する第1の撮像領域内を所定の移動方向に移動させながら、前記第1の撮像領域内の物体を、第1の照明方向から照明して撮影し、第1の画像をスキャン撮像し、前記第1の方向と異なる第2の方向に延在する第2の撮像領域を前記第1の撮像領域と一体的に前記移動方向へ移動させながら、前記第2の撮像領域内の前記物体を、前記物体を上面視した場合に前記第1の照明方向と異なる第2の照明方向から照明して撮影し、第2の画像をスキャン撮像する。 The imaging method according to the present disclosure illuminates an object in the first imaging area from the first illumination direction while moving in a predetermined movement direction in the first imaging area extending in the first direction. The first image is scanned and the second image area extending in the second direction different from the first direction is moved in the moving direction integrally with the first image area. In this case, the object in the second imaging region is photographed by illuminating from a second illumination direction different from the first illumination direction when the object is viewed from above, and a second image is scanned and imaged. To do.
 本開示にかかる画像処理方法は、第1の方向に延在する第1の撮像領域内において、表面に凸部を有する物体を、第1の照明方向から照明して撮影し、前記凸部による陰影を示す第1の陰影情報を生成し、前記第1の方向と異なる第2の方向に延在する第2の撮像領域内において、前記物体を、前記物体を上面視した場合に前記第1の照明方向と異なる第2の照明方向から照明して撮影し、前記凸部による陰影を示す第2の陰影情報を生成し、前記第1の照明方向と前記物体の主面とがなす角度及び前記第1の陰影情報と、前記第2の照明方向と前記物体の前記主面とがなす角度及び前記第2の陰影情報とに基づいて、前記物体の前記表面の高さを示す高さ情報を生成する。 In the image processing method according to the present disclosure, in a first imaging region extending in a first direction, an object having a convex portion on a surface is photographed by illuminating from a first illumination direction, and the convex portion is used. First shadow information indicating a shadow is generated, and the first object is viewed when the object is viewed from above in a second imaging region extending in a second direction different from the first direction. And illuminating from a second illumination direction different from the illumination direction, generating second shadow information indicating a shadow by the convex portion, an angle formed by the first illumination direction and the main surface of the object, and Height information indicating the height of the surface of the object based on the first shadow information, an angle formed by the second illumination direction and the main surface of the object, and the second shadow information. Is generated.
 本開示における撮像装置、画像処理システム、撮像方法、及び画像処理方法は、物体の画像データを高精度に生成できる。 The imaging apparatus, image processing system, imaging method, and image processing method according to the present disclosure can generate object image data with high accuracy.
図1は、実施形態1の画像処理システム及び画像処理システムに接続される印刷装置の構成を示すブロック図である。FIG. 1 is a block diagram illustrating a configuration of an image processing system and a printing apparatus connected to the image processing system according to the first embodiment. 図2は、実施形態1における撮像装置による絵画の撮像を説明するための図である。FIG. 2 is a diagram for explaining imaging of a picture by the imaging apparatus according to the first embodiment. 図3は、実施形態1における撮像装置の駆動部を説明するための図である。FIG. 3 is a diagram for explaining a drive unit of the imaging apparatus according to the first embodiment. 図4は、実施形態1の撮像装置の第1の画像生成部及び第2の画像生成部の位置関係を説明するための図である。FIG. 4 is a diagram for explaining the positional relationship between the first image generation unit and the second image generation unit of the imaging apparatus according to the first embodiment. 図5は、実施形態1における撮像装置の照明角度を説明するための図である。FIG. 5 is a diagram for explaining the illumination angle of the imaging apparatus according to the first embodiment. 図6は、実施形態1における撮像装置の撮像時の移動方向を説明するための図である。FIG. 6 is a diagram for explaining a moving direction during imaging of the imaging apparatus according to the first embodiment. 図7は、実施形態1における撮像装置の動作を説明するためのフローチャートである。FIG. 7 is a flowchart for explaining the operation of the imaging apparatus according to the first embodiment. 図8Aは、実施形態1における撮像時の照明角度と陰影との関係を説明するための図である。FIG. 8A is a diagram for explaining a relationship between an illumination angle and a shadow at the time of imaging in the first embodiment. 図8Bは、実施形態1における撮像時の照明角度と陰影との関係を説明するための図である。FIG. 8B is a diagram for explaining a relationship between an illumination angle and a shadow at the time of imaging in the first embodiment. 図9は、実施形態1における画像処理装置の動作を説明するためのフローチャートである。FIG. 9 is a flowchart for explaining the operation of the image processing apparatus according to the first embodiment. 図10は、撮像領域間の許容角度を説明するための図である。FIG. 10 is a diagram for explaining an allowable angle between imaging regions. 図11は、他の実施形態における撮像装置の第1の画像生成部及び第2の画像生成部の位置関係を説明するための図である。FIG. 11 is a diagram for describing the positional relationship between the first image generation unit and the second image generation unit of an imaging apparatus according to another embodiment. 図12は、他の実施形態における撮像装置の第1の画像生成部、第2の画像生成部、及び第3の画像生成部の位置関係を説明するための図である。FIG. 12 is a diagram for explaining a positional relationship among a first image generation unit, a second image generation unit, and a third image generation unit of an imaging apparatus according to another embodiment. 図13は、課題を説明するための図である。FIG. 13 is a diagram for explaining the problem.
 以下、適宜図面を参照しながら、実施形態を詳細に説明する。但し、必要以上に詳細な説明は省略する場合がある。例えば、既によく知られた事項の詳細説明や実質的に同一の構成に対する重複説明を省略する場合がある。これは、以下の説明が不必要に冗長になるのを避け、当業者の理解を容易にするためである。なお、発明者らは、当業者が本開示を十分に理解するために添付図面および以下の説明を提供するのであって、これらによって請求の範囲に記載の主題を限定することを意図するものではない。 Hereinafter, embodiments will be described in detail with reference to the drawings as appropriate. However, more detailed description than necessary may be omitted. For example, detailed descriptions of already well-known matters and repeated descriptions for substantially the same configuration may be omitted. This is to avoid the following description from becoming unnecessarily redundant and to facilitate understanding by those skilled in the art. In addition, the inventors provide the accompanying drawings and the following description in order for those skilled in the art to fully understand the present disclosure, and are not intended to limit the subject matter described in the claims. Absent.
 (課題)
 例えば油彩画などの絵画は、絵の具の厚みや筆の跡など、凸部を有する物体である。このような物体(被写体)を光により照明して撮像することによって、物体の陰影を含む画像データを生成できる。ここで図13の(a)紙面上方に示すように、物体410(絵画)が、高さの高い凸部411の近傍に高さの低い凸部412を含む場合、凸部412の陰影が凸部411の陰影によって隠れた状態になる場合がある。このような場合、画像データは、凸部412の陰影情報を示さない。凸部412の陰影情報を示す画像データを生成するためには、図13の(b)紙面下方に示すように物体410を回転させ、再度、凸部411及び凸部412の陰影を撮像する必要がある。この場合、物体410の回転を精度良く行う必要がある。回転の精度が不正確であれば、図13の(a)紙面上方の状態で撮像した画像データと図13の(b)紙面下方の状態で撮像した画像データとを合成するときに、凸部411及び凸部412の位置ずれが発生する。位置ずれにより、凸部411及び凸部412の正確な陰影を含む画像データを生成することができなくなる。
(Task)
For example, a painting such as an oil painting is an object having a convex portion such as the thickness of a paint or the trace of a brush. By imaging such an object (subject) by illuminating it with light, image data including the shadow of the object can be generated. Here, when the object 410 (painting) includes a convex portion 412 having a low height in the vicinity of the convex portion 411 having a high height, as shown in FIG. There is a case where it is hidden by the shadow of the part 411. In such a case, the image data does not indicate the shadow information of the convex portion 412. In order to generate image data indicating the shadow information of the convex portion 412, it is necessary to rotate the object 410 as shown in the lower part of the drawing of FIG. 13B and again capture the shadows of the convex portion 411 and the convex portion 412. There is. In this case, it is necessary to rotate the object 410 with high accuracy. If the accuracy of the rotation is inaccurate, a convex portion is formed when combining (a) image data captured in the state above the paper surface in FIG. 13 and (b) image data captured in the state below the paper surface in FIG. 411 and the convex part 412 are displaced. Due to the misalignment, image data including accurate shadows of the convex portions 411 and the convex portions 412 cannot be generated.
 本開示の撮像装置は、物体を撮像し、高精度な画像データを生成できる。特に、物体が油彩画などの絵画である場合、つまり物体が複数の凸部を有し、これらの凸部の高さが互いに異なり、さらにこれらの凸部が隣接して存在する場合又は近傍にある場合でも、撮像装置は、凸部の陰影を含む画像データを精度良く生成する。具体的には、本開示の撮像装置では、複数の画像生成部を、互いの位置関係が固定された状態で同時に移動させながら、凸部を有する物体(例えば、油彩画などの絵画)を撮像する。このとき、複数の画像生成部のそれぞれは、複数の方向のそれぞれから物体を照明して撮像する。つまり一の画像生成部が物体を照明する方向と他の画像生成部が物体を照明する方向とは、物体を上面視した場合に異なる。これにより本開示の撮像装置では、一の画像生成部で撮像する時に重なり合った凸部の陰影は、他の画像生成部で撮像する時には重なり合わない。つまり撮像装置は、互いに異なる陰影情報を含む複数の画像データを取得する。これらの画像データは、複数の画像生成部の位置関係が固定された状態で生成されるため、位置ずれが生じにくく、高精度に合成される。そして本開示の画像処理システムは、複数の画像データに基づいて、高精度な高さデータを生成する。以下、本開示の詳細を説明する。なお、本開示の撮像装置は、凸部のない物体の画像データも高精度に生成できるが、以下の実施形態では、画像処理対象の物体として凸部を有する絵画を例に説明する。 The imaging device of the present disclosure can capture an object and generate highly accurate image data. In particular, when the object is a painting such as an oil painting, that is, when the object has a plurality of convex portions, the heights of these convex portions are different from each other, and these convex portions are adjacent to each other or in the vicinity thereof. Even in some cases, the imaging device generates image data including the shadow of the convex portion with high accuracy. Specifically, in the imaging device of the present disclosure, an object having a convex portion (for example, a painting such as an oil painting) is captured while simultaneously moving a plurality of image generation units in a state where the positional relationship is fixed. To do. At this time, each of the plurality of image generation units illuminates an object from each of a plurality of directions and images it. That is, the direction in which one image generation unit illuminates the object and the direction in which the other image generation unit illuminates the object are different when the object is viewed from above. Accordingly, in the imaging device of the present disclosure, the shadows of the convex portions that overlap when captured by one image generation unit do not overlap when captured by another image generation unit. That is, the imaging apparatus acquires a plurality of pieces of image data including different shade information. Since these image data are generated in a state where the positional relationship among the plurality of image generation units is fixed, positional deviation is unlikely to occur and the image data is synthesized with high accuracy. The image processing system according to the present disclosure generates highly accurate height data based on the plurality of image data. Hereinafter, details of the present disclosure will be described. Note that the imaging apparatus of the present disclosure can also generate image data of an object without a convex portion with high accuracy. However, in the following embodiment, a picture having a convex portion is described as an example of an object to be image processed.
 (実施形態1)
 1.構成
 図1は、実施形態1の画像処理システム1の構成を示している。実施形態1の画像処理システム1は、油彩画などの絵画(物体の一例)の表面を撮像して画像データを生成する撮像装置100と、生成された画像データを処理して絵画の情報(高さデータ及び色画像データ)を出力する画像処理装置200と、を備える。画像処理装置200は、例えば、画像処理装置200の出力情報に基づいて、印刷により絵画を複製する印刷装置300に接続される。
(Embodiment 1)
1. Configuration FIG. 1 shows a configuration of an image processing system 1 according to the first embodiment. The image processing system 1 according to the first embodiment includes an imaging device 100 that captures the surface of a painting (an example of an object) such as an oil painting and generates image data, and the generated image data to process information on the painting (high Image processing apparatus 200 for outputting color data and color image data). The image processing apparatus 200 is connected to a printing apparatus 300 that duplicates a picture by printing based on output information of the image processing apparatus 200, for example.
 実施形態1の撮像装置100は、直線状の領域の画像を撮像するラインスキャンカメラを用いたスキャナである。撮像装置100は、撮像の開始の指示を受け付け、撮像した絵画の画像データを出力する入出力部11と、撮像装置100全体を制御する制御部12と、絵画を照明して撮像する第1の画像生成部10A及び第2の画像生成部10Bと、第1の画像生成部10A及び第2の画像生成部10Bを一体的に移動させる駆動部16と、を備える。 The imaging apparatus 100 according to the first embodiment is a scanner using a line scan camera that captures an image of a linear region. The imaging apparatus 100 receives an instruction to start imaging, outputs an input / output unit 11 that outputs image data of a captured picture, a control unit 12 that controls the entire imaging apparatus 100, and a first that illuminates and images a painting. The image generation unit 10A and the second image generation unit 10B, and the drive unit 16 that moves the first image generation unit 10A and the second image generation unit 10B integrally.
 入出力部11は、入力部11aと通信部11bとを含む。入力部11aは、キーボード、マウス、タッチパネル等である。通信部11bは、所定の通信規格(例えばLocal Area Network:LAN、WiFi)に準拠して外部機器との通信を行うためのインタフェース回路を備える。撮像装置100は、例えば、撮像開始の指示を入力部11a又は通信部11bを介して入力し、絵画を撮像して生成した画像データを通信部11bから出力する。 The input / output unit 11 includes an input unit 11a and a communication unit 11b. The input unit 11a is a keyboard, a mouse, a touch panel, or the like. The communication unit 11b includes an interface circuit for performing communication with an external device in conformity with a predetermined communication standard (for example, Local Area Network: LAN, WiFi). For example, the imaging apparatus 100 inputs an instruction to start imaging via the input unit 11a or the communication unit 11b, and outputs image data generated by imaging a picture from the communication unit 11b.
 制御部12は、入力された撮像開始の指示に基づいて、駆動部16により、第1の画像生成部10A及び第2の画像生成部10Bを一体的に移動させながら、第1の画像生成部10A及び第2の画像生成部10Bにより、絵画を照明した状態で絵画を撮像させる。制御部12は、半導体素子などで実現可能である。制御部12の機能は、ハードウェアのみで構成されてもよいし、ハードウェアとソフトウェアとを組み合わせることにより実現してもよい。制御部12は、例えば、マイクロコンピュータ、CPU(Central Processing Unit)、MPU(Micro Processing Unit)、DSP(Digital Signal Processor)、FPGA(Field‐Programmable Gate Array)、ASIC(Application Specific Integrated Circuit)で構成される。 Based on the input imaging start instruction, the control unit 12 causes the drive unit 16 to move the first image generation unit 10A and the second image generation unit 10B together while moving the first image generation unit. 10A and the 2nd image generation part 10B are made to image a picture in the state which illuminated the picture. The control unit 12 can be realized by a semiconductor element or the like. The function of the control unit 12 may be configured only by hardware, or may be realized by combining hardware and software. The control unit 12 is, for example, a microcomputer, a CPU (Central Processing Unit), an MPU (Micro Processing Unit), a DSP (Digital Signal Processor), an FPGA (Field-Programmable Gate Array), and an ASIC (AppliedCritical Indication). The
 第1の画像生成部10Aは、絵画を撮像して画像データを生成するカメラ13aと、絵画を照明する第1照明部14a及び第3照明部15aとを備える。第1の画像生成部10Aにおいて、カメラ13aは、撮像部131aとメモリ132aとを含む。撮像部131aは、例えば、CCD(Charge Coupled Device)ラインセンサ又はCMOS(Complementary Metal Oxide Semiconductor)ラインセンサを含む。撮像部131aは、1ラインずつ絵画をスキャン撮像して、絵画の画像データを取り込む。撮像部131aが取り込んだ画像データはメモリ132aに格納される。例えば、カメラ13aは、絵画を撮像して各画素について色情報(RGB又はCMYK)を取得し、メモリ132aに記憶する。メモリ132aは、例えば、RAM(Random Access Memory)、DRAM(Dynamic Random Access Memory)、強誘電体メモリ、フラッシュメモリ、又は磁気ディスク、又はこれらの組み合わせによって実現される。 The first image generation unit 10A includes a camera 13a that captures a picture and generates image data, and a first illumination unit 14a and a third illumination unit 15a that illuminate the painting. In the first image generation unit 10A, the camera 13a includes an imaging unit 131a and a memory 132a. The imaging unit 131a includes, for example, a CCD (Charge Coupled Device) line sensor or a CMOS (Complementary Metal Oxide Semiconductor) line sensor. The imaging unit 131a scans and captures a picture line by line, and captures image data of the picture. The image data captured by the imaging unit 131a is stored in the memory 132a. For example, the camera 13a captures a picture, acquires color information (RGB or CMYK) for each pixel, and stores it in the memory 132a. The memory 132a is realized by, for example, a RAM (Random Access Memory), a DRAM (Dynamic Random Access Memory), a ferroelectric memory, a flash memory, a magnetic disk, or a combination thereof.
 第1照明部14a及び第3照明部15aは、スキャン用の照明光源である。具体的には、第1照明部14a及び第3照明部15aは、高演色性の直管型蛍光灯や、高演色性の白色発光ダイオード(LED)を直線状に配置したラインLED照明などである。第1照明部14a又は第3照明部15aによって絵画を照明した状態で、カメラ13aが絵画を撮像することによって、カメラ13aは絵画の凸部の陰影が含まれる画像の画像データを生成することができる。 The first illumination unit 14a and the third illumination unit 15a are scanning illumination light sources. Specifically, the first illuminating unit 14a and the third illuminating unit 15a are, for example, a high color rendering straight tube fluorescent lamp or a line LED illumination in which high color rendering white light emitting diodes (LEDs) are linearly arranged. is there. When the camera 13a captures a picture in a state where the picture is illuminated by the first lighting unit 14a or the third lighting unit 15a, the camera 13a may generate image data of an image including a shadow of a convex portion of the painting. it can.
 第2の画像生成部10Bは、絵画を撮像して画像データを生成するカメラ13bと、絵画を照明する第2照明部14b及び第4照明部15bとを備える。第2の画像生成部10Bの内部構成(すなわち、カメラ13b、第2照明部14b、及び第4照明部15b)は、それぞれの配置位置を除いて、第1の画像生成部10Aの内部構成(すなわち、カメラ13a、第1照明部14a、及び第3照明部15a)と同一である。 The second image generation unit 10B includes a camera 13b that captures a picture and generates image data, and a second illumination unit 14b and a fourth illumination unit 15b that illuminate the painting. The internal configuration of the second image generation unit 10B (that is, the camera 13b, the second illumination unit 14b, and the fourth illumination unit 15b) is the internal configuration of the first image generation unit 10A (except for the arrangement positions thereof) ( That is, it is the same as the camera 13a, the 1st illumination part 14a, and the 3rd illumination part 15a).
 駆動部16は、第1の画像生成部10Aのカメラ13a、第1照明部14a、及び第3照明部15aと、第2の画像生成部10Bのカメラ13b、第2照明部14b、及び第4照明部15bとに連結されている。そして、駆動部16は、制御部12に制御されて、第1の画像生成部10Aと第2の画像生成部10Bを一体的に移動させる。これにより、カメラ13a及びカメラ13bは移動しながら絵画を同時に1ラインずつ撮像することが可能になる。 The drive unit 16 includes the camera 13a, the first illumination unit 14a, and the third illumination unit 15a of the first image generation unit 10A, and the camera 13b, the second illumination unit 14b, and the fourth of the second image generation unit 10B. It is connected to the illumination unit 15b. The drive unit 16 is controlled by the control unit 12 to move the first image generation unit 10A and the second image generation unit 10B integrally. Thereby, it becomes possible for the camera 13a and the camera 13b to capture the picture one line at a time while moving.
 撮像装置100は、カメラ13aがライン毎にスキャンしてメモリ132aに取り込んだ画像データから二次元の画像データを生成すると共に、カメラ13bがライン毎にスキャンしてメモリ132bに取り込んだ画像データから二次元の画像データを生成し、この2つの画像データを通信部11bから出力する。 The imaging apparatus 100 generates two-dimensional image data from the image data scanned by the camera 13a for each line and captured in the memory 132a, and also obtained from the image data scanned by the camera 13b for each line and captured in the memory 132b. Two-dimensional image data is generated, and the two image data are output from the communication unit 11b.
 画像処理装置200は、画像データを入力して物体の表面の高さを示す高さ情報を出力する入出力部21と、画像処理装置200全体を制御するとともに入力された2つの画像データを処理して、物体の表面の高さを示す高さ情報を生成する制御部22と、メモリ23とを含む。入出力部21は、入力部21aと通信部21bとを含む。入力部21aは、キーボード、マウス、タッチパネル等である。通信部21bは、所定の通信規格(例えばLAN、WiFi)に準拠して外部機器との通信を行うためのインタフェース回路を備える。画像処理装置200は、例えば、ユーザが画像データの取り込みの指示を、入力部21aを介して入力すると、画像データの取り込み要求を、通信部21bを介して撮像装置100に出力する。また画像処理装置200は、撮像装置100から送信された画像データを、通信部21bを介して受信する。 The image processing apparatus 200 inputs image data and outputs height information indicating the height of the surface of the object, and controls the entire image processing apparatus 200 and processes two input image data. Then, a control unit 22 that generates height information indicating the height of the surface of the object, and a memory 23 are included. The input / output unit 21 includes an input unit 21a and a communication unit 21b. The input unit 21a is a keyboard, a mouse, a touch panel, or the like. The communication unit 21b includes an interface circuit for performing communication with an external device in compliance with a predetermined communication standard (for example, LAN, WiFi). For example, when the user inputs an instruction for capturing image data via the input unit 21a, the image processing apparatus 200 outputs an image data capture request to the imaging device 100 via the communication unit 21b. The image processing apparatus 200 receives the image data transmitted from the imaging apparatus 100 via the communication unit 21b.
 制御部22は、受信した画像データの画像に含まれる陰影の長さから、絵画の表面の高さ(凸部の高さ)を算出し、算出した高さを示す高さ情報を生成する。具体的には、高さ情報として、絵画の表面の高さを画素毎に数値で表した高さデータを生成する。例えば、凸部の高さが高いほど、数値が大きくなるような高さデータを生成する。生成された情報はメモリ23に格納される。また、制御部22は、生成した高さ情報を、通信部21bを介して、印刷装置300に出力する。制御部22は、半導体素子などで実現可能である。制御部22の機能は、ハードウェアのみで構成されてもよいし、ハードウェアとソフトウェアとを組み合わせることにより実現されてもよい。制御部22は、例えば、マイクロコンピュータ、CPU、MPU、DSP、FPGA、ASICで構成される。メモリ23は、例えば、RAM、DRAM、強誘電体メモリ、フラッシュメモリ、又は磁気ディスク、又はこれらの組み合わせによって実現される。 The control unit 22 calculates the height of the surface of the painting (the height of the convex portion) from the length of the shadow included in the image of the received image data, and generates height information indicating the calculated height. Specifically, height data that represents the height of the surface of the painting as a numerical value for each pixel is generated as the height information. For example, height data is generated such that the higher the height of the convex portion, the larger the numerical value. The generated information is stored in the memory 23. Further, the control unit 22 outputs the generated height information to the printing apparatus 300 via the communication unit 21b. The control unit 22 can be realized by a semiconductor element or the like. The function of the control unit 22 may be configured only by hardware, or may be realized by combining hardware and software. The control unit 22 includes, for example, a microcomputer, CPU, MPU, DSP, FPGA, and ASIC. The memory 23 is realized by, for example, a RAM, a DRAM, a ferroelectric memory, a flash memory, a magnetic disk, or a combination thereof.
 印刷装置300は、画像処理装置200から受け取った高さ情報(高さデータ)に基づいて、絵画の表面の高さを再現した(凸部を含む)画像を印刷する。印刷装置300は、例えば、紫外線を当てることで硬化するUVインクを用いたUVインクジェットプリンタである。印刷装置300は、多層印刷により、高さデータに示される高さの数値が大きいほどインクを厚く盛り上げて、凸部を含む画像を印刷する。 The printing apparatus 300 prints an image that reproduces the height of the surface of the painting (including the convex portion) based on the height information (height data) received from the image processing apparatus 200. The printing apparatus 300 is, for example, a UV inkjet printer that uses UV ink that is cured by being irradiated with ultraviolet rays. The printing apparatus 300 prints an image including a convex portion by increasing the thickness of the ink as the height value indicated by the height data is larger by multilayer printing.
 図2は、撮像装置100が絵画400を撮像している状態を示している。図3は、撮像装置100の概略の側面図を示す。図2及び図3において、絵画400の右方向を正のX方向とし、絵画400の左方向を負のX方向とする。絵画400の下方向を正のY方向とし、絵画400の上方向を負のY方向とする。絵画の垂線方向をZ方向とする。Y方向は、撮像装置100の1ライン毎の主走査方向(いわゆるスキャン方向)であり、X方向は、Y方向に直交する方向(副走査方向)である。主走査方向及び副走査方向は、撮像装置100の移動方向である。図2及び図3に示すように、撮像装置100の駆動部16は、Y方向に延在する第1ガイドレール16bと、第1ガイドレール16b沿いに進退移動する第1可動体16aと、X方向に延在する第2ガイドレール16cと、第2ガイドレール16c沿いに進退移動する第2可動体16dと、第1可動体16aに連結されたフレーム16eとによって、構成される。第1可動体16a及び第2可動体16dは、モータ等の駆動により進退移動する。フレーム16eには、第1の画像生成部10Aと第2の画像生成部10Bの両方が固定される。これにより、駆動部16は第1の画像生成部10Aと第2の画像生成部10Bを一体的に移動させる。第1の画像生成部10Aは、カメラ13aの直下の絵画400を撮像する。つまり第1の画像生成部10Aは、カメラ13aの直下にある1ライン分の撮像領域L1(第1の撮像領域)の絵画400を撮像する。同様に第2の画像生成部10Bは、カメラ13bの直下の絵画400を撮像する。つまり第2の画像生成部10Bは、カメラ13bの直下にある1ライン分の撮像領域L2(第2の撮像領域)の絵画400を撮像する。 FIG. 2 shows a state where the imaging apparatus 100 is capturing a picture 400. FIG. 3 shows a schematic side view of the imaging apparatus 100. 2 and 3, the right direction of the painting 400 is defined as a positive X direction, and the left direction of the painting 400 is defined as a negative X direction. The downward direction of the painting 400 is a positive Y direction, and the upward direction of the painting 400 is a negative Y direction. Let the perpendicular direction of the painting be the Z direction. The Y direction is a main scanning direction (so-called scanning direction) for each line of the imaging apparatus 100, and the X direction is a direction (sub scanning direction) orthogonal to the Y direction. The main scanning direction and the sub-scanning direction are moving directions of the imaging apparatus 100. As shown in FIGS. 2 and 3, the drive unit 16 of the imaging device 100 includes a first guide rail 16b extending in the Y direction, a first movable body 16a moving forward and backward along the first guide rail 16b, and X The second guide rail 16c extends in the direction, the second movable body 16d moves forward and backward along the second guide rail 16c, and the frame 16e connected to the first movable body 16a. The first movable body 16a and the second movable body 16d move forward and backward by driving a motor or the like. Both the first image generation unit 10A and the second image generation unit 10B are fixed to the frame 16e. Thereby, the drive unit 16 moves the first image generation unit 10A and the second image generation unit 10B integrally. 10 A of 1st image generation parts image the painting 400 directly under the camera 13a. That is, the first image generation unit 10A images the painting 400 in the imaging region L1 (first imaging region) for one line immediately below the camera 13a. Similarly, the second image generation unit 10B captures the painting 400 directly below the camera 13b. That is, the second image generation unit 10B images the painting 400 in the imaging area L2 (second imaging area) for one line immediately below the camera 13b.
 図4は、撮像装置100の概略の上面図を示している。つまり図4は、絵画400の主面に対し垂直な方向(Z方向)から見た撮像装置100の上面を示す。絵画400の主面とは、絵画400のXY面であり、凸部を無視した平面とする。第1の画像生成部10Aのカメラ13aは、主走査方向に対して45度の角度で交差する第1の方向に延在する撮像領域L1を撮像可能である。第1照明部14aと第3照明部15aは、第1照明部14aと第3照明部15aの長手方向と、撮像領域L1の延在する方向が平行になるように、カメラ13aの両側に配置される。第1照明部14aと第3照明部15aは、カメラ13aの両側から撮像領域L1を照明する。第1照明部14aが撮像領域L1を照明する方向を照明方向D1(第1の照明方向)とする。第3照明部15aが撮像領域L1を照明する方向を、照明方向D3(第3の照明方向)とする。照明方向D1と照明方向D3とは、絵画400を上面視した場合に、180度異なる。 FIG. 4 shows a schematic top view of the imaging apparatus 100. That is, FIG. 4 shows the upper surface of the imaging apparatus 100 viewed from a direction (Z direction) perpendicular to the main surface of the painting 400. The main surface of the painting 400 is the XY plane of the painting 400 and is a plane that ignores the convex portion. The camera 13a of the first image generation unit 10A can capture an imaging region L1 that extends in a first direction that intersects the main scanning direction at an angle of 45 degrees. The first illumination unit 14a and the third illumination unit 15a are arranged on both sides of the camera 13a so that the longitudinal directions of the first illumination unit 14a and the third illumination unit 15a are parallel to the extending direction of the imaging region L1. Is done. The first illumination unit 14a and the third illumination unit 15a illuminate the imaging region L1 from both sides of the camera 13a. The direction in which the first illumination unit 14a illuminates the imaging region L1 is defined as an illumination direction D1 (first illumination direction). The direction in which the third illumination unit 15a illuminates the imaging region L1 is defined as an illumination direction D3 (third illumination direction). The illumination direction D1 and the illumination direction D3 differ by 180 degrees when the painting 400 is viewed from above.
 第2の画像生成部10Bのカメラ13bは、第2の方向に延在する撮像領域L2を撮像可能である。第2の方向は、第1の方向と同一平面上において(XY平面上において)異なる方向である。第2の方向は、第1の方向に対して所定角度(実施形態1において、90度)で交差する。第2照明部14bと第4照明部15bは、第2照明部14bと第4照明部15bの長手方向と、撮像領域L2の延在する方向が平行になるように、カメラ13bの両側に配置される。第2照明部14bと第4照明部15bは、カメラ13bの両側から撮像領域L2を照明する。第2照明部14bが撮像領域L2を照明する方向を照明方向D2(第2の照明方向)とする。第4照明部15bが撮像領域L2を照明する方向を、照明方向D4とする。照明方向D2と照明方向D4とは、絵画400を上面視した場合に、180度異なる。 The camera 13b of the second image generation unit 10B can image the imaging region L2 extending in the second direction. The second direction is a different direction on the same plane as the first direction (on the XY plane). The second direction intersects the first direction at a predetermined angle (90 degrees in the first embodiment). The second illumination unit 14b and the fourth illumination unit 15b are arranged on both sides of the camera 13b so that the longitudinal directions of the second illumination unit 14b and the fourth illumination unit 15b are parallel to the extending direction of the imaging region L2. Is done. The second illumination unit 14b and the fourth illumination unit 15b illuminate the imaging region L2 from both sides of the camera 13b. The direction in which the second illumination unit 14b illuminates the imaging region L2 is defined as an illumination direction D2 (second illumination direction). A direction in which the fourth illumination unit 15b illuminates the imaging region L2 is defined as an illumination direction D4. The illumination direction D2 and the illumination direction D4 differ by 180 degrees when the painting 400 is viewed from above.
 ここで、照明方向D1~照明方向D4の関係について説明する。照明方向D1は、物体400を上面視した場合に、照明方向D2及び照明方向D4とは異なる方向である。また照明方向D1は、物体400を上面視した場合に、照明方向D2及び照明方向D4とは非平行な方向である。実施形態1では、照明方向D1は、物体400を上面視した場合に、照明方向D2及び照明方向D4に対し90度の角度で交差する方向である。同様に照明方向D3は、物体400を上面視した場合に、照明方向D2及び照明方向D4とは異なる方向である。また照明方向D3は、物体400を上面視した場合に、照明方向D2及び照明方向D4とは非平行な方向である。実施形態1では、照明方向D3は、物体400を上面視した場合に、照明方向D2及び照明方向D4に対し90度の角度で交差する方向である。なお、ある方向と他の方向とが交差する関係とは、ある方向を示す矢印と他の方向を示す矢印とがそれぞれの正負方向の延長線上で交わる関係を含む。 Here, the relationship between the illumination direction D1 to the illumination direction D4 will be described. The illumination direction D1 is a direction different from the illumination direction D2 and the illumination direction D4 when the object 400 is viewed from above. The illumination direction D1 is a direction that is not parallel to the illumination direction D2 and the illumination direction D4 when the object 400 is viewed from above. In the first embodiment, the illumination direction D1 is a direction that intersects the illumination direction D2 and the illumination direction D4 at an angle of 90 degrees when the object 400 is viewed from above. Similarly, the illumination direction D3 is a direction different from the illumination direction D2 and the illumination direction D4 when the object 400 is viewed from above. The illumination direction D3 is a direction that is not parallel to the illumination direction D2 and the illumination direction D4 when the object 400 is viewed from above. In the first embodiment, the illumination direction D3 is a direction that intersects the illumination direction D2 and the illumination direction D4 at an angle of 90 degrees when the object 400 is viewed from above. Note that the relationship in which a certain direction intersects with another direction includes a relationship in which an arrow indicating a certain direction and an arrow indicating the other direction intersect on the extension lines in the respective positive and negative directions.
 実施形態1において、撮像領域L1及び撮像領域L2の大きさは同一である。第1の画像生成部10Aと第2の画像生成部10Bは、撮像時の主走査方向(Y方向)に並べて配置される。また、撮像領域L1の延在する方向と撮像時の主走査方向(Y方向)とがなす角度と、撮像領域L2の延在する方向と撮像時の主走査方向(Y方向)とがなす角度とは、それぞれ45度である。このような構成により、撮像装置100は、絵画400を回転させることなく、図13の(a)紙面上方及び(b)紙面下方に示すような陰影の向きが90度異なる画像データを一度に生成できる。つまり撮像装置100は、物体に対する照明方向D1及び照明方向D2により絵画400の上面視において陰影の向きが90度異なる二つの画像データを一度に生成できる。同様に撮像装置100は、物体に対する照明方向D3及び照明方向D4により絵画400の上面視において陰影の向きが90度異なる二つの画像データを一度に生成できる。 In the first embodiment, the imaging area L1 and the imaging area L2 have the same size. The first image generation unit 10A and the second image generation unit 10B are arranged side by side in the main scanning direction (Y direction) during imaging. In addition, an angle formed between the extending direction of the imaging region L1 and the main scanning direction (Y direction) during imaging, and an angle formed between the extending direction of the imaging region L2 and the main scanning direction (Y direction) during imaging. And 45 degrees respectively. With such a configuration, the imaging apparatus 100 generates image data at a time with different shadow directions 90 degrees as shown in (a) above and (b) below in FIG. 13 without rotating the painting 400. it can. That is, the imaging apparatus 100 can generate two pieces of image data whose shadow directions differ by 90 degrees in the top view of the painting 400 depending on the illumination direction D1 and the illumination direction D2 with respect to the object. Similarly, the imaging apparatus 100 can generate two pieces of image data that are 90 degrees different in shadow direction in the top view of the painting 400 depending on the illumination direction D3 and the illumination direction D4 with respect to the object.
 撮像装置100は、第1の画像生成部10Aの第1照明部14a及び第3照明部15aの照明光が第2の画像生成部10Bの撮像領域L2に漏れないように、且つ第2の画像生成部10Bの第2照明部14b及び第4照明部15bの照明光が第1の画像生成部10Aの撮像領域L1に漏れないようにするために、第1の画像生成部10Aと第2の画像生成部10Bの間に照明光を遮蔽する遮光部材17をさらに備える。 The imaging device 100 is configured so that the illumination light of the first illumination unit 14a and the third illumination unit 15a of the first image generation unit 10A does not leak into the imaging region L2 of the second image generation unit 10B and the second image In order to prevent the illumination light of the second illumination unit 14b and the fourth illumination unit 15b of the generation unit 10B from leaking to the imaging region L1 of the first image generation unit 10A, the first image generation unit 10A and the second image generation unit 10B A light shielding member 17 that shields illumination light is further provided between the image generation units 10B.
 図5は、図4の矢印V1の方向から見た撮像装置100の第1の画像生成部10Aと第2の画像生成部10Bを概略的に示している。図5では、遮光部材17が省略されている。実施形態1においては、第1照明部14aによる、カメラ13aの直下の絵画400の撮像領域L1(1ライン分)への照明方向D1(第1の照明方向)と、平置きされた絵画400全面(主面)との間の角度のうち小さい方の角度(照明角度)θと、第3照明部15aによる撮像領域L1への照明方向D3(第3の照明方向)と、絵画400全面との間の角度のうち小さい方の角度(照明角度)θとが、一定角度になるようにして、撮像領域L1を照明する。この一定角度θは、例えば30°である。第1照明部14aと第3照明部15aは、それぞれ、カメラ13aの直下の撮像領域L1に対して、撮像領域L1(すなわち、絵画)の斜め右上側方向と斜め左下側方向から照明する。このように、絵画400に対して、斜め上方向または斜め下方向から撮像領域L1を照明することによって、第1の画像生成部10Aは陰影付きの画像データを生成することができる。同様に、第2の画像生成部10Bは、第2照明部14bによる撮像領域L2への照明方向D2(第2の照明方向)と絵画400全面との間の角度のうち小さい方の角度(照明角度)θと、第4照明部15bによる撮像領域L2への照明方向D4(第4の照明方向)と絵画400全面との間の角度のうち小さい方の角度(照明角度)θとが一定角度になるようにして、撮像領域L2を照明する。この一定角度θは、例えば30°である。なお、第1照明部14a~第4照明部15bの照明角度θは、照明によって陰影が現れる角度であれば良く、30°に限定されないが、特に、20°~45°が適している。 FIG. 5 schematically shows the first image generation unit 10A and the second image generation unit 10B of the imaging apparatus 100 viewed from the direction of the arrow V1 in FIG. In FIG. 5, the light shielding member 17 is omitted. In the first embodiment, the lighting direction D1 (first lighting direction) to the imaging region L1 (for one line) of the painting 400 directly under the camera 13a by the first lighting unit 14a and the entire surface of the painting 400 placed horizontally. The smaller angle (illumination angle) θ among the angles with respect to the (main surface), the illumination direction D3 (third illumination direction) to the imaging region L1 by the third illumination unit 15a, and the entire surface of the painting 400 The imaging region L1 is illuminated such that the smaller angle (illumination angle) θ among the angles is a constant angle. This constant angle θ is, for example, 30 °. The first illumination unit 14a and the third illumination unit 15a respectively illuminate the imaging region L1 directly below the camera 13a from the diagonally upper right side direction and diagonally lower left side direction of the imaging region L1 (ie, painting). In this way, the first image generation unit 10A can generate shaded image data by illuminating the imaging region L1 with respect to the painting 400 from diagonally upward or diagonally downward. Similarly, the second image generation unit 10B has a smaller angle (illumination) among the angles between the illumination direction D2 (second illumination direction) to the imaging region L2 by the second illumination unit 14b and the entire surface of the painting 400. Angle) θ and the smaller angle (illumination angle) θ among the angles between the illumination direction D4 (fourth illumination direction) to the imaging region L2 by the fourth illumination unit 15b and the entire surface of the painting 400 are constant angles. Thus, the imaging region L2 is illuminated. This constant angle θ is, for example, 30 °. The illumination angle θ of the first illuminating unit 14a to the fourth illuminating unit 15b may be an angle at which a shadow appears due to illumination, and is not limited to 30 °, but 20 ° to 45 ° is particularly suitable.
 第1の画像生成部10Aにおいて、カメラ13aと第1照明部14aと第3照明部15aは、フレーム16eに固定される。なお、駆動部16は、第1照明部14a及び第3照明部15aを昇降可能にさせる第3可動体16fを有することもできる。第2の画像生成部10Bのカメラ13bと第2照明部14bと第4照明部15bは、第1の画像生成部10Aと同様に、フレーム16eに固定される。カメラ13a及びカメラ13bが絵画400をスキャン撮像するとき、制御部12は、駆動部16を制御して、カメラ13a及びカメラ13bと第1照明部14a、第2照明部14b、第3照明部15a、及び第4照明部15bとを一体的に主走査方向に一定速度で平行移動させる。 In the first image generation unit 10A, the camera 13a, the first illumination unit 14a, and the third illumination unit 15a are fixed to the frame 16e. In addition, the drive part 16 can also have the 3rd movable body 16f which makes the 1st illumination part 14a and the 3rd illumination part 15a raise / lower. The camera 13b, the second illumination unit 14b, and the fourth illumination unit 15b of the second image generation unit 10B are fixed to the frame 16e, similarly to the first image generation unit 10A. When the camera 13a and the camera 13b scan and capture the painting 400, the control unit 12 controls the drive unit 16, and the camera 13a and the camera 13b, the first illumination unit 14a, the second illumination unit 14b, and the third illumination unit 15a. , And the fourth illuminating unit 15b are integrally translated in the main scanning direction at a constant speed.
 2.動作
 2.1 撮像装置の動作
 図6は、撮像装置100の撮像時の移動方向(主走査方向)を示している。撮像装置100は、絵画400を固定した状態で、絵画400の上下方向(Y方向)に移動しながら絵画400を撮像する。撮像装置100は、絵画400の上端から下端まで正のY方向に撮像すると、正のX方向にわずかに移動し、絵画400の下端から上端まで負のY方向に撮像する。これを開始位置から終了位置まで繰り返し、絵画400の全体を撮像する。なお、主走査方向は、絵画400の上下方向に限定されるものではなく、任意の方向でもよい。例えば、主走査方向は、絵画400の配置又は向きに応じて、上下方向、左右方向、又は斜め方向でもよい。
2. Operation 2.1 Operation of Imaging Device FIG. 6 shows a moving direction (main scanning direction) of the imaging device 100 during imaging. The imaging device 100 captures an image of the painting 400 while moving in the vertical direction (Y direction) of the painting 400 while the painting 400 is fixed. When the imaging apparatus 100 captures an image in the positive Y direction from the upper end to the lower end of the painting 400, the imaging apparatus 100 moves slightly in the positive X direction and captures an image in the negative Y direction from the lower end to the upper end of the painting 400. This is repeated from the start position to the end position, and the entire painting 400 is imaged. The main scanning direction is not limited to the vertical direction of the painting 400, and may be any direction. For example, the main scanning direction may be a vertical direction, a horizontal direction, or an oblique direction depending on the arrangement or orientation of the painting 400.
 図7に、撮像装置100による絵画の撮像処理を示す。まず、撮像装置100によって撮像される絵画400が、例えば、図2に示すように撮像装置100の真下側に固定される。 FIG. 7 shows a picture imaging process performed by the imaging apparatus 100. First, the painting 400 imaged by the imaging device 100 is fixed directly below the imaging device 100 as shown in FIG. 2, for example.
 撮像装置100は、第1の画像生成部10A及び第2の画像生成部10Bのそれぞれにおいて、一方向から撮像領域L1及び撮像領域L2を照明して、凸部の陰影を含む画像データ(陰影画像データ)を生成する(S701)。具体的には、撮像装置100は、駆動部16により、図6の開始位置から終了位置まで、第1の画像生成部10A及び第2の画像生成部10Bを移動させながら、絵画400を撮像する。このとき、撮像装置100は、第1の画像生成部10Aにおいて、第1照明部14aのみによって、撮像領域L1を照明した状態で、カメラ13aにより、絵画400をスキャン撮像して第1画像データ(陰影画像データ)を生成する。さらに撮像装置100は、第1画像データを生成するときに、同時に、第2の画像生成部10Bにおいて、第2照明部14bのみによって撮像領域L2を照明した状態で、カメラ13bにより、絵画400をスキャン撮像して第2画像データ(陰影画像データ)を生成する。第1画像データ及び第2画像データは、絵画400の凸部の陰影を含む2次元の画像データである。 The imaging device 100 illuminates the imaging region L1 and the imaging region L2 from one direction in each of the first image generation unit 10A and the second image generation unit 10B, and includes image data (shadow image) including the shadow of the convex portion. Data) is generated (S701). Specifically, the imaging apparatus 100 captures the painting 400 by moving the first image generation unit 10A and the second image generation unit 10B from the start position to the end position in FIG. . At this time, in the first image generation unit 10A, the imaging device 100 scans and captures the painting 400 with the camera 13a in a state where the imaging region L1 is illuminated only by the first illumination unit 14a, and the first image data ( Shadow image data) is generated. Furthermore, when the imaging device 100 generates the first image data, at the same time, in the second image generation unit 10B, the imaging region L2 is illuminated only by the second illumination unit 14b, and the painting 400 is created by the camera 13b. Scan imaging is performed to generate second image data (shadow image data). The first image data and the second image data are two-dimensional image data including shadows of the convex portions of the painting 400.
 次に、撮像装置100は、第1の画像生成部10A及び第2の画像生成部10Bのそれぞれにおいて、他の方向から撮像領域L1及び撮像領域L2を照明して、凸部の陰影を含む画像データを生成する(S702)。具体的には、撮像装置100は、駆動部16により、再度、図6の開始位置から終了位置まで、又は図6の終了位置から開始位置まで逆方向に、第1の画像生成部10A及び第2の画像生成部10Bを移動させながら、絵画400を撮像する。このとき、撮像装置100は、第1の画像生成部10Aにおいて、第3照明部15aのみによって、撮像領域L1を照明した状態で、カメラ13aにより、絵画400をスキャン撮像して第3画像データ(陰影画像データ)を生成すると共に、第2の画像生成部10Bにおいて、第4照明部15bのみによって撮像領域L2を照明した状態で、カメラ13bにより、絵画400をスキャン撮像して第4画像データ(陰影画像データ)を生成する。第3画像データ及び第4画像データは、絵画400の凸部の陰影を含む2次元の画像データである。 Next, the imaging device 100 illuminates the imaging region L1 and the imaging region L2 from other directions in each of the first image generation unit 10A and the second image generation unit 10B, and includes an image including a shadow of the convex portion. Data is generated (S702). Specifically, the imaging apparatus 100 causes the drive unit 16 to again perform the first image generation unit 10A and the first image generation unit 10A and the first image generation unit 10A in the reverse direction from the start position to the end position in FIG. The painting 400 is imaged while moving the second image generation unit 10B. At this time, in the first image generation unit 10A, the imaging apparatus 100 scans and captures the painting 400 with the camera 13a in a state where the imaging region L1 is illuminated only by the third illumination unit 15a, and obtains third image data ( Shadow image data) and the second image generator 10B scans and images the painting 400 with the camera 13b in a state where the imaging region L2 is illuminated only by the fourth illumination unit 15b. Shadow image data) is generated. The third image data and the fourth image data are two-dimensional image data including the shadow of the convex portion of the painting 400.
 その後、撮像装置100は、第1の画像生成部10A及び第2の画像生成部10Bの少なくとも一方において、双方の方向から、撮像領域L1又は撮像領域L2を照明して、凸部の陰影を含まない画像データを生成する(S703)。具体的には、撮像装置100は、駆動部16により、図6の開始位置から終了位置まで又は図6の終了位置から開始位置まで、第1の画像生成部10A及び第2の画像生成部10Bを移動させながら、再度、絵画400を撮像する。このとき、撮像装置100は、第1の画像生成部10Aと第2の画像生成部10Bのいずれか一方又は両方において、絵画400をスキャン撮像して第5画像データ(色画像データ)を生成する。例えば、第1の画像生成部10Aにおいて、第1照明部14a及び第3照明部15aの両方により、撮像領域L1の上下側から撮像領域L1に照明角度θで同時に照明した状態で、カメラ13aが絵画400をスキャン撮像して、第5画像データ(色画像データ)を生成する。あるいは、第2の画像生成部10Bにおいて、第2照明部14b及び第4照明部15bの両方により、撮像領域L2の上下側から撮像領域L2に照明角度θで同時に照明した状態で、カメラ13bが絵画400をスキャン撮像して、第5画像データを生成してもよい。さらに、第1の画像生成部10A及び第2の画像生成部10Bの双方が、それぞれ第5画像データを生成してもよい。第5画像データは、絵画400の各画素の色情報(RGB又はCMYK)を含み、凸部の陰影を含まない2次元の画像データ(色画像データ)である。第1照明部14a及び第3照明部15aの両方、又は第2照明部14b及び第4照明部15bの両方が同時に絵画400を照明することによって、凸部の陰影を含まない画像データが得られる。 Thereafter, the imaging apparatus 100 illuminates the imaging region L1 or the imaging region L2 from both directions in at least one of the first image generation unit 10A and the second image generation unit 10B, and includes the shadow of the convex portion. No image data is generated (S703). Specifically, the imaging device 100 uses the drive unit 16 to perform the first image generation unit 10A and the second image generation unit 10B from the start position to the end position in FIG. 6 or from the end position to the start position in FIG. The image 400 is picked up again while moving. At this time, the imaging apparatus 100 scans and images the painting 400 in one or both of the first image generation unit 10A and the second image generation unit 10B, and generates fifth image data (color image data). . For example, in the first image generation unit 10A, the camera 13a is illuminated by the first illumination unit 14a and the third illumination unit 15a simultaneously from the upper and lower sides of the imaging region L1 to the imaging region L1 at the illumination angle θ. The painting 400 is scanned and imaged to generate fifth image data (color image data). Alternatively, in the second image generation unit 10B, the camera 13b is illuminated by the second illumination unit 14b and the fourth illumination unit 15b simultaneously from the upper and lower sides of the imaging region L2 to the imaging region L2 at the illumination angle θ. The painting 400 may be scanned and imaged to generate fifth image data. Further, both the first image generation unit 10A and the second image generation unit 10B may generate the fifth image data, respectively. The fifth image data is two-dimensional image data (color image data) that includes the color information (RGB or CMYK) of each pixel of the painting 400 and does not include the shadow of the convex portion. When both the first illumination unit 14a and the third illumination unit 15a, or both the second illumination unit 14b and the fourth illumination unit 15b illuminate the painting 400 at the same time, image data that does not include the shadow of the convex portion is obtained. .
 撮像装置100は、生成した第1画像データ~第5画像データを通信部11bから出力する(S704)。 The imaging apparatus 100 outputs the generated first image data to fifth image data from the communication unit 11b (S704).
 図8Aは、第1の画像生成部10Aの第1照明部14aが絵画400を斜め右上側から照明したときに生じる陰影を示している。図8Bは、第1の画像生成部10Aの第3照明部15aが絵画400を斜め左下側から照明したときに生じる陰影を示している。絵画400には、例えば、油彩画などのように色を重ね塗りすることによって形成された凸部(絵の具の厚み部分)401が含まれる場合がある。第1の画像生成部10Aの第1照明部14a及び第3照明部15aによって、凸部401の2方向側(例えば、右上側と左下側)の陰影S1及び陰影S2が得られる。同様に、第2の画像生成部10Bの第2照明部14b及び第4照明部15bによって、絵画400の主面(XY平面)において第1の画像生成部10Aの陰影S1及び陰影S2と90度異なる凸部401の2方向側(例えば、左上側と右下側)の陰影が得られる。すなわち、実施形態1によれば、凸部401に対して4方向の陰影が得られる。 FIG. 8A shows a shadow generated when the first illumination unit 14a of the first image generation unit 10A illuminates the painting 400 from the diagonal upper right side. FIG. 8B shows a shadow generated when the third illumination unit 15a of the first image generation unit 10A illuminates the painting 400 from the diagonally lower left side. The painting 400 may include, for example, a convex portion (thickness portion of the paint) 401 formed by repeatedly painting colors such as an oil painting. By the first illumination unit 14a and the third illumination unit 15a of the first image generation unit 10A, the shadow S1 and the shadow S2 on the two direction sides (for example, the upper right side and the lower left side) of the convex portion 401 are obtained. Similarly, the second illumination unit 14b and the fourth illumination unit 15b of the second image generation unit 10B are 90 degrees with the shadow S1 and the shadow S2 of the first image generation unit 10A on the main surface (XY plane) of the painting 400. Shadows on two directions of different convex portions 401 (for example, upper left and lower right) are obtained. That is, according to the first embodiment, shadows in four directions are obtained with respect to the convex portion 401.
 2.2 画像処理装置の動作
 画像処理装置200は、絵画の凸部401の高さを以下のように算出する。
2.2 Operation of Image Processing Device The image processing device 200 calculates the height of the convex portion 401 of the painting as follows.
 図9に、画像処理装置200の制御部22による高さデータの生成処理を示す。画像処理装置200は、撮像装置100から出力された第1画像データ~第4画像データ(陰影画像データ)と第5画像データ(色画像データ)を入力する(S901)。 FIG. 9 shows height data generation processing by the control unit 22 of the image processing apparatus 200. The image processing apparatus 200 receives the first to fourth image data (shadow image data) and the fifth image data (color image data) output from the imaging apparatus 100 (S901).
 画像処理装置200は、第1の画像生成部10Aによって生成された第1画像データ及び第3画像データからそれぞれ凸部401の陰影の長さを算出する(S902)。図8Aに示すように、第1照明部14aによって絵画400を斜め右上側から照明すると、凸部401の一方向(例えば、凸部401の左下側)に凸部401の陰影S1が現れる。画像処理装置200は、例えば、第1画像データと第5画像データの画素の輝度値の差又は色の差に基づいて、凸部401の陰影S1の長さ(例えば、画素数)を算出する。また、図8Bに示すように、第3照明部15aによって絵画400を斜め左下側から照明すると、凸部401の他の方向(例えば、凸部401の右上側)に凸部401の陰影S2が現れる。画像処理装置200は、例えば、第3画像データと第5画像データの画素の輝度値の差又は色の差に基づいて、凸部401の陰影S2の長さ(例えば、画素数)を算出する。 The image processing apparatus 200 calculates the shade length of the convex portion 401 from the first image data and the third image data generated by the first image generation unit 10A (S902). As illustrated in FIG. 8A, when the painting 400 is illuminated obliquely from the upper right side by the first lighting unit 14a, a shadow S1 of the convex portion 401 appears in one direction of the convex portion 401 (for example, the lower left side of the convex portion 401). For example, the image processing apparatus 200 calculates the length (for example, the number of pixels) of the shade S1 of the convex portion 401 based on the difference in luminance value or the color difference between the pixels of the first image data and the fifth image data. . Further, as shown in FIG. 8B, when the painting 400 is illuminated obliquely from the lower left side by the third illumination unit 15a, the shadow S2 of the convex portion 401 is formed in the other direction of the convex portion 401 (for example, the upper right side of the convex portion 401). appear. For example, the image processing apparatus 200 calculates the length (for example, the number of pixels) of the shade S2 of the convex portion 401 based on the difference in luminance value or the color difference between the pixels of the third image data and the fifth image data. .
 画像処理装置200は、算出した陰影S1及び陰影S2の長さと、第1照明部14a及び第3照明部15aの照明角度θとに基づいて、凸部401の高さH3を算出する(S903)。具体的には、画像処理装置200は、算出した陰影S1の長さ(画素数)と第1照明部14aの照明角度θとに基づいて、凸部401の一方向(例えば、左下側)の高さH1を算出する[H1=陰影S1の長さ×tan(θ)]。同様に、画像処理装置200は、算出した陰影S2の長さ(画素数)と第3照明部15aの照明角度θとに基づいて、凸部401の他の方向(例えば、右上側)の高さH2を算出する[H2=陰影S2の長さ×tan(θ)]。そして画像処理装置200は、凸部401の2方向(右上側と左下側)の高さH1及び高さH2に基づいて、その間の高さH3を補間する。 The image processing apparatus 200 calculates the height H3 of the convex portion 401 based on the calculated lengths of the shadow S1 and the shadow S2 and the illumination angle θ of the first illumination unit 14a and the third illumination unit 15a (S903). . Specifically, the image processing apparatus 200 determines one direction (for example, the lower left side) of the convex portion 401 based on the calculated length (number of pixels) of the shadow S1 and the illumination angle θ of the first illumination unit 14a. The height H1 is calculated [H1 = the length of the shadow S1 × tan (θ)]. Similarly, the image processing apparatus 200 determines the height in the other direction (for example, the upper right side) of the convex portion 401 based on the calculated length (number of pixels) of the shadow S2 and the illumination angle θ of the third illumination unit 15a. H2 is calculated [H2 = length of shadow S2 × tan (θ)]. The image processing apparatus 200 interpolates the height H3 between the height H1 and the height H2 of the convex portion 401 in the two directions (upper right side and lower left side).
 次に、画像処理装置200は、第2の画像生成部10Bによって生成された第2画像データ、第4画像データおよび第5画像データから、それぞれ第1の画像生成部10Aと90度異なる方向(例えば、凸部401の左上側と右下側)の凸部401の陰影の長さを算出する(S904)。そして画像処理装置200は、算出した陰影の長さと、第2照明部14b及び第4照明部15bの照明角度θとに基づいて、凸部401の高さH3を算出する(S905)。ここでは、第1の画像生成部10Aと90度異なる方向(例えば、凸部401の左上側と右下側)の凸部401の陰影の長さから、第1の画像生成部10Aと90度異なる方向(例えば、凸部401の左上側と右下側)の凸部401の高さを算出して、その間の高さH3を補間する。 Next, the image processing apparatus 200 has a direction that is 90 degrees different from the first image generation unit 10A from the second image data, the fourth image data, and the fifth image data generated by the second image generation unit 10B ( For example, the length of the shadow of the convex portion 401 on the upper left side and the lower right side of the convex portion 401 is calculated (S904). Then, the image processing apparatus 200 calculates the height H3 of the convex portion 401 based on the calculated shadow length and the illumination angle θ of the second illumination portion 14b and the fourth illumination portion 15b (S905). Here, the first image generation unit 10A and the first image generation unit 10A 90 degrees from the length of the shadow of the projection 401 in a direction 90 degrees different from the first image generation unit 10A (for example, the upper left side and the lower right side of the projection 401). The heights of the convex portions 401 in different directions (for example, the upper left side and the lower right side of the convex portion 401) are calculated, and the height H3 between them is interpolated.
 画像処理装置200は、第1の画像生成部10Aと第2の画像生成部10Bの高さ情報を合成して、凸部401全体の高さを算出して、高さデータを生成する(S906)。すなわち、第1の画像生成部10Aの画像データに基づいて算出した凸部401の高さH3(S903の結果)と、第2の画像生成部10Bの画像データに基づいて算出した凸部401の高さH3(S905の結果)とから、凸部401全体の高さを算出する。例えば、絵画400の各画素について、凸部401の右上側と左下側の高さに基づいて補間して得られた高さH3(S903の結果)と、凸部401の左上側と右下側の高さに基づいて補間して得られた高さH3(S905の結果)とを比較して、高い方の値を選択することによって、凸401の全体の高さを算出する。 The image processing apparatus 200 combines the height information of the first image generation unit 10A and the second image generation unit 10B, calculates the height of the entire convex portion 401, and generates height data (S906). ). That is, the height H3 of the convex portion 401 calculated based on the image data of the first image generation unit 10A (result of S903) and the convex portion 401 calculated based on the image data of the second image generation unit 10B. From the height H3 (result of S905), the height of the entire convex portion 401 is calculated. For example, for each pixel of the painting 400, the height H3 (result of S903) obtained by interpolation based on the height of the upper right side and the lower left side of the convex portion 401, the upper left side and the lower right side of the convex portion 401 The total height of the convex 401 is calculated by comparing the height H3 (result of S905) obtained by interpolation based on the height of H, and selecting the higher value.
 実施形態1によれば、第1の画像生成部10Aが生成する陰影画像データ(第1画像データ、第3画像データ)が、図13の(a)紙面上方に示すような、高い凸部411の陰影に低い凸部412の陰影が重なるような状態で生成されたものであったとしても、第2の画像生成部10Bが生成する陰影画像データ(第2画像データ、第4画像データ)は、図13の(b)紙面下方に示すような、高い凸部411の陰影と低い凸部412の陰影とが重ならない状態で生成されたものになる。よって、第1の画像生成部10Aが生成する陰影画像データと第2の画像生成部10Bが生成する陰影画像データの両方を使用することによって、凸部401の全体の高さを精度良く算出することができる。画像処理装置200は、絵画400に含まれる全ての凸部401の高さを算出することによって、絵画400の画像全体(画像を構成する全画素)の高さを算出し、画像全体の高さ情報として高さデータを生成する。例えば、画像内の各画素の高さを数値で表した高さデータを生成する。 According to the first embodiment, the shadow image data (first image data, third image data) generated by the first image generation unit 10A is a high convex portion 411 as shown in FIG. The shadow image data (second image data and fourth image data) generated by the second image generation unit 10B is generated even in a state where the shadow of the low convex portion 412 overlaps the shadow of As shown in FIG. 13B, the shadow of the high convex portion 411 and the shadow of the low convex portion 412 are generated so as not to overlap. Therefore, by using both the shadow image data generated by the first image generation unit 10A and the shadow image data generated by the second image generation unit 10B, the overall height of the convex portion 401 is accurately calculated. be able to. The image processing apparatus 200 calculates the height of the entire image (all pixels constituting the image) of the painting 400 by calculating the height of all the convex portions 401 included in the painting 400, and the height of the entire image. Generate height data as information. For example, height data representing the height of each pixel in the image as a numerical value is generated.
 その後、画像処理装置200は、色画像データと高さデータとを印刷装置300に出力する(S907)。 Thereafter, the image processing apparatus 200 outputs the color image data and the height data to the printing apparatus 300 (S907).
 2.3 印刷装置の動作
 印刷装置300は、画像処理装置200から出力された高さデータに基づいて、例えば、高さデータの数値が大きい画素ほど透明インクの吐出量を多くして、透明インクを基材(紙、布、プラスチックなど)の上に印刷する。これにより、透明インクの吐出量が多い画素がより高く盛り上がるため、凸部401を表すことができる。印刷装置300は、画像処理装置200から出力された色画像データに基づいて、透明インクの上面にカラーインクを使用して画像を印刷する。これにより、凸部401を再現した絵画400を複製することができる。
2.3 Operation of Printing Device Based on the height data output from the image processing device 200, the printing device 300 increases the discharge amount of the transparent ink, for example, for the pixels with the larger numerical value of the height data. Is printed on a substrate (paper, cloth, plastic, etc.). As a result, pixels with a large amount of transparent ink discharged rise higher, and thus the convex portion 401 can be represented. The printing apparatus 300 prints an image using color ink on the upper surface of the transparent ink based on the color image data output from the image processing apparatus 200. Thereby, the painting 400 which reproduced the convex part 401 can be duplicated.
 3.効果等
 以上のように、実施形態1の撮像装置100は、第1の方向に延在する第1の撮像領域L1を撮像可能な第1の撮像部(撮像部131a)と、第1の撮像領域L1を第1の照明方向D1から照明する第1照明部14aと、を含む第1の画像生成部10Aと、第1の方向と異なる第2の方向に延在する第2の撮像領域L2を撮像可能な第2の撮像部(撮像部131b)と、第2の撮像領域L2を第2の撮像領域L2を上面視した場合に、第1の照明方向D1と異なる第2の照明方向D2から照明する第2照明部14bと、を含む第2の画像生成部10Bと、第1の画像生成部10A及び第2の画像生成部10Bを所定の移動方向(主走査方向及び副操作方向)へ一体的に移動させる駆動部16と、を備える。これにより撮像装置100は、第1照明部14a及び第2照明部14bによって、複数の異なる方向から照明した状態の撮像が可能になる。すなわち撮像装置100は、物体(例えば、絵画)を回転させなくても、図13の(a)紙面上方及び(b)紙面下方に示すような、異なる照明方向D1及び照明方向D2による撮像が可能になる。これにより複数の画像データに基づいて高精度に画像データを合成できる。特に物体が複数の凸部を有し、高さの異なる凸部が近傍にあった場合であっても、複数の画像データに基づいて、陰影を含む画像データを精度良く生成することができる。また第1の画像生成部10Aと第2の画像生成部10Bの位置関係が固定されているため、第1の画像生成部10Aと第2の画像生成部10Bによって生成される画像データを合成する際に、画像の位置ずれが発生しにくい。よって撮像装置100は、陰影を含む画像データをより高精度に生成できる。なお、照明方向D1と照明方向D2とは、物体を上面視した場合に異なる方向であれば、凸部の陰影が重なるのを回避できる。特に照明方向D1と照明方向D2とは、物体を上面視した場合に非平行な方向であれば、凸部の陰影が重なるのをより回避できる。
3. Effects As described above, the imaging apparatus 100 according to the first embodiment includes the first imaging unit (imaging unit 131a) capable of imaging the first imaging region L1 extending in the first direction, and the first imaging. A first image generation unit 10A including a first illumination unit 14a that illuminates the region L1 from the first illumination direction D1, and a second imaging region L2 that extends in a second direction different from the first direction. And a second illumination direction D2 different from the first illumination direction D1 when the second imaging region L2 is viewed from above the second imaging region L2. The second image generation unit 10B including the second illumination unit 14b that illuminates the first image generation unit 10B, and the first image generation unit 10A and the second image generation unit 10B in predetermined moving directions (main scanning direction and sub-operation direction). And a drive unit 16 that moves the body integrally. Thereby, the imaging device 100 can perform imaging in a state where illumination is performed from a plurality of different directions by the first illumination unit 14a and the second illumination unit 14b. That is, the imaging apparatus 100 can capture images in different illumination directions D1 and D2 as shown in (a) above and (b) below in FIG. 13 without rotating an object (for example, a picture). become. Thereby, image data can be synthesized with high accuracy based on a plurality of image data. In particular, even when the object has a plurality of convex portions and there are convex portions having different heights in the vicinity, image data including a shadow can be generated with high accuracy based on the plurality of image data. Further, since the positional relationship between the first image generation unit 10A and the second image generation unit 10B is fixed, the image data generated by the first image generation unit 10A and the second image generation unit 10B is synthesized. In this case, it is difficult for image displacement to occur. Therefore, the imaging apparatus 100 can generate image data including a shadow with higher accuracy. If the illumination direction D1 and the illumination direction D2 are different directions when the object is viewed from above, it is possible to avoid the shadows of the convex portions from overlapping. In particular, if the illumination direction D1 and the illumination direction D2 are non-parallel directions when the object is viewed from the top, it is possible to further avoid the shadows of the convex portions from overlapping.
 第1の画像生成部10Aは、第1の撮像領域L1内にある凸部を有する物体を、第1照明部14aによって照明して撮影し、凸部による陰影を示す第1の陰影情報を生成する。第2の画像生成部10Bは、第2の撮像領域L2内にある物体を、第2照明部14bによって照明して撮影し、凸部による陰影を示す第2の陰影情報を生成する。これにより、高さの異なる凸部が近傍にあった場合であっても、画像の位置ずれが発生することなく、陰影を含む画像データを精度良く生成することができる。 The first image generation unit 10A shoots an object having a convex portion in the first imaging region L1 by illuminating with the first illumination unit 14a, and generates first shadow information indicating a shadow by the convex portion. To do. The second image generation unit 10B shoots an object in the second imaging region L2 by illuminating with the second illumination unit 14b, and generates second shadow information indicating a shadow by the convex portion. As a result, even when there are convex portions having different heights in the vicinity, image data including shadows can be generated with high accuracy without causing image positional deviation.
 また、第1の画像生成部10Aは、撮像部131aを介して第1照明部14aの反対側に配置されて、第1の撮像領域L1を第3の照明方向D3から照明する第3照明部15aをさらに含む。第1の画像生成部10Aは、第1の撮像領域L1内にある物体を、第3照明部15aによって照明して撮影し、凸部による陰影を示す第3の陰影情報を生成する。第2の画像生成部10Bは、撮像部131bを介して第2照明部14bの反対側に配置されて、第2の撮像領域L2を第4の照明方向D4から照明する第4照明部15bをさらに含む。第2の画像生成部10Bは、第2の撮像領域L2内にある物体を、第4照明部15bによって照明して撮影し、凸部による陰影を示す第4の陰影情報を生成する。これにより、第1の撮像領域L1に対して異なる2方向(照明方向D1、照明方向D3)から照明して撮像できる。また第2の撮像領域L2に対して異なる2方向(照明方向D2、照明方向D4)から照明して撮像できる。つまり、より多くの陰影情報(第1の陰影情報~第4の陰影情報)を生成できる。よって、より精度良く画像データを生成することができる。 The first image generation unit 10A is arranged on the opposite side of the first illumination unit 14a via the imaging unit 131a, and illuminates the first imaging region L1 from the third illumination direction D3. 15a is further included. The first image generation unit 10A shoots an object in the first imaging region L1 by illuminating with the third illumination unit 15a, and generates third shadow information indicating a shadow by the convex portion. The second image generation unit 10B is disposed on the opposite side of the second illumination unit 14b via the imaging unit 131b, and includes a fourth illumination unit 15b that illuminates the second imaging region L2 from the fourth illumination direction D4. In addition. The second image generation unit 10B shoots an object in the second imaging region L2 by illuminating with the fourth illumination unit 15b, and generates fourth shadow information indicating a shadow by the convex portion. Thereby, it can image by illuminating from two different directions (illumination direction D1 and illumination direction D3) with respect to the first imaging region L1. Moreover, it can image by illuminating from two different directions (illumination direction D2, illumination direction D4) with respect to the second imaging region L2. That is, more shadow information (first shadow information to fourth shadow information) can be generated. Therefore, image data can be generated with higher accuracy.
 実施形態1の撮像装置100は、第1の画像生成部10Aによる第1の撮像領域L1の撮影と、第2の画像生成部10Bによる第2の撮像領域L2の撮影とを同時に行う。これにより、複数の画像データを短時間で生成することができる。 The imaging apparatus 100 according to the first embodiment simultaneously performs imaging of the first imaging area L1 by the first image generation unit 10A and imaging of the second imaging area L2 by the second image generation unit 10B. Thereby, a plurality of image data can be generated in a short time.
 第1の方向(撮像領域L1の延在する方向)と第1照明部14a及び第3照明部15aの長手方向とは平行である。第2の方向(撮像領域L2の延在する方向)と第2照明部14b及び第4照明部15bの長手方向とは平行である。すなわち、撮像領域L1全体において、第1照明部14aから撮像領域L1内の各画素への距離は同一である。同様に、撮像領域L1全体において、第3照明部15aから撮像領域L1内の各画素への距離は同一である。さらに撮像領域L2全体において、第2照明部14bから撮像領域L2内の各画素への距離は同一である。同様に、撮像領域L2全体において、第4照明部15bから撮像領域L2内の各画素への距離は同一である。よって、撮像領域L1及び撮像領域L2内の全ての画素が、均一の角度及び光量で照明されうる。これにより、精度の良い陰影画像データが得られる。 The first direction (the direction in which the imaging region L1 extends) and the longitudinal directions of the first illumination unit 14a and the third illumination unit 15a are parallel to each other. The second direction (the direction in which the imaging region L2 extends) and the longitudinal directions of the second illumination unit 14b and the fourth illumination unit 15b are parallel to each other. That is, in the entire imaging region L1, the distance from the first illumination unit 14a to each pixel in the imaging region L1 is the same. Similarly, in the entire imaging region L1, the distance from the third illumination unit 15a to each pixel in the imaging region L1 is the same. Furthermore, in the entire imaging region L2, the distance from the second illumination unit 14b to each pixel in the imaging region L2 is the same. Similarly, in the entire imaging region L2, the distance from the fourth illumination unit 15b to each pixel in the imaging region L2 is the same. Therefore, all the pixels in the imaging region L1 and the imaging region L2 can be illuminated with a uniform angle and light amount. As a result, accurate shadow image data can be obtained.
 第1の方向(撮像領域L1の延在する方向)と第2の方向(撮像領域L2の延在する方向)が交差する角度は90度である。よって、実施形態1の撮像装置100によれば、90度異なる方向から絵画を照明した状態の画像データを、絵画を回転させることなく、取得することができる。 The angle at which the first direction (the direction in which the imaging region L1 extends) and the second direction (the direction in which the imaging region L2 extends) intersects is 90 degrees. Therefore, according to the imaging apparatus 100 of Embodiment 1, it is possible to acquire image data in a state where the painting is illuminated from a direction different by 90 degrees without rotating the painting.
 第1の方向(撮像領域L1の延在する方向)及び第2の方向(撮像領域L2の延在する方向)と、第1の画像生成部10Aと第2の画像生成部10Bの撮像時の移動方向(主走査方向)とがなす角度が45度である。このように、実施形態1の撮像装置100では、撮像領域L1及び撮像領域L2が延在する方向と、撮像時の移動方向(主走査方向)とが45度で交差するため、撮像領域L1及び撮像領域L2のX方向のサイズは同一である。また撮像領域L1及び撮像領域L2のY方向のサイズも同一である。よって、画像処理装置200は、撮像領域L1及び撮像領域L2のそれぞれで撮像して得られた画像データの高さ情報の合成を簡単に行うことができる。 The first direction (the direction in which the imaging region L1 extends) and the second direction (the direction in which the imaging region L2 extends), and during the imaging of the first image generation unit 10A and the second image generation unit 10B The angle formed by the moving direction (main scanning direction) is 45 degrees. As described above, in the imaging apparatus 100 according to the first embodiment, the direction in which the imaging region L1 and the imaging region L2 extend intersects with the moving direction (main scanning direction) at the time of imaging at 45 degrees. The size of the imaging region L2 in the X direction is the same. The sizes of the imaging area L1 and the imaging area L2 in the Y direction are also the same. Therefore, the image processing apparatus 200 can easily combine the height information of the image data obtained by imaging in each of the imaging region L1 and the imaging region L2.
 実施形態1の第1の画像生成部10Aと第2の画像生成部10Bとは、主走査方向に並べて配置される。これにより高さ情報を容易に合成できる。 The first image generation unit 10A and the second image generation unit 10B of the first embodiment are arranged side by side in the main scanning direction. Thereby, height information can be easily synthesized.
 撮像装置100は、第1の画像生成部10Aと第2の画像生成部10Bの間に、第1照明部14a、第2照明部14b、第3照明部15a、第4照明部15b)の照明光を遮蔽する遮光部材17を備えている。遮蔽部材17により、一方の撮像領域に他方の撮像領域を照明するための照明光が漏れて、陰影が薄くなったり、陰影の方向が変わったりすることを防ぐことができる。よって、陰影画像データをより精度良く生成することができる。 The imaging apparatus 100 includes illuminations of a first illumination unit 14a, a second illumination unit 14b, a third illumination unit 15a, and a fourth illumination unit 15b) between the first image generation unit 10A and the second image generation unit 10B. A light shielding member 17 for shielding light is provided. The shielding member 17 can prevent the illumination light for illuminating the other imaging area from leaking into one imaging area, thereby reducing the shadow or changing the direction of the shadow. Therefore, it is possible to generate shadow image data with higher accuracy.
 実施形態1の画像処理システム1は、撮像装置100と、第1照明部14aの照明方向D1と物体の主面とがなす角度及び第1の陰影情報(第1画像データ)と、第2照明部14bの照明方向D2と物体の主面とがなす角度及び第2の陰影情報(第2画像データ)とに基づいて、物体の表面の高さを示す高さ情報(高さデータ)を生成する画像処理装置200と、を含む。さらに実施形態1の画像処理システム1における画像処理装置200は、第3照明部15aの照明方向D3と物体の主面とがなす角度及び第3の陰影情報(第3画像データ)と、第4照明部15bの照明方向D4と物体の主面とがなす角度及び第4の陰影情報(第4画像データ)とを併用して、高さ情報(高さデータ)を生成する。実施形態1の撮像装置100によれば、陰影を含む画像データを精度良く生成することができる。したがってこの撮像装置100を用いた実施形態1の画像処理システム1によれば、凸部全体の高さを陰影の長さから精度良く算出できる。 The image processing system 1 according to the first embodiment includes an imaging device 100, an angle formed by the illumination direction D1 of the first illumination unit 14a and the main surface of the object, first shadow information (first image data), and second illumination. The height information (height data) indicating the height of the surface of the object is generated based on the angle formed by the illumination direction D2 of the part 14b and the main surface of the object and the second shadow information (second image data). And an image processing apparatus 200. Furthermore, the image processing apparatus 200 in the image processing system 1 according to the first embodiment includes an angle formed by the illumination direction D3 of the third illumination unit 15a and the main surface of the object, third shadow information (third image data), and fourth. Height information (height data) is generated using both the angle formed by the illumination direction D4 of the illumination unit 15b and the main surface of the object and the fourth shadow information (fourth image data). According to the imaging apparatus 100 of the first embodiment, it is possible to accurately generate image data including a shadow. Therefore, according to the image processing system 1 of the first embodiment using the imaging apparatus 100, the height of the entire convex portion can be accurately calculated from the length of the shadow.
 実施形態1の画像処理方法は、第1の方向に延在する第1の撮像領域L1内において、表面に凸部を有する物体を、第1の照明方向D1から照明して撮影し、凸部による陰影を示す第1の陰影情報を生成する。画像処理方法は、第1の方向と異なる第2の方向に延在する第2の撮像領域L2内において、物体を、物体を上面視した場合に第1の照明方向D1と異なる第2の照明方向D2から照明して撮影し、凸部による陰影を示す第2の陰影情報を生成する。この画像処理方法は、第1の照明方向D1と物体の主面とがなす角度及び第1の陰影情報と、第2の照明方向D2と物体の主面とがなす角度及び第2の陰影情報とに基づいて、物体の表面の高さを示す高さ情報を生成する。この画像処理方法は、第1の照明方向D1と物体の主面とがなす角度及び第1の陰影情報と、第2の照明方向D2と物体の主面とがなす角度及び第2の陰影情報とに基づいて、物体の表面の高さを示す高さ情報を生成する。さらに実施形態1の画像処理方法は、第1の撮像領域L1内において、物体を第3の照明方向D3から照明して撮影し、凸部による陰影を示す第3の陰影情報を生成する。またこの画像処理方法は、第2の撮像領域L2内において、物体を、物体を上面視した場合に第3の照明方向D3と異なる第4の照明方向D4から照明して撮影し、凸部による陰影を示す第4の陰影情報を生成する。この画像処理方法は、第3の照明方向D3と物体の主面とがなす角度及び第3の陰影情報と、第4の照明方向D4と物体の主面とがなす角度及び第4の陰影情報とに基づいて、物体の表面の高さを示す高さ情報を生成する。実施形態1の画像処理方法では、陰影を含む画像データを精度良く生成することができるため、凸部全体の高さを陰影の長さから精度良く算出できる。 The image processing method according to the first embodiment illuminates and captures an object having a convex portion on the surface from the first illumination direction D1 in the first imaging region L1 extending in the first direction. 1st shadow information which shows the shadow by is produced | generated. In the image processing method, in the second imaging region L2 extending in a second direction different from the first direction, the second illumination different from the first illumination direction D1 when the object is viewed from above. Illumination is taken from the direction D2, and second shadow information indicating a shadow by the convex portion is generated. In this image processing method, the angle and first shadow information formed by the first illumination direction D1 and the main surface of the object, and the angle and second shadow information formed by the second illumination direction D2 and the main surface of the object. Based on the above, height information indicating the height of the surface of the object is generated. In this image processing method, the angle and first shadow information formed by the first illumination direction D1 and the main surface of the object, and the angle and second shadow information formed by the second illumination direction D2 and the main surface of the object. Based on the above, height information indicating the height of the surface of the object is generated. Furthermore, the image processing method according to the first embodiment shoots an object by illuminating an object from the third illumination direction D3 in the first imaging region L1, and generates third shadow information indicating a shadow due to the convex portion. Further, in this image processing method, in the second imaging region L2, when an object is viewed from the top, the object is illuminated and photographed from a fourth illumination direction D4 that is different from the third illumination direction D3. 4th shadow information which shows a shadow is produced | generated. In this image processing method, the angle and third shadow information formed by the third illumination direction D3 and the main surface of the object, and the angle and fourth shadow information formed by the fourth illumination direction D4 and the main surface of the object. Based on the above, height information indicating the height of the surface of the object is generated. In the image processing method according to the first embodiment, since image data including a shadow can be generated with high accuracy, the height of the entire convex portion can be accurately calculated from the length of the shadow.
 また、実施形態1は、撮像装置100と画像処理装置200と印刷装置300とを備えた複製システムを提供する。この複製システムによれば、絵画を複製する際に、絵画の凸部(絵画表面の高さ)を精度良く再現することができる。これにより、より実物に近い絵画の複製物を生成することができる。 Also, the first embodiment provides a duplication system including an imaging device 100, an image processing device 200, and a printing device 300. According to this duplication system, when duplicating a picture, the convex part (height of the picture surface) of the picture can be accurately reproduced. This makes it possible to generate a reproduction of a painting that is closer to the real thing.
 なお、実施の形態1において、異なる複数の照明方向から照明して絵画の凸部の陰影を含む複数の陰影画像を撮像するとしたが、異なる複数の照明方向から照明して絵画の凸部の陰影を含まない複数の色画像を撮像するとしても良い。例えば、照明方向により発色が異なる絵の具などを使用した絵画の場合、実施の形態1の撮像装置100を使用することで、一度に複数の色画像を撮像することができる。 In the first embodiment, illumination is performed from a plurality of different illumination directions to capture a plurality of shadow images including shadows of the convex portions of the painting. However, illumination from a plurality of different illumination directions is performed to shade the convex portions of the painting. A plurality of color images not including the image may be captured. For example, in the case of a painting using paints that have different colors depending on the illumination direction, a plurality of color images can be captured at once by using the imaging device 100 according to the first embodiment.
 (他の実施形態)
 以上のように、本出願において開示する技術の例示として、実施形態1を説明した。しかしながら、本開示における技術は、これに限定されず、適宜、変更、置換、付加、省略などを行った実施形態にも適用可能である。また、上記実施形態1で説明した各構成要素を組み合わせて、新たな実施形態とすることも可能である。そこで、以下、他の実施形態を例示する。
(Other embodiments)
As described above, the first embodiment has been described as an example of the technique disclosed in the present application. However, the technology in the present disclosure is not limited to this, and can also be applied to embodiments in which changes, substitutions, additions, omissions, and the like are appropriately performed. Moreover, it is also possible to combine each component demonstrated in the said Embodiment 1, and it can also be set as a new embodiment. Accordingly, other embodiments will be exemplified below.
 実施形態1においては、図7のステップS701~703に示すように、スキャンの開始位置から終了位置まで(又は終了位置から開始位置まで)の移動を3回行うことによって、4方向からの照明による陰影画像データ(第1画像~第4画像データ)と色画像データ(第5画像データ)を取得した。しかし、移動回数は3回でなくても良い。絵画400に対して90度ずつ異なる4方向(例えば、右上側と左下側、左上側と右下側)からの照明による陰影画像データ(第1画像~第4画像データ)と色画像データ(第5画像データ)を取得できれば、移動回数を任意に設定しても良い。例えば、主走査方向へ移動する毎に、撮像領域L1及び撮像領域L2を3種類の照明状態(第1照明部14a及び第2照明部14bのみで照明した状態、第3照明部15a及び第4照明部15bのみで照明した状態、及び第1照明部14aと第3照明部15aの両方(又は第2照明部14bと第4照明部15bの両方)で照明した状態)でそれぞれ撮影することによって、スキャンの開始位置から終了位置までの1回の移動によって、陰影画像データ(第1画像~第4画像データ)と色画像データ(第5画像データ)を取得しても良い。 In the first embodiment, as shown in Steps S701 to S703 of FIG. 7, the movement from the start position of the scan to the end position (or from the end position to the start position) is performed three times, thereby illuminating from four directions. Shadow image data (first image to fourth image data) and color image data (fifth image data) were obtained. However, the number of movements need not be three. Shadow image data (first to fourth image data) and color image data (first image data) from four directions (for example, upper right side and lower left side, upper left side and lower right side) that differ by 90 degrees with respect to the painting 400 (5 image data) can be acquired, the number of movements may be set arbitrarily. For example, every time it moves in the main scanning direction, the imaging region L1 and the imaging region L2 are illuminated with three types of illumination states (a state in which only the first illumination unit 14a and the second illumination unit 14b are illuminated, the third illumination unit 15a and the fourth illumination unit). By photographing in the state illuminated only by the illumination unit 15b and in the state illuminated by both the first illumination unit 14a and the third illumination unit 15a (or the state illuminated by both the second illumination unit 14b and the fourth illumination unit 15b), respectively. The shadow image data (first to fourth image data) and color image data (fifth image data) may be acquired by one movement from the start position to the end position of the scan.
 実施形態1においては、絵画の色彩を再現する例について説明したが、凸部を有する物体の色彩を再現せず、物体の立体形状のみを再現する場合は、色画像データの生成に関する処理(ステップS703)を省略することができる。 In the first embodiment, an example of reproducing the color of a painting has been described. However, when only the three-dimensional shape of an object is reproduced without reproducing the color of an object having a convex portion, processing related to generation of color image data (steps) S703) can be omitted.
 実施形態1において、撮像領域L1及び撮像領域L2の延在する方向の交差角度は90度であったが、この交差角度は90度ちょうどでなくても良く、実質的に90度であればよい。図10に、撮像部131a及び撮像部131bの画素数に応じた交差角度の許容範囲を示す。この許容範囲は、実質的に90度と言える角度の範囲を示す。図10は、撮像部131a及び撮像部131bによって撮像された画像データを解像度1000dpiで撮影した時にXY平面での凸部の位置の誤差を1mmのずれまでを許容可能であるとした場合の、撮像領域L1及び撮像領域L2の交差角度の許容範囲を示している。この場合、凸部の位置の誤差は、撮像部の画素数を画像データの解像度1000dpi×撮像部の画素数×tan(交差角度の誤差)の値で割ることにより算出でき、例えば、撮像部131a及び撮像部131bの画素数が7000であれば、交差角度の許容範囲は89.5°~90.5°となる。撮像領域L1及びL2の交差角度は、図10に示すような、撮像部131a及び撮像部131bの画素数に応じた許容角度の範囲内で決定されても良い。 In the first embodiment, the intersection angle in the extending direction of the imaging region L1 and the imaging region L2 is 90 degrees. However, the intersection angle may not be just 90 degrees and may be substantially 90 degrees. . FIG. 10 shows an allowable range of the crossing angle according to the number of pixels of the imaging unit 131a and the imaging unit 131b. This allowable range indicates an angle range that can be said to be substantially 90 degrees. FIG. 10 shows an imaging when the error of the position of the convex portion on the XY plane can be tolerated to 1 mm when the image data captured by the imaging unit 131a and the imaging unit 131b is captured at a resolution of 1000 dpi. The tolerance | permissible_range of the intersection angle of the area | region L1 and the imaging area L2 is shown. In this case, the error in the position of the convex portion can be calculated by dividing the number of pixels of the imaging unit by the value of resolution of image data 1000 dpi × number of pixels of the imaging unit × tan (intersection angle error), for example, the imaging unit 131a. If the number of pixels of the imaging unit 131b is 7000, the allowable range of the crossing angle is 89.5 ° to 90.5 °. The intersection angle between the imaging regions L1 and L2 may be determined within a range of allowable angles according to the number of pixels of the imaging unit 131a and the imaging unit 131b as illustrated in FIG.
 実施形態1においては、第1の画像生成部10A及び第2の画像生成部10Bは、図4に示すように、主走査方向に対して前後に並べて(Y方向に沿って)配置されたが、その他の位置関係で配置されても良い。図11に、第1の画像生成部10A及び第2の画像生成部10Bの異なる位置関係を示す。第1の画像生成部10A及び第2の画像生成部10Bは、図11に示すように、主走査方向に対して左右に並べて(X方向に沿って)配置されても良い。このとき、撮像領域L1及び撮像領域L2が延在する方向と主走査方向とが45度で交差するように配置すると、撮像領域L1及び撮像領域L2のX方向のサイズとY方向のサイズは同一になる。したがって、撮像領域L1及び撮像領域L2のそれぞれで撮像して得られた画像データの高さ情報の合成を簡単に行うことができるようになる。 In the first embodiment, the first image generation unit 10A and the second image generation unit 10B are arranged side by side with respect to the main scanning direction (along the Y direction) as shown in FIG. These may be arranged in other positional relationships. FIG. 11 shows different positional relationships between the first image generation unit 10A and the second image generation unit 10B. As shown in FIG. 11, the first image generation unit 10A and the second image generation unit 10B may be arranged side by side (along the X direction) with respect to the main scanning direction. At this time, if the direction in which the imaging region L1 and the imaging region L2 extend and the main scanning direction intersect with each other at 45 degrees, the size in the X direction and the size in the Y direction of the imaging region L1 and the imaging region L2 are the same. become. Therefore, it becomes possible to easily combine the height information of the image data obtained by imaging in each of the imaging area L1 and the imaging area L2.
 なお、図11に示すような、第1の画像生成部10A及び第2の画像生成部10Bを主走査方向の左右に並べる配置は、図4に示す配置の第1の画像生成部10A及び第2の画像生成部10Bの主走査方向をY方向ではなく、X方向に変えることによって、実現され得る。第1の画像生成部10A及び第2の画像生成部10Bを、図4に示すように主走査方向に対して前後に並べるか、又は図11に示すように主走査方向に対して左右に並べるかは、撮像する絵画の縦横比に応じて決定されても良い。例えば、縦長の絵画を撮像するときは、主走査方向に対して前後に並べた配置にしても良い。 As shown in FIG. 11, the arrangement in which the first image generation unit 10A and the second image generation unit 10B are arranged on the left and right in the main scanning direction is the first image generation unit 10A and the second image generation unit 10A in the arrangement shown in FIG. This can be realized by changing the main scanning direction of the second image generator 10B to the X direction instead of the Y direction. The first image generation unit 10A and the second image generation unit 10B are arranged in the front and rear direction with respect to the main scanning direction as shown in FIG. 4, or are arranged in the right and left direction with respect to the main scanning direction as shown in FIG. This may be determined according to the aspect ratio of the picture to be captured. For example, when capturing a vertically long picture, it may be arranged in the front-rear direction with respect to the main scanning direction.
 実施形態1においては、撮像装置100が第1の画像生成部10A及び第2の画像生成部10Bの2つの画像生成部を備える例について説明したが、画像生成部の数は2つに限らない。例えば、撮像装置100は3つ以上の画像生成部を備えても良い。図12は、撮像装置100が3つの画像生成部(第1の画像生成部10A、第2の画像生成部10B、及び第3の画像生成部10C)を備えるときの位置関係を示している。第3の画像生成部10Cは、撮像部(第3の撮像部)と照明部14c(第5照明部)及び照明部15c(第6照明部)とを有する。撮像部(第3の撮像部)は、第3の方向に延在する第3の撮像領域L3を撮像する。照明部14c及び照明部15cは、第3の撮像領域L3を照明する。撮像領域L1~撮像領域L3がそれぞれ延在する方向は異なる。撮像装置100は、図12に示すように、3つの画像生成部(第1の画像生成部10A~第3の画像生成部10C)を、それぞれの撮像領域L1~撮像領域L3の延在する方向が60度で交差するように、配置しても良い。これにより、より精度良く、凸部の高さを算出することが可能になる。 In the first embodiment, the example in which the imaging apparatus 100 includes the two image generation units of the first image generation unit 10A and the second image generation unit 10B has been described. However, the number of image generation units is not limited to two. . For example, the imaging apparatus 100 may include three or more image generation units. FIG. 12 illustrates a positional relationship when the imaging apparatus 100 includes three image generation units (a first image generation unit 10A, a second image generation unit 10B, and a third image generation unit 10C). The third image generation unit 10C includes an imaging unit (third imaging unit), an illumination unit 14c (fifth illumination unit), and an illumination unit 15c (sixth illumination unit). The imaging unit (third imaging unit) images the third imaging region L3 extending in the third direction. The illumination unit 14c and the illumination unit 15c illuminate the third imaging region L3. The directions in which the imaging regions L1 to L3 extend are different. As shown in FIG. 12, the imaging apparatus 100 causes three image generation units (first image generation unit 10A to third image generation unit 10C) to extend in the respective imaging regions L1 to L3. May be arranged so as to intersect at 60 degrees. Thereby, it becomes possible to calculate the height of the convex portion with higher accuracy.
 実施形態1では、本開示の撮像装置100の撮像対象として絵画を例として説明したが、撮像対象は絵画に限定されるものではない。本開示の撮像装置100の思想は、特に凸部を有する平面状の物体を撮像する際に適用できる。例えば物体として、彫刻等の美術品、壁紙や床及び天井のクロス等が挙げられる。また、実施形態1では、本開示の画像処理システム1の画像処理対象として絵画を例として説明したが、画像処理対象は絵画に限定されるものではない。本開示の画像処理システム1の思想は、凸部を有する平面状の物体の表面の高さ情報を生成する際に適用できる。 In Embodiment 1, a painting is described as an example of an imaging target of the imaging device 100 of the present disclosure, but the imaging target is not limited to a painting. The idea of the imaging apparatus 100 of the present disclosure can be applied particularly when imaging a planar object having a convex portion. For example, examples of the object include art objects such as sculpture, wallpaper, floor and ceiling crosses, and the like. In the first embodiment, a picture is described as an example of an image processing target of the image processing system 1 of the present disclosure, but the image processing target is not limited to a picture. The idea of the image processing system 1 of the present disclosure can be applied when generating height information of the surface of a planar object having a convex portion.
 本開示の複製システムは、ハードウェア資源、例えば、プロセッサ、メモリ、及びプログラムとの協働などによって、実現可能である。 The replication system of the present disclosure can be realized by cooperation with hardware resources such as a processor, a memory, and a program.
 添付図面および詳細な説明に記載された構成要素の中には、課題解決のために必須な構成要素だけでなく、上記技術を例示するために、課題解決のためには必須でない構成要素も含まれ得る。そのため、それらの必須ではない構成要素が添付図面や詳細な説明に記載されていることをもって、直ちに、それらの必須ではない構成要素が必須であるとの認定をするべきではない。 Among the components described in the attached drawings and the detailed description, not only the components essential for solving the problem, but also the components not essential for solving the problem in order to exemplify the above technique are included. Can be. Therefore, it should not be immediately recognized that these non-essential components are essential as those non-essential components are described in the accompanying drawings and detailed description.
 また、上述の実施形態は、本開示における技術を例示するためのものであるから、請求の範囲またはその均等の範囲において種々の変更、置き換え、付加、省略などを行うことができる。 In addition, since the above-described embodiment is for exemplifying the technique in the present disclosure, various modifications, replacements, additions, omissions, and the like can be performed within the scope of the claims or an equivalent scope thereof.
 本開示は、凸部を有する物体(例えば、絵画)の陰影を含む画像データを生成する撮像装置、陰影を含む画像データから凸部を有する物体の高さデータを生成する画像処理システムに適用可能である。 The present disclosure can be applied to an imaging device that generates image data including a shadow of an object having a convex portion (for example, a painting) and an image processing system that generates height data of an object having a convex portion from the image data including the shadow. It is.
  1   画像処理システム
  10A 第1の画像生成部
  10B 第2の画像生成部
  10C 第3の画像生成部
  11  入出力部
  11a 入力部
  11b 通信部
  12  制御部
  13a,13b   カメラ
  131a,131b 撮像部
  132a,132b メモリ
  14a 第1照明部
  14b 第2照明部
  14c 第5照明部
  15a 第3照明部
  15b 第4照明部
  15c 第6照明部
  16  駆動部
  17  遮光部材
  21  入出力部
  21a 入力部
  21b 通信部
  22  制御部
  23  メモリ
  100 撮像装置
  200 画像処理装置
  300 印刷装置
DESCRIPTION OF SYMBOLS 1 Image processing system 10A 1st image generation part 10B 2nd image generation part 10C 3rd image generation part 11 Input / output part 11a Input part 11b Communication part 12 Control part 13a, 13b Camera 131a, 131b Imaging part 132a, 132b Memory 14a 1st illumination part 14b 2nd illumination part 14c 5th illumination part 15a 3rd illumination part 15b 4th illumination part 15c 6th illumination part 16 drive part 17 light shielding member 21 input / output part 21a input part 21b communication part 22 control part 23 Memory 100 Imaging Device 200 Image Processing Device 300 Printing Device

Claims (15)

  1.  第1の方向に延在する第1の撮像領域を撮像可能な第1の撮像部と、前記第1の撮像領域を第1の照明方向から照明する第1の照明部と、を含む第1の画像生成部と、
     前記第1の方向と異なる第2の方向に延在する第2の撮像領域を撮像可能な第2の撮像部と、前記第2の撮像領域を上面視した場合に、前記第1の照明方向と異なる第2の照明方向から前記第2の撮像領域を照明する第2の照明部と、を含む第2の画像生成部と、
     前記第1の画像生成部及び前記第2の画像生成部を所定の移動方向へ一体的に移動させる駆動部と、
     を備える、撮像装置。
    A first imaging unit including a first imaging unit capable of imaging a first imaging region extending in a first direction, and a first illumination unit that illuminates the first imaging region from a first illumination direction. An image generation unit of
    A second imaging unit capable of imaging a second imaging area extending in a second direction different from the first direction, and the first illumination direction when the second imaging area is viewed from above. A second image generation unit including a second illumination unit that illuminates the second imaging region from a second illumination direction different from the second illumination direction;
    A drive unit that integrally moves the first image generation unit and the second image generation unit in a predetermined movement direction;
    An imaging apparatus comprising:
  2.  前記第1の画像生成部は、前記第1の撮像領域内にある凸部を有する物体を、前記第1の照明部によって照明して撮影し、前記凸部による陰影を示す第1の陰影情報を生成し、
     前記第2の画像生成部は、前記第2の撮像領域内にある前記物体を、前記第2の照明部によって照明して撮影し、前記凸部による陰影を示す第2の陰影情報を生成する、
     請求項1に記載の撮像装置。
    The first image generation unit shoots an object having a convex portion in the first imaging region by illuminating the first illumination unit with the first illumination unit, and shows first shadow information indicating a shadow by the convex portion. Produces
    The second image generation unit is configured to illuminate and photograph the object in the second imaging region with the second illumination unit, and generate second shadow information indicating a shadow by the convex portion. ,
    The imaging device according to claim 1.
  3.  前記第1の画像生成部は、
      前記第1の撮像部を介して前記第1の照明部の反対側に配置されて、前記第1の撮像領域を第3の照明方向から照明する第3の照明部をさらに含み、
      前記第1の撮像領域内にある前記物体を、前記第3の照明部によって照明して撮影し、前記凸部による陰影を示す第3の陰影情報を生成し、
     前記第2の画像生成部は、
      前記第2の撮像部を介して前記第2の照明部の反対側に配置されて、前記第2の撮像領域を第4の照明方向から照明する第4の照明部をさらに含み、
      前記第2の撮像領域内にある前記物体を、前記第4の照明部によって照明して撮影し、前記凸部による陰影を示す第4の陰影情報を生成する、
     請求項2に記載の撮像装置。
    The first image generation unit includes:
    A third illumination unit disposed on the opposite side of the first illumination unit via the first imaging unit and illuminating the first imaging region from a third illumination direction;
    The object in the first imaging area is photographed by illuminating with the third illumination unit, and third shadow information indicating a shadow by the convex part is generated,
    The second image generation unit includes:
    A fourth illumination unit disposed on the opposite side of the second illumination unit via the second imaging unit and illuminating the second imaging region from a fourth illumination direction;
    The object in the second imaging area is photographed by illuminating with the fourth illumination unit, and fourth shadow information indicating a shadow by the convex part is generated.
    The imaging device according to claim 2.
  4.  前記第1の画像生成部による前記第1の撮像領域の撮影と、前記第2の画像生成部による前記第2の撮像領域の撮影とは同時に行われる、請求項2に記載の撮像装置。 The imaging apparatus according to claim 2, wherein the first imaging region is captured by the first image generation unit and the second imaging region is captured by the second image generation unit at the same time.
  5.  前記第1の方向と前記第1の照明部の長手方向は平行であり、
     前記第2の方向と前記第2の照明部の長手方向は平行である、
     請求項1に記載の撮像装置。
    The first direction and the longitudinal direction of the first illumination unit are parallel,
    The longitudinal direction of the second direction and the second illumination unit is parallel,
    The imaging device according to claim 1.
  6.  前記第1の方向と前記第2の方向が交差する角度は90度である、請求項1に記載の撮像装置。 The imaging apparatus according to claim 1, wherein an angle at which the first direction intersects the second direction is 90 degrees.
  7.  前記第1の方向と前記移動方向とがなす角度、及び前記第2の方向と前記移動方向とがなす角度は、いずれも45度である、請求項6に記載の撮像装置。 The imaging apparatus according to claim 6, wherein an angle formed by the first direction and the moving direction and an angle formed by the second direction and the moving direction are both 45 degrees.
  8.  前記第1の画像生成部と前記第2の画像生成部とは、前記移動方向に並べて配置される、請求項1に記載の撮像装置。 The imaging apparatus according to claim 1, wherein the first image generation unit and the second image generation unit are arranged side by side in the movement direction.
  9.  前記第1の画像生成部と前記第2の画像生成部との間に、前記第1の照明部及び前記第2の照明部の照明光を遮蔽する遮光部材をさらに備える、請求項1に記載の撮像装置。 The light-shielding member which shields the illumination light of a said 1st illumination part and a said 2nd illumination part between the said 1st image generation part and the said 2nd image generation part is further provided. Imaging device.
  10.  前記第1の方向及び前記第2の方向の双方と異なる第3の方向に延在する第3の撮像領域を撮像可能な第3の撮像部と、前記第3の撮像領域を、前記第3の撮像領域を上面視した場合に、前記第1の照明方向及び前記第2の照明方向と異なる第5の照明方向から前記第3の撮像領域を照明する第5の照明部と、を含む第3の画像生成部をさらに備える、請求項1に記載の撮像装置。 A third imaging unit capable of imaging a third imaging region extending in a third direction different from both the first direction and the second direction; and the third imaging region, A fifth illumination unit that illuminates the third imaging region from a fifth illumination direction different from the first illumination direction and the second illumination direction when the imaging region is viewed from above. The imaging device according to claim 1, further comprising three image generation units.
  11.  前記第1の方向及び前記第2の方向と、前記第3の方向とがなす角度が60度である、請求項10に記載の撮像装置。 The imaging apparatus according to claim 10, wherein an angle formed by the first direction, the second direction, and the third direction is 60 degrees.
  12.  前記第1の照明部及び前記第3の照明部の両方、及び前記第2の照明部及び前記第4の照明部の両方の少なくともいずれかによって同時に前記物体を照明して撮影し、前記物体の色情報を含む画像データを生成する請求項3に記載の撮像装置。 Illuminating and photographing the object simultaneously with at least one of both the first illumination unit and the third illumination unit, and both the second illumination unit and the fourth illumination unit, The imaging apparatus according to claim 3, wherein image data including color information is generated.
  13.  請求項2に記載の撮像装置と、
     前記第1の照明方向と前記物体の主面とがなす角度及び前記第1の陰影情報に含まれる前記凸部の陰影の長さと、前記第2の照明方向と前記物体の前記主面とがなす角度及び前記第2の陰影情報に含まれる前記凸部の陰影の長さとに基づいて、前記物体の表面の高さを示す高さ情報を生成する画像処理装置と、
     を含む、画像処理システム。
    An imaging device according to claim 2;
    An angle formed by the first illumination direction and the main surface of the object, a shadow length of the convex portion included in the first shadow information, and the second illumination direction and the main surface of the object. An image processing device that generates height information indicating a height of the surface of the object based on an angle formed and a length of a shadow of the convex portion included in the second shadow information;
    Including an image processing system.
  14.  第1の方向に延在する第1の撮像領域を所定の移動方向に移動させながら、前記第1の撮像領域内の物体を、第1の照明方向から照明して撮影し、第1の画像をスキャン撮像し、
     前記第1の方向と異なる第2の方向に延在する第2の撮像領域を前記第1の撮像領域と一体的に前記移動方向へ移動させながら、前記第2の撮像領域内の前記物体を、前記物体を上面視した場合に前記第1の照明方向と異なる第2の照明方向から照明して撮影し、第2の画像をスキャン撮像する、
     撮像方法。
    While moving the first imaging area extending in the first direction in a predetermined moving direction, the object in the first imaging area is illuminated and photographed from the first illumination direction, and the first image Scan images,
    While moving the second imaging region extending in the second direction different from the first direction in the movement direction integrally with the first imaging region, the object in the second imaging region is moved. Illuminating from a second illumination direction different from the first illumination direction when the object is viewed from above, and scanning the second image;
    Imaging method.
  15.  第1の方向に延在する第1の撮像領域内において、表面に凸部を有する物体を、第1の照明方向から照明して撮影し、前記凸部による陰影を示す第1の陰影情報を生成し、
     前記第1の方向と異なる第2の方向に延在する第2の撮像領域内において、前記物体を、前記物体を上面視した場合に前記第1の照明方向と異なる第2の照明方向から照明して撮影し、前記凸部による陰影を示す第2の陰影情報を生成し、
     前記第1の照明方向と前記物体の主面とがなす角度及び前記第1の陰影情報に含まれる前記凸部の陰影の長さと、前記第2の照明方向と前記物体の前記主面とがなす角度及び前記第2の陰影情報に含まれる前記凸部の陰影の長さとに基づいて、前記物体の前記表面の高さを示す高さ情報を生成する、
     画像処理方法。
    In the first imaging region extending in the first direction, an object having a convex portion on the surface is photographed by illuminating from the first illumination direction, and first shadow information indicating a shadow by the convex portion is obtained. Generate
    In a second imaging region extending in a second direction different from the first direction, the object is illuminated from a second illumination direction different from the first illumination direction when the object is viewed from above. And generating second shadow information indicating a shadow by the convex portion,
    An angle formed by the first illumination direction and the main surface of the object, a shadow length of the convex portion included in the first shadow information, and the second illumination direction and the main surface of the object. Generating height information indicating the height of the surface of the object based on the angle formed and the length of the shadow of the convex portion included in the second shadow information;
    Image processing method.
PCT/JP2017/010125 2016-08-31 2017-03-14 Image pickup device, image processing system, image pickup method, and image processing method WO2018042727A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016-169640 2016-08-31
JP2016169640A JP2019192948A (en) 2016-08-31 2016-08-31 Imaging apparatus, image processing system, and image processing method

Publications (1)

Publication Number Publication Date
WO2018042727A1 true WO2018042727A1 (en) 2018-03-08

Family

ID=61300317

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/010125 WO2018042727A1 (en) 2016-08-31 2017-03-14 Image pickup device, image processing system, image pickup method, and image processing method

Country Status (2)

Country Link
JP (1) JP2019192948A (en)
WO (1) WO2018042727A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021117099A1 (en) * 2019-12-09 2021-06-17 株式会社アルステクネ Image processing method, program, and image processing device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS61162704A (en) * 1985-01-14 1986-07-23 Yokogawa Electric Corp Apparatus for measuring dimension of outer diameter
JPS63273047A (en) * 1987-04-30 1988-11-10 Hitachi Ltd Surface irregularity detector
JP2003262509A (en) * 2002-03-08 2003-09-19 Dainippon Screen Mfg Co Ltd Inspection apparatus, measuring instrument, and method for measurement

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS61162704A (en) * 1985-01-14 1986-07-23 Yokogawa Electric Corp Apparatus for measuring dimension of outer diameter
JPS63273047A (en) * 1987-04-30 1988-11-10 Hitachi Ltd Surface irregularity detector
JP2003262509A (en) * 2002-03-08 2003-09-19 Dainippon Screen Mfg Co Ltd Inspection apparatus, measuring instrument, and method for measurement

Also Published As

Publication number Publication date
JP2019192948A (en) 2019-10-31

Similar Documents

Publication Publication Date Title
CN111497231B (en) 3D printing method and device, storage medium and 3D printing system
US9969121B2 (en) Multifunctional 3D scanning and printing apparatus
Moreno et al. Simple, accurate, and robust projector-camera calibration
KR101513107B1 (en) Adjusting apparatus laser beam machining apparatus adjusting method and adjusting program
Huang et al. A fast and flexible projector-camera calibration system
JP7067554B2 (en) Image processing equipment and methods
TWI618640B (en) Three dimensional printing system, and method for three dimensional printing
WO2018037604A1 (en) Image processing system and image processing method
WO2018042727A1 (en) Image pickup device, image processing system, image pickup method, and image processing method
CN108062790B (en) Three-dimensional coordinate system establishing method applied to object three-dimensional reconstruction
US8908012B2 (en) Electronic device and method for creating three-dimensional image
Vieira et al. A camera-projector system for real-time 3d video
KR20200046789A (en) Method and apparatus for generating 3-dimensional data of moving object
JP2019192949A (en) Image processing system and image processing method
CN1482491A (en) Three-dimensional photographic technology
US11282187B2 (en) Inspection system, inspection apparatus, and method using multiple angle illumination
CN101609284B (en) Method for calibrating bias of exposure image and imaging device
WO2018020533A1 (en) Image processing device, replication system, and replication method
WO2017221286A1 (en) Image processing device, replication system, and replication method
JP3823559B2 (en) How to convert 3D distance data
WO2018037586A1 (en) Image processing system and image processing method
US8654223B2 (en) Image pickup apparatus
JP2006179031A (en) Image input apparatus
WO2017154047A1 (en) Image pickup apparatus
TWI624171B (en) Image Scanning Apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17845738

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 17845738

Country of ref document: EP

Kind code of ref document: A1