WO2005080915A1 - Three-dimensional shape detection device and image pick up device - Google Patents

Three-dimensional shape detection device and image pick up device Download PDF

Info

Publication number
WO2005080915A1
WO2005080915A1 PCT/JP2005/002297 JP2005002297W WO2005080915A1 WO 2005080915 A1 WO2005080915 A1 WO 2005080915A1 JP 2005002297 W JP2005002297 W JP 2005002297W WO 2005080915 A1 WO2005080915 A1 WO 2005080915A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
dimensional shape
slit light
pattern light
pattern
Prior art date
Application number
PCT/JP2005/002297
Other languages
French (fr)
Japanese (ja)
Inventor
Hiroshi Uchigashima
Hiroaki Suzuki
Hiroyuki Sasaki
Shiro Yamada
Original Assignee
Brother Kogyo Kabushiki Kaisha
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Brother Kogyo Kabushiki Kaisha filed Critical Brother Kogyo Kabushiki Kaisha
Publication of WO2005080915A1 publication Critical patent/WO2005080915A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2513Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns

Definitions

  • the present invention detects not only the three-dimensional shape of a target object curving in a first direction at a predetermined imaging position, but also the three-dimensional shape of a target object curving in a second direction that intersects the first direction.
  • the present invention relates to a three-dimensional shape detection device and an imaging device that can perform the above-described operations.
  • slit light is projected onto a target object, the target object on which the slit light is projected is imaged by an imaging means, and the slit light is projected on the basis of the locus of the slit light detected from the captured image. Therefore, a three-dimensional shape detection device that detects a three-dimensional shape of a target object is known.
  • Japanese Patent Laying-Open No. 7-318315 describes this type of three-dimensional shape detection device.
  • the device described in this publication divides a light beam from one light source into two rows with a half mirror, and projects the light beam split into the two rows onto a target object as two rows of slit light, It is configured to detect the three-dimensional shape of the target object based on the trajectories of the two rows of slit light formed on the target object by the two rows of slit light.
  • the present invention has been made to solve the problems described above. That is, the present invention detects not only the three-dimensional shape of the target object curved in the first direction at the predetermined imaging position, but also the three-dimensional shape of the target object curved in the second direction that intersects the first direction. It is an object of the present invention to provide a three-dimensional shape detection device and an imaging device that can perform the operations.
  • an aspect of the present invention provides a three-dimensional shape detection device, which includes a light projecting unit that projects pattern light, and a pattern Imaging means for imaging the target object in a state where the light is projected, and detecting the position of the pattern light from the image captured by the imaging means, and detecting the position of the target object based on the detected position of the pattern light.
  • Three-dimensional shape calculation means for calculating three-dimensional shape information.
  • the light projecting means includes, as pattern light, a first pattern light extending along a plane passing through a position distant from the optical axis of the imaging means, and a first pattern extending along a plane intersecting the first pattern light. The two pattern lights are projected onto the target object.
  • the curvature of the target object that bends in the first direction is calculated based on the first pattern light that extends along a plane that passes through a position away from the optical axis of the imaging unit. Based on the second pattern light extending along the plane intersecting with the first pattern light, the curvature of the target object that is curved in the direction intersecting with the first direction is calculated. Therefore, in addition to the three-dimensional shape of the target object that curves in the first direction, the three-dimensional shape of the target object that is curved in the second direction that intersects with the first direction can be detected.
  • the three-dimensional shape detection device as described above and the three-dimensional shape of the target object calculated by the three-dimensional shape calculation means of the three-dimensional shape detection device are used.
  • An imaging device comprising:
  • this imaging device it is possible to achieve the same effects as those of the three-dimensional shape detection device described above, and based on the three-dimensional shape of the target object calculated by the three-dimensional detection device. In addition, it is possible to correct the image of the target object in a state where the pattern light captured by the imaging unit is not projected to a planar image observed from a substantially vertical direction of a predetermined surface of the target object.
  • FIG. 1 (a) is an external perspective view of the imaging device
  • FIG. 1 (b) is a schematic cross-sectional view of the imaging device 1 taken along a sectional line I in FIG. 1 (a).
  • FIGS. 2 (a) and 2 (b) are diagrams showing a state in which slit light is projected from an image pickup device toward a document and a beam is emitted
  • FIG. FIG. 2B is a diagram illustrating a case where a document curving in the vertical direction (Y-axis direction) is imaged with respect to the imaging apparatus. It is.
  • FIGS. 3 (a) and 3 (b) are diagrams showing the configuration of a slit light projecting unit
  • FIG. 3 (a) is a plan view of the slit light projecting unit.
  • (b) is a side view of the slit light projecting unit.
  • FIG. 4 is a block diagram showing an electrical configuration of the imaging device.
  • FIG. 5 is a flowchart showing a processing procedure in a processor.
  • FIG. 6 (a) and FIG. 6 (b) are diagrams for explaining the above-described triangulation calculation processing.
  • FIG. 7 (a) is a diagram showing a slit light image captured in the state of FIG. 2 (a), and FIG. 7 (b) is a slit image captured in the state of FIG. 2 (b). It is a figure showing an optical image.
  • FIGS. 8 (a), 8 (b), and 8 (c) are diagrams for explaining a horizontal curving original posture calculation process.
  • FIG. 9 is a flowchart showing a plane conversion process.
  • FIGS. 10 (a), 10 (b) and 10 (c) are diagrams showing another example of the arrangement of the slit light projecting ports.
  • FIG. 11 is a view for explaining an effect when the slit light projection part is separated from the imaging lens power as much as possible.
  • FIGS. 12 (a) and 12 (b) are views showing a slit light projecting unit of the second embodiment, and FIG. 12 (a) projects the first slit light.
  • Fig. 12 (b) shows the second slit light. Shows the state of light emission.
  • FIGS. 13 (a) and 13 (b) are views showing a slit light projecting unit according to a third embodiment, and FIG. 13 (a) projects a first slit beam.
  • a side view showing the state and a front view of the diffraction element are shown.
  • FIG. 13 (b) shows a side view showing a state where the second slit light is projected and a front view of the diffraction element.
  • Imaging device including a three-dimensional shape detection device
  • Imaging lens (imaging means)
  • FIG. 1A is an external perspective view of an imaging device 1 according to an embodiment of the present invention
  • FIG. 1B is a schematic cross-sectional view of the imaging device 1 along a sectional line I in FIG. 1A. Note that the three-dimensional shape detection device according to the embodiment of the present invention is included in the imaging device 1.
  • the image capturing apparatus 1 is an apparatus that can detect the three-dimensional shape of a document P including a curved document P that is curved in either the vertical direction or the horizontal direction at the same image capturing position.
  • the main outer shape of the imaging device 1 is formed by a rectangular box-shaped main body case 10.
  • an imaging lens 31, a finder 53, and a slit light emitting unit 29 are provided on the front surface of the main body case 10.
  • a release button 52, a mode switching switch 59, and a bending direction switching switch 60 are provided on the side surface of the main body case 10.
  • a memory card slot 55a is provided on the side surface of the main body case 10.
  • a liquid crystal display (LCD) 51 for displaying a captured image is provided on the rear surface of the main body case 10.
  • a CCD image sensor 32 arranged behind the imaging lens 31, a slit light projecting unit 20 arranged behind the slit light projecting port 29, and a memory card insertion port
  • a memory card reader 55 disposed behind 55a and a processor 40 are incorporated.
  • the imaging lens 31 is composed of a plurality of lenses and has an autofocus function.
  • the imaging lens 31 is an optical lens that automatically adjusts the focal length and aperture to form an image of external light on the CCD image sensor 32.
  • the finder 53 is composed of an optical lens disposed from the back to the front of the main body case 10. With this configuration, when the user looks into the imaging device 1 from the back, a range that substantially matches the range in which the imaging lens 31 forms an image on the CCD image sensor 32 can be seen.
  • the slit light projecting port 29 is a light projecting port for projecting the slit light projected from the slit light projecting unit 20 to the target object, and is provided at a lower left corner of the front surface of the main body case 10. It is located.
  • the slit light emitted from the slit light emitting part 29 forms a locus of cross-shaped slit light on the document P. By detecting the locus of the slit light, the three-dimensional shape of the document P is detected.
  • the release button 52 is formed by a push-button switch, and is connected to the processor 40.
  • the processor 40 detects that the user presses down the release button 52.
  • the mode switching switch 59 is configured by a slide switch or the like that can be switched between two positions, and is connected to the processor 40. In the processor 40, one switch position of the mode switching switch 59 is set to the "normal mode", and the other switch position is set to the "corrected imaging”. Mode ".
  • Normal mode is a mode in which the imaged original P itself is used as image data.
  • the “corrected imaging mode” is a mode in which, when the document P is imaged from an oblique direction, the image data is corrected image data as if the document P was imaged from the front.
  • the bending direction switching switch 60 is constituted by a slide switch or the like that can be switched between two positions, and is connected to the processor 40.
  • one switch position of the bending direction switching switch 60 is detected as "lateral bending mode", and the other switch position is detected as "vertical bending mode”.
  • the “horizontal bending mode” is a mode for detecting a three-dimensional shape of a document P that is bent laterally with respect to the imaging device 1 (see a document P in a bent state shown in FIG. 2A). is there.
  • the “vertical bending mode” is a mode for detecting a three-dimensional shape of a document P that is bent in the vertical direction with respect to the imaging device 1 (see the document P in a bent state shown in FIG. 2B).
  • the memory card reader 55 reads image data stored in the memory card from a memory card that stores non-volatile and rewritable image data, and stores captured image data in the memory cart. It is a device to make it.
  • the LCD 51 is configured by a liquid crystal display or the like that displays an image, and receives an image signal from the processor 40 and displays an image. From the processor 40 to the LCD 51, a real-time image received by the CCD image sensor 32, an image stored in the memory card 55, and an image signal for displaying characters of the setting contents of the device are displayed as necessary. Sent.
  • the CCD image sensor 32 has a configuration in which photoelectric conversion elements such as CCD (Charge Coupled Device) elements are arranged in a matrix.
  • the CCD image sensor 32 generates a signal corresponding to the color and intensity of the light of the image formed on the surface, converts the signal into digital data, and outputs the digital data to the processor 40.
  • the data for one CCD element is the pixel data of the pixels forming the image, and the image data is composed of the pixel data of the number of CCD elements.
  • FIG. 2A is a diagram illustrating a case where an image of a document P that is curved in the horizontal direction (X-axis direction) with respect to the imaging device 1 is imaged
  • FIG. 9 illustrates a case where an image of a document P curved in a direction (Y-axis direction) is captured. That is, FIG. 2A shows the document P in the “horizontal bending mode”.
  • FIG. 2B shows a case where the document P is imaged in the “vertical bending mode”.
  • the first slit light 71 and the second slit light 71 The slit light 72 is emitted.
  • the first slit light 71 is projected at a position away from the optical axis (Z axis) of the imaging lens 31 so as to extend along a plane parallel to the optical axis.
  • a locus 71a of the first slit light is formed by the first slit light 71.
  • the second slit light 72 is a plane substantially orthogonal to the first slit light 71, at a position apart from the optical axis (Z axis) of the imaging lens 31, and along a plane parallel to the optical axis. It is projected to extend. On the document P, a locus 72a of the second slit light is formed by the second slit light 72.
  • the first slit light 71 and the second slit light 72 are projected so as to cross each other, and the original P has a locus 71a of the first slit light and a locus 72a of the second slit light. This forms a trajectory of a cross-shaped slit light.
  • the first slit light path 71a is formed along the curve of the document P.
  • the trajectory 71a of the light it is possible to calculate the curvature of the original P that is curved in the horizontal direction.
  • FIGS. 3A and 3B are views showing the configuration of the slit light projecting unit 20.
  • FIG. 3A is a plan view of the slit light projecting unit 20
  • FIG. 3B is a side view of the slit light projecting unit 20.
  • the slit light projecting unit 20 includes a laser diode 21, a collimating lens 22, a half mirror 23, and two antennas. —It is equipped with wheels 24a and 24b, a reflection mirror 26, and two cylindrical canola lenses 27a and 27b.
  • the laser diode 21 is arranged on the most upstream side of the slit light projecting unit 20, and emits one laser beam.
  • the laser diode 21 switches between emission and stop of the laser beam in response to a command from the processor 40.
  • the collimator lens 22 is disposed downstream of the laser diode 21 and focuses the laser beam from the laser diode 21 so as to focus on a reference distance VP (for example, 330 mm) from the slit light projecting unit 20. I do.
  • the half mirror 23 is located downstream of the collimating lens 22 and is inclined at approximately 45 degrees with respect to the traveling direction of the laser beam passing through the collimating lens 22.
  • the laser beam that has passed through the collimating lens 22 is split by the half mirror 23 into a laser beam that passes through the half mirror 23 as it is and a laser beam that travels substantially vertically downward.
  • the apertures 24a and 24b are each formed of a plate having a rectangular opening.
  • the aperture 24a is disposed downstream of the laser beam passing through the half mirror 23 as it is, and the aperture 24b is disposed downstream of the laser beam traveling substantially vertically downward by the half mirror 23.
  • Each laser beam split by the half mirror 23 passes through openings formed in the apertures 24a and 24b, and is shaped as an appropriate slit light width within specifications within an imaging range.
  • the reflection mirror 26 is formed of a member such as a mirror that totally reflects the laser beam, and is disposed on the downstream side of the slit light passing through the aperture 24b at an angle of approximately 45 degrees with respect to the traveling direction of the slit light. ing. The slit light passing through the aperture 24b is totally reflected by the reflection mirror 26, and its traveling direction can be changed by about 90 degrees.
  • the cylindrical lenses 27a and 27b are formed in a so-called cylindrical shape in which a cylinder is divided into two in the axial direction. Light is condensed and diverged in a section having a curvature, and light is kept in a section having no curvature. It is an optical lens that passes.
  • the cylindrical lens 27a is arranged on the downstream side of the aperture 24a so that the curved surface faces the downstream side in a cross-sectional view on the XZ plane.
  • the slit light that has passed through the aperture 24a passes through the cylindrical lens 27a, and is emitted as the above-described first slit light 71 toward the document P via the slit light emitting port 29.
  • FIG. 4 is a block diagram showing an electrical configuration of the imaging device 1.
  • the processor 40 mounted on the imaging device 1 includes a CPU 41, an R ⁇ M 42, and a RAM 43.
  • the above-described CCD image sensor 32, memory card reader 55, release button 52, slit light projecting unit 20, LCD51, mode switching switch 59, and bending direction switching switch 60 are connected to the CPU 41 via signal lines. Have been.
  • the CPU 41 performs the following processing using the RAM 43 in accordance with the processing by the program stored in the ROM 42. That is, the processing performed by the CPU 41 includes detection of a pressing operation of the release button 52, capture of image data from the CCD image sensor 32, writing of image data to a memory card, detection of the state of the mode switching switch 59, detection of a slit light emitting unit, and the like. Switching of the emission of the slit light by 20 and the like are included.
  • the ROM 42 stores a camera control program 421, a difference extraction program 422, a triangulation calculation program 423, a document attitude calculation program 424, and a plane conversion program 425.
  • the camera control program 421 is a program relating to the control of the entire imaging apparatus 1 including the processing of the flowchart shown in FIG. 5 (details will be described later).
  • the difference extraction program 422 calculates a difference between an image with slit light obtained by imaging the document P on which the slit light is projected and an image without slit light obtained by imaging the document P on which the slit light is not emitted, and determines the track of the slit light. This is a program for extracting traces.
  • the triangulation calculation program 423 is a program for calculating a three-dimensional spatial position with respect to each pixel of the locus of the slit light extracted by the difference extraction program 422.
  • the document attitude calculation program 424 is a program for estimating and obtaining the three-dimensional shape of the document P from the three-dimensional spatial positions of the locus 7 la of the first slit light and the locus 72 a of the second slit light.
  • the plane conversion program 425 is a program for converting the image data, which is given the three-dimensional shape of the document P and stored in the slit light non-image storage unit 432, into an image as if it were taken from the front of the document P. .
  • the RAM 43 includes, as storage areas, a slit light image storage unit 431, a slit light non-image storage unit 432, and a slit light storage unit sized to store data in the form of image data from the CCD image sensor 32.
  • a difference image storage unit 433 for storing image data of the difference between the optical image and the slit image is assigned.
  • the RAM 43 stores a triangulation calculation result storage unit having a size for storing the three-dimensional spatial positions of the pixels constituting the trajectories 71a and 72a of each slit light. 434, a document orientation calculation result storage unit 435 having a size for storing the three-dimensional shape of the document P, and a marking area 436 having a size used for temporarily storing data for calculation in the CPU 41. Is assigned.
  • FIG. 5 is a flowchart showing a processing procedure in the processor 40 of the imaging device 1.
  • step S 110 When the release button 52 is pressed by the user, first, the position of the switch of the mode switching switch 59 is detected, and it is determined whether or not the position of the switch is the position of the “correction imaging mode” ( S 110). If the result of determination is that the camera is at the position of the "corrected imaging mode” (S110: Yes), the process proceeds to step S120.
  • step S120 light emission is instructed to the laser diode 21, the first slit light 71 and the second slit light 72 are projected from the slit light projecting unit 20, and then, as an image with slit light, from the CCD image sensor 32.
  • the image data represented by RGB values is obtained. This image data is read into the slit light image storage unit 431 (S120).
  • emission of the laser diode 21 is commanded, and the slit light projecting unit 20 stops emitting the first slit light 71 and the second slit light 72.
  • Image data represented by RGB values is obtained from the image sensor 32. This image data is read into the slit light non-image storage section 432 (S130).
  • the difference extraction program 422 projects the difference between the image data of the image storage unit 431 with slit light and the image data of the image storage unit 432 without slit light, that is, the difference P is projected on the document P.
  • Image data is generated by extracting the illuminated trajectory 71a of the first slit light and the trajectory 72a of the second slit light, and the image data is read into the difference image storage unit 433.
  • step S170a as a document orientation calculation process for the horizontal direction, the document orientation calculation program 424 determines the position of the document P based on the three-dimensional spatial position of each slit light stored in the triangulation calculation result storage unit 434. The dimensional shape is calculated, and the calculation result is read into the document posture calculation result storage unit 435.
  • step S170b each of the slits stored in the triangulation calculation result storage unit 434 by the document attitude calculation program 424 is used as the vertical document attitude calculation processing, similarly to the horizontal document attitude calculation processing (S170a).
  • the three-dimensional shape of the document P is calculated based on the three-dimensional spatial position of the light, and the calculation result is read into the document posture calculation result storage unit 435.
  • the three-dimensional shape data read into the document posture calculation result storage section 435 by the plane conversion program 425 is stored in the slit light non-image storage section 432.
  • the image data is converted into image data of an image as viewed from the front.
  • the image data of the generated erect image is written to the memory card (S190), and the process ends.
  • the laser diode 21 of the slit light projecting unit 20 does not emit light, In a state where the first slit light 71 and the second slit light 72 are not projected, the slit light non-image power is read from the CCD image sensor 32 into the S slit light non-image storage unit 432 (S200). Then, the image data is written to the memory card (S210), and the process ends.
  • FIGS. 6 (a) and 6 (b) are diagrams for explaining the above-described triangulation calculation processing (S150 in FIG. 5).
  • 6 (a) and 6 (b) the coordinate system of the imaging device 1 with respect to the original P to be imaged as shown in FIG.
  • Reference distance VP from 1 is the origin position of the X, Y, and Z axes
  • X is the horizontal direction with respect to imaging device 1.
  • the axis and the vertical direction are the Y axis.
  • the number of pixels in the X-axis direction of the CCD image sensor 32 is defined as ResX, and the number of pixels in the Y-axis direction is defined as Res Y.
  • the upper end of the position where the CCD image sensor 32 is projected on the XY plane through the imaging lens 31 is Yftop, the lower end is Yfbottom, the left end is Xfstart, and the right end is Xfend.
  • the distance between the optical axis of the imaging lens 31 and the optical axis of the first slit light 71 is D
  • the position of the first slit light 71 in the Y-axis direction at which the first slit light 71 intersects the XY plane is las 1.
  • las2 be the position in the Y-axis direction where the second slit light 72 intersects the XY plane.
  • a three-dimensional space position (Xl, Xl) corresponding to the coordinates (ccdxl, ccdyl) on the CCD image sensor 32 of the point of interest 1 that focuses on one of the pixels of the image of the first slit light locus 71a Yl, Z 1) is a triangle formed by a point on the image plane of the CCD image sensor 32, an emission point of the first slit light 71 and the second slit light 72, and a point intersecting the XY plane.
  • the three-dimensional spatial position (X2, Y2) corresponding to the coordinates (ccdx2, ccdy2) of the point of interest 2 focused on one of the pixels of the image of the locus 72a of the second slit light on the CCD image sensor 32 , Z2) is then derived from the solution of the five simultaneous equations.
  • X2 -(X2target / VP) Z2 + X2target
  • FIG. 7 (a) is a diagram showing a slit light image captured in the state of FIG. 2 (a).
  • 8 (a) to 8 (c) are diagrams for explaining a coordinate system at the time of the horizontal document orientation calculation processing.
  • the Z coordinate of the point ⁇ ( ⁇ , ⁇ , Z) (0, 0, L) based on the obtained inclination ⁇ and the intersection P2 between the trajectory 71a of the first slit light and the YZ plane.
  • the value L is determined.
  • a line obtained by approximating the locus 71a of the first slit light with a regression curve is rotationally transformed in the opposite direction by the previously obtained inclination ⁇ ⁇ ⁇ around the X axis, That is, consider a state in which the original P is parallel to the XY plane.
  • the cross-sectional shape of the document P in the X-axis direction is obtained by calculating the displacement in the Z-axis direction at a plurality of points in the X-axis direction with respect to the cross section of the document P on the X--Z plane.
  • the curvature ⁇ (X) which is a function of the tilt in the X-axis direction with the position in the X-axis direction as a variable, is obtained.
  • the inclination ⁇ ⁇ , the distance L, and the curvature ⁇ (X) of the document P as a three-dimensional shape can be obtained.
  • FIG. 7 (b) is a diagram showing an image with slit light captured in the state of FIG. 2 (b).
  • the curvature ⁇ (X) as the three-dimensional shape of the original P is obtained, as in the case of the horizontal original posture calculation processing (S17 Oa).
  • the distance L and the slope ⁇ can be obtained.
  • a processing area of the processing is allocated to the working area 436 of the RAM 43, and initial values of variables used in the processing, such as a variable for the counter b, are set (S1300).
  • the area of the erect image which is an image when the surface of the document P on which the characters and the like are written is observed from a substantially vertical direction, is calculated based on the calculation result of the document posture calculation program 424.
  • the three-dimensional spatial position (0, 0, L)
  • the points at the four corners of the image without slit light are converted and set, and within this area
  • the number a of included pixels is obtained (S1 301).
  • each three-dimensional space position is displaced in the Z-axis direction based on the curvature ⁇ (X) (S1303), and is rotated around the X-axis with a tilt ⁇ (S1304), and the Z-axis is moved. Shift in the direction by distance L (S1305)
  • the obtained three-dimensional spatial position is converted into coordinates (ccdcx, ccdcy) on the CCD image captured by the ideal camera using the above-described triangulation relational expression (SI 306) and used.
  • the coordinates (ccdx, ccdy) on the CCD image captured by the actual camera are converted by a known calibration method (S1307), and the slit light at this position is converted.
  • the state of the non-image pixel is obtained and stored in the working area 436 of the RAM 43 (S1308).
  • FIGS. 10 (a) and 10 (c) are diagrams showing another example regarding the arrangement position of the slit light projecting unit 29 on the front surface of the main body case 10.
  • the slit light projecting unit 29 is arranged in the vertical and horizontal directions of the imaging lens 31. It must be placed in a position where it does not intersect.
  • each of the slit lights 71 and 72 emitted from the slit light projecting light 29 is subjected to the above-described triangulation calculation processing (S150 in FIG. 5) that does not overlap the optical axis of the imaging lens 31.
  • the three-dimensional spatial position of the calculated locus 71a of the first slit light and the locus 72a of the second slit light can be obtained with high accuracy.
  • the slit light projecting unit 29 be disposed as far away from the imaging lens 31 as possible within the area of the front surface thereof.
  • FIG. 11 is a diagram for explaining that it is preferable to dispose the slit light projecting unit 29 at a position as far away from the imaging lens 31 as possible.
  • a description will be given of a case where the slit light projecting unit 29 is arranged at a position A which is separated from the imaging lens 31 by a in the Y-axis direction and a case where it is arranged at a position B which is separated by b.
  • the second slit light 72 emitted from the slit light emitting port 29 arranged at the position A reaches the point P2 and is emitted from the slit light emitting port 29 arranged at the position B.
  • the second slit light 72 reaches the point P3.
  • the second slit light 72 emitted from the slit light Point on document P (Y1 , Zl), and if the document P tilted with respect to the Y axis is caused to be parallel to the Y axis, the point (Yl, Z1) on the document P becomes the point P1 Will correspond.
  • the slit light projecting unit 29 be disposed as far away from the imaging lens 31 as possible within the front area. The same applies to the X-axis direction.
  • the user switches the mode switching switch 59 to the “correction imaging mode” side, checks the bending direction of the document P with respect to the imaging position, and switches the bending direction.
  • Set the switch 60 check that the desired range of the document P is within the imaging range with the viewfinder 53 or the LCD 51, and press the release button 52 to take an image.
  • the curvature of the imaged document is calculated based on the trajectory of the first slit light 71a formed along the curvature.
  • the curvature of the imaged document is calculated based on the trajectory of the second slit light 72a formed along the curvature. Therefore, even if a document curving in the intersecting direction is imaged from a predetermined position, a three-dimensional shape including the curve of the document can be detected.
  • FIGS. 12 (a) and 12 (b) are side views showing the slit light projecting unit 20a of the second embodiment with respect to the slit light projecting unit 20 of the first embodiment described above.
  • FIG. 12 (a) shows a state in which the first slit light 71 is projected
  • FIG. 12 (b) shows a state in which the second slit light 72 is projected. Note that the same members as those of the slit light projecting unit 20 of the first embodiment are denoted by the same reference numerals, and description thereof will be omitted.
  • the slit light projecting unit 20 of the first embodiment projects first and second slit lights 71 and 72 by two cylindrical lenses 27a and 27b, respectively.
  • the slit light projecting unit 20a of the second embodiment rotates one cylindrical lens 70 to project the first slit light 71 in the first state and the second slit light 72 in the second state. Glow Things.
  • the slit light projecting unit 20a of the second embodiment includes a laser diode 21, a collimating lens 22, an aperture 24, and a cylindrical lens 70 in order from the upstream side.
  • the cylindrical lens 70 is configured to be rotatable at that point by a driving device (not shown).
  • a driving device not shown
  • the second surface is arranged such that the curved surface faces the downstream side in a sectional view on the YZ plane.
  • the cylindrical lens 70 By configuring the cylindrical lens 70 to be rotatable between the first state and the second state in this manner, when the cylindrical lens 70 is disposed in the first state, the laser beam is emitted from the laser diode 21. When emitted, the laser beam passes through a collimating lens 22 and aperture 24 and is converted into slit light. Further, the slit light passes through the cylindrical lens 70 and is emitted as a first planar slit light 71 extending in a direction orthogonal to the Y axis (parallel to the X axis) with a predetermined angular width.
  • the cylindrical lens 70 in the first state is rotated so as to be in the second state, and when the cylindrical lens 70 is arranged in the second state, the laser beam is emitted from the laser diode 21. Is emitted, the laser beam passes through a collimating lens 22 and an aperture 24 and is converted into slit light. Further, the slit light passes through the cylindrical lens 70 and is projected as a planar second slit light 72 extending in a predetermined angular width in parallel with the Y axis.
  • the first imaging is executed in the first state
  • the second imaging is executed in the second state.
  • the trajectory 71a of the first slit light is detected by calculating the difference between the first slit light image and the slit light non-image obtained by the first imaging.
  • the trajectory 72a of the second slit light is detected by calculating the difference between the second image with slit light and the image without slit light acquired by the second imaging.
  • the timing for capturing the slit lightless image is configured to be executed between the first imaging and the second imaging, the slit lightless image can be obtained efficiently.
  • the required number of cylindrical lenses can be reduced as compared with the slit light projecting unit 20 of the first embodiment. Since the number of parts can be reduced, the manufacturing cost of the imaging device 1 can be reduced.
  • FIGS. 13 (a) and 13 (b) are views showing the slit light projecting unit 20b of the third embodiment related to the slit light projecting unit 20 of the first embodiment described above.
  • FIG. 13A shows a side view showing a state where the first slit light 71 is projected and a front view of the diffraction element 80.
  • FIG. 13B shows a side view showing a state in which the second slit light 72 is projected and a front view of the diffraction element 80.
  • the same members as those of the slit light projecting unit 20 of the first embodiment are denoted by the same reference numerals, and description thereof will be omitted.
  • the slit light projecting unit 20 of the first embodiment projects the first and second slit lights 71 and 72 by using two cylindrical lenses 27a and 27b.
  • the slit light projecting unit 20b of the third embodiment projects the first and second slit lights 71, 72 by using a diffraction element plate 80 which is movably arranged in the vertical direction.
  • the slit light projecting unit 20b of the third embodiment is formed in a plate shape at a position facing the laser diode 21, the collimating lens 22, the aperture 24, and the aperture 24 in order from the upstream side. And a diffraction element plate 80. A solenoid 81 is connected to the diffraction element plate 80.
  • a plurality of first diffraction elements 80a extending in parallel with the X-axis direction are formed substantially in the upper half of the surface, and a plurality of first diffraction elements 80a extending in parallel with the Y-axis direction are formed in the substantially lower half of the surface.
  • the second diffraction element 80b is formed.
  • the solenoid 81 converts magnetic energy of the coil into linear motion of the plunger 81b.
  • the solenoid 81 includes a main body 81a containing a coil, and a plunger 81b reciprocating with respect to the main body 81a.
  • a diffraction element plate 80 is connected to one end of the plunger 81b.
  • the state shown in FIG. 13A that is, the surface of the diffraction element plate 80 where the second diffraction element 80b is formed is the same as the aperture 24.
  • a laser beam is emitted from the laser diode 21 while being placed at the opposite position, For example, the laser beam passes through the collimator lens 22, the aperture 24, and the second diffractive element 80b, thereby forming a first slit having a predetermined angular width and extending in a direction orthogonal to the Y axis (parallel to the X axis). Light is emitted as light 71.
  • the diffraction element plate 80 is moved downward by the plunger 81b by the action of the solenoid 81. Then, in a state as shown in FIG. 13B, that is, in a state where the surface of the diffraction element plate 80 on which the first diffraction element 80a is formed is located at a position facing the aperture 24, the laser diode 21 When the laser beam is emitted from the laser beam, the laser beam passes through the collimating lens 22, the aperture 24, and the first diffractive element 80a to form a flat second slit light 72 extending in a direction parallel to the Y axis with a predetermined angular width. It is projected as
  • the first and second slit light projects are performed in the same manner as the slit light projecting unit 20a of the second embodiment. Locus 71a, 72a can be detected.
  • the diffractive element plate 80 is inexpensive as compared with the cylindrical lens, the cost of parts can be reduced, and the manufacturing of the imaging apparatus 1 can be achieved. Cost can be reduced.
  • the processing of S170a or the processing of S170b in Fig. 5 is regarded as a three-dimensional shape calculation means.
  • the process of S180 in FIG. 5 is regarded as a planar image correcting unit.
  • the slit light of the first slit light 71 or the second slit light 72 is the slit light along the curvature of the original is determined via the bending direction switching switch 60. And the case where the user inputs is described.
  • any of the slit lights is determined by the slit light along the curve. Then, based on the result of the determination, the document orientation calculation process for the horizontal direction (S170a) or the document orientation calculation process for the vertical direction (S170b) is executed. May be determined. As described above, by performing the determination of the force in software, the switch operation of the user is not required, and the operability of the present apparatus can be improved.
  • the imaging apparatus 1 is configured to capture an image with slit light and an image without slit light using the imaging lens 31 and the CCD image sensor 32.
  • an imaging lens for capturing an image with a slit and a CCD image sensor may be additionally provided.
  • the imaging device 1 of the present embodiment can be made smaller and less expensive with fewer components.
  • the laser diode 21 that emits a laser beam is used as a light source.
  • any other light source that can output a light beam such as a surface emitting laser, an LED, and an EL element, can be used. May be used.
  • the slit light emitted from the slit light projecting unit 20 is a type of pattern light.
  • a slit light having a certain width is provided.
  • a light pattern in the form of a lip may be used.
  • the first pattern light and the second pattern light may be projected so as to extend along a plane parallel to the optical axis of the imaging means.
  • the first pattern light and the second pattern light are projected so as to extend along a plane parallel to the optical axis of the imaging means.
  • the first pattern light or the second pattern light can be projected along the curvature of the document. Therefore, the curvature of the document can be detected with high accuracy.
  • the light projecting means includes two optical lenses that respectively emit the first pattern light and the second pattern light.
  • the light projecting means includes the two optical lenses that respectively emit the first pattern light and the second pattern light, the first pattern light and the second pattern light Can be simultaneously projected on the target object. Therefore, an image including the trajectory of the first pattern light and the trajectory of the second pattern light can be efficiently acquired.
  • each of the two optical lenses may be composed of a cylindrical lens.
  • one cylindrical lens has a curved surface in a sectional view with respect to a plane from which the first pattern light is emitted
  • the other cylindrical lens has a curved surface in a sectional view with respect to a plane from which the second pattern light is emitted. It is arranged to be.
  • the first pattern light from one cylindrical lens and the second pattern light from the other cylindrical lens are transmitted to the target object.
  • Each can emit.
  • the two optical lenses can be constituted by the same cylindrical lens, parts can be shared.
  • the light projecting means includes one light output means for outputting a light beam, and a splitting means for splitting the light beam output from the light output means in two directions. You may have.
  • one of the two optical lenses is disposed in the middle of the two-way split by the splitter, and the other optical lens is positioned in the other half of the two-way split by the splitter. Be placed.
  • each of the two optical lenses is input with the light beam obtained by splitting the light beam output from one light output means by the splitting means in two directions. , even in the case of using two optical lenses, can reduce ⁇ short Nag parts for mounting two light output means, an effect force s being able to reduce the manufacturing cost of the device.
  • the light projecting means may include one optical lens having a first state for emitting the first pattern light and a second state for emitting the second pattern light.
  • the one optical lens is configured to be movable between a first state and a second state.
  • the first pattern light and the second pattern light are emitted from one optical lens, the first pattern light and the second pattern light are emitted using two optical lenses.
  • the number of parts can be reduced compared to the case, and the manufacturing cost of the device can be reduced. it can.
  • one optical lens may be constituted by a cylindrical lens.
  • the cylindrical lens has a second state with respect to a plane from which the second pattern light is emitted so that the cylindrical lens has a curved surface in a cross-sectional view with respect to a plane from which the first pattern light is emitted as the first state. It is configured to be movable so as to have a curved surface in a sectional view.
  • the first pattern light and the second pattern light can be respectively emitted to the target object only by switching between the first state and the second state.
  • the light projecting means includes a first diffractive element for projecting the first pattern light to the target object, and a second diffractive element for projecting the second pattern light to the target object. May be provided.
  • the light projecting unit includes the first diffractive element that projects the first pattern light onto the target object, and the second diffractive element that projects the second pattern light onto the target object. Therefore, the cost of parts can be reduced and the manufacturing cost of the device can be reduced as compared with the case where the light projecting means is configured using an optical lens.
  • the light projecting means includes a light output means for outputting a light beam, and one of the first diffraction element and the second diffraction element is set as a first state.
  • the second state is arranged in the middle of the path of the light beam, the other is arranged in the middle of the path of the light beam, and the other is arranged in the middle of the path of the light beam. It is configured to be movable so as to be located outside the path of the game.
  • the first pattern light and the second pattern light can be respectively emitted to the target object only by switching between the first state and the second state.
  • the light projecting unit when assuming each pixel that partitions the image captured by the imaging unit into a grid, performs the first pattern along the pixel extending in the first direction. It may be configured to project light and project the second pattern light along a pixel extending in a second direction substantially orthogonal to the first direction.
  • the first pattern light and the second pattern light can be projected along a row of pixels, and the three-dimensional shape of the target object can be detected with high accuracy.
  • the imaging lens of the imaging means is disposed at substantially the center of the substantially rectangular front surface of the present apparatus, and the projection light of the pattern light is located at the four corners of the front surface. Even if it is placed somewhere, it is good.
  • the imaging lens of the imaging means is disposed at a substantially central portion of the substantially rectangular front surface of the present apparatus, and the projection light of the pattern light is provided at any one of the four corners of the front surface.
  • the first pattern light and the second pattern light can be emitted without overlapping with the optical axis of the imaging means. Further, the coordinates of the trajectory of the first pattern light and the trajectory of the second pattern light in the real space can be obtained with high accuracy by triangulation.
  • the light projecting means emits a light beam and outputs the light beam from the light output means in a plane with a predetermined angular width.
  • Slit light projecting means for converting the slit light into a slit light and projecting the slit light to a target object may be provided.

Abstract

A three-dimensional shape detection device is provided with a light projecting means for projecting pattern light, an image pick up means, which picks up an image of a target object in the conditions where the pattern light is projected from the light projecting means, and a three-dimensional shape calculating means, which detects the position of the pattern light from the image picked up by the image pick up means and calculates three-dimensional shape information of the target object, based on the detected pattern light position. As the pattern light, the light projecting means projects a first pattern light and a second pattern light to the target object. The first pattern light extends along a flat plane passing a position separated from an optical axis of the image pick up means, and the second pattern light extends along a flat plane crossing the first pattern light.

Description

明 細 書  Specification
3次元形状検出装置および撮像装置  3D shape detection device and imaging device
技術分野  Technical field
[0001] 本発明は、所定の撮像位置において第 1方向に湾曲する対象物体の 3次元形状に 加え、第 1方向と交差する第 2方向に湾曲している対象物体の 3次元形状をも検出す ることができる 3次元形状検出装置および撮像装置に関する。  The present invention detects not only the three-dimensional shape of a target object curving in a first direction at a predetermined imaging position, but also the three-dimensional shape of a target object curving in a second direction that intersects the first direction. The present invention relates to a three-dimensional shape detection device and an imaging device that can perform the above-described operations.
背景技術  Background art
[0002] 従来より、対象物体にスリット光を投光して、そのスリット光が投光された対象物体を 撮像手段によって撮像し、その撮像された画像から検出されるスリット光の軌跡に基 づいて、対象物体の 3次元形状を検出する 3次元形状検出装置が知られている。  [0002] Conventionally, slit light is projected onto a target object, the target object on which the slit light is projected is imaged by an imaging means, and the slit light is projected on the basis of the locus of the slit light detected from the captured image. Therefore, a three-dimensional shape detection device that detects a three-dimensional shape of a target object is known.
[0003] 特開平 7-318315号公報には、この種の 3次元形状検出装置が記載されている。  [0003] Japanese Patent Laying-Open No. 7-318315 describes this type of three-dimensional shape detection device.
この公報に記載された装置は、 1つの光源からの光ビームをハーフミラーで 2列に分 割し、その 2列に分割された光ビームを 2列のスリット光として対象物体に投光し、そ の 2列のスリット光によって対象物体上に形成される 2列のスリット光の軌跡に基づい て、対象物体の 3次元形状を検出するよう構成されている。  The device described in this publication divides a light beam from one light source into two rows with a half mirror, and projects the light beam split into the two rows onto a target object as two rows of slit light, It is configured to detect the three-dimensional shape of the target object based on the trajectories of the two rows of slit light formed on the target object by the two rows of slit light.
[0004] この 3次元形状検出装置によって、例えば、長手方向と直交する方向(横方向)に 湾曲する A4サイズの原稿を縦長に撮像する場合、 2列のスリット光の軌跡は、原稿の 湾曲に沿って形成される。したがって、その 2列のスリット光の軌跡に基づいて、たと え原稿が湾曲している場合であっても、その湾曲を含む原稿の 3次元形状を検出す ること力 Sできる。  [0004] For example, when an A4 size document curving in a direction (horizontal direction) orthogonal to the longitudinal direction is imaged vertically by the three-dimensional shape detecting device, the trajectory of the two rows of slit light is caused by the curving of the document. Formed along. Therefore, even if the original is curved, the three-dimensional shape of the original including the curvature can be detected based on the locus of the two rows of slit light.
発明の開示  Disclosure of the invention
[0005] し力しながら、上述した 3次元形状検出装置において、上述したように湾曲した原稿 を同じ撮影位置で横長に撮像すベぐ原稿を回転させた場合には、対象物体上に形 成される 2列のスリット光の軌跡は、その湾曲に沿って形成されないので、その湾曲を 検出することができないとレ、う問題点があった。  [0005] In the above-described three-dimensional shape detection device, while rotating a document that is to capture a curved document horizontally at the same photographing position as described above, the document is formed on the target object. Since the trajectories of the two rows of slit light formed are not formed along the curvature, there is a problem that the curvature cannot be detected.
[0006] 即ち、上述した 3次元形状検出装置によって対象物体上に形成されるスリット光の 軌跡は 1の方向にだけ延びるように形成されるので、所定の撮像位置において、原 稿の湾曲を検出できるのは、原稿が投光されるスリット光の軌跡に沿って湾曲してい る場合に限られる。よって、スリット光の軌跡と交差する方向に湾曲する原稿について は、その原稿の湾曲を検出することができないという問題点があった。 [0006] That is, since the trajectory of the slit light formed on the target object by the above-described three-dimensional shape detection device is formed so as to extend only in the direction of 1, the original light at the predetermined imaging position The curving of the manuscript can be detected only when the manuscript is curved along the path of the slit light projected. Therefore, there is a problem that a document curved in a direction intersecting the locus of the slit light cannot be detected.
[0007] 本発明は、上述した問題点を解決するためになされたものである。すなわち、本発 明は、所定の撮像位置において第 1方向に湾曲する対象物体の 3次元形状に加え、 第 1方向と交差する第 2方向に湾曲している対象物体の 3次元形状をも検出すること 力できる 3次元形状検出装置および撮像装置を提供することを目的としている。  [0007] The present invention has been made to solve the problems described above. That is, the present invention detects not only the three-dimensional shape of the target object curved in the first direction at the predetermined imaging position, but also the three-dimensional shape of the target object curved in the second direction that intersects the first direction. It is an object of the present invention to provide a three-dimensional shape detection device and an imaging device that can perform the operations.
[0008] 上記目的を達成する為、本発明の一つの側面によって提供されるのは、 3次元形 状検出装置であって、パターン光を投光する投光手段と、その投光手段からパター ン光が投光されている状態における対象物体を撮像する撮像手段と、その撮像手段 によって撮像された画像からパターン光の位置を検出し、その検出したパターン光の 位置に基づき、対象物体の 3次元形状情報を算出する 3次元形状算出手段とを備え る。この構成において、投光手段は、パターン光として、撮像手段の光軸から離れた 位置を通る平面に沿って延びる第 1パターン光と、その第 1パターン光と交差する平 面に沿って延びる第 2パターン光とを対象物体へ投光する。  [0008] In order to achieve the above object, an aspect of the present invention provides a three-dimensional shape detection device, which includes a light projecting unit that projects pattern light, and a pattern Imaging means for imaging the target object in a state where the light is projected, and detecting the position of the pattern light from the image captured by the imaging means, and detecting the position of the target object based on the detected position of the pattern light. Three-dimensional shape calculation means for calculating three-dimensional shape information. In this configuration, the light projecting means includes, as pattern light, a first pattern light extending along a plane passing through a position distant from the optical axis of the imaging means, and a first pattern extending along a plane intersecting the first pattern light. The two pattern lights are projected onto the target object.
[0009] この 3次元形状検出装置によれば、撮像手段の光軸から離れた位置を通る平面に 沿って延びる第 1パターン光に基づき、第 1方向に湾曲する対象物体の湾曲が算出 され、第 1パターン光と交差する平面に沿って延びる第 2パターン光に基づき、第 1方 向と交差する方向に湾曲する対象物体の湾曲が算出される。よって、第 1方向に湾 曲する対象物体の 3次元形状に加え、第 1方向と交差する第 2方向に湾曲している 対象物体の 3次元形状をも検出することができる。  [0009] According to the three-dimensional shape detection device, the curvature of the target object that bends in the first direction is calculated based on the first pattern light that extends along a plane that passes through a position away from the optical axis of the imaging unit. Based on the second pattern light extending along the plane intersecting with the first pattern light, the curvature of the target object that is curved in the direction intersecting with the first direction is calculated. Therefore, in addition to the three-dimensional shape of the target object that curves in the first direction, the three-dimensional shape of the target object that is curved in the second direction that intersects with the first direction can be detected.
[0010] また、本発明の別の側面によれば、上記のような 3次元形状検出装置と、その 3次 元形状検出装置の 3次元形状算出手段により算出される対象物体の 3次元形状に 基づいて、撮像手段によって撮像されるパターン光が投光されていない状態におけ る対象物体の画像を、対象物体の所定面の略鉛直方向から観察される平面画像に 補正する平面画像補正手段とを備える撮像装置が提供される。  [0010] Further, according to another aspect of the present invention, the three-dimensional shape detection device as described above and the three-dimensional shape of the target object calculated by the three-dimensional shape calculation means of the three-dimensional shape detection device are used. Plane image correction means for correcting an image of the target object in a state where the pattern light imaged by the image pickup means is not projected based on the vertical direction of a predetermined surface of the target object based on An imaging device comprising:
[0011] この撮像装置によれば、上記 3次元形状検出装置と同様な効果を奏することができ ると共に、この 3次元検出装置によって算出される対象物体の 3次元形状に基づいて 、撮像手段によって撮像されるパターン光が投光されていない状態における対象物 体の画像を、対象物体の所定面の略鉛直方向から観察される平面画像に補正する こと力 Sできる。 According to this imaging device, it is possible to achieve the same effects as those of the three-dimensional shape detection device described above, and based on the three-dimensional shape of the target object calculated by the three-dimensional detection device. In addition, it is possible to correct the image of the target object in a state where the pattern light captured by the imaging unit is not projected to a planar image observed from a substantially vertical direction of a predetermined surface of the target object.
図面の簡単な説明 Brief Description of Drawings
[図 1]図 1 (a)は撮像装置の外観斜視図であり、図 1 (b)は図 1 (a)のト I断面線におけ る撮像装置 1の概略断面図である。 1] FIG. 1 (a) is an external perspective view of the imaging device, and FIG. 1 (b) is a schematic cross-sectional view of the imaging device 1 taken along a sectional line I in FIG. 1 (a).
[図 2]図 2 (a)および図 2 (b)は、撮像装置から原稿に向かってスリット光を投光してレヽ る状態を示す図であり、図 2 (a)は撮像装置に対して横方向 (X軸方向)に湾曲する 原稿を撮像する場合を示す図であり、図 2 (b)は撮像装置に対して縦方向 (Y軸方向 )に湾曲する原稿を撮像する場合を示すである。  FIGS. 2 (a) and 2 (b) are diagrams showing a state in which slit light is projected from an image pickup device toward a document and a beam is emitted, and FIG. FIG. 2B is a diagram illustrating a case where a document curving in the vertical direction (Y-axis direction) is imaged with respect to the imaging apparatus. It is.
[図 3]図 3 (a)および図 3 (b)は、スリット光投光ユニットの構成を示す図であり、図 3 (a) は、スリット光投光ユニットの平面図であり、図 3 (b)はスリット光投光ユニットの側面図 である。  [FIG. 3] FIGS. 3 (a) and 3 (b) are diagrams showing the configuration of a slit light projecting unit, and FIG. 3 (a) is a plan view of the slit light projecting unit. (b) is a side view of the slit light projecting unit.
[図 4]撮像装置の電気的構成を示したブロック図である。  FIG. 4 is a block diagram showing an electrical configuration of the imaging device.
[図 5]プロセッサでの処理手順を示すフローチャートである。 FIG. 5 is a flowchart showing a processing procedure in a processor.
[図 6]図 6 (a)および図 6 (b)は、上述した三角測量演算処理を説明するための図であ る。  FIG. 6 (a) and FIG. 6 (b) are diagrams for explaining the above-described triangulation calculation processing.
[図 7]図 7 (a)は図 2 (a)の状態において撮像されるスリット光有画像を示す図であり、 図 7 (b)は、図 2 (b)の状態において撮像されるスリット光有画像を示す図である。  [FIG. 7] FIG. 7 (a) is a diagram showing a slit light image captured in the state of FIG. 2 (a), and FIG. 7 (b) is a slit image captured in the state of FIG. 2 (b). It is a figure showing an optical image.
[図 8]図 8 (a)、図 8 (b)および図 8 (c)は、横湾曲用原稿姿勢演算処理を説明するた めの図である。 FIGS. 8 (a), 8 (b), and 8 (c) are diagrams for explaining a horizontal curving original posture calculation process.
[図 9]平面変換処理を示すフローチャートである。  FIG. 9 is a flowchart showing a plane conversion process.
[図 10]図 10 (a)、図 10 (b)および図 10 (c)は、スリット光投光口の配置に関する他の 例を示す図である。  [FIG. 10] FIGS. 10 (a), 10 (b) and 10 (c) are diagrams showing another example of the arrangement of the slit light projecting ports.
[図 11]スリット光投光ロを結像レンズ力も出来るだけ離した場合の効果を説明するた めの図である。  FIG. 11 is a view for explaining an effect when the slit light projection part is separated from the imaging lens power as much as possible.
[図 12]図 12 (a)および図 12 (b)は、第 2実施例のスリット光投光ユニットを示す図であ り、図 12 (a)は第 1スリット光を投光している状態を示し、図 12 (b)は第 2スリット光を 投光してレ、る状態を示してレ、る。 [FIG. 12] FIGS. 12 (a) and 12 (b) are views showing a slit light projecting unit of the second embodiment, and FIG. 12 (a) projects the first slit light. Fig. 12 (b) shows the second slit light. Shows the state of light emission.
[図 13]図 13 (a)および図 13 (b)は、第 3実施例のスリット光投光ユニットを示す図であ り、図 13 (a)は第 1スリット光を投光している状態を示す側面図と回折素子の正面図と を示し、図 13 (b)は第 2スリット光を投光している状態を示す側面図と回折素子の正 面図とを示している。  FIGS. 13 (a) and 13 (b) are views showing a slit light projecting unit according to a third embodiment, and FIG. 13 (a) projects a first slit beam. A side view showing the state and a front view of the diffraction element are shown. FIG. 13 (b) shows a side view showing a state where the second slit light is projected and a front view of the diffraction element.
符号の説明  Explanation of symbols
[0013] 1 撮像装置 (3次元形状検出装置を含む) [0013] 1 Imaging device (including a three-dimensional shape detection device)
20, 20a, 20b スリット光投光ユニット (スリット光投光手段)  20, 20a, 20b Slit light projecting unit (Slit light projecting means)
21 レーザーダイオード (光出力手段)  21 Laser diode (light output means)
27a, 27b, 70 シリンドリカルレンズ  27a, 27b, 70 cylindrical lenses
29 スリット光投光ロ  29 Slit light floodlight
31 結像レンズ (撮像手段)  31 Imaging lens (imaging means)
32 CCD画像センサ(撮像手段)  32 CCD image sensor (imaging means)
80 回折素子板  80 Diffraction element plate
80a 第 1回折素子  80a 1st diffraction element
80b 第 2回折素子  80b 2nd diffraction element
81 ソレノイド  81 Solenoid
421 カメラ制御プログラム(撮像手段)  421 Camera control program (imaging means)
424 原稿姿勢演算プログラム(3次元形状算出手段)  424 Original attitude calculation program (3D shape calculation means)
425 平面変換プログラム(平面画像補正手段)  425 Planar conversion program (planar image correction means)
発明を実施するための最良の形態  BEST MODE FOR CARRYING OUT THE INVENTION
[0014] 以下、本発明の好ましい実施例について、添付図面を参照して説明する。図 1 (a) は本発明の実施形態による撮像装置 1の外観斜視図であり、図 1 (b)は図 1 (a)のト I 断面線における撮像装置 1の概略断面図である。尚、本発明実施形態による 3次元 形状検出装置は撮像装置 1に含まれる。 Hereinafter, preferred embodiments of the present invention will be described with reference to the accompanying drawings. FIG. 1A is an external perspective view of an imaging device 1 according to an embodiment of the present invention, and FIG. 1B is a schematic cross-sectional view of the imaging device 1 along a sectional line I in FIG. 1A. Note that the three-dimensional shape detection device according to the embodiment of the present invention is included in the imaging device 1.
[0015] 撮像装置 1は、同じ撮像位置において縦方向または横方向のいずれに湾曲してい る原稿 Pであっても、その湾曲を含む原稿の 3次元形状を検出することができる装置 である。 [0016] 撮像装置 1は、図 1 (a)に示すように、方形箱形の本体ケース 10によって、その主な 外形が形成されている。本体ケース 10の前面には、結像レンズ 31と、ファインダ 53と 、スリット光投光ロ 29とが設けられている。本体ケース 10の上面には、レリーズボタン 52と、モード切替スィッチ 59と、湾曲方向切替スィッチ 60とが設けられている。本体 ケース 10の側面には、メモリカードの差込口 55aが設けられている。 The image capturing apparatus 1 is an apparatus that can detect the three-dimensional shape of a document P including a curved document P that is curved in either the vertical direction or the horizontal direction at the same image capturing position. As shown in FIG. 1A, the main outer shape of the imaging device 1 is formed by a rectangular box-shaped main body case 10. On the front surface of the main body case 10, an imaging lens 31, a finder 53, and a slit light emitting unit 29 are provided. On the upper surface of the main body case 10, a release button 52, a mode switching switch 59, and a bending direction switching switch 60 are provided. On the side surface of the main body case 10, a memory card slot 55a is provided.
[0017] また、図 1 (b)に示すように、本体ケース 10の背面には、撮像した画像を表示する L CD (Liquid Crystal Display) 51が設けられている。本体ケース 10の内部には、 結像レンズ 31の後方に配置される CCD画像センサ 32と、スリット光投光口 29の後方 に配置されるスリット光投光ユニット 20と、メモリカードの差込口 55aの後方に配置さ れるメモリカード読取装置 55と、プロセッサ 40とが内蔵されている。  As shown in FIG. 1B, a liquid crystal display (LCD) 51 for displaying a captured image is provided on the rear surface of the main body case 10. Inside the main body case 10, a CCD image sensor 32 arranged behind the imaging lens 31, a slit light projecting unit 20 arranged behind the slit light projecting port 29, and a memory card insertion port A memory card reader 55 disposed behind 55a and a processor 40 are incorporated.
[0018] 結像レンズ 31は、複数枚のレンズで構成され、オートフォーカス機能を有する。結 像レンズ 31は、 自動で焦点距離及び絞りを調整して外部からの光を CCD画像セン サ 32上に結像する光学レンズである。  The imaging lens 31 is composed of a plurality of lenses and has an autofocus function. The imaging lens 31 is an optical lens that automatically adjusts the focal length and aperture to form an image of external light on the CCD image sensor 32.
[0019] ファインダ 53は、本体ケース 10の背面から前面を通して配設される光学レンズで構 成されている。この構成により、撮像装置 1の背面から使用者がのぞき込んだ時に、 結像レンズ 31が CCD画像センサ 32上に結像する範囲とほぼ一致する範囲が見える ようになつている。  The finder 53 is composed of an optical lens disposed from the back to the front of the main body case 10. With this configuration, when the user looks into the imaging device 1 from the back, a range that substantially matches the range in which the imaging lens 31 forms an image on the CCD image sensor 32 can be seen.
[0020] スリット光投光口 29は、スリット光投光ユニット 20から投光されるスリット光を対象物 体に対して投光する投光口であり、本体ケース 10の前面の左下隅部に配置されてい る。このスリット光投光ロ 29から投光されるスリット光は、原稿 P上に十字形のスリット 光の軌跡を形成する。このスリット光の軌跡を検出することで、原稿 Pの 3次元形状が 検出されることになる。  The slit light projecting port 29 is a light projecting port for projecting the slit light projected from the slit light projecting unit 20 to the target object, and is provided at a lower left corner of the front surface of the main body case 10. It is located. The slit light emitted from the slit light emitting part 29 forms a locus of cross-shaped slit light on the document P. By detecting the locus of the slit light, the three-dimensional shape of the document P is detected.
[0021] レリーズボタン 52は、押しボタン式のスィッチで構成され、プロセッサ 40に接続され ている。プロセッサ 40にて使用者によるレリーズボタン 52の押し下げ操作が検知され る。  The release button 52 is formed by a push-button switch, and is connected to the processor 40. The processor 40 detects that the user presses down the release button 52.
[0022] モード切替スィッチ 59は、 2つの位置に切替え可能なスライドスィッチなどで構成さ れ、プロセッサ 40に接続されている。プロセッサ 40において、モード切替スィッチ 59 の一方のスィッチ位置は「ノーマルモード」として、他方のスィッチ位置は「補正撮像 モード」として検知される。 The mode switching switch 59 is configured by a slide switch or the like that can be switched between two positions, and is connected to the processor 40. In the processor 40, one switch position of the mode switching switch 59 is set to the "normal mode", and the other switch position is set to the "corrected imaging". Mode ".
[0023] 「ノーマルモード」は、撮像した原稿 Pのそのものを画像データとするモードである。  “Normal mode” is a mode in which the imaged original P itself is used as image data.
「補正撮像モード」は、原稿 Pを斜め方向から撮像場合に、その画像データを原稿 P を正面から撮像したような補正された画像データとするモードである。  The “corrected imaging mode” is a mode in which, when the document P is imaged from an oblique direction, the image data is corrected image data as if the document P was imaged from the front.
[0024] 湾曲方向切替スィッチ 60は、 2つの位置に切替え可能なスライドスィッチなどで構 成され、プロセッサ 40に接続されている。プロセッサ 40において、湾曲方向切替スィ ツチ 60の一方のスィッチ位置は「横方向湾曲モード」として、他方のスィッチ位置は「 縦方向湾曲モード」として検知される。  The bending direction switching switch 60 is constituted by a slide switch or the like that can be switched between two positions, and is connected to the processor 40. In the processor 40, one switch position of the bending direction switching switch 60 is detected as "lateral bending mode", and the other switch position is detected as "vertical bending mode".
[0025] 「横方向湾曲モード」は、撮像装置 1に対して横方向に湾曲する原稿 P (図 2 (a)に 示す湾曲状態の原稿 P参照)の 3次元形状を検出する場合のモードである。 「縦方向 湾曲モード」は、撮像装置 1に対して縦方向に湾曲する原稿 P (図 2 (b)に示す湾曲 状態の原稿 P参照)の 3次元形状を検出する場合のモードである。  The “horizontal bending mode” is a mode for detecting a three-dimensional shape of a document P that is bent laterally with respect to the imaging device 1 (see a document P in a bent state shown in FIG. 2A). is there. The “vertical bending mode” is a mode for detecting a three-dimensional shape of a document P that is bent in the vertical direction with respect to the imaging device 1 (see the document P in a bent state shown in FIG. 2B).
[0026] メモリカード読取装置 55は、不揮発性で書き換え可能な画像データを記憶するメモ リカードから、そのメモリカードに記憶されている画像データを読込んだり、そのメモリ カートに撮像した画像データを記憶させたりする装置である。  [0026] The memory card reader 55 reads image data stored in the memory card from a memory card that stores non-volatile and rewritable image data, and stores captured image data in the memory cart. It is a device to make it.
[0027] LCD51は、画像を表示する液晶ディスプレイなどで構成され、プロセッサ 40からの 画像信号を受けて画像を表示する。プロセッサ 40から LCD51へは、状況に応じて C CD画像センサ 32で受光したリアルタイムの画像や、メモリカード 55に記憶された画 像や、装置の設定内容の文字等を表示するための画像信号が送られて来る。  [0027] The LCD 51 is configured by a liquid crystal display or the like that displays an image, and receives an image signal from the processor 40 and displays an image. From the processor 40 to the LCD 51, a real-time image received by the CCD image sensor 32, an image stored in the memory card 55, and an image signal for displaying characters of the setting contents of the device are displayed as necessary. Sent.
[0028] CCD画像センサ 32は、 CCD (Charge Coupled Device)素子などの光電変換 素子がマトリクス状に配列されてなる構成を有する。 CCD画像センサ 32は、表面に 結像される画像の光の色及び強さに応じた信号を生成し、これをデジタルデータに 変換してプロセッサ 40へ出力する。尚、 CCD素子一つ分のデータが画像を形成す る画素の画素データであり、画像データは CCD素子の数の画素データで構成される  The CCD image sensor 32 has a configuration in which photoelectric conversion elements such as CCD (Charge Coupled Device) elements are arranged in a matrix. The CCD image sensor 32 generates a signal corresponding to the color and intensity of the light of the image formed on the surface, converts the signal into digital data, and outputs the digital data to the processor 40. The data for one CCD element is the pixel data of the pixels forming the image, and the image data is composed of the pixel data of the number of CCD elements.
[0029] 図 2 (a)は撮像装置 1に対して横方向 (X軸方向)に湾曲する原稿 Pを撮像する場合 を示す図であり、図 2 (b)は撮像装置 1に対して縦方向 (Y軸方向)に湾曲する原稿 P を撮像する場合を示すである。即ち、図 2 (a)は「横方向湾曲モード」において原稿 P を撮像する場合を示し、図 2 (b)は「縦方向湾曲モード」において原稿 Pを撮像する場 合を示している。 FIG. 2A is a diagram illustrating a case where an image of a document P that is curved in the horizontal direction (X-axis direction) with respect to the imaging device 1 is imaged, and FIG. 9 illustrates a case where an image of a document P curved in a direction (Y-axis direction) is captured. That is, FIG. 2A shows the document P in the “horizontal bending mode”. FIG. 2B shows a case where the document P is imaged in the “vertical bending mode”.
[0030] 図 2 (a)および図 2 (b)に示す通りに、スリット光投光口 29からは、原稿 Pに向けて所 定の角度幅で平面状に第 1スリット光 71と第 2スリット光 72とが放射される。  As shown in FIGS. 2 (a) and 2 (b), the first slit light 71 and the second slit light 71 The slit light 72 is emitted.
[0031] 第 1スリット光 71は、結像レンズ 31の光軸 (Z軸)から離れた位置で、その光軸と平 行な平面に沿って延びるように投光される。原稿 P上には、第 1スリット光 71によって 第 1スリット光の軌跡 71aが形成される。 The first slit light 71 is projected at a position away from the optical axis (Z axis) of the imaging lens 31 so as to extend along a plane parallel to the optical axis. On the document P, a locus 71a of the first slit light is formed by the first slit light 71.
[0032] 第 2スリット光 72は、第 1スリット光 71と略直交する平面であって、結像レンズ 31の 光軸 (Z軸)から離れた位置で、その光軸と平行な平面に沿って延びるように投光され る。原稿 P上には、第 2スリット光 72によって第 2スリット光の軌跡 72aが形成される。 [0032] The second slit light 72 is a plane substantially orthogonal to the first slit light 71, at a position apart from the optical axis (Z axis) of the imaging lens 31, and along a plane parallel to the optical axis. It is projected to extend. On the document P, a locus 72a of the second slit light is formed by the second slit light 72.
[0033] 即ち、第 1スリット光 71と第 2スリット光 72とは、十字にクロスするように投光され、原 稿 Pには第 1スリット光の軌跡 71aと第 2スリット光の軌跡 72aとによって十字形のスリツ ト光の軌跡が形成される。 That is, the first slit light 71 and the second slit light 72 are projected so as to cross each other, and the original P has a locus 71a of the first slit light and a locus 72a of the second slit light. This forms a trajectory of a cross-shaped slit light.
[0034] よって、図 2 (a)に示すように横方向に湾曲する原稿 Pを撮像する場合、第 1スリット 光の軌跡 71aが原稿 Pの湾曲に沿って形成されるので、この第 1スリット光の軌跡 71a を検出することで横方向に湾曲する原稿 Pの湾曲を算出することができる。 Accordingly, when capturing an image of a document P that is curved in the horizontal direction as shown in FIG. 2A, the first slit light path 71a is formed along the curve of the document P. By detecting the trajectory 71a of the light, it is possible to calculate the curvature of the original P that is curved in the horizontal direction.
[0035] 一方、図 2 (b)に示すように縦方向に湾曲する原稿 Pを撮像する場合、第 2スリット光 の軌跡 72aが原稿 Pの湾曲に沿って形成されるので、この第 2スリット光の軌跡 72aを 検出することで縦方向に湾曲する原稿 Pの湾曲を算出することができる。 On the other hand, when an image of a document P curving in the vertical direction is captured as shown in FIG. 2B, the trajectory 72a of the second slit light is formed along the curve of the document P. By detecting the trajectory 72a of the light, the curvature of the original P curving in the vertical direction can be calculated.
[0036] 図 3 (a)及び図 3 (b)は、スリット光投光ユニット 20の構成を示す図である。図 3 (a) は、スリット光投光ユニット 20の平面図であり、図 3 (b)はスリット光投光ユニット 20の 側面図である。 FIGS. 3A and 3B are views showing the configuration of the slit light projecting unit 20. FIG. FIG. 3A is a plan view of the slit light projecting unit 20, and FIG. 3B is a side view of the slit light projecting unit 20.
[0037] スリット光投光ユニット 20は、レーザーダイオード 21と、コリメートレンズ 22と、ハーフ ミラー 23と、 2つのァノヽ。—チヤ 24a, 24bと、反射ミラー 26と、 2つのシリンドリカノレレン ズ 27a, 27bとを備えてレヽる。  [0037] The slit light projecting unit 20 includes a laser diode 21, a collimating lens 22, a half mirror 23, and two antennas. —It is equipped with wheels 24a and 24b, a reflection mirror 26, and two cylindrical canola lenses 27a and 27b.
[0038] レーザーダイオード 21は、スリット光投光ユニット 20の最上流側に配置され、レーザ 一光線を放射する。レーザーダイオード 21は、プロセッサ 40からの指令に応じて、レ 一ザ一光線の放射及び停止を切り換える。 [0039] コリメートレンズ 22は、レーザーダイオード 21の下流側に配置され、レーザーダイォ ード 21からのレーザー光線を、スリット光投光ユニット 20からの基準距離 VP (例えば 330mm)に焦点を結ぶように集光する。 [0038] The laser diode 21 is arranged on the most upstream side of the slit light projecting unit 20, and emits one laser beam. The laser diode 21 switches between emission and stop of the laser beam in response to a command from the processor 40. The collimator lens 22 is disposed downstream of the laser diode 21 and focuses the laser beam from the laser diode 21 so as to focus on a reference distance VP (for example, 330 mm) from the slit light projecting unit 20. I do.
[0040] ハーフミラー 23は、コリメートレンズ 22の下流側であって、コリメートレンズ 22を通過 したレーザー光線の進行方向に対して略 45度傾けて配置されている。コリメートレン ズ 22を通過したレーザー光線は、ハーフミラー 23によって、ハーフミラー 23をそのま ま通過するレーザー光線と、略鉛直下向きに進行するレーザー光線とに分割される。  The half mirror 23 is located downstream of the collimating lens 22 and is inclined at approximately 45 degrees with respect to the traveling direction of the laser beam passing through the collimating lens 22. The laser beam that has passed through the collimating lens 22 is split by the half mirror 23 into a laser beam that passes through the half mirror 23 as it is and a laser beam that travels substantially vertically downward.
[0041] アパーチャ 24a, 24bは、矩形に開口された開口部を有する板で構成される。ァパ 一チヤ 24aはハーフミラー 23をそのまま通過するレーザー光線の下流側に、ァパー チヤ 24bはハーフミラー 23によって略鉛直下向き方向に進行するレーザー光線の下 流側に配置されている。ハーフミラー 23で分割された各レーザー光線は、このァパ 一チヤ 24a, 24bに形成された開口を通過することで撮像範囲内の仕様内で適切な スリット光幅として整形される。  [0041] The apertures 24a and 24b are each formed of a plate having a rectangular opening. The aperture 24a is disposed downstream of the laser beam passing through the half mirror 23 as it is, and the aperture 24b is disposed downstream of the laser beam traveling substantially vertically downward by the half mirror 23. Each laser beam split by the half mirror 23 passes through openings formed in the apertures 24a and 24b, and is shaped as an appropriate slit light width within specifications within an imaging range.
[0042] 反射ミラー 26は、鏡など、レーザー光線を全反射する部材で構成され、アパーチャ 24bを通過したスリット光の下流側に、そのスリット光の進行方向に対して略 45度傾 けて配置されている。アパーチャ 24bを通過したスリット光は、反射ミラー 26によって 全反射し、その進行方向を略 90度変えられる。  [0042] The reflection mirror 26 is formed of a member such as a mirror that totally reflects the laser beam, and is disposed on the downstream side of the slit light passing through the aperture 24b at an angle of approximately 45 degrees with respect to the traveling direction of the slit light. ing. The slit light passing through the aperture 24b is totally reflected by the reflection mirror 26, and its traveling direction can be changed by about 90 degrees.
[0043] シリンドリカルレンズ 27a、 27bは、円柱を軸方向に 2つに割った所謂蒲鋅形に形成 され、曲率がある断面では光を集光'発散させ、曲率がない断面内では光をそのまま 通過させる光学レンズである。  The cylindrical lenses 27a and 27b are formed in a so-called cylindrical shape in which a cylinder is divided into two in the axial direction. Light is condensed and diverged in a section having a curvature, and light is kept in a section having no curvature. It is an optical lens that passes.
[0044] シリンドリカルレンズ 27aは、アパーチャ 24aの下流側であって、 XZ平面の断面視 において湾曲面が下流側に向くように配置されている。アパーチャ 24aを通過したス リット光は、このシリンドリカルレンズ 27aを通過することで、上述した第 1スリット光 71と してスリット光投光口 29を介して原稿 Pに向かって投光される。  [0044] The cylindrical lens 27a is arranged on the downstream side of the aperture 24a so that the curved surface faces the downstream side in a cross-sectional view on the XZ plane. The slit light that has passed through the aperture 24a passes through the cylindrical lens 27a, and is emitted as the above-described first slit light 71 toward the document P via the slit light emitting port 29.
[0045] 一方、シリンドリカルレンズ 27bは、反射ミラー 26の下流側であって、 YZ平面の断 面視において湾曲面が下流側に向くように配置されている。反射ミラー 26に反射さ れたスリット光は、このシリンドリカルレンズ 27bを通過することで、上述した第 2スリット 光 72としてスリット光投光口 29を介して原稿 Pに向かって投光される。 [0046] 図 4は、撮像装置 1の電気的構成を示したブロック図である。撮像装置 1に搭載され たプロセッサ 40は、 CPU41、 R〇M42、 RAM43を備えている。尚、上述した CCD 画像センサ 32、メモリカード読取装置 55、レリーズボタン 52、スリット光投光ユニット 2 0、 LCD51、モード切替スィッチ 59、湾曲方向切替スィッチ 60は、信号線を介して C PU41と接続されている。 [0045] On the other hand, the cylindrical lens 27b is arranged downstream of the reflection mirror 26 so that the curved surface faces the downstream in a cross-sectional view of the YZ plane. The slit light reflected by the reflection mirror 26 passes through the cylindrical lens 27b, and is emitted as the above-described second slit light 72 toward the document P via the slit light emission port 29. FIG. 4 is a block diagram showing an electrical configuration of the imaging device 1. The processor 40 mounted on the imaging device 1 includes a CPU 41, an R〇M 42, and a RAM 43. The above-described CCD image sensor 32, memory card reader 55, release button 52, slit light projecting unit 20, LCD51, mode switching switch 59, and bending direction switching switch 60 are connected to the CPU 41 via signal lines. Have been.
[0047] CPU41は、 ROM42に記憶されたプログラムによる処理に応じて、 RAM43を利用 して次のような処理を行う。すなわち、 CPU41による処理には、レリーズボタン 52の 押し下げ操作の検知、 CCD画像センサ 32から画像データの取り込み、画像データ のメモリカードへの書込み、モード切替スィッチ 59の状態検出、スリット光投光ュニッ ト 20によるスリット光の出射切替え等が含まれる。  The CPU 41 performs the following processing using the RAM 43 in accordance with the processing by the program stored in the ROM 42. That is, the processing performed by the CPU 41 includes detection of a pressing operation of the release button 52, capture of image data from the CCD image sensor 32, writing of image data to a memory card, detection of the state of the mode switching switch 59, detection of a slit light emitting unit, and the like. Switching of the emission of the slit light by 20 and the like are included.
[0048] ROM42には、カメラ制御プログラム 421と、差分抽出プログラム 422と、三角測量 演算プログラム 423と、原稿姿勢演算プログラム 424と、平面変換プログラム 425とが 格納されている。カメラ制御プログラム 421は、図 5に示すフローチャートの処理(詳 細は後述する。)を含む撮像装置 1全体の制御に関するプログラムである。差分抽出 プログラム 422は、スリット光を投光した原稿 Pを撮像したスリット光有画像と、スリット 光を投光していない原稿 Pを撮像したスリット光無画像との差分を取り、スリット光の軌 跡を抽出するためのプログラムである。三角測量演算プログラム 423は、差分抽出プ ログラム 422で抽出されたスリット光の軌跡の各画素に対する 3次元空間位置を演算 するためのプログラムである。原稿姿勢演算プログラム 424は、第 1スリット光の軌跡 7 la及び第 2スリット光の軌跡 72aの 3次元空間位置から、原稿 Pの 3次元形状を推定 して求めるプログラムである。平面変換プログラム 425は、原稿 Pの 3次元形状が与え られて、スリット光無画像格納部 432に格納された画像データを、原稿 Pの正面から 撮像したような画像に変換するためのプログラムである。  [0048] The ROM 42 stores a camera control program 421, a difference extraction program 422, a triangulation calculation program 423, a document attitude calculation program 424, and a plane conversion program 425. The camera control program 421 is a program relating to the control of the entire imaging apparatus 1 including the processing of the flowchart shown in FIG. 5 (details will be described later). The difference extraction program 422 calculates a difference between an image with slit light obtained by imaging the document P on which the slit light is projected and an image without slit light obtained by imaging the document P on which the slit light is not emitted, and determines the track of the slit light. This is a program for extracting traces. The triangulation calculation program 423 is a program for calculating a three-dimensional spatial position with respect to each pixel of the locus of the slit light extracted by the difference extraction program 422. The document attitude calculation program 424 is a program for estimating and obtaining the three-dimensional shape of the document P from the three-dimensional spatial positions of the locus 7 la of the first slit light and the locus 72 a of the second slit light. The plane conversion program 425 is a program for converting the image data, which is given the three-dimensional shape of the document P and stored in the slit light non-image storage unit 432, into an image as if it were taken from the front of the document P. .
[0049] RAM43には、記憶領域として、 CCD画像センサ 32からの画像データの形式のデ ータを保存する大きさのスリット光有画像格納部 431と、スリット光無画像格納部 432 と、スリット光有画像とスリット光無画像との差分の画像データを格納する差分画像格 納部 433とが割り当てられている。さらに、 RAM43には、各スリット光の軌跡 71a, 7 2aを構成する画素の 3次元空間位置を保存する大きさの三角測量演算結果格納部 434と、原稿 Pの 3次元形状を保存する大きさの原稿姿勢演算結果格納部 435と、 C PU41での演算のために一時的にデータを記憶させるのに使用する大きさのヮーキ ングエリア 436とが割り当てられてレ、る。 [0049] The RAM 43 includes, as storage areas, a slit light image storage unit 431, a slit light non-image storage unit 432, and a slit light storage unit sized to store data in the form of image data from the CCD image sensor 32. A difference image storage unit 433 for storing image data of the difference between the optical image and the slit image is assigned. Further, the RAM 43 stores a triangulation calculation result storage unit having a size for storing the three-dimensional spatial positions of the pixels constituting the trajectories 71a and 72a of each slit light. 434, a document orientation calculation result storage unit 435 having a size for storing the three-dimensional shape of the document P, and a marking area 436 having a size used for temporarily storing data for calculation in the CPU 41. Is assigned.
[0050] 次に、図 5のフローチャートを参照して、上述したように構成された撮像装置 1に関 し、使用者によりレリーズボタン 52が押されてからの動作について説明する。図 5は、 撮像装置 1のプロセッサ 40での処理手順を示すフローチャートである。  Next, with reference to the flowchart of FIG. 5, the operation of the imaging device 1 configured as described above after the user presses the release button 52 will be described. FIG. 5 is a flowchart showing a processing procedure in the processor 40 of the imaging device 1.
[0051] 使用者によりレリーズボタン 52が押されると、まず、モード切替スィッチ 59のスィッチ の位置が検知され、そのスィッチの位置が「補正撮像モード」の位置であるか否かが 判別される(S 110)。判別の結果、「補正撮像モード」の位置にある場合には(S 110 : Yes)、処理はステップ S120に進む。ステップ S 120では、レーザーダイオード 21へ 発光が指令され、スリット光投光ユニット 20から第 1スリット光 71及び第 2スリット光 72 が投光されてから、スリット光有画像として、 CCD画像センサ 32から RGB値で表現さ れた画像デ一が取得される。この画像データはスリット光有画像格納部 431へ読み 込まれる(S120)。  When the release button 52 is pressed by the user, first, the position of the switch of the mode switching switch 59 is detected, and it is determined whether or not the position of the switch is the position of the “correction imaging mode” ( S 110). If the result of determination is that the camera is at the position of the "corrected imaging mode" (S110: Yes), the process proceeds to step S120. In step S120, light emission is instructed to the laser diode 21, the first slit light 71 and the second slit light 72 are projected from the slit light projecting unit 20, and then, as an image with slit light, from the CCD image sensor 32. The image data represented by RGB values is obtained. This image data is read into the slit light image storage unit 431 (S120).
[0052] 次に、レーザーダイオード 21の発光停止が指令され、スリット光投光ユニット 20から 第 1スリット光 71及び第 2スリット光 72が投光されなくなつてから、スリット光無画像とし て CCD画像センサ 32から RGB値で表現された画像データが取得される。この画像 データはスリット光無画像格納部 432へ読み込まれる(S 130)。  Next, emission of the laser diode 21 is commanded, and the slit light projecting unit 20 stops emitting the first slit light 71 and the second slit light 72. Image data represented by RGB values is obtained from the image sensor 32. This image data is read into the slit light non-image storage section 432 (S130).
[0053] 次に、差分抽出処理として(S140)、差分抽出プログラム 422によってスリット光有 画像格納部 431の画像データに対する、スリット光無画像格納部 432の画像データ の差分、つまり、原稿 Pに投光された第 1スリット光の軌跡 71a及び第 2スリット光の軌 跡 72aを抽出した画像データが生成され、その画像データが差分画像格納部 433へ 読み込まれる。  Next, as a difference extraction process (S140), the difference extraction program 422 projects the difference between the image data of the image storage unit 431 with slit light and the image data of the image storage unit 432 without slit light, that is, the difference P is projected on the document P. Image data is generated by extracting the illuminated trajectory 71a of the first slit light and the trajectory 72a of the second slit light, and the image data is read into the difference image storage unit 433.
[0054] 次に、三角測量演算処理として(S150)、三角測量演算プログラム 423によって、 第 1スリット光の軌跡 71a及び第 2スリット光の軌跡 72aの画素毎の 3次元空間位置が 演算され、その演算結果がそれぞれ三角測量演算結果格納部 434へ読み込まれる  Next, as triangulation calculation processing (S 150), the three-dimensional spatial position of each pixel of the trajectory 71 a of the first slit light and the trajectory 72 a of the second slit light is calculated by the triangulation calculation program 423. The calculation results are read into the triangulation calculation result storage unit 434, respectively.
[0055] 次に、湾曲方向切替スィッチ 60のスィッチの位置が検知され、そのスィッチの位置 力 S「横方向湾曲モード」の位置であるか否かが判別される(S160)。判別の結果、「横 方向湾曲モード」の位置にあると判別される場合には(S160: Yes)、処理はステップ S 170aに進む。ステップ S 170aでは、横方向用原稿姿勢演算処理として、原稿姿勢 演算プログラム 424により、三角測量演算結果格納部 434に格納されている各スリツ ト光の 3次元空間位置に基づいて、原稿 Pの 3次元形状が演算され、その演算結果 が原稿姿勢演算結果格納部 435へ読み込まれる。 Next, the position of the switch of the bending direction switching switch 60 is detected, and the position of the switch is detected. It is determined whether the position is the position of the force S “lateral bending mode” (S160). As a result of the determination, if it is determined that the position is in the “lateral bending mode” position (S160: Yes), the process proceeds to step S170a. In step S170a, as a document orientation calculation process for the horizontal direction, the document orientation calculation program 424 determines the position of the document P based on the three-dimensional spatial position of each slit light stored in the triangulation calculation result storage unit 434. The dimensional shape is calculated, and the calculation result is read into the document posture calculation result storage unit 435.
[0056] 一方、湾曲方向切替スィッチ 60の位置が「横方向湾曲モード」の位置にない場合( S160 : No)、即ち、湾曲方向切替スィッチ 60の位置が「縦方向湾曲モード」の位置 である場合には、処理はステップ S170bに進む。ステップ S170bでは、縦方向用原 稿姿勢演算処理として、横方向用原稿姿勢演算処理 (S170a)と同様に、原稿姿勢 演算プログラム 424により、三角測量演算結果格納部 434に格納されている各スリツ ト光の 3次元空間位置に基づいて、原稿 Pの 3次元形状を演算され、その演算結果が 原稿姿勢演算結果格納部 435へ読み込まれる。  On the other hand, when the position of the bending direction switching switch 60 is not in the “horizontal bending mode” position (S160: No), that is, the position of the bending direction switching switch 60 is the position in the “vertical bending mode”. In this case, the process proceeds to step S170b. In step S170b, each of the slits stored in the triangulation calculation result storage unit 434 by the document attitude calculation program 424 is used as the vertical document attitude calculation processing, similarly to the horizontal document attitude calculation processing (S170a). The three-dimensional shape of the document P is calculated based on the three-dimensional spatial position of the light, and the calculation result is read into the document posture calculation result storage unit 435.
[0057] 次に、平面変換処理として(S180)、平面変換プログラム 425により、原稿姿勢演 算結果格納部 435へ読込まれた 3次元形状データから、スリット光無画像格納部 43 2に記憶された画像データは、正面から観察されたような画像の画像データに変換さ れる。そして、生成された正立画像の画像データがメモリカードへ書き込まれ(S190 )、当該処理は終了する。  Next, as a plane conversion process (S180), the three-dimensional shape data read into the document posture calculation result storage section 435 by the plane conversion program 425 is stored in the slit light non-image storage section 432. The image data is converted into image data of an image as viewed from the front. Then, the image data of the generated erect image is written to the memory card (S190), and the process ends.
[0058] 尚、 S110における判別の結果力 「補正撮像モード」ではなく「ノーマルモード」の 位置の場合には(S110: No)、スリット光投光ユニット 20のレーザーダイオード 21が 発光せず、第 1スリット光 71及び第 2スリット光 72が投光されていない状態で、 CCD 画像センサ 32からスリット光無画像力 Sスリット光無画像格納部 432へ読み込まれる(S 200)。そして、その画像データはメモリカードに書き込まれ (S210)、当該処理は終 了する。  If the result of the determination in S110 is that the position is in the “normal mode” instead of the “corrected imaging mode” (S110: No), the laser diode 21 of the slit light projecting unit 20 does not emit light, In a state where the first slit light 71 and the second slit light 72 are not projected, the slit light non-image power is read from the CCD image sensor 32 into the S slit light non-image storage unit 432 (S200). Then, the image data is written to the memory card (S210), and the process ends.
[0059] 図 6 (a)および図 6 (b)は、上述した三角測量演算処理(図 5の S150)を説明するた めの図である。図 6 (a)および図 6 (b)では、図 2 (a)に示すように撮像される原稿 P 対する撮像装置 1の座標系を、結像レンズ 31の光軸方向を Z軸、撮像装置 1から基 準距離 VP離れた位置を X, Y, Z軸の原点位置、撮像装置 1に対して水平方向を X 軸、垂直方向を Y軸としている。 FIGS. 6 (a) and 6 (b) are diagrams for explaining the above-described triangulation calculation processing (S150 in FIG. 5). 6 (a) and 6 (b), the coordinate system of the imaging device 1 with respect to the original P to be imaged as shown in FIG. Reference distance VP from 1 is the origin position of the X, Y, and Z axes, and X is the horizontal direction with respect to imaging device 1. The axis and the vertical direction are the Y axis.
[0060] また、 CCD画像センサ 32の X軸方向の画素数を ResX、 Y軸方向の画素数を Res Yとしている。また、 X— Y平面に結像レンズ 31を通して CCD画像センサ 32を投影し た位置の上端を Yftop、下端を Yfbottom、左端を Xfstart、右端を Xf endとしている The number of pixels in the X-axis direction of the CCD image sensor 32 is defined as ResX, and the number of pixels in the Y-axis direction is defined as Res Y. The upper end of the position where the CCD image sensor 32 is projected on the XY plane through the imaging lens 31 is Yftop, the lower end is Yfbottom, the left end is Xfstart, and the right end is Xfend.
[0061] また、結像レンズ 31の光軸と第 1スリット光 71の光軸との間の距離を D、第 1スリット 光 71が X— Y平面に交差する Y軸方向の位置を las 1、第 2スリット光 72が X— Y平面に 交差する Y軸方向の位置を las2とする。 The distance between the optical axis of the imaging lens 31 and the optical axis of the first slit light 71 is D, and the position of the first slit light 71 in the Y-axis direction at which the first slit light 71 intersects the XY plane is las 1. Let las2 be the position in the Y-axis direction where the second slit light 72 intersects the XY plane.
[0062] この場合、第 1スリット光の軌跡 71aの画像の画素の 1つに注目した注目点 1の CC D画像センサ 32上の座標(ccdxl , ccdyl)に対応する 3次元空間位置(Xl, Yl , Z 1)を、 CCD画像センサ 32の結像面上の点と、第 1スリット光 71及び第 2スリット光 72 の出射点と、 X— Y平面に交差する点とで形成される三角形について立てた次の 5つ の連立方程式の解から導き出す。  In this case, a three-dimensional space position (Xl, Xl) corresponding to the coordinates (ccdxl, ccdyl) on the CCD image sensor 32 of the point of interest 1 that focuses on one of the pixels of the image of the first slit light locus 71a Yl, Z 1) is a triangle formed by a point on the image plane of the CCD image sensor 32, an emission point of the first slit light 71 and the second slit light 72, and a point intersecting the XY plane. Is derived from the solution of the following five simultaneous equations established for.
(1) Yl =- ( (lasl + D) /VP) Zl +lasl  (1) Yl =-((lasl + D) / VP) Zl + lasl
(2) Yl =- (Yltarget/VP) Zl + Yltarget  (2) Yl =-(Yltarget / VP) Zl + Yltarget
(3) XI =- (Xltarget/VP) Zl +Xltarget  (3) XI =-(Xltarget / VP) Zl + Xltarget
(4) Xtarget = Xf start + (ccdxl/ResX) X (Xfend—Xf start)  (4) Xtarget = Xf start + (ccdxl / ResX) X (Xfend—Xf start)
(5) Yltarget = Yf top- (ccdyl/ResY) X (Yftop-Yfbottom)  (5) Yltarget = Yf top- (ccdyl / ResY) X (Yftop-Yfbottom)
尚、本実施例では、第 1スリット光 71が X— Z平面に対して平行のため lasl =_Dで あり、 Yl =— Dである。  In the present embodiment, since the first slit light 71 is parallel to the X-Z plane, lasl = _D and Yl = —D.
[0063] 同様に、 CCD画像センサ 32上の第 2スリット光の軌跡 72aの画像の画素の一つに 注目した注目点 2の座標(ccdx2, ccdy2)に対応する 3次元空間位置 (X2, Y2, Z2 )を、次に 5つの連立方程式の解から導き出す。  Similarly, the three-dimensional spatial position (X2, Y2) corresponding to the coordinates (ccdx2, ccdy2) of the point of interest 2 focused on one of the pixels of the image of the locus 72a of the second slit light on the CCD image sensor 32 , Z2) is then derived from the solution of the five simultaneous equations.
(6) X2 =- ( (las2 + D) /VP) Z2 + las2  (6) X2 =-((las2 + D) / VP) Z2 + las2
(7) X2 =- (X2target/VP) Z2 + X2target  (7) X2 =-(X2target / VP) Z2 + X2target
(8) Y2 =- (Y2target/VP) Z2 + Y2target  (8) Y2 =-(Y2target / VP) Z2 + Y2target
( 9) Ytarget = Ybottom + (ccdx2/ResX) X (Ytop-Ybottom)  (9) Ytarget = Ybottom + (ccdx2 / ResX) X (Ytop-Ybottom)
(10) X2target = Xend- (ccdy2/ResY) X (Xend-Xtop) 尚、本実施例では、第 2スリット光 72が Y— Z平面に対して平行のため las2 = Dであ り、 X2 = Dである。 (10) X2target = Xend- (ccdy2 / ResY) X (Xend-Xtop) In the present embodiment, since the second slit light 72 is parallel to the YZ plane, las2 = D and X2 = D.
[0064] 次に、図 7 (a)、および図 8 (a)から図 8 (c)を参照して、上述した横方向用原稿姿勢 演算処理(図 5の S170a)について説明する。図 7 (a)は図 2 (a)の状態において撮像 されるスリット光有画像を示す図である。図 8 (a)から図 8 (c)は、横方向用原稿姿勢 演算処理の際の座標系を説明するための図である。  Next, with reference to FIG. 7 (a) and FIGS. 8 (a) to 8 (c), a description will be given of the above-described document orientation calculation processing for the horizontal direction (S170a in FIG. 5). FIG. 7 (a) is a diagram showing a slit light image captured in the state of FIG. 2 (a). 8 (a) to 8 (c) are diagrams for explaining a coordinate system at the time of the horizontal document orientation calculation processing.
[0065] この横方向用原稿姿勢演算処理 (S170a)では、まず、三角測量演算処理(図 5の 150)によって三角測量演算結果格納部 434に格納されている各スリット光の軌跡 7 la、 72aの各画素毎の 3次元空間位置から、図 7 (a)に示す第 2スリット光の軌跡 72a の任意の 2点(例えば、点 P1と、点 P4)が抽出される。次に、その抽出された 2点の 3 次元空間位置のデータから、図 8 (a)に示すように、 X軸を中心とした傾き Θが求めら れる。  In the horizontal document orientation calculation process (S170a), first, the trajectories 7 la and 72a of each slit light stored in the triangulation calculation result storage unit 434 by the triangulation calculation process (150 in FIG. 5). From the three-dimensional space position of each pixel, arbitrary two points (for example, point P1 and point P4) of the locus 72a of the second slit light shown in FIG. 7A are extracted. Next, as shown in FIG. 8 (a), a slope と し た about the X axis is obtained from the extracted data of the three points in the three-dimensional space.
[0066] 次に、求めた傾き Θと、第 1スリット光の軌跡 71aと YZ平面との交点 P2とに基づき、 点 Ρ (Χ, Υ, Z) = (0, 0, L)の Z座標値 Lが求められる。  Next, the Z coordinate of the point Ρ (Χ, Υ, Z) = (0, 0, L) based on the obtained inclination Θ and the intersection P2 between the trajectory 71a of the first slit light and the YZ plane. The value L is determined.
[0067] 次に、図 8 (b)に示すように、第 1スリット光の軌跡 71aを回帰曲線近似した線を、先 に求めた X軸まわりの傾き Θ分だけ逆方向に回転変換し、つまり、原稿 Pを X— Y平面 に対して平行にした状態を考える。そして、図 8 (c)に示すように、原稿 Pの X軸方向 の断面形状を、 X— Z平面における原稿 Pの断面について、 Z軸方向の変位を複数の X軸方向の点で求めてその変位度から、 X軸方向の位置を変数とした X軸方向の傾 きの関数である湾曲 Φ (X)を求める。こうして、原稿 Pの 3次元形状としての傾き Θと、 距離 Lと、湾曲 φ (X)を求めることができる。  Next, as shown in FIG. 8 (b), a line obtained by approximating the locus 71a of the first slit light with a regression curve is rotationally transformed in the opposite direction by the previously obtained inclination ま わ り around the X axis, That is, consider a state in which the original P is parallel to the XY plane. Then, as shown in FIG. 8C, the cross-sectional shape of the document P in the X-axis direction is obtained by calculating the displacement in the Z-axis direction at a plurality of points in the X-axis direction with respect to the cross section of the document P on the X--Z plane. From the degree of displacement, the curvature Φ (X), which is a function of the tilt in the X-axis direction with the position in the X-axis direction as a variable, is obtained. Thus, the inclination 傾 き, the distance L, and the curvature φ (X) of the document P as a three-dimensional shape can be obtained.
[0068] 尚、図 7 (b)は、図 2 (b)の状態において撮像されるスリット光有画像を示す図であり 、縦方向用原稿姿勢演算処理 (S170b)では、上述した横方向用原稿姿勢演算処 理 (S170a)の場合と同様に処理することにより、横方向用原稿姿勢演算処理 (S17 Oa)の場合と同様に、原稿 Pの 3次元形状としての湾曲 θ (X)と、距離 Lと、傾き φを 求めることができる。但し、 Θは角度ではなぐ Ζ= Θ (Y)という湾曲としてとして表さ れ、 φは X軸との角度として表される。  FIG. 7 (b) is a diagram showing an image with slit light captured in the state of FIG. 2 (b). In the vertical document orientation calculation process (S170b), By performing the same processing as in the case of the original posture calculation processing (S170a), the curvature θ (X) as the three-dimensional shape of the original P is obtained, as in the case of the horizontal original posture calculation processing (S17 Oa). The distance L and the slope φ can be obtained. However, Θ is not an angle, but is expressed as a curve Ζ = Θ (Y), and φ is expressed as an angle with the X axis.
[0069] 次に、図 9のフローチャートを参照して上述した平面変換処理(図 5の S180)につ いて説明する。尚、フローチャートでは、簡単のため、 θ、 φ共に、曲面に湾曲してい ない場合を示した。湾曲している場合の平面変換処理については、電子情報通信学 会論文誌 DIIVol.J86_D2 No.3 p409「アイスキャナによる湾曲ドキュメント撮影」 を参照されたい。 Next, the plane conversion process (S180 in FIG. 5) described above with reference to the flowchart in FIG. Will be described. For simplicity, the flowchart shows a case where neither θ nor φ is curved to a curved surface. Refer to IEICE Transactions on Electronics, Information and Communication Engineers, DIIVol.J86_D2 No.3, p409, “Shooting Curved Documents with an Eye Scanner” for the plane conversion processing when curved.
[0070] 平面変換処理では、まず、 RAM43のワーキングエリア 436に当該処理の処理領 域を割り当て、カウンタ bのための変数など当該処理に用いる変数の初期値を設定 する(S1300)。  In the plane conversion processing, first, a processing area of the processing is allocated to the working area 436 of the RAM 43, and initial values of variables used in the processing, such as a variable for the counter b, are set (S1300).
[0071] 次に、原稿 Pの文字等が書かれた面が略鉛直方向から観察された場合の画像であ る正立画像の領域を、原稿姿勢演算プログラム 424での演算結果による原稿 Pの 3 次元空間位置(0, 0, L)と、 X軸まわりの傾き Θと、湾曲 φ (X)とに基づき、スリット光 無し画像の 4隅の点を変換して設定し、この領域内に含まれる画素数 aを求める(S1 301)。  Next, the area of the erect image, which is an image when the surface of the document P on which the characters and the like are written is observed from a substantially vertical direction, is calculated based on the calculation result of the document posture calculation program 424. Based on the three-dimensional spatial position (0, 0, L), the inclination X around the X axis, and the curvature φ (X), the points at the four corners of the image without slit light are converted and set, and within this area The number a of included pixels is obtained (S1 301).
[0072] 次にカウンタ bの値が領域内の画素数 aに達したか否かを判断する(S1302)。その 結果、依然、カウンタ bの値が画素数 aに達していない場合には(S 1302 : No)、設定 された正立画像の領域を、まず X— Y平面に配置して、その中に含まれる画素毎に、 各々の 3次元空間位置を、湾曲 φ (X)に基づいて Z軸方向に変位させ(S1303)、傾 き Θで X軸まわりに回転移動し(S 1304)、 Z軸方向に距離 Lだけシフトする(S 1305)  Next, it is determined whether or not the value of the counter b has reached the number of pixels a in the area (S1302). As a result, if the value of the counter b still does not reach the number of pixels a (S1302: No), the area of the set erect image is first arranged on the XY plane, and For each pixel included, each three-dimensional space position is displaced in the Z-axis direction based on the curvature φ (X) (S1303), and is rotated around the X-axis with a tilt Θ (S1304), and the Z-axis is moved. Shift in the direction by distance L (S1305)
[0073] 次に、求められた 3次元空間位置を、先の 3角測量の関係式により理想カメラで写さ れた CCD画像上の座標(ccdcx, ccdcy)に変換し(SI 306)、使用している結像レ ンズ 31の収差特性に従って、公知のキャリブレーション手法により、実際のカメラで写 された CCD画像上の座標(ccdx, ccdy)に変換し(S1307)、この位置にあるスリット 光無画像の画素の状態を求めて、 RAM43のワーキングエリア 436に格納する(S13 08)。 Next, the obtained three-dimensional spatial position is converted into coordinates (ccdcx, ccdcy) on the CCD image captured by the ideal camera using the above-described triangulation relational expression (SI 306) and used. In accordance with the aberration characteristics of the imaging lens 31 being used, the coordinates (ccdx, ccdy) on the CCD image captured by the actual camera are converted by a known calibration method (S1307), and the slit light at this position is converted. The state of the non-image pixel is obtained and stored in the working area 436 of the RAM 43 (S1308).
[0074] ここまでの処理が終了したら、カウンタ bに「1」をカロ算(S1309)し、 S1302力 の処 理を繰り返し、正立画像の画像データを生成する。こうして、 S 1302の判断において 、カウンタ bの値が画素数 aに達した場合には(S1302 :Yes)、 S1300において割当 てた処理領域を開放して(S1310)、当該処理を終了する。 [0075] 尚、 S160において、湾曲方向切換スィッチ 60のスィッチ位置を検知して原稿演算 処理を分岐していたが、第 1スリット光の軌跡 71aおよび第 2スリット光の軌跡 72aを予 め求め、その軌跡が横方向に湾曲している力、縦方向に湾曲しているかで処理を分 岐させても良い。この場合には、湾曲方向切換スィッチ 60は不要となり、装置の操作 性を向上させることができる。 When the processing up to this point is completed, “1” is calculated for the counter b by calorie calculation (S1309), and the processing of S1302 force is repeated to generate image data of an erect image. Thus, if the value of the counter b has reached the number of pixels a in the determination of S1302 (S1302: Yes), the processing area allocated in S1300 is released (S1310), and the process ends. In S160, the original position calculation process is branched by detecting the switch position of the bending direction switching switch 60. However, the trajectory 71a of the first slit light and the trajectory 72a of the second slit light are obtained in advance. Processing may be branched depending on whether the locus is curved in the horizontal direction or whether the locus is curved in the vertical direction. In this case, the bending direction switching switch 60 becomes unnecessary, and the operability of the device can be improved.
[0076] 図 10 (a) 図 10 (c)の各々は、本体ケース 10の前面におけるスリット光投光ロ 29 の配置位置に関する別の例を示す図である。即ち、上述した例では、スリット光投光 口 29は、本体ケース 10の前面であって、その左下隅部に配置したのに対し、図 10 ( a)では左上隅部、図 10 (b)では右上隅部、図 10 (c)では右下隅部に配置されてい る。  FIGS. 10 (a) and 10 (c) are diagrams showing another example regarding the arrangement position of the slit light projecting unit 29 on the front surface of the main body case 10. FIG. That is, in the above-described example, the slit light emitting opening 29 is located at the lower left corner of the front surface of the main body case 10, whereas in FIG. In Fig. 10 (c), it is located in the lower right corner.
[0077] 即ち、本実施例のように、結像レンズ 10を本体ケース 10の前面の略中央部に配置 した場合、スリット光投光ロ 29は、結像レンズ 31の鉛直方向と水平方向とに交わらな い位置に配置する必要がある。これにより、スリット光投光ロ 29から投光される各スリ ット光 71 , 72は、結像レンズ 31の光軸と重なることがなぐ上述した三角測量演算処 理(図 5の S150)において演算される第 1スリット光の軌跡 71aおよび第 2スリット光の 軌跡 72aの 3次元空間位置を高精度に求めることができる。また、スリット光投光ロ 29 は、その前面の領域内で出来るだけ結像レンズ 31から離して配置することが好まし レ、。  That is, when the imaging lens 10 is disposed at a substantially central portion of the front surface of the main body case 10 as in the present embodiment, the slit light projecting unit 29 is arranged in the vertical and horizontal directions of the imaging lens 31. It must be placed in a position where it does not intersect. As a result, each of the slit lights 71 and 72 emitted from the slit light projecting light 29 is subjected to the above-described triangulation calculation processing (S150 in FIG. 5) that does not overlap the optical axis of the imaging lens 31. The three-dimensional spatial position of the calculated locus 71a of the first slit light and the locus 72a of the second slit light can be obtained with high accuracy. Further, it is preferable that the slit light projecting unit 29 be disposed as far away from the imaging lens 31 as possible within the area of the front surface thereof.
[0078] 図 11は、スリット光投光ロ 29を結像レンズ 31から出来るだけ離した位置に配置す ることが好ましいことを説明するための図である。ここでは、スリット光投光ロ 29を結像 レンズ 31から Y軸方向に aだけ離した位置 Aに配置した場合と、 bだけ離した位置 Bに 配置した場合とについて説明する。  FIG. 11 is a diagram for explaining that it is preferable to dispose the slit light projecting unit 29 at a position as far away from the imaging lens 31 as possible. Here, a description will be given of a case where the slit light projecting unit 29 is arranged at a position A which is separated from the imaging lens 31 by a in the Y-axis direction and a case where it is arranged at a position B which is separated by b.
[0079] まず、原稿 Pが Y軸に対して傾斜していない場合を考える。この場合、位置 Aに配 置されたスリット光投光口 29から投光される第 2スリット光 72はポイント P2に到達し、 位置 Bに配置されたスリット光投光口 29から投光される第 2スリット光 72はポイント P3 に到達することになる。  First, consider a case where the document P is not inclined with respect to the Y axis. In this case, the second slit light 72 emitted from the slit light emitting port 29 arranged at the position A reaches the point P2 and is emitted from the slit light emitting port 29 arranged at the position B. The second slit light 72 reaches the point P3.
[0080] 一方、原稿 Pが Y軸に対して所定角度傾いている場合には、位置 A、位置 Bに配置 されたスリット光投光口 29から投光される第 2スリット光 72は、共に原稿 P上の点 (Y1 , Zl)に到達することになり、 Y軸に対して傾いた原稿 Pを Y軸に対して平行になるよ うに引き起こした場合には、原稿 P上の点 (Yl , Z1)は、ポイント P1に対応することに なる。 On the other hand, when the document P is tilted at a predetermined angle with respect to the Y axis, the second slit light 72 emitted from the slit light Point on document P (Y1 , Zl), and if the document P tilted with respect to the Y axis is caused to be parallel to the Y axis, the point (Yl, Z1) on the document P becomes the point P1 Will correspond.
[0081] 即ち、ポイント P1を演算により求めるために必要なポイント P3からポイント P1までの 変化量は、ポイント P2からポイント P1までの変化量より大きいので、その変化量の検 出をより高精度に算出することができ、ひいては、高精度に原稿 Pの 3次元形状を算 出すること力 Sできる。よって、スリット光投光ロ 29は、前面の領域内で出来るだけ結像 レンズ 31から離して配置することが好ましい。尚、これは X軸方向についても同様な 事が言える。  That is, since the amount of change from point P3 to point P1 required to obtain point P1 by calculation is larger than the amount of change from point P2 to point P1, detection of the amount of change is more accurately performed. It is possible to calculate the three-dimensional shape of the document P with high accuracy. Therefore, it is preferable that the slit light projecting unit 29 be disposed as far away from the imaging lens 31 as possible within the front area. The same applies to the X-axis direction.
[0082] 以上説明したとおり、上述した撮像装置 1によれば、使用者は、モード切替スィッチ 59を「補正撮像モード」側に切替え、撮像位置に対する原稿 Pの湾曲方向を確認し て湾曲方向切替スィッチ 60を設定し、ファインダ 53、又は、 LCD51で原稿 Pの所望 の範囲が撮像範囲に入っているか確認し、レリーズボタン 52を押して画像を撮影す る。これによつて、例えば図 2 (a)に示すように撮像した原稿の湾曲は、その湾曲に沿 つて形成される第 1スリット光 71aの軌跡に基づき算出される。  As described above, according to the imaging apparatus 1 described above, the user switches the mode switching switch 59 to the “correction imaging mode” side, checks the bending direction of the document P with respect to the imaging position, and switches the bending direction. Set the switch 60, check that the desired range of the document P is within the imaging range with the viewfinder 53 or the LCD 51, and press the release button 52 to take an image. Thereby, for example, as shown in FIG. 2A, the curvature of the imaged document is calculated based on the trajectory of the first slit light 71a formed along the curvature.
[0083] また、図 2 (b)に示すように撮像した原稿の湾曲は、その湾曲に沿って形成される第 2スリット光 72aの軌跡に基づき算出される。よって、交差する方向に湾曲する原稿を 所定の位置から撮像したとしても、その原稿の湾曲を含む 3次元形状を検出すること ができる。  As shown in FIG. 2 (b), the curvature of the imaged document is calculated based on the trajectory of the second slit light 72a formed along the curvature. Therefore, even if a document curving in the intersecting direction is imaged from a predetermined position, a three-dimensional shape including the curve of the document can be detected.
[0084] 図 12 (a)および図 12 (b)は、上述した第 1実施例のスリット光投光ユニット 20に関 する第 2実施例のスリット光投光ユニット 20aを示す側面図である。図 12 (a)は第 1ス リット光 71を投光している状態を示し、図 12 (b)は第 2スリット光 72を投光している状 態を示している。尚、第 1実施例のスリット光投光ユニット 20と同一の部材については 、同一の符号を付して、その説明を省略する。  FIGS. 12 (a) and 12 (b) are side views showing the slit light projecting unit 20a of the second embodiment with respect to the slit light projecting unit 20 of the first embodiment described above. FIG. 12 (a) shows a state in which the first slit light 71 is projected, and FIG. 12 (b) shows a state in which the second slit light 72 is projected. Note that the same members as those of the slit light projecting unit 20 of the first embodiment are denoted by the same reference numerals, and description thereof will be omitted.
[0085] 第 1実施例のスリット光投光ユニット 20は、 2つのシリンドリカルレンズ 27a, 27bによ つて第 1、第 2スリット光 71, 72を各々投光するものであるのに対し、この第 2実施例 のスリット光投光ユニット 20aは、 1つのシリンドリカルレンズ 70を回転移動させ、第 1 の状態の場合に第 1スリット光 71を、第 2の状態の場合に第 2スリット光 72を投光する ものである。 The slit light projecting unit 20 of the first embodiment projects first and second slit lights 71 and 72 by two cylindrical lenses 27a and 27b, respectively. The slit light projecting unit 20a of the second embodiment rotates one cylindrical lens 70 to project the first slit light 71 in the first state and the second slit light 72 in the second state. Glow Things.
[0086] 第 2実施例のスリット光投光ユニット 20aは、上流側から順番に、レーザーダイォー ド 21と、コリメートレンズ 22と、アパーチャ 24と、シリンドリカルレンズ 70とを備えている  [0086] The slit light projecting unit 20a of the second embodiment includes a laser diode 21, a collimating lens 22, an aperture 24, and a cylindrical lens 70 in order from the upstream side.
[0087] シリンドリカルレンズ 70は、図示しない駆動装置により、その地点で回転可能に構 成されており、第 1の状態として図 2 (a)に示すように、 YZ平面の断面視において湾 曲面がないように配置され、第 2の状態として図 2 (b)に示すように、 YZ平面の断面 視において湾曲面が下流側に向くように配置される。 The cylindrical lens 70 is configured to be rotatable at that point by a driving device (not shown). As a first state, as shown in FIG. As a second state, as shown in FIG. 2 (b), the second surface is arranged such that the curved surface faces the downstream side in a sectional view on the YZ plane.
[0088] このようにシリンドリカルレンズ 70を第 1の状態と第 2の状態とに回転可能構成する ことで、シリンドリカルレンズ 70が第 1の状態に配置されている場合にレーザーダイォ ード 21からレーザー光線を発射すれば、そのレーザー光線はコリメートレンズ 22、ァ パーチヤ 24を通過してスリット光に変換される。更に、そのスリット光はシリンドリカル レンズ 70を通過することで、所定の角度幅で Y軸と直交 (X軸と平行)する方向に延 びる平面状の第 1スリット光 71として投光される。  [0088] By configuring the cylindrical lens 70 to be rotatable between the first state and the second state in this manner, when the cylindrical lens 70 is disposed in the first state, the laser beam is emitted from the laser diode 21. When emitted, the laser beam passes through a collimating lens 22 and aperture 24 and is converted into slit light. Further, the slit light passes through the cylindrical lens 70 and is emitted as a first planar slit light 71 extending in a direction orthogonal to the Y axis (parallel to the X axis) with a predetermined angular width.
[0089] 一方、その後、第 1の状態にあるシリンドリカルレンズ 70を第 2の状態になるように回 転移動させ、シリンドリカルレンズ 70が第 2の状態に配置された場合に、レーザーダ ィオード 21からレーザー光線を発射すれば、そのレーザー光線はコリメートレンズ 22 、アパーチャ 24を通過してスリット光に変換される。更に、そのスリット光はシリンドリカ ルレンズ 70を通過することで、所定の角度幅で Y軸と平行に延びる平面状の第 2スリ ット光 72として投光される。  On the other hand, after that, the cylindrical lens 70 in the first state is rotated so as to be in the second state, and when the cylindrical lens 70 is arranged in the second state, the laser beam is emitted from the laser diode 21. Is emitted, the laser beam passes through a collimating lens 22 and an aperture 24 and is converted into slit light. Further, the slit light passes through the cylindrical lens 70 and is projected as a planar second slit light 72 extending in a predetermined angular width in parallel with the Y axis.
[0090] また、この第 2実施例のスリット光投光ユニット 20bを搭載した撮像装置 1では、第 1 の状態において第 1の撮像を実行し、第 2の状態において第 2の撮像を実行する。  Further, in the imaging apparatus 1 equipped with the slit light projecting unit 20b of the second embodiment, the first imaging is executed in the first state, and the second imaging is executed in the second state. .
[0091] この構成により、第 1の撮像によって取得する第 1スリット光有画像とスリット光無画 像との差分をとることで第 1スリット光の軌跡 71aが検出され。第 2の撮像によって取得 する第 2スリット光有画像とスリット光無画像との差分をとることで第 2スリット光の軌跡 72aが検出される。  [0091] With this configuration, the trajectory 71a of the first slit light is detected by calculating the difference between the first slit light image and the slit light non-image obtained by the first imaging. The trajectory 72a of the second slit light is detected by calculating the difference between the second image with slit light and the image without slit light acquired by the second imaging.
[0092] 尚、スリット光無画像を撮像するタイミングとして、第 1の撮像と第 2の撮像との間に 実行するように構成すれば、効率良くスリット光無画像を取得することができる。 [0093] 以上説明した通り、この第 2実施例のスリット光投光ユニット 20bによれば、第 1実施 例のスリット光投光ユニット 20と比べて、必要なシリンドリカルレンズの個数を減らすこ とができるので、部品点数の削減により、撮像装置 1の製造コストを低減することがで きる。 [0092] Note that if the timing for capturing the slit lightless image is configured to be executed between the first imaging and the second imaging, the slit lightless image can be obtained efficiently. As described above, according to the slit light projecting unit 20b of the second embodiment, the required number of cylindrical lenses can be reduced as compared with the slit light projecting unit 20 of the first embodiment. Since the number of parts can be reduced, the manufacturing cost of the imaging device 1 can be reduced.
[0094] 図 13 (a)および図 13 (b)は、上述した第 1実施例のスリット光投光ユニット 20に関 する第 3実施例のスリット光投光ユニット 20bを示す図である。図 13 (a)は第 1スリット 光 71を投光している状態を示す側面図と回折素子 80の正面図とを示す。図 13 (b) は第 2スリット光 72を投光している状態を示す側面図と回折素子 80の正面図とを示し ている。尚、第 1実施例のスリット光投光ユニット 20と同一の部材については、同一の 符号を付して、その説明を省略する。  FIGS. 13 (a) and 13 (b) are views showing the slit light projecting unit 20b of the third embodiment related to the slit light projecting unit 20 of the first embodiment described above. FIG. 13A shows a side view showing a state where the first slit light 71 is projected and a front view of the diffraction element 80. FIG. 13B shows a side view showing a state in which the second slit light 72 is projected and a front view of the diffraction element 80. Note that the same members as those of the slit light projecting unit 20 of the first embodiment are denoted by the same reference numerals, and description thereof will be omitted.
[0095] 第 1実施例のスリット光投光ユニット 20は、 2つのシリンドリカルレンズ 27a, 27bを利 用して、第 1、第 2スリット光 71 , 72を投光するものであるのに対し、この第 3実施例の スリット光投光ユニット 20bは、上下方向に移動可能に配置された回折素子板 80を 利用して第 1、第 2スリット光 71 , 72を投光するものである。  The slit light projecting unit 20 of the first embodiment projects the first and second slit lights 71 and 72 by using two cylindrical lenses 27a and 27b. The slit light projecting unit 20b of the third embodiment projects the first and second slit lights 71, 72 by using a diffraction element plate 80 which is movably arranged in the vertical direction.
[0096] 第 3実施例のスリット光投光ユニット 20bは、上流側から順番に、レーザーダイォー ド 21と、コリメートレンズ 22と、アパーチャ 24と、そのアパーチャ 24と対向する位置に 板状に形成された回折素子板 80とを備える。回折素子板 80にはソレノイド 81が連結 されている。  [0096] The slit light projecting unit 20b of the third embodiment is formed in a plate shape at a position facing the laser diode 21, the collimating lens 22, the aperture 24, and the aperture 24 in order from the upstream side. And a diffraction element plate 80. A solenoid 81 is connected to the diffraction element plate 80.
[0097] 回折素子板 80は、その表面の略上半分に X軸方向と平行に延びる複数本の第 1 回折素子 80aが形成され、その表面の略下半分に Y軸方向と平行に延びる複数本 の第 2回折素子 80bが形成されてレ、る。  [0097] In the diffraction element plate 80, a plurality of first diffraction elements 80a extending in parallel with the X-axis direction are formed substantially in the upper half of the surface, and a plurality of first diffraction elements 80a extending in parallel with the Y-axis direction are formed in the substantially lower half of the surface. The second diffraction element 80b is formed.
[0098] ソレノイド 81は、コイルの磁気的エネルギーをプランジャ 81bの直線運動に変換す るものである。ソレノイド 81は、コイルを内包する本体 81aと、本体 81aに対して相対 的に往復移動するプランジャ 81bとを備え、そのプランジャ 81bの一端に回折素子板 80が連結されている。  [0098] The solenoid 81 converts magnetic energy of the coil into linear motion of the plunger 81b. The solenoid 81 includes a main body 81a containing a coil, and a plunger 81b reciprocating with respect to the main body 81a. A diffraction element plate 80 is connected to one end of the plunger 81b.
[0099] このように構成されたスリット光投光ユニット 20bによれば、図 13 (a)に示す状態、即 ち、回折素子板 80の第 2回折素子 80bの形成された面がアパーチャ 24と対向する 位置に配置されている状態でレーザーダイオード 21からレーザー光線を発射すれ ば、そのレーザー光線は、コリメートレンズ 22、アパーチャ 24、第 2回折素子 80bを通 過することで、所定の角度幅で Y軸と直交 (X軸と平行)する方向に延びる平面状の 第 1スリット光 71として投光される。 According to the slit light projecting unit 20b configured as described above, the state shown in FIG. 13A, that is, the surface of the diffraction element plate 80 where the second diffraction element 80b is formed is the same as the aperture 24. When a laser beam is emitted from the laser diode 21 while being placed at the opposite position, For example, the laser beam passes through the collimator lens 22, the aperture 24, and the second diffractive element 80b, thereby forming a first slit having a predetermined angular width and extending in a direction orthogonal to the Y axis (parallel to the X axis). Light is emitted as light 71.
[0100] 一方、その後、ソレノイド 81の作用により回折素子板 80はプランジャ 81bにより下方 に移動する。そして、図 13 (b)に示すような状態、即ち、回折素子板 80の第 1回折素 子 80aの形成された面がアパーチャ 24と対向する位置に配置されている状態でレー ザ一ダイオード 21からレーザー光線を発射すれば、そのレーザー光線は、コリメート レンズ 22、アパーチャ 24、第 1回折素子 80aを通過することで、所定の角度幅で Y軸 と平行する方向に延びる平面状の第 2スリット光 72として投光される。  [0100] On the other hand, thereafter, the diffraction element plate 80 is moved downward by the plunger 81b by the action of the solenoid 81. Then, in a state as shown in FIG. 13B, that is, in a state where the surface of the diffraction element plate 80 on which the first diffraction element 80a is formed is located at a position facing the aperture 24, the laser diode 21 When the laser beam is emitted from the laser beam, the laser beam passes through the collimating lens 22, the aperture 24, and the first diffractive element 80a to form a flat second slit light 72 extending in a direction parallel to the Y axis with a predetermined angular width. It is projected as
[0101] 尚、この第 3実施例のスリット光投光ユニット 20bを搭載した撮像装置 1では、第 2実 施例のスリット光投光ユニット 20aと同様な方法で、第 1、第 2スリット光の軌跡 71a、 7 2aを検出することができる。  [0101] In the imaging apparatus 1 equipped with the slit light projecting unit 20b of the third embodiment, the first and second slit light projects are performed in the same manner as the slit light projecting unit 20a of the second embodiment. Locus 71a, 72a can be detected.
[0102] 以上説明した通り、第 3実施例のスリット光投光ユニット 20bによれば、回折素子板 80はシリンドリカルレンズに比べ安価であるため、部品コストを低減でき、ひいては撮 像装置 1の製造コストを低減することができる。  As described above, according to the slit light projecting unit 20b of the third embodiment, since the diffractive element plate 80 is inexpensive as compared with the cylindrical lens, the cost of parts can be reduced, and the manufacturing of the imaging apparatus 1 can be achieved. Cost can be reduced.
[0103] 上記実施例において、図 5の S170aの処理または S170bの処理は 3次元形状算 出手段と位置付けられる。図 5の S180の処理は、平面画像補正手段と位置付けられ る。  [0103] In the above embodiment, the processing of S170a or the processing of S170b in Fig. 5 is regarded as a three-dimensional shape calculation means. The process of S180 in FIG. 5 is regarded as a planar image correcting unit.
[0104] 以上実施例に基づき本発明を説明したが、本発明は上記実施例に何ら限定される ものでなぐ本発明の主旨を逸脱しない範囲内で種々の改良変形が可能であること は容易に推察できるものである。  [0104] Although the present invention has been described based on the embodiments, the present invention is not limited to the above-described embodiments, and various improvements and modifications can be easily made without departing from the gist of the present invention. It can be inferred.
[0105] 例えば、上記実施例では、第 1スリット光 71または第 2スリット光 72のいずれのスリツ ト光が原稿の湾曲に沿ったスリット光であるか否かは、湾曲方向切換スィッチ 60を介 して使用者が入力する場合を説明した。  For example, in the above embodiment, whether the slit light of the first slit light 71 or the second slit light 72 is the slit light along the curvature of the original is determined via the bending direction switching switch 60. And the case where the user inputs is described.
[0106] し力 ながら、例えば、上述した三角測量演算処理(図 5の S150)において演算さ れた各スリット光の 3次元空間における位置座標から、いずれのスリット光が湾曲に沿 つたスリット光であるか否かを判定し、その判定結果に基づいて、横方向用原稿姿勢 演算処理 (S 170a)を実行するか、縦方向用原稿姿勢演算処理 (S 170b)を実行す るかを判定するように構成しても良い。このように、力かる判定をソフト的に行うことで、 使用者のスィッチ操作が不要となり、本装置の操作性を向上させることができる。 For example, from the position coordinates in the three-dimensional space of each slit light calculated in the above-described triangulation calculation processing (S150 in FIG. 5), any of the slit lights is determined by the slit light along the curve. Then, based on the result of the determination, the document orientation calculation process for the horizontal direction (S170a) or the document orientation calculation process for the vertical direction (S170b) is executed. May be determined. As described above, by performing the determination of the force in software, the switch operation of the user is not required, and the operability of the present apparatus can be improved.
[0107] また、撮像装置 1は、スリット光有画像及びスリット光無画像を、結像レンズ 31及び C CD画像センサ 32を用いて撮像するよう構成されている。これに対して、結像レンズ 3 1及び CCD画像センサ 32の他に、スリット有画像を撮像するための結像レンズ及び CCD画像センサを別途追加して設けたものであっても良い。  The imaging apparatus 1 is configured to capture an image with slit light and an image without slit light using the imaging lens 31 and the CCD image sensor 32. On the other hand, in addition to the imaging lens 31 and the CCD image sensor 32, an imaging lens for capturing an image with a slit and a CCD image sensor may be additionally provided.
[0108] このように構成することにより、スリット光有画像とスリット光無画像とを撮像する間の 時間経過(CCD画像センサ 32の画像データを転送する時間など)を無くすることが でき、スリット光有画像に対してスリット光無画像の撮像範囲のずれが無ぐ検出する 対象物体の 3次元形状の精度が高いものとすることができる。但し、本実施例の撮像 装置 1の方が、構成部品が少なぐ小型で安価なものとすることができる。  With this configuration, it is possible to eliminate the time lapse between capturing the image with slit light and the image without slit light (such as the time for transferring the image data of the CCD image sensor 32). The accuracy of the three-dimensional shape of the target object to be detected is small when there is no shift in the imaging range of the image without slit light with respect to the image with light. However, the imaging device 1 of the present embodiment can be made smaller and less expensive with fewer components.
[0109] また、本実施例では光源としてレーザー光線を放射するレーザーダイオード 21を 用いているが、その他、面発光レーザー、 LED、 EL素子など、光ビームを出力でき るものであれば、いずれの光源が用いられても良い。  In this embodiment, the laser diode 21 that emits a laser beam is used as a light source. However, any other light source that can output a light beam, such as a surface emitting laser, an LED, and an EL element, can be used. May be used.
[0110] スリット光投光ユニット 20から出射されるスリット光は、パターン光の一種であり、長 手方向に直交する方向に、急峻に絞り込まれた細線の他に、一定の幅を備えたスト ライプ状の光パターンでも良い。  [0110] The slit light emitted from the slit light projecting unit 20 is a type of pattern light. In addition to a fine line sharply narrowed in a direction orthogonal to the longitudinal direction, a slit light having a certain width is provided. A light pattern in the form of a lip may be used.
[0111] 本発明の一つの実施形態において、第 1パターン光および第 2パターン光は、撮像 手段の光軸と平行な平面に沿って延びるように投光されても良レ、。  [0111] In one embodiment of the present invention, the first pattern light and the second pattern light may be projected so as to extend along a plane parallel to the optical axis of the imaging means.
[0112] このような構成によれば、第 1パターン光および第 2パターン光は撮像手段の光軸と 平行な平面に沿って延びるように投光される。例えば、縦方向または横方向に湾曲 する矩形状の原稿を撮像する場合には、その原稿を正面から撮像する場合が多い。 よって、力、かる場合には、第 1パターン光または第 2パターン光を原稿の湾曲に沿つ て投光することができる。従って、高精度に原稿の湾曲を検出することができる。  [0112] According to such a configuration, the first pattern light and the second pattern light are projected so as to extend along a plane parallel to the optical axis of the imaging means. For example, when capturing an image of a rectangular document that curves in the vertical or horizontal direction, the image of the document is often captured from the front. Therefore, in the case of power, the first pattern light or the second pattern light can be projected along the curvature of the document. Therefore, the curvature of the document can be detected with high accuracy.
[0113] 本発明の一つの実施形態において、投光手段は、第 1パターン光および第 2パタ 一ン光を各々放射する 2つの光学レンズを備えてレ、ても良レ、。  [0113] In one embodiment of the present invention, the light projecting means includes two optical lenses that respectively emit the first pattern light and the second pattern light.
[0114] このような構成によれば、投光手段は、第 1パターン光および第 2パターン光を各々 放射する 2つの光学レンズを備えているので、第 1パターン光および第 2パターン光 を対象物体に対して同時に投光することができる。よって、第 1パターン光の軌跡お よび第 2パターン光の軌跡を含む画像を効率良く取得することができる。 According to such a configuration, since the light projecting means includes the two optical lenses that respectively emit the first pattern light and the second pattern light, the first pattern light and the second pattern light Can be simultaneously projected on the target object. Therefore, an image including the trajectory of the first pattern light and the trajectory of the second pattern light can be efficiently acquired.
[0115] 本発明の一つの実施形態において、 2つの光学レンズの各々はシリンドリカルレン ズで構成されていても良レ、。この場合、一方のシリンドリカルレンズは第 1パターン光 の放射される平面に対する断面視において湾曲面を有するように、他方のシリンドリ カルレンズは第 2パターン光の放射される平面に対する断面視において湾曲面を有 するように配置される。 [0115] In one embodiment of the present invention, each of the two optical lenses may be composed of a cylindrical lens. In this case, one cylindrical lens has a curved surface in a sectional view with respect to a plane from which the first pattern light is emitted, and the other cylindrical lens has a curved surface in a sectional view with respect to a plane from which the second pattern light is emitted. It is arranged to be.
[0116] このような構成によれば、各シリンドリカルレンズの配置を固定するだけで、一方の シリンドリカルレンズから第 1パターン光を、他方のシリンドリカルレンズから第 2パター ン光を、対象物体に対して各々放射することができる。また、 2つの光学レンズを同じ シリンドリカルレンズで構成できるので、部品を共通化することができる。  According to such a configuration, only by fixing the arrangement of each cylindrical lens, the first pattern light from one cylindrical lens and the second pattern light from the other cylindrical lens are transmitted to the target object. Each can emit. In addition, since the two optical lenses can be constituted by the same cylindrical lens, parts can be shared.
[0117] 本発明の一つの実施形態において、投光手段は、光ビームを出力する 1つの光出 力手段と、光出力手段から出力される光ビームを 2方向に分割する分割手段とを備 えていても良い。この場合、 2つの光学レンズの内、一方の光学レンズは分割手段に より 2方向に分割される一方の途中に配置され、他方の光学レンズは分割手段により 2方向に分割される他方の途中に配置される。  [0117] In one embodiment of the present invention, the light projecting means includes one light output means for outputting a light beam, and a splitting means for splitting the light beam output from the light output means in two directions. You may have. In this case, of the two optical lenses, one of the two optical lenses is disposed in the middle of the two-way split by the splitter, and the other optical lens is positioned in the other half of the two-way split by the splitter. Be placed.
[0118] このような構成によれば、 2つの光学レンズの各々へは、分割手段によって 1つの光 出力手段から出力される光ビームを 2方向に分割された光ビームの各々が入力され るので、 2つの光学レンズを用いた場合であっても、 2つの光出力手段を搭載する必 要はなぐ部品点数を削減でき、装置の製造コストを低減することができるという効果 力 sある。 According to such a configuration, each of the two optical lenses is input with the light beam obtained by splitting the light beam output from one light output means by the splitting means in two directions. , even in the case of using two optical lenses, can reduce必short Nag parts for mounting two light output means, an effect force s being able to reduce the manufacturing cost of the device.
[0119] 本発明の一つの実施形態において、投光手段は、第 1パターン光を放射する第 1 状態および第 2パターン光を放射する第 2状態を有する 1つの光学レンズを備えてい ても良い。この場合、その 1つの光学レンズは第 1状態と第 2状態とに移動可能に構 成される。  [0119] In one embodiment of the present invention, the light projecting means may include one optical lens having a first state for emitting the first pattern light and a second state for emitting the second pattern light. . In this case, the one optical lens is configured to be movable between a first state and a second state.
[0120] このような構成によれば、第 1パターン光および第 2パターン光は 1つの光学レンズ から放射されるので、第 1パターン光および第 2パターン光を 2つの光学レンズを用い て放射する場合に比べて、部品点数を削減でき、装置の製造コストを低減することが できる。 According to such a configuration, since the first pattern light and the second pattern light are emitted from one optical lens, the first pattern light and the second pattern light are emitted using two optical lenses. The number of parts can be reduced compared to the case, and the manufacturing cost of the device can be reduced. it can.
[0121] 本発明の一つの実施形態において、 1つの光学レンズはシリンドリカルレンズで構 成されていても良い。この場合、そのシリンドリカルレンズは、第 1状態として第 1パタ ーン光の放射される平面に対する断面視において湾曲面を有するように、第 2状態と して第 2パターン光の放射される平面に対する断面視において湾曲面を有するように 移動可能に構成される。  [0121] In one embodiment of the present invention, one optical lens may be constituted by a cylindrical lens. In this case, the cylindrical lens has a second state with respect to a plane from which the second pattern light is emitted so that the cylindrical lens has a curved surface in a cross-sectional view with respect to a plane from which the first pattern light is emitted as the first state. It is configured to be movable so as to have a curved surface in a sectional view.
[0122] このような構成によれば、第 1状態と第 2状態とを切り換えるだけで、第 1パターン光 および第 2パターン光を、対象物体に対して各々放射すること力できる。  According to such a configuration, the first pattern light and the second pattern light can be respectively emitted to the target object only by switching between the first state and the second state.
[0123] 本発明の一つの実施形態において、投光手段は、第 1パターン光を対象物体に投 光する第 1回折素子と、第 2パターン光を対象物体に投光する第 2回折素子とを備え ていても良い。  [0123] In one embodiment of the present invention, the light projecting means includes a first diffractive element for projecting the first pattern light to the target object, and a second diffractive element for projecting the second pattern light to the target object. May be provided.
[0124] このような構成によれば、光投光手段は、第 1パターン光を対象物体に投光する第 1回折素子と、第 2パターン光を対象物体に投光する第 2回折素子とを備えているの で、投光手段を光学レンズを用いて構成する場合に比べて、部品コストを低減するこ とができ、装置の製造コストを低減することができる。  According to such a configuration, the light projecting unit includes the first diffractive element that projects the first pattern light onto the target object, and the second diffractive element that projects the second pattern light onto the target object. Therefore, the cost of parts can be reduced and the manufacturing cost of the device can be reduced as compared with the case where the light projecting means is configured using an optical lens.
[0125] 本発明の一つの実施形態において、投光手段は、光ビームを出力する光出力手 段を備え、第 1回折素子と第 2回折素子とは、第 1状態として、いずれか一方が光出 力手段から出力される光ビームの経路の途中に配置され、他方が光ビームの経路外 に配置され、第 2状態として、他方が光ビームの経路の途中に配置され、一方が光ビ ームの経路外に配置されるように移動可能に構成されてレ、ても良レ、。  [0125] In one embodiment of the present invention, the light projecting means includes a light output means for outputting a light beam, and one of the first diffraction element and the second diffraction element is set as a first state. The second state is arranged in the middle of the path of the light beam, the other is arranged in the middle of the path of the light beam, and the other is arranged in the middle of the path of the light beam. It is configured to be movable so as to be located outside the path of the game.
[0126] このような構成によれば、第 1状態と第 2状態とを切り換えるだけで、第 1パターン光 および第 2パターン光を対象物体に対して各々放射することができる。  According to such a configuration, the first pattern light and the second pattern light can be respectively emitted to the target object only by switching between the first state and the second state.
[0127] 本発明の一つの実施形態において、撮像手段によって撮像される画像を升目状に 区画する各画素を想定した場合、投光手段は、第 1方向に延びる画素に沿って前記 第 1パターン光を投光し、第 1方向と略直交する第 2方向に延びる画素に沿って前記 第 2パターン光を投光するよう構成されていても良い。  [0127] In one embodiment of the present invention, when assuming each pixel that partitions the image captured by the imaging unit into a grid, the light projecting unit performs the first pattern along the pixel extending in the first direction. It may be configured to project light and project the second pattern light along a pixel extending in a second direction substantially orthogonal to the first direction.
[0128] このような構成によれば、第 1パターン光および第 2パターン光を画素の並びに沿つ て投光でき、高精度に対象物体の 3次元形状を検出することができる。 [0129] 本発明の一つの実施形態において、撮像手段の撮像レンズは、本装置の略矩形 状の前面の略中央部に配置され、パターン光の投光ロは、その前面の四隅部のい ずれかに配置されてレ、ても良レ、。 According to such a configuration, the first pattern light and the second pattern light can be projected along a row of pixels, and the three-dimensional shape of the target object can be detected with high accuracy. [0129] In one embodiment of the present invention, the imaging lens of the imaging means is disposed at substantially the center of the substantially rectangular front surface of the present apparatus, and the projection light of the pattern light is located at the four corners of the front surface. Even if it is placed somewhere, it is good.
[0130] このような構成によれば、撮像手段の撮像レンズは、本装置の略矩形状の前面の 略中央部に配置され、パターン光の投光ロは、その前面の四隅部のいずれかに配 置されているので、撮像手段の光軸と重なることなく第 1パターン光および第 2パター ン光を放射することができる。また、第 1パターン光の軌跡および第 2パターン光の軌 跡の実空間における座標を三角測量により高精度に求めることができる。  [0130] According to such a configuration, the imaging lens of the imaging means is disposed at a substantially central portion of the substantially rectangular front surface of the present apparatus, and the projection light of the pattern light is provided at any one of the four corners of the front surface. The first pattern light and the second pattern light can be emitted without overlapping with the optical axis of the imaging means. Further, the coordinates of the trajectory of the first pattern light and the trajectory of the second pattern light in the real space can be obtained with high accuracy by triangulation.
[0131] 本発明の一つの実施形態において、投光手段は、光ビームを出力する光出力手 段と、その光出力手段から出力される光ビームを所定の角度幅で平面状に放射され る光束であるスリット光に変換し、そのスリット光を対象物体へ投光するスリット光投光 手段とを備えていても良い。  [0131] In one embodiment of the present invention, the light projecting means emits a light beam and outputs the light beam from the light output means in a plane with a predetermined angular width. Slit light projecting means for converting the slit light into a slit light and projecting the slit light to a target object may be provided.

Claims

請求の範囲 The scope of the claims
[1] パターン光を投光する投光手段と、 [1] light emitting means for emitting pattern light,
その投光手段からパターン光が投光されている状態における対象物体を撮像する 撮像手段と、  Imaging means for imaging a target object in a state where pattern light is emitted from the light emitting means;
その撮像手段によって撮像された画像からパターン光の位置を検出し、その検出し たパターン光の位置に基づき、対象物体の 3次元形状情報を算出する 3次元形状算 出手段とを備え、  Three-dimensional shape calculating means for detecting the position of the pattern light from the image captured by the imaging means, and calculating three-dimensional shape information of the target object based on the detected position of the pattern light;
前記投光手段は、前記パターン光として、前記撮像手段の光軸から離れた位置を 通る平面に沿って延びる第 1パターン光と、その第 1パターン光と交差する平面に沿 つて延びる第 2パターン光とを前記対象物体へ投光することを特徴とする 3次元形状 検出装置。  The light projecting means includes, as the pattern light, a first pattern light extending along a plane passing through a position distant from the optical axis of the imaging means, and a second pattern extending along a plane intersecting the first pattern light. A three-dimensional shape detection device, which projects light onto the target object.
[2] 前記第 1パターン光および第 2パターン光は前記撮像手段の光軸と平行な平面に 沿って延びるように投光されることを特徴とする請求項 1に記載の 3次元形状検出装 置。  2. The three-dimensional shape detecting device according to claim 1, wherein the first pattern light and the second pattern light are projected so as to extend along a plane parallel to an optical axis of the imaging unit. Place.
[3] 前記投光手段は、前記第 1パターン光および第 2パターン光を各々放射する 2つの 光学レンズを備えていることを特徴とする請求項丄に記載の 3次元形状検出装置。 3. The three- dimensional shape detecting device according to claim 1, wherein the light projecting means includes two optical lenses that respectively emit the first pattern light and the second pattern light.
[4] 前記 2つの光学レンズの各々はシリンドリカルレンズで構成され、  [4] Each of the two optical lenses is constituted by a cylindrical lens,
一方のシリンドリカルレンズは前記第 1パターン光の放射される平面に対する断面 視において湾曲面を有するように、他方のシリンドリカルレンズは前記第 2パターン光 の放射される平面に対する断面視におレ、て湾曲面を有するように配置されてレ、ること を特徴とする請求項 3に記載の 3次元形状検出装置。  One cylindrical lens has a curved surface in a sectional view with respect to the plane from which the first pattern light is radiated, and the other cylindrical lens has a curved surface in a sectional view with respect to the plane from which the second pattern light is radiated. The three-dimensional shape detection device according to claim 3, wherein the three-dimensional shape detection device is arranged so as to have a surface.
[5] 前記投光手段は、 [5] The light emitting means,
光ビームを出力する 1つの光出力手段と、  One light output means for outputting a light beam,
前記光出力手段から出力される光ビームを 2方向に分割する分割手段とを備え、 前記 2つの光学レンズの内、一方の光学レンズは前記分割手段により 2方向に分割 される一方の途中に配置され、他方の光学レンズは前記分割手段により 2方向に分 割される他方の途中に配置されていることを特徴とする請求項 3に記載の 3次元形状 検出装置。 Splitting means for splitting the light beam output from the light output means in two directions, wherein one of the two optical lenses is disposed in the middle of one of the two optical lenses split by the splitting means in two directions. 4. The three-dimensional shape detecting device according to claim 3, wherein the other optical lens is disposed in the middle of the other divided in two directions by the dividing means.
[6] 前記投光手段は、前記第 1パターン光を放射する第 1状態および前記第 2パターン 光を放射する第 2状態を有する 1つの光学レンズを備え、 [6] The light projecting means includes one optical lens having a first state of emitting the first pattern light and a second state of emitting the second pattern light,
その 1つの光学レンズは第 1状態と第 2状態とに移動可能に構成されていることを特 徴とする請求項 1に記載の 3次元形状検出装置。  3. The three-dimensional shape detecting device according to claim 1, wherein the one optical lens is configured to be movable between a first state and a second state.
[7] 前記 1つの光学レンズはシリンドリカルレンズで構成され、 [7] The one optical lens is constituted by a cylindrical lens,
そのシリンドリカルレンズは、前記第 1状態として前記第 1パターン光の放射される 平面に対する断面視において湾曲面を有するように、前記第 2状態として前記第 2パ ターン光の放射される平面に対する断面視において湾曲面を有するように移動可能 に構成されていることを特徴とする請求項 6に記載の 3次元形状検出装置。  The cylindrical lens has a curved surface in a cross-sectional view with respect to a plane from which the first pattern light is radiated as the first state, and has a curved surface in a cross-sectional view with respect to a plane from which the second pattern light is radiated as the first state. 7. The three-dimensional shape detection device according to claim 6, wherein the three-dimensional shape detection device is configured to be movable so as to have a curved surface.
[8] 前記投光手段は、前記第 1パターン光を前記対象物体に投光する第 1回折素子と[8] The light projecting means includes a first diffraction element that projects the first pattern light onto the target object.
、前記第 2パターン光を対象物体に投光する第 2回折素子とを備えていることを特徴 とする請求項 1に記載の 3次元形状検出装置。 The three-dimensional shape detection device according to claim 1, further comprising: a second diffraction element that projects the second pattern light onto a target object.
[9] 前記投光手段は、光ビームを出力する光出力手段を備え、 [9] The light projecting means includes a light output means for outputting a light beam,
前記第 1回折素子と第 2回折素子とは、第 1状態として、いずれか一方が前記光出 力手段から出力される光ビームの経路の途中に配置され、他方が光ビームの経路外 に配置され、第 2状態として、他方が光ビームの経路の途中に配置され、一方が光ビ ームの経路外に配置されるように移動可能に構成されていることを特徴とする請求項 In the first state, one of the first diffraction element and the second diffraction element is disposed in the middle of the path of the light beam output from the light output unit, and the other is disposed outside the path of the light beam. The second state is configured to be movable so that the other is arranged in the middle of the optical beam path and the other is arranged outside the optical beam path.
8に記載の 3次元形状検出装置。 8. The three-dimensional shape detection device according to 8.
[10] 前記撮像手段によって撮像される画像を升目状に区画する各画素を想定した場合[10] Assuming each pixel that partitions an image captured by the imaging unit into a grid shape
、前記投光手段は、第 1方向に延びる画素に沿って前記第 1パターン光を投光し、第The light projecting means projects the first pattern light along a pixel extending in a first direction,
1方向と略直交する第 2方向に延びる画素に沿って前記第 2パターン光を投光するこ とを特徴とする請求項 1に記載の 3次元形状検出装置。 The three-dimensional shape detecting device according to claim 1, wherein the second pattern light is projected along a pixel extending in a second direction substantially orthogonal to the one direction.
[11] 前記撮像手段の撮像レンズは、本装置の略矩形状の前面の略中央部に配置され[11] The imaging lens of the imaging means is arranged at a substantially central portion of a substantially rectangular front surface of the present apparatus.
、前記パターン光の投光ロは、その前面の四隅部のいずれかに配置されていること を特徴とする請求項 1に記載の 3次元形状検出装置。 2. The three-dimensional shape detecting device according to claim 1, wherein the projection light of the pattern light is arranged at any one of four corners on a front surface thereof.
[12] 前記投光手段は、 [12] The light emitting means,
光ビームを出力する光出力手段と、  Light output means for outputting a light beam;
その光出力手段から出力される光ビームを所定の角度幅で平面状に放射される光 束であるスリット光に変換し、そのスリット光を対象物体へ投光するスリット光投光手段 とを備えることを特徴とする請求項 1に記載の 3次元形状検出装置。 The light beam emitted from the light output means is emitted in a plane at a predetermined angular width. 2. The three-dimensional shape detecting device according to claim 1, further comprising: a slit light projecting unit that converts the slit light into a bundle, and projects the slit light onto a target object.
請求項 1に記載の 3次元形状検出装置と、  A three-dimensional shape detection device according to claim 1,
その 3次元形状検出装置の 3次元形状算出手段により算出される対象物体の 3次 元形状に基づいて、前記撮像手段によって撮像されるパターン光が投光されていな い状態における対象物体の画像を、対象物体の所定面の略鉛直方向から観察され る平面画像に補正する平面画像補正手段とを備えていることを特徴とする撮像装置  Based on the three-dimensional shape of the target object calculated by the three-dimensional shape calculation means of the three-dimensional shape detection device, an image of the target object in a state where the pattern light imaged by the imaging means is not projected is projected. An image capturing apparatus, comprising: a planar image correcting unit configured to correct a planar image of a predetermined surface of a target object to be observed from a substantially vertical direction.
PCT/JP2005/002297 2004-02-19 2005-02-16 Three-dimensional shape detection device and image pick up device WO2005080915A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2004-042647 2004-02-19
JP2004042647A JP2005233748A (en) 2004-02-19 2004-02-19 Device for detecting three-dimensional shape and imaging device

Publications (1)

Publication Number Publication Date
WO2005080915A1 true WO2005080915A1 (en) 2005-09-01

Family

ID=34879267

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2005/002297 WO2005080915A1 (en) 2004-02-19 2005-02-16 Three-dimensional shape detection device and image pick up device

Country Status (2)

Country Link
JP (1) JP2005233748A (en)
WO (1) WO2005080915A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1829453A1 (en) * 2006-02-28 2007-09-05 Ali SpA Automatic machine for producing and dispensing semi-liquid foodstuffs

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6176957B2 (en) * 2013-03-18 2017-08-09 株式会社ミツトヨ Shape measuring device
KR102591766B1 (en) * 2016-02-04 2023-10-23 삼성전자주식회사 Connector

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS549644A (en) * 1977-06-22 1979-01-24 Mitsubishi Electric Corp Linear pattern projector
JPS63109307A (en) * 1986-10-27 1988-05-14 Sharp Corp Apparatus for inspecting mounting of chip part
JPS63145904A (en) * 1986-12-09 1988-06-18 Fujitsu Ltd Slit light irradiator
JP2927955B2 (en) * 1990-04-05 1999-07-28 インテリジェント オートメイション システムズ インコーポレイテッド Real-time three-dimensional sensing device
JP2000136913A (en) * 1998-11-02 2000-05-16 Minolta Co Ltd Image reader

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS549644A (en) * 1977-06-22 1979-01-24 Mitsubishi Electric Corp Linear pattern projector
JPS63109307A (en) * 1986-10-27 1988-05-14 Sharp Corp Apparatus for inspecting mounting of chip part
JPS63145904A (en) * 1986-12-09 1988-06-18 Fujitsu Ltd Slit light irradiator
JP2927955B2 (en) * 1990-04-05 1999-07-28 インテリジェント オートメイション システムズ インコーポレイテッド Real-time three-dimensional sensing device
JP2000136913A (en) * 1998-11-02 2000-05-16 Minolta Co Ltd Image reader

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1829453A1 (en) * 2006-02-28 2007-09-05 Ali SpA Automatic machine for producing and dispensing semi-liquid foodstuffs

Also Published As

Publication number Publication date
JP2005233748A (en) 2005-09-02

Similar Documents

Publication Publication Date Title
US9826216B1 (en) Systems and methods for compact space-time stereo three-dimensional depth sensing
US11029762B2 (en) Adjusting dimensioning results using augmented reality
US20060164526A1 (en) Image processing device and image capturing device
US10805546B2 (en) Image processing system, image processing device, and image processing program
EP2063347A1 (en) Projector and method for projecting image
CN108700408B (en) Three-dimensional shape data and texture information generation system, method and shooting control method
JP2018536892A (en) Multi-aperture imaging device having channel-specific adjustability, method for manufacturing multi-aperture imaging device, and full-field capturing method
WO2005095886A1 (en) 3d shape detection device, 3d shape detection method, and 3d shape detection program
CN110308557A (en) Display system, electronic mirror system, moving body and display methods
US7440119B2 (en) Three-dimensional shape detecting device, image capturing device, and three-dimensional shape detecting method
JP3690581B2 (en) POSITION DETECTION DEVICE AND METHOD THEREFOR, PLAIN POSITION DETECTION DEVICE AND METHOD THEREOF
WO2005080915A1 (en) Three-dimensional shape detection device and image pick up device
JPH09145318A (en) Three-dimensional measuring equipment
JP2007033087A (en) Calibration device and method
JP4360145B2 (en) Three-dimensional shape detection device, imaging device, and three-dimensional shape detection method
KR101539425B1 (en) 3-dimensional scanner capable of acquiring ghostimage-free pictures
KR20230101899A (en) 3D scanner with sensors of overlapping field of view
JP5785896B2 (en) 3D shape measuring device
JP4608855B2 (en) Three-dimensional shape detection device, imaging device, and three-dimensional shape detection method
JP2005148813A (en) Three-dimensional shape detecting device, image pickup device and three-dimensional shape detecting program
JP2005148813A5 (en)
US20060158661A1 (en) Three-dimensional shape detecting device, three-dimensional shape detecting system, and three-dimensional shape detecting program
JP2005189021A (en) Imaging device
TW202037964A (en) Device comprising a multi-aperture imaging device for generating a depth map
WO2005031253A1 (en) Three-dimensional shape measuring device, imaging device, and three-dimensional shape measuring program

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Country of ref document: DE

122 Ep: pct application non-entry in european phase