WO2017090268A1 - 三次元計測装置 - Google Patents

三次元計測装置 Download PDF

Info

Publication number
WO2017090268A1
WO2017090268A1 PCT/JP2016/070238 JP2016070238W WO2017090268A1 WO 2017090268 A1 WO2017090268 A1 WO 2017090268A1 JP 2016070238 W JP2016070238 W JP 2016070238W WO 2017090268 A1 WO2017090268 A1 WO 2017090268A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging
light
image data
light intensity
intensity distribution
Prior art date
Application number
PCT/JP2016/070238
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
信行 梅村
大山 剛
憲彦 坂井田
学 奥田
Original Assignee
Ckd株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ckd株式会社 filed Critical Ckd株式会社
Priority to DE112016005425.4T priority Critical patent/DE112016005425T5/de
Priority to CN201680046181.9A priority patent/CN107923736B/zh
Priority to MX2018001490A priority patent/MX366402B/es
Publication of WO2017090268A1 publication Critical patent/WO2017090268A1/ja

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2518Projection by scanning of the object
    • G01B11/2527Projection by scanning of the object with phase change by in-plane movement of the patern

Definitions

  • the present invention relates to a three-dimensional measurement apparatus that performs three-dimensional measurement using a phase shift method.
  • cream solder is first printed on a predetermined electrode pattern disposed on the printed circuit board.
  • an electronic component is temporarily fixed on the printed circuit board based on the viscosity of the cream solder.
  • the printed circuit board is guided to a reflow furnace, and soldering is performed through a predetermined reflow process.
  • a three-dimensional measuring device is sometimes used for such inspection.
  • a predetermined fringe pattern is projected onto the object to be measured by a predetermined projection means.
  • the projection unit includes a light source that emits predetermined light and a grating that converts light from the light source into a stripe pattern.
  • the grating has a configuration in which light-transmitting portions that transmit light and light-shielding portions that block light are arranged alternately.
  • the fringe pattern projected on the object to be measured is picked up using an image pickup unit arranged right above the object to be measured.
  • an image pickup unit a CCD camera or the like including a lens and an imaging element is used.
  • I f ⁇ sin ⁇ + e (U1) Where f: gain, e: offset, and ⁇ : phase of the stripe pattern.
  • the phase of the fringe pattern is shifted, for example, in four steps ( ⁇ + 0, ⁇ + 90 °, ⁇ + 180 °, ⁇ + 270 °) by controlling the movement of the grating, and the intensity distributions I 0 , I 1 , I 2 , I corresponding to them.
  • the image data having 3 is sequentially fetched, and the phase ⁇ is obtained based on the following equation (U2).
  • the light intensity distribution (waveform) of the fringe pattern projected on the object to be measured is broken.
  • the light intensity distribution is not likely to be sinusoidal.
  • the degree of focus of the fringe pattern varies depending on the relative positional relationship with the object to be measured, if the relative positional relationship with the object to be measured changes, the light intensity distribution (waveform) of the fringe pattern may also change. is there.
  • the fringe pattern cannot be projected using a telecentric optical system.
  • the present invention has been made in view of the above circumstances, and an object of the present invention is to provide a three-dimensional measurement apparatus capable of dramatically improving measurement accuracy when performing three-dimensional measurement using the phase shift method. There is to do.
  • Means 1 A light source that emits predetermined light; a grating that converts light from the light source into a predetermined stripe pattern; and a driving unit that can move the grating; and the stripe pattern is applied to an object to be measured (for example, a printed circuit board).
  • Projection means capable of projecting, Imaging means capable of imaging the object to be measured on which the fringe pattern is projected; Image acquisition means for controlling the projection means and the imaging means, and capable of acquiring a plurality of image data having different light intensity distributions; Image processing means capable of performing three-dimensional measurement of the measurement object by a phase shift method based on a plurality of image data acquired by the image acquisition means,
  • the image acquisition means includes In obtaining one image data of the plurality of image data, While performing a movement process to move the lattice, Performing an imaging process of performing imaging (exposure) continuously in a predetermined period at least partially overlapping with the moving period of the lattice; Or An imaging process for performing imaging (exposure) in a plurality of times during a predetermined period at least partially overlapping with the movement period of the grid is performed, and the imaging result (luminance value of each pixel of the plurality of captured image data)
  • a three-dimensional measuring apparatus that performs a process of adding or averaging the pixels for each pixel.
  • a predetermined stripe pattern projected on the object to be measured (for example, a stripe pattern having a rectangular wave-shaped light intensity distribution) is moved, and the moving stripe pattern is continuously imaged, or Images are taken in a plurality of times, and the imaging results are added or averaged for each pixel.
  • “sinusoidal” means “shaped like a sine wave”. In the case of simply “sinusoidal”, not only an ideal “sine wave” but also “sine wave”. Includes approximate ones (the same applies to other “non-sinusoidal waves” such as “rectangular waves” described later).
  • the “predetermined stripe pattern” includes a “striped pattern having a sinusoidal light intensity distribution”. That is, it is possible to project a fringe pattern having a light intensity distribution approximate to a sine wave, which is not an ideal “sine wave”, and obtain image data having a light intensity distribution closer to the ideal sine wave.
  • the lattice moving operation in the “movement process” may be a continuous operation in which the lattice continuously moves, or an intermittent operation in which the lattice moves intermittently (moves by a predetermined amount). Good.
  • the above-described “imaging process of performing continuous imaging (or performing imaging in a plurality of times) during a predetermined period at least partially overlapping with the moving period of the grid” may be performed before starting the moving of the grid. This includes a case where the imaging process is started while the movement of the grid is stopped, a case where the imaging process is terminated while the movement of the lattice is stopped, and the like. Accordingly, for example, after the imaging process is started while the grid is stopped, the movement of the grid is started, and after the movement of the grid is stopped, the imaging process may be terminated.
  • Means 2 The means 1 starts the imaging process simultaneously with the start of the grid movement process or during the movement process, and ends the imaging process simultaneously with the stop of the grid movement process or during the movement process. 3D measuring device.
  • the position (phase) of the fringe pattern imaged during the predetermined period always changes.
  • the measurement accuracy can be further improved.
  • Means 3 The three-dimensional measuring apparatus according to claim 1 or 2, wherein the predetermined fringe pattern is a fringe pattern having a non-sinusoidal light intensity distribution.
  • non-sinusoidal wave means a predetermined wave that is not a “sine wave”, such as “rectangular wave”, “trapezoidal wave”, “triangular wave”, and “sawtooth wave”.
  • a comparison is made while projecting a fringe pattern having a non-sinusoidal (for example, rectangular wave) light intensity distribution that is not a sine wave without complicating the mechanical structure of the projection means.
  • Image data having a sinusoidal light intensity distribution can be acquired by a simple control process or calculation process. As a result, it is possible to suppress the complication of the mechanical configuration and to reduce the manufacturing cost.
  • Means 4 The three-dimensional measurement apparatus according to any one of means 1 to 3, wherein the grating has an arrangement configuration in which light-transmitting portions that transmit light and light-shielding portions that block light are alternately arranged.
  • the same effect as the above means 3 is achieved.
  • a binary grid like this means at least a flat peak portion where the luminance is maximum and constant (hereinafter referred to as “bright portion”) and a flat peak portion where the luminance is minimum and constant (
  • a fringe pattern having a light intensity distribution having a “dark portion”) can be projected. That is, a fringe pattern having a rectangular or trapezoidal light intensity distribution can be projected.
  • the light passing through the grating is not completely parallel light, but is in the middle of the border between the “bright part” and the “dark part” of the fringe pattern due to diffraction effects at the border between the light transmitting part and the light shielding part. Since a gradation range may occur, it does not become a complete rectangular wave.
  • the luminance gradient in the intermediate gradation area at the boundary between the “bright part” and the “dark part” is steep, it varies depending on the configuration such as the arrangement interval of the light transmitting part and the light shielding part in the lattice.
  • the stripe pattern has a trapezoidal light intensity distribution.
  • Means 5 The three-dimensional measuring apparatus according to any one of means 1 to 4, wherein the object to be measured is a printed board on which cream solder is printed or a wafer board on which solder bumps are formed.
  • the above means 5 it is possible to measure the height of cream solder printed on a printed circuit board or solder bumps formed on a wafer substrate. As a result, in the inspection of cream solder or solder bumps, the quality of cream solder or solder bumps can be determined based on the measured values. Therefore, in such an inspection, the effect of each means described above is exhibited, and the quality determination can be performed with high accuracy. As a result, it is possible to improve the inspection accuracy in the solder printing inspection apparatus or the solder bump inspection apparatus.
  • 6 is a table showing a light intensity distribution in the X-axis direction (coordinates X1 to X8) of the image sensor at every elapse of a predetermined time in the first simulation.
  • 7 is a table showing a light intensity distribution in the X-axis direction (coordinates X9 to X16) of the image sensor at every elapse of a predetermined time in the first simulation.
  • 6 is a table showing a light intensity distribution in the X-axis direction (coordinates X17 to X24) of the image sensor at every elapse of a predetermined time in the first simulation.
  • 6 is a table showing a light intensity distribution in the X-axis direction (coordinates X25 to X32) of the image sensor at every elapse of a predetermined time in the first simulation.
  • 6 is a table showing a light intensity distribution in the X-axis direction (coordinates X33 to X36) of the image sensor at every elapse of a predetermined time in the first simulation.
  • FIG. 5A is a table relating to the first simulation, in which FIG.
  • FIG. 5A is a table showing an ideal sine wave light intensity distribution in the X-axis direction (coordinates X1 to X10) of the image sensor, and FIG. It is a table
  • (c) is a table
  • FIG. 5A is a table relating to the first simulation, in which FIG. 5A is a table showing an ideal sine wave light intensity distribution in the X-axis direction (coordinates X11 to X20) of the image sensor, and FIG. It is a table
  • FIG. 5A is a table relating to the first simulation, in which FIG. 5A is a table showing an ideal sine wave light intensity distribution in the X-axis direction (coordinates X21 to X30) of the image sensor, and FIG. It is a table
  • FIG. 6A is a table relating to the first simulation, in which FIG. 5A is a table showing an ideal sine wave light intensity distribution in the X-axis direction (coordinates X31 to X36) of the image sensor, and FIG.
  • FIG. 14 is a graph showing the light intensity distribution of an ideal sine wave shown in FIG. 14 is a graph in which various average values shown in (b) of FIGS. 10 to 13 are plotted.
  • FIG. 14 is a graph in which differences between various average values and ideal values shown in (c) of FIG. 10 to FIG. 13 are plotted.
  • 10 is a table showing a light intensity distribution in the X-axis direction (coordinates X1 to X8) of the image sensor at every elapse of a predetermined time in the second simulation.
  • 10 is a table showing the light intensity distribution in the X-axis direction (coordinates X9 to X16) of the image sensor at every elapse of a predetermined time in the second simulation.
  • 12 is a table showing a light intensity distribution in the X-axis direction (coordinates X17 to X24) of the image sensor at every elapse of a predetermined time in the second simulation.
  • 10 is a table showing a light intensity distribution in the X-axis direction (coordinates X25 to X32) of the image sensor at every elapse of a predetermined time in the second simulation.
  • 10 is a table showing a light intensity distribution in the X-axis direction (coordinates X33 to X36) of the image sensor at every elapse of a predetermined time in the second simulation.
  • 10 is a table relating to the second simulation, in which (a) is a table showing an ideal sine wave light intensity distribution in the X-axis direction (coordinates X1 to X10) of the image sensor, and (b) is a luminance in each pixel.
  • (c) is a table
  • 10 is a table relating to the second simulation, in which (a) is a table showing an ideal sine wave light intensity distribution in the X-axis direction (coordinates X11 to X20) of the image sensor, and (b) is a luminance in each pixel. It is a table
  • 10 is a table relating to the second simulation, in which (a) is a table showing an ideal sine wave light intensity distribution in the X-axis direction (coordinates X21 to X30) of the image sensor, and (b) is a luminance in each pixel. It is a table
  • FIG. 27 is a graph showing an ideal sine wave light intensity distribution shown in FIGS. 23 to 26A.
  • FIG. 27 is a graph in which various average values shown in (b) of FIGS. 23 to 26 are plotted.
  • FIG. 27 is a graph plotting differences between various average values and ideal values shown in FIG. 23 to FIG. (A)-(d) is a timing chart for demonstrating the processing operation of the camera in another embodiment, and an illuminating device.
  • FIG. 1 is a schematic configuration diagram schematically illustrating a substrate inspection apparatus 1 including a three-dimensional measurement apparatus according to the present embodiment.
  • the substrate inspection apparatus 1 includes a mounting table 3 for mounting a printed circuit board 2 as an object to be measured on which a cream solder K (see FIG. 3) to be measured is printed, and a printed circuit board.
  • Illumination device 4 as projection means for projecting a predetermined stripe pattern (stripe-like light pattern) obliquely from above on the surface of 2, and imaging means for imaging the projected portion of the stripe pattern on printed circuit board 2
  • a camera 5 and a control device 6 for performing various controls, image processing, and arithmetic processing in the substrate inspection apparatus 1 such as driving control of the illumination device 4 and the camera 5 are provided.
  • the control device 6 constitutes an image acquisition unit and an image processing unit in the present embodiment.
  • the mounting table 3 is provided with motors 15 and 16, and the motors 15 and 16 are driven and controlled by the control device 6, so that the printed circuit board 2 mounted on the mounting table 3 can move in any direction ( It slides in the X-axis direction and the Y-axis direction).
  • the illumination device 4 includes a light source 4a that emits predetermined light and a lattice plate 4b that converts light from the light source 4a into a stripe pattern, and is driven and controlled by the control device 6.
  • the light emitted from the light source 4a is guided to a condensing lens (not shown). After being converted into parallel light there, the light is guided to a projection lens (not shown) via the lattice plate 4b, and is applied to the printed circuit board 2. On the other hand, it is projected as a stripe pattern.
  • the lattice plate 4b has an arrangement configuration in which linear light-transmitting portions that transmit light and linear light-shielding portions that block light are alternately arranged in a predetermined direction orthogonal to the optical axis of the light source 4a.
  • a fringe pattern having a rectangular wave or trapezoidal light intensity distribution can be projected onto the printed circuit board 2.
  • a fringe pattern in which the fringe direction is orthogonal to the X-axis direction and parallel to the Y-axis direction is projected.
  • the light passing through the grating plate 4b is not completely parallel light, but at the boundary between the “bright part” and the “dark part” of the stripe pattern due to the diffractive action at the boundary part between the light transmitting part and the light shielding part. Since an intermediate gradation region may occur, a perfect rectangular wave is not obtained. However, in FIG. 3, for the sake of simplification, the intermediate gradation region is omitted, and the stripe pattern is illustrated as a light and dark binary stripe pattern.
  • the stripe pattern has a rectangular wave-shaped light intensity distribution (see FIG. 14).
  • the stripe pattern has a trapezoidal wave-shaped light intensity distribution (see FIG. 27).
  • the illuminating device 4 includes driving means (not shown) such as a motor that moves the lattice plate 4b.
  • the control device 6 can perform a moving process of continuously moving the lattice plate 4b at a constant speed in the predetermined direction perpendicular to the optical axis of the light source 4a by controlling the driving of the driving unit.
  • the stripe pattern can be projected onto the printed circuit board 2 so as to move along the X-axis direction.
  • the camera 5 includes a lens, an image sensor, and the like.
  • a CCD sensor is employed as the image sensor.
  • the image sensor of this embodiment has a resolution of, for example, 512 pixels in the X-axis direction (horizontal direction) and 480 pixels in the Y-axis direction (vertical direction).
  • the camera 5 is driven and controlled by the control device 6. More specifically, the control device 6 synchronizes the timing of moving the grid plate 4b and the timing of capturing an image by the camera 5 based on a signal from an encoder (not shown) provided in the driving means of the grid plate 4b. The imaging process is performed while taking
  • the image data picked up by the camera 5 is converted into a digital signal inside the camera 5, input to the control device 6 in the form of a digital signal, and stored in an image data storage device 24 described later. Then, the control device 6 performs image processing, calculation processing, and the like as described later based on the image data.
  • the control device 6 is an “input” composed of a CPU and input / output interface 21 (hereinafter referred to as “CPU etc. 21”) that controls the entire board inspection apparatus 1, a keyboard, a mouse, a touch panel, and the like.
  • Input device 22 as “means”
  • display device 23 as “display means” having a display screen such as CRT or liquid crystal
  • image data storage device 24 for storing image data captured by camera 5
  • various calculation results Are provided with a calculation result storage device 25 for storing the information
  • a setting data storage device 26 for storing various information such as design data in advance.
  • FIG. 4 is a timing chart for explaining processing operations of the camera 5 and the illumination device 4.
  • This inspection routine is executed by the control device 6 (CPU 21 or the like).
  • the control device 6 CPU 21 or the like.
  • four types of image data having different light intensity distributions are acquired by performing image acquisition processing four times for each inspection area.
  • the control device 6 first drives and controls the motors 15 and 16 to move the printed circuit board 2, and adjusts the field of view (imaging range) of the camera 5 to a predetermined inspection area on the printed circuit board 2.
  • the inspection area is one area in which the surface of the printed circuit board 2 is divided in advance with the size of the visual field of the camera 5 as one unit.
  • the control device 6 drives and controls the illumination device 4, sets the position of the grating plate 4b to a first initial setting position (for example, a position of phase “0 °”), and performs the first image acquisition process.
  • a first initial setting position for example, a position of phase “0 °”
  • the initial setting position of the lattice plate 4b is different in each of the four image acquisition processes, and is set such that the phase of the fringe pattern at the initial setting position is shifted by 90 ° (by a quarter pitch).
  • the control device 6 causes the light source 4a of the illumination device 4 to emit light at a predetermined timing M1, starts projection of the fringe pattern, and simultaneously starts the movement process of the lattice plate 4b. To do. Thereby, the fringe pattern projected on the inspection area continuously moves at a constant speed along the X-axis direction.
  • control device 6 controls the drive of the camera 5 and starts the imaging process at a predetermined timing N1.
  • start timing M1 of the movement process of the lattice plate 4b and the start timing N1 of the imaging process by the camera 5 are set to be simultaneous.
  • imaging is performed by the camera 5 in a plurality of times during the execution period. More specifically, the printed circuit board 2 is imaged each time the fringe pattern is moved by a predetermined amount ⁇ x (for example, a distance corresponding to a phase of 10 ° of the fringe pattern), that is, every time a predetermined time ⁇ t elapses.
  • ⁇ x for example, a distance corresponding to a phase of 10 ° of the fringe pattern
  • the image data captured by the camera 5 is transferred to the image data storage device 24 and stored as needed.
  • control apparatus 6 complete
  • the end timing M2 of the movement process of the lattice plate 4b and the end timing N2 of the imaging process by the camera 5 are set to be simultaneous.
  • the control device 6 executes a predetermined calculation process based on the imaging result obtained by the imaging process. More specifically, the luminance value of each pixel of a series of image data (a plurality of image data captured each time the fringe pattern is moved by a predetermined amount ⁇ x) is added for each pixel and averaged. An averaging process for calculating a value is executed. Thereby, image data having a sinusoidal light intensity distribution is acquired.
  • control apparatus 6 memorize
  • control device 6 drives and controls the illumination device 4 after the completion of the first image acquisition process or during the execution of the averaging process related to the first image acquisition process, and sets the position of the grid plate 4b to the first position. 2 is set to the initial setting position (for example, the position of the phase “90 °” in which the phase of the fringe pattern is shifted by a quarter pitch from the first initial setting position).
  • control device 6 starts the second image acquisition process.
  • the procedure of the second image acquisition process is the same as that of the first image acquisition process, and a detailed description thereof will be omitted (the same applies to the third and fourth image acquisition processes).
  • control device 6 When the control device 6 acquires image data having a sinusoidal light intensity distribution by the second image acquisition process, the control device 6 stores the image data in the calculation result storage device 25, and ends the second image acquisition process.
  • the control device 6 drives and controls the illuminating device 4 after the end of the second image acquisition process or during the execution of the averaging process related to the second image acquisition process, and sets the position of the grid plate 4b to the third position.
  • An initial setting position (for example, a position at a phase “180 °” where the phase of the fringe pattern is shifted by a quarter pitch from the second initial setting position) is set, and the third image acquisition process is started.
  • control device 6 When the control device 6 acquires image data having a sinusoidal light intensity distribution by the third image acquisition process, the control device 6 stores the image data in the calculation result storage device 25, and ends the third image acquisition process.
  • the control device 6 drives and controls the illuminating device 4 after the completion of the third image acquisition process or during the execution of the averaging process related to the third image acquisition process, and sets the position of the grid plate 4b to the fourth position.
  • the image is set to an initial setting position (for example, a position of a phase “270 °” in which the phase of the fringe pattern is shifted by a quarter pitch from the third initial setting position), and the fourth image acquisition process is started.
  • control device 6 When the control device 6 acquires image data having a sinusoidal light intensity distribution by the fourth image acquisition process, the control device 6 stores the image data in the calculation result storage device 25, and ends the fourth image acquisition process.
  • control device 6 performs three-dimensional measurement (height measurement) by the known phase shift method described in the background art based on the four types of image data (the luminance value of each pixel) acquired as described above.
  • the measurement result is stored in the calculation result storage device 25.
  • control device 6 performs pass / fail judgment processing of the cream solder K based on the three-dimensional measurement result (height data at each coordinate). Specifically, the control device 6 detects the printing range of the cream solder K that is higher than the reference surface based on the measurement result of the inspection area obtained as described above, and the height of each part within this range is detected. The amount of printed cream solder K is calculated by integrating the thickness.
  • control device 6 uses the reference data (gerber data or the like) stored in the setting data storage device 26 in advance as the data such as the position, area, height or amount of the cream solder K thus obtained. A comparison determination is made, and whether or not the printing state of the cream solder K in the inspection area is good is determined depending on whether or not the comparison result is within an allowable range.
  • reference data gerber data or the like
  • control device 6 drives and controls the motors 15 and 16 to move the printed circuit board 2 to the next inspection area. Thereafter, the above series of processing is performed in all inspection areas. By being repeatedly performed, the inspection of the entire printed circuit board 2 is completed.
  • 5 to 9 show the coordinate position (horizontal axis: coordinates X1 to X36) of each pixel in the X-axis direction of the image sensor, and the luminance value of the fringe pattern that changes with time (vertical axis: times t1 to t36).
  • the simulation is performed on the assumption that the luminance value of the “bright portion” at which the luminance is maximum is “1” and the luminance value of the “dark portion” at which the luminance is minimum is “0”.
  • 5 to 9 show only one period of the fringe pattern (36 pixels in the X-axis direction), but actually, a plurality of period fringe patterns exist continuously in the X-axis direction. . That is, the light intensity distribution shown in the range of the coordinates X1 to X36 exists repeatedly.
  • the range of the coordinates X2 to X17 is a “bright portion” with a luminance value “1”
  • the range of the coordinates X20 to X35 is a “dark portion” with a luminance value “0”. It has become.
  • the coordinates X36 and X1 and the coordinates X18 and X19 corresponding to the boundary between the “bright part” and the “dark part” each have an intermediate gradation area for two pixels in which the luminance value is changed. That is, the light intensity distribution of the stripe pattern at the imaging timing t1 is as shown in the graph of FIG.
  • the range of the coordinates X3 to X18 becomes the “bright portion” of the luminance value “1”, and the range of the coordinates X21 to X36 has the luminance value “0”. It becomes “dark part”.
  • the range of the coordinates X4 to X19 becomes “bright” with the luminance value “1”, and the range of the coordinates X22 to X1 has the luminance value “0”. It becomes “dark part”.
  • the light intensity distribution of the stripe pattern moves to the right in FIGS. 5 to 9 by one pixel every time the predetermined time ⁇ t elapses.
  • FIGS. 10A to 13A show the relationship between the coordinate position (coordinates X1 to X36) of each pixel in the X-axis direction of the image sensor and the ideal light intensity distribution (ideal value) of the sine wave. It is a table. Here, an ideal sinusoidal light intensity distribution having the same period, amplitude, and phase as the fringe pattern having the rectangular wave light intensity distribution at the imaging timing t1 is shown. An ideal sine wave at the imaging timing t1 is as shown in the graph of FIG.
  • averaging processing is performed on a plurality of image data (luminance values for each pixel) captured within a predetermined time before and after the image data captured at the imaging timing t1.
  • 3 is a table showing results (average values) for each pixel coordinate position (horizontal axis: coordinates X1 to X36) in the X-axis direction of the image sensor.
  • the uppermost row shows the image data (the luminance value for each pixel) captured at the imaging timing t1 when the averaging process is not performed. ing.
  • seven average values are obtained by averaging seven pieces of image data (luminance values for each pixel) picked up at three before and after the image pickup timing t1, that is, at the image pickup timings t34 to t4. It is shown.
  • nine average values are obtained by averaging nine pieces of image data (brightness values for each pixel) taken four times before and after the image pickup timing t1, that is, at the image pickup timings t33 to t5. It is shown.
  • 11 average values are obtained by averaging 11 pieces of image data (brightness values for each pixel) picked up at the front and back at five points around the image pickup timing t1, that is, at the image pickup timings t32 to t6. It is shown.
  • 10C to 13C show the difference between the ideal values shown in FIGS. 10A to 13A and the average values shown in FIGS. 10B to 13B. Is a table shown for each coordinate position (horizontal axis: coordinates X1 to X36) of each pixel in the X-axis direction.
  • the difference between each of the three average values and each ideal value is shown.
  • the difference between each of the five average values and each ideal value is shown.
  • the difference between each of the seven average values and each ideal value is shown.
  • the difference between each of the nine average values and each ideal value is shown.
  • the difference between each 11 average values and each ideal value is shown.
  • the difference between each of the 13 average values and each ideal value is shown.
  • the average number increases, such as 5 average values rather than 3 average values and 7 average values rather than 5 average values. Therefore, the error from the ideal sine wave (ideal value) is decreasing, and the 13 average value is the smallest error. Therefore, in this simulation, it is more preferable to perform three-dimensional measurement by the phase shift method using the 13 average values.
  • the light intensity of a trapezoidal wave having a halftone area (luminance gradient) of 12 pixels at the boundary between “bright” and “dark” is defined as one cycle for 36 pixels in the X-axis direction of the image sensor.
  • a stripe pattern having a distribution was projected, and the stripe pattern was moved in the X-axis direction by one pixel (for a phase of the stripe pattern of 10 °) every time a predetermined time ⁇ t passed.
  • the range of the coordinates X7 to X12 is “bright” of the luminance value “1”, and the range of the coordinates X25 to X30 is “dark” of the luminance value “0”. It has become. Further, the coordinates X31 to X6 and the coordinates X13 to X24 corresponding to the boundary between the “bright part” and the “dark part” each have an intermediate gradation area for 12 pixels in which the luminance value is changed. That is, the light intensity distribution of the stripe pattern at the imaging timing t1 is as shown in the graph of FIG.
  • the range of the coordinates X8 to X13 becomes the “bright portion” of the luminance value “1”, and the range of the coordinates X26 to X31 has the luminance value “0”. It becomes “dark part”.
  • the range of the coordinates X9 to X14 becomes “bright” with the luminance value “1”, and the range of the coordinates X27 to X32 has the luminance value “0”. It becomes “dark part”.
  • the light intensity distribution of the stripe pattern moves to the right in FIGS. 18 to 22 by one pixel every time the predetermined time ⁇ t elapses.
  • 23A to 26A show the relationship between the coordinate position (coordinates X1 to X36) of each pixel in the X-axis direction of the image sensor and the ideal light intensity distribution (ideal value) of the sine wave. It is a table. Here, an ideal sine wave light intensity distribution having the same period, amplitude and phase as the fringe pattern having the trapezoidal light intensity distribution at the imaging timing t1 is shown. An ideal sine wave at the imaging timing t1 is as shown in the graph of FIG.
  • averaging processing is performed on a plurality of image data (luminance values for each pixel) captured within a predetermined time before and after the image data captured at the imaging timing t1.
  • 3 is a table showing results (average values) for each pixel coordinate position (horizontal axis: coordinates X1 to X36) in the X-axis direction of the image sensor.
  • image data luminance value for each pixel
  • FIGS. 23 to 26 (b) image data (luminance value for each pixel) captured at the imaging timing t1 when the averaging process is not performed is shown as it is at the top. ing.
  • seven average values are obtained by averaging seven pieces of image data (luminance values for each pixel) picked up at three before and after the image pickup timing t1, that is, at the image pickup timings t34 to t4. It is shown.
  • nine average values are obtained by averaging nine pieces of image data (brightness values for each pixel) taken four times before and after the image pickup timing t1, that is, at the image pickup timings t33 to t5. It is shown.
  • FIG. 23 to FIG. 26C shows the difference between each ideal value shown in FIG. 23 to FIG. 26A and each average value shown in FIG. 23 to FIG. Is a table shown for each coordinate position (horizontal axis: coordinates X1 to X36) of each pixel in the X-axis direction.
  • the stripe pattern having the light intensity distribution of the rectangular wave shape or the trapezoidal wave shape projected on the printed circuit board 2 is moved, and the moving stripe pattern is divided into a plurality of times. An image is taken, and the luminance value of each pixel of the series of image data taken is added for each pixel, and the average value is calculated.
  • the light intensity distribution simply has a rectangular wave shape or a trapezoidal wave shape. Image data having a light intensity distribution closer to an ideal sine wave can be acquired than when a fringe pattern is projected and imaged.
  • a relatively simple control process and calculation can be performed while projecting a fringe pattern having a light intensity distribution of a rectangular wave shape or a trapezoidal wave shape that is not a sine wave without complicating the mechanical configuration.
  • Image data having a sinusoidal light intensity distribution can be acquired by processing or the like. As a result, it is possible to suppress the complication of the mechanical configuration and to reduce the manufacturing cost.
  • the three-dimensional measuring device is embodied in the substrate inspection device 1 that measures the height of the cream solder K printed and formed on the printed circuit board 2, but is not limited to this, for example, on the substrate. You may embody in the structure which measures the height of other things, such as the printed solder bump and the electronic component mounted on the board
  • phase shift amount is not limited to these.
  • Other phase shift times and phase shift amounts that can be three-dimensionally measured by the phase shift method may be employed.
  • three types of image data with different phases of 120 ° (or 90 °) may be acquired to perform three-dimensional measurement, or two types of image data with different phases of 180 ° (or 90 °) may be acquired. Then, it may be configured to perform three-dimensional measurement.
  • a stripe pattern having a non-sinusoidal light intensity distribution such as a triangular wave shape or a sawtooth wave shape may be projected to obtain image data having a sinusoidal light intensity distribution.
  • a stripe pattern having a rectangular wave-like light intensity distribution without an intermediate gradation range (luminance gradient) may be projected to obtain image data having a sinusoidal light intensity distribution.
  • a fringe pattern having a light intensity distribution approximate to a sine wave that is not an ideal sine wave is projected to obtain image data having a light intensity distribution closer to the ideal sine wave. Also good.
  • the configuration of the projection means is not limited to the illumination device 4 according to the above embodiment.
  • the lattice plate 4b is employed as a lattice that converts light from the light source 4a into a stripe pattern.
  • a liquid crystal panel may be used as a lattice.
  • a liquid crystal layer is formed between a pair of transparent substrates, a common electrode disposed on one transparent substrate, and a plurality of strip electrodes arranged in parallel on the other transparent substrate so as to face the common electrode
  • Each of the grid lines corresponding to each band electrode by controlling on and off the switching elements (thin film transistors, etc.) connected to each band electrode by the drive circuit and controlling the voltage applied to each band electrode.
  • the light transmittance is switched to form a lattice pattern in which light-transmitting portions with high light transmittance and light-shielding portions with low light transmittance are alternately arranged.
  • lattice can be performed by switching-controlling the position of these translucent part and light-shielding part.
  • DLP registered trademark
  • a digital mirror device may be adopted as a lattice.
  • a binary lattice (lattice plate 4b) in which a light transmitting portion and a light shielding portion are alternately arranged is employed.
  • a multi-value lattice pattern with different transmittances may be formed as described above.
  • the start timing M1 of the movement processing of the lattice plate 4b and the start timing N1 of the imaging processing by the camera 5 are set to be simultaneous, and the end timing M2 of the movement process of the lattice plate 4b and the end timing N2 of the imaging process by the camera 5 are set to be simultaneous.
  • start timing M1 after the start of movement of the lattice plate 4b (start timing M1), the imaging processing by the camera 5 is started (start timing N1), and the movement of the lattice plate 4b is stopped (end).
  • start timing N1 start timing N1
  • end timing N2 end timing N2
  • the imaging process by the camera 5 is started (start timing N1) before the movement of the lattice plate 4b starts (start timing M1), and the movement of the lattice plate 4b is stopped (end timing M2).
  • the imaging process by the camera 5 may be ended at the same time or before (end timing N2).
  • the imaging process by the camera 5 is started (start timing N1) simultaneously with or after the start of movement of the lattice plate 4b (start timing M1), and the movement of the lattice plate 4b is stopped ( It is good also as a structure which ends the imaging process by the camera 5 (end timing N2) after the end timing M2).
  • the imaging process by the camera 5 is started (start timing N1) before the movement of the lattice plate 4b starts (start timing M1), and the movement of the lattice plate 4b is stopped (end timing M2). )
  • the imaging process by the camera 5 may be ended later (end timing N2).
  • each image acquisition process is configured to perform a moving process in which the lattice plate 4b is continuously moved at a constant speed by a driving means such as a motor.
  • the driving means for the lattice plate 4b is not limited to a device that continuously moves the lattice plate 4b, such as a motor, but a device that intermittently moves (moves by a predetermined amount) the lattice plate 4b, such as a piezo element, is adopted. Also good.
  • one moving process may be performed by one intermittent moving operation, or a plurality of intermittent moving operations by a predetermined amount. It is good also as a structure performed by.
  • the lattice plate 4b is stopped every time image acquisition processing is performed.
  • the lattice plate 4b continuously performs a moving operation while the image acquisition processing is performed four times. It is good also as a structure.
  • imaging is performed in a plurality of times during the movement of the lattice plate 4b, and the luminance value of each pixel of the series of captured image data is determined for each pixel. In addition, the average value is calculated.
  • the present invention is not limited to this, and the processing for calculating the average value may be omitted, and the three-dimensional measurement may be performed based on the addition data (image data) obtained by adding the luminance value of each pixel of the series of image data for each pixel.
  • a configuration may be adopted in which imaging (exposure) is continuously performed while the lattice plate 4b is moving, and three-dimensional measurement is performed based on the captured image data.
  • the imaging (exposure) time is long, the imaging device reaches a saturation level, and the image is so-called “whiteout”.
  • the imaging (exposure) is repeatedly performed in a plurality of times during the movement of the grid plate 4b as in the above-described embodiment, and the luminance value is added for each pixel, so that the amount of received light can be reduced without being saturated. Many images can be obtained.
  • the imaging element does not reach the saturation level, it is less burdensome to perform imaging (exposure) continuously while the grid plate 4b is moving.
  • a CCD sensor is employed as the image sensor of the camera 5, but the image sensor is not limited to this, and a CMOS sensor or the like may be employed, for example.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)
PCT/JP2016/070238 2015-11-27 2016-07-08 三次元計測装置 WO2017090268A1 (ja)

Priority Applications (3)

Application Number Priority Date Filing Date Title
DE112016005425.4T DE112016005425T5 (de) 2015-11-27 2016-07-08 Vorrichtung zur dreidimensionalen Messung
CN201680046181.9A CN107923736B (zh) 2015-11-27 2016-07-08 三维测量装置
MX2018001490A MX366402B (es) 2015-11-27 2016-07-08 Dispositivo de medicion tridimensional.

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015-231661 2015-11-27
JP2015231661A JP6062523B1 (ja) 2015-11-27 2015-11-27 三次元計測装置

Publications (1)

Publication Number Publication Date
WO2017090268A1 true WO2017090268A1 (ja) 2017-06-01

Family

ID=57800032

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/070238 WO2017090268A1 (ja) 2015-11-27 2016-07-08 三次元計測装置

Country Status (6)

Country Link
JP (1) JP6062523B1 (es)
CN (1) CN107923736B (es)
DE (1) DE112016005425T5 (es)
MX (1) MX366402B (es)
TW (1) TWI610061B (es)
WO (1) WO2017090268A1 (es)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7371443B2 (ja) * 2019-10-28 2023-10-31 株式会社デンソーウェーブ 三次元計測装置

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007113958A (ja) * 2005-10-18 2007-05-10 Yamatake Corp 3次元計測装置、3次元計測方法、及び3次元計測プログラム
JP2013167464A (ja) * 2012-02-14 2013-08-29 Ckd Corp 三次元計測装置
JP2015087244A (ja) * 2013-10-30 2015-05-07 キヤノン株式会社 画像処理装置、画像処理方法

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4701948B2 (ja) * 2005-09-21 2011-06-15 オムロン株式会社 パタン光照射装置、3次元形状計測装置、及びパタン光照射方法
CN100561258C (zh) * 2008-02-01 2009-11-18 黑龙江科技学院 一种三维测量系统中相移光栅
JP2010276607A (ja) * 2009-05-27 2010-12-09 Koh Young Technology Inc 3次元形状測定装置および測定方法
JP2013124938A (ja) * 2011-12-15 2013-06-24 Ckd Corp 三次元計測装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007113958A (ja) * 2005-10-18 2007-05-10 Yamatake Corp 3次元計測装置、3次元計測方法、及び3次元計測プログラム
JP2013167464A (ja) * 2012-02-14 2013-08-29 Ckd Corp 三次元計測装置
JP2015087244A (ja) * 2013-10-30 2015-05-07 キヤノン株式会社 画像処理装置、画像処理方法

Also Published As

Publication number Publication date
MX366402B (es) 2019-07-08
JP6062523B1 (ja) 2017-01-18
TWI610061B (zh) 2018-01-01
MX2018001490A (es) 2018-08-01
TW201719112A (zh) 2017-06-01
CN107923736B (zh) 2020-01-24
JP2017096866A (ja) 2017-06-01
CN107923736A (zh) 2018-04-17
DE112016005425T5 (de) 2018-08-16

Similar Documents

Publication Publication Date Title
JP6027220B1 (ja) 三次元計測装置
JP4744610B2 (ja) 三次元計測装置
JP6109255B2 (ja) 三次元計測装置
JP6189984B2 (ja) 三次元計測装置
JP5443303B2 (ja) 外観検査装置及び外観検査方法
JP5847568B2 (ja) 三次元計測装置
JP5957575B1 (ja) 三次元計測装置
JP6353573B1 (ja) 三次元計測装置
JP2013124938A (ja) 三次元計測装置
JP5640025B2 (ja) 三次元計測装置
JP6062523B1 (ja) 三次元計測装置
WO2016203668A1 (ja) 三次元計測装置
TWI580927B (zh) Three - dimensional measuring device and three - dimensional measurement method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16868219

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: MX/A/2018/001490

Country of ref document: MX

WWE Wipo information: entry into national phase

Ref document number: 112016005425

Country of ref document: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16868219

Country of ref document: EP

Kind code of ref document: A1