US20120194647A1 - Three-dimensional measuring apparatus - Google Patents

Three-dimensional measuring apparatus Download PDF

Info

Publication number
US20120194647A1
US20120194647A1 US13/357,042 US201213357042A US2012194647A1 US 20120194647 A1 US20120194647 A1 US 20120194647A1 US 201213357042 A US201213357042 A US 201213357042A US 2012194647 A1 US2012194647 A1 US 2012194647A1
Authority
US
United States
Prior art keywords
imaging
region
regions
inspection
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/357,042
Other languages
English (en)
Inventor
Takumi Tomaru
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Assigned to SONY CORPORATION reassignment SONY CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TOMARU, TAKUMI
Publication of US20120194647A1 publication Critical patent/US20120194647A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2518Projection by scanning of the object
    • G01B11/2527Projection by scanning of the object with phase change by in-plane movement of the patern
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10141Special mode during image acquisition
    • G06T2207/10152Varying illumination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30141Printed circuit board [PCB]

Definitions

  • the present disclosure relates to a three-dimensional measuring apparatus capable of measuring three-dimensionally a measurement object using a phase shift method or the like.
  • a method of analyzing images obtained by imaging a measurement object and inspecting the quality of the measurement object has been used as a method of inspecting the quality of a measurement object such as a print substrate.
  • a method of measuring the three-dimensional shape of a measurement object through three-dimensional image analysis and inspecting the quality of the measurement object has recently been used.
  • phase shift method time stripe analysis method which is a kind of optical cutting method is widely used (for example, see Japanese Unexamined Patent Application Publication No. 2010-169433 (paragraphs [0033], [0042] to [0044], and FIG. 1).
  • the three-dimensional measuring apparatus disclosed in Japanese Unexamined Patent Application Publication No. 2010-169433 includes a placing stand on which a print substrate printed with a solder is placed and an illumination device that illuminates the surface of the print substrate from the upper side of a slope with a stripe-shaped light pattern. Further, the three-dimensional measuring apparatus further includes a CCD camera that images a portion illuminated with the stripe-shaped light pattern from the immediate upper side of the print substrate and a control device that performs image processing or arithmetic processing.
  • the illumination device illuminates the print substrate with a light pattern by shifting the phase of the light pattern by a 1 ⁇ 4 pitch.
  • the CCD camera captures images of the print substrate illuminated with the light pattern when the phase of the light pattern is shifted and acquires a total of four images.
  • the control device calculates the height of each of the coordinates of the print substrate by the phase shift method by processing the acquired four images.
  • inspection objects such as a solder protruding from a reference surface or a cavity recessed from the reference surface are formed.
  • a shadow may be formed on the surface of the measurement object.
  • halation may be formed depending on the angle of the light incident on the inspection object. In this case, the three- dimensional shape of the inspection object may not accurately be calculated.
  • a three-dimensional measuring apparatus including a projecting unit, an imaging unit, and a control unit.
  • the projecting unit projects a stripe to a projectable region which is a peripheral region of an intersection point on a measurement object when a vertical line is drawn down toward the measurement object.
  • the imaging unit includes a plurality of imaging regions at which the measurement object to which the stripe is projected is imaged within the projectable region.
  • the control unit performs a process of three-dimensionally measuring the measurement object based on an image imaged by the imaging unit.
  • the projecting unit is capable of projecting the stripe to the projectable region which is the peripheral region of the intersection point on the measurement object when the vertical line is drawn down toward the measurement object. Accordingly, in this embodiment, the stripe can be projected in a wide range by the projecting unit. Further, the projectable region of the wide range is effectively used and the plurality of imaging regions are disposed in the projectable region. For example, by appropriately disposing the plurality of imaging regions in the projectable region and imaging the inspection region of the measurement object at these imaging regions, the problem with shadow or the problem with halation can be resolved with the inspection object in the inspection region.
  • the three-dimensional shapes of the plurality of inspection regions can be calculated at high speed by simultaneously locating the two or more inspection regions at two or more imaging regions and allowing the imaging unit to simultaneously image the inspection regions.
  • the three-dimensional measuring apparatus may further include a moving unit.
  • the moving unit moves relative positions of the measurement object with respect to the plurality of imaging regions.
  • control unit may control the movement of the relative positions by the moving unit such that the moving unit moves one inspection region in the measurement object sequentially to two or more imaging regions among the plurality of imaging regions and control the imaging unit such that the imaging unit images the inspection region whenever the inspection region is matched with the positions of the imaging regions.
  • the relative positions of the measurement object with respect to the plurality of imaging regions can be moved and one inspection region in the measurement object can be imaged at each of the different imaging regions.
  • the imaging regions at the positions for appropriately cancelling the shadow occurring in the inspection object in the inspection regions and imaging the one inspection region at each of the imaging regions, the problem with the shadow can be resolved.
  • one inspection region can be imaged at each of the different imaging regions. For example, when the inspection region is imaged at a given imaging region, the problem with the halation can be resolved by imaging the inspection object at the different imaging regions even when the halation occurs.
  • the control unit controls the movement of the relative positions by the moving unit such that the moving unit simultaneously moves two or more inspection regions in the measurement object to two or more imaging regions among the plurality of imaging regions and controls the imaging unit such that the imaging unit simultaneously images the two or more inspection regions.
  • the two or more inspection regions can simultaneously be located at the two or more imaging regions and can simultaneously be imaged by the imaging unit. Therefore, the three-dimensional shapes of the plurality of inspection regions can be calculated at high speed.
  • control unit may switch between a first mode and a second mode.
  • control unit controls the movement of the relative positions by the moving unit such that the moving unit moves one inspection region in the measurement object sequentially to two or more imaging regions among the plurality of imaging regions and controls the imaging unit such that the imaging unit images the inspection region whenever the inspection region is matched with the positions of the imaging regions.
  • control unit controls the movement of the relative positions by the moving unit such that the moving unit simultaneously moves two or more inspection regions in the measurement object to two or more imaging regions among the plurality of imaging regions and controls the imaging unit such that the imaging unit simultaneously images the two or more inspection regions.
  • control unit can arbitrarily switch between the first mode in which the problem with the shadow is resolved and the inspection region is accurately inspected and the second mode in which two or more inspection regions are simultaneously imaged at high speed by the imaging unit and the three-dimensional shapes of the inspection regions are calculated.
  • At least one of the plurality of imaging regions may be located at a position in a range of 180° ⁇ 90° with respect to the other imaging regions in the projectable region.
  • the influence of the shadow occurring in the inspection object can appropriately be excluded.
  • At least one of the plurality of imaging regions may be located at a position in a range of 180° ⁇ 45° with respect to the other imaging regions in the projectable region.
  • the influence of the shadow occurring in the inspection object can appropriately be excluded.
  • At least one of the plurality of imaging regions may be located at a position deviated from 0°, ⁇ 90°, and 180° in the projectable region.
  • the influence of the halation can appropriately be excluded.
  • At least one of the plurality of imaging regions may be located at a position deviated from ⁇ 45° and ⁇ 135° in the projectable region.
  • the influence of the halation can appropriately be excluded.
  • At least one of the plurality of imaging regions may be located at a position deviated from 0°, ⁇ 45°, ⁇ 90°, ⁇ 135°, and 180° in the projectable region.
  • the influence of the halation can appropriately be excluded.
  • At least one of the plurality of imaging regions may be different from the other imaging regions in a distance between the imaging region and the intersection point in the projectable region.
  • the imaging unit may include a plurality of imaging sections corresponding to the plurality of imaging regions.
  • the imaging unit may include an imaging section that is able to singly image two or more imaging regions among the plurality of imaging regions.
  • the three-dimensional measuring apparatus when the imaging unit includes the imaging section capable of singly imaging two or more imaging regions, the three-dimensional measuring apparatus may further include a reflection unit and a driving unit.
  • the driving unit drives the reflection unit so as to switch between a first incident state in which light from one of the plurality of imaging regions is incident on the imaging section and a second incident state in which light from another imaging region is guided to the imaging section by the reflection unit.
  • the projecting unit may include a mask that restricts the projection of the stripe so that the stripe is not projected to a region other than the plurality of imaging regions in the projectable region.
  • the three-dimensional measuring apparatus capable of resolving the problem with shadow or halation.
  • FIG. 1 is a diagram illustrating a three-dimensional measuring apparatus according to an embodiment of the present disclosure
  • FIG. 2 is a diagram illustrating an example of a measurement object to be three-dimensionally measured by the three-dimensional measuring apparatus
  • FIG. 3 is a diagram illustrating an optical system of a projecting unit
  • FIG. 4 is a plan view illustrating a light-shielding mask
  • FIG. 5 is a plan view illustrating the shape of a stripe projected on a substrate and illustrating a relationship between a projectable region and an imaging region;
  • FIG. 6 is a flowchart illustrating a process of a control unit of the three-dimensional measuring apparatus according to an embodiment of the present disclosure
  • FIG. 7 is a diagram illustrating the shape of a shadow formed by an inspection object such as a solder formed in an inspection region when the inspection region is located at a first imaging region;
  • FIG. 8 is a diagram illustrating the shape of a shadow formed by an inspection object when the inspection region is located at a second imaging region
  • FIG. 9 is a diagram illustrating the locations of a plurality of imaging regions for excluding the influence of shadows and illustrating the shapes of the shadows formed when inspection objects are located at the imaging regions;
  • FIG. 10 is a diagram illustrating the shapes of shadows formed when the inspection objects are located at the imaging regions
  • FIG. 11 is a diagram illustrating the shapes of shadows formed when the inspection objects are located at the imaging regions
  • FIG. 12 is a diagram illustrating the shapes of shadows formed when the inspection objects are located at the imaging regions
  • FIG. 13 is a diagram illustrating a plurality of imaging regions disposed in a projectable region according to another embodiment of the present disclosure
  • FIG. 14 is a diagram illustrating the locations of a plurality of imaging regions for excluding the influence of halation and illustrating the influence of the halation when inspection objects are located at the imaging regions;
  • FIG. 15 is a diagram illustrating the influence of the halation when the inspection objects are located at the imaging regions
  • FIG. 16 is a diagram illustrating the influence of the halation when the inspection objects are located at the imaging regions
  • FIG. 17 is a diagram illustrating the influence of the halation when the inspection objects are located at the imaging regions
  • FIG. 18 is a flowchart illustrating a process of a three-dimensional measuring apparatus according to another embodiment
  • FIGS. 19A to 19D are diagrams illustrating the influence of the halation when the imaging regions are disposed at the positions deviated from 0°, ⁇ 45°, ⁇ 90°, ⁇ 135°, and 180°;
  • FIG. 20 is a diagram illustrating a plurality of imaging regions disposed in a projectable region according to still another embodiment.
  • FIG. 21 is a diagram illustrating an example of an imaging section capable of singly imaging two or more imaging regions.
  • FIG. 1 is a diagram illustrating a three-dimensional measuring apparatus 100 according to a first embodiment of the present disclosure.
  • FIG. 2 is a diagram illustrating an example of a measurement object 10 to be three-dimensionally measured by three-dimensional measuring apparatus 100 .
  • an example of a substrate 10 having a plurality of inspection regions 11 ( 11 A to 11 J) will be described as an example of a measurement object 10 to be three-dimensionally measured by the three-dimensional measuring apparatus 100 .
  • the inspection regions 11 for example, solders are printed as inspection objects 12 (see FIGS. 7 and 8 and the like).
  • the substrate 10 has ten inspection regions 11 .
  • the three-dimensional measuring apparatus 100 includes a stage 45 on which a substrate 10 is placed, a stage moving mechanism 46 (moving unit), a projecting unit 20 , an imaging unit 30 , a control unit 41 , a storage unit 42 , a display unit 43 , and an input unit 44 .
  • the projecting unit 20 is configured by, for example, a projector.
  • the projecting unit 20 is disposed such that an optical axis 4 is perpendicular to the surface (projection surface) of the substrate 10 .
  • FIG. 3 is a diagram illustrating an optical system 21 of the projecting unit 20 .
  • FIG. 4 is a plan view illustrating a light-shielding mask 26 .
  • FIG. 5 is a plan view illustrating the shape of a stripe projected to the substrate 10 and illustrating a relationship between a projectable region 2 and imaging regions 1 .
  • the projecting unit 20 includes the optical system 21 .
  • the optical system 21 includes a light source 22 , a condensing lens 23 that condenses light from the light source 22 , a phase grating 24 that passes through the light condensed by the condensing lens 23 to form a stripe, and a projecting lens 25 that projects the light passing through the phase grating 24 to the surface of the substrate 10 .
  • the optical system 21 is configured such that the optical axis 4 of the entire optical system 21 is perpendicular to the surface of the substrate 10 .
  • the light source 22 examples include a halogen lamp, a xenon lamp, a mercury lamp, and an LED (Light Emitting Diode), but the kinds of light source 22 is not particularly limited.
  • the phase grating 24 includes a plurality of slits. The phase grating 24 forms a stripe of which luminance is varied sinusoidally by the plurality of slits and projects the stripe to the surface of the substrate 10 .
  • the phase grating 24 is provided with a grating moving mechanism (not shown) that moves the phase grating 24 in a direction (X axis direction) perpendicular to a direction in which the slits are formed.
  • the grating moving mechanism moves the phase grating 24 under the control of the control unit 41 and shifts the phase of the stripe projected to the substrate 10 .
  • a liquid crystal grating or the like that displays a grating-shaped stripe may be used instead of the phase grating 24 and the grating moving mechanism.
  • the projecting unit 20 can project the stripe to the peripheral region of an intersection point 3 between the vertical line and the surface (projection surface) of the substrate 10 .
  • a region which is the peripheral region of the intersection point 3 between the vertical line and the surface (projection surface) of the substrate 10 and to which the stripe can be projected is referred to as the projectable region 2 .
  • the vertical line oriented from the projecting unit 20 to the substrate 10 is identical with the optical axis since the optical axis 4 is perpendicular to the surface of the substrate 10 .
  • the peripheral region of the intersection point 3 between the optical axis and the surface (projection surface) of the substrate becomes the projectable region 2 .
  • the optical axis 4 of the projecting unit 20 is perpendicular to the surface of the substrate 10 , the projecting unit 20 can project the stripe with a uniform width without distortion of an image to the entire projectable region 2 .
  • the stripe image may be distorted in some cases.
  • a plurality of imaging regions 1 are disposed in the projectable region 2 .
  • the inspection regions 11 to which the stripe is projected are imaged at the imaging regions 1 .
  • the projectable region 2 is configured as a broad region centered at the intersection point 3 , the plurality of imaging regions 1 can be provided within the projectable region 2 .
  • the projecting unit 20 can project the stripe with the uniform width without the distortion of an image to the entire projectable region 2 , the imaging regions 1 can be disposed at any position within the projectable region 2 .
  • the number of imaging regions 1 or the positions of the imaging regions 1 in the projectable region 2 may be arbitrarily set in order to exclude the influence of a shadow, exclude the influence of halation, and improve an inspection speed.
  • the plurality of imaging regions 1 are disposed in the projectable region 2 , for example, in order to exclude the influence of a shadow.
  • the number of imaging regions 1 is two.
  • two imaging regions that is, a first imaging region 1 A and a second imaging region 1 B are disposed.
  • the first imaging region 1 A and the second imaging region 1 B are disposed at the positions opposite (180°) to each other with reference to the intersection point 3 .
  • the relative positions of the imaging regions 1 in the projectable region 2 will be described in detail later.
  • the light-shielding mask 26 is formed below the projecting lens 25 .
  • the stripe is projected in a region other than the first imaging region 1 A and the second imaging region 1 B in the projectable region 2 , diffused reflection occurs, and thus the diffusely reflected stripe is incident on the imaging unit 30 . Therefore, there is a concern that the measurement accuracy of the inspection object 12 may deteriorate.
  • the light-shielding mask 26 that restricts the projection of the stripe is provided in order to prevent the stripe from being projected to a region other than the first imaging region 1 A and the second imaging region 1 B in the projectable region 2 .
  • the light-shielding mask 26 is formed, for example, by forming openings 27 ( 27 A and 27 B) in a circular thin plate member.
  • the openings 27 A and 27 B are disposed at the positions corresponding to the positions of the first imaging region 1 A and the second imaging region 1 B disposed in the projectable region 2 .
  • the light-shielding mask 26 can prevent the stripe from being projected to a region other than the first imaging region 1 A and the second imaging region 1 B.
  • the measurement accuracy of the inspection object 12 as a solder is possible to prevent the measurement accuracy of the inspection object 12 as a solder from deteriorating due to the diffused reflection of the stripe incident on the imaging unit 30 .
  • the light-shielding mask 26 is disposed below the projecting lens 25 .
  • the light-shielding mask 26 may be disposed between the light source 22 and the condensing lens 23 , between the condensing lens 23 and the phase grating 24 , or between the phase grating 24 and the projecting lens 25 .
  • the imaging unit 30 includes a first imaging section 31 and a second imaging section 32 corresponding to the first imaging region 1 A and the second imaging region 1 B, respectively.
  • the first imaging section 31 and the second imaging section 32 each include an imaging element such as a CCD (Charge Coupled Device) sensor or a CMOS (Complementary Metal Oxide Semiconductor) sensor.
  • the first imaging section 31 and the second imaging section 32 each include an optical system such as an image forming lens that forms an image on the imaging surface of the imaging element with the light from the imaging region 1 .
  • the first imaging section 31 and the second imaging section 32 image the inspection regions 11 on the substrate 10 to which the stripe is projected under the control of the control unit 41 .
  • the stage 45 is configured to hold the substrate 10 .
  • the stage moving mechanism 46 moves the stage 45 in XY directions in response to a driving signal from the control unit 41 .
  • the stage moving mechanism 46 varies the relative position of the inspection region 11 of the substrate 10 with respect to the imaging region 1 by moving the stage 45 in the XY directions.
  • the display unit 43 is configured by, for example, a liquid crystal display.
  • the display unit 43 displays the three-dimensional image of the inspection region 11 on the substrate 10 under the control of the control unit 41 .
  • the input unit 44 is configured by a keyboard, a mouse, a touch panel, or the like. The input unit 44 inputs an instruction from a user.
  • the storage unit 42 includes a non-volatile memory such as a ROM (Read Only Memory) storing various kinds of programs necessary for the process of the three-dimensional measuring apparatus 100 and a volatile memory such as a RAM (Random Access Memory) used as a working area of the control unit 41 .
  • a non-volatile memory such as a ROM (Read Only Memory) storing various kinds of programs necessary for the process of the three-dimensional measuring apparatus 100
  • a volatile memory such as a RAM (Random Access Memory) used as a working area of the control unit 41 .
  • the control unit 41 is configured by, for example, a CPU (Central Processing Unit).
  • the control unit 41 controls the three-dimensional measuring apparatus 100 on the whole based on the various kinds of programs stored in the storage unit 42 .
  • the control unit 41 controls the stage moving mechanism 46 such that the stage moving mechanism 46 varies the relative position of the inspection region 11 with respect to the imaging region 1 to match the inspection region 11 and the position of the imaging region 1 .
  • the control unit 41 controls the grating moving mechanism such that the grating moving mechanism shifts the phase of the stripe projected to the substrate 10 .
  • the control unit 41 controls the imaging unit 30 such that the imaging unit 30 captures the image of the inspection region 11 to which the stripe is projected or calculates the three-dimensional shape of the inspection region 11 by a phase shift method based on the captured image.
  • FIG. 6 is a flowchart illustrating the process of the control unit 41 of the three-dimensional measuring apparatus 100 .
  • control unit 41 of the three-dimensional measuring apparatus 100 controls the stage moving mechanism 46 such that the stage moving mechanism 46 moves the stage 45 up to the acceptance position of the substrate 10 .
  • the stage moving mechanism 46 accepts the substrate 10 from a substrate delivering device (not shown) (ST 101 ).
  • control unit 41 controls the stage moving mechanism 46 such that the stage moving mechanism 46 moves the stage 45 and matches one inspection region 11 among the plurality of inspection regions 11 with the position of the first imaging region 1 A (ST 102 ).
  • the inspection region 11 A located in the left upper portion in FIG. 2 among the plurality of inspection regions 11 is matched with the position of the first imaging region 1 A.
  • the control unit 41 emits light from the projecting unit 20 such that the projecting unit 20 projects the stripe to the inspection region 11 matched with the position of the first imaging region 1 A (ST 103 ).
  • FIG. 7 is a diagram illustrating the shape of a shadow formed by the inspection object 12 such as a solder formed in the inspection region 11 when the inspection region 11 is located at the first imaging region 1 A.
  • the shadow is formed on the right side of the inspection object 12 .
  • the control unit 41 When the stripe is projected, the control unit 41 allows the first imaging section 31 to image the inspection region 11 , to which the stripe is projected, at the first imaging region 1 A (ST 104 ). Next, the control unit 41 controls the grating moving mechanism such that the gating moving mechanism moves the phase grating 24 in order to shift the phase of the stripe projected to the inspected region 11 by ⁇ /2 [rad] (ST 105 ). When the control unit 41 shifts the phase of the stripe, the control unit 41 subsequently determines whether four images are captured (ST 106 ).
  • the process returns to ST 104 and the control unit 41 images the inspection region 11 to which the stripe is projected. Then, the control unit 41 again shifts the phase of the stripe by ⁇ /2 [rad] (ST 105 ) and again determines whether four images are captured (ST 106 ). In this way, a total of four images for which the phases of the stripe are different from each other are captured.
  • the control unit 41 calculates the height of each pixel of the image based on the four images by the phase shift method (ST 107 ).
  • the control unit 41 extracts the luminance value of each pixel (coordinates (x, y)) from the four images and calculates the phase ⁇ (x, y) of each pixel by applying Equation (1) below.
  • the control unit 41 calculates the height of each pixel by the triangulation principle based on the calculated phase ⁇ (x, y) of each pixel.
  • I 0 (x, y), I ⁇ /2 (x, y), I ⁇ (x, y), and I 3 ⁇ /2 (x, y) are the luminance values of the pixels (coordinates), respectively, when the phases are 0, ⁇ /2, ⁇ , and 3 ⁇ /2.
  • ⁇ ( x, y ) Tan ⁇ 1 ⁇ I 3 ⁇ /2 ( x, y ) ⁇ I ⁇ /2 ( x,y )/ ⁇ I 0 ( x, y ) ⁇ I ⁇ ( x, y ) ⁇ (1)
  • the control unit 41 may not calculate the height of each pixel in the shadowed portion.
  • the control unit 41 subsequently controls the stage moving mechanism 46 such that the stage moving mechanism 46 varies the relative positions of the inspection regions 11 with respect to the imaging regions 1 A and 1 B. Then, the control unit 41 matches the inspection region 11 imaged immediately prior to the first imaging region 1 A to the position of the second imaging region 1 B (ST 108 ). When the inspection region 11 is matched with the position of the second imaging region 1 B, the stripe is projected to the inspection region 11 (ST 109 ).
  • FIG. 8 is a diagram illustrating the shape of a shadow formed by the inspection object 12 when the inspection region 11 is located at the second imaging region 1 B.
  • a shadow is formed on the left side of the inspection object 12 such as a solder.
  • the stripe can be projected to the right side of the inspection object 12 on which the shadow has been formed in FIG. 7 .
  • control unit 41 When the stripe is projected, the control unit 41 allows the second imaging section 32 to image the inspection region 11 to which the stripe is projected (ST 110 ). Next, the control unit 41 controls the grating moving mechanism such that the gating moving mechanism moves the phase grating 24 in order to shift the phase of the stripe projected to the inspected region 11 by ⁇ /2 [rad] in the second imaging region 1 B (ST 111 ). Next, the control unit 41 determines whether four images are captured (ST 112 ).
  • the control unit 41 calculates the height of each pixel of the image based on the four images by the phase shift method (ST 113 ). In this case, the control unit 41 calculates the height of each pixel (coordinates) by the phase shift method using Equation (1) above.
  • the control unit 41 can interpolate the pixel (coordinates) of which the height may not have been calculated due to the shadow in ST 107 . In this way, the control unit 41 can accurately calculate the shape of the inspection object 12 in the inspection region 11 . That is, the influence of the shadow can be excluded by imaging one inspection region 11 at the plurality of imaging regions 1 and calculating the three-dimensional shape of the inspection object 12 . Accordingly, the shape of the inspection object 12 can accurately be calculated.
  • the control unit 41 displays the three-dimensional shape of the inspection region 11 on the display unit 43 .
  • the user views the three-dimensional image and inspects the inspection object 12 formed in the inspection region 11 of the substrate 10 . Since the image displayed on the display unit 43 is an image accurately formed by excluding the influence of the shadow, a minute defect of the inspection object 12 can accurately be determined.
  • the control unit 41 determines whether the calculation of the three-dimensional shape of the inspection region 11 by the phase shift method ends for all of the inspection regions 11 (ST 114 ). When the calculation of the three-dimensional shape does not end for all of the inspection regions 11 (NO in ST 114 ), the control unit 41 transitions to the calculation of the three-dimensional shape of the subsequent inspection region 11 (ST 115 ) and matches the subsequent inspection region 11 with the position of the first imaging region 1 A (ST 101 ). Then, the processes of ST 101 to ST 114 are repeated.
  • the sequence of the calculation of the three-dimensional shapes of the inspection regions 11 is not particularly limited. For example, the three-dimensional shape is calculated in the sequence of the inspection regions 11 A, 11 B, 11 C, . . . , and 11 J.
  • the control unit 41 controls the stage moving mechanism 46 such that the stage moving mechanism 46 moves the substrate 10 up to a discharging position and discharges the substrate 10 (ST 116 ). Then, the control unit 41 accepts a new substrate 10 from the substrate delivering device and executes the processes of ST 101 to ST 116 .
  • one inspection region 11 is imaged at the imaging regions 1 , that is, both the first imaging region 1 A and the second imaging region 1 B, and then the subsequent inspection region 11 is imaged. However, all of the inspection regions 11 may be imaged at the first imaging region 1 A, and then all of the inspection regions 11 may be imaged at the second imaging region 1 B.
  • FIGS. 9 to 12 are diagrams illustrating the positions of the plurality of imaging regions 1 for excluding the influence of a shadow and illustrating the shapes of shadows formed when inspection objects 12 ( 12 A to 12 D) are located at imaging regions 1 ( 1 a to 1 h ).
  • FIGS. 9 to 12 shows the shapes of the shadows when the inspection objects 12 are located at the inspection regions 11 on the substrate 10 in the lateral direction (0°), the longitudinal direction (90°), the direction of +45°, and the direction of ⁇ 45°.
  • the inspection objects 12 such as solders formed at the inspection regions 11 on the substrate 10 generally have a rectangular parallelepiped longitudinal in one direction in many cases.
  • the inspection objects 12 ( 12 A and 12 B) are most commonly formed in the lateral direction (0°) (X axis direction) or the longitudinal direction (90°) (Y axis direction) on the substrate 10 .
  • the inspection objects 12 are most commonly formed in the lateral or longitudinal direction and, as shown in FIGS. 11 and 12 , the inspection objects 12 ( 12 C and 12 D) are second most commonly formed in the directions of ⁇ 45°.
  • the relative position of two imaging regions 1 in the projectable region 2 will be described with reference to FIGS. 9 to 12 .
  • one imaging region 1 of the two imaging regions 1 is disposed at the position (reference position) of the imaging region 1 a .
  • An appropriate relative position of the other imaging region 1 with respect to the one imaging region 1 will be described. The description will be made on the assumption that the position of the imaging region 1 a is 0° in the projectable region 2 .
  • the inspection objects 12 ( 12 A and 12 B) are formed in the lateral or longitudinal direction in the inspection regions 11 .
  • a shadow is formed on the right side of the inspection object 12 .
  • a shadow is formed in the portion in which the shadow is formed when the inspection object 12 is located at the imaging region 1 a .
  • a shadow is not formed in the portion in which the shadow is formed when the inspection object 12 is located at the first imaging region 1 a.
  • the influence of the shadow can appropriately be excluded as long as the other imaging region 1 falls within a range of 180° ⁇ 90° with respect to the one imaging region 1 .
  • a shadow is formed in the portion in which the shadow is formed when the inspection object 12 is located at the imaging region 1 a .
  • a shadow is not formed in the portion in which the shadow is formed when the inspection object 12 is located at the imaging region 1 a.
  • the influence of the shadow can appropriately be excluded as long as the other imaging region 1 falls within a range of 180° ⁇ 45° with respect to the one imaging region 1 .
  • the inspection object 12 is formed in any direction of the lateral direction, the longitudinal direction, the direction of +45°, and the direction of ⁇ 45°, the influence of the shadow can appropriately be excluded as long as the other imaging region 1 falls within the range of 180° ⁇ 45° with respect to the one imaging region 1 .
  • one inspection object 12 is disposed at one inspection region 11 in order to facilitate the understanding.
  • the plurality of inspection objects 12 in different directions are formed in one inspection region 11 in some cases.
  • the inspection objects 12 A, 12 B, 12 C, and 12 D in the lateral direction, the longitudinal direction, the direction of +45°, and the direction of ⁇ 45° are formed in one inspection region 11 in some cases. Even in the cases, the influence of the shadow can appropriately be excluded as long as the other imaging region 1 falls within the range of 180° ⁇ 45° with respect to the one imaging region 1 .
  • FIG. 13 is a diagram illustrating the plurality of imaging regions 1 disposed in the projectable region 2 according to the second embodiment.
  • a first imaging region 1 C is disposed at the position of 0° and a second imaging region 1 D is disposed at the position of 135° in the projectable region 2 .
  • the position of the imaging unit 30 (second imaging section 32 ) or the position of the opening 27 of the light-shielding mask 26 are appropriately modified so as to be suitable for the position of the second imaging region 1 D.
  • the first imaging region 1 C and the second imaging region 1 D are disposed at these positions in order to exclude the influence of halation.
  • the positions of the first imaging region 1 C and the second imaging region 1 D will be described.
  • FIGS. 14 to 17 are diagrams illustrating the positions of the plurality of imaging regions 1 for excluding the influence of halation and illustrating the influence of halation formed when inspection objects 12 ( 12 A to 12 D) are located at imaging regions 1 ( 1 a to 1 h ).
  • FIGS. 14 to 17 show the influence of halation when the inspection objects 12 are located in the inspection regions 11 in the lateral direction) (0°), the longitudinal direction (90°), the direction of +45°, and the direction of ⁇ 45°.
  • halation may occur in some cases. Accordingly, in FIGS. 14 to 17 , a mark ⁇ is given when the light is emitted vertically to the long side of the inspection object 12 .
  • halation may also occur in some cases. There is a low possibility that the halation occurs where the light is emitted vertically to the short side of the inspection object 12 , compared to the case where the light is emitted vertically to the long side of the inspection object 12 . In this case, a mark ⁇ is given in the drawing. Since there is a low possibility that the halation occurs in other cases, a mark ⁇ is given in the drawing.
  • the inspection object 12 A is disposed in the lateral direction, as shown in FIG. 14 .
  • the inspection object 12 A is located at the imaging regions 1 c and 1 g of ⁇ 90°
  • the light is emitted vertically to the long side of the inspection object 12 A. Therefore, the halation may occur in some cases (see the mark ⁇ ).
  • the inspection object 12 A is located at the imaging regions 1 a and 1 e of 0° and 180°, the light is emitted vertically to the short side of the inspection object 12 A. Therefore, the halation may occur in some cases (see the mark ⁇ ).
  • the inspection object 12 A is located at the imaging regions 1 b , 1 h , 1 d , and 1 f of ⁇ 45° and ⁇ 135°, the light is not emitted to the long side and the short side of the inspection object 12 . In this case, there is a low possibility that the halation occurs (see the mark ⁇ ).
  • the light is emitted vertically to the long side of the inspection object 12 B at the imaging regions 1 a and 1 e of 0° and 180° (see the mark ⁇ ).
  • the light is emitted vertically to the short side of the inspection object 12 B at the imaging regions 1 c and 1 g of ⁇ 90° (see the mark ⁇ ).
  • the light is not emitted to the long side and the short side of the inspection object 12 at the imaging regions 1 b , 1 h , 1 d , and 1 f of ⁇ 45° and ⁇ 135° (see the mark ⁇ ).
  • At least one imaging region 1 among the plurality of imaging regions 1 is disposed at a position deviated from 0°, ⁇ 90°, and 180° in the projectable region 2 in order to prevent the light from being vertically incident on the sides of the inspection objects 12 A and 12 B disposed in the lateral direction or the longitudinal direction.
  • the second imaging region 1 D is disposed at the position of 135° in the projectable region 2 , as shown in FIG. 13 .
  • the light is emitted vertically to the long side of the inspection object 12 at the imaging regions 1 h and 1 d of ⁇ 45° and 135° (see the mark ⁇ ).
  • the light is emitted vertically to the short side of the inspection object 12 at the imaging regions 1 b and 1 f of 45° and ⁇ 135° (see the mark ⁇ ).
  • the light is not emitted vertically to the long side and the short side of the inspection object 12 C at the imaging regions 1 a , 1 c , 1 g , and 1 e of 0°, ⁇ 90°, and 180° (see the mark ⁇ ).
  • the light is emitted vertically to the long side of the inspection object 12 at the imaging regions 1 b and 1 f of 45° and ⁇ 135° (see the mark ⁇ ).
  • the light is emitted vertically to the short side of the inspection object 12 at the imaging regions 1 h and 1 d of ⁇ 45° and 135° (see the mark ⁇ ).
  • the light is not emitted vertically to the long side and the short side of the inspection object 12 at the imaging regions 1 a , 1 c , 1 g , and 1 e of 0°, ⁇ 90°, and 180° (see the mark ⁇ ).
  • At least one imaging region 1 among the plurality of imaging regions 1 is disposed at a position deviated from ⁇ 45° and ⁇ 135° in the projectable region 2 in order to prevent the light from being vertically incident on the sides of the inspection objects 12 C and 12 D disposed in the directions of ⁇ 45°.
  • the first imaging region 1 C is disposed at the position of 0° in the projectable region 2 , as shown in FIG. 13 .
  • FIG. 18 is a flowchart illustrating the process of the three-dimensional measuring apparatus 100 according to the second embodiment.
  • control unit 41 of the three-dimensional measuring apparatus 100 accepts the substrate 10 from the substrate delivering device (ST 201 ) and determines whether the inspection region 11 is a region imaged at the first imaging region 1 C (ST 202 ).
  • the user resets the inspection region 11 to be imaged at the first imaging region 1 C and the inspection region 11 to be imaged at the second imaging region 1 D among the plurality of inspection regions 11 ( 11 A to 11 J) of the substrate 10 with reference to FIG. 2 through the input unit 44 .
  • the inspection objects 12 C and 12 D in the directions of +45° and ⁇ 45° are formed at the inspection regions 11 A to 11 E among the plurality of inspection regions 11 and the inspection objects 12 A and 12 B in the lateral direction and the longitudinal direction are formed at the inspection regions 11 F to 11 J.
  • the user sets the inspection regions such that the inspection regions 11 A to 11 E are imaged at the first imaging region 1 C and the inspection regions 11 F to 11 J are imaged at the second imaging region 1 D.
  • the control unit 41 matches the inspection region 11 with the position of the first imaging region 1 C (ST 203 ). Next, the control unit 41 allows the projecting unit 20 to project the stripe to the inspection region 11 located at the first imaging region 1 C (ST 204 ) and acquires four images for which the phases of the stripe are different from each other (ST 205 to ST 207 ).
  • the inspection object 12 when the inspection object 12 is illuminated with the light (stripe), the light incident on the inspection object 12 is not input vertically to the side of the inspection object 12 (see FIGS. 16 and 17 ). Therefore, no halation occurs.
  • control unit 41 calculates the height of each pixel (coordinates) of the inspection region 11 based on the four images by the phase shift method (ST 208 ).
  • the control unit 41 matches the inspection region 11 with the position of the second imaging region 1 D (ST 209 ). Next, the control unit 41 projects the stripe to the inspection region 11 located at the second imaging region 1 D (ST 210 ) and acquires four images for which the phases of the stripe are different from each other (ST 211 to ST 213 ).
  • the inspection object 12 when the inspection object 12 is illuminated with the light (stripe), the light incident on the inspection object 12 is not input vertically to the side of the inspection object 12 (see FIGS. 14 and 15 ). Therefore, no halation occurs.
  • control unit 41 calculates the height of each pixel (coordinates) of the inspection region 11 based on the four images by the phase shift method (ST 214 ).
  • control unit 41 calculates the height of each pixel of the inspection regions 11 .
  • the control unit 41 subsequently determines whether the calculation of the three-dimensional shape of the inspection region 11 ends for all of the inspection regions 11 (ST 215 ).
  • the control unit 41 transitions to the calculation of the three-dimensional shape of the subsequent inspection region 11 (ST 216 ) and repeats the processes of ST 201 to ST 215 .
  • the control unit 41 moves the substrate 10 up the discharging position and the discharges the substrate 10 (ST 217 ). Then, the control unit 41 accepts a new substrate 10 from the substrate delivering device and executes the processes of ST 201 to ST 217 .
  • the problem with the halation can be resolved irrespective of the direction of the inspection object 12 .
  • one inspection region 11 is located at one imaging region 1 of the first imaging region 1 C and the second imaging region 1 D and is imaged.
  • one inspection region 11 is moved sequentially to the positions of the first imaging region 1 C and the second imaging region 1 D and the inspection region 11 may be imaged at the first imaging region 1 C and the second imaging region 1 D.
  • the inspection object 12 is disposed in the longitudinal direction at the inspection region 11 .
  • the inspection region 11 when the inspection region 11 is imaged at the first imaging region 1 C, the light may be incident vertically to the long side of the inspection region 12 . Therefore, halation may occur in some cases.
  • the inspection region 11 when the inspection region 11 is imaged at the second imaging region 1 D, the light is not vertically incident on the side of the inspection object 12 . Therefore, the halation does not occur.
  • the same is applied to a case where the inspection object 12 is formed in the lateral direction and the directions of ⁇ 45°.
  • the halation may occur at one of the two imaging regions 1 , but the halation does not occur at the other hand thereof. Accordingly, even when one inspection region 11 is imaged at the two imaging regions 1 C and 1 D, the problem with the halation can be resolved irrespective of the direction of the inspection object 12 .
  • the effect can particularly be achieved due to the fact that the user may not set the imaging region 1 at which the inspection region 11 is imaged. Further, when one inspection region 11 is imaged at each of the two imaging regions 1 C and 1 D, the problem with the halation can appropriately be resolved in a case where the plurality of inspection objects 12 of which the directions are different from each other are formed at one inspection region 11 .
  • the problem with the shadow and the problem with the halation can be resolved simultaneously. That is, in the second embodiment, the second imaging region 1 D is in the range of 180° ⁇ 45° with respect to the first imaging region 1 C (see FIGS. 9 to 12 ). Accordingly, the problem with the shadow and the problem with the halation can be resolved simultaneously, irrespective of the direction of the inspection object 12 , by imaging one imaging region 1 at each of the two imaging regions 1 C and 1 D.
  • At least one imaging region 1 among the plurality of imaging regions 1 is disposed at a position deviated from 0°, ⁇ 90°, and 180° or ⁇ 45° and ⁇ 135° in the projectable region 2 .
  • at least one imaging region 1 among the plurality of imaging regions 1 may be disposed so as to be excluded from the positions of 0°, ⁇ 45°, ⁇ 90°, ⁇ 135°, and 180°.
  • FIGS. 19A to 19D are diagrams illustrating the influence of the halation when the imaging region 1 is disposed at the position deviated from 0°, ⁇ 45°, ⁇ 90°, ⁇ 135°, and 180°.
  • FIGS. 19A to 19D show the inspection object 12 formed in the lateral direction (0°), the longitudinal direction (90°), the direction of +45°, and the direction of ⁇ 45°. Further, in FIGS. 19A to 19D , for example, the inspection region 11 is disposed at the position of 202.5°.
  • the third embodiment is different from the above-described embodiments in that a three-dimensional measuring apparatus 100 according to the third embodiment can switch between a high-accuracy mode (first mode) and a high-speed mode (second mode).
  • first mode high-accuracy mode
  • second mode high-speed mode
  • the high-accuracy mode is a mode in which one inspection region 11 is imaged at the plurality of imaging regions 1 by moving the one inspection region 11 to the positions of the plurality of imaging regions 1 .
  • the high-speed mode is a mode in which the plurality of imaging regions 1 are simultaneously imaged by simultaneously locating the plurality of inspection regions 11 to the plurality of imaging regions 1 .
  • FIG. 20 is a diagram illustrating the disposition of the plurality of imaging regions 1 in the projectable region 2 according to the third embodiment.
  • five imaging regions 1 E, 1 F, 1 G, 1 H, and 1 I are disposed in the projectable region 2 .
  • the five imaging regions 1 E, 1 F, 1 G, 1 H, and 1 I are disposed at the positions of 0°, 45°, 135°, 202.5°, and 270°, respectively.
  • the imaging regions 1 disposed at the positions of 0°, 45°, 135°, 202.5°, and 270° are referred to the first imaging region 1 E, the second imaging region 1 F, the third imaging region 1 G, the fourth imaging region 1 H, and the fifth imaging region 1 I, respectively.
  • the imaging unit 30 since the number of imaging regions 1 is five, for example, the imaging unit 30 includes five imaging sections corresponding to the five imaging regions 1 . Further, five openings 27 of the light-shielding mask 26 are formed at the positions corresponding to the positions of the five imaging regions 1 so as to correspond to the five imaging regions 1 .
  • the five imaging regions 1 are disposed in the above-described manner in order to resolve the shadow or the problem with the halation.
  • at least one imaging region 1 among the five imaging regions 1 is disposed in the range of 180° ⁇ 90° (or 180° ⁇ 45°) with respect to the other imaging regions 1 (see FIGS. 9 to 12 ).
  • the second imaging region 1 F and the third imaging region 1 G are disposed at the positions deviated from 0°, ⁇ 90°, and 180° (see FIGS. 14 and 15 ).
  • the first imaging region 1 E and the fifth imaging region 1 I are disposed at the positions deviated from ⁇ 45° and ⁇ 135° (see FIGS. 16 and 17 ).
  • the fourth imaging region 1 H is disposed at the position deviated from 0°, ⁇ 45°, ⁇ 90°, ⁇ 135°, and 180° (see FIGS. 19A to 19D ).
  • the second imaging region 1 F is different from the other imaging regions 1 E, 1 G, 1 H, and 1 I in the distance between the imaging region and the center (the intersection point 3 between the optical axis 4 and the surface of the substrate 10 ) of the projectable region 2 .
  • the measurement accuracy and the measurement range can be varied at the second imaging region 1 F since the irradiation angle of the light emitted to the inspection object 12 is changed, compared to the other imaging regions 1 E, 1 G, 1 H, and 1 I.
  • the measurement range is narrower.
  • the smaller the illumination angle of the light emitted to the inspection object 12 is, the poorer the resolving power is.
  • the measurement range is wider.
  • the high-accuracy mode In the high-accuracy mode, the same process as the process described above in the first embodiment is generally performed.
  • the control unit 41 moves one inspection region 11 sequentially to the positions of the first imaging region 1 E and the fourth imaging region 1 H and images one inspection region 11 at each of the first imaging region 1 E and the fourth imaging region 1 H.
  • a combination of the first imaging region 1 A and the fourth imaging region 1 H, a combination of the first imaging region 1 E and the third imaging region 1 G, and a combination of the third imaging region 1 G and the fifth imaging region 1 I can be used.
  • the present disclosure is not limited to the combinations. Of course, other combinations may be used.
  • the number of imaging regions 1 to be used is not limited to two, but three or more imaging regions 1 may be used. In the high-accuracy mode, since the accurate three-dimensional image of the inspection object 12 can be displayed without the problem with the shadow or the problem with the halation, the user can accurately inspect the inspection regions 11 .
  • the high-accuracy mode and the high-speed mode is switched in response to an instruction from the user through the input unit 44 .
  • the control unit 41 simultaneously moves the plurality of inspection regions 11 to the plurality of imaging regions 1 and simultaneously images the plurality of inspection regions 11 at the plurality of imaging regions 1 .
  • the plurality of inspection regions 11 are disposed at the positions shown in FIG. 2 on the substrate 10 .
  • the control unit 41 simultaneously moves the inspection regions 11 C and the inspection region 11 A to the first imaging region 1 E and the third imaging region 1 G and simultaneously moves the two inspection regions 11 C and 11 A.
  • the control unit 41 simultaneously moves the inspection regions 11 J, 11 I, and 11 H to the first imaging region 1 E, the third imaging region 1 G, and the fourth imaging region 1 H and simultaneously image the three inspection regions 11 J, 11 I, and 11 H.
  • the user sets the inspection regions 11 to be simultaneously imaged among the ten inspection regions 11 in advance in the three-dimensional measuring apparatus 100 through the input unit 44 .
  • the control unit 41 matches the other five inspection regions 11 B and 11 D to 11 G to the positions of the fourth imaging region 1 H and images the other five inspection regions 11 B and 11 D to 11 G at the fourth imaging region 1 H.
  • the ten inspection regions 11 can be imaged by the seven imaging times (four images are acquired by one imaging time). In this way, in the high-speed mode, the inspection regions 11 can be imaged at high speed and the three-dimensional shape of the inspection object 12 can be calculated at high speed.
  • inspection regions 11 are simultaneously imaged.
  • four or more inspection regions 11 may be moved to the positions of four or more imaging regions 1 and may simultaneously be imaged.
  • the user can arbitrarily switch between the high-accuracy mode and the high-speed mode.
  • the user can select the high-accuracy mode and accurately display the three-dimensional image of the inspection object 12 , when it is necessary to accurately inspect the inspection object 12 .
  • the user can select the high-speed mode and display the three-dimensional image of the inspection object 12 at high speed, when it is necessary to inspect the inspection object 12 at high speed.
  • the case has hitherto been described in which the three-dimensional measuring apparatus 100 has both the high-accuracy mode and the high-speed mode. However, it can be supposed that the three-dimensional measuring apparatus 100 has only the high-speed mode.
  • the stage moving mechanism 46 has hitherto been described as an example of a moving unit that moves the substrate 10 and the relative position of the plurality of imaging regions 1 in the XY directions.
  • the moving unit may generally move the relative position of the substrate 10 with respect to the plurality of imaging regions 1 in the XY direction.
  • the moving unit may move the projecting unit 20 and the imaging unit 30 in the XY directions.
  • the moving unit may move the stage 45 (substrate 10 ), the projecting unit 20 , and the imaging unit 30 in the XY directions.
  • the imaging unit 30 includes the plurality of imaging sections corresponding to the plurality of imaging regions 1 .
  • the imaging unit 30 may include an imaging section capable of singly imaging two or more imaging regions 1 .
  • FIG. 21 is a diagram illustrating an example of an imaging section capable of singly imaging two or more imaging regions 1 . Differences from the first embodiment will be mainly described with reference to FIG. 21 .
  • the reflection unit 50 includes a first reflection section 51 installed below the imaging unit 33 (above the first imaging region 1 A) and a second reflection section 52 installed above the second imaging region 1 B.
  • the first reflection unit 51 is rotatable about the axis of the Y axis direction.
  • the first reflection unit 51 is rotatable by driving of a driving unit 53 such as a motor.
  • the driving unit 53 rotates the first reflection unit 51 under the control of the control unit 41 to switch between a first incident state in which the light from the first imaging region 1 A is incident on the imaging section 33 and a second incident state in which the light from the second imaging region 1 B is guided to the imaging section 33 by the reflection unit 50 and is incident on the imaging section 33 .
  • the reflection unit 50 may be configured such that the light from the other imaging regions 1 can be incident on the imaging section 33 , when there are the imaging regions 1 other than the first imaging region 1 A and the second imaging region 1 B. In the example shown in FIG. 21 , the cost can be reduced since the number of imaging sections 33 can be reduced.
  • an imaging section that is capable of moves the relative position to the projecting unit 20 in the XY directions can be used as another example of the imaging section that is capable of singly imaging two or more imaging regions 1 .
  • the imaging section is rotatable about the projecting unit 20 and around the imaging unit and is movable in a direction in which the imaging section becomes close to the projecting unit 20 or a direction in which the imaging section becomes distant from the projecting unit 20 .
  • the substrate 10 on which a solder for soldering a mounted component is printed has hitherto been exemplified as an example of the measurement object 10 .
  • the measurement object 10 is not limited thereto.
  • Examples of the measurement object 10 include a substrate in which an adhesive for adhering a mounted component is formed, a wiring substrate in which a wiring pattern is formed, a substrate in which glass is printed, and a substrate in which a fluorescent substance is printed.
  • the optical axis 4 of the imaging unit 20 is vertical to the surface (projecting surface) of the substrate 10 .
  • the optical axis 4 may not necessarily be vertical to the surface of the substrate. That is, the optical axis 4 of the projecting unit 20 may not necessarily be vertical to the surface of the substrate, as long as the stripe can be projected to the peripheral region of the intersection point 3 between the vertical line and the surface of the substrate 10 when the vertical line is drawn from the projecting unit 20 to the substrate 10 . Even in this case, the problem with the shadow or the problem with the halation can appropriately be excluded.
US13/357,042 2011-02-01 2012-01-24 Three-dimensional measuring apparatus Abandoned US20120194647A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011-019431 2011-02-01
JP2011019431A JP5765651B2 (ja) 2011-02-01 2011-02-01 3次元測定装置

Publications (1)

Publication Number Publication Date
US20120194647A1 true US20120194647A1 (en) 2012-08-02

Family

ID=46577046

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/357,042 Abandoned US20120194647A1 (en) 2011-02-01 2012-01-24 Three-dimensional measuring apparatus

Country Status (3)

Country Link
US (1) US20120194647A1 (ja)
JP (1) JP5765651B2 (ja)
CN (1) CN102628677B (ja)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180040118A1 (en) * 2015-02-27 2018-02-08 Koh Young Technology Inc. Substrate inspection method and system
US10997714B2 (en) 2017-02-13 2021-05-04 Koh Young Technology Inc. Apparatus for inspecting components mounted on printed circuit board, operating method thereof, and computer-readable recording medium
US20210310791A1 (en) * 2018-08-24 2021-10-07 Yamaha Hatsudoki Kabushiki Kaisha Three-dimensional measuring apparatus, three-dimensional measuring method
US20210348918A1 (en) * 2018-09-27 2021-11-11 Yamaha Hatsudoki Kabushiki Kaisha Three-dimensional measuring device

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6507653B2 (ja) * 2015-01-13 2019-05-08 オムロン株式会社 検査装置及び検査装置の制御方法
JP6848385B2 (ja) * 2016-11-18 2021-03-24 オムロン株式会社 三次元形状計測装置
CN114252024B (zh) * 2020-09-20 2023-09-01 浙江四点灵机器人股份有限公司 一种单测量模组多工作模式工件三维测量装置及方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050047637A1 (en) * 2003-08-29 2005-03-03 Steven Greenbaum Image buffers and access schedules for image reconstruction systems
US20070247668A1 (en) * 2006-04-24 2007-10-25 Negevtech Of Rehovot Printed fourier filtering in optical inspection tools
US20080212838A1 (en) * 2006-12-21 2008-09-04 Massachusetts Institute Of Technology Methods and apparatus for 3D surface imaging using active wave-front sampling
US20080312866A1 (en) * 2003-09-11 2008-12-18 Katsunori Shimomura Three-dimensional measuring equipment
US7525668B2 (en) * 2005-04-14 2009-04-28 Panasonic Corporation Apparatus and method for appearance inspection
US20100008543A1 (en) * 2007-04-05 2010-01-14 Tomoaki Yamada Shape measuring device and shape measuring method

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4335024B2 (ja) * 2004-01-27 2009-09-30 オリンパス株式会社 3次元形状測定方法及びその装置
CN103134446B (zh) * 2008-02-26 2017-03-01 株式会社高永科技 三维形状测量装置及测量方法
CZ2009133A3 (cs) * 2009-03-03 2009-07-08 Witrins S.R.O. Zarízení, zpusob merení vnejších rozmeru testovaného výrobku a použití tohoto zarízení
JP5441840B2 (ja) * 2009-07-03 2014-03-12 コー・ヤング・テクノロジー・インコーポレーテッド 3次元形状測定装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050047637A1 (en) * 2003-08-29 2005-03-03 Steven Greenbaum Image buffers and access schedules for image reconstruction systems
US20080312866A1 (en) * 2003-09-11 2008-12-18 Katsunori Shimomura Three-dimensional measuring equipment
US7525668B2 (en) * 2005-04-14 2009-04-28 Panasonic Corporation Apparatus and method for appearance inspection
US20070247668A1 (en) * 2006-04-24 2007-10-25 Negevtech Of Rehovot Printed fourier filtering in optical inspection tools
US20080212838A1 (en) * 2006-12-21 2008-09-04 Massachusetts Institute Of Technology Methods and apparatus for 3D surface imaging using active wave-front sampling
US20100008543A1 (en) * 2007-04-05 2010-01-14 Tomoaki Yamada Shape measuring device and shape measuring method

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180040118A1 (en) * 2015-02-27 2018-02-08 Koh Young Technology Inc. Substrate inspection method and system
US10672116B2 (en) * 2015-02-27 2020-06-02 Koh Young Technology Inc. Substrate inspection method and system
US10997714B2 (en) 2017-02-13 2021-05-04 Koh Young Technology Inc. Apparatus for inspecting components mounted on printed circuit board, operating method thereof, and computer-readable recording medium
US11481893B2 (en) 2017-02-13 2022-10-25 Koh Young Technology Inc. Apparatus for inspecting components mounted on printed circuit board, operating method thereof, and computer-readable recording medium
US20210310791A1 (en) * 2018-08-24 2021-10-07 Yamaha Hatsudoki Kabushiki Kaisha Three-dimensional measuring apparatus, three-dimensional measuring method
US11796308B2 (en) * 2018-08-24 2023-10-24 Yamaha Hatsudoki Kabushiki Kaisha Three-dimensional measuring apparatus, three-dimensional measuring method
US20210348918A1 (en) * 2018-09-27 2021-11-11 Yamaha Hatsudoki Kabushiki Kaisha Three-dimensional measuring device

Also Published As

Publication number Publication date
CN102628677A (zh) 2012-08-08
JP5765651B2 (ja) 2015-08-19
JP2012159397A (ja) 2012-08-23
CN102628677B (zh) 2016-12-14

Similar Documents

Publication Publication Date Title
US20120194647A1 (en) Three-dimensional measuring apparatus
US20210207954A1 (en) Apparatus and method for measuring a three-dimensional shape
KR101152842B1 (ko) 삼차원 계측 장치 및 기판 검사기
US9441957B2 (en) Three-dimensional shape measuring apparatus
US7019848B2 (en) Three-dimensional measuring instrument, filter striped plate, and illuminating means
TWI622754B (zh) Three-dimensional measuring device
KR101273094B1 (ko) 광삼각법을 이용한 3차원 형상 측정기를 사용하여 pcb 범프 높이 측정 방법
JP2003279329A (ja) 三次元計測装置
EP3282223A1 (en) Three-dimensional shape measuring apparatus
TWI504856B (zh) 連續掃描型計測裝置
JP7280774B2 (ja) 三次元形状測定装置、三次元形状測定方法、三次元形状測定プログラム及びコンピュータで読み取り可能な記録媒体並びに記録した機器
JP5594923B2 (ja) 基板面高さ測定方法及びその装置
JP2011089939A (ja) 外観検査装置及び印刷半田検査装置
US20180367722A1 (en) Image acquisition device and image acquisition method
JP6097389B2 (ja) 検査装置および検査方法
JP2009092485A (ja) 印刷半田検査装置
JP2011021970A (ja) 三次元形状測定装置および三次元形状測定方法
JP6694248B2 (ja) 三次元画像検査装置、三次元画像検査方法及び三次元画像検査プログラム並びにコンピュータで読み取り可能な記録媒体
JP2017032340A (ja) 三次元画像検査装置、三次元画像検査方法及び三次元画像検査プログラム並びにコンピュータで読み取り可能な記録媒体
WO2020241061A1 (ja) 三次元計測装置及び三次元計測方法
JP2009097940A (ja) 形状測定装置
JP2003262509A (ja) 検査装置、計測装置、検査方法および計測方法
JP6847088B2 (ja) 投影装置及び三次元計測装置
JP2018197683A (ja) 面形状歪測定装置
WO2020116052A1 (ja) 投影装置及び三次元計測装置

Legal Events

Date Code Title Description
AS Assignment

Owner name: SONY CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TOMARU, TAKUMI;REEL/FRAME:028013/0231

Effective date: 20120403

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION